Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
40.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Job Description As a member of the Support organization, your focus is to deliver post-sales support and solutions to the Oracle customer base while serving as an advocate for customer needs. This involves resolving post-sales non-technical customer inquiries via phone and electronic means, as well as, technical questions regarding the use of and troubleshooting for our Electronic Support Services. A primary point of contact for customers, you are responsible for facilitating customer relationships with Support and providing advice and assistance to internal Oracle employees on diverse customer situations and escalated issues. The Fusion Supply Chain / Manufacturing Support Team is expanding to support our rapidly increasing customer base in the Cloud (SaaS), as well as growing numbers of on-premise accounts. The team partners with Oracle Development in supporting early adopters and many other new customers. This is a unique opportunity to be part of the future of Oracle Support and help shape the product and the organization to benefit our customers and our employees. This position is for supporting Fusion Applications, particularly under the Fusion SCM modules - Fusion SCM Planning, Fusion SCM Manufacturing, Fusion SCM Maintenance. Research, resolve and respond to complex issues across the Application product lines and product boundaries in accordance with current standards Demonstrate strong follow-through and consistently keep commitments to customers and employees Ensure that each and every customer is handled with a consummately professional attitude and the highest possible level of service Take ownership and responsibility for priority customer cases where and when required Review urgent and critical incidents for quality Queue reviews with analysts to ensure quality and efficiency of support Report high visibility cases, escalation, customer trends to management Act as information resource to the management team Contribute to an environment that encourages information sharing, team-based resolution activity, cross training and an absolute focus on resolving customer cases as quickly and effectively as possible Participate in projects that enhance the quality or efficiency of support Participate in system and release testing, as needed Act as a role model and mentor for other analysts Perform detailed technical analysis and troubleshooting using SQL, PL/SQL,Java, ADF, Redwood, VBCS, SOA and Rest API Participate in after hour support as required. Work with Oracle Development/Support Development for product related issues Demonstrate core competencies (employ sound business judgment, creative and innovative ways to solve problems, strong work ethic and do whatever it takes to get the job done) Knowledge of Business process and functional knowledge required for our support organization for Maintenance Module Asset Management: Oversee the entire lifecycle of physical assets to optimize utilization and visibility into maintenance operations.Track and manage enterprise-owned and customer-owned assets, including Install Base Assets. Preventive maintenance/Maintenance Program: Define and generate daily preventive maintenance forecasts for affected assets within maintenance-enabled organizations. Utilize forecasts to create preventive maintenance work orders, reducing workload for planners and enhancing program auditing, optimization, and exception management. Work Definition: Identify and manage Maintenance Work Areas based on physical, geographical, or logical groupings of work centers. Define and manage resources, work centers, and standard operations. Develop reusable operation templates (standard operations) detailing operation specifics and required resources. Apply standard operations to multiple maintenance work definitions and work orders. Work Order creation, scheduling and Dispatch: Track material usage and labour hours against planned activities. Manage component installation and removal. Conduct inspections and ensure seamless execution of work orders. Work Order Transactions: Apply knowledge of operation pull, assembly pull, and backflush concepts. Execute operation transactions to update dispatch status in count point operations. Manage re- sequenced operations within work order processes. Charge maintenance work orders for utilized resources and ensure accurate transaction recording. Technical skills required for our support organization for Maintenance Module SQL and PL/SQL REST API - creating, different methods and testing via POSTMAN Knowledge of JSON format Knowledge of WSDL, XML and SOAP Webservices Oracle SOA - Composites, Business Events, debugging via SOA Composite trace and logs Java and Oracle ADF Oracle Visual Builder Studio (good to have) Page Composer(Fusion Apps) : Customize existing UI (good to have) Application Composer(Fusion Apps) - sandbox, creating custom object and fields, dynamic page layout and Object Functions (good to have) Career Level - IC3 Responsibilities RESPONSIBILITIES As a Sr. Support Engineer, you will be the technical interface to customer) for resolution of problems related to the maintenance and use of Oracle products. Have an understanding of all Oracle products in their competencies and in-depth knowledge of several products and/or platforms. Also, you should be highly experienced in multiple platforms and be able to complete assigned duties with minimal direction from management. In this position, you will routinely act independently while researching and developing solutions to customer issues. Research, resolve and respond to complex issues across the Application product lines and product boundaries in accordance with current standards Demonstrate strong follow-through and consistently keep commitments to customers and employees Ensure that each and every customer is handled with a consummately professional attitude and the highest possible level of service Take ownership and responsibility for priority customer cases where and when required Review urgent and critical incidents for quality Queue reviews with analysts to ensure quality and efficiency of support Report high visibility cases, escalation, customer trends to management Act as information resource to the management team Contribute to an environment that encourages information sharing, team-based resolution activity, cross training and an absolute focus on resolving customer cases as quickly and effectively as possible Participate in projects that enhance the quality or efficiency of support Participate in system and release testing, as needed Act as a role model and mentor for other analysts Perform detailed technical analysis and troubleshooting using SQL, Java, ADF, Redwood, VBCS, SOA and Rest API Participate in after hour support as required. Work with Oracle Development/Support Development for product related issues Demonstrate core competencies (employ sound business judgment, creative and innovative ways to solve problems, strong work ethic and do whatever it takes to get the job done) Knowledge of Business process and functional knowledge required for our support organization for Maintenance Module Asset Management: Oversee the entire lifecycle of physical assets to optimize utilization and visibility into maintenance operations.Track and manage enterprise-owned and customer-owned assets, including Install Base Assets. Preventive maintenance/Maintenance Program: Define and generate daily preventive maintenance forecasts for affected assets within maintenance-enabled organizations. Utilize forecasts to create preventive maintenance work orders, reducing workload for planners and enhancing program auditing, optimization, and exception management. Work Definition: Identify and manage Maintenance Work Areas based on physical, geographical, or logical groupings of work centers. Define and manage resources, work centers, and standard operations. Develop reusable operation templates (standard operations) detailing operation specifics and required resources. Apply standard operations to multiple maintenance work definitions and work orders. Work Order creation, scheduling and Dispatch: Track material usage and labour hours against planned activities. Manage component installation and removal. Conduct inspections and ensure seamless execution of work orders. Work Order Transactions: Apply knowledge of operation pull, assembly pull, and backflush concepts. Execute operation transactions to update dispatch status in count point operations. Manage re- sequenced operations within work order processes. Charge maintenance work orders for utilized resources and ensure accurate transaction recording. Technical skills required for our support organization for Maintenance Module SQL and PL/SQL REST API - creating, different methods and testing via POSTMAN Knowledge of JSON format Knowledge of WSDL, XML and SOAP Webservices Oracle SOA - Composites, Business Events, debugging via SOA Composite trace and logs Java and Oracle ADF Oracle Visual Builder Studio (good to have) Page Composer(Fusion Apps) : Customize existing UI (good to have) Application Composer(Fusion Apps) - sandbox, creating custom object and fields, dynamic page layout and Object Functions (good to have) Qualifications Career Level - IC3 About Us As a world leader in cloud solutions, Oracle uses tomorrow’s technology to tackle today’s challenges. We’ve partnered with industry-leaders in almost every sector—and continue to thrive after 40+ years of change by operating with integrity. We know that true innovation starts when everyone is empowered to contribute. That’s why we’re committed to growing an inclusive workforce that promotes opportunities for all. Oracle careers open the door to global opportunities where work-life balance flourishes. We offer competitive benefits based on parity and consistency and support our people with flexible medical, life insurance, and retirement options. We also encourage employees to give back to their communities through our volunteer programs. We’re committed to including people with disabilities at all stages of the employment process. If you require accessibility assistance or accommodation for a disability at any point, let us know by emailing accommodation-request_mb@oracle.com or by calling +1 888 404 2494 in the United States. Oracle is an Equal Employment Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability and protected veterans’ status, or any other characteristic protected by law. Oracle will consider for employment qualified applicants with arrest and conviction records pursuant to applicable law.
Posted 2 weeks ago
8.0 - 10.0 years
20 - 35 Lacs
Pune
Work from Office
Job Summary: We are looking for a seasoned Senior ETL/DB Tester with deep expertise in data validation and database testing across modern data platforms. This role requires strong proficiency in SQL and experience with tools like Talend, ADF, Snowflake, and Power BI. The ideal candidate will be highly analytical, detail-oriented, and capable of working across cross-functional teams in a fast-paced data engineering environment. Key Responsibilities: Design, develop, and execute comprehensive test plans for ETL and database validation processes Validate data transformations and integrity across multiple stages and systems (Talend, ADF, Snowflake, Power BI) Perform manual testing and defect tracking using Zephyr or Tosca Analyze business and data requirements to ensure full test coverage Write and execute complex SQL queries for data reconciliation Identify data-related issues and conduct root cause analysis in collaboration with developers Track and manage bugs and enhancements through appropriate tools Optimize testing strategies for performance, scalability, and accuracy in ETL workflows Mandatory Skills: ETL Tools: Talend, ADF Data Platforms: Snowflake Reporting/Analytics: Power BI, VPI Testing Tools: Zephyr or Tosca, Manual testing Strong SQL expertise for validating complex data workflows Good-to-Have Skills: API Testing exposure Power BI Advanced Features (Dashboards, DAX, Data Modelling)
Posted 2 weeks ago
6.0 years
7 - 8 Lacs
Hyderābād
On-site
Company Description Experian is a global data and technology company, powering opportunities for people and businesses around the world. We help to redefine lending practices, uncover and prevent fraud, simplify healthcare, create marketing solutions, and gain deeper insights into the automotive market, all using our unique combination of data, analytics and software. We also assist millions of people to realize their financial goals and help them save time and money. We operate across a range of markets, from financial services to healthcare, automotive, agribusiness, insurance, and many more industry segments. We invest in people and new advanced technologies to unlock the power of data. As a FTSE 100 Index company listed on the London Stock Exchange (EXPN), we have a team of 22,500 people across 32 countries. Our corporate headquarters are in Dublin, Ireland. Learn more at experianplc.com. Job Description The Azure Big Data Engineer is an important role where you are responsible for designing, implementing, and managing comprehensive big data solutions on the Azure platform. You will report to the Senior Manager and require you to work hybrid 2 days WFO in Hyderabad Responsibilities: Design, implement, and maintain scalable and reliable data pipelines using Azure Data Factory, Databricks, and Synapse Analytics. Microsoft Fabric Develop, configure, and optimize data lakes and warehouses on Azure using services like Azure Data Lake Storage (ADLS), Azure Lakehouse, Warehouse and monitor data pipelines for performance, scalability, and reliability. Collaborate with data scientists, architects, analysts, and other stakeholders to understand data requirements and deliver solutions that meet business needs. Ensure that data is secure and meets all regulatory compliance standards, including role-based access control (RBAC) and data encryption in Azure environments. Develop and configure monitoring and alerting mechanisms to proactively enhance performance and optimize data systems. Troubleshoot and resolve data-related issues in a timely manner. Produce clear, concise technical documentation for all developed solutions Requirements: Experience with SSIS, SQL Jobs, BIDS & ADF Experience with Azure services (Microsoft Fabric, Azure Synapse, Azure SQL Database, Azure Key Vault, etc.). Proficiency in Azure data services, including Azure Data Factory, Azure Databricks, Azure Synapse Analytics and Azure Data Lake Storage. Experience in data modeling, data architecture, and implementing ETL/ELT solutions Proficiency in SQL and familiarity with other programming languages such as Python or Scala. Knowledge of data modeling, data warehousing, and big data technologies. Experience with data governance and security best practices. Qualifications Bachelors in computer science or related field, Masters preferred 6+ years of professional data ingestion experience with ETL/ELT tools like SSIS, ADF, Synapse 2+ years of Azure cloud experience Bachelors in computer science or related field, Masters Preferred: Experience with Microsoft Fabric and Azure Synapse. Understand ML/AI concepts. Azure certifications Additional Information Our uniqueness is that we truly celebrate yours. Experian's culture and people are important differentiators. We take our people agenda very seriously and focus on what truly matters; DEI, work/life balance, development, authenticity, engagement, collaboration, wellness, reward & recognition, volunteering... the list goes on. Experian's strong people first approach is award winning; Great Place To Work™ in 24 countries, FORTUNE Best Companies to work and Glassdoor Best Places to Work (globally 4.4 Stars) to name a few. Check out Experian Life on social or our Careers Site to understand why. Experian is proud to be an Equal Opportunity and Affirmative Action employer. Innovation is a critical part of Experian's DNA and practices, and our diverse workforce drives our success. Everyone can succeed at Experian and bring their whole self to work, irrespective of their gender, ethnicity, religion, color, sexuality, physical ability or age. If you have a disability or special need that requires accommodation, please let us know at the earliest opportunity. Experian Careers - Creating a better tomorrow together Benefits Experian care for employee's work life balance, health, safety and wellbeing. In support of this endeavor, we offer best-in-class family well-being benefits, enhanced medical benefits and paid time off. #LI-Hybrid Experian Careers - Creating a better tomorrow together Find out what its like to work for Experian by clicking here
Posted 2 weeks ago
4.0 years
0 Lacs
Hyderābād
On-site
Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start Caring. Connecting. Growing together. As part of our strategic initiative to build a centralized capability around data and cloud engineering, we are establishing a dedicated Azure Cloud Data Engineering practice. This team will be at the forefront of designing, developing, and deploying scalable data solutions on cloud primarily using Microsoft Azure platform. The practice will serve as a centralized team, driving innovation, standardization, and best practices across cloud-based data initiatives. New hires will play a pivotal role in shaping the future of our data landscape, collaborating with cross-functional teams, clients, and stakeholders to deliver impactful, end-to-end solutions. Primary Responsibilities: Ingest data from multiple on-prem and cloud data sources using various tools & capabilities in Azure Design and develop Azure Databricks processes using PySpark/Spark-SQL Design and develop orchestration jobs using ADF, Databricks Workflow Analyzing data engineering processes being developed and act as an SME to troubleshoot performance issues and suggest solutions to improve Develop and maintain CI/CD processes using Jenkins, GitHub, Github Actions etc. Building test framework for the Databricks notebook jobs for automated testing before code deployment Design and build POCs to validate new ideas, tools, and architectures in Azure Continuously explore new Azure services and capabilities; assess their applicability to business needs. Create detailed documentation for cloud processes, architecture, and implementation patterns Work with data & analytics team to build and deploy efficient data engineering processes and jobs on Azure cloud Prepare case studies and technical write-ups to showcase successful implementations and lessons learned Work closely with clients, business stakeholders, and internal teams to gather requirements and translate them into technical solutions using best practices and appropriate architecture Contribute to full lifecycle project implementations, from design and development to deployment and monitoring Ensure solutions adhere to security, compliance, and governance standards Monitor and optimize data pipelines and cloud resources for cost and performance efficiency Identifies solutions to non-standard requests and problems Support and maintain the self-service BI warehouse Mentor and support existing on-prem developers for cloud environment Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Required Qualifications: Undergraduate degree or equivalent experience 4+ years of overall experience in Data & Analytics engineering 4+ years of experience working with Azure, Databricks, and ADF, Data Lake Solid experience working with data platforms and products using PySpark and Spark-SQL Solid experience with CICD tools such as Jenkins, GitHub, Github Actions, Maven etc. In-depth understanding of Azure architecture & ability to come up with efficient design & solutions Highly proficient in Python and SQL Proven excellent communication skills Preferred Qualifications: Snowflake, Airflow experience Power BI development experience Experience or knowledge of health care concepts - E&I, M&R, C&S LOBs, Claims, Members, Provider, Payers, Underwriting At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone–of every race, gender, sexuality, age, location and income–deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes — an enterprise priority reflected in our mission. #NIC
Posted 2 weeks ago
2.0 - 3.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Job Title: AI/GenAI Engineer Job ID: POS-13731 Primary Skill: Databricks, ADF Location: Hyderabad Experience: 3.00 Secondary skills: Python, LLM, Langchain, Vectors, and AWS Mode of Work: Work from Office Experience : 2-3 Years About The Job We are seeking a highly motivated and innovative Generative AI Engineer to join our team and drive the exploration of cutting-edge AI capabilities. You will be at forefront of developing solutions using Generative AI technologies, primarily focusing on Large Language Models (LLMs) and foundation models, deployed on either AWS or Azure cloud platforms. This role involves rapid prototyping, experimentation, and collaboration with various stakeholders to assess the feasibility and potential impact of GenAI solutions on our business challenges. If you are passionate about the potential of GenAI and enjoy hands-on building in a fast-paced environment, this is the role for you. Know Your Team At ValueMomentum’s Engineering Center , we are a team of passionate engineers who thrive on tackling complex business challenges with innovative solutions while transforming the P&C insurance value chain. We achieve this through a strong engineering foundation and by continuously refining our processes, methodologies, tools, agile delivery teams, and core engineering archetypes. Our core expertise lies in six key areas: Cloud Engineering, Application Engineering, Data Engineering, Core Engineering, Quality Engineering, and Domain expertise. Join a team that invests in your growth. Our Infinity Program empowers you to build your career with role-specific skill development, leveraging immersive learning platforms. You'll have the opportunity to showcase your talents by contributing to impactful projects. Responsibilities Develop GenAI Solutions: Develop, and rapidly iterate on GenAI solutions leveraging LLMs and other foundation models available on AWS and/or Azure platforms. Cloud Platform Implementation: Utilize relevant cloud services (e.g., AWS SageMaker, Bedrock, Lambda, Step Functions; Azure Machine Learning, Azure OpenAI Service, Azure Functions) for model access, deployment, data processing. Explore GenAI Techniques: Experiment with and implement techniques like Retrieval-Augmented Generation (RAG), evaluating the feasibility of model fine-tuning or other adaptation methods for specific PoC requirements. API Integration: Integrate GenAI models (via APIs from cloud providers, OpenAI, Hugging Face, etc.) into prototype applications and workflows. Data Handling for AI: Prepare, manage, and process data required for GenAI tasks, such as data for RAG indexes, datasets for evaluating fine-tuning feasibility, or example data for few-shot prompting. Documentation & Presentation: Clearly document PoC architectures, implementation details, findings, limitations, and results for both technical and non-technical audiences. Requirements Overall, 2-3 years of experience. Expert in Python with advance programming and concepts Solid understanding of Generative AI concepts, including LLMs, foundation models, prompt engineering, embeddings, and common architectures (e.g., RAG). Demonstrable experience working with at least one major cloud platform (AWS or Azure). Hands-on experience using cloud-based AI/ML services relevant to GenAI (e.g., AWS SageMaker, Bedrock; Azure Machine Learning, Azure OpenAI Service). Experience interacting with APIs, particularly AI/ML model APIs Bachelor’s degree in computer science, AI, Data Science or equivalent practical experience. About The Company Headquartered in New Jersey, US, ValueMomentum is the largest standalone provider of IT Services and Solutions to Insurers. Our industry focus, expertise in technology backed by R&D, and our customer-first approach uniquely position us to deliver the value we promise and drive momentum to our customers’ initiatives. ValueMomentum is amongst the top 10 insurance-focused IT services firms in North America by number of customers. Leading Insurance firms trust ValueMomentum with their Digital, Data, Core, and IT Transformation initiatives. Benefits We at ValueMomentum offer you a congenial environment to work and grow in the company of experienced professionals. Some benefits that are available to you are: Competitive compensation package. Career Advancement: Individual Career Development, coaching and mentoring programs for professional and leadership skill development. Comprehensive training and certification programs. Performance Management: Goal Setting, continuous feedback and year-end appraisal. Reward & recognition for the extraordinary performers.
Posted 2 weeks ago
4.0 - 8.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Job Title/ Skill - System Administrator - Oracle Weblogic and SOA Administrator Experience: 4 - 8 years Location: Hyderabad Required Skills: Experience in Oracle Weblogic Server Installation, Configuration, performance tuning and Troubleshooting. Knowledge and Hands-on experience on Oracle FMW SOA / OSB Administration work for Installation, Configuration, performance tuning and Troubleshooting. Experience in Patching / Upgrading Oracle FMW Products (Weblogic, ADF, SOA, OSB etc.). Experience in troubleshooting WebLogic and SOA / OSB logs and work with Oracle team for resolution
Posted 2 weeks ago
8.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Job Title: Data Engineer Location: Pune, India (Hybrid) Type: Contract (6 Months) Experience: 5–8 Years Domain: Financial Services Work Timing: Regular Day Shift Background Check: Mandatory before onboarding Job Description: Seeking experienced Data Engineers with strong hands-on skills in SQL, Python, Azure Databricks, ADF, and PySpark. Candidates should have experience in data modeling, ETL design, big data technologies, and large-scale on-prem to cloud migrations using Azure data stack. Mandatory Skills: Azure Databricks Azure Data Factory Python PySpark Preferred Skills: Spark, Kafka Azure Synapse, Azure SQL, Azure Data Lake, Azure Cosmos DB Batch and real-time ingestion
Posted 2 weeks ago
0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Exp level – 5 to 10 yrs Strong in Azure and ADF, Azure Data brick or Fabrics & Data pipeline Relevant 3 yrs of experience in Azure Data Brick.
Posted 2 weeks ago
6.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
The Azure Big Data Engineer is an important role where you are responsible for designing, implementing, and managing comprehensive big data solutions on the Azure platform. You will report to the Senior Manager and require you to work hybrid 2 days WFO in Hyderabad Responsibilities Design, implement, and maintain scalable and reliable data pipelines using Azure Data Factory, Databricks, and Synapse Analytics. Microsoft Fabric Develop, configure, and optimize data lakes and warehouses on Azure using services like Azure Data Lake Storage (ADLS), Azure Lakehouse, Warehouse and monitor data pipelines for performance, scalability, and reliability. Collaborate with data scientists, architects, analysts, and other stakeholders to understand data requirements and deliver solutions that meet business needs. Ensure that data is secure and meets all regulatory compliance standards, including role-based access control (RBAC) and data encryption in Azure environments. Develop and configure monitoring and alerting mechanisms to proactively enhance performance and optimize data systems. Troubleshoot and resolve data-related issues in a timely manner. Produce clear, concise technical documentation for all developed solutions Requirements Experience with SSIS, SQL Jobs, BIDS & ADF Experience with Azure services (Microsoft Fabric, Azure Synapse, Azure SQL Database, Azure Key Vault, etc.). Proficiency in Azure data services, including Azure Data Factory, Azure Databricks, Azure Synapse Analytics and Azure Data Lake Storage. Experience in data modeling, data architecture, and implementing ETL/ELT solutions Proficiency in SQL and familiarity with other programming languages such as Python or Scala. Knowledge of data modeling, data warehousing, and big data technologies. Experience with data governance and security best practices. About Experian Experian is a global data and technology company, powering opportunities for people and businesses around the world. We help to redefine lending practices, uncover and prevent fraud, simplify healthcare, create marketing solutions, and gain deeper insights into the automotive market, all using our unique combination of data, analytics and software. We also assist millions of people to realize their financial goals and help them save time and money. We operate across a range of markets, from financial services to healthcare, automotive, agribusiness, insurance, and many more industry segments. We invest in people and new advanced technologies to unlock the power of data. As a FTSE 100 Index company listed on the London Stock Exchange (EXPN), we have a team of 22,500 people across 32 countries. Our corporate headquarters are in Dublin, Ireland. Learn more at experianplc.com. Experience And Skills Bachelors in computer science or related field, Masters preferred 6+ years of professional data ingestion experience with ETL/ELT tools like SSIS, ADF, Synapse 2+ years of Azure cloud experience Bachelors in computer science or related field, Masters Preferred Experience with Microsoft Fabric and Azure Synapse. Understand ML/AI concepts. Azure certifications Additional Information Our uniqueness is that we truly celebrate yours. Experian's culture and people are important differentiators. We take our people agenda very seriously and focus on what truly matters; DEI, work/life balance, development, authenticity, engagement, collaboration, wellness, reward & recognition, volunteering... the list goes on. Experian's strong people first approach is award winning; Great Place To Work™ in 24 countries, FORTUNE Best Companies to work and Glassdoor Best Places to Work (globally 4.4 Stars) to name a few. Check out Experian Life on social or our Careers Site to understand why. Experian is proud to be an Equal Opportunity and Affirmative Action employer. Innovation is a critical part of Experian's DNA and practices, and our diverse workforce drives our success. Everyone can succeed at Experian and bring their whole self to work, irrespective of their gender, ethnicity, religion, color, sexuality, physical ability or age. If you have a disability or special need that requires accommodation, please let us know at the earliest opportunity. Experian Careers - Creating a better tomorrow together Benefits Experian care for employee's work life balance, health, safety and wellbeing. In support of this endeavor, we offer best-in-class family well-being benefits, enhanced medical benefits and paid time off. Experian Careers - Creating a better tomorrow together Find out what its like to work for Experian by clicking here
Posted 2 weeks ago
8.0 years
0 Lacs
Itanagar, Arunachal Pradesh, India
On-site
Job Description We are seeking a skilled and motivated Data Engineer with 8+ years of experience to join our team. The ideal candidate should be experienced in data engineering on Snowflake, Azure -ADF, Microsoft MDS, SQL ,Data Pipelines with a focus on developing, maintaining the Data Analytics solutions. You will collaborate with cross-functional teams to deliver high-quality data solutions that meet business requirements. Required Skills And Experience Bachelor or Master degree in computer science, Data Science, Engineering, or a related field. 8+ years of experience in data engineering or related fields. Strong proficiency in SQL, Snowflake, Stored procedure, Views . Hands-on experience with Snowflake SQL, ADF (Azure Data Factory), Microsoft MDS(Master Data Service). Knowledge of data warehousing concepts. Experience with cloud platforms (Azure). Understanding of data modeling and data warehousing principles. Strong problem-solving and analytical skills, with attention to detail. Excellent communication and collaboration skills. Bonus Skills Exposure to CI/CD practices using Microsoft Azure DevOps . Basic knowledge or understanding of PBI. Key Responsibilities Design, develop, and maintain scalable and efficient data pipelines using Azure Data Factory (ADF). Build and optimize data models and data warehousing solutions within Snowflake. Develop and maintain data integration processes, ensuring data quality and integrity. Utilize strong SQL skills to query, transform, and analyze data within Snowflake. Develop and manage stored procedures and views in Snowflake. Implement and manage master data using Microsoft Master Data Services (MDS). Collaborate with data analysts and business stakeholders to understand data requirements and deliver effective data solutions. Ensure the performance, reliability, and security of data pipelines and data warehousing systems. Troubleshoot and resolve data-related issues in a timely manner. Stay up-to-date with the latest advancements in data engineering technologies, particularly within the Snowflake and Azure ecosystems. Contribute to the documentation of data pipelines, data models, and ETL processes (ref:hirist.tech)
Posted 2 weeks ago
0 years
0 Lacs
Pune, Maharashtra, India
On-site
Job Summary We are looking for a seasoned Senior ETL/DB Tester with deep expertise in data validation and database testing across modern data platforms. This role requires strong proficiency in SQL and experience with tools like Talend, ADF, Snowflake, and Power BI. The ideal candidate will be highly analytical, detail-oriented, and capable of working across cross-functional teams in a fast-paced data engineering Responsibilities : Design, develop, and execute comprehensive test plans for ETL and database validation processes Validate data transformations and integrity across multiple stages and systems (Talend, ADF, Snowflake, Power BI) Perform manual testing and defect tracking using Zephyr or Tosca Analyze business and data requirements to ensure full test coverage Write and execute complex SQL queries for data reconciliation Identify data-related issues and conduct root cause analysis in collaboration with developers Track and manage bugs and enhancements through appropriate tools Optimize testing strategies for performance, scalability, and accuracy in ETL Skills : ETL Tools: Talend, ADF Data Platforms: Snowflake Reporting/Analytics: Power BI, VPI Testing Tools: Zephyr or Tosca, Manual testing Strong SQL expertise for validating complex data Skills : API Testing exposure Power BI Advanced Features (Dashboards, DAX, Data Modelling) (ref:hirist.tech)
Posted 2 weeks ago
5.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Databricks Data Engineer Key Responsibilities : Design, develop, and maintain high-performance data pipelines using Databricks and Apache Spark. Implement medallion architecture (Bronze, Silver, Gold layers) for efficient data processing. Optimize Delta Lake tables, partitioning, Z-ordering, and performance tuning in Databricks. Develop ETL/ELT processes using PySpark, SQL, and Databricks Workflows. Manage Databricks clusters, jobs, and notebooks for batch and real-time data processing. Work with Azure Data Lake, AWS S3, or GCP Cloud Storage for data ingestion and storage. Implement CI/CD pipelines for Databricks jobs and notebooks using DevOps tools. Monitor and troubleshoot performance bottlenecks, cluster optimization, and cost management. Ensure data quality, governance, and security using Unity Catalog, ACLs, and encryption. Collaborate with Data Scientists, Analysts, and Business Teams to deliver Skills & Experience : 5+ years of hands-on experience in Databricks, Apache Spark, and Delta Lake. Strong SQL, PySpark, and Python programming skills. Experience in Azure Data Factory (ADF), AWS Glue, or GCP Dataflow. Expertise in performance tuning, indexing, caching, and parallel processing. Hands-on experience with Lakehouse architecture and Databricks SQL. Strong understanding of data governance, lineage, and cataloging (e.g., Unity Catalog). Experience with CI/CD pipelines (Azure DevOps, GitHub Actions, or Jenkins). Familiarity with Airflow, Databricks Workflows, or orchestration tools. Strong problem-solving skills with experience in troubleshooting Spark jobs. Nice To Have Hands-on experience with Kafka, Event Hubs, or real-time streaming in Databricks. Certifications in Databricks, Azure, AWS, or GCP. (ref:hirist.tech)
Posted 2 weeks ago
0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Years of Experience : 5 Job Description We are looking for a skilled and experienced Senior Azure Developer to join our team! As part of the team, you will be involved in the implementation of the ongoing and new initiatives for our company. If you love learning, thinking strategically, innovating, and helping others, this job is for you! Primary Skills : ADF,Databricks Secondary Skills : Responsibility : Translate functional specifications and change requests into technical specifications Translate business requirement document, functional specification, and technical specification to related coding Develop efficient code with unit testing and code documentation Ensuring accuracy and integrity of data and applications through analysis, coding, documenting, testing, and problem solving Setting up the development environment and configuration of the development tools Communicate with all the project stakeholders on the project status Manage, monitor, and ensure the security and privacy of data to satisfy business needs Contribute to the automation of modules, wherever required To be proficient in written, verbal and presentation communication (English) Co-ordinating with the UAT team Role Requirement Proficient in basic and advanced SQL programming concepts (Procedures, Analytical functions etc.) Good Knowledge and Understanding of Data warehouse concepts (Dimensional Modeling, change data capture, slowly changing dimensions etc.) Knowledgeable in Shell / PowerShell scripting Knowledgeable in relational databases, nonrelational databases, data streams, and file stores Knowledgeable in performance tuning and optimization Experience in Data Profiling and Data validation Experience in requirements gathering and documentation processes and performing unit testing Understanding and Implementing QA and various testing process in the project Knowledge in any BI tools will be an added advantage Sound aptitude, outstanding logical reasoning, and analytical skills Willingness to learn and take initiatives Ability to adapt to fast-paced Agile environment Additional Requirement Demonstrated expertise as a Data Engineer, specializing in Azure cloud services. Highly skilled in Azure Data Factory, Azure Data Lake, Azure Databricks, and Azure Synapse Analytics. Create and execute efficient, scalable, and dependable data pipelines utilizing Azure Data Factory. Utilize Azure Databricks for data transformation and processing. Effectively oversee and enhance data storage solutions, emphasizing Azure Data Lake and other Azure storage services. Construct and uphold workflows for data orchestration and scheduling using Azure Data Factory or equivalent tools. Proficient in programming languages like Python, SQL, and conversant with pertinent scripting languages. (ref:hirist.tech)
Posted 2 weeks ago
7.0 years
0 Lacs
Greater Kolkata Area
Remote
Job Title : Senior Data Engineer Azure, ETL, Snowflake. Experience : 7+ yrs. Location : Remote. Job Summary We are seeking a highly skilled and experienced Senior Data Engineer with a strong background in ETL processes, Cloud data platforms (Azure), Snowflake, SQL, and Python scripting. The ideal candidate will have hands-on experience building robust data pipelines, performing data ingestion from multiple sources, and working with modern data tools like ADF, Databricks, Fivetran, and DBT. Key Responsibilities Develop and maintain end-to-end data pipelines using Azure Data Factory, Databricks, and Snowflake. Write optimized SQL queries, stored procedures, and views to transform and retrieve data. Perform data ingestion and integration from various formats including JSON, XML, Parquet, TXT, XLSX, etc. Work on data mapping, modelling, and transformation tasks across multiple data sources. Build and deploy custom connectors using Python, PySpark, or ADF. Implement and manage Snowflake as a data storage and processing solution. Collaborate with cross-functional teams to ensure code promotion and versioning using GitHub. Ensure smooth cloud migration and data pipeline deployment using Azure services. Work with Fivetran and DBT for ingestion and transformation as required. Participate in Agile/Scrum ceremonies and follow DevSecOps practices. Mandatory Skills & Qualifications 7 years of experience in Data Engineering, ETL development, or similar roles. Proficient in SQL with strong understanding of joins, filters, and aggregations. Solid programming skills in Python (Functions, Loops, API requests, JSON parsing, etc. Strong experience with ETL tools such as Informatica, Talend, Teradata, or DataStage. Experience with Azure Cloud Services, specifically : Azure Data Factory (ADF). Databricks. Azure Data Lake. Hands-on experience in Snowflake implementation (ETL or Storage Layer). Familiarity with data modelling, data mapping, and pipeline creation. Experience working with semi-structured/unstructured data formats. Working knowledge of GitHub for version control and code management. Good To Have / Preferred Skills Experience using Fivetran and DBT for ingestion and transformation. Knowledge of AWS or GCP cloud environments. Familiarity with DevSecOps processes and CI/CD pipelines within Azure. Proficiency in Excel and Macros. Exposure to Agile methodologies (Scrum/Kanban). Understanding of custom connector creation using PySpark or ADF. Soft Skills Strong analytical and problem-solving skills. Effective communication and teamwork abilities. Ability to work independently and take ownership of deliverables. Detail-oriented with a commitment to quality. Why Join Us? Work on modern, cloud-based data platforms. Exposure to a diverse tech stack and new-age data tools. Flexible remote working opportunity aligned with a global team. Opportunity to work on critical enterprise-level data solutions. (ref:hirist.tech)
Posted 2 weeks ago
7.0 years
0 Lacs
Greater Kolkata Area
On-site
Job Description 7+ years of Hands experience on with Azure (ADF Pipelines), Databricks, data lake, Python, SQL/Snowflake [Must Have] Hands experience with data Architecture and design techniques (local/abstract) including API techniques/developments [Must Have] Hands experience of real-time data pipelining-based technologies Fast api [Must Have] Experience on delta-lake would be preferable Experience with architecting, designing and developing Big-Data processing pipelines Extensive experience with data platforms; working with large data sets, large amounts of data in motion and numerous big data technologies. Experience with Cloud-based software, specifically Microsoft Azure (including but not limited to Databricks, Azure Functions) Experience of Agile Project Delivery techniques (e.g. Scrum, Kanban). Have good interpersonal, communication, facilitation and presentation skills. Comfortable in communicating with business colleagues and working in cross functional global teams Ability to work under pressure with tight deadlines and balancing multiple priorities. Analytical, troubleshooting and problem-solving skills Excellent analytical & problem-solving skills, willingness to take ownership and resolve technical challenges. Excellent communication and stakeholder management skills (ref:hirist.tech)
Posted 2 weeks ago
7.0 - 14.0 years
0 Lacs
Greater Kolkata Area
On-site
Key Responsibilities Develop and optimize complex SQL queries, including joins (inner/outer), filters, and aggregations. Work with diverse datasets from multiple database sources, ensuring data quality and integrity. Leverage Python for data manipulation, including functions, iterations, API requests, and JSON flattening. Use Python to interpret, manipulate, and process data to facilitate downstream analysis. Design, implement, and optimize ETL processes and workflows. Manage data ingestion from various formats (e.g., JSON, Parquet, TXT, XLSX) using tools like Informatica, Teradata, DataStage, Talend, and Snowflake. Demonstrate expertise in Azure services, specifically ADF, Databricks, and Azure Data Lake. Create, manage, and optimize cloud-based data pipelines. Integrate data sources via Fivetran or custom connectors (e.g., PySpark, ADF). Lead the implementation of Snowflake as an ETL and storage layer. Ensure seamless data connectivity, including handling semi-structured/unstructured data. Promote code and manage changes across various environments. Proficient in writing complex SQL scripts, including stored procedures, views, and functions. Hands-on experience with Snowflake in multiple projects. Familiarity with DBT for transformation logic and Fivetran for data ingestion. Strong understanding of data modeling and data warehousing fundamentals. Experience with GitHub for version control and code Skills & Experience : 7 to 14 years of experience in Data Engineering, with a focus on SQL, Python, ETL, and cloud technologies. Hands-on experience with Snowflake implementation and data pipeline management. In-depth understanding of Azure cloud tools and services, such as ADF, Databricks, and Azure Data Lake. Expertise in designing and managing ETL workflows, data mapping, and ingestion from multiple data sources/formats. Proficient in Python for data interpretation, manipulation, and automation tasks. Strong knowledge of SQL, including advanced techniques such as stored procedures and functions. Experience with GitHub for version control and collaborative to Have : Experience with other cloud platforms (e.g., AWS, GCP). Familiarity with DataOps and continuous integration/continuous delivery (CI/CD) practices. Prior experience leading or mentoring teams of data engineers. (ref:hirist.tech)
Posted 2 weeks ago
5.0 years
0 Lacs
Greater Kolkata Area
On-site
Job Description Responsibilities : Architect and design end-to-end data solutions on Cloud Platform, focusing on data warehousing and big data platforms. Collaborate with clients, developers, and architecture teams to understand requirements and translate them into effective data solutions. Develop high-level and detailed data architecture and design documentation. Implement data management and data governance strategies, ensuring compliance with industry standards. Architect both batch and real-time data solutions, leveraging cloud native services and technologies. Design and manage data pipeline processes for historic data migration and data integration. Collaborate with business analysts to understand domain data requirements and incorporate them into the design deliverables. Drive innovation in data analytics by leveraging cutting-edge technologies and methodologies. Demonstrate excellent verbal and written communication skills to communicate complex ideas and concepts effectively. Stay updated on the latest advancements in Data Analytics, data architecture, and data management techniques. Requirements Minimum of 5 years of experience in a Data Architect role, supporting warehouse and Cloud data platforms/environments (Azure). Extensive Experience with common Azure services such as ADLS, Synapse, Databricks, Azure SQL etc. Experience on Azure services such as ADF, Polybase, Azure Stream Analytics Proven expertise in Databricks architecture, Delta Lake, Delta sharing, Unity Catalog, Data pipelines, and Spark tuning. Strong knowledge of Power BI architecture, DAX, and dashboard optimization. In-depth experience with SQL, Python, and/or PySpark. Hands-on knowledge of data governance, lineage, and cataloging tools such as Azure Purview and Unity Catalog. Experience in implementing CI/CD pipelines for data and BI components (e.g., using DevOps or GitHub). Experience on building symantec modeling in Power BI. Strong knowledge of Power BI architecture, DAX, and dashboard optimization. Strong expertise in data exploration using SQL and a deep understanding of data relationships. Extensive knowledge and implementation experience in data management, governance, and security frameworks. Proven experience in creating high-level and detailed data architecture and design documentation. Strong aptitude for business analysis to understand domain data requirements. Proficiency in Data Modeling using any Modeling tool for Conceptual, Logical, and Physical models is preferred Hands-on experience with architecting end-to-end data solutions for both batch and real-time designs. Ability to collaborate effectively with clients, developers, and architecture teams to implement enterprise-level data solutions. Familiarity with Data Fabric and Data Mesh architecture is a plus. Excellent verbal and written communication skills. (ref:hirist.tech)
Posted 2 weeks ago
6.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Company Description Experian is a global data and technology company, powering opportunities for people and businesses around the world. We help to redefine lending practices, uncover and prevent fraud, simplify healthcare, create marketing solutions, and gain deeper insights into the automotive market, all using our unique combination of data, analytics and software. We also assist millions of people to realize their financial goals and help them save time and money. We operate across a range of markets, from financial services to healthcare, automotive, agribusiness, insurance, and many more industry segments. We invest in people and new advanced technologies to unlock the power of data. As a FTSE 100 Index company listed on the London Stock Exchange (EXPN), we have a team of 22,500 people across 32 countries. Our corporate headquarters are in Dublin, Ireland. Learn more at experianplc.com. Job Description The Azure Big Data Engineer is an important role where you are responsible for designing, implementing, and managing comprehensive big data solutions on the Azure platform. You will report to the Senior Manager and require you to work hybrid 2 days WFO in Hyderabad Responsibilities Design, implement, and maintain scalable and reliable data pipelines using Azure Data Factory, Databricks, and Synapse Analytics. Microsoft Fabric Develop, configure, and optimize data lakes and warehouses on Azure using services like Azure Data Lake Storage (ADLS), Azure Lakehouse, Warehouse and monitor data pipelines for performance, scalability, and reliability. Collaborate with data scientists, architects, analysts, and other stakeholders to understand data requirements and deliver solutions that meet business needs. Ensure that data is secure and meets all regulatory compliance standards, including role-based access control (RBAC) and data encryption in Azure environments. Develop and configure monitoring and alerting mechanisms to proactively enhance performance and optimize data systems. Troubleshoot and resolve data-related issues in a timely manner. Produce clear, concise technical documentation for all developed solutions Requirements Experience with SSIS, SQL Jobs, BIDS & ADF Experience with Azure services (Microsoft Fabric, Azure Synapse, Azure SQL Database, Azure Key Vault, etc.). Proficiency in Azure data services, including Azure Data Factory, Azure Databricks, Azure Synapse Analytics and Azure Data Lake Storage. Experience in data modeling, data architecture, and implementing ETL/ELT solutions Proficiency in SQL and familiarity with other programming languages such as Python or Scala. Knowledge of data modeling, data warehousing, and big data technologies. Experience with data governance and security best practices. Qualifications Bachelors in computer science or related field, Masters preferred 6+ years of professional data ingestion experience with ETL/ELT tools like SSIS, ADF, Synapse 2+ years of Azure cloud experience Bachelors in computer science or related field, Masters Preferred Experience with Microsoft Fabric and Azure Synapse. Understand ML/AI concepts. Azure certifications Additional Information Our uniqueness is that we truly celebrate yours. Experian's culture and people are important differentiators. We take our people agenda very seriously and focus on what truly matters; DEI, work/life balance, development, authenticity, engagement, collaboration, wellness, reward & recognition, volunteering... the list goes on. Experian's strong people first approach is award winning; Great Place To Work™ in 24 countries, FORTUNE Best Companies to work and Glassdoor Best Places to Work (globally 4.4 Stars) to name a few. Check out Experian Life on social or our Careers Site to understand why. Experian is proud to be an Equal Opportunity and Affirmative Action employer. Innovation is a critical part of Experian's DNA and practices, and our diverse workforce drives our success. Everyone can succeed at Experian and bring their whole self to work, irrespective of their gender, ethnicity, religion, color, sexuality, physical ability or age. If you have a disability or special need that requires accommodation, please let us know at the earliest opportunity. Experian Careers - Creating a better tomorrow together Benefits Experian care for employee's work life balance, health, safety and wellbeing. In support of this endeavor, we offer best-in-class family well-being benefits, enhanced medical benefits and paid time off. Experian Careers - Creating a better tomorrow together Find out what its like to work for Experian by clicking here
Posted 3 weeks ago
5.0 - 10.0 years
10 - 20 Lacs
Hyderabad, Gurugram
Work from Office
Job Description About the Company : Headquartered in California, U.S.A., GSPANN provides consulting and IT services to global clients. We help clients transform how they deliver business value by helping them optimize their IT capabilities, practices, and operations with our experience in retail, high-technology, and manufacturing. With five global delivery centers and 1900+ employees, we provide the intimacy of a boutique consultancy with the capabilities of a large IT services firm. Role: Azure Data Engineer. Experience: 4+ Years Skill Set: Azure Synapse, Pyspark, ADF and SQL. Location: Pune, Hyderabad, Gurgaon 5+ years of experience in software development, technical operations, and running large-scale applications. 4+ years of experience in developing or supporting Azure Data Factory (API/APIM), Azure Databricks, Azure DevOps, Azure Data Lake storage (ADLS), SQL and Synapse data warehouse, Azure Cosmos DB 2+ years of experience working in Data Engineering Any experience in data virtualization products like Denodo is desirable Azure Data Engineer or Solutions Architect certification is desirable Should have a good understanding of container platforms like Docker and Kubernetes. Should be able to assess the application/platform time to time for architectural improvements and provide inputs to the relevant teams Very Good troubleshooting skills (quick identification of the application issues and providing quick resolutions with no or minimal user/business impact) Hands-on experience in working with high-volume, mission-critical applications Deep appreciation of IT tools, techniques, systems, and solutions. Excellent communication skills along with experience in driving triage calls which involves different technical stake holders Has creative problem-solving skills related to cross-functional issues amidst the changing priorities. Should be flexible and resourceful to swiftly manage the changing operational goals and demands. Good experience in handling escalations and take complete responsibility and ownership of all critical issues to get a technical/logical closure. Good understanding of the IT Infrastructure Library (ITIL) framework and various IT Service Management (ITSM) tools available in the marketplace
Posted 3 weeks ago
3.0 - 6.0 years
5 - 8 Lacs
Hyderabad, Bengaluru, Delhi / NCR
Work from Office
About the Role: Were hiring 2 Cloud & Data Engineering Specialists to join our fast-paced, agile team. These roles are focused on designing, developing, and scaling modern, cloud-based data engineering solutions using tools like Azure, AWS, GCP, Databricks, Kafka, PySpark, SQL, Snowflake, and ADF. Position 1: Cloud & Data Engineering Specialist Resource 1 Key Responsibilities: Develop and manage cloud-native solutions on Azure or AWS Build real-time streaming apps with Kafka Engineer services using Java and Python Deploy and manage Kubernetes-based containerized applications Process big data using Databricks Administer SQL Server and Snowflake databases, write advanced SQL Utilize Unix/Linux for system operations Must-Have Skills: Azure or AWS cloud experience Kafka, Java, Python, Kubernetes Databricks, SQL Server, Snowflake Unix/Linux commands Location: Remote- Bengaluru,Hyderabad,Delhi / NCR,Chennai,Pune,Kolkata,Ahmedabad,Mumbai
Posted 3 weeks ago
7.0 - 12.0 years
6 - 16 Lacs
Bengaluru
Remote
5+ years’ experience with a strong proficiency with SQL query/development skills Hands-on experience with ETL tools Experience working in the healthcare industry with PHI/PII
Posted 3 weeks ago
0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Description Full Stack Engineer / .Net, Nightwatch, Azure, Angular/React / Azure Data Sources (Azure SQL, ADF, Cosmos, Mongo, etc.) Requirements Full Stack Engineer / .Net, Nightwatch, Azure, Angular/React / Azure Data Sources (Azure SQL, ADF, Cosmos, Mongo, etc.) Job responsibilities Full Stack Engineer / .Net, Nightwatch, Azure, Angular/React / Azure Data Sources (Azure SQL, ADF, Cosmos, Mongo, etc.) What we offer Culture of caring. At GlobalLogic, we prioritize a culture of caring. Across every region and department, at every level, we consistently put people first. From day one, you’ll experience an inclusive culture of acceptance and belonging, where you’ll have the chance to build meaningful connections with collaborative teammates, supportive managers, and compassionate leaders. Learning and development. We are committed to your continuous learning and development. You’ll learn and grow daily in an environment with many opportunities to try new things, sharpen your skills, and advance your career at GlobalLogic. With our Career Navigator tool as just one example, GlobalLogic offers a rich array of programs, training curricula, and hands-on opportunities to grow personally and professionally. Interesting & meaningful work. GlobalLogic is known for engineering impact for and with clients around the world. As part of our team, you’ll have the chance to work on projects that matter. Each is a unique opportunity to engage your curiosity and creative problem-solving skills as you help clients reimagine what’s possible and bring new solutions to market. In the process, you’ll have the privilege of working on some of the most cutting-edge and impactful solutions shaping the world today. Balance and flexibility. We believe in the importance of balance and flexibility. With many functional career areas, roles, and work arrangements, you can explore ways of achieving the perfect balance between your work and life. Your life extends beyond the office, and we always do our best to help you integrate and balance the best of work and life, having fun along the way! High-trust organization. We are a high-trust organization where integrity is key. By joining GlobalLogic, you’re placing your trust in a safe, reliable, and ethical global company. Integrity and trust are a cornerstone of our value proposition to our employees and clients. You will find truthfulness, candor, and integrity in everything we do. About GlobalLogic GlobalLogic, a Hitachi Group Company, is a trusted digital engineering partner to the world’s largest and most forward-thinking companies. Since 2000, we’ve been at the forefront of the digital revolution – helping create some of the most innovative and widely used digital products and experiences. Today we continue to collaborate with clients in transforming businesses and redefining industries through intelligent products, platforms, and services.
Posted 3 weeks ago
0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Description Full Stack Engineer / .Net, Nightwatch, Azure, Angular/React / Azure Data Sources (Azure SQL, ADF, Cosmos, Mongo, etc.) Requirements Full Stack Engineer / .Net, Nightwatch, Azure, Angular/React / Azure Data Sources (Azure SQL, ADF, Cosmos, Mongo, etc.) Job responsibilities Full Stack Engineer / .Net, Nightwatch, Azure, Angular/React / Azure Data Sources (Azure SQL, ADF, Cosmos, Mongo, etc.) What we offer Culture of caring. At GlobalLogic, we prioritize a culture of caring. Across every region and department, at every level, we consistently put people first. From day one, you’ll experience an inclusive culture of acceptance and belonging, where you’ll have the chance to build meaningful connections with collaborative teammates, supportive managers, and compassionate leaders. Learning and development. We are committed to your continuous learning and development. You’ll learn and grow daily in an environment with many opportunities to try new things, sharpen your skills, and advance your career at GlobalLogic. With our Career Navigator tool as just one example, GlobalLogic offers a rich array of programs, training curricula, and hands-on opportunities to grow personally and professionally. Interesting & meaningful work. GlobalLogic is known for engineering impact for and with clients around the world. As part of our team, you’ll have the chance to work on projects that matter. Each is a unique opportunity to engage your curiosity and creative problem-solving skills as you help clients reimagine what’s possible and bring new solutions to market. In the process, you’ll have the privilege of working on some of the most cutting-edge and impactful solutions shaping the world today. Balance and flexibility. We believe in the importance of balance and flexibility. With many functional career areas, roles, and work arrangements, you can explore ways of achieving the perfect balance between your work and life. Your life extends beyond the office, and we always do our best to help you integrate and balance the best of work and life, having fun along the way! High-trust organization. We are a high-trust organization where integrity is key. By joining GlobalLogic, you’re placing your trust in a safe, reliable, and ethical global company. Integrity and trust are a cornerstone of our value proposition to our employees and clients. You will find truthfulness, candor, and integrity in everything we do. About GlobalLogic GlobalLogic, a Hitachi Group Company, is a trusted digital engineering partner to the world’s largest and most forward-thinking companies. Since 2000, we’ve been at the forefront of the digital revolution – helping create some of the most innovative and widely used digital products and experiences. Today we continue to collaborate with clients in transforming businesses and redefining industries through intelligent products, platforms, and services.
Posted 3 weeks ago
0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Description Full Stack Engineer / .Net, Nightwatch, Azure, Angular/React / Azure Data Sources (Azure SQL, ADF, Cosmos, Mongo, etc.) Requirements Full Stack Engineer / .Net, Nightwatch, Azure, Angular/React / Azure Data Sources (Azure SQL, ADF, Cosmos, Mongo, etc.) Job responsibilities Full Stack Engineer / .Net, Nightwatch, Azure, Angular/React / Azure Data Sources (Azure SQL, ADF, Cosmos, Mongo, etc.) What we offer Culture of caring. At GlobalLogic, we prioritize a culture of caring. Across every region and department, at every level, we consistently put people first. From day one, you’ll experience an inclusive culture of acceptance and belonging, where you’ll have the chance to build meaningful connections with collaborative teammates, supportive managers, and compassionate leaders. Learning and development. We are committed to your continuous learning and development. You’ll learn and grow daily in an environment with many opportunities to try new things, sharpen your skills, and advance your career at GlobalLogic. With our Career Navigator tool as just one example, GlobalLogic offers a rich array of programs, training curricula, and hands-on opportunities to grow personally and professionally. Interesting & meaningful work. GlobalLogic is known for engineering impact for and with clients around the world. As part of our team, you’ll have the chance to work on projects that matter. Each is a unique opportunity to engage your curiosity and creative problem-solving skills as you help clients reimagine what’s possible and bring new solutions to market. In the process, you’ll have the privilege of working on some of the most cutting-edge and impactful solutions shaping the world today. Balance and flexibility. We believe in the importance of balance and flexibility. With many functional career areas, roles, and work arrangements, you can explore ways of achieving the perfect balance between your work and life. Your life extends beyond the office, and we always do our best to help you integrate and balance the best of work and life, having fun along the way! High-trust organization. We are a high-trust organization where integrity is key. By joining GlobalLogic, you’re placing your trust in a safe, reliable, and ethical global company. Integrity and trust are a cornerstone of our value proposition to our employees and clients. You will find truthfulness, candor, and integrity in everything we do. About GlobalLogic GlobalLogic, a Hitachi Group Company, is a trusted digital engineering partner to the world’s largest and most forward-thinking companies. Since 2000, we’ve been at the forefront of the digital revolution – helping create some of the most innovative and widely used digital products and experiences. Today we continue to collaborate with clients in transforming businesses and redefining industries through intelligent products, platforms, and services.
Posted 3 weeks ago
0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Description Full Stack Engineer / .Net, Nightwatch, Azure, Angular/React / Azure Data Sources (Azure SQL, ADF, Cosmos, Mongo, etc.) Requirements Full Stack Engineer / .Net, Nightwatch, Azure, Angular/React / Azure Data Sources (Azure SQL, ADF, Cosmos, Mongo, etc.) Job responsibilities Full Stack Engineer / .Net, Nightwatch, Azure, Angular/React / Azure Data Sources (Azure SQL, ADF, Cosmos, Mongo, etc.) What we offer Culture of caring. At GlobalLogic, we prioritize a culture of caring. Across every region and department, at every level, we consistently put people first. From day one, you’ll experience an inclusive culture of acceptance and belonging, where you’ll have the chance to build meaningful connections with collaborative teammates, supportive managers, and compassionate leaders. Learning and development. We are committed to your continuous learning and development. You’ll learn and grow daily in an environment with many opportunities to try new things, sharpen your skills, and advance your career at GlobalLogic. With our Career Navigator tool as just one example, GlobalLogic offers a rich array of programs, training curricula, and hands-on opportunities to grow personally and professionally. Interesting & meaningful work. GlobalLogic is known for engineering impact for and with clients around the world. As part of our team, you’ll have the chance to work on projects that matter. Each is a unique opportunity to engage your curiosity and creative problem-solving skills as you help clients reimagine what’s possible and bring new solutions to market. In the process, you’ll have the privilege of working on some of the most cutting-edge and impactful solutions shaping the world today. Balance and flexibility. We believe in the importance of balance and flexibility. With many functional career areas, roles, and work arrangements, you can explore ways of achieving the perfect balance between your work and life. Your life extends beyond the office, and we always do our best to help you integrate and balance the best of work and life, having fun along the way! High-trust organization. We are a high-trust organization where integrity is key. By joining GlobalLogic, you’re placing your trust in a safe, reliable, and ethical global company. Integrity and trust are a cornerstone of our value proposition to our employees and clients. You will find truthfulness, candor, and integrity in everything we do. About GlobalLogic GlobalLogic, a Hitachi Group Company, is a trusted digital engineering partner to the world’s largest and most forward-thinking companies. Since 2000, we’ve been at the forefront of the digital revolution – helping create some of the most innovative and widely used digital products and experiences. Today we continue to collaborate with clients in transforming businesses and redefining industries through intelligent products, platforms, and services.
Posted 3 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
31458 Jobs | Dublin
Wipro
16542 Jobs | Bengaluru
EY
10788 Jobs | London
Accenture in India
10711 Jobs | Dublin 2
Amazon
8660 Jobs | Seattle,WA
Uplers
8559 Jobs | Ahmedabad
IBM
7988 Jobs | Armonk
Oracle
7535 Jobs | Redwood City
Muthoot FinCorp (MFL)
6170 Jobs | New Delhi
Capgemini
6091 Jobs | Paris,France