Jobs
Interviews

56639 Azure Jobs - Page 20

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 - 9.0 years

0 Lacs

ahmedabad, gujarat

On-site

As a Senior Data Analytics Engineer at Ajmera Infotech Private Limited (AIPL), you will have the opportunity to power mission-critical decisions with governed insights using cutting-edge technologies and solutions. Ajmera Infotech is a reputable company that builds planet-scale software for NYSE-listed clients in highly regulated domains such as HIPAA, FDA, and SOC 2. Our team of 120 engineers specializes in delivering production-grade systems that provide strategic advantages through data-driven decision-making. You will play a crucial role in building end-to-end analytics solutions, from lake house pipelines to real-time dashboards, ensuring fail-safe engineering practices with TDD, CI/CD, DAX optimization, Unity Catalog, and cluster tuning. Working with a modern stack including Databricks, PySpark, Delta Lake, Power BI, and Airflow, you will have the opportunity to create impactful solutions that drive business success. At AIPL, you will be part of a mentorship culture where you can lead code reviews, share best practices, and grow as a domain expert. You will work in a mission-critical context, helping enterprises migrate legacy analytics into cloud-native, governed platforms with a compliance-first mindset in HIPAA-aligned environments. Key Responsibilities: - Build scalable pipelines using SQL, PySpark, Delta Live Tables on Databricks. - Orchestrate workflows with Databricks Workflows or Airflow; implement SLA-backed retries and alerting. - Design dimensional models (star/snowflake) with Unity Catalog and Great Expectations validation. - Deliver robust Power BI solutions including dashboards, semantic layers, and paginated reports, focusing on DAX optimization. - Migrate legacy SSRS reports to Power BI with zero loss of logic or governance. - Optimize compute and cost through cache tuning, partitioning, and capacity monitoring. - Document pipeline logic, RLS rules, and more in Git-controlled formats. - Collaborate cross-functionally to convert product analytics needs into resilient BI assets. - Champion mentorship by reviewing notebooks, dashboards, and sharing platform standards. Must-Have Skills: - 5+ years in analytics engineering, with 3+ years in production Databricks/Spark contexts. - Proficiency in advanced SQL (including windowing), expert PySpark, Delta Lake, and Unity Catalog. - Mastery of Power BI including DAX optimization, security rules, and paginated reports. - Experience in SSRS-to-Power BI migration with RDL logic replication. - Strong Git, CI/CD familiarity, and cloud platform know-how (Azure/AWS). - Excellent communication skills to bridge technical and business audiences. Nice-to-Have Skills: - Databricks Data Engineer Associate certification. - Experience with streaming pipelines (Kafka, Structured Streaming). - Familiarity with data quality frameworks such as dbt, Great Expectations, or similar tools. - BI diversity including experience with Tableau, Looker, or similar platforms. - Knowledge of cost governance (Power BI Premium capacity, Databricks chargeback). Join us at AIPL and enjoy a competitive salary package with performance-based bonuses, along with comprehensive health insurance for you and your family. Take on this exciting opportunity to make a significant impact in the world of data analytics and engineering.,

Posted 13 hours ago

Apply

7.0 - 12.0 years

0 Lacs

noida, uttar pradesh

On-site

As a Full Stack Developer at VBeyond Corporation, you will be responsible for leading and managing the development of end-to-end solutions using Angular for the frontend and Python (with Django or Flask) for the backend. Your role will involve architecting and implementing scalable and performant applications, collaborating with cross-functional teams, writing clean and efficient code, and overseeing code quality to ensure adherence to best practices. You will design, implement, and maintain RESTful APIs and integrations, as well as lead, mentor, and guide a team of junior developers to improve their coding skills. Troubleshooting and debugging issues in both frontend and backend systems, ensuring application performance, scalability, and reliability across all environments, and participating in code sprints and project milestones will be key responsibilities. To be successful in this role, you should have 7-12 years of professional experience in full-stack development, with a focus on Angular for the frontend and Python for the backend. Proficiency in Angular, including Angular CLI, HTML5, CSS3, JavaScript, and responsive design principles is required. You should also have experience with state management libraries like NgRx or Akita for the frontend, and frameworks like Django, Flask, or FastAPI for the backend. Additionally, knowledge of databases such as PostgreSQL, MySQL, or MongoDB, caching solutions like Redis or Memcached, and version control tools like Git is essential. Experience with Agile/Scrum methodologies, CI/CD pipelines, DevOps practices, cloud platforms, containerization, orchestration tools, and problem-solving skills will be beneficial. Leadership skills are also important as you will be leading and mentoring a team of developers, collaborating effectively with stakeholders, and ensuring successful project delivery. Preferred qualifications include experience with frontend state management libraries, GraphQL, serverless architectures, and test-driven development practices. Overall, as a Full Stack Developer at VBeyond Corporation, you will play a vital role in delivering cutting-edge solutions using top-notch technologies and ensuring high-quality software development practices are maintained throughout the project lifecycle.,

Posted 13 hours ago

Apply

5.0 - 9.0 years

0 Lacs

haryana

On-site

You will be responsible for working with API related tasks such as defining quotas, implementing security measures, enforcing governance policies, and ensuring system resilience. Your expertise will be required in managing API Gateways like Kong, Apigee, Tyk, or Istio. Proficiency in working with Cloud platforms such as GCP or Azure and experience with Kubernetes clusters will be essential for this role. Knowledge and hands-on experience in technologies like JWT, OAuth2, and OpenID Connect are required. Familiarity with Redis, New Relic, IAM, and RBAC will be beneficial in fulfilling your responsibilities effectively. The role is based in Gurgaon. If you are passionate about API management, cloud technologies, and ensuring the security and performance of IT systems, this opportunity is perfect for you. Join our team and contribute to the success of our projects by leveraging your skills and knowledge in the specified technologies.,

Posted 13 hours ago

Apply

1.0 years

0 Lacs

Mohali, Punjab

On-site

Job Description Job Title: .NET Developer (1+ Year Experience) Location: D 179, Phase 8B, Industrial Area, Sector 74, Sahibzada Ajit Singh Nagar, Punjab 160055 Job Type: Full-time 5 working days from 11 AM to 8 PM About Us Ensuesoft helps brands with the best of technologies, tools, platforms, and best practices to smoothen their digital transformation journey. We are committed to delivering high-quality products and services to our clients. We are specialised in Asp.Net, MSSQL, MySql, MVC, Angular.js, Cryptocurrency, PHP, Web Designing, WPF, WCF, Web Apis, Window Server, Azure, DevExpress, Asp.Net Core, Shopify Web and Apps, Selenium, Android Applications, and IOS Applications. Position Overview: We are seeking a passionate and motivated .NET Developer Fresher to join our dynamic development team. As a .NET Developer, you will be involved in designing, developing, and maintaining web-based applications using the .NET framework. This is an excellent opportunity for fresh graduates to gain hands-on experience and grow their skills in software development. Key Responsibilities: Develop and maintain web applications, services, and components using .NET technologies (C#, ASP.NET, .NET Core, MVC, Web API). Collaborate with cross-functional teams to define, design, and ship new features. Write clean, scalable, and efficient code following best practices and design patterns. Participate in code reviews and ensure adherence to development standards and guidelines. Troubleshoot and resolve application issues, bugs, and performance bottlenecks. Ensure the quality of applications through unit testing and debugging. Assist with database design and optimization (SQL Server, Entity Framework). Continuously learn and apply new technologies and techniques to enhance application performance. Document development processes, code changes, and technical specifications. Participate in Agile development cycles, including sprint planning, stand-ups, and retrospectives. Requirements: Bachelor's degree in Computer Science, Engineering, or a related field (or equivalent experience). 1+ years of experience in .NET development (C#, ASP.NET, .NET Core). Strong understanding of object-oriented programming (OOP) principles. Familiarity with front-end technologies (HTML, CSS, JavaScript, jQuery). Experience working with relational databases (SQL Server, MySQL) and ORM frameworks like Entity Framework. Good understanding of web services and API design (RESTful APIs, Web API). Knowledge of version control systems like Git. Ability to work in a collaborative, fast-paced environment. Strong problem-solving skills and attention to detail. Excellent communication and interpersonal skills. Ability to learn quickly and adapt to new technologies. Preferred Qualifications: Experience with cloud platforms (Azure, AWS) is a plus. Familiarity with modern front-end frameworks (Angular, React) is an advantage. Knowledge of Agile methodologies and experience working in Agile teams. Job Type: Full-time Work Location: In person

Posted 13 hours ago

Apply

3.0 - 7.0 years

0 Lacs

karnataka

On-site

As a member of the IT Security team at Fresenius Digital Technology, you will play a crucial role in the implementation, management, and operation of various security capabilities across different business segments within the Fresenius Group. Your responsibilities will include the deployment and maintenance of Identity Governance and Administration (IGA) solutions to ensure alignment with security, business, and compliance objectives. You will be involved in technical integrations of business applications such as Active Directory, SAP, and cloud platforms, in collaboration with application owners. Implementing best practices for IGA processes, including identity lifecycle management, access reviews, role modeling, and access policies will also be part of your role. Your expertise will be essential in troubleshooting and resolving IGA-related incidents and service requests, as well as monitoring and reporting on access risks and policy violations. Collaboration with cross-functional teams comprising business, security, infrastructure, HR, and application teams, both internal and external, will be integral to developing identity security workflows and integrations. Additionally, staying updated with industry trends, emerging technologies, and best practices in identity governance will be crucial to your success in this role. To excel in this position, you are required to have a minimum of 3 years of experience in Identity Governance and Administration or IAM roles. Hands-on experience with the IGA platform SailPoint ISC is essential, along with a solid understanding of identity governance principles and familiarity with security protocols and authentication standards. You should also possess experience in integrating IGA tools with cloud, on-premises systems, and SaaS applications, coupled with strong collaboration, communication, and documentation skills. Preferred qualifications include SailPoint ISC or IdentityNow Engineer/Architect Certification, experience with cloud environments such as AWS and Azure, and prior exposure to regulated industries and modern identity ecosystems. If you are seeking a challenging yet rewarding working environment where your expertise will be valued, Fresenius Digital Technology in Bangalore, India, may be the ideal workplace for you. To apply for this opportunity, please reach out to Amit Kumar at Amit.Singh1@fresenius.com. *Please note that by applying for this position, you agree that the country-specific labor laws of the respective legal entity will be applicable to the application process.,

Posted 13 hours ago

Apply

5.0 - 9.0 years

0 Lacs

karnataka

On-site

Wipro Limited is a leading technology services and consulting company that focuses on creating innovative solutions to meet the complex digital transformation needs of clients. With a holistic portfolio of capabilities in consulting, design, engineering, and operations, Wipro helps clients achieve their boldest ambitions and develop sustainable, future-ready businesses. With a global presence of over 230,000 employees and business partners across 65 countries, Wipro is committed to helping customers, colleagues, and communities thrive in an ever-changing world. We are currently looking for an ETL Test Lead with the following qualifications: Primary Skill: ETL Testing Secondary Skill: Azure Key Requirements: - 5+ years of experience in data warehouse testing, with at least 2 years of experience in Azure Cloud - Strong understanding of data marts and data warehouse concepts - Expertise in SQL with the ability to create source-to-target comparison test cases - Proficient in creating test plans, test cases, traceability matrix, and closure reports - Familiarity with Pyspark, Python, Git, Jira, and JTM Band: B3 Location: Pune, Chennai, Coimbatore, Bangalore Mandatory Skills: ETL Testing Experience: 5-8 Years At Wipro, we are in the process of building a modern organization that is focused on digital transformation. We are looking for individuals who are inspired by reinvention and are committed to evolving themselves, their careers, and their skills. Join us in our journey to constantly evolve and adapt to the changing world around us. Come to Wipro and realize your ambitions. We welcome applications from individuals with disabilities.,

Posted 13 hours ago

Apply

5.0 - 9.0 years

0 Lacs

hyderabad, telangana

On-site

As a Data Engineer/ETL Developer at NTT DATA Business Solutions, you will be responsible for developing and implementing ETL processes using Python and SQL to extract, transform, and load data from various sources into our data warehouse. You will optimize and maintain existing ETL workflows and data pipelines to enhance performance and scalability. Your role will involve designing, developing, and maintaining efficient, reusable, and reliable Python code while supporting Python version upgrade activities. Collaboration with cross-functional teams is essential to understand data requirements and ensure data integrity and quality. You will also be monitoring and troubleshooting data processing systems to guarantee timely and accurate data delivery. Documentation related to ETL processes, data models, and workflows will need to be developed and maintained. Participation in code reviews to provide constructive feedback to team members is also expected. As the ideal candidate, you should possess a Bachelor's degree in IT, computer science, computer engineering, or a related field, along with proven experience as a Data Engineer or ETL Developer, focusing on Python and SQL. A minimum of 5 years of experience in ETL is required, along with proficiency in programming languages like Python for data engineering tasks. Strong understanding of ETL concepts, data warehousing principles, and proficiency in writing complex SQL queries is essential. Familiarity with cloud platforms such as Azure or OCI is considered a plus, as well as experience with version control systems like Git. Being familiar with Agile methodology and Agile working environments is advantageous, and the ability to work collaboratively with Product Owners, Business Analysts, and Architects is crucial. Staying up-to-date with industry trends and emerging technologies is encouraged to continuously enhance data engineering practices. This position is based in Offshore - India. Join us at NTT DATA Business Solutions and be part of our mission to transform SAP solutions into value. For any queries regarding this opportunity, please reach out to our Recruiter: Recruiter Name: Pragya Kalra Recruiter Email ID: Pragya.Kalra@nttdata.com NTT DATA Business Solutions is a rapidly growing international IT company and one of the world's leading SAP partners. We offer a full range of services, from business consulting to SAP solution implementation, including hosting services and support.,

Posted 13 hours ago

Apply

2.0 years

0 Lacs

Noida, Uttar Pradesh, India

Remote

Position Overview: As a Slot Game Backend Developer ( Mid-I) , you will play a pivotal role in designing, developing, and maintaining backend systems and services tailored for slot games. You will focus on creating robust, scalable, and efficient backend architectures to support game logic, player interactions, and real-time data streaming. Additionally, you will contribute to the development and maintenance of a Remote Gaming Server (RGS) that manages game mechanics, operator integrations, and ensures compliance with industry standards. Key Responsibilities: Game-Specific Architecture & Design: Design and implement scalable backend architectures optimized for slot games, including random number generation (RNG) handling, payout calculations, and in-game event tracking. Slot Game Backend Development: Develop and maintain backend services for slot games using Node.js, ensuring seamless execution of game mechanics, bonus rounds, and jackpot functionalities. Remote Gaming Server Development & Maintenance: Collaborate on the development and upkeep of a Remote Gaming Server to handle game logic, player session management, and operator integrations. Real-Time Messaging: Utilize Apache Kafka to build real-time data pipelines for game events, player session tracking, leaderboards, and bonus triggers. API Integration: Integrate and manage third-party APIs, including RNG services, payment gateways, analytics platforms, and operator systems. Develop RESTful APIs for internal game clients and external integrations. Performance Optimization: Analyze and address performance bottlenecks and latency issues to ensure a smooth gaming experience, even during high traffic periods. Game Logic Implementation: Work closely with game designers to implement backend logic for slot mechanics, including win/loss logic, RTP calculations, and feature triggers. Code Reviews & Mentorship: Conduct thorough code reviews, enforce coding standards, and mentor junior developers to ensure high-quality backend solutions. Testing & Deployment: Ensure robust testing of backend systems for slot games, including game scenarios and stress testing; manage CI/CD pipelines for smooth deployment and updates. Troubleshooting & Live Support: Monitor live environments for issues, troubleshoot backend systems, and ensure minimal downtime for slot games in production. Required Qualifications: Experience: Minimum 2+ years of experience in backend development At least 1+ years working on Slot Game Backend Systems At least 1+ years working on Remote Gaming Server Technical Skills: Proficiency in Node.js and related frameworks (e.g., Express, Happi). Expertise in Apache Kafka for real-time messaging and event-driven systems. Hands-on experience with third-party RNG services and payment gateway integrations. Solid understanding of RESTful API design and implementation. Game-Specific Experience: Proven track record of designing and implementing backend systems for slot games, including RTP configuration, progressive jackpots, and in-game rewards. Experience in maintaining backend components for a Remote Gaming Server (RGS) to support game operations and operator integrations. Databases: Experience with relational (SQL) databases (e.g., PostgreSQL, MySQL, MariaDB), with a focus on player data, game state management, and transaction logs. Version Control & Cloud Platforms: Proficiency with Git for version control. Familiarity with cloud platforms such as AWS, Azure, or Google Cloud, specifically for gaming backends. Preferred Qualifications: Additional Skills: Experience with containerization tools like Docker and orchestration platforms like Kubernetes. Familiarity with casino backend frameworks and gaming protocols. Agile Methodologies: Experience working in Agile/Scrum environments, specifically in gaming projects. DevOps Practices: Knowledge of CI/CD pipelines and automation tools for slot game deployments. Soft Skills: Problem-Solving: Strong analytical and troubleshooting skills to identify and resolve backend issues in live slot game environments. Communication: Excellent verbal and written communication skills to collaborate with game designers, product managers, and stakeholders. Teamwork: Proven ability to work in a team-oriented environment, mentoring junior developers and fostering collaboration. Adaptability: Ability to adapt to the fast-paced, evolving requirements of slot game development and live operations.

Posted 13 hours ago

Apply

4.0 - 8.0 years

0 Lacs

haryana

On-site

As an AI/ML professional with 4-8 years of experience, your role will involve developing and implementing AI models and pipelines utilizing LLMs to meet business requirements. You will be responsible for designing and deploying machine learning solutions on Azure, ensuring high performance, scalability, and reliability. Using Python for data preprocessing, model development, and experimentation will be a key part of your responsibilities. Moreover, you will play a crucial role in building and integrating REST APIs using Flask to facilitate seamless model access for various applications. Your expertise in containerizing models and applications with Docker will be vital in streamlining deployment processes and enhancing portability. Collaboration with cross-functional teams to comprehend project objectives and develop customized AI solutions will also be a significant aspect of your job. To excel in this role, you should stay abreast of the latest advancements in LLMs and cloud technologies. Optimizing models for accuracy, performance, and inference speed based on real-time data will be essential to drive continuous improvement and innovation. Qualifications for this position include 4-8 years of experience in data science, specifically working with LLMs models in domains such as CPG, Retail, and Supply Chain. Hands-on experience in deploying AI/ML models on Azure and utilizing cloud-based tools is essential. Proficiency in Python for data manipulation, model building, and evaluation is required. Additionally, experience with Dockerization for deploying and managing AI applications, knowledge of Flask for building lightweight APIs, and strong problem-solving skills to translate business challenges into AI solutions are necessary. Good communication skills are vital for effective collaboration with stakeholders and presenting technical results convincingly. A working knowledge of the finance domain is a must for this role.,

Posted 13 hours ago

Apply

3.0 - 8.0 years

0 Lacs

ahmedabad, gujarat

On-site

You are a highly skilled and motivated Lead DevOps Engineer with Solution Architect expertise responsible for managing end-to-end infrastructure projects across cloud, hybrid, and dedicated server environments. Your role involves hands-on experience with WHM/cPanel, OpenPanel, load balancers, and in-depth knowledge of modern DevOps practices. As the ideal candidate, you will lead a team of DevOps engineers, drive technical excellence, and serve as the go-to expert for scalable, secure, and high-availability infrastructure solutions. You will be responsible for various key activities, including architecting, implementing, and maintaining scalable infrastructure solutions across cloud and dedicated server environments. Additionally, you will manage hosting infrastructure, design load balancing strategies, automate provisioning and monitoring processes, and ensure infrastructure reliability, security, and disaster recovery protocols are in place. Your role also involves translating business and application requirements into robust infrastructure blueprints, leading design reviews for client and internal projects, creating documentation, defining architectural best practices, and mentoring a team of DevOps engineers across multiple projects. You will allocate resources, manage project timelines, foster innovation and collaboration, conduct performance reviews, provide training, and support the career development of team members. Furthermore, you will set up and maintain observability systems, conduct performance tuning, cost optimization, and environment hardening, and ensure compliance with internal policies and external standards. To succeed in this role, you should possess 8+ years of experience in DevOps, systems engineering, or cloud infrastructure management, along with proven expertise in hosting infrastructure, Linux servers, networking, security, automation scripting, and cloud platforms. Preferred qualifications include certifications such as AWS Solutions Architect, RHCE, CKA, or Linux Foundation Certified Engineer, experience in IT services or hosting/cloud consulting environments, knowledge of compliance frameworks, and familiarity with agile methodologies and DevOps lifecycle management tools.,

Posted 13 hours ago

Apply

6.0 - 12.0 years

0 Lacs

hyderabad, telangana

On-site

Choosing Capgemini means choosing a company where you will be empowered to shape your career in the way you'd like, where you'll be supported and inspired by a collaborative community of colleagues around the world, and where you'll be able to reimagine what's possible. Join us and help the world's leading organizations unlock the value of technology and build a more sustainable, more inclusive world. Install and configure ArcGIS Pro, Portal, and Enterprise with SDE. Deploy ArcGIS components: Server, Data Store, Image & GP Servers. Manage enterprise service migration and patch deployments. Configure SSO, SSL, tokens, and web services. Support Azure hosting, HA setup, and ESRI web/mobile apps. 6-12 years in ArcGIS Pro, Portal, and Enterprise setup with SDE. Skilled in ArcGIS components, FME, ArcPy, and DI tools. Experience with Azure, HA architecture, and ESRI licensing. Familiar with Agile, DevOps, and change management. Strong communication and documentation skills. Flexible work option: Hybrid. Competitive salary and benefits package. Career growth with SAP and cloud certifications. Inclusive and collaborative work environment. Capgemini is a global business and technology transformation partner, helping organizations to accelerate their dual transition to a digital and sustainable world, while creating tangible impact for enterprises and society. It is a responsible and diverse group of 340,000 team members in more than 50 countries. With its strong over 55-year heritage, Capgemini is trusted by its clients to unlock the value of technology to address the entire breadth of their business needs. It delivers end-to-end services and solutions leveraging strengths from strategy and design to engineering, all fueled by its market leading capabilities in AI, generative AI, cloud and data, combined with its deep industry expertise and partner ecosystem.,

Posted 13 hours ago

Apply

5.0 years

0 Lacs

Ahmedabad, Gujarat, India

On-site

Greetings from Synergy Resource Solutions, a leading Recruitment Consultancy. Our Client is an ISO 27001:2013 AND ISO 9001 Certified company, and pioneer in web design and development company from India. Company has also been voted as the Top 10 mobile app development companies in India. Company is leading IT Consulting and web solution provider for custom software, website, games, custom web application, enterprise mobility, mobile apps and cloud-based application design & development. Company is ranked one of the fastest growing web design and development company in India, with 3900+ successfully delivered projects across United States, UK, UAE, Canada and other countries. Over 95% of client retention rate demonstrates their level of services and client satisfaction. Position : Senior Data Engineer Experience : 5+ Years relevant experience Education Qualification : Bachelor's or Master’s degree in Computer Science, Information Technology, or a related field. Job Location : Ahmedabad Shift : 11 AM – 8.30 PM Key Responsibilities: Our client seeking an experienced and motivated Senior Data Engineer to join their AI & Automation team. The ideal candidate will have 5–8 years of experience in data engineering, with a proven track record of designing and implementing scalable data solutions. A strong background in database technologies, data modeling, and data pipeline orchestration is essential. Additionally, hands-on experience with generative AI technologies and their applications in data workflows will set you apart. In this role, you will lead data engineering efforts to enhance automation, drive efficiency, and deliver data driven insights across the organization. Job Description: • Design, build, and maintain scalable, high-performance data pipelines and ETL/ELT processes across diverse database platforms. • Architect and optimize data storage solutions to ensure reliability, security, and scalability. • Leverage generative AI tools and models to enhance data engineering workflows, drive automation, and improve insight generation. • Collaborate with cross-functional teams (Data Scientists, Analysts, and Engineers) to understand and deliver on data requirements. • Develop and enforce data quality standards, governance policies, and monitoring systems to ensure data integrity. • Create and maintain comprehensive documentation for data systems, workflows, and models. • Implement data modeling best practices and optimize data retrieval processes for better performance. • Stay up-to-date with emerging technologies and bring innovative solutions to the team. Qualifications: • Bachelor's or Master’s degree in Computer Science, Information Technology, or a related field. • 5–8 years of experience in data engineering, designing and managing large-scale data systems. Strong expertise in database technologies, including: The mandatory skills are as follows: SQL NoSQL (MongoDB or Cassandra, or CosmosDB) One of the following : Snowflake or Redshift or BigQuery or Microsft Fabrics Azure • Hands-on experience implementing and working with generative AI tools and models in production workflows. • Proficiency in Python and SQL, with experience in data processing frameworks (e.g., Pandas, PySpark). • Experience with ETL tools (e.g., Apache Airflow, MS Fabric, Informatica, Talend) and data pipeline orchestration platforms. • Strong understanding of data architecture, data modeling, and data governance principles. • Experience with cloud platforms (preferably Azure) and associated data services. Skills: • Advanced knowledge of Database Management Systems and ETL/ELT processes. • Expertise in data modeling, data quality, and data governance. • Proficiency in Python programming, version control systems (Git), and data pipeline orchestration tools. • Familiarity with AI/ML technologies and their application in data engineering. • Strong problem-solving and analytical skills, with the ability to troubleshoot complex data issues. • Excellent communication skills, with the ability to explain technical concepts to non-technical stakeholders. • Ability to work independently, lead projects, and mentor junior team members. • Commitment to staying current with emerging technologies, trends, and best practices in the data engineering domain. If your profile is matching with the requirement & if you are interested for this job, please share your updated resume with details of your present salary, expected salary & notice period.

Posted 13 hours ago

Apply

1.0 - 2.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

DIGITAP.AI provides high tech advanced AI / ML solutions to new age internet driven businesses for reliable, fast and 100% compliant Customer On boarding, Automated Risk Management along with Big Data enabled services like Risk Analytics and Customized Scorecards. For customers on boarding and risk management, Digitap.ai extracts the data from various sources through web scraping. The Role: We are seeking a talented Python Backend Developer with 1-2 years of experience to join our team. The ideal candidate is passionate about creating scalable and efficient backend solutions, has a strong understanding of Python development best practices, and thrives in a fast-paced, collaborative environment. You will work closely with cross-functional teams to translate business requirements into high-quality software solutions. Responsibilities: Design, develop, and maintain scalable and efficient backend systems using Python and related technologies. Collaborate with other developers, product managers, and stakeholders to understand requirements and translate them into technical solutions. Implement and maintain RESTful APIs for seamless integration with frontend applications and third-party services. Optimize application performance and scalability through code refactoring, database optimization, and other best practices. Write clean, maintainable, and well-documented code while adhering to coding standards and software development methodologies. Conduct code reviews, provide constructive feedback, and mentor junior team members to promote a culture of continuous learning and improvement. Stay up-to-date with the latest trends and technologies in backend development, and actively contribute to the technical growth of the team. Required Skill sets: Strong Python programming knowledge. Familiarity with cloud platforms such as AWS, Azure, or Google Cloud Platform. Ability to integrate unit test frameworks seamlessly with modern IDEs and CI/CD workflows. Knowledge of web scraping and structured data parsing is an added advantage. Good knowledge of Various Databases from RDBMS to NoSQL ( MySQL, Redis, MongoDB). Web RestFul APIs / Microservices Development and Django Experience. Strong Logical & Analytical skills with good soft & communication skills. Qualification: Bachelor's degree in Computer Science, Engineering, or a related field.(minimum 4 years of degree). 1-2 years of professional experience in backend development using Python.

Posted 14 hours ago

Apply

2.0 - 10.0 years

0 Lacs

pune, maharashtra

On-site

As an experienced IT professional with a passion for data and technology, your role will involve ensuring that data accurately reflects business requirements and targets. Collaborating closely with the Procurement & Logistic department and external providers in an agile environment, you will leverage your deep understanding of technology stack capabilities to facilitate engagements and solve impediments for delivering data use cases to drive business value and contribute to the vision of becoming a data-driven company. You will play a crucial role in the energy transformation at Siemens Energy ABP Procurement team, working alongside a diverse team of innovative and hardworking data enthusiasts and AI professionals. Your impact will be significant, with responsibilities including service operation and end-to-end delivery management, interacting with business users and key collaborators, developing and maintaining data architecture and governance standards, designing optimized data architecture frameworks, providing guidance to developers, ensuring data quality, and collaborating with various functions to translate user requirements into technical specifications. To excel in this role, you should bring 8 to 10 years of IT experience with a focus on ETL tools and platforms, proficiency in Snowflake SQL Scripting, JavaScript, PL/SQL, and data modeling for relational databases. Experience in data warehousing, data migration, building data pipelines, and working with AWS, Azure & GCP data services is essential. Additionally, familiarity with Qlik, Power BI, and a degree in computer science or IT are preferred. Strong English skills, intercultural communication abilities, and a background in international collaboration are also key requirements. Joining the Value Center ERP team at Siemens Energy, you will be part of a dynamic group dedicated to driving digital transformation in manufacturing and contributing to the achievement of Siemens Energy's objectives. This role offers the opportunity to work on innovative projects that have a substantial impact on the business and industry, enabling you to be a part of the energy transition and the future of sustainable energy solutions. Siemens Energy is a global leader in energy technology, with a commitment to sustainability and innovation. With a diverse team of over 100,000 employees worldwide, we are dedicated to meeting the energy demands of the future in a reliable and sustainable manner. By joining Siemens Energy, you will contribute to the development of energy systems that drive the energy transition and shape the future of electricity generation. Diversity and inclusion are at the core of Siemens Energy's values, celebrating uniqueness and creativity across over 130 nationalities. The company provides employees with benefits such as Medical Insurance and Meal Card options, supporting a healthy work-life balance and overall well-being. If you are ready to make a difference in the energy sector and be part of a global team committed to sustainable energy solutions, Siemens Energy offers a rewarding and impactful career opportunity.,

Posted 14 hours ago

Apply

7.0 - 11.0 years

0 Lacs

haryana

On-site

As a Solution Architect specializing in AI and automation, you will be responsible for designing comprehensive solutions for contact center and back-office operations. Your role will involve leading architectural discussions, developing architectures for Generative AI use cases, and collaborating with various stakeholders to translate business requirements into technical solutions. You will guide the implementation team on best practices, perform architecture reviews, and stay updated on emerging AI, automation, and cloud technologies relevant to the industry. To excel in this role, you should possess a minimum of 7 years of experience in solution architecture with a focus on cloud-native applications. Demonstrated expertise in designing and implementing AI-driven solutions, hands-on experience with major public cloud providers, and proficiency in programming languages such as C++, C#, Java, or Python are essential. Additionally, experience in delivering Generative AI solutions and knowledge of microservices, APIs, data platforms, and security are valuable assets. Preferred qualifications include certifications in cloud architecture, direct experience with contact center platforms and RPA tools, as well as a background in regulated industries. Strong communication skills, stakeholder engagement abilities, and experience with Agile and DevOps delivery models are also desired. In return, you will have the opportunity to work on cutting-edge AI and automation projects, be part of a dynamic and innovative team environment, and receive competitive compensation and benefits. Join FIS BPS to lead the transformation of business process services with next-generation AI and automation solutions. As an Expert professional in this role, you will be recognized as a Subject Matter Expert (SME) responsible for developing large and complex business process solutions. You will act as an internal consultant for issue resolution, train and mentor staff, and work independently on significant projects. Superior presentation, communication, and negotiation skills are required, along with strategic planning capabilities. FIS is committed to protecting the privacy and security of all personal information processed to provide services. Recruitment at FIS primarily operates on a direct sourcing model, and resumes from recruitment agencies not on the preferred supplier list are not accepted.,

Posted 14 hours ago

Apply

2.0 - 6.0 years

0 Lacs

chandigarh

On-site

As a Mobile Application Developer, you will play a crucial role in our tech-driven team by leveraging your skills in React Native, JavaScript, and TypeScript to create high-quality mobile applications. Your responsibilities will involve collaborating with cross-functional teams to ensure the delivery of seamless and efficient user experiences. You will be responsible for developing and maintaining mobile applications using React Native, writing clean and maintainable code in JavaScript and TypeScript, and collaborating with UI/UX designers to implement intuitive interfaces. Additionally, you will optimize application performance for maximum speed and scalability, integrate RESTful APIs and third-party libraries, and utilize state management libraries like Redux or MobX. Furthermore, you will manage code versioning with Git, participate in code reviews and team collaborations, and stay updated with emerging technologies and best practices. To excel in this role, you should possess a Bachelor's degree in Computer Science, Engineering, or a related field, be proficient in JavaScript and TypeScript, and have strong experience with React Native for mobile app development. Preferred skills for this position include familiarity with CSS pre-processors like SASS or LESS, experience with front-end tools such as Babel, Webpack, and NPM, knowledge of RESTful APIs and asynchronous programming, and proficiency with code versioning tools like Git. Understanding mobile app deployment on iOS and Android platforms is also essential. Moreover, having experience with state management libraries (Redux, MobX), knowledge of modern authentication and authorization mechanisms, experience deploying applications on cloud platforms (AWS, Azure, Google Cloud), familiarity with containerization (Docker) and orchestration (Kubernetes), understanding of GraphQL and Apollo Client, experience with CI/CD pipelines and automation tools, knowledge of testing frameworks like Jest or Enzyme, and familiarity with React Navigation for handling app routing and navigation are considered as preferred skills for this role.,

Posted 14 hours ago

Apply

8.0 - 12.0 years

0 Lacs

chennai, tamil nadu

On-site

As a Java Development Lead for Functions DFT, you will be responsible for coding, leading, and designing solutions within the DFT Functions Domain. Your role will involve developing applications, providing reusable patterns/components for development, and ensuring end-to-end ownership throughout the lifecycle. Additionally, you will lead and guide a community group contributing to reusable components/libraries. Your key responsibilities will include demonstrating a strong sense of responsibility and ownership in technical evaluations, decisions, and software design/development activities. You will contribute to building prototypes, translating them into production-grade applications or reusable components, and staying updated on industry developments to incorporate best practices. Leading end-to-end delivery with adherence to timelines and agile practices, participating in code reviews, and mentoring team members on technical aspects are also crucial aspects of your role. Your expertise should include proficiency in Java programming language, web application development using frameworks like Spring Boot, Quarkus, and Node.js, as well as designing applications using microservices, cloud-native architecture, and REST API. You should have experience with API design, various design patterns, databases (RDBMS & NoSQL), coding, debugging, testing, documentation, version control (GitHub/Bitbucket), container ecosystems (Kubernetes, EKS, AKS), messaging systems (Kafka, RabbitMQ), cloud computing platforms (AWS, GCP, Azure), high-level architecture, CI/CD (ADO), and Scrum methodology. In addition, you will be expected to display exemplary conduct, adhere to regulatory and business standards, and collaborate effectively with key stakeholders such as Enterprise Architects, Technology Delivery teams, and Business Product owners. Your qualifications should include a Bachelor's degree in computer science or a related technical field, with at least 8 years of relevant work experience. Certifications in Java, Architecture, Cyber Security, or Cloud will be advantageous. As part of Standard Chartered, an international bank committed to making a positive difference, you will have the opportunity to grow, innovate, and contribute to our purpose of driving commerce and prosperity. We value diversity, inclusion, and continuous learning, offering various benefits and support systems to help you thrive and succeed in your role. If you are looking for a purpose-driven career in a values-driven organization that celebrates diversity and inclusion, we encourage you to apply and be part of our global team at Standard Chartered.,

Posted 14 hours ago

Apply

5.0 - 9.0 years

0 Lacs

pune, maharashtra

On-site

Join as a Big Data Engineer at Barclays and lead the evolution of the digital landscape to drive innovation and excellence. Utilize cutting-edge technology to revolutionize digital offerings and ensure unparalleled customer experiences. To succeed in this role, you should possess the following essential skills: - Full Stack Software Development for large-scale, mission-critical applications. - Proficiency in distributed big data systems like Spark, Hive, Kafka streaming, Hadoop, Airflow. - Expertise in Scala, Java, Python, J2EE technologies, Microservices, Spring, Hibernate, REST APIs. - Experience with n-tier web application development and frameworks such as Spring Boot, Spring MVC, JPA, Hibernate. - Familiarity with version control systems, particularly Git; GitHub Copilot experience is a bonus. - Proficient in API Development using SOAP or REST, JSON, and XML. - Hands-on experience in developing back-end applications with multi-process and multi-threaded architectures. - Skilled in building scalable microservices solutions using integration design patterns, Dockers, Containers, and Kubernetes. - Knowledge of DevOps practices including CI/CD, Test Automation, Build Automation using tools like Jenkins, Maven, Chef, Git, Docker. - Experience with data processing in cloud environments like Azure or AWS. - Essential experience in Data Product development and Agile development methodologies like SCRUM. - Result-oriented with strong analytical and problem-solving skills. - Excellent verbal and written communication and presentation skills. Your primary responsibilities will include: - Developing and delivering high-quality software solutions using industry-aligned programming languages, frameworks, and tools, ensuring scalability, maintainability, and performance optimization. - Collaborating cross-functionally with product managers, designers, and engineers to define software requirements, devise solution strategies, and align with business objectives. - Promoting a culture of code quality and knowledge sharing through participation in code reviews and industry technology communities. - Ensuring secure coding practices to protect data and mitigate vulnerabilities, along with effective unit testing practices for proper code design and reliability. As a Big Data Engineer at Barclays, you will play a crucial role in designing, developing, and enhancing software to provide business, platform, and technology capabilities for customers and colleagues. You will contribute to technical excellence, continuous improvement, and risk mitigation while adhering to Barclays" values of Respect, Integrity, Service, Excellence, and Stewardship, and embodying the Barclays Mindset of Empower, Challenge, and Drive.,

Posted 14 hours ago

Apply

2.0 - 6.0 years

0 Lacs

noida, uttar pradesh

On-site

As a Front-End Developer, you will be responsible for building and maintaining web applications using React.js. Your role will involve optimizing user interfaces to ensure maximum performance across various devices and browsers. Implementing responsive design principles will be crucial to enhance user experience. In Back-End Development, you will develop and maintain server-side applications using Node.js. Your tasks will include designing and integrating RESTful API endpoints while ensuring the implementation of security and data protection practices. Collaboration is a key aspect of this role. You will collaborate closely with UI/UX designers to create user-centric designs and work with DevOps teams to deploy and maintain applications. Participating in code reviews and providing constructive feedback will also be part of your responsibilities. Your expertise in Debugging and Optimization will be essential to identify and resolve performance bottlenecks in the application. Monitoring and enhancing the stability and scalability of web applications will contribute to the overall success of the projects. Documentation is another important aspect of this role. You will be required to create and maintain technical documentation for code and processes to ensure transparency and knowledge sharing within the team. In terms of Technical Skills, proficiency in React.js and Node.js is a must. Strong knowledge of JavaScript (ES6+), HTML5, and CSS3 is required. Experience with state management libraries like Redux, familiarity with database technologies such as MongoDB, PostgreSQL, or MySQL, and expertise in version control systems, especially Git, are essential. Knowledge of cloud platforms like Azure, AWS, or Google Cloud is a plus, as well as familiarity with other programming languages like Python or PHP. Soft Skills play a significant role in this position. Strong problem-solving skills, attention to detail, excellent communication, and teamwork abilities are crucial. The ability to adapt to a fast-paced environment and meet deadlines will be key to your success in this role. Preferred Qualifications include experience with JavaScript, familiarity with CI/CD pipelines and DevOps practices, knowledge of testing frameworks such as Jest, Mocha, or Cypress, and exposure to Agile/Scrum methodologies.,

Posted 14 hours ago

Apply

10.0 - 14.0 years

0 Lacs

pune, maharashtra

On-site

You are an experienced Web Application Architect with over 10 years of experience in full-stack development and cloud-native architecture. Your primary responsibility will be to design end-to-end technical solutions, guide development teams, and deliver scalable, secure, and high-performing enterprise applications. Working closely with stakeholders from product, engineering, and operations, you will architect solutions for diverse domains such as SaaS, ERP, IoT, and Supply Chain. Your key responsibilities will include designing and owning the architecture for scalable and distributed systems, utilizing modern technologies like Node.js, NestJS, Laravel, ReactJS, and AWS. You will translate business requirements into high-level technical solutions, define system standards, development processes, and deployment strategies, and provide technical leadership throughout the software development lifecycle. Collaborating with cross-functional teams, you will guide best practices in microservices, APIs, cloud-native patterns, and database design. As a Web Application Architect, you will lead and mentor development teams, conduct architecture workshops, participate in sprint planning and delivery, and ensure that solution designs meet performance, scalability, availability, and security requirements. Additionally, you will evaluate and recommend emerging tools, technologies, and frameworks to improve development productivity and product quality. Your expertise should cover a range of key skills and technologies, including Architecture & Design principles like Microservices, Distributed Systems, API Design, and Event-Driven Architecture. You should be proficient in Backend technologies such as Node.js, NestJS, PHP, Laravel, Frontend technologies like ReactJS, Angular, Vue.js, and GraphQL, as well as Cloud & DevOps tools like AWS (Lambda, EC2, ECS, S3, RDS, Elasticache), Azure, Docker, GitLab CI/CD, and Jenkins. Your knowledge should also extend to various databases such as MongoDB, MySQL, PostgreSQL, Elasticsearch, and other technologies like Kafka, REST/GraphQL APIs, Agile methodologies (Scrum/Kanban), and tools like Jira and Confluence.,

Posted 14 hours ago

Apply

610.0 years

0 Lacs

Itanagar, Arunachal Pradesh, India

On-site

Experience Required 610+ years in AI/ML development, with 3+ years of hands-on experience in Generative AI, RAG frameworks, and Agentic AI systems. Job Summary We are seeking highly skilled Generative AI Engineers to join a dynamic team focused on building enterprise-grade, production-ready AI systems using RAG and Agentic AI paradigms. The ideal candidates will have hands-on experience developing and fine-tuning LLM-based applications, integrating feedback loops, and implementing safeguards in regulated or complex business environments. Key Responsibilities Design, develop, and optimize RAG pipelines using frameworks such as LangChain, LlamaIndex, or custom built stacks. Implement Agentic AI architectures involving task-based agents, stateful memory, planning-execution workflows, and tool augmentation. Perform model fine-tuning, embedding generation, and evaluation of LLM outputs; incorporate human and automated feedback loops. Build and enforce guardrails to ensure safe, compliant, and robust model behaviorincluding prompt validation, output moderation, and access controls. Collaborate with cross-functional teams to deploy solutions in cloud-native environments such as Azure OpenAI, AWS Bedrock, or Google Vertex AI. Contribute to system observability via dashboards and logging, and support post-deployment model monitoring and optimization. Required Qualifications Proven production experience with RAG frameworks like LangChain, LlamaIndex, or custom-built solutions Solid understanding of Agentic AI design patterns: task agents, memory/state tracking, and orchestration logic Strong expertise in LLM fine-tuning, vector embeddings, evaluation strategies, and feedback integration Experience with implementing AI guardrails (e.g., moderation, filtering, prompt validation) Proficiency in Python, LLM APIs (OpenAI, Anthropic, Cohere, etc.), and vector database integration Familiarity with CI/CD pipelines, API integrations, and cloud-native deployment patterns Preferred Qualifications Experience working on AI projects in regulated environments (Banking domain) Hands-on experience with cloud AI platforms : Azure OpenAI, AWS Bedrock, or Google Vertex AI Knowledge of prompt engineering, RLHF, and LLM observability frameworks Experience building or leveraging internal LLM evaluation harnesses, agent orchestration layers, or compliance dashboards (ref:hirist.tech)

Posted 14 hours ago

Apply

0 years

0 Lacs

Itanagar, Arunachal Pradesh, India

On-site

AI Model Development & Support : Assist in fine-tuning LLMs and building innovative, prompt-driven solutions using Azure OpenAI. Develop robust data preprocessing pipelines, focusing on data cleaning, tokenization, and normalization. Contribute to the implementation of RAG pipelines and Azure Document Intelligence solutions. MLOps & Lifecycle Management : Support the implementation and maintenance of Azure ML Pipelines, leveraging MLflow and DVC for efficient version control and experiment tracking. Assist in monitoring AI model performance using Azure Monitor and Application Insights to ensure optimal operation. Maintain and manage AI notebooks and experiments effectively within Azure AI Studio (Foundry). Collaboration & Continuous Learning : Work closely with Senior AI Engineers, Data Scientists, and teams like DataOps and PlatformOps. Actively participate in peer reviews, knowledge-sharing sessions, and documentation efforts to foster a collaborative environment. Seize every opportunity to strengthen your AI and MLOps competencies, contributing to your professional growth. Quality Assurance & Delivery Support : Ensure high code quality, adherence to documentation standards, and timely delivery of project milestones. Contribute to rigorous testing, validation, and seamless integration of AI solutions into production environments. (ref:hirist.tech)

Posted 14 hours ago

Apply

3.0 - 4.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Job Summary We are looking for a skilled Python Developer with 3-4 years of experience, specializing in FastAPI and Microservices architecture. The ideal candidate should have strong backend development skills, expertise in building scalable APIs, and experience working with distributed systems. Key Responsibilities Develop, test, and maintain scalable backend applications using Python and FastAPI. Design and implement microservices-based architectures for high-performance applications. Develop and optimize RESTful APIs and asynchronous services. Work with databases like PostgreSQL, MySQL, or MongoDB, ensuring efficient queries and indexing. Implement authentication, authorization, and security best practices in APIs. Integrate third-party services and APIs as needed. Write clean, maintainable, and well-documented code following industry best practices. Collaborate with frontend developers, DevOps, and other stakeholders to ensure smooth project execution. Optimize system performance, scalability, and reliability. Work with Docker, Kubernetes, and CI/CD pipelines for deployment and automation. Monitor, troubleshoot, and resolve application issues in a timely Skills & Qualifications : 2-3 years of experience in Python backend development. Strong experience with FastAPI (or Flask/Django with a willingness to switch). Hands-on experience in Microservices architecture. Proficiency in SQL and NoSQL databases (PostgreSQL, MySQL, MongoDB, or Redis). Experience with Celery, RabbitMQ, or Kafka for asynchronous processing. Knowledge of containerization using Docker and Kubernetes. Experience working with authentication mechanisms (JWT, OAuth, API Keys). Good understanding of asynchronous programming and event-driven architectures. Strong problem-solving and debugging skills. Proficiency in Git and version control to Have : Experience with GraphQL. Exposure to AWS, GCP, or Azure cloud services. Understanding of WebSockets and real-time applications. Experience with CI/CD tools like Jenkins, GitHub Actions, or GitLab CI. Knowledge of unit testing and test-driven development (TDD). (ref:hirist.tech)

Posted 14 hours ago

Apply

2.0 - 10.0 years

0 Lacs

Mulshi, Maharashtra, India

On-site

Area(s) of responsibility Skills: Azure DevOps with C# Experience: 2-10 years Location: Pune Only Role And Responsibilities Collaboration and Communication: Strong skills in designing and implementing processes for collaboration and communication within cross-functional teams Experience in end-customer / end-user communication and support. An excellent communicator, capable of cooperation inside and outside of the team. Proficiency in Azure Services: Understanding Azure cloud services, including virtual machines, containers, networking, and databases CI/CD Pipelines: Experience in designing, implementing, and managing Continuous Integration/Continuous Deployment (CI/CD) pipelines using tools like Azure DevOps, Jenkins, or GitHub Actions Automation and Scripting: C# Skills in automating the software delivery process, scripting deployment tasks, managing build processes, and orchestrating automated tests Testing: Experience and understanding of testing is important. Source Control Management: Proficiency in using source control systems like Git to manage code repositories Standards, Security and Compliance: Ability to develop and implement security and compliance plans

Posted 14 hours ago

Apply

3.0 - 7.0 years

0 Lacs

chennai, tamil nadu

On-site

The Senior Data Engineer position at Annalect within the Technology team involves building products on cloud-based data infrastructure while collaborating with a team that shares a passion for technology, design, development, and data integration. Your main responsibilities will include designing, building, testing, and deploying data transfers across various cloud environments such as Azure, GCP, AWS, and Snowflake. You will also be tasked with developing data pipelines, monitoring, maintaining, and optimizing them. Writing at-scale data transformations using SQL and Python will be a crucial part of your role. Additionally, you will be expected to conduct code reviews and provide mentorship to junior developers. To excel in this position, you should possess a keen curiosity for understanding the business requirements driving the engineering needs. An enthusiasm for exploring new technologies and bringing innovative ideas to the team is highly valued. A minimum of 3 years of experience in SQL, Python, and Linux is required, along with familiarity with Snowflake, AWS, GCP, and Azure cloud environments. Intellectual curiosity, self-motivation, and a genuine passion for technology are essential attributes for success in this role. For this role, a degree in Computer Science, Engineering, or equivalent practical experience is preferred. Experience with big data, infrastructure setup, and working with relational databases like Postgres, MySQL, and MSSQL is advantageous. Familiarity with data processing tools such as Hadoop, Hive, Spark, and Redshift is beneficial as a significant amount of time will be dedicated to building and optimizing data transformations. The ability to independently manage projects from concept to implementation and maintenance is a key requirement. Working at Annalect comes with various perks, including a vibrant and collaborative work environment with engaging social and learning activities, a generous vacation policy, extended time off during the holiday season, and the advantage of being part of a global company while maintaining a startup-like flexibility and pace. The role offers the opportunity to work with a modern stack and environment, enabling continuous learning and experimentation with cutting-edge technologies to drive innovation.,

Posted 14 hours ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies