Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
7.0 - 12.0 years
10 - 14 Lacs
bengaluru
Work from Office
Your future role Take on a new challenge and apply your data engineering expertise in a cutting-edge field. Youll work alongside collaborative and innovative teammates. You'll play a key role in enabling data-driven decision-making across the organization by ensuring data availability, quality, and accessibility. Day-to-day, youll work closely with teams across the business (e.g., Data Scientists, Analysts, and ML Engineers), mentor junior engineers, and contribute to the architecture and design of our data platforms and solutions. Youll specifically take care of designing and developing scalable data pipelines, but also managing and optimizing object storage systems. Well look to you for: Designing, developing, and maintaining scalable and efficient data pipelines using tools like Apache NiFi and Apache Airflow. Creating robust Python scripts for data ingestion, transformation, and validation. Managing and optimizing object storage systems such as Amazon S3, Azure Blob, or Google Cloud Storage. Collaborating with Data Scientists and Analysts to understand data requirements and deliver production-ready datasets. Implementing data quality checks, monitoring, and alerting mechanisms. Ensuring data security, governance, and compliance with industry standards. Mentoring junior engineers and promoting best practices in data engineering. All about you We value passion and attitude over experience. Thats why we dont expect you to have every single skill. Instead, weve listed some that we think will help you succeed and grow in this role: Bachelors or Masters degree in Computer Science, Engineering, or a related field. 7+ years of experience in data engineering or a similar role. Strong proficiency in Python and data processing libraries (e.g., Pandas, PySpark). Hands-on experience with Apache NiFi for data flow automation. Deep understanding of object storage systems and cloud data architectures. Proficiency in SQL and experience with both relational and NoSQL databases. Familiarity with cloud platforms (AWS, Azure, or GCP). Exposure to the Data Science ecosystem, including tools like Jupyter, scikit-learn, TensorFlow, or MLflow. Experience working in cross-functional teams with Data Scientists and ML Engineers. Cloud certifications or relevant technical certifications are a plus.
Posted 13 hours ago
7.0 - 12.0 years
8 - 13 Lacs
bengaluru
Work from Office
Your future role Take on a new challenge and apply your data engineering expertise in a cutting-edge field. Youll work alongside collaborative and innovative teammates. You'll play a key role in enabling data-driven decision-making across the organization by ensuring data availability, quality, and accessibility. Day-to-day, youll work closely with teams across the business (e.g., Data Scientists, Analysts, and ML Engineers), mentor junior engineers, and contribute to the architecture and design of our data platforms and solutions. Youll specifically take care of designing and developing scalable data pipelines, but also managing and optimizing object storage systems. Well look to you for: Designing, developing, and maintaining scalable and efficient data pipelines using tools like Apache NiFi and Apache Airflow. Creating robust Python scripts for data ingestion, transformation, and validation. Managing and optimizing object storage systems such as Amazon S3, Azure Blob, or Google Cloud Storage. Collaborating with Data Scientists and Analysts to understand data requirements and deliver production-ready datasets. Implementing data quality checks, monitoring, and alerting mechanisms. Ensuring data security, governance, and compliance with industry standards. Mentoring junior engineers and promoting best practices in data engineering. All about you We value passion and attitude over experience. Thats why we dont expect you to have every single skill. Instead, weve listed some that we think will help you succeed and grow in this role: Bachelors or Masters degree in Computer Science, Engineering, or a related field. 7+ years of experience in data engineering or a similar role. Strong proficiency in Python and data processing libraries (e.g., Pandas, PySpark). Hands-on experience with Apache NiFi for data flow automation. Deep understanding of object storage systems and cloud data architectures. Proficiency in SQL and experience with both relational and NoSQL databases. Familiarity with cloud platforms (AWS, Azure, or GCP). Exposure to the Data Science ecosystem, including tools like Jupyter, scikit-learn, TensorFlow, or MLflow. Experience working in cross-functional teams with Data Scientists and ML Engineers. Cloud certifications or relevant technical certifications are a plus.
Posted 2 days ago
6.0 - 7.0 years
14 - 18 Lacs
kochi
Work from Office
As an Associate Software Developer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You'll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you'll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you'll tackle obstacles related to database integration and untangle complex, unstructured data sets. In this role, your responsibilities may include: Implementing and validating predictive models as well as creating and maintain statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques Designing and implementing various enterprise seach applications such as Elasticsearch and Splunk for client requirements Work in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviors. Build teams or writing programs to cleanse and integrate data in an efficient and reusable manner, developing predictive or prescriptive models, and evaluating modeling results Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Total Exp-6-7 Yrs (Relevant-4-5 Yrs) Mandatory Skills: Azure Databricks, Python/PySpark, SQL, Github, - Azure Devops - Azure Blob Ability to use programming languages like Java, Python, Scala, etc., to build pipelines to extract and transform data from a repository to a data consumer Ability to use Extract, Transform, and Load (ETL) tools and/or data integration, or federation tools to prepare and transform data as needed. Ability to use leading edge tools such as Linux, SQL, Python, Spark, Hadoop and Java Preferred technical and professional experience You thrive on teamwork and have excellent verbal and written communication skills. Ability to communicate with internal and external clients to understand and define business needs, providing analytical solutions Ability to communicate results to technical and non-technical audiences
Posted 2 days ago
6.0 - 7.0 years
14 - 18 Lacs
bengaluru
Work from Office
As an Associate Software Developer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You'll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you'll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you'll tackle obstacles related to database integration and untangle complex, unstructured data sets. In this role, your responsibilities may include: Implementing and validating predictive models as well as creating and maintain statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques Designing and implementing various enterprise search applications such as Elasticsearch and Splunk for client requirements Work in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviours. Build teams or writing programs to cleanse and integrate data in an efficient and reusable manner, developing predictive or prescriptive models, and evaluating modeling results Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Total Exp-6-7 Yrs (Relevant-4-5 Yrs) Mandatory Skills: Azure Databricks, Python/PySpark, SQL, Github, - Azure Devops - Azure Blob Ability to use programming languages like Java, Python, Scala, etc., to build pipelines to extract and transform data from a repository to a data consumer Ability to use Extract, Transform, and Load (ETL) tools and/or data integration, or federation tools to prepare and transform data as needed. Ability to use leading edge tools such as Linux, SQL, Python, Spark, Hadoop and Java Preferred technical and professional experience You thrive on teamwork and have excellent verbal and written communication skills. Ability to communicate with internal and external clients to understand and define business needs, providing analytical solutions Ability to communicate results to technical and non-technical audiences
Posted 2 days ago
5.0 - 10.0 years
7 - 12 Lacs
bengaluru
Work from Office
Role Description: Technical Requirement: Key Responsibilities: We are looking for motivated individuals to deliver innovative digital solutions through design, development, deployment, and support. The Senior Integration Software Engineer role involves designing, developing, testing, and maintaining API & Integration solutions using Microsoft Azure Integration Services. The developer will collaborate with cross-functional teams to gather requirements, optimize performance, and ensure scalability and reliability. Key skills include proficiency in Azure Integration services, C#, .NET, RESTful APIs, relational databases, and implementing best practices for code quality and security.Required Skills: Design comprehensive integration architectures, leading complex integration projects, establishing integration standards and practices. Mentor junior engineers. Design, develop, and maintain API & integration solutions using Microsoft Azure Integration services (e.g., Azure Logic Apps, Azure Functions, Azure API Management) to facilitate communication between different systems Collaborate with cross-functional teams to gather requirements and design integration solutions that meet business needs. Implement and manage data integration processes, ensuring data accuracy and consistency across systems. Optimize API & integration solutions for performance, scalability, and reliability. Troubleshoot and resolve integration issues in a timely manner. Design and management of relational database schemas, performance, and queries. Utilize data lake technologies (e.g., Azure Data Lake, Azure Blob Storage) to store and process large volumes of data. Document API & integration processes, configurations, and best practices. Participate in code reviews and contribute to the continuous improvement of development processes. Stay current with industry trends and emerging technologies to ensure our integration solutions remain cutting-edge.Required Qualification: Design comprehensive integration architectures, leading complex integration projects, establishing integration standards and practices. Mentor junior engineers. Design, develop, and maintain API & integration solutions using Microsoft Azure Integration services (e.g., Azure Logic Apps, Azure Functions, Azure API Management) to facilitate communication between different systems Collaborate with cross-functional teams to gather requirements and design integration solutions that meet business needs. Implement and manage data integration processes, ensuring data accuracy and consistency across systems. Optimize API & integration solutions for performance, scalability, and reliability. Troubleshoot and resolve integration issues in a timely manner. Design and management of relational database schemas, performance, and queries. Utilize data lake technologies (e.g., Azure Data Lake, Azure Blob Storage) to store and process large volumes of data. Document API & integration processes, configurations, and best practices. Participate in code reviews and contribute to the continuous improvement of development processes. Stay current with industry trends and emerging technologies to ensure our integration solutions remain cutting-edge.Preferred Qualification: Design comprehensive integration architectures, leading complex integration projects, establishing integration standards and practices. Mentor junior engineers. Design, develop, and maintain API & integration solutions using Microsoft Azure Integration services (e.g., Azure Logic Apps, Azure Functions, Azure API Management) to facilitate communication between different systems Collaborate with cross-functional teams to gather requirements and design integration solutions that meet business needs. Implement and manage data integration processes, ensuring data accuracy and consistency across systems. Optimize API & integration solutions for performance, scalability, and reliability. Troubleshoot and resolve integration issues in a timely manner. Design and management of relational database schemas, performance, and queries. Utilize data lake technologies (e.g., Azure Data Lake, Azure Blob Storage) to store and process large volumes of data. Document API & integration processes, configurations, and best practices. Participate in code reviews and contribute to the continuous improvement of development processes. Stay current with industry trends and emerging technologies to ensure our integration solutions remain cutting-edge.
Posted 3 days ago
3.0 - 7.0 years
0 Lacs
kolkata, west bengal
On-site
As an Azure Data Engineer, you will utilize your expertise and hands-on experience with Azure Data Lake, Azure Data Factory, SQL Data Warehouse Azure Blob, and Azure Storage Explorer. Your responsibilities will include creating Data Factory pipelines for on-cloud ETL processing, performing copy activities, and engaging in custom Azure development. You should possess a deep understanding of Azure Data Catalog, Event Grid, Service Bus, SQL, and Synapse. In addition, you will demonstrate proficiency in the SQL Server BI suite, encompassing ETL processes, reporting, analytics, and dashboards using tools such as SSIS, SSAS, SSRS, and Power BI. Your role will involve designing and constructing data architectures, including scalable data lakes in full cloud or hybrid cloud environments, as well as implementing scheduled transformations and data governance practices. Ideal candidates for this position would hold certifications such as Azure Data Engineer Certification or Data Architect Certification, although these are preferred qualifications rather than strict requirements. If you are passionate about working with cutting-edge technologies and have a strong background in Azure data engineering, this role based in Kolkata offers a rewarding opportunity to contribute to the design and implementation of innovative data solutions.,
Posted 1 week ago
1.0 - 11.0 years
0 Lacs
karnataka
On-site
The ideal candidate for this position should have 8-11 years of experience in DevOps Engineering with expertise in Azure Cloud Architecture, Azure DevOps CI/CD pipelines, Terraform, Azure infrastructure, and Azure security concepts. You should be proficient in writing deployment scripts, creating deployment packages, and integrating with code quality tools in CI/CD pipelines. Experience with Docker or containerized applications on Azure AKS is a plus. Knowledge of Agile methodology and good communication skills are essential for this role. Your responsibilities will include reviewing the current Azure Architecture to identify and address security gaps, updating configurations using Azure Portal or automation scripts, collaborating with customers to enhance security posture, creating/updating CI/CD pipelines, writing deployment scripts, managing resource provisioning with Terraform, and ensuring timely completion of DevOps tasks in alignment with organizational standards. As a DevOps Engineer at Infogain, you will play a crucial role in driving business outcomes for Fortune 500 companies and digital natives across various industries. Infogain is a Microsoft Gold Partner and Azure Expert Managed Services Provider (MSP), known for accelerating experience-led transformations through cutting-edge technologies like cloud, microservices, automation, IoT, and artificial intelligence. Join us in shaping the future of digital platforms and experience-led transformations. If you have the required skills and experience in DevOps Engineering, including Azure Cloud Architecture, Azure DevOps CI/CD pipelines, Terraform, and Azure security concepts, we encourage you to apply for this exciting opportunity at Infogain.,
Posted 1 week ago
3.0 - 7.0 years
0 Lacs
hyderabad, telangana
On-site
You should have experience with Data Transfer tools and methods, including the ability to create and process change requests for new and existing clients. This includes expertise in GlobalScape, NDM (including Secure+), AWS S3/AWS CLI, GCP, Azure Blob, WinSCP/Putty. Additionally, you should have proficiency in certificate management and application license management. Experience with server patching, maintenance, vulnerability remediation, and server monitoring is essential. Familiarity with system diagnostic tools and maintenance reports such as Rapid7 and Brinqa is required. You should also possess expertise in file server management, Active Directory, and DNS management. Extensive knowledge of AWS services is a must, including VPC, EC2 EMR, S3 Fargate, Load balancers, EFX, EBS, AWS Workspaces. You should be able to install and configure software according to organizational guidelines and plans, including system configuration and default user settings. Managing server access requests, system accounts, password management, Instance/EBS snapshots, server decommissions, and change control processes will be part of your responsibilities. You should also have experience in setting up and configuring user tools like Dbeaver, Excel Macro functionality, Vedit, and troubleshooting any related issues. This position was posted by Hymavati Sarojini from Softility.,
Posted 2 weeks ago
3.0 - 7.0 years
0 Lacs
pune, maharashtra
On-site
You should have a minimum of 3 years of experience and be ready to join immediately. You will be based in Hyderabad and work in the General shift. Your main responsibilities will include being a Management Migration Expert with expertise in Azure management tools and frameworks such as Azure ARM templates, Bicep, and Terraform. You will be managing resources like virtual machines and storage. Familiarity with DevOps principles and CICD tools like Azure DevOps or GitHub Actions will be essential to streamline deployment and integration processes. You should be proficient in Azure cost management, performance monitoring, and usage tracking to optimize resources and ensure cost-effective usage of cloud resources. Experience in implementing Azure Monitor, Application Insights, and Log Analytics will be crucial to proactively monitor performance, detect issues, and enhance resource efficiency. It would be beneficial if you have familiarity with scripting languages like Python, Azure SDK, and JavaScript for automation and integration tasks. Your ability to analyze problem statements and propose scalable solutions for Azure cloud management and migration will be highly valued. You should have experience in developing and deploying highly available, scalable, and secure Azure solutions following industry best practices. Knowledge of advanced monitoring and diagnostic tools like Azure Monitor and Log Analytics will be necessary for continuous resource monitoring and improvement. Preferred qualifications include strong analytical and problem-solving skills, customer-focused approach, relevant certifications such as Microsoft Certified Azure Administrator Associate or Azure Solutions Architect Expert, and excellent communication skills to provide technical support and resolve issues. Mandatory skills for this role include Azure Infra Services and Azure Monitor, while good-to-have skills encompass Azure AKS, Azure API Management, Azure BLOB, and Azure Database Service. If you meet these requirements and are enthusiastic about Azure cloud management and migration, we look forward to potentially welcoming you to our team. Best regards, Poornima P LTIMindtree,
Posted 2 weeks ago
5.0 - 9.0 years
0 Lacs
chennai, tamil nadu
On-site
As a Cloud Platform Engineer, you will play a crucial role in developing and maintaining Terraform modules and patterns for AWS and Azure. Your responsibilities will include creating platform landing zones, application landing zones, and deploying application infrastructure. Managing the lifecycle of these patterns will be a key aspect of your role, encompassing tasks such as releases, bug fixes, feature integrations, and updating test cases. You will be responsible for developing and releasing Terraform modules, landing zones, and patterns for both AWS and Azure platforms. Providing ongoing support for these patterns, including bug fixing and maintenance, will be essential. Additionally, you will need to integrate new features into existing patterns to enhance their functionality and ensure that updated and new patterns meet the current requirements. Updating and maintaining test cases for patterns will also be part of your responsibilities to guarantee reliability and performance. To qualify for this role, you should have at least 5 years of experience in AWS and Azure cloud migration. Proficiency in Cloud compute (such as EC2, EKS, Azure VM, AKS) and Storage (like s3, EBS, EFS, Azure Blob, Azure Managed Disks, Azure Files) is required. A strong knowledge of AWS and Azure cloud services, along with expertise in Terraform, is essential. Possessing AWS or Azure certification would be advantageous for this position. Key Qualifications: - 5+ years of AWS/Azure cloud migration experience - Proficiency in Cloud compute and Storage - Strong knowledge of AWS and Azure cloud services - Expertise in Terraform - AWS/Azure certification preferred Mandatory Skills: Cloud AWS DevOps (Minimum 5 Years of Migration Experience) Relevant Experience: 5-8 Years This is a Full-time, Permanent, or Contractual / Temporary job with a contract length of 12 months. Benefits: - Health insurance - Provident Fund Schedule: - Day shift, Monday to Friday, Morning shift Additional Information: - Performance bonus - Yearly bonus,
Posted 3 weeks ago
5.0 - 9.0 years
4 - 8 Lacs
Hyderabad
Work from Office
Educational Requirements MBA,MSc,MTech,Bachelor Of Science (Tech),Bachelor of Engineering,Bachelor Of Technology (Integrated) Service Line Enterprise Package Application Services Responsibilities Over 7 + years of IT experience which includes 5+ years of Extensive experience as a React JS Developer and 5 years of Experience as a UI/UX Developer /API developerExtensive experience in developing web pages and single page app using HTML/HTML5, DHTML CSS3, JavaScript, React JS 16 + , Redux, Node.js, express.js, JESTExperienced in MERN stack development Mongo dB, Express.js, Node, and ReactJS.Experience in all phase of SDLC like Requirement Analysis, Implementation and Maintenance, and extensive experience with Agile and SCRUM.Extensive knowledge in developing single - page applications (SPAs).Working knowledge of Web protocols and standards REST, SSO etcGood Expertise in development and debugging tools such as VSCode, git, npm, chrome developer tools, Familiar with creating Custom Reusable React Components Library.Involved in writing application level code to interact with APIs, RESTful Web Services using AJAX, JSON.Knowledge of utilizing cloud technologies including Amazon Web Services (AWS), Microsoft Azure Blob and Pivotal Cloud Foundry (PCF).Expertise in RESTful services to integrate between Application to ApplicationExperience with front-end development with back-end system integration.Proficient in using JEST framework for unit testingGood Experience in Bug tracking tools like JIRA,HP Quality CenterAbility to work effectively while working as a team member as well as individually.Excellent communication and Inter-Personal Skills, well organized, goal oriented. Additional Responsibilities: What’s in it for youWe are not just a technology company full of people, we’re a people company full of technology. It is people like you who make us what we are today. Welcome to our worldOur people, our culture, our voices, and our passions. What’s better than building the next big thingIt’s doing so while never letting go of the little things that matter. None of the amazing things we do at Infosys would be possible without an equally amazing culture, the environment in which to do them, one where ideas can flourish, and where you are empowered to move forward as far as your ideas will take you. This is something we achieve through cultivating a culture of inclusiveness and openness, and a mindset of exploration and applied innovation. A career at Infosys means experiencing and contributing to this environment every day. It means being a part of a dynamic culture where we are united by a common purposeto navigate further, together.EOE/Minority/Female/Veteran/Disabled/Sexual Orientation/Gender Identity/National OriginAt Infosys, we recognize that everyone has individual requirements. If you are a person with disability, illness or injury and require adjustments to the recruitment and selection process, please contact our Recruitment team for adjustment only on Infosys_ta@infosys.com or include your preferred method of communication in email and someone will be in touch.Please note in order to protect the interest of all parties involved in the recruitment process, Infosys does not accept any unsolicited resumes from third party vendors. In the absence of a signed agreement any submission will be deemed as non-binding and Infosys explicitly reserves the right to pursue and hire the submitted profile. All recruitment activity must be coordinated through the Talent Acquisition department. Technical and Professional Requirements: Participate in the estimation of work product in order to provide right information to TL/PM for overall project estimation Understand the requirements both functional and non-functional by going through the specifications and with inputs from business analysts and participate in creating high level estimate as well as translating the same to systems requirements in order to create a systems requirements document and participate effectively in the design, development and testing phases of the project Develop and review artifacts (Code, Documentation, Unit test scripts) conduct reviews for self and peers, conduct unit test and document unit test results for complex programs in order to build the application and make it ready for validation/delivery Preferred Skills: Technology-Reactive Programming-react JS
Posted 4 weeks ago
3.0 - 6.0 years
14 - 18 Lacs
Kochi
Work from Office
As an Associate Software Developer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You'll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you'll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you'll tackle obstacles related to database integration and untangle complex, unstructured data sets. In this role, your responsibilities may include: Implementing and validating predictive models as well as creating and maintain statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques Designing and implementing various enterprise search applications such as Elasticsearch and Splunk for client requirements Work in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviours. Build teams or writing programs to cleanse and integrate data in an efficient and reusable manner, developing predictive or prescriptive models, and evaluating modelling results. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Total Exp-6-7 Yrs (Relevant-4-5 Yrs) Mandatory Skills: Azure Databricks, Python/PySpark, SQL, Github, - Azure Devops- Azure Blob Ability to use programming languages like Java, Python, Scala, etc., to build pipelines to extract and transform data from a repository to a data consumer Ability to use Extract, Transform, and Load (ETL) tools and/or data integration, or federation tools to prepare and transform data as needed. Ability to use leading edge tools such as Linux, SQL, Python, Spark, Hadoop and Java Preferred technical and professional experience You thrive on teamwork and have excellent verbal and written communication skills. Ability to communicate with internal and external clients to understand and define business needs, providing analytical solutions Ability to communicate results to technical and non-technical audiences
Posted 4 weeks ago
3.0 - 6.0 years
14 - 18 Lacs
Bengaluru
Work from Office
As an Associate Software Developer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You'll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you'll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you'll tackle obstacles related to database integration and untangle complex, unstructured data sets. In this role, your responsibilities may include: Implementing and validating predictive models as well as creating and maintain statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques Designing and implementing various enterprise search applications such as Elasticsearch and Splunk for client requirements Work in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviours. Build teams or writing programs to cleanse and integrate data in an efficient and reusable manner, developing predictive or prescriptive models, and evaluating modelling results Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Total Exp 3-6 Yrs (Relevant-4-5 Yrs) Mandatory Skills: Azure Databricks, Python/PySpark, SQL, Github, - Azure Devops - Azure Blob Ability to use programming languages like Java, Python, Scala, etc., to build pipelines to extract and transform data from a repository to a data consumer Ability to use Extract, Transform, and Load (ETL) tools and/or data integration, or federation tools to prepare and transform data as needed. Ability to use leading edge tools such as Linux, SQL, Python, Spark, Hadoop and Java Preferred technical and professional experience You thrive on teamwork and have excellent verbal and written communication skills. Ability to communicate with internal and external clients to understand and define business needs, providing analytical solutions Ability to communicate results to technical and non-technical audiences
Posted 4 weeks ago
4.0 - 9.0 years
15 - 30 Lacs
Bengaluru
Work from Office
Data Engineer - Python We are looking for a Data Engineer with experience in building data pipelines and implementing AI/ML solutions in Azure. This role involves integrating structured and unstructured data sources for efficient retrieval and processing in support of OpenAI-based RAG pipelines Responsibilities Design and implement data pipelines using Azure Data Factory or equivalent. Set up and manage SQL databases and integrate with Azure AI Search. Prepare and store embeddings for RAG using Azure Vector Search. Ensure data quality, versioning, and security with Azure Blob, Key Vault, and Monitoring. Collaborate with prompt engineers and backend teams to optimize data flow. Collaborating with various stakeholders to determine software requirements. Design and develop logical flows for each business requirement Prepare technical documentation for each feature and guide/coach the junior developers during the implementation phase. Work closely with the other members of the backend and frontend team to integrate different components into the applications. Researching and implementing new technologies for the product Troubleshooting and resolving issues with coding or design. Testing the final product to ensure it is completely functional and meets requirements. Requirements SQL, Azure Data Factory, Azure Blob, Azure Key Vault Experience with vector stores and embeddings Familiarity with Azure OpenAI and AI Search Data modeling and performance tuning REST API integration and scripting Experience with building software using Python. Experience with building applications with micro services & serverless architecture 4+ years of experience in software development roles Good communication skills Technical Skills Language/ Framework: Python, Azure Data Factory, One Lake, Azure Blob, Azure Key Vault Database: Postgres, SQL Server OS: Unix/Linux/Windows/Serverless Others(good to have) Machine Learning Model Building, NLP – NLTK, Spacy, NumPy etc. Location Bangalore
Posted 4 weeks ago
7.0 - 11.0 years
0 Lacs
kolkata, west bengal
On-site
Wipro Limited is a leading technology services and consulting company dedicated to creating innovative solutions for clients" most complex digital transformation needs. With a vast portfolio of capabilities in consulting, design, engineering, and operations, Wipro aims to help clients achieve their boldest ambitions and establish future-ready, sustainable businesses. As a company with over 230,000 employees and business partners spanning 65 countries, Wipro is committed to supporting customers, colleagues, and communities in navigating an ever-changing world. Role Purpose: The primary objective of this role is to facilitate process delivery by ensuring the daily performance of Production Specialists, addressing technical escalations, and enhancing the technical capabilities of the Production Specialists. Key Requirements: - 7-10 years of software development experience - Proficiency in .NET Core 5.0 or above (Web, API) - Proficiency in Azure Services (Serverless Computing, Azure Functions, Azure durable functions, Azure Storage, Azure Service Bus, Azure Blob, Azure Table storage, Azure APIM) - Strong Object-Oriented Programming (OOPS) design skills and software design patterns proficiency - Strong knowledge of SQL server - Experience in Microservices architecture-based development - Good communication skills Responsibilities: - Handle technical escalations by diagnosing and troubleshooting client queries effectively - Manage and resolve technical roadblocks/escalations within SLA and quality requirements - Escalate unresolved issues to TA & SES when necessary - Provide product support and resolutions to clients through guided step-by-step solutions - Troubleshoot client queries professionally and courteously - Offer alternative solutions to retain customers" business - Communicate effectively with listeners and situations - Conduct triage-based trainings to bridge skill gaps and enhance technical knowledge of Production Specialists - Stay current with product features through relevant trainings - Identify common problems, recommend resolutions, and document findings - Continuously update job knowledge through self-learning opportunities and networks maintenance Performance Parameters: 1. Process: No. of cases resolved per day, compliance with process and quality standards, meeting SLAs, Pulse score, customer feedback 2. Team Management: Productivity, efficiency, absenteeism 3. Capability Development: Triages completed, Technical Test performance Join Wipro in reinventing your world and be a part of a company that encourages constant evolution and personal reinvention. Applications from individuals with disabilities are warmly welcomed.,
Posted 1 month ago
12.0 - 16.0 years
0 Lacs
hyderabad, telangana
On-site
You are an experienced Snowflake Architect with over 12 years of experience in data warehousing, cloud architecture, and Snowflake implementations. Your expertise lies in designing, optimizing, and managing large-scale Snowflake data platforms to ensure scalability, performance, and security. You are expected to possess deep technical knowledge of Snowflake, cloud ecosystems, and data engineering best practices. Your key responsibilities will include leading the design and implementation of Snowflake data warehouses, data lakes, and data marts. You will define best practices for Snowflake schema design, clustering, partitioning, and optimization. Additionally, you will architect multi-cloud Snowflake deployments with seamless integration and design data sharing, replication, and failover strategies for high availability. You will be responsible for optimizing query performance using Snowflake features, implementing automated scaling strategies for dynamic workloads, and troubleshooting performance bottlenecks in large-scale Snowflake environments. Furthermore, you will architect ETL/ELT pipelines using Snowflake, Coalesce, and other tools, integrate Snowflake with BI tools, ML platforms, and APIs, and implement CDC, streaming, and batch processing solutions. In terms of security, governance, and compliance, you will define RBAC, data masking, row-level security, and encryption policies in Snowflake. You will ensure compliance with GDPR, CCPA, HIPAA, and SOC2 regulations and establish data lineage, cataloging, and auditing using Snowflake's governance features. As a leader, you will mentor data engineers, analysts, and developers on Snowflake best practices, collaborate with C-level executives to align Snowflake strategy with business goals, and evaluate emerging trends for innovation. Your required skills and qualifications include over 12 years of experience in data warehousing, cloud architecture, and database technologies, 8+ years of hands-on Snowflake architecture and administration experience, and expertise in SQL and Python for data processing. Deep knowledge of Snowflake features, experience with cloud platforms, and strong understanding of data modeling are also essential. Certification as a Snowflake Advanced Architect is a must. Preferred skills include knowledge of DataOps, MLOps, and CI/CD pipelines, as well as familiarity with DBT, Airflow, SSIS, and IICS.,
Posted 1 month ago
6.0 - 7.0 years
14 - 18 Lacs
Bengaluru
Work from Office
As an Associate Software Developer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You'll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you'll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you'll tackle obstacles related to database integration and untangle complex, unstructured data sets. In this role, your responsibilities may include: Implementing and validating predictive models as well as creating and maintain statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques Designing and implementing various enterprise search applications such as Elasticsearch and Splunk for client requirements Work in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviours. Build teams or writing programs to cleanse and integrate data in an efficient and reusable manner, developing predictive or prescriptive models, and evaluating modelling results Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Total Exp-6-7 Yrs (Relevant-4-5 Yrs) Mandatory Skills: Azure Databricks, Python/PySpark, SQL, Github, - Azure Devops - Azure Blob Ability to use programming languages like Java, Python, Scala, etc., to build pipelines to extract and transform data from a repository to a data consumer Ability to use Extract, Transform, and Load (ETL) tools and/or data integration, or federation tools to prepare and transform data as needed. Ability to use leading edge tools such as Linux, SQL, Python, Spark, Hadoop and Java Preferred technical and professional experience You thrive on teamwork and have excellent verbal and written communication skills. Ability to communicate with internal and external clients to understand and define business needs, providing analytical solutions Ability to communicate results to technical and non-technical audiences
Posted 1 month ago
7.0 - 12.0 years
8 - 13 Lacs
Bengaluru
Work from Office
Date 25 Jun 2025 Location: Bangalore, KA, IN Company Alstom At Alstom, we understand transport networks and what moves people. From high-speed trains, metros, monorails, and trams, to turnkey systems, services, infrastructure, signalling, and digital mobility, we offer our diverse customers the broadest portfolio in the industry. Every day, 80,000 colleagues lead the way to greener and smarter mobility worldwide, connecting cities as we reduce carbon and replace cars. Your future role Take on a new challenge and apply your data engineering expertise in a cutting-edge field. Youll work alongside collaborative and innovative teammates. You'll play a key role in enabling data-driven decision-making across the organization by ensuring data availability, quality, and accessibility. Day-to-day, youll work closely with teams across the business (e.g., Data Scientists, Analysts, and ML Engineers), mentor junior engineers, and contribute to the architecture and design of our data platforms and solutions. Youll specifically take care of designing and developing scalable data pipelines, but also managing and optimizing object storage systems. Well look to you for: Designing, developing, and maintaining scalable and efficient data pipelines using tools like Apache NiFi and Apache Airflow. Creating robust Python scripts for data ingestion, transformation, and validation. Managing and optimizing object storage systems such as Amazon S3, Azure Blob, or Google Cloud Storage. Collaborating with Data Scientists and Analysts to understand data requirements and deliver production-ready datasets. Implementing data quality checks, monitoring, and alerting mechanisms. Ensuring data security, governance, and compliance with industry standards. Mentoring junior engineers and promoting best practices in data engineering. All about you We value passion and attitude over experience. Thats why we dont expect you to have every single skill. Instead, weve listed some that we think will help you succeed and grow in this role: Bachelors or Masters degree in Computer Science, Engineering, or a related field. 7+ years of experience in data engineering or a similar role. Strong proficiency in Python and data processing libraries (e.g., Pandas, PySpark). Hands-on experience with Apache NiFi for data flow automation. Deep understanding of object storage systems and cloud data architectures. Proficiency in SQL and experience with both relational and NoSQL databases. Familiarity with cloud platforms (AWS, Azure, or GCP). Exposure to the Data Science ecosystem, including tools like Jupyter, scikit-learn, TensorFlow, or MLflow. Experience working in cross-functional teams with Data Scientists and ML Engineers. Cloud certifications or relevant technical certifications are a plus. Things youll enjoy Join us on a life-long transformative journey the rail industry is here to stay, so you can grow and develop new skills and experiences throughout your career. Youll also: Enjoy stability, challenges, and a long-term career free from boring daily routines. Work with advanced data and cloud technologies to drive innovation. Collaborate with cross-functional teams and helpful colleagues. Contribute to innovative projects that have a global impact. Utilise our flexible and hybrid working environment. Steer your career in whatever direction you choose across functions and countries. Benefit from our investment in your development, through award-winning learning programs. Progress towards leadership roles or specialized technical paths. Benefit from a fair and dynamic reward package that recognises your performance and potential, plus comprehensive and competitive social coverage (life, medical, pension). You dont need to be a train enthusiast to thrive with us. We guarantee that when you step onto one of our trains with your friends or family, youll be proud. If youre up for the challenge, wed love to hear from you! Important to note As a global business, were an equal-opportunity employer that celebrates diversity across the 63 countries we operate in. Were committed to creating an inclusive workplace for everyone.
Posted 1 month ago
5.0 - 8.0 years
10 - 14 Lacs
Chennai
Work from Office
Mandatory Skills Terraform modules Devops, AWS and Azure Years of exp needed Minimum of 8 Years Work Location pls mention city and preferably office address as well Chennai Cloud Platform Engineer Chennai JC-75156 /75158 Band B3 No. of position - 3 Position Overview: Cloud Platform Engineer will be responsible for developing and maintaining Terraform modules and patterns for AWS and Azure. These modules and patterns will be used for platform landing zones, application landing zones, and application infrastructure deployments. The role involves managing the lifecycle of these patterns, including releases, bug fixes, feature integrations, and updates to test cases. Key Responsibilities: Develop and release Terraform modules, landing zones, and patterns for AWS and Azure. Provide lifecycle support for patterns, including bug fixing and maintenance. Integrate new features into existing patterns to enhance functionality. Release updated and new patterns to ensure they meet current requirements. Update and maintain test cases for patterns to ensure reliability and performance. Qualifications: 5+ years of AWS/Azure cloud migration experience. Proficiency in Cloud compute (EC2, EKS, Azure VM, AKS) and Storage (s3, EBS,EFS, Azure Blob, Azure Managed Disks, Azure Files). Strong knowledge of AWS and Azure cloud services. Expert in terraform. AWS/Azure certification preferred. Provide customer support/ service on the DevOps tools Timely support internal & external customers escalations on multiple platforms Troubleshoot the various problems that arise in implementation of DevOps tools across the project/ module Perform root cause analysis of major incidents/ critical issues which may hamper project timeliness, quality or cost Develop alternate plans/ solutions to be implemented as per root cause analysis of critical problems Follow escalation matrix/ process as soon as a resolution gets complicated or isnt resolved Provide knowledge transfer, sharing best practices with the team and motivate Team Management Resourcing Forecast talent requirements as per the current and future business needs Hire adequate and right resources for the team Train direct reportees to make right recruitment and selection decisions Talent Management Ensure 100% compliance to Wipros standards of adequate onboarding and training for team members to enhance capability & effectiveness Build an internal talent pool of HiPos and ensure their career progression within the organization Promote diversity in leadership positions Performance Management Set goals for direct reportees, conduct timely performance reviews and appraisals, and give constructive feedback to direct reports. Incase of performance issues, take necessary action with zero tolerance for will based performance issues Ensure that organizational programs like Performance Nxtarewell understood and that the team is taking the opportunities presented by such programs to their and their levels below Employee Satisfaction and Engagement Lead and drive engagement initiatives for the team Track team satisfaction scores and identify initiatives to build engagement within the team Proactively challenge the team with larger and enriching projects/ initiatives for the organization or team Exercise employee recognition and appreciation Deliver No. Performance Parameter Measure 1. Continuous Integration, Deployment & Monitoring 100% error free on boarding & implementation 2. CSAT Manage service tools Troubleshoot queries Customer experience 3. Capability Building & Team Management % trained on new age skills, Team attrition %, Employee satisfaction score Mandatory Skills: Cloud AWS Devops. Experience5-8 Years.
Posted 1 month ago
5.0 - 8.0 years
10 - 14 Lacs
Chennai
Work from Office
Mandatory Skills Terraform modules Devops, AWS and Azure Years of exp needed Minimum of 8 Years Work Location pls mention city and preferably office address as well Chennai Rates including mark up - 170 K/M Cloud Platform Engineer Chennai Band B3 No. of position - 3 Position Overview: Cloud Platform Engineer will be responsible for developing and maintaining Terraform modules and patterns for AWS and Azure. These modules and patterns will be used for platform landing zones, application landing zones, and application infrastructure deployments. The role involves managing the lifecycle of these patterns, including releases, bug fixes, feature integrations, and updates to test cases. Key Responsibilities: Develop and release Terraform modules, landing zones, and patterns for AWS and Azure. Provide lifecycle support for patterns, including bug fixing and maintenance. Integrate new features into existing patterns to enhance functionality. Release updated and new patterns to ensure they meet current requirements. Update and maintain test cases for patterns to ensure reliability and performance. Qualifications: 5+ years of AWS/Azure cloud migration experience. Proficiency in Cloud compute (EC2, EKS, Azure VM, AKS) and Storage (s3, EBS,EFS, Azure Blob, Azure Managed Disks, Azure Files). Strong knowledge of AWS and Azure cloud services. Expert in terraform. AWS/Azure certification preferred. Team Management Resourcing Forecast talent requirements as per the current and future business needs Hire adequate and right resources for the team Train direct reportees to make right recruitment and selection decisions Talent Management Ensure 100% compliance to Wipros standards of adequate onboarding and training for team members to enhance capability & effectiveness Build an internal talent pool of HiPos and ensure their career progression within the organization Promote diversity in leadership positions Performance Management Set goals for direct reportees, conduct timely performance reviews and appraisals, and give constructive feedback to direct reports. Incase of performance issues, take necessary action with zero tolerance for will based performance issues Ensure that organizational programs like Performance Nxtarewell understood and that the team is taking the opportunities presented by such programs to their and their levels below Employee Satisfaction and Engagement Lead and drive engagement initiatives for the team Track team satisfaction scores and identify initiatives to build engagement within the team Proactively challenge the team with larger and enriching projects/ initiatives for the organization or team Exercise employee recognition and appreciation Deliver No. Performance Parameter Measure 1. Continuous Integration, Deployment & Monitoring 100% error free on boarding & implementation 2. CSAT Manage service tools Troubleshoot queries Customer experience 3. Capability Building & Team Management % trained on new age skills, Team attrition %, Employee satisfaction score Mandatory Skills: Cloud AWS Devops. Experience5-8 Years.
Posted 1 month ago
8.0 - 13.0 years
20 - 35 Lacs
Chennai
Work from Office
Warm Greetings from SP Staffing Services Pvt Ltd!!!! Experience:8-15yrs Work Location :Chennai Job Description: Required Technical Skill Set: Azure Native Technology, synapse and data bricks, Python Desired Experience Range: 8+ Years Location of Requirement: Chennai Required Skills: Previous experience as a data engineer or in a similar role Must have experience with MS Azure services such as Data Lake Storage, Data Factory, Databricks, Azure SQL Database, Azure Synapse Analytics, Azure Functions Technical expertise with data models, data mining, analytics and segmentation techniques Knowledge of programming languages and environments such as Python, Java, Scala, R, .NET/C# Hands-on experience with SQL database design Great numerical and analytical skills Degree in Computer Science, IT, or similar field; a master's is a plus Experience working in integrating Azure PaaS services Interested candidates, Kindly share your updated resume to ramya.r@spstaffing.in or contact number 8667784354 (Whatsapp:9597467601 )to proceed further.
Posted 1 month ago
6.0 - 11.0 years
3 - 6 Lacs
Noida
Work from Office
We are looking for a skilled Snowflake Ingress/Egress Specialist with 6 to 12 years of experience to manage and optimize data flow into and out of our Snowflake data platform. This role involves implementing secure, scalable, and high-performance data pipelines, ensuring seamless integration with upstream and downstream systems, and maintaining compliance with data governance policies. Roles and Responsibility Design, implement, and monitor data ingress and egress pipelines in and out of Snowflake. Develop and maintain ETL/ELT processes using tools like Snowpipe, Streams, Tasks, and external stages (S3, Azure Blob, GCS). Optimize data load and unload processes for performance, cost, and reliability. Coordinate with data engineering and business teams to support data movement for analytics, reporting, and external integrations. Ensure data security and compliance by managing encryption, masking, and access controls during data transfers. Monitor data movement activities using Snowflake Resource Monitors and Query History. Job Bachelor's degree in Computer Science, Information Systems, or a related field. 6-12 years of experience in data engineering, cloud architecture, or Snowflake administration. Hands-on experience with Snowflake features such as Snowpipe, Streams, Tasks, External Tables, and Secure Data Sharing. Proficiency in SQL, Python, and data movement tools (e.g., AWS CLI, Azure Data Factory, Google Cloud Storage Transfer). Experience with data pipeline orchestration tools such as Apache Airflow, dbt, or Informatica. Strong understanding of cloud storage services (S3, Azure Blob, GCS) and working with external stages. Familiarity with network security, encryption, and data compliance best practices. Snowflake certification (SnowPro Core or Advanced) is preferred. Experience with real-time streaming data (Kafka, Kinesis) is desirable. Knowledge of DevOps tools (Terraform, CI/CD pipelines) is a plus. Strong communication and documentation skills are essential.
Posted 1 month ago
4.0 - 9.0 years
6 - 11 Lacs
Vadodara
Work from Office
Cloud Consultant | Document Management Software | Docsvault 2022-05-10T08:18:34-04:00 Summary We are looking for Azure Consultant with relevant experience of 4+ years in Microsoft Azure to design and implement cloud architecture for our new cloud application. Responsibilities and Duties Design, consult and advise on state-of-the-art technical solutions on Azure that address our Angular/.netcore/MySQL application requirements for scalability, reliability, security, and performance. Taking complete ownership of design, deployment, implementation, security, and maintenance plans of our application on Microsoft Azure Evaluate stakeholder requirements, develop, communicate, and present solutions to the dev team and management. Support our teams in driving documentation, written requirements, and strategic direction to transfer knowledge and responsibility of our cloud infrastructure. Recommend client value creation initiatives and implement industry best practices. Provide valuable contributions and adaptation to post implementation support (long term). Demonstrate a hands-on ability to deliver appropriate technical solutions within project and program time frames. Desired Candidate Profile Significant experience in solution design, architecture, and hands on delivery within the Azure cloud & Azure DevOps environment (Advanced Azure knowledge) Extensive experience in relevant hosting solutions like Azure Functions, Azure DB for MySQL, Azure Blobs, modern network design, and senior technical support role. An ongoing willingness to learn, upskill in cutting edge technologies, train, coach, and mentor.
Posted 1 month ago
5.0 - 10.0 years
15 - 30 Lacs
Hyderabad, Pune, Delhi / NCR
Work from Office
Jo b Description: Should be an experienced professional with Data Engineering background. Should be able to work without much guidance. Design, develop, and maintain scalable ETL pipelines using Azure Services to process, transform, and load large datasets into AWS Datalake or other data stores. Collaborate with cross-functional teams, including data architects, analysts, and business stakeholders, to gather data requirements and deliver efficient data solutions. Design, implement, and maintain data pipelines for data ingestion, processing, and transformation in Azure. Work together with data scientists and analysts to understand the needs for data and create effective data workflows. Create and maintain data storage solutions including Azure SQL Database, Azure Data Lake, and Azure Blob Storage. Utilizing Azure Data Factory or comparable technologies, create and maintain ETL (Extract, Transform, Load) operations. Implementing data validation and cleansing procedures will ensure the quality, integrity, and dependability of the data. Improve the scalability, efficiency, and cost-effectiveness of data pipelines. Monitoring and resolving data pipeline problems will guarantee consistency and availability of the data. Key Skill Sets Required Experience in designing and hands-on development in cloud-based analytics solutions. Expert level understanding on Azure Data Factory, Azure Synapse, Azure SQL, Azure Data Lake, and Azure App Service is required. Designing and building of data pipelines using API ingestion and Streaming ingestion methods. Knowledge of Dev-Ops processes (including CI/CD) and Infrastructure as code is desirable. Strong experience in common data warehouse modelling principles including Kimball, Inmon. Knowledge in Azure Databricks, Azure IoT, Azure HDInsight + Spark, Azure Stream Analytics, Power BI is desirable Working knowledge of Python is desirable • Experience developing security models.
Posted 1 month ago
6.0 - 7.0 years
14 - 18 Lacs
Pune
Work from Office
As an Associate Software Developer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You'll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you'll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you'll tackle obstacles related to database integration and untangle complex, unstructured data sets. In this role, your responsibilities may include: Implementing and validating predictive models as well as creating and maintain statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques Designing and implementing various enterprise search applications such as Elasticsearch and Splunk for client requirements Work in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviors. Build teams or writing programs to cleanse and integrate data in an efficient and reusable manner, developing predictive or prescriptive models, and evaluating modeling results Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Total Exp-6-7 Yrs (Relevant-4-5 Yrs) Mandatory Skills: Azure Databricks, Python/PySpark, SQL, Github, - Azure Devops - Azure Blob Ability to use programming languages like Java, Python, Scala, etc., to build pipelines to extract and transform data from a repository to a data consumer Ability to use Extract, Transform, and Load (ETL) tools and/or data integration, or federation tools to prepare and transform data as needed. Ability to use leading edge tools such as Linux, SQL, Python, Spark, Hadoop and Java Preferred technical and professional experience You thrive on teamwork and have excellent verbal and written communication skills. Ability to communicate with internal and external clients to understand and define business needs, providing analytical solutions Ability to communicate results to technical and non-technical audiences
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
54024 Jobs | Dublin
Wipro
24262 Jobs | Bengaluru
Accenture in India
18733 Jobs | Dublin 2
EY
17079 Jobs | London
Uplers
12548 Jobs | Ahmedabad
IBM
11704 Jobs | Armonk
Amazon
11059 Jobs | Seattle,WA
Bajaj Finserv
10656 Jobs |
Accenture services Pvt Ltd
10587 Jobs |
Oracle
10506 Jobs | Redwood City