Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
8.0 - 13.0 years
6 - 10 Lacs
Bengaluru
Work from Office
Project Role : BI Engineer Project Role Description : Develop, migrate, deploy, and maintain data-driven insights and knowledge engine interfaces that drive adoption and decision making. Integrate security and data privacy protection. Must have skills : SAS Analytics Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a SAS BASE & MACROS, you will be responsible for building and designing scalable and open Business Intelligence (BI) architecture to provide cross-enterprise visibility and agility for business innovation. You will create industry and function data models used to build reports and dashboards, ensuring seamless integration with Accentures Data and AI framework to meet client needs. Roles & Responsibilities1.Data Engineer to lead or drive the migration of legacy SAS data preparation jobs to a modern Python-based data engineering framework. 2. Should have deep experience in both SAS and Python, strong knowledge of data transformation workflows, and a solid understanding of database systems and ETL best practices.3.Should analyze existing SAS data preparation and data feed scripts and workflows to identify logic and dependencies.4.Should translate and re-engineer SAS jobs into scalable, efficient Python-based data pipelines.5.Collaborate with data analysts, scientists, and engineers to validate and test converted workflows.6.Optimize performance of new Python workflows and ensure data quality and consistency.7.Document migration processes, coding standards, and pipeline configurations.8.Integrate new pipelines with google cloud platforms as required.9.Provide guidance and support for testing, validation, and production deployment Professional & Technical Skills: - Must To Have Skills: Proficiency in SAS Base & Macros- Strong understanding of statistical analysis and machine learning algorithms- Experience with data visualization tools such as Tableau or Power BI- Hands-on implementing various machine learning algorithms such as linear regression, logistic regression, decision trees, and clustering algorithms- Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity Additional Information:- The candidate should have 8+ years of exp with min 3 years of exp in SAS or python Data engineering Qualification 15 years full time education
Posted 2 hours ago
8.0 - 13.0 years
14 - 19 Lacs
Coimbatore
Work from Office
Project Role : BI Architect Project Role Description : Build and design scalable and open Business Intelligence (BI) architecture to provide cross-enterprise visibility and agility for business innovation. Create industry and function data models used to build reports and dashboards. Ensure the architecture and interface seamlessly integrates with Accentures Data and AI framework, meeting client needs. Must have skills : SAS Base & Macros Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a SAS BASE & MACROS, you will be responsible for building and designing scalable and open Business Intelligence (BI) architecture to provide cross-enterprise visibility and agility for business innovation. You will create industry and function data models used to build reports and dashboards, ensuring seamless integration with Accentures Data and AI framework to meet client needs. Roles & Responsibilities1.Data Engineer to lead or drive the migration of legacy SAS data preparation jobs to a modern Python-based data engineering framework. 2. Should have deep experience in both SAS and Python, strong knowledge of data transformation workflows, and a solid understanding of database systems and ETL best practices.3.Should analyze existing SAS data preparation and data feed scripts and workflows to identify logic and dependencies.4.Should translate and re-engineer SAS jobs into scalable, efficient Python-based data pipelines.5.Collaborate with data analysts, scientists, and engineers to validate and test converted workflows.6.Optimize performance of new Python workflows and ensure data quality and consistency.7.Document migration processes, coding standards, and pipeline configurations.8.Integrate new pipelines with google cloud platforms as required.9.Provide guidance and support for testing, validation, and production deployment Professional & Technical Skills: - Must To Have Skills: Proficiency in SAS Base & Macros- Strong understanding of statistical analysis and machine learning algorithms- Experience with data visualization tools such as Tableau or Power BI- Hands-on implementing various machine learning algorithms such as linear regression, logistic regression, decision trees, and clustering algorithms- Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity Additional Information:- The candidate should have 8+ years of exp with min 3 years of exp in SAS or python Data engineering Qualification 15 years full time education
Posted 2 hours ago
8.0 - 13.0 years
14 - 19 Lacs
Bengaluru
Work from Office
Project Role : BI Architect Project Role Description : Build and design scalable and open Business Intelligence (BI) architecture to provide cross-enterprise visibility and agility for business innovation. Create industry and function data models used to build reports and dashboards. Ensure the architecture and interface seamlessly integrates with Accentures Data and AI framework, meeting client needs. Must have skills : SAS Base & Macros Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a SAS BASE & MACROS, you will be responsible for building and designing scalable and open Business Intelligence (BI) architecture to provide cross-enterprise visibility and agility for business innovation. You will create industry and function data models used to build reports and dashboards, ensuring seamless integration with Accentures Data and AI framework to meet client needs. Roles & Responsibilities1.Data Engineer to lead or drive the migration of legacy SAS data preparation jobs to a modern Python-based data engineering framework. 2. Should have deep experience in both SAS and Python, strong knowledge of data transformation workflows, and a solid understanding of database systems and ETL best practices.3.Should analyze existing SAS data preparation and data feed scripts and workflows to identify logic and dependencies.4.Should translate and re-engineer SAS jobs into scalable, efficient Python-based data pipelines.5.Collaborate with data analysts, scientists, and engineers to validate and test converted workflows.6.Optimize performance of new Python workflows and ensure data quality and consistency.7.Document migration processes, coding standards, and pipeline configurations.8.Integrate new pipelines with google cloud platforms as required.9.Provide guidance and support for testing, validation, and production deployment Professional & Technical Skills: - Must To Have Skills: Proficiency in SAS Base & Macros- Strong understanding of statistical analysis and machine learning algorithms- Experience with data visualization tools such as Tableau or Power BI- Hands-on implementing various machine learning algorithms such as linear regression, logistic regression, decision trees, and clustering algorithms- Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity Additional Information:- The candidate should have 8+ years of exp with min 3 years of exp in SAS or python Data engineering Qualification 15 years full time education
Posted 2 hours ago
3.0 - 8.0 years
5 - 9 Lacs
Mumbai
Work from Office
3+ years experience of building software solutions using Python Strong fundamentals of Python like Python Data Layout, Generators, Decorators, File IO, Dynamic Programming, Algorithms etc. Working knowledge of Python Standard Libraries and libraries like any ORM library, numpy, scipy, matplotlib, mlab etc. Knowledge of fundamental design principles to build a scalable application Knowledge of Python web frameworks Working knowledge of core Java is added plus Knowledge of web technologies (HTTP, JS) is added plus A financial background will be added plus Any technical capabilities in the area of big data analytics is also added plus Salary Package: As per the industry standard Preferred Programs: BE or BTech or equivalent degree with strong Mathematics and Statistics foundation (example, B.Sc. or M.Sc. in Mathematics & Computer Science)
Posted 1 day ago
6.0 - 8.0 years
15 - 20 Lacs
Mumbai, Chennai, Bengaluru
Work from Office
Strong Python programming skills with expertise in Pandas, Lxml, ElementTree, File I/O operations, Smtplib, and Logging libraries Basic understanding of XML structures and ability to extract key parent & child xml tag elements from XML data structure Required Candidate profile Java Spring Boot API Microservices - 8+ years of exp + SQL 5+ years of exp + Azure 3+ years of experience
Posted 6 days ago
6.0 - 8.0 years
15 - 22 Lacs
Mumbai
Work from Office
Strong Python programming skills with expertise in Pandas, Lxml, ElementTree, File I/O operations, Smtplib, and Logging libraries Basic understanding of XML structures and ability to extract key parent & child xml tag elements from XML data structure Required Candidate profile Java Spring Boot API Microservices - 8+ years of exp + SQL 5+ years of exp + Azure 3+ years of experience
Posted 6 days ago
6.0 - 10.0 years
8 - 18 Lacs
Bengaluru
Work from Office
Team: Development - Alpha Data Position Overview: We are seeking an experienced Python developer to join our Alpha Data team, responsible for delivering a vast quantity of data served to users worldwide. You will be a cornerstone of a growing Data team, becoming a technical subject matter expert and developing strong working relationships with quant researchers, traders, and fellow colleagues across our Technology organisation. Alpha Data teams are able to deploy valuable data to the rest of the Squarepoint business at speed. Ingestion pipelines and data transformation jobs are resilient and highly maintainable, while the data models are carefully designed in close collaboration with our researchers for efficient query construction and alpha generation. We achieve an economy of scale through building new frameworks, libraries, and services used to increase the team's quality of life, throughput, and code quality. Teamwork and collaboration are encouraged, excellence is rewarded and diversity of thought and creative solutions are valued. Our emphasis is on a culture of learning, development, and growth. Take part ownership of our ever-growing estate of data pipelines, Propose and contribute to new abstractions and improvements - make a real positive impact across our team globally, Design, implement, test, optimize and troubleshoot our data pipelines, frameworks, and services, Collaborate with researchers to onboard new datasets, Regularly take the lead on production support operations - during normal working hours only. Required Qualifications: 5+ years of experience coding to a high standard in Python, React, Javascript Bachelor's degree in a STEM subject, Experience with and knowledge of SQL, and one or more common RDBMS systems (we mostly use Postgres), Practical knowledge of commonly used protocols and tools used to transfer data (e.g. FTP, SFTP, HTTP APIs, AWS S3), Excellent communication skills. Nice to haves Experience with big data frameworks, databases, distributed systems, or Cloud development. Experience with any of these: C++, kdb+/q, Rust.
Posted 1 week ago
3.0 - 5.0 years
8 - 12 Lacs
Gurugram
Work from Office
Position Summary This Requisition is for the Employee Referral Campaign. We are seeking high-energy, driven, and innovative Data Scientists to join our Data Science Practice to develop new, specialized capabilities for Axtria, and to accelerate the company’s growth by supporting our clients’ commercial & clinical strategies. Job Responsibilities Be an Individual Contributor tothe Data Science team and solve real-world problems using cutting-edge capabilities and emerging technologies. Help clients translate the business use cases they are trying to crack into data science solutions. Provide genuine assistance to users by advising them on how to leverage Dataiku DSS to implement data science projects, from design to production. Data Source Configuration, Maintenance, Document and maintain work-instructions. Deep working onmachine learning frameworks such as TensorFlow, Caffe, Keras, SparkML Expert knowledge in Statistical and Probabilistic methods such as SVM, Decision-Trees, Clustering Expert knowledge of python data-science and math packages such as NumPy , Pandas, Sklearn Proficiency in object-oriented languages (Java and/or Kotlin),Python and common machine learning frameworks(TensorFlow, NLTK, Stanford NLP, Ling Pipe etc Education Bachelor Equivalent - Engineering Master's Equivalent - Engineering Work Experience Data Scientist 3-5 years of relevant experience in advanced statistical and mathematical models and predictive modeling using Python. Experience in the data science space prior relevant experience in Artificial intelligence and machine Learning algorithms for developing scalable models supervised and unsupervised techniques likeNLP and deep Learning Algorithms. Ability to build scalable models using Python, R-Studio, R Shiny, PySpark, Keras, and TensorFlow. Experience in delivering data science projects leveraging cloud infrastructure. Familiarity with cloud technology such as AWS / Azure and knowledge of AWS tools such as S3, EMR, EC2, Redshift, and Glue; viz tools like Tableau and Power BI. Relevant experience in Feature Engineering, Feature Selection, and Model Validation on Big Data. Knowledge of self-service analytics platforms such as Dataiku/ KNIME/ Alteryx will be an added advantage. ML Ops Engineering 3-5 years of experience with MLOps Frameworks like Kubeflow, MLFlow, Data Robot, Airflow, etc., experience with Docker and Kubernetes, OpenShift. Prior experience in end-to-end automated ecosystems including, but not limited to, building data pipelines, developing & deploying scalable models, orchestration, scheduling, automation, and ML operations. Ability to design and implement cloud solutions and ability to build MLOps pipelines on cloud solutions (AWS, MS Azure, or GCP). Programming languages like Python, Go, Ruby, or Bash, a good understanding of Linux, knowledge of frameworks such as Keras, PyTorch, TensorFlow, etc. Ability to understand tools used by data scientists and experience with software development and test automation. Good understanding of advanced AI/ML algorithms & their applications. Gen AI :Minimum of 4-6 years develop, test, and deploy Python based applications on Azure/AWS platforms.Must have basic knowledge on concepts of Generative AI / LLMs / GPT.Deep understanding of architecture and work experience on Web Technologies.Python, SQL hands-on experience.Expertise in any popular python web frameworks e.g. flask, Django etc. Familiarity with frontend technologies like HTML, JavaScript, REACT.Be an Individual Contributor in the Analytics and Development team and solve real-world problems using cutting-edge capabilities and emerging technologies based on LLM/GenAI/GPT.Can interact with client on GenAI related capabilities and use cases.
Posted 1 week ago
5.0 - 10.0 years
5 - 15 Lacs
Hyderabad, Pune, Bengaluru
Work from Office
Role & responsibilities Strong understanding of Azure environment (PaaS, IaaS) and experience in working with Hybrid model At least 1 project experience in Azure Data Stack that involves components like Azure Data Lake, Azure Synapse Analytics, Azure Data Factory, Azure Data Bricks, Azure Analysis Service, Azure SQL DWH Strong hands-on SQL/T-SQL/Spark SQL and database concepts Strong experience in Azure Blob and ADLSGEN2 Strong Knowledge of Azure Key Vault, Managed Identity RBAC Strong experience and understanding of DAX tabular models Experience in Performance Tuning, Security, Sizing and deployment Automation of SQL/Spark Good to have Knowledge in Advanced analytics tools like Azure Machine Learning, Event Hubs and Azure Stream Analytics Good Knowledge on Data Visualization tools Power BI Able to do Code reviews as per organization's Best Practices. Exposure/Knowledge of No-SQL databases. Good hands on experience in Azure Dev ops tools. Should have experience in Multi-site project model, client communication skills String working experience in ingesting data from various data sources and data types Good knowledge in Azure DevOps, understanding of build and release pipelines Good knowledge in push/pull request in Azure Repo/Git repositories Good knowledge in code review and coding standards Good knowledge in unit and functional testing Expert knowledge using advanced calculations using MS Power BI Desktop (Aggregate, Date, Logical, String, Table) Good at creating different visualizations using Slicers, Lines, Pies, Histograms, Maps, Scatter, Bullets, Heat Maps, Tree maps, etc. Exceptional interpersonal and communications (verbal and written) skills Strong communication skills Ability to manage mid-sized teams and customer interaction
Posted 1 week ago
5.0 - 9.0 years
11 - 16 Lacs
Chennai
Work from Office
Job Summary Synechron is seeking a detail-oriented and knowledgeable Senior QA Engineer specializing in Quality Assurance (QA) with a solid foundation in Business Analysis within Rates Derivatives. In this role, you will contribute to ensuring the quality and accuracy of derivative products, focusing on derivatives, fixed income products, and market data processes. Your expertise will support the organization’s efforts in maintaining high standards in trading and risk management systems, directly impacting operational efficiency and compliance. Software Requirements Required Skills: Proficiency in MS Excel, including advanced functionalities such as Macros Strong working knowledge of MS SQL Server for data querying and management Competence in Python for automation, scripting, and data analysis Experience with automation testing tools and frameworks Basic understanding of version control tools (e.g., Git) is preferred Preferred Skills: Familiarity with business analysis tools and requirements gathering platforms Exposure to cloud data environments or cloud-based testing tools Overall Responsibilities Analyze and validate derivative trade data related to Rates Business, ensuring accuracy in P&L and risk calculations Develop and execute test cases, scripts, and automation processes to verify system integrity and data consistency Collaborate with Quantitative Analysts and Business Teams to understand trading models, market data, and risk methodologies Assist in analyzing and optimizing P&L and risk computation processes Document testing procedures, results, and process improvements to uphold quality standards Support the identification of system flaws or data discrepancies and recommend corrective actions Participate in requirement review sessions to ensure testing coverage aligns with business needs Technical Skills (By Category) Programming Languages (Essential): Python (required for automation and data analysis) Excel macros (VBA) for automation and data manipulation Databases/Data Management (Essential): MS SQL Server (for data querying, validation, and management) Cloud Technologies: Not essential for this role but familiarity with cloud data environments (Azure, AWS) is a plus Frameworks and Libraries: Use of Python data libraries (e.g., pandas, NumPy) is preferred for automation tasks Development Tools and Methodologies: Version control (Git) is preferred Test automation frameworks and scripting practices to ensure repeatability and accuracy Security Protocols: Not specifically applicable; adherence to data confidentiality and compliance standards is required Experience Requirements Minimum of 6+ years in Quality Assurance, Business Analysis, or related roles within the derivatives or financial markets industry Strong understanding of Rates Derivatives, fixed income products, and associated market data Hands-on experience in P&L and risk measurement calculations Proven history of developing and maintaining automation scripts, particularly in Python and Excel Macros Experience working with SQL databases to extract, analyze, and validate data Industry experience in trading, risk, or quantitative teams preferred Day-to-Day Activities Collaborate with Quantitative Analysts, Business Teams, and Developers to understand trade data and risk models Develop, execute, and maintain automation scripts to streamline testing and validation processes Validate trade data accuracy and integrity across systems, focusing on P&L and risk calculations Perform detailed testing of business workflows, ensuring compliance with requirements and risk standards Analyze market data inputs and trade data discrepancies, reporting findings and potential improvements Prepare documentation of test cases, findings, and process documentation for audit and review purposes Participate in daily stand-ups, requirement discussions, and defect review meetings Provide ongoing feedback for system enhancements and automation opportunities Qualifications Bachelor’s degree in Finance, Economics, Computer Science, Information Technology, or related field; equivalent professional experience acceptable Relevant certifications such as CFA, CQF, or ISTQB are preferred Demonstrated experience in QA, Business Analysis, or related roles within financial derivatives markets Commitment to continuous learning in financial products, quantitative methods, and automation techniques Professional Competencies Strong analytical and critical thinking skills essential for understanding complex financial data and risk metrics Effective communicator capable of articulating technical and business issues clearly Collaborative team player able to work across business, technical, and QA teams Adaptability to evolving technology tools, processes, and regulatory requirements Focused on continuous improvement and process efficiency Good time management skills to prioritize tasks in a fast-paced environment
Posted 4 weeks ago
5 - 10 years
14 - 19 Lacs
Bengaluru
Work from Office
Project Role : BI Architect Project Role Description : Build and design scalable and open Business Intelligence (BI) architecture to provide cross-enterprise visibility and agility for business innovation. Create industry and function data models used to build reports and dashboards. Ensure the architecture and interface seamlessly integrates with Accentures Data and AI framework, meeting client needs. Must have skills : SAS Base & Macros Good to have skills : NA Minimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a SAS BASE & MACROS, you will be responsible for building and designing scalable and open Business Intelligence (BI) architecture to provide cross-enterprise visibility and agility for business innovation. You will create industry and function data models used to build reports and dashboards, ensuring seamless integration with Accentures Data and AI framework to meet client needs. Roles & Responsibilities1.Data Engineer to lead or drive the migration of legacy SAS data preparation jobs to a modern Python-based data engineering framework. 2. Should have deep experience in both SAS and Python, strong knowledge of data transformation workflows, and a solid understanding of database systems and ETL best practices.3.Should analyze existing SAS data preparation and data feed scripts and workflows to identify logic and dependencies.4.Should translate and re-engineer SAS jobs into scalable, efficient Python-based data pipelines.5.Collaborate with data analysts, scientists, and engineers to validate and test converted workflows.6.Optimize performance of new Python workflows and ensure data quality and consistency.7.Document migration processes, coding standards, and pipeline configurations.8.Integrate new pipelines with google cloud platforms as required.9.Provide guidance and support for testing, validation, and production deployment Professional & Technical Skills: Must To Have Skills: Proficiency in SAS Base & Macros Strong understanding of statistical analysis and machine learning algorithms Experience with data visualization tools such as Tableau or Power BI Hands-on implementing various machine learning algorithms such as linear regression, logistic regression, decision trees, and clustering algorithms Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity Additional Information: The candidate should have 8+ years of exp with min 3 years of exp in SAS or python Data engineering Qualification 15 years full time education
Posted 1 month ago
3 - 8 years
14 - 19 Lacs
Bengaluru
Work from Office
Project Role : BI Architect Project Role Description : Build and design scalable and open Business Intelligence (BI) architecture to provide cross-enterprise visibility and agility for business innovation. Create industry and function data models used to build reports and dashboards. Ensure the architecture and interface seamlessly integrates with Accentures Data and AI framework, meeting client needs. Must have skills : SAS Base & Macros Good to have skills : NA Minimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a SAS BASE & MACROS, you will be responsible for building and designing scalable and open Business Intelligence (BI) architecture to provide cross-enterprise visibility and agility for business innovation. You will create industry and function data models used to build reports and dashboards, ensuring seamless integration with Accentures Data and AI framework to meet client needs. Roles & Responsibilities1.Data Engineer to lead or drive the migration of legacy SAS data preparation jobs to a modern Python-based data engineering framework. 2. Should have deep experience in both SAS and Python, strong knowledge of data transformation workflows, and a solid understanding of database systems and ETL best practices.3.Should analyze existing SAS data preparation and data feed scripts and workflows to identify logic and dependencies.4.Should translate and re-engineer SAS jobs into scalable, efficient Python-based data pipelines.5.Collaborate with data analysts, scientists, and engineers to validate and test converted workflows.6.Optimize performance of new Python workflows and ensure data quality and consistency.7.Document migration processes, coding standards, and pipeline configurations.8.Integrate new pipelines with google cloud platforms as required.9.Provide guidance and support for testing, validation, and production deployment Professional & Technical Skills: Must To Have Skills: Proficiency in SAS Base & Macros Strong understanding of statistical analysis and machine learning algorithms Experience with data visualization tools such as Tableau or Power BI Hands-on implementing various machine learning algorithms such as linear regression, logistic regression, decision trees, and clustering algorithms Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity Additional Information: The candidate should have 8+ years of exp with min 3 years of exp in SAS or python Data engineering Qualification 15 years full time education
Posted 1 month ago
5 - 10 years
6 - 10 Lacs
Bengaluru
Work from Office
Project Role : BI Engineer Project Role Description : Develop, migrate, deploy, and maintain data-driven insights and knowledge engine interfaces that drive adoption and decision making. Integrate security and data privacy protection. Must have skills : SAS Analytics Good to have skills : NA Minimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a SAS BASE & MACROS, you will be responsible for building and designing scalable and open Business Intelligence (BI) architecture to provide cross-enterprise visibility and agility for business innovation. You will create industry and function data models used to build reports and dashboards, ensuring seamless integration with Accentures Data and AI framework to meet client needs. Roles & Responsibilities1.Data Engineer to lead or drive the migration of legacy SAS data preparation jobs to a modern Python-based data engineering framework. 2. Should have deep experience in both SAS and Python, strong knowledge of data transformation workflows, and a solid understanding of database systems and ETL best practices.3.Should analyze existing SAS data preparation and data feed scripts and workflows to identify logic and dependencies.4.Should translate and re-engineer SAS jobs into scalable, efficient Python-based data pipelines.5.Collaborate with data analysts, scientists, and engineers to validate and test converted workflows.6.Optimize performance of new Python workflows and ensure data quality and consistency.7.Document migration processes, coding standards, and pipeline configurations.8.Integrate new pipelines with google cloud platforms as required.9.Provide guidance and support for testing, validation, and production deployment Professional & Technical Skills: Must To Have Skills: Proficiency in SAS Base & Macros Strong understanding of statistical analysis and machine learning algorithms Experience with data visualization tools such as Tableau or Power BI Hands-on implementing various machine learning algorithms such as linear regression, logistic regression, decision trees, and clustering algorithms Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity Additional Information: The candidate should have 8+ years of exp with min 3 years of exp in SAS or python Data engineering Qualification 15 years full time education
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
17062 Jobs | Dublin
Wipro
9393 Jobs | Bengaluru
EY
7759 Jobs | London
Amazon
6056 Jobs | Seattle,WA
Accenture in India
6037 Jobs | Dublin 2
Uplers
5971 Jobs | Ahmedabad
Oracle
5764 Jobs | Redwood City
IBM
5714 Jobs | Armonk
Tata Consultancy Services
3524 Jobs | Thane
Capgemini
3518 Jobs | Paris,France