Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
5.0 - 8.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Avant de postuler à un emploi, sélectionnez votre langue de préférence parmi les options disponibles en haut à droite de cette page. Découvrez votre prochaine opportunité au sein d'une organisation qui compte parmi les 500 plus importantes entreprises mondiales. Envisagez des opportunités innovantes, découvrez notre culture enrichissante et travaillez avec des équipes talentueuses qui vous poussent à vous développer chaque jour. Nous savons ce qu’il faut faire pour diriger UPS vers l'avenir : des personnes passionnées dotées d’une combinaison unique de compétences. Si vous avez les qualités, de la motivation, de l'autonomie ou le leadership pour diriger des équipes, il existe des postes adaptés à vos aspirations et à vos compétences d'aujourd'hui et de demain. Role Overview Fiche de poste : UPS Data Science and Machine Learning team is seeking a highly skilled and experienced Lead Machine Learning Engineer to manage our AI, ML, GenAI application focused on Cross Border logistics. This position leverages continuous integration and deployment of the best practices, including test automation and monitoring, to ensure successful deployment of optimal ML models and analytical systems. You will be responsible for the end-to-end lifecycle of AI models, from experimentation and fine-tuning to deployment and management in production. A strong background in prompt engineering and practical experience with either Google Cloud's Vertex AI platform is essential for this role. You will also provide technical leadership and mentorship to other members of the AI/ML team. Key Responsibilities Lead the development and deployment of generative AI solutions utilizing LLMs, SLMs, and FMs for various applications (e.g., content generation, chatbots, summarization, code generation, etc.). Architect and implement robust and scalable infrastructure for training, fine-tuning, and serving large-scale AI models, leveraging either Vertex AI. Drive the fine-tuning and adaptation of pre-trained models using proprietary data to achieve state-of-the-art performance on specific tasks. Develop and implement effective prompt engineering strategies to elicit desired outputs and control the behavior of generative models. Manage the lifecycle of deployed models, production support, including monitoring performance, identifying areas for improvement, and implementing necessary updates or retraining. Collaborate closely with cross-functional teams (e.g., product, engineering, research) to understand business requirements and translate them into technical solutions. Provide technical leadership and mentorship to junior machine learning engineers, fostering a culture of learning and innovation. Ensure the responsible and ethical development and deployment of AI models, considering factors such as bias, fairness, and privacy. Stay up to date with latest advancements in generative AI, LLMs, and related technologies, and evaluate their potential application within the company. Document technical designs, implementation details, and deployment processes. Troubleshoot and resolve issues related to model performance and deployment. Required Skills And Experience Bachelor's or Master's degree in Computer Science, Machine Learning, Artificial Intelligence, or a related field. Minimum of 5-8 years of hands-on experience in building, deploying, and managing machine learning models in a production environment. Demonstrable experience in managing, deploying, and fine-tuning large language models (LLMs), small language models (SLMs), and foundation models (FMs). Significant hands-on experience with prompt engineering techniques for various generative AI tasks. Proven experience working with either Google Cloud's Vertex AI platform platform. including experience with their respective model registries, deployment tools, and MLOps features. Strong programming skills in Python and experience with relevant machine learning libraries (e.g., TensorFlow, PyTorch, Transformers). Experience with cloud computing platforms (beyond Vertex AI is a plus, e.g. Azure). Solid understanding of machine learning principles, deep learning architectures, and evaluation metrics. Excellent problem-solving, analytical, and communication skills. Ability to work independently and as part of a collaborative team. Experience with MLOps practices and tools for continuous integration and continuous delivery (CI/CD) of ML models is highly desirable. Experience with version control systems (e.g., Git). Bonus Points Experience with model governance frameworks and implementing ethical AI practices. Experience with specific generative AI use cases relevant to Logistics industry. Publications or contributions to open-source projects, technical blogs, or industry conferences are considered a plus Familiarity with data engineering pipelines and tools. Familiarity with emerging trends in generative AI, reinforcement learning from human feedback (RLHF), and federated learning approaches. Type De Contrat en CDI Chez UPS, égalité des chances, traitement équitable et environnement de travail inclusif sont des valeurs clefs auxquelles nous sommes attachés.
Posted 14 hours ago
2.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
🌍 Join the Mission to Shape Global Futures | Study Abroad Counselor @ Vertex Edu 🌍 Are you passionate about education and ready to grow with a fast-scaling startup that’s helping Indian students get into top global universities? At Vertex Edu, we don’t just offer jobs — we offer ownership, purpose, and the power to make an impact. We're India’s leading institute for study abroad test prep & admissions counseling, guiding students on exams like GMAT, GRE, SAT, and IELTS, and helping them land admits from top universities in the USA, UK, Canada, Australia, and Germany — with complete visa support. Now, we’re building a dream team. Want in? 🚀 Role: Study Abroad Counselor As a Study Abroad Counselor, you’ll not just guide students — you’ll be a co-creator of their dreams and a builder of Vertex Edu’s future. We’re looking for someone who wants more than just a job — someone who wants to grow, own their outcomes, and build something meaningful with us. 🔍 What You’ll Do ✅ Student Counseling & Career Mapping Guide students on their international education journey: university shortlisting, applications, SOPs, and visa processes. Offer end-to-end support — from their first inquiry to the moment they board their flight. ✅ Lead Nurturing & Conversion Own the lead funnel: follow up, convert, and close — with warmth, clarity, and persistence. Craft smart conversion strategies that drive results — we’ll give you the freedom to innovate. ✅ Visa & Documentation Guidance Help students confidently navigate visa applications, financial documents, SOPs, and interviews. Stay on top of changing regulations in major study destinations. ✅ Events, Webinars & Collaboration Conduct impactful seminars & webinars. Collaborate with universities, embassies, and internal teams — be a bridge between students and their futures. 🧠 You’re a Fit If You Have… 2+ years of experience in Study Abroad Counseling or EdTech/student services. A Bachelor’s or Master’s degree (Education, International Relations, or related field preferred). Strong grasp of visa requirements, university admissions, and financial docs. Excellent spoken & written communication and a go-getter attitude. CRM & MS Office skills, and the ability to thrive under deadlines. 🤝 Who Shouldn’t Apply Let’s keep it real — we respect your time and ask you to respect ours. ❌ If you tend to ghost after saying yes to interviews... ❌ If you struggle with time commitments or basic professional communication... Please don’t apply — our team puts a lot of effort into this process, and we want to work with people who value that. 💼 Why Join Vertex Edu? ✨ Work at the intersection of education & impact ✨ Define your own role — we believe in giving ownership, not just tasks ✨ Competitive pay + performance incentives ✨ Be part of a tight-knit team that’s building a nationally respected brand ✨ Grow your career and leadership in the booming study abroad space Salary- 22000- 40000 per month (Negotiable, depends on profile) 📩 Apply Now – Send your resume to hr@vertexedu.com Come build something that matters. Let’s grow together.
Posted 14 hours ago
0 years
0 Lacs
Chennai, Tamil Nadu, India
Remote
When you join Verizon You want more out of a career. A place to share your ideas freely even if theyre daring or different. Where the true you can learn, grow, and thrive. At Verizon, we power and empower how people live, work and play by connecting them to what brings them joy. We do what we love driving innovation, creativity, and impact in the world. Our V Team is a community of people who anticipate, lead, and believe that listening is where learning begins. In crisis and in celebration, we come together lifting our communities and building trust in how we show up, everywhere & always. Want in? Join the V Team Life. : We are seeking a S4HANA experienced Functional Manager with thorough knowledge in SAP FICO who will strategically drive and lead the Invoice to pay & Tax team. What Youll Be Doing... As an SAP Senior Manager in IT Corporate Systems group, you will be responsible for SAP Finance - Invoice to Pay and Taxation to drive and lead all aspects of strategic and transformation initiatives in 1ERP S4HANA and to deliver the enhancements in the form of Change Requests. 1ERP is a multi-year program to consolidate various ERP platforms into a single ERP platform to drive efficiencies. The primary function of this role is intended to take a broad-based view of business processes across core functional domain for both US and global business entities. You will make sure that the team delivers best in class out of box capabilities through planning, analysis, design and leading the Delivery. These will be enterprise-wide business processes centered on SAP S/4 HANA ERP platform and spanning across other SAP and non-SAP systems. Role This role will interface with Business partners, System integration Delivery Leads, and SAP Technical managers in order to fulfill the stated primary goal. Should have experience in handling, guiding and leading the team in the following technical aspects of delivery: Gathering preparation, Configuration, Prepare Test Scenarios and Test Scripts. Preparing Functional Specifications, cutover strategies and issue resolution post Go Live. Creating and tracking SAP OSS notes and working with SAP to resolve issues. Preparing reports and training materials, training personnel, and delivering presentations. Implementing SAP Best Practices Business Processes, Global templates, and configuration of these Best Practices. Identifying as-is processes and to-be processes and Map Business Processes in SAP S/4 Hana System. Where youll be working In this hybrid role, you'll have a defined work location that includes work from home and assigned office days set by your supervisor. What Were Looking For... You will need to have: Bachelor's degree or six or more years of experience. Six or more years of relevant work experience in SAP FI-GL. Experience working closely with SAP Finance Process owners. Experience working on at least 3 full life cycle implementation projects including at least 2 Implementation experience in S/4 Hana. Good understanding of Finance business processes at high level and Client Interfacing experience. Experience in End-to-End configuration skills in SAP Finance and Invoice To Pay and Tax on S/4 Hana. Experience working on at least two full life cycle implementation projects in S/4 Hana. Experience with Accounts Payable, Travel Expenses, Concur Integration, Taxation, Vertex Integration in SAP S4 Hana along with Treasury and In-House Cash and ICMR. Good knowledge and experience in Accounts Payable Configuration in S4 Hana Bank Methods, DMEE, BCM, Business Partners, Validations and Substitutions. Experience in Integration between Material Management and Accounts Payable, Accounts Payable and Treasury Integration, Idea on Integration between IHC, Treasury and Accounts Payable. Worked on interfaces. Worked on preparation of Functional Specifications. Working knowledge of Fiori Apps. Coordination with ABAP team for any development work. Should possess good domain knowledge in the Accounts Payable, Travel Expenses and Taxation. Have experience in preparation of Training documents and end user training. Even better if you have one or more of the following: Masters degree in Commerce/MBA Finance / Chartered Accountant & 13 or more years of relevant work experience. Experience in S4HANA implementation, and certification. Experience with tax and working with large, complex transformation projects. Knowledge of indirect tax analysis of supply chains. Experience with gathering tax business requirements and design of tax configuration/solutions. Experience in Implementation of new SAP tax applications if required as per the business requirement. Experience with SAP, both transactional processes and core tax components. Experience with Vertex - use/input taxation, Vertex -VAT/VAT exempt Tax - Tax codes/Tax Assist rules/Jurisdiction codes/calculation procedures, Vertex Tax Accelerator Mapping (Tax drivers mapping), Vertex Custom User Exit mapping, Vertex SAP Tax Accelerator reports, Advanced Tax Return for Tax on Sales/Purchases, Vertex non-deductible Tax/Reverse Charge, Vertex RFC connectivity and updates. Good Knowledge on Master Data Governance. Working knowledge in Fixed assets accounting and General ledger accounting. Experience on the Business side. Experience in integration points with other SAP modules and non-SAP systems, IDOC / XML and other interfaces. Ability to deliver simple-to-complex design solutions for the process enhancements in RICEFW. Knowledge of custom programs and able to perform troubleshooting by de-bugging whenever required. Knowledge of implementing ERP systems using project lifecycle processes, including design, testing, implementation, and support. Ability to perform functional and performance tests on the system in order to verify the changes implemented by developers. Strong written, verbal, and interpersonal communication skills with management, technical peers, and business stakeholders. Strong Analytical and innovative skills. If Verizon and this role sound like a fit for you, we encourage you to apply even if you dont meet every even better qualification listed above. Where youll be working In this hybrid role, you'll have a defined work location that includes work from home and assigned office days set by your manager. Scheduled Weekly Hours 40 Equal Employment Opportunity Verizon is an equal opportunity employer. We evaluate qualified applicants without regard to race, gender, disability or any other legally protected characteristics. Locations Chennai, India Hyderabad, India
Posted 1 day ago
6.0 years
0 Lacs
India
On-site
Immediate Joiners Only Need Someone with Python, Google Pub/Sub, CI/CD, Terraform and Vertex AI Experience. Qualifications: Bachelor’s degree in computer science, Information Technology, or a related field (or equivalent experience). Experience: 6+ years of experience in cloud architecture, with a focus on GCP. Technical Expertise: Strong knowledge of GCP core services, including compute, storage, networking, and database solutions. Proficiency in Infrastructure as Code (IaC) tools like Terraform, Deployment Manager, or Pulumi. Experience with containerization and orchestration tools (e.g., Docker, Kubernetes, GKE, or Cloud Run). Understanding of DevOps practices, CI/CD pipelines, and automation. Strong command of networking concepts such as VPCs, load balancing, and firewall rules. Familiarity with scripting languages like Python or Bash. Preferred Qualifications: Google Cloud Certified – Professional Cloud Architect or Professional DevOps Engineer. Expertise in engineering and maintaining MLOps and AI applications. Experience in hybrid cloud or multi-cloud environments. Familiarity with monitoring and logging tools such as Cloud Monitoring, ELK Stack, or Datadog.
Posted 1 day ago
0 years
0 Lacs
Ahmedabad, Gujarat, India
On-site
Minimum of (3+) years of experience in AI-based application development. Fine-tune pre-existing models to improve performance and accuracy. Experience with TensorFlow or PyTorch, Scikit-learn, or similar ML frameworks and familiarity with APIs like OpenAI or vertex AI Experience with NLP tools and libraries (e.g., NLTK, SpaCy, GPT, BERT). Implement frameworks like LangChain, Anthropics Constitutional AI, OpenAIs, Hugging Face, and Prompt Engineering techniques to build robust and scalable AI applications. Evaluate and analyze RAG solution and Utilise the best-in-class LLM to define customer experience solutions (Fine tune Large Language models (LLM)). Architect and develop advanced generative AI solutions leveraging state-of-the-art language models (LLMs) such as GPT, LLaMA, PaLM, BLOOM, and others. Strong understanding and experience with open-source multimodal LLM models to customize and create solutions. Explore and implement cutting-edge techniques like Few-Shot Learning, Reinforcement Learning, Multi-Task Learning, and Transfer Learning for AI model training and fine-tuning. Proficiency in data preprocessing, feature engineering, and data visualization using tools like Pandas, NumPy, and Matplotlib. Optimize model performance through experimentation, hyperparameter tuning, and advanced optimization techniques. Proficiency in Python with the ability to get hands-on with coding at a deep level. Develop and maintain APIs using Python's FastAPI, Flask, or Django for integrating AI capabilities into various systems. Ability to write optimized and high-performing scripts on relational databases (e.g., MySQL, PostgreSQL) or non-relational database (e.g., MongoDB or Cassandra) Enthusiasm for continuous learning and professional developement in AI and leated technologies. Collaborate with cross-functional teams to understand business requirements and translate them into technical solutions. Knowledge of cloud services like AWS, Google Cloud, or Azure. Proficiency with version control systems, especially Git. Familiarity with data pre-processing techniques and pipeline development for Al model training. Experience with deploying models using Docker, Kubernetes Experience with AWS Bedrock, and Sagemaker is a plus Strong problem-solving skills with the ability to translate complex business problems into Al solutions.
Posted 1 day ago
0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Our Company Changing the world through digital experiences is what Adobe’s all about. We give everyone—from emerging artists to global brands—everything they need to design and deliver exceptional digital experiences! We’re passionate about empowering people to create beautiful and powerful images, videos, and apps, and transform how companies interact with customers across every screen. We’re on a mission to hire the very best and are committed to creating exceptional employee experiences where everyone is respected and has access to equal opportunity. We realize that new ideas can come from everywhere in the organization, and we know the next big idea could be yours! Platform Development and Evangelism: Build scalable AI platforms that are customer-facing. Evangelize the platform with customers and internal stakeholders. Ensure platform scalability, reliability, and performance to meet business needs. Machine Learning Pipeline Design: Design ML pipelines for experiment management, model management, feature management, and model retraining. Implement A/B testing of models. Design APIs for model inferencing at scale. Proven expertise with MLflow, SageMaker, Vertex AI, and Azure AI. LLM Serving and GPU Architecture: Serve as an SME in LLM serving paradigms. Possess deep knowledge of GPU architectures. Expertise in distributed training and serving of large language models. Proficient in model and data parallel training using frameworks like DeepSpeed and service frameworks like vLLM. Model Fine-Tuning and Optimization: Demonstrate proven expertise in model fine-tuning and optimization techniques. Achieve better latencies and accuracies in model results. Reduce training and resource requirements for fine-tuning LLM and LVM models. LLM Models and Use Cases: Have extensive knowledge of different LLM models. Provide insights on the applicability of each model based on use cases. Proven experience in delivering end-to-end solutions from engineering to production for specific customer use cases. DevOps and LLMOps Proficiency: Proven expertise in DevOps and LLMOps practices. Knowledgeable in Kubernetes, Docker, and container orchestration. Deep understanding of LLM orchestration frameworks like Flowise, Langflow, and Langgraph. Skill Matrix LLM: Hugging Face OSS LLMs, GPT, Gemini, Claude, Mixtral, Llama LLM Ops: ML Flow, Langchain, Langraph, LangFlow, Flowise, LLamaIndex, SageMaker, AWS Bedrock, Vertex AI, Azure AI Databases/Datawarehouse: DynamoDB, Cosmos, MongoDB, RDS, MySQL, PostGreSQL, Aurora, Spanner, Google BigQuery. Cloud Knowledge: AWS/Azure/GCP Dev Ops (Knowledge): Kubernetes, Docker, FluentD, Kibana, Grafana, Prometheus Cloud Certifications (Bonus): AWS Professional Solution Architect, AWS Machine Learning Specialty, Azure Solutions Architect Expert Proficient in Python, SQL, Javascript Adobe is proud to be an Equal Employment Opportunity employer. We do not discriminate based on gender, race or color, ethnicity or national origin, age, disability, religion, sexual orientation, gender identity or expression, veteran status, or any other applicable characteristics protected by law. Learn more about our vision here. Adobe aims to make Adobe.com accessible to any and all users. If you have a disability or special need that requires accommodation to navigate our website or complete the application process, email accommodations@adobe.com or call (408) 536-3015.
Posted 1 day ago
3.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Summary Position Summary Pr oduct/Place Data Engineer , Senior Consultant Work you’ll do Work with Product Managers/Owners to understand how products can be used and should be implemented to solve client problems. Work with Project Managers to understand client requirements and brainstorm on solutioning it, implement the assets/products as applicable. Build large-scale batch and real-time data pipelines with data processing frameworks on Microsoft Azure, Google Cloud Platform (GCP), or Amazon Web Services (AWS) Understanding of Vertex AI configuration/requirements to be able to handle large data (several GB to TB) Develop data pipelines using Python, PySpark , SQL Understanding Data science techniques is an added advantage. Deploy data pipelines into production Consistently strive to acquire new skills on Cloud, Big Data technologies Support and coach your team on best coding practices, development tools, and pathfinding and surveys for technologies The t eam ConvergeCONSUMER ™ is a product-driven business that combines differentiated consumer insights with next-generation decision and experience platforms to help consumer-focused businesses optimize decision making and deliver personalized experiences to drive growth, consumer loyalty and profitability. We operate with the speed and agility of a startup, inside the world’s largest professional services firm. Qualifications Required : Include the most essential requirements that must be met for the role; the qualifications that a job seeker can demonstrably possess to be considered as a candidate. These should be quantifiable to allow for a clear and concise comparison of required skills to any given resume . A bachelor's degree in computer science, Data Engineering, Applied Mathematics, or similar Quantitative Field with a minimum 3+ yrs. of experience as data engineer 3+ years of experience with at least one of the following cloud platforms: Microsoft Azure, Google Cloud Platform (GCP), Amazon Web Services (AWS) Azure, GCP, and / or AWS Certification preferred 3+ years of experience in SQL, data transformations, statistical analysis, and troubleshooting across more than one Database Platform (Big query, MySQL, Snowflake, PostgreSQL etc.) 3+ years of experience in the design and build of data extraction, transformation, and loading processes by writing custom data pipelines Proven skills in various programming and database languages i.e., Python, PySpark , SQL Experience working with either a Map Reduce or an MPP system on any size/scale preferred Familiarity with consumer related data and a general interest in the retail, consumer products manufacturing, automotive, transportation, or hospitality sectors A desire to take initiative and continuously provide feedback on improving the products you are responsible for A general interest in relevant emerging technologies and a constant thirst to further your own technical abilities Experience working in an Agile development environment The ability to work well independently and as a team player Excellent conversational and written communication skills Experience with BI/analytics tools, such as Tableau, PowerBI , or similar tools Preferred: List and additional skills, experiences, degrees or certifications that are preferred but not required to be considered. “Nice to haves;” not having all of them would not prevent a job seeker from applying. Master’s Degree in a relevant field from a top-tier university . How You’ll Grow At Deloitte, our professional development plan focuses on helping people at every level of their career to identify and use their strengths to do their best work every day. From entry-level employees to senior leaders, we believe there’s always room to learn. We offer opportunities to help sharpen skills in addition to hands-on experience in the global, fast-changing business world. From on-the-job learning experiences to formal development programs at Deloitte University, our professionals have a variety of opportunities to continue to grow throughout their career. Explore Deloitte University, The Leadership Center. Benefits At Deloitte, we know that great people make a great organization. We value our people and offer employees a broad range of benefits. Learn more about what working at Deloitte can mean for you. Deloitte’s culture Our positive and supportive culture encourages our people to do their best work every day. We celebrate individuals by recognizing their uniqueness and offering them the flexibility to make daily choices that can help them to be healthy, centered, confident, and aware. We offer well-being programs and are continuously looking for new ways to maintain a culture where our people excel and lead healthy, happy lives. Learn more about Life at Deloitte. Corporate citizenship Deloitte is led by a purpose: to make an impact that matters. This purpose defines who we are and extends to relationships with our clients, our people and our communities. We believe that business has the power to inspire and transform. We focus on education, giving, skill-based volunteerism, and leadership to help drive positive social impact in our communities. Learn more about Deloitte’s impact on the world. Recruiter tips We want job seekers exploring opportunities at Deloitte to feel prepared and confident. To help you with your interview, we suggest that you do your research: know some background about the organization and the business area you’re applying to. Check out recruiting tips from Deloitte professionals. Our purpose Deloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Our people and culture Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Professional development At Deloitte, professionals have the opportunity to work with some of the best and discover what works best for them. Here, we prioritize professional growth, offering diverse learning and networking opportunities to help accelerate careers and enhance leadership skills. Our state-of-the-art DU: The Leadership Center in India, located in Hyderabad, represents a tangible symbol of our commitment to the holistic growth and development of our people. Explore DU: The Leadership Center in India . Benefits To Help You Thrive At Deloitte, we know that great people make a great organization. Our comprehensive rewards program helps us deliver a distinctly Deloitte experience that helps that empowers our professionals to thrive mentally, physically, and financially—and live their purpose. To support our professionals and their loved ones, we offer a broad range of benefits. Eligibility requirements may be based on role, tenure, type of employment and/ or other criteria. Learn more about what working at Deloitte can mean for you. Recruiting tips From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Requisition code: 302244
Posted 1 day ago
3.0 years
0 Lacs
Kolkata, West Bengal, India
On-site
Summary Position Summary Pr oduct/Place Data Engineer , Senior Consultant Work you’ll do Work with Product Managers/Owners to understand how products can be used and should be implemented to solve client problems. Work with Project Managers to understand client requirements and brainstorm on solutioning it, implement the assets/products as applicable. Build large-scale batch and real-time data pipelines with data processing frameworks on Microsoft Azure, Google Cloud Platform (GCP), or Amazon Web Services (AWS) Understanding of Vertex AI configuration/requirements to be able to handle large data (several GB to TB) Develop data pipelines using Python, PySpark , SQL Understanding Data science techniques is an added advantage. Deploy data pipelines into production Consistently strive to acquire new skills on Cloud, Big Data technologies Support and coach your team on best coding practices, development tools, and pathfinding and surveys for technologies The t eam ConvergeCONSUMER ™ is a product-driven business that combines differentiated consumer insights with next-generation decision and experience platforms to help consumer-focused businesses optimize decision making and deliver personalized experiences to drive growth, consumer loyalty and profitability. We operate with the speed and agility of a startup, inside the world’s largest professional services firm. Qualifications Required : Include the most essential requirements that must be met for the role; the qualifications that a job seeker can demonstrably possess to be considered as a candidate. These should be quantifiable to allow for a clear and concise comparison of required skills to any given resume . A bachelor's degree in computer science, Data Engineering, Applied Mathematics, or similar Quantitative Field with a minimum 3+ yrs. of experience as data engineer 3+ years of experience with at least one of the following cloud platforms: Microsoft Azure, Google Cloud Platform (GCP), Amazon Web Services (AWS) Azure, GCP, and / or AWS Certification preferred 3+ years of experience in SQL, data transformations, statistical analysis, and troubleshooting across more than one Database Platform (Big query, MySQL, Snowflake, PostgreSQL etc.) 3+ years of experience in the design and build of data extraction, transformation, and loading processes by writing custom data pipelines Proven skills in various programming and database languages i.e., Python, PySpark , SQL Experience working with either a Map Reduce or an MPP system on any size/scale preferred Familiarity with consumer related data and a general interest in the retail, consumer products manufacturing, automotive, transportation, or hospitality sectors A desire to take initiative and continuously provide feedback on improving the products you are responsible for A general interest in relevant emerging technologies and a constant thirst to further your own technical abilities Experience working in an Agile development environment The ability to work well independently and as a team player Excellent conversational and written communication skills Experience with BI/analytics tools, such as Tableau, PowerBI , or similar tools Preferred: List and additional skills, experiences, degrees or certifications that are preferred but not required to be considered. “Nice to haves;” not having all of them would not prevent a job seeker from applying. Master’s Degree in a relevant field from a top-tier university . How You’ll Grow At Deloitte, our professional development plan focuses on helping people at every level of their career to identify and use their strengths to do their best work every day. From entry-level employees to senior leaders, we believe there’s always room to learn. We offer opportunities to help sharpen skills in addition to hands-on experience in the global, fast-changing business world. From on-the-job learning experiences to formal development programs at Deloitte University, our professionals have a variety of opportunities to continue to grow throughout their career. Explore Deloitte University, The Leadership Center. Benefits At Deloitte, we know that great people make a great organization. We value our people and offer employees a broad range of benefits. Learn more about what working at Deloitte can mean for you. Deloitte’s culture Our positive and supportive culture encourages our people to do their best work every day. We celebrate individuals by recognizing their uniqueness and offering them the flexibility to make daily choices that can help them to be healthy, centered, confident, and aware. We offer well-being programs and are continuously looking for new ways to maintain a culture where our people excel and lead healthy, happy lives. Learn more about Life at Deloitte. Corporate citizenship Deloitte is led by a purpose: to make an impact that matters. This purpose defines who we are and extends to relationships with our clients, our people and our communities. We believe that business has the power to inspire and transform. We focus on education, giving, skill-based volunteerism, and leadership to help drive positive social impact in our communities. Learn more about Deloitte’s impact on the world. Recruiter tips We want job seekers exploring opportunities at Deloitte to feel prepared and confident. To help you with your interview, we suggest that you do your research: know some background about the organization and the business area you’re applying to. Check out recruiting tips from Deloitte professionals. Our purpose Deloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Our people and culture Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Professional development At Deloitte, professionals have the opportunity to work with some of the best and discover what works best for them. Here, we prioritize professional growth, offering diverse learning and networking opportunities to help accelerate careers and enhance leadership skills. Our state-of-the-art DU: The Leadership Center in India, located in Hyderabad, represents a tangible symbol of our commitment to the holistic growth and development of our people. Explore DU: The Leadership Center in India . Benefits To Help You Thrive At Deloitte, we know that great people make a great organization. Our comprehensive rewards program helps us deliver a distinctly Deloitte experience that helps that empowers our professionals to thrive mentally, physically, and financially—and live their purpose. To support our professionals and their loved ones, we offer a broad range of benefits. Eligibility requirements may be based on role, tenure, type of employment and/ or other criteria. Learn more about what working at Deloitte can mean for you. Recruiting tips From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Requisition code: 302244
Posted 1 day ago
3.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
If you are looking for a career at a dynamic company with a people-first mindset and a deep culture of growth and autonomy, ACV is the right place for you! Competitive compensation packages and learning and development opportunities, ACV has what you need to advance to the next level in your career. We will continue to raise the bar every day by investing in our people and technology to help our customers succeed. We hire people who share our passion, bring innovative ideas to the table, and enjoy a collaborative atmosphere. Who We Are ACV is a technology company that has revolutionized how dealers buy and sell cars online. We are transforming the automotive industry. ACV Auctions Inc. (ACV), has applied innovation and user-designed, data driven applications and solutions. We are building the most trusted and efficient digital marketplace with data solutions for sourcing, selling and managing used vehicles with transparency and comprehensive insights that were once unimaginable. We are disruptors of the industry and we want you to join us on our journey. Our network of brands include ACV Auctions, ACV Transportation, ClearCar, MAX Digital and ACV Capital within its Marketplace Products, as well as, True360 and Data Services. ACV Auctions in Chennai, India are looking for talented individuals to join our team. As we expand our platform, we're offering a wide range of exciting opportunities across various roles in corporate, operations, and product and technology. Our global product and technology organization spans product management, engineering, data science, machine learning, DevOps and program leadership. What unites us is a deep sense of customer centricity, calm persistence in solving hard problems, and a shared passion for innovation. If you're looking to grow, lead, and contribute to something larger than yourself, we'd love to have you on this journey. Let's build something extraordinary together. Join us in shaping the future of automotive! At ACV we focus on the Health, Physical, Financial, Social and Emotional Wellness of our Teammates and to support this we offer industry leading benefits and wellness programs. What You Will Do ACV’s Machine Learning (ML) team is looking to grow its MLOps team. Multiple ACV operations and product teams rely on the ML team’s solutions. Current deployments drive opportunities in the marketplace, in operations, and sales, to name a few. As ACV has experienced hyper growth over the past few years, the volume, variety, and velocity of these deployments has grown considerably. Thus, the training, deployment, and monitoring needs of the ML team has grown as we’ve gained traction. MLOps is a critical function to help ourselves continue to deliver value to our partners and our customers. Successful candidates will demonstrate excellent skill and maturity, be self-motivated as well as team-oriented, and have the ability to support the development and implementation of end-to-end ML-enabled software solutions to meet the needs of their stakeholders. Those who will excel in this role will be those who listen with an ear to the overarching goal, not just the immediate concern that started the query. They will be able to show their recommendations are contextually grounded in an understanding of the practical problem, the data, and theory as well as what product and software solutions are feasible and desirable. The Core Responsibilities Of This Role Are Working with fellow machine learning engineers to build, automate, deploy, and monitor ML applications. Developing data pipelines that feed ML models. Deploy new ML models into production. Building REST APIs to serve ML models predictions. Monitoring performance of models in production. Required Qualifications Graduate education (MS or PhD) in a computationally intensive domain or equivalent work experience. 3+ years of prior relevant work or lab experience in ML projects/research Advanced proficiency with Python, SQL etc. Experience with building and deploying REST APIs (Flask, FastAPI) Experience with cloud services (AWS / GCP) and kubernetes, docker, CI/CD. Preferred Qualifications Experience with MLOps-specific tooling like Vertex AI, Ray, Feast, Kubeflow, or ClearML, etc. are a plus. Experience with distributed caching technologies (Redis) Experience with real-time data streaming and processing (Kafka) Experience with building data pipelines Experience with training ML models Our Values Trust & Transparency | People First | Positive Experiences | Calm Persistence | Never Settling At ACV, we are committed to an inclusive culture in which every individual is welcomed and empowered to celebrate their true selves. We achieve this by fostering a work environment of acceptance and understanding that is free from discrimination. ACV is committed to being an equal opportunity employer regardless of sex, race, creed, color, religion, marital status, national origin, age, pregnancy, sexual orientation, gender, gender identity, gender expression, genetic information, disability, military status, status as a veteran, or any other protected characteristic. We also consider qualified applicants regardless of criminal histories, consistent with legal requirements. If you have a disability or special need that requires reasonable accommodation, please let us know.
Posted 1 day ago
4.0 - 8.0 years
3 - 5 Lacs
Chennai
On-site
Date: 27 Jun 2025 Company: Qualitest Group Country/Region: IN Key Responsibilities Design, develop, and deploy ML models and AI solutions across various domains such as NLP, computer vision, recommendation systems, time-series forecasting, etc. Perform data preprocessing, feature engineering, and model training using frameworks like TensorFlow, PyTorch, Scikit-learn, or similar. Collaborate with cross-functional teams to understand business problems and translate them into AI/ML solutions. Optimize models for performance, scalability, and reliability in production environments. Integrate ML pipelines with production systems using tools like MLflow, Airflow, Docker, or Kubernetes. Conduct rigorous model evaluation using metrics and validation techniques. Stay up-to-date with state-of-the-art AI/ML research and apply findings to enhance existing systems. Mentor junior engineers and contribute to best practices in ML engineering. Required Skills & Qualifications Bachelor’s or Master’s degree in Computer Science, Data Science, Engineering, or a related field. 4–8 years of hands-on experience in machine learning, deep learning, or applied AI. Proficiency in Python and ML libraries/frameworks (e.g., Scikit-learn, TensorFlow, PyTorch, XGBoost). Experience with data wrangling tools (Pandas, NumPy) and SQL/NoSQL databases. Familiarity with cloud platforms (AWS, GCP, or Azure) and ML tools (SageMaker, Vertex AI, etc.). Solid understanding of model deployment, monitoring, and CI/CD pipelines. Strong problem-solving skills and the ability to communicate technical concepts clearly.
Posted 1 day ago
0 years
5 - 7 Lacs
Noida
Remote
Genpact (NYSE: G) is a global professional services and solutions firm delivering outcomes that shape the future. Our 125,000+ people across 30+ countries are driven by our innate curiosity, entrepreneurial agility, and desire to create lasting value for clients. Powered by our purpose – the relentless pursuit of a world that works better for people – we serve and transform leading enterprises, including the Fortune Global 500, with our deep business and industry knowledge, digital operations services, and expertise in data, technology, and AI. We are inviting applications for the role of Management Trainee/Assistant Manager - US Sales and Use Tax compliance. Ensure timeliness and Accuracy of all deliverables - Provide guidance to team for correct accounting treatment - will be working as escalation point if needed Responsibilities The person will be responsible for the following day to day activities: Business License : Coordinate with Business License team and Sales Tax team to work on executing Business license including renewal of Business License Coordination with Treasury team to pay for Licenses US Property Tax : Tracking and adherence to US Property Tax Returns filing per due date; Track exemptions and exceptions for applicable state; Keeping a close track of new Stores build / Stores closed; Assessment notice reconciliation to PTMS Property Manager Reconciling PPTX accrual account and booking the monthly accrual and adjustment entries US Sales Tax : Understand of Sales Tax Return filing on schedule and ensure timely Payment; Review of Sales Tax Payable using Alteryx. Performing reconciliation and resolving variance is any; Manage vendor queries Maintain all documents related to above Investigate and research open items and follow up/escalate with different teams for resolution of open items Adherence to the internal/external US GAAP/SOX audits Qualifications we seek in you Minimum qualifications • Accounting graduates with relevant experience, CA/CMA preferred. Good written and verbal communication skills Working experience with ERPs, specifically Oracle would be preferred Experience in tool e.g. Alteryx, AS400 and PTMS. Vertex, PTMS, Sovos, Bloomberg sites, Middleware for tax rates would be added advantage Prior experience with Retail Clients with same size would be preferred Good Interpersonal skills and ability to manage complex task and communicate well Leadership Preferred qualifications Strong Accounting and Analytical skills Good understanding of accounting GAAP (US GAAP preferred) principles and strong analytical skills Ability to prioritize work, multi-task and drive things to closure Good hands-on knowledge of Microsoft Excel and other Microsoft applications Collaborate with System, Customers and Key Stakeholders Prior experience working remotely in a US time zone would be a plus Attention to Detail: Accomplishing tasks by considering all areas involved, no matter how small: showing concern for all aspects of the job, accurately checking processes and tasks. Determining Financial Impact: Understanding the financial consequences of decisions Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color, religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. For more information, visit www.genpact.com. Follow us on Twitter, Facebook, LinkedIn, and YouTube. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a 'starter kit,' paying to apply, or purchasing equipment or training. Job Management Trainee Primary Location India-Noida Schedule Full-time Education Level Bachelor's / Graduation / Equivalent Job Posting Jun 27, 2025, 3:15:22 AM Unposting Date Ongoing Master Skills List Operations Job Category Full Time
Posted 1 day ago
3.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Summary Position Summary Pr oduct/Place Data Engineer , Senior Consultant Work you’ll do Work with Product Managers/Owners to understand how products can be used and should be implemented to solve client problems. Work with Project Managers to understand client requirements and brainstorm on solutioning it, implement the assets/products as applicable. Build large-scale batch and real-time data pipelines with data processing frameworks on Microsoft Azure, Google Cloud Platform (GCP), or Amazon Web Services (AWS) Understanding of Vertex AI configuration/requirements to be able to handle large data (several GB to TB) Develop data pipelines using Python, PySpark , SQL Understanding Data science techniques is an added advantage. Deploy data pipelines into production Consistently strive to acquire new skills on Cloud, Big Data technologies Support and coach your team on best coding practices, development tools, and pathfinding and surveys for technologies The t eam ConvergeCONSUMER ™ is a product-driven business that combines differentiated consumer insights with next-generation decision and experience platforms to help consumer-focused businesses optimize decision making and deliver personalized experiences to drive growth, consumer loyalty and profitability. We operate with the speed and agility of a startup, inside the world’s largest professional services firm. Qualifications Required : Include the most essential requirements that must be met for the role; the qualifications that a job seeker can demonstrably possess to be considered as a candidate. These should be quantifiable to allow for a clear and concise comparison of required skills to any given resume . A bachelor's degree in computer science, Data Engineering, Applied Mathematics, or similar Quantitative Field with a minimum 3+ yrs. of experience as data engineer 3+ years of experience with at least one of the following cloud platforms: Microsoft Azure, Google Cloud Platform (GCP), Amazon Web Services (AWS) Azure, GCP, and / or AWS Certification preferred 3+ years of experience in SQL, data transformations, statistical analysis, and troubleshooting across more than one Database Platform (Big query, MySQL, Snowflake, PostgreSQL etc.) 3+ years of experience in the design and build of data extraction, transformation, and loading processes by writing custom data pipelines Proven skills in various programming and database languages i.e., Python, PySpark , SQL Experience working with either a Map Reduce or an MPP system on any size/scale preferred Familiarity with consumer related data and a general interest in the retail, consumer products manufacturing, automotive, transportation, or hospitality sectors A desire to take initiative and continuously provide feedback on improving the products you are responsible for A general interest in relevant emerging technologies and a constant thirst to further your own technical abilities Experience working in an Agile development environment The ability to work well independently and as a team player Excellent conversational and written communication skills Experience with BI/analytics tools, such as Tableau, PowerBI , or similar tools Preferred: List and additional skills, experiences, degrees or certifications that are preferred but not required to be considered. “Nice to haves;” not having all of them would not prevent a job seeker from applying. Master’s Degree in a relevant field from a top-tier university . How You’ll Grow At Deloitte, our professional development plan focuses on helping people at every level of their career to identify and use their strengths to do their best work every day. From entry-level employees to senior leaders, we believe there’s always room to learn. We offer opportunities to help sharpen skills in addition to hands-on experience in the global, fast-changing business world. From on-the-job learning experiences to formal development programs at Deloitte University, our professionals have a variety of opportunities to continue to grow throughout their career. Explore Deloitte University, The Leadership Center. Benefits At Deloitte, we know that great people make a great organization. We value our people and offer employees a broad range of benefits. Learn more about what working at Deloitte can mean for you. Deloitte’s culture Our positive and supportive culture encourages our people to do their best work every day. We celebrate individuals by recognizing their uniqueness and offering them the flexibility to make daily choices that can help them to be healthy, centered, confident, and aware. We offer well-being programs and are continuously looking for new ways to maintain a culture where our people excel and lead healthy, happy lives. Learn more about Life at Deloitte. Corporate citizenship Deloitte is led by a purpose: to make an impact that matters. This purpose defines who we are and extends to relationships with our clients, our people and our communities. We believe that business has the power to inspire and transform. We focus on education, giving, skill-based volunteerism, and leadership to help drive positive social impact in our communities. Learn more about Deloitte’s impact on the world. Recruiter tips We want job seekers exploring opportunities at Deloitte to feel prepared and confident. To help you with your interview, we suggest that you do your research: know some background about the organization and the business area you’re applying to. Check out recruiting tips from Deloitte professionals. Our purpose Deloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Our people and culture Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Professional development At Deloitte, professionals have the opportunity to work with some of the best and discover what works best for them. Here, we prioritize professional growth, offering diverse learning and networking opportunities to help accelerate careers and enhance leadership skills. Our state-of-the-art DU: The Leadership Center in India, located in Hyderabad, represents a tangible symbol of our commitment to the holistic growth and development of our people. Explore DU: The Leadership Center in India . Benefits To Help You Thrive At Deloitte, we know that great people make a great organization. Our comprehensive rewards program helps us deliver a distinctly Deloitte experience that helps that empowers our professionals to thrive mentally, physically, and financially—and live their purpose. To support our professionals and their loved ones, we offer a broad range of benefits. Eligibility requirements may be based on role, tenure, type of employment and/ or other criteria. Learn more about what working at Deloitte can mean for you. Recruiting tips From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Requisition code: 302244
Posted 1 day ago
3.0 years
4 - 8 Lacs
Calcutta
On-site
Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Manager Job Description & Summary At PwC, our people in business application consulting specialise in consulting services for a variety of business applications, helping clients optimise operational efficiency. These individuals analyse client needs, implement software solutions, and provide training and support for seamless integration and utilisation of business applications, enabling clients to achieve their strategic objectives. As a business application consulting generalist at PwC, you will provide consulting services for a wide range of business applications. You will leverage a broad understanding of various software solutions to assist clients in optimising operational efficiency through analysis, implementation, training, and support. *Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us . At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Responsibilities: Position responsibilities and expectations · Designing and building analytical /DL/ ML algorithms using Python, R and other statistical tools. · Strong data representation and lucid presentation (of analysis/modelling output) using Python, R Markdown, Power Point, Excel etc. · Ability to learn new scripting language or analytics platform. Technical Skills required (must have) · HandsOn Exposure to Generative AI (Design, development of GenAI application in production) · Strong understanding of RAG, Vector Database, Lang Chain and multimodal AI applications. · Strong understanding of deploying and optimizing AI application in production. · Strong knowledge of statistical and data mining techniques like Linear & Logistic Regression analysis, Decision trees, Bagging, Boosting, Time Series and Non-parametric analysis. · Strong knowledge of DL & Neural Network Architectures (CNN, RNN, LSTM, Transformers etc.) · Strong knowledge of SQL and R/Python and experience with distribute data/computing tools/IDEs. · Experience in advanced Text Analytics (NLP, NLU, NLG). · Strong hands-on experience of end-to-end statistical model development and implementation · Understanding of LLMOps, ML Ops for scalable ML development. · Basic understanding of DevOps and deployment of models into production (PyTorch, TensorFlow etc.). · Expert level proficiency algorithm building languages like SQL, R and Python and data visualization tools like Shiny, Qlik, Power BI etc. · Exposure to Cloud Platform (Azure or AWS or GCP) technologies and services like Azure AI/ Sage maker/Vertex AI, Auto ML, Azure Index, Azure Functions, OCR, OpenAI, storage, scaling etc. Technical Skills required (Any one or more) · Experience in video/ image analytics (Computer Vision) · Experience in IoT/ machine logs data analysis · Exposure to data analytics platforms like Domino Data Lab, c3.ai, H2O, Alteryx or KNIME · Expertise in Cloud analytics platforms (Azure, AWS or Google) · Experience in Process Mining with expertise in Celonis or other tools · Proven capability in using Generative AI services like OpenAI, Google (Gemini) · Understanding of Agentic AI Framework (Lang Graph, Auto gen etc.) · Understanding of fine-tuning for pre-trained models like GPT, LLaMA, Claude etc. using LoRA, QLoRA and PEFT technique. · Proven capability in building customized models from open-source distributions like Llama, Stable Diffusion Mandatory skill sets: AI chatbots, Data structures, GenAI object-oriented programming, IDE, API, LLM Prompts, Streamlit Preferred skill sets: AI chatbots, Data structures, GenAI object-oriented programming, IDE, API, LLM Prompts, Streamlit Years of experience required: 3-10 Years Education qualification: BE, B. Tech, M. Tech, M. Stat, Ph.D., M.Sc. (Stats / Maths) Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Master of Business Administration, Bachelor of Engineering Degrees/Field of Study preferred: Certifications (if blank, certifications not specified) Required Skills Data Science Optional Skills Accepting Feedback, Accepting Feedback, Active Listening, Analytical Reasoning, Analytical Thinking, Application Software, Business Data Analytics, Business Management, Business Technology, Business Transformation, Coaching and Feedback, Communication, Creativity, Documentation Development, Embracing Change, Emotional Regulation, Empathy, Implementation Research, Implementation Support, Implementing Technology, Inclusion, Intellectual Curiosity, Learning Agility, Optimism, Performance Assessment {+ 21 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Not Specified Available for Work Visa Sponsorship? No Government Clearance Required? No Job Posting End Date
Posted 1 day ago
6.0 - 9.0 years
5 - 9 Lacs
Hyderabad
Work from Office
Skill:SAP SD Integration Job description Cross functional integration with MM-LE, FI Must have done at-least 2 Implementation Functional knowledge in SD, MM, HCM and FI with Media domain should have worked with developer on custom Fiori app ideally should have AATP experience should have experience with flexible workflows Vertex and custom Idocs solid integration experience (upstream / downstream) understanding of BRF+
Posted 1 day ago
10.0 - 14.0 years
27 - 30 Lacs
Chennai
Work from Office
Job Title: Consultant Machine Learning Engineer Location: Chennai, India Company: Altimetrik Experience: 8-10 years Job Summary: We are looking for a highly skilled Senior Machine Learning Engineer with strong expertise in Python, SQL, Google Cloud Platform (GCP), BigQuery, Terraform, GCS, Looker, Vertex AI, Airflow, TensorFlow , and modern MLOps tools such as Tekton . The ideal candidate will design, build, and deploy scalable machine learning solutions and data pipelines in production environments, working closely with data scientists, data engineers, and business stakeholders. Roles & Responsibilities: ML Solution Development: Develop, train, and deploy ML models using Python , TensorFlow , and Vertex AI . Optimize model performance, scalability, and maintainability. Data Engineering & Pipelines: Design and implement data pipelines on GCP using Airflow , BigQuery (BQ) , GCS , and Terraform for infrastructure automation. Automate model training and deployment workflows using Tekton . Cloud Infrastructure & MLOps: Build robust CI/CD pipelines for ML models and data workflows. Manage infrastructure provisioning and configuration using Terraform . Ensure monitoring, observability, and governance for ML services. Visualization & Insights: Develop dashboards and reports in Looker to communicate insights to business stakeholders. Collaboration: Work cross-functionally with data scientists, product managers, and engineers. Mentor junior engineers and contribute to knowledge sharing across the team. Key Skills: Mandatory Technical Skills: Python (Advanced) SQL (Advanced) GCP services: BigQuery, GCS, Vertex AI, Looker TensorFlow Terraform Airflow Tekton MLOps best practices Additional Desired Skills: Experience with containerization and orchestration (e.g., Docker, Kubernetes) Exposure to Data Security and Governance principles Familiarity with other ML frameworks (e.g., PyTorch) Qualifications: Bachelors or Masters degree in Computer Science, Data Science, Engineering, or a related field. 810 years of professional experience in machine learning, data engineering, and cloud infrastructure. Proven track record of building and deploying end-to-end ML solutions in production. Why Join Altimetrik? Opportunity to work on cutting-edge AI/ML solutions at enterprise scale. Collaborative culture focused on continuous learning and innovation. Access to the latest tools and technologies in cloud and machine learning. Preferred candidate profile
Posted 1 day ago
5.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Position : MLOps Engineer Experience : 3–5 Years Location : Hyderabad (Onsite) Job Description We are seeking a skilled and motivated MLOps Engineer with 3–5 years of experience in building and maintaining machine learning infrastructure and deployment pipelines. The ideal candidate will have a solid foundation in DevOps and a deep understanding of ML workflows in production environments. Required Skills Hands-on experience with MLOps platforms: MLflow , Kubeflow Strong proficiency in Python and modular code development Proficient with Docker , CI/CD tools ( Jenkins , GitLab CI ) Familiarity with Kubernetes and container orchestration IaC tools: Terraform , Ansible Monitoring tools: Prometheus , Grafana , Datadog , AWS CloudWatch Experience with AWS SageMaker , Google Vertex AI , or Databricks Competency in Linux , shell scripting, and command-line operations Solid understanding of microservices , load balancing, and service discovery Proficiency with Git and version control workflows Knowledge of ML fundamentals and lifecycle Qualifications Bachelor’s in Computer Science or related field 3–5+ years in MLOps/SRE/Software Engineering 2+ years working on ML/AI systems in production Certification (e.g., AWS DevOps Engineer – Professional ) is a plus
Posted 1 day ago
2.0 - 4.0 years
6 - 12 Lacs
Hyderabad
Work from Office
Key Responsibilities: Support cutover execution activities related to indirect tax configurations and data migration. Perform reconciliation of tax-relevant data between legacy and new ERP system to ensure completeness and accuracy. Validate and verify indirect tax outcomes on transactions in the new ERP system, ensuring compliance with applicable tax laws. Act as a first line of support during hypercare to triage, track, and resolve indirect tax issues post go-live. Drive issue resolution by coordinating with BT, Tax, Controllership, and business process owners to identify root causes and implement fixes. Collaborate with stakeholders to ensure smooth functioning of upstream and downstream systems affecting tax outcomes. Maintain detailed documentation of configurations, issues, resolutions, and system behaviour during cutover and hypercare. Provide ad-hoc support for training and change management efforts related to tax processes and tools. Required Skills: ERP Knowledge: Familiarity with ERP systems (e.g., Workday, Oracle) including navigation, configuration, and tax-related modules. Indirect Tax Expertise: Strong understanding of indirect tax types and compliance (e.g., VAT, GST, Sales & Use Tax), tax determination logic, and reporting requirements. Tax Technology: Hands-on experience with tax engines (e.g., Vertex) and their integration with ERP systems. Exposure on Global Taxation. Data Reconciliation: Ability to analyze and reconcile tax-relevant data between legacy and new systems to ensure accuracy and completeness. Issue Resolution: Strong troubleshooting skills to investigate tax discrepancies and coordinate timely resolution with BT and Tax teams. System Integration Knowledge: Understanding of upstream/downstream system impacts on tax (e.g., procurement, sales, billing, master data). Communication Skills: Clear and effective communication with cross-functional teams (Finance, IT, Tax, Operations). Time Management: Ability to work under tight deadlines
Posted 2 days ago
6.0 - 10.0 years
14 - 19 Lacs
Coimbatore
Work from Office
We are seeking a Senior Data & AI/ML Engineer with deep expertise in GCP, who will not only build intelligent and scalable data solutions but also champion our internal capability building and partner-level excellence.. This is a high-impact role for a seasoned engineer who thrives in designing GCP-native AI/ML-enabled data platforms. You'll play a dual role as a hands-on technical lead and a strategic enabler, helping drive our Google Cloud Data & AI/ML specialization track forward through successful implementations, reusable assets, and internal skill development.. Preferred Qualification. GCP Professional Certifications: Data Engineer or Machine Learning Engineer.. Experience contributing to a GCP Partner specialization journey.. Familiarity with Looker, Data Catalog, Dataform, or other GCP data ecosystem tools.. Knowledge of data privacy, model explainability, and AI governance is a plus.. Work Location: Remote. Key Responsibilities. Data & AI/ML Architecture. Design and implement data architectures for real-time and batch pipelines, leveraging GCP services such as BigQuery, Dataflow, Dataproc, Pub/Sub, Vertex AI, and Cloud Storage.. Lead the development of ML pipelines, from feature engineering to model training and deployment using Vertex AI, AI Platform, and Kubeflow Pipelines.. Collaborate with data scientists to operationalize ML models and support MLOps practices using Cloud Functions, CI/CD, and Model Registry.. Define and implement data governance, lineage, monitoring, and quality frameworks.. Google Cloud Partner Enablement. Build and document GCP-native solutions and architectures that can be used for case studies and specialization submissions.. Lead client-facing PoCs or MVPs to showcase AI/ML capabilities using GCP.. Contribute to building repeatable solution accelerators in Data & AI/ML.. Work with the leadership team to align with Google Cloud Partner Program metrics.. Team Development. Mentor engineers and data scientists toward achieving GCP certifications, especially in Data Engineering and Machine Learning.. Organize and lead internal GCP AI/ML enablement sessions.. Represent the company in Google partner ecosystem events, tech talks, and joint GTM engagements.. What We Offer. Best-in-class packages.. Paid holidays and flexible time-off policies.. Casual dress code and a flexible working environment.. Opportunities for professional development in an engaging, fast-paced environment.. Medical insurance covering self and family up to 4 lakhs per person.. Diverse and multicultural work environment..
Posted 2 days ago
10.0 - 14.0 years
4 - 8 Lacs
Bengaluru
Work from Office
Skill required: Statutory Reporting & Tax Compliance - Indirect Tax Processing Designation: Tax Associate Manager Qualifications: BCom/MCom/Master of Business Administration Years of Experience: 10 to 14 years About Accenture Combining unmatched experience and specialized skills across more than 40 industries, we offer Strategy and Consulting, Technology and Operations services, and Accenture Song all powered by the worlds largest network of Advanced Technology and Intelligent Operations centers. Our 699,000 people deliver on the promise of technology and human ingenuity every day, serving clients in more than 120 countries. Visit us at www.accenture.com What would you do You will be aligned with our Finance Operations vertical and will be helping us in determining financial outcomes by collecting operational data/reports, whilst conducting analysis and reconciling transactions.The process of generating source data and performing required analysis to support indirect periodic (monthly, quarterly, annual) tax filings, based on the clients interpretation of statutory and tax requirements. Examples include VAT, local sales and use taxes, property and income (business) taxes. Preferred Qualifications:Experience:Previous experience in tax analysis or tax technology roles, particularly involving Vertex.Certifications:CPA, CMI, or equivalent certification is a plus.Software Proficiency:Familiarity with data visualization tools and advanced Excel functions. What are we looking for GBS Tax Team is seeking a Data and Tax Analyst with specialized expertise in the Vertex tax system. The successful candidate will manage and analyze tax-related data, ensuring compliance and optimizing tax strategies through effective use of Vertex.Technical Skills: Proficiency in Vertex tax software, including configuration and maintenance.Strong SQL skills for data extraction and analysis.Experience with ERP systems and their integration with tax software.Analytical Skills: Ability to interpret complex tax data and provide actionable insights.Problem-Solving:Demonstrated ability to identify issues within tax systems and develop effective solutions.Communication:Excellent verbal and written communication skills for effective collaboration across teams.Ownership:Proactive approach with a strong sense of ownership and accountability for tasks and projects. Certifications: CPA - Certified Public Accountant Roles and Responsibilities: Vertex Tax System Management:Administer and maintain the Vertex tax software, ensuring accurate configuration and integration with financial systems.Tax Code Updates:Monitor and implement changes in tax codes within the Vertex system to ensure compliance with evolving tax regulations. This includes staying informed about tax rate changes and jurisdictional tax data updates.Tax Data Analysis:Extract and analyze tax data using SQL to identify trends, discrepancies, and opportunities for tax optimization.Compliance and Reporting:Utilize Vertex to prepare and file tax returns, ensuring compliance with all applicable tax regulations and deadlines.Issue Resolution:Troubleshoot and resolve issues related to tax calculations and reporting within the Vertex system.Process Improvement:Identify and implement enhancements to tax data management and reporting processes within Vertex to increase efficiency and accuracy.Cross-functional Collaboration:Work closely with finance, legal, and IT teamsto ensure seamless integration and operation of Vertex within broader financialecosystem. Qualification BCom,MCom,Master of Business Administration
Posted 2 days ago
8.0 years
0 Lacs
Gurugram, Haryana, India
On-site
About Spyne At Spyne , we’re reimagining how cars are bought and sold globally with cutting-edge Generative AI . What started as a bold idea — using AI to help auto dealers sell faster online — has evolved into a full-fledged, AI-first automotive retail ecosystem. Backed by $16M in Series A from Accel, Vertex Ventures, and other top investors, we’re shaping the future of car retail: AI-powered image, video, and 360° solutions for automotive dealers GenAI Retail Suite for Inventory, Marketing, and CRM across global markets 1,500+ dealers onboarded across the US, EU, and key markets Team of 150+ passionate individuals , equally split across R&D and GTM Explore Spyne AI Studio | Series A Announcement Location: Gurugram | On-site | Full-time Role Overview We’re looking for a hands‑on Lead – Demand Generation to own the growth funnel — from acquiring quality traffic to converting Marketing Qualified Leads (MQLs) into revenue. You will be responsible for planning, executing, and optimizing demand generation campaigns across organic, paid, and account-based marketing efforts. If you have deep experience in SEO , CRO , and funnel management , and are passionate about driving revenue impact, we’d love to meet you! What Will You Do? ⚡️ SEO & Organic Growth Lead end‑to‑end SEO strategy — site audits, keyword research, backlink planning, and site optimization Collaborate closely with Product, Marketing, and Engineering to implement and measure impact Stay updated with search engine trends , best practices, and algorithm changes 📈 Conversion Rate Optimization (CRO) Build, test, and optimize landing pages and conversion flows for higher traffic‑to‑lead conversion Lead A/B testing, user experience reviews, and site optimization efforts Analyze site behavior and implement recommendations for better conversion metrics 🎯 Demand Generation & Funnel Management Plan, launch, and manage multi‑channel demand generation campaigns across paid and organic platforms Measure campaign effectiveness across metrics like CPL, CAC, MQL‑to‑Revenue conversion, and pipeline velocity Lead nurturing efforts to drive high‑quality MQLs and enable seamless conversion to revenue 💡 Analytics & Optimization Maintain and manage dashboards for campaign performance across traffic, lead, MQL, and revenue metrics Identify trends, bottlenecks, and growth opportunities within the marketing funnel Leverage marketing automation platforms (HubSpot, Marketo, or equivalent) for nurturing and conversion optimization What We’re Looking For 5–8 years experience in Demand Generation roles, with a focus on SEO, CRO, and MQL‑to‑Revenue funnel optimization Strong knowledge of SEO best practices , site analytics, and conversion optimization techniques Hands‑on experience managing platforms like Google Analytics, SEMrush/Ahrefs, HubSpot, Marketo, Salesforce , or similar A data‑driven approach with a proven track record of optimizing marketing funnel metrics Strong communication and stakeholder management abilities across Product, Sales, and Marketing teams An agile mindset and comfort working in a high‑growth, fast‑paced SaaS environment Why Join Spyne? 🚀 High‑Growth Company – We’ve scaled revenues 5X in 15 months and are poised for another 3–4X growth in the coming year 👥 High Ownership Culture – Autonomy, accountability, and room to make an impact every day 💻 Best‑in‑Class Tools – Laptop of your choice + access to premium marketing platforms and analytics tools 🌍 Global Impact – Build marketing strategies that drive results across the US, EU, and beyond 🥇 Learning Culture – We hire sharp minds and foster a space for them to learn, experiment, and evolve 📈 Exponential Growth – Join at the best time and scale your role, expertise, and career 10X with the company If you’re passionate about scaling growth , optimizing conversion , and making a global impact, we’d love to have you on the journey!
Posted 2 days ago
5.0 - 10.0 years
7 - 13 Lacs
Mumbai
Work from Office
Design, develop and maintain best-practice, scalable and efficient CPQ models that meet global and local business requirements Progressive maintenance of document proposals based on specificationsDesign and conduct testing scenarios Working with business users to understand business requirement and then translate those requirements or high-level user stories into detailed user stories technical issues for model development or CPQ parametrization Provide assistance to CPQ users by answering questions, resolving technical problems escalating issues that cannot be solved to other departments or the CPQ vendor Communicate relevant changes on models, templates or new enhancements to CPQ users Make recommendations based on own findings or user feedback regarding the implementation of new CPQ initiatives and platform enhancements Responsible for maintaining the CPQ models from development through the entire lifecycle Responsible for managing change control of the CPQ models and executing updates in a timely manner Responsible for working with other IT experts to integrate CPQ models using the CPI platform with other systems as needed Other systems include but limited to SAP ECC, Salesforce, PriceFx, Vertex etc Prepare documentation and provide user training Bachelors Degree in IT or Engineering required with 5 years of experience in developing and implementing SAP VC models using SAP CPQ 2+ years of experience as Product or Configurator Manager in industrial companies using SAP CPQ Good understanding of sales business processesProficiency in Salesforce Sales and eCommerce platform desired
Posted 2 days ago
7.0 - 10.0 years
9 - 12 Lacs
Hyderabad
Work from Office
Gather requirements from business stakeholders regarding tax-specific needs. Document and understand the client's current tax processes and configurations. Solution Design: Design SAP-Vertex integration points based on the client's business processes. Create detailed design documentation, ensuring it aligns with tax regulations and client's business needs. Configuration: Configure SAP to integrate seamlessly with Vertex. Set up tax-related master data and ensure its correctness. Custom Development: Identify areas where standard integration might not suffice, and custom developments are needed. Work with the technical team to develop, test, and implement custom solutions. Testing: Conduct unit tests to ensure individual components work as expected. Support business users in User Acceptance Testing (UAT) to validate the overall solution. Document test cases and outcomes. Deployment: Assist in the migration of configurations and custom developments to the production environment. Monitor post-go-live activities to ensure smooth operation. Training and Documentation: Train end-users on how the integrated system functions. Create user manuals and documentation to support the solutions implemented.
Posted 2 days ago
0 years
0 Lacs
Ahmedabad, Gujarat, India
On-site
Minimum of (3+) years of experience in AI-based application development. Fine-tune pre-existing models to improve performance and accuracy. Experience with TensorFlow or PyTorch, Scikit-learn, or similar ML frameworks and familiarity with APIs like OpenAI or vertex AI Experience with NLP tools and libraries (e.g., NLTK, SpaCy, GPT, BERT). Implement frameworks like LangChain, Anthropics Constitutional AI, OpenAIs, Hugging Face, and Prompt Engineering techniques to build robust and scalable AI applications. Evaluate and analyze RAG solution and Utilise the best-in-class LLM to define customer experience solutions (Fine tune Large Language models (LLM)). Architect and develop advanced generative AI solutions leveraging state-of-the-art language models (LLMs) such as GPT, LLaMA, PaLM, BLOOM, and others. Strong understanding and experience with open-source multimodal LLM models to customize and create solutions. Explore and implement cutting-edge techniques like Few-Shot Learning, Reinforcement Learning, Multi-Task Learning, and Transfer Learning for AI model training and fine-tuning. Proficiency in data preprocessing, feature engineering, and data visualization using tools like Pandas, NumPy, and Matplotlib. Optimize model performance through experimentation, hyperparameter tuning, and advanced optimization techniques. Proficiency in Python with the ability to get hands-on with coding at a deep level. Develop and maintain APIs using Python's FastAPI, Flask, or Django for integrating AI capabilities into various systems. Ability to write optimized and high-performing scripts on relational databases (e.g., MySQL, PostgreSQL) or non-relational database (e.g., MongoDB or Cassandra) Enthusiasm for continuous learning and professional developement in AI and leated technologies. Collaborate with cross-functional teams to understand business requirements and translate them into technical solutions. Knowledge of cloud services like AWS, Google Cloud, or Azure. Proficiency with version control systems, especially Git. Familiarity with data pre-processing techniques and pipeline development for Al model training. Experience with deploying models using Docker, Kubernetes Experience with AWS Bedrock, and Sagemaker is a plus Strong problem-solving skills with the ability to translate complex business problems into Al solutions.
Posted 2 days ago
3.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
About the job We are seeking an experienced AI/NLP Engineer to join our team. The ideal candidate will have expertise in working with large language models and AI-based tools, strong analytical skills, and experience developing, testing, and refining AI-driven applications. Know your team (“Legacy Rewired, Engineering the Future”) At ValueMomentum’s Engineering Centre, we are a team of passionate engineers who thrive on tackling complex business challenges with innovative solutions while transforming the P&C insurance value chain. We achieve this through a strong engineering foundation and by continuously refining our processes, methodologies, tools, agile delivery teams, and core engineering archetypes. Our core expertise lies in six key areas: Cloud Engineering, Application Engineering, Data Engineering, Core Engineering, Quality Engineering, and Domain expertise. Relevant Experience should be more than 3 Years on the same Programming: Python, TensorFlow, PyTorch, Scikit-learn ✔ MLOps & Deployment: Docker, Kubernetes, MLflow, Airflow ✔ Cloud: AWS (SageMaker), GCP (Vertex AI), Azure ML ✔ Big Data: Spark, Kafka, Hadoop ✔ Databases: SQL, NoSQL, GraphDBs ✔ DevOps: CI/CD, GitHub Actions, Terraform ✔ Optimization: ONNX, TensorRT, Pruning & Quantization Feature - AI Engineer Focus - Building and deploying AI systems Responsibilities - Developing algorithms, deploying models, ensuring scalability Skills - Strong programming, AI frameworks, cloud computing Goal - Create AI-powered solutions
Posted 2 days ago
0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Avant de postuler à un emploi, sélectionnez votre langue de préférence parmi les options disponibles en haut à droite de cette page. Découvrez votre prochaine opportunité au sein d'une organisation qui compte parmi les 500 plus importantes entreprises mondiales. Envisagez des opportunités innovantes, découvrez notre culture enrichissante et travaillez avec des équipes talentueuses qui vous poussent à vous développer chaque jour. Nous savons ce qu’il faut faire pour diriger UPS vers l'avenir : des personnes passionnées dotées d’une combinaison unique de compétences. Si vous avez les qualités, de la motivation, de l'autonomie ou le leadership pour diriger des équipes, il existe des postes adaptés à vos aspirations et à vos compétences d'aujourd'hui et de demain. Job Summary Fiche de poste : We are seeking a highly skilled MLOps Engineer to design, deploy, and manage machine learning pipelines in Google Cloud Platform (GCP). In this role, you will be responsible for automating ML workflows, optimizing model deployment, ensuring model reliability, and implementing CI/CD pipelines for ML systems. You will work with Vertex AI, Kubernetes (GKE), BigQuery, and Terraform to build scalable and cost-efficient ML infrastructure. The ideal candidate must have a good understanding of ML algorithms, experience in model monitoring, performance optimization, Looker dashboards and infrastructure as code (IaC), ensuring ML models are production-ready, reliable, and continuously improving. You will be interacting with multiple technical teams, including architects and business stakeholders to develop state of the art machine learning systems that create value for the business. Responsibilities Managing the deployment and maintenance of machine learning models in production environments and ensuring seamless integration with existing systems. Monitoring model performance using metrics such as accuracy, precision, recall, and F1 score, and addressing issues like performance degradation, drift, or bias. Troubleshoot and resolve problems, maintain documentation, and manage model versions for audit and rollback. Analyzing monitoring data to preemptively identify potential issues and providing regular performance reports to stakeholders. Optimization of the queries and pipelines. Modernization of the applications whenever required Qualifications Expertise in programming languages like Python, SQL Solid understanding of best MLOps practices and concepts for deploying enterprise level ML systems. Understanding of Machine Learning concepts, models and algorithms including traditional regression, clustering models and neural networks (including deep learning, transformers, etc.) Understanding of model evaluation metrics, model monitoring tools and practices. Experienced with GCP tools like BigQueryML, MLOPS, Vertex AI Pipelines (Kubeflow Pipelines on GCP), Model Versioning & Registry, Cloud Monitoring, Kubernetes, etc. Solid oral and written communication skills and ability to prepare detailed technical documentation of new and existing applications. Strong ownership and collaborative qualities in their domain. Takes initiative to identify and drive opportunities for improvement and process streamlining. Bachelor’s Degree in a quantitative field of mathematics, computer science, physics, economics, engineering, statistics (operations research, quantitative social science, etc.), international equivalent, or equivalent job experience. Bonus Qualifications Experience in Azure MLOPS, Familiarity with Cloud Billing. Experience in setting up or supporting NLP, Gen AI, LLM applications with MLOps features. Experience working in an Agile environment, understanding of Lean Agile principles. Type De Contrat en CDI Chez UPS, égalité des chances, traitement équitable et environnement de travail inclusif sont des valeurs clefs auxquelles nous sommes attachés.
Posted 2 days ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
India has seen a rise in demand for professionals with expertise in Vertex, a cloud-based tax technology solution. Companies across various industries are actively seeking individuals with skills in Vertex to manage their tax compliance processes efficiently. If you are a job seeker looking to explore opportunities in this field, read on to learn more about the Vertex job market in India.
The salary range for Vertex professionals in India varies based on experience levels. Entry-level professionals can expect to earn around INR 4-6 lakhs per annum, while experienced professionals with several years in the industry can earn upwards of INR 12-15 lakhs per annum.
In the Vertex domain, a typical career progression path may include roles such as Tax Analyst, Tax Consultant, Tax Manager, and Tax Director. Professionals may advance from Junior Tax Analyst to Senior Tax Analyst, and eventually take on leadership roles as Tax Managers or Directors.
Alongside expertise in Vertex, professionals in this field are often expected to have skills in tax compliance, tax regulations, accounting principles, and data analysis. Knowledge of ERP systems and experience in tax software implementation can also be beneficial.
As you explore job opportunities in the Vertex domain in India, remember to showcase your expertise, skills, and experience confidently during interviews. Prepare thoroughly for technical questions and demonstrate your understanding of tax compliance processes. With dedication and continuous learning, you can build a successful career in Vertex roles. Good luck!
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
20312 Jobs | Dublin
Wipro
11977 Jobs | Bengaluru
EY
8165 Jobs | London
Accenture in India
6667 Jobs | Dublin 2
Uplers
6464 Jobs | Ahmedabad
Amazon
6352 Jobs | Seattle,WA
Oracle
5993 Jobs | Redwood City
IBM
5803 Jobs | Armonk
Capgemini
3897 Jobs | Paris,France
Tata Consultancy Services
3776 Jobs | Thane