Jobs
Interviews

1266 Vertex Jobs - Page 4

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

2.0 years

0 Lacs

Gurugram, Haryana, India

On-site

About Spyne At Spyne, we are transforming how cars are marketed and sold with cutting-edge Generative AI. What started as a bold idea—using AI-powered visuals to help auto dealers sell faster online—has now evolved into a full-fledged, AI-first automotive retail ecosystem. Backed by $16M in Series A funding from Accel, Vertex Ventures, and other top investors, we’re scaling at breakneck speed: Launched industry-first AI-powered Image, Video & 360° solutions for Automotive dealers Launching Gen AI powered Automotive Retail Suite to power Inventory, Marketing, CRM for dealers Onboarded 1500+ dealers across US, EU and other key markets in the past 2 years of launch Gearing up to onboard 10K+ dealers across global market of 200K+ dealers 150+ members team with near equal split on R&D and GTM Learn more about our products: Spyne AI Products - StudioAI, RetailAI Series A Announcement - CNBC-TV18, Yourstory What are we looking for? We are looking for Account Executives who can sell, hustle, and thrive in chaos . Working on B2B SaaS sales and driving revenue growth. This role will be responsible for building relationships with car dealers while helping in shaping the overall sales strategy in the US region . 📍 Location: Gurugram (Work from Office, 5 days a week) 🌎 Shift Timings: US Shift (6 PM – 3 AM IST) 🚀 Why this role? At Spyne, we believe that a high-performing sales team is the key to growth. As an Account Executive - US , you will be the driving force behind our expansion in the US market. This role offers the opportunity to work with a rapidly growing AI Tech company, build strong client relationships, and make a significant impact on our business. If you are passionate about sales, thrive in a fast-paced environment, and want to be part of an exciting growth journey, this role is for you! 📌 What will you do? Generate leads , nurture prospects & close high-value deals. Own & exceed annual sales targets. Deliver compelling product demos & refine sales strategies. Build long-term client relationships & explore upsell opportunities. Develop & execute account growth plans . Communicate value propositions effectively. Use a Challenger-based selling approach to close complex deals. 🏆 What will make you successful in this role? Deep understanding of the US market: Experience selling to SMBs and enterprise accounts in the US. Sales expertise and Strong Communication Skills : Proven track record of achieving and exceeding sales targets. Ability to engage clients effectively via phone, email, and presentations. Relationship-building skills: Ability to foster long-term, trusting relationships with clients. Strategic thinking: Ability to identify and execute upselling and cross-selling opportunities. Process-oriented mindset: Proficiency with CRM software like HubSpot to manage and track sales efforts. 📊 What will a typical quarter at Spyne look like? Identify and close high-value deals with SMBs and enterprise clients in the US market. Conduct engaging product demos to convert prospects into long-term customers. Build and manage a strong sales pipeline while exceeding set targets. Collaborate with internal teams to ensure a seamless customer experience. Track sales metrics and refine strategies to improve conversion rates and revenue. 🔹 How will we set you up for success? Comprehensive onboarding and training to help you understand our product and market. Continuous learning and development opportunities to enhance your sales skills. A culture that values innovation, customer obsession, and long-term success. Access to cutting-edge AI technology that makes selling more effective and impactful. 🎯 What you must have? Bachelor’s or Master’s degree with 3 - 5 years of relevant sales experience (US sales experience preferred). Experience working with SMBs/Enterprise Accounts in the US market. Proficiency with HubSpot or other CRM software . Prior experience as a sales professional with a track record of achieving sales quotas . 🚀 Why Spyne? Strong Culture: A supportive and collaborative work environment. Transparency & Trust: High levels of autonomy and decision-making. Competitive Salary & Equity: Stock options for top performers. Health Insurance: Coverage for employees and dependents, including GMC, GPA, and GTLI coverage. Dynamic Growth Environment: Join a high-growth startup and accelerate your career. 📢 If you’re a go-getter, passionate about B2B SaaS sales, and excited to drive revenue growth in a Series A startup, apply now! 🚀

Posted 4 days ago

Apply

10.0 - 15.0 years

10 - 14 Lacs

Ahmedabad

Work from Office

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : SAP FI S/4HANA Accounting Good to have skills : NAMinimum 12 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. You will be responsible for managing the team and ensuring successful project delivery. Your typical day will involve collaborating with multiple teams, making key decisions, and providing solutions to problems that apply across multiple teams. Roles & Responsibilities:Expertise in Tax and its application in SAP Finance modules and aligned business processes.Should be able to own end to end responsibility for Indirect Tax solution being deployed in S4 HANAExtensive experience of SAP FI Tax ConfigurationKnowledge of country specific Indirect tax scenarios with applicable business scenarios.Cross-functional knowledge on OTC, PTP and IntercompanyPrepare functional specification documents for enhancements related to IDTDesign, build and support integrations to external tax authorities where needed.Ability to understand complex business/tax requirements within the business processes and communicate effectively with a non-technical audience.Support testing (Integration, UAT),cutover, Go live and hyper care.Experience with Vertex interface administration(Added advantage) Professional & Technical Skills: Must To Have Skills: Proficiency in SAP FI CO FinanceExtensive knowledge of S4 HANA Implementation of Tax & FICO.Minimum 6-10+ years of experience in implementation / application enhancement /production support/ projects in Finance.Minimum experience of 4+ implementation projects.Knowledge and hands-on experience across all SAP Finance modules and DevOps methodology.Good to have hands-on knowledge and experience on support tools - Service Now, JIRA, CharmEffective business communication and stakeholder management skills Additional Information:- The candidate should have a minimum of 12 years of experience in SAP FI CO Finance- This position is based at our Hyderabad office- A 15 years full-time education is required Qualification 15 years full time education

Posted 4 days ago

Apply

1.0 years

0 Lacs

Noida, Uttar Pradesh, India

Remote

Position – Sales Coordinator Company – Attentive OS Pvt Ltd Location – Remote - India Department – Growth About Attentive.ai Attentive.ai is a fast-growing vertical SaaS startup backed by Peak XV (Surge), InfoEdge, and Vertex Ventures. We build innovative software solutions for the landscape, paving, and construction industries in the United States. Our mission is to help these businesses improve operations and win more work through AI-powered takeoffs and a streamlined software platform. We’re looking for a resourceful and highly motivated professional to join our Growth team. This role will support sales execution, deal flow operations, partner outreach, and executive-level initiatives, making it ideal for someone who thrives in a fast-paced, high-ownership support role. Job Description The ideal candidate is a self-starter who brings structure, initiative, and attention to detail. This role will support Account Executives, assist the President of Field Services, and act as a key communication bridge between our internal teams and external stakeholders. You’ll work across CRM, partner communications, customer preparation, and executive projects to ensure smooth sales execution and strategic growth initiatives. Responsibilities Of The Role Manage deal flow and communication on behalf of Account Executives, including outreach, follow-ups, and recap emails. Assist in CRM management (HubSpot) - ensuring pipeline hygiene, updating deal data, logging call notes, and maintaining accuracy across records. Support credential creation and routing of free trials for the sales team. Collaborate directly with the President of Field Services on customer follow-ups, proposal development, partner outreach, and strategic initiatives. Draft emails, memos, and proposals; create both internal and customer-facing decks and supporting materials from scratch. Pull together data from various internal sources and synthesize it into structured documents with initial insights. Participate in select customer and partner meetings to support note-taking, documentation, and follow-up. Assist in preparing agendas, customer correspondence, and partner updates for ongoing executive-level accounts and initiatives. Requirements For The Role 1+ years' experience in a similar sales support, business operations, or executive assistant role within a B2B/SaaS environment (preferred). Experience working with North American teams and availability during EST business hours. (7am - 4pm EST) Proficiency in Google Suite and Slack; familiarity with Notion and HubSpot is a plus. Excellent written and verbal communication skills across both business formal and conversational styles. High attention to detail, organizational strength, and the ability to manage multiple priorities independently. Professional discretion and sound judgment when working with sensitive business information. Traits that will help you thrive: resourcefulness, initiative, and a strong sense of urgency.

Posted 4 days ago

Apply

5.0 years

0 Lacs

India

On-site

Role Overview: We are looking for a skilled and versatile AI Infrastructure Engineer (DevOps/MLOps) to build and manage the cloud infrastructure, deployment pipelines, and machine learning operations behind our AI-powered products. You will work at the intersection of software engineering, ML, and cloud architecture to ensure that our models and systems are scalable, reliable, and production-ready. Key Responsibilities: Design and manage CI/CD pipelines for both software applications and machine learning workflows. Deploy and monitor ML models in production using tools like MLflow, SageMaker, Vertex AI, or similar. Automate the provisioning and configuration of infrastructure using IaC tools (Terraform, Pulumi, etc.). Build robust monitoring, logging, and alerting systems for AI applications. Manage containerized services with Docker and orchestration platforms like Kubernetes. Collaborate with data scientists and ML engineers to streamline model experimentation, versioning, and deployment. Optimize compute resources and storage costs across cloud environments (AWS, GCP, or Azure). Ensure system reliability, scalability, and security across all environments. Requirements: 5+ years of experience in DevOps, MLOps, or infrastructure engineering roles. Hands-on experience with cloud platforms (AWS, GCP, or Azure) and services related to ML workloads. Strong knowledge of CI/CD tools (e.g., GitHub Actions, Jenkins, GitLab CI). Proficiency in Docker, Kubernetes, and infrastructure-as-code frameworks. Experience with ML pipelines, model versioning, and ML monitoring tools. Scripting skills in Python, Bash, or similar for automation tasks. Familiarity with monitoring/logging tools (Prometheus, Grafana, ELK, CloudWatch, etc.). Understanding of ML lifecycle management and reproducibility. Preferred Qualifications: Experience with Kubeflow, MLflow, DVC, or Triton Inference Server. Exposure to data versioning, feature stores, and model registries. Certification in AWS/GCP DevOps or Machine Learning Engineering is a plus. Background in software engineering, data engineering, or ML research is a bonus. What We Offer: Work on cutting-edge AI platforms and infrastructure Cross-functional collaboration with top ML, research, and product teams Competitive compensation package – no constraints for the right candidate send mail to :- thasleema@qcentro.com Job Type: Permanent Ability to commute/relocate: Thiruvananthapuram District, Kerala: Reliably commute or planning to relocate before starting work (Required) Experience: Devops and MLops: 5 years (Required) Work Location: In person

Posted 4 days ago

Apply

8.0 years

0 Lacs

India

Remote

Location: Remote Experience: 8+ Years Job Type: Contract Job Overview: We are seeking a highly skilled Machine Learning Architect to design and implement cutting-edge AI/ML solutions that drive business innovation and operational efficiency. The ideal candidate will have deep expertise in Google Cloud Platform , Gurobi , and Google OR-Tools , with a proven ability to build scalable, optimized machine learning models for complex decision-making processes. Key Responsibilities: Design and develop robust machine learning architectures aligned with business objectives. Implement optimization models using Gurobi and Google OR to address complex operational problems. Leverage Google Cloud AI/ML services (Vertex AI, TensorFlow, AutoML) for scalable model training and deployment. Build automated pipelines for data preprocessing, model training, evaluation, and deployment. Ensure high-performance computing and efficient resource usage in cloud environments. Collaborate with data scientists, ML engineers, and business stakeholders to integrate ML solutions into production. Monitor, retrain, and enhance model performance to maintain accuracy and efficiency. Stay current with emerging AI/ML trends, tools, and best practices. Required Qualifications: Bachelor’s or Master’s degree in Computer Science, AI, Data Science, or a related field. 5+ years of experience in machine learning solution architecture and deployment. Strong hands-on experience with Google Cloud AI/ML services (Vertex AI, AutoML, BigQuery, etc.). Deep expertise in optimization modeling using Gurobi and Google OR-Tools . Proficiency in Python, TensorFlow, PyTorch, and ML libraries/frameworks. Solid understanding of big data processing frameworks (e.g., Apache Spark , BigQuery ). Excellent problem-solving skills with the ability to work across cross-functional teams. Preferred Qualifications: PhD in Machine Learning, Artificial Intelligence, or a related field. Experience with Reinforcement Learning and complex optimization algorithms. Working knowledge of MLOps , CI/CD pipelines, and Kubernetes for model lifecycle management. Familiarity with Google Cloud security best practices and identity/access management. Unlock more with Plus

Posted 4 days ago

Apply

16.0 - 25.0 years

0 Lacs

Gurgaon

On-site

Skill required: Financial Planning & Analysis - Financial Planning and Analysis (FP&A) Designation: Delivery Lead Senior Manager Qualifications: Any Graduation Years of Experience: 16 to 25 years About Accenture Accenture is a global professional services company with leading capabilities in digital, cloud and security.Combining unmatched experience and specialized skills across more than 40 industries, we offer Strategy and Consulting, Technology and Operations services, and Accenture Song— all powered by the world’s largest network of Advanced Technology and Intelligent Operations centers. Our 699,000 people deliver on the promise of technology and human ingenuity every day, serving clients in more than 120 countries. We embrace the power of change to create value and shared success for our clients, people, shareholders, partners and communities.Visit us at www.accenture.com What would you do? You will be aligned with our Finance Operations vertical and will be helping us in determining financial outcomes by collecting operational data/reports, whilst conducting analysis and reconciling transactions. Financial planning, reporting, variance analysis, budgeting and forecasting Financial planning and analysis (FP&A) refers to the processes designed to help organizations accurately plan, forecast, and budget to support the company s major business decisions and future financial health. These processes include planning, budgeting, forecasting, scenario modeling, and performance reporting. Build, mentor, and inspire a high-performing tax team, setting aspirations, defining career paths, and promoting technical training. Foster a culture of excellence, continuous improvement, and ethical conduct. Serve as primary liaison with external entities—regulatory authorities, auditors, tax advisors—and internal stakeholders across Finance, Legal, Treasury, and Operations. Communicate tax strategy, issues, and risks to C-level and global leadership with clarity and confidence. What are we looking for? •Direct Tax Processing •Indirect Tax Processing •1. Establish and maintain robust governance and control frameworks for indirect tax, U.S. GAAP and federal tax reporting, identifying and mitigating key risks. 2. Lead evaluation of high-risk processes, audits, and open compliance items; take decisive action and implement corrective plans. 3. Serve as a strategic advisor to senior leadership on all tax matters—policy, compliance, operational risks, and automation initiatives 4. Ensure full compliance across: Indirect Taxes: Customs, federal excise, FIP, export reporting, state excise, sales/use, and PACT Act filings. Direct Taxes: FBAR (FinCEN 114), Form 5472, federal/state income taxes, extensions, estimated payments, and Country-by-Country Reporting. 5. Oversee timely filings and payments of tax returns, with final approval and validation. 6. Lead digital transformation initiatives—including SAP, Power BI, RPA, and tax engines (Vertex, Avalara, OneSource, etc.) to drive efficiency and accuracy 7. Manage tax technology roadmap and liaise with Finance, IT, and external vendors to implement integrated systems. 8. Lead cross-functional, global teams to execute tax initiatives, ensuring timelines, governance, and quality standards. Roles and Responsibilities: •In this role you are required to identify and assess complex problems for area(s) of responsibility • The individual should create solutions in situations in which analysis requires in-depth knowledge of organizational objectives • Requires involvement in setting strategic direction to establish near-term goals for area(s) of responsibility • Interaction is with senior management levels at a client and/or within Accenture, involving negotiating or influencing on significant matters • Should have latitude in decision-making and determination of objectives and approaches to critical assignments • Their decisions have a lasting impact on area of responsibility with the potential to impact areas outside of own responsibility • Individual manages large teams and/or work efforts (if in an individual contributor role) at a client or within Accenture • Please note that this role may require you to work in rotational shifts Any Graduation

Posted 4 days ago

Apply

5.0 years

10 Lacs

Calcutta

On-site

Lexmark is now a proud part of Xerox, bringing together two trusted names and decades of expertise into a bold and shared vision. When you join us, you step into a technology ecosystem where your ideas, skills, and ambition can shape what comes next. Whether you’re just starting out or leading at the highest levels, this is a place to grow, stretch, and make real impact—across industries, countries, and careers. From engineering and product to digital services and customer experience, you’ll help connect data, devices, and people in smarter, faster ways. This is meaningful, connected work—on a global stage, with the backing of a company built for the future, and a robust benefits package designed to support your growth, well-being, and life beyond work. Responsibilities : A Data Engineer with AI/ML focus combines traditional data engineering responsibilities with the technical requirements for supporting Machine Learning (ML) systems and artificial intelligence (AI) applications. This role involves not only designing and maintaining scalable data pipelines but also integrating advanced AI/ML models into the data infrastructure. The role is critical for enabling data scientists and ML engineers to efficiently train, test, and deploy models in production. This role is also responsible for designing, building, and maintaining scalable data infrastructure and systems to support advanced analytics and business intelligence. This role often involves leading mentoring junior team members, and collaborating with cross-functional teams. Key Responsibilities: Data Infrastructure for AI/ML: Design and implement robust data pipelines that support data preprocessing, model training, and deployment. Ensure that the data pipeline is optimized for high-volume and high-velocity data required by ML models. Build and manage feature stores that can efficiently store, retrieve, and serve features for ML models. AI/ML Model Integration: Collaborate with ML engineers and data scientists to integrate machine learning models into production environments. Implement tools for model versioning, experimentation, and deployment (e.g., MLflow, Kubeflow, TensorFlow Extended). Support automated retraining and model monitoring pipelines to ensure models remain performant over time. Data Architecture & Design Design and maintain scalable, efficient, and secure data pipelines and architectures. Develop data models (both OLTP and OLAP). Create and maintain ETL/ELT processes. Data Pipeline Development Build automated pipelines to collect, transform, and load data from various sources (internal and external). Optimize data flow and collection for cross-functional teams. MLOps Support: Develop CI/CD pipelines to deploy models into production environments. Implement model monitoring, alerting, and logging for real-time model predictions. Data Quality & Governance Ensure high data quality, integrity, and availability. Implement data validation, monitoring, and alerting mechanisms. Support data governance initiatives and ensure compliance with data privacy laws (e.g., GDPR, HIPAA). Tooling & Infrastructure Work with cloud platforms (AWS, Azure, GCP) and data engineering tools like Apache Spark, Kafka, Airflow, etc. Use containerization (Docker, Kubernetes) and CI/CD pipelines for data engineering deployments. Team Collaboration & Mentorship Collaborate with data scientists, analysts, product managers, and other engineers. Provide technical leadership and mentor junior data engineers. Core Competencies Data Engineering: Apache Spark, Airflow, Kafka, dbt, ETL/ELT pipelines ML/AI Integration: MLflow, Feature Store, TensorFlow, PyTorch, Hugging Face GenAI: LangChain, OpenAI API, Vector DBs (FAISS, Pinecone, Weaviate) Cloud Platforms: AWS (S3, SageMaker, Glue), GCP (BigQuery, Vertex AI) Languages: Python, SQL, Scala, Bash DevOps & Infra: Docker, Kubernetes, Terraform, CI/CD pipelines Educational Qualifications: Bachelor's or Master's degree in Computer Science, Engineering, or related field. 5+ years of experience in data engineering or related field. Strong understanding of data modeling, ETL/ELT concepts, and distributed systems. Experience with big data tools and cloud platforms. Soft Skills: Strong problem-solving and critical-thinking skills. Excellent communication and collaboration abilities. Leadership experience and the ability to guide technical decisions. How to Apply ? Are you an innovator? Here is your chance to make your mark with a global technology leader. Apply now!

Posted 4 days ago

Apply

1.0 - 3.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Job Title: Inside Sales Executive – Study Abroad (Full-Time) 📍 Location: Noida 📅 Experience: 1-3 Years in EdTech / Study Abroad 💼 Department: Sales & Student Counseling 🎓 Industry: Education & Overseas Consulting 🕒 Working Hours: 10 AM – 7 PM (6 days/week) About Vertex Edu Vertex Edu is a trusted name in global education consulting, empowering students to achieve admissions in top universities worldwide – including Ivy League and other elite institutions. With a personalized mentorship model and an 80% success rate, we are reimagining how students and families plan for international education. Job Overview We are seeking a driven and empathetic Inside Sales Executive to join our dynamic team. You will be the first point of contact for students and parents exploring study abroad options. Your role is not just to sell, but to guide, build trust, and convert inquiries into committed journeys. Key Responsibilities (KRAs) ✅ Lead Conversion & Sales Closure Handle inbound and outbound calls with parents and students who have expressed interest. Conduct needs assessment and explain Vertex Edu’s offerings. Convert qualified leads into enrollments through consultative sales. Consistently meet or exceed monthly sales targets. ✅ Student/Parent Consultation Provide clarity on study abroad options, exams, countries, costs, scholarships, and timelines. Book Zoom/phone sessions with seniors or academic mentors as needed. ✅ CRM & Follow-ups Maintain daily records of calls, leads, and student data on the CRM. Follow up regularly through calls, emails, and WhatsApp with interested leads. Update status and detailed remarks for each lead. ✅ Collaboration Coordinate with counselors, admission teams, and marketing for smooth handover post-enrollment. Share student/parent feedback with the marketing team for better campaign targeting. ✅ Reporting Submit daily, weekly, and monthly reports on leads, conversions, and pipeline movement. Requirements 1–3 years of experience in Inside Sales / Tele-sales / Counseling in EdTech, Overseas, or similar industry. Strong communication skills in English and Hindi. Ability to handle objections, and emotionally connect with parents & students. Experience with CRM tools and Google Workspace preferred. Passion for education, empathy, and result-oriented approach. What We Offer A fast-growing startup environment with direct mentorship Opportunity to work on impactful global student journeys Incentive structure for high performers Career growth path into Senior Counselor / Sales Manager roles Salary- 20,000- 40,000 rupees per month (Depends upon profile) How to Apply 📩 Send your CV with the subject “Inside Sales – Vertex Edu” to [hr@vertexedu.com] 🌐 Visit us at: www.vertexedu.com

Posted 4 days ago

Apply

15.0 years

0 Lacs

India

On-site

Job Location: Hyderabad / Bangalore / Pune Immediate Joiners / less than 30 days About the Role We are looking for a seasoned AI/ML Solutions Architect with deep expertise in designing and deploying scalable AI/ML and GenAI solutions on cloud platforms. The ideal candidate will have a strong track record in BFSI, leading end-to-end projects—from use case discovery to productionization—while ensuring governance, compliance, and performance at scale. Key Responsibilities Lead the design and deployment of enterprise-scale AI/ML and GenAI architectures. Drive end-to-end AI/ML project delivery : discovery, prototyping, productionization. Architect solutions using leading cloud-native AI services (AWS, Azure, GCP). Implement MLOps/LLMOps pipelines for model lifecycle and automation. Guide teams in selecting and integrating GenAI/LLM frameworks (OpenAI, Cohere, Hugging Face, LangChain, etc.). Ensure robust AI governance, model risk management , and compliance practices. Collaborate with senior business stakeholders and cross-functional engineering teams. Required Skills & Experience 15+ years in AI/ML, cloud architecture, and data engineering. At least 10 end-to-end AI/ML project implementations. Hands-on expertise in one or more of the following: ML frameworks: scikit-learn, XGBoost, TensorFlow, PyTorch GenAI/LLM tools: OpenAI, Cohere, LangChain, Hugging Face, FAISS, Pinecone Cloud platforms: AWS, Azure, GCP (AI/ML services) MLOps: MLflow, SageMaker Pipelines, Kubeflow, Vertex AI Strong understanding of data privacy, model governance, and compliance frameworks in BFSI. Proven leadership of cross-functional technical teams and stakeholder engagement.

Posted 4 days ago

Apply

0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Your potential has a place here with TTEC’s award-winning employment experience.As a Principal Data Scientist working onsite in Hyderabad, India, you’ll be a part of bringing humanity to business. #experienceTTEC Our employees have spoken. Our purpose, team, and company culture are amazing and our Great Place to Work® certification in India says it all! What You’ll Do In this role, you'll work on everything from data ingestion and model training to deployment and dashboarding using BigQuery, VertexAI, PySpark, and advanced ML frameworks. You'll report to Director, Data Engineering During a Typical Day, You’ll Prepare and manage training data for machine learning models using BigQuery and GCS. Design and optimize complex SQL queries for efficient data processing and preparation. Build, deploy, and support end-to-end machine learning systems, from data ingestion to dashboarding. Develop and maintain machine learning models using frameworks like TensorFlow, PyTorch, and scikit-learn. Implement MLOps best practices to streamline model training, deployment, and monitoring. Utilize Google Cloud tools such as Vertex AI, BigQuery, and Vertex AI Pipelines for ML workflows. Lead ML projects from conception to deployment, ensuring documentation and collaboration via JIRA and Confluence. What You Bring To The Role Strong knowledge of DevOps/MLOps best practices. Experience with machine learning training, prediction, and deployment workflows. Proficiency in Python and R, with experience using multiple ML libraries. Good understanding of internal business data domains (Empower, Oracle, Kronos, Employee, NICE). Familiarity with Contact Center Switch data. Ability to write clear, concise code and maintain strong documentation throughout the ML lifecycle. Excellent communication, collaboration, and problem-solving skills; able to work across cross-functional teams and build relationships. What You Can Expect Supportive of your career and professional development An inclusive culture and community minded organization where giving back is encouraged A global team of curious lifelong learners guided by our company values Ask us about our paid time off (PTO) and wellness and healthcare benefits And yes... a great compensation package and performance bonus opportunities, benefits you'd expect and maybe a few that would pleasantly surprise you (like tuition reimbursement) About TTEC Our business is about making customers happy. That's all we do. Since 1982, we've helped companies build engaged, pleased, profitable customer experiences powered by our combination of humanity and technology. On behalf of many of the world's leading iconic and hypergrowth brands, we talk, message, text, and video chat with millions of customers every day. These exceptional customer experiences start with you. TTEC is proud to be an equal opportunity employer where all qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, or status as a protected veteran. TTEC embraces and is committed to building a diverse and inclusive workforce that respects and empowers the cultures and perspectives within our global teams. We aim to reflect the communities we serve, by not only delivering amazing service and technology, but also humanity. We make it a point to make sure all our employees feel valued, belonging, and comfortable being their authentic selves at work. As a global company, we know diversity is our strength because it enables us to view things from different vantage points and for you to bring value to the table in your own unique way. Primary Location India-Telangana-Hyderabad

Posted 4 days ago

Apply

0.0 - 5.0 years

0 Lacs

Thiruvananthapuram District, Kerala

On-site

Role Overview: We are looking for a skilled and versatile AI Infrastructure Engineer (DevOps/MLOps) to build and manage the cloud infrastructure, deployment pipelines, and machine learning operations behind our AI-powered products. You will work at the intersection of software engineering, ML, and cloud architecture to ensure that our models and systems are scalable, reliable, and production-ready. Key Responsibilities: Design and manage CI/CD pipelines for both software applications and machine learning workflows. Deploy and monitor ML models in production using tools like MLflow, SageMaker, Vertex AI, or similar. Automate the provisioning and configuration of infrastructure using IaC tools (Terraform, Pulumi, etc.). Build robust monitoring, logging, and alerting systems for AI applications. Manage containerized services with Docker and orchestration platforms like Kubernetes. Collaborate with data scientists and ML engineers to streamline model experimentation, versioning, and deployment. Optimize compute resources and storage costs across cloud environments (AWS, GCP, or Azure). Ensure system reliability, scalability, and security across all environments. Requirements: 5+ years of experience in DevOps, MLOps, or infrastructure engineering roles. Hands-on experience with cloud platforms (AWS, GCP, or Azure) and services related to ML workloads. Strong knowledge of CI/CD tools (e.g., GitHub Actions, Jenkins, GitLab CI). Proficiency in Docker, Kubernetes, and infrastructure-as-code frameworks. Experience with ML pipelines, model versioning, and ML monitoring tools. Scripting skills in Python, Bash, or similar for automation tasks. Familiarity with monitoring/logging tools (Prometheus, Grafana, ELK, CloudWatch, etc.). Understanding of ML lifecycle management and reproducibility. Preferred Qualifications: Experience with Kubeflow, MLflow, DVC, or Triton Inference Server. Exposure to data versioning, feature stores, and model registries. Certification in AWS/GCP DevOps or Machine Learning Engineering is a plus. Background in software engineering, data engineering, or ML research is a bonus. What We Offer: Work on cutting-edge AI platforms and infrastructure Cross-functional collaboration with top ML, research, and product teams Competitive compensation package – no constraints for the right candidate send mail to :- thasleema@qcentro.com Job Type: Permanent Ability to commute/relocate: Thiruvananthapuram District, Kerala: Reliably commute or planning to relocate before starting work (Required) Experience: Devops and MLops: 5 years (Required) Work Location: In person

Posted 4 days ago

Apply

5.0 years

0 Lacs

Tamil Nadu, India

On-site

Role : Sr. AI/ML Engineer Years of experience: 5+ years (with minimum 4 years of relevant experience) Work mode: WFO- Chennai (mandate) Type: FTE Notice Period: Immediate to 15 days ONLY Key skills: Python, Tensorflow, Generative AI ,Machine Learning, AWS , Agentic AI, Open AI, Claude, Fast API JD: Experience in Gen AI, CI/CD pipelines, scripting languages, and a deep understanding of version control systems(e.g. Git), containerization (e.g. Docker), and continuous integration/deployment tools (e.g. Jenkins) third party integration is a plus, cloud computing platforms (e.g. AWS, GCP, Azure), Kubernetes and Kafka. Experience building production-grade ML pipelines. Proficient in Python and frameworks like Tensorflow, Keras , or PyTorch. Experience with cloud build, deployment, and orchestration tools Experience with MLOps tools such as MLFlow, Kubeflow, Weights & Biases, AWS Sagemaker, Vertex AI, DVC, Airflow, Prefect, etc., Experience in statistical modeling, machine learning, data mining, and unstructured data analytics. Understanding of ML Lifecycle, MLOps & Hands on experience to Productionize the ML Model Detail-oriented, with the ability to work both independently and collaboratively. Ability to work successfully with multi-functional teams, principals, and architects, across organizational boundaries and geographies. Equal comfort driving low-level technical implementation and high-level architecture evolution Experience working with data engineering pipelines

Posted 4 days ago

Apply

5.0 years

0 Lacs

Greater Kolkata Area

On-site

Lexmark is now a proud part of Xerox, bringing together two trusted names and decades of expertise into a bold and shared vision. When you join us, you step into a technology ecosystem where your ideas, skills, and ambition can shape what comes next. Whether you’re just starting out or leading at the highest levels, this is a place to grow, stretch, and make real impact—across industries, countries, and careers. From engineering and product to digital services and customer experience, you’ll help connect data, devices, and people in smarter, faster ways. This is meaningful, connected work—on a global stage, with the backing of a company built for the future, and a robust benefits package designed to support your growth, well-being, and life beyond work. Responsibilities : A Data Engineer with AI/ML focus combines traditional data engineering responsibilities with the technical requirements for supporting Machine Learning (ML) systems and artificial intelligence (AI) applications. This role involves not only designing and maintaining scalable data pipelines but also integrating advanced AI/ML models into the data infrastructure. The role is critical for enabling data scientists and ML engineers to efficiently train, test, and deploy models in production. This role is also responsible for designing, building, and maintaining scalable data infrastructure and systems to support advanced analytics and business intelligence. This role often involves leading mentoring junior team members, and collaborating with cross-functional teams. Key Responsibilities: Data Infrastructure for AI/ML: Design and implement robust data pipelines that support data preprocessing, model training, and deployment. Ensure that the data pipeline is optimized for high-volume and high-velocity data required by ML models. Build and manage feature stores that can efficiently store, retrieve, and serve features for ML models. AI/ML Model Integration: Collaborate with ML engineers and data scientists to integrate machine learning models into production environments. Implement tools for model versioning, experimentation, and deployment (e.g., MLflow, Kubeflow, TensorFlow Extended). Support automated retraining and model monitoring pipelines to ensure models remain performant over time. Data Architecture & Design Design and maintain scalable, efficient, and secure data pipelines and architectures. Develop data models (both OLTP and OLAP). Create and maintain ETL/ELT processes. Data Pipeline Development Build automated pipelines to collect, transform, and load data from various sources (internal and external). Optimize data flow and collection for cross-functional teams. MLOps Support: Develop CI/CD pipelines to deploy models into production environments. Implement model monitoring, alerting, and logging for real-time model predictions. Data Quality & Governance Ensure high data quality, integrity, and availability. Implement data validation, monitoring, and alerting mechanisms. Support data governance initiatives and ensure compliance with data privacy laws (e.g., GDPR, HIPAA). Tooling & Infrastructure Work with cloud platforms (AWS, Azure, GCP) and data engineering tools like Apache Spark, Kafka, Airflow, etc. Use containerization (Docker, Kubernetes) and CI/CD pipelines for data engineering deployments. Team Collaboration & Mentorship Collaborate with data scientists, analysts, product managers, and other engineers. Provide technical leadership and mentor junior data engineers. Core Competencies Data Engineering: Apache Spark, Airflow, Kafka, dbt, ETL/ELT pipelines ML/AI Integration: MLflow, Feature Store, TensorFlow, PyTorch, Hugging Face GenAI: LangChain, OpenAI API, Vector DBs (FAISS, Pinecone, Weaviate) Cloud Platforms: AWS (S3, SageMaker, Glue), GCP (BigQuery, Vertex AI) Languages: Python, SQL, Scala, Bash DevOps & Infra: Docker, Kubernetes, Terraform, CI/CD pipelines Educational Qualifications: Bachelor's or Master's degree in Computer Science, Engineering, or related field. 5+ years of experience in data engineering or related field. Strong understanding of data modeling, ETL/ELT concepts, and distributed systems. Experience with big data tools and cloud platforms. Soft Skills: Strong problem-solving and critical-thinking skills. Excellent communication and collaboration abilities. Leadership experience and the ability to guide technical decisions. How to Apply ? Are you an innovator? Here is your chance to make your mark with a global technology leader. Apply now! Global Privacy Notice Lexmark is committed to appropriately protecting and managing any personal information you share with us. Click here to view Lexmark's Privacy Notice.

Posted 4 days ago

Apply

5.0 years

0 Lacs

Greater Kolkata Area

On-site

Lexmark is now a proud part of Xerox, bringing together two trusted names and decades of expertise into a bold and shared vision. When you join us, you step into a technology ecosystem where your ideas, skills, and ambition can shape what comes next. Whether you’re just starting out or leading at the highest levels, this is a place to grow, stretch, and make real impact—across industries, countries, and careers. From engineering and product to digital services and customer experience, you’ll help connect data, devices, and people in smarter, faster ways. This is meaningful, connected work—on a global stage, with the backing of a company built for the future, and a robust benefits package designed to support your growth, well-being, and life beyond work. Responsibilities : A Senior Data Engineer with AI/ML focus combines traditional data engineering responsibilities with the technical requirements for supporting Machine Learning (ML) systems and artificial intelligence (AI) applications. This role involves not only designing and maintaining scalable data pipelines but also integrating advanced AI/ML models into the data infrastructure. The role is critical for enabling data scientists and ML engineers to efficiently train, test, and deploy models in production. This role is also responsible for designing, building, and maintaining scalable data infrastructure and systems to support advanced analytics and business intelligence. This role often involves leading data engineering projects, mentoring junior team members, and collaborating with cross-functional teams. Key Responsibilities: Data Infrastructure for AI/ML: Design and implement robust data pipelines that support data preprocessing, model training, and deployment. Ensure that the data pipeline is optimized for high-volume and high-velocity data required by ML models. Build and manage feature stores that can efficiently store, retrieve, and serve features for ML models. AI/ML Model Integration: Collaborate with ML engineers and data scientists to integrate machine learning models into production environments. Implement tools for model versioning, experimentation, and deployment (e.g., MLflow, Kubeflow, TensorFlow Extended). Support automated retraining and model monitoring pipelines to ensure models remain performant over time. Data Architecture & Design Design and maintain scalable, efficient, and secure data pipelines and architectures. Develop data models (both OLTP and OLAP). Create and maintain ETL/ELT processes. Data Pipeline Development Build automated pipelines to collect, transform, and load data from various sources (internal and external). Optimize data flow and collection for cross-functional teams. MLOps Support: Develop CI/CD pipelines to deploy models into production environments. Implement model monitoring, alerting, and logging for real-time model predictions. Data Quality & Governance Ensure high data quality, integrity, and availability. Implement data validation, monitoring, and alerting mechanisms. Support data governance initiatives and ensure compliance with data privacy laws (e.g., GDPR, HIPAA). Tooling & Infrastructure Work with cloud platforms (AWS, Azure, GCP) and data engineering tools like Apache Spark, Kafka, Airflow, etc. Use containerization (Docker, Kubernetes) and CI/CD pipelines for data engineering deployments. Team Collaboration & Mentorship Collaborate with data scientists, analysts, product managers, and other engineers. Provide technical leadership and mentor junior data engineers. Core Competencies Data Engineering: Apache Spark, Airflow, Kafka, dbt, ETL/ELT pipelines ML/AI Integration: MLflow, Feature Store, TensorFlow, PyTorch, Hugging Face GenAI: LangChain, OpenAI API, Vector DBs (FAISS, Pinecone, Weaviate) Cloud Platforms: AWS (S3, SageMaker, Glue), GCP (BigQuery, Vertex AI) Languages: Python, SQL, Scala, Bash DevOps & Infra: Docker, Kubernetes, Terraform, CI/CD pipelines Educational Qualifications: Bachelor's or Master's degree in Computer Science, Engineering, or related field. 5+ years of experience in data engineering or related field. Strong understanding of data modeling, ETL/ELT concepts, and distributed systems. Experience with big data tools and cloud platforms. Soft Skills: Strong problem-solving and critical-thinking skills. Excellent communication and collaboration abilities. Leadership experience and the ability to guide technical decisions. How to Apply ? Are you an innovator? Here is your chance to make your mark with a global technology leader. Apply now! Global Privacy Notice Lexmark is committed to appropriately protecting and managing any personal information you share with us. Click here to view Lexmark's Privacy Notice.

Posted 4 days ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

Introducing Thinkproject Platform Pioneering a new era and offering a cohesive alternative to the fragmented landscape of construction software, Thinkproject seamlessly integrates the most extensive portfolio of mature solutions with an innovative platform, providing unparalleled features, integrations, user experiences, and synergies. By combining information management expertise and in-depth knowledge of the building, infrastructure, and energy industries, Thinkproject empowers customers to efficiently deliver, operate, regenerate, and dispose of their built assets across their entire lifecycle through a Connected Data Ecosystem. We are seeking a hands-on Applied Machine Learning Engineer to join our team and lead the development of ML-driven insights from historical data in our contracts management, assets management and common data platform. This individual will work closely with our data engineering and product teams to design, develop, and deploy scalable machine learning models that can parse, learn from, and generate value from both structured and unstructured contract data. You will use BigQuery and its ML capabilities (including SQL and Python integrations) to prototype and productionize models across a variety of NLP and predictive analytics use cases. Your work will be critical in enhancing our platform’s intelligence layer, including search, classification, recommendations, and risk detection. What your day will look like Key Responsibilities Model Development: Design and implement machine learning models using structured and unstructured historical contract data to support intelligent document search, clause classification, metadata extraction, and contract risk scoring. BigQuery ML Integration: Build, train, and deploy ML models directly within BigQuery using SQL and/or Python, leveraging native GCP tools (e.g., Vertex AI, Dataflow, Pub/Sub). Data Preprocessing & Feature Engineering: Clean, enrich, and transform raw data (e.g., legal clauses, metadata, audit trails) into model-ready features using scalable and efficient pipelines. Model Evaluation & Experimentation: Conduct experiments, model validation, A/B testing, and iterate based on precision, recall, F1-score, RMSE, etc. Deployment & Monitoring: Operationalize models in production environments with monitoring, retraining pipelines, and CI/CD best practices for ML (MLOps). Collaboration: Work cross-functionally with data engineers, product managers, legal domain experts, and frontend teams to align ML solutions with product needs. What you need to fulfill the role Skills And Experience Education: Bachelor’s or Master’s degree in Computer Science, Machine Learning, Data Science, or a related field. ML Expertise: Strong applied knowledge of supervised and unsupervised learning, classification, regression, clustering, feature engineering, and model evaluation. NLP Experience: Hands-on experience working with textual data, especially in NLP use cases like entity extraction, classification, and summarization. GCP & BigQuery: Proficiency with Google Cloud Platform, especially BigQuery and BigQuery ML; comfort querying large-scale datasets and integrating with external ML tooling. Programming: Proficient in Python and SQL; familiarity with libraries such as Scikit-learn, TensorFlow, PyTorch, Keras. MLOps Knowledge: Experience with model deployment, monitoring, versioning, and ML CI/CD best practices. Data Engineering Alignment: Comfortable working with data pipelines and tools like Apache Beam, Dataflow, Cloud Composer, and pub/sub systems. Version Control: Strong Git skills and experience collaborating in Agile teams. Preferred Qualifications Experience working with contractual or legal text datasets. Familiarity with document management systems, annotation tools, or enterprise collaboration platforms. Exposure to Vertex AI, LangChain, RAG-based retrieval, or embedding models for Gen AI use cases. Comfortable working in a fast-paced, iterative environment with changing priorities. What we offer Lunch 'n' Learn Sessions I Women's Network I LGBTQIA+ Network I Coffee Chat Roulette I Free English Lessons I Thinkproject Academy I Social Events I Volunteering Activities I Open Forum with Leadership Team (Tp Café) I Hybrid working I Unlimited learning We are a passionate bunch here. To join Thinkproject is to shape what our company becomes. We take feedback from our staff very seriously and give them the tools they need to help us create our fantastic culture of mutual respect. We believe that investing in our staff is crucial to the success of our business. Your contact: Mehal Mehta Please submit your application, including salary expectations and potential date of entry, by submitting the form on the next page. Working at thinkproject.com - think career. think ahead.

Posted 4 days ago

Apply

3.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Location : Hyderabad and Chennai Immediate Joiners Experience : 3 to 5 years Mandatory skills: MLOps, Model lifecycle + Python + PySpark + GCP (BigQuery, Dataproc & Airflow), And CI/CD Required Skills and Experience: Strong programming skills: Proficiency in languages like Python, with experience in libraries like TensorFlow, PyTorch, or scikit-learn. Cloud Computing: Deep understanding of GCP services relevant to ML, such as Vertex AI, BigQuery, Cloud Storage, Dataflow, Dataproc, and others. Data Science Fundamentals: Solid foundation in machine learning concepts, statistical analysis, and data modeling. Software Engineering Principles: Experience with software development best practices, version control (e.g., Git), and testing methodologies. MLOps: Familiarity with MLOps principles and practices. Data Engineering: Experience in building and managing data pipelines.

Posted 4 days ago

Apply

2.0 years

0 Lacs

Hyderabad, Telangana

On-site

Req ID: 333041 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a GCP Python Gen AI LLM RAG Vertex AI to join our team in Hyderabad, Telangana (IN-TG), India (IN). Required Qualifications: 4+ years of Software Engineering experience, or equivalent demonstrated through one or a combination of the following: work experience, training, military experience, education (Must Have) 2+ years working with GCP (Google Cloud Platform and its services ) or alternate public/hybrid cloud and a proven track record delivering products with cloud services and cloud architectures at scale. (Must Have) 2+ years of experience with Python (Must Have) 3+ years of experience with GenAI, LLMs, RAG, vector databases and conversational bots (Must Have) 1+ years of experience with Playbooks , Vertex AI (Must Have) Exposure to ADK (hands-on) and Voice AI (Good to have) , LangChain and/or LangGraph is a plus (Good to have) ,4+ years of Contact Center industry experience with design, development, testing, integration with vendors, CRMs and business applications. Proven knowledge in one or more of the contact center sub domains such as: IVR/IVA, NLU/NLP, Real Time Omni channel Agent experience and customer journey, CX/AX experience optimization using AI/ML (Artificial Intelligence/ Machine Learning) (Good to have) ,4+ years of experience with Node JS, JAVA, Spring Boot, Kafka, Distributed Caches (GemFire, Redis), Elastic Search technologies, GraphQL, and NoSQL Databases (Cassandra or Mongo), Graph Databases, Public Cloud Marketplace services (Good to have) ,2+ years of Deep Domain Driven Design experience with cloud native Microservices designed and developed for massive scale and seamless resiliency, deployed on PCF/VMWare Tanzu, K8s or Serverless cloud technologies. About NTT DATA NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies. Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us at us.nttdata.com NTT DATA endeavors to make https://us.nttdata.com accessible to any and all users. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact us at https://us.nttdata.com/en/contact-us. This contact information is for accommodation requests only and cannot be used to inquire about the status of applications. NTT DATA is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status. For our EEO Policy Statement, please click here. If you'd like more information on your EEO rights under the law, please click here. For Pay Transparency information, please click here.

Posted 4 days ago

Apply

2.0 years

0 Lacs

Hyderabad, Telangana

On-site

Req ID: 333041 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a GCP Python Gen AI LLM RAG Vertex AI to join our team in Hyderabad, Telangana (IN-TG), India (IN). Required Qualifications: 4+ years of Software Engineering experience, or equivalent demonstrated through one or a combination of the following: work experience, training, military experience, education (Must Have) 2+ years working with GCP (Google Cloud Platform and its services ) or alternate public/hybrid cloud and a proven track record delivering products with cloud services and cloud architectures at scale. (Must Have) 2+ years of experience with Python (Must Have) 3+ years of experience with GenAI, LLMs, RAG, vector databases and conversational bots (Must Have) 1+ years of experience with Playbooks , Vertex AI (Must Have) Exposure to ADK (hands-on) and Voice AI (Good to have) , LangChain and/or LangGraph is a plus (Good to have) ,4+ years of Contact Center industry experience with design, development, testing, integration with vendors, CRMs and business applications. Proven knowledge in one or more of the contact center sub domains such as: IVR/IVA, NLU/NLP, Real Time Omni channel Agent experience and customer journey, CX/AX experience optimization using AI/ML (Artificial Intelligence/ Machine Learning) (Good to have) ,4+ years of experience with Node JS, JAVA, Spring Boot, Kafka, Distributed Caches (GemFire, Redis), Elastic Search technologies, GraphQL, and NoSQL Databases (Cassandra or Mongo), Graph Databases, Public Cloud Marketplace services (Good to have) ,2+ years of Deep Domain Driven Design experience with cloud native Microservices designed and developed for massive scale and seamless resiliency, deployed on PCF/VMWare Tanzu, K8s or Serverless cloud technologies. About NTT DATA NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies. Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us at us.nttdata.com NTT DATA endeavors to make https://us.nttdata.com accessible to any and all users. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact us at https://us.nttdata.com/en/contact-us. This contact information is for accommodation requests only and cannot be used to inquire about the status of applications. NTT DATA is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status. For our EEO Policy Statement, please click here. If you'd like more information on your EEO rights under the law, please click here. For Pay Transparency information, please click here.

Posted 4 days ago

Apply

10.0 years

0 Lacs

India

On-site

Avensys is a reputed global IT professional services company headquartered in Singapore. Our service spectrum includes enterprise solution consulting, business intelligence, business process automation and managed services. Given our decade of success, we have evolved to become one of the top trusted providers in Singapore and service a client base across banking and financial services, insurance, information technology, healthcare, retail and supply chain. Job Description: Experience: 10+ Years Work Timing: 4:00 PM IST to 1:00–1:30 AM IST Joining: Immediate to max 20 days’ notice Job Summary: We are looking for an experienced SAP FICO Consultant with a strong background in the Retail domain and solid expertise in Record to Report (RTR) processes. The ideal candidate will have at least 4 years of retail-specific experience and demonstrate hands-on experience in integrating third-party financial systems such as Vertex and Kyriba . Key Responsibilities: Lead and support SAP FICO RTR modules in a retail-focused business environment Collaborate with cross-functional teams to ensure seamless financial operations Manage third-party interface integrations (e.g., Vertex, Kyriba) Translate business requirements into functional and technical specifications Provide solutions for financial reporting, month-end processes, and reconciliations Support continuous improvements and resolve system-related issues promptly Engage with stakeholders and ensure compliance with internal controls and statutory requirements Required Skills & Experience: Minimum 10 years of experience in SAP FICO 4+ years experience in the retail domain is mandatory Strong understanding of finance and accounting principles Expertise in Record to Report (RTR) processes Hands-on experience in third-party integrations , specifically Vertex and Kyriba Strong communication and stakeholder management skills Preferred Candidate Profile: Immediate joiners or those with maximum 20 days’ notice WHAT’S ON OFFER You will be remunerated with an excellent base salary and entitled to attractive company benefits. Additionally, you will get the opportunity to enjoy a fun and collaborative work environment, alongside a strong career progression. To submit your application, please apply online or email your UPDATED CV to swathi@aven-sys.com Your interest will be treated with strict confidentiality. CONSULTANT DETAILS: Consultant Name : Swathi Avensys Consulting Pte Ltd EA Licence 12C5759 Privacy Statement: Data collected will be used for recruitment purposes only. Personal data provided will be used strictly in accordance with the relevant data protection law and Avensys' privacy policy

Posted 4 days ago

Apply

3.0 - 7.0 years

0 Lacs

pune, maharashtra

On-site

As a Software Developer based in Pune, you will be responsible for utilizing your 3-4 years of experience to design and develop robust software solutions. Your expertise in Core Java (1.7 or higher), OOPS concepts, and Spring framework (Core, AOP, Batch, JMS) will be crucial in creating efficient applications. Your role will involve designing Web Services (SOAP and REST) and developing Microservices APIs using Spring and Springboot. Your proficiency in working with databases such as MySQL, PostgreSQL, and Oracle PL/SQL will be essential for database development tasks. Strong coding skills, along with good analytical and problem-solving abilities, will enable you to deliver high-quality code. Your understanding of website authentication, Identity Management, REST APIs, security principles, and industry best practices will be key in ensuring secure and reliable software solutions. Additionally, your knowledge of Java Native Interface (JNI) and experience with web servers like Tomcat Apache, nginx, or Vertex/Grizzly, JBoss will contribute to the successful implementation of projects. Familiarity with Java Cryptography Architecture (JCA) will be an added advantage in this role.,

Posted 5 days ago

Apply

7.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

At PwC, our people in tax services focus on providing advice and guidance to clients on tax planning, compliance, and strategy. These individuals help businesses navigate complex tax regulations and optimise their tax positions. In indirect tax at PwC, you will focus value-added tax (VAT), goods and services tax (GST), sales tax and other indirect taxes. Your work will involve providing advice and guidance to clients on indirect tax planning, compliance, and strategy, helping businesses navigate complex indirect tax regulations and optimise their indirect tax positions. Enhancing your leadership style, you motivate, develop and inspire others to deliver quality. You are responsible for coaching, leveraging team member’s unique strengths, and managing performance to deliver on client expectations. With your growing knowledge of how business works, you play an important role in identifying opportunities that contribute to the success of our Firm. You are expected to lead with integrity and authenticity, articulating our purpose and values in a meaningful way. You embrace technology and innovation to enhance your delivery and encourage others to do the same. Skills Examples of the skills, knowledge, and experiences you need to lead and deliver value at this level include but are not limited to: Analyse and identify the linkages and interactions between the component parts of an entire system. Take ownership of projects, ensuring their successful planning, budgeting, execution, and completion. Partner with team leadership to ensure collective ownership of quality, timelines, and deliverables. Develop skills outside your comfort zone, and encourage others to do the same. Effectively mentor others. Use the review of work as an opportunity to deepen the expertise of team members. Address conflicts or issues, engaging in difficult conversations with clients, team members and other stakeholders, escalating where appropriate. Uphold and reinforce professional and technical standards (e.g. refer to specific PwC tax and audit guidance), the Firm's code of conduct, and independence requirements. Project Management Experience.-Deep problem solving skills to assist with ideal future state solution for our clients. Ability to independently work designing the solutions based on the requirement. Tax Engine Experience (Vertex, Avalara, OneSource, Sovos) Configurations related to Tax Determination and Reporting - Build Indirect Tax requirements through interaction with client process teams Deep understanding of the Business Requirements in Indirect Tax Process Automation - Ability to take ownership of the Functional Design Strong Functional expertise and Deep problem solving skills to assist with ideal future state solution for our clients. Perform Testing in ERP system and Tax Engines Ability to identify root cause related to calculation issues in Oracle or SAP Ability to create these documentations: Business Requirements Documents, Functional Designs, Configurations,Test Scenarios and Training Guides -Ensuring a timely resolution of issues, through training, coaching, issue triage, classification, resolution, escalation, and SLA enforcement where necessary 7+ years of experience. Project Management Experience. Strong ERP skills (SAP/Oracle). Configuration related to Tax Business Process (order-to-cash and procure-to-pay). Tax Engine Experience (Vertex, Avalara, OneSource, Sovos). Experience with Configurations related to Tax Determination and Reporting. Build Indirect Tax requirements through interaction with client process teams. Deep understanding of the Business Requirements in Indirect Tax Process Automation. Ability to take ownership of the Functional Design. Strong Functional expertise and Deep problem solving skills to assist with ideal future state solution for our clients. Perform Testing in ERP system and Tax Engines. Ability to identify root cause related to calculation issues in Oracle or SAP. Ability to create these documentations: Business Requirements Documents, Functional Designs, Configurations,Test Scenarios and Training Guides. Ensuring a timely resolution of issues, through training, coaching, issue triage, classification, resolution, escalation, and SLA enforcement where necessary. Supporting existing and new client integrations through effective coordination, communication, and project management activities. Strong customer service mindsets, with the ability to build, maintain, and enhance relationships. We're looking for people who can speak up confidently, with a genuine desire to make things better across the business. If you're ready to further build on your reputation.

Posted 5 days ago

Apply

4.0 years

0 Lacs

Andhra Pradesh, India

On-site

A career within Data and Analytics services will provide you with the opportunity to help organisations uncover enterprise insights and drive business results using smarter data analytics. We focus on a collection of organisational technology capabilities, including business intelligence, data management, and data assurance that help our clients drive innovation, growth, and change within their organisations in order to keep up with the changing nature of customers and technology. We make impactful decisions by mixing mind and machine to leverage data, understand and navigate risk, and help our clients gain a competitive edge. As part of our Analytics and Insights Consumption team, you’ll analyze data to drive useful insights for clients to address core business issues or to drive strategic outcomes. You'll use visualization, statistical and analytics models, AI/ML techniques, Modelops and other techniques to develop these insights. Years of Experience: Candidates with 4+ years of hands on experience Must Have Internal & External stakeholder management Familiarity with the CCaaS domain, CCaaS Application Development , contact center solution design & presales consulting. In-depth knowledge of CCaaS platforms like MS DCCP, Amazon Connect, NICECXOne, Genesys Cloud , Cisco Webex CC, Cisco HCS, UCCE/PCCE etc., including their architecture, functionalities, and application development, integration capabilities Governance & communication skills Hands-on configuration of Gen AI, LLM to be built on top of CCaaS platforms/Domain (MS DCCP, Amazon Connect, Genesys Cloud/NICE CX) includes, Develop and implement generative AI models to enhance customer interactions, including chatbots, virtual agents, and automated response systems. Speech scientist & speech, conversational fine-tuning (grammar & pattern analysis) Collaborate with stakeholders to identify business needs and define AI-driven solutions that improve customer experiences. Analyze existing customer service processes and workflows to identify areas for AI integration and optimization. Create and maintain documentation for AI solutions, including design specifications and user guides. Monitor and evaluate the performance of AI models, making adjustments as necessary to improve accuracy and effectiveness. Stay updated on the latest advancements in AI technologies and their applications in customer service and contact centers. Conduct training sessions for team members and stakeholders on the use and benefits of AI technologies in the contact center. Understanding of the fundamental ingredients of enterprise integration including interface definitions and contracts; REST APIs or SOAP web services; SQL,MY SQL, Oracle , PostgreSQL , Dynamo DB, S3, RDS Provide effective real time demonstrations of CCaaS & AI (Bots) platforms High proficiency in defining top notch customer facing slides/presentations Gen AI,LLM platforms MUST have technologies includes Copilot, Copilot Studio, Amazon Bedrock, Amazon Titan, Sagemaker, Azure OpenAI, Azure AI Services, Google Vertex AI, Gemini AI. Proficiency in data visualization tools like Tableau, Power BI, Quicksight and others Nice To Have Experience in CPaaS platforms (Twilio, Infobip) for synergies between Communication Platform As A Service & Contact Center As a Service Understanding of cloud platforms (e.g., AWS, Azure, Google Cloud) and their services for scalable data storage, processing, and analytics Work on high velocity Presales solution consulting engagements (RFP, RFI, RFQ) Define industry specific use cases (BFS & I, Telecom, Retail, Manlog etc) Work on high volume presales consulting engagements including solution design document definition, commercial construct (CCaaS) Defining Business Case

Posted 5 days ago

Apply

16.0 - 25.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Skill required: Financial Planning & Analysis - Financial Planning and Analysis (FP&A) Designation: Delivery Lead Senior Manager Qualifications: Any Graduation Years of Experience: 16 to 25 years About Accenture Accenture is a global professional services company with leading capabilities in digital, cloud and security.Combining unmatched experience and specialized skills across more than 40 industries, we offer Strategy and Consulting, Technology and Operations services, and Accenture Song— all powered by the world’s largest network of Advanced Technology and Intelligent Operations centers. Our 699,000 people deliver on the promise of technology and human ingenuity every day, serving clients in more than 120 countries. We embrace the power of change to create value and shared success for our clients, people, shareholders, partners and communities.Visit us at www.accenture.com What would you do? You will be aligned with our Finance Operations vertical and will be helping us in determining financial outcomes by collecting operational data/reports, whilst conducting analysis and reconciling transactions. Financial planning, reporting, variance analysis, budgeting and forecasting Financial planning and analysis (FP&A) refers to the processes designed to help organizations accurately plan, forecast, and budget to support the company s major business decisions and future financial health. These processes include planning, budgeting, forecasting, scenario modeling, and performance reporting. Build, mentor, and inspire a high-performing tax team, setting aspirations, defining career paths, and promoting technical training. Foster a culture of excellence, continuous improvement, and ethical conduct. Serve as primary liaison with external entities—regulatory authorities, auditors, tax advisors—and internal stakeholders across Finance, Legal, Treasury, and Operations. Communicate tax strategy, issues, and risks to C-level and global leadership with clarity and confidence. What are we looking for? Direct Tax Processing Indirect Tax Processing 1. Establish and maintain robust governance and control frameworks for indirect tax, U.S. GAAP and federal tax reporting, identifying and mitigating key risks. 2. Lead evaluation of high-risk processes, audits, and open compliance items; take decisive action and implement corrective plans. 3. Serve as a strategic advisor to senior leadership on all tax matters—policy, compliance, operational risks, and automation initiatives 4. Ensure full compliance across: Indirect Taxes: Customs, federal excise, FIP, export reporting, state excise, sales/use, and PACT Act filings. Direct Taxes: FBAR (FinCEN 114), Form 5472, federal/state income taxes, extensions, estimated payments, and Country-by-Country Reporting. 5. Oversee timely filings and payments of tax returns, with final approval and validation. 6. Lead digital transformation initiatives—including SAP, Power BI, RPA, and tax engines (Vertex, Avalara, OneSource, etc.) to drive efficiency and accuracy 7. Manage tax technology roadmap and liaise with Finance, IT, and external vendors to implement integrated systems. 8. Lead cross-functional, global teams to execute tax initiatives, ensuring timelines, governance, and quality standards. Roles and Responsibilities: In this role you are required to identify and assess complex problems for area(s) of responsibility The individual should create solutions in situations in which analysis requires in-depth knowledge of organizational objectives Requires involvement in setting strategic direction to establish near-term goals for area(s) of responsibility Interaction is with senior management levels at a client and/or within Accenture, involving negotiating or influencing on significant matters Should have latitude in decision-making and determination of objectives and approaches to critical assignments Their decisions have a lasting impact on area of responsibility with the potential to impact areas outside of own responsibility Individual manages large teams and/or work efforts (if in an individual contributor role) at a client or within Accenture Please note that this role may require you to work in rotational shifts, Any Graduation

Posted 5 days ago

Apply

0 years

0 Lacs

India

Remote

Avensys is a reputed global IT professional services company headquartered in Singapore. Our service spectrum includes enterprise solution consulting, business intelligence, business process automation and managed services. Given our decade of success we have evolved to become one of the top trusted providers in Singapore and service a client base across banking and financial services, insurance, information technology, healthcare, retail and supply chain. We are seeking for Oracle EBS Consultant Position Job Details: Job Type: 6 Months renewable contract Location : Remote, India Timings : Regular working hours (IST) Job Description: •14+ yrs of experience in Oracle EBusiness Suite Application with expertise in Oracle Tax, GL,AP, AR and CM. • Understanding of Fusion AP, AR, GL, FA, CM, Tax, Costing and Projects modules and how they work in integration with core supply chain modules. • Excellent understanding and working Oracle E-Business Tax Solution. • Experience in (US/CA/Vertex) EBTax is mandatory . • Experience in implementing Oracle Finance EBS. • Must have worked in at least two EBusiness Suite implementation projects. WHAT’S ON OFFER: You will be remunerated with an excellent base salary and entitled to attractive company benefits. Additionally, you will get the opportunity to enjoy a fun and collaborative work environment, alongside a strong career progression. To submit your application, please apply online or email your UPDATED CV in Microsoft Word format to uma@aven-sys.com . Your interest will be treated with strict confidentiality. CONSULTANT DETAILS: Consultant Name : Uma AvensysConsultingPteLtd Privacy Statement: Data collected will be used for recruitment purposes only. Personal data provided will be used strictly in accordance with the relevant data protection law and Avensys' privacy policy.

Posted 5 days ago

Apply

4.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Description We have an outstanding opportunity for an expert AI and Data Cloud Solutions Engineer to work with our trailblazing customers in crafting ground-breaking customer engagement roadmaps demonstrating the Salesforce applications, platform across the machine learning and LLM/GPT domains in India! The successful applicant will have a track record in driving business outcomes through technology solutions, with experience in engaging at the C-level with Business and Technology groups. Responsibilities Primary pre-sales technical authority for all aspects of AI usage within the Salesforce product portfolio - existing Einstein ML based capabilities and new (2023) generative AI Majority of time (60%+) will be customer/external facing Evangelisation of Salesforce AI capabilities Assessing customer requirements and use cases and aligning to these capabilities Solution proposals, working with Architects and wider Solution Engineer (SE) teams Preferred Qualifications expertise in an AI related subject (ML, deep learning, NLP etc.) Familiar with technologies such as OpenAI, Google Vertex, Amazon Sagemaker, Snowflake, Databricks etc Required Qualifications: Experience will be evaluated based on the core proficiencies of the role. 4+ years working directly in the commercial technology space with AI products and solutions. Data knowledge - Data science, Data lakes and warehouses, ETL, ELT, data quality AI knowledge - application of algorithms and models to solve business problems (ML, LLMs, GPT) 10+ years working in a sales, pre-sales, consulting or related function in a commercial software company Strong focus and experience in pre-sales or implementation is required. Experience in demonstrating Customer engagement solution, understand and drive use cases, customer journeys, ability to draw ‘Day in life of’ across different LOBs. Business Analysis/ Business case/return on investment construction. Demonstrable experience in presenting and communicating complex concepts to large audiences A broad understanding of and ability to articulate the benefits of CRM, Sales, Service and Marketing cloud offerings Strong verbal and written communications skills with a focus on needs analysis, positioning, business justification, and closing techniques. Continuous learning demeanor with a demonstrated history of self enablement and advancement in both technology and behavioural areas. Building reference models/ideas/approaches for inclusion of GPT based products within wider Salesforce solution architectures, especially involving Data Cloud Alignment with customer security and privacy teams on trust capabilities and values of our solution(s) Presenting at multiple customer events from single account sessions through to major strategic events (World Tour, Dreamforce) Representing Salesforce at other events (subject to PM approval) Sales and SE organisation education and enablement e.g. roadmap - all roles across all product areas Bridge/primary contact point to product management Provide thought leadership in how large enterprise organisation can drive customer success through digital transformation. Ability to uncover the challenges and issues a business is facing by running successful and targeted discovery sessions and workshops. Be an innovator who can build new solutions using out-of-the-box thinking. Demonstrate business value of our AI solutions to business using solution presentations, demonstrations and prototypes. Build roadmaps that clearly articulate how partners can implement and accept solutions to move from current to future state. Deliver functional and technical responses to RFPs/RFIs. Work as an excellent teammate by chipping in, learning and sharing new knowledge. Demonstrate a conceptual knowledge of how to integrate cloud applications to existing business applications and technology. Lead multiple customer engagements concurrently. Be self-motivated, flexible, and take initiative.|

Posted 5 days ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies