Home
Jobs

39 Textract Jobs - Page 2

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

About Us Yubi stands for ubiquitous. But Yubi will also stand for transparency, collaboration, and the power of possibility. From being a disruptor in India’s debt market to marching towards global corporate markets from one product to one holistic product suite with seven products Yubi is the place to unleash potential. Freedom, not fear. Avenues, not roadblocks. Opportunity, not obstacles. About Yubi Yubi, formerly known as CredAvenue, is re-defining global debt markets by freeing the flow of finance between borrowers, lenders, and investors. We are the world's possibility platform for the discovery, investment, fulfillment, and collection of any debt solution. At Yubi, opportunities are plenty and we equip you with tools to seize it. In March 2022, we became India's fastest fintech and most impactful startup to join the unicorn club with a Series B fundraising round of $137 million. In 2020, we began our journey with a vision of transforming and deepening the global institutional debt market through technology. Our two-sided debt marketplace helps institutional and HNI investors find the widest network of corporate borrowers and debt products on one side and helps corporates to discover investors and access debt capital efficiently on the other side. Switching between platforms is easy, which means investors can lend, invest and trade bonds - all in one place. Our platforms shake up the traditional debt ecosystem and offer new ways of digital finance. Yubi Credit Marketplace - With the largest selection of lenders on one platform, our credit marketplace helps enterprises partner with lenders of their choice for any capital requirements. Yubi Invest - Fixed income securities platform for wealth managers & financial advisors to channel client investments in fixed income Financial Services Platform - Designed for financial institutions to manage co-lending partnerships & asset-based securitization Spocto - Debt recovery & risk mitigation platform Accumn - Dedicated SaaS solutions platform powered by Decision-grade data, Analytics, Pattern Identifications, Early Warning Signals and Predictions to Lenders, Investors and Business Enterprises So far, we have onboarded more than 17000 enterprises, 6200 investors, and lenders and facilitated debt volumes of over INR 1,40,000 crore. Backed by marquee investors like Insight Partners, B Capital Group, Dragoneer, Sequoia Capital, LightSpeed and Lightrock, we are the only-of-its-kind debt platform globally, revolutionizing the segment. At Yubi, People are at the core of the business and our most valuable assets. Yubi is constantly growing, with 1000+ like-minded individuals today who are changing the way people perceive debt. We are a fun bunch who are highly motivated and driven to create a purposeful impact. Come join the club to be a part of our epic growth story. Requirements Key Responsibilities: Lead and mentor a dynamic Data Science team in developing scalable, reusable tools and capabilities to advance machine learning models, specializing in computer vision, natural language processing, API development and Product building. Drive innovative solutions for complex CV-NLP challenges, including tasks like image classification, data extraction, text classification, and summarization, leveraging a diverse set of data inputs such as images, documents, and text. Collaborate with cross-functional teams, including DevOps and Data Engineering, to design and implement efficient ML pipelines that facilitate seamless model integration and deployment in production environments. Spearhead the optimization of the model development lifecycle, focusing on scalability for training and production scoring to manage significant data volumes and user traffic. Implement cutting-edge technologies and techniques to enhance model training throughput and response times. Required Experience & Expertise: 3+ years of experience in developing computer vision models and applications. Extensive knowledge and experience in Data Science and Machine Learning techniques, with a proven track record in leading and executing complex projects. Deep understanding of the entire ML model development lifecycle, including design, development, training, testing/evaluation, and deployment, with the ability to guide best practices. Expertise in writing high-quality, reusable code for various stages of model development, including training, testing, and deployment. Advanced proficiency in Python programming, with extensive experience in ML frameworks such as Scikit-learn, TensorFlow, and Keras and API development frameworks such as Django, Fast API. Demonstrated success in overcoming OCR challenges using advanced methodologies and libraries like Tesseract, Keras-OCR, EasyOCR, etc. Proven experience in architecting reusable APIs to integrate OCR capabilities across diverse applications and use cases. Proficiency with public cloud OCR services like AWS Textract, GCP Vision, and Document AI. History of integrating OCR solutions into production systems for efficient text extraction from various media, including images and PDFs. Comprehensive understanding of convolutional neural networks (CNNs) and hands-on experience with deep learning models, such as YOLO. Strong capability to prototype, evaluate, and implement state-of-the-art ML advancements, particularly in OCR and CV-NLP. Extensive experience in NLP tasks, such as Named Entity Recognition (NER), text classification, and on finetuning of Large Language Models (LLMs). This senior role is tailored for visionary professionals eager to push the boundaries of CV-NLP and drive impactful data-driven innovations using both well-established methods and the latest technological advancements. Benefits We are committed to creating a diverse environment and are proud to be an equal-opportunity employer. All qualified applicants receive consideration for employment without regard to race, color, religion, gender, gender identity or expression, sexual orientation, national origin, genetics, disability, or age. Show more Show less

Posted 3 weeks ago

Apply

1.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

About Us Yubi stands for ubiquitous. But Yubi will also stand for transparency, collaboration, and the power of possibility. From being a disruptor in India’s debt market to marching towards global corporate markets from one product to one holistic product suite with seven products Yubi is the place to unleash potential. Freedom, not fear. Avenues, not roadblocks. Opportunity, not obstacles. About Yubi Yubi, formerly known as CredAvenue, is re-defining global debt markets by freeing the flow of finance between borrowers, lenders, and investors. We are the world's possibility platform for the discovery, investment, fulfillment, and collection of any debt solution. At Yubi, opportunities are plenty and we equip you with tools to seize it. In March 2022, we became India's fastest fintech and most impactful startup to join the unicorn club with a Series B fundraising round of $137 million. In 2020, we began our journey with a vision of transforming and deepening the global institutional debt market through technology. Our two-sided debt marketplace helps institutional and HNI investors find the widest network of corporate borrowers and debt products on one side and helps corporates to discover investors and access debt capital efficiently on the other side. Switching between platforms is easy, which means investors can lend, invest and trade bonds - all in one place. Our platforms shake up the traditional debt ecosystem and offer new ways of digital finance. Yubi Credit Marketplace - With the largest selection of lenders on one platform, our credit marketplace helps enterprises partner with lenders of their choice for any capital requirements. Yubi Invest - Fixed income securities platform for wealth managers & financial advisors to channel client investments in fixed income Financial Services Platform - Designed for financial institutions to manage co-lending partnerships & asset-based securitization Spocto - Debt recovery & risk mitigation platform Accumn - Dedicated SaaS solutions platform powered by Decision-grade data, Analytics, Pattern Identifications, Early Warning Signals and Predictions to Lenders, Investors and Business Enterprises So far, we have onboarded more than 17000 enterprises, 6200 investors, and lenders and facilitated debt volumes of over INR 1,40,000 crore. Backed by marquee investors like Insight Partners, B Capital Group, Dragoneer, Sequoia Capital, LightSpeed and Lightrock, we are the only-of-its-kind debt platform globally, revolutionizing the segment. At Yubi, People are at the core of the business and our most valuable assets. Yubi is constantly growing, with 1000+ like-minded individuals today who are changing the way people perceive debt. We are a fun bunch who are highly motivated and driven to create a purposeful impact. Come join the club to be a part of our epic growth story. Requirements Key Responsibilities: Join a dynamic Data Science team as a CV-NLP Engineer, where you'll develop reusable tools and capabilities for building advanced machine learning models. Tackle cutting-edge CV-NLP challenges, including image classification, data extraction, text classification, and summarization, using images, documents, and text data. Collaborate closely with DevOps and Data Engineering teams to create efficient ML pipelines, ensuring seamless integration and deployment of models into production environments. Accelerate the model development lifecycle, ensuring scalability for training and production scoring to handle large volumes of data and user traffic. Optimize model training throughput and response times using the latest technologies and techniques. Required Experience & Expertise: 1-3 years of experience in developing computer vision models and applications. Foundational knowledge in API Development and experience in Data Science and Machine Learning techniques. Strong understanding of the complete ML model development lifecycle, including development, training, testing/evaluation, and deployment. Proficient in writing reusable code for various ML stages, such as model training, testing, and deployment. Hands-on experience in Python programming. Proven track record in developing solutions for ML problems using frameworks like Scikit-learn, TensorFlow, Keras, etc. Experience solving OCR challenges with pre-trained models and libraries such as Tesseract, Keras-OCR, EasyOCR, etc. Skilled in developing reusable APIs for integrating OCR capabilities with various applications. Familiarity with public cloud OCR services like AWS Textract, GCP Vision etc. Experience in integrating OCR solutions into production systems for extracting text from diverse images, PDFs, and other document types. Solid understanding of CNN concepts and experience with deep learning models such as YOLO. Ability to prototype, evaluate, and incorporate the latest ML advancements, particularly in OCR. Experience in NLP tasks, including Named Entity Recognition (NER), text classification. Experience with Large Language Models (LLMs). This role is for those who are enthusiastic about pushing the boundaries of what's possible in CV-NLP, leveraging both established and cutting-edge methodologies. Benefits We are committed to creating a diverse environment and are proud to be an equal-opportunity employer. All qualified applicants receive consideration for employment without regard to race, color, religion, gender, gender identity or expression, sexual orientation, national origin, genetics, disability, or age. Show more Show less

Posted 3 weeks ago

Apply

0 years

3 - 5 Lacs

Ahmedabad

On-site

GlassDoor logo

Data Engineer Location: Ahmadabad, Surat & Mumbai Apply Now : https://forms.office.com/r/z987VFhAH1 999 8741 755 Role Summary: The Data Engineer sets up data ingestion pipelines, normalizes incoming data, and ensures clean, structured data for AI models and downstream agents. Key Responsibilities: Set up Kinesis Data Streams and Lambda triggers for real-time data ingestion. Use AWS Textract to parse DA documents and store them in S3/RDS. Develop ETL jobs using AWS Glue for historical data transformation. Manage data schemas in RDS . Implement data validation rules and maintain data quality standards. Optimize S3 lifecycle and storage policies (Intelligent-Tiering, Glacier). Skills & Experience: Experience in data pipelines (Glue, Kinesis, Lambda). Proficient in SQL (Postgres) and DynamoDB modeling. S3 and object storage best practices. Python (Pandas, data processing libraries). Data governance and security awareness. Job Type: Full-time Pay: ₹350,000.00 - ₹500,000.00 per year Work Location: In person

Posted 3 weeks ago

Apply

5.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Linkedin logo

About You – Experience, Education, Skills, And Accomplishments Holding a Bachelor's in Engineering or a Master's degree (BE, ME, B.Tech, M.Tech, MCA, MS) with strong communication and reasoning abilities is required. Over 5 years of hands-on technical experience using AWS serverless resources, including but not limited to ECS, Lambda, RDS, API Gateway, S3, Cloudfront, and ALB. Over 8 years of experience independently developing modules in one or more of the following: Python, Web development, JavaScript/TypeScript, and Containers. Experience in design and development of web-based applications using NodeJS. Experience with modern JavaScript framework (Vue.js, Angular, React), UI Testing (Puppeteer, Playwright, Selenium). Experience working in a CI/CD setup with multiple environments, and with an ability to manage code and deployments towards incrementally faster releases. Experience with RDBMS and NoSQL databases, particularly MySQL or PostgreSQL. Additionally, It Would Be Advantageous If You Have Experience in Terraform or similar, and IAC in general. Familiarity with AWS Bedrock. Experience with OCR engines and solutions, e.g. AWS Textract, Google Cloud Vision. Interest in exploring and adopting Data Science methodologies, and AI/ML technologies to optimize project outcomes. What will you be doing in this role? Overall, you will play a pivotal role in driving the success of the development projects and achieving business objectives through innovative and efficient agile software development practices. Provide technical guidance to dev team so Proof of Concepts can be productionized. Drive and execute productionizing activities. Identify and pursue opportunities for reuse across team boundaries. Quickly and efficiently resolve complex technical issues by analysing information, evaluating options, and executing decisions. Participate in technical design discussions and groups for feature development. Understand the impact of architecture and hosting strategies on technical design and apply industry best practices in software development, including unit testing, object-oriented design, and code reviews. Work with team members to address findings from security, functionality, and performance tests. Conduct detailed code reviews for intricate solutions, offering enhancements where feasible. Prioritize security and performance in all implementations. About The Team Our team comprises driven professionals who are deeply committed to leveraging technology to make a tangible impact in our field of the patent services area. Joining us, you'll thrive in a multi-region, cross-cultural environment, collaborating on cutting-edge technologies with a strong emphasis on a user-centric approach. At Clarivate, we are committed to providing equal employment opportunities for all persons with respect to hiring, compensation, promotion, training, and other terms, conditions, and privileges of employment. We comply with applicable laws and regulations governing non-discrimination in all locations. Show more Show less

Posted 3 weeks ago

Apply

0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Linkedin logo

Clarivate is on the lookout for a Sr. Software Engineer ML (machine learning) to join our Patent Service team in Noida . The successful candidate will be responsible focus on supporting machine learning (ML) projects, for deploying, scaling, and maintaining ML models in production environments, working closely with data scientists, ML engineers, and software developers to architect robust infrastructure, implement automation pipelines, and ensure the reliability and scalability of our ML systems. The ideal candidate should be eager to learn, equipped with strong hands-on technical and analytical thinking skills, have a passion for teamwork, and staying updated with the latest technological trends. About You – Experience, Education, Skills, And Accomplishments Holding a Bachelor's in Engineering or a Master's degree (BE, ME, B.Tech, M.Tech, MCA, MS) with strong communication and reasoning abilities is required. Proven experience as a Machine Learning Engineer or similar position Deep knowledge of math, probability, statistics and algorithms Outstanding analytical and problem-solving skills Understanding of data structures, data modeling and software architecture Good understanding of ML concepts and frameworks (e.g., TensorFlow, Keras, PyTorch) Proficiency with Python and basic libraries for machine learning such as scikit-learn and pandas Expertise in Prompt engineering . Expertise in visualizing and manipulating big datasets Working experience for managing ML workload in production Implement and/ or practicing MLOps or LLMOps concepts Additionally, It Would Be Advantageous If You Have Experience in Terraform or similar, and IAC in general. Familiarity with AWS Bedrock. Experience with OCR engines and solutions, e.g. AWS Textract, Google Cloud Vision. Interest in exploring and adopting Data Science methodologies, and AI/ML technologies to optimize project outcomes. Experience working in a CI/CD setup with multiple environments, and with an ability to manage code and deployments towards incrementally faster releases. Experience with RDBMS and NoSQL databases, particularly MySQL or PostgreSQL. What will you be doing in this role? Overall, you will play a pivotal role in driving the success of the development projects and achieving business objectives through innovative and efficient agile software development practices. Designing and developing machine learning systems Implementing appropriate ML algorithms, analyzing ML algorithms that could be used to solve a given problem and ranking them by their success probability Running machine learning tests and experiments, perform statistical analysis and fine-tuning using test results, training and retraining systems when necessary Implement monitoring and alerting systems to track the performance and health of ML models in production. Ensure security best practices are followed in the deployment and management of ML systems. Optimize infrastructure for performance, scalability, and cost efficiency. Develop and maintain CI/CD pipelines for automated model training, testing, and deployment. Troubleshoot issues related to infrastructure, deployments, and performance of ML models. Stay up to date with the latest advancements in ML technologies, and evaluate their potential impact on our workflows. About The Team Our team comprises driven professionals who are deeply committed to leveraging technology to make a tangible impact in our field of the patent services area. Joining us, you'll thrive in a multi-region, cross-cultural environment, collaborating on cutting-edge technologies with a strong emphasis on a user-centric approach. At Clarivate, we are committed to providing equal employment opportunities for all qualified persons with respect to hiring, compensation, promotion, training, and other terms, conditions, and privileges of employment. We comply with applicable laws and regulations governing non-discrimination in all locations. Show more Show less

Posted 1 month ago

Apply

3.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

We are looking for a Document Extraction and Inference Engineer with expertise in traditional machine learning algorithms and rule based NLP techniques. The ideal candidate will have a strong foundation in document processing, structured data extraction, and inference modeling using classical ML approaches. You will work on designing, implementing, and optimizing document extraction pipelines for various applications, ensuring accuracy and efficiency. Key Responsibilities Develop and implement document parsing and structured data extraction techniques. Utilize OCR (Optical Character Recognition) and pattern-based NLP for text extraction. Optimize rulebased and statistical models for document classification and entity recognition. Design feature engineering strategies for improving inference accuracy. Work with structured and semistructured data (PDFs, scanned documents, XML, JSON). Implement knowledgebased inference models for decisionmaking applications. Collaborate with data engineers to build scalable document processing pipelines. Conduct error analysis and improve extraction accuracy through iterative refinements. Stay updated with advancements in traditional NLP and document processing techniques. Required Qualifications Bachelor’s or Master’s degree in Computer Science, AI, Machine Learning, or related field. 3+ years of experience in document extraction and inference modeling. Strong proficiency in Python and ML libraries (Scikit-learn, NLTK, OpenCV, Tesseract). Expertise in OCR technologies, regular expressions, and rule-based NLP. Experience with SQL and database management for handling extracted data. Knowledge of probabilistic models, optimization techniques, and statistical inference. Familiarity with cloud-based document processing (AWS Textract, Azure Form Recognizer). Strong analytical and problem-solving skills. Preferred Qualifications Experience with graphbased document analysis and knowledge graphs. Knowledge of time series analysis for document-based forecasting. Exposure to reinforcement learning for adaptive document processing. Understanding of the credit / loan processing domain. Location: Chennai, India Show more Show less

Posted 1 month ago

Apply

6.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

Application Deadline: 30th May 2025 We are seeking an experienced Senior Developer to lead the engineering behind our Core AI Orchestration Platform, leveraging LangGraph, LangChain, and cutting-edge LLMs. You’ll design, build, and scale a multi-agent system for document parsing, contract validation, and workflows, with a focus on performance, explainability, and real-time traceability. You will get an opportunity to shape a next-gen AI product with a true global team and work with edge tools (LangGraph, Claude, GPT-4.1) You will work at the intersection of backend APIs, AI pipeline orchestration, and frontend dashboards, bringing together structured reasoning, vision models, and document intelligence. Apply now and help us build the orchestration layer powering the next generation of intelligent systems. Key Responsibilities Implement multi-agent workflows using LangGraph and LangChain, enabling conditional routing, tool invocation, and memory-based decisions. Integrate LLMs (Claude 3, GPT-4.1) and Vision models (Claude Opus, OpenAI Vision) f or document understanding and structured output generation. Build robust APIs using FastAPI, including support for async processing, webhook-based triggers, and job queues. Implement PDF/DOCX parsing pipelines using Textract, Unstructured.io, and combine with RAG-based retrieval for clause-level reasoning. Manage and optimize data pipelines leveraging Supabase Postgres, pgvector, and Amazon S3 f or structured and unstructured storage. Build internal tools and dashboards using Next.js, React, and Tailwind CSS for audit workflows, feedback loops, and reviewer management. Own deployment and DevOps workflows. Set up observability and testing infrastructure using LangSmith or LangFuse, with monitoring. Requirements 6+ years of hands-on development experience (Python + JS preferred) Deep understanding of LLM integration, prompt engineering, and RAG systems Proven experience building async-ready APIs and document processing pipelines Strong understanding of Postgres schemas, joins, indexing, and pgvector usage Familiarity with Next.js and frontend best practices DevOps comfort with EC2, Docker, and CI/CD Bonus: Experience with LangGraph, LangSmith, or Bedrock/OpenAI SDKs Prior experience with multi-agent LLM systems Background in document intelligence or compliance tooling Experience scaling real-time dashboards for multi-user environments. Show more Show less

Posted 1 month ago

Apply

10.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Position Summary Job Title: Manager - Government and Public Services Enabling Areas (GPS EA). The Team: GPS GSi The Role: Senior Data Scientist The Team: Do you have a strong background in machine learning and deep learning? Are you interested in utilizing your data science skills and collaborating with a small team in a fast-paced environment to achieve strategic mission goals? If so, Deloitte has an exciting opportunity for you! As a member of our GPS GSi group, you will play a crucial role in the development and maintenance of our data science and business intelligence solutions. This role will specialize in assisting with machine learning, deep learning, and generative AI initiatives that will be utilized by Enabling Area professionals to enhance and expedite decision-making. You will provide expertise within and across business teams, demonstrate the ability to work independently / as a team, and apply problem-solving skills to resolve complex issues. Work you will do. Technology: Deliver exceptional client service. Maximizes results and drives high performance from people while fostering collaboration across businesses and geographies Interfacing with business customers and leadership to gather requirements and deliver complete Data Engineering, Data Warehousing, and BI solutions. Design, train, and deploy machine learning and deep learning models to AWS, Databricks, and Dataiku platforms. Develop, design, and/or advise on Large Language Model (LLM) solutions for enterprise-wide documentation (e.g., Retrieval-Augmented Generation (RAG), Continued Pre-training (CPT), Supervised Fine-tuning (SFT), etc.) Utilize Machine Learning Operations (MLOps) pipelines, including knowledge of containerization (Docker) and CI/CD for training and deploying models. Maintain structured documentation of project development stages, including the utilization of GitHub and/or Jira for version control and project management. Demonstrate effective communication skills with the ability to provide expertise and break down complex analytical solutions to explain to clients. Remain current with latest industry trends and developments in data science and/or related fields, with the ability to learn new skills and knowledge to advance the skillset of our Data Science team. Apply thorough attention to detail, and carefully review data science solutions for accuracy and quality. Leadership: Develop high-performing teams by providing challenging and meaningful opportunities, and acknowledge their contributions to the organization's success. Establish the team's strategy and roadmap, prioritizing initiatives based on their broader business impact. Demonstrate leadership in guiding both US and USI teams to deliver advanced technical solutions across the GPS practice. Serve as a role model for junior practitioners, inspiring action and fostering positive behaviors. Pursue new and challenging initiatives that have a positive impact on our Practice and our personnel. Establish a reputation as a Deloitte expert and be acknowledged as a role model and senior member by client teams. Support and participate in the recognition and reward of junior team members. People Development: Actively seek, provide, and respond to constructive feedback. Offer development guidance to the GSi team, enhancing their people, leadership, and client management skills. Play a pivotal role in recruitment and the onboarding of new hires. Engage in formal performance assessment activities for assigned staff and collaborate with Practice leadership to address and resolve performance issues. Serve as an effective coach by helping counselees identify their strengths and opportunities to capitalize on them. Foster a "One Team" mindset among US and USI team members. Qualifications: Required/Preferred: Bachelor's degree, preferably in Management Information Systems, Computer Science, Software Engineering, or related IT discipline Minimum of 10+ years of relevant experience with data science technologies and analytics advisory or consulting firms. Strong knowledge of LLMs and RAG. Familiarity with AWS, Databricks, and/or Dataiku platforms. Working knowledge of MLOps, including familiarity with containerization (e.g., Docker). Excellent troubleshooting skills and the ability to work independently. Strong organizational skills, including clear documentation of projects and ability to write clean code. Familiarity with agile project methodology and/or project development lifecycle. Experience with GitHub for version control. Excellent communication and presentation skills, with the ability to explain complex data science concepts to non-technical audiences. Ability to complete work in an acceptable timeframe and manage a variety of detailed tasks and responsibilities simultaneously and with accuracy to meet deadlines, goals, and objectives and satisfy internal and external customer needs related to the job. Extensive experience with MLOps and associated serving frameworks (i.e., Flask, FastAPI, etc.) and orchestration pipelines (e.g., Sage Maker Pipelines, Step Functions, Metaflow, etc.). Extensive experience working with open source LLMs (e.g., serving via TGI / vLLM, performing CPT and/or SFT, etc.). Experience using various AWS Services (e.g., Textract, Transcribe, Lambda, etc.). Proficiency in basic front-end web development (e.g., Streamlit). Knowledge of Object-Oriented Programming (OOP) concepts. At least 3-4 years of people management skills are required. Work Location:Hyderabad Timings: 2 PM – 11PM How You’ll Grow At Deloitte, we’ve invested a great deal to create a rich environment in which our professionals can grow. We want all our people to develop in their own way, playing to their own strengths as they hone their leadership skills. And, as a part of our efforts, we provide our professionals with a variety of learning and networking opportunities— including exposure to leaders, sponsors, coaches, and challenging assignments—to help accelerate their careers along the way. No two people learn in exactly the same way. So, we provide a range of resources including live classrooms, team-based learning, and eLearning. DU: The Leadership Center in India, our state-of-the-art, worldclass learning Center in the Hyderabad offices is an extension of the Deloitte University (DU) in Westlake, Texas, and represents a tangible symbol of our commitment to our people’s growth and development. Explore DU: The Leadership Center in India Benefits At Deloitte, we know that great people make a great organization. We value our people and offer employees a broad range of benefits. Learn more about what working at Deloitte can mean for you. Deloitte’s culture Our positive and supportive culture encourages our people to do their best work every day. We celebrate individuals by recognizing their uniqueness and offering them the flexibility to make daily choices that can help them to be healthy, centered, confident, and aware. We offer well-being programs and are continuously looking for new ways to maintain a culture that is inclusive, invites authenticity, leverages our diversity, and where our people excel and lead healthy, happy lives. Learn more about Life at Deloitte. Corporate citizenship Deloitte is led by a purpose: to make an impact that matters. This purpose defines who we are and extends to relationships with our clients, our people and our communities.We believe that business has the power to inspire and transform. We focus on education, giving, skill-based volunteerism, and leadership to help drive positive social impact in our communities. Learn more about Deloitte’s impact on the world. Recruiting tips From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Benefits At Deloitte, we know that great people make a great organization. We value our people and offer employees a broad range of benefits. Learn more about what working at Deloitte can mean for you. Our people and culture Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Our purpose Deloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Professional development From entry-level employees to senior leaders, we believe there’s always room to learn. We offer opportunities to build new skills, take on leadership opportunities and connect and grow through mentorship. From on-the-job learning experiences to formal development programs, our professionals have a variety of opportunities to continue to grow throughout their career. Requisition code: 302611 Show more Show less

Posted 1 month ago

Apply

0 years

0 Lacs

Gurugram, Haryana, India

Remote

Linkedin logo

About the Role Drive the digital backbone of a growing commercial real-estate group. You’ll prototype, test and ship automations that save our teams > 10 hours/week in the first 90 days Availability ~20 hrs/week (flexible), Gurgaon/remote hybrid. Engagement Model - On-site 1 day/wk during rollout peaks Compensation ₹55–70 k per month. Core Responsibilities 1. Systems Audit & Consolidation – unify Google Workspace tenants, rationalise shared drives. 2. Database & CRM Build-out – design, deploy, and maintain occupant tracker and a lightweight CRM; migrate legacy data. 3. Automation & Integration – link CRM, Google Sheets, and Tally using Apps Script/Zoho Flow/Zapier. 4. Process Documentation – own the internal wiki; keep SOPs and RACI charts current. 5. Dashboards & Reporting – craft Looker Studio boards for collections, projects, facility KPIs. 6. User Training & Support – deliver monthly clinics; teach teams how to use G Suite, ChatGPT to improve productivity 7. Security & Compliance – enforce 2FA, backup policies, basic network hygiene. 8. Vendor Co-ordination – liaise with Zoho, Tally consultants, ISP/MSP vendors; manage small capex items. 🔧 Required Skills & Experience We’re looking for a hands-on builder with a strong track record in automation, low-code systems, and internal tooling. The ideal candidate will bring most (not necessarily all) of the following: ⚙️ Automation & Low-Code Workflows Practical experience building solutions with Google Apps Script or Zoho Creator/Flow , including REST APIs and webhooks Familiarity with workflow bridges like Zapier, Make, or n8n Bonus: Exposure to AI-based low-code tools like Cursor or Loveable 📄 Data Extraction & Integrations Hands-on experience using OCR/Document AI tools (e.g. Google DocAI, AWS Textract) to parse and structure lease or legal documents Familiarity with Tally Prime integrations via API or ODBC for syncing financial data 📇 CRM & Customer View Experience with end-to-end CRM rollouts (Zoho/Freshsales preferred), including data migration and module customization Bonus: Exposure to helpdesk tools like Zoho Desk or Freshdesk 📊 Analytics & Reporting Advanced proficiency in Google Sheets (ARRAYFORMULA, QUERY, IMPORTRANGE) Experience designing interactive dashboards in Looker Studio Bonus: Awareness of data warehousing concepts (BigQuery, Redshift) for creating a unified customer view 🧠 Scripting & AI Comfortable writing Python or Node.js scripts for light-weight cloud functions and ETL Experience using OpenAI/Claude APIs to build small copilots or automations (e.g., résumé rankers, document summarizers) 📋 Project & Knowledge Management Bonus: Familiarity with Trello or other Kanban-style project boards Strong documentation skills with Notion or Google Sites for building wikis, SOPs, and internal help resources 🗣️ Soft Skills Able to explain technical systems clearly to non-technical stakeholders Comfortable training teams in both English and Hindi 📩 How to Apply? If this sounds like you, please apply via this short form : 👉 https://forms.gle/3gPwMqnadpf3dP159 We’ll review responses daily. If you clear the knockout round, you’ll receive a 30-minute skills test within 24 hours. Show more Show less

Posted 1 month ago

Apply

8 - 12 years

0 Lacs

Mumbai, Maharashtra, India

Remote

Linkedin logo

We are seeking a talented individual to join our Data Science team at Marsh. This role will be based in Mumbai. This is a hybrid role that has a requirement of working at least three days a week in the office. Senior Manager - Data Science and Automation We will count on you to: Identify opportunities which add value to the business and make the process more efficient. Invest in understand the core business including products, process, documents, and data points with the objective of identifying efficiency and value addition opportunities. Design and develop end-to-end NLP/LLM solutions for document parsing, information extraction, and summarization from PDFs and scanned text. Develop AI applications to automate manual and repetitive tasks using generative AI and machine learning. Fine-tune open-source LLMs (like LLaMA, Mistral, Falcon, or similar) or build custom pipelines using APIs (OpenAI, Anthropic, Azure OpenAI). Build custom extraction logic using tools like LangChain, Haystack, Hugging Face Transformers, and OCR libraries like Tesseract or Azure Form Recognizer. Create pipelines to convert outputs into formatted Microsoft Word or PDF files using libraries like docx, PDFKit, ReportLab, or LaTeX. Collaborate with data engineers and software developers to integrate AI models into production workflows. Ensure model performance, accuracy, scalability, and cost-efficiency across business use cases. Stay updated with the latest advancements in generative AI, LLMs, and NLP research to identify innovative solutions. Design, develop, and maintain robust data pipelines for extracting, transforming, and loading (ETL) data from diverse sources. As the operational scales up design and implement scalable data storage solutions and integrate them with existing systems. Utilize cloud platforms (AWS, Azure, Google Cloud) for data storage and processing. Conduct code reviews and provide mentorship to junior developers. Stay up-to-date with the latest technology trends and best practices in data engineering and cloud services. Ability to lead initiatives and deliver results by engaging with cross-functional teams and resolving data ambiguity issues. Be responsible for the professional development of your projects and institute a succession plan. What you need to have: Bachelor's degree in Engineering, Analytics, or a related field, MBA, Computer Applications, IT, Business Analytics, or any discipline. Proven experience of 8-12 years in Python development Hands-on experience with frameworks and libraries like Transformers, LangChain, PyTorch/TensorFlow, spaCy, Hugging Face, and Haystack. Strong expertise in document parsing, OCR (Tesseract, AWS Textract, Azure Form Recognizer), and entity extraction. Proficiency in Python and familiarity with cloud-based environments (Azure, AWS, GCP). Experience deploying models as APIs/microservices using FastAPI, Flask, or similar. Familiarity with PDF parsing libraries (PDFMiner, PyMuPDF, Apache PDFBox) and Word generation libraries (python-docx, PDFKit). Solid understanding of prompt engineering and prompt-tuning techniques. Proven experience with data automation and building data pipelines. Proven track record in building and maintaining data pipelines and ETL processes. Strong knowledge of Python libraries such as Pandas, NumPy, and PySpark, Camelot. Familiarity with database management systems (SQL and NoSQL databases). Experience in designing and implementing system architecture. Ability to operate in a multi layered technology architecture and shape the technology maturity of the organization. Solid understanding of software development best practices, including version control (Git), code reviews, and testing frameworks (PyTest, UnitTest). Strong attention to detail and ability to work with complex data sets. Effective communication skills to present findings and insights to both technical and non-technical stakeholders. Specify superior listening, verbal and written communication skills Excellent project management and organization skills Superlative stakeholder management skills – ability to positively influence stakeholders. Synthesis skills- Ability to connect the dots and answer the business question. Excellent problem-solving, structuring and critical-thinking skills. Ability to work independently and collaboratively in a fast-paced environment. What makes you stand out? Master’s degree in Computer Science, Engineering, or related fields. Experience in working with large-scale data sets and real-time data processing. Familiarity with additional programming languages like Java, C++, or R. Strong problem-solving skills and ability to work in a fast-paced environment. Why join our team: We help you be your best through professional development opportunities, interesting work and supportive leaders. We foster a vibrant and inclusive culture where you can work with talented colleagues to create new solutions and have impact for colleagues, clients and communities. Our scale enables us to provide a range of career opportunities, as well as benefits and rewards to enhance your well-being. Marsh, a business of Marsh McLennan (NYSE: MMC), is the world’s top insurance broker and risk advisor. Marsh McLennan is a global leader in risk, strategy and people, advising clients in 130 countries across four businesses: Marsh, Guy Carpenter, Mercer and Oliver Wyman. With annual revenue of $24 billion and more than 90,000 colleagues, Marsh McLennan helps build the confidence to thrive through the power of perspective. For more information, visit marsh.com, or follow on LinkedIn and X. Marsh McLennan is committed to embracing a diverse, inclusive and flexible work environment. We aim to attract and retain the best people and embrace diversity of age, background, caste, disability, ethnic origin, family duties, gender orientation or expression, gender reassignment, marital status, nationality, parental status, personal or social status, political affiliation, race, religion and beliefs, sex/gender, sexual orientation or expression, skin color, or any other characteristic protected by applicable law. Marsh McLennan is committed to hybrid work, which includes the flexibility of working remotely and the collaboration, connections and professional development benefits of working together in the office. All Marsh McLennan colleagues are expected to be in their local office or working onsite with clients at least three days per week. Office-based teams will identify at least one “anchor day” per week on which their full team will be together in person. R_308144 Show more Show less

Posted 1 month ago

Apply

0 years

0 Lacs

Ahmedabad, Gujarat, India

Remote

Linkedin logo

Company Description Prakash Software Solutions Pvt Ltd (PSSPL), founded in 2001, is a globally recognized software development consultancy in the IT space. As a certified Microsoft Solution Partner for Data & AI and Digital & App Innovation (Azure), and an ISO 9001: 2015 & ISO 27001: 2022 certified company, PSSPL has developed over 500 custom B2B and B2C applications for diverse industries. We provide end-to-end mobile and web development, UI/UX design, cloud-based solutions, AR, VR, AI, Big Data, and IoT solutions, as well as advanced quality assurance and dedicated Agile teams. Our focus is on delivering high-quality projects through frequent and open communication and collaboration with our clients. Role Description Experience: 4+ yrs. Skills required: Technical knowledge on Azure, AWS, Python, Jupyter Notebook, COLAB, ChatGPT, Jurrasic-1, Matplotlib, Sklearn, PyTorch, spacy, HuggingFace, SpeechBrain, Wave2Letter, ElasticSearch, AWS Transcribe, Textract, Amazon Sagemaker Natural Language Processing (NLP), Voice App Development, Data Structures & Algorithms, Web Development, Machine Learning, Deep Learning, Natural Language Processing, Data Science, Tableau, SAS Programming, SQL for Data Analytics, Power BI, Clinical Trial Analysis & Reporting, Git & GitHub Location: Ahmedabad/Vadodara Soft Skills: Leadership Skills, Managerial Skills, Communication Skills, Presentation Skills, Analytical & Logical Skills, Team building & Client relationship management What you will do: Project Planning and Management: Develop and execute comprehensive project plans, including scope, timelines, resources, and budget allocation. Define project milestones and deliverables, ensuring adherence to project management best practices throughout the project lifecycle. Requirement Analysis: Collaborate with clients and stakeholders to gather and analyse business requirements. Translate these requirements into technical specifications and project deliverables. Identify any customization or integration needs and define the technical approach accordingly. Solution Architecture: Create comprehensive AI/ML system designs, including architecture, data pipelines, algorithms, and model selection. Ensure that the solution aligns with business goals and is scalable and maintainable. Evaluate and select the most suitable AI/ML technologies, frameworks, and tools for each project. Stay up to date with emerging AI/ML trends and technologies. Team Coordination: Lead and manage cross-functional project teams, including developers, consultants, and other stakeholders. Define roles and responsibilities, assign tasks, and provide guidance and support throughout the project lifecycle. Foster effective collaboration and communication among team members. Risk and Issue Management: Identify and proactively mitigate project risks and issues. Develop contingency plans to address any potential obstacles that may arise during the project. Monitor project progress, track project metrics, and communicate project status to stakeholders, ensuring transparency and timely reporting. Quality Assurance: Establish and enforce quality standards and best practices throughout the project. Conduct regular quality reviews and ensure adherence to coding standards, testing protocols, and documentation requirements. Perform thorough testing and coordinate user acceptance testing (UAT) activities. Client Relationship Management: Build and maintain strong relationships with clients and stakeholders. Provide regular project updates, manage expectations, and address any concerns or issues that arise. Ensure a high level of client satisfaction by delivering projects that meet or exceed their expectations. Continuous Improvement: AI/ML technologies are rapidly evolving. Stay current with the latest research, frameworks, algorithms, and best practices in AI/ML. Subscribe to relevant journals, blogs, and attend conferences and workshops. Invest in ongoing education and training for your AI/ML professionals. Provide opportunities for them to acquire new skills and certifications, attend training programs, and participate in online courses. Requirements: Bachelor’s degree in computer science, Information Technology, or a related field (master’s degree preferred). Practical experience in at least one of Java, Python, or JavaScript. Practical experience with at least one of Spring, Flask, Django, or Node.js. 3+ years of experience. Effective communication skills including uplevelling communication for various leadership levels. Experience moving technical or engineering programs and products from inception to delivery, articulating the impact using metrics. Collaborate with the team in solutioning, design, and code reviews. Analytical and problem-solving experience with large-scale systems. Experience establishing work relationships across multi-disciplinary teams working remotely. Interpersonal skills, including relationship building and collaboration within a diverse, cross-functional team to develop solutions. Organizational and coordination skills along with multitasking experience to get things done in an ambiguous and fast paced environment. Analytical mindset with the ability to identify and mitigate risks, solve problems, and make data-driven decisions. Strong organizational skills and attention to detail, with the ability to manage multiple projects simultaneously. Acquire, clean, and preprocess data from various sources to prepare it for analysis and modelling. Perform exploratory data analysis (EDA) to identify trends, patterns, and anomalies in the data. Design and implement data pipelines for efficient data processing and model training. Develop and maintain documentation for data processes, datasets, and models. Collaborate with data scientists to design and evaluate machine learning models. Monitor and assess the performance of machine learning models and make recommendations for improvements. Stay up to date with industry trends and best practices in AI/ML and data analysis. Show more Show less

Posted 1 month ago

Apply

0.0 - 3.0 years

0 Lacs

Ahmedabad, Gujarat

On-site

Indeed logo

We are seeking a highly skilled and motivated MEAN Stack Developer to join our International IT client's team. Key Responsibilities: Design, implement, and manage CI/CD pipelines using GitLab, Jenkins, and Bitbucket. Administer Linux servers, including networking configurations, DNS, and system troubleshooting. Maintain artifact repositories and Artifactory systems. Utilize a wide range of AWS services: EC2, S3, ECS, RDS (Postgres), Lambda (Python runtime), DynamoDB, Comprehend, Textract, and SageMaker for ML deployments. Optimize AWS resource usage for performance and cost-efficiency. Develop infrastructure using Terraform and manage Infrastructure as Code (IaC) workflows. Deploy and manage Kubernetes clusters, including EKS, and work with microservices architecture, load balancers, and database replication (Postgres, MongoDB). Hands-on experience with Redis clusters, Elasticsearch, and Amazon OpenSearch. Integrate monitoring tools such as CloudWatch, Grafana, and implement alerting solutions. Support DevOps scripting using tools like AWS CLI, Python, PowerShell, and optionally FileMaker. Implement and maintain automated troubleshooting, system health checks, and ensure maximum uptime. Collaborate with development teams to interpret test data and meet quality goals. Create system architecture diagrams and provide scalable, cost-effective solutions to clients. Implement best practices for network security, data encryption, and overall cybersecurity. Stay current with industry trends and introduce modern DevOps tools and practices. Ability to handle client interviews with strong communication. Key Skills & Requirements: 3–4 years of experience in DevOps roles. Strong knowledge of CI/CD tools (Jenkins, GitLab CI/CD, Bitbucket). Proficiency with AWS cloud infrastructure, including serverless technologies. Experience with Docker, Kubernetes, and IaC tools like Terraform. Expertise in Linux systems, networking, and scripting (Python, Shell, PowerShell). Experience working with Postgres, MongoDB, and DynamoDB. Knowledge of Redis, Elasticsearch, and monitoring tools (CloudWatch, Grafana). Understanding of microservices architecture, performance optimization, and security. Preferred Qualifications: Hands-on experience with GCP and services like BigQuery, Composer, Airflow, and Pub/Sub is a plus point. Design and Experience deploying applications on Vercel. Knowledge of AWS ML and NLP services (Comprehend, Textract, SageMaker). Familiarity with streaming data platforms and real-time pipelines. AWS Certification or AWS Solutions Architect, Kubernetes certification is a strong plus. Strong leadership and cross-functional collaboration skills. Job Types: Full-time, Permanent Pay: ₹540,000.00 - ₹660,000.00 per year Benefits: Leave encashment Paid sick time Paid time off Provident Fund Schedule: Day shift Monday to Friday Ability to commute/relocate: Ahmedabad, Gujarat: Reliably commute or planning to relocate before starting work (Required) Application Question(s): Would you please share your Current CTC, Expected CTC and Notice Period? Experience: DevOps: 3 years (Required) Work Location: In person Speak with the employer +91 9727330030

Posted 1 month ago

Apply

0 years

0 Lacs

India

Remote

Linkedin logo

Job Title: Prompt Engineer Location: Remote Mode: Full-Time Employment (FTE) Experience Required: 5 to 10 Years Notice Period: Immediate Joiners Preferred Job Description We are looking for experienced Prompt Engineers to join our team and push the boundaries of what Large Language Models (LLMs) can do. You will play a crucial role in designing, developing, and optimizing prompts to enhance the performance of LLMs across a wide range of common and complex tasks. This includes building intelligent systems capable of extracting structured data from various scanned documents. Key Responsibilities: Develop and optimize prompts to improve LLM performance on both generic and domain-specific tasks. Work on solutions that enable LLMs to extract data from scanned and unstructured documents. Collaborate with cross-functional teams including data scientists, ML engineers, and product teams to deliver robust and scalable solutions. Continuously test and iterate prompts for optimal performance. Apply prompt tuning techniques to improve accuracy and reliability in downstream applications. Required Skills & Experience: 5–10 years of experience in a technical role involving AI/ML, NLP, or software development. Strong hands-on experience with prompt engineering and working with LLMs (such as GPT, Claude, Gemini, etc.). Proficiency in Python and experience with frameworks like LangChain, OpenAI APIs, or similar. Familiarity with AWS Lambda or similar serverless computing platforms. Solid understanding of NLP tasks such as entity recognition, summarization, classification, and data extraction. Experience with scanned document processing (OCR tools like Tesseract, Azure Form Recognizer, Amazon Textract, etc.) is a strong plus. Strong analytical and problem-solving skills. Excellent communication and collaboration abilities. If you're passionate about the intersection of AI, NLP, and real-world problem solving—this role is for you. Apply now or send your CV to jobs@weareanomaly.in Show more Show less

Posted 1 month ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Key Responsibilities Apply expertise in Data Science, Generative AI, and Machine Learning to design and develop solutions. Work with AWS Cloud services to build, deploy, and manage AI/ML applications and infrastructure components. Develop and integrate backend components using Java and Microservices architecture. Utilize Python for data analysis, model development, scripting, and other Data Science tasks related to AI/ML projects. Collaborate effectively with cross-functional teams using Good Communication Skills. Design and implement various components of AI/ML pipelines. Explore and implement new Generative AI models and techniques. Contribute to the full development lifecycle of AI-driven applications, from concept to deployment. Troubleshoot and optimize AI/ML models and their Skillset (Required Skills & Qualifications) : 5+ years of experience in relevant roles such as Data Science, Machine Learning Engineering, or Software Engineering with a focus on AI/ML. Expertise in Data Science, Generative AI, and Machine Learning concepts and practices. Experience with AWS Cloud services relevant to AI/ML workloads. Experience with Java and Microservices architecture. Proficiency in Python. Good Communication Skills (Plus Points) : Experience with specific Generative AI models (e.g., LLMs, specific frameworks). Experience with MLOps practices and tools. Experience with other AWS AI/ML services (e.g., SageMaker, Textract, Rekognition). Experience with other cloud platforms (Azure, GCP). Experience with database technologies. (ref:hirist.tech) Show more Show less

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies