Jobs
Interviews

957 Parsing Jobs - Page 5

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Job Description : SDET (Software Development Engineer in Test) Notice Period Requirement: Immediately to 2 Month(Officially) Job Locations: Gurgaon/Delhi Experience: 5 to 8 Years Skills: SDET, Automation, Java programming, Selenium, Playwright, Cucumber, Rest Assured, API Coding(All Mandatory) Job Type : Full-Time Job Description We are seeking an experienced and highly skilled SDET (Software Development Engineer in Test) to join our Quality Engineering team. The ideal candidate will possess a strong background in test automation with API testing or mobile testing or Web, with hands-on experience in creating robust automation frameworks and scripts. This role demands a thorough understanding of quality engineering practices, microservices architecture, and software testing tools. Key Responsibilities : - Design and develop scalable and modular automation frameworks using best industry practices such as the Page Object Model. - Automate testing for distributed, highly scalable systems. - Create and execute test scripts for GUI-based, API, and mobile applications. - Perform end-to-end testing for APIs, ensuring thorough validation of request and response schemas, status codes, and exception handling. - Conduct API testing using tools like Rest Assured, SOAP UI, NodeJS, and Postman, and validate data with serialization techniques (e.g., POJO classes). - Implement and maintain BDD/TDD frameworks using tools like Cucumber, TestNG, or JUnit. - Write and optimize SQL queries for data validation and backend testing. - Integrate test suites into test management systems and CI/CD pipelines using tools like Maven, Gradle, and Git. - Mentor team members and quickly adapt to new technologies and tools. - Select and implement appropriate test automation tools and strategies based on project needs. - Apply design patterns, modularization, and user libraries for efficient framework creation. - Collaborate with cross-functional teams to ensure the quality and scalability of microservices and APIs. Must-Have Skills : - Proficiency in designing and developing automation frameworks from scratch. - Strong programming skills in Java, Groovy, or JavaScript with a solid understanding of OOP concepts. - Hands-on experience with at least one GUI automation tool (desktop/mobile). Experience with multiple tools is an advantage. - In-depth knowledge of API testing and microservices architecture. - Experience with BDD and TDD methodologies and associated tools. - Familiarity with SOAP and REST principles. - Expertise in parsing and validating complex JSON and XML responses. - Ability to create and manage test pipelines in CI/CD environments. Nice-to-Have Skills : - Experience with multiple test automation tools for GUI or mobile platforms. - Knowledge of advanced serialization techniques and custom test harness implementation. - Exposure to various test management tools and automation strategies. Qualifications : - Bachelor's or Master's degree in Computer Science, Engineering, or a related field. - 5 Years+ in software quality engineering and test automation. - Strong analytical and problem-solving skills with attention to detail.

Posted 1 week ago

Apply

0 years

0 Lacs

India

Remote

Title: SnapLogic Developer Experience: 6+ yrs Timings: 8:30PM to 5:30AM EST Timezone Location: Remote Salary: Upto 1 Lakh/ month (Depend upon experience) *This is a Freelancing role. Not a permanant position Role: We are seeking a Senior SnapLogic Developer to lead the design, development, and maintenance of complex data integration pipelines using SnapLogic. This role will play a key part in managing all incoming and outgoing data flows across the enterprise, with a strong emphasis on EDI (X12) parsing, Salesforce integrations, and SnapLogic best practices. The ideal candidate is a technical expert who can also mentor junior developers and contribute to the evolution of our integration standards and architecture. Key Responsibilities: Lead and own SnapLogic pipeline development for various enterprise integration needs. Design, build, and maintain scalable integration workflows involving EDI X12 formats, Salesforce Snaps, REST/SOAP APIs, and file-based transfers (SFTP, CSV, etc.). Parse and transform EDI documents, particularly X12 837, 835, 834, 270/271, into target system formats like Salesforce, databases, or flat files. Manage and monitor SnapLogic dataflows for production and non-production environments. Collaborate with business and technical teams to understand integration requirements and deliver reliable solutions. Lead a team of SnapLogic developers, providing technical guidance, mentorship, and code reviews. Document integration flows, error handling mechanisms, retry logic, and operational procedures. Establish and enforce SnapLogic development standards and reusable components (SnapPacks, pipelines, assets). Collaborate with DevOps/SecOps to ensure deployments are automated and compliant. Troubleshoot issues in existing integrations and optimize performance where needed. Required Skills and Experience: Proven expertise in parsing and transforming EDI X12 transactions (especially 837, 835, 834, 270/271). Strong experience using Salesforce Snaps, including data sync between Salesforce and external systems. Deep understanding of SnapLogic architecture, pipeline execution patterns, error handling, and best practices. Experience working with REST APIs, SOAP services, OAuth, JWT, and token management in integrations. Knowledge of JSON, XML, XSLT, and data transformation logic. Strong leadership and communication skills; ability to mentor junior developers and lead a small team. Comfortable working in Agile environments with tools like Jira, Confluence, Git, etc. Experience with data privacy and security standards (HIPAA, PHI) is a plus, especially in healthcare integrations.

Posted 1 week ago

Apply

0 years

0 Lacs

Kozhikode, Kerala, India

On-site

Pfactorial Technologies is a fast-growing AI/ML/NLP company at the forefront of innovation in Generative AI, voice technology, and intelligent automation. We specialize in building next-gen solutions using LLMs, agent frameworks, and custom ML pipelines. Join our dynamic team to work on real-world challenges and shape the future of AI driven systems and smart automation.. We are looking for AI/ML Engineer – LLMs, Voice Agents & Workflow Automation (0–3Yrs Experience ) Experience with LLM integration pipelines (OpenAI, Vertex AI, Hugging Face models) Hands on experience in working with voice agents, TTS, STT, caching mechanisms, and ElevenLabs voice technology Strong understanding of vector databases like Qdrant or Milvus Hands-on experience with Langchain, LlamaIndex, or agent frameworks (e.g., AutoGen, CrewAI) Knowledge of FastAPI, Celery, and orchestration of ML/AI services Familiarity with cloud deployment on GCP, AWS, or Azure Ability to build and fine-tune matching, ranking, or retrieval-based models Developing agentic workflows for automation Implementing NLP pipelines for parsing, summarizing, and communication (e.g., email bots, script generators) Comfortable working with graph-based data representation and integrating with frontend Experience in multi-agent collaboration frameworks like Google Agent2Agent Practical experience in data scraping and enrichment for ML training datasets Understanding of compliance in AI applications 👉 For more updates, follow us on our LinkedIn page! https://in.linkedin.com/company/pfactorial

Posted 1 week ago

Apply

5.0 years

0 Lacs

New Delhi, Delhi, India

Remote

Location: Remote (India-based preferred) Type: Full-time | Founding Team | High Equity Company: Flickd (www.flickd.in) About the Role We’re building India’s most advanced virtual try-on engine — think Doji meets TryOnDiffusion, but optimized for real-world speed, fashion, and body diversity. As our ML Engineer (Computer Vision + Try-On) , you’ll own the end-to-end pipeline : from preprocessing user/product images to generating hyper-realistic try-on results with preserved pose, skin, texture, and identity. You’ll have full autonomy to build, experiment, and ship — working directly with React, Spring Boot, DevOps, and design folks already in place. This is not a junior researcher role. This is one person building the brain of the system - and setting the foundation for India's biggest visual shopping innovation. What You’ll Build Stage 1: User Image Preprocessing Human parsing (face, body, hair), pose detection, face/limb alignment Auto orientation, canvas resizing, brightness/contrast normalization Stage 2: Product Image Processing Background removal, garment segmentation (SAM/U^2-Net/YOLOv8) Handle occlusions, transparent clothes, long sleeves, etc. Stage 3: Try-On Engine Implement and iterate on CP-VTON / TryOnDiffusion / FlowNet Fine-tune on custom data for realism, garment drape, identity retention Inference Optimisation TorchScript / ONNX, batching, inference latency minimization Collaborate with DevOps for Lambda/EC2 + GPU deployment Postprocessing Alpha blending, edge smoothing, fake shadows, cloth-body warps You’re a Fit If You: Have 2–5 years in ML/CV with real shipped work (not just notebooks) Have worked on: human parsing, pose estimation, cloth warping, GANs Are hands-on with PyTorch , OpenCV, Segmentation Models, Flow or ViT Can replicate models from arXiv fast, and care about output quality Want to own a system seen by millions , not just improve metrics Stack You’ll Use PyTorch, ONNX, TorchScript, Hugging Face DensePose, OpenPose, Segment Anything, Diffusion Models Docker, Redis, AWS Lambda, S3 (infra is already set up) MLflow or DVC (can be implemented from scratch) For exceptional talent, we’re flexible on cash vs equity split. Why This Is a Rare Opportunity Build the core AI product that powers a breakout consumer app Work in a zero BS, full-speed team (React, SpringBoot, DevOps, Design all in place) Be the founding ML brain and shape all future hires Ship in weeks, not quarters — and see your output in front of users instantly Apply now, or DM Dheekshith (Founder) on LinkedIn with your GitHub or project links. Let’s build something India’s never seen before.

Posted 1 week ago

Apply

3.0 years

0 Lacs

India

Remote

Job Title: AI Engineer – Web Crawling & Field Data Extraction Location: [Remote] Department: Engineering / Data Science Experience Level: Mid to Senior Employment Type: Contract to Hire About the Role: We are looking for a skilled AI Engineer with strong experience in web crawling, data parsing, and AI/ML-driven information extraction to join our team. You will be responsible for developing systems that automatically crawl websites, extract structured and unstructured data, and intelligently map the extracted content to predefined fields for business use. This role combines practical web scraping, NLP techniques, and AI model integration to automate workflows that involve large-scale content ingestion. Key Responsibilities: Design and develop automated web crawlers and scrapers to extract information from various websites and online resources. Implement robust and scalable data extraction pipelines that convert semi-structured/unstructured data into structured field-level data. Use Natural Language Processing (NLP) and ML models to intelligently interpret and map extracted content to specific form fields or schemas. Build systems that can handle dynamic web content, captchas, JavaScript-rendered pages, and anti-bot mechanisms. Collaborate with frontend/backend teams to integrate extracted data into user-facing applications. Monitor crawler performance, ensure compliance with legal/data policies, and manage scheduling, deduplication, and logging. Optimize crawling strategies using AI/heuristics for prioritization, entity recognition, and data validation. Create tools for auto-filling forms or generating structured records from crawled data. Required Skills and Qualifications: Bachelor’s or Master’s degree in Computer Science, AI/ML, Data Science, or related field. 3+ years of hands-on experience with web scraping frameworks (e.g., Scrapy, Puppeteer, Playwright, Selenium). Proficiency in Python, with experience in BeautifulSoup, lxml, requests, aiohttp, or similar libraries. Experience with NLP libraries (e.g., spaCy, NLTK, Hugging Face Transformers) to parse and map extracted data. Familiarity with ML-based data classification, extraction, and field mapping. Knowledge of structured data formats (JSON, XML, CSV) and RESTful APIs. Experience handling anti-scraping techniques and rate-limiting controls. Strong problem-solving skills, clean coding practices, and the ability to work independently. Nice-to-Have Experience with AI form understanding (e.g., LayoutLM, DocAI, OCR). Familiarity with Large Language Models (LLMs) for intelligent data labeling or validation. Exposure to data pipelines, ETL frameworks, or orchestration tools (Airflow, Prefect). Understanding of data privacy, compliance, and ethical crawling standards. Why Join Us? Work on cutting-edge AI applications in real-world automation. Be part of a fast-growing and collaborative team. Opportunity to lead and shape intelligent data ingestion solutions from the ground up.

Posted 1 week ago

Apply

4.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Life at MX We are driven by our moral imperative to advance mankind - and it all starts with our people, product and purpose. We always carry a deep sense of drive and passion with us. If you thrive in a challenging work environment, surrounded by incredible team members who will help you grow, MX is the right place for you. Come build with us and be part of an award-winning company that’s helping create meaningful and lasting change in the financial industry. We’re looking for a highly skilled engineer who thrives at the intersection of automation, AI, and web data extraction . You will be responsible for building advanced web scraping systems, designing evasion strategies to bypass anti-bot mechanisms, and integrating intelligent data extraction techniques. This role requires strong expertise in TypeScript , Puppeteer (or Playwright) , and modern scraping architectures, along with a practical understanding of bot detection mechanisms and machine learning for smarter data acquisition. Key Responsibilities Design and maintain scalable web scraping pipelines using Puppeteer, Playwright, or headless browsers Implement evasion techniques to bypass bot detection systems (e.g., fingerprint spoofing, dynamic delays, proxy rotation) Leverage AI/ML models for intelligent parsing, CAPTCHA solving, and anomaly detection Handle large-scale data collection with distributed scraping infrastructure Monitor scraping performance, detect bans, and auto-recover from failure states Build structured outputs (e.g., JSON, GraphQL feeds) from semi-structured/unstructured sources Collaborate with product and data science teams to shape high-quality, reliable data inputs Ensure compliance with legal and ethical scraping practice Required Skills & Experience 4+ years of experience building and scaling web scraping tools Strong proficiency in TypeScript and Node.js Hands-on with Puppeteer, Playwright, or Selenium for browser automation Deep understanding of how bot detection systems work (e.g., Cloudflare, Akamai, hCaptcha) Experience with proxy management, user-agent spoofing, fingerprint manipulation Familiarity with CAPTCHA solving libraries/APIs, ML-based screen parsing, OCR Working knowledge of AI/ML for parsing or automation (e.g., Tesseract, TensorFlow, OpenAI APIs) Comfortable working with large-scale data pipelines, queues (e.g., Kafka, RabbitMQ), and headless fleet management Additional Skills Experience with cloud infrastructure (AWS/GCP) for scalable scraping jobs CI/CD and containerization (Docker, Kubernetes) for deployment Knowledge of ethical and legal considerations around data scraping Contributions to open-source scraping frameworks or tools Work Environment In this role, a significant aspect of the job involves working in the office for a standard 40-hour workweek. We believe that the collaborative nature of our work and the face-to-face interactions among team members are essential for fostering a dynamic and productive work environment. Being present in the office enables seamless communication, facilitates quick decision-making, and encourages spontaneous collaboration that contributes to the overall success of our projects. We value the synergy that comes from having our team members physically together, allowing for immediate problem-solving, idea exchange, and team building. Compensation The expected earnings for this role could be comprised of a base salary and other forms of cash compensation, such as bonus or commissions as applicable. This pay range is just one component of MX’s total rewards package. MX takes a number of factors into account when determining individual starting pay, including job and level they are hired into, location, skillset, peer compensation. Please note applicants applying for this position must have the legal right to work in India without the need for sponsorship. We are unable to provide work sponsorship for this role, and candidates should be able to verify their eligibility to work in the country independently. Proof of eligibility to work in India will be required as part of the hiring process.

Posted 1 week ago

Apply

0 years

2 - 3 Lacs

Ahmedabad, Gujarat, India

On-site

Company Profile Nextgen is a UK based company that provides services for mobile operators world-wide. We are a growing company with about 300+ employees and offices in Europe, Asia, India, Cairo and the US. Our core competency is the provision of services around the commercial aspects of mobile roaming, data and financial clearing. Our services are based on proprietary software and operated centrally. The software is based on Web and Oracle technology and its main purpose consists in processing and distribution of roaming data, settlement of charges between the operators and providing business intelligence applications to our customers. Role Purpose & Context Accounts Assistants in the Receivable Management Team are required to make sure that all GSM and SMS invoices are generated within deadline of invoice generation as per operations calendar. Team members will allocate all bank receipts within 24 hours of receipt loading. Responsibilities Invoice Generation & Dispatch Sanity check of GSM & SMS data received from DCH/Client for the invoice generation. Data Loading & Invoices generation of GSM & SMS data within deadline. Checking of error logs and updating same to "All Clients Sheet" (Missing Roaming Agreement Sheet) Sending generated invoices to client confirmation through Issue ID for there respective client. Creation of Hub parent position. Checking of Payable and Receivable RAP's once data are loaded and invoices are generated accordingly. Cross Checking of MFS/SMS data to the invoice generated before invoices are dispatched. Manual Check on duplicate TAP File billing. Timely updating of "Data Parsing & Invoice Generation" Sheet during invoice generation. Creation of MRA's once received from Client Regeneration of invoice once RAP's are approved by Account Manager. Notify to Account Manager to generate Credit Note/Debit Note if invoice is generated with negative value. Sharing of formatted data to shared path for the future referance. Cash Allocation To allocate the receipts or take relevant action on daily basis within 24 hours To clear remittance queue on OTRS and same are shared to relevant folders. To chase missing PN on every alternative day and if it is not received after being chased for 3 times from the system and 1 personalized email to partner then Log an Issue to relevant Account Manager. To take confirmation from AM in case of FX loss/Gain or any other queries (issues) related to PN for which back office is not authorized to take further action through Issue Log. To chase Missing Invoices required for allocation of Payment from APEX or Operations Mailbox. Providing and replying to the mails of missing invoice request Sending requested payment notification to the partner / FCH Chasing and follow up of missing invoice for our customers Requirements Bachelor's degree in business, Accounts, or related field preferred Strong communication and relationship-building skills Experience in invoice reconciliation Ability to work in a fast-paced, dynamic environment with a focus on results Excellent analytical skills and attention to detail Proficient in Microsoft Office and CRM software Strong organizational skills Proficiency in Microsoft Office Ability to harness financial data to inform decisions Benefits Health Insurance Provident Fund, Gratuity 5 days working (Monday-Friday) Employee Engagement activities in a Quarter

Posted 1 week ago

Apply

0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Software Engineer - Content Parsing The Opportunity We're looking for a talented and detail-oriented Software Engineer - Content Parsing to join our dynamic team. In this role, you'll be crucial in extracting, categorizing, and structuring vast amounts of content from various web sources. You'll leverage your expertise in Python and related parsing technologies to build robust, scalable, and highly accurate parsing solutions. This position requires a strong focus on data quality, comprehensive testing, and the ability to implement effective alerting and notification systems. Responsibilities : Design, develop, and maintain robust and scalable HTML parsing solutions to extract diverse web content. Implement advanced content categorization logic to accurately classify and tag extracted data based on predefined schemas and business rules, incorporating AI/ML techniques where applicable. Develop and integrate alerting and notification systems to monitor parsing performance, identify anomalies, and report on data quality issues. Write comprehensive unit, integration, and end-to-end test cases to ensure the accuracy, reliability, and robustness of parsing logic, covering all boundary conditions and edge cases. Optimize parsing performance and efficiency to handle large volumes of data. Troubleshoot and resolve parsing issues, adapting to changes in website structures and content formats. Contribute to the continuous improvement of our parsing infrastructure and methodologies, including the research and adoption of new AI-driven parsing techniques. Manage and deploy parsing solutions in a Linux environment. Collaborate with DevOps engineers to improve the scaling, deployment, and operational efficiency of parsing solutions. This role requires occasional weekend work as content changes are typically deployed on weekends, necessitating monitoring and immediate adjustments. Qualifications : Bachelor's degree in Computer Science or a closely related technical field is required. Experience in software development with a strong focus on data extraction and parsing. Proficiency in Python and its ecosystem, particularly with libraries for web scraping and parsing (e.g., Beautiful Soup, lxml, Scrapy, Playwright, Selenium). Demonstrated experience in building or parsing complex and unstructured HTML content into structured data formats. Understanding and practical experience with content categorization techniques (e.g., keyword extraction, rule-based classification, basic NLP concepts). Proven ability to design and implement effective alerting and notification systems (e.g., integrating with Slack, PagerDuty, email, custom dashboards). Attention to details with unit testing skills, with a meticulous approach to covering all boundary conditions, error cases, and edge scenarios. Experience working in a Linux environment, including shell scripting and command-line tools. Familiarity with data storage solutions (e.g., SQL databases) and data serialization formats (e.g., JSON, XML. Experience with version control systems (e.g., Git). Excellent problem-solving skills. Strong communication and collaboration abilities.

Posted 1 week ago

Apply

0.0 - 2.0 years

0 - 0 Lacs

Mohali, Punjab

On-site

Job Description- Flutter Developer Job Location: Mohali Experience- 1-2 years Mobile App Development Build responsive and scalable cross-platform mobile apps using Flutter (iOS & Android). Convert UI/UX designs into functional mobile app components. Use Flutter widgets effectively to craft clean and reusable code. API Integration Consume RESTful APIs and WebSockets to connect with backend services. Handle data parsing (JSON) and error handling gracefully. Performance Optimization Optimize application performance, responsiveness, and speed. Use tools like Flutter DevTools for debugging and profiling. Testing & Debugging Write unit, widget, and integration tests. Debug and resolve technical issues. App Store Deployment Prepare and publish apps to the Apple App Store and Google Play Store. Handle app versioning, code signing, and platform-specific build issues. Cross-functional Responsibilities · Knowledge of Backend Skills ( Nodejs, Php) is a Plus · Collaborate with designers, product managers, and QA engineers. · Review code (pull requests), suggest improvements, and mentor junior devs if needed. · Experience with Git and version control workflows. · Knowledge of containerization (Docker) is a plus. · Ability to troubleshoot both frontend and backend bugs. For further queries call/WhatsApp on 7743059799 #flutterDeveloper #IOS #Andriod #nodejs #php # MobileAppDevelopment #APIintegration. Job Type: Full-time Pay: ₹20,000.00 - ₹30,000.00 per month Schedule: Day shift Application Question(s): How many years of experience do you have in flutter role? Do you have experience in Mobile App development? Do have experience in API integration? Location: Mohali, Punjab (Required) Work Location: In person

Posted 1 week ago

Apply

0 years

0 Lacs

Delhi, India

On-site

This is a test job post created for the purpose of evaluating and improving our internal recruitment system through Zoho Recruit. It allows the HR and marketing team to simulate the complete hiring journey, including job posting visibility, candidate application tracking, pipeline management, and automation of communications via WhatsApp, email, or integrated tools like Pabbly. This post is not intended for real hiring purposes. Through this test, we aim to verify system performance, data accuracy, resume parsing, automated responses, and cross-platform syncing. Team members may use this listing to submit dummy applications, upload trial resumes, and check whether each automation trigger works as intended. It also helps in assessing interview scheduling features and lead management. All actions and feedback from this test will guide us in setting up a seamless experience for real candidates in the future. Please do not treat this listing as a real job opportunity. Requirements No real qualifications required – this is a dummy listing Should help test resume upload, forms, or auto-tagging May be used by internal team only (HR / Tech / Marketing) Benefits Enables smooth hiring operations for real job roles Verifies Zoho Recruit + Pabbly integrations Improves candidate journey through testing 100% safe for trial runs – no real candidates will be processed

Posted 1 week ago

Apply

0 years

0 Lacs

India

Remote

📍 Location: India (Remote) 💼 Equity Only 🧠 Effort = Reward We’re building something bold at the edge of Web3: a decentralized crypto copy trading and ghost trading platform that helps new users simulate and learn before they trade. Think real-time DEX insights, AI-generated trading signals, and structured liquidity analytics all feeding into a clean, gamified trading experience. We’re assembling a founding tech crew to bring this to life. 🔧 WHO WE’RE LOOKING FOR: A Solana-native builder who knows how to work with Solana RPCs, data parsing, on-chain indexing Can create or evolve a DEX analytics engine specializing in structured liquidity data (multiple pools, LP behavior, routing paths, etc.) Comfortable scraping, mining, and transforming on-chain data from top Solana DEXs (Orca, Raydium, Phoenix, etc.) Can think beyond dashboards and imagine real-time copy trading mechanics, ghost portfolio simulations, and custom trade signal feeds Must be India-based, able to work independently, and committed to confidentiality Bonus: Prior experience in crypto simulations, DeFi UI/UX, or AI modeling 💰 COMPENSATION This is a pure equity opportunity, not a salary role. We're looking for someone who wants skin in the game, someone who sees the upside and is ready to build alongside us on a "reward equals effort" basis. 🧩 THE STACK Solana RPC, Web3.js, Serum/Raydium APIs Node.js / TypeScript / Rust (optional) GraphQL indexing, TimescaleDB, The Graph (Solana equivalent) GitHub, Discord, Notion – async-first collaboration ⚡ WHAT YOU GET Equity in a platform that will onboard the next 10M crypto users A seat at the table to shape product direction, tokenomics, and architecture Built-in runway via Paywaz.com LLC and upcoming integrations Equity stake tied to milestone contributions 🧠 If you're a builder who thinks in models, sees beyond APIs, and lives to decode on-chain behavior we want to hear from you.

Posted 2 weeks ago

Apply

3.0 years

0 Lacs

India

Remote

About Us: YipitData is the leading market research and analytics firm for the disruptive economy and most recently raised $475M from The Carlyle Group at a valuation of over $1B. Every day, our proprietary technology analyzes billions of alternative data points to uncover actionable insights across sectors like software, AI, cloud, e-commerce, ridesharing, and payments. Our data and research teams transform raw data into strategic intelligence, delivering accurate, timely, and deeply contextualized analysis that our customers—ranging from the world’s top investment funds to Fortune 500 companies—depend on to drive high-stakes decisions. From sourcing and licensing novel datasets to rigorous analysis and expert narrative framing, our teams ensure clients get not just data, but clarity and confidence. We operate globally with offices in the US (NYC, Austin, Miami, Mountain View), APAC (Hong Kong, Shanghai, Beijing, Guangzhou, Singapore), and India. Our award-winning, people-centric culture—recognized by Inc. as a Best Workplace for three consecutive years—emphasizes transparency, ownership, and continuous mastery. What It’s Like to Work at YipitData: YipitData isn’t a place for coasting—it’s a launchpad for ambitious, impact-driven professionals. From day one, you’ll take the lead on meaningful work, accelerate your growth, and gain exposure that shapes careers. Why Top Talent Chooses YipitData: Ownership That Matters: You’ll lead high-impact projects with real business outcomes Rapid Growth: We compress years of learning into months Merit Over Titles: Trust and responsibility are earned through execution, not tenure Velocity with Purpose: We move fast, support each other, and aim high—always with purpose and intention If your ambition is matched by your work ethic—and you're hungry for a place where growth, impact, and ownership are the norm—YipitData might be the opportunity you’ve been waiting for. About The Role: We are seeking a Web Scraping Engineer to join our growing engineering team. In this hands-on role, you’ll take ownership of designing, building, and maintaining robust web scrapers that power critical reports and customer experiences across our organization. You will work on complex, high-impact scraping challenges and collaborate closely with cross-functional teams to ensure our data ingestion processes are resilient, efficient, and scalable, while delivering high-quality data to our products and stakeholders. As Our Web Scraping Engineer You Will: Refactor and Maintain Web Scrapers Overhaul existing scraping scripts to improve reliability, maintainability, and efficiency. Implement best coding practices (clean code, modular architecture, code reviews, etc.) to ensure quality and sustainability. Implement Advanced Scraping Techniques Utilize sophisticated fingerprinting methods (cookies, headers, user-agent rotation, proxies) to avoid detection and blocking. Handle dynamic content, navigate complex DOM structures, and manage session/cookie lifecycles effectively. Collaborate with Cross-Functional Teams Work closely with analysts and other stakeholders to gather requirements, align on targets, and ensure data quality. Provide support, documentation, and best practices to internal stakeholders to ensure effective use of our web scraped data in critical reporting workflows. Monitor and Troubleshoot Develop robust monitoring solutions, alerting frameworks to quickly identify and address failures. Continuously evaluate scraper performance, proactively diagnosing bottlenecks and scaling issues. Drive Continuous Improvement Propose new tooling, methodologies, and technologies to enhance our scraping capabilities and processes. Stay up to date with industry trends, evolving bot-detection tactics, and novel approaches to web data extraction. This is a fully-remote opportunity based in India. Standard work hours are from 11am to 8pm IST, but there is flexibility here. You Are Likely To Succeed If: Effective communication in English with both technical and non-technical stakeholders. You have a track record of mentoring engineers and managing performance in a fast-paced environment. 3+ years of experience with web scraping frameworks (e.g., Selenium, Playwright, or Puppeteer). Strong understanding of HTTP, RESTful APIs, HTML parsing, browser rendering, and TLS/SSL mechanics. Expertise in advanced fingerprinting and evasion strategies (e.g., browser fingerprint spoofing, request signature manipulation). Deep experience managing cookies, headers, session states, and proxy rotations, including the deployment of both residential and data center proxies. Experience with logging, metrics, and alerting to ensure high availability. Troubleshooting skills to optimize scraper performance for efficiency, reliability, and scalability. What We Offer: Our compensation package includes comprehensive benefits, perks, and a competitive salary: We care about your personal life, and we mean it. We offer flexible work hours, flexible vacation, a generous 401K match, parental leave, team events, wellness budget, learning reimbursement, and more! Your growth at YipitData is determined by the impact that you are making, not by tenure, unnecessary facetime, or office politics. Everyone at YipitData is empowered to learn, self-improve, and master their skills in an environment focused on ownership, respect, and trust. See more on our high-impact, high-opportunity work environment above! We are committed to equal employment opportunity regardless of race, color, ancestry, religion, sex, national origin, sexual orientation, age, marital status, disability, gender, gender identity or expression, or veteran status. We are proud to be an equal-opportunity employer. Job Applicant Privacy Notice

Posted 2 weeks ago

Apply

3.0 years

0 Lacs

India

Remote

About Us: YipitData is the leading market research and analytics firm for the disruptive economy and most recently raised $475M from The Carlyle Group at a valuation of over $1B. Every day, our proprietary technology analyzes billions of alternative data points to uncover actionable insights across sectors like software, AI, cloud, e-commerce, ridesharing, and payments. Our data and research teams transform raw data into strategic intelligence, delivering accurate, timely, and deeply contextualized analysis that our customers—ranging from the world’s top investment funds to Fortune 500 companies—depend on to drive high-stakes decisions. From sourcing and licensing novel datasets to rigorous analysis and expert narrative framing, our teams ensure clients get not just data, but clarity and confidence. We operate globally with offices in the US (NYC, Austin, Miami, Mountain View), APAC (Hong Kong, Shanghai, Beijing, Guangzhou, Singapore), and India. Our award-winning, people-centric culture—recognized by Inc. as a Best Workplace for three consecutive years—emphasizes transparency, ownership, and continuous mastery. What It’s Like to Work at YipitData: YipitData isn’t a place for coasting—it’s a launchpad for ambitious, impact-driven professionals. From day one, you’ll take the lead on meaningful work, accelerate your growth, and gain exposure that shapes careers. Why Top Talent Chooses YipitData: Ownership That Matters: You’ll lead high-impact projects with real business outcomes Rapid Growth: We compress years of learning into months Merit Over Titles: Trust and responsibility are earned through execution, not tenure Velocity with Purpose: We move fast, support each other, and aim high—always with purpose and intention If your ambition is matched by your work ethic—and you're hungry for a place where growth, impact, and ownership are the norm—YipitData might be the opportunity you’ve been waiting for. About The Role: We are seeking a Web Scraping Engineer to join our growing engineering team. In this hands-on role, you’ll take ownership of designing, building, and maintaining robust web scrapers that power critical reports and customer experiences across our organization. You will work on complex, high-impact scraping challenges and collaborate closely with cross-functional teams to ensure our data ingestion processes are resilient, efficient, and scalable, while delivering high-quality data to our products and stakeholders. As Our Web Scraping Engineer You Will: Refactor and Maintain Web Scrapers Overhaul existing scraping scripts to improve reliability, maintainability, and efficiency. Implement best coding practices (clean code, modular architecture, code reviews, etc.) to ensure quality and sustainability. Implement Advanced Scraping Techniques Utilize sophisticated fingerprinting methods (cookies, headers, user-agent rotation, proxies) to avoid detection and blocking. Handle dynamic content, navigate complex DOM structures, and manage session/cookie lifecycles effectively. Collaborate with Cross-Functional Teams Work closely with analysts and other stakeholders to gather requirements, align on targets, and ensure data quality. Provide support, documentation, and best practices to internal stakeholders to ensure effective use of our web scraped data in critical reporting workflows. Monitor and Troubleshoot Develop robust monitoring solutions, alerting frameworks to quickly identify and address failures. Continuously evaluate scraper performance, proactively diagnosing bottlenecks and scaling issues. Drive Continuous Improvement Propose new tooling, methodologies, and technologies to enhance our scraping capabilities and processes. Stay up to date with industry trends, evolving bot-detection tactics, and novel approaches to web data extraction. This is a fully-remote opportunity based in India. Standard work hours are from 11am to 8pm IST, but there is flexibility here. You Are Likely To Succeed If: Effective communication in English with both technical and non-technical stakeholders. You have a track record of mentoring engineers and managing performance in a fast-paced environment. 3+ years of experience with web scraping frameworks (e.g., Selenium, Playwright, or Puppeteer). Strong understanding of HTTP, RESTful APIs, HTML parsing, browser rendering, and TLS/SSL mechanics. Expertise in advanced fingerprinting and evasion strategies (e.g., browser fingerprint spoofing, request signature manipulation). Deep experience managing cookies, headers, session states, and proxy rotations, including the deployment of both residential and data center proxies. Experience with logging, metrics, and alerting to ensure high availability. Troubleshooting skills to optimize scraper performance for efficiency, reliability, and scalability. What We Offer: Our compensation package includes comprehensive benefits, perks, and a competitive salary: We care about your personal life, and we mean it. We offer flexible work hours, flexible vacation, a generous 401K match, parental leave, team events, wellness budget, learning reimbursement, and more! Your growth at YipitData is determined by the impact that you are making, not by tenure, unnecessary facetime, or office politics. Everyone at YipitData is empowered to learn, self-improve, and master their skills in an environment focused on ownership, respect, and trust. See more on our high-impact, high-opportunity work environment above! We are committed to equal employment opportunity regardless of race, color, ancestry, religion, sex, national origin, sexual orientation, age, marital status, disability, gender, gender identity or expression, or veteran status. We are proud to be an equal-opportunity employer. Job Applicant Privacy Notice

Posted 2 weeks ago

Apply

0 years

0 Lacs

India

Remote

Your Next Challenge: Reinvent Healthcare with Code & AI We believe healthcare deserves better — better efficiency, better tech, and better outcomes. And we believe that happens when brilliant minds like yours meet bold problems like ours . At Jorie AI , we’re building automation and AI solutions that transform the painful inefficiencies of Revenue Cycle Management (RCM) into seamless, intelligent workflows. We already help hospitals and providers save millions, and now we’re ready to take it to the next level. We need a doer, dreamer, and builder who thrives at the intersection of: 🧠 Artificial Intelligence 💻 Python Automation 🩺 Healthcare RCM What You’ll Actually Do Break down messy RCM processes, then rebuild them with smart, scalable automation. Code in Python like it’s second nature — building bots, APIs, and backends to make healthcare run smarter. Apply AI/ML to solve real-world problems like intelligent coding, claim denials prediction, document parsing, and AR prioritization etc. Work with a cross-functional team of product managers, engineers, and healthcare domain experts who geek out just as much as you do. Stay ahead of the curve — you’ll have the freedom to innovate, experiment, and iterate. Skills we are looking for ✨ Someone who loves to solve hard problems with smart code. ✨ Fluent in Python (not just “comfortable”). ✨ Knows healthcare RCM processes inside-out (or at least has solid experience working in RCM environments). ✨ Has played with AI/ML tools, APIs, or models — and knows how to make them work in production. ✨ Curious. Relentless. Collaborative. You ship fast and learn faster. What’s In It for You? 🎯 Work that actually matters — fixing a broken healthcare system, one automated workflow at a time. 🎯 Remote-first, flexible work culture. 🎯 A fast-moving team where your ideas shape the product. 🎯 A front-row seat in the healthtech revolution.

Posted 2 weeks ago

Apply

3.0 years

0 Lacs

India

On-site

About Us Adfinity Global Solutions is a technology driven company focused on delivering effective digital display solutions. We design and deploy outdoor, indoor and transparent displays that help brands reach people with clarity and purpose. We are now expanding into the entertainment space, guided by the same principles that define our work. Every solution we build is meant to serve real needs, create real engagement and reflect the trust our clients place in us. As we grow, we are looking for individuals who share this mindset and are ready to contribute meaningfully to what we are building. Visit www.adfinityglobal.com for more details. Role Overview We are looking for a Flutter developer who knows their way around Dart and cares about writing clean, reliable code. You should be comfortable working on real production apps, handling state, structuring things well and making sure everything runs smoothly on both Android and iOS. You will be working closely with the design, product and backend teams to bring ideas to life and make sure the app feels right in the hands of our users. Key Responsibilities Build and maintain scalable, modular Flutter applications for both Android and iOS. Work with Riverpod (including riverpod_generator) and Freezed to implement clean, immutable, reactive architecture. Integrate with RESTful APIs using robust error handling and state management practices. Implement custom UI/UX, animations, and transitions based on design mocks. Optimize app performance using profiling tools and asynchronous programming best practices. Use tools like build runner, linters, and custom annotations to maintain clean code and architecture. Work closely with backend and design teams to ensure accurate data flow and UI/UX precision. Debug platform-specific issues on iOS and Android and ensure smooth deployment pipelines. Integrate third-party packages (e.g., cached_network_image, flutter_html, etc.) and native SDKs as needed. Maintain and improve collaborative workflows with Git and code reviews. Required Skills and Experience 3+ years of professional Flutter development experience. Strong expertise in Dart and Flutter SDK. Hands-on experience with Riverpod for state management and code generation using riverpod_generator. Experience with Freezed for building immutable models. Proficient in REST API integration, JSON parsing, and structured error handling. Familiarity with dependency injection, modular code structure, and clean architecture principles. Understanding of Flutter performance profiling tools and optimization techniques. Prior work with custom UI design, animations, and third-party libraries. Solid grasp of Git and collaborative version control practices. Experience with build runner, annotations, linter rules, and project structuring. Bonus Points For Experience with Firebase services such as Analytics, Crashlytics, and Messaging. Hands-on experience with app store deployment, platform-specific debugging, and resolving Android/iOS build issues. Contributions to open source or Flutter community plugins. A good eye for design, transitions, and micro-interactions. What We Offer An opportunity to be part of a growing platform in the entertainment space. A focused team where your work is seen and matters. Clear ownership over features that reach real users every day. The space to learn, try things and grow through hands-on work. Experience: 3+ years Salary: Based on candidate's experience and current CTC To Apply Send your resume, GitHub/portfolio/app links, and a short note about a Flutter feature or UI you’re proud of to haritha@adfinityglobal.com Subject: Flutter Frontend Engineer – Application Job Types: Full-time, Permanent Benefits: Health insurance Work Location: In person

Posted 2 weeks ago

Apply

4.0 - 5.0 years

6 - 8 Lacs

Gurgaon

On-site

Project description We are looking for a skilled Document AI / NLP Engineer to develop intelligent systems that extract meaningful data from documents such as PDFs, scanned images, and forms. In this role, you will build document processing pipelines using OCR and NLP technologies, fine-tune ML models for tasks like entity extraction and classification, and integrate those solutions into scalable cloud-based applications. You will collaborate with cross-functional teams to deliver high-performance, production-ready pipelines and stay up to date with advancements in the document understanding and machine learning space. Responsibilities Design, build, and optimize document parsing pipelines using tools like Amazon Textract, Azure Form Recognizer, or Google Document AI. Perform data preprocessing, labeling, and annotation for training machine learning and NLP models. Fine-tune or train models for tasks such as Named Entity Recognition (NER), text classification, and layout understanding using PyTorch, TensorFlow, or HuggingFace Transformers. Integrate document intelligence capabilities into larger workflows and applications using REST APIs, microservices, and cloud components (e.g., AWS Lambda, S3, SageMaker). Evaluate model and OCR accuracy, applying post-processing techniques or heuristics to improve precision and recall. Collaborate with data engineers, DevOps, and product teams to ensure solutions are robust, scalable, and meet business KPIs. Monitor, debug, and continuously enhance deployed document AI solutions. Maintain up-to-date knowledge of industry trends in OCR, Document AI, NLP, and machine learning. Skills Must have 4-5 years of hands-on experience in machine learning, document AI, or NLP-focused roles. Strong expertise in OCR tools and frameworks, especially Amazon Textract, Azure Form Recognizer, Google Document AI, or open-source tools like Tesseract, LayoutLM, or PaddleOCR. Solid programming skills in Python and familiarity with ML/NLP libraries: scikit-learn, spaCy, transformers, PyTorch, TensorFlow, etc. Experience working with structured and unstructured data formats, including PDF, images, JSON, and XML. Hands-on experience with REST APIs, microservices, and integrating ML models into production pipelines. Working knowledge of cloud platforms, especially AWS (S3, Lambda, SageMaker) or their equivalents. Understanding of NLP techniques such as NER, text classification, and language modeling. Strong debugging, problem-solving, and analytical skills. Clear verbal and written communication skills for technical and cross-functional collaboration. Nice to have N/A Other Languages English: B2 Upper Intermediate Seniority Senior Gurugram, India Req. VR-116250 AI/ML BCM Industry 29/07/2025 Req. VR-116250

Posted 2 weeks ago

Apply

4.0 years

2 - 6 Lacs

Ahmedabad

On-site

Position: Android Developer (CE48SF RM 3425) Shift timing (if any): General Shift Work Mode – EIC office/ Hybrid Minimum Relevant Experience: 4+ years Education Required: Bachelor’s / Masters / PhD : B.E Computers, MCA is preferable Must have: XAML for UI development., RESTful APIs, JSON/XML parsing, networking on Android, Debugging and Troubleshooting, mobile application lifecycle (Android), JAVA Kotlin Good to have: Bluetooth/BLE programming, Java, C, C++ Overview We are looking for a talented and motivated Android Developer to join our innovative software development team. The ideal candidate should have a strong passion for mobile application development and a proven track record of building high-quality native Android applications. You will collaborate with cross-functional teams to design, develop, and deploy Android solutions that align with our product vision and business goals. Key Responsibilities Design, develop, and maintain native Android applications using Kotlin and/or Java . Collaborate with product managers, designers, and fellow developers to define, design, and implement new features. Write clean, maintainable, and scalable code following Android development best practices. Optimize application performance, responsiveness, and usability. Participate in Agile development processes: sprint planning, daily stand-ups, retrospectives. Diagnose and resolve bugs, crashes, and performance issues. Conduct code reviews and support internal development improvements. Implement security and data protection practices across the app. Required Skills & Qualifications Strong experience in native Android development using Kotlin and/or Java . Solid understanding of Android SDK , Jetpack components , and Material Design . Experience working with MVVM , MVP , or Clean Architecture patterns. Proficiency in integrating RESTful APIs and handling JSON/XML data. Experience with Room , SQLite , or other local storage solutions. Hands-on experience publishing apps to the Google Play Store . Familiarity with Android lifecycle, background processing, and threading. Experience with platform-specific features such as camera , GPS , sensors , and notifications . Strong debugging and performance tuning skills. Good communication and documentation abilities. Ability to work both independently and collaboratively in a team. Nice to Have Experience with Bluetooth/BLE integration. Familiarity with Firebase services (Authentication, Cloud Messaging, Analytics). Experience working with CI/CD pipelines and tools like Fastlane or GitHub Actions . Exposure to Jetpack Compose and willingness to adopt it. Knowledge of Gradle , Proguard , and general mobile app optimization techniques. Understanding of Unit Testing and UI Testing using tools like JUnit , Espresso , or Mockito . Familiarity with UML diagrams , flow charts, and technical documentation. Tools & Technologies Languages : Kotlin, Java Development Tools : Android Studio, ADB, Android Emulator Version Control : Git, Bitbucket, GitHub Project Management : JIRA, Confluence Testing Tools : Espresso, JUnit, Mockito, Firebase Test Lab Build & Release : Gradle, Proguard, Fastlane, Play Consol ******************************************************************************************************************************************* Job Category: Digital_Cloud_Web Technologies Job Type: Full Time Job Location: Ahmedabad Experience: 4-8 years Notice period: 0-15 days

Posted 2 weeks ago

Apply

5.0 years

0 Lacs

Jaipur, Rajasthan, India

On-site

Position- Lead Android Developer Location- Jaipur Khushi Baby, a nonprofit organization in India, serves as a technical partner to health departments. Established in 2016 from a Yale University classroom, it has grown into a 90-member team with offices in Jaipur, Udaipur, Delhi, and Bengaluru. Khushi Baby focuses on digital health solutions, health program strengthening, and R&D. Its flagship platform, the Community Health Integrated Platform (CHIP), supports over 70,000 community health workers across 40,000 villages, reaching 45 million beneficiaries. The platform has identified and monitored 5+ million high-risk individuals, with the Ministry of Health allocating ₹160 crore ($20M) for its scale-up. CHIP has enabled initiatives like Rajasthan's digital health census, TB case finding, vector-borne disease surveillance, labor room monitoring, and immunization drives, co-designed with extensive field input. In R&D, Khushi Baby advances community-level geospatial analysis and individual health diagnostics, including smartphone-based tools and low-literacy models. Programmatically, it focuses on maternal health, child malnutrition, and zero-dose children. Backed by donors like GAVI, Skoll Foundation, and CSR funding, Khushi Baby partners with IITs, AIIMS Jodhpur, JPAL South Asia, MIT, Microsoft Research, WHO, and multiple state governments. Khushi Baby seeks skilled, creative, and driven candidates eager to make a large-scale public health impact by joining its interdisciplinary team in policy, design, development, implementation, and data science. What we require: A willingness to put our mission first and to go to the last mile to ensure our solution is creating impact 5+ years of professional working experience in developing android applications More than 3 years of experience in leading a team of developers. Experienced in leading a team on various projects. Good exposure to Android Studio/Android SDKs with Android tools, Kotlin, and frameworks. Research and suggest new mobile products, applications and protocols. Working in close collaboration with back-end developers, designers, and the rest of the team to deliver well-architected and high-quality solutions. Continuously discover, evaluate, and implement new technologies to maximize development efficiency. Familiarity with industry-standard design patterns for most commonly encountered situations is a must A solid understanding of operating system fundamentals such as processes, inter-process communication, multi-threading primitives, race conditions and deadlocks Good knowledge of multithreading, process optimisation, and system resource planning in native Android Experience using Web Services and Data parsing using JSON, XML etc. Good knowledge of OO designs, database design, data structures and algorithms Experience working in an Agile team, familiarity with Agile best practices, and ability to manage individual task deliverables Possessing the sense of user engagement in order to deep dive for understanding the real end users' needs and to improve the product over time. Work closely with developers, backend lead, product and project managers to meet project deadlines. Notwithstanding anything contained What we prefer: Background in public health, ICT4D, and digital health standard frameworks Experience with building offline-online capable apps Experience with facial biometrics, Near Field Communication, edge analytics Development of currently live Android applications with over 1,000 downloads and 4+ rating on Playstore Projects / Responsibilities: Applications Community Health Integrated Platform for ASHAs, ANMs and MOCs Khushi Baby Reproductive and Child Health Solution Decision Support Tool for Community Health Officers Health Worker Diligence and High-Risk Prediction module in collaboration with Google AI for Social Good IoT device integration, facial biometric module integration, NFC device integration for decentralized health records, NDHM implementation Health and Wellness Center Digital Platform Ensuring end-to-end encryption, version control and backwards compatibility, automated testing, systematic documentation Conducting field tests and analyzing automated user metrics to understand and improve user interface Remuneration The remuneration offered will range between 20-25 LPA commensurate with the candidate's experience and skill sets. Other benefits include: Medical Insurance Paid sick leave, paid parental leave and menstrual leave Learning stipend policy A flexible, enabling environment workplace with the opportunity to grow into leadership roles. Opportunities to attend and actively participate in prestigious International conferences, workshops Note : The candidate will be on a probationary period for the first 90 days of the contract How to Apply To apply for the above position, To apply for the above position, share your CV on careers@khushibaby.org Due to the high number of applicants, we will only reach out to those who are shortlisted. Rest assured that your application will be carefully reviewed, and if you are shortlisted, you will receive a call or mail from us.

Posted 2 weeks ago

Apply

1.0 years

0 Lacs

Indore, Madhya Pradesh, India

On-site

Location: Indore/Chennai Job Type: Full-time Experience Required: 1+ years Department: Technology / Mobile Development Job Summary : We are seeking a passionate and skilled Swift Developer with 1+ years of experience in iOS application development. The ideal candidate should have hands-on experience building mobile applications using Swift and a strong understanding of Apple’s ecosystem. You will collaborate with cross-functional teams to develop, test, and deliver high-performance iOS applications. Key Responsibilities: Develop and maintain advanced iOS applications using Swift. Collaborate with UX/UI designers, product managers, and backend developers. Integrate APIs and third-party services into applications. Write clean, scalable, and well-documented code. Debug and resolve issues, improve performance, and stability. Keep up to date with the latest iOS trends, technologies, and best practices. Participate in code reviews and contribute to technical discussions. Requirements: 1+ years of experience in Swift and iOS app development. Strong knowledge of Xcode, UIKit, CoreData, and other iOS frameworks. Familiarity with RESTful APIs, JSON parsing, and third-party libraries. Good understanding of mobile UI/UX standards. Experience with version control systems like Git. Ability to write unit and UI tests to ensure robustness. Strong analytical and problem-solving skills. Bachelor’s degree in Computer Science, Engineering, or a related field.

Posted 2 weeks ago

Apply

3.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Project Role : Engineering Services Practitioner Project Role Description : Assist with end-to-end engineering services to develop technical engineering solutions to solve problems and achieve business objectives. Solve engineering problems and achieve business objectives using scientific, socio-economic, technical knowledge and practical experience. Work across structural and stress design, qualification, configuration and technical management. Must have skills : 5G Wireless Networks & Technologies Good to have skills : NA Minimum 3 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Job Title: 5G Core Network Ops Senior Engineer Summary: We are seeking a skilled 5G Core Network Senior Engineer to join our team. The ideal candidate will have extensive experience with Nokia 5G Core platforms and will be responsible for fault handling, troubleshooting, session and service investigation, configuration review, performance monitoring, security support, change management, and escalation coordination. Roles and Responsibilities: 1. Fault Handling & Troubleshooting: Provide Level 2 (L2) support for 5G Core SA network functions in production environment. Nokia EDR Operations & Support, Monitor and maintain the health of Nokia EDR systems. Perform log analysis and troubleshoot issues related to EDR generation, parsing, and delivery. Ensure EDRs are correctly generated for all relevant 5G Core functions (AMF, SMF, UPF, etc.) and interfaces (N4, N6, N11, etc.). Validate EDR formats and schemas against 3GPP and Nokia specifications. NCOM Platform Operations Operate and maintain the Nokia Cloud Operations Manager (NCOM) platform. Manage lifecycle operations of CNFs, VNFs, and network services (NSs) across distributed Kubernetes and OpenStack environments. Analyze alarms from NetAct/Mantaray, or external monitoring tools. Correlate events using Netscout, Mantaray, and PM/CM data. Troubleshoot and resolve complex issues related to registration, session management, mobility, policy, charging, DNS, IPSec and Handover issues. Handle node-level failures (AMF/SMF/UPF/NRF/UDM/UDR/PCF/CHF restarts, crashes, overload). Perform packet tracing (Wireshark) or core trace (PCAP, logs) and Nokia PCMD trace capturing and analysis. Perform root cause analysis (RCA) and implement corrective actions. Handle escalations from Tier-1 support and provide timely resolution. 2. Automation & Orchestration Automate deployment, scaling, healing, and termination of network functions using NCOM. Develop and maintain Ansible playbooks, Helm charts, and GitOps pipelines (FluxCD, ArgoCD). Integrate NCOM with third-party systems using open APIs and custom plugins. 3. Session & Service Investigation: Trace subscriber issues (5G attach, PDU session, QoS). Use tools like EDR, Flow Tracer, Nokia Cloud Operations Manager (COM). Correlate user-plane drops, abnormal release, bearer QoS mismatch. Work on Preventive measures with L1 team for health check & backup. 4. Configuration and Change Management: Create a MOP for required changes, validate MOP with Ops teams, stakeholders before rollout/implementation. Maintain detailed documentation of network configurations, incident reports, and operational procedures. Support software upgrades, patch management, and configuration changes. Maintain documentation for known issues, troubleshooting guides, and standard operating procedures (SOPs). Audit NRF/PCF/UDM etc configuration & Database. Validate policy rules, slicing parameters, and DNN/APN settings. Support integration of new 5G Core nodes and features into the live network. 5. Performance Monitoring: Use KPI dashboards (NetAct/NetScout) to monitor 5G Core KPIs e.g registration success rate, PDU session setup success, latency, throughput, user-plane utilization. Proactively detect degrading KPIs trends. 6. Security & Access Support: Application support for Nokia EDR and CrowdStrike. Assist with certificate renewal, firewall/NAT issues, and access failures. 7. Escalation & Coordination: Escalate unresolved issues to L3 teams, Nokia TAC, OSS/Core engineering. Work with L3 and care team for issue resolution. Ensure compliance with SLAs and contribute to continuous service improvement. 8. Reporting Generate daily/weekly/monthly reports on network performance, incident trends, and SLA compliance. Technical Experience and Professional Attributes: 5–9 years of experience in Telecom industry with hands on experience. Mandatory experience with Nokia 5G Core-SA platform. Handson Experience on Nokia EDR Operations & Support, Monitor and maintain the health of Nokia EDR systems. Perform log analysis and troubleshoot issues related to EDR generation, parsing, and delivery. Experience on NCOM Platform Operations Operate and maintain the Nokia Cloud Operations Manager (NCOM) platform NF deployment and troubleshooting experience on deployment, scaling, healing, and termination of network functions using NCOM. Solid understanding for 5G Core Packet Core Network Protocol such as N1, N2, N3, N6, N7, N8, 5G Core interfaces, GTP-C/U, HTTPS and including ability to trace, debug the issues. Hands-on experience with 5GC components: AMF, SMF, UPF, NRF, AUSF, NSSF, UDM, PCF, CHF, SDL, NEDR, Provisioning and Flowone. In-depth understanding of 3GPP call flows for 5G-SA, 5G NSA, Call routing, number analysis, system configuration, call flow, Data roaming, configuration and knowledge of Telecom standards e.g. 3GPP, ITU-T and ANSI. Familiarity with policy control mechanisms, QoS enforcement, and charging models (event-based, session-based). Hands-on experience with Diameter, HTTP/2, REST APIs, and SBI interfaces. Strong analytical and troubleshooting skills. Proficiency in monitoring and tracing tools (NetAct, NetScout, PCMD tracing). And log management systems (e.g., Prometheus, Grafana). Knowledge of network protocols and security (TLS, IPsec). Excellent communication and documentation skills. Educational Qualification: BE / BTech 15 Years Full Time Education Additional Information: Nokia certifications (e.g., NCOM, NCS, NSP, Kubernetes). Experience in Nokia Platform 5G Core, NCOM, NCS, Nokia Private cloud and Public Cloud (AWS preferred), cloud-native environments (Kubernetes, Docker, CI/CD pipelines). Cloud Certifications (AWS)/ Experience on AWS Cloud

Posted 2 weeks ago

Apply

4.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Our Purpose Mastercard powers economies and empowers people in 200+ countries and territories worldwide. Together with our customers, we’re helping build a sustainable economy where everyone can prosper. We support a wide range of digital payments choices, making transactions secure, simple, smart and accessible. Our technology and innovation, partnerships and networks combine to deliver a unique set of products and services that help people, businesses and governments realize their greatest potential. Title And Summary Analyst, Inclusive Innovation & Analytics, Center for Inclusive Growth Mastercard powers economies and empowers people in 200+ countries and territories worldwide. Together with our customers, we’re helping build a sustainable economy where everyone can prosper. We support a wide range of digital payments choices, making transactions secure, simple, smart and accessible. Our technology and innovation, partnerships and networks combine to deliver a unique set of products and services that help people, businesses and governments realize their greatest potential. The Center for Inclusive Growth is the social impact hub at Mastercard. The organization seeks to ensure that the benefits of an expanding economy accrue to all segments of society. Through actionable research, impact data science, programmatic grants, stakeholder engagement and global partnerships, the Center advances equitable and sustainable economic growth and financial inclusion around the world. The Center’s work is at the heart of Mastercard’s objective to be a force for good in the world. Reporting to Vice President, Inclusive Innovation & Analytics, the Analyst, will 1) create and/or scale data, data science, and AI solutions, methodologies, products, and tools to advance inclusive growth and the field of impact data science, 2) work on the execution and implementation of key priorities to advance external and internal data for social strategies, and 3) manage the operations to ensure operational excellence across the Inclusive Innovation & Analytics team. Key Responsibilities Data Analysis & Insight Generation Design, develop, and scale data science and AI solutions, tools, and methodologies to support inclusive growth and impact data science. Analyze structured and unstructured datasets to uncover trends, patterns, and actionable insights related to economic inclusion, public policy, and social equity. Translate analytical findings into insights through compelling visualizations and dashboards that inform policy, program design, and strategic decision-making. Create dashboards, reports, and visualizations that communicate findings to both technical and non-technical audiences. Provide data-driven support for convenings involving philanthropy, government, private sector, and civil society partners. Data Integration & Operationalization Assist in building and maintaining data pipelines for ingesting and processing diverse data sources (e.g., open data, text, survey data). Ensure data quality, consistency, and compliance with privacy and ethical standards. Collaborate with data engineers and AI developers to support backend infrastructure and model deployment. Team Operations Manage team operations, meeting agendas, project management, and strategic follow-ups to ensure alignment with organizational goals. Lead internal reporting processes, including the preparation of dashboards, performance metrics, and impact reports. Support team budgeting, financial tracking, and process optimization. Support grantees and grants management as needed Develop briefs, talking points, and presentation materials for leadership and external engagements. Translate strategic objectives into actionable data initiatives and track progress against milestones. Coordinate key activities and priorities in the portfolio, working across teams at the Center and the business as applicable to facilitate collaboration and information sharing Support the revamp of the Measurement, Evaluation, and Learning frameworks and workstreams at the Center Provide administrative support as needed Manage ad-hoc projects, events organization Qualifications Bachelor’s degree in Data Science, Statistics, Computer Science, Public Policy, or a related field. 2–4 years of experience in data analysis, preferably in a mission-driven or interdisciplinary setting. Strong proficiency in Python and SQL; experience with data visualization tools (e.g., Tableau, Power BI, Looker, Plotly, Seaborn, D3.js). Familiarity with unstructured data processing and robust machine learning concepts. Excellent communication skills and ability to work across technical and non-technical teams. Technical Skills & Tools Data Wrangling & Processing Data cleaning, transformation, and normalization techniques Pandas, NumPy, Dask, Polars Regular expressions, JSON/XML parsing, web scraping (e.g., BeautifulSoup, Scrapy) Machine Learning & Modeling Scikit-learn, XGBoost, LightGBM Proficiency in supervised/unsupervised learning, clustering, classification, regression Familiarity with LLM workflows and tools like Hugging Face Transformers, LangChain (a plus) Visualization & Reporting Power BI, Tableau, Looker Python libraries: Matplotlib, Seaborn, Plotly, Altair Dashboarding tools: Streamlit, Dash Storytelling with data and stakeholder-ready reporting Cloud & Collaboration Tools Google Cloud Platform (BigQuery, Vertex AI), Microsoft Azure Git/GitHub, Jupyter Notebooks, VS Code Experience with APIs and data integration tools (e.g., Airflow, dbt) Ideal Candidate You are a curious and collaborative analyst who believes in the power of data to drive social change. You’re excited to work with cutting-edge tools while staying grounded in the real-world needs of communities and stakeholders. Corporate Security Responsibility All activities involving access to Mastercard assets, information, and networks comes with an inherent risk to the organization and, therefore, it is expected that every person working for, or on behalf of, Mastercard is responsible for information security and must: Abide by Mastercard’s security policies and practices; Ensure the confidentiality and integrity of the information being accessed; Report any suspected information security violation or breach, and Complete all periodic mandatory security trainings in accordance with Mastercard’s guidelines.

Posted 2 weeks ago

Apply

7.0 - 12.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Job Role: -Data Quality(DQ) Specialist Experience: 7-12 years of relevant professional experience in Data Quality and/or Data Governance, Data Management, Data Lineage solution implementation. Location: Pune Notice-Immediate/Max 15 Days Work Mode- 5 days WFO JOB RESPONSIBILITIES: The job entails you to work with our clients and partners to design, define, implement, roll-out, and improve Data Quality that leverage various tools available in the market for example: Informatica IDQ or SAP DQ or SAP MDG or Collibra DQ or Talend DQ or Custom DQ Solution and/or other leading platform for the client’s business benefit. The ideal candidate will be responsible for ensuring the accuracy, completeness, consistency, and reliability of data across systems. You will work closely with data engineers, analysts, and business stakeholders to define and implement data quality frameworks and tools. As part of your role and responsibilities, you will get the opportunity to be involved in the entire business development life-cycle: Meet with business individuals to gather information and analyze existing business processes, determine and document gaps and areas for improvement, prepare requirements documents, functional design documents, etc. To summarize, work with the project stakeholders to identify business needs and gather requirements for the following areas: Data Quality and/or Data Governance or Master Data Follow up of the implementation by conducting training sessions, planning and executing technical and functional transition to support team. Ability to grasp business and technical concepts and transform them into creative, lean, and smart data management solutions. Development and implementation of Data Quality solution in any of the above leading platform-based Enterprise Data Management Solutions Assess and improve data quality across multiple systems and domains. Define and implement data quality rules, metrics, and dashboards. Perform data profiling, cleansing, and validation using industry-standard tools. Collaborate with data stewards and business units to resolve data issues. Develop and maintain data quality documentation and standards. Support data governance initiatives and master data management (MDM). Recommend and implement data quality tools and automation strategies. Conduct root cause analysis of data quality issues and propose remediation plans. Implement/Take advantage of AI to improve/automate Data Quality solution Leveraging SAP MDG/ECCs experience the candidate is able to deep dive to do root cause analysis for assigned usecases. Also able to work with Azure data lake (via dataBricks) using SQL/Python. Data Model (Conceptual and Physical) will be needed to be identified and built that provides automated mechanism to monitor on going DQ issues. Multiple workshops may also be needed to work through various options and identifying the one that is most efficient and effective Works with business (Data Owners/Data Stewards) to profile data for exposing patterns indicating data quality issues. Also is able to identify impact to specific CDEs deemed important for each individual business. Identifies financial impacts of Data Quality Issue. Also is able to identify business benefit (quantitative/qualitative) from a remediation standpoint along with managing implementation timelines. Schedules regular working groups with business that have identified DQ issues and ensures progression for RCA/Remediation or for presenting in DGFs Identifies business DQ rules basis which KPIs/Measures are stood up that feed into the dashboarding/workflows for BAU monitoring. Red flags are raised and investigated Understanding of Data Quality value chain, starting with Critical Data Element concepts, Data Quality Issues, Data Quality KPIs/Measures is needed. Also has experience owing and executing Data Quality Issue assessments to aid improvements to operational process and BAU initiatives Highlights risk/hidden DQ issues to Lead/Manager for further guidance/escalation Communication skills are important in this role as this is outward facing and focus has to be on clearly articulation messages. Support designing, building and deployment of data quality dashboards via PowerBI Determines escalation paths and constructs workflow and alerts which notify process and data owners of unresolved data quality issues Collaborates with IT & analytics teams to drive innovation (AI, ML, cognitive science etc.) Works with business functions and projects to create data quality improvement plans Sets targets for data improvements / maturity. Monitors and intervenes when sufficient progress is not being made Supports initiatives which are driving data clean-up of existing data landscape JOB REQUIREMENTS: i. Education or Certifications: Bachelor's / Master's degree in engineering/technology/other related degrees. Relevant Professional level certifications from Informatica or SAP or Collibra or Talend or any other leading platform/tools Relevant certifications from DAMA, EDM Council and CMMI-DMM will be a bonus ii. Work Experience: You have 4-10 years of relevant experience within the Data & Analytics area with major experience around data management areas: ideally in Data Quality (DQ) and/or Data Governance or Master Data using relevant tools You have an in-depth knowledge of Data Quality and Data Governance concepts, approaches, methodologies and tools Client-facing Consulting experience will be considered a plus iii. Technical and Functional Skills: Hands-on experience in any of the above DQ tools in the area of enterprise Data Management preferably in complex and diverse systems environments Exposure to concepts of data quality – data lifecycle, data profiling, data quality remediation(cleansing, parsing, standardization, enrichment using 3rd party plugins etc.) etc. Strong understanding of data quality best practices, concepts, data quality management frameworks and data quality dimensions/KPIs Deep knowledge on SQL and stored procedure Should have strong knowledge on Master Data, Data Governance, Data Security Prefer to have domain knowledge on SAP Finance modules Good to have hands on experience on AI use cases on Data Quality or Data Management areas Prefer to have the concepts and hands on experience of master data management – matching, merging, creation of golden records for master data entities Strong soft skills like inter-personal, team and communication skills (both verbal and written)

Posted 2 weeks ago

Apply

4.0 - 10.0 years

0 Lacs

Pune, Maharashtra, India

On-site

The role of Data Governance Manager is a first level leadership position within the Finance Data Office’s Data Management team. This is a pivotal role for setting up and driving the data governance framework, Data related principles and policies to create a culture of data accountability across all finance data domains. The role is responsible for leading and executing the data governance agenda including data definition, data ownership, data standards, data remediation and master data governance processes across Finance. Data governance manager is expected to partner with the wider data management team in improvement of data quality by implementing data monitoring solutions. The ideal candidate will have a proven track record in working with data governance platforms such as Alation or Collibra for SAP master data domains. This position will take accountability for defining and driving data governance aspects, including leading meetings and data governance forums with Data Stewards, Data Owners, Data Engineers, and other key stakeholders.  Coordinating with Data Owners to enable identification of Critical data elements for SAP master Data – Supplier/Finance/Bank master.  Develop and maintain a business-facing data glossary and data catalog for SAP master data (Supplier, Customer, Finance (GL, Cost Center, Profit Center etc), capturing data definitions, lineage, and usage for relevant SAP master Data  Define Data governance framework: Develop and implement data governance policies, standards, and processes to ensure data quality, data management, and compliance for relevant SAP Master Data (Finance, Supplier and Customer Master Data)  Conduct data quality assessments and implement corrective actions to address data quality issues.  Collaborate with cross-functional teams to ensure data governance practices are integrated into all SAP relevant business processes.  Data Cataloging and Lineage: Manage data cataloging and lineage to provide visibility into data assets, their origins, and transformations in SAP environment  Facilitate governance forums, data domain councils, and change advisory boards to review data issues, standards, and continuous improvements.  Responsible to prepare data documentation, including data models, process flows, governance policies, and stewardship responsibilities.  Collaboration: Work closely with IT, data management teams, and business units to implement data governance best practices and tools.  Monitoring and Reporting: Monitor data governance activities, measure progress, and report on key metrics to senior management.  Training and Awareness: Conduct training sessions and create awareness programs to promote data governance within the organization.  Data structures and models: Demonstrate deep understanding of SAP (and other ERP system such as JD Edwards etc.) master data structures such as Vendor, Customer, Cost center, Profit Center, GL Accounts etc.  Data Policies: Collaborate and coordinate with respective pillar lead’s to ensure necessary policies related to data privacy, data lifecycle management and data quality management are being developed JOB REQUIREMENTS: i. Education or Certifications:  Bachelor's / Master's degree in engineering/technology/other related degrees.  Relevant Professional level certifications from Informatica or SAP or Collibra or Alation or any other leading platform/tools  Relevant certifications from DAMA, EDM Council and CMMI-DMM will be a bonus ii. Work Experience:  You have 4-10 years of relevant experience within the Data & Analytics area with major experience around data management areas: ideally in Data Governance (DQ) and/or Data Quality or Master Data or Data Lineage using relevant tools like Informatica or SAP MDG or Collibra or Alation or any other market leading tools.  You have an in-depth knowledge of Data Quality and Data Governance concepts, approaches, methodologies and tools  Client-facing Consulting experience will be considered a plus iii. Technical and Functional Skills:  Hands-on experience in any of the above tools in the area of Enterprise Data Governance preferably in SAP or complex and diverse systems environments  Experience of implementing data governance in SAP environment both transactional and master data  Expert knowledge of data governance concepts around data definition and catalog, data ownership, data lineage, data policies and controls, data monitoring and data governance forums  Strong knowledge on SAP peripheral systems and good understanding of Upstream and downstream impact of master Data  Exposure to concepts of data quality – data lifecycle, data profiling, data quality remediation(cleansing, parsing, standardization, enrichment using 3 rd party plugins etc.) etc.  Strong understanding of data quality best practices, concepts, data quality management frameworks and data quality dimensions/KPIs  Deep knowledge on SQL and stored procedure  Should have strong knowledge on Master Data, Data Security  Prefer to have domain knowledge on SAP Finance modules  Good to have hands on experience on AI use cases on Data Quality or Data Governance or other Management areas  Prefer to have the concepts and hands on experience of master data management – matching, merging, creation of golden records for master data entities  Strong soft skills like inter-personal, team and communication skills (both verbal and written)  Prefer to have - Project management, Domain knowledge [Procurement, Finance, Customer], Business Acumen, Critical thinking, Story telling

Posted 2 weeks ago

Apply

5.0 - 9.0 years

0 Lacs

haryana

On-site

Genpact is a global professional services and solutions firm with a team of over 125,000 professionals in more than 30 countries. Driven by curiosity, agility, and the desire to create lasting value for clients, we serve leading enterprises worldwide, including the Fortune Global 500. Our purpose is the relentless pursuit of a world that works better for people, and we achieve this through our deep business and industry knowledge, digital operations services, and expertise in data, technology, and AI. We are currently seeking applications for the position of Principal Consultant, Research Data Scientist. We are looking for candidates with relevant experience in Text Mining/Natural Language Processing (NLP) tools, Data sciences, Big Data, and algorithms. The ideal candidate should have full cycle experience in at least one large-scale Text Mining/NLP project, including creating a business use case, Text Analytics assessment/roadmap, technology & analytic solutioning, implementation, and change management. Experience in Hadoop, including development in the map-reduce framework, is also desirable. The Text Mining Scientist (TMS) will play a crucial role in bridging enterprise database teams and business/functional resources, translating business needs into techno-analytic problems, and working with database teams to deliver large-scale text analytic solutions. Responsibilities: - Develop transformative AI/ML solutions to address clients" business requirements - Manage project delivery involving data pre-processing, model training and evaluation, and parameter tuning - Manage stakeholder/customer expectations and project documentation - Research cutting-edge developments in AI/ML with NLP/NLU applications in various industries - Design and develop solution algorithms within tight timelines - Interact with clients to collect and synthesize requirements for effective analytics/text mining roadmap - Work with digital development teams to integrate algorithms into production applications - Conduct applied research on text analytics and machine learning projects, file patents, and publish papers Qualifications: Minimum Qualifications/Skills: - MS in Computer Science, Information Systems, or Computer Engineering - Relevant experience in Text Mining/Natural Language Processing (NLP) tools, Data sciences, Big Data, and algorithms Technology: - Open Source Text Mining paradigms (NLTK, OpenNLP, OpenCalais, StanfordNLP, GATE, UIMA, Lucene) and cloud-based NLU tools (DialogFlow, MS LUIS) - Statistical Toolkits (R, Weka, S-Plus, Matlab, SAS-Text Miner) - Strong Core Java experience, programming in the Hadoop ecosystem, and distributed computing concepts - Proficiency in Python/R programming; Java programming skills are a plus Methodology: - Solutioning & Consulting experience in verticals like BFSI, CPG, with text analytics experience on large structured and unstructured data - Knowledge of AI Methodologies (ML, DL, NLP, Neural Networks, Information Retrieval, NLG, NLU) - Familiarity with Natural Language Processing & Statistics concepts, especially in their application - Ability to conduct client research to enhance analytics agenda Preferred Qualifications/Skills: Technology: - Expertise in NLP, NLU, and Machine learning/Deep learning methods - UI development paradigms for Text Mining Insights Visualization - Experience with Linux, Windows, GPU, Spark, Scala, and deep learning frameworks Methodology: - Social Network modeling paradigms, tools & techniques - Text Analytics using NLP tools like Support Vector Machines and Social Network Analysis - Previous experience with Text analytics implementations using open source packages or SAS-Text Miner - Strong prioritization, consultative mindset, and time management skills Job Details: - Job Title: Principal Consultant - Primary Location: India-Gurugram - Schedule: Full-time - Education Level: Master's/Equivalent - Job Posting Date: Oct 4, 2024, 12:27:03 PM - Unposting Date: Ongoing - Master Skills List: Digital - Job Category: Full Time,

Posted 2 weeks ago

Apply

8.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Skills & Expertise Angular (v8.0+) & TypeScript : 8+ years of hands-on development experience REST API Integration : Proficient in building and consuming APIs UI Frameworks : Expertise in Material Design and Bootstrap Template to Screen Conversion : Skilled at transforming UI/UX designs into functional screens Version Control Tools : Experience with GitHub, TeamCity Testing : Development of JUnit / MUnit test cases Ticketing Tools : Familiar with JIRA, ServiceNow Server Knowledge : Working knowledge of Apache Tomcat Frontend Technologies : Proficient in HTML, CSS Backend & DB Basics : Basic Java skills and experience with databases like MySQL, MS-SQL, Oracle Agile & Waterfall Methodologies : Experience in both project management styles Communication : Excellent verbal and written communication skills JSON Handling : Competent in parsing and managing JSON data Key Responsibilities Development & UI/UX Implementation Lead the design, development, and maintenance of robust, responsive, and user-friendly web applications using Angular (v8.0+) and TypeScript. Demonstrate strong proficiency in transforming raw UI/UX designs and templates into functional, pixel-perfect screens. Leverage expertise in Material Design and Bootstrap to build visually appealing and consistent user interfaces. API Integration & Data Handling Proficiently build and consume REST APIs, ensuring seamless data flow between frontend and backend systems. Competently parse and manage JSON data for effective client-side operations. Frontend & Backend Foundations Show strong proficiency in fundamental frontend technologies : HTML and CSS. Apply basic Java skills and experience with databases like MySQL, MS-SQL, and Oracle to understand full-stack interactions. Testing & Quality Assurance Develop JUnit / MUnit test cases to ensure code quality, reliability, and maintainability. Tools & Methodologies Utilize Version Control Tools such as GitHub and TeamCity for collaborative development. Work with Ticketing Tools like JIRA and ServiceNow for project tracking and issue management. Possess working knowledge of Apache Tomcat for deployment understanding. Operate effectively within both Agile and Waterfall Methodologies (ref:hirist.tech)

Posted 2 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies