Home
Jobs

446 Parsing Jobs - Page 12

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

0 years

0 Lacs

Hyderabad, Telangana, India

Remote

Linkedin logo

When you join Verizon You want more out of a career. A place to share your ideas freely — even if they’re daring or different. Where the true you can learn, grow, and thrive. At Verizon, we power and empower how people live, work and play by connecting them to what brings them joy. We do what we love — driving innovation, creativity, and impact in the world. Our V Team is a community of people who anticipate, lead, and believe that listening is where learning begins. In crisis and in celebration, we come together — lifting our communities and building trust in how we show up, everywhere & always. Want in? Join the #VTeamLife. What You’ll Be Doing... As a Data Engineer with ETL/ELT expertise for our growing data platform and analytics teams, you will understand and enable the required data sets from different sources. This includes both structured and unstructured data into our data warehouse and data lake with real-time streaming and/or batch processing to generate insights and perform analytics for business teams within Verizon. Understanding the business requirements. Transforming technical design. Working on data ingestion, preparation and transformation. Developing the scripts for data sourcing and parsing. Developing data streaming applications. Debugging the production failures and identifying the solution. Working on ETL/ELT development. What We’re Looking For... You’re curious about new technologies and the game-changing possibilities it creates. You like to stay up-to-date with the latest trends and apply your technical expertise to solve business problems. You'll Need To Have Bachelor’s degree or one or more years of experience. Experience with Data Warehouse concepts and Data Management life cycle. Even better if you have one or more of the following: Any related Certification on ETL/ELT developer. Accuracy and attention to detail. Good problem solving, analytical, and research capabilities. Good verbal and written communication. Experience presenting to and influencing partners. If Verizon and this role sound like a fit for you, we encourage you to apply even if you don’t meet every “even better” qualification listed above. #AI&D Where you’ll be working In this hybrid role, you'll have a defined work location that includes work from home and assigned office days set by your manager. Scheduled Weekly Hours 40 Equal Employment Opportunity Verizon is an equal opportunity employer. We evaluate qualified applicants without regard to race, gender, disability or any other legally protected characteristics. Show more Show less

Posted 3 weeks ago

Apply

2.0 - 5.0 years

0 Lacs

Hyderabad, Telangana, India

Remote

Linkedin logo

Job Description: We are looking for a skilled Technical Trainer with expertise in Zoho’s Deluge scripting language to train and mentor aspiring Zoho developers. The ideal candidate should have 2-5 years of experience in Zoho Creator/CRM development and Deluge scripting . Roles & Responsibilities: Conduct hands-on training sessions on Deluge scripting across Zoho Creator, CRM, and other Zoho applications. Design and deliver structured learning paths, exercises, and capstone projects . Guide learners in developing custom workflows, automations, and integrations using Deluge . Provide ongoing mentorship, code reviews, and support . Evaluate students’ understanding through projects and assignments. Stay updated with new features in Zoho and Deluge . Host webinars, live coding demos , and interactive Q&A sessions. Customize teaching methods to suit beginner and advanced learners . Technology-Specific Responsibilities: Zoho Creator : Teach how to build apps, forms, reports, and automate them with Deluge. Zoho CRM : Instruct on custom modules, buttons, workflows, and scripting for business logic. Deluge Scripting : Guide end-to-end from basics to advanced concepts including integration, loops, maps, etc. API Integration : Train students to consume REST APIs, parse JSON, and trigger webhooks. Best Practices : Emphasize clean code, modular functions, and efficient workflows. Requirements: 2-5 years of experience in Zoho One, Creator, CRM, and Deluge scripting Proficiency in writing workflows, automations , and integrations Solid understanding of REST APIs and JSON parsing Clear communication and mentorship ability Preferred Skills: Experience with Zoho Analytics, Zoho Flow , or Zoho Books Familiarity with OAuth2 authentication in API integrations Exposure to No-code/Low-code platforms Knowledge of Webhook handling and third-party API setup Why Join Us? Opportunity to shape the next generation of Zoho developers. A dynamic and supportive team environment. Remote-friendly with flexible working hours. Competitive pay with growth and leadership paths. Show more Show less

Posted 3 weeks ago

Apply

0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Linkedin logo

Job Overview We are seeking a highly skilled Python Developer to join our dynamic team. The ideal candidate should have strong expertise in Python and its associated libraries, with experience in web scraping, data handling, and automation. You should be an excellent problem solver with great communication skills and a solid understanding of object-oriented programming and data structures. Key Responsibilities Develop, test, and maintain efficient Python-based desktop applications. Work with pandas for data manipulation and analysis. Write optimized SQL queries for database interactions. Utilize BeautifulSoup and Selenium for web scraping and automation. Handle JSON data efficiently for API integrations and data exchange. Apply object-oriented programming (OOP) principles to software development. Implement data structures and algorithms to optimize performance. Troubleshoot and debug code for functionality and efficiency. Collaborate with cross-functional teams to deliver high-quality solutions. Document processes and write clean, maintainable code. Must-Have Skills ✅ Python – Strong proficiency in Python programming. ✅ Pandas – Experience with data manipulation and analysis. ✅ SQL – Ability to write and optimize queries. ✅ BeautifulSoup – Web scraping and parsing HTML/XML data. ✅ JSON – Handling structured data for APIs and storage. ✅ Selenium – Automation and web testing. ✅ OOP Concepts – Strong understanding of object-oriented principles. ✅ Data Structures & Algorithms – Efficient problem-solving abilities. ✅ Problem-Solving Skills – Ability to tackle complex technical challenges. ✅ Communication Skills – Strong verbal and written communication. Show more Show less

Posted 3 weeks ago

Apply

0 years

0 Lacs

Ahmedabad, Gujarat, India

On-site

Linkedin logo

Key Requirements Strong proficiency in Android (Kotlin/Java) Strong knowledge of OOPs Fundamental Dynamic layout design Deep understanding of MVVM architecture, dependency injection (Dagger/Hilt). Experience in RESTful APIs , JSON parsing, and third-party library Retrofit. Location and Map integration. Proficiency in Firebase , push notifications, and real-time database handling. Knowledge of version control systems such as Git/GitHub/GitLab . Ability to optimize applications for performance and scalability . Experience in writing unit tests and UI tests is a plus. Exposure to Agile development methodologies. Additional Preferences Strong problem-solving skills and debugging capabilities. Experience with CI/CD pipelines for mobile applications. Familiarity with Play Store deployment processes . Show more Show less

Posted 3 weeks ago

Apply

7.0 - 10.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

Job Title: Lead Splunk Engineer Location: Gurgaon (Hybrid) Experience: 7-10 Years Employment Type: Full-time Notice Period: Immediate Joiners Preferred Job Summary: We are seeking an experienced Lead Splunk Engineer to design, deploy, and optimize SIEM solutions with expertise in Splunk architecture, log management, and security event monitoring . The ideal candidate will have hands-on experience in Linux administration, scripting, and integrating Splunk with tools like ELK & DataDog . Key Responsibilities: ✔ Design & deploy scalable Splunk SIEM solutions (UF, HF, SH, Indexer Clusters). ✔ Optimize log collection, parsing, normalization, and retention . ✔ Ensure license & log optimization for cost efficiency. ✔ Integrate Splunk with 3rd-party tools (ELK, DataDog, etc.) . ✔ Develop automation scripts (Python/Bash/PowerShell) . ✔ Create technical documentation (HLD, LLD, Runbooks) . Skills Required: 🔹 Expert in Splunk (Architecture, Deployment, Troubleshooting) 🔹 Strong SIEM & Log Management Knowledge 🔹 Linux/Unix Administration 🔹 Scripting (Python, Bash, PowerShell) 🔹 Experience with ELK/DataDog 🔹 Understanding of German Data Security Standards (GDPR/Data Parsimony) Why Join Us? Opportunity to work with cutting-edge security tools . Hybrid work model (Gurgaon-based). Collaborative & growth-oriented environment . Show more Show less

Posted 3 weeks ago

Apply

3.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

Role : Cloud and Observability Engineer Experience : 3-6 Years+ Location : Gurugram To Apply: https://forms.gle/mu8BgX7j5PTKF1Lz5 About the Job Coralogix is a modern, full-stack observability platform transforming how businesses process and understand their data. Our unique architecture powers in-stream analytics without reliance on expensive indexing or hot storage. We specialize in comprehensive monitoring of logs, metrics, trace and security events with features such as APM, RUM, SIEM, Kubernetes monitoring and more, all enhancing operational efficiency and reducing observability spend by up to 70%. Coralogix is rebuilding the path to observability using a real-time streaming analytics pipeline that provides monitoring, visualization, and alerting capabilities without the burden of indexing. By enabling users to define different data pipelines per use case, we provide deep Observability and Security insights, at an infinite scale, for less than half the cost. We are looking for a Customer Success Engineer to join our highly experienced global team. The Customer Success Engineer role embodies the critical intersection of technical expertise and a focus on customer satisfaction. This role is tasked with helping Coralogix customers with giving answers to technical questions, solution architecture, and ensuring successful adoption of the Coralogix Platform. About The Position: Job Summary: As a Cloud and Observability Engineer you will play a critical role in ensuring a smooth transition of customers’ monitoring and observability infrastructure. Your expertise in various other observability tools, coupled with a strong understanding of DevOps, will be essential in successfully migrating alerts and dashboards through creating extension packages and enhancing the customer's monitoring capabilities. You will collaborate with cross-functional teams, understand their requirements, design migration & extension strategies, execute the migration process, and provide training and support throughout the engagement Responsibilities: Extension Delivery: Build & enhance quality extension packages for alerts, dashboards and parsing rules in Coralogix Platform to improve monitoring experience for key services using our platform. This would entail - Research related to building world class extensions including for container technology, services from cloud service providers, etc. Building related Alerts and Dashboards in Coralogix, validating their accuracy & consistency and creating their detailed overviews and documentation Configuring Parsing rules in Coralogix using regex to structure the data as per requirements Building packages as per Coralogix methodology and standards and automating ongoing process using scripting Support internal stakeholders and customers with respect to queries, issues and feedback with respect to deployed extensions Migration Delivery: Help migrate customer alerts, dashboards and parsing rules from leading competitive observability and security platforms to Coralogix Knowledge Management: Build, maintain and evolve documentation with respect to all aspects of extensions and migration Conduct training sessions for internal stakeholders and customer on all aspects of the platform functionality (alerts, dashboards, parsing, querying, etc.), migrations process & techniques and extensions content Collaborate closely with internal stakeholders and customers to understand their specific monitoring needs, gather requirements, and ensure alignment during the extension building process Professional Experience: Minimum 3+ years of experience as a Systems Engineer, DevOps Engineer, or similar roles, with a focus on monitoring, alerting, and observability solutions. Cloud Technology Experience - 2+ yrs of hands-on experience with and understanding of Cloud and Container technologies (GCP/Azure/AWS + K8/EKS/GKE/AKS). Cloud Service Provider DevOps certifications would be a plus Observability Expertise: Good knowledge and hands-on experience with 2 or more Observability platforms, including alert creation, dashboard creation, and infrastructure monitoring.Researching latest industry trends is part of the scope. Deployments & Automation: Good understanding of CI/CD with at least one deployment and version control tool. Engineers would need to package alerts and dashboards as extension packs on an ongoing basis. Grafana & PromQL Proficiency: Basic understanding and practical experience with PromQL, Prometheus's query language, for querying metrics and creating custom dashboards. Person would also need to learn Dataprime and Lucene syntax on the job. Troubleshooting Skills: Excellent problem-solving and debugging skills to diagnose issues, identify root causes, and propose effective solutions. Communication Skills: Strong English verbal and written communication skills to collaborate with the customer's cross-functional teams, deliver training sessions, and create clear technical documentation. Analytical Thinking: Ability to analyze complex systems, identify inefficiencies or gaps, and propose optimized monitoring solutions. Availability: Ability to also work across US and European timezones This is a work from office role Cultural Fit We’re seeking candidates who are hungry, humble, and smart. Coralogix fosters a culture of innovation and continuous learning, where team members are encouraged to challenge the status quo and contribute to our shared mission. If you thrive in dynamic environments and are eager to shape the future of observability solutions, we’d love to hear from you. Coralogix is an equal opportunity employer and encourages applicants from all backgrounds to apply Show more Show less

Posted 3 weeks ago

Apply

0.0 - 2.0 years

0 Lacs

Kollam, Kerala

On-site

Indeed logo

Amrita Vishwa Vidyapeetham, Bengaluru Campus is inviting applications from qualified candidates for the post of Flutter Devloper. For Details Contact: paikrishnang@am.amrita.edu Job Title Flutter Devloper Location Kollam , Kerala Required Number 2 Job description App Development Develop and maintain cross-platform mobile applications using Flutter and Dart. Build responsive and pixel-perfect UIs based on Figma/Adobe XD/UI designs. Implement new features and functionalities based on project requirements. State Management Use appropriate state management techniques such as BLoC, Provider, Riverpod, or GetX. Maintain scalable and clean state handling across screens and modules. API Integration Integrate RESTful APIs and handle data fetching, parsing, and error handling. Use tools like Dio or HTTP for network calls. Code Quality Write clean, maintainable, and testable Dart code. Follow version control best practices using Git. Testing and Debugging Conduct unit testing and widget testing. Debug and fix performance, UI, and logic issues during development and after release. Build & Deployment Understand how to build, sign, and release Android (APK/AAB) and iOS apps. Collaborate with seniors for publishing apps to the Play Store or App Store. Documentation Maintain proper documentation of code and app architecture. Write README files and API usage notes where applicable. Learning & Improvement Stay updated with Flutter releases and best practices. Actively learn and apply new tools or libraries relevant to the project. Qualification BTech/BCA/MCA/MTech Job category Project Experience 1-2 years Last date to apply June 20, 2025

Posted 3 weeks ago

Apply

10.0 years

0 Lacs

Noida, Uttar Pradesh

On-site

Indeed logo

Noida,Uttar Pradesh,India Job ID 766940 Join our Team About this opportunity: We are looking for a skilled Telecom Billing Mediation Specialist to manage and optimize the mediation process between network elements and the postpaid billing system. What you will do: Implement rules for data filtering, deduplication, and enrichment before sending to the billing system. Work with network, IT, and billing teams to ensure smooth integration between mediation and billing platforms. Optimize mediation rules to handle high-volume CDR processing efficiently. Perform data reconciliation between network elements, mediation, and billing systems. Investigate and resolve discrepancies in mediation and billing data. Monitor system health, troubleshoot issues, and ensure high availability of mediation services. Conduct root cause analysis (RCA) for mediation-related issues and implement corrective actions. You will bring: Hands-on experience with billing mediation platforms (e.g. Amdocs Mediation, IBM, HP Openet, etc.) Proficiency in SQL, Linux/Unix scripting, and data transformation tools. Familiarity with ETL processes, data parsing, and API integrations. Solid understanding of telecom postpaid billing systems (e.g., Amdocs, HP, Oracle BRM). Knowledge of network elements (MSC, MME, SGSN, GGSN, PCRF, OCS, IN) and their impact on mediation. Awareness of revenue assurance and fraud detection in telecom billing. Key Qualification: Bachelor’s degree in computer science, E.C.E Telecommunications. 10+ years of experience in telecom billing mediation. Experience in cloud-based mediation solutions (AWS, Azure, GCP) is a plus. Knowledge of 5G mediation and real-time charging architectures is an advantage. What happens once you apply?

Posted 3 weeks ago

Apply

4.0 years

0 Lacs

Gurugram, Haryana

On-site

Indeed logo

- 3+ years of building models for business application experience - PhD, or Master's degree and 4+ years of CS, CE, ML or related field experience - Experience in patents or publications at top-tier peer-reviewed conferences or journals - Experience programming in Java, C++, Python or related language - Experience in any of the following areas: algorithms and data structures, parsing, numerical optimization, data mining, parallel and distributed computing, high-performance computing Interested in building something new? Join the Amazon Autos team on an exhilarating journey to redefine the vehicle shopping experience. This is an opportunity to be part of the Amazon's new business ventures. Our goal is to create innovative automotive discovery and shopping experiences on Amazon, providing customers with greater convenience and a wider selection. You'll work in a creative, fast-paced, and entrepreneurial environment at the center of Amazon's innovation. As a key member, you'll play a pivotal role in helping us achieve our mission. We are looking for a highly accomplished Applied Science professional drive our science strategy, foster a culture of data-driven decision-making, and drive impactful business outcomes through advanced state-of-the-art science methodologies. If you're enthusiastic about innovating and delivering exceptional shopping experiences to customers, thrive on new challenges, and excel at solving complex problems using top-notch ML models, LLM and GenAI techniques, then you're the perfect candidate for this role. Strong business acumen and interpersonal skills are a must, as you'll work closely with business owners to understand customer needs and design scalable solutions. Join us on this exhilarating journey and be part of redefining the vehicle shopping experience. Key job responsibilities As an Applied Scientist in Amazon Autos, you will: - Shape the roadmap and strategy for applying science to solve customer problems in the Amazon AutoStore domain. - Drive big picture innovations with clear roadmaps for intermediate delivery. - Apply your skills in areas such as deep learning and reinforcement learning while building scalable solutions for business problems. - Produce and deliver models that help build best-in-class customer experiences and build systems that allow us to deploy these models to production with low latency and high throughput. - Utilize your Generative AI, time series and predictive modeling skills, and creative problem-solving skills to drive new projects from ideation to implementation. - Interface with business customers, gathering requirements and delivering science solutions. - Collaborate with cross-functional teams, including software engineers, data scientists, and product managers, to define project requirements, establish success metrics, and deliver high-quality solutions. - Effectively communicate complicated machine learning concepts to multiple partners. - Research new and innovative machine learning approaches. A day in the life In this role, you will be part of a multidisciplinary team working on one of Amazon's newest business ventures. As a key member, you will collaborate closely with engineering, product, design, operations, and business development to bring innovative solutions to our customers. Your science expertise will be leveraged to research and deliver novel solutions to existing problems, explore emerging problem spaces, and create new knowledge. You will invent and apply state-of-the-art technologies, such as large language models, machine learning, natural language processing, and computer vision, to build next-generation solutions for Amazon. You'll publish papers, file patents, and work closely with engineers to bring your ideas to production. About the team This is a critical role for Amazon Autos team with a vision to create innovative automotive discovery and shopping experiences on Amazon, providing customers better convenience and more selection. We’re collaborating with other experienced teams at Amazon to define the future of how customers research and shop for cars online. Experience using Unix/Linux Experience in professional software development Experience building complex software systems, especially involving deep learning, machine learning and computer vision, that have been successfully delivered to customers Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner.

Posted 3 weeks ago

Apply

3.0 - 7.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Responsibilities As a Web Scraper, your role is to apply your knowledge set to fetch data from multiple online sources Developing highly reliable web Scraper and parsers across various websites Extract structured/unstructured data and store them into SQL/No SQL data store Work closely with Project/Business/Research teams to provide scrapped data for analysis Maintain the scraping projects delivered to production Develop frameworks for automating and maintaining constant flow of data from multiple sources Work independently with minimum supervision Develop a deep understanding of the data sources on the web and know exactly how when, and which data to scrap, parse and store this data Required Skills And Experience Experience as Web Scraper of 3 to 7 years. Proficient knowledge in Python language and working knowledge of Web Crawling/Web scraping in Python Requests, Beautifulsoup or URLlib and Selenium, Playwright. Must possess strong knowledge of basic Linux commands for system navigation, management, and troubleshooting. Must have expertise in proxy usage to ensure secure and efficient network operations. Must have experience with captcha-solving techniques for seamless automation and data extraction. Experience with data parsing - Strong knowledge of Regular expression, HTML, CSS, DOM, XPATH. Knowledge of Javascript would be a plus. Must be able to access, manipulate, and transform data from a variety of database and flat file sources. MongoDB & MYSQL skills are essential. Must possess strong knowledge of basic Linux commands for system navigation, management, and troubleshooting. Must be able to develop reusable code-based scraping products which can be used by others. GIT knowledge is mandatory for version control and collaborative development workflows. Must have experience handling cloud servers on platforms like AWS, GCP, and LEAPSWITCH for scalable and reliable infrastructure management. Ability to ask the right questions and deliver the right results in a way that is understandable and usable to your clients. A track record of digging in to the tough problems, attacking them from different angles, and bringing innovative approaches to bear is highly desirable. Must be capable of selfteaching new techniques. (ref:hirist.tech) Show more Show less

Posted 3 weeks ago

Apply

1.0 - 3.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Responsibilities As a Web Scraper, your role is to apply your knowledge set to fetch data from multiple online sources Developing highly reliable web Scraper and parsers across various websites Extract structured/unstructured data and store them into SQL/No SQL data store Work closely with Project/Business/Research teams to provide scrapped data for analysis Maintain the scraping projects delivered to production Develop frameworks for automating and maintaining constant flow of data from multiple sources Work independently with minimum supervision Develop a deep understanding of the data sources on the web and know exactly how, when, and which data to scrap, parse and store this data Required Skills And Experience Experience as Web Scraper of 1 to 3 years. Proficient knowledge in Python language and working knowledge of Web Crawling/Web scraping in Python Requests, Beautifulsoup or URLlib and Selenium, Playwright. Must possess strong knowledge of basic Linux commands for system navigation, management, and troubleshooting. Must have expertise in proxy usage to ensure secure and efficient network operations. Must have experience with captcha-solving techniques for seamless automation and data extraction. Experience with data parsing Strong knowledge of Regular expression, HTML, CSS, DOM, XPATH. Knowledge of Javascript would be a plus Must be able to access, manipulate, and transform data from a variety of database and flat file sources. MongoDB & MYSQL skills are essential. Must possess strong knowledge of basic Linux commands for system navigation, management, and troubleshooting. Must be able to develop reusable code-based scraping products which can be used by others. GIT knowledge is mandatory for version control and collaborative development workflows. Must have experience handling cloud servers on platforms like AWS, GCP, and LEAPSWITCH for scalable and reliable infrastructure management. Ability to ask the right questions and deliver the right results in a way that is understandable and usable to your clients. A track record of digging in to the tough problems, attacking them from different angles, and bringing innovative approaches to bear is highly desirable. (ref:hirist.tech) Show more Show less

Posted 3 weeks ago

Apply

3.0 - 6.0 years

12 - 18 Lacs

Pune

Work from Office

Naukri logo

Job Description: Were searching for Senior Security Engineer to assist our 247 managed security operations center. This role is in Integration Department, responsible for the strategic, technical, and operational direction of the Integration Team Responsibilities: • IBM QRadar/ Sentinel / Datadog , Integration and content management, Event Collector deployment/upgradation. • Troubleshooting skills at all layers of OSI Model. • Onboard all standard devices to QRadar, such as Windows Security Events, Firewalls, Antivirus, Proxy etc. • Onboard non-standard devices by researching the product and coordinating with different teams. Such as application onboarding or onboarding new security products. • Developing and Deploying connectors and scripts for log collection for cloud-based solutions. • Detailed validation of parsing and normalization of logs before handing over to SOC team will be day to day Job. • Coordinate between customer and internal teams for issues related to log collection. • The engineer needs to make sure that various team have completed their tasks, such as log validation, Log Source Not Reporting (LSNR Automation), Content Management before the Log Source is in production. • Troubleshooting API based log sources. • Documentation of integrations and versioning Essential Skills: • Prior SIEM administration and integration experience ( QRadar , Splunk , Datadog , Azure Sentinel) • Network and Endpoint Device integration and administration . • Knowledge of Device Integration : Log , Flows collection • Knowledge of Regular Expression and scripting language (ex: Bash , Python , PowerShell ), API implementation and development. • Knowledge of Parser creation and maintenance . • Knowledge of Cloud technologies and implementation . • Excellent in verbal and written communication . • Hands on experience in Networking , Security Solutions and Endpoint Administration and operations. Additional Desired Skills: • Excel, formulation • Documentation and presentation • Quick response on issues and mail with prioritization • Ready to work in 24x7 environment Education Requirements & Experience: • BE/B.Tech, BCA • Experience Level: 3+Year

Posted 3 weeks ago

Apply

0 years

0 Lacs

Gurugram, Haryana, India

Remote

Linkedin logo

🚀 Software Engineer Intern (Remote-First | Hybrid Option | Summer 2025) You bring the fundamentals. We’ll hand you the fire. 🧠 About Hirebix Hirebix is not another job board or agency. We’re rebuilding technical hiring from the ground up—with code, not spreadsheets. We’re building an AI-powered recruitment SaaS to help startups filter noise, run async tech rounds, and cut hiring time by 50%. We’re early-stage, product-obsessed, and moving fast. This is your chance to build the core engine of what could power hundreds of tech teams. 💻 Internship Structure This is a remote-first, hybrid-enabled internship designed for: 🧑‍💻 Students or fresh grads looking for summer internships or first break into product building. 📍 3-month unpaid internship focused on skill-building, mentorship, and contribution. 💼 High-performing interns will be offered a full-time paid role post internship, based on contribution. 🔨 What You’ll Work On No fake tasks. No shadowing. You’ll work directly with our founding engineer(s) to: Architect and build modules of the Hirebix recruitment SaaS (backend-heavy) Integrate AI/LLM features for resume parsing, feedback generation, and interview simulation Write scalable, modular code (Python, Node.js, FastAPI, etc.) 🧠 What We’re Looking For We don’t care about your GPA. We do care if you’ve: ✅ Strong fundamentals in Data Structures, Algorithms, and OOP ✅ Built anything end-to-end (solo or in a team—hackathons count!) ✅ Explored Python, Node.js, or any backend stack ✅ Curiosity to work with AI/ML models and automate things ✅ Hunger to learn, fail fast, and ask better questions Bonus points: You’ve dabbled with Docker, APIs, MongoDB, or FastAPI You’ve tried building your own bots, tools, or scrapers 📍 Logistical Bits Mode : Remote-first (Hybrid available in Gurugram if desired) Duration : 3 Months (May–Aug 2025 preferred) Stipend : Unpaid during internship Post-internship : Performance-based Full-Time Paid Role opportunity Certificate + LOR : Yes, for all sincere contributors Show more Show less

Posted 3 weeks ago

Apply

0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

As the global leader in high-speed connectivity, Ciena is committed to a people-first approach. Our teams enjoy a culture focused on prioritizing a flexible work environment that empowers individual growth, well-being, and belonging. We’re a technology company that leads with our humanity—driving our business priorities alongside meaningful social, community, and societal impact. BPI is built on the Blue Planet Cloud Native Platform (CNP), a modern OSS that converges design, delivery, and assurance software applications to eliminate inefficient operational silos and helps streamline the introduction and delivery of innovative services across network domains and vendor We are looking for a software engineer who will contribute to developing industry-leading dynamic network inventory software. Job Requirements Below Software Development Experience in Java. Extremely competent in Java, with emphasis on Core Java (OOPs concept, Designing skills, Multi-threading, Concurrency, Collection Framework, Exception handling and debug skills), Java Swing, JavaFX, JAXB, XML Parsing techniques, Socket Programming etc. Familiarity with relational and non-relational database concepts. Experience in writing queries on databases like Oracle and Neo4j Familiarity with UI technologies such as Angular. Excellent troubleshooting/debugging skills. Excellent Problem-Solving skills. Strong knowledge of operating systems: Linux, MAC, and Windows Strong commitment to product excellence and quality. Ability to resolve complex issues that may require design trade-offs. Bachelor’s/Master of Engineering in computer science or a related discipline. Excellent written and verbal communication skills, effectively able to collaborate with multiple teams across geographically diverse areas. Not ready to apply? Join our Talent Community to get relevant job alerts straight to your inbox. At Ciena, we are committed to building and fostering an environment in which our employees feel respected, valued, and heard. Ciena values the diversity of its workforce and respects its employees as individuals. We do not tolerate any form of discrimination. Ciena is an Equal Opportunity Employer, including disability and protected veteran status. If contacted in relation to a job opportunity, please advise Ciena of any accommodation measures you may require. Show more Show less

Posted 3 weeks ago

Apply

7.0 - 10.0 years

2 - 3 Lacs

Gurgaon

On-site

GlassDoor logo

Experience: 7 - 10 Years Location: GURGAON/ HYBRID MODE CTC TO BE OFFERED : Mention Your Current & Expected CTC Notice Period: IMMEDIATE TO 30 DAYS KeySkills: SPLUNK, SIEM DOMAIN, BACKEND OPERATIONS , UF, HF, SH, INDEXER CLUSTER, LOG MANAGEMENT, LOG COLLECTION, PARSING, NORMALIZATION, RETENTION PRACTICES, LOGS/LICENSE OPTIMIZATION, DESIGNING, DEPLOYMENT & IMPLEMENTATION, DATA PARSIMONY, GERMAN DATA SECURITY STANDARDS, SPLUNK LOGGING INFRASTRUCTURE, OBSERVABILITY TOOLS, ELK, DATADOG, NETWORK ARCHITECTURE, LINUX ADMINISTRATION, SYSLOG, PYTHON, POWERSHELL, OR BASH, OEM SIEM, HLD, LLD, IMPLEMENTATION GUIDE, OPERATION MANUALS Job Description: As Lead Splunk, your role and responsibilities would include: Hands on experience in the SIEM domain Expert knowledge on Splunk Backend operations (UF, HF, SH and Indexer Cluster) and architecture Expert knowledge of Log Management and Splunk SIEM. Understanding of log collection, parsing, normalization, and retention practices. Expert in Logs/License optimization techniques and strategy. Good Understanding of Designing, Deployment & Implementation of a scalable SIEM Architecture. Understanding of data parsimony as a concept, especially in terms of German data security standards. Working knowledge of integration of Splunk logging infrastructure with 3rd party Observability Tools (e.g. ELK, DataDog etc.) Experience in identifying the security and non-security logs and apply adequate filters/re-route the logs accordingly. Expert in understanding the Network Architecture and identifying the components of impact. Expert in Linux Administration. Proficient in working with Syslog. Proficiency in scripting languages like Python, PowerShell, or Bash to automate tasks Expertise with OEM SIEM tools preferably Splunk Experience with open source SIEM/Log storage solutions like ELK OR Datadog etc.. Very good with documentation of HLD, LLD, Implementation guide and Operation Manuals Note: (i) Our client is looking for immediate & early joiners. (ii) Having LinkedIn Profile is a must. (iii) Being an immediate & high priority requirement interested candidates can share their Resumes with Photograph in word doc. format

Posted 3 weeks ago

Apply

0 years

3 - 5 Lacs

Chennai

On-site

GlassDoor logo

Primary Responsibilities: Design and develop AI-driven web applications using Streamlit and LangChain. Implement multi-agent workflows with LangGraph. Integrate Claude 3 (via AWS Bedrock) into intelligent systems for document and image processing. Work with FAISS for vector search and similarity matching. Develop document integration solutions for PDF, DOCX, XLSX, PPTX, and image-based formats. Implement OCR and summarization features using EasyOCR, PyMuPDF, and AI models. Create features such as spell-check, chatbot accuracy tracking, and automatic re-training pipelines. Build secure apps with SSO authentication, transcript downloads, and reference link generation. Integrate external platforms like Confluence, SharePoint, ServiceNow, Veeva Vault, Outlook, G.Net/G.Share, and JIRA. Collaborate on architecture, performance optimization, and deployment. Required Skills: Strong expertise in Streamlit, LangChain, LangGraph, and Claude 3 (AWS Bedrock). Hands-on experience with boto3, FAISS, EasyOCR, and PyMuPDF. Advanced skills in document parsing and image/video-to-text summarization. Proficient in modular architecture design and real-time AI response systems. Experience in enterprise integration with tools like ServiceNow, Confluence, Outlook, and JIRA. Familiar with chatbot monitoring and retraining strategies. Secondary Skills: Working knowledge of PostgreSQL, JSON, and file I/O with Python libraries like os, io, time, datetime, and typing. Experience with dataclasses and numpy for efficient data handling and numerical process About Virtusa Teamwork, quality of life, professional and personal development: values that Virtusa is proud to embody. When you join us, you join a team of 27,000 people globally that cares about your growth — one that seeks to provide you with exciting projects, opportunities and work with state of the art technologies throughout your career with us. Great minds, great potential: it all comes together at Virtusa. We value collaboration and the team environment of our company, and seek to provide great minds with a dynamic place to nurture new ideas and foster excellence. Virtusa was founded on principles of equal opportunity for all, and so does not discriminate on the basis of race, religion, color, sex, gender identity, sexual orientation, age, non-disqualifying physical or mental disability, national origin, veteran status or any other basis covered by appropriate law. All employment is decided on the basis of qualifications, merit, and business need.

Posted 3 weeks ago

Apply

4.0 years

0 Lacs

Bengaluru

On-site

GlassDoor logo

Company Description Bosch Global Software Technologies Private Limited is a 100% owned subsidiary of Robert Bosch GmbH, one of the world's leading global supplier of technology and services, offering end-to-end Engineering, IT and Business Solutions. With over 28,200+ associates, it’s the largest software development center of Bosch, outside Germany, indicating that it is the Technology Powerhouse of Bosch in India with a global footprint and presence in the US, Europe and the Asia Pacific region. Job Description Roles & Responsibilities : The engineer’s role would be to support testing efforts for XR sw stack and adjacent technologies like game/3d app dev kits (unity 3d, unreal engine), connectivity (usb, wifi 6, Wifi 6E), multimedia(video, camera, graphics, audio, display). Primary responsibilities will be to write and execute testcases and device automation. Evaluate technologies, systems, and devices through testing, logging and analysis. Candidate must be able to create test environments using software and hardware tools. Develop test procedures, execute tests, and isolate problems. Develop automation systems for executing tests and parsing results. Develop app/test content for exercising new features. Actively study user experience to improve customer experience on Qualcomm MM solution across ARM based chipset product Qualifications Educational qualification: Bachelor’s or Master’s degree in Computer Science or Electronics or Electrical Engineering or related field. Experience : 4-8 Years Skills: Experience working with Windows, Linux and Android. Expertise on Windows and Linux OS concepts and tools – bat script, linux commands. Expert in C/C++/Python programming Experience in game/3d app dev kits (unity 3d, unreal engine), connectivity (usb, wifi 6, Wifi 6E), multimedia(video, camera, graphics, audio, display) Outstanding problem-solving skills Excellent communication and team working skills.

Posted 3 weeks ago

Apply

0 years

0 - 0 Lacs

Bengaluru

Remote

GlassDoor logo

Job Description: We are seeking a creative and independent Web Crawler Developer to join our team Seattle based Construction Team. The ideal candidate will have a keen eye for detail, a passion for problem-solving, and the ability to think outside the box to develop sophisticated web scraping solutions. Responsibilities: - Design, implement, and maintain web crawlers that can effectively extract data from various websites . - Analyze web page structures and adapt crawlers to extract relevant information efficiently. - Monitor crawler performance and make necessary adjustments to ensure optimal data collection. - Work independently to identify new opportunities for data extraction and offer insightful recommendations. - Ensure compliance with legal and ethical standards for data scraping. - Collaborate with data analysts and other team members to understand data needs and improve data accuracy. - Keep up-to-date with the latest web scraping technologies and best practices Qualifications: - Strong experience with web scraping tools and frameworks (e.g., Scrapy, BeautifulSoup, Selenium, etc.). - Proficiency in programming languages such as Python, Java, or others relevant to web crawling. - Experience with handling and parsing different data formats like HTML, JSON, XML, etc. - Excellent problem-solving skills and the ability to think outside the box. - Ability to work independently and manage multiple tasks efficiently. - Solid understanding of web protocols (HTTP, HTTPS) and web technologies. - Familiarity with version control systems, preferably Git. - Knowledge of data privacy laws and ethical web scraping practices. Preferred: - Experience with cloud services like AWS or Azure for deploying and managing web crawlers. - Understanding of databases and data storage solutions. - Previous experience in a similar role or related projects. Job Type: Contractual / Temporary Contract length: 2 months Pay: ₹76,000.00 - ₹80,000.00 per month Benefits: Work from home Supplemental Pay: Performance bonus Expected Start Date: 03/06/2025

Posted 3 weeks ago

Apply

3.0 years

4 - 10 Lacs

Pune

Remote

GlassDoor logo

We help the world run better At SAP, we enable you to bring out your best. Our company culture is focused on collaboration and a shared passion to help the world run better. How? We focus every day on building the foundation for tomorrow and creating a workplace that embraces differences, values flexibility, and is aligned to our purpose-driven and future-focused work. We offer a highly collaborative, caring team environment with a strong focus on learning and development, recognition for your individual contributions, and a variety of benefit options for you to choose from. The SAP HANA Database and Analytics Core engine team is looking for an intermediate, or senior developer to contribute to our Knowledge Graph Database System engine development. In this role, you will be designing, developing features, and maintaining our Knowledge Graph engine, which runs inside SAP HANA in-memory database. At SAP, all members of the engineering team, including management, are hands-on and close to the code. If you think you can thrive in such an environment, and you have the necessary skills and experience please do not hesitate to apply. WHAT YOU’LL DO- As a developer, you will have the opportunity to: Contribute to hands-on coding, design, and architecture that is best suited for our team size and performance targets. Collaborate in a team environment that extends to colleagues in remote locations and from various lines of businesses within the company. Ability to communicate and guide other teams to construct best possible queries for their needs. Assess new technology, tool, and infrastructure to keep up with the rapid pace of change. Embrace lean and agile software development principles. Debug, troubleshoot and communicate with customers about their issues with their data models, and queries. Continually enhance existing skills and seek new areas for personal development. WHAT YOU BRING- Bachelor’s degree or equivalent university education in computer science or engineering with 3-5 years of experience in developing enterprise class software. Experience in Development with modern C++. Knowledge of development of Database Internals like - Query Optimizer/Planner, Query Executor, System Management, Transaction Management, and/or Persistence. Knowledge of SQL, and Graph technologies like RDF/SPARQL. Knowledge of full SDLC and development of tests using Python or other tools. Experience designing and developing well-encapsulated, and object-oriented code. Solution-oriented and open minded. Manage collaboration with sister teams and partner resources in remote locations. High service and customer orientation Skilled in process optimization and drives for permanent change. Strong in analytical thinking/problem solving. Interpersonal skills: team player, proactive networking, results and execution oriented, motivated to work in an international and intercultural environment. Excellent oral and written communication skills and presentation skills MEET YOUR TEAM- The team is responsible for developing HANA Knowledge Graph, a high-performance graph analytics database system, made available to SAP customers, partners, and various internal groups as part of HANA Multi Model Database System. It is specifically designed for processing large-scale graph data and executing complex graph queries with high efficiency. HANA Knowledge Graph enables organizations to gain insights from their graph datasets, discover patterns, perform advanced graph analytics, and unlock the value of interconnected data. HANA Knowledge Graph utilizes massive parallel processing (MPP) architecture to leverage the power of distributed computing. It is built with W3C web standards specifications of graph data and query language – RDF and SPARQL. The various components of HANA Knowledge Graph System include – Storage, Data Load, Query Parsing, Query Planning and Optimization, Query Execution, Transaction Management, Memory Management, Network Communications, System Management, Data Persistence, Backup & Restore, Performance Tuning, etc. At SAP, HANA Knowledge Graph is set to play a critical role in the development of several AI products. Bring out your best SAP innovations help more than four hundred thousand customers worldwide work together more efficiently and use business insight more effectively. Originally known for leadership in enterprise resource planning (ERP) software, SAP has evolved to become a market leader in end-to-end business application software and related services for database, analytics, intelligent technologies, and experience management. As a cloud company with two hundred million users and more than one hundred thousand employees worldwide, we are purpose-driven and future-focused, with a highly collaborative team ethic and commitment to personal development. Whether connecting global industries, people, or platforms, we help ensure every challenge gets the solution it deserves. At SAP, you can bring out your best. We win with inclusion SAP’s culture of inclusion, focus on health and well-being, and flexible working models help ensure that everyone – regardless of background – feels included and can run at their best. At SAP, we believe we are made stronger by the unique capabilities and qualities that each person brings to our company, and we invest in our employees to inspire confidence and help everyone realize their full potential. We ultimately believe in unleashing all talent and creating a better and more equitable world. SAP is proud to be an equal opportunity workplace and is an affirmative action employer. We are committed to the values of Equal Employment Opportunity and provide accessibility accommodations to applicants with physical and/or mental disabilities. If you are interested in applying for employment with SAP and are in need of accommodation or special assistance to navigate our website or to complete your application, please send an e-mail with your request to Recruiting Operations Team: Careers@sap.com For SAP employees: Only permanent roles are eligible for the SAP Employee Referral Program, according to the eligibility rules set in the SAP Referral Policy. Specific conditions may apply for roles in Vocational Training. EOE AA M/F/Vet/Disability: Qualified applicants will receive consideration for employment without regard to their age, race, religion, national origin, ethnicity, age, gender (including pregnancy, childbirth, et al), sexual orientation, gender identity or expression, protected veteran status, or disability. Successful candidates might be required to undergo a background verification with an external vendor. Requisition ID: 396628 | Work Area: Software-Design and Development | Expected Travel: 0 - 10% | Career Status: Professional | Employment Type: Regular Full Time | Additional Locations: #LI-Hybrid.

Posted 3 weeks ago

Apply

9.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Introduction As a Hardware Developer at IBM, you’ll get to work on the systems that are driving the quantum revolution and the AI era. Join an elite team of engineering professionals who enable IBM customers to make better decisions quicker on the most trusted hardware platform in today’s market. Your Role And Responsibilities We are seeking highly motivated Test engineer to be part of Hardware team. Join a great team of engineering professionals who are involved in development, validation, delivery of DFT patterns and testing the patterns for IBM’s microprocessor chip design team. Preferred Education Master's Degree Required Technical And Professional Expertise 4–9 years of experience in ATE test development, silicon debug, and production support for complex SoC or ASIC devices. Strong expertise in test program development, test vector translation, timing setup, and ATE bring-up workflows. Proven ability in debugging test failures, analyzing yield and parametric issues, and resolving silicon bring-up and characterization challenges. Experience with RMA debug – reproducing, analyzing, and isolating failures in customer-returned or field-returned silicon. Hands-on experience with PVT (Process, Voltage, Temperature) characterization, using ATE. Experience in pattern generation, pattern retargeting, and vector-level debug using standard ATE tools (e.g., Teradyne, Advantest). Strong knowledge of pin margin analysis, voltage/timing margining, and correlation between simulation and ATE results. Proficient in automation and scripting using VB (Visual Basic), Perl, Python, and TCL for test flow automation, log parsing, and pattern manipulation. Effective collaboration with cross-functional teams including design, validation, product engineering, and silicon debug to ensure test robustness and quality. Excellent debug and bring-up skills – considered key requirements for this role. Detail-oriented with solid analytical and problem-solving abilities. Strong communication skills and ability to work across global teams. Preferred Technical And Professional Experience Experience with Teradyne UltraFlex (UFlex) tester is a plus. Familiarity with microcontroller architecture, embedded firmware, and functional verification concepts. Experience in post-silicon validation, system-level debug, and yield optimization workflows. Knowledge of processor-based test flows, scan diagnostics, and test time optimization Show more Show less

Posted 3 weeks ago

Apply

14.0 - 16.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

We have urgent job opportunity with us. C# Developer with XML Expertise Location : Pune Overview: We are seeking a talented and motivated C# Developer with a strong background in XML technologies to join our dynamic team in the life insurance sector. The ideal candidate will have 14-16 years of experience in software development, with a robust understanding of C# programming, XML handling, and the nuances of the US life insurance domain. You will play a key role in developing and maintaining software solutions that support our business operations and enhance our customer experience. Key Responsibilities: - Design, develop, and implement software applications using C# that meet business requirements in the life insurance sector. - Work with XML data formats for data interchange and processing within our applications. - Collaborate with cross-functional teams including business analysts, quality assurance, and project management to gather and refine requirements. - Perform code reviews, unit testing, and debugging to ensure high-quality software delivery. - Maintain and enhance existing applications to improve performance and functionality. - Document software designs, processes, and technical specifications to facilitate knowledge sharing and compliance. - Stay current with industry trends and technologies related to life insurance and software development. Qualifications: - 14-16 years of professional experience in software development using C#, .NET framework, and related technologies. - Strong understanding of XML and experience in parsing, transforming, and validating XML data. - Knowledge of life insurance industry practices, products, and regulatory requirements. - Experience with SQL Server or other database technologies for data management. - Proficiency in working with RESTful and SOAP web services. - Familiarity with Agile development methodologies and tools. - Excellent problem-solving skills and attention to detail. - Strong communication skills and ability to work collaboratively in a team-oriented environment. Preferred Qualifications: - Experience with related technologies such as ASP.NET, MVC frameworks, and cloud services (e.g., Azure). - Knowledge of life insurance policy administration systems or underwriting processes. - Certifications in C#, XML technologies, or Agile methodologies are a plus. Show more Show less

Posted 3 weeks ago

Apply

7.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Data Integration Developer About the Role We are looking for a skilled Data Integration Developer to join our award-winning team that recently earned the "Outstanding Data Engineering Team" award at DES 2025. In this role, you will be instrumental in building and maintaining cloud-based data pipelines that power AI and Intelligent Document Automation services . Your work will directly support scalable, production-grade workflows that transform structured and unstructured documents into actionable data using cutting-edge machine learning solutions. You’ll collaborate cross-functionally with data scientists, AI/ML engineers, cloud engineers, and product owners to ensure robust pipeline design, integration, observability, and performance at scale. Key Responsibilities Design, develop, and maintain end-to-end data ingestion and integration pipelines on Google Cloud Platform (GCP). Implement robust workflows from document ingestion and file triggers to downstream ML integration and storage (e.g., BigQuery). Integrate and manage RESTful APIs and asynchronous job queues to support ML/OCR services. Collaborate with AI/ML teams to deploy and scale intelligent automation solutions. Containerize services using Docker and manage deployments via Cloud Run and Cloud Build . Ensure production readiness through monitoring, logging, and observability best practices . Utilize Infrastructure-as-Code tools (e.g., Terraform) for provisioning and environment consistency. Work in Agile/Scrum teams, participating in sprint planning, reviews, and backlog refinement. Required Skills & Experience 7+ years of experience in data integration, cloud data engineering, or backend development . Strong proficiency in Python , REST APIs, and handling asynchronous workloads . Solid hands-on experience with Google Cloud Platform (GCP) services including: Pub/Sub, Cloud Storage (GCS), BigQuery, Cloud Run, Cloud Build Experience in Docker and managing containerized microservices . Familiarity with Infrastructure-as-Code tools like Terraform . Exposure to AI/ML integration , OCR pipelines, or document understanding frameworks is a strong advantage. Experience with CI/CD pipelines and automated deployment workflows . Preferred Qualifications Knowledge of data formats like JSON, XML, or PDF parsing. Prior experience working on document processing , intelligent automation , or ML Ops projects. GCP certification (e.g., Professional Data Engineer or Associate Cloud Engineer ) is a plus. Why Join Us? Join a recognized leader in data engineering , recently awarded the "Outstanding Data Engineering Team" at DES 2025 . Work on mission-critical AI and automation products that directly impact real-world use cases. Thrive in a collaborative, learning-driven culture with opportunities to grow in AI/ML and Cloud technologies. Experience : 7 to 12 Yrs Location : Chennai/Pune/Coimbatore/Bangalore Notice : Immediate to 1 Week Regards, TA Team KANINI Software Solutions Show more Show less

Posted 3 weeks ago

Apply

7.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

Job Title- Sr.Splunk Architect Exp-7+ Years Location- Gurgaon (Hybrid) Notice Period- Immediate Joiner /Serving Responsibilities As Lead Splunk, your role and responsibilities would include: Hands on experience in the SIEM domain o Expert knowledge on splunk> Backend operations (UF, HF, SH and Indexer Cluster) and architecture o Expert knowledge of Log Management and Splunk SIEM. Understanding of log collection, parsing, normalization, and retention practices. o Expert in Logs/License optimization techniques and strategy. o Good Understanding of Designing, Deployment & Implementation of a scalable SIEM Architecture. o Understanding of data parsimony as a concept, especially in terms of German data security standards. o Working knowledge of integration of Splunk logging infrastructure with 3rd party Observability Tools (e.g. ELK, DataDog etc.) o Experience in identifying the security and non-security logs and apply adequate filters/re- route the logs accordingly. o Expert in understanding the Network Architecture and identifying the components of impact. o Expert in Linux Administration. o Proficient in working with Syslog. o Proficiency in scripting languages like Python, PowerShell, or Bash to automate tasks Expertise with OEM SIEM tools preferably Splunk Experience with open source SIEM/Log storage solutions like ELK OR Datadog etc.. o Very good with documentation of HLD, LLD, Implementation guide and Operation Manuals Show more Show less

Posted 3 weeks ago

Apply

10.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Job Title: QA Networking Manual + Python Location: Bangalore Experience: 10Years+ Notice Period: Immediate to 30 Days About the Company Calsoft is a leading technology-first partner, providing digital and product engineering services. For over 25 years, Calsoft has been helping customers solve business challenges through technologies in Storage, Virtualization, Networking, Security, Cloud, AI/ML, IoT, and Telecommunications. As a Technology First company, our mission is to find new ways to solve customer problems by leveraging agility, technological innovation, and deep engineering expertise. With a global presence, thousands of innovators, and a clientele including Fortune 500 companies, Calsoft is a trusted digital transformation partner. Job Title: SONIC Platform Validation Engineer (Layer 1/Layer2) Job Description: We are seeking a highly skilled and motivated SONIC Platform Validation Engineer to join our team. The ideal candidate will be responsible for validating and ensuring the robustness of the SONIC platform at Layer 1 and also the Serdes level testing experience. This role involves working closely with hardware and software teams to ensure seamless integration and optimal performance. Key Responsibilities: Develop and execute test plans for validating Layer 1 functionalities of the SONIC platform or any other NOS. Hands-on experience with industry level Network operating systems such as Cisco XR/ JUNOS in L1 and L2 levels Handson with designing and developing scripted test automation infrastructure for system electrical validation, test automation. Experience in using Python for data parsing/analysis for L1-L3 performance metrics (e.g., SerDes performance FEC SER, packet statistics, power, temperature), Experience with iterative test infra handling with traffic generator, PDU through API RPCS, TCP/IP, and Python Good understanding of PCS/PMA and MAC level test cases. Good understanding of Ethernet standards including Speed and FEC Ensure all hardware and protocols meet industry standards and regulatory requirements, including Ethernet (IEEE 802.3), VLANs (IEEE 802.1Q) and VRF Experience with testing Layer1 and hardware validation for data center switching products from 1.8T to 25.6T and 40G, 100G, and 400G/800G ports Hands-on experience with SerDes tuning and channel optimization Tx FIR tuning for host-side links between ASIC/ retimes / module loopbacks; for optical eye compliance to meet IEEE 802.3 requirements at 10G, 25G, and 50Gbps data rates Hands-on with different media types such as DAC/AEC/Optical modules ranging from 10G till 800G with different form factors such as QSFPDD/OSFP/QSFP56/QSFP112. Must have worked with traffic generators such as IXIA/XENA/JDSU/Spirent Collaborate with hardware teams to ensure proper integration and functionality of hardware components (e.g., ASICs). Perform detailed analysis and debugging of issues related to Layer 1. Document test results and provide comprehensive reports to stakeholders. Work with cross-functional teams to resolve issues and improve platform stability. Stay updated with the latest advancements in SONIC and related technologies. Collaborate with firmware and driver teams to validate Layer 1 interactions with higher layers. Qualifications: Bachelor's or Master's degree in Electrical Engineering, Computer Science, or a related field. 10+ years of experience in Layer1 and Layer 2 validation. Strong understanding of Layer 1 and Layer2 protocols and technologies. Experience with SONIC/Any NOS platform Proficiency in scripting languages such as Python or Bash. Familiarity with hardware validation and debugging tools. Excellent problem-solving skills and attention to detail. Application Instructions- Please share the following mandatory details along with your application: Total IT Experience: Experience in Validation Engineer Role: Experience in Networking Protocol: Layer1 and Layer 2 validation Experience in Docker/containerization : Current CTC: Expected CTC: Current Location: Notice Period: Are you comfortable working in Indore (in-person interview required)? Show more Show less

Posted 3 weeks ago

Apply

4.0 years

0 Lacs

Bengaluru, Karnataka

On-site

Indeed logo

- 3+ years of building models for business application experience - PhD, or Master's degree and 4+ years of CS, CE, ML or related field experience - Experience in patents or publications at top-tier peer-reviewed conferences or journals - Experience programming in Java, C++, Python or related language - Experience in any of the following areas: algorithms and data structures, parsing, numerical optimization, data mining, parallel and distributed computing, high-performance computing Do you want to join an innovative team of scientists who use machine learning and statistical techniques to create state-of-the-art solutions for providing better value to Amazon’s customers? Do you want to build and deploy advanced algorithmic systems that help optimize millions of transactions every day? Are you excited by the prospect of analyzing and modeling terabytes of data to solve real world problems? Do you like to own end-to-end business problems/metrics and directly impact the profitability of the company? Do you like to innovate and simplify? If yes, then you may be a great fit to join the Machine Learning and team for India Consumer Businesses. If you have an entrepreneurial spirit, know how to deliver, love to work with data, are deeply technical, highly innovative and long for the opportunity to build solutions to challenging problems that directly impact the company's bottom-line, we want to talk to you. Major responsibilities - Use machine learning and analytical techniques to create scalable solutions for business problems - Analyze and extract relevant information from large amounts of Amazon’s historical business data to help automate and optimize key processes - Design, development, evaluate and deploy innovative and highly scalable models for predictive learning - Research and implement novel machine learning and statistical approaches - Work closely with software engineering teams to drive real-time model implementations and new feature creations - Work closely with business owners and operations staff to optimize various business operations - Establish scalable, efficient, automated processes for large scale data analyses, model development, model validation and model implementation - Mentor other scientists and engineers in the use of ML techniques About the team The India Machine Learning team works closely with the business and engineering teams in building ML solutions that create an impact for Amazon's IN businesses. This is a great opportunity to leverage your machine learning and data mining skills to create a direct impact on consumers and end users. Experience using Unix/Linux Experience in professional software development Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner.

Posted 3 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies