Home
Jobs

428 Parsing Jobs

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

2.0 years

0 Lacs

Gurgaon, Haryana, India

On-site

Linkedin logo

Job Description Having quick turnaround and increased test coverage is the need of the hour; hence the increased emphasis on automation testing across projects. Publicis Sapient is looking for automation experts who in addition to the expertise in the tools and techniques also have the knack to understand the business needs, ROI and accordingly create the automation strategy. Assures consistent quality of Software Applications production by developing and enforcing robust Automated Software QE strategy, practices, and processes, providing documentation and managing people. They collaborate the with the project, business and (QE) teams, to develop detailed automated scripts, test frameworks to make the overall system more effective and efficient for our clients. Responsible for the overall quality of the project through effective QE leadership and management to ensure that all deliverables in terms of time, price and quality are met. This individual must have a proven track record of success building, leading, and managing a functional and technical QE team with a strong sense of quality ownership. This is a hands-on job that requires strategic thinking and planning to provide leadership and expertise throughout the entire QA lifecycle, ensuring the success of the team s manual and automation efforts in an agile working environment Able to estimate for low and medium complexity applications and have used at least one of the estimation techniques. Able to handle/oversight a small team ranging from 2 -5 people and can guide them during the complete SDLC cycle starting from test case creation till test closure activities Well-versed with the most of the activities in defect management process, can define/enhance the defect documentation and TAR lifecycle process independently Have expertise to enforce/adhere defect or other processes in the team Preferred (Mostly for people being hired at the Senior Associate Career Stage) Mentored or coached at least one person Can define Automation Test strategy and test plan for low and medium complexity applications taking into account the business needs, ROI etc. Able to maintain and report test coverage matrix • Able to identify device coverage for the application in question. Can devise regression testing approach Qualifications 2-4 years of experience. Experience with QE for distributed, highly scalable systems Good understanding of OOPS concepts and strong programming skills in Java, Groovy, or JavaScript Hands-on experience in working with at least one GUI-based test automation tool for desktop and/or mobile automation. Experience with multiple tools will be added advantage Proficient in writing SQL queries Familiarity with the process of test automation tool selection & test approach Experience in designing and developing automation frameworks and creation of scripts using best industry practices such as Page object model Integrate test suites into the test management system and custom test harness Familiar with the implementation of design patterns, modularization, and user libraries for framework creation Can mentor team as well as has short learning curve for new technology Understands all aspects of Quality Engineering Understanding of SOAP and REST principles Thorough understanding of microservices architecture In-depth hands-on experience of working with at least one API testing tool like RestAssured, SOAP UI, NodeJS Hands-on experience working with Postman or similar tool Hands-on experience in parsing complex JSON & XML and data validation using serialization techniques like POJO classes or similar Hands-on experience in performing Request and Response Schema validation, Response codes, and exceptions Good Understanding of BDD, TDD methodologies, and tools like Cucumber, TestNG, Junit, or similar. Experience in defining API E2E testing strategy, designing and developing API automation framework Working experience on building tools Maven / Gradle, Git, etc. Experience in creating test pipeline – CI/CD Possess domain knowledge to identify issues across those domains, understand their impact, and drive resolution [(familiar/expert in domains like retail banking, automobile, insurance, betting, food markets, hotel industry, healthcare) Used /Exposure to automation tool for automating mobile applications Used /Exposure to automation tool for non-functional testing To set up test environment for execution on cloud environments such as Sauce Labs, browser stack Knowledge of new tools (open source & licensed) in the automation world and have the knack to explore them and keep abreast with the latest tools in the market Expertise in creating test automation frameworks, implementing and maintaining them on a project Experience in the modern agile practices such as BDD/Cucumber, DevOps Knowledge and experience in service virtualization and tools like CA Lisa Hands-on knowledge of setting up PACT Broker and writing PACT tests is desirable Experience in test management tools like Xray & Zephyr and integration of test framework with these tools Understanding of commonly used software design patterns like Builder, Factory, Singleton and Faà ade Possess excellent Communication skills (written, verbal both formal & informal) Helps to create a positive, collaborative working environment for the team. Quick grasping and flexibility to adapt to new technologies/processes Ability to multi-task under pressure and work independently with minimal supervision. i.e. Ability to prioritize when under pressure Efficiently makes tough decisions and communicates them effectively. Independently manages operational level client meetings. Develops strong relationships with appropriate client stakeholders. Acts as the primary POC/facilitator for planned (regular) client meetings. Manages peer-level client relationships (expectations, communications, negotiations, escalation, feedback, etc.) Education: Full time Bachelor’s/Master’s engineering degree Additional Information Gender-Neutral Policy 18 paid holidays throughout the year. Generous parental leave and new parent transition program Flexible work arrangements Employee Assistance Programs to help you in wellness and well being Company Description Publicis Sapient is a digital transformation partner helping established organizations get to their future, digitally enabled state, both in the way they work and the way they serve their customers. We help unlock value through a start-up mindset and modern methods, fusing strategy, consulting and customer experience with agile engineering and problem-solving creativity. United by our core values and our purpose of helping people thrive in the brave pursuit of next, our 20,000+ people in 53 offices around the world combine experience across technology, data sciences, consulting and customer obsession to accelerate our clients’ businesses through designing the products and services their customers truly value.

Posted 1 hour ago

Apply

3.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Description Do you want to join an innovative team of scientists who use machine learning and statistical techniques to create state-of-the-art solutions for providing better value to Amazon’s customers? Do you want to build and deploy advanced algorithmic systems that help optimize millions of transactions every day? Are you excited by the prospect of analyzing and modeling terabytes of data to solve real world problems? Do you like to own end-to-end business problems/metrics and directly impact the profitability of the company? Do you like to innovate and simplify? If yes, then you may be a great fit to join the Machine Learning and team for India Consumer Businesses. If you have an entrepreneurial spirit, know how to deliver, love to work with data, are deeply technical, highly innovative and long for the opportunity to build solutions to challenging problems that directly impact the company's bottom-line, we want to talk to you. Major responsibilities Use machine learning and analytical techniques to create scalable solutions for business problems Analyze and extract relevant information from large amounts of Amazon’s historical business data to help automate and optimize key processes Design, development, evaluate and deploy innovative and highly scalable models for predictive learning Research and implement novel machine learning and statistical approaches Work closely with software engineering teams to drive real-time model implementations and new feature creations Work closely with business owners and operations staff to optimize various business operations Establish scalable, efficient, automated processes for large scale data analyses, model development, model validation and model implementation Mentor other scientists and engineers in the use of ML techniques About The Team The India Machine Learning team works closely with the business and engineering teams in building ML solutions that create an impact for Amazon's IN businesses. This is a great opportunity to leverage your machine learning and data mining skills to create a direct impact on consumers and end users. Basic Qualifications 3+ years of building models for business application experience PhD, or Master's degree and 4+ years of CS, CE, ML or related field experience Experience in patents or publications at top-tier peer-reviewed conferences or journals Experience programming in Java, C++, Python or related language Experience in any of the following areas: algorithms and data structures, parsing, numerical optimization, data mining, parallel and distributed computing, high-performance computing Preferred Qualifications Experience using Unix/Linux Experience in professional software development Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner. Company - ADCI - Karnataka Job ID: A2997187

Posted 1 hour ago

Apply

1.0 - 3.0 years

1 - 4 Lacs

India

On-site

GlassDoor logo

We’re Hiring: Flutter (Android) Developer at Neovify Technolabs Pvt. Ltd. Location: On-site – Ahmedabad Experience: 1–3 years (Mandatory) Employment Type: Full-time Neovify Technolabs Pvt. Ltd. is looking for a talented Flutter Developer to join our on-site team in Ahmedabad. If you're passionate about creating beautiful and high-performance cross-platform mobile apps, this is the perfect opportunity to grow with a fast-evolving tech company. Key Responsibilities: Develop high-quality cross-platform mobile applications using Flutter Collaborate with UI/UX designers, backend developers, and project managers Translate designs and wireframes into clean, responsive, and scalable code Integrate APIs and third-party libraries into mobile applications Debug and optimize application performance across Android and iOS platforms Keep up with emerging trends in mobile development and the Flutter ecosystem ✅ Requirements: 1–3 years of solid experience in Flutter development Proficient in Dart and Flutter SDK Must have hands-on experience with native Android development Experience with state management (e.g., Provider, Bloc, Riverpod, etc.) Familiarity with REST APIs, JSON parsing, and third-party integrations Good understanding of the mobile app lifecycle and app store deployment processes Knowledge of Git and collaborative development workflows Strong problem-solving skills and attention to detail ➕ Bonus Skills: Experience with Firebase (Firestore, Auth, Push Notifications) Knowledge of native iOS development basics Familiarity with Clean Architecture or Modular Design in Flutter How to Apply: Send your resume and portfolio (if available) to careers@neovify.com Join Neovify Technolabs Pvt. Ltd. and help us build beautiful, fast, and scalable mobile solutions. Job Types: Full-time, Permanent Pay: ₹15,000.00 - ₹40,000.00 per month Schedule: Monday to Friday Morning shift Experience: Android Development: 1 year (Required) Flutter: 2 years (Required) Location: Makarba, Ahmedabad, Gujarat (Required) Shift availability: Day Shift (Preferred) Work Location: In person

Posted 2 hours ago

Apply

2.0 years

10 Lacs

Indore

On-site

GlassDoor logo

Job Summary: We are looking for a skilled and motivated AI Developer with hands-on experience in Retrieval-Augmented Generation (RAG) techniques. The ideal candidate should have a deep understanding of NLP, LLMs (like GPT, LLaMA, or similar), vector databases, and integration of retrieval mechanisms into generative models to create intelligent, context-aware systems. Key Responsibilities: Design and develop AI-powered applications using RAG-based architectures. Integrate large language models (LLMs) with retrieval systems such as vector databases (e.g., FAISS, Pinecone, Weaviate, Qdrant). Fine-tune and evaluate language models for domain-specific tasks. Implement document parsing, chunking, and embedding generation using NLP techniques. Create end-to-end pipelines for document ingestion, semantic search, and context-aware generation. Optimize performance and accuracy of RAG systems. Collaborate with cross-functional teams including data engineers, product managers, and frontend/backend developers. Key Requirements: Bachelor's/Master's degree in Computer Science, Artificial Intelligence, or a related field. 2+ years of experience in AI/ML development, with at least 1 year working on RAG or related LLM-based applications. Strong programming skills in Python and experience with libraries like LangChain, Hugging Face Transformers, PyTorch, etc. Hands-on experience with vector databases like FAISS, Pinecone, or Qdrant. Good understanding of semantic search, embeddings, and prompt engineering. Familiarity with APIs from OpenAI, Cohere, Hugging Face, etc. Knowledge of cloud services (AWS, Azure, GCP) is a plus. Strong problem-solving skills and a passion for innovation in generative AI. Job Type: Full-time Pay: Up to ₹1,000,000.00 per year Schedule: Day shift Application Question(s): How much experience do you have in building robust RAG-based systems? Are you comfortable relocating to Indore? Please mention your current ctc. Work Location: In person

Posted 2 hours ago

Apply

0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Linkedin logo

Req ID: 329321 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Software Development Senior Specialist to join our team in Noida, Uttar Pradesh (IN-UP), India (IN). L4 support resource required with the below skill set: Experience in Pentesting skill sets: HTTP/HTTPS protocol, HTML, CSS, and JavaScript, Web frameworks (Spring Framework & SpringBoot), OWASP Top Ten vulnerabilities (e.g., SQL Injection, XSS, CSRF), Burp Suite, OWASP ZAPSQLMap, Postman, RESTful and SOAP API testing, Bash scripting, Session hijacking, Vulnerability reporting and documentation Other mandatory skillset: Microservices Architecture, Spring Boot, Cloud Integration (AWS, Azure, Google Cloud), Containerization (Docker, Kubernetes), Performance Optimization (profiling and tuning JVM), Security (OAuth, JWT, secure coding practices), API Gateway (Kong, Zuul, AWS API Gateway), Reactive Programming (Spring WebFlux, RxJava), CI/CD Pipelines (Jenkins, GitLab CI, CircleCI), UI Frontend frameworks like – React JS, Jquery 1.8x, Bootstrap, AngularJS, CSS3, and Java Script frameworks, HTML5 Experience in Core Java , Web Services ( SOAP / REST ) , JSF/JSP/Servlets, Dependency Injection frameworks, ORM tools Experience in build tools (Ant, Maven etc.) & unit testing utilizing JUNIT and other unit testing tools. Knowledge on JSON , XSL, JQuery, XML Parsing, XPath Proficient in Frameworks Spring, Spring module, MVC DevOps tools usage, Jenkins, BitBucket, Selenium automation scripting, SonarQube etc. Java/J2EE/Web and Application Servers/XML/CSS Experience on Windows/Unix/AIX systems, Oracle/SQL Experience with multi-tier architecture Exposure to retail/online Banking applications Experience in performing pentesting ITIL – Production support experience Good communication and coordination experience between teams. BFSI domain experience would be desirable. Excellent written and verbal communication skills & Excellent analytical and troubleshooting skills. Role Responsibilities Incident, problem, change management and other ITIL functions Enhancement and release support Participate in service introduction activities Delivery as per agreed SLAs Knowledge management Daily checks and reporting Provide oncall support Includes direct interaction with customers and client stakeholders Participate and manage technical issue and other governance calls Irish Working hours, oncall support & weekend support About NTT DATA NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies. Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us at us.nttdata.com NTT DATA endeavors to make https://us.nttdata.com accessible to any and all users. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact us at https://us.nttdata.com/en/contact-us . This contact information is for accommodation requests only and cannot be used to inquire about the status of applications. NTT DATA is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status. For our EEO Policy Statement, please click here . If you'd like more information on your EEO rights under the law, please click here . For Pay Transparency information, please click here .

Posted 2 hours ago

Apply

0 years

0 Lacs

Ahmedabad, Gujarat, India

On-site

Linkedin logo

Designation: Talent Acquisition Trainee Location: Ahmedabad Experience: Fresher What We Offer: 3-month internship with the potential for a full-time job opportunity Hands-on training and mentorship from experienced HR professionals Exposure to live recruitment projects and real-time candidate interactions A vibrant, growth-oriented work culture 5-day work week with flexible work timings Job Summary: As a Talent Acquisition Trainee at Techify Solutions, you will gain hands-on experience in end-to-end recruitment processes. This role is ideal for final-year MBA (HR) students looking to build a strong foundation in talent acquisition within a fast-paced IT services company. You will assist in sourcing, screening, coordinating interviews, and supporting various recruitment campaigns. Key Responsibilities: Source candidates via job portals (LinkedIn, Naukri), social media, and college outreach Review resumes and conduct initial screenings aligned with JDs Coordinate interview schedules between candidates and hiring managers Maintain seamless candidate communication to ensure a positive hiring experience Support campus hiring events and other recruitment drives Update ATS, candidate databases, and recruitment trackers accurately Learn ATS functions such as resume parsing, keyword matching, pipeline management, and automated communications Requirements: Preferred: Final-year MBA student (HR specialization); other graduates with interest in HR are welcome Strong enthusiasm for recruitment and talent acquisition Excellent verbal and written communication skills Familiarity with LinkedIn, Naukri, or other job portals Basic understanding of ATS and interest in gaining practical ATS experience Proficient in Microsoft Office (Word, Excel, PowerPoint) High attention to detail, organized multitasker Professional, proactive, and eager to learn

Posted 4 hours ago

Apply

5.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

Job Description : SDET (Software Development Engineer in Test) Notice Period Requirement: Immediately to 2 Month(Officially) Job Locations: Gurgaon Experience: 5 to 8 Years Skills: SDET, Automation, Java programming, Selenium, Cucumber, Rest Assured Job Type : Full-Time Job Description We are seeking an experienced and highly skilled SDET (Software Development Engineer in Test) to join our Quality Engineering team. The ideal candidate will possess a strong background in test automation with API testing or mobile testing or Web, with hands-on experience in creating robust automation frameworks and scripts. This role demands a thorough understanding of quality engineering practices, microservices architecture, and software testing tools. Key Responsibilities : - Design and develop scalable and modular automation frameworks using best industry practices such as the Page Object Model. - Automate testing for distributed, highly scalable systems. - Create and execute test scripts for GUI-based, API, and mobile applications. - Perform end-to-end testing for APIs, ensuring thorough validation of request and response schemas, status codes, and exception handling. - Conduct API testing using tools like Rest Assured, SOAP UI, NodeJS, and Postman, and validate data with serialization techniques (e.g., POJO classes). - Implement and maintain BDD/TDD frameworks using tools like Cucumber, TestNG, or JUnit. - Write and optimize SQL queries for data validation and backend testing. - Integrate test suites into test management systems and CI/CD pipelines using tools like Maven, Gradle, and Git. - Mentor team members and quickly adapt to new technologies and tools. - Select and implement appropriate test automation tools and strategies based on project needs. - Apply design patterns, modularization, and user libraries for efficient framework creation. - Collaborate with cross-functional teams to ensure the quality and scalability of microservices and APIs. Must-Have Skills : - Proficiency in designing and developing automation frameworks from scratch. - Strong programming skills in Java, Groovy, or JavaScript with a solid understanding of OOP concepts. - Hands-on experience with at least one GUI automation tool (desktop/mobile). Experience with multiple tools is an advantage. - In-depth knowledge of API testing and microservices architecture. - Experience with BDD and TDD methodologies and associated tools. - Familiarity with SOAP and REST principles. - Expertise in parsing and validating complex JSON and XML responses. - Ability to create and manage test pipelines in CI/CD environments. Nice-to-Have Skills : - Experience with multiple test automation tools for GUI or mobile platforms. - Knowledge of advanced serialization techniques and custom test harness implementation. - Exposure to various test management tools and automation strategies. Qualifications : - Bachelor's or Master's degree in Computer Science, Engineering, or a related field. - 5 Years+ in software quality engineering and test automation. - Strong analytical and problem-solving skills with attention to detail.

Posted 4 hours ago

Apply

3.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Roles and Responsibilities: As a, Data scientist / Senior Data scientist you will solve some of the most impactful business problems for our clients using a variety of AI and ML technologies. You will collaborate with business partners and domain experts to design and develop innovative solutions on the data to achieve predefined outcomes. Engage with clients to understand current and future business goals and translate business problems into analytical frameworks Develop custom models based on in-depth understanding of underlying data, data structures, and business problems to ensure deliverables meet client needs Create repeatable, interpretable and scalable models Effectively communicate the analytics approach and insights to a larger business audience Collaborate with team members, peers and leadership at Tredence and client companies Qualification: Bachelor's or Master's degree in a quantitative field (CS, machine learning, mathematics, statistics) or equivalent experience. 3+ years of experience in data science, building hands-on ML models Experience with LMs (Llama (1/2/3), T5, Falcon, Langchain or framework similar like Langchain) Candidate must be aware of entire evolution history of NLP (Traditional Language Models to Modern Large Language Models), training data creation, training set-up and finetuning Candidate must be comfortable interpreting research papers and architecture diagrams of Language Models Candidate must be comfortable with LORA, RAG, Instruct fine-tuning, Quantization, etc. Experience leading the end-to-end design, development, and deployment of predictive modeling solutions. Excellent programming skills in Python. Strong working knowledge of Python’s numerical, data analysis, or AI frameworks such as NumPy, Pandas, Scikit-learn, Jupyter, etc. Advanced SQL skills with SQL Server and Spark experience. Knowledge of predictive/prescriptive analytics including Machine Learning algorithms (Supervised and Unsupervised) and deep learning algorithms and Artificial Neural Networks Experience with Natural Language Processing (NLTK) and text analytics for information extraction, parsing and topic modeling. Excellent verbal and written communication. Strong troubleshooting and problem-solving skills. Thrive in a fast-paced, innovative environment Experience with data visualization tools — PowerBI, Tableau, R Shiny, etc. preferred Experience with cloud platforms such as Azure, AWS is preferred but not require

Posted 4 hours ago

Apply

3.0 years

0 Lacs

Hyderabad, Telangana, India

Remote

Linkedin logo

Role Description This is a remote, paid internship role for a Full Stack Engineer Intern at JOB.AI. The role involves working on real-world AI-powered applications, building features across the stack — from handling file parsing and AI model integration to frontend development and secure authentication. We accept recently graduated students in Computer Science. Qualifications • 0–3 years of experience in full-stack web development • Bachelor's degree in Computer Science or equivalent • Strong in JavaScript, Python, and REST APIs • Hands-on experience with AI Agents and AI Models • React.js / Next.js or similar frontend frameworks • Node.js / Django / Flask for backend development • MongoDB, PostgreSQL, or Firebase for database management • Familiar with resume parsing, text similarity (NLP), or keyword extraction • Experience handling file uploads (PDF/DOCX), parsing, and generation • Experience with NLP Libraries( spaCy, scikit-learn, or fuzzywuzzy). • Experience implementing secure login with Firebase Auth (or JWT) for multiple roles.

Posted 7 hours ago

Apply

0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Has excellent knowledge of Node JS. Worked with Express JS. Knowledge of ORMs like Drizzle, TypeORM or Prizma Has knowledge of MySQL or PostGreSQL Has worked with REST Apis extensively Has knowledge of XML parsing and construction Good to have SAP exposure

Posted 8 hours ago

Apply

0.0 - 2.0 years

0 - 0 Lacs

Makarba, Ahmedabad, Gujarat

On-site

Indeed logo

We’re Hiring: Flutter (Android) Developer at Neovify Technolabs Pvt. Ltd. Location: On-site – Ahmedabad Experience: 1–3 years (Mandatory) Employment Type: Full-time Neovify Technolabs Pvt. Ltd. is looking for a talented Flutter Developer to join our on-site team in Ahmedabad. If you're passionate about creating beautiful and high-performance cross-platform mobile apps, this is the perfect opportunity to grow with a fast-evolving tech company. Key Responsibilities: Develop high-quality cross-platform mobile applications using Flutter Collaborate with UI/UX designers, backend developers, and project managers Translate designs and wireframes into clean, responsive, and scalable code Integrate APIs and third-party libraries into mobile applications Debug and optimize application performance across Android and iOS platforms Keep up with emerging trends in mobile development and the Flutter ecosystem ✅ Requirements: 1–3 years of solid experience in Flutter development Proficient in Dart and Flutter SDK Must have hands-on experience with native Android development Experience with state management (e.g., Provider, Bloc, Riverpod, etc.) Familiarity with REST APIs, JSON parsing, and third-party integrations Good understanding of the mobile app lifecycle and app store deployment processes Knowledge of Git and collaborative development workflows Strong problem-solving skills and attention to detail ➕ Bonus Skills: Experience with Firebase (Firestore, Auth, Push Notifications) Knowledge of native iOS development basics Familiarity with Clean Architecture or Modular Design in Flutter How to Apply: Send your resume and portfolio (if available) to careers@neovify.com Join Neovify Technolabs Pvt. Ltd. and help us build beautiful, fast, and scalable mobile solutions. Job Types: Full-time, Permanent Pay: ₹15,000.00 - ₹40,000.00 per month Schedule: Monday to Friday Morning shift Experience: Android Development: 1 year (Required) Flutter: 2 years (Required) Location: Makarba, Ahmedabad, Gujarat (Required) Shift availability: Day Shift (Preferred) Work Location: In person

Posted 9 hours ago

Apply

0 years

0 Lacs

Vadodara, Gujarat, India

Remote

Linkedin logo

We’re reinventing the market research industry. Let’s reinvent it together. At Numerator, we believe tomorrow’s success starts with today’s market intelligence. We empower the world’s leading brands and retailers with unmatched insights into consumer behavior and the influencers that drive it. Numerator is currently looking for a Configuration Engineer to join our Data Extraction (DX) team in India (Remote). In this role, you will be responsible for helping to grow and maintain a library of thousands of receipt parsing configurations used by fortune 500 brands and retailers. Day-to-day, you’ll come up with creative solutions to complex problems, and learn new skills to complement your existing abilities. This is a great role for those who are looking for hands-on experience with high visibility and impact. We welcome fresh ideas and approaches as we constantly aim to improve our development processes. Our team has experience using a wide range of technologies and years of cloud and big data experience. We are always learning and growing, so we can guarantee that you won’t be bored with us! If you are seeking an environment where you get to do meaningful work with other great engineers, then we want to hear from you! What You Will Get To Do Write clean, efficient, thoroughly tested code, back-up with pair programming and code reviews. Much of our code is Python, but we use all kinds of languages and frameworks. Create complex regexes that pull structured data out of OCR-transcribed receipt images as well as XPATHs to extract data from receipt emails. Maintain the platform that drives our receipt extraction at scale. Troubleshoot, test, and maintain the platform and configurations to ensure strong optimization and functionality. Evaluate the technical tradeoffs of decisions and build things that last and scale. Maintain and fix existing configuration issues. Create and analyze new configuration technologies - figuring out how we can scale up our receipt extraction. What You'll Bring to Numerator Programming experience in Python An eagerness to learn new things, and improve upon existing skills, abilities and practices Familiarity with web technology, such as HTTP, JSON, HTML, XPath or JavaScript. Knowledge in an Agile software development environment, Experience with version control systems (Git, Subversion, etc.). Have a real passion for clean code and finding elegant solutions to problems. Eager to expand your knowledge and abilities in python and cloud-based technologies. Motivation to participate in ongoing learning and growth through pair programming, test-driven development, code reviews, and application of new technologies and best practices. You look ahead to identify opportunities and foster a culture of innovation. Good communication (verbal and written) Nice to haves Knowledge of web scraping Knowledge of regular expressions Knowledge of business rules engines. Familiarity with virtual software development environments (ie. Vagrant, docker etc.) Familiarity with object-oriented programming Scripting knowledge Familiarity with JSON and similar data formats Experience with databases, SQL or noSQL. Programming experience on Unix based infrastructure. Knowledge of cloud-based systems (EC2, Rackspace, etc.). Expertise with big data, analytics, machine learning, and personalization.

Posted 19 hours ago

Apply

2.0 years

0 Lacs

Guwahati, Assam, India

On-site

Linkedin logo

*Note:* This position is open only to candidates currently residing in Guwahati or nearby regions. Alegra Labs is looking for Flutter Developers in Guwahati Name of Post: Flutter Developer No. of Posts: 2 Pay: INR 25,000 - 45,000 Age: NA Educational Qualification: MCA / M.E / M.Tech / MSc in IT or Computer Science Role Description This is a full-time on-site role for Flutter Developers at Alegra Labs, Guwahati. You will be responsible for developing cross-platform mobile applications using the Flutter framework, maintaining performance, and ensuring smooth user experience. Daily responsibilities include UI design, feature implementation, debugging, testing, and collaborating with back-end and design teams. Skills and Experience Required Essential Experience: - Minimum 2 years of experience in Flutter development for both iOS and Android. - Proficiency in Flutter and Dart programming language. - Experience in using third-party APIs and handling API integration. - Working knowledge of Firebase services: Auth, Firestore, Realtime DB, Cloud Functions. - Firebase Cloud Messaging (FCM) & Push Notifications implementation. - Published applications on Google Play Store and/or Apple App Store. - Familiar with source code management tools: GitHub, GitLab, Bitbucket. - Capable of building and handling local databases (SQLite). - Good understanding of State Management (Provider, Bloc, Riverpod, etc.). - Experience with Geolocation services and Google Maps integration. - File management: Downloading, uploading, and file storage handling. - Knowledge of Localization for multi-language apps. - Experience implementing Pagination. - Hands-on experience with Speech-to-Text functionalities. - Solid understanding of Memory Management, Caching, and crash handling. - Experience with CI/CD pipelines and DevOps workflow. Additional Skills: - Knowledge of Regular Expressions (RegEx) and Pattern Matching for input validation, parsing, and automation. - Understanding of Basic Data Structures and Algorithms (DSA) relevant to mobile development. - Familiarity with UI/UX principles, custom widgets, and responsive layout techniques. - Knowledge of CSS is a plus (especially for hybrid app design). Interview & Application Details Walk-in Date & Time: Shortlisted candidates will be notified and invited for an in-person interview. Application Fee: NIL How to Apply: Apply online at https://www.alegralabs.com/career/ Last Date: 31/07/2024

Posted 20 hours ago

Apply

3.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

BlitzenX is looking for a meticulous, driven, and reliable Data Entry Executive to support our Sales and Recruitment functions. This is not a routine typing job — this role is mission-critical in helping us identify the right decision-makers in our target accounts and ensuring our applicant tracking system (ATS) remains sharp, accurate, and de-cluttered. This is a high-trust, high-detail role ideal for someone who thrives in fast-paced, data-heavy environments. Key Responsibilities Sales Support (60%) Research and update CRM records with accurate, verified Sales POCs (decision-makers, influencers, buying group personas) for targeted accounts across Insurance verticals. Clean and enrich contact databases using LinkedIn Sales Navigator, ZoomInfo, Apollo, or similar tools. Ensure sales leads and contacts are tagged by portfolio (P&C, Life, Health) and by sales priority. De-duplicate contacts and accounts, eliminate bad records, and correct contact hierarchy mismatches. Partner with Sales and Portfolio Heads to ensure territory-level data hygiene is maintained. Support in pulling accurate contact lists for campaigns and leadership outreach. Recruitment Support (40%) Maintain and clean ATS records — remove duplicates, merge candidate profiles, and validate source tagging. Align candidate records with the current job hierarchy and BlitzenX hiring model (e.g., team mapping for Developers/Sr. Developers/Leads). Perform weekly audits on candidate pipelines to eliminate spam, outdated, or non-profile-matching resumes. Validate and tag referrals, agency submissions, and inbound profiles for hiring analytics. Work closely with Recruitment Operations to ensure interview logs, offer tracking, and candidate stage updates are accurately reflected Required Skills & Experience 3+ years of hands-on experience in data entry, CRM/ATS operations, or lead enrichment roles, preferably supporting Sales and Recruitment teams in fast-paced technology or staffing environments. CRM Tools Expertise: Proven experience with Apollo.io, HubSpot, and ZoomInfo for lead enrichment, persona mapping, and contact verification. Ability to perform advanced filtering, segmentation, and list building within these tools, aligned to Ideal Customer Profiles (ICPs) across verticals (e.g., P&C, Life, Health Insurance). Familiarity with LinkedIn Sales Navigator and Hunter.io to validate and cross-check contact intelligence. ATS Platform Knowledge: Practical working knowledge of JobDiva, Bullhorn, or equivalent ATS platforms. Proficient in candidate record de-duplication, source tagging, pipeline stage updates, and job-to-candidate mapping. Understanding of resume parsing, metadata fields, and tagging standards for large-scale recruitment workflows. Data Integrity & Operational Rigor: Demonstrated capability in managing large datasets with zero-tolerance for duplicates, missing fields, or bad data hierarchies. Strong command over Microsoft Excel or Google Sheets, including lookups, pivot tables, conditional formatting, and data validation rules. Familiarity with data hygiene automation tools and Chrome extensions for enrichment/sync (e.g., Clearbit, ContactOut, Skrapp). Process Mindset: Able to follow and continuously improve SOPs related to CRM/ATS hygiene, enrichment cycles, and reporting standards. Strong documentation discipline — can log changes, maintain audit trails, and build reusable checklists. Execution-Focused: Can process 500+ contact or candidate updates per week with high accuracy and within defined SLAs. Operates with speed, but never at the cost of precision; understands how poor data directly impacts sales and hiring effectiveness. Cross-Team Collaboration: Experience working with Sales Operations, Recruitment Operations, and Leadership stakeholders to prioritize cleanup backlogs and support live campaign or hiring sprint needs. Capable of translating abstract requirements into system-ready records with minimal hand-holding. Mindset & Cultural Fit You are a fixer — if you see clutter, you organize it without waiting to be told. You thrive on structure — clean data, accurate records, and tight workflows are your motivators. You understand speed + accuracy matters — and know how to balance both. You own your numbers — if there's a gap, you dig in until it's resolved. Performance Metrics Sales Contact Accuracy % ATS Duplicate Clean-Up Rate Weekly CRM/ATS Audit Completion Rate Task SLA Compliance Stakeholder Satisfaction Score (Sales & Recruitment Ops)

Posted 1 day ago

Apply

5.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Thank you for considering the Backend Engineer position at Reveal Health Tech. We are an early-stage IT startup based in the US and India, focused on leveraging technology to deliver transformative healthcare solutions. About The Applied AI Lab The Applied AI Lab is an internal R&D team at Reveal dedicated to identifying high-impact problems in healthcare, life sciences, and adjacent verticals—and transforming those insights into repeatable AI solutions. We operate as a nimble product studio within the company: researching emerging technologies, rapidly prototyping AI and ML-powered tools, and building foundational infrastructure to support long-term product plays. Our output ranges from sandbox-ready MVPs to reusable components and SaaS-aligned platforms. Our team is multi-disciplinary—engineering, design, research, and business—and we work closely with client-facing and go-to-market teams to validate our ideas in the real world. About The Role We're looking for a Backend Engineer to join the Applied AI Lab and play a critical role in building robust, scalable, and flexible backend systems that power intelligent products and agentic workflows. You'll be responsible for developing APIs, integrating ML components, and helping stand up infrastructure that connects user interfaces with AI/ML logic and data services. You'll work closely with ML engineers, frontend developers, and designers to enable end-to-end functionality, rapid iteration, and long-term reliability. This role is ideal for someone who enjoys wearing multiple hats, thrives in a fast-moving environment, and wants to build backend systems that support real product innovation. Requirements Design and implement scalable APIs and service layers to support agentic AI workflows and prototypes Work closely with ML engineers to integrate models into backend infrastructure and orchestrate inference flows Build data access layers and connect to databases, vector stores, and external APIs Collaborate with frontend engineers to define data contracts and enable seamless UI integration Develop infrastructure for task orchestration, agent state tracking, and output management Contribute to DevOps efforts (e.g., CI/CD pipelines, deployment scripts, logging/monitoring) Optimize backend systems for performance, modularity, and reuse across Lab projects Support rapid prototyping and contribute to turning MVPs into stable, reusable assets Participate in roadmap planning, design sessions, and prioritization of Lab initiatives Desired Qualifications Please note that while you do not need to be an expert in every area, being familiar with most of the following is important. We are looking for someone who can effectively integrate everything, with team support to fill any gaps. 5+ years of experience as a backend engineer Programming & Backend Skills Terraform: Proficient in writing, testing, and managing Terraform modules Primary Language: Python (3.x), with solid understanding of object-oriented design Secondary (Nice to Have): Go, Node.js, or Java API Development: Experience with RESTful APIs and/or GraphQL; FastAPI or Flask preferred, microservices, and service-oriented architecture Testing: Pytest, unit/integration testing best practices CI/CD Pipelines: Experience with GitHub Actions, GitLab CI, or similar Experienced with cloud platforms (AWS, Azure, GCP) Compute: Lambda, ECS/Fargate, EC2 Storage: S3, EFS Networking: VPC, Route53, API Gateway Databases: RDS (PostgreSQL/MySQL), DynamoDB Monitoring: CloudWatch, X-Ray IAM: Policies, roles, permissions model Data & Event Processing (Optional but valuable) Experience with: Message Brokers: SQS, Kafka Data Pipelines / ETL: AWS Step Functions, Airflow (especially on MWAA), Glue File Parsing: JSON, XML, CSV, Parquet, etc. Tooling & Environment Version Control: Git (GitHub or GitLab workflows) Secrets Management: AWS Secrets Manager or SSM Parameter Store Dev Tools: Docker Compose, Make, VS Code How you will enrich us? Energetic and enthusiastic Autonomous and self-motivated Growth mindset Embraces challenges Building new things gets your blood pumping Curiosity and deep interest in the world Challenges the status quo constructively Benefits What do you get in return? Be part of a high-impact team shaping the future of our IP and product innovation strategy Work with cutting-edge technologies in a focused but flexible lab environment Help define how applied AI can solve real-world problems in complex, high-stakes domains Grow with a small, mission-aligned team with executive support and long-term vision Industry best compensation and benefits Next Steps Send us your updated CV - if you can mention how you have enriched your previous organisation in a cover letter, that would be great! If we find your profile suitable, we will have our Talent personnel to reach out to you to understand your profile/interests and how best we can align mutually. Finally, you would have a chat with our Leadership to understand more about us and see if this is the right next career move!

Posted 1 day ago

Apply

5.0 years

5 - 6 Lacs

Hyderābād

On-site

GlassDoor logo

Job title: Cyber Defense - Splunk Admin – Assistant Manager Do you thrive on developing creative and innovative insights to solve complex issues? Want to work on next-generation, cutting-edge products and services that deliver outstanding value and that are global in vision and scope? Work with other experts in your field? Work for a world-class organization that provides an exceptional career experience with an inclusive and collaborative culture? Want to make an impact that matters? Consider Deloitte Global. Work you’ll do: The Cybersecurity Engineer position supports the SIEM Health Monitoring team which is responsible for monitoring the health and performance of the Splunk platform and data within Splunk. This role is responsible for supporting the Cybersecurity, SIEM and SOC, IR, Threat Intel teams to ensure the efficacy of the Splunk platform by creating content, mitigating monitoring gaps, performing RCA on critical components and creating content. The Role also requires you to work closely with our stakeholders and clients and deliver SIEM Health Monitoring solutions accordingly. Troubleshoot and perform RCA on various data quality alerts and SIEM platform alerts. Create and drive vendor (Splunk) support cases independently. Maintain the SIEM Health Monitoring group in ServiceNow or Azure Devops and ensure all tasks and incident SLAs as met as required by our stakeholders. Create, document and update playbooks, process documents, SOW(s), RCA content periodically. Actively seek to improve and develop new content to drive process improvement and innovation. Participate in bi-annual health checks and strategize monitoring maturity road-map. Provide excellent customer service, as we will be required to interact/work with other teams to complete our daily tasks. What you’ll be part of—our Deloitte Global culture: At Deloitte, we expect results. Incredible—tangible—results. And Deloitte Global professionals play a unique role in delivering those results. We reach across disciplines and borders to serve our global organization. We are the engine of Deloitte. We develop and implement global strategies and provide programs and services that unite our network. In Deloitte Global, everyone has opportunities. We see the importance of your perspective and your ability to create value. We want you to fit in—with an inclusive culture, focus on work-life fit and well-being, and a supportive, connected environment; but we also want you to stand out—with opportunities to have a strategic impact, innovate, and take the risks necessary to make your mark. Who you’ll work with: The Deloitte Global Cybersecurity function is responsible for enhancing data protection, standardizing and securing critical infrastructure, and gaining cyber visibility through security operations centers. The Cybersecurity organization delivers a comprehensive set of security services to Deloitte’s global network of firms around the globe. Qualifications Required: Bachelor’s degree in Computer Science, Information Technology, or relevant educational or professional experience. Atleast 5 years of hands-on Splunk Enterprise and or SplunkCloud Administration experience. Splunk Enterprise Core certified Admin, Power User, & User Strong Working Knowledge of the Splunk Platform and integrations to public cloud, EDR, Networking toolsets. Proficient in troubleshooting Splunk performance and data quality issues. Strong experience in analyzing, troubleshooting and providing solutions for technical issues. Knowledge about various data onboarding methods (UF, HEC, DBConnect, syslog-ng, rsyslog) and means to troubleshoot them. Knowledge and experience in GIT, Microsoft Azure DevOps, or any CI/CD tools. Experience in requirement gathering and documentation. Experience in Log parsing, lookups, calculated fields extractions using regular expression (regex). Experience in creating and troubleshooting Splunk Dashboards, Reports, Alerts, Visualizations and optimize SPL searches. Sound judgment and deduction skills with a knack to see out patterns. Proactive mindset and a self-starter with minimum supervision Excellent interpersonal and organizational skills. Preferred: Splunk Enterprise Certified Admin SplunkCloud experience is a huge plus Cribl User / Admin certification Knowledge of risk assessment tools, technologies and methods Experience with Splunk Enterprise Security How you’ll grow: Deloitte Global inspires our people at every level. We believe in investing in you, helping you at every step of your career, and helping you identify and hone your unique strengths. We encourage you to grow by providing formal and informal development programs, coaching and mentoring. We want you to ask questions, take chances, and explore the possible. Benefits you’ll receive: Deloitte’s Total Rewards program reflects our continued commitment to lead from the front in everything we do — that’s why we take pride in offering a comprehensive variety of programs and resources to support your health and well-being needs. We provide the benefits, competitive compensation, and recognition to help sustain your efforts in making an impact that matters. Corporate Citizenship Deloitte is led by a purpose: to make an impact that matters. This purpose defines who we are and extends to relationships with our clients, our people, and our communities. We believe that business has the power to inspire and transform. We focus on education, giving, skill-based volunteerism, and leadership to help drive positive social impact in our communities. Our purpose Deloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Our people and culture Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Professional development At Deloitte, professionals have the opportunity to work with some of the best and discover what works best for them. Here, we prioritize professional growth, offering diverse learning and networking opportunities to help accelerate careers and enhance leadership skills. Our state-of-the-art DU: The Leadership Center in India, located in Hyderabad, represents a tangible symbol of our commitment to the holistic growth and development of our people. Explore DU: The Leadership Center in India. Benefits to help you thrive At Deloitte, we know that great people make a great organization. Our comprehensive rewards program helps us deliver a distinctly Deloitte experience that helps that empowers our professionals to thrive mentally, physically, and financially—and live their purpose. To support our professionals and their loved ones, we offer a broad range of benefits. Eligibility requirements may be based on role, tenure, type of employment and/ or other criteria. Learn more about what working at Deloitte can mean for you. Recruiting tips From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Requisition code: 305357

Posted 1 day ago

Apply

3.0 years

0 Lacs

India

On-site

GlassDoor logo

Line of Service Advisory Industry/Sector Not Applicable Specialism Microsoft Management Level Senior Associate Job Description & Summary At PwC, our people in cybersecurity focus on protecting organisations from cyber threats through advanced technologies and strategies. They work to identify vulnerabilities, develop secure systems, and provide proactive solutions to safeguard sensitive data. Those in security architecture at PwC will focus on designing and implementing robust security frameworks to protect organisations from cyber threats. You will develop strategies and solutions to safeguard sensitive data and enable the integrity of systems and networks. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us . At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations Job Description & Summary: A career within…. Responsibilities: 1. Splunk Environment Management: o Install, configure, and maintain Splunk software across distributed and clustered environments. o Monitor & Keep the Splunk Enterprise instances in good health to serve our customers with highest platform availability. 2. Data Collection and Integration: o Collaborate with teams to identify and integrate necessary data sources. o Manage data inputs, parsing, indexing, and storage while monitoring performance, security, and availability. o Configure and maintain forwarders and data ingestion pipelines, including custom log source integration. o Integrate Splunk with various legacy data sources using diverse protocols. 3. Search Alerts/Reporting/Dashboard: o Develop and optimize search queries, dashboards, and reports for meaningful data insights. o Create alerts and scheduled reports for critical events and stakeholder notifications. o Create visualizations and custom queries to enhance dashboards and data views. 4. User Access and Role Management: o Manage user accounts, roles, and access controls o Ensure compliance with security policies. 5. Troubleshooting and Support: o Provide technical support and resolve issues related to log outage, data ingestion, system performance, and Splunk modules. o Collaborate with security teams on vulnerabilities and incident response activities. 6. Performance Tuning and Optimization: o Conduct performance tuning and apply best practices for efficient indexing and searching. o Filtering unwanted data and ensuring data hygiene 7. Documentation and Training: o Maintain detailed documentation of configurations, policies, and procedures. o Provide training and support to Splunk users and stakeholders. 8. System Upgrades and Patching: o Plan and execute software updates, upgrades, and patching, assessing their impact on systems. 9. Incident Management and Response: o Participate in incident response to identify and mitigate issues, collaborating with IT and security teams. 10. Innovation and Improvement: o Research and implement new Splunk features and tools for enhanced data analysis. o Continuously seek process improvements and provide consulting services to customize Splunk for client needs. Mandatory skill sets: · Must have Splunk Enterprise Admin Certification. · Good to have Splunk Enterprise Architect Certification. · Proven experience as a Splunk Administrator or similar role. · Strong understanding of Splunk architecture, data collection, and log management. · Strong understanding of Networking / Routing fundamentals, traffic and operating systems (Windows & Unix/Linux), TCP/IP, DNS, Firewalls, Security Proxies. · Good knowledge in Linux/UNIX – Scripting, RegEx. · Excellent troubleshooting and problem-solving skills. · Ability to work independently and collaboratively in a team environment. · Strong interpersonal and communication skills · Ready to work across different shifts and flexible on working days Preferred skill sets: Splunk Enterprise Certified Administrator Splunk Core Certified Power User Years of experience required: 3-7 Years Education qualification: B.Tecgh/B.E. Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Bachelor of Technology, Bachelor of Engineering Degrees/Field of Study preferred: Certifications (if blank, certifications not specified) Required Skills Splunk Phantom Optional Skills Accepting Feedback, Accepting Feedback, Active Listening, Amazon Web Services (AWS), Analytical Thinking, Azure Data Factory, Communication, Compliance, Safety, Accountability (CSA), Computer Network Defense, Creativity, Cybersecurity, Cybersecurity Framework, Cybersecurity Requirements, Embracing Change, Emotional Regulation, Empathy, Encryption Technologies, Forensic Investigation, Incident Response Tool, Inclusion, Intellectual Curiosity, Java (Programming Language), Learning Agility, Optimism, Security Architecture {+ 14 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Available for Work Visa Sponsorship? Government Clearance Required? Job Posting End Date

Posted 1 day ago

Apply

3.0 years

5 - 8 Lacs

Calcutta

On-site

GlassDoor logo

Line of Service Advisory Industry/Sector Not Applicable Specialism Microsoft Management Level Senior Associate Job Description & Summary At PwC, our people in cybersecurity focus on protecting organisations from cyber threats through advanced technologies and strategies. They work to identify vulnerabilities, develop secure systems, and provide proactive solutions to safeguard sensitive data. Those in security architecture at PwC will focus on designing and implementing robust security frameworks to protect organisations from cyber threats. You will develop strategies and solutions to safeguard sensitive data and enable the integrity of systems and networks. *Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us . At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Responsibilities: 1. Splunk Environment Management: o Install, configure, and maintain Splunk software across distributed and clustered environments. o Monitor & Keep the Splunk Enterprise instances in good health to serve our customers with highest platform availability. 2. Data Collection and Integration: o Collaborate with teams to identify and integrate necessary data sources. o Manage data inputs, parsing, indexing, and storage while monitoring performance, security, and availability. o Configure and maintain forwarders and data ingestion pipelines, including custom log source integration. o Integrate Splunk with various legacy data sources using diverse protocols. 3. Search Alerts/Reporting/Dashboard: o Develop and optimize search queries, dashboards, and reports for meaningful data insights. o Create alerts and scheduled reports for critical events and stakeholder notifications. o Create visualizations and custom queries to enhance dashboards and data views. 4. User Access and Role Management: o Manage user accounts, roles, and access controls o Ensure compliance with security policies. 5. Troubleshooting and Support: o Provide technical support and resolve issues related to log outage, data ingestion, system performance, and Splunk modules. o Collaborate with security teams on vulnerabilities and incident response activities. 6. Performance Tuning and Optimization: o Conduct performance tuning and apply best practices for efficient indexing and searching. o Filtering unwanted data and ensuring data hygiene 7. Documentation and Training: o Maintain detailed documentation of configurations, policies, and procedures. o Provide training and support to Splunk users and stakeholders. 8. System Upgrades and Patching: o Plan and execute software updates, upgrades, and patching, assessing their impact on systems. 9. Incident Management and Response: o Participate in incident response to identify and mitigate issues, collaborating with IT and security teams. 10. Innovation and Improvement: o Research and implement new Splunk features and tools for enhanced data analysis. o Continuously seek process improvements and provide consulting services to customize Splunk for client needs. Mandatory skill sets: · Must have Splunk Enterprise Admin Certification. · Good to have Splunk Enterprise Architect Certification. · Proven experience as a Splunk Administrator or similar role. · Strong understanding of Splunk architecture, data collection, and log management. · Strong understanding of Networking / Routing fundamentals, traffic and operating systems (Windows & Unix/Linux), TCP/IP, DNS, Firewalls, Security Proxies. · Good knowledge in Linux/UNIX – Scripting, RegEx. · Excellent troubleshooting and problem-solving skills. · Ability to work independently and collaboratively in a team environment. · Strong interpersonal and communication skills · Ready to work across different shifts and flexible on working days Preferred skill sets: Splunk Enterprise Certified Administrator Splunk Core Certified Power User Years of experience required: 3-7 Years Education qualification: B.Tecgh/B.E. Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Bachelor of Technology, Bachelor of Engineering Degrees/Field of Study preferred: Certifications (if blank, certifications not specified) Required Skills Splunk Administration Optional Skills Accepting Feedback, Accepting Feedback, Active Listening, Amazon Web Services (AWS), Analytical Thinking, Azure Data Factory, Communication, Compliance, Safety, Accountability (CSA), Computer Network Defense, Creativity, Cybersecurity, Cybersecurity Framework, Cybersecurity Requirements, Embracing Change, Emotional Regulation, Empathy, Encryption Technologies, Forensic Investigation, Incident Response Tool, Inclusion, Intellectual Curiosity, Java (Programming Language), Learning Agility, Optimism, Security Architecture {+ 14 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Available for Work Visa Sponsorship? Government Clearance Required? Job Posting End Date

Posted 1 day ago

Apply

3.0 - 7.0 years

0 Lacs

Kolkata, West Bengal, India

On-site

Linkedin logo

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Job Title: Informatica IDQ/CDQ Developer / Data Quality Specialist-Senior Job Summary: We are seeking a highly skilled Informatica DQ (Data Quality)/Cloud Data Quality Developer. The candidate will be responsible for designing, developing, and deploying data quality solutions using Informatica IDQ/CDQ to ensure accurate, complete, and reliable data across the enterprise. This role involves close collaboration with data stewards, business analysts, data engineers, and governance teams to define and enforce data quality standards, rules, and processes. Key Responsibilities: Design and implement data quality rules, scorecards, and dashboards using Informatica DQ. Perform data profiling, data cleansing, standardization, parsing, matching, and de-duplication. Collaborate with business stakeholders to define data quality metrics, thresholds, and SLAs. Develop reusable data quality assets (rules, mappings, workflows) and deploy them in production. Integrate DQ solutions with Informatica MDM, PowerCenter, IICS, or other ETL platforms. Monitor and troubleshoot DQ jobs and provide data quality issue resolution support. Work with data stewards and governance teams to establish data stewardship workflows. Conduct data analysis to identify root causes of data quality issues and recommend improvements. Create and maintain technical documentation, including data dictionaries and rule repositories. Participate in data governance programs, supporting continuous improvement and regulatory compliance. Required Qualifications: 3-7 years of experience in Informatica Data Quality (IDQ/CDQ) development. Strong knowledge of data profiling, cleansing, and transformation techniques. Proficiency in building Informatica DQ mappings, mapplets, workflows, and scorecards. Experience working with relational databases (Oracle, SQL Server, etc.) and writing SQL queries. Familiarity with data governance frameworks and master data concepts. Solid understanding of data lifecycle management and data architecture principles. Strong problem-solving, analytical, and communication skills. Preferred Qualifications: Informatica DQ certification. Experience with IICS (Informatica Intelligent Cloud Services) and Cloud Data Quality modules. Exposure to data governance tools like Informatica Axon, Collibra , or similar. Familiarity with Agile or DevOps methodologies and tools like JIRA, Git, Jenkins. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.

Posted 1 day ago

Apply

3.0 - 7.0 years

0 Lacs

Kanayannur, Kerala, India

On-site

Linkedin logo

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Job Title: Informatica IDQ/CDQ Developer / Data Quality Specialist-Senior Job Summary: We are seeking a highly skilled Informatica DQ (Data Quality)/Cloud Data Quality Developer. The candidate will be responsible for designing, developing, and deploying data quality solutions using Informatica IDQ/CDQ to ensure accurate, complete, and reliable data across the enterprise. This role involves close collaboration with data stewards, business analysts, data engineers, and governance teams to define and enforce data quality standards, rules, and processes. Key Responsibilities: Design and implement data quality rules, scorecards, and dashboards using Informatica DQ. Perform data profiling, data cleansing, standardization, parsing, matching, and de-duplication. Collaborate with business stakeholders to define data quality metrics, thresholds, and SLAs. Develop reusable data quality assets (rules, mappings, workflows) and deploy them in production. Integrate DQ solutions with Informatica MDM, PowerCenter, IICS, or other ETL platforms. Monitor and troubleshoot DQ jobs and provide data quality issue resolution support. Work with data stewards and governance teams to establish data stewardship workflows. Conduct data analysis to identify root causes of data quality issues and recommend improvements. Create and maintain technical documentation, including data dictionaries and rule repositories. Participate in data governance programs, supporting continuous improvement and regulatory compliance. Required Qualifications: 3-7 years of experience in Informatica Data Quality (IDQ/CDQ) development. Strong knowledge of data profiling, cleansing, and transformation techniques. Proficiency in building Informatica DQ mappings, mapplets, workflows, and scorecards. Experience working with relational databases (Oracle, SQL Server, etc.) and writing SQL queries. Familiarity with data governance frameworks and master data concepts. Solid understanding of data lifecycle management and data architecture principles. Strong problem-solving, analytical, and communication skills. Preferred Qualifications: Informatica DQ certification. Experience with IICS (Informatica Intelligent Cloud Services) and Cloud Data Quality modules. Exposure to data governance tools like Informatica Axon, Collibra , or similar. Familiarity with Agile or DevOps methodologies and tools like JIRA, Git, Jenkins. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.

Posted 1 day ago

Apply

0 years

0 Lacs

Vadodara, Gujarat, India

Remote

Linkedin logo

We’re reinventing the market research industry. Let’s reinvent it together. At Numerator, we believe tomorrow’s success starts with today’s market intelligence. We empower the world’s leading brands and retailers with unmatched insights into consumer behavior and the influencers that drive it. About the role: Numerator is currently looking for a Configuration Engineer to join our Data Extraction (DX) team in India (Remote). In this role, you will be responsible for helping to grow and maintain a library of thousands of receipt parsing configurations used by fortune 500 brands and retailers. Day-to-day, you’ll come up with creative solutions to complex problems, and learn new skills to complement your existing abilities. This is a great role for those who are looking for hands-on experience with high visibility and impact. We welcome fresh ideas and approaches as we constantly aim to improve our development processes. Our team has experience using a wide range of technologies and years of cloud and big data experience. We are always learning and growing, so we can guarantee that you won’t be bored with us! If you are seeking an environment where you get to do meaningful work with other great engineers, then we want to hear from you! What you will get to do: Write clean, efficient, thoroughly tested code, back-up with pair programming and code reviews. Much of our code is Python, but we use all kinds of languages and frameworks. Create complex regexes that pull structured data out of OCR-transcribed receipt images as well as XPATHs to extract data from receipt emails. Maintain the platform that drives our receipt extraction at scale. Troubleshoot, test, and maintain the platform and configurations to ensure strong optimization and functionality. Evaluate the technical tradeoffs of decisions and build things that last and scale. Maintain and fix existing configuration issues. Create and analyze new configuration technologies - figuring out how we can scale up our receipt extraction. Skills & Requirements: Programming experience in Python An eagerness to learn new things, and improve upon existing skills, abilities and practices Familiarity with web technology, such as HTTP, JSON, HTML, XPath or JavaScript. Knowledge in an Agile software development environment, Experience with version control systems (Git, Subversion, etc.). Have a real passion for clean code and finding elegant solutions to problems. Eager to expand your knowledge and abilities in python and cloud-based technologies. Motivation to participate in ongoing learning and growth through pair programming, test-driven development, code reviews, and application of new technologies and best practices. You look ahead to identify opportunities and foster a culture of innovation. Good communication (verbal and written) What You'll Bring to Numerator Requirements: Knowledge of web scraping Knowledge of regular expressions Knowledge of business rules engines. Familiarity with virtual software development environments (ie. Vagrant, docker etc.) Familiarity with object-oriented programming Scripting knowledge Familiarity with JSON and similar data formats Experience with databases, SQL or noSQL. Programming experience on Unix based infrastructure. Knowledge of cloud-based systems (EC2, Rackspace, etc.). Expertise with big data, analytics, machine learning, and personalization.

Posted 1 day ago

Apply

5.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Summary Position Summary Job title: Cyber Defense - Splunk Admin – Assistant Manager Do you thrive on developing creative and innovative insights to solve complex issues? Want to work on next-generation, cutting-edge products and services that deliver outstanding value and that are global in vision and scope? Work with other experts in your field? Work for a world-class organization that provides an exceptional career experience with an inclusive and collaborative culture? Want to make an impact that matters? Consider Deloitte Global. Work you’ll do: The Cybersecurity Engineer position supports the SIEM Health Monitoring team which is responsible for monitoring the health and performance of the Splunk platform and data within Splunk. This role is responsible for supporting the Cybersecurity, SIEM and SOC, IR, Threat Intel teams to ensure the efficacy of the Splunk platform by creating content, mitigating monitoring gaps, performing RCA on critical components and creating content. The Role also requires you to work closely with our stakeholders and clients and deliver SIEM Health Monitoring solutions accordingly. Troubleshoot and perform RCA on various data quality alerts and SIEM platform alerts. Create and drive vendor (Splunk) support cases independently. Maintain the SIEM Health Monitoring group in ServiceNow or Azure Devops and ensure all tasks and incident SLAs as met as required by our stakeholders. Create, document and update playbooks, process documents, SOW(s), RCA content periodically. Actively seek to improve and develop new content to drive process improvement and innovation. Participate in bi-annual health checks and strategize monitoring maturity road-map. Provide excellent customer service, as we will be required to interact/work with other teams to complete our daily tasks. What you’ll be part of—our Deloitte Global culture: At Deloitte, we expect results. Incredible—tangible—results. And Deloitte Global professionals play a unique role in delivering those results. We reach across disciplines and borders to serve our global organization. We are the engine of Deloitte. We develop and implement global strategies and provide programs and services that unite our network. In Deloitte Global, everyone has opportunities. We see the importance of your perspective and your ability to create value. We want you to fit in—with an inclusive culture, focus on work-life fit and well-being, and a supportive, connected environment; but we also want you to stand out—with opportunities to have a strategic impact, innovate, and take the risks necessary to make your mark. Who you’ll work with: The Deloitte Global Cybersecurity function is responsible for enhancing data protection, standardizing and securing critical infrastructure, and gaining cyber visibility through security operations centers. The Cybersecurity organization delivers a comprehensive set of security services to Deloitte’s global network of firms around the globe. Qualifications Required: Bachelor’s degree in Computer Science, Information Technology, or relevant educational or professional experience. Atleast 5 years of hands-on Splunk Enterprise and or SplunkCloud Administration experience. Splunk Enterprise Core certified Admin, Power User, & User Strong Working Knowledge of the Splunk Platform and integrations to public cloud, EDR, Networking toolsets. Proficient in troubleshooting Splunk performance and data quality issues. Strong experience in analyzing, troubleshooting and providing solutions for technical issues. Knowledge about various data onboarding methods (UF, HEC, DBConnect, syslog-ng, rsyslog) and means to troubleshoot them. Knowledge and experience in GIT, Microsoft Azure DevOps, or any CI/CD tools. Experience in requirement gathering and documentation. Experience in Log parsing, lookups, calculated fields extractions using regular expression (regex). Experience in creating and troubleshooting Splunk Dashboards, Reports, Alerts, Visualizations and optimize SPL searches. Sound judgment and deduction skills with a knack to see out patterns. Proactive mindset and a self-starter with minimum supervision Excellent interpersonal and organizational skills. Preferred: Splunk Enterprise Certified Admin SplunkCloud experience is a huge plus Cribl User / Admin certification Knowledge of risk assessment tools, technologies and methods Experience with Splunk Enterprise Security How you’ll grow: Deloitte Global inspires our people at every level. We believe in investing in you, helping you at every step of your career, and helping you identify and hone your unique strengths. We encourage you to grow by providing formal and informal development programs, coaching and mentoring. We want you to ask questions, take chances, and explore the possible. Benefits you’ll receive: Deloitte’s Total Rewards program reflects our continued commitment to lead from the front in everything we do — that’s why we take pride in offering a comprehensive variety of programs and resources to support your health and well-being needs. We provide the benefits, competitive compensation, and recognition to help sustain your efforts in making an impact that matters. Corporate Citizenship Deloitte is led by a purpose: to make an impact that matters. This purpose defines who we are and extends to relationships with our clients, our people, and our communities. We believe that business has the power to inspire and transform. We focus on education, giving, skill-based volunteerism, and leadership to help drive positive social impact in our communities. Our purpose Deloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Our people and culture Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Professional development At Deloitte, professionals have the opportunity to work with some of the best and discover what works best for them. Here, we prioritize professional growth, offering diverse learning and networking opportunities to help accelerate careers and enhance leadership skills. Our state-of-the-art DU: The Leadership Center in India, located in Hyderabad, represents a tangible symbol of our commitment to the holistic growth and development of our people. Explore DU: The Leadership Center in India . Benefits To Help You Thrive At Deloitte, we know that great people make a great organization. Our comprehensive rewards program helps us deliver a distinctly Deloitte experience that helps that empowers our professionals to thrive mentally, physically, and financially—and live their purpose. To support our professionals and their loved ones, we offer a broad range of benefits. Eligibility requirements may be based on role, tenure, type of employment and/ or other criteria. Learn more about what working at Deloitte can mean for you. Recruiting tips From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Requisition code: 305357

Posted 1 day ago

Apply

5.0 - 11.0 years

0 Lacs

Chennai, Tamil Nadu, India

Remote

Linkedin logo

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Senior (CTM – Threat Detection & Response) KEY Capabilities: Experience in working with Splunk Enterprise, Splunk Enterprise Security & Splunk UEBA Minimum of Splunk Power User Certification Good knowledge in programming or Scripting languages such as Python (preferred), JavaScript (preferred), Bash, PowerShell, Bash, etc. Perform remote and on-site gap assessment of the SIEM solution. Define evaluation criteria & approach based on the Client requirement & scope factoring industry best practices & regulations Conduct interview with stakeholders, review documents (SOPs, Architecture diagrams etc.) Evaluate SIEM based on the defined criteria and prepare audit reports Good experience in providing consulting to customers during the testing, evaluation, pilot, production and training phases to ensure a successful deployment. Understand customer requirements and recommend best practices for SIEM solutions. Offer consultative advice in security principles and best practices related to SIEM operations Design and document a SIEM solution to meet the customer needs Experience in onboarding data into Splunk from various sources including unsupported (in-house built) by creating custom parsers Verification of data of log sources in the SIEM, following the Common Information Model (CIM) Experience in parsing and masking of data prior to ingestion in SIEM Provide support for the data collection, processing, analysis and operational reporting systems including planning, installation, configuration, testing, troubleshooting and problem resolution Assist clients to fully optimize the SIEM system capabilities as well as the audit and logging features of the event log sources Assist client with technical guidance to configure end log sources (in-scope) to be integrated to the SIEM Experience in handling big data integration via Splunk Expertise in SIEM content development which includes developing process for automated security event monitoring and alerting along with corresponding event response plans for systems Hands-on experience in development and customization of Splunk Apps & Add-Ons Builds advanced visualizations (Interactive Drilldown, Glass tables etc.) Build and integrate contextual data into notable events Experience in creating use cases under Cyber kill chain and MITRE attack framework Capability in developing advanced dashboards (with CSS, JavaScript, HTML, XML) and reports that can provide near real time visibility into the performance of client applications. Experience in installation, configuration and usage of premium Splunk Apps and Add-ons such as ES App, UEBA, ITSI etc Sound knowledge in configuration of Alerts and Reports. Good exposure in automatic lookup, data models and creating complex SPL queries. Create, modify and tune the SIEM rules to adjust the specifications of alerts and incidents to meet client requirement Work with the client SPOC to for correlation rule tuning (as per use case management life cycle), incident classification and prioritization recommendations Experience in creating custom commands, custom alert action, adaptive response actions etc. Qualification & experience: Minimum of 5 to 11 years’ experience with a depth of network architecture knowledge that will translate over to deploying and integrating a complicated security intelligence solution into global enterprise environments. Strong oral, written and listening skills are an essential component to effective consulting. Strong background in network administration. Ability to work at all layers of the OSI models, including being able to explain communication at any level is necessary. Must have knowledge of Vulnerability Management, Windows and Linux basics including installations, Windows Domains, trusts, GPOs, server roles, Windows security policies, user administration, Linux security and troubleshooting. Good to have below mentioned experience with designing and implementation of Splunk with a focus on IT Operations, Application Analytics, User Experience, Application Performance and Security Management Multiple cluster deployments & management experience as per Vendor guidelines and industry best practices Troubleshoot Splunk platform and application issues, escalate the issue and work with Splunk support to resolve issues Certification in any one of the SIEM Solution such as IBM QRadar, Exabeam, Securonix will be an added advantage Certifications in a core security related discipline will be an added advantage. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.

Posted 1 day ago

Apply

5.0 - 11.0 years

0 Lacs

Kolkata, West Bengal, India

Remote

Linkedin logo

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Senior (CTM – Threat Detection & Response) KEY Capabilities: Experience in working with Splunk Enterprise, Splunk Enterprise Security & Splunk UEBA Minimum of Splunk Power User Certification Good knowledge in programming or Scripting languages such as Python (preferred), JavaScript (preferred), Bash, PowerShell, Bash, etc. Perform remote and on-site gap assessment of the SIEM solution. Define evaluation criteria & approach based on the Client requirement & scope factoring industry best practices & regulations Conduct interview with stakeholders, review documents (SOPs, Architecture diagrams etc.) Evaluate SIEM based on the defined criteria and prepare audit reports Good experience in providing consulting to customers during the testing, evaluation, pilot, production and training phases to ensure a successful deployment. Understand customer requirements and recommend best practices for SIEM solutions. Offer consultative advice in security principles and best practices related to SIEM operations Design and document a SIEM solution to meet the customer needs Experience in onboarding data into Splunk from various sources including unsupported (in-house built) by creating custom parsers Verification of data of log sources in the SIEM, following the Common Information Model (CIM) Experience in parsing and masking of data prior to ingestion in SIEM Provide support for the data collection, processing, analysis and operational reporting systems including planning, installation, configuration, testing, troubleshooting and problem resolution Assist clients to fully optimize the SIEM system capabilities as well as the audit and logging features of the event log sources Assist client with technical guidance to configure end log sources (in-scope) to be integrated to the SIEM Experience in handling big data integration via Splunk Expertise in SIEM content development which includes developing process for automated security event monitoring and alerting along with corresponding event response plans for systems Hands-on experience in development and customization of Splunk Apps & Add-Ons Builds advanced visualizations (Interactive Drilldown, Glass tables etc.) Build and integrate contextual data into notable events Experience in creating use cases under Cyber kill chain and MITRE attack framework Capability in developing advanced dashboards (with CSS, JavaScript, HTML, XML) and reports that can provide near real time visibility into the performance of client applications. Experience in installation, configuration and usage of premium Splunk Apps and Add-ons such as ES App, UEBA, ITSI etc Sound knowledge in configuration of Alerts and Reports. Good exposure in automatic lookup, data models and creating complex SPL queries. Create, modify and tune the SIEM rules to adjust the specifications of alerts and incidents to meet client requirement Work with the client SPOC to for correlation rule tuning (as per use case management life cycle), incident classification and prioritization recommendations Experience in creating custom commands, custom alert action, adaptive response actions etc. Qualification & experience: Minimum of 5 to 11 years’ experience with a depth of network architecture knowledge that will translate over to deploying and integrating a complicated security intelligence solution into global enterprise environments. Strong oral, written and listening skills are an essential component to effective consulting. Strong background in network administration. Ability to work at all layers of the OSI models, including being able to explain communication at any level is necessary. Must have knowledge of Vulnerability Management, Windows and Linux basics including installations, Windows Domains, trusts, GPOs, server roles, Windows security policies, user administration, Linux security and troubleshooting. Good to have below mentioned experience with designing and implementation of Splunk with a focus on IT Operations, Application Analytics, User Experience, Application Performance and Security Management Multiple cluster deployments & management experience as per Vendor guidelines and industry best practices Troubleshoot Splunk platform and application issues, escalate the issue and work with Splunk support to resolve issues Certification in any one of the SIEM Solution such as IBM QRadar, Exabeam, Securonix will be an added advantage Certifications in a core security related discipline will be an added advantage. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.

Posted 1 day ago

Apply

5.0 - 11.0 years

0 Lacs

Kanayannur, Kerala, India

Remote

Linkedin logo

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Senior (CTM – Threat Detection & Response) KEY Capabilities: Experience in working with Splunk Enterprise, Splunk Enterprise Security & Splunk UEBA Minimum of Splunk Power User Certification Good knowledge in programming or Scripting languages such as Python (preferred), JavaScript (preferred), Bash, PowerShell, Bash, etc. Perform remote and on-site gap assessment of the SIEM solution. Define evaluation criteria & approach based on the Client requirement & scope factoring industry best practices & regulations Conduct interview with stakeholders, review documents (SOPs, Architecture diagrams etc.) Evaluate SIEM based on the defined criteria and prepare audit reports Good experience in providing consulting to customers during the testing, evaluation, pilot, production and training phases to ensure a successful deployment. Understand customer requirements and recommend best practices for SIEM solutions. Offer consultative advice in security principles and best practices related to SIEM operations Design and document a SIEM solution to meet the customer needs Experience in onboarding data into Splunk from various sources including unsupported (in-house built) by creating custom parsers Verification of data of log sources in the SIEM, following the Common Information Model (CIM) Experience in parsing and masking of data prior to ingestion in SIEM Provide support for the data collection, processing, analysis and operational reporting systems including planning, installation, configuration, testing, troubleshooting and problem resolution Assist clients to fully optimize the SIEM system capabilities as well as the audit and logging features of the event log sources Assist client with technical guidance to configure end log sources (in-scope) to be integrated to the SIEM Experience in handling big data integration via Splunk Expertise in SIEM content development which includes developing process for automated security event monitoring and alerting along with corresponding event response plans for systems Hands-on experience in development and customization of Splunk Apps & Add-Ons Builds advanced visualizations (Interactive Drilldown, Glass tables etc.) Build and integrate contextual data into notable events Experience in creating use cases under Cyber kill chain and MITRE attack framework Capability in developing advanced dashboards (with CSS, JavaScript, HTML, XML) and reports that can provide near real time visibility into the performance of client applications. Experience in installation, configuration and usage of premium Splunk Apps and Add-ons such as ES App, UEBA, ITSI etc Sound knowledge in configuration of Alerts and Reports. Good exposure in automatic lookup, data models and creating complex SPL queries. Create, modify and tune the SIEM rules to adjust the specifications of alerts and incidents to meet client requirement Work with the client SPOC to for correlation rule tuning (as per use case management life cycle), incident classification and prioritization recommendations Experience in creating custom commands, custom alert action, adaptive response actions etc. Qualification & experience: Minimum of 5 to 11 years’ experience with a depth of network architecture knowledge that will translate over to deploying and integrating a complicated security intelligence solution into global enterprise environments. Strong oral, written and listening skills are an essential component to effective consulting. Strong background in network administration. Ability to work at all layers of the OSI models, including being able to explain communication at any level is necessary. Must have knowledge of Vulnerability Management, Windows and Linux basics including installations, Windows Domains, trusts, GPOs, server roles, Windows security policies, user administration, Linux security and troubleshooting. Good to have below mentioned experience with designing and implementation of Splunk with a focus on IT Operations, Application Analytics, User Experience, Application Performance and Security Management Multiple cluster deployments & management experience as per Vendor guidelines and industry best practices Troubleshoot Splunk platform and application issues, escalate the issue and work with Splunk support to resolve issues Certification in any one of the SIEM Solution such as IBM QRadar, Exabeam, Securonix will be an added advantage Certifications in a core security related discipline will be an added advantage. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.

Posted 1 day ago

Apply

Exploring Parsing Jobs in India

The parsing job market in India is thriving, with a growing demand for professionals skilled in parsing techniques across various industries. Employers are actively seeking individuals who can effectively extract and analyze structured data from different sources. If you are a job seeker looking to explore parsing roles in India, this article will provide you with valuable insights and guidance.

Top Hiring Locations in India

  1. Bangalore
  2. Pune
  3. Hyderabad
  4. Chennai
  5. Delhi

These cities are known for their vibrant tech industries and offer numerous opportunities for individuals interested in parsing roles.

Average Salary Range

The average salary range for parsing professionals in India varies based on experience levels. Entry-level positions can expect to earn between INR 3-6 lakhs per annum, while experienced professionals can command salaries ranging from INR 8-15 lakhs per annum.

Career Path

In the field of parsing, a typical career path may include the following progression: - Junior Developer - Software Engineer - Senior Developer - Tech Lead - Architect

As professionals gain experience and expertise in parsing techniques, they can advance to higher roles with increased responsibilities.

Related Skills

In addition to parsing skills, individuals pursuing roles in this field are often expected to possess or develop the following skills: - Data analysis - Programming languages (e.g., Python, Java) - Knowledge of databases - Problem-solving abilities

Interview Questions

  • What is parsing and why is it important? (basic)
  • Explain the difference between lexical analysis and syntax analysis. (medium)
  • How does a parser handle ambiguity in a grammar? (advanced)
  • Can you give an example of a top-down parsing algorithm? (medium)
  • What is the difference between LL and LR parsing? (advanced)
  • What are the common parsing techniques used in natural language processing? (medium)
  • How do you handle errors during parsing? (basic)
  • Explain the concept of parsing trees. (medium)
  • What is the role of a parser generator in parsing? (medium)
  • How does a recursive descent parser work? (advanced)
  • Can you explain the shift-reduce parsing technique? (medium)
  • What are the advantages and disadvantages of bottom-up parsing? (medium)
  • How does a predictive parser work? (advanced)
  • What is the difference between top-down and bottom-up parsing? (medium)
  • How can you optimize a parsing algorithm for efficiency? (advanced)
  • Explain the concept of tokenization in parsing. (basic)
  • How do you handle nested structures during parsing? (medium)
  • What is the role of a lexer in the parsing process? (basic)
  • Can you explain how a shift-reduce parser operates? (medium)
  • What are the limitations of recursive descent parsing? (medium)
  • How do you handle left-recursion in parsing? (advanced)
  • Explain the concept of ambiguous grammars and how they are resolved during parsing. (advanced)
  • What are some common challenges faced during parsing large datasets? (medium)
  • How do you choose the appropriate parsing technique for a given problem? (medium)
  • Can you discuss the impact of parsing errors on data analysis? (basic)

Closing Remark

As you prepare for parsing roles in India, remember to showcase your expertise in parsing techniques and related skills during interviews. Stay updated with the latest trends in the field and practice answering common interview questions to boost your confidence. With dedication and perseverance, you can secure a rewarding career in parsing in India. Good luck!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies