Home
Jobs

327 Solr Jobs - Page 4

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

4.0 years

0 Lacs

India

Remote

Linkedin logo

Job Title : Mern Stack Developer Experience: 4+ Years Location: Remote Contract Duration: Long Term Work Timing: IST Shift Job Description : We are looking for an experienced MERN Stack Developer with over 4 years of expertise in building scalable and high-performance web applications. This is a long-term remote opportunity ideal for a passionate full stack developer with strong hands-on experience in MERN stack, microservices architecture, search technologies (Elasticsearch/Solr), and AWS cloud services. Responsibilities: Develop, maintain, and scale full stack applications using MongoDB, Express, React, and Node.js. Work with Vue.js and Strapi.js for additional front-end and CMS functionalities. Design and implement microservices architecture for scalable application development. Integrate advanced search capabilities using Elasticsearch, Solr, and graph databases. Collaborate with stakeholders to define architectural solutions and ensure best practices. Utilize AWS services (EC2, S3, RDS, Lambda, API Gateway, SQS, SNS) for cloud-based deployments. Optional but beneficial: experience with PostgreSQL, Java stack, and data engineering tasks like data modeling, pipelining, and cleansing. Skills & Requirements: 4+ years of full stack development in agile environments. 2+ years of professional experience with the MERN stack and tools like Vue.js and Strapi.js. Proven ability in designing and implementing microservices. Hands-on knowledge of search technologies: Elasticsearch, Solr, Graph Databases. Cloud proficiency with AWS: EC2, S3, RDS, Lambda, API Gateway, SQS, SNS. Nice to have: PostgreSQL, Java stack, Data Modeling, Data Staging, Data Cleansing. Show more Show less

Posted 1 week ago

Apply

0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Linkedin logo

As a trusted global transformation partner, Welocalize accelerates the global business journey by enabling brands and companies to reach, engage, and grow international audiences. Welocalize delivers multilingual content transformation services in translation, localization, and adaptation for over 250 languages with a growing network of over 400,000 in-country linguistic resources. Driving innovation in language services, Welocalize delivers high-quality training data transformation solutions for NLP-enabled machine learning by blending technology and human intelligence to collect, annotate, and evaluate all content types. Our team works across locations in North America, Europe, and Asia serving our global clients in the markets that matter to them. www.welocalize.com To perform this job successfully, an individual must be able to perform each essential duty satisfactorily. The requirements listed below are representative of the knowledge, skill, and/or ability required. Reasonable accommodations may be made to enable individuals with disabilities to perform the essential functions. Job Reference: ROLE OVERVIEW The Software Development Engineer plays a pivotal role in the development of advanced information systems by deeply understanding business operations and translating them into scalable, maintainable, and efficient software solutions. This position requires a combination of technical expertise, creativity, and a collaborative approach to ensure that software components meet both current business needs and future growth. By analyzing operational requirements, the Senior Software Engineer will design and implement complex software components, focusing on quality, performance, and long-term system sustainability. In this role, the Senior Software Engineer works closely with cross-functional teams, including product management, quality assurance, and technical leads, to ensure the seamless integration of solutions across different systems. They are responsible for guiding the development process from concept to implementation, ensuring that software meets rigorous standards for security, reliability, and scalability. Additionally, they actively participate in code reviews, mentor junior engineers, and foster a culture of continuous learning and improvement within the development team. This position requires a strategic thinker with a passion for technology, someone who can balance hands-on coding with architectural oversight and technical leadership. Through their expertise and proactive problem-solving, the Senior Software Engineer will contribute to the success of the team and the overall organization by delivering high-quality software that drives business results MAIN DUTIES · Design and develop scalable applications focusing on high availability, fault tolerance, and performance optimization, while collaborating with DevOps teams to ensure smooth deployment and continuous delivery. · Implement and maintain RESTful APIs while working with SQL (MySQL) and NoSQL databases (MongoDB), with expertise in writing and optimizing complex queries. · Provide technical leadership and mentorship to software engineers, ensuring adherence to best practices and architectural standards. · Collaborate with stakeholders to analyze requirements, troubleshoot issues, and deliver high-quality solutions aligned with business objectives. · Demonstrate expertise in system design, architecture, and optimization of complex systems, ensuring scalability and maintainability. · Quickly understand and adapt to existing system design and architecture to drive solutions that integrate seamlessly within established frameworks. · Optimize and fine-tune existing solutions for enhanced performance and operational efficiency. · Provide robust solutions for complex XML parsing and integrating third-party systems into the application ecosystem. · Architect systems using strong system design principles, incorporating security best practices, and demonstrating proficiency in OAuth 2.0 for authentication and authorization. · Leverage domain expertise in localization to deliver tailored solutions that meet industry-specific requirements. · Foster effective cross-team communication to align technical strategies with business goals and ensure smooth collaboration across departments. - Proven experience in full-stack development, with a deep understanding of front-end, back-end, and cloud technologies. REQUIREMENTS Education Level Post-secondary degree in Computer Science or equivalent professional experience. Experience - Demonstrable professional experience as the technical owner of major components or subsystems through the software .development lifecycle. - Previous professional experience collaborating with business stakeholders. - Experience shipping code in an agile SDLC. - Previous experience in performance analysis and benchmarking. - Able to apply continuous integration, development, and automated testing concepts. - Knowledge of multiple frameworks, paradigms, languages, and trends in the relevant domain. Technical Skills Programming & Frameworks: Expertise in NodeJS, Spring Framework (Boot, MVC, Data), Hibernate, React, Vue.js Cloud & DevOps : Expertise in AWS, Azure, Docker, Kubernetes, Jenkins, Git, Bitbucket, CI/CD Pipelines. Databases: Expertise in MySQL, MongoDB, in database design, optimization, and complex queries. Data & Search Technologies: Familiarity with Elasticsearch, Apache Solr. Testing & Monitoring: JUnit, Mockito, Postman, JMeter, Dynatrace, New Relic. Architectural Patterns: Expertise in Microservices, RESTful Services, and Integration Architecture. Soft Skills: Strong problem-solving, collaboration, and communication Microservices & Architecture: Expertise in microservices design, deployment, and orchestration Security: OAuth, JWT, Spring Security, SAST, DAST Scheduler: Expertise in managing and optimizing Quartz jobs for scheduling complex workflows Version Control: Expertise in managing complex version control scenarios API Documentation: Expertise in API documentation and automation Project Management Tools : Expertise in project tracking and collaboration with Confluence Join our team and contribute to creating cutting-edge solutions that support the future growth and success of Welocalize. If you’re ready to take on this challenge and help us build the next generation of technology, we encourage you to apply today. Show more Show less

Posted 1 week ago

Apply

8.0 years

0 Lacs

Greater Kolkata Area

On-site

Linkedin logo

Years of Experience : 5–8 Years Contract Duration : 10 Months Work Location :PAN India Mode of Work : Hybrid Position Overview We are looking for a skilled and experienced AEM (Adobe Experience Manager) Developer with Solr expertise , possessing 5 to 8 years of hands-on experience. The ideal candidate will be responsible for the development, upgrade, configuration, and optimization of both AEM and Apache Solr environments. The role focuses on creating AEM custom components, enhancing search functionalities, and ensuring seamless integration with enterprise applications to improve the overall user experience. Key Responsibilities AEM Component and Template Development Develop AEM components, templates, dialogs, workflows using AEM architecture (Sling, CRX, OSGi, JCR). Configure AEM workflows, Sling mappings, and multi-site management (including translation frameworks). Implement front-end solutions using HTML, CSS, JavaScript. Provide technical guidance and collaborate with project leadership for successful AEM implementations. AEM and Solr Upgrade Management Lead upgrade planning and execution for both AEM websites and Solr, ensuring minimal service disruption. Evaluate and implement new Solr versions to enhance system performance and features. Configuration and Management Configure and maintain AEM tools and Solr instances for performance, reliability, and security. Develop custom AEM configurations and manage Solr schemas, ingestion pipelines, and indexing strategies. Search Optimization Analyze user behavior and search metrics to identify optimization opportunities. Tune search queries and indexing logic to improve search speed and result relevance. Apply best practices for search relevance, ranking, and performance tuning. System Integration Integrate Solr with AEM and other enterprise applications. Develop and manage RESTful APIs for data communication between AEM and other systems. Monitoring and Troubleshooting Monitor health and performance of AEM environments (DEV, QA, Stage, Prod) and Solr servers. Troubleshoot issues related to AEM website functionality and search performance, ensuring system stability. Required Skills Technical Expertise Strong understanding of Adobe Experience Manager (AEM) architecture and Apache Solr. Proven ability to configure and manage both platforms effectively. Programming Skills Proficiency in Java, HTML, CSS, JavaScript, jQuery, and related web development technologies. Search Optimization Experience in search tuning, query optimization, and relevance ranking algorithms. Integration Experience Hands-on experience in integrating Solr with enterprise systems. Working knowledge of RESTful API development and integration with AEM. Analytical And Problem-Solving Skills Ability to analyze complex issues and deliver optimized, scalable solutions for AEM and search platforms. Show more Show less

Posted 1 week ago

Apply

4.0 - 5.0 years

0 Lacs

India

On-site

Linkedin logo

Highly skilled Search Engineer with deep expertise in designing, implementing, and optimizing search solutions using Apache Solr, Elasticsearch, and Apache Spark. Need substantial experience handling big data search and document-based retrieval, with a strong focus on writing complex queries and indexing strategies for large-scale systems. Key Responsibilities: Design and implement robust, scalable search architectures using Solr and Elasticsearch. Write, optimize, and maintain complex search queries (including full-text, faceted, fuzzy, geospatial, and nested queries) using Solr Query Parser and Elasticsearch DSL. Work with business stakeholders to understand search requirements and translate them into performant and accurate queries. Build and manage custom analyzers, tokenizers, filters, and index mappings/schemas tailored to domain-specific search needs. Develop and optimize indexing pipelines using Apache Spark for processing large-scale structured and unstructured datasets. Perform query tuning and search relevance optimization based on precision, recall, and user engagement metrics. Create and maintain query templates and search APIs for integration with enterprise applications. Monitor, troubleshoot, and improve search performance and infrastructure reliability. Conduct evaluations and benchmarking of search quality, query latency, and index refresh times. Required Skills and Qualifications: 4 to 5 years of hands-on experience with Apache Solr and/or Elasticsearch in production environments. Proven ability to write and optimize complex Solr queries (standard, dismax, edismax parsers) and Elasticsearch Query DSL, including: Full-text search with analyzers Faceted and filtered search Boolean and range queries Aggregations and suggesters Nested and parent/ child queries Strong understanding of indexing principles, Lucene internals, and relevance scoring mechanisms (BM25, TF-IDF). Proficiency with Apache Spark for custom indexing workflows and large-scale data processing. Experience with document parsing and extraction (JSON, XML, PDFs, etc.) for search indexing. Experience integrating search into web applications or enterprise software platforms. Show more Show less

Posted 1 week ago

Apply

4.0 - 9.0 years

14 - 20 Lacs

Hyderabad, Chennai, Bengaluru

Work from Office

Naukri logo

Role & responsibilities 4 to 12 Years Skills : Senior Java Lucidworks, Solr, with Good Communication This role requires deep technical knowledge of Lucidworks Fusion (including Solr, Spark, and AI/ML integration), along with strong hands-on skills in data ingestion, pipeline development, query tuning, and relevance engineering.Key Responsibilities:Design & Development:Build end-to-end intelligent search solutions using Lucidworks Fusion.Develop custom ingestion pipelines, index configurations, query pipelines, and business rules.Create meaningful user experiences through relevance tuning, signals-based learning, and ML integration.Data Engineering:Configure connectors to ingest structured and unstructured data from multiple sources (SQL, NoSQL, REST APIs, etc.).Transform and enrich data during ingestion using Fusions Index Pipelines and JavaScript/Python scripting.Relevance Engineering:Apply signal-based personalization techniques and AI models to enhance relevance.Optimize ranking models using Fusion’s Smart Answers, Predictive Merchandiser, or Learning to Rank (LTR).System Integration & Deployment:Integrate Fusion with enterprise applications, microservices, and analytics platforms.Manage Fusion environments, handle deployments, and monitor system performance.Collaboration:Work closely with product managers, UX designers, and data scientists to meet business goals.Participate in Agile ceremonies, provide estimates, and maintain technical documentation.Required Qualifications:Bachelor's or Master’s degree in Computer Science, Engineering, or a related field.3+ years of hands-on experience with Lucidworks Fusion, Apache Solr, or similar enterprise search platforms.Strong expertise in search relevancy tuning, query pipeline development, and signals-based search.Proficiency in Java, JavaScript, Python, or Scala.Experience with data pipeline tools, ETL, and RESTful API integration.Solid understanding of Solr schema design, faceting, filtering, and query optimization.Familiarity with containerization tools (Docker, Kubernetes) and CI/CD pipelines.Excellent problem-solving and communication skills.Preferred Qualifications:Experience with Lucidworks Smart Answers, App Studio, AI-powered search use cases, or predictive merchandising.Familiarity with Apache Spark, Kafka, or cloud platforms like AWS, GCP, or Azure.Understanding of machine learning models and their integration into search platforms.Exposure to headless CMS, e-commerce search, or knowledge management systems. Preferred candidate profile

Posted 1 week ago

Apply

4.0 years

0 Lacs

India

Remote

Linkedin logo

Job Title : Mern Stack Developer Experience : 4+ Years Location : Remote Contract Duration : Long Term Work Timing : IST Shift Job Description : We are looking for an experienced MERN Stack Developer with over 4 years of expertise in building scalable and high-performance web applications. This is a long-term remote opportunity ideal for a passionate full stack developer with strong hands-on experience in MERN stack, microservices architecture, search technologies (Elasticsearch/Solr), and AWS cloud services. Responsibilities: Develop, maintain, and scale full stack applications using MongoDB, Express, React, and Node.js. Work with Vue.js and Strapi.js for additional front-end and CMS functionalities. Design and implement microservices architecture for scalable application development. Integrate advanced search capabilities using Elasticsearch, Solr, and graph databases. Collaborate with stakeholders to define architectural solutions and ensure best practices. Utilize AWS services (EC2, S3, RDS, Lambda, API Gateway, SQS, SNS) for cloud-based deployments. Optional but beneficial: experience with PostgreSQL, Java stack, and data engineering tasks like data modeling, pipelining, and cleansing. Skills & Requirements: 4+ years of full stack development in agile environments. 2+ years of professional experience with the MERN stack and tools like Vue.js and Strapi.js. Proven ability in designing and implementing microservices. Hands-on knowledge of search technologies: Elasticsearch, Solr, Graph Databases. Cloud proficiency with AWS: EC2, S3, RDS, Lambda, API Gateway, SQS, SNS. Nice to have: PostgreSQL, Java stack, Data Modeling, Data Staging, Data Cleansing. Show more Show less

Posted 1 week ago

Apply

3.0 - 7.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Role - Java Engineer Experience - 3-5 yrs Location - Bangalore Notice Period - Immediate Joiner ● Bachelors/Masters in Computer science from a reputed institute/university ● 3-7 years of strong experience in building Java/golang/python based server side solutions ● Strong in data structure, algorithm and software design ● Experience in designing and building RESTful micro services ● Experience with Server side frameworks such as JPA (HIbernate/SpringData), Spring, vertex, Springboot, Redis, Kafka, Lucene/Solr/ElasticSearch etc. ● Experience in data modeling and design, database query tuning ● Experience in MySQL and strong understanding of relational databases. ● Comfortable with agile, iterative development practices ● Excellent communication (verbal & written), interpersonal and leadership skills ● Previous experience as part of a Start-up or a Product company. ● Experience with AWS technologies would be a plus ● Experience with reactive programming frameworks would be a plus · Contributions to opensource are a plus ● Familiarity with deployment architecture principles and prior experience with container orchestration platforms, particularly Kubernetes, would be a significant advantage Show more Show less

Posted 1 week ago

Apply

3.0 - 7.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Role - Backend Developer Experience - 3-5 yrs Location - Bangalore ● Bachelors/Masters in Computer science from a reputed institute/university ● 3-7 years of strong experience in building Java/golang/python based server side solutions ● Strong in data structure, algorithm and software design ● Experience in designing and building RESTful micro services ● Experience with Server side frameworks such as JPA (HIbernate/SpringData), Spring, vertex, Springboot, Redis, Kafka, Lucene/Solr/ElasticSearch etc. ● Experience in data modeling and design, database query tuning ● Experience in MySQL and strong understanding of relational databases. ● Comfortable with agile, iterative development practices ● Excellent communication (verbal & written), interpersonal and leadership skills ● Previous experience as part of a Start-up or a Product company. ● Experience with AWS technologies would be a plus ● Experience with reactive programming frameworks would be a plus · Contributions to opensource are a plus ● Familiarity with deployment architecture principles and prior experience with container orchestration platforms, particularly Kubernetes, would be a significant advantage Show more Show less

Posted 1 week ago

Apply

6.0 years

0 Lacs

Pune, Maharashtra, India

Remote

Linkedin logo

Role Overview We are hiring a passionate and experienced Senior Java Engineer to join a dynamic product development team delivering high-impact, high-scale solutions for enterprise clients. This role offers the opportunity to work on mission-critical digital products and platforms, with exposure to greenfield and tier-one enterprise builds across diverse domains including payments, asset management, and financial services. As a Senior Java Engineer, you will collaborate with senior architects and engineering leaders to design, build, and maintain performant, scalable Java applications. You will be part of a high-velocity Scrum team working on real-world, client-facing problems that demand modern engineering practices and robust architectural thinking. Key Responsibilities Develop high-performance, scalable Java-based applications. Architect and implement end-to-end solutions with a focus on low-latency and high-volume systems. Collaborate closely with clients and cross-functional teams to understand requirements and deliver solutions. Mentor junior engineers and contribute to team growth and technical leadership. Participate in Agile development processes including sprint planning, estimation, and delivery. Drive code quality through continuous integration, automated testing, and code reviews. Support system design and contribute to CI/CD pipeline enhancements. Core Requirements 6+ years of Java development experience in enterprise-level environments. Strong command of Java 8 (preferably Java 11+) including advanced features such as lambda expressions, Stream API, and CompletableFuture. Proven experience with Spring Boot, microservices, asynchronous programming, and multithreading. Strong understanding of CI/CD practices and shift-left testing. Hands-on experience with SQL and relational databases. Experience with system design, data sourcing, modeling, and enrichment. Familiarity with cloud platforms, preferably AWS. Excellent communication skills with a consultant-like, client-facing mindset. Nice To Have Exposure to Golang and/or Rust. Experience with messaging and streaming systems such as Kafka. Familiarity with tools/technologies like: MongoDB, Sonar, Jenkins Oracle DB, Sybase IQ, DB2 Drools or other rule engines Adobe AEM or similar CMS platforms Search technologies like Algolia, Elasticsearch, Solr Apache Spark Preferred Experience Domain experience in Payments or Asset/Wealth Management. Strong server-side development background with long-term experience in enterprise-grade product development. Proven track record of deploying products to production. Experience working in distributed or remote teams. Demonstrated ability to coach junior engineers while remaining hands-on in development. Skills: communication skills,completablefuture,asynchronous programming,aws,java 8+,ci/cd practices,relational databases,sql,lambda expressions,java 11+,stream api,microservices,java 8,spring boot,multithreading Show more Show less

Posted 1 week ago

Apply

5.0 years

0 Lacs

Ahmedabad, Gujarat, India

On-site

Linkedin logo

Company Name: Stridely Solutions Stridely Solutions is an ISO 9001:2015 certified leading global technology solution provider enabling you to avail digital transformation solutions based on cutting edge technology tools and platforms. We are trusted global partners offering services into SAP, IoT, RPA, Advanced Analytics, Microsoft Dynamics, & Microsoft CRM having highly qualified team of 450+ techno brains with our direct presence in India, USA and Canada. We boast of having extensive experience of automating business processes and technology platform migration, with committed techno-enthusiasts to provide quality enterprise software solutions that adds value to generate a favorable return on your investment. Working with our clients more like associates and partners is our standard engagement model. Employee strength: 500+ Position: SAP Hybris Developer Required Experience: 5-14 Years Location: Ahmedabad /Pune/Baroda Work Mode - Work from Office High level Skill Set Required: 5+ years of relevant experience in SAP Commerce Cloud (Hybris). The ideal candidate should possess the ability to think creatively when solving problems and have a strong understanding of coding and design principles. Responsibilities: Must have hands-on experience in designing and developing E-Commerce applications using SAP Commerce Cloud. Demonstrates excellent knowledge of SAP Commerce Cloud core and commerce concepts, including the development of extensions, CronJob, WCMS, Cart, Checkout, Payment Integrations, and more. Have good experience in developing eCommerce applications on SAP Commerce Cloud platform using Spring, REST/API services (OCC), Web Services. Have good understanding of Catalog, SOLR, Order management and Media Management in SAP Commerce Cloud. Possesses knowledge of web technologies, including HTML, CSS, and JavaScript. Has experience in implementing Agile methodology. Familiar with continuous integration build tools and code quality tools. Has a strong understanding of design patterns and software development best practices. Excellent communication and teamwork skills. What We Can Offer? Attractive and competitive salary, Matching your expectation. Opportunity to work in a world class organization. Onsite Opportunity. Flexible work hours. Opportunity to work with Global clients. Awesome place to work. Show more Show less

Posted 1 week ago

Apply

4.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Req ID: 324638 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Data Engineer to join our team in Chennai, Tamil Nādu (IN-TN), India (IN). Key Responsibilities: Design and implement tailored data solutions to meet customer needs and use cases, spanning from streaming to data lakes, analytics, and beyond within a dynamically evolving technical stack. Provide thought leadership by recommending the most appropriate technologies and solutions for a given use case, covering the entire spectrum from the application layer to infrastructure. Demonstrate proficiency in coding skills, utilizing languages such as Python, Java, and Scala to efficiently move solutions into production while prioritizing performance, security, scalability, and robust data integrations. Collaborate seamlessly across diverse technical stacks, including Cloudera, Databricks, Snowflake, and AWS. Develop and deliver detailed presentations to effectively communicate complex technical concepts. Generate comprehensive solution documentation, including sequence diagrams, class hierarchies, logical system views, etc. Adhere to Agile practices throughout the solution development process. Design, build, and deploy databases and data stores to support organizational requirements. Basic Qualifications: 4+ years of experience supporting Software Engineering, Data Engineering, or Data Analytics projects. 2+ years of experience leading a team supporting data related projects to develop end-to-end technical solutions. Experience with Informatica, Python, Databricks, Azure Data Engineer Ability to travel at least 25%. Preferred Skills: Demonstrate production experience in core data platforms such as Snowflake, Databricks, AWS, Azure, GCP, Hadoop, and more. Possess hands-on knowledge of Cloud and Distributed Data Storage, including expertise in HDFS, S3, ADLS, GCS, Kudu, ElasticSearch/Solr, Cassandra, or other NoSQL storage systems. Exhibit a strong understanding of Data integration technologies, encompassing Informatica, Spark, Kafka, eventing/streaming, Streamsets, NiFi, AWS Data Migration Services, Azure DataFactory, Google DataProc. Showcase professional written and verbal communication skills to effectively convey complex technical concepts. Undergraduate or Graduate degree preferred About NTT DATA NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies. Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us at us.nttdata.com NTT DATA endeavors to make https://us.nttdata.com accessible to any and all users. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact us at https://us.nttdata.com/en/contact-us . This contact information is for accommodation requests only and cannot be used to inquire about the status of applications. NTT DATA is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status. For our EEO Policy Statement, please click here . If you'd like more information on your EEO rights under the law, please click here . For Pay Transparency information, please click here . Show more Show less

Posted 1 week ago

Apply

4.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Req ID: 324631 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Data Engineer to join our team in Chennai, Tamil Nādu (IN-TN), India (IN). Key Responsibilities: Design and implement tailored data solutions to meet customer needs and use cases, spanning from streaming to data lakes, analytics, and beyond within a dynamically evolving technical stack. Provide thought leadership by recommending the most appropriate technologies and solutions for a given use case, covering the entire spectrum from the application layer to infrastructure. Demonstrate proficiency in coding skills, utilizing languages such as Python, Java, and Scala to efficiently move solutions into production while prioritizing performance, security, scalability, and robust data integrations. Collaborate seamlessly across diverse technical stacks, including Cloudera, Databricks, Snowflake, and AWS. Develop and deliver detailed presentations to effectively communicate complex technical concepts. Generate comprehensive solution documentation, including sequence diagrams, class hierarchies, logical system views, etc. Adhere to Agile practices throughout the solution development process. Design, build, and deploy databases and data stores to support organizational requirements. Basic Qualifications: 4+ years of experience supporting Software Engineering, Data Engineering, or Data Analytics projects. 2+ years of experience leading a team supporting data related projects to develop end-to-end technical solutions. Experience with Informatica, Python, Databricks, Azure Data Engineer Ability to travel at least 25%. Preferred Skills: Demonstrate production experience in core data platforms such as Snowflake, Databricks, AWS, Azure, GCP, Hadoop, and more. Possess hands-on knowledge of Cloud and Distributed Data Storage, including expertise in HDFS, S3, ADLS, GCS, Kudu, ElasticSearch/Solr, Cassandra, or other NoSQL storage systems. Exhibit a strong understanding of Data integration technologies, encompassing Informatica, Spark, Kafka, eventing/streaming, Streamsets, NiFi, AWS Data Migration Services, Azure DataFactory, Google DataProc. Showcase professional written and verbal communication skills to effectively convey complex technical concepts. Undergraduate or Graduate degree preferred About NTT DATA NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies. Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us at us.nttdata.com NTT DATA endeavors to make https://us.nttdata.com accessible to any and all users. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact us at https://us.nttdata.com/en/contact-us . This contact information is for accommodation requests only and cannot be used to inquire about the status of applications. NTT DATA is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status. For our EEO Policy Statement, please click here . If you'd like more information on your EEO rights under the law, please click here . For Pay Transparency information, please click here . Show more Show less

Posted 1 week ago

Apply

4.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Req ID: 324632 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Data Engineer to join our team in Chennai, Tamil Nādu (IN-TN), India (IN). Key Responsibilities: Design and implement tailored data solutions to meet customer needs and use cases, spanning from streaming to data lakes, analytics, and beyond within a dynamically evolving technical stack. Provide thought leadership by recommending the most appropriate technologies and solutions for a given use case, covering the entire spectrum from the application layer to infrastructure. Demonstrate proficiency in coding skills, utilizing languages such as Python, Java, and Scala to efficiently move solutions into production while prioritizing performance, security, scalability, and robust data integrations. Collaborate seamlessly across diverse technical stacks, including Cloudera, Databricks, Snowflake, and AWS. Develop and deliver detailed presentations to effectively communicate complex technical concepts. Generate comprehensive solution documentation, including sequence diagrams, class hierarchies, logical system views, etc. Adhere to Agile practices throughout the solution development process. Design, build, and deploy databases and data stores to support organizational requirements. Basic Qualifications: 4+ years of experience supporting Software Engineering, Data Engineering, or Data Analytics projects. 2+ years of experience leading a team supporting data related projects to develop end-to-end technical solutions. Experience with Informatica, Python, Databricks, Azure Data Engineer Ability to travel at least 25%. Preferred Skills: Demonstrate production experience in core data platforms such as Snowflake, Databricks, AWS, Azure, GCP, Hadoop, and more. Possess hands-on knowledge of Cloud and Distributed Data Storage, including expertise in HDFS, S3, ADLS, GCS, Kudu, ElasticSearch/Solr, Cassandra, or other NoSQL storage systems. Exhibit a strong understanding of Data integration technologies, encompassing Informatica, Spark, Kafka, eventing/streaming, Streamsets, NiFi, AWS Data Migration Services, Azure DataFactory, Google DataProc. Showcase professional written and verbal communication skills to effectively convey complex technical concepts. Undergraduate or Graduate degree preferred About NTT DATA NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies. Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us at us.nttdata.com NTT DATA endeavors to make https://us.nttdata.com accessible to any and all users. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact us at https://us.nttdata.com/en/contact-us . This contact information is for accommodation requests only and cannot be used to inquire about the status of applications. NTT DATA is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status. For our EEO Policy Statement, please click here . If you'd like more information on your EEO rights under the law, please click here . For Pay Transparency information, please click here . Show more Show less

Posted 1 week ago

Apply

25.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

About your new company!! Collegedunia is an education portal, matching students with best colleges in India & abroad. We help in college research, exam prep tips, application process & also provide insights on campus life. Launched in 2014, we are the highest ranked portal by SimilarWeb in education. We have also been awarded as - Best Educational Portal- by IAMAI in 2017, and listed by TechinAsia as - Top 100 Startups in Asia- Collegedunia is fuelled by energy of over 280 individuals having average age around 25 years. The talent pool comprises data analysts, engineers, designers, writers, managers & marketers, which is increasing at 10% every month. Key Responsibilities -  Design of the overall architecture of the web application.  Optimization of the application for maximum speed and scalability.  Solve complex performance problems and architectural challenges.  Integration of user-facing elements developed by front-end developers with server-side logic.  Learn and use core AWS technologies to design and then build available and scalable backend web services and customer-facing APIs.  Strong problem-solving skills, algorithmic skills, and data structures Experience in agile methodologies like Scrum Good understanding of branching, build, deployment, continuous integration methodologies Skills & Qualifications  Strong knowledge of PHP frameworks such as Laravel, Symfony etc. depending on your technology stack.  Experience in working with MySQL database and analyzing the efficiency of queries.  Worked with real time web applications and event driven architectures like Node.js or JQuery.  Full-cycle PHP code development experience including debugging and performance analysis.  Building scalable and performance oriented services with caching techniques and systems like Memcached, redis.  Experience with MySQL and distributed database like MongoDB, Cassandra or Redis.  Comfortable with search engines like Solr or ElasticSearch. JOB TITLE PHP Developer TEAM Collegedunia COLLEGEDUNIA.COM | JOB DESCRIPTION PHP Developer  Working understanding of NGINX and Apache web servers.  Passion for products, empathy for users, and aspiration to make big impact.  Strong database design and query writing skills with a commitment to performance and efficiency.  Knowledge of service oriented architecture, microservices, and distributed systems.  Assure quality of the solutions being developed within the team.  Troubleshoot and debug support issues and code blues autonomously.  Maintain technical mastery of the products being developed.  Person with a track record of delivering innovative solutions in the field of algorithm development. Education Qualification: BTech, MCA or M Tech Show more Show less

Posted 1 week ago

Apply

3.0 - 5.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Role : Senior Generative AI Engineer Location : Chennai (on-site) Exp : 3 - 5 years FarmwiseAI is a leading Geospatial AI company based in Chennai, specializing in AI driven agriculture solutions that enable data-driven decision-making for governments, lenders, and businesses. Founded in 2020, we deliver real-time advisory, automated land mapping, and crop-monitoring products at scale to foster sustainable development. As an AI-first organization, we embed AI assistance across our entire product lifecycle from brainstorming and architecture to testing and deployment to empower every team member to leverage AI in their day-to-day work. We are hiring a Generative AI Engineer to build, deploy, and optimize multimodal AI services across text, speech, and vision. You'll work on RAG, synthetic data generation, agent workflows, and integrate STT/TTS/OCR with scalable backend systems. Generative Pipelines : Design applications for RAG, CAG, text classification, summarization, image/video generation, OCR, and synthetic data generation. Multimodal Integration : Work with STT, TTS, IVR, OCR, and vision inputs to enable seamless AI interactions. AI Agent Workflows : Develop modular, multi-step orchestrations for document, conversational, and data-based user journeys. Containerization & Deployment : Collaborate with DevOps to containerize services, manage Kubernetes orchestration, and implement CI/CD for agile delivery. Observability : Instrument services using OpenTelemetry, Prometheus, and logging tools to ensure SLO-driven production reliability. Collaboration : Work cross-functionally with product, data science, and frontend teams to define APIs (REST/GraphQL) and ensure smooth integration. Documentation & Mentorship : Participate in architecture reviews, write clear documentation, and mentor junior engineers and interns Bachelor's/Masters in Computer Science, Data Science, IT, or related field. 2 - 3 years of experience building AI/ML products in Python. Must be proficient in AI-first coding tools like Claude Code, Cursor, Roocode, etc. Proven experience in deploying GenAI applications and agents in production. Strong hands-on with vector search, embedding-based retrieval, STT, TTS, OCR/vision. Familiarity with Docker, Kubernetes, frontend development, and CI/CD workflows. Strong debugging, performance tuning, and cost-optimization skills. Excellent communication, teamwork, and mentoring abilities. Languages & Tools (mandatory) : Python (pandas, scikit-learn, PyTorch, Tensorflow, etc.), Git/GitHub , AWS or GCP. Generative AI stack (mandatory) : LangChain, LlamaIndex, transformers, frontier LLMs (OpenAI, Anthropic, Gemini models) and open models (DeepSeek, Qwen, Llama and Phi models). Vector stores : FAISS, Pinecone, Qdrant, Weaviate, etc. Keyword Index : Elasticsearch, Apache Solr, Typesense, etc. Validation frameworks : Pydantic, Instructor, etc. LLM Abstraction libraries : Lite LLM Asynchronous or parallel programming : asyncio, joblib, etc. API frameworks : FastAPI, Flask, etc. FE prototyping : Streamlit, Gradio, etc Agentic AI Frameworks (mandatory, 1) : Google Agents Development Kit , LangGraph, OpenAI Agents SDK, PydanticAI. Speech & Vision (nice-to-have) : OpenAI Realtime Voice API/Whisper; ElevenLabs/Smallest.ai TTS; LlamaParse/JinaAI/Mistral OCR. Observability & Monitoring (nice-to-have) : OpenTelemetry, Prometheus , LangSmith, Pydantic Logfire . Cloud & DevOps (nice-to-have) : Docker, Kubernetes, GitHub Actions. Domain experience in AgriTech, FinTech, HRTech or EduTech. Experience and profound interest in reading and implementing research papers. Open-source contributions or published evaluation suites. Exposure to managed cloud AI services (Vertex AI, Bedrock, JumpStart). Familiarity with React/Next.js integration (ref:hirist.tech) Show more Show less

Posted 1 week ago

Apply

2.0 years

0 Lacs

Mumbai Metropolitan Region

On-site

Linkedin logo

We're hiring for Python SDE 1 to join our Commerce Team. The Commerce Engineering Team forms the backbone of our core business. We build and iterate over our core platform that handles start from onboarding a seller to serving the finished products to end customers across different channels with customisation and configuration. Our team consists of generalist engineers who work on building REST APIs, Internal tools, and Infrastructure. Some Specific Requirements Atleast 2+ years of Development Experience You have prior experience developing and working on consumer-facing web/app products Solid experience in Python with experience in building web/app-based tech products Experience in at least one of the following frameworks - Sanic, Django, Flask, Falcon, web2py, Twisted, Tornado Working knowledge of MySQL, MongoDB, Redis, Aerospike Good understanding of Data Structures, Algorithms, and Operating Systems You've worked with core AWS services in the past and have experience with EC2, ELB, AutoScaling, CloudFront, S3, Elasticache Understanding of Kafka, Docker, Kubernetes Have knowledge of Solr, Elastic search Attention to detail You can dabble in Frontend codebases using HTML, CSS, and Javascript You love doing things efficiently the work you do will have a disproportionate impact on the business. We believe in systems and processes that let us scale our impact to be larger than ourselves You might not have experience with all the tools that we use but you can learn those given the guidance and resources (ref:hirist.tech) Show more Show less

Posted 1 week ago

Apply

5.0 - 10.0 years

15 - 30 Lacs

Pune, Gurugram

Work from Office

Naukri logo

In one sentence We are seeking an experienced Kafka Administrator to manage and maintain our Apache Kafka infrastructure, with a strong focus on deployments within OpenShift and Cloudera environments. The ideal candidate will have hands-on experience with Kafka clusters, container orchestration, and big data platforms, ensuring high availability, performance, and security. What will your job look like? Install, configure, and manage Kafka clusters in production and non-production environments. Deploy and manage Kafka on OpenShift using Confluent for Kubernetes (CFK) or similar tools. Integrate Kafka with Cloudera Data Platform (CDP), including services like NiFi, HBase, and Solr. Monitor Kafka performance and implement tuning strategies for optimal throughput and latency. Implement and manage Kafka security using SASL_SSL, Kerberos, and RBAC. Perform upgrades, patching, and backup/recovery of Kafka environments. Collaborate with DevOps and development teams to support CI/CD pipelines and application integration. Troubleshoot and resolve Kafka-related issues in a timely manner. Maintain documentation and provide knowledge transfer to team members. All you need is... 5+ years of experience as a Kafka Administrator. 2+ years of experience deploying Kafka on OpenShift or Kubernetes. Strong experience with Cloudera ecosystem and integration with Kafka. Proficiency in Kafka security protocols (SASL_SSL, Kerberos). Experience with monitoring tools like Prometheus, Grafana, or Confluent Control Center. Solid understanding of Linux systems and shell scripting. Familiarity with CI/CD tools (Jenkins, GitLab CI, etc.). Excellent problem-solving and communication skills.

Posted 1 week ago

Apply

6.0 - 11.0 years

8 - 13 Lacs

Bengaluru

Work from Office

Naukri logo

Requirements 6+ years of experience Proficiency in Java, Spring Boot, object databases, ElasticSearch/Solr, Practice in using AWS cloud, Docker & Kubernetes, REST APIs Experience in building scalable and high-performance systems Strong communication skills in English (B2+) Nice-to-have: Knowledge of Python, ETL experience, and big data solutions Responsibilities Maintenance of a large modern search platform Handling production issues and incidents Optimizing and maintaining existing code for performance and availability Ensuring high performance and availability of the system Engage in the Release Process Team Information Work within a SAFe, scrum / kanban methodology, and agile approach Collaborative and friendly atmosphere Utilization of microservice architecture and extensive CI/CD automation Tools used: git, IntelliJ, Jira, Confluence, i3 by Tieto as search backend Skills: aws cloud, docker & kubernetes, rest apis,strong communication skills in english (b2+),rest apis,java,kubernetes,docker,elasticsearch,java, spring boot, object databases, elasticsearch/solr,object databases,big data solutions,aws cloud,python,etl,solr,spring boot

Posted 1 week ago

Apply

4.0 - 5.0 years

6 - 7 Lacs

Pune

Work from Office

Naukri logo

As a Consultant/Developer you will serve as a client-facing practitioner who sells, leads and implements expert services utilizing the breadth of IBM's offerings and technologies. A successful Consultant is regarded by clients as a trusted business advisor who collaborates to provide innovative solutions used to solve the most challenging business problems. You will work on projects that help clients integrate strategy, process, technology, and information to increase effectiveness, reduce costs and improve profit and shareholder value. You can take advantage of opportunities to master new skills, work across different disciplines, move into new challenges and develop a robust understanding of different industries Your primary responsibilities include: Work in an Agile environment (with Product Owners, Requirement Managers, Developers, QA) to define requirements and design the solution, depending on the project. Participate in the software design, software development, and package implementation using industry standard platforms of SAP Hybris Commerce Interfaces with system integrators and IT engineers on projects to ensure proper adherence to SDLC and platform integrity. Ensures software developed is held to high quality standards through code and design reviews. Become an integral part in providing Hybris Solutions to B2B and B2C customers Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Minimum 4 to 5 years of experience in IT Industry in the SAP Hybris Commerce Expertise with 5+ years in Java, Advance Java, Spring Framework, Java Script Proficient in Hybris Commerce experience with 8+ years’ experience along with Hybris Commerce Implementation Project Portfolio of previous work must be presented incorporating UI designs and/or sketches that demonstrate an understanding of wireframing, interaction design, visual design and design best practices Preferred technical and professional experience Proven Soft skills in written and verbal You love collaborative environments that use agile methodologies to encourage creative design thinking and find innovative ways to develop with cutting edge technologies Ambitious individual who can work under their own direction towards agreed targets/goals and with creative approach to work

Posted 1 week ago

Apply

12.0 - 17.0 years

14 - 19 Lacs

Bengaluru

Work from Office

Naukri logo

Work in an Agile environment (with Product Owners, Requirement Managers, Developers, QA) to define requirements and design the solution, depending on the project. Participate in the software design, software development, and package implementation using industry standard platforms of SAP Hybris Commerce Interfaces with system integrators and IT engineers on projects to ensure proper adherence to SDLC and platform integrity. Ensures software developed is held to high quality standards through code and design reviews Become an integral part in providing Hybris Solutions to B2B and B2C customers Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Overall industry experience of 12+ years Expertise with 5+ years in Java, Advance Java, Spring Framework, Java Script Proficient in Hybris Commerce experience with 8+ years’ experience along with Hybris Commerce Implementation Project Hands-on development experience in Hybris Commerce Portfolio of previous work must be presented incorporating UI designs and/or sketches that demonstrate an understanding ofwireframing, interaction design, visual design and design best practices Preferred technical and professional experience You love collaborative environments that use agile methodologies to encourage creative design thinking and find innovative ways to develop with cutting edge technologies Ambitious individual who can work under their own direction towards agreed targets/goals and with creative approach to work Up-to-date technical knowledge by attending educational workshops, reviewing publications

Posted 1 week ago

Apply

14.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Requirements Description and Requirements Position Summary: A highly skilled Big Data (Hadoop) Administrator responsible for the installation, configuration, engineering, and architecture of Cloudera Data Platform (CDP) and Cloudera Flow Management (CFM) streaming clusters on RedHat Linux. Strong expertise in DevOps practices, scripting, and infrastructure-as-code for automating and optimizing operations is highly desirable. Experience in collaborating with cross-functional teams, including application development, infrastructure, and operations, is highly preferred. Job Responsibilities: Manages the design, distribution, performance, replication, security, availability, and access requirements for large and complex Big Data clusters. Designs and develops the architecture and configurations to support various application needs; implements backup, recovery, archiving, conversion strategies, and performance tuning; manages job scheduling, application release, cluster changes, and compliance. Identifies and resolves issues utilizing structured tools and techniques. Provides technical assistance and mentoring to staff in all aspects of Hadoop cluster management; consults and advises application development teams on security, query optimization, and performance. Writes scripts to automate routine cluster management tasks and documents maintenance processing flows per standards. Implement industry best practices while performing Hadoop cluster administration tasks. Works in an Agile model with a strong understanding of Agile concepts. Collaborates with development teams to provide and implement new features. Debugs production issues by analyzing logs directly and using tools like Splunk and Elastic. Address organizational obstacles to enhance processes and workflows. Adopts and learns new technologies based on demand and supports team members by coaching and assisting. Education: Bachelor’s degree in computer science, Information Systems, or another related field with 14+ years of IT and Infrastructure engineering work experience. Experience: 14+ Years Total IT experience & 10+ Years relevant experience in Big Data database Technical Skills: Big Data Platform Management : Big Data Platform Management: Expertise in managing and optimizing the Cloudera Data Platform, including components such as Apache Hadoop (YARN and HDFS), Apache HBase, Apache Solr , Apache Hive, Apache Kafka, Apache NiFi , Apache Ranger, Apache Spark, as well as JanusGraph and IBM BigSQL . Data Infrastructure & Security : Proficient in designing and implementing robust data infrastructure solutions with a strong focus on data security, utilizing tools like Apache Ranger and Kerberos. Performance Tuning & Optimization : Skilled in performance tuning and optimization of big data environments, leveraging advanced techniques to enhance system efficiency and reduce latency. Backup & Recovery : Experienced in developing and executing comprehensive backup and recovery strategies to safeguard critical data and ensure business continuity. Linux & Troubleshooting : Strong knowledge of Linux operating systems , with proven ability to troubleshoot and resolve complex technical issues, collaborating effectively with cross-functional teams. DevOps & Scripting : Proficient in scripting and automation using tools like Ansible, enabling seamless integration and automation of cluster operations. Experienced in infrastructure-as-code practices and observability tools such as Elastic. Agile & Collaboration : Strong understanding of Agile SAFe for Teams, with the ability to work effectively in Agile environments and collaborate with cross-functional teams. ITSM Process & Tools : Knowledgeable in ITSM processes and tools such as ServiceNow. Other Critical Requirements: Automation and Scripting : Proficiency in automation tools and programming languages such as Ansible and Python to streamline operations and improve efficiency. Analytical and Problem-Solving Skills : Strong analytical and problem-solving abilities to address complex technical challenges in a dynamic enterprise environment. 24x7 Support : Ability to work in a 24x7 rotational shift to support Hadoop platforms and ensure high availability. Team Management and Leadership : Proven experience managing geographically distributed and culturally diverse teams, with strong leadership, coaching, and mentoring skills. Communication Skills : Exceptional written and oral communication skills, with the ability to clearly articulate technical and functional issues, conclusions, and recommendations to stakeholders at all levels. Stakeholder Management : Prior experience in effectively managing both onshore and offshore stakeholders, ensuring alignment and collaboration across teams. Business Presentations : Skilled in creating and delivering impactful business presentations to communicate key insights and recommendations. Collaboration and Independence : Demonstrated ability to work independently as well as collaboratively within a team environment, ensuring successful project delivery in a complex enterprise setting. About MetLife Recognized on Fortune magazine's list of the 2025 "World's Most Admired Companies" and Fortune World’s 25 Best Workplaces™ for 2024, MetLife , through its subsidiaries and affiliates, is one of the world’s leading financial services companies; providing insurance, annuities, employee benefits and asset management to individual and institutional customers. With operations in more than 40 markets, we hold leading positions in the United States, Latin America, Asia, Europe, and the Middle East. Our purpose is simple - to help our colleagues, customers, communities, and the world at large create a more confident future. United by purpose and guided by empathy, we’re inspired to transform the next century in financial services. At MetLife, it’s #AllTogetherPossible . Join us! Show more Show less

Posted 1 week ago

Apply

3.0 - 7.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Role - Java Developer Experience - 3-5 yrs Location - Bangalore Backend ● Bachelors/Masters in Computer science from a reputed institute/university ● 3-7 years of strong experience in building Java/golang/python based server side solutions ● Strong in data structure, algorithm and software design ● Experience in designing and building RESTful micro services ● Experience with Server side frameworks such as JPA (HIbernate/SpringData), Spring, vertex, Springboot, Redis, Kafka, Lucene/Solr/ElasticSearch etc. ● Experience in data modeling and design, database query tuning ● Experience in MySQL and strong understanding of relational databases. ● Comfortable with agile, iterative development practices ● Excellent communication (verbal & written), interpersonal and leadership skills ● Previous experience as part of a Start-up or a Product company. ● Experience with AWS technologies would be a plus ● Experience with reactive programming frameworks would be a plus · Contributions to opensource are a plus ● Familiarity with deployment architecture principles and prior experience with container orchestration platforms, particularly Kubernetes, would be a significant advantage Show more Show less

Posted 1 week ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Current scope and span of work: Summary : Need is for a data engineer to handle day-to-day activities involving data ingestion from multiple source locations, help identify data sources, to troubleshoot issues, and engage with a third-party vendor to meet stakeholders’ needs. Required Skills : Python Processing of large quantities of text documents Extraction of text from Office and PDF documents Input json to an API, output json to an API Nifi (or similar technology compatible with current EMIT practices) Basic understanding of AI/ML concepts Database/Search engine/SOLR skills SQL – build queries to analyze, create and update databases Understands the basics of hybrid search Experience working with terabytes (TB) of data Basic OpenML/Python/Azure knowledge Scripting knowledge/experience in an Azure environment to automate Cloud systems experience related to search and databases Platforms: DataBricks Snowflake ESRI ArcGIS / SDE New GenAI app being developed Scope of work : 1. Ingest TB of data from multiple sources identified by the Ingestion Lead 2. Optimize data pipelines to improve on data processing, speed, and data availability 4. Make data available for end users from several hundred LAN and SharePoint areas 5. Monitor data pipelines daily and fix issues related to scripts, platforms, and ingestion 6. Work closely with the Ingestion Lead & Vendor on issues related to data ingestion Technical Skills demonstrated: 1. SOLR - Backend database 2. Nifi - Data movement 3. Pyspark - Data Processing 4. Hive & Oozie - For jobs monitoring 5. Querying - SQL, HQl and SOLR querying 6. SQL 7. Python Behavioral Skills demonstrated: 1. Excellent communication skills 2. Ability to receive direction from a Lead and implement 3. Prior experience working in an Agile setup, preferred 4. Experience troubleshooting technical issues and quality control checking of work 5. Experience working with a globally distributed team in different Show more Show less

Posted 1 week ago

Apply

0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

We are intending to hire Data engineer to handle day-to-day activities involving data ingestion from multiple source locations, help identify data sources, to troubleshoot issues, and engage with a third-party vendor to meet stakeholders’ needs. Work Location: Chennai or Hyderabad or Pune WFO. Shift hours: 2.00pm to 11.00pm IST. Required Immediate Joiners. Required Skills : Python Processing of large quantities of text documents Extraction of text from Office and PDF documents Input json to an API, output json to an API Nifi (or similar technology compatible with current EMIT practices) Basic understanding of AI/ML concepts Database/Search engine/SOLR skills SQL – build queries to analyze, create and update databases Understands the basics of hybrid search Experience working with terabytes (TB) of data Basic OpenML/Python/Azure knowledge Scripting knowledge/experience in an Azure environment to automate Cloud systems experience related to search and databases Platforms: DataBricks Snowflake ESRI ArcGIS / SDE New GenAI app being developed Scope of work : 1. Ingest TB of data from multiple sources identified by the Ingestion Lead 2. Optimize data pipelines to improve on data processing, speed, and data availability 4. Make data available for end users from several hundred LAN and SharePoint areas 5. Monitor data pipelines daily and fix issues related to scripts, platforms, and ingestion 6. Work closely with the Ingestion Lead & Vendor on issues related to data ingestion Technical Skills demonstrated: 1. SOLR - Backend database 2. Nifi - Data movement 3. Pyspark - Data Processing 4. Hive & Oozie - For jobs monitoring 5. Querying - SQL, HQl and SOLR querying 6. SQL 7. Python Behavioral Skills demonstrated: 1. Excellent communication skills 2. Ability to receive direction from a Lead and implement 3. Prior experience working in an Agile setup, preferred 4. Experience troubleshooting technical issues and quality control checking of work 5. Experience working with a globally distributed team in different Show more Show less

Posted 1 week ago

Apply

8.0 years

0 Lacs

Bengaluru

On-site

GlassDoor logo

Imagine what you could do here. At Apple, phenomenal ideas have a way of becoming great products, services, and customer experiences very quickly. Bring passion and dedication to your job and there's no telling what you could accomplish. The people here at Apple don’t just create products - they create the kind of wonder that’s revolutionized entire industries. It’s the diversity of those people and their ideas that inspires the innovation that runs through everything we do, from amazing technology to industry-leading environmental efforts. Join Apple, and help us leave the world better than we found it! We are looking for a passionate NoSQL / Search Engineer to help manage the large scale data store environments. This team is responsible for providing new architectures and scalability solutions to ever growing business and data processing needs. Individual can go to the depths to solve complex problems and have the curiosity to explore and learn new technologies for innovative solutions. Description - Design, implement and maintain NoSQL database systems / search engines. - Develop and optimize search algorithms to ensure high performance and accuracy. - Analyze and understand data requirements to design appropriate data models. - Monitor and troubleshoot database performance issues, ensuring system stability and efficiency. - Implement data indexing and ensure efficient data retrieval processes. - Collaborate with cross-functional teams to understand business requirements and translate them into technical solutions. - Stay updated with the latest advancements in NoSQL and search technologies, and apply them to improve existing systems. - Build and maintain documentation related to database configurations, schemas, and processes - Will work with global teams in US. Deliver solutions that can keep up with a rapidly evolving product in a timely fashion Minimum Qualifications 8+ years or experience as a NoSQL / Search Engineer or in a similar role Education : Bachelors or Masters in Computer Science or equivalent Strong understanding and hands-on experience with NoSQL databases such as Cassandra, Couchbase, or similar Expertise in search technologies such as Elasticsearch, Solr, or similar Proficiency in programming languages such as Java, Python Familiarity with data modeling, indexing, and query optimization techniques Experience with large-scale data processing and distributed systems Strong problem-solving skills and attention to detail Good in depth understanding of the Linux in term of debugging tools and performance tuning Open source contribution is a must Preferred Qualifications Experience with cloud platforms such as AWS, Google Cloud, or Azure is a plus Knowledge of machine learning techniques and their application in search is an added bonus to have JVM Tuning tools, OS Performance and Debugging DISCLAIMER : We also take affirmative action to offer employment and advancement opportunities to all applicants, including minorities, women, protected veterans, and individuals with disabilities. Apple will not discriminate or retaliate against applicants who inquire about, disclose, or discuss their compensation or that of other applicants. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation. Submit CV

Posted 1 week ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies