Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
12 years
0 Lacs
Chennai, Tamil Nadu
Work from Office
Java, Spring Boot, PL/SQL, Elastic Search, Neo4j ECS Openshift, Docker, Kubernetes, Helm, Spring Core, Reactive Programming, and RESTful API We are seeking a Senior Search Domain Expert and Java Developer with 12+ years of experience and a strong background in Spring Boot, ECS Openshift, Docker, and Kubernetes (Helm). The successful candidate will be responsible for designing, developing, and implementing high-quality software solutions. Design, develop, and maintain efficient, reusable, and reliable Java code. Use Spring Boot to develop microservices and manage cross-cutting concerns. Use Docker for containerization and Kubernetes for orchestration of services. Identify bottlenecks and bugs, and devise solutions to these problems. Help maintain code quality, organization, and automatization. Collaborate with other team members and stakeholders. 12+ years of software development experience with Java. Strong experience with Spring Boot. Proficient in PL/SQL Experience with Elasticsearch including configuration, content ingestion and query building. Experience with Neo4j and familiarity with knowledge Graph database concepts. Proven implementation of Search Indexing for efficient and accurate data retrieval. Experience with ECS Openshift, Docker and Kubernetes. Experience with Helm for managing Kubernetes deployments. Solid understanding of object-oriented programming. Familiarity with concepts of Spring Core, Reactive Programming, and RESTful API development. Understanding of code versioning tools, such as Git. Familiarity with build tools such as Maven or Gradle. Excellent problem-solving skills and attention to detail. Strong communication skills and the ability to work as part of a team. About Virtusa Teamwork, quality of life, professional and personal development: values that Virtusa is proud to embody. When you join us, you join a team of 27,000 people globally that cares about your growth — one that seeks to provide you with exciting projects, opportunities and work with state of the art technologies throughout your career with us. Great minds, great potential: it all comes together at Virtusa. We value collaboration and the team environment of our company, and seek to provide great minds with a dynamic place to nurture new ideas and foster excellence. Virtusa was founded on principles of equal opportunity for all, and so does not discriminate on the basis of race, religion, color, sex, gender identity, sexual orientation, age, non-disqualifying physical or mental disability, national origin, veteran status or any other basis covered by appropriate law. All employment is decided on the basis of qualifications, merit, and business need.
Posted 1 month ago
3 - 7 years
5 - 15 Lacs
Chennai
Remote
Keytag.ai is looking for a skilled Python Developer with expertise in Neo4j and Cypher to join our growing team. Youll be working on enhancing existing products and contributing to all stages of the development lifecycle from design and implementation to deployment and support. This is a great opportunity to work in a dynamic, startup-style environment with the security of long-term customer contracts already in place. Core Skills Required Strong Python development skills, particularly in designing and building FastAPI web services Hands-on experience with Neo4j and the Cypher query language Proficient with Docker, Git, and CI/CD pipelines Solid experience in data engineering, including use of pandas and other data science libraries Comfortable working in a Linux environment using VSCode and GitHub A proactive, self-starting attitude with a passion for learning, problem-solving, and contributing ideas to both design and development Desirable (but not required you’ll gain experience here) Experience with Apache Airflow (designing and troubleshooting DAGs) Familiarity with Redis (used for caching and vector storage) What You’ll Be Working On Extend and enhance existing products, including: A knowledge-graph-based intelligence system for a pharmaceutical client Data lineage and discovery tools Multiple agent-based systems powered by large language models (LLMs) Design, build, and maintain APIs using FastAPI Model and optimize complex data relationships in Neo4j; manage related caching with Redis Collaborate closely with frontend developers, architects, and data experts to deliver new features Contribute to overall system architecture, performance tuning, and security enhancements Participate in daily team standups, work closely on shared tasks, and engage directly with clients to discuss requirements, troubleshoot, and deliver solutions Benefits Remote work opportunity, providing flexibility and work-life balance. Medical insurance coverage for you and your family. Join a friendly and well-organized team with an open communication style. Continuous opportunities to grow your technical and soft skills.
Posted 1 month ago
12 - 17 years
14 - 19 Lacs
Pune, Bengaluru
Work from Office
Project Role : Application Architect Project Role Description : Provide functional and/or technical expertise to plan, analyze, define and support the delivery of future functional and technical capabilities for an application or group of applications. Assist in facilitating impact assessment efforts and in producing and reviewing estimates for client work requests. Must have skills : Manufacturing Operations Good to have skills : NA Minimum 12 year(s) of experience is required Educational Qualification : BTech BE Job Title:Industrial Data Architect Summary :We are seeking a highly skilled and experienced Industrial Data Architect with a proven track record in providing functional and/or technical expertise to plan, analyse, define and support the delivery of future functional and technical capabilities for an application or group of applications. Well versed with OT data quality, Data modelling, data governance, data contextualization, database design, and data warehousing. Must have Skills:Domain knowledge in areas of Manufacturing IT OT in one or more of the following verticals Automotive, Discrete Manufacturing, Consumer Packaged Goods, Life ScienceKey Responsibilities: Industrial Data Architect will be responsible for developing and overseeing the industrial data architecture strategies to support advanced data analytics, business intelligence, and machine learning initiatives. This role involves collaborating with various teams to design and implement efficient, scalable, and secure data solutions for industrial operations. Focused on designing, building, and managing the data architecture of industrial systems. Assist in facilitating impact assessment efforts and in producing and reviewing estimates for client work requests. Own the offerings and assets on key components of data supply chain, data governance, curation, data quality and master data management, data integration, data replication, data virtualization. Create scalable and secure data structures, integrating with existing systems and ensuring efficient data flow. Qualifications: Data Modeling and Architecture:oProficiency in data modeling techniques (conceptual, logical, and physical models).oKnowledge of database design principles and normalization.oExperience with data architecture frameworks and methodologies (e.g., TOGAF). Database Technologies:oRelational Databases:Expertise in SQL databases such as MySQL, PostgreSQL, Oracle, and Microsoft SQL Server.oNoSQL Databases:Experience with at least one of the NoSQL databases like MongoDB, Cassandra, and Couchbase for handling unstructured data.oGraph Databases:Proficiency with at least one of the graph databases such as Neo4j, Amazon Neptune, or ArangoDB. Understanding of graph data models, including property graphs and RDF (Resource Description Framework).oQuery Languages:Experience with at least one of the query languages like Cypher (Neo4j), SPARQL (RDF), or Gremlin (Apache TinkerPop). Familiarity with ontologies, RDF Schema, and OWL (Web Ontology Language). Exposure to semantic web technologies and standards. Data Integration and ETL (Extract, Transform, Load):oProficiency in ETL tools and processes (e.g., Talend, Informatica, Apache NiFi).oExperience with data integration tools and techniques to consolidate data from various sources. IoT and Industrial Data Systems:oFamiliarity with Industrial Internet of Things (IIoT) platforms and protocols (e.g., MQTT, OPC UA).oExperience with either of IoT data platforms like AWS IoT, Azure IoT Hub, and Google Cloud IoT Core.oExperience working with one or more of Streaming data platforms like Apache Kafka, Amazon Kinesis, Apache FlinkoAbility to design and implement real-time data pipelines. Familiarity with processing frameworks such as Apache Storm, Spark Streaming, or Google Cloud Dataflow.oUnderstanding of event-driven design patterns and practices. Experience with message brokers like RabbitMQ or ActiveMQ.oExposure to the edge computing platforms like AWS IoT Greengrass or Azure IoT Edge AI/ML, GenAI:oExperience working on data readiness for feeding into AI/ML/GenAI applicationsoExposure to machine learning frameworks such as TensorFlow, PyTorch, or Keras. Cloud Platforms:oExperience with cloud data services from at least one of the providers like AWS (Amazon Redshift, AWS Glue), Microsoft Azure (Azure SQL Database, Azure Data Factory), and Google Cloud Platform (BigQuery, Dataflow). Data Warehousing and BI Tools:oExpertise in data warehousing solutions (e.g., Snowflake, Amazon Redshift, Google BigQuery).oProficiency with Business Intelligence (BI) tools such as Tableau, Power BI, and QlikView. Data Governance and Security:oUnderstanding of data governance principles, data quality management, and metadata management.oKnowledge of data security best practices, compliance standards (e.g., GDPR, HIPAA), and data masking techniques. Big Data Technologies:oExperience in big data platforms and tools such as Hadoop, Spark, and Apache Kafka.oUnderstanding of distributed computing and data processing frameworks. Excellent Communication:Superior written and verbal communication skills, with the ability to effectively articulate complex technical concepts to diverse audiences. Problem-Solving Acumen:A passion for tackling intricate challenges and devising elegant solutions. Collaborative Spirit:A track record of successful collaboration with cross-functional teams and stakeholders. Certifications:AWS Certified Data Engineer Associate / Microsoft Certified:Azure Data Engineer Associate / Google Cloud Certified Professional Data Engineer certification is mandatory Minimum of 14-18 years progressive information technology experience. Qualifications BTech BE
Posted 1 month ago
5 - 10 years
5 - 9 Lacs
Bengaluru
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Neo4j Good to have skills : NA Minimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with various teams to understand their needs, developing innovative solutions, and ensuring that applications are aligned with business objectives. You will engage in problem-solving activities, participate in team meetings, and contribute to the overall success of projects by leveraging your expertise in application development. Roles & Responsibilities: Expected to be an SME. Collaborate and manage the team to perform. Responsible for team decisions. Engage with multiple teams and contribute on key decisions. Provide solutions to problems for their immediate team and across multiple teams. Facilitate knowledge sharing sessions to enhance team capabilities. Monitor project progress and ensure timely delivery of application features. Professional & Technical Skills: Must To Have Skills: Proficiency in Neo4j. Strong understanding of graph database concepts and data modeling. Experience with application development frameworks and methodologies. Familiarity with RESTful APIs and microservices architecture. Ability to troubleshoot and optimize application performance. Additional Information: The candidate should have minimum 5 years of experience in Neo4j. This position is based at our Bengaluru office. A 15 years full time education is required. Qualification 15 years full time education
Posted 1 month ago
18 - 22 years
0 Lacs
Hyderabad, Telangana, India
Hybrid
DATAECONOMY is one of the fastest-growing Data & AI company with global presence. We are well-differentiated and are known for our Thought leadership, out-of-the-box products, cutting-edge solutions, accelerators, innovative use cases, and cost-effective service offerings. We offer products and solutions in Cloud, Data Engineering, Data Governance, AI/ML, DevOps and Blockchain to large corporates across the globe. Strategic Partners with AWS, Collibra, cloudera, neo4j, DataRobot, Global IDs, tableau, MuleSoft and Talend. Job Title: Delivery HeadExperience: 18 - 22 YearsLocation: HyderabadNotice Period: Immediate Joiners are preferred Job Summary:We are seeking a seasoned Technical Delivery Manager with deep expertise in Data Engineering and Data Science to lead complex data initiatives and drive successful delivery across cross-functional teams. The ideal candidate brings a blend of strategic thinking, technical leadership, and project execution skills, along with hands-on knowledge of modern data platforms, machine learning, and analytics frameworks. Key Responsibilities:Program & Delivery ManagementOversee end-to-end delivery of large-scale data programs, ensuring alignment with business goals, timelines, and quality standards.Manage cross-functional project teams including data engineers, data scientists, analysts, and DevOps personnel.Ensure agile delivery through structured sprint planning, backlog grooming, and iterative delivery.Technical LeadershipProvide architectural guidance and review of data engineering pipelines and machine learning models.Evaluate and recommend modern data platforms (e.g., Snowflake, Databricks, Azure Data Services, AWS Redshift, GCP BigQuery).Ensure best practices in data governance, quality, and compliance (e.g., GDPR, HIPAA).Stakeholder & Client ManagementAct as the primary point of contact for technical discussions with clients, business stakeholders, and executive leadership.Translate complex data requirements into actionable project plans.Present technical roadmaps and delivery status to stakeholders and C-level executives.Team Development & MentoringLead, mentor, and grow a high-performing team of data professionals.Conduct code and design reviews; promote innovation and continuous improvement. Key Skills and Qualifications:Bachelor’s or master’s degree in computer science, Data Science, Engineering, or a related field.18–22 years of total IT experience with at least 8–10 years in data engineering, analytics, or data science.Proven experience delivering enterprise-scale data platforms, including:ETL/ELT pipelines using tools like Apache Spark, Airflow, Kafka, Talend, or Informatica.Data warehouse and lake architectures (e.g., Snowflake, Azure Synapse, AWS Redshift, Delta Lake).Machine Learning lifecycle management (e.g., model training, deployment, MLOps using MLflow, SageMaker, or Vertex AI).Strong knowledge of cloud platforms (Azure, AWS, or GCP).Deep understanding of Agile, Scrum, and DevOps principles.Excellent problem-solving, communication, and leadership skills. Preferred Certifications (Optional but Beneficial):PMP, SAFe Agile, or similar project management certifications.Certifications in cloud platforms (e.g., AWS Certified Data Analytics, Azure Data Engineer Associate).Certified Scrum Master (CSM) or equivalent.
Posted 1 month ago
5 - 8 years
0 Lacs
Pune, Maharashtra, India
Remote Contract Role - Full Stack Developer (AWS, Node.js, Python, Terraform) Location: Remote (Offshore) Contract Type: Day rate / Contract A UK-based cyber consultancy is seeking an experienced Full Stack Developer for an offshore contract role. You'll help build a next-gen SaaS platform processing IoT edge data - all within a secure, serverless AWS environment. Key Skills:AWS (Lambda, API Gateway, S3, IAM, CodePipeline)Node.js & Python - strong backend/API developmentTerraform - modular IaC, state managementCI/CD & DevOps - build, deploy, secureGraph databases (Neo4j/AuraDB), Cypher queriesSecurity-first mindset - SAST, DAST, OWASP, etc. Nice to Have:IoT, event-driven design (SNS/SQS), CognitoDocker/Fargate, AWS certifications Join a collaborative team pushing the boundaries of secure SaaS and IoT solutions. Apply now to be part of something innovative!
Posted 1 month ago
12 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
About the Company We are Mindsprint! A leading-edge technology and business services firm that provides impact driven solutions to businesses, enabling them outpace speed of change. For over three decades we have been accelerating technology transformation for the Olam Group and their large base of global clients. Working with leading technologies and empowered with the freedom to create new solutions and better existing ones, we have been inspiring businesses with pioneering initiatives. Awards bagged in the recent years: Great Place To Work® Certified™ for 2023-2024Best Shared Services in India Award by Shared Services Forum – 2019Asia’s No.1 Shared Services in Process Improvement and Value Creation by Shared Services and Outsourcing Network Forum – 2019International Innovation Award for Best Services and Solutions – 2019Kincentric Best Employer India – 2020Creative Talent Management Impact Award – SSON Impact Awards 2021The Economic Times Best Workplaces for Women – 2021 & 2022#SSFExcellenceAward for Delivering Business Impact through Innovative People Practices – 2022 For more info: https://www.mindsprint.org/ Follow us in LinkedIn: Mindsprint Position : Associate Director Responsibilities Lead, mentor, and manage the Data Architects, Apps DBA, and DB Operations teams.Possess strong experience and deep understanding of major RDBMS, NoSQL, and Big Data technologies, with expertise in system design and advanced troubleshooting in high-pressure production environments.Core technologies include SQL Server, PostgreSQL, MySQL, TigerGraph, Neo4J, Elastic Search, ETL concepts, and high-level understanding on data warehouse platforms such as Snowflake, ClickHouse, etc.Define, validate, and implement robust data models and database solutions for clients across sectors such as Agriculture, Supply Chain, and Life Sciences.Oversee end-to-end database resource provisioning in the cloud, primarily on Azure, covering IaaS, PaaS, and SaaS models, along with proactive cost management and optimization.Hands-on expertise in data migration strategies between on-premises and cloud environments, ensuring minimal downtime and secure transitions.Experienced in database performance tuning, identifying and resolving SQL code bottlenecks, code review, optimization for high throughput, and regular database maintenance including defragmentation.Solid understanding of High Availability (HA) and Disaster Recovery (DR) solutions, with experience in setting up failover setup, replication, backup, and recovery strategies.Expertise in implementing secure data protection measures such as encryption (at rest and in transit), data masking, access controls, DLP strategies, and ensuring regulatory compliance with GDPR, PII, PCI-DSS, HIPAA, etc.Skilled in managing data integration, data movement, and data report pipelines using tools like Azure Data Factory (ADF), Apache NiFi, and Talend.Fair understanding of database internals, storage engines, indexing strategies, and partitioning for optimal resource and performance management.Strong knowledge in Master Data Management (MDM), data cataloging, metadata management, and building comprehensive data lineage frameworks.Proven experience in implementing monitoring and alerting systems for database health and capacity planning using tools like Azure Monitor, Grafana, or custom scripts.Exposure to DevOps practices for database management, including CI/CD pipelines for database deployments, version control of database schemas, and Infrastructure as Code (IaC) practices (e.g., Terraform, ARM templates).Experience collaborating with data analytics teams to provision optimized environments as data’s are shared between RDBMS, NoSQL and Snowflake Layers.Knowledge of security best practices for multi-tenant database environments and data segmentation strategies.Ability to guide the evolution of data governance frameworks, defining policies, standards, and best practices for database environments. Job Location : ChennaiNotice period :15 Days / Immediate / Currently Serving Notice period - Max 30 DaysShift : Day ShiftExperience : Min 12 YearsWork Mode : HybridGrade : D1 Associate Director
Posted 1 month ago
4 years
0 Lacs
Hyderabad, Telangana, India
On-site
Job Title: Mid-Level Data Scientist / AI-ML EngineerLocation: On-site, HyderabadExperience: 4+ years of relevant experience Job Summary: We are looking for Data Scientist to join our team in Hyderabad. This role is ideal for candidates with 4+ year of experience with strong ML, GenAI and cloud skills. Must Have Skills:Experience with GenAIworking with LLMs like Llama, Mistral, Deep seek and GPTsopen source GenAI tools like flowise/langflowKnowledge graphs like Neo4j, LangChainSolid grasp of machine learning algorithms and Model deploymentRestAPIs, microservices (eg: FastAPI, Flask and Streamlit)Experience with AWS, GCP, Azure (S3, Lambda, Sage Maker, Bedrock or Vertex AI)Containerization (Docker) and CI/CD basicsSQL + NoSQL databasesData preprocessing & ETL pipelinesAgentic AI, MLOps, Prompt engineering and RAG pipelinesGit,Vector DBs (Pinecone, FAISS and ChromaDB)Experience integrating LLMs into applications etc.
Posted 1 month ago
0 years
0 Lacs
Kolkata, West Bengal, India
Remote
Unified Infotech is a 14-year-old, multi-award winning digital transformation partner. We turbocharge business growth for Fortune 500 companies, multinational corporations (MNCs), small and medium-sized enterprises (SMEs), and Startups using emerging tech and streamlined digital processes.We’re your go-to partner for:· Digital Transformation, Custom Web, Mobile, and Desktop Software Development· Digital Customer Experience - UX/UI Research & Design· SaaS and Software Product Development· IT Consulting & Staff Augmentation· Software Modernization & Cloud Migration· Data and Analytics· Cloud EngineeringYou can get more details about us from our website www.unifiedinfotech.net Position Overview We are looking for a highly skilled and experienced Solution Architect to join our team.This role is responsible for delivering both technical and functional expertise to clientsacross various projects. The ideal candidate will have a strong background in designing,implementing, and optimizing scalable and highly available cloud (SaaS) services andsolutions. This role involves collaborating closely with business development, accountmanagement, and executive leadership teams to ensure that technical solutions alignwith business goals and are implemented seamlessly. Key Responsibilities • Solution Design & Development: Analyze client requirements and functionalspecifications, and collaborate with development teams to design and implementscalable, distributed cloud-based solutions (SaaS).• Cloud Architecture: Lead the design and implementation of highly available,resilient, and efficient cloud architectures. Build complex distributed systems fromthe ground up with a focus on minimizing downtime, ensuring failproofdeployments, and maintaining data integrity.• Stakeholder Collaboration: Work closely with business development, accountmanagers, and executive management to align technical solutions with businessgoals and increase overall company productivity and profitability.• Database Expertise: Provide expertise in SQL and NoSQL databases such asMySQL, Oracle, MongoDB, Cassandra, Redis, and Neo4J to design efficient datamodels and schemas.• Continuous Improvement: Evaluate and recommend improvements to currenttechnologies and processes within the organization to drive greater efficiency andperformance.• Mentorship & Best Practices: Mentor development teams by guiding them in bestpractices for coding, architecture design, and software developmentmethodologies.• Version Control & CI/CD: Implement and manage version control systems (e.g.,Git) and Continuous Integration/Continuous Deployment (CI/CD) pipelines toensure smooth, efficient development workflows.• Security & Compliance: Ensure that all solutions adhere to security best practicesand comply with relevant standards to protect data and systems.• Agile Methodology: Participate in Agile/Scrum development processes,collaborating with cross-functional teams to deliver high-quality solutions on time.• Strategic Planning: Contribute to long-term architectural strategy, identifying areasfor improvement and ensuring solutions meet business requirements andperformance goals. Desired Candidate Profile • Experience: Proven experience in solution architecture, design, andimplementation of scalable cloud-based solutions (SaaS). Hands-on experiencewith high availability and distributed systems is essential.• Technical Skills: o Strong proficiency in SQL and NoSQL databases (e.g., MySQL, MongoDB,Cassandra, Neo4J, Redis).o Expertise in cloud architectures, distributed systems, and high-performancecomputing.o Proficient in version control systems, particularly Git.o Familiarity with CI/CD processes and pipeline automation.o Understanding of web application security principles.• Programming & Frameworks: Experience with technologies and frameworks suchas NodeJS, Laravel, Spring, Angular, React, or similar frameworks is highly desirable.• Leadership & Mentorship: Strong ability to mentor and guide technical teams inadopting best practices and delivering high-quality solutions.• Methodology: Practical experience in Agile/Scrum development methodologieswith a collaborative approach to team success.• Communication: Excellent communication skills, with the ability to effectivelypresent complex technical concepts to both technical and non-technicalstakeholders.
Posted 1 month ago
3 years
0 Lacs
Greater Kolkata Area
On-site
This role is for one of the Weekday's clients Salary range: Rs 600000 - Rs 1700000 (ie INR 6-17 LPA) Min Experience: 3 years Location: Bangalore, Chennai, pune, Kolkata, Gurugram JobType: full-time Experience: 6+ years in IT with 3+ years in Data Warehouse/ETL projects Requirements Primary Responsibilities: Design and develop modern data warehouse solutions using Snowflake, Databricks, and Azure Data Factory (ADF). Deliver forward-looking data engineering and analytics solutions that scale with business needs. Work with DW/BI leads to gather and implement requirements for new ETL pipelines. Troubleshoot and resolve issues in existing pipelines, identifying root causes and implementing fixes. Partner with business stakeholders to understand reporting requirements and build corresponding data models. Provide technical mentorship to junior team members and assist with issue resolution. Engage in technical discussions with client architects and team members to align on best practices. Orchestrate data workflows using scheduling tools like Apache Airflow. Qualifications: Bachelor's or Master's degree in Computer Science or a related field. Expertise in Snowflake, including security, SQL, and object design/implementation. Proficient with Snowflake tools such as SnowSQL, Snowpipe, Snowsight, and Snowflake connectors. Strong understanding of Star and Snowflake schema modeling. Deep knowledge of data management principles and data warehousing. Experience with Databricks and a solid grasp of Delta Lake architecture. Hands-on with SQL and Spark (preferably PySpark). Experience developing ETL processes and transformations for data warehousing solutions. Familiarity with NoSQL and open-source databases such as MongoDB, Cassandra, or Neo4J. Exposure to structured and unstructured data, including imaging and geospatial formats. Proficient in DevOps tools and practices including Terraform, CircleCI, Git. Strong background in RDBMS, PL/SQL, Unix Shell Scripting, and query performance tuning. Databricks Certified Data Engineer Associate/Professional certification is a plus. Ability to thrive in a fast-paced, dynamic environment managing multiple projects. Experience working within Agile development frameworks. Excellent communication, analytical, and problem-solving skills with strong attention to detail. Mandatory Skills: Snowflake, Azure Data Factory, PySpark, Databricks, SQL, Python
Posted 1 month ago
3 - 8 years
6 - 10 Lacs
Chennai
Work from Office
Overview Java development with hands-on experience in Spring Boot. Strong knowledge of UI frameworks, particularly Angular, for developing dynamic, interactive web applications. Experience with Kubernetes for managing microservices-based applications in a cloud environment. Familiarity with Postgres (relational) and Neo4j (graph database) for managing complex data models. Experience in Meta Data Modeling and designing data structures that support high-performance and scalability. Expertise in Camunda BPMN and business process automation. Experience implementing rules with Drools Rules Engine. Knowledge of Unix/Linux systems for application deployment and management. Experience building data Ingestion Frameworks to process and handle large datasets. Responsibilities Java development with hands-on experience in Spring Boot. Strong knowledge of UI frameworks, particularly Angular, for developing dynamic, interactive web applications. Experience with Kubernetes for managing microservices-based applications in a cloud environment. Familiarity with Postgres (relational) and Neo4j (graph database) for managing complex data models. Experience in Meta Data Modeling and designing data structures that support high-performance and scalability. Expertise in Camunda BPMN and business process automation. Experience implementing rules with Drools Rules Engine. Knowledge of Unix/Linux systems for application deployment and management. Experience building data Ingestion Frameworks to process and handle large datasets. Requirements Java development with hands-on experience in Spring Boot. Strong knowledge of UI frameworks, particularly Angular, for developing dynamic, interactive web applications. Experience with Kubernetes for managing microservices-based applications in a cloud environment. Familiarity with Postgres (relational) and Neo4j (graph database) for managing complex data models. Experience in Meta Data Modeling and designing data structures that support high-performance and scalability. Expertise in Camunda BPMN and business process automation. Experience implementing rules with Drools Rules Engine. Knowledge of Unix/Linux systems for application deployment and management. Experience building data Ingestion Frameworks to process and handle large datasets.
Posted 1 month ago
2 - 7 years
4 - 8 Lacs
Chennai
Work from Office
Overview Java development with hands-on experience in Spring Boot. Strong knowledge of UI frameworks, particularly Angular, for developing dynamic, interactive web applications. Experience with Kubernetes for managing microservices-based applications in a cloud environment. Familiarity with Postgres (relational) and Neo4j (graph database) for managing complex data models. Experience in Meta Data Modeling and designing data structures that support high-performance and scalability. Expertise in Camunda BPMN and business process automation. Experience implementing rules with Drools Rules Engine. Knowledge of Unix/Linux systems for application deployment and management. Experience building data Ingestion Frameworks to process and handle large datasets. Responsibilities Java development with hands-on experience in Spring Boot. Strong knowledge of UI frameworks, particularly Angular, for developing dynamic, interactive web applications. Experience with Kubernetes for managing microservices-based applications in a cloud environment. Familiarity with Postgres (relational) and Neo4j (graph database) for managing complex data models. Experience in Meta Data Modeling and designing data structures that support high-performance and scalability. Expertise in Camunda BPMN and business process automation. Experience implementing rules with Drools Rules Engine. Knowledge of Unix/Linux systems for application deployment and management. Experience building data Ingestion Frameworks to process and handle large datasets. Requirements Java development with hands-on experience in Spring Boot. Strong knowledge of UI frameworks, particularly Angular, for developing dynamic, interactive web applications. Experience with Kubernetes for managing microservices-based applications in a cloud environment. Familiarity with Postgres (relational) and Neo4j (graph database) for managing complex data models. Experience in Meta Data Modeling and designing data structures that support high-performance and scalability. Expertise in Camunda BPMN and business process automation. Experience implementing rules with Drools Rules Engine. Knowledge of Unix/Linux systems for application deployment and management. Experience building data Ingestion Frameworks to process and handle large datasets.
Posted 1 month ago
6 - 11 years
6 - 9 Lacs
Chennai
Work from Office
Overview Java development with hands-on experience in Spring Boot. Strong knowledge of UI frameworks, particularly Angular, for developing dynamic, interactive web applications. Experience with Kubernetes for managing microservices-based applications in a cloud environment. Familiarity with Postgres (relational) and Neo4j (graph database) for managing complex data models. Experience in Meta Data Modeling and designing data structures that support high-performance and scalability. Expertise in Camunda BPMN and business process automation. Experience implementing rules with Drools Rules Engine. Knowledge of Unix/Linux systems for application deployment and management. Experience building data Ingestion Frameworks to process and handle large datasets. Responsibilities Key Responsibilities: Meta Data Modeling: Develop and implement meta data models that represent complex data structures and relationships across the system. Collaborate with cross-functional teams to design flexible, efficient, and scalable meta data models to support application and data processing requirements. Software Development (Java & Spring Boot): Develop high-quality, efficient, and scalable Java applications using Spring Boot and other Java-based frameworks. Participate in full software development lifecycledesign, coding, testing, deployment, and maintenance. Optimize Java applications for performance and scalability. UI Development (Angular)(Optional) Design and implement dynamic, responsive, and user-friendly web UIs using Angular. Integrate the UI with backend microservices, ensuring a seamless and efficient user experience. Ensure that the UI adheres to best practices in terms of accessibility, security, and usability. Containerization & Microservices (Kubernetes): Design, develop, and deploy microservices using Kubernetes to ensure high availability and scalability of applications. Use Docker containers and Kubernetes for continuous deployment and automation of application lifecycle. Maintain and troubleshoot containerized applications in a cloud or on-premise Kubernetes environment. Requirements Database Management (Postgres & Neo4j): Design and implement database schemas and queries for both relational databases (Postgres) and graph databases (Neo4j). Develop efficient data models and support high-performance query optimization. Collaborate with the data engineering team to integrate data pipelines and ensure the integrity of data storage. Business Process Modeling (BPMN): Utilize BPMN to model business processes and workflows. Design and optimize process flows to improve operational efficiency. Work with stakeholders to understand business requirements and implement process automation. Rule Engine (Drools Rules): Implement business logic using the Drools Rules Engine to automate decision-making processes. Work with stakeholders to design and define business rules and integrate them into applications. Ingestion Framework: Build and maintain robust data ingestion frameworks that process large volumes of data efficiently. Ensure proper data validation, cleansing, and enrichment during the ingestion process.
Posted 1 month ago
7 - 10 years
0 Lacs
Bengaluru, Karnataka
Work from Office
About Us: Data Scientist – 3 – Kotak811 Kotak811 is a Neobank incubated by Kotak Mahindra Bank, with a view of providing completely digitized banking services in the convenience of the customer’s mobile phone. 811 is an early mover in the Indian fintech space that started off as a downloadable savings bank account in 2017, post demonetization, when India took one step closer to a digital economy. The Data Scientist-3 in Bangalore (or Mumbai) will be part of the 811 Data Strategy Group that comprises Data Engineers, Data Scientists and Data Analytics professionals. He/she will be associated with one of the key functional areas such as Product Strategy, Cross Sell, Asset Risk, Fraud Risk, Customer Experience etc. and help build robust and scalable solutions that are deployed for real time or near real time consumption and integrated into our proprietary Customer Data Platform (CDP). This is an exciting opportunity to work on data driven analytical solutions and have a profound influence on the growth trajectory of a super fast evolving digital product. Key Requirements of The Role Advanced degree in an analytical field (e.g., Data Science, Computer Science, Engineering, Applied Mathematics, Statistics, Data Analysis) or substantial hands on work experience in the space 7 - 10 Years of relevant experience in the space Expertise in mining AI/ML opportunities from open ended business problems and drive solution design/development while closely collaborating with engineering, product and business teams Strong understanding of advanced data mining techniques, curating, processing and transforming data to produce sound datasets. Strong experience in NLP, time series forecasting and recommendation engines preferred Create great data stories with expertise in robust EDA and statistical inference. Should have at least a foundational understanding in Experimentation design Strong understanding of the Machine Learning lifecycle - feature engineering, training, validation, scaling, deployment, scoring, monitoring, and feedback loop. Exposure to Deep Learning applications and tools like TensorFlow, Theano, Torch, Caffe preferred Experience with analytical programming languages, tools and libraries (Python a must) as well as Shell scripting. Should be proficient in developing production ready code as per best practices. Experience in using Scala/Java/Go based libraries a big plus Very proficient is SQL and other relational databases along with PySpark or Spark SQL. Proficient is using NoSQL databases. Experience in using GraphDBs like Neo4j a plus. Candidate should be able to handle unstructured data with ease. Candidate should have experience in working with MLEs and be proficient (with experience) in using MLOps tools. Should be able to consume the capabilities of said tools with deep understanding of deployment lifecycle. Experience in CI/CD deployment is a big plus. Knowledge of key concepts in distributed systems like replication, serialization, concurrency control etc. a big plus Good understanding of programming best practices and building code artifacts for reuse. Should be comfortable with version controlling and collaborate comfortably in tools like git Ability to create frameworks that can perform model RCAs using analytical and interpretability tools. Should be able to peer review model documentations/code bases and find opportunities Experience in end-to-end delivery of AI driven Solutions (Deep learning , traditional data science projects) Strong communication, partnership and teamwork skills Should be able to guide and mentor teams while leading them by example. Should be an integral part of creating a team culture focused on driving collaboration, technical expertise and partnerships with other teams Ability to work in an extremely fast paced environment, meet deadlines, and perform at high standards with limited supervision A self-starter who is looking to build grounds up and contribute to the making of a potential big name in the space Experience in Banking and financial services is a plus. However, sound logical reasoning and first principles problem solving are even more critical A typical day in the life of the job role: 1. As a key partner at the table, attend key meetings with the business team to bring in the data perspective to the discussions 2. Perform comprehensive data explorations around to generate inquisitive insights and scope out the problem 3. Develop simplistic to advanced solutions to address the problem at hand. We believe in making swift (albeit sometimes marginal) impact to business KPIs and hence adopt an MVP approach to solution development 4. Build re-usable code analytical frameworks to address commonly occurring business questions 5. Perform 360-degree customer profiling and opportunity analyses to guide new product strategy. This is a nascent business and hence opportunities to guide business strategy are plenty 6. Guide team members on data science and analytics best practices to help them overcome bottlenecks and challenges 7. The role will be an approximate 60% IC – 40% leading and the ratios can vary basis need and fit 8. Develop Customer-360 Features that will be integrated into the Customer Data Platform (CDP) to enhance the single view of our customer Website: https://www.kotak811.com/
Posted 1 month ago
8 - 10 years
25 - 30 Lacs
Chennai, Pune, Delhi
Work from Office
Description: ACCOUNTABILITIES Provides database operations support, monitoring databases and backups, resolving repetitive and simple events and escalating more complex incident according to the standard operating procedures Provides production database support with specific focus on availability, capacity, performance, security and recoverability. Performs database installation and configuration, tuning, capacity planning, health checks, backups & recovery and change management according to documented stand Description Comments Additional Details Description Comments : India General Shift (IST), Need to join the global meeting during the evening IST hours.Candidate needs to be onsite at Dell Bangalore or Dell Hyderabad office location.Detailed Job Description (JD) is attached.Roles and Responsibilities (JD)Education and Experience: Bachelor s degree in computer science or a related technical discipline, or the equivalentcombination of education, technical training, or work experience. Requires 8-10 years of related experience in the design, maintenance, and administration ofNoSQL databases - Elastic Search, Neo4j, Cassandra, SingleStore, etc. Hands-on experience in Ansible automation development for NoSQL DB platform provisioning,DB installation, upgrade / patching and DBA administration activities. Deep understanding of Db cluster management, replication, and multi-datacenter configuration Strong knowledge of monitoring, management, capacity planning and compaction strategy Good knowledge of database backup and recovery, connectivity and security, and rolemanagement Ability to express complex technical concepts effectively, both verbally and in writing Ability to work well with people from many different disciplines with varying degrees oftechnical experience Must be versatile, flexible, and proactive when resolving technical issuesExcellent Interpersonal Communication SkillsProfessional Certifications: Ansible automation development, Elastic Search / Neo4j / Cassandra /Single Store certification is preferred. Not to Exceed Rate : (No Value)
Posted 1 month ago
2 - 4 years
15 - 15 Lacs
Hyderabad
Work from Office
We are seeking a highly skilled AI/ML Engineer to join our dynamic team to build the next gen applications for our global customers. If you are a technology enthusiast and highly passionate, we are eager to discuss with you about the potential role. Responsibilities: Implement, and deploy Machine Learning solutions to solve complex problems and deliver real business value, ie. revenue, engagement, and customer satisfaction. Collaborate with data product managers, software engineers and SMEs to identify AI/ML opportunities for improving process efficiency. Develop production-grade ML models to enhance customer experience, content recommendation, content generation, and predictive analysis. Monitor and improve model performance via data enhancement, feature engineering, experimentation and online/offline evaluation. Stay up-to-date with the latest in machine learning and artificial intelligence, and influence AI/ML for the Life science industry. Stay up-to-date with the latest in machine learning and artificial intelligence, and influence AI/ML for the Life science industry. Requirements 2 - 4 years of experience in AI/ML engineering, with a track record of handling increasingly complex projects. Strong programming skills in Python, Rust. Experience with Pandas, NumPy, SciPy, OpenCV (for image processing) Experience with ML frameworks, such as scikit-learn, Tensorflow, PyTorch. Experience with GenAI tools, such as Langchain, LlamaIndex, and open source Vector DBs. Experience with one or more Graph DBs - Neo4J, ArangoDB Experience with MLOps platforms, such as Kubeflow or MLFlow. Expertise in one or more of the following AI/ML domains: Causal AI, Reinforcement Learning, Generative AI, NLP, Dimension Reduction, Computer Vision, Sequential Models. Expertise in building, deploying, measuring, and maintaining machine learning models to address real-world problems. Thorough understanding of software product development lifecycle, DevOps (build, continuous integration, deployment tools) and best practices. Excellent written and verbal communication skills and interpersonal skills. Advanced degree in Computer Science, Machine Learning or related field. Benefits We’re on a Mission In 2005, we disrupted the life sciences industry by introducing the world’s first digital validation lifecycle management system. ValGenesis VLMS® revolutionized compliance-based corporate validation activities and has remained the industry standard. Today, we continue to push the boundaries of innovation enhancing and expanding our portfolio beyond validation with an end-to-end digital transformation platform. We combine our purpose-built systems with world-class consulting services to help every facet of GxP meet evolving regulations and quality expectations. The Team You’ll Join Our customers’ success is our success. We keep the customer experience centered in our decisions, from product to marketing to sales to services to support. Life sciences companies exist to improve humanity’s quality of life, and we honor that mission. We work together. We communicate openly, support each other without reservation, and never hesitate to wear multiple hats to get the job done. We think big. Innovation is the heart of ValGenesis. That spirit drives product development as well as personal growth. We never stop aiming upward. We’re in it to win it. We’re on a path to becoming the number one intelligent validation platform in the market, and we won’t settle for anything less than being a market leader. How We Work Our Chennai and Bangalore offices are onsite, 5 days per week. We believe that in-person interaction and collaboration fosters creativity, and a sense of community, and is critical to our future success as a company. ValGenesis is an equal-opportunity employer that makes employment decisions on the basis of merit. Our goal is to have the best-qualified people in every job. All qualified applicants will receive consideration for employment without regard to race, religion, sex, sexual orientation, gender identity, national origin, disability, or any other characteristics protected by local law.
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
20312 Jobs | Dublin
Wipro
11977 Jobs | Bengaluru
EY
8165 Jobs | London
Accenture in India
6667 Jobs | Dublin 2
Uplers
6464 Jobs | Ahmedabad
Amazon
6352 Jobs | Seattle,WA
Oracle
5993 Jobs | Redwood City
IBM
5803 Jobs | Armonk
Capgemini
3897 Jobs | Paris,France
Tata Consultancy Services
3776 Jobs | Thane