Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
2 - 4 years
4 - 7 Lacs
Hyderabad
Work from Office
Associate Data Engineer Graph – Research Data and Analytics What you will do Let’s do this. Let’s change the world. In this vital role you will be part of Research’s Semantic Graph. Team is seeking a dedicated and skilled Data Engineer to design, build and maintain solutions for scientific data that drive business decisions for Research. You will build scalable and high-performance, graph-based, data engineering solutions for large scientific datasets and collaborate with Research partners. The ideal candidate possesses experience in the pharmaceutical or biotech industry, demonstrates deep technical skills, has experience with semantic data modeling and graph databases, and understands data architecture and ETL processes. Roles & Responsibilities: Design, develop, and implement data pipelines, ETL/ELT processes, and data integration solutions Contribute to data pipeline projects from inception to deployment, manage scope, timelines, and risks Contribute to data models for biopharma scientific data, data dictionaries, and other documentation to ensure data accuracy and consistency Optimize large datasets for query performance Collaborate with global multi-functional teams including research scientists to understand data requirements and design solutions that meet business needs Implement data security and privacy measures to protect sensitive data Leverage cloud platforms (AWS preferred) to build scalable and efficient data solutions Collaborate with Data Architects, Business SMEs, Software Engineers and Data Scientists to design and develop end-to-end data pipelines to meet fast paced business needs across geographic regions Identify and resolve data-related challenges Adhere to best practices for coding, testing, and designing reusable code/component Explore new tools and technologies that will help to improve ETL platform performance Participate in sprint planning meetings and provide estimations on technical implementation Maintain documentation of processes, systems, and solutions What we expect of you We are all different, yet we all use our unique contributions to serve patients. Basic Qualifications and Experience: Bachelor’s degree and 1to 3 years of Computer Science, IT, Computational Chemistry, Computational Biology/Bioinformatics or related field experience OR Diploma and 4 to 7 years of Computer Science, IT, Computational Chemistry, Computational Biology/Bioinformatics or related field experience Functional Skills: Must-Have Skills: Advanced Semantic and Relational Data Skills: Proficiency in Python, RDF, SPARQL, Graph Databases (e.g. Allegrograph), SQL, relational databases, ETL pipelines, big data technologies (e.g. Databricks), semantic data standards (OWL, W3C, FAIR principles), ontology development and semantic modeling practices. Hands on experience with big data technologies and platforms, such as Databricks, workflow orchestration, performance tuning on data processing. Excellent problem-solving skills and the ability to work with large, complex datasets Good-to-Have Skills: A passion for tackling complex challenges in drug discovery with technology and data Experience with system administration skills, such as managing Linux and Windows servers, configuring network infrastructure, and automating tasks with shell scripting. Examples include setting up and maintaining virtual machines, troubleshooting server issues, and ensuring data security through regular updates and backups. Solid understanding of data modeling, data warehousing, and data integration concepts Solid experience using RDBMS (e.g. Oracle, MySQL, SQL server, PostgreSQL) Knowledge of cloud data platforms (AWS preferred) Experience with data visualization tools (e.g. Dash, Plotly, Spotfire) Experience with diagramming and collaboration tools such as Miro, Lucidchart or similar tools for process mapping and brainstorming Experience writing and maintaining user documentation in Confluence Professional Certifications: Databricks Certified Data Engineer Professional preferred Soft Skills: Excellent critical-thinking and problem-solving skills Good communication and collaboration skills Demonstrated awareness of how to function in a team setting Demonstrated presentation skills What you can expect of us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we’ll support your journey every step of the way. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards. Apply now and make a lasting impact with the Amgen team. careers.amgen.com As an organization dedicated to improving the quality of life for people around the world, Amgen fosters an inclusive environment of diverse, ethical, committed and highly accomplished people who respect each other and live the Amgen values to continue advancing science to serve patients. Together, we compete in the fight against serious disease. Amgen is an Equal Opportunity employer and will consider all qualified applicants for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, disability status, or any other basis protected by applicable law. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation.
Posted 3 months ago
10 - 15 years
27 - 35 Lacs
Hyderabad
Work from Office
Node.js experience, TypeScript, JavaScript, NoSQL/Graph databases, and API development. Expertise in GraphQL, Docker, cloud deployment (AWS/Azure), (React.js/AngularJS) is required. Knowledge of API security, logging, Koa.js, and build tools is plus
Posted 3 months ago
12 - 17 years
14 - 19 Lacs
Pune, Bengaluru
Work from Office
Project Role : Application Architect Project Role Description : Provide functional and/or technical expertise to plan, analyze, define and support the delivery of future functional and technical capabilities for an application or group of applications. Assist in facilitating impact assessment efforts and in producing and reviewing estimates for client work requests. Must have skills : Manufacturing Operations Good to have skills : NA Minimum 12 year(s) of experience is required Educational Qualification : BTech BE Job Title:Industrial Data Architect Summary :We are seeking a highly skilled and experienced Industrial Data Architect with a proven track record in providing functional and/or technical expertise to plan, analyse, define and support the delivery of future functional and technical capabilities for an application or group of applications. Well versed with OT data quality, Data modelling, data governance, data contextualization, database design, and data warehousing. Must have Skills:Domain knowledge in areas of Manufacturing IT OT in one or more of the following verticals Automotive, Discrete Manufacturing, Consumer Packaged Goods, Life ScienceKey Responsibilities: Industrial Data Architect will be responsible for developing and overseeing the industrial data architecture strategies to support advanced data analytics, business intelligence, and machine learning initiatives. This role involves collaborating with various teams to design and implement efficient, scalable, and secure data solutions for industrial operations. Focused on designing, building, and managing the data architecture of industrial systems. Assist in facilitating impact assessment efforts and in producing and reviewing estimates for client work requests. Own the offerings and assets on key components of data supply chain, data governance, curation, data quality and master data management, data integration, data replication, data virtualization. Create scalable and secure data structures, integrating with existing systems and ensuring efficient data flow. Qualifications: Data Modeling and Architecture:oProficiency in data modeling techniques (conceptual, logical, and physical models).oKnowledge of database design principles and normalization.oExperience with data architecture frameworks and methodologies (e.g., TOGAF). Database Technologies:oRelational Databases:Expertise in SQL databases such as MySQL, PostgreSQL, Oracle, and Microsoft SQL Server.oNoSQL Databases:Experience with at least one of the NoSQL databases like MongoDB, Cassandra, and Couchbase for handling unstructured data.oGraph Databases:Proficiency with at least one of the graph databases such as Neo4j, Amazon Neptune, or ArangoDB. Understanding of graph data models, including property graphs and RDF (Resource Description Framework).oQuery Languages:Experience with at least one of the query languages like Cypher (Neo4j), SPARQL (RDF), or Gremlin (Apache TinkerPop). Familiarity with ontologies, RDF Schema, and OWL (Web Ontology Language). Exposure to semantic web technologies and standards. Data Integration and ETL (Extract, Transform, Load):oProficiency in ETL tools and processes (e.g., Talend, Informatica, Apache NiFi).oExperience with data integration tools and techniques to consolidate data from various sources. IoT and Industrial Data Systems:oFamiliarity with Industrial Internet of Things (IIoT) platforms and protocols (e.g., MQTT, OPC UA).oExperience with either of IoT data platforms like AWS IoT, Azure IoT Hub, and Google Cloud IoT Core.oExperience working with one or more of Streaming data platforms like Apache Kafka, Amazon Kinesis, Apache FlinkoAbility to design and implement real-time data pipelines. Familiarity with processing frameworks such as Apache Storm, Spark Streaming, or Google Cloud Dataflow.oUnderstanding of event-driven design patterns and practices. Experience with message brokers like RabbitMQ or ActiveMQ.oExposure to the edge computing platforms like AWS IoT Greengrass or Azure IoT Edge AI/ML, GenAI:oExperience working on data readiness for feeding into AI/ML/GenAI applicationsoExposure to machine learning frameworks such as TensorFlow, PyTorch, or Keras. Cloud Platforms:oExperience with cloud data services from at least one of the providers like AWS (Amazon Redshift, AWS Glue), Microsoft Azure (Azure SQL Database, Azure Data Factory), and Google Cloud Platform (BigQuery, Dataflow). Data Warehousing and BI Tools:oExpertise in data warehousing solutions (e.g., Snowflake, Amazon Redshift, Google BigQuery).oProficiency with Business Intelligence (BI) tools such as Tableau, Power BI, and QlikView. Data Governance and Security:oUnderstanding of data governance principles, data quality management, and metadata management.oKnowledge of data security best practices, compliance standards (e.g., GDPR, HIPAA), and data masking techniques. Big Data Technologies:oExperience in big data platforms and tools such as Hadoop, Spark, and Apache Kafka.oUnderstanding of distributed computing and data processing frameworks. Excellent Communication:Superior written and verbal communication skills, with the ability to effectively articulate complex technical concepts to diverse audiences. Problem-Solving Acumen:A passion for tackling intricate challenges and devising elegant solutions. Collaborative Spirit:A track record of successful collaboration with cross-functional teams and stakeholders. Certifications:AWS Certified Data Engineer Associate / Microsoft Certified:Azure Data Engineer Associate / Google Cloud Certified Professional Data Engineer certification is mandatory Minimum of 14-18 years progressive information technology experience. Qualifications BTech BE
Posted 3 months ago
7 - 9 years
9 - 13 Lacs
Bengaluru
Work from Office
Project Role : Data Platform Engineer Project Role Description : Assists with the data platform blueprint and design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills : Graph Databases Good to have skills : Life Sciences, Autosys Minimum 7.5 year(s) of experience is required Educational Qualification : BE Summary :As a Data Platform Engineer, you will be responsible for assisting with the blueprint and design of the data platform components. Your typical day will involve collaborating with Integration Architects and Data Architects to ensure cohesive integration between systems and data models, utilizing your expertise in Graph Databases and Life Sciences. Roles & Responsibilities: Assist with the blueprint and design of the data platform components. Collaborate with Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Utilize expertise in Graph Databases to design and implement data models. Develop and maintain data pipelines and ETL processes. Ensure data quality and integrity through testing and validation processes. Professional & Technical Skills: Must To Have Skills:Expertise in Graph Databases. Good To Have Skills:Knowledge of Life Sciences. Experience in designing and implementing data models. Proficiency in developing and maintaining data pipelines and ETL processes. Strong understanding of testing and validation processes for ensuring data quality and integrity. Additional Information: The candidate should have a minimum of 7.5 years of experience in Graph Databases. This position is based at our Bengaluru office. Qualification BE
Posted 3 months ago
5 - 10 years
5 - 9 Lacs
Bengaluru
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Neo4j Good to have skills : NA Minimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with various teams to understand their needs, developing innovative solutions, and ensuring that applications are aligned with business objectives. You will engage in problem-solving activities, participate in team meetings, and contribute to the overall success of projects by leveraging your expertise in application development. Roles & Responsibilities: Expected to be an SME. Collaborate and manage the team to perform. Responsible for team decisions. Engage with multiple teams and contribute on key decisions. Provide solutions to problems for their immediate team and across multiple teams. Facilitate knowledge sharing sessions to enhance team capabilities. Monitor project progress and ensure timely delivery of application features. Professional & Technical Skills: Must To Have Skills: Proficiency in Neo4j. Strong understanding of graph database concepts and data modeling. Experience with application development frameworks and methodologies. Familiarity with RESTful APIs and microservices architecture. Ability to troubleshoot and optimize application performance. Additional Information: The candidate should have minimum 5 years of experience in Neo4j. This position is based at our Bengaluru office. A 15 years full time education is required. Qualification 15 years full time education
Posted 3 months ago
2 - 7 years
4 - 8 Lacs
Chennai
Work from Office
Overview Java development with hands-on experience in Spring Boot. Strong knowledge of UI frameworks, particularly Angular, for developing dynamic, interactive web applications. Experience with Kubernetes for managing microservices-based applications in a cloud environment. Familiarity with Postgres (relational) and Neo4j (graph database) for managing complex data models. Experience in Meta Data Modeling and designing data structures that support high-performance and scalability. Expertise in Camunda BPMN and business process automation. Experience implementing rules with Drools Rules Engine. Knowledge of Unix/Linux systems for application deployment and management. Experience building data Ingestion Frameworks to process and handle large datasets. Responsibilities Java development with hands-on experience in Spring Boot. Strong knowledge of UI frameworks, particularly Angular, for developing dynamic, interactive web applications. Experience with Kubernetes for managing microservices-based applications in a cloud environment. Familiarity with Postgres (relational) and Neo4j (graph database) for managing complex data models. Experience in Meta Data Modeling and designing data structures that support high-performance and scalability. Expertise in Camunda BPMN and business process automation. Experience implementing rules with Drools Rules Engine. Knowledge of Unix/Linux systems for application deployment and management. Experience building data Ingestion Frameworks to process and handle large datasets. Requirements Java development with hands-on experience in Spring Boot. Strong knowledge of UI frameworks, particularly Angular, for developing dynamic, interactive web applications. Experience with Kubernetes for managing microservices-based applications in a cloud environment. Familiarity with Postgres (relational) and Neo4j (graph database) for managing complex data models. Experience in Meta Data Modeling and designing data structures that support high-performance and scalability. Expertise in Camunda BPMN and business process automation. Experience implementing rules with Drools Rules Engine. Knowledge of Unix/Linux systems for application deployment and management. Experience building data Ingestion Frameworks to process and handle large datasets.
Posted 3 months ago
3 - 8 years
5 - 10 Lacs
Delhi NCR, Bengaluru, Mumbai (All Areas)
Work from Office
Hi Folks, Greeting!! We are hiring for below mention position for leading brand. Location:- Pan India Must have skills : Graph Databases Good to have skills : NA Minimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary: As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements. You will work closely with the team to ensure the successful delivery of high-quality software solutions. Your typical day will involve collaborating with stakeholders to gather requirements, designing and implementing application features, and troubleshooting and resolving any issues that arise. You will also have the opportunity to contribute to the continuous improvement of development processes and practices. Roles & Responsibilities : - Expected to perform independently and become an SME. - Required active participation/contribution in team discussions. - Contribute in providing solutions to work-related problems. - Collaborate with stakeholders to gather and analyze requirements. - Design and develop applications based on business process and application requirements. - Configure and customize applications to meet specific needs. - Perform unit testing and debugging of applications. - Troubleshoot and resolve any issues or bugs that arise. - Contribute to the continuous improvement of development processes and practices. Professional & Technical Skills: - Must To Have Skills: Proficiency in Graph Databases. - Strong understanding of software development principles and methodologies. - Experience with programming languages such as Java or Python. - Knowledge of database management systems and SQL. - Familiarity with web development frameworks and technologies. - Good To Have Skills: Experience with cloud platforms such as AWS or Azure. Additional Information: - The candidate should have a minimum of 3 years of experience in Graph Databases. - This position is based at our Bengaluru office. - A 15 years full-time education is required.
Posted 3 months ago
9.0 - 14.0 years
35 - 45 Lacs
hyderabad, chennai, bengaluru
Hybrid
Role- Princ Engr-Data Engineering Location- Bangalore/ Hyderabad/ Chennai Exp-9 to 13 years Primary: GCP, Python, Graph DB, spanner Or Neo4J Role & responsibilities Bachelors degree or four or more years of work experience. Six or more years of relevant work experience. Knowledge of Information Systems and their applications to data management processes. Experience performing detailed analysis of business problems and technical environments and designing the solution. Experience working with Google Cloud Platform & BigQuery. Experience working with Bigdata Technologies & Utilities - Hadoop/ Spark/ Scala/ Kafka/ NiFi. Experience with relational SQL and NoSQL databases. Experience with data pipeline and workflow management & Governance tools. Experience with stream-processing systems. Experience with object-oriented/object function scripting languages. Experience building data solutions for Machine learning and Artificial Intelligence. Knowledge of Data Analytics and modeling tools.
Posted Date not available
3.0 - 6.0 years
11 - 20 Lacs
pune
Hybrid
Bachelor's degree in Computer Science, Software Engineering, or a related field. At least 3-5 years of professional experience in backend web development, including familiarity with the full software development lifecycle. Proficiency in backend programming languages such as Python, Java, Node.js (with frameworks like Django, Flask, Express). Strong experience with database management systems, including both SQL (e.g., MySQL, PostgreSQL) and NoSQL databases, and particularly graph databases (e.g., Neo4j). Expertise in designing and implementing RESTful APIs. Knowledge of version control tools like Git. Understanding of object-oriented programming, design ideas, patterns, and best practices. Experience with cloud platforms (e.g., AWS, Azure) and deploying applications in cloud environments is a significant plus. Strong problem-solving and communication skills to collaborate effectively with team members and stakeholders. Familiarity with DevOps tools and practices for deployment and monitoring.
Posted Date not available
5.0 - 8.0 years
4 - 9 Lacs
bengaluru
Work from Office
Owning the Infrastructure - Co-ordination for provisioning, maintenance and monitoring Work closely with Platform Architecture and Engineering team to Improve/establish Governance Collaborate with use case team and Graph DB product team to enable the technical capabilities Onboarding of new use cases in Graph DB Owning the DevOps pipeline and processes - Create, enhance and maintenance Owning the admin related activities Graph DB new feature enablement, access, authorization ..etc.
Posted Date not available
3.0 - 8.0 years
4 - 8 Lacs
chennai
Work from Office
We are looking for immediate job openings on#Neo4j Graph Database _Chennai_Contract Skills: Neo4j Graph Database#Exp 3+ Years#Location Chennai#Notice PeriodImmediate#Employment Type:ContractJOB Description: Build Knowledge Graph solutions leveraging large-scale datasets Design and build graph database schemas to support various use cases including knowledge graphs Design and develop a Neo4j data model for a new application as per the use cases Design and build graph database load processes to efficiently populate the knowledge graphs Migrate an existing relational database (BigQuery) to Neo4j Build design/integration patterns for both batch and real-time update processes to keep the knowledge graphs in sync Work with stakeholders to understand the requirements and translate them into technical architecture Select and configure appropriate Neo4j features and capabilities as applicable for the given use case(s) Optimize the performance of a Neo4j-based recommendation engine Set up a Neo4j cluster in the cloud Configure Neo4j security features to protect sensitive data Ensure the security and reliability of Neo4j deployments Provide guidance and support to other developers on Neo4j best practices QualificationsMinimum 3+ years of working experience with knowledge graphs/graph databases Expertise with Graph database technology especially Neo4J Expertise with Python, and related software engineering platforms/frameworks Experience in designing and building highly scalable Knowledge Graphs in production Experience developing APIs leveraging knowledge graph data Experience with querying knowledge graphs using a graph query language (e.g. Cypher) Experience working with end-to-end CI/CD pipelines using frameworks The ideal candidate will have a strong knowledge of Graph solutions especially Neo4j, Python and have experience working with massive amounts of data in the retail space. Candidate must have a strong curiosity for data and a proven track record of successfully implementing graph database solutions with proficiency in software engineering.
Posted Date not available
5.0 - 8.0 years
2 - 6 Lacs
mumbai
Work from Office
Graph data Engineer required for a complex Supplier Chain Project. Key required Skills Graph data modelling (Experience with graph data models (LPG, RDF) and graph language (Cypher), exposure to various graph data modelling techniques) Experience with neo4j Aura, Optimizing complex queries. Experience with GCP stacks like BigQuery, GCS, Dataproc. Experience in PySpark, SparkSQL is desirable. Experience in exposing Graph data to visualisation tools such as Neo Dash, Tableau and PowerBI The Expertise You Have: Bachelors or Masters Degree in a technology related field (e.g. Engineering, Computer Science, etc.). Demonstrable experience in implementing data solutions in Graph DB space. Hands-on experience with graph databases (Neo4j(Preferred), or any other). Experience Tuning Graph databases. Understanding of graph data model paradigms (LPG, RDF) and graph language, hands-on experience with Cypher is required. Solid understanding of graph data modelling, graph schema development, graph data design. Relational databases experience, hands-on SQL experience is required. Desirable (Optional) skills: Data ingestion technologies (ETL/ELT), Messaging/Streaming Technologies (GCP data fusion, Kinesis/Kafka), API and in-memory technologies. Understanding of developing highly scalable distributed systems using Open-source technologies. Experience in Supply Chain Data is desirable but not essential. Location: Pune, Mumbai, Chennai, Bangalore, Hyderabad
Posted Date not available
4.0 - 8.0 years
3 - 7 Lacs
bengaluru
Work from Office
Position Overview This is an exciting opportunity to join an Innovation team focused on developing Generative AI based solutions to accelerate the software development and modernisation process. You will be responsible for driving technological advancements in our products and services, ensuring our solutions remain at the forefront of AI Innovation. This is a technologists dream role, and the successful candidate must display a real passion for technology and continuous learning. There are no passengers in this team! Key Responsibilities: Design, develop and maintain Java-based applications with a focus on scalability, reliability, and performance in an AWS cloud environment Design and implement AWS infrastructure and services to support Java application deployment, including EC2, EKS, Lambda, API Gateway and others Help us deliver on our vision to enable our AI based Next Generation Development capability Evaluate new technologies to ensure we deliver cutting edge & innovative solutions Demonstrate drive and passion for technology on a daily basis ABSOLUTE MUST Mentor junior members of the team Qualifications Skills & Experience Required: Bachelors degree in Software Engineering or related field Minimum of 6+ years Java experience Minimum of 3+ years AWS experience Strong interest in exploring how to apply Generative AI to the development process Excellent English speaking skills Excellent communication and planning skills Desired Skills Knowledge of Front-End Development (React or Angular) Knowledge of AI Coding Assistants and techniques Knowledge of Graph database technology Experience in a Tech Lead role Additional Information Benefits Work with a global award-winning innovation team focused on Engineering Excellence Enjoy flexible work hours and a hybrid schedule for a healthy work-life balance Benefit from a safe and inclusive workplace for women Receive a competitive remuneration package with comprehensive benefits for well-being, professional growth, and financial stability Participate in a Profit Share scheme and get a portion of our company's profits each quarter Gain cross-skilling opportunities in various technologies. Access tech-related benefits, including an innovative Tech Scheme and incentivized certifications (AWS, Microsoft, Oracle, Red Hat) Collaborate with a diverse team and enjoy private medical and life insurance coverage, free eye tests, and contributions towards glasses Work in a brand-new office with state-of-the-art facilities Be part of a company recognized as a Great Place to Work
Posted Date not available
4.0 - 6.0 years
0 - 1 Lacs
pune
Remote
Role & responsibilities Title: Senior Software Engineer Location: Remote Experience Required: 56 Years Position Type: Full-Time Openings: 1 Job Overview: We are seeking a Senior Software Engineer with 5–6 years of hands-on experience in full-stack development. This is a remote position, offering the opportunity to work on cutting-edge projects involving Node.js, React.js, SQL, GraphQL, AWS Serverless, and Graph Databases. The ideal candidate will also have exposure to Large Language Model (LLM) integrations and OCR techniques, contributing to smart, AI-driven applications. Key Responsibilities: Architect and develop scalable, cloud-native backend services using Node.js and AWS Serverless infrastructure (Lambda, API Gateway, DynamoDB, S3, etc.). Build responsive and high-performance frontend applications using React.js. Design and implement flexible, secure GraphQL APIs. Manage and querye Models (LLMs) such as OpenAI/GPT for smart automation, chatbots, or semantic search features. Implement OCR solutions using tools such as Tesseract, AWS Textract, or similar for document data extraction. Contribute to DevOps workflows, including CI/CD pipelines and infrastructure automation using Terraform or CloudFormation. Participate in code reviews, knowledge sharing, and mentoring of junior team members. Required Skills: Strong proficiency in Node.js and React.js Experience in SQL database development and optimization Working knowledge of GraphQL and API security best practices Hands-on experience with AWS Serverless architecture Practical knowledge of Neo4j or other Graph Databases Experience in integrating LLM APIs (e.g., OpenAI, Anthropic, LangChain) Understanding of OCR tools like Tesseract, AWS Textract, or Google Vision API Familiarity with modern software development practices including CI/CD, Git, agile methodologies Nice to Have: Experience with Terraform or AWS CloudFormation Familiarity with TypeScript, NoSQL, and microservices architecture Understanding of containerization using Docker, ECS, or EKS Prior experience with AI/ML workflows or prompt engineering
Posted Date not available
5.0 - 8.0 years
4 - 7 Lacs
pune
Work from Office
Expert C# Rest API skills This is mandatory Windows Services Development skills This is mandatory Expert Object-Oriented Programing skills Mongo Atlas experience (Document DB). This is mandatory Neo4j experience (Graph DB) Nice to Haves: Humana Experience Kafka experience SQL Queries Azure Kubernetes Services (Note Azure is not required) Mandatory Skills: Csharp - CSharp-Programming.Experience: 5-8 Years.>
Posted Date not available
8.0 - 12.0 years
20 - 25 Lacs
hyderabad
Hybrid
About Client Hiring for One of the Most Prestigious Multinational Corporations! Job Description Job Title : Data Analyst Required skills and qualifications : Experience in data projects (e.g. data warehousing, business intelligence, analytics) Strong experience with Python, Spark Strong experience with Big Data platforms (preferred: Cloudera Hadoop) Strong experience with the Neo4j Graph Database platform Qualification : Any Graduate or Above Relevant Experience :8 to 12yrs Location : Hyderabad CTC Range : 20 to 25LPA Notice period : immediate joiners /Currently serving Mode of Interview : Virtual Mode of Work : Hybrid Gayatri G Staffing analyst - IT recruiter Black and White Business solutions PVT Ltd Bangalore, Karnataka, INDIA gayatri@blackwhite.in / www.blackwhite.in +91 8067432472
Posted Date not available
3.0 - 6.0 years
4 - 8 Lacs
noida, india
Work from Office
Sr. Engineer/Engineer- DBA The role involves design, develop, enhance and monitor all production databases that exist within the technical architecture related to Postgres. Optimizing and tuning existing programs and developing new routines will be an integral part of the profile. Experience Range: 3 - 6 years Educational Qualifications: B.Tech/B.E Job Responsibilities: Work closely with programming teams to deliver high quality software Analyze application problems and provide solutions. Participate in the creation of development, staging, and production database instances and the migration from one environment to another. Review requirements with the users and provide time estimates for task completion. Responsible for the monitoring and uptime of all production databases. Responsible for regular backups and recovery of databases including PITR. Responsible for High Availability Failover Replication. Responsible for regular maintenance on databases (e.g., Vacuum, Reindexing, Archiving). Responsible for Query tuning, Process Optimization and preventative maintenance. Provide support in different phases of software development. Develop and Schedule cron jobs as per requirement. Skills Required: Database Administration , RDBMS , NoSQL , Cloud , PostgreSQL , Cassandra , Graph db,Tuning,Recovery,Back Up Candidate Attributes: Strong knowledge of Postgre Architecture 9/10/11 including Master-Slave Replication. Hands on experience in Postgre installation, configuration and upgradation. Good understanding of data and schema standards and user/application security concepts. Must have experience in monitoring, troubleshooting and fixing Postgres issues. Experience with regular maintenance tasks automation (e.g., Vacuum, Reindexing, Archiving) and proactive intervention, reducing any downtime Good working experience of SQL Query Optimization Experience in Postgres Backups, restore and recovery (including PITR). Strong knowledge of UNIX/LINUX platforms and shell scripting. Knowledge of High Availability Failover Replication , pg-pool. Ability to quickly learn new business and technical concepts in a fast paced service focused environment. Demonstrated analytical and problem-solving skills. Good customer service and communication skills. Must be comfortable interacting with Internal clients and internal staff. Highly motivated with the ability to work well independently and within a team environment. Ability to work under pressure and to handle several tasks efficiently.
Posted Date not available
5.0 - 10.0 years
22 - 32 Lacs
pune, bengaluru, delhi / ncr
Hybrid
Role and Responsibilities: Model complex data sets in Graph technology (Neo4j) Work collaboratively with team to plan and solve complex problems like fraud investigations across different products and transactions. Expertise in graph modeling based on business rules. Work with business teams to develop analytical visualization especially identifying fraud rings through linked party analysis using Neo 4J. Build and set up Cyper queries that can be repetitively utilized by the business operations teams. Develop customizations based on evolving fraud trends as defined by business team. Work as part of the product with a team of Business Analysts in building capabilities defined by the business teams. Establish close partnership with Neo 4J team on client side for infra set up & requirement specification activities. Candidate Profile: Must Have: 2+ years hands on experience with the Neo 4J toolset - query development and visualization. Strong Neo4j Graph Database Experience (Minimum Two Years of Experience) Demonstrated Hands-on Development Expert in Graph Modeling Expert in Ingestion Expert in Cypher/ APOC Has experience working in a cross functional team in a fast-paced environment working with business, technology and product teams. 4+ years hands-on experience working with Hadoop ecosystems including HIVE, HDFS, Spark, Kafka, etc. Experience with using the Agile approach to deliver solutions. Nice to have: Experience of working in financial services and fraud/risk analytics domain, a plus Experience with handling large and complex data in Big Data Environment Experience with designing and developing complex data ingestions and transformation routines Understanding of Data Warehouse and Data Lake design, standards and best practices. Strong record of achievement, solid analytical ability, and an entrepreneurial hands-on approach to work Outstanding written and verbal communication skills
Posted Date not available
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
54024 Jobs | Dublin
Wipro
24262 Jobs | Bengaluru
Accenture in India
18733 Jobs | Dublin 2
EY
17079 Jobs | London
Uplers
12548 Jobs | Ahmedabad
IBM
11704 Jobs | Armonk
Amazon
11059 Jobs | Seattle,WA
Bajaj Finserv
10656 Jobs |
Accenture services Pvt Ltd
10587 Jobs |
Oracle
10506 Jobs | Redwood City