Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
2.0 years
0 Lacs
Jaipur, Rajasthan
On-site
Full Time Jaipur About Khushi Baby Khushi Baby, a nonprofit organization in India, serves as a technical partner to health departments. Established in 2016 from a Yale University classroom, it has grown into a 90+ member team with offices in Jaipur, Udaipur, Delhi, and Bengaluru. Khushi Baby focuses on digital health solutions, health program strengthening, and R&D. Its flagship platform, the Community Health Integrated Platform (CHIP), supports over 70,000 community health workers across 40,000 villages, reaching 45 million beneficiaries. The platform has identified and monitored 5+ million high-risk individuals, with the Ministry of Health allocating ₹160 crore ($20M) for its scale-up. CHIP has enabled initiatives like Rajasthan's digital health census, TB case finding, vector-borne disease surveillance, labor room monitoring, and immunization drives, co-designed with extensive field input. In R&D, Khushi Baby advances community-level geospatial analysis and individual health diagnostics, including smartphone-based tools and low-literacy models. Programmatically, it focuses on maternal health, child malnutrition, and zero-dose children. Backed by donors like GAVI, Skoll Foundation, and CSR funding, Khushi Baby partners with IITs, AIIMS Jodhpur, JPAL South Asia, MIT, Microsoft Research, WHO, and multiple state governments. Khushi Baby seeks skilled, creative, and driven candidates eager to make a large-scale public health impact by joining its interdisciplinary team in policy, design, development, implementation, and data science. Job Overview We are looking for a Lead Data Engineer to design, build, and optimize scalable data systems for public health analytics. You will define data workflows, layer architecture, and pipelines, ensuring data quality, security, and efficiency while leading a team of engineers. Key Responsibilities 1. Data Architecture & Infrastructure Plan, define, and implement scalable, efficient data architectures, modeling strategies, and workflows to support experimentation, analytics, and product development. Develop and manage real-time and batch processing systems (e.g., Kafka, Flink, RisingWave) to support agile experimentation (A/B testing) and analytics at scale. Strategically anticipate and plan for future data infrastructure needs, ensuring scalability, performance, and cost-effectiveness on cloud platforms (AWS, GCP, Azure). Build and optimize ETL/ELT pipelines for structured and unstructured data, integrating data from diverse public health sources, including state and national health portals. 2. Data Quality, Security & Compliance Lead the design and implementation of robust data quality protocols, embedding quality assurance at every stage—from data collection to ingestion and processing. Proactively shape data collection processes to embed data integrity and standardization from the source. Ensure security, privacy, and compliance with public health data standards such as FHIR, HL7, and ICD-10. Implement robust access control, encryption, and compliance frameworks to meet industry and public health regulatory requirements. 3. Collaboration & Impact Work closely with product, design, and field implementation teams to co-define indicators, monitor product performance, and refine tools based on real-time feedback and experimentation. Translate technical concepts into actionable, business-friendly insights for cross-functional stakeholders.Contribute to defining and iterating KPIs and success metrics through data-driven insights. 4. Team Leadership & Mentorship Manage and mentor data engineers and analysts to develop technical depth, promote a culture of experimentation, and enhance data literacy across the organization. Foster innovation, continuous learning, and the adoption of emerging technologies within the data team. 5. Technical Optimization & Documentation Conduct performance tuning for databases, queries, and data pipelines. Monitor and optimize cloud usage for performance and cost efficiency. Ensure strong documentation of data architectures, workflows, lineage, cataloging, and metadata management processes. Required Qualifications : Master’s degree in Computer Science, Data Engineering, or related field. 7+ years of experience in data engineering, with 2+ years in a leadership role. Expertise in SQL, Python, data modeling, and pipeline orchestration (Airflow, Mage AI, etc.). Experience with big data technologies such as Apache Iceberg or Delta Lake. Knowledge of streaming & CDC tools (Kafka, Debezium, Redpanda, etc.). Deep understanding of data engineering best practices including partitioning, indexing, caching, and compression techniques. Strong problem-solving, logical reasoning, communication, and leadership abilities. Hands-on experience with cloud data services (AWS, GCP, Azure). Preferred/Good to Have : Experience working on public health data projects with knowledge of key health indicators and metrics. Familiarity with data lakehouse architectures and federated learning frameworks. Strong understanding of public health data interoperability standards (FHIR, HL7, ICD-10). Passion for using data to drive impact in the health and development sectors. Remuneration The remuneration offered will be in the range of 15–25 LPA, depending on the candidate’s experience, skill set, and evaluation based on our internal parameters. Benefits Medical insurance Flexible work policies for those menstruating, needing time for grievance of a loved one, religious fasting, etc Monthly field visits, annual retreat Learning opportunities with world-class research institutions (Yale, Harvard) Learning Stipend Policy Sponsored Workshops and Seminars How to apply To apply for the above position, To apply for the above position, share your CV on careers@khushibaby.org Due to the high number of applicants, we will only reach out to those who are shortlisted. Rest assured that your application will be carefully reviewed, and if you are shortlisted, you will receive a call or mail from us.
Posted 3 days ago
7.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Atleast 7 years of experience is required Spring ecosystem (Boot, Data, Security, Webflux[Most Important]). -Modern Architecture & Design (9/10):Proficient in designing microservices (2-5 years). -Cloud-Native & DevOps (9/10): Hands-on experience with Cloud, Kubernetes, and CI/CD pipelines (2-5 years). -Core Technical Foundations (8/10):Advanced proficiency in Algorithms & Data Structures (8/10 level) and practical Python skills. -Data & Search Technologies (8/10): Experience with data streaming (Kafka), search (Apache SOLR), and RDBMS. -NoSQL Experience (7/10):Familiarity with NoSQL databases like MongoDB is a strong plus. Resource must be hands-on in SQL. Should be able to write queries for the following areas DDL queries DML Views CTE's Operators Aggregate functions window functions Stored procedures and so on. Knowledge of a scripting language is desirable. knowledge of DB Admin is an additional aspect but not mandatory 7+ years of exp in SQL /No SQL
Posted 3 days ago
0 years
0 Lacs
Greater Kolkata Area
On-site
Job Summary We are seeking a forward-thinking AI Architect to design, lead, and scale enterprise-grade AI systems and solutions across domains. This role demands deep expertise in machine learning, generative AI, data engineering, cloud-native architecture, and orchestration frameworks. You will collaborate with cross-functional teams to translate business requirements into intelligent, production-ready AI solutions. Key Responsibilities Architecture & Strategy : Design end-to-end AI architectures that include data pipelines, model development, MLOps, and inference serving. Create scalable, reusable, and modular AI components for different use cases (vision, NLP, time series, etc.). Drive architecture decisions across AI solutions, including multi-modal models, LLMs, and agentic workflows. Ensure interoperability of AI systems across cloud (AWS/GCP/Azure), edge, and hybrid environments. Technical Leadership Guide teams in selecting appropriate models (traditional ML, deep learning, transformers, etc.) and technologies. Lead architectural reviews and ensure compliance with security, performance, and governance policies. Mentor engineering and data science teams in best practices for AI/ML, GenAI, and MLOps. Model Lifecycle & Engineering Oversee implementation of model lifecycle using CI/CD for ML (MLOps) and/or LLMOps workflows. Define architecture for Retrieval Augmented Generation (RAG), vector databases, embeddings, prompt engineering, etc. Design pipelines for fine-tuning, evaluation, monitoring, and retraining of models. Data & Infrastructure Collaborate with data engineers to ensure data quality, feature pipelines, and scalable data stores. Architect systems for synthetic data generation, augmentation, and real-time streaming inputs. Define solutions leveraging data lakes, data warehouses, and graph databases. Client Engagement / Product Integration Interface with business/product stakeholders to align AI strategy with KPIs. Collaborate with DevOps teams to integrate models into products via APIs/microservices. Required Skills & Experience Core Skills : Strong foundation in AI/ML/DL (Scikit-learn, TensorFlow, PyTorch, Transformers, Langchain, etc.) Advanced knowledge of Generative AI (LLMs, diffusion models, multimodal models, etc.) Proficiency in cloud-native architectures (AWS/GCP/Azure) and containerization (Docker, Kubernetes) Experience with orchestration frameworks (Airflow, Ray, LangGraph, or similar) Familiarity with vector databases (Weaviate, Pinecone, FAISS), LLMOps platforms, and RAG design Architecture & Programming Solid experience in architectural patterns (microservices, event-driven, serverless) Proficient in Python and optionally Java/Go Knowledge of APIs (REST, GraphQL), streaming (Kafka), and observability tooling (Prometheus, ELK, Grafana) Tools & Platforms ML lifecycle tools: MLflow, Kubeflow, Vertex AI, Sagemaker, Hugging Face, etc. Prompt orchestration tools: LangChain, CrewAI, Semantic Kernel, DSPy (nice to have) Knowledge of security, privacy, and compliance (GDPR, SOC2, HIPAA, etc.) (ref:hirist.tech)
Posted 3 days ago
3.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Job Description About Oracle Analytics & Big Data Service: Oracle Analytics is a complete platform that supports every role within analytics, offering cloud-native services or on-premises solutions without compromising security or governance. Our platform delivers a unified system for managing everything from data collection to decision-making, with seamless integration of AI and machine learning to help businesses accelerate productivity and uncover critical insights. Oracle Big Data Service, a part of Oracle Analytics, is a fully managed, automated cloud service designed to help enterprises create scalable Hadoop-based data lakes. The service work scope encompasses not just good integration with OCI’s native infrastructure (security, cloud, storage, etc.) but also deep integration with other relevant cloud-native services in OCI. It includes doing cloud-native ways of doing service level patching & upgrades and maintaining high availability of the service in the face of random failures & planned downtimes in the underlying infrastructure (e.g., for things like patching the Linux kernels to take care of a security vulnerability). Developing systems for monitoring and getting telemetry into the service’s runtime characteristics and being able to take actions on the telemetry data is a part of the charter. We are interested in experienced engineers with expertise and passion for solving difficult problems in distributed systems and highly available services to join our Oracle Big Data Service team. In this role, you will be instrumental in building, maintaining, and enhancing our managed, cloud-native Big Data service focused on large-scale data processing and analytics. At Oracle, you can help, shape, design, and build innovative new systems from the ground up. These are exciting times in our space - we are growing fast, still at an early stage, and working on ambitious new initiatives. Engineers at any level can have significant technical and business impact. Minimum Qualifications: Bachelor’s or Master’s degree in Computer Science, Engineering, or related technical field. Minimum of 3-5 years of experience in software development, with a focus on large-scale distributed systems, cloud services, or Big Data technologies. US passport holders. This is required by the position to access US Gov regions. Expertise in coding in Java, Python with emphasis on tuning/optimization Experience with Linux systems administration, troubleshooting, and security best practices in cloud environments. Familiarity with containerization and orchestration technologies like Docker and Kubernetes. Experience with automation tools and techniques for deployment, patching, and service monitoring. Experience with open-source software in the Big Data ecosystem Experience at an organization with operational/dev-ops culture Solid understanding of networking, storage, and security components related to cloud infrastructure. Solid foundation in data structures, algorithms, and software design with strong analytical and debugging skills. Preferred Qualifications: Proven expertise in cloud-native architectures and services, preferably within Oracle Cloud Infrastructure (OCI), AWS, Azure, or GCP. Deep understanding of Java and JVM mechanics Hands-on experience with Hadoop ecosystem (HDFS, MapReduce, YARN), Spark, Kafka, Flink and other big data technologies is a plus. Excellent problem-solving skills and the ability to work in a fast-paced, agile environment. Responsibilities Key Responsibilities: Participate in the design, development, and maintenance of a scalable and secure Hadoop-based data lake service. Code, integrate, and operationalize open and closed source data ecosystem components for Oracle cloud service offerings Collaborate with cross-functional teams including DevOps, Security, and Product Management to define and execute product roadmaps, service updates, and feature enhancements. Becoming an active member of the Apache open source community when working on open source components Implement high availability strategies and design patterns to manage system failures and ensure uninterrupted service during planned or unplanned infrastructure downtimes. Ensure compliance with security protocols and industry best practices when handling large-scale data processing in the cloud. Qualifications Career Level - IC3 About Us As a world leader in cloud solutions, Oracle uses tomorrow’s technology to tackle today’s challenges. We’ve partnered with industry-leaders in almost every sector—and continue to thrive after 40+ years of change by operating with integrity. We know that true innovation starts when everyone is empowered to contribute. That’s why we’re committed to growing an inclusive workforce that promotes opportunities for all. Oracle careers open the door to global opportunities where work-life balance flourishes. We offer competitive benefits based on parity and consistency and support our people with flexible medical, life insurance, and retirement options. We also encourage employees to give back to their communities through our volunteer programs. We’re committed to including people with disabilities at all stages of the employment process. If you require accessibility assistance or accommodation for a disability at any point, let us know by emailing accommodation-request_mb@oracle.com or by calling +1 888 404 2494 in the United States. Oracle is an Equal Employment Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability and protected veterans’ status, or any other characteristic protected by law. Oracle will consider for employment qualified applicants with arrest and conviction records pursuant to applicable law.
Posted 3 days ago
1.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Job Description About Oracle Analytics & Big Data Service: Oracle Analytics is a complete platform that supports every role within analytics, offering cloud-native services or on-premises solutions without compromising security or governance. Our platform delivers a unified system for managing everything from data collection to decision-making, with seamless integration of AI and machine learning to help businesses accelerate productivity and uncover critical insights. Oracle Big Data Service, a part of Oracle Analytics, is a fully managed, automated cloud service designed to help enterprises create scalable Hadoop-based data lakes. The service work scope encompasses not just good integration with OCI’s native infrastructure (security, cloud, storage, etc.) but also deep integration with other relevant cloud-native services in OCI. It includes doing cloud-native ways of doing service level patching & upgrades and maintaining high availability of the service in the face of random failures & planned downtimes in the underlying infrastructure (e.g., for things like patching the Linux kernels to take care of a security vulnerability). Developing systems for monitoring and getting telemetry into the service’s runtime characteristics and being able to take actions on the telemetry data is a part of the charter. We are interested in experienced engineers with expertise and passion for solving difficult problems in distributed systems and highly available services to join our Oracle Big Data Service team. In this role, you will be instrumental in building, maintaining, and enhancing our managed, cloud-native Big Data service focused on large-scale data processing and analytics. At Oracle, you can help, shape, design, and build innovative new systems from the ground up. These are exciting times in our space - we are growing fast, still at an early stage, and working on ambitious new initiatives. Engineers at any level can have significant technical and business impact. Minimum Qualifications: Bachelor’s or Master’s degree in Computer Science, Engineering, or related technical field. Minimum of 1-2 years of experience in software development, with a focus on large-scale distributed systems, cloud services, or Big Data technologies. US passport holders. This is required by the position to access US Gov regions. Expertise in coding in Java, Python with emphasis on tuning/optimization Experience with Linux systems administration, troubleshooting, and security best practices in cloud environments. Experience with open-source software in the Big Data ecosystem Experience at an organization with operational/dev-ops culture Solid understanding of networking, storage, and security components related to cloud infrastructure. Solid foundation in data structures, algorithms, and software design with strong analytical and debugging skills. Preferred Qualifications: Hands-on experience with Hadoop ecosystem (HDFS, MapReduce, YARN), Spark, Kafka, Flink and other big data technologies. Proven expertise in cloud-native architectures and services, preferably within Oracle Cloud Infrastructure (OCI), AWS, Azure, or GCP. In-depth understanding of Java and JVM mechanics Good problem-solving skills and the ability to work in a fast-paced, agile environment. Responsibilities Key Responsibilities: Participate in development and maintenance of a scalable and secure Hadoop-based data lake service. Code, integrate, and operationalize open and closed source data ecosystem components for Oracle cloud service offerings Collaborate with cross-functional teams including DevOps, Security, and Product Management to define and execute product roadmaps, service updates, and feature enhancements. Becoming an active member of the Apache open source community when working on open source components Ensure compliance with security protocols and industry best practices when handling large-scale data processing in the cloud. Qualifications Career Level - IC2 About Us As a world leader in cloud solutions, Oracle uses tomorrow’s technology to tackle today’s challenges. We’ve partnered with industry-leaders in almost every sector—and continue to thrive after 40+ years of change by operating with integrity. We know that true innovation starts when everyone is empowered to contribute. That’s why we’re committed to growing an inclusive workforce that promotes opportunities for all. Oracle careers open the door to global opportunities where work-life balance flourishes. We offer competitive benefits based on parity and consistency and support our people with flexible medical, life insurance, and retirement options. We also encourage employees to give back to their communities through our volunteer programs. We’re committed to including people with disabilities at all stages of the employment process. If you require accessibility assistance or accommodation for a disability at any point, let us know by emailing accommodation-request_mb@oracle.com or by calling +1 888 404 2494 in the United States. Oracle is an Equal Employment Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability and protected veterans’ status, or any other characteristic protected by law. Oracle will consider for employment qualified applicants with arrest and conviction records pursuant to applicable law.
Posted 3 days ago
3.0 - 5.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Devops Engineer : Bangalore Job Description DevOps Engineer_Qilin Lab Bangalore, India Role We are seeking an experienced DevOps Engineer to deliver insights from massive-scale data in real time. Specifically, were searching for someone who has fresh ideas and a unique viewpoint, and who enjoys collaborating with a cross-functional team to develop real-world solutions and positive user experiences for every of this role : Work with DevOps to run the production environment by monitoring availability and taking a holistic view of system health Build software and systems to manage our Data Platform infrastructure Improve reliability, quality, and time-to-market of our Global Data Platform Measure and optimize system performance and innovate for continual improvement Provide operational support and engineering for a distributed Platform at : Define, publish and defend service-level objectives (SLOs) Partner with data engineers to improve services through rigorous testing and release procedures Participate in system design, Platform management and capacity planning Create sustainable systems and services through automation and automated run-books Proactive approach to identifying problems and seeking areas for improvement Mentor the team in infrastructure best : Bachelors degree in Computer Science or an IT related field, or equivalent practical experience with a proven track record. The following hands-on working knowledge and experience is required : Kubernetes , EC2 , RDS,ELK Stack, Cloud Platforms (AWS, Azure, GCP) preferably AWS. Building & operating clusters Related technologies such as Containers, Helm, Kustomize, Argocd Ability to program (structured and OOP) using at least one high-level language such as Python, Java, Go, etc. Agile Methodologies (Scrum, TDD, BDD, etc.) Continuous Integration and Continuous Delivery Tools (gitops) Terraform, Unix/Linux environments Experience with several of the following tools/technologies is desirable : Big Data platforms (eg. Apache Hadoop and Apache Spark)Streaming Technologies (Kafka, Kinesis, etc.) ElasticSearch Service, Mesh Orchestration technologies, e.g., Argo Knowledge of the following is a plus : Security (OWASP, SIEM, etc.)Infrastructure testing (Chaos, Load, Security), Github, Microservices architectures. Notice period : Immediate to 15 days Experience : 3 to 5 years Job Type : Full-time Schedule : Day shift Monday to Friday Work Location : On Site Job Type : Payroll Must Have Skills Python - 3 Years - Intermediate DevOps - 3 Years - Intermediate AWS - 2 Years - Intermediate Agile Methodology - 3 Years - Intermediate Kubernetes - 3 Years - Intermediate ElasticSearch - 3 Years - Intermediate (ref:hirist.tech)
Posted 3 days ago
10.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Strong Functional Testing experience with 10+ years Experience in Kafka validations, AWS Microservices, Google lenses etc. Understand the project and business requirement Develop Test Strategy/Test Plan, Create test scenarios/cases Review of the project deliverables Execute test cases, log and retest defects Highlight key issues/challenges with project stakeholders, incumbent vendors, developers. Manage multiple priorities, customer expectations and relationships with all stakeholders. Manage relationships with customers, project managers, and any/all levels of management
Posted 3 days ago
1.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Total Experience : 1+ years Notice Period : Immediate to a maximum of 45 Days Mode Of Hire : Permanent Required Skills Set (Mandatory) : Object Oriented Programming, data structures, algorithms, software design, and database systems. Desired Skills (Good if you have) : ElasticSearch, MongoDB, MySQL, Redis, Springboot, Crawling/ Web Scraping, Programming languages(C, C++, Java, Groovy) Job Responsibilities Collaborate with Managers and other Engineers to help define, scope, and implement high-quality features that solve critical user needs. Break down requirements into architecture and deliver code, while keeping operational issues in mind. The ability to own end-to-end responsibility right from the requirement to release. Write clear documentation so that other engineers can jump in and get things done. Actively participate in design and code reviews. Help take Tracxn to the next level as a world-class engineering team Job Requirements Experience with building backend services. Strong algorithm and CS skills. 1+ years of experience designing and implementing large- scale distributed systems. Experience with multiple programming languages (Groovy, Java) and data stores (MySQL, MongoDB, Redis, etc) Proven ability to work in a fast-paced, agile, and in ownership, and results-oriented culture. Strong problem-solving and analytical skills. Culture Work with performance-oriented teams driven by ownership and passion. Learn to design systems for high accuracy, efficiency, and scalability. No strict deadlines focus on delivering quality work. Meritocracy-driven, candid culture. No politics. Very high visibility regarding which startups and markets are exciting globally. About Tracxn Tracxn (Tracxn.com) is a Bangalore-based product company providing a research and deal sourcing platform for Venture Capital, Private Equity, Corp Dev, and professionals working around the startup ecosystem. We are a team of 600+ working professionals serving customers across the globe. Our clients include Funds like Andreessen Horowitz, Matrix Partners, GGV Capital, and Large Corporates such as Citi, Embraer & Neha Singh (ex-Sequoia, BCG | MBA - Stanford GSB) Abhishek Goyal (ex-Accel Partners, Amazon | BTech - IIT Kanpur) About Technology Team Tracxn's Technology team is 50+ members strong and growing. The technology team is subdivided into multiple smaller teams, each of which owns one or more services/components of the technology platform. Ours is a young team of motivated engineers with a minimal management structure where almost everyone is actively involved in technical development and design activities. We have a team-centric culture where the ownership and responsibility of a feature or module lie with a team as compared to an individual. We work on an array of technologies, including but not limited to ReactJS, Next.js, Storybook, Webpack, Node, Mongo, AWS Lambda, Spring, Elastic Stack, MySQL, Kafka, Redis, Ansible, etc. We value ownership, continuous learning, consistency, and discipline as a team. (ref:hirist.tech)
Posted 3 days ago
3.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Job Description About Oracle Analytics & Big Data Service: Oracle Analytics is a complete platform that supports every role within analytics, offering cloud-native services or on-premises solutions without compromising security or governance. Our platform delivers a unified system for managing everything from data collection to decision-making, with seamless integration of AI and machine learning to help businesses accelerate productivity and uncover critical insights. Oracle Big Data Service, a part of Oracle Analytics, is a fully managed, automated cloud service designed to help enterprises create scalable Hadoop-based data lakes. The service work scope encompasses not just good integration with OCI’s native infrastructure (security, cloud, storage, etc.) but also deep integration with other relevant cloud-native services in OCI. It includes doing cloud-native ways of doing service level patching & upgrades and maintaining high availability of the service in the face of random failures & planned downtimes in the underlying infrastructure (e.g., for things like patching the Linux kernels to take care of a security vulnerability). Developing systems for monitoring and getting telemetry into the service’s runtime characteristics and being able to take actions on the telemetry data is a part of the charter. We are interested in experienced engineers with expertise and passion for solving difficult problems in distributed systems and highly available services to join our Oracle Big Data Service team. In this role, you will be instrumental in building, maintaining, and enhancing our managed, cloud-native Big Data service focused on large-scale data processing and analytics. At Oracle, you can help, shape, design, and build innovative new systems from the ground up. These are exciting times in our space - we are growing fast, still at an early stage, and working on ambitious new initiatives. Engineers at any level can have significant technical and business impact. Minimum Qualifications: Bachelor’s or Master’s degree in Computer Science, Engineering, or related technical field. Minimum of 3-5 years of experience in software development, with a focus on large-scale distributed systems, cloud services, or Big Data technologies. US passport holders. This is required by the position to access US Gov regions. Expertise in coding in Java, Python with emphasis on tuning/optimization Experience with Linux systems administration, troubleshooting, and security best practices in cloud environments. Familiarity with containerization and orchestration technologies like Docker and Kubernetes. Experience with automation tools and techniques for deployment, patching, and service monitoring. Experience with open-source software in the Big Data ecosystem Experience at an organization with operational/dev-ops culture Solid understanding of networking, storage, and security components related to cloud infrastructure. Solid foundation in data structures, algorithms, and software design with strong analytical and debugging skills. Preferred Qualifications: Proven expertise in cloud-native architectures and services, preferably within Oracle Cloud Infrastructure (OCI), AWS, Azure, or GCP. Deep understanding of Java and JVM mechanics Hands-on experience with Hadoop ecosystem (HDFS, MapReduce, YARN), Spark, Kafka, Flink and other big data technologies is a plus. Excellent problem-solving skills and the ability to work in a fast-paced, agile environment. Responsibilities Key Responsibilities: Participate in the design, development, and maintenance of a scalable and secure Hadoop-based data lake service. Code, integrate, and operationalize open and closed source data ecosystem components for Oracle cloud service offerings Collaborate with cross-functional teams including DevOps, Security, and Product Management to define and execute product roadmaps, service updates, and feature enhancements. Becoming an active member of the Apache open source community when working on open source components Implement high availability strategies and design patterns to manage system failures and ensure uninterrupted service during planned or unplanned infrastructure downtimes. Ensure compliance with security protocols and industry best practices when handling large-scale data processing in the cloud. Qualifications Career Level - IC3 About Us As a world leader in cloud solutions, Oracle uses tomorrow’s technology to tackle today’s challenges. We’ve partnered with industry-leaders in almost every sector—and continue to thrive after 40+ years of change by operating with integrity. We know that true innovation starts when everyone is empowered to contribute. That’s why we’re committed to growing an inclusive workforce that promotes opportunities for all. Oracle careers open the door to global opportunities where work-life balance flourishes. We offer competitive benefits based on parity and consistency and support our people with flexible medical, life insurance, and retirement options. We also encourage employees to give back to their communities through our volunteer programs. We’re committed to including people with disabilities at all stages of the employment process. If you require accessibility assistance or accommodation for a disability at any point, let us know by emailing accommodation-request_mb@oracle.com or by calling +1 888 404 2494 in the United States. Oracle is an Equal Employment Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability and protected veterans’ status, or any other characteristic protected by law. Oracle will consider for employment qualified applicants with arrest and conviction records pursuant to applicable law.
Posted 3 days ago
1.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Job Description About Oracle Analytics & Big Data Service: Oracle Analytics is a complete platform that supports every role within analytics, offering cloud-native services or on-premises solutions without compromising security or governance. Our platform delivers a unified system for managing everything from data collection to decision-making, with seamless integration of AI and machine learning to help businesses accelerate productivity and uncover critical insights. Oracle Big Data Service, a part of Oracle Analytics, is a fully managed, automated cloud service designed to help enterprises create scalable Hadoop-based data lakes. The service work scope encompasses not just good integration with OCI’s native infrastructure (security, cloud, storage, etc.) but also deep integration with other relevant cloud-native services in OCI. It includes doing cloud-native ways of doing service level patching & upgrades and maintaining high availability of the service in the face of random failures & planned downtimes in the underlying infrastructure (e.g., for things like patching the Linux kernels to take care of a security vulnerability). Developing systems for monitoring and getting telemetry into the service’s runtime characteristics and being able to take actions on the telemetry data is a part of the charter. We are interested in experienced engineers with expertise and passion for solving difficult problems in distributed systems and highly available services to join our Oracle Big Data Service team. In this role, you will be instrumental in building, maintaining, and enhancing our managed, cloud-native Big Data service focused on large-scale data processing and analytics. At Oracle, you can help, shape, design, and build innovative new systems from the ground up. These are exciting times in our space - we are growing fast, still at an early stage, and working on ambitious new initiatives. Engineers at any level can have significant technical and business impact. Minimum Qualifications: Bachelor’s or Master’s degree in Computer Science, Engineering, or related technical field. Minimum of 1-2 years of experience in software development, with a focus on large-scale distributed systems, cloud services, or Big Data technologies. US passport holders. This is required by the position to access US Gov regions. Expertise in coding in Java, Python with emphasis on tuning/optimization Experience with Linux systems administration, troubleshooting, and security best practices in cloud environments. Experience with open-source software in the Big Data ecosystem Experience at an organization with operational/dev-ops culture Solid understanding of networking, storage, and security components related to cloud infrastructure. Solid foundation in data structures, algorithms, and software design with strong analytical and debugging skills. Preferred Qualifications: Hands-on experience with Hadoop ecosystem (HDFS, MapReduce, YARN), Spark, Kafka, Flink and other big data technologies. Proven expertise in cloud-native architectures and services, preferably within Oracle Cloud Infrastructure (OCI), AWS, Azure, or GCP. In-depth understanding of Java and JVM mechanics Good problem-solving skills and the ability to work in a fast-paced, agile environment. Responsibilities Key Responsibilities: Participate in development and maintenance of a scalable and secure Hadoop-based data lake service. Code, integrate, and operationalize open and closed source data ecosystem components for Oracle cloud service offerings Collaborate with cross-functional teams including DevOps, Security, and Product Management to define and execute product roadmaps, service updates, and feature enhancements. Becoming an active member of the Apache open source community when working on open source components Ensure compliance with security protocols and industry best practices when handling large-scale data processing in the cloud. Qualifications Career Level - IC2 About Us As a world leader in cloud solutions, Oracle uses tomorrow’s technology to tackle today’s challenges. We’ve partnered with industry-leaders in almost every sector—and continue to thrive after 40+ years of change by operating with integrity. We know that true innovation starts when everyone is empowered to contribute. That’s why we’re committed to growing an inclusive workforce that promotes opportunities for all. Oracle careers open the door to global opportunities where work-life balance flourishes. We offer competitive benefits based on parity and consistency and support our people with flexible medical, life insurance, and retirement options. We also encourage employees to give back to their communities through our volunteer programs. We’re committed to including people with disabilities at all stages of the employment process. If you require accessibility assistance or accommodation for a disability at any point, let us know by emailing accommodation-request_mb@oracle.com or by calling +1 888 404 2494 in the United States. Oracle is an Equal Employment Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability and protected veterans’ status, or any other characteristic protected by law. Oracle will consider for employment qualified applicants with arrest and conviction records pursuant to applicable law.
Posted 3 days ago
3.0 years
0 Lacs
Trivandrum, Kerala, India
On-site
Job Description About Oracle Analytics & Big Data Service: Oracle Analytics is a complete platform that supports every role within analytics, offering cloud-native services or on-premises solutions without compromising security or governance. Our platform delivers a unified system for managing everything from data collection to decision-making, with seamless integration of AI and machine learning to help businesses accelerate productivity and uncover critical insights. Oracle Big Data Service, a part of Oracle Analytics, is a fully managed, automated cloud service designed to help enterprises create scalable Hadoop-based data lakes. The service work scope encompasses not just good integration with OCI’s native infrastructure (security, cloud, storage, etc.) but also deep integration with other relevant cloud-native services in OCI. It includes doing cloud-native ways of doing service level patching & upgrades and maintaining high availability of the service in the face of random failures & planned downtimes in the underlying infrastructure (e.g., for things like patching the Linux kernels to take care of a security vulnerability). Developing systems for monitoring and getting telemetry into the service’s runtime characteristics and being able to take actions on the telemetry data is a part of the charter. We are interested in experienced engineers with expertise and passion for solving difficult problems in distributed systems and highly available services to join our Oracle Big Data Service team. In this role, you will be instrumental in building, maintaining, and enhancing our managed, cloud-native Big Data service focused on large-scale data processing and analytics. At Oracle, you can help, shape, design, and build innovative new systems from the ground up. These are exciting times in our space - we are growing fast, still at an early stage, and working on ambitious new initiatives. Engineers at any level can have significant technical and business impact. Minimum Qualifications: Bachelor’s or Master’s degree in Computer Science, Engineering, or related technical field. Minimum of 3-5 years of experience in software development, with a focus on large-scale distributed systems, cloud services, or Big Data technologies. US passport holders. This is required by the position to access US Gov regions. Expertise in coding in Java, Python with emphasis on tuning/optimization Experience with Linux systems administration, troubleshooting, and security best practices in cloud environments. Familiarity with containerization and orchestration technologies like Docker and Kubernetes. Experience with automation tools and techniques for deployment, patching, and service monitoring. Experience with open-source software in the Big Data ecosystem Experience at an organization with operational/dev-ops culture Solid understanding of networking, storage, and security components related to cloud infrastructure. Solid foundation in data structures, algorithms, and software design with strong analytical and debugging skills. Preferred Qualifications: Proven expertise in cloud-native architectures and services, preferably within Oracle Cloud Infrastructure (OCI), AWS, Azure, or GCP. Deep understanding of Java and JVM mechanics Hands-on experience with Hadoop ecosystem (HDFS, MapReduce, YARN), Spark, Kafka, Flink and other big data technologies is a plus. Excellent problem-solving skills and the ability to work in a fast-paced, agile environment. Responsibilities Key Responsibilities: Participate in the design, development, and maintenance of a scalable and secure Hadoop-based data lake service. Code, integrate, and operationalize open and closed source data ecosystem components for Oracle cloud service offerings Collaborate with cross-functional teams including DevOps, Security, and Product Management to define and execute product roadmaps, service updates, and feature enhancements. Becoming an active member of the Apache open source community when working on open source components Implement high availability strategies and design patterns to manage system failures and ensure uninterrupted service during planned or unplanned infrastructure downtimes. Ensure compliance with security protocols and industry best practices when handling large-scale data processing in the cloud. Qualifications Career Level - IC3 About Us As a world leader in cloud solutions, Oracle uses tomorrow’s technology to tackle today’s challenges. We’ve partnered with industry-leaders in almost every sector—and continue to thrive after 40+ years of change by operating with integrity. We know that true innovation starts when everyone is empowered to contribute. That’s why we’re committed to growing an inclusive workforce that promotes opportunities for all. Oracle careers open the door to global opportunities where work-life balance flourishes. We offer competitive benefits based on parity and consistency and support our people with flexible medical, life insurance, and retirement options. We also encourage employees to give back to their communities through our volunteer programs. We’re committed to including people with disabilities at all stages of the employment process. If you require accessibility assistance or accommodation for a disability at any point, let us know by emailing accommodation-request_mb@oracle.com or by calling +1 888 404 2494 in the United States. Oracle is an Equal Employment Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability and protected veterans’ status, or any other characteristic protected by law. Oracle will consider for employment qualified applicants with arrest and conviction records pursuant to applicable law.
Posted 3 days ago
1.0 years
0 Lacs
Trivandrum, Kerala, India
On-site
Job Description About Oracle Analytics & Big Data Service: Oracle Analytics is a complete platform that supports every role within analytics, offering cloud-native services or on-premises solutions without compromising security or governance. Our platform delivers a unified system for managing everything from data collection to decision-making, with seamless integration of AI and machine learning to help businesses accelerate productivity and uncover critical insights. Oracle Big Data Service, a part of Oracle Analytics, is a fully managed, automated cloud service designed to help enterprises create scalable Hadoop-based data lakes. The service work scope encompasses not just good integration with OCI’s native infrastructure (security, cloud, storage, etc.) but also deep integration with other relevant cloud-native services in OCI. It includes doing cloud-native ways of doing service level patching & upgrades and maintaining high availability of the service in the face of random failures & planned downtimes in the underlying infrastructure (e.g., for things like patching the Linux kernels to take care of a security vulnerability). Developing systems for monitoring and getting telemetry into the service’s runtime characteristics and being able to take actions on the telemetry data is a part of the charter. We are interested in experienced engineers with expertise and passion for solving difficult problems in distributed systems and highly available services to join our Oracle Big Data Service team. In this role, you will be instrumental in building, maintaining, and enhancing our managed, cloud-native Big Data service focused on large-scale data processing and analytics. At Oracle, you can help, shape, design, and build innovative new systems from the ground up. These are exciting times in our space - we are growing fast, still at an early stage, and working on ambitious new initiatives. Engineers at any level can have significant technical and business impact. Minimum Qualifications: Bachelor’s or Master’s degree in Computer Science, Engineering, or related technical field. Minimum of 1-2 years of experience in software development, with a focus on large-scale distributed systems, cloud services, or Big Data technologies. US passport holders. This is required by the position to access US Gov regions. Expertise in coding in Java, Python with emphasis on tuning/optimization Experience with Linux systems administration, troubleshooting, and security best practices in cloud environments. Experience with open-source software in the Big Data ecosystem Experience at an organization with operational/dev-ops culture Solid understanding of networking, storage, and security components related to cloud infrastructure. Solid foundation in data structures, algorithms, and software design with strong analytical and debugging skills. Preferred Qualifications: Hands-on experience with Hadoop ecosystem (HDFS, MapReduce, YARN), Spark, Kafka, Flink and other big data technologies. Proven expertise in cloud-native architectures and services, preferably within Oracle Cloud Infrastructure (OCI), AWS, Azure, or GCP. In-depth understanding of Java and JVM mechanics Good problem-solving skills and the ability to work in a fast-paced, agile environment. Responsibilities Key Responsibilities: Participate in development and maintenance of a scalable and secure Hadoop-based data lake service. Code, integrate, and operationalize open and closed source data ecosystem components for Oracle cloud service offerings Collaborate with cross-functional teams including DevOps, Security, and Product Management to define and execute product roadmaps, service updates, and feature enhancements. Becoming an active member of the Apache open source community when working on open source components Ensure compliance with security protocols and industry best practices when handling large-scale data processing in the cloud. Qualifications Career Level - IC2 About Us As a world leader in cloud solutions, Oracle uses tomorrow’s technology to tackle today’s challenges. We’ve partnered with industry-leaders in almost every sector—and continue to thrive after 40+ years of change by operating with integrity. We know that true innovation starts when everyone is empowered to contribute. That’s why we’re committed to growing an inclusive workforce that promotes opportunities for all. Oracle careers open the door to global opportunities where work-life balance flourishes. We offer competitive benefits based on parity and consistency and support our people with flexible medical, life insurance, and retirement options. We also encourage employees to give back to their communities through our volunteer programs. We’re committed to including people with disabilities at all stages of the employment process. If you require accessibility assistance or accommodation for a disability at any point, let us know by emailing accommodation-request_mb@oracle.com or by calling +1 888 404 2494 in the United States. Oracle is an Equal Employment Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability and protected veterans’ status, or any other characteristic protected by law. Oracle will consider for employment qualified applicants with arrest and conviction records pursuant to applicable law.
Posted 3 days ago
47.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Job Title : Lead Full Stack Developer Experience : 47 Years Location : Gurgaon (Hybrid) Key Responsibilities Lead the design and development of full stack applications using React.js, Next.js, and Spring Boot. Build and maintain scalable, high-performance services using Java, MongoDB, and Kafka. Drive front-end architecture and design with React, Next.js, HTML5, CSS3, and modern JavaScript (ES6+). Implement RESTful APIs and integrate with third-party services and AWS cloud components. Apply Reactive Programming principles to improve system responsiveness (bonus). Take ownership of technical designs, code quality, and mentoring developers on best practices. Collaborate with DevOps teams to ensure reliable CI/CD pipelines and cloud deployments (AWS). Conduct code reviews, optimize performance, and ensure adherence to security and scalability standards. Required Skills Frontend : React.js & Next.js (Expert) JavaScript (ES6+), HTML5, CSS3 (Strong fundamentals) Webpack, Babel, and modern front-end tooling Backend Java (8+), Spring Boot (Hands-on experience) MongoDB or similar NoSQL databases Kafka (event-driven architecture experience) RESTful APIs, Microservices architecture Others AWS services (EC2, S3, Lambda, etc.) Git, CI/CD tools (e.g., Jenkins, GitHub Actions) Unit Testing and Integration Testing (Jest, JUnit, etc.) Understanding of Reactive Programming (Project Reactor/WebFlux) Good to have (ref:hirist.tech)
Posted 4 days ago
5.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Level AI was founded in 2019 and is a Series C startup headquartered in Mountain View, California. Level AI revolutionizes customer engagement by transforming contact centers into strategic assets. Our AI-native platform leverages advanced technologies such as Large Language Models to extract deep insights from customer interactions. By providing actionable intelligence, Level AI empowers organizations to enhance customer experience and drive growth. Consistently Updated With The Latest AI Innovations, Level AI Stands As The Most Adaptive And Forward-thinking Solution In The Overview We are looking for a Lead Software Engineer to help raise the engineering bar for the entire technology stack at Level AI, including applications, platform and infrastructure. They will actively collaborate with team members and the wider Level AI engineering community to develop highly scalable and performant systems. They will be a technical thought leader who will help drive solving complex problems of today and the future by designing and building simple and elegant technical solutions. They will coach and mentor junior engineers and drive engineering best practices. They will actively collaborate with product managers and other stakeholders both inside and outside the : Large Scale Distributed Systems, Search (such As Elasticsearch), High Scale Messaging Systems (such As Kafka), Real Time Job Queues, High Throughput And Low Latency Systems, Python, Django, Relational Databases (such As PostgreSQL), Data Modeling, DB Query Optimization, Caching, Redis, Celery, CI/CD, GCP, Kubernetes, Develop and execute the technical roadmap to scale Level AIs technology stack. Design and build highly scalable and low latency distributed systems to process large scale real time data. Drive best-in class engineering practices through the software development lifecycle. Drive operational excellence for critical services that need to have high uptime. Collaborate with a variety of stakeholders within and outside engineering to create technical plans to deliver on important business goals, and lead the execution of those. Stay up to date with the latest technologies and thoughtfully apply them to Level AIs tech : B.E/B.Tech/M.E/M.Tech/PhD from tier 1/2 Engineering institutes with relevant work experience with a top technology company. 5+ years of Backend and Infrastructure Experience with strong track record in development, architecture and design. Hands on experience with large scale databases, high scale messaging systems and real time Job Queues. Experience navigating and understanding large scale systems and complex code-bases, and architectural patterns. Experience mentoring and providing technical leadership to other engineers in the team. Nice To Have Experience with Google Cloud, Django, Postgres, Celery, Redis. Some experience with AI Infrastructure and : We offer market-leading compensation, based on the skills and aptitude of the learn more visit : https : https : https AI platform : https (ref:hirist.tech)
Posted 4 days ago
3.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Job Description About Oracle Analytics & Big Data Service: Oracle Analytics is a complete platform that supports every role within analytics, offering cloud-native services or on-premises solutions without compromising security or governance. Our platform delivers a unified system for managing everything from data collection to decision-making, with seamless integration of AI and machine learning to help businesses accelerate productivity and uncover critical insights. Oracle Big Data Service, a part of Oracle Analytics, is a fully managed, automated cloud service designed to help enterprises create scalable Hadoop-based data lakes. The service work scope encompasses not just good integration with OCI’s native infrastructure (security, cloud, storage, etc.) but also deep integration with other relevant cloud-native services in OCI. It includes doing cloud-native ways of doing service level patching & upgrades and maintaining high availability of the service in the face of random failures & planned downtimes in the underlying infrastructure (e.g., for things like patching the Linux kernels to take care of a security vulnerability). Developing systems for monitoring and getting telemetry into the service’s runtime characteristics and being able to take actions on the telemetry data is a part of the charter. We are interested in experienced engineers with expertise and passion for solving difficult problems in distributed systems and highly available services to join our Oracle Big Data Service team. In this role, you will be instrumental in building, maintaining, and enhancing our managed, cloud-native Big Data service focused on large-scale data processing and analytics. At Oracle, you can help, shape, design, and build innovative new systems from the ground up. These are exciting times in our space - we are growing fast, still at an early stage, and working on ambitious new initiatives. Engineers at any level can have significant technical and business impact. Minimum Qualifications: Bachelor’s or Master’s degree in Computer Science, Engineering, or related technical field. Minimum of 3-5 years of experience in software development, with a focus on large-scale distributed systems, cloud services, or Big Data technologies. US passport holders. This is required by the position to access US Gov regions. Expertise in coding in Java, Python with emphasis on tuning/optimization Experience with Linux systems administration, troubleshooting, and security best practices in cloud environments. Familiarity with containerization and orchestration technologies like Docker and Kubernetes. Experience with automation tools and techniques for deployment, patching, and service monitoring. Experience with open-source software in the Big Data ecosystem Experience at an organization with operational/dev-ops culture Solid understanding of networking, storage, and security components related to cloud infrastructure. Solid foundation in data structures, algorithms, and software design with strong analytical and debugging skills. Preferred Qualifications: Proven expertise in cloud-native architectures and services, preferably within Oracle Cloud Infrastructure (OCI), AWS, Azure, or GCP. Deep understanding of Java and JVM mechanics Hands-on experience with Hadoop ecosystem (HDFS, MapReduce, YARN), Spark, Kafka, Flink and other big data technologies is a plus. Excellent problem-solving skills and the ability to work in a fast-paced, agile environment. Responsibilities Key Responsibilities: Participate in the design, development, and maintenance of a scalable and secure Hadoop-based data lake service. Code, integrate, and operationalize open and closed source data ecosystem components for Oracle cloud service offerings Collaborate with cross-functional teams including DevOps, Security, and Product Management to define and execute product roadmaps, service updates, and feature enhancements. Becoming an active member of the Apache open source community when working on open source components Implement high availability strategies and design patterns to manage system failures and ensure uninterrupted service during planned or unplanned infrastructure downtimes. Ensure compliance with security protocols and industry best practices when handling large-scale data processing in the cloud. Qualifications Career Level - IC3 About Us As a world leader in cloud solutions, Oracle uses tomorrow’s technology to tackle today’s challenges. We’ve partnered with industry-leaders in almost every sector—and continue to thrive after 40+ years of change by operating with integrity. We know that true innovation starts when everyone is empowered to contribute. That’s why we’re committed to growing an inclusive workforce that promotes opportunities for all. Oracle careers open the door to global opportunities where work-life balance flourishes. We offer competitive benefits based on parity and consistency and support our people with flexible medical, life insurance, and retirement options. We also encourage employees to give back to their communities through our volunteer programs. We’re committed to including people with disabilities at all stages of the employment process. If you require accessibility assistance or accommodation for a disability at any point, let us know by emailing accommodation-request_mb@oracle.com or by calling +1 888 404 2494 in the United States. Oracle is an Equal Employment Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability and protected veterans’ status, or any other characteristic protected by law. Oracle will consider for employment qualified applicants with arrest and conviction records pursuant to applicable law.
Posted 4 days ago
1.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Job Description About Oracle Analytics & Big Data Service: Oracle Analytics is a complete platform that supports every role within analytics, offering cloud-native services or on-premises solutions without compromising security or governance. Our platform delivers a unified system for managing everything from data collection to decision-making, with seamless integration of AI and machine learning to help businesses accelerate productivity and uncover critical insights. Oracle Big Data Service, a part of Oracle Analytics, is a fully managed, automated cloud service designed to help enterprises create scalable Hadoop-based data lakes. The service work scope encompasses not just good integration with OCI’s native infrastructure (security, cloud, storage, etc.) but also deep integration with other relevant cloud-native services in OCI. It includes doing cloud-native ways of doing service level patching & upgrades and maintaining high availability of the service in the face of random failures & planned downtimes in the underlying infrastructure (e.g., for things like patching the Linux kernels to take care of a security vulnerability). Developing systems for monitoring and getting telemetry into the service’s runtime characteristics and being able to take actions on the telemetry data is a part of the charter. We are interested in experienced engineers with expertise and passion for solving difficult problems in distributed systems and highly available services to join our Oracle Big Data Service team. In this role, you will be instrumental in building, maintaining, and enhancing our managed, cloud-native Big Data service focused on large-scale data processing and analytics. At Oracle, you can help, shape, design, and build innovative new systems from the ground up. These are exciting times in our space - we are growing fast, still at an early stage, and working on ambitious new initiatives. Engineers at any level can have significant technical and business impact. Minimum Qualifications: Bachelor’s or Master’s degree in Computer Science, Engineering, or related technical field. Minimum of 1-2 years of experience in software development, with a focus on large-scale distributed systems, cloud services, or Big Data technologies. US passport holders. This is required by the position to access US Gov regions. Expertise in coding in Java, Python with emphasis on tuning/optimization Experience with Linux systems administration, troubleshooting, and security best practices in cloud environments. Experience with open-source software in the Big Data ecosystem Experience at an organization with operational/dev-ops culture Solid understanding of networking, storage, and security components related to cloud infrastructure. Solid foundation in data structures, algorithms, and software design with strong analytical and debugging skills. Preferred Qualifications: Hands-on experience with Hadoop ecosystem (HDFS, MapReduce, YARN), Spark, Kafka, Flink and other big data technologies. Proven expertise in cloud-native architectures and services, preferably within Oracle Cloud Infrastructure (OCI), AWS, Azure, or GCP. In-depth understanding of Java and JVM mechanics Good problem-solving skills and the ability to work in a fast-paced, agile environment. Responsibilities Key Responsibilities: Participate in development and maintenance of a scalable and secure Hadoop-based data lake service. Code, integrate, and operationalize open and closed source data ecosystem components for Oracle cloud service offerings Collaborate with cross-functional teams including DevOps, Security, and Product Management to define and execute product roadmaps, service updates, and feature enhancements. Becoming an active member of the Apache open source community when working on open source components Ensure compliance with security protocols and industry best practices when handling large-scale data processing in the cloud. Qualifications Career Level - IC2 About Us As a world leader in cloud solutions, Oracle uses tomorrow’s technology to tackle today’s challenges. We’ve partnered with industry-leaders in almost every sector—and continue to thrive after 40+ years of change by operating with integrity. We know that true innovation starts when everyone is empowered to contribute. That’s why we’re committed to growing an inclusive workforce that promotes opportunities for all. Oracle careers open the door to global opportunities where work-life balance flourishes. We offer competitive benefits based on parity and consistency and support our people with flexible medical, life insurance, and retirement options. We also encourage employees to give back to their communities through our volunteer programs. We’re committed to including people with disabilities at all stages of the employment process. If you require accessibility assistance or accommodation for a disability at any point, let us know by emailing accommodation-request_mb@oracle.com or by calling +1 888 404 2494 in the United States. Oracle is an Equal Employment Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability and protected veterans’ status, or any other characteristic protected by law. Oracle will consider for employment qualified applicants with arrest and conviction records pursuant to applicable law.
Posted 4 days ago
10.0 - 14.0 years
0 Lacs
punjab
On-site
About Us We are a global climate technologies company engineered for sustainability. We create sustainable and efficient residential, commercial and industrial spaces through HVACR technologies. We protect temperature-sensitive goods throughout the cold chain. And we bring comfort to people globally. Best-in-class engineering, design and manufacturing combined with category-leading brands in compression, controls, software and monitoring solutions result in next-generation climate technology that is built for the needs of the world ahead. Whether you are a professional looking for a career change, an undergraduate student exploring your first opportunity, or recent graduate with an advanced degree, we have opportunities that will allow you to innovate, be challenged and make an impact. Join our team and start your journey today! JOB DESCRIPTION/ RESPONSIBILITIES: We are looking for Data Engineers:- Candidate must have a minimum of (10) years of experience in a Data Engineer role, including the following tools/technologies: Experience with relational (SQL) databases. Experience with data warehouses like Oracle, SQL & Snowflake. Technical expertise in data modeling, data mining and segmentation techniques. Experience with building new and troubleshooting existing data pipelines using tools like Pentaho Data Integration (PDI),ADF (Azure Data Factory),Snowpipe,Fivetran,DBT Experience with batch and real time data ingestion and processing frameworks. Experience with languages like Python, Java, etc. Knowledge of additional cloud-based analytics solutions, along with Kafka, Spark and Scala is a plus. * Develops code and solutions that transfers/transforms data across various systems * Maintains deep technical knowledge of various tools in the data warehouse, data hub, * and analytical tools. * Ensures data is transformed and stored in efficient methods for retrieval and use. * Maintains data systems to ensure optimal performance * Develops a deep understanding of underlying business systems involved with analytical systems. * Follows standard software development lifecycle, code control, code standards and * process standards. * Maintains and develops technical knowledge by self-training of current toolsets and * computing environments, participates in educational opportunities, maintains professional * networks, and participates in professional organizations related to their tech skills Systems Analysis * Works with key stakeholders to understand business needs and capture functional and * technical requirements. * Offers ideas that simplify the design and complexity of solutions delivered. * Effectively communicates any expectations required of stakeholders or other resources * during solution delivery. * Develops and executes test plans to ensure successful rollout of solution, including accuracy * and quality of data. Service Management * Effectively communicates to leaders and stakeholders any obstacles that occur during * solution delivery. * Defines and manages promised delivery dates. * Pro-actively research, analyzes, and predicts operational issues, informing leadership * where appropriate. * Offers viable options to solve unexpected/unknown issues that occur during * solution development and delivery. * EDUCATION / JOB-RELATED TECHNICAL SKILLS: * Bachelors Degree in Computer Science/Information Technology or equivalent * Ability to effectively communicate with others at all levels of the Company both verbally and in * writing. Demonstrates a courteous, tactful, and professional approach with employees and * others. * Ability to work in a large, global corporate structure Our Commitment to Our People Across the globe, we are united by a singular Purpose: Sustainability is no small ambition. Thats why everything we do is geared toward a sustainable futurefor our generation and all those to come. Through groundbreaking innovations, HVACR technology and cold chain solutions, we are reducing carbon emissions and improving energy efficiency in spaces of all sizes, from residential to commercial to industrial. Our employees are our greatest strength. We believe that our culture of passion, openness, and collaboration empowers us to work toward the same goal - to make the world a better place. We invest in the end-to-end development of our people, beginning at onboarding and through senior leadership, so they can thrive personally and professionally. Flexible and competitive benefits plans offer the right options to meet your individual/family needs . We provide employees with flexible time off plans, including paid parental leave (maternal and paternal), vacation and holiday leave. Together, we have the opportunity and the power to continue to revolutionize the technology behind air conditioning, heating and refrigeration, and cultivate a better future. Learn more about us and how you can join our team! Our Commitment to Diversity, Equity & Inclusion At Copeland, we believe having a diverse, equitable and inclusive environment is critical to our success. We are committed to creating a culture where every employee feels welcomed, heard, respected, and valued for their experiences, ideas, perspectives and expertise . Ultimately, our diverse and inclusive culture is the key to driving industry-leading innovation, better serving our customers and making a positive impact in the communities where we live. Equal Opportunity Employer ,
Posted 4 days ago
1.0 years
0 Lacs
Ahmedabad, Gujarat, India
On-site
Job Description About Oracle Analytics & Big Data Service: Oracle Analytics is a complete platform that supports every role within analytics, offering cloud-native services or on-premises solutions without compromising security or governance. Our platform delivers a unified system for managing everything from data collection to decision-making, with seamless integration of AI and machine learning to help businesses accelerate productivity and uncover critical insights. Oracle Big Data Service, a part of Oracle Analytics, is a fully managed, automated cloud service designed to help enterprises create scalable Hadoop-based data lakes. The service work scope encompasses not just good integration with OCI’s native infrastructure (security, cloud, storage, etc.) but also deep integration with other relevant cloud-native services in OCI. It includes doing cloud-native ways of doing service level patching & upgrades and maintaining high availability of the service in the face of random failures & planned downtimes in the underlying infrastructure (e.g., for things like patching the Linux kernels to take care of a security vulnerability). Developing systems for monitoring and getting telemetry into the service’s runtime characteristics and being able to take actions on the telemetry data is a part of the charter. We are interested in experienced engineers with expertise and passion for solving difficult problems in distributed systems and highly available services to join our Oracle Big Data Service team. In this role, you will be instrumental in building, maintaining, and enhancing our managed, cloud-native Big Data service focused on large-scale data processing and analytics. At Oracle, you can help, shape, design, and build innovative new systems from the ground up. These are exciting times in our space - we are growing fast, still at an early stage, and working on ambitious new initiatives. Engineers at any level can have significant technical and business impact. Minimum Qualifications: Bachelor’s or Master’s degree in Computer Science, Engineering, or related technical field. Minimum of 1-2 years of experience in software development, with a focus on large-scale distributed systems, cloud services, or Big Data technologies. US passport holders. This is required by the position to access US Gov regions. Expertise in coding in Java, Python with emphasis on tuning/optimization Experience with Linux systems administration, troubleshooting, and security best practices in cloud environments. Experience with open-source software in the Big Data ecosystem Experience at an organization with operational/dev-ops culture Solid understanding of networking, storage, and security components related to cloud infrastructure. Solid foundation in data structures, algorithms, and software design with strong analytical and debugging skills. Preferred Qualifications: Hands-on experience with Hadoop ecosystem (HDFS, MapReduce, YARN), Spark, Kafka, Flink and other big data technologies. Proven expertise in cloud-native architectures and services, preferably within Oracle Cloud Infrastructure (OCI), AWS, Azure, or GCP. In-depth understanding of Java and JVM mechanics Good problem-solving skills and the ability to work in a fast-paced, agile environment. Responsibilities Key Responsibilities: Participate in development and maintenance of a scalable and secure Hadoop-based data lake service. Code, integrate, and operationalize open and closed source data ecosystem components for Oracle cloud service offerings Collaborate with cross-functional teams including DevOps, Security, and Product Management to define and execute product roadmaps, service updates, and feature enhancements. Becoming an active member of the Apache open source community when working on open source components Ensure compliance with security protocols and industry best practices when handling large-scale data processing in the cloud. Qualifications Career Level - IC2 About Us As a world leader in cloud solutions, Oracle uses tomorrow’s technology to tackle today’s challenges. We’ve partnered with industry-leaders in almost every sector—and continue to thrive after 40+ years of change by operating with integrity. We know that true innovation starts when everyone is empowered to contribute. That’s why we’re committed to growing an inclusive workforce that promotes opportunities for all. Oracle careers open the door to global opportunities where work-life balance flourishes. We offer competitive benefits based on parity and consistency and support our people with flexible medical, life insurance, and retirement options. We also encourage employees to give back to their communities through our volunteer programs. We’re committed to including people with disabilities at all stages of the employment process. If you require accessibility assistance or accommodation for a disability at any point, let us know by emailing accommodation-request_mb@oracle.com or by calling +1 888 404 2494 in the United States. Oracle is an Equal Employment Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability and protected veterans’ status, or any other characteristic protected by law. Oracle will consider for employment qualified applicants with arrest and conviction records pursuant to applicable law.
Posted 4 days ago
3.0 years
0 Lacs
Ahmedabad, Gujarat, India
On-site
Job Description About Oracle Analytics & Big Data Service: Oracle Analytics is a complete platform that supports every role within analytics, offering cloud-native services or on-premises solutions without compromising security or governance. Our platform delivers a unified system for managing everything from data collection to decision-making, with seamless integration of AI and machine learning to help businesses accelerate productivity and uncover critical insights. Oracle Big Data Service, a part of Oracle Analytics, is a fully managed, automated cloud service designed to help enterprises create scalable Hadoop-based data lakes. The service work scope encompasses not just good integration with OCI’s native infrastructure (security, cloud, storage, etc.) but also deep integration with other relevant cloud-native services in OCI. It includes doing cloud-native ways of doing service level patching & upgrades and maintaining high availability of the service in the face of random failures & planned downtimes in the underlying infrastructure (e.g., for things like patching the Linux kernels to take care of a security vulnerability). Developing systems for monitoring and getting telemetry into the service’s runtime characteristics and being able to take actions on the telemetry data is a part of the charter. We are interested in experienced engineers with expertise and passion for solving difficult problems in distributed systems and highly available services to join our Oracle Big Data Service team. In this role, you will be instrumental in building, maintaining, and enhancing our managed, cloud-native Big Data service focused on large-scale data processing and analytics. At Oracle, you can help, shape, design, and build innovative new systems from the ground up. These are exciting times in our space - we are growing fast, still at an early stage, and working on ambitious new initiatives. Engineers at any level can have significant technical and business impact. Minimum Qualifications: Bachelor’s or Master’s degree in Computer Science, Engineering, or related technical field. Minimum of 3-5 years of experience in software development, with a focus on large-scale distributed systems, cloud services, or Big Data technologies. US passport holders. This is required by the position to access US Gov regions. Expertise in coding in Java, Python with emphasis on tuning/optimization Experience with Linux systems administration, troubleshooting, and security best practices in cloud environments. Familiarity with containerization and orchestration technologies like Docker and Kubernetes. Experience with automation tools and techniques for deployment, patching, and service monitoring. Experience with open-source software in the Big Data ecosystem Experience at an organization with operational/dev-ops culture Solid understanding of networking, storage, and security components related to cloud infrastructure. Solid foundation in data structures, algorithms, and software design with strong analytical and debugging skills. Preferred Qualifications: Proven expertise in cloud-native architectures and services, preferably within Oracle Cloud Infrastructure (OCI), AWS, Azure, or GCP. Deep understanding of Java and JVM mechanics Hands-on experience with Hadoop ecosystem (HDFS, MapReduce, YARN), Spark, Kafka, Flink and other big data technologies is a plus. Excellent problem-solving skills and the ability to work in a fast-paced, agile environment. Responsibilities Key Responsibilities: Participate in the design, development, and maintenance of a scalable and secure Hadoop-based data lake service. Code, integrate, and operationalize open and closed source data ecosystem components for Oracle cloud service offerings Collaborate with cross-functional teams including DevOps, Security, and Product Management to define and execute product roadmaps, service updates, and feature enhancements. Becoming an active member of the Apache open source community when working on open source components Implement high availability strategies and design patterns to manage system failures and ensure uninterrupted service during planned or unplanned infrastructure downtimes. Ensure compliance with security protocols and industry best practices when handling large-scale data processing in the cloud. Qualifications Career Level - IC3 About Us As a world leader in cloud solutions, Oracle uses tomorrow’s technology to tackle today’s challenges. We’ve partnered with industry-leaders in almost every sector—and continue to thrive after 40+ years of change by operating with integrity. We know that true innovation starts when everyone is empowered to contribute. That’s why we’re committed to growing an inclusive workforce that promotes opportunities for all. Oracle careers open the door to global opportunities where work-life balance flourishes. We offer competitive benefits based on parity and consistency and support our people with flexible medical, life insurance, and retirement options. We also encourage employees to give back to their communities through our volunteer programs. We’re committed to including people with disabilities at all stages of the employment process. If you require accessibility assistance or accommodation for a disability at any point, let us know by emailing accommodation-request_mb@oracle.com or by calling +1 888 404 2494 in the United States. Oracle is an Equal Employment Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability and protected veterans’ status, or any other characteristic protected by law. Oracle will consider for employment qualified applicants with arrest and conviction records pursuant to applicable law.
Posted 4 days ago
3.0 - 7.0 years
0 Lacs
Nagpur, Maharashtra, India
On-site
Job Summary We are seeking a highly skilled Core Java Developer with experience in WhatsApp Business API and RCS. The candidate will be responsible for designing, developing, and optimizing messaging solutions using Java-based frameworks. This role requires hands-on experience in backend development, API integrations, and cloud-based Responsibilities : Develop, optimize, and maintain Java-based applications for WhatsApp Business API & RCS messaging. Integrate and manage WhatsApp Cloud API and RCS messaging platforms to enhance communication solutions. Work with RESTful APIs, Webhooks, and WebSocket protocols for seamless messaging functionality. Implement secure and scalable solutions for handling high-volume messaging transactions. Collaborate with product, DevOps, and front-end teams to improve platform performance and reliability. Ensure compliance with WhatsApp & RCS guidelines and telecom regulations. Troubleshoot and resolve technical issues related to messaging services and third-party integrations. Optimize backend performance using multithreading, caching, and database indexing Skills & Experience : 3-7 years of experience in Java development, with expertise in Spring Boot, Hibernate, and Microservices. Hands-on experience with WhatsApp Business API and RCS messaging. Strong knowledge of RESTful APIs, JSON, and Webhooks. Experience in MySQL, PostgreSQL, or NoSQL databases (MongoDB, Redis). Exposure to cloud platforms (AWS, GCP, or Azure) for deploying scalable applications. Familiarity with RabbitMQ, Kafka, or other message queuing systems. Understanding of OAuth, JWT, and security best practices. Experience with Docker, Kubernetes, and CI/CD pipelines is a Qualifications : Experience in CPaaS (Communication Platform as a Service) domain. Knowledge of chatbots, AI-based messaging, and NLP integration (ref:hirist.tech)
Posted 4 days ago
7.0 - 11.0 years
0 Lacs
noida, uttar pradesh
On-site
Job Title/Role: Teach Lead [Java & Python] Location: Noida/Delhi NCR Experience : 7- 10 yrs Roles & Responsibilities Understanding the clients business use cases and technical requirements and be able to convert them into technical solutions which elegantly meets the requirements Identifying different solutions and being able to narrow down the best option that meets the business requirements. Develop solution design considering various aspects like extensibility, scalability, security, design patterns, user experience, NFRs, etc., and ensure that all relevant best practices are followed. Developing and designing the overall solution for defined functional and non-functional requirements; and defining technologies, patterns, and frameworks to materialize it. Understanding and relating technology integration scenarios and applying these learnings in projects. Excellent communication and teamwork skills Resolving issues that are raised during code/review, through exhaustive systematic analysis of the root cause, and being able to justify the decision taken. Should be confident, self-driven with a lot of initiative, and should have the zeal and energy to quickly ramp-up on upcoming technologies. Create and contribute to an environment that is geared to innovation, high productivity, high quality and customer service. Experience in communicating with end clients, business users, other technical teams and provide estimates. Qualification BTech. Or MCA in computer science More than 7 years of experience working as Java full stack technologist/ - software development, testing and production support & Python programming language and the FastAPI framework. Design / Development experience in Java technical stack, Java/J2EE, Design Patterns, Spring Framework-Core| Boot| MVC /Hibernate, JavaScript, CSS, HTML, Multithreading, Data Structures, Kafka and SQL Experience with data analytics tools and libraries such as pandas, NumPy, and scikit-learn Familiarity in Content Management Tools and, experience with integrating databases both relational (e.g. Oracle, PostgreSQL, MySQL) and non-relational databases (e.g. DynamoDB, Mongo, Cassandra) An in-depth understanding of Public/Private/Hybrid Cloud solutions and experience in securely integrating public cloud into traditional hosting/delivery models with a specific focus on AWS (S3, lambda, API gateway, EC2, Cloudflare) Working knowledge of Docker, Kubernetes, UNIX-based operating systems, and Micro-services Should have clear understating on continuous integration, build, release, code quality GitHub/Jenkins Should have an experience of managing teams and time bound projectsWorking in F&B Industry or Aerospace could be an added advantage,
Posted 4 days ago
6.0 - 10.0 years
0 Lacs
hyderabad, telangana
On-site
Role Description Salesforce has immediate opportunities for software developers who want their lines of code to have significant and measurable positive impact for users, the company's bottom line, and the industry. You will be working with a group of world-class engineers to build the breakthrough features our customers will love, adopt, and use while keeping our trusted CRM platform stable and scalable. The software engineer role at Salesforce encompasses architecture, design, implementation, and testing to ensure we build products right and release them with high quality. We pride ourselves on writing high quality, maintainable code that strengthens the stability of the product and makes our lives easier. We embrace the hybrid model and celebrate the individual strengths of each team member while cultivating everyone on the team to grow into the best version of themselves. We believe that autonomous teams with the freedom to make decisions will empower the individuals, the product, the company, and the customers they serve to thrive. Your Impact As a Senior Backend Software Engineer, your job responsibilities will include: Build new and exciting components in an ever-growing and evolving market technology to provide scale and efficiency. Develop high-quality, production-ready code that millions of users of our cloud platform can use. Design, implement, and tune robust APIs and API framework-related features that perform and scale in a multi-tenant environment. Work in a Hybrid Engineering model and contribute to all phases of SDLC including design, implementation, code reviews, automation, and testing of the features. Build efficient components/algorithms on a microservice multi-tenant SaaS cloud environment Code review, mentoring junior engineers, and providing technical guidance to the team (depending on the seniority level) Required Skills: Mastery of multiple programming languages and platforms 6 + years of backend software development experience including designing and developing distributed systems at scale Deep knowledge of object-oriented programming and other scripting languages: Java, Python, Scala C#, Go, Node.JS and C++. Strong PostgreSQL/SQL skills and experience with relational and non-relational databases including writing queries. A deeper understanding of software development best practices and demonstrate leadership skills. Degree or equivalent relevant experience required. Experience will be evaluated based on the core competencies for the role (e.g. extracurricular leadership roles, military experience, volunteer roles, work experience, etc.) Preferred Skills: Experience with developing SAAS products over public cloud infrastructure - AWS/Azure/GCP. Experience with Big-Data/ML and S3 Hands-on experience with Streaming technologies like Kafka Experience with Elastic Search Experience with Terraform, Kubernetes, Docker Experience working in a high-paced and rapidly growing multinational organization BENEFITS & PERKS Comprehensive benefits package including well-being reimbursement, generous parental leave, adoption assistance, fertility benefits, and more! World-class enablement and on-demand training with Trailhead.com Exposure to executive thought leaders and regular 1:1 coaching with leadership Volunteer opportunities and participation in our 1:1:1 model for giving back to the community For more details, visit https://www.salesforcebenefits.com/,
Posted 4 days ago
5.0 - 9.0 years
0 Lacs
karnataka
On-site
As a Cloud Engineering Specialist at BT, you will have the opportunity to be part of a team that is shaping the future of communication services and defining how people interact with these services. Your role will involve fulfilling various requirements in Voice platforms, ensuring timely delivery and integration with other platform components. Your responsibilities will include deploying infrastructure, networking, and software packages, as well as automating deployments. You will implement up-to-date security practices and manage issue diagnosis and resolution across infrastructure, software, and networking areas. Collaboration with development, design, ops, and test teams will be essential to ensure the reliable delivery of services. To excel in this role, you should possess in-depth knowledge of Linux, server management, and issue diagnosis, along with hands-on experience. Proficiency in TCP/IP, HTTP, SIP, DNS, and Linux tooling for debugging is required. Additionally, you should be comfortable with Bash/Python scripting, have a strong understanding of Git, and experience in automation through tools like Ansible and Terraform. Your expertise should also include a solid background in cloud technologies, preferably Azure, and familiarity with container technologies such as Docker, Kubernetes, and GitOps tooling like FluxCD/ArgoCD. Exposure to CI/CD frameworks, observability tooling, RDBMS, NoSQL databases, service discovery, message queues, and Agile methodologies will be beneficial. At BT, we value inclusivity, safety, integrity, and customer-centricity. Our leadership standards emphasize building trust, owning outcomes, delivering value to customers, and demonstrating a growth mindset. We are committed to building diverse, future-ready teams where individuals can thrive and contribute positively. BT, as part of BT Group, plays a vital role in connecting people, businesses, and public services. We embrace diversity and inclusion in everything we do, reflecting our core values of being Personal, Simple, and Brilliant. Join us in making a difference through digital transformation, and be part of a team that empowers lives and businesses through innovative communication solutions.,
Posted 4 days ago
5.0 - 9.0 years
0 Lacs
karnataka
On-site
Software Engineer The Software Engineering team delivers next-generation application enhancements and new products for a changing world. Working at the cutting edge, we design and develop software for platforms, peripherals, applications, and diagnostics all with the most advanced technologies, tools, software engineering methodologies, and the collaboration of internal and external partners. Join us as a Software Engineer on our Software Engineering team in Bangalore to do the best work of your career and make a profound social impact. What Youll Achieve As a Software Engineer, you will be responsible for developing sophisticated systems and software based on the customers business goals, needs, and general business environment creating software solutions. You will: Manage all activities necessary to take a product from concept to production including analysing the requirements, system architecture, Proof of Concept, Contribution to behavioural specification, defining engineering functional plan, design, coding, engineering interlocks with stakeholders that include Product Management, Core team, cross functional engineering teams, Architecture Review Board, factory and customer support. Develop technical specifications, defining Proof Of Concepts, evaluating prototypes and making recommendations, defining program scope, driving design / engineering reviews are part of the job responsibility. Experience in dealing with Data Center or Cloud Infra management software, fault tolerant systems and high availability is a plus. Person should have strong leadership skills with focus on drive for results. It is an IC role with more focus on technical content. Take the first step towards your dream career Every Dell Technologies team member brings something unique to the table. Heres what we are looking for with this role: Essential Requirements 5+ years of Experience in using languages like Golang, Java, with spring Framework for Java. Experience with cloud platforms or platforms as a service such as Kubernetes, Cloud Foundry, Azure, AWS Experience working with one or more of the following: Postgres, Cassandra, Redis, MongoDB, Elasticsearch Experience working with messaging systems like RabbitMQ, Kafka Bachelors or Masters degree (or equivalent) in Computer Science, Computer Engineering, or related field with 5+ years of experience Desirable Requirements Experience with scripting language like PERL, Python or bash Knowledge and experience with Agile. Who We Are We believe that each of us has the power to make an impact. Thats why we put our team members at the center of everything we do. If youre looking for an opportunity to grow your career with some of the best minds and most advanced tech in the industry, were looking for you. Dell Technologies is a unique family of businesses that helps individuals and organizations transform how they work, live and play. Join us to build a future that works for everyone because Progress Takes All of Us. Application closing date: 30 March 2025 Dell Technologies is committed to the principle of equal employment opportunity for all employees and to providing employees with a work environment free of discrimination and harassment. Read the full Equal Employment Opportunity Policy here. Job ID:R262633,
Posted 4 days ago
1.0 - 3.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Position Description Founded in 1976, CGI is among the largest independent IT and business consulting services firms in the world. With 94,000 consultants and professionals across the globe, CGI delivers an end-to-end portfolio of capabilities, from strategic IT and business consulting to systems integration, managed IT and business process services and intellectual property solutions. CGI works with clients through a local relationship model complemented by a global delivery network that helps clients digitally transform their organizations and accelerate results. CGI Fiscal 2024 reported revenue is CA$14.68 billion and CGI shares are listed on the TSX (GIB.A) and the NYSE (GIB). Learn more at cgi.com. Job Title: Test Engineer - Associate Software Engineer/Software Engineer - Java Full Stack Position: Associate Software Engineer/Software Engineer Experience:1-3 Years Category: Software Development/ Engineering Shift: General Shift Main location: India- Bangalore/Hyderabad Position ID: J0725-0474 Employment Type: Full Time Position Description: We are looking for a Java Full Stack Developer (1–3 years' experience) with strong skills in Core Java, Spring Boot, and Microservices to develop scalable backend services and contribute to frontend development. Familiarity with cloud technologies and tools like Angular, Kafka, or Camunda is a plus. Your future duties and responsibilities Develop, test, and maintain high-quality, scalable, and maintainable Java-based backend services using Core Java, Spring Boot, and Microservices Architecture. Support front-end development using modern web technologies; Angular knowledge is a plus. Assist in building cloud-native applications and contribute to the deployment process using cloud-based technologies. Participate in the end-to-end development process—analyzing requirements, designing, coding, testing, and deploying. Collaborate with cross-functional teams to deliver a streamlined and high-performance user experience. Write clean, well-documented, and tested code following industry best practices and internal coding standards. Participate in code reviews, debugging, and troubleshooting issues, ensuring code quality and consistency. Contribute to technical design discussions and help make decisions around backend and frontend modules. Support and learn from senior developers and tech leads in resolving technical challenges and project blockers. Demonstrate a proactive attitude and willingness to learn advanced technologies like Camunda, Kafka, Apache NIFI, and modern front-end frameworks. Engage in continuous learning and professional development to grow into a senior or lead role. Required Qualifications To Be Successful In This Role Education Qualification: Bachelor's degree in computer science or related field or higher with minimum 2 years of relevant experience. Must-Have Skills 1–3 years of hands-on experience with Core Java, Spring Boot, and Microservices Architecture Basic understanding of frontend development (e.g., HTML, CSS, JavaScript) Familiarity with REST APIs and backend integration Good understanding of cloud-based technologies (e.g., AWS, Azure, GCP) Strong problem-solving and analytical skills Ability to write clean, testable, and well-documented code Strong communication and teamwork skills Good-to-Have Skills Working knowledge of Angular or similar frontend frameworks Exposure to Camunda, Kafka, or Apache NIFI Experience with unit testing frameworks (e.g., JUnit, TestNG) Understanding of CI/CD pipelines and version control tools like Git Familiarity with Agile/Scrum methodologies Enthusiasm to learn new technologies and grow into a senior role CGI is an equal opportunity employer. In addition, CGI is committed to providing accommodation for people with disabilities in accordance with provincial legislation. Please let us know if you require reasonable accommodation due to a disability during any aspect of the recruitment process and we will work with you to address your needs. Together, as owners, let’s turn meaningful insights into action. Life at CGI is rooted in ownership, teamwork, respect and belonging. Here, you’ll reach your full potential because… You are invited to be an owner from day 1 as we work together to bring our Dream to life. That’s why we call ourselves CGI Partners rather than employees. We benefit from our collective success and actively shape our company’s strategy and direction. Your work creates value. You’ll develop innovative solutions and build relationships with teammates and clients while accessing global capabilities to scale your ideas, embrace new opportunities, and benefit from expansive industry and technology expertise. You’ll shape your career by joining a company built to grow and last. You’ll be supported by leaders who care about your health and well-being and provide you with opportunities to deepen your skills and broaden your horizons. Come join our team—one of the largest IT and business consulting services firms in the world.
Posted 4 days ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
27534 Jobs | Dublin
Wipro
14175 Jobs | Bengaluru
Accenture in India
9809 Jobs | Dublin 2
EY
9787 Jobs | London
Amazon
7964 Jobs | Seattle,WA
Uplers
7749 Jobs | Ahmedabad
IBM
7414 Jobs | Armonk
Oracle
7069 Jobs | Redwood City
Muthoot FinCorp (MFL)
6164 Jobs | New Delhi
Capgemini
5421 Jobs | Paris,France