Jobs
Interviews
22 Job openings at Onix
About Onix

Onix is a leading provider of technology solutions that help organizations manage their digital transformation, including cloud services, data analytics, and IT consulting.

Devops Lead

Noida, Pune, Bengaluru

8 - 11 years

INR 0.5 - 0.5 Lacs P.A.

Hybrid

Full Time

Role: DevOps Lead Locations: Bangalore | Pune | Delhi NCR Notice Period: Immediate to 30 Days Experience: 7 to 12 Years We are looking for an experienced and motivated DevOps Lead to join our high-performing engineering team, driving the development of next-generation cloud platforms on Google Cloud Platform (GCP) . Role Overview As a DevOps Lead, you will play a critical role in architecting and implementing scalable, secure, and automated infrastructure. Youll lead by example and collaborate across teams to ensure DevOps best practices are consistently applied in our cloud-native environment. Key Responsibilities Lead the design, implementation, and maintenance of infrastructure as code using Terraform Provide deep technical expertise in GCP services including Compute, IAM, VPC, GKE, Cloud Functions, and others. Oversee and optimize CI/CD pipelines using tools such as Jenkins, GitLab CI, ArgoCD, and Spinnaker. Drive container orchestration with Docker and Kubernetes (GKE preferred). Implement automated configuration management with Ansible Lead monitoring and observability initiatives using Prometheus , Grafana , Stackdriver , and ELK Stack Mentor junior engineers and collaborate with cross-functional teams to deliver high-quality infrastructure solutions Ideal Candidate Profile 7 to 12 years of hands-on experience in DevOps, with a minimum of 2+ years in a lead or senior engineering role. Proven experience with Google Cloud Platform (GCP) is mandatory. Strong expertise in infrastructure automation, cloud architecture, and CI/CD practices Demonstrated leadership in DevOps strategy and delivery within agile teams Excellent communication, collaboration, and problem-solving skills. Why Join Us? Lead the development of cutting-edge Generative AI solutions for real-world applications. Be part of a collaborative, innovative, and technology-driven team. Opportunity to work with advanced AI/ML tools and frameworks. Drive innovation through technical leadership, mentorship, and solution evangelization. Continuous professional growth with access to the latest AI/ML technologies and frameworks.

Devops Engineer - GCP

Pune, Bengaluru

4 - 12 years

INR 8.0 - 11.0 Lacs P.A.

Work from Office

Full Time

Proven experience as a DevOps engineer or in a similar role with a focus on monitoring and observability. Expert-level knowledge of Splunk (advanced configuration, data indexing, search optimization, and alerting). Advanced experience with Grafana for creating real-time, interactive dashboards and visualizations. Strong proficiency in Linux/Unix systems administration and scripting (Bash, Python, etc.). Solid understanding of cloud platforms like AWS, Azure, or GCP and how to integrate monitoring solutions into these environments. Experience with containerization (Docker, Kubernetes) and orchestration tools. Familiarity with Infrastructure as Code tools (Terraform, Ansible, etc.). Experience with automation tools (Jenkins, GitLab CI, etc.) for deploying and managing infrastructure. Strong problem-solving skills with the ability to troubleshoot and resolve complex technical issues in a fast-paced environment. Experience with distributed systems and knowledge of performance tuning, scaling, and high-availability setups. Preferred Skills : Experience in managing large-scale Splunk and Grafana environments. Knowledge of log aggregation technologies (Fluentd, Logstash, etc.). Familiarity with Alerting Incident Management Tools (PagerDuty, Opsgenie, etc.). Certifications in Cloud Platforms (AWS Certified DevOps Engineer, Azure DevOps Engineer, etc.). Familiarity with Agile methodologies (Scrum/Kanban) and DevOps practices. Understanding of Security principles and practices as they relate to logging and monitoring.

MS SQL DBA

Pune

6 - 8 years

INR 4.0 - 8.0 Lacs P.A.

Work from Office

Full Time

MS SQL Server Database Administrator (DBA) Proficiency in MS SQL Server database administration. Demonstrated ability to: Export data from on-premises environments and transfer it to Cloud Storage buckets. Import data into Cloud SQL from Cloud Storage. Implement and manage backup and recovery strategies. Manage and enforce database security measures. Cloud Native Engineer Strong understanding of Google Cloud Platform (GCP) core services, including Compute Engine, networking, storage, and Identity and Access Management (IAM). Proficiency in using Terraform to provision and manage virtual machines (VMs) on GCP.

Big Data lead

Pune

6 - 10 years

INR 11.0 - 15.0 Lacs P.A.

Work from Office

Full Time

We at Onix Datametica Solutions Private Limited are looking for Bigdata Lead who have a passion for cloud with knowledge of different on-premise and cloud Data implementation in the field of Big Data and Analytic s including and not limiting to Teradata, Netezza, Exadata, Oracle, Cloudera, Hortonworks and alike. Ideal candidates should have technical experience in migrations and the ability to help customers get value from Datametica's tools and accelerators Job Description 6+ years of overall experience in developing, testing implementing Big data projects using Hadoop, Spark, Hive Hands-on experience playing lead role in Big data projects, responsible for implementing one or more tracks within projects, identifying and assigning tasks within the team and providing technical guidance to team members Experience in setting up Hadoop services, implementing ETL/ELT pipelines, working with Terabytes of data ingestion processing from varied systems Experience working in onshore/offshore model, leading technical discussions with customers, mentoring and guiding teams on technology, preparing HDD LDD documents Required Skills and Abilities: Mandatory Skills Spark, Scala/Pyspark, Hadoop ecosystem including Hive, Sqoop, Impala, Oozie, Hue, Java, Python, SQL, Flume, bash (shell scripting) Experience implementing CICD pipelines and working experience with tools like SCM tools such as GIT, Bit bucket, etc Hands on experience in writing data ingestion pipelines, data processing pipelines using spark and SQL, experience in implementing SCD type 1 2, auditing, exception handling mechanism Data Warehousing projects implementation with either, Scala or Hadoop programming background Proficient with various development methodologies like waterfall, agile/scrum Exceptional communication, organisation, and time management skills Collaborative approach to decision-making Strong analytical skills Good To Have - Certifications in any of GCP, AWS or Azure, Cloud era Work on multiple Projects simultaneously, prioritising appropriately

Kafka Architect

Pune, Bengaluru

10 - 15 years

INR 12.0 - 16.0 Lacs P.A.

Work from Office

Full Time

We are seeking a talented and experienced Kafka Architect with migration experience to Google Cloud Platform (GCP) to join our team. As a Kafka Architect, you will be responsible for designing, implementing, and managing our Kafka infrastructure to support our data processing and messaging needs, while also leading the migration of our Kafka ecosystem to GCP. You will work closely with our engineering and data teams to ensure seamless integration and optimal performance of Kafka on GCP. Responsibilities: Discovery, analysis, planning, design, and implementation of Kafka deployments on GKE, with a specific focus on migrating Kafka from AWS to GCP. Design, architect and implement scalable, high-performance Kafka architectures and clusters to meet our data processing and messaging requirements. Lead the migration of our Kafka infrastructure from on-premises or other cloud platforms to Google Cloud Platform (GCP). Conduct thorough discovery and analysis of existing Kafka deployments on AWS. Develop and implement best practices for Kafka deployment, configuration, and monitoring on GCP. Develop a comprehensive migration strategy for moving Kafka from AWS to GCP. Collaborate with engineering and data teams to integrate Kafka into our existing systems and applications on GCP. Optimize Kafka performance and scalability on GCP to handle large volumes of data and high throughput. Plan and execute the migration, ensuring minimal downtime and data integrity. Test and validate the migrated Kafka environment to ensure it meets performance and reliability standards. Ensure Kafka security on GCP by implementing authentication, authorization, and encryption mechanisms. Troubleshoot and resolve issues related to Kafka infrastructure and applications on GCP. Ensure seamless data flow between Kafka and other data sources/sinks. Implement monitoring and alerting mechanisms to ensure the health and performance of Kafka clusters. Stay up to date with Kafka developments and GCP services to recommend and implement new features and improvements. Requirements: Bachelors degree in computer science, Engineering, or related field (Masters degree preferred). Proven experience as a Kafka Architect or similar role, with a minimum of [5] years of experience. Deep knowledge of Kafka internals and ecosystem, including Kafka Connect, Kafka Streams, and KSQL. In-depth knowledge of Apache Kafka architecture, internals, and ecosystem components. Proficiency in scripting and automation for Kafka management and migration. Hands-on experience with Kafka administration, including cluster setup, configuration, and tuning. Proficiency in Kafka APIs, including Producer, Consumer, Streams, and Connect. Strong programming skills in Java, Scala, or Python. Experience with Kafka monitoring and management tools such as Confluent Control Center, Kafka Manager, or similar. Solid understanding of distributed systems, data pipelines, and stream processing. Experience leading migration projects to Google Cloud Platform (GCP), including migrating Kafka workloads. Familiarity with GCP services such as Google Kubernetes Engine (GKE), Google Cloud Storage, Google Cloud Pub/Sub, and Big Query. Excellent communication and collaboration skills. Ability to work independently and manage multiple tasks in a fast-paced environment.

Technical Project Manager

Pune

10 - 13 years

INR 11.0 - 16.0 Lacs P.A.

Work from Office

Full Time

Mandate Skills: Hands-on Coding, ETL, GCP, SQL, Resource planning, Project Management. Mandatory Skills: Hands-on experience in design, development and managing data integration, ETL, GCP, SQL, Resource planning, Project Management. Experience in managing projects in the area of Data warehousing, Business Intelligence using open source or top-of-the-line tools and technologies Good knowledge of Dimensional Modeling Experience in managing medium to large projects Proven experience in project planning, estimation, execution and implementation of medium to large projects Proficient with various development methodologies like waterfall, agile/scrum and iterative Good Interpersonal skills and excellent communication skills Advanced level Microsoft Project, PowerPoint, Visio, Excel and Word. Responsibilities: Responsible to work closely with customers to understand the requirements, discuss and define various use cases Liaise with key stakeholders to define, a solutions roadmap, prioritize the deliverables Responsible for end-to-end project delivery from project estimations, project planning, resourcing and support perspective Drive and participate in requirements gathering workshops, estimation discussions, design meetings and status review meetings Participate and contribute in Solution Design implementation Projects Monitor and review the status of the project and ensure that the deliverables are on track with respect to scope, budget and time Transparently communicate the status of the project to all the stakeholders on a regular basis Identify and manage risks/issues related to deliverables and arrive at mitigation plans to resolve the issues and risks Seek proactive feedback continuously to identify areas of improvement Ensure the team is creating and maintaining the knowledge artifacts with reference to the project deliverables.

Cloud Engineer

Hyderabad

4 - 9 years

INR 4.0 - 8.0 Lacs P.A.

Work from Office

Full Time

Data Transformation: Utilize Data Build Tool (dbt) to transform raw data into curated data models according to business requirements. Implement data transformations and aggregations to support analytical and reporting needs. Orchestration and Automation: Design and implement automated workflows using Google Cloud Composer to orchestrate data pipelines and ensure timely data delivery. Monitor and troubleshoot data pipelines, identifying and resolving issues proactively. Develop and maintain documentation for data pipelines and workflows. GCP Expertise: Leverage GCP services, including BigQuery, Cloud Storage, and Pub/Sub, to build a robust and scalable data platform. Optimize BigQuery performance and cost through efficient query design and data partitioning. Implement data security and access controls in accordance with banking industry standards. Collaboration and Communication: Collaborate with Solution Architect and Data Modeler to understand data requirements and translate them into technical solutions. Communicate effectively with team members and stakeholders, providing regular updates on project progress. Participate in code reviews and contribute to the development of best practices. Data Pipeline Development: Design, develop, and maintain scalable and efficient data pipelines using Google Cloud Dataflow to ingest data from various sources, including relational databases (RDBMS), data streams, and files. Implement data quality checks and validation processes to ensure data accuracy and consistency. Optimize data pipelines for performance and cost-effectiveness. Banking Domain Knowledge (Preferred): Understanding of banking data domains, such as customer data, transactions, and financial products. Familiarity with regulatory requirements and data governance standards in the banking industry. Required Experience: Bachelor's degree in computer science, Engineering, or a related field. ETL Knowledge. 4-9 years of experience in data engineering, with a focus on building data pipelines and data transformations. Strong proficiency in SQL and experience working with relational databases. Hands-on experience with Google Cloud Platform (GCP) services, including Dataflow, BigQuery, Cloud Composer, and Cloud Storage. Experience with data transformation tools, preferably Data Build Tool (dbt). Proficiency in Python or other scripting languages is a plus. Experience with data orchestration and automation. Strong problem-solving and analytical skills. Excellent communication and collaboration skills. Experience with data streams like Pub/Sub or similar. Experience in working with files such as CSV, JSON and Parquet. Primary Skills: GCP, Dataflow, BigQuery, Cloud Composer, Cloud Storage, Data Pipeline, Composer, SQL, DBT, DWH Concepts. Secondary Skills: Python, Banking Domain knowledge, pub/sub, Cloud certifications (e.g. Data engineer), Git or any other version control system.

AI / ML Developer

Hyderabad, Pune

3 - 6 years

INR 8.0 - 12.0 Lacs P.A.

Work from Office

Full Time

We are looking for a dynamic andinnovative AI/ML Engineer withexpertise in Generative AI andhands-on experience in GCP or othercloud platforms The ideal candidate will have proven experience in developing,training, fine-tuning, and deploying advanced AI/ML models You will play apivotal role in building scalable, production-ready solutions involving large datasets , NLP techniques, and cutting-edge frameworks such asLangChain, Retrieval-Augmented Generation (RAG), and REACT (Retrieve, Extract,Adapt, Construct, Think) This role requires a solid foundation in Python , SQL , and AI/ML development pipelines, combined with a passion forsolving real-world problems using AI. Roles Responsibilities Model Development Training Design, train, and fine-tune AI/ML models , especially Generative AI and Large Language Models (LLMs) , to address specific use cases. Build conversational AI solutionsand chatbots using frameworks such as LangChain , RAG (Retrieval-Augmented Generation) ,and Chain-of-Thought (COT) prompting . Apply advanced techniques,including embeddings , fine-tuning,and custom prompting strategies. Incorporate REACT (Retrieve, Extract, Adapt, Construct, Think) methods toenhance model capabilities. Develop scalable AI solutions tointegrate seamlessly into production environments. Data Handling Manage large-scale datasets for AI/ML applications, ensuring data quality,transformation, and normalization. Conduct data analysis, preprocessing, and munging to extract valuableinsights. Implement scalable dataengineering workflows for model development and production. Cloud AI/ML Deployment Deploy, manage, and optimize AImodels on Google Cloud Platform (GCP) (or other cloud platforms like AWS/Azure). Leverage GCP services such asVertex AI, BigQuery, Cloud Functions, and Dataflow for AI workflows. Collaboration Solutioning Collaborate with cross-functionalteams, including product managers, data scientists, and software engineers, todeliver AI-driven solutions. Integrate models withclient-facing applications, ensuring end-to-end implementation. Support scalable developmentthrough Docker for containerization(a plus). Continuous Improvement Stay updated with the latestadvancements in AI, ML, and Generative AI frameworks, tools, and methodologies. Proactively learn new technologiesand apply them to improve processes and solutions. Web Scraping (Optional but Preferred) Implement web scraping solutions to gather data from unstructured sources formodel training and validation. Required Skills Qualifications 3-6 years of experience in AI/ML modeldevelopment, training, and fine-tuning. Strong programming skills in Python and SQL . Hands-on experience with Generative AI , LLMs , and NLP techniques . Experience working with LangChain , RAG frameworks , and advanced prompting strategies. Proficiency in embeddings and fine-tuning models forspecific tasks. Strong understanding of machine learning algorithms andstatistical analysis. Experience working with large datasets and scalable dataprocessing workflows. Hands-on experience with GCP (Vertex AI, BigQuery) or othercloud platforms (AWS/Azure). Knowledge of Docker for deployment and containerization. Solid skills in data cleaning, transformation, andnormalization for data integrity. Preferred Skills Familiarity with Reinforcement Learning from Human Feedback(RLHF) . Understanding of COT (Chain-of-Thought) prompting . Proficiency in REACT (Retrieve, Extract, Adapt, Construct,Think) frameworks. Experience in web scraping techniques. Key Attributes Strong analytical andproblem-solving abilities. Ability to work independently aswell as collaboratively in a team environment. Excellent communication skills tointeract with stakeholders and cross-functional teams. Proactive attitude to learn andadopt new AI technologies and frameworks. Why Join Us Opportunity to work oncutting-edge AI/ML solutions. Collaborative and innovative workculture. Access to state-of-the-art tools,frameworks, and GCP resources. Growth opportunities with a focuson professional development in AI/ML.

Druid Developer

Pune, Bengaluru

3 - 8 years

INR 3.0 - 6.0 Lacs P.A.

Work from Office

Full Time

We are seeking a skilled and experienced Druid Developer to design, develop, and maintain real-time data analytics solutions using Apache Druid. The ideal candidate will have hands-on experience working with Druid, a deep understanding of distributed systems, and a passion for processing large-scale datasets. You will play a pivotal role in creating scalable, high-performance systems that enable real-time decision-making. Technical Skills: Strong experience with Apache Druid, including ingestion, query optimizations, and cluster management. Proficiency in real-time data streaming technologies (e.g., Apache Kafka, AWS Kinesis). Experience with data transformation and ETL processes. Knowledge of relational and NoSQL databases (e.g., PostgreSQL, MongoDB). Hands-on experience with cloud platforms (AWS, GCP, Azure) for deploying Druid clusters. Proficiency in programming languages like Java, Python, or Scala. Familiarity with containerization tools like Docker and orchestration tools like Kubernetes.

Infosec Analyst - Lead.

Pune, Maharashtra, India

0 years

Not disclosed

On-site

Full Time

Position: Infosec Analyst – Audit & Compliance ( Lead or AM ) Key Responsibility Areas (KRA): Regulatory Compliance & Governance: Ensure adherence to ISO 27001, NIST, SOC 2, GDPR, HIPAA, and enforce security policies. Audit & Risk Management: Lead internal/external audits, manage compliance assessments, and drive risk mitigation. Incident Response & Compliance Monitoring: Work with Security Operations to monitor incidents, ensure compliance, and support investigations. Security Awareness & Training: Develop and implement training programs to strengthen cybersecurity culture. Vendor & Third-Party Security: Assess vendor security risks, ensure contract compliance, and enforce security standards. Business Continuity & Disaster Recovery (BCDR): Support security-related aspects of BCDR, ensuring compliance with recovery objectives. Critical Coordination & Availability: Be available during US business hours for audits, compliance discussions, and security escalations. Roles & Responsibilities: Lead security audits, compliance initiatives, and regulatory assessments. Maintain security policies, documentation, and reporting for compliance readiness. Serve as the primary contact for auditors, legal teams, and regulatory bodies. Oversee remediation efforts for vulnerabilities and drive timely risk mitigation. Monitor security controls, drive continuous improvement, and align compliance with business objectives. Support security incidents and investigations related to compliance risks. Ensure availability for critical discussions, escalations, and audits during US hours. Show more Show less

Sales Specialist Engineer

Pune, Maharashtra, India

1 - 3 years

Not disclosed

On-site

Full Time

Job Description: Sales Specialist Engineer Experience: 1-3 years Position: Sales Specialist Engineer Department: Inside Sales Team Location: Pune, India Shift: US Shift (Night Shift) Employment Type: Full-Time Role Overview: We are seeking a Sales Specialist Engineer to join our dynamic Inside Sales Team . This individual will serve as a technical bridge between prospects and the Inside Sales team, playing a crucial role in technical conversations, product demonstrations, and opportunity qualification . The ideal candidate should have a strong grasp of Onix products and services , along with the ability to articulate their technical value to prospects during first-level phone conversations . This role requires a proactive learner who understands data analytics, AI basics , and can thrive in a fast-paced, entrepreneurial environment. Key Responsibilities: Understand Onix products and services in-depth to effectively communicate their value to prospects. Participate in first-level conversations with prospects alongside Inside Sales representatives. Lead technical discussions, addressing prospect queries, identifying pain points, and explaining solutions. Deliver impactful technical demos of Onix products to prospects and customers. Tailor demonstrations to the specific needs and challenges of prospects, showcasing relevant use cases. Answer technical questions during product demos and clarify complex concepts in simple terms. Proactively probe and qualify opportunities by asking the right questions during conversations. Evaluate technical requirements to determine fit with Onix’s products and services. Work closely with the Inside Sales team to prioritize high-quality opportunities. Collaboration & Support: Collaborate with the Inside Sales team to support pipeline development and opportunity management. Provide feedback to the product and marketing teams based on prospect interactions to improve positioning and messaging. Support customer meetings and presentations during US working hours. Learning & Development: Stay up-to-date with Onix’s product updates, industry trends, and competitive landscape. Continuously enhance knowledge in data analytics, AI fundamentals, and related technologies. Share insights and technical expertise with the Inside Sales team to improve overall effectiveness. Key Qualifications & Skills: Technical Expertise: Strong understanding of data analytics, AI fundamentals, and the ability to explain them to non-technical audiences. Experience conducting technical product demos or presenting solutions to prospects. Familiarity with cloud technologies, SaaS platforms, or enterprise software is a plus. Exceptional probing skills to identify prospect needs and qualify opportunities effectively. Ability to simplify complex technical concepts and communicate their value clearly. Comfortable engaging in technical conversations over the phone or virtual meetings. General Requirements: 1-3 years of experience in a technical sales, pre-sales, or solutions engineering role. An entrepreneurial mindset with a strong sense of ownership and initiative. A go-getter attitude with a passion for learning and self-improvement. Strong collaboration skills to work effectively within cross-functional teams. Willingness to work in US shifts to align with Inside Sales and customer schedules. Why Join Us? Be part of a high-impact Inside Sales team, driving meaningful technical conversations with prospects. Gain hands-on experience with cutting-edge technologies in data analytics and AI. Enjoy opportunities to grow into solutions engineering, pre-sales leadership, or other technical sales roles. Thrive in an entrepreneurial, fast-paced environment where your contributions are valued. Show more Show less

Onix is Hiring MDM Informatica Developer

Hyderabad, Pune

3 - 8 years

INR 20.0 - 30.0 Lacs P.A.

Hybrid

Full Time

Job Summary: We are seeking a highly skilled Informatica MDM Developer to join our data integration and management team. The ideal candidate will have extensive experience in Informatica Master Data Management (MDM) solutions and a deep understanding of data quality, data governance, and master data modeling. Key Responsibilities: Design, develop, and deploy Informatica MDM solutions (including Hub, IDD, SIF, and MDM Hub configurations). Work closely with data architects, business analysts, and stakeholders to understand master data requirements. Configure and manage Trust, Merge, Survivorship rules, and Match/Merge logic. Implement data quality (DQ) checks and profiling using Informatica DQ tools. Develop batch and real-time integration using Informatica MDM SIF APIs and ETL tools (e.g., Informatica PowerCenter). Monitor and optimize MDM performance and data processing. Document MDM architecture, data flows, and integration touchpoints. Troubleshoot and resolve MDM issues across environments (Dev, Test, UAT, Prod). Support data governance and metadata management initiatives. Required Skills: Strong hands-on experience with Informatica MDM (10.x or later) . Proficient in match/merge rules , data stewardship , hierarchy management , and SIF APIs . Experience with Informatica Data Quality (IDQ) is a plus. Solid understanding of data modeling , relational databases , and SQL . Familiarity with REST/SOAP APIs , web services , and real-time data integration. Experience in Agile/Scrum environments. Excellent problem-solving and communication skills.

AI/ML Architect

Pune, Maharashtra, India

10 years

Not disclosed

On-site

Full Time

We are seeking an experienced AI/ML Architect to lead the design, development, and deployment of Generative AI solutions. This role requires a deep understanding of AI/ML architectures, technical leadership, and the ability to design robust, scalable, and production-ready systems. The ideal candidate will have extensive experience in cloud platforms like GCP and optionally AWS, Azure, or equivalent tools, combined with hands-on expertise in MLOps, containerization, data processing, and advanced model optimization. You will work closely with cross-functional teams, technical leadership, and stakeholders to implement state-of-the-art AI solutions that solve real-world challenges and drive business value. Roles & Responsibilities Technical Leadership Lead the technical design and architecture of complex Generative AI systems. Ensure solutions align with business objectives, scalability requirements, and technical feasibility. Guide development teams through best practices, architecture reviews, and technical decision-making processes. Solution Architecture Design and develop end-to-end Generative AI solutions, including data pipelines, model training, deployment, and real-time monitoring. Utilize MLOps tools and frameworks to automate workflows, ensuring scalable and repeatable deployments. Architect robust solutions using GCP and optionally AWS, Azure, or open-source frameworks. Design, train, and fine-tune AI/ML models, especially Generative AI and Large Language Models (LLMs), to address specific use cases. Build conversational AI solutions and chatbots using frameworks such as LangChain, RAG (Retrieval-Augmented Generation), and Chain-of-Thought (COT) prompting. Production Deployment Lead the deployment of Generative AI models into production environments. Optimize deployment pipelines leveraging tools like Docker, Kubernetes, and cloud-native services for orchestration. Ensure seamless integration of GenAI solutions into existing CI/CD pipelines. Data Processing & Feature Engineering Build scalable ETL workflows for managing structured, semi-structured, and unstructured data. Implement data wrangling, preprocessing, and feature engineering pipelines to prepare data for Generative AI applications. Optimize workflows to extract meaningful insights from large datasets. Model Optimization Identify and implement optimization strategies such as hyperparameter tuning, feature engineering, and model selection for performance enhancement. Focus on computational efficiency and scaling models to production-level performance. Pilot/POCs Development Drive the design and development of Proof of Concepts (POCs) and pilot projects to address customer requirements. Collaborate with delivery and product teams to scale successful pilots to production-grade solutions. Evangelization Promote and drive the adoption of Generative AI solutions across customer and delivery teams. Provide technical leadership and mentorship to teams working on GenAI projects. Conduct workshops, training sessions, and knowledge-sharing initiatives to enable stakeholders. Continuous Improvement Stay at the forefront of AI advancements, frameworks, and tools, including emerging concepts in Generative AI. Explore and evaluate techniques like Reinforcement Learning from Human Feedback (RLHF) and REACT (Retrieve, Extract, Adapt, Construct, Think) frameworks to enhance GenAI applications. Required Skills & Qualifications 10+ years of experience in AI/ML architecture, model development, and production deployment. Proven expertise in designing, implementing, and scaling Generative AI and LLM-based solutions. Hands-on experience with frameworks like LangChain, Retrieval-Augmented Generation (RAG), and advanced prompting techniques. Proficiency in advanced techniques such as embeddings and Chain-of-Thought (COT) prompting. Experience working with cloud platforms, primarily GCP, with optional experience in AWS or Azure. Strong understanding of MLOps tools, pipelines, and model monitoring in production. Proficiency in Python and SQL for model development and data processing. Experience with data preprocessing, ETL workflows, and feature engineering for AI applications. Strong knowledge of containerization tools like Docker and orchestration platforms like Kubernetes. Solid understanding of CI/CD pipelines for continuous deployment and integration of AI solutions. Experience working with large datasets for structured and unstructured AI applications. Deep experience in model optimization, including hyperparameter tuning and computational efficiency strategies. Proven track record of leading POCs/pilots and scaling them to production-grade deployments. Preferred Skills Familiarity with Reinforcement Learning from Human Feedback (RLHF). Experience with REACT (Retrieve, Extract, Adapt, Construct, Think) frameworks. Strong understanding of orchestration for large-scale production environments. Key Attributes Strong technical leadership and mentorship abilities. Excellent communication and stakeholder management skills. Strategic thinking with the ability to architect scalable and future-ready AI systems. Passion for solving business challenges using state-of-the-art AI techniques. Commitment to staying updated with the latest advancements in AI/ML technologies. Why Join Us? Lead the development of cutting-edge Generative AI solutions for real-world applications. Be part of a collaborative, innovative, and technology-driven team. Opportunity to work with advanced AI/ML tools and frameworks. Drive innovation through technical leadership, mentorship, and solution evangelization. Continuous professional growth with access to the latest AI/ML technologies and frameworks. Show more Show less

GCP Hadoop Developer/Lead

Pune, Bengaluru

5 - 10 years

INR 0.5 - 0.5 Lacs P.A.

Hybrid

Full Time

Job Summary We are seeking a highly skilled Hadoop Developer / Lead Data Engineer to join our data engineering team based in Bangalore or Pune. The ideal candidate will have extensive experience with Hadoop ecosystem technologies and cloud-based big data platforms, particularly on Google Cloud Platform (GCP). This role involves designing, developing, and maintaining scalable data ingestion, processing, and transformation frameworks to support enterprise data needs. Minimum Qualifications Bachelor's degree in computer science, Computer Information Systems, or related technical field. 5-10 years of experience in software engineering or data engineering, with a strong focus on big data technologies. Proven experience in implementing software development life cycles (SDLC) in enterprise environments. Technical Skills & Expertise Big Data Technologies: Expertise in Hadoop platform, Hive , and related ecosystem tools. Strong experience with Apache Spark (using SQL, Scala, and/or Java). Experience with real-time data streaming using Kafka . Programming Languages & Frameworks: Proficient in PySpark and SQL for data processing and transformation. Strong coding skills in Python . Cloud Technologies (Google Cloud Platform): Experience with BigQuery for data warehousing and analytics. Familiarity with Cloud Composer (Airflow) for workflow orchestration. Hands-on with DataProc for managed Spark and Hadoop clusters. Responsibilities Design, develop, and implement scalable data ingestion and transformation pipelines using Hadoop and GCP services. Build real-time and batch data processing solutions leveraging Spark, Kafka, and related technologies. Ensure data quality, governance, and lineage by implementing automated validation and classification frameworks. Collaborate with cross-functional teams to deploy and operationalize data analytics tools at enterprise scale. Participate in production support and on-call rotations to maintain system reliability. Follow established SDLC practices to deliver high-quality, maintainable solutions. Preferred Qualifications Experience leading or mentoring data engineering teams. Familiarity with CI/CD pipelines and DevOps best practices for big data environments. Strong communication skills with an ability to collaborate across teams.

Cloud Native Architect

Pune, Maharashtra, India

13 years

Not disclosed

Remote

Full Time

We are seeking a highly experienced Cloud Native Architect to lead the design and implementation of cloud-native platforms and enterprise-grade solutions. The ideal candidate will have over 13 years of experience in IT, with a strong focus on cloud-native technologies, microservices architecture, Kubernetes, and DevOps practices. This role demands deep architectural expertise, strong leadership, and hands-on experience with building scalable, resilient, and secure applications in cloud environments GCP. Key Responsibilities Architect and design cloud-native solutions that are scalable, secure, and resilient across public, private, and hybrid clouds. Lead the development of microservices-based platforms using containers, Kubernetes, and service mesh technologies. Define and enforce cloud-native standards, best practices, and reusable patterns across teams. Oversee implementation of CI/CD pipelines and DevSecOps processes to support modern software delivery. Collaborate with enterprise architects, product owners, and engineering leads to ensure alignment with business and technical goals. Perform architecture reviews, cloud cost optimizations, and performance tuning. Stay current with cloud technologies and provide strategic input for adopting emerging trends like serverless, edge computing, and AI/ML in cloud. Mentor and guide junior architects, engineers, and DevOps teams. Required Skills & Qualifications 13+ years of total experience in IT, with at least 5-7 years in cloud-native architecture. Deep expertise in Kubernetes, Docker, Helm, Istio, or other container and orchestration tools. Strong experience with at least one major cloud provider (AWS, GCP, Azure); multi-cloud knowledge is a plus. Proficient in DevOps practices, CI/CD pipelines (Jenkins, ArgoCD, GitOps), and Infrastructure as Code (Terraform, CloudFormation). Deep knowledge of distributed systems, event-driven architecture, and API management. Understanding of cloud security, IAM, compliance, and governance frameworks. Strong programming/scripting knowledge (e.g., Go, Python, Java, Bash). Experience leading large-scale migrations to the cloud and building greenfield cloud-native applications. Excellent communication and stakeholder management skills. Preferred Certifications Google Professional Cloud Architect Certified Kubernetes Administrator (CKA) / CKAD Why Join Us Work on cutting-edge cloud-native initiatives with global impact Be part of a culture that promotes continuous learning and innovation Competitive compensation and remote flexibility Opportunity to influence strategic technology direction Show more Show less

GCP SRE Engineer

Hyderabad, Pune

5 - 8 years

INR 0.5 - 0.5 Lacs P.A.

Work from Office

Full Time

Job Description Job Overview We are looking for a skilled and proactive SRE (Site Reliability Engineer) to manage, maintain, and troubleshoot cloud data pipelines across our infrastructure. The ideal candidate is a data engineering expert with deep knowledge of cloud services, data pipeline architecture, and a software engineering mindset to optimize performance, reliability, and cost-efficiency. This role demands strong problem-solving abilities, hands-on experience with any cloud platforms (preferably GCP), and the capability to work independently in a fast-paced environment. Key Responsibilities Manage and support cloud data pipelines and associated infrastructure Monitor the performance and reliability of pipelines, including Informatica ETL workflows, MDM, and Control-M jobs Troubleshoot and resolve complex issues related to data pipelines and data processing systems Optimize data pipeline efficiency to reduce operational costs and failure rates Automate repetitive tasks and streamline data pipeline management processes Conduct post-incident reviews and implement improvements for future reliability Perform SLA-oriented monitoring and recommend enhancements to ensure compliance Collaborate with cross-functional teams to improve and document systems and workflows Support real-time monitoring and alerting for mission-critical data processes Continuously improve systems based on proactive testing and performance insights Required Skills and Qualifications 5+ years of experience in Data Engineering support and enhancement Proficiency in Python for data processing and automation Strong SQL skills and experience working with relational databases Solid understanding of data pipeline architectures and ETL processes Hands-on experience with any cloud platforms (GCP, Azure, AWS GCP preferred) Familiarity with version control systems like Git. Experience in monitoring and alerting solutions for data systems Skilled in conducting post-incident analysis and reliability improvements Exposure to data visualization tools such as Google Looker Studio, Tableau, Domo, or Power BI is a plus Strong analytical and problem-solving skills Excellent verbal and written communication abilities Ability to work in a 24x7 shift environment Preferred Qualifications Bachelor’s degree in computer science , Engineering, or a related technical field. Professional Cloud Certification (e.g., GCP Professional Data Engineer) is a plus.

Onix is Hiring MDM ETL Developer

Hyderabad, Pune

3 - 8 years

INR 20.0 - 30.0 Lacs P.A.

Hybrid

Full Time

Job Summary: oin our team and what well accomplish together As an MDM Developer, you will be responsible for implementing and managing Master Data Management (MDM) projects. The ideal candidate will have extensive experience with Informatica MDM and proficiency in configuring MDM tools and integrating them with cloud environments. You will utilize your expertise in data engineering to build and maintain data pipelines, optimize performance, and ensure data quality. You will be working as part of a friendly, cross-discipline agile team who helps each other solve problems across all functions. As a custodian of customer trust, you will employ best practice in development, security, accessibility and design to achieve the highest quality of service for our customers. Our development team uses a range of technologies to get the job done including ETL and data quality tools from Informatica, streaming via Apache NiFi, google native tools on GCP (Dataflow, Composer, Big Query, etc.). We also do some API design and development with Postman and Node.js You will be part of the team building data pipelines that support our marketing, finance, campaign and Executive Leadership team as well as implementing Informatica Master Data Management (MDM) hosted on Amazon Web Services (AWS). Specifically you’ll be building pipelines that support insights to enable our business partners’ analytics and campaigns. You are a fast learner, highly technical, passionate person looking to work within a team of multidisciplinary experts to improve your craft and contribute to the data development practice. Here’s how Learn new skills & advance your data development practice Analyze and profile data Design, develop, test, deploy, maintain and improve batch and real-time data pipelines Assist with design and development of solution prototypes Support consumers with understanding the data outcomes and technical design Collaborate closely with multiple teams in an agile environment What you bring You are a senior developer with 3+ years of experience in IT platform implementation in a technical capacity Bachelor of Computer Science, Engineering or equivalent Extensive experience with Informatica MDM (Multi-Domain Edition) version 10 Proficiency in MDM configuration, including Provisioning Tool, Business Entity Services, Customer 360, data modeling, match rules, cleanse rules, and metadata analysis Expertise in configuring data models, match and merge rules, database schemas, and trust and validation settings Understanding of data warehouses/cloud architectures and ETL processes Working SQL knowledge and experience working with relational databases and query authoring (SQL), as well as working familiarity with a variety of databases Experience with the Google Cloud Platform (GCP) and its related technologies (Kubernetes, CloudSQL, PubSub, Storage, Logging, Dashboards, Airflow, BigQuery, BigTable, Python, BQ SQL, Dataplex, Datastream etc.) Experience with Informatica IDQ/PowerCenter/IICS, Apache NiFi and other related ETL tools Experience with Informatica MDM (preferred) but strong skills in other MDM tools still an asset Experience working with message queues like JMS, Kafka, PubSub A passion for data quality Great-to-haves Experience with Informatica MDM SaaS Experience with Python and software engineering best practices API development using Node.js and testing using Postman/SoapUI Understanding of TMF standards

GCP Infra Lead

Pune, Maharashtra, India

6 years

None Not disclosed

On-site

Full Time

GCP Infrastructure Lead Location: Bangalore, Pune Exp: 6+ Years Responsibilities: 5+ years of demonstrated relevant experience deploying and supporting public cloud Infrastructure (GCP as primary) IaaS and PaaS. Experience in configuring and managing the GCP infrastructure environment components Foundation components – Networking (VPC, VPN, Interconnect, Firewall and Routes), IAM, Folder Structure, Organization Policy, VPC Service Control, Security Command Center etc. Application Components - BigQuery, Cloud Composer, Cloud Storage, Google Kubernetes Engine (GKE), Compute Engine, Cloud SQL, Cloud Monitoring, Dataproc, Data Fusion, Big Table, Dataflow etc. Design and implement Identity and Access Management (IAM) policies, custom roles, and service accounts across GCP projects and organizations. Implement and maintain Workload Identity Federation, IAM Conditions, and least-privilege access models. Integrate Google Cloud audit logs, access logs, and security logs with enterprise SIEM tools (e.g., Splunk, Chronicle, QRadar, or Exabeam). Configure Cloud Logging, Cloud Audit Logs, and Pub/Sub pipelines for log export to SIEM. Collaborate with the Security Operations Center (SOC) to define alerting rules and dashboards based on IAM events and anomalies. Participate in threat modeling and incident response planning involving IAM and access events. Maintain compliance with regulatory and internal security standards (e.g., CIS GCP Benchmark, NIST, ISO 27001). Monitor and report on IAM posture, access drift, and misconfigurations. Support periodic access reviews and identity governance requirements. Required Skills and Abilities: Mandatory Skills – GCP Networking (VPC, Firewall, Routes & VPN),CI/CD Pipelines, Terraform, Shell Scripting/Python Scripting Secondary Skills – Composer, BigQuery, GKE, Dataproc Good To Have - Certifications in any of the following: GCP Professional Cloud Architect, Cloud Devops Engineer, Cloud Security Engineer, Cloud Network Engineer Participate in incident discussions and work with the Team towards resolving platform issues. Good verbal and written communication skills. Ability to communicate with customers, developers, and other stakeholders. Mentor and guide team members Good Presentation skills Strong Team Player About Us: We are a global Leader in the Data Warehouse Migration and Modernization to the Cloud, we empower businesses by migrating their Data/Workload/ETL/Analytics to the Cloud by leveraging Automation. We have our own products! Eagle – Data warehouse Assessment & Migration Planning Product Raven – Automated Workload Conversion Product Pelican – Automated Data Validation Product, which helps automate and accelerate data migration to the cloud.

Gcp Data Engineer

Pune

4 - 7 years

INR 0.5 - 0.5 Lacs P.A.

Hybrid

Full Time

Job Title: GCP Data Engineer Location: Pune, India Experience: 4 to 7 Years Job Type: Full-Time Job Summary: We are looking for a highly skilled GCP Data Engineer with 4 to 7 years of experience to join our data engineering team in Pune . The ideal candidate should have strong experience working with Google Cloud Platform (GCP) , including Dataproc , Cloud Composer (Apache Airflow) , and must be proficient in Python , SQL , and Apache Spark . The role involves designing, building, and optimizing data pipelines and workflows to support enterprise-grade analytics and data science initiatives. Key Responsibilities: Design and implement scalable and efficient data pipelines on GCP , leveraging Dataproc , BigQuery , Cloud Storage , and Pub/Sub. Develop and manage ETL/ELT workflows using Apache Spark , SQL , and Python. Orchestrate and automate data workflows using Cloud Composer (Apache Airflow). Build batch and streaming data processing jobs that integrate data from various structured and unstructured sources. Optimize pipeline performance and ensure cost-effective data processing. Collaborate with data analysts, scientists, and business teams to understand data requirements and deliver high-quality solutions. Implement and monitor data quality checks, validation, and transformation logic. Required Skills: Strong hands-on experience with Google Cloud Platform (GCP) Proficiency with Dataproc for big data processing and Apache Spark Expertise in Python and SQL for data manipulation and scripting Experience with Cloud Composer / Apache Airflow for workflow orchestration Knowledge of data modeling, warehousing, and pipeline best practices Solid understanding of ETL/ELT architecture and implementation Strong troubleshooting and problem-solving skills Preferred Qualifications: GCP Data Engineer or Cloud Architect Certification. Familiarity with BigQuery , Dataflow , and Pub/Sub. Experience with CI/CD and DevOps tools in data engineering workflows. Exposure to Agile methodologies and team collaboration tools.

Gcp Data Engineer

Pune

4 - 7 years

INR 18.0 - 20.0 Lacs P.A.

Hybrid

Full Time

Job Title: GCP Data Engineer Location: Pune, India Experience: 4 to 7 Years Job Type: Full-Time Job Summary: We are looking for a highly skilled GCP Data Engineer with 4 to 7 years of experience to join our data engineering team in Pune . The ideal candidate should have strong experience working with Google Cloud Platform (GCP) , including Dataproc , Cloud Composer (Apache Airflow) , and must be proficient in Python , SQL , and Apache Spark . The role involves designing, building, and optimizing data pipelines and workflows to support enterprise-grade analytics and data science initiatives. Key Responsibilities: Design and implement scalable and efficient data pipelines on GCP , leveraging Dataproc , BigQuery , Cloud Storage , and Pub/Sub. Develop and manage ETL/ELT workflows using Apache Spark , SQL , and Python. Orchestrate and automate data workflows using Cloud Composer (Apache Airflow). Build batch and streaming data processing jobs that integrate data from various structured and unstructured sources. Optimize pipeline performance and ensure cost-effective data processing. Collaborate with data analysts, scientists, and business teams to understand data requirements and deliver high-quality solutions. Implement and monitor data quality checks, validation, and transformation logic. Required Skills: Strong hands-on experience with Google Cloud Platform (GCP) Proficiency with Dataproc for big data processing and Apache Spark Expertise in Python and SQL for data manipulation and scripting Experience with Cloud Composer / Apache Airflow for workflow orchestration Knowledge of data modeling, warehousing, and pipeline best practices Solid understanding of ETL/ELT architecture and implementation Strong troubleshooting and problem-solving skills Preferred Qualifications: GCP Data Engineer or Cloud Architect Certification. Familiarity with BigQuery , Dataflow , and Pub/Sub. Interested candidates can send your your resume on pranitathapa@onixnet.com

FIND ON MAP

Onix

Onix logo

Onix

|

Information Technology & Services

Cleveland

200+ Employees

22 Jobs

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Job Titles Overview