Hyderabad
INR 10.0 - 20.0 Lacs P.A.
Hybrid
Full Time
Job Description : We are seeking a highly motivated and experienced ML Engineer/Data Scientist to join our growing ML/GenAI team. You will play a key role in designing, developing and productionalizing ML applications by evaluating models, training and/or fine tuning them. You will play a crucial role in developing Gen AI based solutions for our customers. As a senior member of the team, you will take ownership of projects, collaborating with engineers and stakeholders to ensure successful project delivery. What we're looking for: At least 3 years of experience in designing & building AI applications for customer and deploying them into production At least 5 years of Software engineering experience in building Secure, scalable and performant applications for customers. Experience with Document extraction using AI, Conversational AI, Vision AI, NLP or Gen AI. Design, develop, and operationalize existing ML models by fine tuning, personalizing it. Evaluate machine learning models and perform necessary tuning. Develop prompts that instruct LLM to generate relevant and accurate responses. Collaborate with data scientists and engineers to analyze and preprocess datasets for prompt development, including data cleaning, transformation, and augmentation. Conduct thorough analysis to evaluate LLM responses, iteratively modify prompts to improve LLM performance. Hands on customer experience with RAG solution or fine tuning of LLM model. Build and deploy scalable machine learning pipelines on GCP or any equivalent cloud platform involving data warehouses, machine learning platforms, dashboards or CRM tools. Experience working with the end-to-end steps involving but not limited to data cleaning, exploratory data analysis, dealing outliers, handling imbalances, analyzing data distributions (univariate, bivariate, multivariate), transforming numerical and categorical data into features, feature selection, model selection, model training and deployment. Proven experience building and deploying machine learning models in production environments for real life applications Good understanding of natural language processing, computer vision or other deep learning techniques. Expertise in Python, Numpy, Pandas and various ML libraries (e.g., XGboost, TensorFlow, PyTorch, Scikit-learn, LangChain). Familiarity with Google Cloud or any other Cloud Platform and its machine learning services. Excellent communication, collaboration, and problem-solving skills.
Hyderabad
INR 10.0 - 16.0 Lacs P.A.
Hybrid
Full Time
About Company: Egen is a fast-growing and entrepreneurial company with a data-first mindset. We bring together the best engineering talent working with the most advanced technology platforms, including Google Cloud and Salesforce, to help clients drive action and impact through data and insights. We are committed to being a place where the best people choose to work so they can apply their engineering and technology expertise to envision what is next for how data and platforms can change the world for the better. We are dedicated to learning, thrive on solving tough problems, and continually innovate to achieve fast, effective results. Job Summary We are seeking a talented and passionate Python Developer to join our dynamic team. In this role, you will be instrumental in designing, developing, and deploying scalable and efficient applications on the Google Cloud Platform. You will have the opportunity to work on exciting projects and contribute to the growth and innovation of our products/services. You will also mentorship to other engineers, and engage with clients to understand their needs and deliver effective solutions. Responsibilities: Design, develop, and maintain robust and scalable applications using Python. Build and consume RESTful APIs using FastAPI. Deploy and manage applications on the Google Cloud Platform (GCP). Collaborate effectively with cross-functional teams, including product managers, designers, and other engineers. Write clean, well-documented, and testable code. Participate in code reviews to ensure code quality and adherence to best practices. Troubleshoot and debug issues in development and production environments. Create clear and effective documents. Stay up-to-date with the latest industry trends and technologies. Assist the junior team members. Required Skills and Experience Proven experience as a Python Developer. Solid understanding and practical experience with the FastAPI framework. Hands-on experience with the Google Cloud Platform (GCP) and its core services. A minimum of 4.5 to 6 years of relevant work experience in software development. Strong problem-solving and analytical skills. Excellent communication and collaboration abilities. Bachelor's degree in Computer Science or a related field (or equivalent practical experience). Ability to write unit test cases and execute them. Able to discuss and propose architectural changes. Security best practices. Optional Skills (a plus) Experience with any front-end framework such as Angular, React, Vue.js , etc. Familiarity with DevOps principles and practices. Experience with infrastructure-as-code tools like Terraform. Knowledge of containerization technologies such as Docker and Kubernetes. Experience with CI/CD pipelines.
Hyderabad
INR 4.5 - 8.0 Lacs P.A.
Hybrid
Full Time
About the Role: We are looking for a creative and detail-oriented Motion Graphic Designer to join our team. In this role, the designer will play a key part in strengthening our brand storytelling and enhancing our sales enablement efforts through high-quality motion design. What will they do: Strengthen brand storytelling: Develop compelling, high-quality motion graphics that communicate our brands message in a visually impactful way, helping to foster trust, credibility, and loyalty among our audience. Support sales enablement: Create visually engaging sales demos, explainer videos, and client story content that empower our sales team to communicate value, drive engagement, and close deals more effectively. Collaborate with the team: Work closely with the Marketing team End-to-end production: Manage motion graphic projects from concept to completion, including storyboarding, animation, editing, and final delivery. What are we looking for: 2-5 years of experience in motion design, with a strong portfolio showcasing animation, storytelling, and branding skills. Proficiency in Adobe Creative Suite (After Effects, Premiere Pro, Illustrator, Photoshop) Strong understanding of visual hierarchy, typography, and brand aesthetics. Ability to manage multiple projects with a high level of attention to detail and timeliness. Self-motivated, collaborative, and open to feedback.
Hyderabad
INR 14.0 - 24.0 Lacs P.A.
Hybrid
Full Time
Job Overview: We are looking for a skilled and motivated Data Engineer with strong experience in Python programming and Google Cloud Platform (GCP) to join our data engineering team. The ideal candidate will be responsible for designing, developing, and maintaining robust and scalable ETL (Extract, Transform, Load) data pipelines. The role involves working with various GCP services, implementing data ingestion and transformation logic, and ensuring data quality and consistency across systems. Key Responsibilities: Design, develop, test, and maintain scalable ETL data pipelines using Python. Work extensively on Google Cloud Platform (GCP) services such as: Dataflow for real-time and batch data processing Cloud Functions for lightweight serverless compute BigQuery for data warehousing and analytics Cloud Composer for orchestration of data workflows (based on Apache Airflow) Google Cloud Storage (GCS) for managing data at scale IAM for access control and security Cloud Run for containerized applications Perform data ingestion from various sources and apply transformation and cleansing logic to ensure high-quality data delivery. Implement and enforce data quality checks, validation rules, and monitoring. Collaborate with data scientists, analysts, and other engineering teams to understand data needs and deliver efficient data solutions. Manage version control using GitHub and participate in CI/CD pipeline deployments for data projects. Write complex SQL queries for data extraction and validation from relational databases such as SQL Server , Oracle , or PostgreSQL . Document pipeline designs, data flow diagrams, and operational support procedures. Required Skills: 46 years of hands-on experience in Python for backend or data engineering projects. Strong understanding and working experience with GCP cloud services (especially Dataflow, BigQuery, Cloud Functions, Cloud Composer, etc.). Solid understanding of data pipeline architecture , data integration , and transformation techniques . Experience in working with version control systems like GitHub and knowledge of CI/CD practices . Strong experience in SQL with at least one enterprise database (SQL Server, Oracle, PostgreSQL, etc.). Good to Have (Optional Skills): Experience working with Snowflake cloud data platform. Hands-on knowledge of Databricks for big data processing and analytics. Familiarity with Azure Data Factory (ADF) and other Azure data engineering tools. Additional Details: Excellent problem-solving and analytical skills. Strong communication skills and ability to collaborate in a team environment.
Hyderabad
INR 5.0 - 8.5 Lacs P.A.
Hybrid
Full Time
Job Summary: We are seeking an Accounting Specialist to support our finance operations, ensuring accurate financial reporting, compliance with US GAAP, and seamless collaboration with international teams. The ideal candidate will be responsible for various accounting functions, including accounts receivable, accounts payable, bank reconciliations and intercompany transactions. This role requires a strong accounting background and proficiency in MS Office and ERP systems. Roles & Responsibilities: • Invoice Processing Preparing of Invoices to be sent out to customers • Payment Processing Receive and record customer payments accurately • Process vendor invoices and reconciliations. • Handle month-end closing activities, including reconciliations and accruals. • Collaborate with US-based teams for approvals and vendor communications. • Perform bank reconciliations and resolve discrepancies. Requirements: • Strong accounting background with expertise in AR, AP, and reconciliations. • Proficiency in MS Office (Excel, Word, etc.). • Excellent verbal and written communication skills for effective interaction with US teams. • Experience in ERP systems (Financial Force or similar preferred).
Hyderabad
INR 15.0 - 30.0 Lacs P.A.
Hybrid
Full Time
Role & responsibilities Analyzing business requirements and translating them into customized solutions using the Salesforce platform. Implementing integrations and Automations using Mulesoft following best practices. Implementing Salesforce solutions that adhere to platform best practices and performing Peer code reviews, including custom platform development (Apex, Lightning Components, Apps, Mobile, and Custom front ends, etc.), integrations with other systems (often through middleware), and complex data migrations. Customizing solutions while adhering to Salesforce Governor Limits. Participate in Sprint Planning, designing steps and modules, defining timelines. Prepare Solution Design Documents. Prepare Test cases, Testing the stability and functionality of applications. Troubleshooting and fixing bugs. Designing and Developing custom objects, components, triggers, flows, and pages. Maintaining the security and integrity of application software. Mentoring the team and conducting training as needed. Preferred Credentials: - Salesforce (Mulesoft) Hyperautomation Specialist - Salesforce Administrator - Salesforce Platform App Builder - Salesforce Platform Developer I - Salesforce Agentforce Specialist (Good to have) Requirements: - Bachelors degree in Computer Science or Software Engineering. - 8+ years of experience with application and software development. - Experience in developing customer-facing interfaces. - Advanced knowledge of Salesforce CRM platforms with experience working on Sales, - Service, and Experience Clouds. - 4+ years of experience implementing integrations and Automations using Mulesoft, with at - least 2 Customer implementations using RPA. - Good to have experience of working on Agentforce implementations or accelerators. - Experience working on Salesforce Industry Clouds (good to have). - Good knowledge of Lightning Web Components (LWC). - Proficiency in MYSQL, Apex, Flows, JavaScript, Native, and Visual Force. - Good communication skills. - Ability to solve high-level software and application issues. - Good understanding of SFDC, SCRUM, and the Salesforce Project Execution process. - Active participation in Salesforce and Mulesoft Communities and an interest in Upskilling.
Hyderabad
INR 10.0 - 20.0 Lacs P.A.
Hybrid
Full Time
Job Summary: We are seeking a highly skilled and experienced Senior Infrastructure Engineer to join our dynamic team. The ideal candidate will be passionate about building and maintaining complex systems, with a holistic approach to architecture. You will play a key role in designing, implementing, and managing cloud infrastructure, ensuring scalability, availability, security, and optimal performance. You will also provide mentorship to other engineers, and engage with clients to understand their needs and deliver effective solutions. Responsibilities: Design, architect, and implement scalable, highly available, and secure infrastructure solutions, primarily on Amazon Web Services (AWS). Develop and maintain Infrastructure as Code (IaC) using Terraform or AWS CDK for enterprise-scale maintainability and repeatability. Implement robust access control via IAM roles and policy orchestration, ensuring least-privilege and auditability across multi-environment deployments. Contribute to secure, scalable identity and access patterns, including OAuth2-based authorization flows and dynamic IAM role mapping across environments. Support deployment of infrastructure lambda functions. Troubleshoot issues and collaborate with cloud vendors on managed service reliability and roadmap alignment. Utilize Kubernetes deployment tools such as Helm/Kustomize in combination with GitOps tools such as ArgoCD for container orchestration and management. Design and implement CI/CD pipelines using platforms like GitHub, GitLab, Bitbucket, Cloud Build, Harness, etc., with a focus on rolling deployments, canaries, and blue/green deployments. Ensure auditability and observability of pipeline states. Implement security best practices, audit, and compliance requirements within the infrastructure. Engage with clients to understand their technical and business requirements, and provide tailored solutions. If needed, lead agile ceremonies and project planning, including developing agile boards and backlogs with support from our Service Delivery Leads. Troubleshoot and resolve complex infrastructure issues. Qualifications: 6+ years of experience in Infrastructure Engineering or similar role. Extensive experience with Amazon Web Services (AWS). Proven ability to architect for scale, availability, and high-performance workloads. Deep knowledge of Infrastructure as Code (IaC) with Terraform. Strong experience with Kubernetes and related tools (Helm, Kustomize, ArgoCD). Solid understanding of git, branching models, CI/CD pipelines and deployment strategies. Experience with security, audit, and compliance best practices. Excellent problem-solving and analytical skills. Strong communication and interpersonal skills, with the ability to engage with both technical and non-technical stakeholders. Experience in technical mentoring, team-forming and fostering self-organization and ownership. Experience with client relationship management and project planning. Certifications: Relevant certifications (e.g., Kubernetes Certified Administrator, AWS Certified Machine Learning Engineer - Associate, AWS Certified Data Engineer - Associate, AWS Certified Developer - Associate, etc.). Software development experience (e.g., Terraform, Python). Experience/Exposure with machine learning infrastructure. Education: B.Tech/BE in computer sciences, a related field or equivalent experience.
Hyderabad
INR 20.0 - 35.0 Lacs P.A.
Hybrid
Full Time
Job Summary: We are seeking a highly skilled and experienced Lead Infrastructure Engineer to join our dynamic team. The ideal candidate will be passionate about building and maintaining complex systems, with a holistic approach to architecting infrastructure that survives and thrives in production. You will play a key role in designing, implementing, and managing cloud infrastructure, ensuring scalability, availability, security, and optimal performance vs spend. You will also provide technical leadership and mentorship to other engineers, and engage with clients to understand their needs and deliver effective solutions. Responsibilities: Design, architect, and implement scalable, highly available, and secure infrastructure solutions, primarily on Amazon Web Services (AWS). Develop and maintain Infrastructure as Code (IaC) using Terraform or AWS CDK for enterprise-scale maintainability and repeatability. Implement robust access control via IAM roles and policy orchestration, ensuring least-privilege and auditability across multi-environment deployments. Contribute to secure, scalable identity and access patterns, including OAuth2-based authorization flows and dynamic IAM role mapping across environments. Support deployment of infrastructure lambda functions. Troubleshoot issues and collaborate with cloud vendors on managed service reliability and roadmap alignment. Utilize Kubernetes deployment tools such as Helm/Kustomize in combination with GitOps tools such as ArgoCD for container orchestration and management. Design and implement CI/CD pipelines using platforms like GitHub, GitLab, Bitbucket, Cloud Build, Harness, etc., with a focus on rolling deployments, canaries, and blue/green deployments. Ensure auditability and observability of pipeline states. Implement security best practices, audit, and compliance requirements within the infrastructure. Provide technical leadership, mentorship, and training to engineering staff. Engage with clients to understand their technical and business requirements, and provide tailored solutions. If needed, lead agile ceremonies and project planning, including developing agile boards and backlogs with support from our Service Delivery Leads. Troubleshoot and resolve complex infrastructure issues. Potentially participate in pre-sales activities and provide technical expertise to sales teams. Qualifications: 10+ years of experience in an Infrastructure Engineer or similar role. Extensive experience with Amazon Web Services (AWS). Proven ability to architect for scale, availability, and high-performance workloads. Ability to plan and execute zero-disruption migrations. Experience with enterprise IAM and familiarity with authentication technology such as OAuth2 and OIDC. Deep knowledge of Infrastructure as Code (IaC) with Terraform and/or AWS CDK. Strong experience with Kubernetes and related tools (Helm, Kustomize, ArgoCD). Solid understanding of git, branching models, CI/CD pipelines and deployment strategies. Experience with security, audit, and compliance best practices. Excellent problem-solving and analytical skills. Strong communication and interpersonal skills, with the ability to engage with both technical and non-technical stakeholders. Experience in technical leadership, mentoring, team-forming and fostering self-organization and ownership. Experience with client relationship management and project planning. Certifications: Relevant certifications (for example Kubernetes Certified Administrator, AWS Certified Solutions Architect - Professional, AWS Certified DevOps Engineer - Professional etc.). Software development experience (for example Terraform, Python). Experience with machine learning infrastructure. Education: B.Tech /BE in computer science, a related field or equivalent experience.
Hyderabad
INR 10.0 - 18.0 Lacs P.A.
Hybrid
Full Time
About the Role: We are looking for a skilled and motivated Data Engineer with strong experience in Python programming and Google Cloud Platform (GCP) to join our data engineering team. The ideal candidate will be responsible for designing, developing, and maintaining robust and scalable ETL (Extract, Transform, Load) data pipelines. The role involves working with various GCP services, implementing data ingestion and transformation logic, and ensuring data quality and consistency across systems. Key Responsibilities: Design, develop, test, and maintain scalable ETL data pipelines using Python. Work extensively on Google Cloud Platform (GCP) services such as: Dataflow for real-time and batch data processing Cloud Functions for lightweight serverless compute BigQuery for data warehousing and analytics Cloud Composer for orchestration of data workflows (based on Apache Airflow) Google Cloud Storage (GCS) for managing data at scale IAM for access control and security Cloud Run for containerized applications Perform data ingestion from various sources and apply transformation and cleansing logic to ensure high-quality data delivery. Implement and enforce data quality checks, validation rules, and monitoring. Collaborate with data scientists, analysts, and other engineering teams to understand data needs and deliver efficient data solutions. Manage version control using GitHub and participate in CI/CD pipeline deployments for data projects. Write complex SQL queries for data extraction and validation from relational databases such as SQL Server, Oracle, or PostgreSQL. Document pipeline designs, data flow diagrams, and operational support procedures. Required Skills: 4-6 years of hands-on experience in Python for backend or data engineering projects. Strong understanding and working experience with GCP cloud services (especially Dataflow, BigQuery, Cloud Functions, Cloud Composer, etc.). Solid understanding of data pipeline architecture, data integration, and transformation techniques. Experience in working with version control systems like GitHub and knowledge of CI/CD practices. Strong experience in SQL with at least one enterprise database (SQL Server, Oracle, PostgreSQL, etc.). Good to Have (Optional Skills): Experience working with Snowflake cloud data platform. Hands-on knowledge of Databricks for big data processing and analytics. Familiarity with Azure Data Factory (ADF) and other Azure data engineering tools.
Hyderabad
INR 16.0 - 30.0 Lacs P.A.
Hybrid
Full Time
About the Role: Egen is seeking a proactive and versatile Technical Project Manager to lead multiple internal IT Applications teams. The IT Applications Technical Project Manager is responsible for managing the critical applications, systems and tools used by 500+ consultants delivering 100+ concurrent AI and Data Analytics Professional Services projects. This role requires the ability to act as Product Owner to interface with business owners, understand business use cases and define appropriate upgrades and new features; as a Technical Project Manager to provide technical oversight, direct software design and development and guide technical direction and as a Scrum Master to plan and manage multiple teams implementing biweekly Sprints and releases to Production. Key Responsibilities: Product Ownership Understand how critical business applications function to meet business needs Work closely with internal stakeholders and business owners to understand their application needs, translate them into clear and actionable user stories, define acceptance criteria, and drive innovation and continuous improvement Design new processes within existing business applications to meet key business priorities Scrum Master Lead multiple scrum teams using Agile Best Practices in requirements gathering, User Story documentation, Quality Assurance and Task lifecycle management Plan biweekly sprints to build key functionality to meet business needs Facilitate Scrum ceremonies (e.g., daily stand-ups, Sprint planning, retrospectives) to ensure maximum team effectiveness Manage and prioritize application backlogs, including bug fixes, enhancement requests, and operational improvements, ensuring they align with business needs and user impact Project Management Create and manage project plans, resource allocation, timelines, and deliverables Oversee project planning, execution, and delivery for longer-term system upgrades and implementations Manage multiple concurrent teams delivering features to multiple business owners Identify and communicate potential risks and issues, develop mitigation strategies, and manage to closure Create status reports as needed Stakeholder Communication Coordinate with different departments (e.g. Operations, Finance, HR, Delivery) to communicate progress, manage expectations, and gather feedback Communicate project status and issues to stakeholders Develop and communicate short and long term roadmaps Manage executive prioritization of backlog Coordinate business users developing Test Suites and executing UAT of candidate releases Resource Management : Allocate resources to projects. Balance workload among team members Helpdesk support Oversee IT helpdesk process Ensure SLAs are met Reporting and Analytics : Define and measure performance KPIs Create and distribute regular reports on project status, resource utilization, financial performance, etc. Analyze data to identify trends, issues, and opportunities for improvement Provide insights and recommendations to senior management Process Improvement : Identify areas for process improvement within the PSA system Implement best practices and new tools to enhance efficiency Train team members on new processes and tools. Required Skills and Qualifications: Project Management : Strong project management skills and experience with project management tools (e.g., Asana, Smartsheet, Jira, MS Project) Agile Development: Strong experience planning and managing Agile Sprints, defining User Stories, managing team capacity and managing backlogs. Analytical Skills : Ability to analyze data, generate reports, and provide actionable insights. Technical Proficiency Experience managing Salesforce implementations Familiarity with PSA software (e.g. Mavenlink, NetSuite OpenAir). FinancialForce PSA (Certinia) preferred Experience using reporting and analytics tools (Looker, Tableau, CRMA) Experience with Google Cloud Platform is a plus Communication : Excellent verbal and written communication skills Organizational Skills : Strong organizational skills and attention to detail Problem-Solving : Ability to identify issues and develop effective solutions. Collaboration : Ability to work effectively in a team environment and collaborate with various stakeholders. Adaptability : Ability to adapt to changing priorities and manage multiple tasks simultaneously. Global Collaboration : Ability to work synchronously and asynchronously with stakeholders around the globe.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.