Jobs
Interviews

16 Cicd Workflows Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

2.0 - 6.0 years

0 Lacs

chennai, tamil nadu

On-site

You will be working as a .NET Core & API Developer at Brainmagic, a renowned Mobile Application and Web Application Development company with over two decades of experience in delivering technology solutions. Your primary responsibility will involve back-end web development, software development, implementing object-oriented programming principles, and working with databases. Your role will require you to deliver high-quality code and effectively integrate .NET Core technologies into various projects. To excel in this role, you should have 2 to 3 years of experience in .NET Core Web API development and possess a strong command over C#, Entity Framework Core, and LINQ. Proficiency in Object-Oriented Programming (OOP) principles, a solid understanding of databases, and hands-on experience with RESTful API design and integration are essential requirements. Additionally, you should have sound knowledge of SQL Server, database design principles, HTTP protocols, JSON, and secure API practices such as JWT and OAuth. Familiarity with Git version control and CI/CD workflows will be an added advantage. As a part of the team at Brainmagic, you will have the opportunity to work on customized projects for global clients, in a collaborative and learning-oriented work environment. We offer a competitive salary with performance-based growth and long-term career prospects in a stable and innovative company. If you possess a Bachelor's degree in Computer Science, Engineering, or a related field, along with excellent problem-solving and analytical skills, and the ability to work effectively in a team-oriented environment, we encourage you to apply for this exciting role and be a part of our dynamic team at Brainmagic.,

Posted 2 days ago

Apply

3.0 - 7.0 years

0 Lacs

noida, uttar pradesh

On-site

The contextualization platform enables large-scale data integration and entity matching across heterogeneous sources. The current engineering focus is to modernize the architecture for better scalability and orchestration compatibility, refactor core services, and lay the foundation for future AI-based enhancements. This is a pivotal development initiative with clear roadmap milestones and direct alignment with a multi-year digital transformation strategy. We are looking for a skilled and motivated Senior Backend Engineer with strong expertise in Kotlin to join a newly established scrum team responsible for enhancing a core data contextualization platform. This service plays a central role in associating and matching data from diverse sources - time series, equipment, documents, 3D objects - into a unified data model. You will lead backend development efforts to modernize and scale the platform by integrating with an updated data architecture and orchestration framework. This is a high-impact role contributing to a long-term roadmap focused on scalable, maintainable, and secure industrial software. Key Responsibilities: - Design, develop, and maintain scalable, API-driven backend services using Kotlin. - Align backend systems with modern data modeling and orchestration standards. - Collaborate with engineering, product, and design teams to ensure seamless integration across the broader data platform. - Implement and refine RESTful APIs following established design guidelines. - Participate in architecture planning, technical discovery, and integration design for improved platform compatibility and maintainability. - Conduct load testing, improve unit test coverage, and contribute to reliability engineering efforts. - Drive software development best practices including code reviews, documentation, and CI/CD process adherence. - Ensure compliance with multi-cloud design standards and use of infrastructure-as-code tooling (Kubernetes, Terraform). Qualifications: - 3+ years of backend development experience, with a strong focus on Kotlin - Proven ability to design and maintain robust, API-centric microservices. - Hands-on experience with Kubernetes-based deployments, cloud-agnostic infrastructure, and modern CI/CD workflows. - Solid knowledge of PostgreSQL, Elasticsearch, and object storage systems. - Strong understanding of distributed systems, data modeling, and software scalability principles. - Excellent communication skills and ability to work in a cross-functional, English-speaking environment. - Bachelor's or Master's degree in Computer Science or related discipline. Bonus Qualifications: - Experience with Python for auxiliary services, data processing, or SDK usage. - Knowledge of data contextualization or entity resolution techniques. - Familiarity with 3D data models, industrial data structures, or hierarchical asset relationships. - Exposure to LLM-based matching or AI-enhanced data processing (not required but a plus). - Experience with Terraform, Prometheus, and scalable backend performance testing. About the role and key responsibilities: Develop Data Fusion - a robust, state-of-the-art SaaS for industrial data. Solve concrete industrial data problems by designing and implementing delightful APIs and robust services on top of Data Fusion. Examples include integrating data sources into our platform in a secure and scalable way and enabling high-performance data science pipelines. Work with application teams to ensure a delightful user experience that helps the user solve complex real-world problems that have yet to be solved before. Work with distributed open-source software such as Kubernetes, Kafka, Spark and similar to build scalable and performant solutions. Work with databases or storage systems such as PostgreSQL, Elasticsearch or S3-API-compatible blob stores. Help shape the culture and methodology of a rapidly growing company. We at GlobalLogic offer a culture of caring, learning and development opportunities, interesting and meaningful work, balance and flexibility, and a high-trust organization. Join us and be part of a trusted digital engineering partner to the world's largest and most forward-thinking companies, collaborating in transforming businesses and redefining industries through intelligent products, platforms, and services.,

Posted 2 days ago

Apply

3.0 - 7.0 years

0 Lacs

noida, uttar pradesh

On-site

The contextualization platform enables large-scale data integration and entity matching across heterogeneous sources. The current engineering focus is to modernize the architecture for better scalability and orchestration compatibility, refactor core services, and lay the foundation for future AI-based enhancements. This is a pivotal development initiative with clear roadmap milestones and direct alignment with a multi-year digital transformation strategy. We are looking for a skilled and motivated Senior Backend Engineer with strong expertise in Kotlin to join a newly established scrum team responsible for enhancing a core data contextualization platform. This service plays a central role in associating and matching data from diverse sources - time series, equipment, documents, 3D objects - into a unified data model. You will lead backend development efforts to modernize and scale the platform by integrating with an updated data architecture and orchestration framework. This is a high-impact role contributing to a long-term roadmap focused on scalable, maintainable, and secure industrial software. Key Responsibilities: - Design, develop, and maintain scalable, API-driven backend services using Kotlin. - Align backend systems with modern data modeling and orchestration standards. - Collaborate with engineering, product, and design teams to ensure seamless integration across the broader data platform. - Implement and refine RESTful APIs following established design guidelines. - Participate in architecture planning, technical discovery, and integration design for improved platform compatibility and maintainability. - Conduct load testing, improve unit test coverage, and contribute to reliability engineering efforts. - Drive software development best practices including code reviews, documentation, and CI/CD process adherence. - Ensure compliance with multi-cloud design standards and use of infrastructure-as-code tooling (Kubernetes, Terraform). Qualifications: - 3+ years of backend development experience, with a strong focus on Kotlin. - Proven ability to design and maintain robust, API-centric microservices. - Hands-on experience with Kubernetes-based deployments, cloud-agnostic infrastructure, and modern CI/CD workflows. - Solid knowledge of PostgreSQL, Elasticsearch, and object storage systems. - Strong understanding of distributed systems, data modeling, and software scalability principles. - Excellent communication skills and ability to work in a cross-functional, English-speaking environment. - Bachelor's or Master's degree in Computer Science or related discipline. Bonus Qualifications: - Experience with Python for auxiliary services, data processing, or SDK usage. - Knowledge of data contextualization or entity resolution techniques. - Familiarity with 3D data models, industrial data structures, or hierarchical asset relationships. - Exposure to LLM-based matching or AI-enhanced data processing (not required but a plus). - Experience with Terraform, Prometheus, and scalable backend performance testing. About The Role And Key Responsibilities: - Develop Data Fusion - a robust, state-of-the-art SaaS for industrial data. - Solve concrete industrial data problems by designing and implementing delightful APIs and robust services on top of Data Fusion. - Work with distributed open-source software such as Kubernetes, Kafka, Spark, and similar to build scalable and performant solutions. - Help shape the culture and methodology of a rapidly growing company. ,

Posted 2 days ago

Apply

5.0 - 9.0 years

0 Lacs

noida, uttar pradesh

On-site

The contextualization platform enables large-scale data integration and entity matching across heterogeneous sources. The current engineering focus is to modernize the architecture for better scalability and orchestration compatibility, refactor core services, and lay the foundation for future AI-based enhancements. This is a pivotal development initiative with clear roadmap milestones and direct alignment with a multi-year digital transformation strategy. We are looking for a skilled and motivated Senior Backend Engineer with strong expertise in Kotlin to join a newly established scrum team responsible for enhancing a core data contextualization platform. This service plays a central role in associating and matching data from diverse sources - time series, equipment, documents, 3D objects - into a unified data model. You will lead backend development efforts to modernize and scale the platform by integrating with an updated data architecture and orchestration framework. This is a high-impact role contributing to a long-term roadmap focused on scalable, maintainable, and secure industrial software. Key Responsibilities: - Design, develop, and maintain scalable, API-driven backend services using Kotlin. - Align backend systems with modern data modeling and orchestration standards. - Collaborate with engineering, product, and design teams to ensure seamless integration across the broader data platform. - Implement and refine RESTful APIs following established design guidelines. - Participate in architecture planning, technical discovery, and integration design for improved platform compatibility and maintainability. - Conduct load testing, improve unit test coverage, and contribute to reliability engineering efforts. - Drive software development best practices including code reviews, documentation, and CI/CD process adherence. - Ensure compliance with multi-cloud design standards and use of infrastructure-as-code tooling (Kubernetes, Terraform). Qualifications: - 5+ years of backend development experience, with a strong focus on Kotlin - Proven ability to design and maintain robust, API-centric microservices. - Hands-on experience with Kubernetes-based deployments, cloud-agnostic infrastructure, and modern CI/CD workflows. - Solid knowledge of PostgreSQL, Elasticsearch, and object storage systems. - Strong understanding of distributed systems, data modeling, and software scalability principles. - Excellent communication skills and ability to work in a cross-functional, English-speaking environment. - Bachelor's or Master's degree in Computer Science or related discipline. Bonus Qualifications: - Experience with Python for auxiliary services, data processing, or SDK usage. - Knowledge of data contextualization or entity resolution techniques. - Familiarity with 3D data models, industrial data structures, or hierarchical asset relationships. - Exposure to LLM-based matching or AI-enhanced data processing (not required but a plus). - Experience with Terraform, Prometheus, and scalable backend performance testing. About the role and key responsibilities: - Develop Data Fusion - a robust, state-of-the-art SaaS for industrial data. - Solve concrete industrial data problems by designing and implementing delightful APIs and robust services on top of Data Fusion. Examples include integrating data sources into our platform in a secure and scalable way and enabling high-performance data science pipelines. - Work with application teams to ensure a delightful user experience that helps the user solve complex real-world problems that have yet to be solved before. - Work with distributed open-source software such as Kubernetes, Kafka, Spark and similar to build scalable and performant solutions. - Work with databases or storage systems such as PostgreSQL, Elasticsearch or S3-API-compatible blob stores. - Help shape the culture and methodology of a rapidly growing company.,

Posted 2 days ago

Apply

5.0 - 9.0 years

0 Lacs

noida, uttar pradesh

On-site

The contextualization platform enables large-scale data integration and entity matching across heterogeneous sources. Our current engineering focus is on modernizing the architecture for better scalability and orchestration compatibility, refactoring core services, and laying the foundation for future AI-based enhancements. This pivotal development initiative aligns directly with a multi-year digital transformation strategy and has clear roadmap milestones. We are searching for a skilled and motivated Senior Backend Engineer with strong expertise in Kotlin to join our newly established scrum team responsible for enhancing a core data contextualization platform. This service is crucial in associating and matching data from diverse sources such as time series, equipment, documents, and 3D objects into a unified data model. As a Senior Backend Engineer, you will lead backend development efforts to modernize and scale the platform by integrating with an updated data architecture and orchestration framework. This role is high-impact, contributing to a long-term roadmap focused on scalable, maintainable, and secure industrial software. Your key responsibilities will include designing, developing, and maintaining scalable, API-driven backend services using Kotlin, aligning backend systems with modern data modeling and orchestration standards, collaborating with engineering, product, and design teams for seamless integration, implementing and refining RESTful APIs, participating in architecture planning, technical discovery, and integration design, conducting load testing, improving unit test coverage, driving software development best practices, and ensuring compliance with multi-cloud design standards. To qualify for this role, you should have at least 5 years of backend development experience with a strong focus on Kotlin, the ability to design and maintain robust, API-centric microservices, hands-on experience with Kubernetes-based deployments, cloud-agnostic infrastructure, and modern CI/CD workflows, solid knowledge of PostgreSQL, Elasticsearch, and object storage systems, a strong understanding of distributed systems, data modeling, and software scalability principles, excellent communication skills, and a degree in Computer Science or a related discipline. Bonus qualifications include experience with Python, knowledge of data contextualization or entity resolution techniques, familiarity with 3D data models, industrial data structures, or hierarchical asset relationships, exposure to LLM-based matching or AI-enhanced data processing, experience with Terraform, Prometheus, and scalable backend performance testing. In this role, you will develop Data Fusion, a robust SaaS for industrial data, and work on solving concrete industrial data problems by designing and implementing APIs and services on top of Data Fusion. You will collaborate with application teams to ensure a delightful user experience and work with open-source software like Kubernetes, Kafka, Spark, databases such as PostgreSQL and Elasticsearch, and storage systems like S3-API-compatible blob stores. At GlobalLogic, we offer a culture of caring, learning and development opportunities, interesting and meaningful work, balance and flexibility, and a high-trust organization where integrity is key. Join us as we continue to collaborate with clients in transforming businesses and redefining industries through intelligent products, platforms, and services.,

Posted 2 days ago

Apply

6.0 - 10.0 years

0 Lacs

chennai, tamil nadu

On-site

You will be working as a Data Schema Designer focusing on designing clean, extensible, and high-performance schemas for GCP data platforms in Chennai. The role is crucial in standardizing data design, enabling scalability, and ensuring cross-system consistency. Your responsibilities will include creating and maintaining unified data schema standards across BigQuery, CloudSQL, and AlloyDB, collaborating with engineering and analytics teams to identify modeling best practices, ensuring schema alignment with ingestion pipelines, transformations, and business rules, developing entity relationship diagrams and schema documentation templates, and assisting in the automation of schema deployments and version control. To excel in this role, you must possess expert knowledge in schema design principles for GCP platforms, proficiency with schema documentation tools such as DBSchema and dbt docs, a deep understanding of data normalization, denormalization, and indexing strategies, as well as hands-on experience with OLTP and OLAP schemas. Preferred skills for this role include exposure to CI/CD workflows and Git-based schema management, experience in metadata governance and data cataloging. Soft skills like precision and clarity in technical documentation, collaboration mindset with attention to performance and quality are also valued. By joining this role, you will be the backbone of reliable and scalable data systems, influence architectural decisions through thoughtful schema design, and work with modern cloud data stacks and enterprise data teams. Skills required for this position include GCP, denormalization, metadata governance, data, OLAP schemas, Git-based schema management, CI/CD workflows, data cataloging, schema documentation tools (e.g., DBSchema, dbt docs), indexing strategies, OLTP schemas, collaboration, analytics, technical documentation, schema design principles for GCP platforms, and data normalization.,

Posted 2 days ago

Apply

5.0 - 9.0 years

0 Lacs

karnataka

On-site

As a Backend Developer, you will be responsible for developing and maintaining backend APIs and services using FastAPI and Flask-RESTX. You will design scalable, modular, and maintainable microservices-based solutions. Your role will involve working with PostgreSQL and MongoDB to create robust data models and efficient queries. Additionally, you will implement messaging and task workflows utilizing RabbitMQ and integrate secure authentication and authorization flows with Auth0. In this position, you will be required to monitor and debug production systems using Elasticsearch and APM tools. Writing clean, testable code and actively participating in design and code reviews will also be part of your responsibilities. Collaboration with cross-functional teams across engineering, DevOps, and product departments is crucial for the success of the projects. The ideal candidate must possess strong hands-on experience in Python backend development and practical knowledge of FastAPI, Flask, or Flask-RESTX. A solid understanding and real-world experience with microservices architecture are essential. Proficiency in either MongoDB or PostgreSQL, along with experience in RabbitMQ for async messaging and job queues, is required. Familiarity with API security and integration using Auth0 or similar services is also a must. Moreover, the candidate should have an understanding of observability practices utilizing Elasticsearch and APM tools. Strong debugging, performance tuning, and optimization skills are highly valued for this role. Experience with SQLAlchemy and Alembic for ORM and migrations, exposure to PostgREST or GraphQL APIs, and knowledge of containerized development with Docker are considered nice-to-have skills. Familiarity with CI/CD workflows, Git-based version control, and prior experience in event-driven, large-scale data processing systems would be advantageous.,

Posted 2 days ago

Apply

3.0 - 7.0 years

0 Lacs

pune, maharashtra

On-site

You will be responsible for designing and implementing scalable Snowflake data warehouse architectures, which includes schema modeling and data partitioning. You will lead or support data migration projects from on-premise or legacy cloud platforms to Snowflake. Additionally, you will be developing ETL/ELT pipelines and integrating data using tools such as DBT, Fivetran, Informatica, Airflow, etc. It will be part of your role to define and implement best practices for data modeling, query optimization, and storage efficiency in Snowflake. Collaboration with cross-functional teams, including data engineers, analysts, BI developers, and stakeholders, to align architectural solutions will be essential. Ensuring data governance, compliance, and security by implementing RBAC, masking policies, and access control within Snowflake will also be a key responsibility. Working with DevOps teams to enable CI/CD pipelines, monitoring, and infrastructure as code for Snowflake environments will be part of your duties. Optimizing resource utilization, monitoring workloads, and managing the cost-effectiveness of the platform will also be under your purview. Staying updated with Snowflake features, cloud vendor offerings, and best practices is crucial. Qualifications & Skills: - Bachelor's or Master's degree in Computer Science, Information Systems, or a related field. - X years of experience in data engineering, data warehousing, or analytics architecture. - 3+ years of hands-on experience in Snowflake architecture, development, and administration. - Strong knowledge of cloud platforms (AWS, Azure, or GCP). - Solid understanding of SQL, data modeling, and data transformation principles. - Experience with ETL/ELT tools, orchestration frameworks, and data integration. - Familiarity with data privacy regulations (GDPR, HIPAA, etc.) and compliance. Qualifications: - Snowflake certification (SnowPro Core / Advanced). - Experience in building data lakes, data mesh architectures, or streaming data platforms. - Familiarity with tools like Power BI, Tableau, or Looker for downstream analytics. - Experience with Agile delivery models and CI/CD workflows.,

Posted 3 days ago

Apply

3.0 - 7.0 years

0 Lacs

hyderabad, telangana

On-site

The Solutions Engineer - Pre-Sales at ReleaseOwl plays a pivotal role in the Pre-Sales team by demonstrating the capabilities of ReleaseOwl through engaging product demos, developing Proof of Concepts (POCs), and aiding in pilot implementations. As a Solutions Engineer, you will leverage your expertise in SAP Basis, Transport Management, and SAP BTP and Integration Suite DevOps practices to support sales opportunities effectively. Your responsibilities will include delivering tailored product demos that address customer pain points, creating and managing customer-specific POCs and pilot environments, collaborating with Sales, Product, and Engineering teams to craft value-driven solutions, providing early-stage implementation guidance, and communicating the technical advantages of ReleaseOwl to both technical and business stakeholders. Additionally, you will serve as a trusted advisor during pre-sales engagements and discovery sessions. To excel in this role, you should possess a minimum of 3 years of experience in SAP Basis or SAP landscape operations, a strong background in Transport Management, cTMS, Solution Manager (Solman), and hands-on proficiency in SAP DevOps for BTP and SAP Integration Suite. Familiarity with DevOps tools, CI/CD workflows, and SAP's transport architecture is crucial. Effective communication and presentation skills are essential, along with prior experience in pre-sales, customer-facing roles, or consulting, which is considered a strong advantage. By joining ReleaseOwl, you will have the opportunity to work on innovative SAP DevOps solutions that are reshaping enterprise automation. You will collaborate with a driven team focused on developing top-tier SaaS products and enjoy competitive compensation in a global work environment. Furthermore, this role offers the potential for growth into solution architecture, product, or customer success positions.,

Posted 3 days ago

Apply

5.0 - 9.0 years

0 Lacs

kochi, kerala

On-site

As a highly skilled Senior Machine Learning Engineer, you will leverage your expertise in Deep Learning, Large Language Models (LLMs), and MLOps/LLMOps to design, optimize, and deploy cutting-edge AI solutions. Your responsibilities will include developing and scaling deep learning models, fine-tuning LLMs (e.g., GPT, Llama), and implementing robust deployment pipelines for production environments. You will be responsible for designing, training, fine-tuning, and optimizing deep learning models (CNNs, RNNs, Transformers) for various applications such as NLP, computer vision, or multimodal tasks. Additionally, you will fine-tune and adapt LLMs for domain-specific tasks like text generation, summarization, and semantic similarity. Experimenting with RLHF (Reinforcement Learning from Human Feedback) and alignment techniques will also be part of your role. In the realm of Deployment & Scalability (MLOps/LLMOps), you will build and maintain end-to-end ML pipelines for training, evaluation, and deployment. Deploying LLMs and deep learning models in production environments using frameworks like FastAPI, vLLM, or TensorRT is crucial. You will optimize models for low-latency, high-throughput inference and implement CI/CD workflows for ML systems using tools like MLflow and Kubeflow. Monitoring & Optimization will involve setting up logging, monitoring, and alerting for model performance metrics such as drift, latency, and accuracy. Collaborating with DevOps teams to ensure scalability, security, and cost-efficiency of deployed models will also be part of your responsibilities. The ideal candidate will possess 5-7 years of hands-on experience in Deep Learning, NLP, and LLMs. Strong proficiency in Python, PyTorch, TensorFlow, Hugging Face Transformers, and LLM frameworks is essential. Experience with model deployment tools like Docker, Kubernetes, and FastAPI, along with knowledge of MLOps/LLMOps best practices and familiarity with cloud platforms (AWS, GCP, Azure) are required qualifications. Preferred qualifications include contributions to open-source LLM projects, showcasing your commitment to advancing the field of machine learning.,

Posted 3 days ago

Apply

3.0 - 7.0 years

0 Lacs

maharashtra

On-site

Arista Networks is an industry leader in data-driven, client-to-cloud networking for large data center, campus, and routing environments. With over $7 billion in revenue, Arista has established itself as a profitable company. Arista's award-winning platforms, with Ethernet speeds up to 800G bits per second, redefine scalability, agility, and resilience. As a founding member of the Ultra Ethernet consortium, Arista has shipped over 20 million cloud networking ports worldwide. The company is committed to open standards, and its products are available globally, both directly and through partners. At Arista, diversity of thought and perspectives is highly valued. Fostering an inclusive environment where individuals from diverse backgrounds feel welcome is key to driving creativity and innovation within the company. The commitment to excellence at Arista has been recognized with prestigious awards, including the Great Place to Work Survey for Best Engineering Team and Best Company for Diversity, Compensation, and Work-Life Balance. Arista takes pride in its successful track record and is dedicated to maintaining the highest quality and performance standards. As a Software Tools Development Engineer at Arista, you will work with the Hardware Team to design the hardware and software components of the company's products. You will have the opportunity to lead your own projects and think innovatively. Collaborating with multi-disciplinary engineers, you will develop tools to enhance Arista's hardware development workflow, aiming to improve quality and deliver top-notch products to customers. In this role, your responsibilities will include: - Creating stress tests to validate hardware conceptual designs - Drafting Functional Specifications to communicate intentions with the team - Debugging complex issues in multi-server execution environments - Reviewing peers" code for adherence to best practices and target architectures - Developing unit-test code for validation and creating new tests - Generating documentation templates and test reports to communicate testing results to the hardware team - Identifying and addressing unexpected issues with multi-layered patches - Contributing to the overall priorities of the hardware tools team - Learning various code languages to support existing software suites The qualifications for this position include a B.S. in Electrical Engineering and/or Computer Engineering, 3-5 years of relevant experience in software engineering for tools development, self-motivation, a passion for developing high-quality software solutions, continuous learning mindset, strong communication skills, experience with CI/CD workflows, knowledge of networking protocols, and enthusiasm for collaborative work within a multidisciplinary team. Arista is known for its engineering-centric approach, with leadership consisting of engineers who prioritize sound software engineering principles. The company offers a flat and streamlined management structure, providing engineers with complete ownership of their projects. Arista emphasizes the development and utilization of test automation tools and offers global opportunities for engineers to work across various domains. Headquartered in Santa Clara, California, Arista has development offices in multiple countries, fostering a diverse and inclusive work environment. Regardless of location, all R&D centers are considered equal in stature. Join Arista to shape the future of networking and be part of a culture that values invention, quality, respect, and fun.,

Posted 4 days ago

Apply

3.0 - 7.0 years

0 Lacs

karnataka

On-site

The Data Analytics Engineer role at Rightpoint involves being a crucial part of client projects to develop and deliver decisioning intelligence solutions. Working collaboratively with other team members and various business and technical entities on the client side is a key aspect of this role. As a member of a modern data team, your primary responsibility will be to bridge the gap between enterprise data engineers and business-focused data and visualization analysts. This involves transforming raw data into clean, organized, and reusable datasets to facilitate effective analysis and decisioning intelligence data products. Key Responsibilities: - Design, develop, and maintain clean, scalable data models to support analytics and business intelligence needs. Define rules and requirements for the data to serve business analysis objectives. - Collaborate with data analysts and business stakeholders to define data requirements, ensure data consistency across platforms, and promote self-service analytics. - Build, optimize, and document transformed pipelines into visualization and analysis environments to ensure high data quality and integrity. - Implement data transformation best practices using modern tools like dbt, SQL, and cloud data warehouses (e.g., Azure Synapse, BigQuery, Azure Databricks). - Monitor and troubleshoot data quality issues, ensuring accuracy, completeness, and reliability. - Define and maintain data quality metrics, data formats, and adopt automated methods to cleanse and improve data quality. - Optimize data performance to ensure query efficiency for large datasets. - Establish and maintain analytics platform best practices for the team, including version control, data unit testing, CI/CD, and documentation. - Collaborate with other team members, including data engineers, business and visualization analysts, and data scientists to align data assets with business analysis objectives. - Work closely with data engineering teams to integrate new data sources into the data lake and optimize performance. - Act as a consultant within cross-functional teams to understand business needs and develop appropriate data solutions. - Demonstrate strong communication skills, both written and verbal, and exhibit professionalism, conciseness, and effectiveness. - Take initiative, be proactive, anticipate needs, and complete projects comprehensively. - Exhibit a willingness to continuously learn, problem-solve, and assist others. Desired Qualifications: - Strong knowledge of SQL and Python. - Familiarity with cloud platforms like Azure, Azure Databricks, and Google BigQuery. - Understanding of schema design and data modeling methodologies. - Hands-on experience with dbt for data transformation and modeling. - Experience with version control systems like Git and CI/CD workflows. - Passion for continuous improvement, learning, and applying new technologies to everyday activities. - Ability to translate technical concepts for non-technical stakeholders. - Analytical mindset to address business challenges through data design. - Bachelor's or master's degree in computer science, Data Science, Engineering, or a related field. - Strong problem-solving skills and attention to detail. By joining Rightpoint, you will have the opportunity to work with cutting-edge business and data technologies, in a collaborative and innovative environment. Competitive salary and benefits package, along with career growth opportunities in a data-driven organization are some of the perks of working at Rightpoint. If you are passionate about data and enjoy creating efficient, scalable data solutions, we would love to hear from you! Benefits and Perks at Rightpoint include 30 Paid leaves, Public Holidays, Casual and open office environment, Flexible Work Schedule, Family medical insurance, Life insurance, Accidental Insurance, Regular Cultural & Social Events, Continuous Training, Certifications, and Learning Opportunities. Rightpoint is committed to bringing people together from diverse backgrounds and experiences to create phenomenal work, making it an inclusive and welcoming workplace for all. EEO Statement: Rightpoint is an equal opportunity employer and is committed to providing a workplace that is free from any form of discrimination.,

Posted 4 days ago

Apply

3.0 - 7.0 years

0 Lacs

noida, uttar pradesh

On-site

The contextualization platform enables large-scale data integration and entity matching across heterogeneous sources. The current engineering focus is to modernize the architecture for better scalability and orchestration compatibility, refactor core services, and lay the foundation for future AI-based enhancements. This is a pivotal development initiative with clear roadmap milestones and direct alignment with a multi-year digital transformation strategy. We are looking for a skilled and motivated Senior Backend Engineer with strong expertise in Kotlin to join a newly established scrum team responsible for enhancing a core data contextualization platform. This service plays a central role in associating and matching data from diverse sources - time series, equipment, documents, 3D objects - into a unified data model. As a Senior Backend Engineer, you will lead backend development efforts to modernize and scale the platform by integrating with an updated data architecture and orchestration framework. This role is high-impact, contributing to a long-term roadmap focused on scalable, maintainable, and secure industrial software. Your key responsibilities will include designing, developing, and maintaining scalable, API-driven backend services using Kotlin, aligning backend systems with modern data modeling and orchestration standards, collaborating with engineering, product, and design teams for seamless integration across the broader data platform, implementing and refining RESTful APIs following established design guidelines, participating in architecture planning, technical discovery, and integration design for improved platform compatibility and maintainability, conducting load testing, improving unit test coverage, and contributing to reliability engineering efforts, driving software development best practices including code reviews, documentation, and CI/CD process adherence, ensuring compliance with multi-cloud design standards and use of infrastructure-as-code tooling such as Kubernetes and Terraform. Qualifications: - 3+ years of backend development experience, with a strong focus on Kotlin - Proven ability to design and maintain robust, API-centric microservices - Hands-on experience with Kubernetes-based deployments, cloud-agnostic infrastructure, and modern CI/CD workflows - Solid knowledge of PostgreSQL, Elasticsearch, and object storage systems - Strong understanding of distributed systems, data modeling, and software scalability principles - Excellent communication skills and ability to work in a cross-functional, English-speaking environment - Bachelors or Masters degree in Computer Science or related discipline Bonus Qualifications: - Experience with Python for auxiliary services, data processing, or SDK usage - Knowledge of data contextualization or entity resolution techniques - Familiarity with 3D data models, industrial data structures, or hierarchical asset relationships - Exposure to LLM-based matching or AI-enhanced data processing (not required but a plus) - Experience with Terraform, Prometheus, and scalable backend performance testing In this role, you will develop Data Fusion, a robust, state-of-the-art SaaS for industrial data. You will solve concrete industrial data problems by designing and implementing delightful APIs and robust services on top of Data Fusion, working with application teams to ensure a delightful user experience, and collaborating with distributed open-source software and databases/storage systems such as Kubernetes, Kafka, Spark, PostgreSQL, Elasticsearch, and S3-API-compatible blob stores, among others. At GlobalLogic, we offer a culture of caring, learning and development opportunities, interesting and meaningful work on impactful projects, balance and flexibility in work arrangements, and a high-trust organization committed to integrity and trust.,

Posted 4 days ago

Apply

3.0 - 7.0 years

0 Lacs

noida, uttar pradesh

On-site

The contextualization platform enables large-scale data integration and entity matching across heterogeneous sources. The current engineering focus is to modernize the architecture for better scalability and orchestration compatibility, refactor core services, and lay the foundation for future AI-based enhancements. This is a pivotal development initiative with clear roadmap milestones and direct alignment with a multi-year digital transformation strategy. We are looking for a skilled and motivated Senior Backend Engineer with strong expertise in Kotlin to join a newly established scrum team responsible for enhancing a core data contextualization platform. This service plays a central role in associating and matching data from diverse sources - time series, equipment, documents, 3D objects - into a unified data model. As a Senior Backend Engineer, you will lead backend development efforts to modernize and scale the platform by integrating with an updated data architecture and orchestration framework. This is a high-impact role contributing to a long-term roadmap focused on scalable, maintainable, and secure industrial software. Key Responsibilities: - Design, develop, and maintain scalable, API-driven backend services using Kotlin. - Align backend systems with modern data modeling and orchestration standards. - Collaborate with engineering, product, and design teams to ensure seamless integration across the broader data platform. - Implement and refine RESTful APIs following established design guidelines. - Participate in architecture planning, technical discovery, and integration design for improved platform compatibility and maintainability. - Conduct load testing, improve unit test coverage, and contribute to reliability engineering efforts. - Drive software development best practices including code reviews, documentation, and CI/CD process adherence. - Ensure compliance with multi-cloud design standards and use of infrastructure-as-code tooling (Kubernetes, Terraform). Qualifications: - 3+ years of backend development experience, with a strong focus on Kotlin. - Proven ability to design and maintain robust, API-centric microservices. - Hands-on experience with Kubernetes-based deployments, cloud-agnostic infrastructure, and modern CI/CD workflows. - Solid knowledge of PostgreSQL, Elasticsearch, and object storage systems. - Strong understanding of distributed systems, data modeling, and software scalability principles. - Excellent communication skills and ability to work in a cross-functional, English-speaking environment. - Bachelor's or Master's degree in Computer Science or related discipline. Bonus Qualifications: - Experience with Python for auxiliary services, data processing, or SDK usage. - Knowledge of data contextualization or entity resolution techniques. - Familiarity with 3D data models, industrial data structures, or hierarchical asset relationships. - Exposure to LLM-based matching or AI-enhanced data processing (not required but a plus). - Experience with Terraform, Prometheus, and scalable backend performance testing. About the role and key responsibilities: - Develop Data Fusion - a robust, state-of-the-art SaaS for industrial data. - Solve concrete industrial data problems by designing and implementing delightful APIs and robust services on top of Data Fusion. - Work with distributed open-source software such as Kubernetes, Kafka, Spark and similar to build scalable and performant solutions. - Help shape the culture and methodology of a rapidly growing company. At GlobalLogic, we prioritize a culture of caring where people come first, offering continuous learning and development opportunities, interesting and meaningful work, balance and flexibility, and a high-trust organization. Join us to be part of a trusted digital engineering partner to the world's largest companies, collaborating in transforming businesses and redefining industries through intelligent products, platforms, and services.,

Posted 4 days ago

Apply

3.0 - 7.0 years

0 Lacs

pune, maharashtra

On-site

As a Data Engineer specializing in Snowflake architecture, you will be responsible for designing and implementing scalable data warehouse architectures, including schema modeling and data partitioning. Your role will involve leading or supporting data migration projects to Snowflake from on-premise or legacy cloud platforms. You will be developing ETL/ELT pipelines and integrating data using various tools such as DBT, Fivetran, Informatica, and Airflow. It will be essential to define and implement best practices for data modeling, query optimization, and storage efficiency within Snowflake. Collaboration with cross-functional teams, including data engineers, analysts, BI developers, and stakeholders, will be crucial to align architectural solutions effectively. Ensuring data governance, compliance, and security by implementing RBAC, masking policies, and access control within Snowflake will also be part of your responsibilities. Working closely with DevOps teams to enable CI/CD pipelines, monitoring, and infrastructure as code for Snowflake environments is essential. Your role will involve optimizing resource utilization, monitoring workloads, and managing the cost-effectiveness of the platform. Staying updated with Snowflake features, cloud vendor offerings, and best practices will be necessary to drive continuous improvement in data architecture. Qualifications & Skills: - Bachelor's or Master's degree in Computer Science, Information Systems, or a related field. - 5+ years of experience in data engineering, data warehousing, or analytics architecture. - 3+ years of hands-on experience in Snowflake architecture, development, and administration. - Strong knowledge of cloud platforms such as AWS, Azure, or GCP. - Solid understanding of SQL, data modeling, and data transformation principles. - Experience with ETL/ELT tools, orchestration frameworks, and data integration. - Familiarity with data privacy regulations (GDPR, HIPAA, etc.) and compliance. Additional Qualifications: - Snowflake certification (SnowPro Core / Advanced). - Experience in building data lakes, data mesh architectures, or streaming data platforms. - Familiarity with tools like Power BI, Tableau, or Looker for downstream analytics. - Experience with Agile delivery models and CI/CD workflows. This role offers an exciting opportunity to work on cutting-edge data architecture projects and collaborate with diverse teams to drive impactful business outcomes.,

Posted 4 days ago

Apply

5.0 - 9.0 years

0 Lacs

karnataka

On-site

In this role, you will contribute to a critical and highly-visible function within the Esper business. You will have the opportunity to autonomously deliver the technical direction of the service and the feature roadmap. Working with extraordinary talent, you will deliver end-to-end features, improve platform quality, and act as a technical leader. If you are excited about making a significant impact on Esper and the device industry, you will find this role engaging, challenging, and full of opportunities to learn and grow. You will be responsible for end-to-end implementation and maintenance of features, fixes, and enhancements to the platform. Your contributions will directly and immediately enhance the experience of our customers. This role offers the chance to work with cutting-edge technologies and solve scalability issues associated with managing millions of devices. Each project you undertake will expand the scope of your impact on the platform. Your responsibilities will include improving the Esper Platform by planning, recommending, and executing strategic projects. Using metrics and data, you will provide insights on customer usage, bottlenecks, future requirements, security, and scalability of the platform. You will establish standards, guidelines, sample projects, and demos to influence engineering teams to write stable, secure, maintainable, and quality code. Collaboration with distributed teams will be essential to drive changes, write root cause analyses (RCAs), and coordinate resolutions for production incidents. Additionally, you will objectively assess new technologies, tools, frameworks, and design patterns for adoption into the Esper Platform. You will become the Subject Matter Expert (SME) for the Platform SRE team and be responsible for various SRE tasks including performance testing, API test automation, maintaining Kubernetes clusters, automations, and release-related tasks. The ideal candidate for this role should have at least 5 years of experience. Hands-on experience in building and managing cloud systems on one or more providers such as AWS, GCP, or Azure is required. Knowledge of Computer Science fundamentals like Data Structures, Algorithms, Operating Systems, and Networks is essential. Experience in designing, developing, and deploying at least one customer-facing project is expected. Proficiency in scripting or any modern programming languages is necessary, along with experience in developing and deploying on UNIX/Linux-based systems. Hands-on experience in performance optimization using multiple metrics, as well as familiarity with microservices and container technologies like Docker, Kubernetes, and OpenShift, is important. Understanding best security practices for implementing Infrastructure as Code (IAC), automation, and CI/CD workflows is a plus. Familiarity with tools such as Jenkins and Buildkite, as well as knowledge of performance testing and automation testing, will be advantageous.,

Posted 1 week ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies