Jobs
Interviews

6 Github Workflows Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

10.0 - 14.0 years

0 Lacs

pune, maharashtra

On-site

As a DataOps Engineer, you will play a crucial role within our data engineering team, operating in the realm that merges software engineering, DevOps, and data analytics. Your primary responsibility will involve creating and managing secure, scalable, and production-ready data pipelines and infrastructure that are vital in supporting advanced analytics, machine learning, and real-time decision-making capabilities for our clientele. Your key duties will encompass designing, developing, and overseeing the implementation of robust, scalable, and efficient ETL/ELT pipelines leveraging Python and contemporary DataOps methodologies. You will also be tasked with incorporating data quality checks, pipeline monitoring, and error handling mechanisms, as well as constructing data solutions utilizing cloud-native services on AWS like S3, ECS, Lambda, and CloudWatch. Furthermore, your role will entail containerizing applications using Docker and orchestrating them via Kubernetes to facilitate scalable deployments. You will collaborate with infrastructure-as-code tools and CI/CD pipelines to automate deployments effectively. Additionally, you will be involved in designing and optimizing data models using PostgreSQL, Redis, and PGVector, ensuring high-performance storage and retrieval while supporting feature stores and vector-based storage for AI/ML applications. In addition to your technical responsibilities, you will be actively engaged in driving Agile ceremonies such as daily stand-ups, sprint planning, and retrospectives to ensure successful sprint delivery. You will also be responsible for reviewing pull requests (PRs), conducting code reviews, and upholding security and performance standards. Your collaboration with product owners, analysts, and architects will be essential in refining user stories and technical requirements. To excel in this role, you are required to have at least 10 years of experience in Data Engineering, DevOps, or Software Engineering roles with a focus on data products. Proficiency in Python, Docker, Kubernetes, and AWS (specifically S3 and ECS) is essential. Strong knowledge of relational and NoSQL databases like PostgreSQL, Redis, and experience with PGVector will be advantageous. A deep understanding of CI/CD pipelines, GitHub workflows, and modern source control practices is crucial, as is experience working in Agile/Scrum environments with excellent collaboration and communication skills. Moreover, a passion for developing clean, well-documented, and scalable code in a collaborative setting, along with familiarity with DataOps principles encompassing automation, testing, monitoring, and deployment of data pipelines, will be beneficial for excelling in this role.,

Posted 1 day ago

Apply

3.0 - 8.0 years

0 Lacs

hyderabad, telangana

On-site

As a GCP CloudOps Engineer, you will be responsible for deploying, integrating, and testing solutions using Infrastructure as Code (IaC) and DevSecOps techniques. With over 8 years of experience in infrastructure design and delivery, including 5 years of hands-on experience in Google Cloud technologies, you will play a key role in ensuring continuous, repeatable, secure, and automated deployment processes. Your responsibilities will also include: - Utilizing monitoring tools such as Datadog, New Relic, or Splunk for effective performance analysis and troubleshooting. - Implementing container orchestration services like Docker or Kubernetes, with a preference for GKE. - Collaborating with diverse teams across different time zones and cultures. - Maintaining comprehensive documentation, including principles, standards, practices, and project plans. - Building data warehouses using Databricks and IaC patterns with tools like Terraform, Jenkins, Spinnaker, CircleCI, etc. - Enhancing platform observability and optimizing monitoring and alerting tools for better performance. - Developing CI/CD frameworks to streamline application deployment processes. - Contributing to Cloud strategy discussions and implementing best practices for Cloud solutions. Your role will involve proactive collaboration, automation of long-term solutions, and adherence to incident, problem, and change management best practices. You will also be responsible for debugging applications, enhancing deployment architectures, and measuring cost and performance metrics of cloud services to drive informed decision-making. Preferred qualifications for this role include experience with Databricks, Multicloud environments (GCP, AWS, Azure), GitHub, and GitHub Actions. Strong communication skills, a proactive approach to problem-solving, and a deep understanding of Cloud technologies and tools are essential for success in this position. Key Skills: Splunk, Terraform, Google Cloud Platform, GitHub Workflows, AWS, Datadog, Python, Azure DevOps, Infrastructure as Code (IaC), Data Warehousing (Databricks), New Relic, CircleCI, Container Orchestration (Docker, Kubernetes, GKE), Spinnaker, DevSecOps, Jenkins, etc.,

Posted 2 days ago

Apply

5.0 - 9.0 years

0 Lacs

varanasi, uttar pradesh

On-site

Join a team at NEXUSsoft that has been at the forefront of automating complex business processes since 2003. Our mission is to assist Australian mid-sized enterprises in transforming chaotic, multi-system workflows into a unified, reliable source of truth. The core of our operations lies within the innovative iCERP platform, which seamlessly coordinates data and tasks from initial interaction to final billing, recognized by analysts as Intelligent Process Automation. As the demand for our cutting-edge continuous-improvement approach grows, we are expanding our Varanasi engineering hub and seeking skilled, ambitious individuals who excel in taking ownership, fostering collaboration, and creating tangible outcomes. If you resonate with these values, continue reading. As a Senior Engineer at NEXUSsoft, we are seeking a professional with a profound proficiency in PHP, web development, and MySQL. You should possess a robust background in troubleshooting, testing, DevOps practices, and GitHub workflows. In this role, you will be instrumental in delivering top-notch web applications, facilitating deployments, and contributing to strategic technical decision-making while closely collaborating with diverse teams. Your responsibilities will include developing, refining, and improving web applications utilizing PHP and contemporary frameworks. You will be tasked with designing and optimizing MySQL databases to ensure optimal performance and scalability. Crafting clean, efficient, and well-documented code is crucial, along with actively engaging in code reviews. Additionally, you will play a pivotal role in enhancing DevOps processes such as CI/CD pipelines, server configurations, and deployments. Proficiency in utilizing GitHub for effective source control, branching, pull requests, and version management is essential. Moreover, you will be responsible for diagnosing and resolving bugs and performance issues across the entire technology stack. Collaborating with QA teams to devise and implement testing strategies will also be part of your role, ensuring timely and high-quality feature deliveries in alignment with project managers. The ideal candidate should possess over 5 years of hands-on experience in PHP development, with expertise in Laravel, Symfony, or similar frameworks. A solid grasp of web technologies including HTML, CSS, JavaScript, and REST APIs is required. Extensive experience in MySQL database design and optimization is crucial, alongside familiarity with DevOps tools and a strong understanding of Git/GitHub workflows. Proficiency in troubleshooting and debugging web applications, coupled with excellent communication and problem-solving skills, is highly valued. Desirable skills include familiarity with cloud services, specifically Azure, and a keen interest in enhancing DevOps practices. Additionally, experience with Node JS is considered advantageous for this role.,

Posted 6 days ago

Apply

3.0 - 8.0 years

0 Lacs

hyderabad, telangana

On-site

Are you passionate about building scalable BI solutions and leading innovation with Microsoft Fabric AmplifAI is looking for a Power BI Architect to lead our analytics strategy, mentor a growing team, and drive enterprise-wide reporting transformation. As a Power BI Architect at AmplifAI, you will play a crucial role in defining scalable data models, pipelines, and reporting structures using OneLake, Direct Lake, Dataflows Gen2. You will lead the architecture and migration from Power BI Pro/Premium to Microsoft Fabric, integrating structured and semi-structured data for unified analysis. Additionally, you will manage and mentor a team of Power BI Analysts, evangelize best practices across semantic modeling, performance tuning, and data governance, and drive governance and CI/CD using GitHub-based workflows. The ideal candidate for this role will have 8+ years of experience in Power BI and enterprise analytics, 5+ years of SQL expertise, and 3+ years in a leadership role. Proven experience with Microsoft Fabric, hands-on experience with GitHub workflows and version control, as well as strong communication, critical thinking, and problem-solving skills are essential for success in this position. At AmplifAI, you will have the opportunity to work on cutting-edge enterprise AI & BI solutions, be part of a diverse, inclusive, and globally distributed team, and shape the future of analytics in CX and performance management. If you are ready to lead data-driven transformation and make a significant impact, apply now to join AmplifAI as a Power BI Architect!,

Posted 1 week ago

Apply

3.0 - 8.0 years

0 Lacs

hyderabad, telangana

On-site

Are you passionate about building scalable BI solutions and leading innovation with Microsoft Fabric AmplifAI is looking for a Power BI Architect to lead our analytics strategy, mentor a growing team, and drive enterprise-wide reporting transformation. The position is based in Hyderabad with work hours from 9 AM to 6 PM EST (US Time). As a Power BI Architect at AmplifAI, you will lead the architecture and migration from Power BI Pro/Premium to Microsoft Fabric. You will be responsible for defining scalable data models, pipelines, and reporting structures using OneLake, Direct Lake, Dataflows Gen2. Additionally, you will manage and mentor a team of Power BI Analysts and build engaging dashboards for platform insights, contact center KPIs, auto QA, and sentiment analysis. Integration of structured and semi-structured data for unified analysis, driving governance and CI/CD using GitHub-based workflows, and evangelizing best practices across semantic modeling, performance tuning, and data governance are key responsibilities. The ideal candidate should have 8+ years of experience in Power BI and enterprise analytics, 5+ years of SQL expertise, and at least 3 years in a leadership role. Proven experience with Microsoft Fabric, hands-on experience with GitHub workflows and version control, as well as strong communication, critical thinking, and problem-solving skills are essential. At AmplifAI, you will have the opportunity to work on cutting-edge enterprise AI & BI solutions, be part of a diverse, inclusive, and globally distributed team, and contribute to shaping the future of analytics in CX and performance management. If you are ready to lead data-driven transformation at AmplifAI, apply now!,

Posted 1 week ago

Apply

3.0 - 5.0 years

0 Lacs

Remote, , India

Remote

Req ID: 326959 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Java Developer - Digital Engineering Sr. Engineer to join our team in Remote, Telangana (IN-TG), India (IN). Role: Java Engineer (3-5 Years Experience) Description: We are looking for a skilled Java Engineer with 3-5 years of experience in application development on any cloud platform (AWS, Azure, GCP, etc.). The ideal candidate should have: Strong proficiency in Java programming and object-oriented design Solid understanding of SQL and experience working with relational databases Hands-on experience with CI/CD pipelines / GitHub workflows Proven ability in troubleshooting, debugging, and resolving performance issues Familiarity with building scalable, cloud-native applications Exposure to microservices architecture Experience with monitoring/logging tools Understanding of containerization (Docker/Kubernetes) This aligns better with our current approach, where developers are actively contributing on the automation front as part of their extended DevOps responsibilities. A junior dev with the right attitude and mentorship could be more effective and productive in supporting for development and automation needs. About NTT DATA NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies. Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us at NTT DATA endeavors to make accessible to any and all users. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact us at . This contact information is for accommodation requests only and cannot be used to inquire about the status of applications. NTT DATA is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status. For our EEO Policy Statement, please click . If you'd like more information on your EEO rights under the law, please click . For Pay Transparency information, please click.

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies