Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
6.0 - 11.0 years
9 - 13 Lacs
Hyderabad
Work from Office
GCPdata engineer Big Query SQL Python Talend ETL ProgrammerGCPor Any Cloud technology. Experienced inGCPdata engineer Big Query SQL Python Talend ETL ProgrammerGCPor Any Cloud technology. Good experience in building the pipeline ofGCPComponents to load the data into Big Query and to cloud storage buckets. Excellent Data Analysis skills. Good written and oral communication skills Self-motivated able to work independently
Posted 13 hours ago
6.0 - 11.0 years
9 - 14 Lacs
Hyderabad
Work from Office
1. Oracle Transport Management Cloud Techno/functional consultant with 8 years of expert domain knowledge covering the Planning, execution, Settlement. She/he must have been a part of at least 1-3 end-to-end OTM implementations 2.In-depth understanding of OTM Cloud business processes and their data flow. 3.The candidate should have been in client-facing roles and interacted with customers in requirement-gathering workshops, design, configuration, testing and go-live. 4. Should have strong written and verbal communication skills, personal drive, flexibility, team player, problem-solving, continuous improvement and client management. 5.Assist in identifying, assessing, and resolving complex functional issues/problems. Interact with client frequently around specific work efforts/deliverables 6.Configuring and Managing Technical activities 7. Configuration of Agents, SQL, VPD Profile and Executing as per requirements 8.Identifying areas of improvement and recommending process modifications to enhance operational efficiencies of 9.Configuration of Agents, Milestones, Rates, Locations, Itineraries 10. Managed the integration for OTM with ERPs like SAP/Oracle
Posted 14 hours ago
3.0 - 8.0 years
4 - 8 Lacs
Chennai
Work from Office
We are looking for immediate job openings on#Neo4j Graph Database _Chennai_Contract Skills: Neo4j Graph Database#Exp 3+ Years#Location Chennai#Notice PeriodImmediate#Employment Type:ContractJOB Description: Build Knowledge Graph solutions leveraging large-scale datasets Design and build graph database schemas to support various use cases including knowledge graphs Design and develop a Neo4j data model for a new application as per the use cases Design and build graph database load processes to efficiently populate the knowledge graphs Migrate an existing relational database (BigQuery) to Neo4j Build design/integration patterns for both batch and real-time update processes to keep the knowledge graphs in sync Work with stakeholders to understand the requirements and translate them into technical architecture Select and configure appropriate Neo4j features and capabilities as applicable for the given use case(s) Optimize the performance of a Neo4j-based recommendation engine Set up a Neo4j cluster in the cloud Configure Neo4j security features to protect sensitive data Ensure the security and reliability of Neo4j deployments Provide guidance and support to other developers on Neo4j best practices QualificationsMinimum 3+ years of working experience with knowledge graphs/graph databases Expertise with Graph database technology especially Neo4J Expertise with Python, and related software engineering platforms/frameworks Experience in designing and building highly scalable Knowledge Graphs in production Experience developing APIs leveraging knowledge graph data Experience with querying knowledge graphs using a graph query language (e.g. Cypher) Experience working with end-to-end CI/CD pipelines using frameworks The ideal candidate will have a strong knowledge of Graph solutions especially Neo4j, Python and have experience working with massive amounts of data in the retail space. Candidate must have a strong curiosity for data and a proven track record of successfully implementing graph database solutions with proficiency in software engineering.
Posted 15 hours ago
10.0 - 15.0 years
12 - 17 Lacs
Noida
Work from Office
Company Overview With 80,000 customers across 150 countries, UKG is the largest U.S.-based private software company in the world. And we're only getting started. Ready to bring your bold ideas and collaborative mindset to an organization that still has so much more to build and achieveRead on. Here, we know that you're more than your work. That's why our benefits help you thrive personally and professionally, from wellness programs and tuition reimbursement to U Choose "” a customizable expense reimbursement program that can be used for more than 200+ needs that best suit you and your family, from student loan repayment, to childcare, to pet insurance. Our inclusive culture, active and engaged employee resource groups, and caring leaders value every voice and support you in doing the best work of your career. If you're passionate about our purpose "” people "”then we can't wait to support whatever gives you purpose. We're united by purpose, inspired by you.DescriptionUKG Ready is seeking a strategic and hands-on Manager of Product Management to lead initiatives within our Data Sub-Stream. This role will oversee two high-impact areasthe evolution of our GenAI and Agent capabilities (including LLM-based features, evaluations, and intelligent experiences), and the continued growth of our analytics platform, powered by BigQuery.This is a leadership role that involves close collaboration with senior and executive management to define vision, drive execution, and mentor product team members. Success in this role will be measured by your ability to work cross-functionally with engineering, AI research, UX, and other AI pillars across UKG to deliver scalable, responsible, and user-centric data and AI products.If you thrive at the intersection of data, intelligence, and user experience"”and enjoy growing a young team while shaping both strategic roadmaps and day-to-day product development"”we'd love to connect.ResponsibilitiesStrategy & Leadership Define and own the product strategy for UKG Ready's data platform. Lead a team of product managers across multiple geographies, providing mentorship, prioritization, and execution support. Represent GenAI and Analytics in cross-functional planning, executive updates, and stakeholder alignment. Champion a user-centric and ethically responsible approach to LLM-powered features. Align AI-powered product experiences with the underlying analytics infrastructure.GenAI & LLM Oversee the development of LLM-powered features (e.g., summarization, chat, intelligent insights). Guide prompt engineering strategies, evaluation frameworks, and RAG pipelines. Ensure robust infrastructure for safe, high-performance AI interactions. Monitor real-world performance and quality of generative experiences, driving continuous improvement.Analytics Platform Drive the roadmap for the analytics platform and reporting experiences, including dashboards and data exploration tools. Guide the evolution of our BigQuery architecture and data products to support scalability and cross-suite reporting. Collaborate with data engineering and architecture teams to ensure clean, performant, and accessible data for all personas. Align KPIs, dashboards, and self-service tools with both internal and customer-facing needs.Execution Support agile ceremonies across teamsplanning, grooming, story definition, and backlog management. Translate customer feedback, usage data, and market trends into actionable priorities. Balance short-term delivery with long-term vision to ensure sustainable product development. Define and track success metrics across both GenAI and Analytics initiatives.Qualifications 6"“10 years of product management experience, including 2+ years in a leadership or mentoring role. Proven experience owning product strategy and execution in one or more of the following areasGenerative AI/LLMs, analytics platforms, or data products. Demonstrated success leading cross-functional initiatives across engineering, UX, and data science. Strong working knowledge of LLM concepts (prompting, embeddings, RAG, evaluation), preferably in production environments. Hands-on familiarity with cloud data platforms"”BigQuery experience strongly preferred. Excellent communication skills, with the ability to distill complexity into clear direction for both executives and teams. Deep user empathy and a data-driven decision-making mindset. BonusExperience with vector databases, LangChain/LlamaIndex, dbt, or Looker. BonusBackground in enterprise SaaS, HR tech, or workflow platforms. Where we're going UKG is on the cusp of something truly special. Worldwide, we already hold the #1 market share position for workforce management and the #2 position for human capital management. Tens of millions of frontline workers start and end their days with our software, with billions of shifts managed annually through UKG solutions today. Yet it's our AI-powered product portfolio designed to support customers of all sizes, industries, and geographies that will propel us into an even brighter tomorrow! in the Application and Interview Process UKGCareers@ukg.com
Posted 15 hours ago
10.0 - 15.0 years
12 - 17 Lacs
Noida
Work from Office
Company Overview With 80,000 customers across 150 countries, UKG is the largest U.S.-based private software company in the world. And we're only getting started. Ready to bring your bold ideas and collaborative mindset to an organization that still has so much more to build and achieveRead on. Here, we know that you're more than your work. That's why our benefits help you thrive personally and professionally, from wellness programs and tuition reimbursement to U Choose "” a customizable expense reimbursement program that can be used for more than 200+ needs that best suit you and your family, from student loan repayment, to childcare, to pet insurance. Our inclusive culture, active and engaged employee resource groups, and caring leaders value every voice and support you in doing the best work of your career. If you're passionate about our purpose "” people "”then we can't wait to support whatever gives you purpose. We're united by purpose, inspired by you. UKG is hiring an Architect for the UKG Scheduling product. In this role you will be responsible for leading and driving the architectural design and implementation of key modules within the product. This role requires a strong technical background and a passion for driving innovation and efficiency.Architects participate in how we define, implement, and enforce an overall architecture practice, including principles and standards. They are focused on improving efficiency and effectiveness of the business through architectural designs that align and fit within the technical ecosystem. They are the primary drivers of design, while also being technically hands-on to support the teams and ensure that ongoing feature deliveries are well-designed, resilient and optimized for performance.This role will work as a thought leader in our WFM pillar to help define and govern the processes spanning over the whole software development lifecycle. The ideal candidate is an experienced software developer or current architect with experience in cloud infrastructure, enterprise architecture, and software development principles.1. Provide technical leadership and support to UKG Scheduling engineering teams* Develop, design, and communicate a clear architectural vision and design for the teams that is aligned with the organization's goals and objectives, while keeping a product suite mindset at the forefront.* Understand product requirements, drive for clarity of requirements to ensure the design is fit for purpose, and ensure design supports architecture strategy.* Develops technical roadmaps and ensures that services for the suite meet established architectural guidelines and standards.* Deep dive into the code to confirm design integrity and maintain a cost-effective straightforward design throughout the teams.* Collaborate with cross-functional teams, including developers, operations, and product managers, to gather requirements and ensure architectural design meets the needs of internal and external stakeholders.* Collaborate with peers and technical leaders to define/articulate constraints and guidelines.* Define and enforce CI/CD standards, development methodologies, and quality assurance processes.* Identify and mitigate risks associated with architectural decisions.* Ensure effective adoption of observability tools for proactive alerting of production performance issues, adopt service-owner mindset to ensure quick recovery from problems, and constantly seek opportunities to improve resilience of services from failures.* Leverage artificial intelligence tools to identify productivity improvements for engineering teams, as also value-add features for our users.2. Document and maintain the product & service architecture3. Drive strategic architecture vision and innovation.* Identify and evaluate emerging technologies, industry trends, and best practices to ensure the Value Streams scalability, security, and performance.* Provide architecture leadership, focusing on creating and maintaining cross-product and multi-year architecture visions.* Identify architecture risks, develop mitigation strategies, and maintain architectural opportunities for all stakeholders.* Understand how architecture is done across the industry; research new technology trends; identify innovations that can drive a competitive advantage for UKG products.:* Bachelors/Master's in engineering / Computer Science or equivalent experience* 10 years of software development experience in a fast-paced environment, working through all phases of the software development life cycle.* Proven experience as a lead software developer or similar role, driving the architecture and implementation of complex software solutions.* Possess extensive design portfolios showing high proficiency in Java based development technologies for SaaS and Multi-Tenant systems.* Experience leveraging observability tools such as Datadog and Grafana for production monitoring.* Experience with modern cloud technology (GCP, AWS, Azure, Kubernetes, etc.) and the ability to design a solution that operates optimally in a cloud environment, including cost optimizations, leveraging managed services, observability, etc.* Strong exposure for highly reliable, scalable, secure, and decoupled solution* Strong exposure of Continuous Integration and Continuous Delivery process (CI/CD)* Experience in detailed analysis, feasibility studies, performance analysis and prototyping* Experience in developing Software application for Multiplatform development.* Experience in object-oriented programming and design, Service oriented architecture and design patterns* Excellent leadership, communication, and interpersonal skills, with the ability to influence and collaborate with stakeholders at all levels of the organization.* Excellent problems solving skills, with the ability to handle the most complex issues.* Experience with relational and non-relational database technologies (SQL Server, Postgres, MySQL, MongoDB, Cassandra, etc.)* Experience with modern quality practices to effectively automate testing and eliminate manual test processes will be preferred.* Experience with artificial Intelligence and machine learning techniques.* Experience with modern analytics technology (BigQuery, Snowflake, Tableau, Looker, etc.).* Experience with messaging and event streaming solutions (Kafka, RabbitMQ, Apache Beam, Spark, etc.).* Experience with industry leading integration platform (like Boomi). Where we're going UKG is on the cusp of something truly special. Worldwide, we already hold the #1 market share position for workforce management and the #2 position for human capital management. Tens of millions of frontline workers start and end their days with our software, with billions of shifts managed annually through UKG solutions today. Yet it's our AI-powered product portfolio designed to support customers of all sizes, industries, and geographies that will propel us into an even brighter tomorrow! in the Application and Interview Process UKGCareers@ukg.com
Posted 15 hours ago
2.0 - 7.0 years
4 - 9 Lacs
Pune
Work from Office
Company Overview With 80,000 customers across 150 countries, UKG is the largest U.S.-based private software company in the world. And we're only getting started. Ready to bring your bold ideas and collaborative mindset to an organization that still has so much more to build and achieveRead on. Here, we know that you're more than your work. That's why our benefits help you thrive personally and professionally, from wellness programs and tuition reimbursement to U Choose "” a customizable expense reimbursement program that can be used for more than 200+ needs that best suit you and your family, from student loan repayment, to childcare, to pet insurance. Our inclusive culture, active and engaged employee resource groups, and caring leaders value every voice and support you in doing the best work of your career. If you're passionate about our purpose "” people "”then we can't wait to support whatever gives you purpose. We're united by purpose, inspired by you. UKG is a leader in the HCM space, and is at the forefront of artificial intelligence innovation, dedicated to developing cutting-edge generative AI solutions that transform the HR / HCM industry and enhance user experiences. We are seeking a talented and motivated AI Engineers to join our dynamic team and contribute to the development of next-generation AI/GenAI based products and solutions. This role will provide you with the opportunity to work on cutting-edge SaaS technologies and impactful projects that are used by enterprises and users worldwide. As a Senior Software Engineer, you will be involved in the design, development, testing, deployment, and maintenance of software solutions. You will work in a collaborative environment, contributing to the technical foundation behind our flagship products and services. Responsibilities: Software DevelopmentWrite clean, maintainable, and efficient code or various software applications and systems. GenAI Product DevelopmentParticipate in the entire AI development lifecycle, including data collection, preprocessing, model training, evaluation, and deployment.Assist in researching and experimenting with state-of-the-art generative AI techniques to improve model performance and capabilities. Design and ArchitectureParticipate in design reviews with peers and stakeholders Code ReviewReview code developed by other developers, providing feedback adhering to industry standard best practices like coding guidelines TestingBuild testable software, define tests, participate in the testing process, automate tests using tools (e.g., Junit, Selenium) and Design Patterns leveraging the test automation pyramid as the guide. Debugging and TroubleshootingTriage defects or customer reported issues, debug and resolve in a timely and efficient manner. Service Health and QualityContribute to health and quality of services and incidents, promptly identifying and escalating issues. Collaborate with the team in utilizing service health indicators and telemetry for action. Assist in conducting root cause analysis and implementing measures to prevent future recurrences. Dev Ops ModelUnderstanding of working in a DevOps Model. Begin to take ownership of working with product management on requirements to design, develop, test, deploy and maintain the software in production. DocumentationProperly document new features, enhancements or fixes to the product, and also contribute to training materials. Basic Qualifications: Bachelor's degree in Computer Science, Engineering, or a related technical field, or equivalent practical experience. 2+ years of professional software development experience. Proficiency as a developer using Python, FastAPI, PyTest, Celery and other Python frameworks. Experience with software development practices and design patterns. Familiarity with version control systems like Git GitHub and bug/work tracking systems like JIRA. Basic understanding of cloud technologies and DevOps principles. Strong analytical and problem-solving skills, with a proven track record of building and shipping successful software products and services. Preferred Qualifications: Experience with object-oriented programming, concurrency, design patterns, and REST APIs. Experience with CI/CD tooling such as Terraform and GitHub Actions. High level familiarity with AI/ML, GenAI, and MLOps concepts. Familiarity with frameworks like LangChain and LangGraph. Experience with SQL and NoSQL databases such as MongoDB, MSSQL, or Postgres. Experience with testing tools such as PyTest, PyMock, xUnit, mocking frameworks, etc. Experience with GCP technologies such as VertexAI, BigQuery, GKE, GCS, DataFlow, and Kubeflow. Experience with Docker and Kubernetes. Experience with Java and Scala a plus. Where we're going UKG is on the cusp of something truly special. Worldwide, we already hold the #1 market share position for workforce management and the #2 position for human capital management. Tens of millions of frontline workers start and end their days with our software, with billions of shifts managed annually through UKG solutions today. Yet it's our AI-powered product portfolio designed to support customers of all sizes, industries, and geographies that will propel us into an even brighter tomorrow! Disability Accommodation UKGCareers@ukg.com
Posted 15 hours ago
4.0 - 9.0 years
6 - 11 Lacs
Pune
Work from Office
Company Overview With 80,000 customers across 150 countries, UKG is the largest U.S.-based private software company in the world. And we're only getting started. Ready to bring your bold ideas and collaborative mindset to an organization that still has so much more to build and achieveRead on. Here, we know that you're more than your work. That's why our benefits help you thrive personally and professionally, from wellness programs and tuition reimbursement to U Choose "” a customizable expense reimbursement program that can be used for more than 200+ needs that best suit you and your family, from student loan repayment, to childcare, to pet insurance. Our inclusive culture, active and engaged employee resource groups, and caring leaders value every voice and support you in doing the best work of your career. If you're passionate about our purpose "” people "”then we can't wait to support whatever gives you purpose. We're united by purpose, inspired by you. UKG is a leader in the HCM space, and is at the forefront of artificial intelligence innovation, dedicated to developing cutting-edge generative AI solutions that transform the HR / HCM industry and enhance user experiences. We are seeking a talented and motivated AI Engineers to join our dynamic team and contribute to the development of next-generation AI/GenAI based products and solutions. This role will provide you with the opportunity to work on cutting-edge SaaS technologies and impactful projects that are used by enterprises and users worldwide. As a Lead Software Engineer, you will be involved in the design, development, testing, deployment, and maintenance of software solutions. You will work in a collaborative environment, contributing to the technical foundation behind our flagship products and services. Responsibilities: Software DevelopmentWrite clean, maintainable, and efficient code or various software applications and systems. GenAI Product DevelopmentParticipate in the entire AI development lifecycle, including data collection, preprocessing, model training, evaluation, and deployment.Assist in researching and experimenting with state-of-the-art generative AI techniques to improve model performance and capabilities. Design and ArchitectureParticipate in design reviews with peers and stakeholders Code ReviewReview code developed by other developers, providing feedback adhering to industry standard best practices like coding guidelines TestingBuild testable software, define tests, participate in the testing process, automate tests using tools (e.g., Junit, Selenium) and Design Patterns leveraging the test automation pyramid as the guide. Debugging and TroubleshootingTriage defects or customer reported issues, debug and resolve in a timely and efficient manner. Service Health and QualityContribute to health and quality of services and incidents, promptly identifying and escalating issues. Collaborate with the team in utilizing service health indicators and telemetry for action. Assist in conducting root cause analysis and implementing measures to prevent future recurrences. Dev Ops ModelUnderstanding of working in a DevOps Model. Begin to take ownership of working with product management on requirements to design, develop, test, deploy and maintain the software in production. DocumentationProperly document new features, enhancements or fixes to the product, and also contribute to training materials. Basic Qualifications: Bachelor's degree in Computer Science, Engineering, or a related technical field, or equivalent practical experience. 4+ years of professional software development experience. Proficiency as a developer using Python, FastAPI, PyTest, Celery and other Python frameworks. Experience with software development practices and design patterns. Familiarity with version control systems like Git GitHub and bug/work tracking systems like JIRA. Basic understanding of cloud technologies and DevOps principles. Strong analytical and problem-solving skills, with a proven track record of building and shipping successful software products and services. Preferred Qualifications: Experience with object-oriented programming, concurrency, design patterns, and REST APIs. Experience with CI/CD tooling such as Terraform and GitHub Actions. High level familiarity with AI/ML, GenAI, and MLOps concepts. Familiarity with frameworks like LangChain and LangGraph. Experience with SQL and NoSQL databases such as MongoDB, MSSQL, or Postgres. Experience with testing tools such as PyTest, PyMock, xUnit, mocking frameworks, etc. Experience with GCP technologies such as VertexAI, BigQuery, GKE, GCS, DataFlow, and Kubeflow. Experience with Docker and Kubernetes. Experience with Java and Scala a plus. Where we're going UKG is on the cusp of something truly special. Worldwide, we already hold the #1 market share position for workforce management and the #2 position for human capital management. Tens of millions of frontline workers start and end their days with our software, with billions of shifts managed annually through UKG solutions today. Yet it's our AI-powered product portfolio designed to support customers of all sizes, industries, and geographies that will propel us into an even brighter tomorrow! Disability Accommodation UKGCareers@ukg.com
Posted 15 hours ago
3.0 - 8.0 years
5 - 10 Lacs
Bengaluru
Work from Office
Capgemini Invent Capgemini Invent is the digital innovation, consulting and transformation brand of the Capgemini Group, a global business line that combines market leading expertise in strategy, technology, data science and creative design, to help CxOs envision and build what’s next for their businesses. Your Role Solution Design & Architecture Implementation & Deployment Technical Leadership & Guidance Client Engagement & Collaboration Performance Monitoring & Optimization Your Profile Bachelor's or Master's degree in Computer Science, Engineering, or a related field. 3-8 years of experience in designing, implementing, and managing data solutions. 3-8 years of hands-on experience working with Google Cloud Platform (GCP) data services. Strong expertise in core GCP data services, including BigQuery (Data Warehousing) Cloud Storage (Data Lake) Dataflow (ETL/ELT) Cloud Composer (Workflow Orchestration - Apache Airflow) Pub/Sub and Dataflow (Streaming Data) Cloud Data Fusion (Graphical Data Integration) Dataproc (Managed Hadoop and Spark) Proficiency in SQL and experience with data modeling techniques. Experience with at least one programming language (e.g., Python, Java, Scala). Experience with Infrastructure-as-Code (IaC) tools such as Terraform or Cloud Deployment Manager. Understanding of data governance, security, and compliance principles in a cloud environment. Experience with CI/CD pipelines and DevOps practices. Excellent problem-solving, analytical, and communication skills. Ability to work independently and as part of a collaborative team. What you will love about working here We recognize the significance of flexible work arrangements to provide support. Be it remote work, or flexible work hours, you will get an environment to maintain healthy work life balance. At the heart of our mission is your career growth. Our array of career growth programs and diverse professions are crafted to support you in exploring a world of opportunities. Equip yourself with valuable certifications in the latest technologies such as Generative AI. About Capgemini Capgemini is a global business and technology transformation partner, helping organizations to accelerate their dual transition to a digital and sustainable world, while creating tangible impact for enterprises and society. It is a responsible and diverse group of 340,000 team members in more than 50 countries. With its strong over 55-year heritage, Capgemini is trusted by its clients to unlock the value of technology to address the entire breadth of their business needs. It delivers end-to-end services and solutions leveraging strengths from strategy and design to engineering, all fueled by its market leading capabilities in AI, cloud and data, combined with its deep industry expertise and partner ecosystem. The Group reported 2023 global revenues of "22.5 billion.
Posted 15 hours ago
1.0 - 4.0 years
6 - 10 Lacs
Pune
Work from Office
To ensure successful initiation, planning, execution, control and completion of the project by guiding team members on technical aspects, conducting reviews of technical documents and artefacts. Lead project development, production support and maintenance activities. Fill and ensure timesheets are completed, as is the invoicing process, on or before the deadline. Lead the customer interface for the project on an everyday basis, proactively addressing any issues before they are escalated. Create functional and technical specification documents. Track open tickets/ incidents in queue and allocate tickets to resources and ensure that the tickets are closed within the deadlines. Ensure analysts adhere to SLAs/KPIs/OLAs. Ensure that all in the delivery team, including self, are constantly thinking of ways to do things faster, better or in a more economic manner. Lead and ensure project is in compliance with Software Quality Processes and within timelines. Review functional and technical specification documents. Serve as the single point of contact for the team to the project stakeholders. Promote team work, motivate, mentor and develop subordinates. Provide application production support as per process/RACI (Responsible, Accountable, Consulted and Informed) Matrix.
Posted 16 hours ago
6.0 - 11.0 years
20 - 35 Lacs
Kolkata, Bengaluru, Mumbai (All Areas)
Hybrid
We are looking for a skilled and proactive GCP Big Data Engineer with hands-on experience in building and maintaining scalable data pipelines using Google Cloud Platform (GCP) services. The ideal candidate must have a strong foundation in BigQuery , Python , and data engineering best practices. You will work closely with data analysts, architects, and business stakeholders to design efficient data solutions that drive business value. Key Responsibilities: Design, develop, and maintain scalable ETL/ELT data pipelines on GCP. Work with BigQuery , Cloud Storage , Cloud Composer , and other GCP tools to ingest, transform, and load data. Write efficient, reusable, and modular Python scripts for data processing and automation. Optimize data workflows for performance and cost efficiency. Ensure data quality, validation, and governance across pipelines. Collaborate with data scientists and analysts to understand business requirements and translate them into technical solutions. Monitor and troubleshoot data pipeline issues in production environments. Implement CI/CD practices for data engineering workflows. Required Skills: 5+ years of experience in data engineering with at least 4+ years on GCP . Strong expertise in BigQuery and SQL performance tuning. Proficiency in Python for data manipulation, automation, and orchestration. Experience with Cloud Composer (Apache Airflow) for workflow management. Familiarity with data modeling, partitioning, clustering, and query optimization. Strong understanding of data warehouse concepts and best practices. Experience with version control (Git) and DevOps tools. Excellent problem-solving, communication, and collaboration skills. Preferred Qualifications: GCP Professional Data Engineer certification. Experience with other GCP services like Pub/Sub, Dataflow, and Dataproc. Exposure to data security and compliance practices. Knowledge of additional programming languages such as Java or Scala is a plus. Education: Bachelors or Masters degree in Computer Science, Information Technology, Engineering, or related field.
Posted 16 hours ago
8.0 - 13.0 years
6 - 10 Lacs
Hyderabad
Work from Office
Skill Extensive experience with Google Data Products (Cloud Data Fusion,BigQuery, Dataflow, Dataproc, AI Building Blocks, Looker, Dataprep, etc.). Expertise in Cloud Data Fusion,BigQuery & Dataproc Experience in MDM, Metadata Management, Data Quality and Data Lineage tools. E2E Data Engineering and Lifecycle (including non-functional requirements and operations) management. Experience with SQL and NoSQL modern data stores. E2E Solution Design skills - Prototyping, Usability testing and data visualization literacy. Excellent knowledge of the software development life cycle
Posted 17 hours ago
4.0 - 9.0 years
10 - 20 Lacs
Bengaluru
Remote
Job Title: Software Engineer GCP Data Engineering Work Mode: Remote Base Location: Bengaluru Experience Required: 4 to 6 Years Job Summary: We are seeking a Software Engineer with a strong background in GCP Data Engineering and a solid understanding of how to build scalable data processing frameworks. The ideal candidate will be proficient in data ingestion, transformation, and orchestration using modern cloud-native tools and technologies. This role requires hands-on experience in designing and optimizing ETL pipelines, managing big data workloads, and supporting data quality initiatives. Key Responsibilities: Design and develop scalable data processing solutions using Apache Beam, Spark, and other modern frameworks. Build and manage data pipelines on Google Cloud Platform (GCP) using services like Dataflow, Dataproc, Composer (Airflow), and BigQuery . Collaborate with data architects and analysts to understand data models and implement efficient ETL solutions. Leverage DevOps and CI/CD best practices for code management, testing, and deployment using tools like GitHub and Cloud Build. Ensure data quality, performance tuning, and reliability of data processing systems. Work with cross-functional teams to understand business requirements and deliver robust data infrastructure to support analytical use cases. Required Skills: 4 to 6 years of professional experience as a Data Engineer working on cloud platforms, preferably GCP . Proficiency in Java and Python with strong problem-solving and analytical skills. Hands-on experience with Apache Beam , Apache Spark , Dataflow , Dataproc , Composer (Airflow) , and BigQuery . Strong understanding of data warehousing concepts and ETL pipeline optimization techniques. Experience in cloud-based architectures and DevOps practices. Familiarity with version control (GitHub) and CI/CD pipelines . Preferred Skills: Exposure to modern ETL tools and data integration platforms. Experience with data governance, data quality frameworks , and metadata management. Familiarity with performance tuning in distributed data processing systems. Tech Stack: Cloud: GCP (Dataflow, BigQuery, Dataproc, Composer) Programming: Java, Python Frameworks: Apache Beam, Apache Spark DevOps: GitHub, CI/CD tools, Composer (Airflow) ETL/Data Tools: Data ingestion, transformation, and warehousing on GCP
Posted 18 hours ago
8.0 - 12.0 years
20 - 30 Lacs
Pune
Remote
8+ years of hands-on experience in software development, with 3+ years on data engineering practices and tooling • 3+ years working with AWS managed services and cloud-native development • At least 2 years working with Spark, python, SQL using technologies in modern data management and orchestration tooling (e.g., AirFlow, DataBricks, DBT) • Professional experience with data structures, relational databases, non-relational/no-SQL databases, ETL processes, and complex relational queries • Experience developing SaaS (Software as a Service) / product development • Experience with containerization and orchestration technologies (e.g., Docker, Kubernetes) • Exposure to API development and productization / data delivery at scale through APIs • Exceptional problem-solving and analytical skills • Excellent communication and teamwork abilities • Bachelor's degree in Computer Science, Software Engineering, or a related field (or equivalent practical experience)
Posted 18 hours ago
6.0 - 11.0 years
6 - 9 Lacs
Hyderabad
Work from Office
At least 8 + years of experience in any of the ETL tools Prophecy, Datastage 11.5/11.7, Pentaho.. etc . At least 3 years of experience in Pyspark with GCP (Airflow, Dataproc, Big query) capable of configuring data pipelines . Strong Experience in writing complex SQL queries to perform data analysis on Databases SQL server, Oracle, HIVE etc . Possess the following technical skills SQL, Python, Pyspark, Hive, ETL, Unix, Control-M (or similar scheduling tools ) Ability to work independently on specialized assignments within the context of project deliverables Take ownership of providing solutions and tools that iteratively increase engineering efficiencies . Design should help embed standard processes, systems and operational models into the BAU approach for end-to-end execution of Data Pipelines Proven problem solving and analytical abilities including the ability to critically evaluate information gathered from multiple sources, reconcile conflicts, decompose high-level information into details and apply sound business and technical domain knowledge Communicate openly and honestly. Advanced oral, written and visual communication and presentation skills - the ability to communicate efficiently at a global level is paramount. Ability to deliver materials of the highest quality to management against tight deadlines. Ability to work effectively under pressure with competing and rapidly changing priorities.
Posted 18 hours ago
8.0 - 13.0 years
15 - 30 Lacs
Pune
Work from Office
Hi, Greetings from Peoplefy Infosolutions !!! We are hiring for one of our reputed MNC client based in Pune . We are looking for candidates with 8+ years of experience working as a Data Engineer. Job Description: Design, develop, implement, test, and maintain scalable and efficient data pipelines for large scale structured and unstructured datasets, including document, image, and event data used in GenAI and ML use cases. Collaborate closely with data scientists, AI/ML engineers, MLOps and Product Owners to understand data requirements and ensure data availability and quality. Build and optimize data architectures for both batch and real-time processing. Develop and maintain data warehouses and data lakes to store and manage large volumes of structured and unstructured data. Implement data validation and monitoring processes to ensure data integrity. Implement and manage vector databases (eg. pgVector, Pinecone, FAISS, etc) and embedding pipelines to support retrieval-augmented architectures. Support data sourcing and ingestion strategies, including API, data lakes, and message queues. Enforce data quality, lineage, observability, and governance standards for AI workloads Work with cross-functional IT and business teams in an Agile environment to deliver successful data solutions. Help foster a data-driven culture via information sharing, design for scalability, and operational efficiency. Stay updated with the latest trends and best practices in data engineering and big data technologies Interested candidates for above position kindly share your CVs on sneh.ne@peoplefy.com with below details - Experience : CTC : Expected CTC : Notice Period : Location :
Posted 18 hours ago
1.0 - 4.0 years
6 - 10 Lacs
Hyderabad
Work from Office
Step into the world of AI innovation with the Experts Community of Soul AI (By Deccan AI). We are looking for Indias top 1% Data Engineers for a unique job opportunity to work with the industry leaders. Who can be a part of the community. We are looking for top-tier Data Engineers who are proficient in designing, building and optimizing data pipelines. If you have experience in this field then this is your chance to collaborate with industry leaders. Whats in it for you. Pay above market standards. The role is going to be contract based with project timelines from 2 12 months, or freelancing. Be a part of an Elite Community of professionals who can solve complex AI challenges. Work location could be:. Remote (Highly likely). Onsite on client location. Deccan AIs OfficeHyderabad or Bangalore. Responsibilities:. Design and architect enterprise-scale data platforms, integrating diverse data sources and tools. Develop real-time and batch data pipelines to support analytics and machine learning. Define and enforce data governance strategies to ensure security, integrity, and compliance along with optimizing data pipelines for high performance, scalability, and cost efficiency in cloud environments. Implement solutions for real-time streaming data (Kafka, AWS Kinesis, Apache Flink) and adopt DevOps/DataOps best practices. Required Skills: . Strong experience in designing scalable, distributed data systems and programming (Python, Scala, Java) with expertise in Apache Spark, Hadoop, Flink, Kafka, and cloud platforms (AWS, Azure, GCP). Proficient in data modeling, governance, warehousing (Snowflake, Redshift, BigQuery), and security/compliance standards (GDPR, HIPAA). Hands-on experience with CI/CD (Terraform, Cloud Formation, Airflow, Kubernetes) and data infrastructure optimization (Prometheus, Grafana). Nice to Have:. Experience with graph databases, machine learning pipeline integration, real-time analytics, and IoT solutions. Contributions to open-source data engineering communities. What are the next steps. Register on our Soul AI website. Our team will review your profile. Clear all the screening roundsClear the assessments once you are shortlisted. Profile matching and Project AllocationBe patient while we align your skills and preferences with the available project. Skip the Noise. Focus on Opportunities Built for You!.
Posted 19 hours ago
2.0 - 5.0 years
7 - 11 Lacs
Hyderabad
Work from Office
Step into the world of AI innovation with the Deccan AI Experts Community (By Soul AI), where you become a creator, not just a consumer. We are reaching out to the top 1% of Soul AIs Data Visualization Engineers like YOU for a unique job opportunity to work with the industry leaders. Whats in it for you. pay above market standards. The role is going to be contract based with project timelines from 2 6 months, or freelancing. Be a part of an Elite Community of professionals who can solve complex AI challenges. Work location could be:. Remote. Onsite on client locationUS, UAE, UK, India etc. Deccan AIs OfficeHyderabad or Bangalore. Responsibilities:. Architect and implement enterprise-level BI solutions to support strategic decision-making along with data democratization by enabling self-service analytics for non-technical users. Lead data governance and data quality initiatives to ensure consistency and design data pipelines and automated reporting solutions using SQL and Python. Optimize big data queries and analytics workloads for cost efficiency and Implement real-time analytics dashboards and interactive reports. Mentor junior analysts and establish best practices for data visualization. Required Skills: . Advanced SQL, Python (Pandas, NumPy), and BI tools (Tableau, Power BI, Looker). Expertise in AWS (Athena, Redshift), GCP (Big Query), or Snowflake. Experience with data governance, lineage tracking, and big data tools (Spark, Kafka). Exposure to machine learning and AI-powered analytics. Nice to Have:. Experience with graph analytics, geospatial data, and visualization libraries (D3.js, Plotly). Hands-on experience with BI automation and AI-driven analytics. Who can be a part of the community. We are looking for top-tier Data Visualization Engineers with expertise in analyzing and visualizing complex datasets. Proficiency in SQL, Tableau, Power BI, and Python (Pandas, NumPy, Matplotlib) is a plus. If you have experience in this field then this is your chance to collaborate with industry leaders. What are the next steps. Register on our Soul AI website. Our team will review your profile. Clear all the screening roundsClear the assessments once you are shortlisted. Profile matchingBe patient while we align your skills and preferences with the available project. Project AllocationYoull be deployed on your preferred project!. Skip the Noise. Focus on Opportunities Built for You!.
Posted 19 hours ago
3.0 - 8.0 years
13 - 18 Lacs
Hyderabad
Work from Office
Step into the world of AI innovation with the Experts Community of Soul AI (By Deccan AI). We are looking for Indias top 1% Data Architects for a unique job opportunity to work with the industry leaders. Who can be a part of the community. We are looking for top-tier Data Architects who are proficient in designing, building and optimizing data pipelines. If you have experience in this field then this is your chance to collaborate with industry leaders. Whats in it for you. Pay above market standards. The role is going to be contract based with project timelines from 2 12 months, or freelancing. Be a part of an Elite Community of professionals who can solve complex AI challenges. Work location could be:. Remote (Highly likely). Onsite on client location. Deccan AIs OfficeHyderabad or Bangalore. Responsibilities:. Design and architect enterprise-scale data platforms, integrating diverse data sources and tools. Develop real-time and batch data pipelines to support analytics and machine learning. Define and enforce data governance strategies to ensure security, integrity, and compliance along with optimizing data pipelines for high performance, scalability, and cost efficiency in cloud environments. Implement solutions for real-time streaming data (Kafka, AWS Kinesis, Apache Flink) and adopt DevOps/DataOps best practices. Required Skills: . Strong experience in designing scalable, distributed data systems and programming (Python, Scala, Java) with expertise in Apache Spark, Hadoop, Flink, Kafka, and cloud platforms (AWS, Azure, GCP). Proficient in data modeling, governance, warehousing (Snowflake, Redshift, Big Query), and security/compliance standards (GDPR, HIPAA). Hands-on experience with CI/CD (Terraform, Cloud Formation, Airflow, Kubernetes) and data infrastructure optimization (Prometheus, Grafana). Nice to Have:. Experience with graph databases, machine learning pipeline integration, real-time analytics, and IoT solutions. Contributions to open-source data engineering communities. What are the next steps. Register on our Soul AI website. Our team will review your profile. Clear all the screening roundsClear the assessments once you are shortlisted. Profile matching and Project AllocationBe patient while we align your skills and preferences with the available project. Skip the Noise. Focus on Opportunities Built for You!.
Posted 19 hours ago
1.0 - 4.0 years
6 - 10 Lacs
Mumbai
Work from Office
Step into the world of AI innovation with the Experts Community of Soul AI (By Deccan AI). We are looking for Indias top 1% Data Engineers for a unique job opportunity to work with the industry leaders. Who can be a part of the community. We are looking for top-tier Data Engineers who are proficient in designing, building and optimizing data pipelines. If you have experience in this field then this is your chance to collaborate with industry leaders. Whats in it for you. Pay above market standards. The role is going to be contract based with project timelines from 2 12 months, or freelancing. Be a part of an Elite Community of professionals who can solve complex AI challenges. Work location could be:. Remote (Highly likely). Onsite on client location. Deccan AIs OfficeHyderabad or Bangalore. Responsibilities:. Design and architect enterprise-scale data platforms, integrating diverse data sources and tools. Develop real-time and batch data pipelines to support analytics and machine learning. Define and enforce data governance strategies to ensure security, integrity, and compliance along with optimizing data pipelines for high performance, scalability, and cost efficiency in cloud environments. Implement solutions for real-time streaming data (Kafka, AWS Kinesis, Apache Flink) and adopt DevOps/DataOps best practices. Required Skills: . Strong experience in designing scalable, distributed data systems and programming (Python, Scala, Java) with expertise in Apache Spark, Hadoop, Flink, Kafka, and cloud platforms (AWS, Azure, GCP). Proficient in data modeling, governance, warehousing (Snowflake, Redshift, BigQuery), and security/compliance standards (GDPR, HIPAA). Hands-on experience with CI/CD (Terraform, Cloud Formation, Airflow, Kubernetes) and data infrastructure optimization (Prometheus, Grafana). Nice to Have:. Experience with graph databases, machine learning pipeline integration, real-time analytics, and IoT solutions. Contributions to open-source data engineering communities. What are the next steps. Register on our Soul AI website. Our team will review your profile. Clear all the screening roundsClear the assessments once you are shortlisted. Profile matching and Project AllocationBe patient while we align your skills and preferences with the available project. Skip the Noise. Focus on Opportunities Built for You!.
Posted 19 hours ago
2.0 - 5.0 years
7 - 11 Lacs
Mumbai
Work from Office
Step into the world of AI innovation with the Deccan AI Experts Community (By Soul AI), where you become a creator, not just a consumer. We are reaching out to the top 1% of Soul AIs Data Visualization Engineers like YOU for a unique job opportunity to work with the industry leaders. Whats in it for you. pay above market standards. The role is going to be contract based with project timelines from 2 6 months, or freelancing. Be a part of an Elite Community of professionals who can solve complex AI challenges. Work location could be:. Remote. Onsite on client locationUS, UAE, UK, India etc. Deccan AIs OfficeHyderabad or Bangalore. Responsibilities:. Architect and implement enterprise-level BI solutions to support strategic decision-making along with data democratization by enabling self-service analytics for non-technical users. Lead data governance and data quality initiatives to ensure consistency and design data pipelines and automated reporting solutions using SQL and Python. Optimize big data queries and analytics workloads for cost efficiency and Implement real-time analytics dashboards and interactive reports. Mentor junior analysts and establish best practices for data visualization. Required Skills: . Advanced SQL, Python (Pandas, NumPy), and BI tools (Tableau, Power BI, Looker). Expertise in AWS (Athena, Redshift), GCP (Big Query), or Snowflake. Experience with data governance, lineage tracking, and big data tools (Spark, Kafka). Exposure to machine learning and AI-powered analytics. Nice to Have:. Experience with graph analytics, geospatial data, and visualization libraries (D3.js, Plotly). Hands-on experience with BI automation and AI-driven analytics. Who can be a part of the community. We are looking for top-tier Data Visualization Engineers with expertise in analyzing and visualizing complex datasets. Proficiency in SQL, Tableau, Power BI, and Python (Pandas, NumPy, Matplotlib) is a plus. If you have experience in this field then this is your chance to collaborate with industry leaders. What are the next steps. Register on our Soul AI website. Our team will review your profile. Clear all the screening roundsClear the assessments once you are shortlisted. Profile matchingBe patient while we align your skills and preferences with the available project. Project AllocationYoull be deployed on your preferred project!. Skip the Noise. Focus on Opportunities Built for You!.
Posted 19 hours ago
1.0 - 4.0 years
6 - 10 Lacs
Bengaluru
Work from Office
Step into the world of AI innovation with the Experts Community of Soul AI (By Deccan AI). We are looking for Indias top 1% Data Engineers for a unique job opportunity to work with the industry leaders. Who can be a part of the community. We are looking for top-tier Data Engineers who are proficient in designing, building and optimizing data pipelines. If you have experience in this field then this is your chance to collaborate with industry leaders. Whats in it for you. Pay above market standards. The role is going to be contract based with project timelines from 2 12 months, or freelancing. Be a part of an Elite Community of professionals who can solve complex AI challenges. Work location could be:. Remote (Highly likely). Onsite on client location. Deccan AIs OfficeHyderabad or Bangalore. Responsibilities:. Design and architect enterprise-scale data platforms, integrating diverse data sources and tools. Develop real-time and batch data pipelines to support analytics and machine learning. Define and enforce data governance strategies to ensure security, integrity, and compliance along with optimizing data pipelines for high performance, scalability, and cost efficiency in cloud environments. Implement solutions for real-time streaming data (Kafka, AWS Kinesis, Apache Flink) and adopt DevOps/DataOps best practices. Required Skills: . Strong experience in designing scalable, distributed data systems and programming (Python, Scala, Java) with expertise in Apache Spark, Hadoop, Flink, Kafka, and cloud platforms (AWS, Azure, GCP). Proficient in data modeling, governance, warehousing (Snowflake, Redshift, BigQuery), and security/compliance standards (GDPR, HIPAA). Hands-on experience with CI/CD (Terraform, Cloud Formation, Airflow, Kubernetes) and data infrastructure optimization (Prometheus, Grafana). Nice to Have:. Experience with graph databases, machine learning pipeline integration, real-time analytics, and IoT solutions. Contributions to open-source data engineering communities. What are the next steps. Register on our Soul AI website. Our team will review your profile. Clear all the screening roundsClear the assessments once you are shortlisted. Profile matching and Project AllocationBe patient while we align your skills and preferences with the available project. Skip the Noise. Focus on Opportunities Built for You!.
Posted 19 hours ago
2.0 - 5.0 years
7 - 11 Lacs
Bengaluru
Work from Office
Step into the world of AI innovation with the Deccan AI Experts Community (By Soul AI), where you become a creator, not just a consumer. We are reaching out to the top 1% of Soul AIs Data Visualization Engineers like YOU for a unique job opportunity to work with the industry leaders. Whats in it for you. pay above market standards. The role is going to be contract based with project timelines from 2 6 months, or freelancing. Be a part of an Elite Community of professionals who can solve complex AI challenges. Work location could be:. Remote. Onsite on client locationUS, UAE, UK, India etc. Deccan AIs OfficeHyderabad or Bangalore. Responsibilities:. Architect and implement enterprise-level BI solutions to support strategic decision-making along with data democratization by enabling self-service analytics for non-technical users. Lead data governance and data quality initiatives to ensure consistency and design data pipelines and automated reporting solutions using SQL and Python. Optimize big data queries and analytics workloads for cost efficiency and Implement real-time analytics dashboards and interactive reports. Mentor junior analysts and establish best practices for data visualization. Required Skills: . Advanced SQL, Python (Pandas, NumPy), and BI tools (Tableau, Power BI, Looker). Expertise in AWS (Athena, Redshift), GCP (Big Query), or Snowflake. Experience with data governance, lineage tracking, and big data tools (Spark, Kafka). Exposure to machine learning and AI-powered analytics. Nice to Have:. Experience with graph analytics, geospatial data, and visualization libraries (D3.js, Plotly). Hands-on experience with BI automation and AI-driven analytics. Who can be a part of the community. We are looking for top-tier Data Visualization Engineers with expertise in analyzing and visualizing complex datasets. Proficiency in SQL, Tableau, Power BI, and Python (Pandas, NumPy, Matplotlib) is a plus. If you have experience in this field then this is your chance to collaborate with industry leaders. What are the next steps. Register on our Soul AI website. Our team will review your profile. Clear all the screening roundsClear the assessments once you are shortlisted. Profile matchingBe patient while we align your skills and preferences with the available project. Project AllocationYoull be deployed on your preferred project!. Skip the Noise. Focus on Opportunities Built for You!.
Posted 19 hours ago
3.0 - 8.0 years
13 - 18 Lacs
Mumbai
Work from Office
Step into the world of AI innovation with the Experts Community of Soul AI (By Deccan AI). We are looking for Indias top 1% Data Architects for a unique job opportunity to work with the industry leaders. Who can be a part of the community. We are looking for top-tier Data Architects who are proficient in designing, building and optimizing data pipelines. If you have experience in this field then this is your chance to collaborate with industry leaders. Whats in it for you. Pay above market standards. The role is going to be contract based with project timelines from 2 12 months, or freelancing. Be a part of an Elite Community of professionals who can solve complex AI challenges. Work location could be:. Remote (Highly likely). Onsite on client location. Deccan AIs OfficeHyderabad or Bangalore. Responsibilities:. Design and architect enterprise-scale data platforms, integrating diverse data sources and tools. Develop real-time and batch data pipelines to support analytics and machine learning. Define and enforce data governance strategies to ensure security, integrity, and compliance along with optimizing data pipelines for high performance, scalability, and cost efficiency in cloud environments. Implement solutions for real-time streaming data (Kafka, AWS Kinesis, Apache Flink) and adopt DevOps/DataOps best practices. Required Skills: . Strong experience in designing scalable, distributed data systems and programming (Python, Scala, Java) with expertise in Apache Spark, Hadoop, Flink, Kafka, and cloud platforms (AWS, Azure, GCP). Proficient in data modeling, governance, warehousing (Snowflake, Redshift, Big Query), and security/compliance standards (GDPR, HIPAA). Hands-on experience with CI/CD (Terraform, Cloud Formation, Airflow, Kubernetes) and data infrastructure optimization (Prometheus, Grafana). Nice to Have:. Experience with graph databases, machine learning pipeline integration, real-time analytics, and IoT solutions. Contributions to open-source data engineering communities. What are the next steps. Register on our Soul AI website. Our team will review your profile. Clear all the screening roundsClear the assessments once you are shortlisted. Profile matching and Project AllocationBe patient while we align your skills and preferences with the available project. Skip the Noise. Focus on Opportunities Built for You!.
Posted 19 hours ago
1.0 - 4.0 years
6 - 10 Lacs
Kolkata
Work from Office
Step into the world of AI innovation with the Experts Community of Soul AI (By Deccan AI). We are looking for Indias top 1% Data Engineers for a unique job opportunity to work with the industry leaders. Who can be a part of the community. We are looking for top-tier Data Engineers who are proficient in designing, building and optimizing data pipelines. If you have experience in this field then this is your chance to collaborate with industry leaders. Whats in it for you. Pay above market standards. The role is going to be contract based with project timelines from 2 12 months, or freelancing. Be a part of an Elite Community of professionals who can solve complex AI challenges. Work location could be:. Remote (Highly likely). Onsite on client location. Deccan AIs OfficeHyderabad or Bangalore. Responsibilities:. Design and architect enterprise-scale data platforms, integrating diverse data sources and tools. Develop real-time and batch data pipelines to support analytics and machine learning. Define and enforce data governance strategies to ensure security, integrity, and compliance along with optimizing data pipelines for high performance, scalability, and cost efficiency in cloud environments. Implement solutions for real-time streaming data (Kafka, AWS Kinesis, Apache Flink) and adopt DevOps/DataOps best practices. Required Skills: . Strong experience in designing scalable, distributed data systems and programming (Python, Scala, Java) with expertise in Apache Spark, Hadoop, Flink, Kafka, and cloud platforms (AWS, Azure, GCP). Proficient in data modeling, governance, warehousing (Snowflake, Redshift, BigQuery), and security/compliance standards (GDPR, HIPAA). Hands-on experience with CI/CD (Terraform, Cloud Formation, Airflow, Kubernetes) and data infrastructure optimization (Prometheus, Grafana). Nice to Have:. Experience with graph databases, machine learning pipeline integration, real-time analytics, and IoT solutions. Contributions to open-source data engineering communities. What are the next steps. Register on our Soul AI website. Our team will review your profile. Clear all the screening roundsClear the assessments once you are shortlisted. Profile matching and Project AllocationBe patient while we align your skills and preferences with the available project. Skip the Noise. Focus on Opportunities Built for You!.
Posted 19 hours ago
2.0 - 5.0 years
7 - 11 Lacs
Kolkata
Work from Office
Step into the world of AI innovation with the Deccan AI Experts Community (By Soul AI), where you become a creator, not just a consumer. We are reaching out to the top 1% of Soul AIs Data Visualization Engineers like YOU for a unique job opportunity to work with the industry leaders. Whats in it for you. pay above market standards. The role is going to be contract based with project timelines from 2 6 months, or freelancing. Be a part of an Elite Community of professionals who can solve complex AI challenges. Work location could be:. Remote. Onsite on client locationUS, UAE, UK, India etc. Deccan AIs OfficeHyderabad or Bangalore. Responsibilities:. Architect and implement enterprise-level BI solutions to support strategic decision-making along with data democratization by enabling self-service analytics for non-technical users. Lead data governance and data quality initiatives to ensure consistency and design data pipelines and automated reporting solutions using SQL and Python. Optimize big data queries and analytics workloads for cost efficiency and Implement real-time analytics dashboards and interactive reports. Mentor junior analysts and establish best practices for data visualization. Required Skills: . Advanced SQL, Python (Pandas, NumPy), and BI tools (Tableau, Power BI, Looker). Expertise in AWS (Athena, Redshift), GCP (Big Query), or Snowflake. Experience with data governance, lineage tracking, and big data tools (Spark, Kafka). Exposure to machine learning and AI-powered analytics. Nice to Have:. Experience with graph analytics, geospatial data, and visualization libraries (D3.js, Plotly). Hands-on experience with BI automation and AI-driven analytics. Who can be a part of the community. We are looking for top-tier Data Visualization Engineers with expertise in analyzing and visualizing complex datasets. Proficiency in SQL, Tableau, Power BI, and Python (Pandas, NumPy, Matplotlib) is a plus. If you have experience in this field then this is your chance to collaborate with industry leaders. What are the next steps. Register on our Soul AI website. Our team will review your profile. Clear all the screening roundsClear the assessments once you are shortlisted. Profile matchingBe patient while we align your skills and preferences with the available project. Project AllocationYoull be deployed on your preferred project!. Skip the Noise. Focus on Opportunities Built for You!.
Posted 19 hours ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
BigQuery, a powerful cloud-based data warehouse provided by Google Cloud, is in high demand in the job market in India. Companies are increasingly relying on BigQuery to analyze and manage large datasets, driving the need for skilled professionals in this area.
The average salary range for BigQuery professionals in India varies based on experience level. Entry-level positions may start at around INR 4-6 lakhs per annum, while experienced professionals can earn upwards of INR 15-20 lakhs per annum.
In the field of BigQuery, a typical career progression may include roles such as Junior Developer, Developer, Senior Developer, Tech Lead, and eventually moving into managerial positions such as Data Architect or Data Engineering Manager.
Alongside BigQuery, professionals in this field often benefit from having skills in SQL, data modeling, data visualization tools like Tableau or Power BI, and cloud platforms like Google Cloud Platform or AWS.
As you explore opportunities in the BigQuery job market in India, remember to continuously upskill and stay updated with the latest trends in data analytics and cloud computing. Prepare thoroughly for interviews by practicing common BigQuery concepts and showcase your hands-on experience with the platform. With dedication and perseverance, you can excel in this dynamic field and secure rewarding career opportunities. Good luck!
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
20312 Jobs | Dublin
Wipro
11977 Jobs | Bengaluru
EY
8165 Jobs | London
Accenture in India
6667 Jobs | Dublin 2
Uplers
6462 Jobs | Ahmedabad
Amazon
6351 Jobs | Seattle,WA
Oracle
5993 Jobs | Redwood City
IBM
5803 Jobs | Armonk
Capgemini
3897 Jobs | Paris,France
Tata Consultancy Services
3776 Jobs | Thane