Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
5.0 - 15.0 years
0 Lacs
karnataka
On-site
As an experienced data engineer with a focus on Google Cloud Platform (GCP), you will be responsible for designing and implementing end-to-end data engineering architectures tailored to meet both business and technical requirements. Your key responsibilities will include leading the implementation of these architectures, ensuring adherence to best practices and compliance standards. A significant aspect of your role will involve driving Kafka migration strategies, encompassing tasks such as topic identification, optimization, and migration planning. You will be expected to develop and execute a framework for Kafka topic selection, blueprinting, and migration to GCP. Additionally, you will play a pivotal role in identifying and transitioning non-compliant Kafka topics to alternative, compliant patterns. In line with enhancing operational efficiencies, you will work towards standardizing infrastructure and streaming patterns across different tenants and environments. Collaboration with engineering, DevOps, and business teams will be crucial to ensure seamless project delivery. Your expertise will also be instrumental in providing architectural governance, comprehensive documentation, and technical leadership throughout the project lifecycle. Continuous evaluation and enhancement of cloud architecture for improved performance, cost efficiency, and scalability will be key focus areas. Therefore, your role will demand a thorough understanding of GCP services such as BigQuery, Dataflow, Pub/Sub, Cloud Storage, Cloud Composer, and IAM, among others. To excel in this role, you should possess a minimum of 12-15 years of IT experience, with at least 5 years specifically in GCP data engineering and architecture. Demonstrable experience in Kafka migrations, proficiency in data pipeline design, real-time and batch processing, cloud-native integration patterns, and blueprinting frameworks for Kafka topic migration is essential. Moreover, a deep understanding of cloud security, networking, and infrastructure standardization is imperative. Strong communication skills, adept documentation capabilities, and effective stakeholder management are qualities that will further bolster your success in this position. While not mandatory, possessing GCP certifications like Professional Cloud Architect or Data Engineer, familiarity with Terraform, CI/CD pipelines, and DevOps practices, as well as experience in Agile methodologies and multi-cloud environments, will be advantageous. Exposure to governance frameworks and compliance-driven architectures will also be beneficial in contributing to the overall success of the projects undertaken.,
Posted 2 days ago
5.0 - 9.0 years
0 Lacs
maharashtra
On-site
As a Principal Architect/Cloud Evangelist at Niveus Solutions, a distinguished Google Cloud Premier Partner, you will play a crucial role in empowering businesses with transformative cloud solutions. Your responsibilities will involve strategic thought leadership, deep technical mastery, and outstanding communication abilities to drive the adoption of Google Cloud solutions and secure high-impact client engagements. With a minimum of 18 years of overall IT experience and at least 5+ years in a dedicated senior architect or lead technical role focused on Google Cloud Platform, you will have a demonstrable track record of success in pre-sales or solutions architecture. Your profound expertise will cover a wide range of Google Cloud services, including compute, networking, storage, data & analytics, AI/ML, security, operations, and identity. You will need to possess multiple current Google Cloud Certifications, including Google Cloud Professional Cloud Architect as mandatory, and others such as Professional Data Engineer, Professional Cloud Security Engineer, or Professional Machine Learning Engineer as highly desirable. Your exceptional written and verbal communication skills will be vital in articulating complex technical concepts clearly to both technical and non-technical audiences. Being part of Niveus Solutions will allow you to be involved in groundbreaking Google Cloud projects, collaborate with a vibrant team of cloud experts, and benefit from continuous learning opportunities and exposure to the latest cloud technologies. The organization offers a competitive compensation package, comprehensive benefits, and a dynamic work environment focused on innovation and operational excellence. If you are a visionary architect with a passion for Google Cloud and the ability to inspire and lead, we encourage you to apply and contribute to the success story of Niveus Solutions.,
Posted 2 days ago
7.0 - 11.0 years
0 Lacs
kochi, kerala
On-site
As a Data Architect with over 10 years of experience, you will be responsible for designing and implementing scalable, secure, and cost-effective data architectures using Google Cloud Platform (GCP). Your primary duties will involve leading the design and development of data pipelines with tools such as BigQuery, Dataflow, and Cloud Storage. Additionally, you will be expected to architect and implement data lakes, data warehouses, and real-time data processing solutions on GCP while ensuring alignment with business goals, governance, and compliance requirements. Collaboration with stakeholders to define data strategy and roadmap will be a key aspect of your role, along with designing and deploying optimized BigQuery solutions for performance and cost efficiency. Building and maintaining ETL/ELT pipelines for large-scale data processing, leveraging tools like Cloud Pub/Sub, Dataflow, and Cloud Functions for real-time data integration, and implementing best practices for data security, privacy, and compliance in cloud environments are also significant responsibilities. Your expertise will be utilized to integrate machine learning workflows with data pipelines and analytics tools, define data governance frameworks, manage data lineage, and lead data modeling efforts to ensure consistency, accuracy, and performance across systems. Optimization of cloud infrastructure for scalability, performance, and reliability, as well as mentoring junior team members and ensuring adherence to architectural standards, will be essential to your role. Collaboration with DevOps teams to implement Infrastructure as Code using tools like Terraform and Cloud Deployment Manager, ensuring high availability and disaster recovery solutions in data systems, conducting technical reviews, audits, and performance tuning, and designing solutions for multi-region and multi-cloud data architecture are additional responsibilities. Staying updated on emerging technologies and trends in data engineering and GCP, driving innovation in data architecture by recommending new tools and services, and possessing Google Cloud Certification are preferred qualifications. Strong skills in BigQuery, Cloud Dataflow, Cloud Pub/Sub, Cloud Storage, SQL, Python, data warehousing, data lakes, and real-time data pipelines, along with experience in cloud security, data governance, and compliance frameworks, are also required. Leadership experience, the ability to mentor technical teams, and excellent communication and collaboration skills are essential for success in this role.,
Posted 2 days ago
10.0 - 15.0 years
0 Lacs
delhi
On-site
As a Data Architect at Wingify, you will be responsible for leading and mentoring a team of data engineers to ensure high performance and career growth. Your primary focus will be on architecting and optimizing scalable data infrastructure to guarantee high availability and reliability. Additionally, you will drive the development and implementation of data governance frameworks and best practices, working closely with cross-functional teams to define and execute a data roadmap. Your role will also involve optimizing data processing workflows for performance and cost efficiency, ensuring data security, compliance, and quality across all data platforms, and fostering a culture of innovation and technical excellence within the data team. Preferred qualifications for this role include experience in machine learning infrastructure or MLOps, exposure to real-time data processing and analytics, an interest in data structures, algorithm analysis and design, multicore programming, and scalable architecture, as well as prior experience in a SaaS or high-growth tech company. The ideal candidate for this position should have 10+ years of experience in software/data engineering, with at least 3+ years in a leadership role. You should have expertise in backend development using programming languages such as Java, PHP, Python, Node.JS, GoLang, JavaScript, HTML, and CSS. Proficiency in SQL, Python, and Scala for data processing and analytics is essential, along with a strong understanding of cloud platforms such as AWS, GCP, or Azure and their data services. You should also possess a strong foundation and expertise in HLD and LLD, as well as design patterns, preferably using Spring Boot or Google Guice. Experience in big data technologies such as Spark, Hadoop, Kafka, and distributed computing frameworks, as well as hands-on experience with data warehousing solutions like Snowflake, Redshift, or BigQuery, will be advantageous. Deep knowledge of data governance, security, and compliance standards such as GDPR and SOC2 is also required for this role.,
Posted 2 days ago
1.0 - 5.0 years
0 Lacs
indore, madhya pradesh
On-site
We are searching for a Python Web Developer to act as a bridge between front-end development and machine learning infrastructure. Your main responsibility will involve constructing dynamic web interfaces, integrating with real-time ML APIs, and aiding in the delivery of data-driven experiences to end users. As a Python Web Developer with 1-3 years of experience, your key responsibilities will include designing, building, and maintaining scalable web applications that interact with machine learning models. You will be tasked with developing REST APIs and services using FastAPI for seamless integration with real-time ML models. Additionally, you will handle the consumption and serving of predictions from models using Python, while integrating them with front-end components. You will also be involved in working on real-time data pipelines and backend services, collaborating with data teams to establish infrastructure for training, deployment, and monitoring of ML models. Managing large-scale data flows utilizing Airflow, Redis, and PostgreSQL will be part of your routine tasks. Furthermore, you will be responsible for deploying containerized applications using Docker and overseeing them on AWS. Your role will also entail integrating models developed with PyTorch, TensorFlow, and Hugging Face, as well as participating in LLM fine-tuning, embeddings-based search, RAG pipelines, and multi-modal model integrations. You will track and manage experiments using MLflow and develop and support systems using Ray for scalable and distributed processing. Collaboration on projects using LangChain for chaining LLM operations will also be expected. This is a full-time position based in Indore, Madhya Pradesh, and requires in-person work. Required Experience: - Python: 1 year Location: - Indore, Madhya Pradesh,
Posted 2 days ago
5.0 - 10.0 years
0 Lacs
pune, maharashtra
On-site
The role of Data Engineer at Atgeir Solutions in Pune requires a Bachelor's or Master's degree in Computer Science, Engineering, or a related field with a minimum of 5-10 years of experience in IT Consulting & Services. As a Data Engineer, you will be responsible for designing, developing, and optimizing data infrastructure and workflows to drive insightful analytics, enhance AI/ML models, and facilitate data-driven decision-making for clients. Your primary responsibilities will include designing and implementing scalable data architectures, ETL pipelines, and data workflows to process, store, and analyze large datasets effectively. You will also integrate and consolidate data from various sources, ensuring data integrity and quality while implementing data governance practices and complying with information security standards. Collaborating closely with data scientists, software engineers, and business analysts, you will align data engineering solutions with business goals and requirements. Additionally, optimizing database performance, query execution, and storage solutions for efficiency is a crucial part of your role. You will also contribute to innovative projects and mentor junior team members, fostering a culture of continuous learning and development. The ideal candidate for this position should possess proficiency in programming languages such as Python, Java, or Scala, along with hands-on experience in big data technologies like Hadoop and Spark. Strong knowledge of SQL and NoSQL databases, familiarity with data warehousing tools, and experience with cloud platforms such as GCP, AWS, or Azure are essential. Expertise in ETL/ELT processes, data modeling, and data lake architectures, as well as knowledge of data visualization tools, are required. Educational/Academic Experience in teaching relevant technical subjects and the ability to explain complex concepts to diverse audiences are beneficial. Soft skills including excellent analytical and problem-solving abilities, strong interpersonal skills, and the capacity to collaborate effectively in cross-functional teams are highly valued. Preferred qualifications include certifications in cloud technologies, familiarity with AI/ML tools and frameworks, and contributions to research papers, technical blogs, or open-source projects.,
Posted 3 days ago
3.0 - 7.0 years
0 Lacs
chennai, tamil nadu
On-site
The purpose of this role is to create and support business intelligence solutions, driving BI requirements with clients and ensuring accurate reporting documentation. You will have a strong understanding of BI tools and technologies, directly interfacing with clients, third-party entities, and other teams. Key responsibilities include possessing a higher-level understanding of database and data management concepts, communicating effectively with both internal and external stakeholders, defining best practices for the design and development of data layers, and participating in client engagements as the reporting Subject Matter Expert (SME). You will collaborate with database administrators, developers, data stewards, and others to ensure accurate collection and analysis of reporting requirements, develop complex worksheets and dashboards for effective storytelling, and create ad-hoc data sets for end users. Additionally, you will develop custom tables/views and data models across various Databases such as SQL Server, Snowflake, BigQuery, and/or Redshift. This position is based at DGS India in Chennai at Anna Nagar Tyche Towers, under the brand Paragon. It is a full-time role with a permanent contract.,
Posted 3 days ago
6.0 - 10.0 years
0 Lacs
haryana
On-site
Join GlobalLogic and be a valuable part of the team working on a significant software project for a world-class company providing M2M / IoT 4G/5G modules to industries such as automotive, healthcare, and logistics. As part of our engagement, you will contribute to developing end-user modules" firmware, implementing new features, maintaining compatibility with the latest telecommunication and industry standards, and analyzing and estimating customer requirements. Requirements: - BA / BS degree in Computer Science, Mathematics, or related technical field, or equivalent practical experience. - Experience in Cloud SQL and Cloud Bigtable. - Familiarity with Dataflow, BigQuery, Dataproc, Datalab, Dataprep, Pub / Sub, and Genomics. - Experience in Google Transfer Appliance, Cloud Storage Transfer Service, BigQuery Data Transfer. - Proficiency in data processing software (e.g., Hadoop, Kafka, Spark, Pig, Hive) and data processing algorithms (MapReduce, Flume). - Experience working with technical customers. - Proficiency in writing software in languages such as Java or Python. - 6-10 years of relevant consulting, industry, or technology experience. - Strong problem-solving and troubleshooting skills. - Strong communication skills. Responsibilities: - Experience working with data warehouses, including technical architectures, infrastructure components, ETL / ELT, and reporting / analytic tools. - Experience in technical consulting. - Architecting and developing software or internet scale production-grade Big Data solutions in virtualized environments like Google Cloud Platform (mandatory) and AWS / Azure (good to have). - Working with big data, information retrieval, data mining, or machine learning, and building multi-tier high availability applications with modern web technologies. - Knowledge of ITIL and / or agile methodologies. - Google Data Engineer certified. What we offer: At GlobalLogic, we prioritize a culture of caring, continuous learning and development, interesting & meaningful work, balance and flexibility, and integrity. You'll have the opportunity to build meaningful connections with collaborative teammates, supportive managers, and compassionate leaders, learn and grow daily, work on impactful projects, achieve work-life balance, and be part of a high-trust organization committed to integrity. About GlobalLogic: GlobalLogic, a Hitachi Group Company, is a trusted digital engineering partner to the world's largest and most forward-thinking companies. Since 2000, we've been helping create innovative digital products and experiences, collaborating with clients to transform businesses and redefine industries through intelligent products, platforms, and services.,
Posted 3 days ago
7.0 - 11.0 years
0 Lacs
maharashtra
On-site
As a Staff Engineer - Data, Digital Business at SonyLIV, your primary focus will be on leading the data engineering strategy. You will be responsible for architecting a scalable data infrastructure, driving innovation in data processing, ensuring operational excellence, and fostering a high-performance team. Your role is crucial in enabling data-driven insights for OTT content and user engagement. Your responsibilities will include defining the technical vision for SonyLIV's data and analytics platform, leading innovation in data processing and architecture, ensuring operational excellence in data systems, building and mentoring a high-caliber data engineering team, collaborating with cross-functional teams, and driving data quality and business insights. You will work closely with Data Scientists, Software Engineers, and Product Managers to deliver scalable data solutions for personalization algorithms, recommendation engines, and content analytics. To be successful in this role, you should have a minimum of + years of progressive experience in data engineering, business intelligence, and data warehousing. You must have a proven track record in building, scaling, and managing large data engineering teams, as well as expertise in designing and implementing scalable data architectures using modern technologies like Spark, Kafka, Snowflake, and cloud services. Proficiency in SQL and experience with object-oriented programming languages such as Python or Java will be essential for custom data solutions and pipeline optimization. Additionally, you should have strong experience in establishing and enforcing SLAs for data availability, accuracy, and latency, and extensive knowledge of A/B testing methodologies and statistical analysis. Skills in data governance, data privacy, and compliance are also required, along with experience in implementing security protocols and controls within large data ecosystems. If you have a Bachelor's or Master's degree in computer science, Mathematics, Physics, or a related technical field, and experience managing the end-to-end data engineering lifecycle, it would be considered a plus. Familiarity with automated data lineage and data auditing tools, as well as expertise with BI and visualization tools and advanced processing frameworks, are also preferred qualifications. Joining the SonyLIV team will provide you with the opportunity to drive the future of data-driven entertainment. You will collaborate with some of the brightest minds in the industry, access a comprehensive data set, and leverage cutting-edge technology to make a tangible impact on the products delivered and the viewers engaged. If you are passionate about working with big data and shaping the direction of products that impact millions of users daily, we look forward to connecting with you.,
Posted 3 days ago
2.0 - 14.0 years
0 Lacs
kochi, kerala
On-site
As an experienced professional with 2.5 to 14 years of experience, you will be responsible for working on development projects in Data Warehousing within GCP platforms. You should have a strong understanding of dimensional modeling and expertise in MySQL, SQL/PL, GCP, and BigQuery. Possessing a GCP certification would be an added advantage for this role. Your role will involve working with Cloud Composer, DAGs, and airflow, as well as developing REST APIs. You should have a proven track record in analytical thinking and problem-solving, along with excellent communication skills. In addition to your technical skills, you will be expected to demonstrate leadership abilities by leading team members, translating requirements into technical solutions, assigning tasks, coordinating the team, and reviewing deliverables to ensure they are on schedule. If you are based in Kochi or willing to relocate for a walk-in interview on 1st February, and if you meet the above requirements, we encourage you to apply for this exciting opportunity.,
Posted 4 days ago
6.0 - 10.0 years
0 Lacs
karnataka
On-site
As a Senior Data Engineer at Notified in Bangalore, you will play a crucial role in designing and implementing complex ETL processes using Informatica Intelligent Cloud Services (IICS). Your primary responsibility will be to extract, transform, and load data efficiently across multiple sources and destinations. You will be working closely with cross-functional teams to gather project requirements, develop ETL specifications, and ensure successful execution of data integration strategies. Your expertise in data warehousing concepts and techniques will be essential in designing and maintaining scalable, high-performing data models and schemas. In this role, you will also be responsible for ensuring real-time data synchronization to optimize system performance and enhance data-driven decision-making processes. Your ability to work with various data warehouses such as Snowflake, MSSQL, MySQL, and BigQuery will be key. Additionally, you will be involved in ongoing platform administration tasks including user management, data security, performance tuning, and system upgrades to ensure optimal performance and data reliability. As a part of the team, you will continuously explore and adapt to new technologies and methodologies in the data engineering space, fostering a culture of innovation and continuous improvement. Troubleshooting and resolving integration issues within the Informatica platform while maintaining the highest level of data quality and integrity will also be part of your responsibilities. Your proactive approach in maintaining and optimizing existing ETL processes, identifying areas for improvement, and aligning them with the organization's strategic goals will be crucial. To be successful in this role, you should have 6-8 years of experience in Informatica application integration (IICS), data integration, ETL jobs, Snowflake, and SQL, preferably in large and complex data environments. A strong knowledge of Informatica PowerCenter, IDQ, and IICS is mandatory. Additionally, you should have experience in SQL query writing, building data warehousing models, data marts, stored procedures & views for front-end reporting. A bachelor's degree in computer science, Engineering, or a related field is required, with a master's degree being preferred. Basic knowledge of Snowflake DB, experience with Salesforce, ServiceNow, or ConnectWise data, and real-time data synchronization are good to have. Familiarity with Tableau/PowerBI & Snowflake, BigQuery tools is a plus. Strong leadership, communication, and problem-solving skills are essential for this role. At Notified, we are committed to creating a more connected world by providing the tools needed to amplify stories. We believe in helping people and brands share their stories globally and strive to deliver wisdom and insight to our clients. As an equal opportunities employer, we celebrate and support differences to build success.,
Posted 4 days ago
2.0 - 6.0 years
0 Lacs
karnataka
On-site
The team member is expected to work directly with the client in a fast-paced environment to address their day-to-day analytics needs. You will be required to incorporate business context to design and develop solutions leveraging statistical and advanced analytics methodologies. Root cause analysis on internal and external data and processes will be necessary to answer specific business questions and identify improvement opportunities. Building efficiencies through standardization and automation is also a key aspect of the role. The indicative areas of work include Digital Analytics, where you will implement, audit, and leverage GA4 to measure and enhance product usage. Utilizing GTM to track custom events and visualizing data through Looker Studio via BigQuery will be part of your responsibilities. User Analytics will involve understanding user behavior in the app, feature adoption, retention and churn drivers, transaction behavior across features, offer and scratch card redemption, and the impact of email and notifications. Additionally, Marketing measurement will entail evaluating the effectiveness of marketing campaigns through channels like ATL and performance media. If you are someone who enjoys working in a dynamic environment with passionate colleagues and thrives on addressing analytical challenges with creativity and innovation, this role offers an exciting opportunity to contribute to the success of our clients and the organization as a whole.,
Posted 4 days ago
6.0 - 10.0 years
0 Lacs
karnataka
On-site
As a Cloud Full Stack Developer (Team Lead) at NGX Systems in Bangalore, India, with 6-8+ years of experience, you will play a vital role in our dynamic team. Your primary focus will be on cloud-native development and full stack technologies, particularly in Java or Node.js. Your expertise in building, deploying, and maintaining scalable microservices on the Google Cloud Platform (GCP) will be crucial. You will lead a small team of junior and mid-level engineers, ensuring code quality and delivering robust, secure, and high-performing applications. Responsibilities include independently developing, designing, and maintaining full stack applications, guiding junior/mid-level developers through mentoring and code reviews, leveraging GCP for cloud-native development, working with services such as Firebase, Firestore, BigQuery, developing scalable and secure REST APIs and microservices using Java or Node.js, parsing and managing streaming JSON data efficiently for real-time applications, implementing containerized applications using Docker, managing deployments with Kubernetes, collaborating on UI development with React.js, and addressing Cross-Site Scripting (XSS) issues. You will actively participate in Agile development processes, ensuring security best practices and compliance in all stages of the software development lifecycle. Required skills for this role include strong experience in Java or Node.js programming, hands-on experience with GCP/AWS, proficiency in REST API development, microservices architecture, deploying applications using Docker and Kubernetes, experience with streaming JSON data parsing and real-time data management, solid knowledge of software design principles and best practices, familiarity with React.js or similar UI libraries, understanding of Agile methodologies and practices, and knowledge of security protocols and data compliance standards. It would be beneficial to have experience with front-end development frameworks like React.js, familiarity with data governance, encryption, and regulatory compliance. This is a full-time, hybrid position where your expertise and leadership will be instrumental in driving the success of our projects.,
Posted 4 days ago
10.0 - 14.0 years
0 Lacs
pune, maharashtra
On-site
One of our prestigious clients, a TOP MNC Giant with a global presence, is currently seeking a Lead Enterprise Architect to join their team in Pune, Mumbai, or Bangalore. **Qualifications and Certifications:** **Education:** - Bachelors or masters degree in Computer Science, Information Technology, Engineering, or a related field. **Experience:** - A minimum of 10+ years of experience in data engineering, with at least 4 years of hands-on experience with GCP cloud platforms. - Proven track record in designing and implementing data workflows using GCP services such as BigQuery, Dataform Cloud Dataflow, Cloud Pub/Sub, and Cloud Composer. **Certifications:** - Google Cloud Professional Data Engineer certification is preferred. **Key Skills:** **Mandatory Skills:** - Advanced proficiency in Python for developing data pipelines and automation. - Strong SQL skills for querying, transforming, and analyzing large datasets. - Hands-on experience with various GCP services including Cloud Storage, Dataflow, Cloud Pub/Sub, Cloud SQL, BigQuery, Dataform, Compute Engine, and Kubernetes Engine (GKE). - Familiarity with CI/CD tools like Jenkins, GitHub, or Bitbucket. - Proficiency in Docker, Kubernetes, Terraform, or Ansible for containerization, orchestration, and infrastructure as code (IaC). - Knowledge of workflow orchestration tools such as Apache Airflow or Cloud Composer. - Strong understanding of Agile/Scrum methodologies. **Nice-to-Have Skills:** - Experience with other cloud platforms like AWS or Azure. - Familiarity with data visualization tools such as Power BI, Looker, or Tableau. - Understanding of machine learning workflows and their integration with data pipelines. **Soft Skills:** - Strong problem-solving and critical-thinking abilities. - Excellent communication skills to effectively collaborate with both technical and non-technical stakeholders. - Proactive attitude towards innovation and continuous learning. - Ability to work independently and as part of a collaborative team. If you are interested in this opportunity, please reply back with your updated CV and provide the following details: - Total Experience: - Relevant experience in Data Engineering: - Relevant experience in GCP cloud platforms: - Relevant experience as an Enterprise Architect: - Availability to join ASAP: - Preferred location (Pune / Mumbai / Bangalore): We will contact you once we receive your CV along with the above-mentioned details. Thank you, Kavita.A,
Posted 1 week ago
4.0 - 8.0 years
0 Lacs
haryana
On-site
OakNorth is a profitable business that has supported the growth of thousands of businesses. We help entrepreneurs scale quickly, realise their ambitions and make data-driven decisions. We're looking for engineers who are particularly passionate about data analytics and data engineering to join our team. You'd use both your generalist and specialist skills to better our products and our team. You'd join our data platform squad as an immediate contributor. As an Analytics Engineer, you will work with our Finance teams to transform raw data into meaningful insights using tools like DBT, BigQuery, and Tableau. You should have 4-8 years of relevant hands-on experience and be proficient in developing and maintaining data models using DBT. Your responsibilities will include writing and optimizing SQL queries to transform raw data into structured formats, developing and maintaining interactive dashboards and reports in Tableau, collaborating with stakeholders to gather requirements and translate them into analytical solutions, and working well cross-functionally while earning trust from co-workers at all levels. You should deeply care about mentorship and growing your colleagues, prefer simple solutions and designs over complex ones, enjoy working with a diverse group of people with different areas of expertise, challenge the existing approach when necessary, and be organized amidst chaos. You should also be a broad thinker with the capability to see the potential impact of decisions across the wider business. In our cross-functional, mission-driven, and autonomous squads, you will have the opportunity to work on specific user and business problems. Initially, you will be upskilling within the Data Platform squad, which looks after all internal data products and the data warehouse, driving the bank's data strategy with various exciting greenfield projects. Our technology stack includes Python, DBT, Tableau, PostgreSQL, BigQuery, MySQL, pytest, AWS, GCP, Docker, Terraform, GitHub, and GIT. We are pragmatic about our technology choices and focus on outcomes over outputs to solve user problems that translate to business results. We expect you to collaborate effectively, focus on continuous improvement, seek to understand our users, embrace continuous deployment, test outside-in, practice DevOps culture, and be cloud-native. Your behaviors at work should reflect and actively sustain a healthy engineering environment where a wide range of voices are heard, teams are happy and engaged, safety to have an opinion is perceived, and egos are left behind. At OakNorth Bank, we empower entrepreneurs to realise their ambitions, understand their markets, and apply data intelligence to scale successfully at pace. We believe in barrier-free banking and strive to create an inclusive and diverse workplace where everyone can thrive. Join us in our mission to revolutionize the banking industry and empower businesses to thrive.,
Posted 1 week ago
3.0 - 7.0 years
0 Lacs
chennai, tamil nadu
On-site
We are looking for a highly skilled MLOps Engineer to design, deploy, and manage machine learning pipelines in Google Cloud Platform (GCP). Your responsibilities will include automating ML workflows, optimizing model deployment, ensuring model reliability, and implementing CI/CD pipelines for ML systems. You will collaborate with technical teams to develop cutting-edge machine learning systems that drive business value. In this role, you will manage the deployment and maintenance of machine learning models in production environments, ensuring seamless integration with existing systems. You will monitor model performance using metrics such as accuracy, precision, recall, and F1 score, addressing issues like performance degradation, drift, or bias. Troubleshooting problems, maintaining documentation, and managing model versions for audit and rollback will be part of your routine tasks. Your duties will also involve analyzing monitoring data proactively to identify potential issues and providing regular performance reports to stakeholders. Additionally, you will focus on optimizing queries and pipelines, as well as modernizing applications when necessary. To qualify for this role, you should have expertise in programming languages like Python and SQL, along with a solid understanding of best MLOps practices for deploying enterprise-level ML systems. Familiarity with Machine Learning concepts, models, and algorithms, such as regression, clustering, and neural networks, including deep learning and transformers, is essential. Experience with GCP tools like BigQueryML, Vertex AI Pipelines, Model Versioning & Registry, Cloud Monitoring, and Kubernetes is preferred. Strong communication skills, both written and oral, are crucial, as you will be required to prepare detailed technical documentation for new and existing applications. You should demonstrate strong ownership and collaborative qualities in your domain, taking the initiative to identify and drive opportunities for improvement and process streamlining. A Bachelor's Degree in a quantitative field or equivalent job experience is required for this position. Experience in Azure MLOPS, familiarity with Cloud Billing, setting up or supporting NLP, Gen AI, LLM applications with MLOps features, and working in an Agile environment are considered bonus qualifications. If you are passionate about MLOps, have a knack for problem-solving, and enjoy working in a collaborative environment to deliver innovative machine learning solutions, we would like to hear from you.,
Posted 1 week ago
8.0 - 12.0 years
0 Lacs
karnataka
On-site
As a Principal Engineer at our company, you will play a crucial role in leading the architecture and execution of our GenAI-powered, self-serve marketing platforms. Working directly with the CEO, you will have the opportunity to shape, build, and scale products that revolutionize how marketers interact with data and AI. This is not just a sandbox innovation lab but a real-world product with traction, velocity, and high impact. Your responsibilities will include co-owning the product architecture and direction alongside the CEO, building GenAI-native, full-stack platforms from MVP to scale using LLMs, agents, and predictive AI. You will be expected to own the full stack, including React for frontend, Node.js/Python for backend, GCP for infrastructure, BigQuery for data, and vector databases for AI. Leading a lean, high-caliber team, you will need to adopt a hands-on, unblock-and-coach mindset to drive rapid iteration with rigor, balancing short-term delivery with long-term resilience. To excel in this role, you should bring 8-12 years of experience in building and scaling full-stack, data-heavy, or AI-driven products. Proficiency in React, Node.js, and Google Cloud (Functions, BigQuery, Cloud SQL, Airflow, etc.) is essential. Hands-on experience with GenAI tools like LangChain, OpenAI APIs, and LlamaIndex is considered a bonus. A proven track record of shipping products from ambiguity to impact, architecting and building scalable microservices and APIs, driving adoption of engineering best practices, and collaborating closely with cross-functional teams are key attributes we are looking for. Additionally, you will be responsible for mentoring and guiding junior and mid-level engineers, proactively identifying tech debt and proposing scalable solutions, ensuring strong engineering processes through Agile practices, CI/CD pipelines, and DevOps automation, staying updated with emerging technology trends, and contributing to hiring and talent development efforts within the engineering team. If you are a dynamic individual with a passion for innovation, a strong technical background, and a desire to make a significant impact in the field of marketing technology, we encourage you to apply for this exciting opportunity.,
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
navi mumbai, maharashtra
On-site
As an experienced IT professional with over 5 years of experience, you should have a good understanding of analytics tools to effectively analyze data. Your previous roles may have involved working in production deployment and production support teams. You must be familiar with Big Data tools such as Hadoop, Spark, Apache Beam, and Kafka. Additionally, your expertise should include object-oriented/object function scripting languages like Python, Java, C++, and Scala. Experience with data warehousing tools like BQ, Redshift, Synapse, or Snowflake is essential. You should also be well-versed in ETL processes and have a strong understanding of relational and non-relational databases including MySQL, MS SQL Server, Postgres, MongoDB, and Cassandra. Familiarity with cloud platforms like AWS, GCP, and Azure is also required, along with experience in workflow management using tools like Apache Airflow. In your role, you will be expected to develop high-performance and scalable solutions using GCP for extracting, transforming, and loading big data. You will design and build production-grade data solutions from ingestion to consumption using Java or Python. Optimizing data models on GCP cloud with data stores such as BigQuery will be part of your responsibilities. Furthermore, you should be capable of handling the deployment process, optimizing data pipelines for performance and cost in large-scale data lakes, and writing complex queries across large data sets. Collaboration with Data Engineers to identify the right tools for delivering product features is essential, as well as researching new use cases for existing data. Preferred qualifications include awareness of design best practices for OLTP and OLAP systems, participation in team designing the database and pipeline, exposure to load testing methodologies, debugging pipelines, and handling delta loads in heterogeneous migration projects. Overall, you should be a collaborative team player who interacts effectively with business stakeholders, BAs, and other Data/ML engineers to drive innovation and deliver impactful solutions.,
Posted 1 week ago
3.0 - 7.0 years
0 Lacs
pune, maharashtra
On-site
You should have strong expertise in Looker and LookML along with advanced SQL skills including experience in query optimization. Proficiency in Data Warehouse (DWH) concepts and BigQuery is also required. Your excellent communication and team leadership abilities will be essential for this role. As a candidate for this position, you will be responsible for demonstrating strong expertise in Looker and LookML, utilizing advanced SQL skills for query optimization, understanding DWH concepts and working with BigQuery. Additionally, your excellent communication and team leadership abilities will be crucial for effective collaboration within the team. You should hold a Bachelor's degree in Engineering with a focus on strong communication skills to excel in this role.,
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
chennai, tamil nadu
On-site
As a member of the Platform Observability Engineering team within Ford's Data Platforms and Engineering (DP&E) organization, you will contribute to building and maintaining a top-tier platform for monitoring and observability. This platform focuses on the four golden signalslatency, traffic, errors, and saturationproviding essential data to support operations, root cause analysis, continuous improvement, and cost optimization. You will collaborate with platform architects to help design, develop, and maintain a scalable and reliable platform, ensuring smooth integration with systems used across various teams. Your contributions will be key in improving MTTR and MTTX through increased visibility into system performance. Working with stakeholders, you will integrate observability data into their workflows, develop insightful dashboards and reports, continuously improve platform performance and reliability, optimize costs, and stay updated with industry best practices and technologies. The role focuses on building and maintaining a robust platform rather than developing individual monitoring tools, creating a centralized, reliable source of observability data that empowers data-driven decisions and accelerates incident response across the organization. Responsibilities: - Design and Build Data Pipelines: Architect, develop, and maintain scalable data pipelines and microservices supporting real-time and batch processing on GCP. - Service-Oriented Architecture (SOA) and Microservices: Design and implement SOA and microservices-based architectures for modular, flexible, and maintainable data solutions. - Full-Stack Integration: Contribute to the seamless integration of front-end and back-end components, ensuring robust data access and UI-driven data exploration. - Data Ingestion and Integration: Lead the ingestion and integration of data from various sources into the data platform, ensuring standardized and optimized data for analytics. - GCP Data Solutions: Utilize GCP services (BigQuery, Dataflow, Pub/Sub, Cloud Functions, etc.) to build and manage data platforms meeting business needs. - Data Governance and Security: Implement and manage data governance, access controls, and security best practices while leveraging GCP's native security features. - Performance Optimization: Continuously monitor and improve the performance, scalability, and efficiency of data pipelines and storage solutions. - Collaboration and Best Practices: Define best practices, design patterns, and frameworks for cloud data engineering by closely working with data architects, software engineers, and cross-functional teams. - Automation and Reliability: Automate data platform processes to enhance reliability, reduce manual intervention, and improve operational efficiency. Qualifications: - Technical Skills: Proficiency in Java, Angular, or any JavaScript technology with experience in designing and deploying cloud-based data pipelines and microservices using GCP tools like BigQuery, Dataflow, and Dataproc. - Service-Oriented Architecture and Microservices: Strong understanding of SOA, microservices, and their application within a cloud data platform context. Develop robust, scalable services using Java Spring Boot, Python, Angular, and GCP technologies. - Full-Stack Development: Knowledge of front-end and back-end technologies enabling collaboration on data access and visualization layers (e.g., React, Node.js). - Design and develop RESTful APIs for seamless integration across platform services. - Implement robust unit and functional tests to maintain high standards of test coverage and quality. - Database Management: Experience with relational (e.g., PostgreSQL, MySQL) and NoSQL databases, as well as columnar databases like BigQuery. - Data Governance and Security: Understanding of data governance frameworks and implementing RBAC, encryption, and data masking in cloud environments. - CI/CD and Automation: Familiarity with CI/CD pipelines, Infrastructure as Code (IaC) tools like Terraform, and automation frameworks. - Manage code changes with GitHub and troubleshoot and resolve application defects efficiently. - Ensure adherence to SDLC best practices, independently managing feature design, coding, testing, and production releases. - Problem-Solving: Strong analytical skills with the ability to troubleshoot complex data platform and microservices issues. Certifications (Preferred): GCP Data Engineer, GCP Professional Cloud,
Posted 1 week ago
6.0 - 12.0 years
0 Lacs
chennai, tamil nadu
On-site
As a R Shiny Full Stack Developer with 6 to 12 years of experience, you will be responsible for designing, developing, and maintaining the user interface (UI) of the MART application using R Shiny, JavaScript, and CSS to ensure a seamless and intuitive user experience. Additionally, you will develop and maintain efficient and scalable ETL pipelines using GCP Dataform and BigQuery for extracting, transforming, and loading data from various on-premise (Oracle) and cloud-based sources. This includes leveraging Big R query for accessing on-premise Oracle data. You will also be tasked with implementing data manipulation and transformation logic in R to create a longitudinal data format with unique member identifiers. Furthermore, you will develop comprehensive logging and monitoring using Splunk and collaborate with other developers, data scientists, and stakeholders to ensure the timely delivery of high-quality software. Your role will involve participating in all phases of the software development lifecycle, from requirements gathering and design to testing and deployment, as well as contributing to the maintenance and improvement of existing application functionality. You will work within a Git-based version control system and manage data in a dedicated GCP project while adhering to best practices for cloud security and scalability. Additionally, you will contribute to the creation of summary statistics for groups via the Population Assessment Tool (PAT). Required Skillsets: - Strong proficiency in R Shiny for UI development. - Strong proficiency in JavaScript, CSS, and HTML for front-end development. - Proven experience in designing, developing, and maintaining ETL pipelines, preferably using GCP Dataform and BigQuery. - Experience with data manipulation and transformation in R, including creating longitudinal datasets. - Experience working with on-premise databases (Oracle), preferably using Big R query for data access. - Experience with Git for version control. - Experience with Splunk for logging and monitoring. - Experience working with cloud platforms, specifically Google Cloud Platform (GCP). - Strong analytical and problem-solving skills. - Excellent communication and collaboration skills. Good to Have Skillsets: - Experience with Tableau for dashboarding and data visualization. - Experience with advanced data visualization techniques. - Experience working in an Agile development environment. If you are interested in this role, please share your updated resume to Femina.Periyanayagam@thryvedigital.com. Referrals are welcome.,
Posted 1 week ago
3.0 - 7.0 years
0 Lacs
maharashtra
On-site
You will be joining our team as a Looker Enterprise Dashboarding Specialist. Your main responsibility will be to design, develop, and optimize Looker dashboards to extract actionable insights from complex datasets. To excel in this role, you should have a solid understanding of LookML, data modeling, SQL, and data visualization best practices. You will collaborate with data analysts, engineers, and business stakeholders to create impactful reports and dashboards. Your key responsibilities will include designing, developing, and maintaining Looker dashboards and reports to support business decision-making. You will also be tasked with building and optimizing LookML models, explores, and views to ensure efficient data querying. Collaborating with data engineering teams to enhance data pipelines and model performance will be essential. Working closely with business stakeholders to comprehend reporting needs and convert them into scalable Looker solutions is also a crucial part of your role. Implementing best practices for data visualization to ensure clear and effective storytelling will be a key aspect. Furthermore, optimizing dashboard performance, developing and maintaining data governance standards for Looker usage, and conducting training sessions for internal teams to enhance self-service analytics adoption will fall under your responsibilities. Staying abreast of Looker updates, new features, and industry best practices is also expected. To qualify for this position, you should have 3-5 years of experience in data visualization, business intelligence, or analytics. Strong expertise in Looker, LookML, and SQL is a must. Experience in data modeling, familiarity with BigQuery or other cloud data warehouses, understanding of data governance, security, and role-based access control in Looker, ability to optimize dashboards for performance and usability, strong problem-solving and analytical skills with attention to detail, and excellent communication and stakeholder management skills are necessary. Preferred qualifications include experience working with ETL pipelines and data transformation processes, familiarity with Python or other scripting languages for data automation, exposure to Google Cloud Platform (GCP) and data engineering concepts, and certifications in Looker, Google Cloud, or related BI tools.,
Posted 1 week ago
10.0 - 15.0 years
0 Lacs
delhi
On-site
As a seasoned data engineering professional with 10+ years of experience, you will lead and mentor a team of data engineers to ensure high performance and career growth. Your primary responsibility will be to architect and optimize scalable data infrastructure, guaranteeing high availability and reliability. Additionally, you will drive the development and implementation of data governance frameworks and best practices, collaborating closely with cross-functional teams to define and execute a data roadmap. Your expertise in backend development using languages like Java, PHP, Python, Node.JS, GoLang, JavaScript, HTML, and CSS will be crucial. Proficiency in SQL, Python, and Scala for data processing and analytics is a must. In-depth knowledge of cloud platforms such as AWS, GCP, or Azure is required, along with hands-on experience in big data technologies like Spark, Hadoop, Kafka, and distributed computing frameworks. You will be responsible for ensuring data security, compliance, and quality across all data platforms while optimizing data processing workflows for performance and cost efficiency. A strong foundation in High-Level Design (HLD) and Low-Level Design (LLD), as well as design patterns, preferably using Spring Boot or Google Guice, is necessary. Experience with data warehousing solutions like Snowflake, Redshift, or BigQuery will be beneficial. Your role will also involve working with NoSQL databases such as Redis, Cassandra, MongoDB, and TiDB, as well as familiarity with automation and DevOps tools like Jenkins, Ansible, Docker, Kubernetes, Chef, Grafana, and ELK. Proven ability to drive technical strategy aligned with business objectives, strong leadership, communication, and stakeholder management skills are essential for this position. Candidates from Tier 1 colleges/universities with a background in product startups and experience in implementing Data Engineering systems from an early stage in the company are preferred. Additionally, experience in machine learning infrastructure or MLOps, exposure to real-time data processing and analytics, and interest in data structures, algorithm analysis and design, multicore programming, and scalable architecture will be advantageous. Prior experience in a SaaS or high-growth tech company will be a plus. If you are a highly skilled data engineer with a passion for innovation and technical excellence, we invite you to apply for this challenging and rewarding opportunity.,
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
karnataka
On-site
As a highly experienced and strategic Senior Software Developer with a deep expertise in full stack development and cloud-native solutions on Google Cloud Platform (GCP) or any ETL & Data Warehousing platform, your role will be instrumental in shaping the engineering direction of the organization. You will be responsible for driving architectural decisions, mentoring senior engineers, and ensuring the delivery of scalable, secure, and high-performing solutions across the platform. Your responsibilities will span across the entire stack, involving the crafting of backend experiences, building robust backend APIs, and designing cloud infrastructure. You will play a critical role in influencing the technical vision, driving innovation, and aligning engineering efforts with business goals. Working within the Marketplace Seller Acquisition and Onboarding team, you will be at the forefront of building core platforms and services to facilitate Walmart in delivering a vast selection at competitive prices with a superior seller onboarding experience. This involves enabling third-party sellers to list, sell, and manage their products on walmart.com. Your focus will be on managing the entire seller lifecycle, monitoring customer experience, and providing valuable insights to sellers for assortment planning, pricing, and inventory management. Key responsibilities include leading the design and development of end-to-end ETL applications with high scalability and resilience, architecting complex cloud-native systems utilizing GCP services, setting technical direction, defining best practices, and driving engineering excellence. Additionally, you will guide the adoption of serverless and container-based architectures, champion CI/CD pipelines and Infrastructure as Code (IaC), drive code quality through design reviews and automated testing, and collaborate cross-functionally to translate business requirements into scalable tech solutions. To excel in this role, you should bring at least 5 years of experience in ETL development, deep proficiency in JavaScript/TypeScript, Python, or Go, strong experience with modern frontend frameworks (React preferred), expertise in designing and operating cloud-native systems, proficiency with microservices architecture, Docker, Kubernetes, and event-driven systems, extensive experience in CI/CD and DevOps practices, familiarity with SQL and NoSQL databases, exceptional communication, leadership, and collaboration skills, experience with serverless platforms, and exposure to large-scale data processing pipelines or ML workflows on GCP. Joining Walmart Global Tech offers you the opportunity to work in an environment where your contributions can impact the lives of millions of people. The team consists of software engineers, data scientists, cybersecurity experts, and service professionals who are driving the next wave of retail disruption. By fostering a people-led and tech-empowered culture, Walmart Global Tech provides opportunities for career growth and innovation at scale. In addition to a competitive compensation package, benefits include incentive awards, maternity and parental leave, PTO, health benefits, and more. Walmart aims to create a culture where every associate feels valued and respected, fostering a sense of belonging and creating opportunities for all associates, customers, and suppliers. As an Equal Opportunity Employer, Walmart, Inc. values diversity and inclusivity in the workplace. By understanding, respecting, and valuing unique experiences and identities, Walmart creates a welcoming environment where all individuals can thrive and contribute to the success of the organization.,
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
karnataka
On-site
You will be responsible for leading the delivery of complex solutions by coding larger features from start to finish. Actively participating in planning, performing code and architecture reviews of your team's product will be a crucial aspect of your role. You will help ensure the quality and integrity of the Software Development Life Cycle (SDLC) for your team by identifying opportunities for improvement in how the team works, through the usage of recommended tools and practices. Additionally, you will lead the triage of complex production issues across systems and demonstrate creativity and initiative in solving complex problems. As a high performer, you will consistently deliver a high volume of story points relative to your team. Being aware of the technology landscape, you will plan the delivery of coarse-grained business needs spanning multiple applications. You will also influence technical peers outside your team and set a consistent example of agile development practices. Coaching other engineers to work as a team with Product and UX will be part of your responsibilities. Furthermore, you will create and enhance internal libraries and tools, provide technical leadership on the product, and determine the technical approach. Proactively communicating status and issues to your manager, collaborating with other teams to find creative solutions to customer issues, and showing a commitment to delivery deadlines, especially seasonal and vendor partner deadlines that are critical to Best Buy's continued success, will be essential. Basic Qualifications: - 5+ years of relevant technical professional experience with a bachelor's degree OR equivalent professional experience. - 2+ years of experience with Google Cloud services including Dataflow, Bigquery, Looker. - 1+ years of experience with Adobe Analytics, Content Square, or similar technologies. - Hands-on experience with data engineering and visualization tools like SQL, Airflow, DBT, PowerBI, Tableau, and Looker. - Strong understanding of real-time data processing and issue detection. - Expertise in data architecture, database design, data quality standards/implementation, and data modeling. Preferred Qualifications: - Experience working in an omni-channel retail environment. - Experience connecting technical issues with business performance metrics. - Experience with Forsta or similar customer feedback systems. - Certification in Google Cloud Platform services. - Good understanding of data governance, data privacy laws & regulations, and best practices. About Best Buy: BBY India is a service provider to Best Buy, and as part of the team working on Best Buy projects and initiatives, you will help fulfill Best Buy's purpose to enrich lives through technology. Every day, you will humanize and personalize tech solutions for every stage of life in Best Buy stores, online, and in Best Buy customers" homes. Best Buy is a place where techies can make technology more meaningful in the lives of millions of people, enabling the purpose of enriching lives through technology. The unique culture at Best Buy unleashes the power of its people and provides fast-moving, collaborative, and inclusive experiences that empower employees of all backgrounds to make a difference, learn, and grow every day. Best Buy's culture is built on deeply supporting and valuing its amazing employees and other team members. Best Buy is committed to being a great place to work, where you can unlock unique career possibilities. Above all, Best Buy aims to provide a place where people can bring their full, authentic selves to work now and into the future. Tomorrow works here.,
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
BigQuery, a powerful cloud-based data warehouse provided by Google Cloud, is in high demand in the job market in India. Companies are increasingly relying on BigQuery to analyze and manage large datasets, driving the need for skilled professionals in this area.
The average salary range for BigQuery professionals in India varies based on experience level. Entry-level positions may start at around INR 4-6 lakhs per annum, while experienced professionals can earn upwards of INR 15-20 lakhs per annum.
In the field of BigQuery, a typical career progression may include roles such as Junior Developer, Developer, Senior Developer, Tech Lead, and eventually moving into managerial positions such as Data Architect or Data Engineering Manager.
Alongside BigQuery, professionals in this field often benefit from having skills in SQL, data modeling, data visualization tools like Tableau or Power BI, and cloud platforms like Google Cloud Platform or AWS.
As you explore opportunities in the BigQuery job market in India, remember to continuously upskill and stay updated with the latest trends in data analytics and cloud computing. Prepare thoroughly for interviews by practicing common BigQuery concepts and showcase your hands-on experience with the platform. With dedication and perseverance, you can excel in this dynamic field and secure rewarding career opportunities. Good luck!
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
40183 Jobs | Dublin
Wipro
19418 Jobs | Bengaluru
Accenture in India
16534 Jobs | Dublin 2
EY
15533 Jobs | London
Uplers
11630 Jobs | Ahmedabad
Amazon
10667 Jobs | Seattle,WA
Oracle
9549 Jobs | Redwood City
IBM
9337 Jobs | Armonk
Accenture services Pvt Ltd
8190 Jobs |
Capgemini
7921 Jobs | Paris,France