Home
Jobs

732 Bigquery Jobs - Page 22

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

10.0 - 14.0 years

10 - 16 Lacs

Pune

Work from Office

Naukri logo

Role Overview:- The Senior Tech Lead - GCP Data Engineering leads the design, development, and optimization of advanced data solutions. The jobholder has extensive experience with GCP services, data architecture, and team leadership, with a proven ability to deliver scalable and secure data systems. Responsibilities:- Lead the design and implementation of GCP-based data architectures and pipelines. Architect and optimize data solutions using GCP services such as BigQuery, Dataflow, Pub/Sub, and Cloud Storage. Provide technical leadership and mentorship to a team of data engineers. Collaborate with stakeholders to define project requirements and ensure alignment with business goals. Ensure best practices in data security, governance, and compliance. Troubleshoot and resolve complex technical issues in GCP data environments. Stay updated on the latest GCP technologies and industry trends. Key Technical Skills & Responsibilities Overall 10+ Yrs of experience with GCP and Data Warehousing concepts; Coding; reviewing; testing and debugging Experience as architect on GCP implementation/or migration data projects. Must have understanding of Data Lakes and Data Lake Architectures, best practices in data storage, loading, retrieving data from data lakes. Experience in develop and maintain pipelines in GCP platform, understand best practices of bringing on-prem data to the cloud. File loading, compression, parallelization of loads, optimization etc. Working knowledge and/or experience with Google Data Studio, looker and other visualization tools Working knowledge in Hadoop and Python/Java would be an added advantage Experience in designing and planning BI solutions, Debugging, monitoring and troubleshooting BI solutions, Creating and deploying reports and Writing relational and multidimensional database queries. Any experience in NOSQL environment is a plus. Must be good with Python and PySpark for data pipeline building. Must have experience of working with streaming data sources and Kafka. GCP Services - Cloud Storage, BigQuery , Big Table, Cloud Spanner, Cloud SQL, DataStore/Firestore, DataFlow, DataProc, DataFusion, DataPrep, Pub/Sub, Data Studio, Looker, Data Catalog, Cloud Composer, Cloud Scheduler, Cloud Function Eligibility Criteria: Bachelors degree in Computer Science, Data Engineering, or a related field. Extensive experience with GCP data services and tools. GCP certification (e.g., Professional Data Engineer, Professional Cloud Architect). Experience with machine learning and AI integration in GCP environments. Strong understanding of data modeling, ETL/ELT processes, and cloud integration. Proven leadership experience in managing technical teams. Excellent problem-solving and communication skills.

Posted 1 month ago

Apply

5.0 - 8.0 years

17 - 20 Lacs

Kolkata

Work from Office

Naukri logo

Key Responsibilities Architect and implement scalable data solutions using GCP (BigQuery, Dataflow, Pub/Sub, Cloud Storage, Composer, etc.) and Snowflake. Lead the end-to-end data architecture including ingestion, transformation, storage, governance and consumption layers. Collaborate with business stakeholders, data scientists and engineering teams to define and deliver enterprise data strategy. Design robust data pipelines (batch and real-time) ensuring high data quality, security and availability. Define and enforce data governance, data cataloging and metadata management best practices. Evaluate and select appropriate tools and technologies to optimize data architecture and cost efficiency. Mentor junior architects and data engineers, guiding them on design best practices and technology standards. Collaborate with DevOps teams to ensure smooth CI/CD pipelines and infrastructure automation for data Skills & Qualifications : 3+ years of experience in data architecture, data engineering, or enterprise data platform roles. 3+ years of hands-on experience in Google Cloud Platform (especially BigQuery, Dataflow, Cloud Composer, Data Catalog). 3+ years of experience designing and implementing Snowflake-based data solutions. Deep understanding of modern data architecture principles (Data Lakehouse, ELT/ETL, Data Mesh, etc.). Proficient in Python, SQL and orchestration tools like Airflow / Cloud Composer. Experience in data modeling (3NF, Star, Snowflake schemas) and designing data marts and warehouses. Strong understanding of data privacy, compliance (GDPR, HIPAA) and security principles in cloud environments. Familiarity with tools like dbt, Apache Beam, Looker, Tableau, or Power BI is a plus. Excellent communication and stakeholder management skills. GCP or Snowflake certification preferred (e.g., GCP Professional Data Engineer, SnowPro Qualifications : Experience working with hybrid or multi-cloud data strategies. Exposure to ML/AI pipelines and support for data science workflows. Prior experience in leading architecture reviews, PoCs and technology roadmaps

Posted 1 month ago

Apply

1.0 - 5.0 years

3 - 7 Lacs

Chandigarh

Work from Office

Naukri logo

Key Responsibilities Assist in building and maintaining data pipelines on GCP using services like BigQuery, Dataflow, Pub/Sub, Cloud Storage, etc. Support data ingestion, transformation, and storage processes for structured and unstructured datasets. Participate in performance tuning and optimization of existing data workflows. Collaborate with data analysts, engineers, and stakeholders to ensure reliable data delivery. Document code, processes, and architecture for reproducibility and future reference. Debug issues in data pipelines and contribute to their resolution.

Posted 1 month ago

Apply

10.0 - 15.0 years

11 - 15 Lacs

Jhagadia

Work from Office

Naukri logo

Develop, implement, and maintain the organization's MIS to ensure accurate and real-time reporting of key business metrics. Oversee the preparation and distribution of daily, weekly, and monthly reports to various departments and senior management. Ensure data accuracy, integrity, and consistency across all reporting platforms. Design and maintain dashboards for business performance monitoring. Analyze data trends and provide insights to management for informed decision-making. Establish and maintain cost accounting systems and procedures for accurate tracking of material, labor, and overhead costs. Review and update cost standards, analyzing variances and taking corrective actions when necessary. Collaborate with other departments to monitor and control project costs, ensuring alignment with budget and financial goals. Perform cost analysis and prepare cost reports to monitor financial performance and support pricing decisions. Conduct regular audits to ensure compliance with costing policies and industry standards. Provide regular cost analysis reports, highlighting variances between actual and budgeted figures, and recommend corrective actions. Support financial forecasting and budgeting processes by providing relevant data and insights. Assist in month-end and year-end closing processes by ensuring accurate costing and reporting entries. Review profitability analysis reports and identify areas for cost optimization.

Posted 1 month ago

Apply

1.0 - 5.0 years

3 - 7 Lacs

Gurugram

Work from Office

Naukri logo

Key Responsibilities Assist in building and maintaining data pipelines on GCP using services like BigQuery, Dataflow, Pub/Sub, Cloud Storage, etc. Support data ingestion, transformation, and storage processes for structured and unstructured datasets. Participate in performance tuning and optimization of existing data workflows. Collaborate with data analysts, engineers, and stakeholders to ensure reliable data delivery. Document code, processes, and architecture for reproducibility and future reference. Debug issues in data pipelines and contribute to their resolution.

Posted 1 month ago

Apply

4.0 - 6.0 years

7 - 9 Lacs

Chennai

Work from Office

Naukri logo

What youll be doing Were seeking a skilled Data Engineering Analyst to join our high-performing team and propel our telecom business forward. Youll contribute to building cutting-edge data products and assets for our wireless and wireline operations, spanning areas like consumer analytics, network performance, and service assurance. In this role, you will develop deep expertise in various telecom domains. As part of the Data Architecture Strategy team, youll collaborate closely with IT and business stakeholders to design and implement user-friendly, robust data product solutions. This includes incorporating data classification and governance principles. Your responsibilities encompass Collaborate with stakeholders to understand data requirements and translate them into efficient data models Design, develop, and implement data architecture solutions on GCP and Teradata to support our Telecom business. Design data ingestion for both real-time and batch processing, ensuring efficient and scalable data acquisition for creating an effective data warehouse. Maintain meticulous documentation, including data design specifications, functional test cases, data lineage, and other relevant artifacts for all data product solution assets. Implement data architecture standards, as set by the data architecture team. Proactively identify opportunities for automation and performance optimization within your scope of work Collaborate effectively within a product-oriented organization, providing data expertise and solutions across multiple business units. Cultivate strong cross-functional relationships and establish yourself as a subject matter expert in data and analytics within the organization. What were looking for... Youre curious about new technologies and the game-changing possibilities it creates. You like to stay up-to-date with the latest trends and apply your technical expertise to solving business problems. Youll need to have Bachelors degree with four or more years of work experience. Four or more years of relevant work experience. Expertise in building complex SQLs to do data analysis to understand and design data solutions Experience with ETL, Data Warehouse concepts and Data Management life cycle Experience in creating technical documentation such as Source to Target mapping, Source contract, SLA's etc Experience in any DBMS, preferably GCP/BigQuery Experience in creating Data models using Erwin tool Experience in shell scripting and python Understanding of git version control and basic git command Understanding of Data Quality concepts Even better if you have one or more of the following Certification in GCP-Data Engineer. Understanding of NO SQL databases like Cassandra, Mongo etc Accuracy and attention to detail. Good problem solving, analytical, and research capabilities. Good verbal and written communication. Experience presenting to leaders and influencing stakeholders.

Posted 1 month ago

Apply

5.0 - 7.0 years

25 - 30 Lacs

Bengaluru

Work from Office

Naukri logo

Job Title: Senior Data Engineer / Technical Lead Location: Bangalore Employment Type: Full-time Role Summary We are seeking a highly skilled and motivated Senior Data Engineer/Technical Lead to take ownership of the end-to-end delivery of a key project involving data lake transitions, data warehouse maintenance, and enhancement initiatives. The ideal candidate will bring strong technical leadership, excellent communication skills, and hands-on expertise with modern data engineering tools and platforms. Experience in Databricks and JIRA is highly desirable. Knowledge of supply chain and finance domains is a plus, or a willingness to quickly ramp up in these areas is expected. Key Responsibilities Delivery Management Lead and manage data lake transition initiatives under the Gold framework. Oversee delivery of enhancements and defect fixes related to the enterprise data warehouse. Technical Leadership Design and develop efficient, scalable data pipelines using Python, PySpark , and SQL . Ensure adherence to coding standards, performance benchmarks, and data quality goals. Conduct performance tuning and infrastructure optimization for data solutions. Provide code reviews, mentorship, and technical guidance to the engineering team. Collaboration & Stakeholder Engagement Collaborate with business stakeholders (particularly the Laboratory Products team) to gather, interpret, and refine requirements. Communicate technical solutions and project progress clearly to both technical and non-technical audiences. Tooling and Technology Use Leverage tools such as Databricks, Informatica, AWS Glue, Google DataProc , and Airflow for ETL and data integration. Use JIRA to manage project workflows, track defects, and report progress. Documentation and Best Practices Create and review documentation including architecture, design, testing, and deployment artifacts. Define and promote reusable templates, checklists, and best practices for data engineering tasks. Domain Adaptation Apply or gain knowledge in supply chain and finance domains to enhance project outcomes and align with business needs. Skills and Qualifications Technical Proficiency Strong hands-on experience in Python, PySpark , and SQL . Expertise with ETL tools such as Informatica, AWS Glue, Databricks , and Google Cloud DataProc . Deep understanding of data warehousing solutions (e.g., Snowflake, BigQuery, Delta Lake, Lakehouse architectures ). Familiarity with performance tuning, cost optimization, and data modeling best practices. Platform & Tools Proficient in working with cloud platforms like AWS, Azure, or Google Cloud . Experience in version control and configuration management practices. Working knowledge of JIRA and Agile methodologies. Certifications (Preferred but not required) Certifications in cloud technologies, ETL platforms, or relevant domain (e.g., AWS Data Engineer, Databricks Data Engineer, Supply Chain certification). Expected Outcomes Timely and high-quality delivery of data engineering solutions. Reduction in production defects and improved pipeline performance. Increased team efficiency through reuse of components and automation. Positive stakeholder feedback and high team engagement. Consistent adherence to SLAs, security policies, and compliance guidelines. Performance Metrics Adherence to project timelines and engineering standards Reduction in post-release defects and production issues Improvement in data pipeline efficiency and resource utilization Resolution time for pipeline failures and data issues Completion of required certifications and training Preferred Background Background or exposure to supply chain or finance domains Willingness to work during morning US East hours Ability to work independently and drive initiatives with minimal oversight Required Skills Databricks,Data Warehousing,ETL,SQL

Posted 1 month ago

Apply

5.0 - 7.0 years

7 - 9 Lacs

Pune

Work from Office

Naukri logo

New Opportunity :FullStack Engineer. Location :Pune (Onsite). Company :Apptware Solutions Hiring. Experience :4+ years. We're looking for a skilled Full Stack Engineer to join our team. If you have experience in building scalable applications and working with modern technologies, this role is for you. Role & Responsibilities. Develop product features to help customers easily transform data. Design, implement, deploy, and support client-side and server-side architectures, including web applications, CLI, and SDKs. Minimum Requirements. 4+ years of experience as a Full Stack Developer or similar role. Hands-on experience in a distributed engineering role with direct operational responsibility (on-call experience preferred). Proficiency in at least one back-end language (Node.js, TypeScript, Python, or Go). Front-end development experience with Angular or React, HTML, CSS. Strong understanding of web applications, backend APIs, CI/CD pipelines, and testing frameworks. Familiarity with NoSQL databases (e.g. DynamoDB) and AWS services (Lambda, API Gateway, Cognito, etc.). Bachelor's degree in Computer Science, Engineering, Math, or equivalent experience. Strong written and verbal communication skills. Preferred Skills. Experience with AWS Glue, Spark, or Athena. Strong understanding of SQL and data engineering best practices. Exposure to Analytical EDWs (Snowflake, Databricks, Big Query, Cloudera, Teradata). Experience in B2B applications, SaaS offerings, or startups is a plus. (ref:hirist.tech). Show more Show less

Posted 1 month ago

Apply

5.0 - 8.0 years

8 - 14 Lacs

Bengaluru

Remote

Naukri logo

Job Overview : We are looking for an experienced GCP Data Engineer with deep expertise in BigQuery, DataFlow, DataProc, Pub/Sub, and GCS to build, manage, and optimize large-scale data pipelines. The ideal candidate should have a strong background in cloud data storage, real-time data streaming, and orchestration. Key Responsibilities : Data Storage & Management : - Manage Google Cloud Storage (GCS) buckets, set up permissions, and optimize storage solutions for handling large datasets. - Ensure data security, access control, and lifecycle management. Data Processing & Analytics : - Design and optimize BigQuery for data warehousing, querying large datasets, and performance tuning. - Implement ETL/ELT pipelines for structured and unstructured data. - Work with DataProc (Apache Spark, Hadoop) for batch processing of large datasets. Real-Time Data Streaming : - Use Pub/Sub for building real-time, event-driven streaming pipelines. - Implement Dataflow (Apache Beam) for real-time and batch data processing. Workflow Orchestration & Automation : - Use Cloud Composer (Apache Airflow) for scheduling and automating data workflows. - Build monitoring solutions to ensure data pipeline health and performance. Cloud Infrastructure & DevOps : - Implement Terraform for provisioning and managing cloud infrastructure. - Work with Google Kubernetes Engine (GKE) for container orchestration and managing distributed applications. Advanced SQL & Data Engineering : - Write efficient SQL queries for data transformation, aggregation, and analysis. - Optimize query performance and cost efficiency in BigQuery. Required Skills & Qualifications : - 4-8 years of experience in GCP Data Engineering - Strong expertise in BigQuery, DataFlow, DataProc, Pub/Sub, and GCS - Experience in SQL, Python, or Java for data processing and transformation - Proficiency in Airflow (Cloud Composer) for scheduling workflows - Hands-on experience with Terraform for cloud infrastructure automation - Familiarity with NoSQL databases like Bigtable for high-scale data handling - Knowledge of GKE for containerized applications and distributed processing Preferred Qualifications : - Experience with CI/CD pipelines for data deployment - Familiarity with Cloud Functions or Cloud Run for serverless execution - Understanding of data governance, security, and compliance Why Join Us ? - Work on cutting-edge GCP data projects in a cloud-first environment - Competitive salary and career growth opportunities - Collaborative and innovative work culture - Exposure to big data, real-time streaming, and advanced analytics.

Posted 1 month ago

Apply

4.0 - 7.0 years

5 - 8 Lacs

Gurugram

Work from Office

Naukri logo

Key Responsibilities: Gather and analyze data from a variety of sources, including SQL databases, BigQuery, Excel, Power BI, and Python. Good SQL coding skills are a must. Work with stakeholders to understand their needs and translate them into data-driven solutions. Communicate effectively with stakeholders, both verbally and in writing. Leading and managing a team of business analysts. Must be a self-starter, able to manage multiple tasks and projects simultaneously, own deliverables end to end, prioritize workload effectively and thrive in dynamic environment Must be a problem solver with outstanding skills in discovering techniques and proven Abilities to translate the underlying business needs into actionable insights Works well under pressure, can work within stringent timelines and collaborate with teams to achieve results Desired Profile : Should have relevant experience of 4-6 years in the field of analytics Technical Capabilities (hands on) - SQL, Advance Excel, Power BI, BigQuery, R/Python (Good to have) Possess strong analytical skills and sharepoint of views with the organization Penchant for business, curiosity about numbers and persistent to work with data to generate insights Provides customized knowledge for client work, prepares accurate, well developed client deliverables Experience in App Analytics would be preferred Experience with E-commerce, Retail business would be preferred

Posted 1 month ago

Apply

1.0 - 5.0 years

6 - 10 Lacs

Bengaluru

Work from Office

Naukri logo

Job TitleData Engineer Experience5"“8 Years LocationDelhi, Pune, Bangalore (Hyderabad & Chennai also acceptable) Time ZoneAligned with UK Time Zone Notice PeriodImmediate Joiners Only Role Overview: We are seeking experienced Data Engineers to design, develop, and optimize large-scale data processing systems You will play a key role in building scalable, efficient, and reliable data pipelines in a cloud-native environment, leveraging your expertise in GCP, BigQuery, Dataflow, Dataproc, and more Key Responsibilities: Design, build, and manage scalable and reliable data pipelines for real-time and batch processing. Implement robust data processing solutions using GCP services and open-source technologies. Create efficient data models and write high-performance analytics queries. Optimize pipelines for performance, scalability, and cost-efficiency. Collaborate with data scientists, analysts, and engineering teams to ensure smooth data integration and transformation. Maintain high data quality, enforce validation rules, and set up monitoring and alerting. Participate in code reviews, deployment activities, and provide production support. Technical Skills Required: Cloud PlatformsGCP (Google Cloud Platform)- mandatory Key GCP ServicesDataproc, BigQuery, Dataflow Programming LanguagesPython, Java, PySpark Data Engineering ConceptsData Ingestion, Change Data Capture (CDC), ETL/ELT pipeline design Strong understanding of distributed computing, data structures, and performance tuning Required Qualifications & Attributes: 5"“8 years of hands-on experience in data engineering roles Proficiency in building and optimizing distributed data pipelines Solid grasp of data governance and security best practices in cloud environments Strong analytical and problem-solving skills Effective verbal and written communication skills Proven ability to work independently and in cross-functional teams Show more Show less

Posted 1 month ago

Apply

1.0 - 5.0 years

6 - 10 Lacs

Pune

Work from Office

Naukri logo

We're HiringPython Developer! We are looking for an experienced Python Developer to join dynamic team in Pune, India The ideal candidate will possess a strong background in software development and be proficient in writing efficient, reusable code You will play a key role in designing and implementing scalable applications while collaborating with cross-functional teams. “ LocationPune, India Work ModeHybrid ’ RolePython Developer Experience5+ years What Were Looking For Proven experience designing, building, and operating data-oriented solutions in a high volume, transactional, global, industry Experience with advertising technology (AdTech) highly desired. Proven experience developing simple / scalable / reliable architectures, building, and operating concurrent, distributed systems, and solving difficult and novel problems Proven experience in developing data structures and algorithms, including experience working with ML/AI solutions. Proven experience and a passion for developing and operating data-oriented and/or full stack solutions using Python, Javascript/Typescript, Airflow/Composer, Node, Kafka, Snowflake, BigQuery, and a mix of data platforms such as Spark, Hadoop, AWS Athena, Postgres and Redis Excellent SQL development, query optimization and data pipeline development skills required Strong experience using public cloud platforms including AWS and GCP is required; experience with docker and Kubernetes strongly preferred. Proven experience in developing data structures and algorithms Experience supporting ML/AI highly desirable. Proven experience in modern software development and testing practices, with a willingness to share, partner and support and coach other engineers, product people, and operations Experience in employing TDD, BDD or ATDD highly desirable. Proven experience contributing to the development of principles, practices, and tooling supporting agile, testing/QA, DevSecOps, automation, SRE Experience in Trunk Based Development, XP, & implementing CI/CD highly desirable. Experience in SaaS product engineering and operations highly desirable. A focus on continuous learning and improving, both technically and professionally, in your industry, for you and your teams. Demonstrated resilience, with experience working in ambiguous situations. What You'll Do Develop software as a member of one of our engineering teams, participating in all stages of development, delivery and operations, together with your tech lead, colleagues, Product, Data Science, and Design leaders. Develop solutions that are simple, scalable, reliable, secure, maintainable, and make a measurable impact. Develop and deliver new features, maintain our product, and drive growth to hit team KPIs. Employ modern pragmatic engineering principles, practices, and tooling, including TDD/BDD/ATDD, XP, QA Engineering, Trunk Based Development, Continuous Delivery, automation, DevSecOps, and Site Reliability Engineering. Contribute to driving ongoing improvements to our engineering principles, practices, and tooling Provide support & mentorship to junior engineers, prioritising continuous learning and development. Develop and maintain a contemporary understanding of AdTech developments, industry standards, partner and competitor platform developments, and commercial models, from an engineering perspective Combined these insights with technical expertise to contribute to our strategy and plans, influence product design, shape our roadmap, and help plan delivery. Ready to take your career to the next levelš" Apply now and join us on this exciting journey! Show more Show less

Posted 1 month ago

Apply

10.0 - 18.0 years

25 - 30 Lacs

Noida

Work from Office

Naukri logo

Responsibilities:- Collaborate with the sales team to understand customer challenges and business objectives and propose solutions, POC etc..- Develop and deliver impactful technical presentations and demos showcasing the capabilities of GCP Data and AI , GenAI Solutions- Conduct technical proof-of-concepts (POCs) to validate the feasibility and value proposition of GCP solutions.- Collaborate with technical specialists and solution architects from COE Team to design and configure tailored cloud solutions.- Manage and qualify sales opportunities, working closely with the sales team to progress deals through the sales funnel.- Stay up to date on the latest GCP offerings, trends, and best practices.Experience :- Design and implement a comprehensive strategy for migrating and modernizing existing relational on-premise databases to scalable and cost-effective solution on Google Cloud Platform ( GCP).- Design and Architect the solutions for DWH Modernization and experience with building data pipelines in GCP - Strong Experience in BI reporting tools ( Looker, PowerBI and Tableau) - In-depth knowledge of Google Cloud Platform (GCP) services, particularly Cloud SQL, Postgres, Alloy DB, BigQuery, Looker Vertex AI and Gemini (GenAI)- Strong knowledge and experience in providing the solution to process massive datasets in real time and batch process using cloud native/open source Orchestration techniques - Build and maintain data pipelines using Cloud Dataflow to orchestrate real-time and batch data processing for streaming and historical data.- Strong knowledge and experience in best practices for data governance, security, and compliance - Excellent Communication and Presentation Skills with ability to tailor technical information as per customer needs- Strong analytical and problem-solving skills.- Ability to work independently and as part of a team.

Posted 1 month ago

Apply

4.0 - 8.0 years

6 - 10 Lacs

Gurugram

Work from Office

Naukri logo

Key Responsibilities: - Gather and Analyze data from a variety of sources, including SQL databases, BigQuery, Excel, Power BI, and Python. Good SQL coding skills are a must. - Work with stakeholders to understand their needs and translate them into data-driven solutions. - Communicate effectively with stakeholders, both verbally and in writing. - Leading and managing a team of business analysts. - Must be a self-starter, able to manage multiple tasks and projects simultaneously, own deliverables end to end, prioritize workload effectively and thrive in dynamic environment - Must be a problem solver with outstanding skills in discovering techniques and proven - Abilities to translate the underlying business needs into actionable insights - Works well under pressure, can work within stringent timelines and collaborate with teams to achieve results Desired Profile: - Should have relevant experience of 4-7 years in the field of analytics - Technical Capabilities (hands on) - SQL, Advance Excel, Power BI, BigQuery, R/Python (Good to have) - Possess strong analytical skills and sharepoint of views with the organization - Penchant for business, curiosity about numbers and persistent to work with data to generate insights - Provides customized knowledge for client work, prepares accurate, well developed client deliverables - Experience in App Analytics would be preferred - Experience with E-commerce, Retail business would be preferred

Posted 1 month ago

Apply

1.0 - 4.0 years

10 - 14 Lacs

Pune

Work from Office

Naukri logo

Overview Design, develop, and maintain data pipelines and ETL/ELT processes using PySpark/Databricks. Optimize performance for large datasets through techniques such as partitioning, indexing, and Spark optimization. Collaborate with cross-functional teams to resolve technical issues and gather requirements. Responsibilities Ensure data quality and integrity through data validation and cleansing processes. Analyze existing SQL queries, functions, and stored procedures for performance improvements. Develop database routines like procedures, functions, and views. Participate in data migration projects and understand technologies like Delta Lake/warehouse. Debug and solve complex problems in data pipelines and processes. Qualifications Bachelor’s degree in computer science, Engineering, or a related field. Strong understanding of distributed data processing platforms like Databricks and BigQuery. Proficiency in Python, PySpark, and SQL programming languages. Experience with performance optimization for large datasets. Strong debugging and problem-solving skills. Fundamental knowledge of cloud services, preferably Azure or GCP. Excellent communication and teamwork skills. Nice to Have: Experience in data migration projects. Understanding of technologies like Delta Lake/warehouse. What we offer you Transparent compensation schemes and comprehensive employee benefits, tailored to your location, ensuring your financial security, health, and overall wellbeing. Flexible working arrangements, advanced technology, and collaborative workspaces. A culture of high performance and innovation where we experiment with new ideas and take responsibility for achieving results. A global network of talented colleagues, who inspire, support, and share their expertise to innovate and deliver for our clients. Global Orientation program to kickstart your journey, followed by access to our Learning@MSCI platform, LinkedIn Learning Pro and tailored learning opportunities for ongoing skills development. Multi-directional career paths that offer professional growth and development through new challenges, internal mobility and expanded roles. We actively nurture an environment that builds a sense of inclusion belonging and connection, including eight Employee Resource Groups. All Abilities, Asian Support Network, Black Leadership Network, Climate Action Network, Hola! MSCI, Pride & Allies, Women in Tech, and Women’s Leadership Forum. At MSCI we are passionate about what we do, and we are inspired by our purpose – to power better investment decisions. You’ll be part of an industry-leading network of creative, curious, and entrepreneurial pioneers. This is a space where you can challenge yourself, set new standards and perform beyond expectations for yourself, our clients, and our industry. MSCI is a leading provider of critical decision support tools and services for the global investment community. With over 50 years of expertise in research, data, and technology, we power better investment decisions by enabling clients to understand and analyze key drivers of risk and return and confidently build more effective portfolios. We create industry-leading research-enhanced solutions that clients use to gain insight into and improve transparency across the investment process. MSCI Inc. is an equal opportunity employer. It is the policy of the firm to ensure equal employment opportunity without discrimination or harassment on the basis of race, color, religion, creed, age, sex, gender, gender identity, sexual orientation, national origin, citizenship, disability, marital and civil partnership/union status, pregnancy (including unlawful discrimination on the basis of a legally protected parental leave), veteran status, or any other characteristic protected by law. MSCI is also committed to working with and providing reasonable accommodations to individuals with disabilities. If you are an individual with a disability and would like to request a reasonable accommodation for any part of the application process, please email Disability.Assistance@msci.com and indicate the specifics of the assistance needed. Please note, this e-mail is intended only for individuals who are requesting a reasonable workplace accommodation; it is not intended for other inquiries. To all recruitment agencies MSCI does not accept unsolicited CVs/Resumes. Please do not forward CVs/Resumes to any MSCI employee, location, or website. MSCI is not responsible for any fees related to unsolicited CVs/Resumes. Note on recruitment scams We are aware of recruitment scams where fraudsters impersonating MSCI personnel may try and elicit personal information from job seekers. Read our full note on careers.msci.com

Posted 1 month ago

Apply

4.0 - 8.0 years

16 - 25 Lacs

Bengaluru

Hybrid

Naukri logo

Required Skills: Successful candidates will have demonstrated the following skills and characteristics: Must Have: Proven expertise in supply chain analytics across domains such as demand forecasting, inventory optimization, logistics, segmentation, and network design Well versed and hands-on experience of working on optimization methods like linear programming, mixed integer programming, scheduling optimization. Having understanding of working on third party optimization solvers like Gurobi will be an added advantage Proficiency in forecasting techniques (e.g., Holt-Winters, ARIMA, ARIMAX, SARIMA, SARIMAX, FBProphet, NBeats) and machine learning techniques (supervised and unsupervised) Strong command of statistical modeling, testing, and inference Proficient in using GCP tools: BigQuery, Vertex AI, Dataflow, Looker Building data pipelines and models for forecasting, optimization, and scenario planning Strong SQL and Python programming skills; experience deploying models in GCP environment Knowledge of orchestration tools like Cloud Composer (Airflow) Nice to have: Familiarity with MLOps, containerization (Docker, Kubernetes), and orchestration tools (e.g., Cloud composer) Strong communication and stakeholder engagement skills at the executive level Roles and Responsibilities: Assist analytics projects within the supply chain domain, driving design, development, and delivery of data science solutions Develop and execute on project & analysis plans under the guidance of Project Manager Interact with and advise consultants/clients in US as a subject matter expert to formalize data sources to be used, datasets to be acquired, data & use case clarifications that are needed to get a strong hold on data and the business problem to be solved Drive and Conduct analysis using advanced analytics tools and coach the junior team members Implement necessary quality control measures in place to ensure the deliverable integrity like data quality, model robustness, and explainability for deployments. Validate analysis outcomes, recommendations with all stakeholders including the client team Build storylines and make presentations to the client team and/or PwC project leadership team Contribute to the knowledge and firm building activities Role & responsibilities Preferred candidate profile

Posted 1 month ago

Apply

7.0 - 10.0 years

15 - 18 Lacs

Hyderabad, Bengaluru

Hybrid

Naukri logo

We are seeking an experienced MuleSoft Developer who will be responsible for designing, developing, and maintaining integrations between various enterprise applications and systems. You will work with business and technical teams to ensure smooth data flow across systems, leveraging MuleSofts Anypoint Platform to implement end-to-end integration solutions. Design and develop integration solutions using MuleSoft’s Anypoint Platform, including Mule ESB, CloudHub, and API management tools. • Work closely with business analysts, stakeholders, and system architects to gather integration requirements and transform them into technical solutions. • Create and maintain reusable MuleSoft APIs and connectors for internal and external integrations. • Experience in integtating On-Prem databases with SF Ecosystem • Detail knowledge on VPN implementation for On Prem Data Movement • Develop, test, deploy, and monitor APIs and integration services. • Troubleshoot and resolve issues related to integrations, APIs, and MuleSoft components. • Collaborate with cross-functional teams to support API lifecycle management and promote best practices for API design and development. • Ensure high availability, scalability, and security of integration services. Integration with different legacy systems and data bases like MySQL SQL Server • Ability to work on BigQuery , Oracle Connectors & MINIO API. • Ability to perform end to end integration between On Prem and cloud Databases to Salesforce Data Cloud and Marketing Cloud • Ability to work on Parquet file • Extensive experience in designing, implementing and optimizing REST, SOAP and Bulk APIs • Conduct unit test, code reviews and quality assurance checks to ensure integration solutions meet high standards • Develop technical design documents • Experience in implementation of large scale software(high transaction volume, high availability concepts) • Good programming skills and experience in troubleshooting Mule ESB, including working with debuggers, flow analyzers, and configuration tools • Experience with Services Oriented Architecture, Web services development • 2-3 yrs of experience working in integration platform Bachelor’s Degree in Computer Science, Information Technology, Business, Engineering, or a related field • Minimum of 4-5 years of experience in integration development, with at least 2-3 years of hands-on experience with MuleSoft • Experienced in building integration projects using Mule ESB, Mule API, and Mule Cloud Hub • Strong ability to manage and communicate with both technical and non-technical stakeholders. • Solid understanding of software development, systems integration, and cloud technologies • Strong strategic thinking and planning skills. • Ability to work in a fast-paced, dynamic environment and manage multiple priorities • Experience with Agile methodologies and version control systems like Git MBA or advanced degree in a related field. • MuleSoft certification (e.g., MuleSoft Certified Developer - Level 1). • Knowledge of Cloud environments like AWS, Azure, or Google Cloud Platform. • Experience with API management, OAuth, JWT, and OpenID Connect. • Experience in API integration with Legacy syste

Posted 1 month ago

Apply

10.0 - 15.0 years

30 - 40 Lacs

Bhopal, Pune, Gurugram

Hybrid

Naukri logo

Job Title: Senior Data Engineer GCP | Big Data | Airflow | dbt Company: Xebia Location: All Xebia locations Experience: 10+ Years Employment Type: Full Time Notice Period: Immediate to Max 30 Days Only Job Summary Join the digital transformation journey of one of the world’s most iconic global retail brands! As a Senior Data Engineer , you’ll be part of a dynamic Digital Technology organization, helping build modern, scalable, and reliable data products to power business decisions across the Americas. You'll work in the Operations Data Domain, focused on ingesting, processing, and optimizing high-volume data pipelines using Google Cloud Platform (GCP) and other modern tools. Key Responsibilities Design, develop, and maintain highly scalable big data pipelines (batch & streaming) Collaborate with cross-functional teams to understand data needs and deliver efficient solutions Architect robust data solutions using GCP-native services (BigQuery, Pub/Sub, Cloud Functions, etc.) Build and manage modern Data Lake/Lakehouse platforms Create frameworks and reusable components for scalable ingestion and processing Implement data governance, security, and ensure regulatory compliance Mentor junior engineers and lead an offshore team of 8+ engineers Monitor pipeline performance, troubleshoot bottlenecks, and ensure data quality Engage in code reviews, CI/CD deployments, and agile product releases Contribute to internal best practices and engineering standards Must-Have Skills & Qualifications 8+ years in data engineering with strong hands-on experience in production-grade pipelines Expertise in GCP Data Services – BigQuery, Vertex AI, Pub/Sub, etc. Proficiency in dbt (Data Build Tool) for data transformation Strong programming skills in Python, Java, or Scala Advanced SQL & NoSQL knowledge Experience with Apache Airflow for orchestration Hands-on with Git, GitHub Actions , Jenkins for CI/CD Solid understanding of data warehousing (BigQuery, Snowflake, Redshift) Exposure to tools like Hadoop, Spark, Kafka , Databricks (nice to have) Familiarity with BI tools like Tableau, Power BI, or Looker (optional) Strong leadership qualities to manage offshore engineering teams Excellent communication skills and stakeholder management experience Preferred Education Bachelor’s or Master’s degree in Computer Science, Engineering, Mathematics, or a related field Notice Period Requirement Only Immediate Joiners or candidates with Max 30 Days Notice Period will be considered. How to Apply If you are passionate about solving real-world data problems and want to be part of a global data-driven transformation, apply now by sending your resume to vijay.s@xebia.com with the subject line: "Sr Data Engineer Application – [Your Name]" Kindly include the following details in your email: Full Name Total Experience Current CTC Expected CTC Current Location Preferred Location Notice Period / Last Working Day Key Skills Please do not apply if you are currently in process with any other role at Xebia or have recently interviewed.

Posted 1 month ago

Apply

5.0 - 8.0 years

27 - 42 Lacs

Bengaluru

Work from Office

Naukri logo

Years Of Exp - 5-12 Yrs Location - PAN India OFSAA Data Modeler Experience in design, build ,customize OFSAA Data model ,Validation of data model Excellent knowledge in Data model guidelines for Staging. processing and Reporting tables. Knowledge on Data model Support on configuring the UDPs, subtype and supertype relationship enhancements Experience on OFSAA platform (OFSAAI) with one or more of following OFSAA modules: o OFSAA Financial Solution Data Foundation - (Preferred) o OFSA Data Integrated Hub - Optional Good in SQL and PL/SQL. Strong in Data Warehouse Principles, ETL/Data Flow tools. Should have excellent Analytical and Communication skills. OFSAA Integration SME - DIH/Batch run framework Experience in ETL process, familiar with OFSAA. DIH setup in EDS, EDD, T2T, etc. Familiar with different seeded tables, SCD, DIM, hierarchy, look ups, etc Worked with FSDF in knowing the STG, CSA, FACT table structures Have working with different APIs and out of box connectors, etc. Familiar with Oracle patching and SR

Posted 1 month ago

Apply

5.0 - 10.0 years

15 - 27 Lacs

Hyderabad, Chennai, Bengaluru

Hybrid

Naukri logo

Job Title: Data Engineer GCP Company: Xebia Location: Hybrid - Any Xebia Location Experience: 5+ Years Salary: As per industry standards Job Type: Full Time About the Role: Xebia is hiring a seasoned Data Engineer (L4) to join a high-impact team building scalable data platforms using GCP, Databricks, and Airflow . If you thrive on architecting future-ready solutions and have strong experience in big data transformations, we’d love to hear from you. Project Overview: We currently manage 1000+ Data Pipelines using Databricks Clusters for end-to-end data transformation ( Raw Silver Gold ) with orchestration handled via Airflow — all on Google Cloud Platform (GCP) . Curated datasets are delivered through BigQuery and Databricks Notebooks . Our roadmap includes migrating to a GCP-native data processing framework optimized for Spark workloads. Key Responsibilities: Design and implement a GCP-native data processing framework Analyze and plan migration of existing workloads to cloud-native architecture Ensure data availability, integrity, and consistency Build reusable tools and standards for the Data Engineering team Collaborate with stakeholders and document processes thoroughly Required Experience: 5+ years in Data Engineering with strong data architecture experience Hands-on expertise in Databricks , Airflow , BigQuery , and PySpark Deep knowledge of GCP services for data processing (Dataflow, Dataproc, etc.) Familiarity with data lake table formats like Delta, Iceberg Experience with orchestration tools ( Airflow , Dagster , or similar) Key Skills: Python programming Strong understanding of data lake architectures and cloud-native best practices Excellent problem-solving and communication skills Notice Period Requirement: Only Immediate Joiners or Candidates with Max 30 Days Notice Period Will Be Considered How to Apply: Interested candidates can share their details and updated resume with vijay.s@xebia.com in the following format: Full Name: Total Experience (Must be 5+ years): Current CTC: Expected CTC: Current Location: Preferred Location: Notice Period / Last Working Day (if serving notice): Primary Skill Set: Note: Please apply only if you have not recently applied or interviewed for any open roles at Xebia.

Posted 1 month ago

Apply

1.0 - 3.0 years

10 - 15 Lacs

Kolkata, Gurugram, Bengaluru

Hybrid

Naukri logo

Salary: 10 to 16 LPA Exp: 1 to 3 years Location: Gurgaon / Bangalore/ Kolkata (Hybrid) Notice: Immediate to 30 days..!! Key Skills: GCP, Cloud, Pubsub, Data Engineer

Posted 1 month ago

Apply

3.0 - 8.0 years

15 - 30 Lacs

Gurugram, Bengaluru

Hybrid

Naukri logo

Salary: 15 to 30 LPA Exp: 3 to 8 years Location: Gurgaon / Bangalore (Hybrid) Notice: Immediate to 30 days..!! Key Skills: GCP, Cloud, Pubsub, Data Engineer

Posted 1 month ago

Apply

13.0 - 15.0 years

35 - 50 Lacs

Hyderabad

Work from Office

Naukri logo

We're Nagarro , We are a Digital Product Engineering company that is scaling in a big way! We build products, services, and experiences that inspire, excite, and delight. We work at scale across all devices and digital mediums, and our people exist everywhere in the world (18000+ experts across 38 countries, to be exact). Our work culture is dynamic and non-hierarchical. We are looking for great new colleagues. That is where you come in! REQUIREMENTS: Total experience 13+years. Hands-on experience in big data engineering and data architecture. Strong working knowledge with Google Cloud Platform services. Strong working experience in Google Big Query, Google Cloud Storage, Cloud Dataflow, Cloud Composer, and Dataproc. Advanced proficiency in Python and SQL for data engineering and automation. Solid understanding of data modelling, data warehousing, and distributed computing frameworks like Apache Spark. Deep knowledge of data governance, security controls, and regulatory compliance within cloud ecosystems. Strong leadership, problem-solving, and communication skills to drive technical direction and stakeholder engagement. Experience working in agile development environments. Strong communication and coordination skills to work with cross-functional and globally distributed teams. RESPONSIBILITIES: Writing and reviewing great quality code. Understanding the projects functional and non-functional requirements and the business context of the application being developed. Understanding and documenting requirements validated by the SMEs Interacting with clients to identify the scope of testing, expectations, acceptance criteria and availability of test data and environment. Working closely with product owner in defining and refining acceptance criteria. Preparing test plan/strategy Estimating the test effort and preparing schedules for testing activities, assigning tasks, identifying constraints and dependencies Risk management identifying, mitigating and resolving business and technical risks. Determines the potential causes of problems and analyses multiple alternatives. Designing and developing a framework for automated testing following the project's design and coding guidelines. Set up best practices for test automation. Preparing test reports to summarize the outcome of the testing phase and recommending whether the application is in a shippable state or not Communicating measurable quality metrics, with the ability to highlight problem areas and suggest solutions Participating in retrospective meetings, helping identify the root cause of any quality related issue and identifying ways to continuously improve the testing process Conducting demos of the application for internal and external stakeholders Reviewing all testing artifacts prepared by the team and ensuring that defects found during the review are tracked to closure Working with team and stakeholders to triage and prioritize defects for resolution Giving constructive feedback to the team members and setting clear expectations

Posted 1 month ago

Apply

0.0 - 1.0 years

6 - 10 Lacs

Pune

Work from Office

Naukri logo

Job Title: Software Engineer Location: Pune, Maharashtra Company: Zerebral IT Solutions Pvt. Ltd. Website: www.zerebral.co.in Experience Level: 0-1 years Employment Type: Full-time About Zerebral Founded in 2011 and headquartered in Pune, Zerebral IT Solutions is a privately held technology company specializing in building scalable products, big data platforms, and vertical search engines. We focus on large-scale web data extraction, massive-scale APIs, and cloud-native microservices. Our team thrives on solving complex engineering challenges and delivering real-world impact through innovation. Role Overview We are seeking a passionate and experienced Software Engineer to join our dynamic team. In this role, you will be instrumental in designing, developing, and optimising high-performance systems that handle large-scale data processing and API integrations. You will collaborate with cross-functional teams to deliver robust solutions that align with our commitment to excellence and innovation. Key Responsibilities Design, develop, and maintain scalable APIs and microservices for data-intensive applications. Implement large-scale web data extraction and aggregation systems. Optimize backend services for performance, scalability, and reliability. Collaborate with product managers, data scientists, and frontend developers to deliver end-to-end solutions. Ensure code quality through code reviews, testing, and adherence to best practices. Monitor system performance and troubleshoot issues proactively. Why Join Zerebral? Work on cutting-edge technologies and challenging projects. Collaborate with a team of passionate and skilled professionals. Opportunities for continuous learning and career growth. Flexible work environment with a focus on work-life balance. Competitive compensation and benefits package Educational Qualification with minimum cut-off percentages HSC - 70% and above SSC - 80% and above B.E. - Computer Science/IT/Electronics and related branches - 60% and above

Posted 1 month ago

Apply

4.0 - 8.0 years

6 - 10 Lacs

Bengaluru

Work from Office

Naukri logo

We are looking to hire a Senior Manager, Customer Analytics to join our team in Bangalore! Reporting to the Director of Customer Insights & Analytics, youll lead a high-performing team of analysts and Subject matter authorities to deliver customer-centric insights, business intelligence, and data-driven strategies. In this role, youll act as a strategic partner to global collaborators across Marketing, Sales, and CRM, helping drive improvements in customer experience, marketing performance, and commercial outcomes. Youll also play a key role in mentoring talent, crafting analytics processes, and scaling insight capabilities spanning multiple business units. What Youll Do: Lead and mentor a team of 68 analysts and data professionals, encouraging a collaborative, high-performance culture Drive customer analytics initiatives including segmentation, cohort analysis, and performance tracking Partner with global Sales, Marketing, CRM, and Data leaders to align on priorities and deliver actionable insights Translate sophisticated data into clear recommendations and compelling narratives for non-technical collaborators Influence senior-level decision-making through data-driven storytelling and strategic insight Coordinate end-to-end delivery of analytics projects across divisions, ensuring quality and business relevance Improve and standardize analytics tools, processes, and documentation for scalability and consistency Collaborate with Tech and Data Engineering teams to improve customer data infrastructure and analytics capabilities Serve as a trusted advisor to senior leadership on demonstrating customer intelligence for business growth Leverage AI to improve team productivity and analytics efficiency What Youll Bring: Extensive experience in data analytics, customer insights, or marketing analytics, with a proven record in team leadership and management Optimally led customer-centric analytics initiatives within global or matrixed organizations Sophisticated analytical skills with hands-on expertise in SQL and tools like Python, R, or similar statistical platforms Strong proficiency in BI tools such as Power BI and Google Data Studio Familiar with cloud-based data platforms like Google BigQuery and similar analytics stacks Skilled in communicating sophisticated analytical insights clearly and persuasively to senior collaborators Business-savvy approach with the ability to translate data into impactful strategic actions Proven success in mentoring and developing high-performing analytics teams

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies