Home
Jobs

731 Bigquery Jobs - Page 29

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

6 - 10 years

16 - 25 Lacs

Hyderabad, Chennai, Bengaluru

Work from Office

Naukri logo

Hiring!! Role: Sr Power BI Developer Client : MNC(Fulltime)-Permanent Exp: 6-12 years Noticeperiod: Imm/serving/15 Days Location : Pan India Skills: Power BI Developer power bi dashboards, Power Query, GCP (Big Query)& SQL Server. Power Apps Kindly please fill the below details & share updated cv to mansoor@burgeonits.com Name as per Aadhar card Mobile no & Alternate no Email id & Alternate email Date of birth Pan card no(for client upload)mandatory* Total Exp Rev Exp Current company If any payroll (Name) Notice Period (If Serving any Np , Mention last working day) CCTC ECTC Any offers (Yes/No) If yes how much offer &when joining date Current location & Preferred Location Happy to relocate(Yes/No) Available Interview time slots

Posted 1 month ago

Apply

11 - 20 years

45 - 75 Lacs

Gurugram

Work from Office

Naukri logo

Role & responsibilities Proven success architecting and scaling complex software solutions, Familiarity with interface design. Experience and ability to drive a project/module independently from an execution stand. Prior experience with scalable architecture and distributed processing. Strong Programming expertise in Python, SQL, Scala Hands-on experience on any major big data solutions like Spark, Kafka, Hive. Strong data management skills with ETL, DWH, Data Quality and Data Governance. Hands-on experience on microservices architecture, docker and Kubernetes as orchestration. Experience on cloud-based data stores like Redshift and Big Query. Experience in cloud solution architecture. Experience on architecture of running spark jobs on k8s and optimization of spark jobs. Experience in MLops architecture/tools/orchestrators like Kubeflow, MLflow Experience in logging, metrics and distributed tracing systems (e.g. Prometheus/Grafana/Kibana) Experience in CI/CD using octopus/teamcity/jenkins Interested candidate can share their updated resume at surinder.kaur@mounttalent.com

Posted 1 month ago

Apply

5 - 10 years

20 - 25 Lacs

Bengaluru

Work from Office

Naukri logo

About The Role : Job Title Transformation Principal Change Analyst Corporate TitleAVP LocationBangalore, India Role Description We are looking for an experienced Change Manager to lead a variety of regional/global change initiatives. Utilizing the tenets of PMI, you will lead cross-functional initiatives that transform the way we run our operations. If you like to solve complex problems, have a gets things done attitude and are looking for a highly visible dynamic role where your voice is heard and your experience is appreciated, come talk to us What we'll offer you As part of our flexible scheme, here are just some of the benefits that youll enjoy, Best in class leave policy. Gender neutral parental leaves 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Employee Assistance Program for you and your family members Comprehensive Hospitalization Insurance for you and your dependents Accident and Term life Insurance Complementary Health screening for 35 yrs. and above Your key responsibilities Responsible for change management planning, execution and reporting adhering to governance standards ensuring transparency around progress status; Using data to tell the story, maintain risk management controls, monitor and communicate initiatives risks; Collaborate with other departments as required to execute on timelines to meet the strategic goals As part of the larger team, accountable for the delivery and adoption of the global change portfolio including by not limited to business case development/analysis, reporting, measurements and reporting of adoption success measures and continuous improvement. As required, using data to tell the story, participate in Working Group and Steering Committee to achieve the right level of decision making and progress/ transparency, establishing strong partnership and collaborative relationships with various stakeholder groups to remove constraints to success and carry forward to future projects. As required, developing and documenting end-to-end roles and responsibilities, including process flow, operating procedures, required controls, gathering and documenting business requirements (user stories)including liaising with end-users and performing analysis of gathered data. Heavily involved in product development journey Your skills and experience Overall experience of at least 7-10 years leading complex change programs/projects, communicating and driving transformation initiatives using the tenets of PMI in a highly matrixed environment Banking / Finance/ regulated industry experience of which at least 2 years should be in change / transformation space or associated with change/transformation initiatives a plus Knowledge of client lifecycle processes, procedures and experience with KYC data structures / data flows is preferred. Experience working with management reporting is preferred. Bachelors degree How we'll support you Training and development to help you excel in your career. Coaching and support from experts in your team. A culture of continuous learning to aid progression. A range of flexible benefits that you can tailor to suit your needs. About us and our teams Please visit our company website for further information: https://www.db.com/company/company.htm We strive for a culture in which we are empowered to excel together every day. This includes acting responsibly, thinking commercially, taking initiative and working collaboratively. Together we share and celebrate the successes of our people. Together we are Deutsche Bank Group. We welcome applications from all people and promote a positive, fair and inclusive work environment.

Posted 1 month ago

Apply

1 - 5 years

6 - 11 Lacs

Pune

Work from Office

Naukri logo

About The Role : Job Title Data Engineer for Private Bank One Data Platform on Google Cloud Corporate TitleAssociate LocationPune, India Role Description As part of one of the internationally staffed agile teams of the Private Bank One Data Platform, you are part of the "TDI PB Germany Enterprise & Data" division. The focus here is on the development, design, and provision of different solutions in the field of data warehousing, reporting and analytics for the Private Bank to ensure that necessary data is provided for operational and analytical purposes. The PB One Data Platform is the new strategic data platform of the Private Bank and uses the Google Cloud Platform as the basis. With Google as a close partner, we are following Deutsche Banks cloud strategy with the aim of transferring or rebuilding a significant share of todays on-prem applications to the Google Cloud Platform. What we'll offer you As part of our flexible scheme, here are just some of the benefits that youll enjoy, Best in class leave policy. Gender neutral parental leaves 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Employee Assistance Program for you and your family members Comprehensive Hospitalization Insurance for you and your dependents Accident and Term life Insurance Complementary Health screening for 35 yrs. and above Your key responsibilities Work within software development applications as a Data Engineer to provide fast and reliable data solutions for warehousing, reporting, Customer- and Business Intelligence solutions. Partner with Service/Backend Engineers to integrate data provided by legacy IT solutions into your designed databases and make it accessible to the services consuming those data. Focus on the design and setup of databases, data models, data transformations (ETL), critical Online banking business processes in the context Customer Intelligence, Financial Reporting and performance controlling. Contribute to data harmonization as well as data cleansing. A passion for constantly learning and applying new technologies and programming languages in a constantly evolving environment. Build solutions are highly scalable and can be operated flawlessly under high load scenarios. Together with your team, you will run and develop you application self-sufficiently. You'll collaborate with Product Owners as well as the team members regarding design and implementation of data analytics solutions and act as support during the conception of products and solutions. When you see a process running with high manual effort, you'll fix it to run automated, optimizing not only our operating model, but also giving yourself more time for development. Your skills and experience Mandatory Skills Hands-on development work building scalabledata engineering pipelinesand other data engineering/modellingwork usingJava/Python. Excellent knowledge of SQL and NOSQL databases. Experience working in a fast-paced and Agile work environment. Working knowledge of public cloud environment. Preferred Skills Experience inDataflow (Apache Beam)/Cloud Functions/Cloud Run Knowledge of workflow management tools such asApache Airflow/Composer. Demonstrated ability to write clear code that is well-documented and stored in a version control system (GitHub). Knowledge ofGCS Buckets, Google Pub Sub, BigQuery Knowledge aboutETLprocesses in theData Warehouseenvironment/Data Lakeand how to automate them. Nice to have Knowledge of provisioning cloud resources usingTerraform. Knowledge ofShell Scripting. Experience withGit,CI/CD pipelines,Docker, andKubernetes. Knowledge ofGoogle Cloud Cloud Monitoring & Alerting Knowledge ofCloud Run, Data Form, Cloud Spanner Knowledge of Data Warehouse solution -Data Vault 2.0 Knowledge onNewRelic Excellent analytical and conceptual thinking. Excellent communication skills, strong independence and initiative, ability to work in agile delivery teams. Good communication and experience in working with distributed teams (especially Germany + India) How we'll support you Training and development to help you excel in your career. Coaching and support from experts in your team. A culture of continuous learning to aid progression. A range of flexible benefits that you can tailor to suit your needs. About us and our teams Please visit our company website for further information: https://www.db.com/company/company.htm We strive for a culture in which we are empowered to excel together every day. This includes acting responsibly, thinking commercially, taking initiative and working collaboratively. Together we share and celebrate the successes of our people. Together we are Deutsche Bank Group. We welcome applications from all people and promote a positive, fair and inclusive work environment.

Posted 1 month ago

Apply

3 - 7 years

10 - 14 Lacs

Pune

Work from Office

Naukri logo

About The Role : Job Title GCP Data Engineer, AS LocationPune, India Role Description An Engineer is responsible for designing and developing entire engineering solutions to accomplish business goals. Key responsibilities of this role include ensuring that solutions are well architected, with maintainability and ease of testing built in from the outset, and that they can be integrated successfully into the end-to-end business process flow. They will have gained significant experience through multiple implementations and have begun to develop both depth and breadth in several engineering competencies. They have extensive knowledge of design and architectural patterns. They will provide engineering thought leadership within their teams and will play a role in mentoring and coaching of less experienced engineers. What we'll offer you As part of our flexible scheme, here are just some of the benefits that youll enjoy Best in class leave policy. Gender neutral parental leaves 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Employee Assistance Program for you and your family members Comprehensive Hospitalization Insurance for you and your dependents Accident and Term life Insurance Complementary Health screening for 35 yrs. and above Your key responsibilities Design, develop and maintain data pipelines using Python and SQL programming language on GCP. Experience in Agile methodologies, ETL, ELT, Data movement and Data processing skills. Work with Cloud Composer to manage and process batch data jobs efficiently. Develop and optimize complex SQL queries for data analysis, extraction, and transformation. Develop and deploy google cloud services using Terraform. Implement CI CD pipeline using GitHub Action Consume and Hosting REST API using Python. Monitor and troubleshoot data pipelines, resolving any issues in a timely manner. Ensure team collaboration using Jira, Confluence, and other tools. Ability to quickly learn new any existing technologies Strong problem-solving skills. Write advanced SQL and Python scripts. Certification on Professional Google Cloud Data engineer will be an added advantage. Your skills and experience 6+ years of IT experience, as a hands-on technologist. Proficient in Python for data engineering. Proficient in SQL. Hands on experience on GCP Cloud Composer, Data Flow, Big Query, Cloud Function, Cloud Run and well to have GKE Hands on experience in REST API hosting and consumptions. Proficient in Terraform/ Hashicorp. Experienced in GitHub and Git Actions Experienced in CI-CD Experience in automating ETL testing using python and SQL. Good to have APIGEE. Good to have Bit Bucket How we'll support you Training and development to help you excel in your career. Coaching and support from experts in your team A culture of continuous learning to aid progression. A range of flexible benefits that you can tailor to suit your needs.

Posted 1 month ago

Apply

4 - 9 years

14 - 19 Lacs

Pune

Work from Office

Naukri logo

About The Role : We are looking for a passionate and self-motivated Technology Leader to join our team in Accounting domain. Being part of a diverse multi-disciplinary global team, you will collaborate with other disciplines to shape technology strategy, drive engineering excellence and deliver business outcomes. What we'll offer you As part of our flexible scheme, here are just some of the benefits that you'll enjoy * Best in class leave policy * Gender neutral parental leaves * 100% reimbursement under childcare assistance benefit (gender neutral) * Sponsorship for Industry relevant certifications and education * Employee Assistance Program for you and your family members * Comprehensive Hospitalization Insurance for you and your dependents * Accident and Term life Insurance * Complementary Health screening for 35 yrs. and above This role is responsible for Design and Implementation of the high-quality technology solutions. The candidate should have demonstrated technical expertise having excellent problem-solving skills. The candidate is expected to; be a hands-on engineering lead involved in analysis, design, design/code reviews, coding and release activities champion engineering best practices and guide/mentor team to achieve high performance. work closely with Business stakeholders, Tribe lead, Product Owner, Lead Architect to successfully deliver the business outcomes. acquire functional knowledge of the business capability being digitized/re-engineered. demonstrate ownership, inspire others, innovative thinking, growth mindset and collaborate for success. focus on upskilling people, team building and career development. keeping up-to-date with industry trends and developments. Your Skills & Experience: Minimum 15 years of IT industry experience in Full stack development Expert in Java, Spring Boot, NodeJS, SQL/PLSQL, ReactJS, Strong experience in Big data processing Apache Spark, Hadoop, Bigquery, DataProc, Dataflow etc Strong experience in Kubernetes, OpenShift container platform Experience with Databases Oracle, PostgreSQL, MongoDB, Redis/hazelcast, should understand data modeling, normalization, and performance optimization Experience in message queues (RabbitMQ/IBM MQ, JMS) and Data streaming i.e. Kafka, Pub-sub etc Experience of working on public cloud GCP preferred, AWS or Azure Knowledge of various distributed/multi-tiered architecture styles Micro-services, Data mesh, Integration patterns etc Experience on modern software product delivery practices, processes and tooling and BIzDevOps skills such as CI/CD pipelines using Jenkins, Git Actions etc Experience on designing solutions, based on DDD and implementing Clean / Hexagonal Architecture efficient systems that can handle large-scale operation Experience on leading teams and mentoring developers Focus on quality experience with TDD, BDD, Stress and Contract Tests Proficient in working with APIs (Application Programming Interfaces) and understand data formats like JSON, XML, YAML, Parquet etc Advantageous: * Having prior experience in Banking/Finance domain * Having worked on hybrid cloud solutions preferably using GCP * Having worked on product development How we'll support you: * Training and development to help you excel in your career * Coaching and support from experts in your team * A culture of continuous learning to aid progression * A range of flexible benefits that you can tailor to suit your needs About us and our teams Please visit our company website for further information: https://www.db.com/company/company.htm We strive for a culture in which we are empowered to excel together every day. This includes acting responsibly, thinking commercially, taking initiative and working collaboratively. Together we share and celebrate the successes of our people. Together we are Deutsche Bank Group. We welcome applications from all people and promote a positive, fair and inclusive work environment.

Posted 1 month ago

Apply

4 - 9 years

16 - 20 Lacs

Pune

Work from Office

Naukri logo

About The Role : Job TitleIT Application Owner, AS LocationPune, India Role Description Deutsche Banks Strategy & Innovation Engineering team identifies, evaluates, and incubates cutting-edge technical innovation. It is part of the Chief Strategy Office of the banks Technology, Data & Innovation (TDI) function and works globally with all business lines and infrastructure functions of the bank. A focus of the team is to create value for clients and the bank using Artificial Intelligence, Large Language Models (LLM) and other advanced data-driven technologies. As a ITAO, you will be joining the innovation engineering team and contribute to the supporting and managing of new AI products and services for the entire Deutsche Bank Group. We require technical specialists to help research, design, and implement state of the art AI services, with particular focus on performing technology evaluations of AI products. You will make a real difference for senior stakeholders across core banking functions where computational, complexity and efficiency challenges abound through your own delivery and through the promotion of modern AI development best practices and techniques. Overview We are seeking a talented and experienced AI Engineer to join our team. The ideal candidate will be hands-on and drive design, development, and implementation of AI based solutions for CB Tech. This role involves working with large datasets, conducting experiments, and staying updated with the latest advancements in AI and Machine Learning. This person is expected to innovate and support the Innovation teams Tech efforts in modernizing the engineering landscape by identifying AI use cases and provide local support by owning ITAO role of AI Platform of Bank. If you are carrying engineering mindset, have a passion for AI and want to be part of developing innovative products then apply today. What we'll offer you As part of our flexible scheme, here are just some of the benefits that youll enjoy Best in class leave policy Gender neutral parental leaves 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Employee Assistance Program for you and your family members Comprehensive Hospitalization Insurance for you and your dependents Accident and Term life Insurance Complementary Health screening for 35 yrs. and above Your key responsibilities The IT Application owner (ITAO) is responsible for the Application Management and has to ensure that the applications are enhanced and maintained in accordance to the Banks IT Policy requirements on the application lifecycle governance. Design, develop, and deploy solutions using Advanced Analytics, Machine Learning / AI and cloud technologies that fulfil Deutsche Bank's Innovation Strategy Contribute to effective and efficient technical research and experiments, through technology evaluations, publishing AI research reports, and building proof-of-concept (POCs) Engage with business stakeholders to identify and evaluate opportunities to create value through innovative solutions Foster adoption of AI and ML by collaborating with cross-functional teams and educating stakeholders on AI driven solutions Stay up-to-date with the latest advancements in AI and Data Science Ongoing enhancement and maintenance of the application including management of scope. Ensuring that the changes to the applications in scope are fully aligned DB standards and regulations. The main focus is to guarantee the system stability and to ensure a smooth and successful transition to production, steady-state environment. Conducting strategic planning for the application. Managing strategic capacity, consumption and performance management (Forecast and management based on business plans). Ensuring policy-compliance for the application. Facilitating and contributing to the audit activities. Managing software licenses, security certificates and contracts with service providers. Ensuring documentation availability. Identifying and managing technical projects necessary to ensure required and established service levels are maintained. Working with development center team to estimate work effort throughout different phases of the functional domain deliverables. Assisting with development of configuration/monitoring/packaging/deployment/automations of AI Platforms. Identifying, documenting and communicating risks and issues discovered during delivery cycle. Your skills and experience Excellent communication and presentation skills, highly organized and disciplined. Experienced in working with multiple stakeholders. Ability to create and naturally maintain good business relationships with all stakeholders. IT Service Management, IT Governance or IT Project Management background. ITAO, TISO roles awareness, Compliance, Risk and Governance concepts with respect to financial industry. Comfortable working in VUCA (Volatility Uncertainty Complexity Ambiguity) and highly dynamic environments. ITAO will typically have a rather limited technical hands on involvement. A high-level understanding on the products/technologies below is welcomed: Google Cloud GKE, Terraform, IAM, BigQuery, Cloud Shell, Cloud Storage AI/ML AI Agents, AI concepts, ML models, AI/ML Concepts, ,Vertex AI, AutoML, BigQuery ML. MLOps & CICD Pipelines, Kubeflow, Vertex AI pipelines Proficiency in Designing, deploying and managing AI agents e..g chatbots, virtual assistants GCP Networking, Networking protocols, Security concepts, VPC, Load balancers Unix servers very basic administration Python, Shell Scripting, SQL Familiarity with fine-tuning and deploying large language models on GCP. Understanding of security best practices, including data governance, encryption, and compliance with AI-related regulations. GCP - Cloud Logging, Cloud Monitoring and AI Model Performance Tracking. 6+ years of work experience in IT; (for AVP 6+, Associate 4+) Strong problem-solving skills and a passion for AI research Good inter-personal skills with ability to co-operate and collaborate together with other teams Educational Qualifications B.E. / B. Tech. / Master's degree in computer science or equivalent Added advantage. GCP Certifications Kubernetes Certifications AI/Ml Educational background or Certifications or higher qualifications How we'll support you Training and development to help you excel in your career Coaching and support from experts in your team A culture of continuous learning to aid progression A range of flexible benefits that you can tailor to suit your needs

Posted 1 month ago

Apply

2 - 5 years

5 - 10 Lacs

Chennai

Work from Office

Naukri logo

Req ID: 320304 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Lead Developer to join our team in Chennai, Tamil Nadu (IN-TN), India (IN). Lead .NET Developer - Remote Who We Are NTT DATA America strives to hire exceptional, innovative and passionate individuals who want to grow with us. Launch by NTT DATA is the culmination of the company’s strategy to acquire and integrate the skills, experience, and technology of leading digital companies, backed by NTT DATA’s core capabilities, global reach, and depth. How You’ll Help Us A Lead Application Developer is first and foremost a software developer who specializes in .NET C# development. You’ll be part of a team focused on delivering quality software for our clients. How We Will Help You Joining our Microsoft practice is not only a job, but a chance to grow your career. We will make sure to equip you with the skills you need to produce robust applications that you can be proud of. Whether it is providing you with training on a new programming language or helping you get certified in a new technology, we will help you grow your skills so you can continue to deliver increasingly valuable work. Once You Are Here, You Will The Lead Application Developer provides leadership in full systems life cycle management (e.g., analyses, technical requirements, design, coding, testing, implementation of systems and applications software, etc.) to ensure delivery is on time and within budget. You will direct component and data architecture design, technology planning, and testing for Applications Development (AD) initiatives to meet business requirements and ensure compliance. This position develops and leads AD project activities and integrations. The Lead Applications Developer guides teams to ensure effective communication and achievement of objectives, in addition to researching and supporting the integration of emerging technologies. This position provides knowledge and support for applications’ development, integration, and maintenance. The Lead Applications Developer will lead junior team members with project related activities and tasks. Additionally, you will guide and influence the department, project teams, and facilitate collaboration with stakeholders. Basic Qualifications 5+ years of experience with Angular/ExtJS 8+ years developing in developing .NET applications in C# 8+ years of experience in designing and developing Restful Webservices leveraging micro-service design and implementation patterns 8+ years of experience with SQL Server 8+ years of experience with PL/SQL Scripting 8+ years of experience with DB reporting leveraging tools such as SSIS 3+ years of experience as being a tech lead such that you have mentored and coached less senior resources. 3+ years of experience with software architectural design. Preferred Experience with GCP in web services Experience with GCP Big Data Experience with GCP BigQuery Experience with PowerBI Ideal Mindset Lifelong LearnerYou are always seeking to improve your technical and nontechnical skills. Team PlayerYou are someone who wants to see everyone on the team succeed and is willing to go the extra mile to help a teammate in need. CommunicatorYou know how to communicate your design ideas to both technical and nontechnical stakeholders, prioritizing critical information and leaving out extraneous details. Please note Shift Timing Requirement1:30pm IST -10:30 pm IST #Launchjobs #LaunchEngineering About NTT DATA NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies.Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us atus.nttdata.com NTT DATA endeavors to make https://us.nttdata.comaccessible to any and all users. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact us at https://us.nttdata.com/en/contact-us. This contact information is for accommodation requests only and cannot be used to inquire about the status of applications. NTT DATA is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status. For our EEO Policy Statement, please click here. If you'd like more information on your EEO rights under the law, please click here. For Pay Transparency information, please click here. Job Segment .NET, Application Developer, Testing, Developer, Information Technology, Technology

Posted 1 month ago

Apply

2 - 5 years

6 - 10 Lacs

Hyderabad

Work from Office

Naukri logo

Req ID: 319692 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Business Intelligence Specialist to join our team in Hyderabad, Telangana (IN-TG), India (IN). Senior Developer Mandatory Skills: GCP, Big Query,Linux Shell Script, SQL Server Desired Skills: ETL/ELT, Python, Agile, MS Office JD: Senior Developers with solid Linux Shell Scripting and SQL experience with knowlegde of ETL/ELT application design. About NTT DATA NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies.Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us atus.nttdata.com NTT DATA endeavors to make https://us.nttdata.comaccessible to any and all users. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact us at https://us.nttdata.com/en/contact-us. This contact information is for accommodation requests only and cannot be used to inquire about the status of applications. NTT DATA is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status. For our EEO Policy Statement, please click here. If you'd like more information on your EEO rights under the law, please click here. For Pay Transparency information, please click here. Job Segment Business Intelligence, Database, SQL, Linux, Consulting, Technology

Posted 1 month ago

Apply

5 - 10 years

9 - 19 Lacs

Bengaluru

Work from Office

Naukri logo

Key Responsibilities: Design, develop, and optimize interactive dashboards using Looker and LookerML . Work with BigQuery to create efficient data models and queries for visualization. Develop LookML models, explores, and derived tables to support business intelligence needs. Optimize dashboard performance by implementing best practices in data aggregation and visualization. Collaborate with data engineers, analysts, and business teams to understand requirements and translate them into actionable insights. Implement security and governance policies within Looker to ensure data integrity and controlled access. Leverage Google Cloud Platform (GCP) services to build scalable and reliable data solutions. Maintain documentation and provide training to stakeholders on using Looker dashboards effectively. Troubleshoot and resolve issues related to dashboard performance, data accuracy, and visualization constraints. Maintain and optimize existing Looker dashboards and reports to ensure continuity and alignment with business KPIs Understand, audit, and enhance existing LookerML models to ensure data integrity and performance Build new dashboards and data visualizations based on business requirements and stakeholder input Collaborate with data engineers to define and validate data pipelines required for dashboard development and ensure the timely availability of clean, structured data Document existing and new Looker assets and processes to support knowledge transfer, scalability, and maintenance Support the transition/handover process by acquiring detailed knowledge of legacy implementations and ensuring a smooth takeover Required Skills & Experience: 5-8 years of experience in data visualization and business intelligence using Looker and LookerML . Strong proficiency in writing and optimizing SQL queries , especially for BigQuery . Experience in Google Cloud Platform (GCP) , particularly with BigQuery and related data services. Solid understanding of data modeling, ETL processes, and database structures. Familiarity with data governance, security, and access controls in Looker. Strong analytical skills and the ability to translate business requirements into technical solutions. Excellent communication and collaboration skills. Expertise in Looker and LookerML, including Explore creation, Views, and derived tables Strong SQL skills for data exploration, transformation, and validation Experience in BI solution lifecycle management (build, test, deploy, maintain) Excellent documentation and stakeholder communication skills for handovers and ongoing alignment Strong data visualization and storytelling abilities, focusing on user-centric design and clarity Preferred Qualifications: Experience working in the media industry (OTT, DTH, Web) and handling large-scale media datasets. Knowledge of other BI tools like Tableau, Power BI, or Data Studio is a plus. Experience with Python or scripting languages for automation and data processing. Understanding of machine learning or predictive analytics is an advantage.

Posted 1 month ago

Apply

3 - 7 years

20 - 25 Lacs

Pune

Remote

Naukri logo

1. Extract and transform data from Google BigQuery and other relevant data sources. 2. Utilize Python and libraries such as Pandas and NumPy to manipulate, clean, and analyze large datasets. 3. Develop and implement Python scripts to automate data extraction, processing, and analysis for comparison reports. 4. Design and execute queries in BigQuery to retrieve specific data sets required for comparison analysis.

Posted 1 month ago

Apply

10 - 15 years

0 - 0 Lacs

Chennai

Work from Office

Naukri logo

About the Role As a Senior Data Engineer you’ll be a core part of our engineering team. You will bring your valuable experience and knowledge, improving the technical quality of our data-focused products. This is a key role in helping us become more mature, deliver innovative new products and unlock further business growth. This role will be part of a newly formed team that will collaborate alongside data team members based in Ireland, USA and India. Following the successful delivery of some fantastic products in 2024, we have embarked upon a data-driven strategy in 2025. We have a huge amount of data and are keen to accelerate unlocking its value to delight our customers and colleagues. You will be tasked with delivering new data pipelines, actionable insights in automated ways and enabling innovative new product features. Reporting to our Team Lead, you will be collaborating with the engineering and business teams. You’ll work across all our brands, helping to shape their future direction. Working as part of a team, you will help shape the technical design of our platforms and solve complicated problems in elegant ways that are robust, scalable, and secure. We don’t get everything right first time, but you will help us reflect, adjust and be better next time around. We are looking for people who are inquisitive, confident exploring unfamiliar problems, and have a passion for learning. We don’t have all the answers and don’t expect you to know everything either. Our team culture is open, inclusive, and collaborative – we tackle goals together. Seeking the best solution to a problem, we actively welcome ideas and opinions from everyone in the team. Our Technologies We are continuously evolving our products and exploring new opportunities. We are focused on selecting the right technologies to solve the problem at hand. We know the technologies we’ll be using in 3 years’ time will probably be quite different to what we’re using today. You’ll be a key contributor to evolving our tech stack over time. Our data pipelines are currently based upon Google BigQuery, FiveTran and DBT Cloud. These involve advanced SQL alongside Python in a variety of areas. We don’t need you to be an expert with these technologies, but it will help if you’re strong with something similar. Your Skills and Experience This is an important role for us as we scale up the team and we are looking for someone who has existing experience at this level. You will have worked with data driven platforms that involve some kind of transaction, such as eCommerce, trading platforms or advertising lead generation. Your broad experience and knowledge of data engineering methods mean you’re able to build high quality products regardless of the language used – solutions that avoid common pitfalls impacting the platform’s technical performance. You can apply automated approaches for tracking and measuring quality throughout the whole lifecycle, through to the production environments. You are comfortable working with complex and varied problems. As a strong communicator, you work well with product owners and business stakeholders. You’re able to influence and persuade others by listening to their views, explaining your own thoughts, and working to achieve agreement. We have many automotive industry experts within our team already and they are eager to teach you everything you need to know for this role. Any existing industry knowledge is a bonus but is not necessary. This is a full-time role based in our India office on a semi-flexible basis. Our engineering team is globally distributed but we’d like you to be accessible to the office for ad-hoc meetings and workshops.

Posted 1 month ago

Apply

4 - 8 years

0 - 0 Lacs

Chennai

Work from Office

Naukri logo

Who we are looking for: Monk is leading the way in AI-driven innovation with its advanced damage detection technology for vehicles, enabling seamless integration into web and native applications via APIs. Our machine learning expertise is trusted by global leaders like our parent company ACV Auctions (USA), CAT Logistics (Europ Getaround (France), Autobiz (Europe), and Hgreg (Canad We are looking for a Machine Learning Scientist in Computer Vision who thrives on taking ownership, delivering impactful results, and driving innovation. If you're someone who can align with business needs, ensure timely delivery, and elevate the team's capabilities, we’d love to have you on board. This role is your opportunity to lead game-changing projects, create tangible value, and shape the future of AI technology in the automotive space. What you will do: Own the end-to-end development of machine learning models and datasets to solve critical business challenges and meet product needs. Drive innovation by continuously tracking and integrating state-of-the-art advancements of computer vision and deep learning into Monk’s AI solutions Identify and solve key business problems with data-driven, actionable insights. Deliver high-impact results through meticulous planning, rigorous testing, and smooth deployment of machine learning solutions. Mentor and support junior team members, cultivating a culture of excellence and collaboration. Machine Learning Scientist CV Collaborate with product, business, and technical teams to align on priorities, estimate efforts, and adapt to feedback. Ensure long-term reliability and scalability through robust documentation and testing practices. Tech Stack Utilized: Languages: Python, SQL Libraries/Frameworks: PyTorch ecosystem, a touch of OpenCV and Scikit-learn Tools: Jupyter, DBT, Docker Infrastructure: GCP, Kubernetes, BigQuery Version Control: GitHub Background and Skills: Required Skills 5+ years of experience in computer vision, data science or related fields, wit a proven track record of delivering impactful results. Strong foundation in machine learning, computer science, statistical modeling, and data processing pipelines. Proficiency in Python and SQL for data manipulation and model development. Solid experience deploying machine learning models into production environments. A proactive approach to aligning with business needs and driving team-wide innovation. Strong communication skills to explain technical concepts to non-technical stakeholders. Fluent in English, enabling effective collaboration across global teams. Desired Background Master’s or Ph.D. in Data Science, Computer Science, or a related field Experience in B2B SaaS, automotive technology, or AI-driven startups. Knowledge of working with distributed teams across time zones. A proactive mindset, with the ability to work autonomously in fast-paced environments. #LI-RG1

Posted 1 month ago

Apply

5 - 8 years

7 - 17 Lacs

Bengaluru

Work from Office

Naukri logo

EXP: 5+yrs Data Visualization, Data Modelling GCP, Big Query, SQL Looker Dashboard

Posted 1 month ago

Apply

6 - 11 years

0 - 1 Lacs

Pune, Chennai, Bengaluru

Work from Office

Naukri logo

Role: GCP Data Engineer Experience: 6-12 yrs Location: Chennai,Hyderabad,Bangalore,Pune,Gurgaon Required Skillset =>Should have experience in Big query, Dataflow , Cloud SQL , Cloud composer =>Should have experience in Python , Vertex and Data flow Interested candidates can send resume to jegadheeswari.m@spstaffing.in or reach me @9566720836

Posted 1 month ago

Apply

11 - 21 years

25 - 40 Lacs

Hyderabad, Pune, Bengaluru

Hybrid

Naukri logo

Warm Greetings from SP Staffing Services Private Limited!! We have an urgent opening with our CMMI Level 5 client for the below position. Please send your update profile if you are interested. Relevant Experience: 3 - 20 Yrs Location- Pan India Job Description : - Skills: GCP, BigQuery, Cloud Composer, Cloud DataFusion, Python, SQL 5-20 years of overall experience mainly in the data engineering space, 2+ years of Hands-on experience in GCP cloud data implementation, Experience of working in client facing roles in technical capacity as an Architect. must have implementation experience of GCP based clous Data project/program as solution architect, Proficiency of using Google Cloud Architecture Framework in Data context Expert knowledge and experience of core GCP Data stack including BigQuery, DataProc, DataFlow, CloudComposer etc. Exposure to overall Google tech stack of Looker/Vertex-AI/DataPlex etc. Expert level knowledge on Spark.Extensive hands-on experience working with data using SQL, Python Strong experience and understanding of very large-scale data architecture, solutioning, and operationalization of data warehouses, data lakes, and analytics platforms. (Both Cloud and On-Premise) Excellent communications skills with the ability to clearly present ideas, concepts, and solutions If interested please forward your updated resume to sankarspstaffings@gmail.com / Sankar@spstaffing.in or you can reach me @ 8939853050 With Regards, Sankar G Sr. Executive - IT Recruitment

Posted 1 month ago

Apply

15 - 21 years

15 - 19 Lacs

Bengaluru

Work from Office

Naukri logo

Overview We are seeking a highly skilled Senior Architect with extensive experience in leading Hadoop to BigQuery migration projects. As a key member of our team, you will be responsible for designing and implementing scalable solutions that leverage Google Cloud Platform (GCP) technologies, specifically focusing on BigQuery. Your expertise in Big Data technologies, Python scripting, and database migration will be crucial in ensuring the successful transition from Hadoop to BigQuery Responsibilities Key Responsibilities: Lead the architecture and design phases of the Hadoop to BigQuery migration project. Develop strategies and solutions for efficient data ingestion, processing, and storage on GCP. Collaborate with cross-functional teams to ensure alignment with business requirements and technical specifications. Implement best practices for data governance, security, and performance optimization. Provide technical guidance and mentorship to junior team members. Mandatory Skills: GCP Data Engineer Certification Extensive experience with Big Data technologies and frameworks (e.g., Hadoop, BigQuery). Proficiency in Python for scripting and automation. Hands-on experience with Apache Airflow for workflow orchestration. Proven track record in database migration projects, particularly from Hadoop to BigQuery. Optional Skills: Experience with Scala, PySpark, and Spark SQL. Proficiency in Java programming. Familiarity with Informatica or similar ETL tools. Requirements A Bachelor's degree in any field, with a desired MSc/BE/Masters with 15+ years of experience in software development and architecture roles. Strong analytical and problem-solving skills. Excellent communication and interpersonal skills. Experience working in multi-channel delivery projects

Posted 1 month ago

Apply

5 - 10 years

9 - 13 Lacs

Chennai

Work from Office

Naukri logo

Overview GCP all services (Pubsub, BQ, Airflow, data proc, cloud composer, gcs) +Teradata Responsibilities GCP all services (Pubsub, BQ, Airflow, data proc, cloud composer, gcs) + Teradata Requirements GCP all services (Pubsub, BQ, Airflow, data proc, cloud composer, gcs) + Teradata

Posted 1 month ago

Apply

8 - 13 years

13 - 23 Lacs

Jaipur

Hybrid

Naukri logo

Your skills and experience We are looking for talents with a Degree (or equivalent) in Engineering, Mathematics, Statistics, Sciences from an accredited college or university (or equivalent) to develop analytical solutions for our stakeholders to support strategic decision making. Any professional certification in Advanced Analytics, Data Visualisation and Data Science related domain is a plus. You have a natural curiosity for numbers and have strong quantitative & logical thinking skills. You ensure results are of high data quality and accuracy. You have working experience on Google Cloud and have worked with Cross functional teams to enable data source and process migration to GCP, you have working experience with SQL You are adaptable to emerging technologies like leveraging Machine Learning and AI to drive innovation. Procurement experience (useful --- though not essential) across vendor management, sourcing, risk, contracts and purchasing preferably within a Global and complex environment. You have the aptitude to understand stakeholders requirements, identify relevant data sources, integrate data, perform analysis and interpret the results by identifying trends and patterns. You enjoy the problem-solving process, think out of the box and break down a problem into its constituent parts with a view to developing end-to-end solution. You display enthusiasm to work in data analytics domain and strive for continuous learning and improvement of your technical and soft skills. You demonstrate working knowledge of different analytical tools like Tableau, Databases, Alteryx, Pentaho, Looker, Big Query in order to work with large datasets and derive insights for decision making. You enjoy working in a team and your language skills in English are convincing, making it easy for you to work in an international environment and with global, virtual teams

Posted 1 month ago

Apply

2 - 6 years

7 - 11 Lacs

Hyderabad

Work from Office

Naukri logo

As a Software Developer you'll participate in many aspects of the software development lifecycle, such as design, code implementation, testing, and support. You will create software that enables your clients' hybrid-cloud and AI journeys. Your primary responsibilities includeComprehensive Feature Development and Issue ResolutionWorking on the end to end feature development and solving challenges faced in the implementation. Stakeholder Collaboration and Issue ResolutionCollaborate with key stakeholders, internal and external, to understand the problems, issues with the product and features and solve the issues as per SLAs defined. Continuous Learning and Technology IntegrationBeing eager to learn new technologies and implementing the same in feature development. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise GCP using pub sub, big query, dataflow, cloud workflow/cloud scheduler, cloud run, data proc, Cloud Function Cloud data engineers with GCP PDE certification and working experience with GCP. Building end to end data pipelines in GCP using pub sub, big query, dataflow, cloud workflow/cloud scheduler, cloud run, data proc, Cloud Function Experience in logging and monitoring of GCP services and Experience in Terraform and infrastructure automation. Expertise in Python coding language Develops data engineering solutions on Google Cloud ecosystem and Supports and maintains data engineering solutions on Google Cloud ecosystem Preferred technical and professional experience Stay updated with the latest trends and advancements in cloud technologies, frameworks, and tools. Conduct code reviews and provide constructive feedback to maintain code quality and ensure adherence to best practices. Troubleshoot and debug issues, and deploy applications to the cloud platform

Posted 1 month ago

Apply

2 - 5 years

4 - 8 Lacs

Pune

Work from Office

Naukri logo

About The Role Process Manager - GCP Data Engineer Mumbai/Pune| Full-time (FT) | Technology Services Shift Timings - EMEA(1pm-9pm)|Management Level - PM| Travel Requirements - NA The ideal candidate must possess in-depth functional knowledge of the process area and apply it to operational scenarios to provide effective solutions. The role enables to identify discrepancies and propose optimal solutions by using a logical, systematic, and sequential methodology. It is vital to be open-minded towards inputs and views from team members and to effectively lead, control, and motivate groups towards company objects. Additionally, candidate must be self-directed, proactive, and seize every opportunity to meet internal and external customer needs and achieve customer satisfaction by effectively auditing processes, implementing best practices and process improvements, and utilizing the frameworks and tools available. Goals and thoughts must be clearly and concisely articulated and conveyed, verbally and in writing, to clients, colleagues, subordinates, and supervisors. Process Manager Roles and responsibilities: Participate in Stakeholder interviews, workshops, and design reviews to define data models, pipelines, and workflows. Analyse business problems and propose data-driven solutions that meet stakeholder objectives. Experience on working on premise as well as cloud platform (AWS/GCP/Azure) Should have extensive experience in GCP with a strong focus on Big Query, and will be responsible for designing, developing, and maintaining robust data solutions to support analytics and business intelligence needs. (GCP is preferable over AWS & Azure) Design and implement robust data models to efficiently store,organize,and access data for diverse use cases. Design and build robust data pipelines (Informatica / Fivertan / Matillion / Talend) for ingesting, transforming, and integrating data from diverse sources. Implement data processing pipelines using various technologies, including cloud platforms, big data tools, and streaming frameworks (Optional). Develop and implement data quality checks and monitoring systems to ensure data accuracy and consistency. Technical and Functional Skills: Bachelors Degree with 5+ years of experience with relevant 3+ years hands-on of experience in GCP with BigQuery. Good knowledge of any 1 of the databases scripting platform (Oracle preferable) Work would involve analysis, development of code/pipelines at modular level, reviewing peers code and performing unit testing and owning push to prod activities. With 5+ of work experience and worked as Individual contributor for 5+ years Direct interaction and deep diving with VPs of deployment Should work with cross functional team/ stakeholders Participate in Backlog grooming and prioritizing tasks Worked on Scrum Methodology. GCP certification desired. About eClerx eClerx is a global leader in productized services, bringing together people, technology and domain expertise to amplify business results. Our mission is to set the benchmark for client service and success in our industry. Our vision is to be the innovation partner of choice for technology, data analytics and process management services. Since our inception in 2000, we've partnered with top companies across various industries, including financial services, telecommunications, retail, and high-tech. Our innovative solutions and domain expertise help businesses optimize operations, improve efficiency, and drive growth. With over 18,000 employees worldwide, eClerx is dedicated to delivering excellence through smart automation and data-driven insights. At eClerx, we believe in nurturing talent and providing hands-on experience. About eClerx Technology eClerxs Technology Group collaboratively delivers Analytics, RPA, AI, and Machine Learning digital technologies that enable our consultants to help businesses thrive in a connected world. Our consultants and specialists partner with our global clients and colleagues to build and implement digital solutions through a broad spectrum of activities. To know more about us, visit https://eclerx.com eClerx is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, disability or protected veteran status, or any other legally protected basis, per applicable law

Posted 1 month ago

Apply

2 - 5 years

4 - 8 Lacs

Mumbai

Work from Office

Naukri logo

About The Role The ideal candidate must possess in-depth functional knowledge of the process area and apply it to operational scenarios to provide effective solutions. The role enables to identify discrepancies and propose optimal solutions by using a logical, systematic, and sequential methodology. It is vital to be open-minded towards inputs and views from team members and to effectively lead, control, and motivate groups towards company objects. Additionally, candidate must be self-directed, proactive, and seize every opportunity to meet internal and external customer needs and achieve customer satisfaction by effectively auditing processes, implementing best practices and process improvements, and utilizing the frameworks and tools available. Goals and thoughts must be clearly and concisely articulated and conveyed, verbally and in writing, to clients, colleagues, subordinates, and supervisors. Process Manager Role and responsibilities: Set up user journey dashboards across customer touchpoints (web and mobile app) in Adobe analytics, GA4 and Amplitude and identify pain points, familiarity with auditing tags using Omibug, GA debugger and other relevant tools. Understanding and familiarity with cross-device analyses using combined reporting suites, virtual report suite and familiarity with people metric and data warehouse. Building complex segments in analytics tools by going through online user journeys, self-serving tag audit to build segments. Analysis of customer journey and recommend personalization tests on digital properties by using Adobe analytics, GA4, Amplitude or any equivalent tool, walk through of analysis outcome and come up with ideasto optimize digital user experience. Website and mobile app Optimization consulting for client accounts across industries (customer journey analyses and personalization). Familiarity with website measurement strategy, identifying key KPIs and define goals, integrating online and offline data and segmentation strategies. Connect with clients for business requirements, walk through analysis outcome and come up with ideas for optimization of the digital properties. Build analytical reports and dashboards using visualization tools like LookerStudio or PowerBI. Technical and Functional Skills: Bachelors Degree with overall experience of 6-10 years in digital analytics and optimization (Adobe Analytics, GA4, Appsflyer and Amplitude). Specialism- Adobe Analytics or GA4 and App analytics tools like Amplitude, Appsflyer, Visualization tools Expert- LookerStudio or PowerBI Certification in Adobe Analytics Business practitioner preferred. Ability to drive business inference from quantitative and qualitative datasets. Ability to collaborate with stakeholders across the globe. Strong communication, creative and innovation skills to help develop offerings to meet market needs.

Posted 1 month ago

Apply

7 - 12 years

0 - 1 Lacs

Bengaluru

Work from Office

Naukri logo

Job Description- GCP Data Engineer TECH STACK GCP data integration and resource management Big Query DataForms Python (good to have) Terraform (nice to have) DevOps (is a plus) Experience working with Analytics products is a plus Experience working with Security and Compliance teams and/or on DevOps Qualifications: Formal qualifications in computer science, software engineering, or any engineering equivalent Minimum 9+ years (Expert SWE) or 7 years (Senior SWE) of professional experience as software engineer with similar level of experience in the specific tech stack for the area Minimum 5 years (Expert SWE) or 3 years (Senior SWE) experience of working in agile/iterative software development teams with a DevOps working setup and with an emphasis on self-organisation and delivery to agreed commitments Demonstrable experience with cloud computing environments Excellent written and verbal English communication skills

Posted 1 month ago

Apply

8 - 13 years

10 - 15 Lacs

Jaipur, Rajasthan

Work from Office

Naukri logo

Job Summary Auriga is looking for a Data Engineer to design and maintain cloud-native data pipelines supporting real-time analytics and machine learning. You'll work with cross-functional teams to build scalable, secure data solutions using GCP (BigQuery, Looker), SQL, Python, and orchestration tools like Dagster and DBT. Mentoring junior engineers and ensuring data best practices will also be part of your role. WHAT YOU'LL DO: Design, build, and maintain scalable data pipelines and architectures to support analytical and operational workloads. Develop and optimize ETL/ELT pipelines, ensuring efficient data extraction, transformation, and loading from various sources. Work closely with backend and platform engineers to integrate data pipelines into cloud-native applications. Manage and optimize cloud data warehouses, primarily BigQuery, ensuring performance, scalability, and cost efficiency. Implement data governance, security, and privacy best practices, ensuring compliance with company policies and regulations. Collaborate with analytics teams to define data models and enable self-service reporting and BI capabilities. Develop and maintain data documentation, including data dictionaries, lineage tracking, and metadata management. Monitor, troubleshoot, and optimize data pipelines, ensuring high availability and reliability. Stay up to date with emerging data engineering technologies and best practices, continuously improving our data infrastructure. WHAT WE'RE LOOKING FOR: Strong proficiency in English (written and verbal communication) is required. Experience working with remote teams across North America and Latin America, ensuring smooth collaboration across time zones. 5+ years of experience in data engineering, with expertise in building scalable data pipelines and cloud-native data architectures. Strong proficiency in SQL for data modeling, transformation, and performance optimization. Experience with BI and data visualization tools (e.g., Looker, Tableau, or Google Data Studio). Expertise in Python for data processing, automation, and pipeline development. Experience with cloud data platforms, particularly Google Cloud Platform (GCP).Hands-on experience with Google BigQuery, Cloud Storage, and Pub/Sub. Strong knowledge of ETL/ELT frameworks such as DBT, Dataflow, or Apache Beam. Familiarity with workflow orchestration tools like Dagster, Apache Airflow or Google Cloud Workflows. Understanding of data privacy, security, and compliance best practices. Strong problem-solving skills, with the ability to debug and optimize complex data workflows. Excellent communication and collaboration skills. NICE TO HAVES: Experience with real-time data streaming solutions (e.g., Kafka, Pub/Sub, or Kinesis). Familiarity with machine learning workflows and MLOps best practices. Knowledge of Terraform for Infrastructure as Code (IaC) in data environments. Familiarity with data integrations involving Contentful, Algolia, Segment, and .

Posted 1 month ago

Apply

8 - 12 years

20 - 25 Lacs

Gandhinagar

Remote

Naukri logo

Requirement : 8+ years of professional experience as a data engineer and 2+ years of professional experience as a senior data engineer Must have strong working experience in Python and its various data analysis packages Pandas / NumPy Must have strong understanding of prevalent cloud ecosystems and experience in one of the cloud platforms AWS / Azure / GCP . Must have strong working experience in one of the leading MPP Databases Snowflake / Amazon Redshift / Azure Synapse / Google Big Query Must have strong working experience in one of the leading data orchestration tools in cloud – Azure Data Factory / Amazon Glue / Apache Airflow Must have experience working with Agile methodologies, Test Driven Development, and implementing CI/CD pipelines using one of leading services – GITLab / Azure DevOps / Jenkins / AWS Code Pipeline / Google Cloud Build Must have Data Governance / Data Management / Data Quality project implementation experience Must have experience in big data processing using Spark Must have strong experience with SQL databases (SQL Server, Oracle, Postgres etc.) Must have stakeholder management experience and very good communication skills Must have working experience on end-to-end project delivery including requirement gathering, design, development, testing, deployment, and warranty support Must have working experience with various testing levels, such as, unit testing, integration testing and system testing Working experience with large, heterogeneous datasets in building and optimizing data pipelines, pipeline architectures Nice to have Skills : Working experience in DataBricks notebooks and managing DataBricks clusters Experience in Data Modelling tool such as Erwin or ER Studio Experience in one of the data architectures, such as Data Mesh or Data Fabric Has handled real time data or near real time data Experience in one of the leading Reporting & analysis tools, such as Power BI, Qlik, Tableau or Amazon Quick Sight Working experience with API integration General insurance / banking / finance domain understanding

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies