Home
Jobs

779 Bigquery Jobs - Page 19

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

8.0 - 13.0 years

25 - 30 Lacs

Bengaluru

Work from Office

Data Analyst Location: Bangalore Experience: 8 - 15 Yrs Type: Full-time Role Overview We are seeking a skilled Data Analyst to support our platform powering operational intelligence across airports and similar sectors. The ideal candidate will have experience working with time-series datasets and operational information to uncover trends, anomalies, and actionable insights. This role will work closely with data engineers, ML teams, and domain experts to turn raw data into meaningful intelligence for business and operations stakeholders. Key Responsibilities Analyze time-series and sensor data from various sources Develop and maintain dashboards, reports, and visualizations to communicate key metrics and trends. Correlate data from multiple systems (vision, weather, flight schedules, etc) to provide holistic insights. Collaborate with AI/ML teams to support model validation and interpret AI-driven alerts (e.g., anomalies, intrusion detection). Prepare and clean datasets for analysis and modeling; ensure data quality and consistency. Work with stakeholders to understand reporting needs and deliver business-oriented outputs. Qualifications & Required Skills Bachelors or Masters degree in Data Science, Statistics, Computer Science, Engineering, or a related field. 5+ years of experience in a data analyst role, ideally in a technical/industrial domain. Strong SQL skills and proficiency with BI/reporting tools (e.g., Power BI, Tableau, Grafana). Hands-on experience analyzing structured and semi-structured data (JSON, CSV, time-series). Proficiency in Python or R for data manipulation and exploratory analysis. Understanding of time-series databases or streaming data (e.g., InfluxDB, Kafka, Kinesis). Solid grasp of statistical analysis and anomaly detection methods. Experience working with data from industrial systems or large-scale physical infrastructure. Good-to-Have Skills Domain experience in airports, smart infrastructure, transportation, or logistics. Familiarity with data platforms (Snowflake, BigQuery, Custom-built using open-source). Exposure to tools like Airflow, Jupyter Notebooks and data quality frameworks. Basic understanding of AI/ML workflows and data preparation requirements.

Posted 3 weeks ago

Apply

8.0 - 13.0 years

30 - 35 Lacs

Bengaluru

Work from Office

Data Engineer Location: Bangalore - Onsite Experience: 8 - 15 years Type: Full-time Role Overview We are seeking an experienced Data Engineer to build and maintain scalable, high-performance data pipelines and infrastructure for our next-generation data platform. The platform ingests and processes real-time and historical data from diverse industrial sources such as airport systems, sensors, cameras, and APIs. You will work closely with AI/ML engineers, data scientists, and DevOps to enable reliable analytics, forecasting, and anomaly detection use cases. Key Responsibilities Design and implement real-time (Kafka, Spark/Flink) and batch (Airflow, Spark) pipelines for high-throughput data ingestion, processing, and transformation. Develop data models and manage data lakes and warehouses (Delta Lake, Iceberg, etc) to support both analytical and ML workloads. Integrate data from diverse sources: IoT sensors, databases (SQL/NoSQL), REST APIs, and flat files. Ensure pipeline scalability, observability, and data quality through monitoring, alerting, validation, and lineage tracking. Collaborate with AI/ML teams to provision clean and ML-ready datasets for training and inference. Deploy, optimize, and manage pipelines and data infrastructure across on-premise and hybrid environments. Participate in architectural decisions to ensure resilient, cost-effective, and secure data flows. Contribute to infrastructure-as-code and automation for data deployment using Terraform, Ansible, or similar tools. Qualifications & Required Skills Bachelors or Master’s in Computer Science, Engineering, or related field. 6+ years in data engineering roles, with at least 2 years handling real-time or streaming pipelines. Strong programming skills in Python/Java and SQL. Experience with Apache Kafka, Apache Spark, or Apache Flink for real-time and batch processing. Hands-on with Airflow, dbt, or other orchestration tools. Familiarity with data modeling (OLAP/OLTP), schema evolution, and format handling (Parquet, Avro, ORC). Experience with hybrid/on-prem and cloud platforms (AWS/GCP/Azure) deployments. Proficient in working with data lakes/warehouses like Snowflake, BigQuery, Redshift, or Delta Lake. Knowledge of DevOps practices, Docker/Kubernetes, Terraform or Ansible. Exposure to data observability, data cataloging, and quality tools (e.g., Great Expectations, OpenMetadata). Good-to-Have Experience with time-series databases (e.g., InfluxDB, TimescaleDB) and sensor data. Prior experience in domains such as aviation, manufacturing, or logistics is a plus. Role & responsibilities Preferred candidate profile

Posted 3 weeks ago

Apply

2.0 - 5.0 years

9 - 13 Lacs

Coimbatore

Work from Office

Overview Role-Data Support-Sr Analyst skills: SQL, AWS, Meta ads, Google ads or Tiktok ads Location-Bangalore, Hyderabad, Chennai & Coimbatore Work Model-Hybrid Annalect is continuously evolving its Technology Operations function, and as part of this expansion, we are seeking a motivated and dynamic individual to fill the Senior Analyst role. In this position, you will work on providing technical support for multiple applications, both developed by our internal Annalect Technology team and integrated into Annalect's ecosystem as SaaS and PaaS solutions. This role is essential for the continued daily support demands of our global user community and is critical to the overall success of the organization in deploying and supporting its technical stack across the company. Responsibilities Key Responsibilities: Maintaining, documenting, Technology/Application Support and Service Level Agreements. Manage user access and onboarding for platforms, including Active Directory (AD) and cloud-based tools. Develop, implement, and manage IT solutions to improve visibility, automate workflows, and optimize IT operations. Work closely with the onshore/offshore/cross-functional team providing ongoing support for the Annalect technology and Business teams. Ongoing support of the Annalect technology and Business teams using existing tools or to be built tool/application sets. Strong understanding of ad platforms such as Google Ads, Meta, TikTok, Amazon DSP, DV360, The Trade Desk Etc. Must have good QA Skill to compare key advertising metrics (Clicks, Impressions, Cost, etc.) between the platform and the destination data. Documenting, implementing and managing statistics to ensure that the AOS team is operating at a highly efficient and effective level. Monitoring and handling incident response of the infrastructure, platforms, and core engineering services. Troubleshooting infrastructure, network, and application issues. Help identify and troubleshoot problems within environment. Must be able to work both independently and as a productive member in a team. Leads team projects and activities using project management methodology. It is expected that this position will require 40% process/procedure and 60% technology skills Can be available 24x7 for occasional technology / application related issues. Support robots encountering issues by taking on tickets and identifying root cause analysis. Maintain and update documentations whenever changes are made to the robots. Knowledgeable and able to support RPA infrastructure and architecture. Qualifications Required Skills 5-7 years of relevant and progressive experience as a Technology Operations Analyst, or in similar role Self-motivated and action-driven with the ability to take initiative, execute and follow-through Ability to clearly articulate technical and functional information to various audiences, both verbally and written High degree of organizational skills and ability to reprioritize based on business needs Excellent written and verbal communication skills Strong understanding of ad platform ecosystems, including campaign management, Ad Manager and Business Manager, tracking methodologies, data ingestion, and reporting workflows. Knowledge of ad operations, audience targeting, attribution models. Proficient in Excel, with demonstrated ability to organize and consolidate multiple data sources for analysis. Must possess critical thinking and exhibit problem solving skills for technical and software related issues. Has worked with a single sign-on platform inclusive of user and application setup/support. Good understanding of different methodologies such as DevOps, CICD (Continuous Integration, Continuous Delivery)/Agile/Kanban, AWS. Good working knowledge of Microsoft tools (Office, Sharepoint), CRM (JIRA, Hubspot) and reporting tools (PowerBI, Tableau etc.) Proficiency in SQL, Google BigQuery, Starburst for querying and analyzing large datasets. Strong understanding of APIs and troubleshooting. Exposure to Generative AI models (e.g., OpenAI, GPT-based models).

Posted 3 weeks ago

Apply

2.0 - 4.0 years

4 - 6 Lacs

Hyderabad

Work from Office

Key Responsibilities:Cloud Infrastructure Management:o Design, deploy, and manage scalable and secure infrastructure on Google Cloud Platform (GCP).o Implement best practices for GCP IAM, VPCs, Cloud Storage, Clickhouse, Superset Apache tools onboarding and other GCP services.Kubernetes and Containerization:o Manage and optimize Google Kubernetes Engine (GKE) clusters for containerized applications. Implement Kubernetes best practices, including pod scaling, resource allocation, and security policies.CI/CD Pipelines:o Build and maintain CI/CD pipelines using tools like Cloud Build, Stratus, GitLab CI/CD, or ArgoCD.o Automate deployment workflows for containerized and serverless applications.Security and Compliance:o Ensure adherence to security best practices for GCP, including IAM policies, network security, and data encryption. Conduct regular audits to ensure compliance with organizational and regulatory standards. Collaboration and Support:o Work closely with development teams to containerize applications and ensure smooth deployment on GCP.o Provide support for troubleshooting and resolving infrastructure-related issues.Cost Optimization:o Monitor and optimize GCP resource usage to ensure cost efficiency.o Implement strategies to reduce cloud spend without compromising performance.Required Skills and Qualifications:Certifications:o Must hold a Google Cloud Professional DevOps Engineer certification or Google Cloud Professional Cloud Architect certification.Cloud Expertise:o Strong hands-on experience with Google Cloud Platform (GCP) services, including GKE, Cloud Functions, Cloud Storage, BigQuery, and Cloud Pub/Sub.DevOps Tools:o Proficiency in DevOps tools like Terraform, Ansible, Stratus, GitLab CI/CD, or Cloud Build.o Experience with containerization tools like Docker.Kubernetes Expertise:o In-depth knowledge of Kubernetes concepts such as pods, deployments, services, ingress, config maps, and secrets.o Familiarity with Kubernetes tools like kubectl, Helm, and Kustomize.5. Programming and Scripting:o Strong scripting skills in Python, Bash, or Go.o Familiarity with YAML and JSON for configuration management.Monitoring and Logging:o Experience with monitoring tools like Prometheus, Grafana, or Google Cloud Operations Suite.Networking:o Understanding of cloud networking concepts, including VPCs, subnets, firewalls, and load balancers. Soft Skills: Strong problem-solving and troubleshooting skills.o Excellent communication and collaboration abilities.o Ability to work in an agile, fast-paced environment.

Posted 3 weeks ago

Apply

8.0 - 10.0 years

9 - 13 Lacs

Bengaluru

Work from Office

What you’ll be doing: Assist in developing machine learning models based on project requirements Work with datasets by preprocessing, selecting appropriate data representations, and ensuring data quality. Performing statistical analysis and fine-tuning using test results. Support training and retraining of ML systems as needed. Help build data pipelines for collecting and processing data efficiently. Follow coding and quality standards while developing AI/ML solutions Contribute to frameworks that help operationalize AI models What we seek in you: 8+ years of experience in IT Industry Strong on programming languages like Python One cloud hands-on experience (GCP preferred) Experience working with Dockers Environments managing (e.g venv, pip, poetry, etc.) Experience with orchestrators like Vertex AI pipelines, Airflow, etc Understanding of full ML Cycle end-to-end Data engineering, Feature Engineering techniques Experience with ML modelling and evaluation metrics Experience with Tensorflow, Pytorch or another framework Experience with Models monitoring Advance SQL knowledge Aware of Streaming concepts like Windowing, Late arrival, Triggers etc Storage: CloudSQL, Cloud Storage, Cloud Bigtable, Bigquery, Cloud Spanner, Cloud DataStore, Vector database Ingest: Pub/Sub, Cloud Functions, AppEngine, Kubernetes Engine, Kafka, Micro services Schedule: Cloud Composer, Airflow Processing: Cloud Dataproc, Cloud Dataflow, Apache Spark, Apache Flink CI/CD: Bitbucket+Jenkins / Gitlab, Infrastructure as a tool: Terraform Life at Next: At our core, we're driven by the mission of tailoring growth for our customers by enabling them to transform their aspirations into tangible outcomes. We're dedicated to empowering them to shape their futures and achieve ambitious goals. To fulfil this commitment, we foster a culture defined by agility, innovation, and an unwavering commitment to progress. Our organizational framework is both streamlined and vibrant, characterized by a hands-on leadership style that prioritizes results and fosters growth. Perks of working with us: Clear objectives to ensure alignment with our mission, fostering your meaningful contribution. Abundant opportunities for engagement with customers, product managers, and leadership. You'll be guided by progressive paths while receiving insightful guidance from managers through ongoing feedforward sessions. Cultivate and leverage robust connections within diverse communities of interest. Choose your mentor to navigate your current endeavors and steer your future trajectory. Embrace continuous learning and upskilling opportunities through Nexversity. Enjoy the flexibility to explore various functions, develop new skills, and adapt to emerging technologies. Embrace a hybrid work model promoting work-life balance. Access comprehensive family health insurance coverage, prioritizing the well-being of your loved ones. Embark on accelerated career paths to actualize your professional aspirations. Who we are? We enable high growth enterprises build hyper personalized solutions to transform their vision into reality. With a keen eye for detail, we apply creativity, embrace new technology and harness the power of data and AI to co-create solutions tailored made to meet unique needs for our customers. Join our passionate team and tailor your growth with us!

Posted 3 weeks ago

Apply

5.0 - 10.0 years

10 - 20 Lacs

Gurugram

Hybrid

We are looking for a highly skilled Engineer with a solid experience of building Bigdata, GCP Cloud based marketing ODL applications. The Engineer will play a crucial role in designing, implementing, and optimizing data solutions to support our organizations data-driven initiatives. This role requires expertise in data engineering, strong problem-solving abilities, and a collaborative mindset to work effectively with various stakeholders. Technical Skills 1. Core Data Engineering Skills Proficiency in using GCPs big data tools like: BigQuery: For data warehousing and SQL analytics. Dataproc: For running Spark and Hadoop clusters. Expertise in building automated, scalable, and reliable pipelines using custom Python/Scala solutions or Cloud Data Functions . 2. Programming and Scripting Strong coding skills in SQL, and Java. Familiarity with APIs and SDKs for GCP services to build custom data solutions. 3. Cloud Infrastructure Understanding of GCP services such as Cloud Storage, Compute Engine, and Cloud Functions. Familiarity with Kubernetes (GKE) and containerization for deploying data pipelines. (Optional but Good to have) 4. DevOps and CI/CD Experience setting up CI/CD pipelines using Cloud Build, GitHub Actions, or other tools. Monitoring and logging tools like Cloud Monitoring and Cloud Logging for production workflows. Soft Skills 1. Innovation and Problem-Solving Ability to think creatively and design innovative solutions for complex data challenges. Experience in prototyping and experimenting with cutting-edge GCP tools or third-party integrations. Strong analytical mindset to transform raw data into actionable insights. 2. Collaboration Teamwork: Ability to collaborate effectively with data analysts, and business stakeholders. Communication: Strong verbal and written communication skills to explain technical concepts to non-technical audiences. 3. Adaptability and Continuous Learning Open to exploring new GCP features and rapidly adapting to changes in cloud technology.

Posted 3 weeks ago

Apply

7.0 - 12.0 years

25 - 37 Lacs

Hyderabad, Pune

Work from Office

GCP Data Engineer (Big Query + SQL + ETL Knowledge + Python, Data Flow, Pubsub, CICD) "KASHIF@D2NSOLUTIONS.COM"

Posted 3 weeks ago

Apply

4.0 - 9.0 years

10 - 15 Lacs

Gurugram

Hybrid

Department overview AutomotiveMastermind provides U.S. automotive dealers with AI/behavior prediction analytics software and marketing solutions that improve the vehicle purchase process and results. The companys cloud-based technology helps dealers precisely predict automobile-buying behavior and automates the creation of microtargeted customer communications, leading to proven higher sales and more consistent customer retention. Responsibilities: Work closely with Product Management and Data Strategy leadership to understand short and long-term roadmaps, and overall Data product strategy Drive backlog grooming agile sprint ceremony, acting as bridge between business needs and technical implementation Present on behalf of agile teams in sprint review, reiterating business value delivered with each work increment completed Develop expertise on the existing aM ecosystem of integrations and data available within the system Collaborate with data analysts, data management, data science, and engineering teams to develop short and long-term solutions to meet business needs and solve distinct problems Application of deep, creative, rigorous thinking to solve broad, platform-wide technical and/or business problems Identify key value drivers and key opportunities for/sources of error across products and processes Develop short-term preventive or detective measures, and leading medium/long-term product improvement initiatives arrived at via close collaboration with engineering, QA, and data support Coordinate with data engineers as appropriate to design and enable repeatable processes and generate deliverables to answer routine business questions What Were Looking For: Basic Required Qualifications: Minimum 4 years working experience as a Product Owner or Product Manager in an Agile scrum framework Experience using data and analytical processes to drive decision making, with ability to explain how analysis was done to an executive audience Strong knowledge of Agile development framework, with practical experience to support flexible application of principles Strong conceptual understanding of data integrations technologies and standards Working familiarity with road-mapping and issue tracking software applications (Aha!, MS Azure DevOps, Salesforce) Familiarity with Microsoft Excel, SQL, BigQuery, MongoDB, and Postman preferred An advocate for the importance of leveraging data, a supporter of the use of data analysis in decision-making, and a fierce promoter of data and engineering best practices throughout the organization. Passionate about empirical research A team player who is comfortable working with a globally distributed team across time zones A solid communicator, both with technology teams and with non-technical stakeholders Preferred: Experience with or awareness of and interest in dimensional data modeling concepts B.tech/M.tech qualified. Grade: 9 Location: Gurgaon Hybrid Mode: twice a week work from office Shift Time: 12 pm to 9 pm IST

Posted 3 weeks ago

Apply

4.0 - 9.0 years

10 - 17 Lacs

Ahmedabad

Work from Office

Job Summary: Key Responsibilities: Design, develop, and maintain interactive and user-friendly Power BI dashboards and reports. Translate business requirements into functional and technical specifications. Perform data modeling, DAX calculations, and Power Query transformations. Integrate data from multiple sources including SQL Server, Excel, SharePoint, and APIs. Optimize Power BI datasets, reports, and dashboards for performance and usability. Collaborate with business analysts, data engineers, and stakeholders to ensure data accuracy and relevance. Ensure security and governance best practices in Power BI workspaces and datasets. Provide ongoing support and troubleshooting for existing Power BI solutions. Stay updated with Power BI updates, best practices, and industry trends. Required Skills & Qualifications: Bachelors degree in Computer Science, Information Technology, Data Analytics, or a related field. 4+ years of professional experience in data analytics or business intelligence. 3+ years of hands-on experience with Power BI (Power BI Desktop, Power BI Service). Strong expertise in DAX, Power Query (M Language), and data modeling (star/snowflake schema). Proficiency in writing complex SQL queries and optimizing them for performance. Experience in working with large and complex datasets. Experience in BigQuery, MySql, Looker Studio is a plus. Ecommerce Industry Experience will be an added advantage. Solid understanding of data warehousing concepts and ETL processes. Experience with version control tools such as Power Apps & Power Automate would be a plus. Preferred Qualifications: Microsoft Power BI Certification (PL-300 or equivalent is a plus). Experience with Azure Data Services (Azure Data Factory, Azure SQL, Synapse). Knowledge of other BI tools (Tableau, Qlik) is a plus. Familiarity with scripting languages (Python, R) for data analysis is a bonus. Experience integrating Power BI into web portals using Power BI Embedded.

Posted 3 weeks ago

Apply

2.0 - 6.0 years

4 - 8 Lacs

Gurugram

Work from Office

About the Role: Grade Level (for internal use): 09 The Team: Automotive Mastermind was founded on the idea that there are patterns in peoples behavior that, with the right logic, can be used to predict future outcomes.Our software helps automotive dealerships and sales teams better understand and predict exactly which customers are ready to buy, the reasons why, and the key offers and incentives most likely to close the sale.Our culture is creative and entrepreneurial where everyone contributes to company goals in a very real way. We are a hardworking group, but we have a lot of fun with what we do and are looking for new people with a similar mindset to join the organization. The Impact: As a Quality Engineer you will collaborate with members of both, Product and Development Teams to help them make informed decisions on releases of one of the best tools there is for car dealerships in the United States. Whats in it for you: Possibility to work on a project in a very interesting domain - Automotive industry in the United States, and influence the quality of one of the best tools there is for car dealerships. Affect processes and tools used for Quality Engineering. Our Team has a high degree of autonomy in automotive Mastermind organization to decide what tools and processes we will use. Responsibilities: Own and be responsible for testing and delivery of product or core modules. Assessing the quality, usability and functionality of each release. Reviewing software requirement and capable in preparing test scenarios for complex business rules Interact with the stakeholders to understand the detailed requirements and expectations Be able to gain technical knowledge and aim to be a quality SME(s) in core functional components Developing and organizing QA Processes for assigned projects to align with overall QA goals Designing and implementing a test automation strategy supporting multiple product development teams Leading efforts for related automation projects, design and code reviews Producing regular reports on the status and quality of software releases and be prepared to speak to findings in an informative way to all levels of audiences. What Were Looking For: Participate in and improve the whole lifecycle of servicesfrom inception and design, through deployment, operation, and refinement. Participate in the release planning process to review functional specifications and create release plans. Collaborate with software engineers to design verification test plans. Design regression test suites and review with engineering, applications and the field organization. Produce regular reports on the status and quality of software releases and be prepared to speak to findings in an informative way to all levels of audience. Assess the quality, usability and functionality of each release. Develop and organize QA Processes for assigned projects to align with overall QA goals Lead and train a dynamically changing team of colleagues who participate in testing processes Exhibit expertise in handling large scale programs/projects that involve multiple stakeholders (Product, Dev, DevOps) Maintain a leading edge understanding of QA as related to interactive technologies best practices Design and implement test automation strategy for multiple product development teams at the onset of the project. Lead efforts for related automation projects, design and code reviews. Work closely with leadership and IT to provide input into the design and implementation of the automation framework. Work with Architecture, Engineering, Quality Engineering, IT, and Product Operations leaders to create and implement processes that accelerate the delivery of new features and products with high quality and at scale. Develop and contribute to a culture of high performance, transparency and continuous improvement as it relates to the infrastructure services and streamlining of the development pipeline. Participate in a diverse team of talented engineers globally, providing guidance, support and clear priorities. ? Who you are: Total Experience: 2 to 6 years. Hands on experience with at least 2 or more of leading testing tools/framework like Playwright, Robot Framework, K6, Jmeter. Hands on experience working on Python. Experience with Databases SQL/NoSQL. Experience working on CloudNative Applications. Hands on experience with Google Cloud Services like Kubernetes, Composer, Dataplex, Pub-Sub, BigQuery, AlloyDb, CloudSQL , lookerstudio etc. Strong analytical skills and ability to solve complex technical problems. API testing - must have understanding of RESTful design / best practices. Hands on experience testing APIs and test tools Experience with load / stress / performance testing and tools, Experience with Azure DevOps (or other similar issue/bug tracking systems) is required, Experience working with Cloud native applications. Ability to think abstract to ensure ability to not conform to the norm. Norms do not find bugs quickly, Experience working in an Agile software development organization, Experience supporting development and product teams Excellent verbal, written, and interpersonal communication skills; ability to interact with all levels of an organization Ability to work in an advisory capacity to identify key technical and business problems, develop and evaluate. Grade: 08 / 09 Job Location: Gurugram Hybrid Mode: twice a week work from office. Shift Time: 12 pm to 9 pm IST.

Posted 3 weeks ago

Apply

5.0 - 10.0 years

20 - 35 Lacs

Hyderabad

Work from Office

Greetings from Technogen !!! We thank you for taking time about your competencies and skills, while allowing us an opportunity to explain about us and our Technogen , we understand that your experience and expertise are relevant the current open with our clients. TechnoGen Brief Overview: TechnoGen, Inc. is an ISO 9001:2015, ISO 20000-1:2011, ISO 27001:2013, and CMMI Level 3 Global IT Services Company headquartered in Chantilly, Virginia. TechnoGen, Inc. (TGI) is a Minority & Women-Owned Small Business with over 20 years of experience providing end-to-end IT Services and Solutions to the Public and Private sectors. TGI provides highly skilled and certied professionals and has successfully executed more than 345 projects. TechnoGen is committed to helping our clients solve complex problems and achieve their goals, on time and under budget. LinkedIn: https://www.linkedin.com/company/technogeninc/about/ Job Title : Senior Data Engineer Location : Hyderabad Required Experience : 5+ years Job Summary :- Overview: Seeking a Senior Data Engineer to design and optimize scalable data pipeline architectures and support analytics needs across cross-functional teams. Key Responsibilities: Design, build, and maintain data pipelines (ETL/ELT) using BigQuery, Python, and SQL Optimize data flow, automate processes, and scale infrastructure Develop and manage workflows in Airflow/Cloud Composer and Ascend (or similar ETL tools) Implement data quality checks and testing strategies Support CI/CD (DevSecOps) processes, conduct code reviews, and mentor junior engineers Collaborate with QA/business teams and troubleshoot issues across environments Core Skills: BigQuery, Python, SQL, Collibra, Airflow/Cloud Composer, Ascend or similar ETL tools Data integration, warehousing, and pipeline orchestration Data quality frameworks and incremental load strategies Strong experience with GCP or AWS serverless data warehouse environments Preferred Skills: DBT for transformation Collibra for data governance Working with unstructured datasets Qualifications: 5+ years in data engineering Graduate degree in CS, Statistics, or related field Strong analytical and SQL expertise Best Regards, Rampelli Kiran Kumar| Sr.IT Recruiter kiran.r@technogenindia.com www.technogenindia.com | Follow us on LinkedIn

Posted 3 weeks ago

Apply

7.0 - 12.0 years

9 - 14 Lacs

Ahmedabad

Work from Office

Project Role : Program/Project Management Lead Project Role Description : Lead business and technology outcomes for assigned program, project, or contracted service. Leverage standard tools, methodologies and processes to deliver, monitor, and control service level agreements. Must have skills : Google Cloud Platform Architecture Good to have skills : Google Kubernetes Engine Minimum 7.5 year(s) of experience is required Educational Qualification : M Tech Summary:As a Program/Project Management Lead, you will be responsible for leading business and technology outcomes for assigned programs, projects, or contracted services. Your typical day will involve leveraging standard tools, methodologies, and processes to deliver, monitor, and control service level agreements. Roles & Responsibilities:- Lead the planning and execution of programs and projects, ensuring adherence to timelines, budgets, and quality standards.- Collaborate with cross-functional teams to identify and prioritize project requirements, risks, and dependencies.- Develop and maintain project plans, status reports, and other project-related documentation.- Manage project budgets, forecasts, and financial reporting, ensuring accurate and timely delivery of financial information.- Provide leadership and guidance to project team members, ensuring effective communication and collaboration throughout the project lifecycle. Professional & Technical Skills:- Must To Have Skills:Expertise in Google Cloud Platform Architecture.- Good To Have Skills:Experience with Google Kubernetes Engine.- Strong understanding of program and project management methodologies and tools.- Experience in managing large-scale, complex programs and projects.- Excellent communication, leadership, and stakeholder management skills. Additional Information:- The candidate should have a minimum of 7.5 years of experience in Google Cloud Platform Architecture.- The ideal candidate will possess a strong educational background in computer science, engineering, or a related field, along with a proven track record of delivering successful programs and projects.- This position is based at our Bengaluru office. Qualifications M Tech

Posted 3 weeks ago

Apply

13.0 - 18.0 years

13 - 17 Lacs

Bengaluru

Work from Office

Skill required: Tech for Operations - Agile Project Management Designation: AI/ML Computational Science Manager Qualifications: Any Graduation Years of Experience: 13 to 18 years What would you do? You will be part of the Technology for Operations team that acts as a trusted advisor and partner to Accenture Operations. The team provides innovative and secure technologies to help clients build an intelligent operating model, driving exceptional results. We work closely with the sales, offering and delivery teams to identify and build innovative solutions.The Tech For Operations (TFO) team provides innovative and secure technologies to help clients build an intelligent operating model, driving exceptional results. Works closely with the sales, offering and delivery teams to identify and build innovative solutions. Major sub deals include AHO(Application Hosting Operations), ISMT (Infrastructure Management), Intelligent AutomationAn iterative, incremental method of managing the design and build activities of engineering, information technology and other business areas that aim to provide new product or service development in a highly flexible and interactive manner. It requires individuals and interactions from the relevant business to respond to change, customer collaboration, and management openness to non-hierarchical forms of leadership. What are we looking for? Results orientation Problem-solving skills Ability to perform under pressure Strong analytical skills Written and verbal communication Roles and Responsibilities: In this role you are required to identify and assess complex problems for area of responsibility The person would create solutions in situations in which analysis requires an in-depth evaluation of variable factors Requires adherence to strategic direction set by senior management when establishing near-term goals Interaction of the individual is with senior management at a client and/or within Accenture, involving matters that may require acceptance of an alternate approach Some latitude in decision-making in involved you will act independently to determine methods and procedures on new assignments Decisions individual at this role makes have a major day to day impact on area of responsibility The person manages large - medium sized teams and/or work efforts (if in an individual contributor role) at a client or within Accenture Please note that this role may require you to work in rotational shifts Qualification Any Graduation

Posted 3 weeks ago

Apply

5.0 - 10.0 years

7 - 12 Lacs

Mumbai

Work from Office

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Google BigQuery Good to have skills : Teradata BI Minimum 5 year(s) of experience is required Educational Qualification : minimum 15 years Full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. Your typical day will involve collaborating with team members to develop innovative solutions and ensure seamless application functionality. Roles & Responsibilities: Expected to be an SME Collaborate and manage the team to perform Responsible for team decisions Engage with multiple teams and contribute on key decisions Provide solutions to problems for their immediate team and across multiple teams Lead application development projects Conduct code reviews and ensure coding standards are met Professional & Technical Skills: Must To Have Skills:Proficiency in Google BigQuery Strong understanding of data warehousing concepts Experience with cloud-based data platforms Hands-on experience in SQL and database management Good To Have Skills:Experience with Teradata BI Additional Information: The candidate should have a minimum of 5 years of experience in Google BigQuery This position is based at our Mumbai office A minimum of 15 years Full time education is required Qualifications minimum 15 years Full time education

Posted 3 weeks ago

Apply

5.0 - 10.0 years

7 - 12 Lacs

Pune

Work from Office

Project Role : Cloud Services Engineer Project Role Description : Act as liaison between the client and Accenture operations teams for support and escalations. Communicate service delivery health to all stakeholders and explain any performance issues or risks. Ensure Cloud orchestration and automation capability is operating based on target SLAs with minimal downtime. Hold performance meetings to share performance and consumption data and trends. Must have skills : Managed File Transfer Good to have skills : NA Minimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Cloud Services Engineer, you will act as a liaison between the client and Accenture operations teams for support and escalations. You will communicate service delivery health to all stakeholders and explain any performance issues or risks. Ensure Cloud orchestration and automation capability is operating based on target SLAs with minimal downtime. Hold performance meetings to share performance and consumption data and trends. Roles & Responsibilities: Expected to be an SME. Collaborate and manage the team to perform. Responsible for team decisions. Engage with multiple teams and contribute on key decisions. Provide solutions to problems for their immediate team and across multiple teams. Ensure effective communication between client and operations teams. Analyze service delivery health and address performance issues. Conduct performance meetings to share data and trends. Professional & Technical Skills: Must To Have Skills:Proficiency in Managed File Transfer. Strong understanding of cloud orchestration and automation. Experience in SLA management and performance analysis. Knowledge of IT service delivery and escalation processes. Additional Information: The candidate should have a minimum of 5 years of experience in Managed File Transfer. This position is based at our Pune office. A 15 years full time education is required. Qualifications 15 years full time education

Posted 3 weeks ago

Apply

7.0 - 11.0 years

9 - 13 Lacs

Bengaluru

Work from Office

Skill required: Delivery - Customer Insight & Marketing Analytics Designation: I&F Decision Sci Practitioner Specialist Qualifications: Any Graduation Years of Experience: 7 to 11 years What would you do? Data & AIProcess by which data from customer behavior is used to help make key business decisions via market segmentation and predictive analytics. This information is used by businesses for direct marketing, site selection, and customer relationship management. What are we looking for? Data Analytics - with a specialization in the marketing domain Ability & experience working with paid media, CRM, Digital Advertising Analytics Website clickstream data and GA 4 Knowledge Highly experienced with SQL, Python and Big Query for exploring large datasets. Data Storytelling Familiarity with Tableau and Looker is a plus Problem-solving skills Ability to establish strong client relationship Ability to manage multiple stakeholders Roles and Responsibilities: In this role you are required to do analysis and solving of moderately complex problems May create new solutions, leveraging and, where needed, adapting existing methods and procedures The person would require understanding of the strategic direction set by senior management as it relates to team goals Primary upward interaction is with direct supervisor May interact with peers and/or management levels at a client and/or within Accenture Guidance would be provided when determining methods and procedures on new assignments Decisions made by you will often impact the team in which they reside Individual would manage small teams and/or work efforts (if in an individual contributor role) at a client or within Accenture Please note that this role may require you to work in rotational shifts Qualifications Any Graduation

Posted 3 weeks ago

Apply

5.0 - 8.0 years

7 - 10 Lacs

Bengaluru

Work from Office

Skill required: Delivery - Marketing Analytics and Reporting Designation: I&F Decision Sci Practitioner Sr Analyst Qualifications: Any Graduation Years of Experience: 5 to 8 years What would you do? Data & AIAnalytical processes and technologies applied to marketing-related data to help businesses understand and deliver relevant experiences for their audiences, understand their competition, measure and optimize marketing campaigns, and optimize their return on investment. What are we looking for? Data Analytics - with a specialization in the marketing domain*Domain Specific skills* Familiarity with ad tech and B2B sales*Technical Skills* Proficiency in SQL and Python Experience in efficiently building, publishing & maintaining robust data models & warehouses for self-ser querying, advanced data science & ML analytic purposes Experience in conducting ETL / ELT with very large and complicated datasets and handling DAG data dependencies. Strong proficiency with SQL dialects on distributed or data lake style systems (Presto, BigQuery, Spark/Hive SQL, etc.), including SQL-based experience in nested data structure manipulation, windowing functions, query optimization, data partitioning techniques, etc. Knowledge of Google BigQuery optimization is a plus. Experience in schema design and data modeling strategies (e.g. dimensional modeling, data vault, etc.) Significant experience with dbt (or similar tools), Spark-based (or similar) data pipelines General knowledge of Jinja templating in Python. Hands-on experience with cloud provider integration and automation via CLIs and APIs*Soft Skills* Ability to work well in a team Agility for quick learning Written and verbal communication Roles and Responsibilities: In this role you are required to do analysis and solving of increasingly complex problems Your day-to-day interactions are with peers within Accenture You are likely to have some interaction with clients and/or Accenture management You will be given minimal instruction on daily work/tasks and a moderate level of instruction on new assignments Decisions that are made by you impact your own work and may impact the work of others In this role you would be an individual contributor and/or oversee a small work effort and/or team Please note that this role may require you to work in rotational shifts Qualifications Any Graduation

Posted 3 weeks ago

Apply

5.0 - 10.0 years

7 - 12 Lacs

Coimbatore

Work from Office

Project Role : Infrastructure Architect Project Role Description : Lead the definition, design and documentation of technical environments. Deploy solution architectures, conduct analysis of alternative architectures, create architectural standards, define processes to ensure conformance with standards, institute solution-testing criteria, define a solutions cost of ownership, and promote a clear and consistent business vision through technical architectures. Must have skills : Microsoft 365 Security & Compliance Good to have skills : NA Minimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Cloud Services Engineer in the Security Delivery job family group, you will be responsible for ensuring Cloud orchestration and automation capability is operating based on target SLAs with minimal downtime. Your typical day will involve acting as a liaison between the client and Accenture operations teams for support and escalations, communicating service delivery health to all stakeholders, and holding performance meetings to share performance and consumption data and trends. Roles & Responsibilities: Act as a liaison between the client and Accenture operations teams for support and escalations. Communicate service delivery health to all stakeholders and explain any performance issues or risks. Ensure Cloud orchestration and automation capability is operating based on target SLAs with minimal downtime. Hold performance meetings to share performance and consumption data and trends. Professional & Technical Skills: Must To Have Skills:Experience in Microsoft 365 Security & Compliance. Defender for O365, Defender for Identity, Defender for Endpoints, Defender for Cloud Apps, Defender for Cloud, Microsoft Purview, DLP, eDiscovery, Microsoft priva, Microsoft Sentinel. Good To Have Skills:Experience in Cloud orchestration and automation. Strong understanding of Cloud technologies and security principles. Experience in managing and monitoring Cloud infrastructure. Experience in incident management and problem resolution. Additional Information: The candidate should have a minimum of 6 years of experience in Microsoft 365 Security & Compliance. Qualifications 15 years full time education

Posted 3 weeks ago

Apply

5.0 - 10.0 years

12 - 17 Lacs

Hyderabad, Pune

Hybrid

Role 1 - GCP Data Engineer Must have skills /Mandatory Skills GCP, Big Query, Dataflow, Cloud Composer Role 2. Big Data Engineer Must have skills /Mandatory Skills -Big Data, PySpark, Scala, Python Role 3 - GCP DevOps Engineer Must have skills /Mandatory Skills – GCP DevOps Experience Range – 5+ Years Location – Only Pune & Hyderabad, If you are applying from outside Pune or Hyd, then you have to relocate to Pune or Hyd . Work Mode – Min 2 days are mandatory to work from home. Salary – 12-16 LPA Point to be remember Pls fill the Candidate Summary Sheet. Not Considering more than 30 days’ notice period. Highlights of this role. It’s a long term role. High Possibility of conversion within 6 Months or After 6 months ( if you perform well). Interview -Total 2 rounds of Interview ( Both Virtual), but one face to face meeting is mandatory @ any location - Pune/Hyderabad /Bangalore /Chennai. UAN Verification will be done in Background Check. Any overlapping in past employment will eliminate you. Last 4 Years Continuous PF deduction is mandatory. One face to face meeting is mandatory, Otherwise we can’t onboard you. Client Company – One of Leading Technology Consulting Payroll Company – One of Leading IT Services & Staffing Company ( This company has a presence in India, UK, Europe , Australia , New Zealand, US, Canada, Singapore, Indonesia, and Middle east. Do not change the subject line or do not create new email while sharing /applying for this position. Pls reply on this email thread only. Role 1 - GCP Data Engineer Must have skills /Mandatory Skills – GCP, Big Query, Dataflow, Cloud Composer About the Role: We are seeking a highly skilled and passionate GCP Data Engineer to join our growing data team. In this role, you will be instrumental in designing, building, and maintaining scalable and robust data pipelines and solutions on Google Cloud Platform (GCP). You will work closely with data scientists, analysts, and other stakeholders to translate business requirements into efficient data architectures, enabling data-driven decision-making across the organization. Qualifications: Bachelor's degree in Computer Science, Engineering, Information Technology, or a related quantitative field. Min 5+ years of experience (e.g., 3-8 years) as a Data Engineer, with a strong focus on Google Cloud Platform (GCP). Mandatory hands-on experience with core GCP data services: BigQuery (advanced SQL, data modeling, query optimization) Dataflow (Apache Beam, Python/Java SDK) Cloud Composer / Apache Airflow for workflow orchestration Cloud Storage (GCS) Cloud Pub/Sub for messaging/streaming Strong programming skills in Python (preferred) or Java/Scala for data manipulation and pipeline development. Proficiency in SQL and experience with relational and NoSQL databases. Experience with data warehousing concepts, ETL/ELT processes, and data modeling techniques. Understanding of distributed systems and big data technologies (e.g., Spark, Hadoop concepts, Kafka). Familiarity with CI/CD practices and tools. Role 2. Big Data Engineer Must have skills /Mandatory Skills -Big Data, PySpark, Scala, Python About the Role: We are looking for an experienced and passionate Big Data Engineer to join our dynamic team. In this role, you will be responsible for designing, building, and maintaining scalable, high-performance data processing systems and pipelines capable of handling vast volumes of structured and unstructured data. You will play a crucial role in enabling our data scientists, analysts, and business teams to derive actionable insights from complex datasets. Qualifications : bachelor’s degree in Computer Science, Engineering, Information Technology, or a related field. Min 5 years of proven experience as a Big Data Engineer or a similar role. Extensive hands-on experience with Apache Spark (PySpark, Scala) for data processing. Strong expertise in the Hadoop ecosystem (HDFS, Hive, MapReduce). Proficiency in Python and/or Scala/Java. Solid SQL skills and experience with relational databases. Experience designing and building complex ETL/ELT pipelines. Familiarity with data warehousing concepts and data modeling techniques (star schema, snowflake, data vault). Understanding of distributed computing principles. Excellent problem-solving, analytical, and communication skills Role 3r - GCP DevOps Engineer Must have skills /Mandatory Skills – GCP DevOps We are seeking a highly motivated and experienced GCP DevOps Engineer to join our innovative engineering team. You will be responsible for designing, implementing, and maintaining robust, scalable, and secure cloud infrastructure and automation pipelines on Google Cloud Platform (GCP). This role involves working closely with development, operations, and QA teams to streamline the software delivery lifecycle, enhance system reliability, and promote a culture of continuous improvement. Qualifications: Bachelor's degree in Computer Science, Engineering, Information Technology, or a related field. 5 years of experience in a DevOps or SRE role, with significant hands-on experience on Google Cloud Platform (GCP). Strong expertise in core GCP services relevant to DevOps: Compute Engine, GKE, Cloud SQL, Cloud Storage, VPC, Cloud Load Balancing, IAM. Proficiency with Infrastructure as Code (IaC) tools, especially Terraform. Extensive experience in designing and implementing CI/CD pipelines using tools like Cloud Build, Jenkins, or GitLab CI. Hands-on experience with containerization (Docker) and container orchestration (Kubernetes/GKE). Strong scripting skills in Python and Bash/Shell. Experience with monitoring and logging tools (Cloud Monitoring, Prometheus, Grafana, ELK stack). Solid understanding of networking concepts (TCP/IP, DNS, Load Balancers, VPNs) in a cloud environment. Familiarity with database concepts and experience managing cloud databases (e.g., Cloud SQL, Firestore). *** Mandatory to share ***Candidate Summary Sheet*** Interested parties can share their resume at (shant@harel-consulting.com) along with below details Applying for which role ( Pls mention the role name)- Your Name – Contact NO – Email ID – Do you have valid passport – Total Experience – Role 1 . Experience in GCP - Experience in Big Query - Experience in Data Flow - Experience in Cloud Composer – Experience in Apache Airflow – Experience in Python OR Java OR Scala and how much – Role 2nd Experience in Big data- Experience in Hive – Experience in Python OR Java OR Scala and how much – Experience in Pyspark- Role 3. Experience in GCP Devops – Experience in Python Current CTC – Expected CTC – What is your notice period in your current Company- Are you currently working or not- If not working then when you have left your last company – Current location – Preferred Location – It’s a Contract to Hire role, Are you ok with that- Highest Qualification – Current Employer (Payroll Company Name) Previous Employer (Payroll Company Name)- 2nd Previous Employer (Payroll Company Name) – 3rd Previous Employer (Payroll Company Name)- Are you holding any Offer – Are you Expecting any offer - Are you open to consider Contract to Hire role , It’s a C2H Role- PF Deduction is happening in Current Company – PF Deduction happened in 2nd last Employer- PF Deduction happened in 3 last Employer – Latest Photo - If incase you are working with a company whose employee strength is less than 2000 employees, than its mandatory to share UAN Service history. BR Shantpriya Harel Consulting shant@harel-consulting.com 9582718094

Posted 3 weeks ago

Apply

2.0 - 4.0 years

6 - 10 Lacs

Bengaluru

Work from Office

Overview We are an integral part of Annalect Global and Omnicom Group, one of the largest media and advertising agency holding companies in the world. Omnicom’s branded networks and numerous specialty firms provide advertising, strategic media planning and buying, digital and interactive marketing, direct and promotional marketing, public relations, and other specialty communications services. Our agency brands are consistently recognized as being among the world’s creative best. Annalect India plays a key role for our group companies and global agencies by providing stellar products and services in areas of Creative Services, Technology, Marketing Science (data & analytics), Market Research, Business Support Services, Media Services, Consulting & Advisory Services. We currently have 4000+ awesome colleagues (in Annalect India) who are committed to solve our clients’ pressing business issues. We are growing rapidly and looking for talented professionals like you to be part of this journey. Let us build this, together. Responsibilities Collaborate internally between departments and act as a data facilitator to identify potential erroneous data and report and fix identified issues. Act as a data entry specialist while maintaining speed and accuracy in day-to-day operation. Provide support to internal members with the agency’s Hyperlocal platform. Ensure the security, integrity, and data governance of all stored information. Possess and maintain awareness of best practices related to data acumen, business trends, and evolving technologies. Develop a strong understanding of internal and external data sources. Must be a strong, honest, and proactive communicator, acting as a collaborative liaison between business and technology teams. Assist the Retail Tech Data team in regular data audits Knowledge of AdTech, MarTech, CRM metrics, and related business concepts is a big plus. Understand best data practices, normalization, and data governance. Effectively and efficiently explain and understand the agency’s basic data needs. Qualifications B.A./B.S. degree or equivalent in Information Systems, Statistics, or a comparable field of study. Hands-on experience working with data, data integration technologies, and databases. Experience with data governance rules and models. Comfortable with new technologies and iterating quickly. Able to balance multiple concurrent projects. Experience with Bigquery, ETL pipelines, API requirements, and BI tools is a plus Strong attention to detail and communication skills when validating data and reporting on data quality and integrity

Posted 3 weeks ago

Apply

6.0 - 9.0 years

12 - 16 Lacs

Bengaluru

Work from Office

Overview We are looking for a experienced GCP BigQuery Lead to architect, develop, and optimize data solutions on Google Cloud Platform, with a strong focus on Big Query . role involves leading warehouse setup initiatives, collaborating with stakeholders, and ensuring scalable, secure, and high-performance data infrastructure. Responsibilities Lead the design and implementation of data pipelines using BigQuery , Datorama , Dataflow , and other GCP services. Architect and optimize data models and schemas to support analytics, reporting use cases. Implement best practices for performance tuning , partitioning , and cost optimization in BigQuery. Collaborate with business stakeholders to translate requirements into scalable data solutions. Ensure data quality, governance, and security across all big query data assets. Automate workflows using orchestration tools. Mentor junior resource and lead script reviews, documentation, and knowledge sharing. Qualifications 6+ years of experience in data analytics, with 3+ years on GCP and BigQuery. Strong proficiency in SQL , with experience in writing complex queries and optimizing performance. Hands-on experience with ETL/ELT tools and frameworks. Deep understanding of data warehousing , dimensional modeling , and data lake architectures . Good Exposure with data governance , lineage , and metadata management . GCP data engineer certification is a plus. Experience with BI tools (e.g., Looker, Power BI). Good communication and team lead skills.

Posted 3 weeks ago

Apply

4.0 - 8.0 years

6 - 10 Lacs

Hyderabad

Hybrid

Key Responsibilities: 1. Cloud Infrastructure Management: o Design, deploy, and manage scalable and secure infrastructure on Google Cloud Platform (GCP). o Implement best practices for GCP IAM, VPCs, Cloud Storage, Clickhouse, Superset Apache tools onboarding and other GCP services. 2. Kubernetes and Containerization: o Manage and optimize Google Kubernetes Engine (GKE) clusters for containerized applications. o Implement Kubernetes best practices, including pod scaling, resource allocation, and security policies. 3. CI/CD Pipelines: o Build and maintain CI/CD pipelines using tools like Cloud Build, Stratus, GitLab CI/CD, or ArgoCD. o Automate deployment workflows for containerized and serverless applications. 4. Security and Compliance: o Ensure adherence to security best practices for GCP, including IAM policies, network security, and data encryption. o Conduct regular audits to ensure compliance with organizational and regulatory standards. 5. Collaboration and Support: o Work closely with development teams to containerize applications and ensure smooth deployment on GCP. o Provide support for troubleshooting and resolving infrastructure-related issues. 6. Cost Optimization: o Monitor and optimize GCP resource usage to ensure cost efficiency. o Implement strategies to reduce cloud spend without compromising performance. ________________________________________ Required Skills and Qualifications: 1. Certifications: o Must hold a Google Cloud Professional DevOps Engineer certification or Google Cloud Professional Cloud Architect certification. 2. Cloud Expertise: o Strong hands-on experience with Google Cloud Platform (GCP) services, including GKE, Cloud Functions, Cloud Storage, BigQuery, and Cloud Pub/Sub. 3. DevOps Tools: o Proficiency in DevOps tools like Terraform, Ansible, Stratus, GitLab CI/CD, or Cloud Build. o Experience with containerization tools like Docker. 4. Kubernetes Expertise: o In-depth knowledge of Kubernetes concepts such as pods, deployments, services, ingress, config maps, and secrets. o Familiarity with Kubernetes tools like kubectl, Helm, and Kustomize. 5. Programming and Scripting: o Strong scripting skills in Python, Bash, or Go. o Familiarity with YAML and JSON for configuration management. 6. Monitoring and Logging: o Experience with monitoring tools like Prometheus, Grafana, or Google Cloud Operations Suite. 7. Networking: o Understanding of cloud networking concepts, including VPCs, subnets, firewalls, and load balancers. 8. Soft Skills: o Strong problem-solving and troubleshooting skills. o Excellent communication and collaboration abilities. o Ability to work in an agile, fast-paced environment.

Posted 3 weeks ago

Apply

5.0 - 7.0 years

6 - 10 Lacs

Chennai

Work from Office

Role Summary: This role demands experienced developers to create detailed member level information supporting the MART application development which involves Application (UX) development and integration with data sources , ETL pipeline development. Responsibilities: Design, develop, and maintain the user interface (UI) of the MART application using R Shiny, JavaScript, and CSS, ensuring a seamless and intuitive user experience. Develop and maintain efficient and scalable ETL pipelines using GCP Dataform and BigQuery to extract, transform, and load data from various on-premise (Oracle) and cloud-based sources. This includes leveraging Big R query for accessing on-premise Oracle data. Develop and implement data manipulation and transformation logic in R, creating a longitudinal data format with unique member identifiers. Develop and implement comprehensive logging and monitoring using Splunk. Collaborate with other developers, data scientists, and stakeholders to ensure the timely delivery of high-quality software. Participate in all phases of the software development lifecycle, from requirements gathering and design to testing and deployment. Contribute to the maintenance and improvement of existing application functionality. Work within a Git-based version control system. Manage data in a dedicated GCP project, adhering to best practices for cloud security and scalability. Contribute to the creation of summary statistics for groups via the Population Assessment Tool (PAT). Required Skillsets: Strong proficiency in R Shiny for UI development. Strong proficiency in JavaScript, CSS, and HTML for front-end development. Proven experience in designing, developing, and maintaining ETL pipelines, preferably using GCP Dataform and BigQuery. Experience with data manipulation and transformation in R, including creating longitudinal datasets. Experience working with on-premise databases (Oracle), preferably using Big R query for data access. Experience with Git for version control. Experience with Splunk for logging and monitoring. Experience working with cloud platforms, specifically Google Cloud Platform (GCP). Strong analytical and problem-solving skills. Excellent communication and collaboration skills. Good to Have Skillsets: Experience with Tableau for dashboarding and data visualization. Experience with advanced data visualization techniques. Experience working in an Agile development environment. Shift Requirement: 3PM to 12 midnight

Posted 3 weeks ago

Apply

5.0 - 7.0 years

13 - 17 Lacs

Bengaluru

Work from Office

A career in IBM Consulting is rooted by long-term relationships and close collaboration with clients across the globe. You'll work with visionaries across multiple industries to improve the hybrid cloud and Al journey for the most innovative and valuable companies in the world. Your ability to accelerate impact and make meaningful change for your clients is enabled by our strategic partner ecosystem and our robust technology platforms across the IBM portfolio; including Software and Red Hat. In your role, you will be responsible forSkilled Multiple GCP services - GCS, BigQuery, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer etc. Must have Python and SQL work experience & Proactive, collaborative and ability to respond to critical situation Ability to analyse data for functional business requirements & front face customer Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise 5 to 7 years of relevant experience working as technical analyst with Big Query on GCP platform. Skilled in multiple GCP services - GCS, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer Ambitious individual who can work under their own direction towards agreed targets/goals and with creative approach to work You love collaborative environments that use agile methodologies to encourage creative design thinking and find innovative ways to develop with cutting edge technologies. End to End functional knowledge of the data pipeline/transformation implementation that the candidate has done, should understand the purpose/KPIs for which data transformation was done Preferred technical and professional experience Experience with AEM Core Technologies OSGI Services, Apache Sling ,Granite Framework., Java Content Repository API, Java 8+, Localization Familiarity with building tools, Jenkin and Maven , Knowledge of version control tools, especially Git, Knowledge of Patterns and Good Practices to design and develop quality and clean code, Knowledge of HTML, CSS, and JavaScript , jQuery Familiarity with task management, bug tracking, and collaboration tools like JIRA and Confluence

Posted 3 weeks ago

Apply

3.0 - 8.0 years

9 - 13 Lacs

Gurugram

Work from Office

About the Role: Grade Level (for internal use): 10 The Team As a member of the Data Transformation team you will work on building ML powered products and capabilities to power natural language understanding, data extraction, information retrieval and data sourcing solutions for S&P Global Market Intelligence and our clients. You will spearhead development of production-ready AI products and pipelines while leading-by-example in a highly engaging work environment. You will work in a (truly) global team and encouraged for thoughtful risk-taking and self-initiative. The Impact The Data Transformation team has already delivered breakthrough products and significant business value over the last 3 years. In this role you will be developing our next generation of new products while enhancing existing ones aiming at solving high-impact business problems. What’s in it for you Be a part of a global company and build solutions at enterprise scale Collaborate with a highly skilled and technically strong team Contribute to solving high complexity, high impact problems Key Responsibilities Build production ready data acquisition and transformation pipelines from ideation to deployment Being a hands-on problem solver and developer helping to extend and manage the data platforms Apply best practices in data modeling and building ETL pipelines (streaming and batch) using cloud-native solutions What We’re Looking For 3-5 years of professional software work experience Expertise in Python and Apache Spark OOP Design patterns, Test-Driven Development and Enterprise System design Experience building data processing workflows and APIs using frameworks such as FastAPI, Flask etc. Proficiency in API integration, experience working with REST APIs and integrating external & internal data sources SQL (any variant, bonus if this is a big data variant) Linux OS (e.g. bash toolset and other utilities) Version control system experience with Git, GitHub, or Azure DevOps. Problem-solving and debugging skills Software craftsmanship, adherence to Agile principles and taking pride in writing good code Techniques to communicate change to non-technical people Nice to have Core Java 17+, preferably Java 21+, and associated toolchain DevOps with a keen interest in automation Apache Avro Apache Kafka Kubernetes Cloud expertise (AWS and GCP preferably) Other JVM based languages - e.g. Kotlin, Scala C# - in particular .NET Core Data warehouses (e.g., Redshift, Snowflake, BigQuery) What’s In It For You Our Purpose: Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technology–the right combination can unlock possibility and change the world.Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence®, pinpointing risks and opening possibilities. We Accelerate Progress. Our People: Were more than 35,000 strong worldwide—so were able to understand nuances while having a broad perspective. Our team is driven by curiosity and a shared belief that Essential Intelligence can help build a more prosperous future for us all.From finding new ways to measure sustainability to analyzing energy transition across the supply chain to building workflow solutions that make it easy to tap into insight and apply it. We are changing the way people see things and empowering them to make an impact on the world we live in. We’re committed to a more equitable future and to helping our customers find new, sustainable ways of doing business. We’re constantly seeking new solutions that have progress in mind. Join us and help create the critical insights that truly make a difference. Our Values: Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the worlds leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits: We take care of you, so you can take care of business. We care about our people. That’s why we provide everything you—and your career—need to thrive at S&P Global. Health & WellnessHealth care coverage designed for the mind and body. Flexible DowntimeGenerous time off helps keep you energized for your time on. Continuous LearningAccess a wealth of resources to grow your career and learn valuable new skills. Invest in Your FutureSecure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly PerksIt’s not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the BasicsFrom retail discounts to referral incentive awards—small perks can make a big difference. For more information on benefits by country visithttps://spgbenefits.com/benefit-summaries Global Hiring and Opportunity at S&P Global: At S&P Global, we are committed to fostering a connected and engaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. ----------------------------------------------------------- Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email toEEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only: The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf ----------------------------------------------------------- 20 - Professional (EEO-2 Job Categories-United States of America), IFTECH202.1 - Middle Professional Tier I (EEO Job Group), SWP Priority – Ratings - (Strategic Workforce Planning)

Posted 3 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies