Jobs
Interviews

1086 Bigquery Jobs - Page 16

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3.0 - 5.0 years

5 - 9 Lacs

Navi Mumbai

Work from Office

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Collibra Data Quality & Observability Good to have skills : Collibra Data GovernanceMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with various teams to understand their needs, developing solutions that align with business objectives, and ensuring that applications are functioning optimally. You will also engage in problem-solving activities, providing support and enhancements to existing applications while maintaining a focus on quality and efficiency. Key Responsibilities:Configure and implement Collibra Data Quality (CDQ) rules, workflows, dashboards, and data quality scoring metrics.Collaborate with data stewards, data owners, and business analysts to define data quality KPIs and thresholds.Develop data profiling and rule-based monitoring using CDQ's native rule engine or integrations (e.g., with Informatica, Talend, or BigQuery).Build and maintain Data Quality Dashboards and Issue Management workflows within Collibra.Integrate CDQ with Collibra Data Intelligence Cloud for end-to-end governance visibility.Drive root cause analysis and remediation plans for data quality issues.Support metadata and lineage enrichment to improve data traceability.Document standards, rule logic, and DQ policies in the Collibra Catalog.Conduct user training and promote data quality best practices across teams.Required Skills and Experience:3+ years of experience in data quality, metadata management, or data governance.Hands-on experience with Collibra Data Quality & Observability (CDQ) platform.Knowledge of Collibra Data Intelligence Cloud including Catalog, Glossary, and Workflow Designer.Proficiency in SQL and understanding of data profiling techniques.Experience integrating CDQ with enterprise data sources (Snowflake, BigQuery, Databricks, etc.).Familiarity with data governance frameworks and data quality dimensions (accuracy, completeness, consistency, etc.).Excellent analytical, problem-solving, and communication skills. Additional Information:- The candidate should have minimum 7.5 years of experience in Collibra Data Quality & Observability.- This position is based in Mumbai.- A 15 years full time education is required. Qualification 15 years full time education

Posted 3 weeks ago

Apply

3.0 - 8.0 years

5 - 9 Lacs

Pune

Work from Office

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Google BigQuery Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with team members to understand project needs, developing application features, and ensuring that the applications function seamlessly within the existing infrastructure. You will also engage in testing and troubleshooting to enhance application performance and user experience, while continuously seeking opportunities for improvement and innovation in application design and functionality. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Assist in the documentation of application processes and workflows.- Engage in code reviews to ensure quality and adherence to best practices. Professional & Technical Skills: - Must To Have Skills: Proficiency in Google BigQuery.- Good To Have Skills: Experience with data modeling and ETL processes.- Strong understanding of SQL and database management.- Familiarity with application development frameworks and methodologies.- Experience in cloud computing environments and services. Additional Information:- The candidate should have minimum 3 years of experience in Google BigQuery.- This position is based in Pune.- A 15 years full time education is required. Qualification 15 years full time education

Posted 3 weeks ago

Apply

4.0 - 9.0 years

6 - 10 Lacs

Hyderabad

Work from Office

Responsibilities: Data Quality Implementation & Monitoring (Acceldata & Demand Tools): Design, develop, and implement data quality rules and checks using Acceldata to monitor data accuracy, completeness, consistency, and timeliness. Configure and utilize Acceldata to profile data, identify anomalies, and establish data quality thresholds. Investigate and resolve data quality issues identified by Acceldata, working with relevant teams for remediation. Leverage DemandTools within our Salesforce environment to identify, merge, and prevent duplicate records across Leads, Contacts, and Accounts. Implement data standardization and cleansing processes within Salesforce using DemandTools. Develop and maintain data quality dashboards and reports using Acceldata to provide visibility into data health. Data Onboarding & Integration Quality: Collaborate with engineers and platform teams to understand data sources and pipelines built using Fivetran / ingestion tool. Ensure data transformations within Fivetran to maintain data integrity and quality. Develop and execute test plans and test cases to validate the successful and accurate onboarding of data into our snowflake environment. Metadata Management & Data Governance: Work with the Atlan platform to understand and contribute to the establishment of a comprehensive data catalog. Assist in defining and implementing data governance policies and standards within Atlan. Validate the accuracy and completeness of metadata within Atlan to ensure data discoverability and understanding. Collaborate on data lineage tracking and impact analysis using Atlan. Collaboration & Communication: Work closely with data engineers, platform team, data analysts, business stakeholders, and Salesforce administrators. Clearly communicate data quality findings, risks, and remediation steps. Participate in data governance meetings and contribute to the development of data quality best practices. Document data quality rules, processes, and monitoring procedures. Required Skills & Experience: Proven experience (e.g., 3+ years) as a Data Quality Engineer or similar role. Hands-on experience with Fivetran / data ingestion application for data integration and understanding its data transformation capabilities. Familiarity with Atlan or other modern data catalog and metadata management tools. Strong practical experience with Acceldata or similar data quality monitoring and observability platforms. Familiarity in using DemandTools for data quality management within Salesforce. Solid understanding of data quality principles, methodologies, and best practices. Strong SQL skills for data querying and analysis. Experience with data profiling and data analysis techniques. Excellent analytical, problem-solving, and troubleshooting skills. Strong communication and collaboration skills. Ability to work independently and manage tasks effectively in a remote environment. Preferred Skills & Experience: Experience with other data quality tools or frameworks. Knowledge of data warehousing concepts and technologies (e.g., Snowflake, BigQuery). Experience with scripting languages like Python for data manipulation and automation. Familiarity with Salesforce data model and administration.

Posted 3 weeks ago

Apply

5.0 - 10.0 years

5 - 9 Lacs

Bengaluru

Work from Office

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Google BigQuery Good to have skills : Microsoft SQL ServerMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. Your typical day will involve collaborating with teams to develop innovative solutions and ensure seamless application functionality. Roles & Responsibilities:- Expected to be an SME- Collaborate and manage the team to perform- Responsible for team decisions- Engage with multiple teams and contribute on key decisions- Provide solutions to problems for their immediate team and across multiple teams- Lead application development projects- Conduct code reviews and ensure coding standards are met Professional & Technical Skills: - Must To Have Skills: Proficiency in Google BigQuery- Good To Have Skills: Experience with Microsoft SQL Server- Strong understanding of database management systems- Knowledge of data modeling and optimization techniques- Experience in developing scalable and efficient applications Additional Information:- The candidate should have a minimum of 5 years of experience in Google BigQuery- This position is based at our Bengaluru office- A 15 years full-time education is required Qualification 15 years full time education

Posted 3 weeks ago

Apply

7.0 - 9.0 years

11 - 16 Lacs

Bengaluru

Work from Office

Educational Bachelor of Engineering Service Line Data & Analytics Unit Responsibilities A day in the life of an Infoscion As part of the Infosys delivery team, your primary role would be to ensure effective Design, Development, Validation and Support activities, to assure that our clients are satisfied with the high levels of service in the technology domain. You will gather the requirements and specifications to understand the client requirements in a detailed manner and translate the same into system requirements. You will play a key role in the overall estimation of work requirements to provide the right information on project estimations to Technology Leads and Project Managers. You would be a key contributor to building efficient programs/ systems and if you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you!If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Additional Responsibilities: Knowledge of design principles and fundamentals of architecture Understanding of performance engineering Knowledge of quality processes and estimation techniques Basic understanding of project domain Ability to translate functional / nonfunctional requirements to systems requirements Ability to design and code complex programs Ability to write test cases and scenarios based on the specifications Good understanding of SDLC and agile methodologies Awareness of latest technologies and trends Logical thinking and problem-solving skills along with an ability to collaborate Technical and Professional : Technology-Cloud Platform-GCP Database-Google BigQuery Preferred Skills: Technology-Cloud Platform-GCP Core Services Technology-Cloud Platform-Azure Devops-data on cloud-gcp Technology-Cloud Platform-GCP App Development

Posted 3 weeks ago

Apply

5.0 - 7.0 years

22 - 25 Lacs

Bengaluru

Work from Office

Role Overview: We are looking for a skilled Data Visualization Software Developer Engineer with 6-8 years of experience in developing interactive dashboards and data-driven solutions using Looker and LookerML. The ideal candidate will have expertise in Google Cloud Platform (GCP) and BigQuery and a strong understanding of data visualization best practices. Experience in the media domain (OTT, DTH, Web) will be a plus. Key Responsibilities: Design, develop, and optimize interactive dashboards using Looker and LookerML. Work with BigQuery to create efficient data models and queries for visualization. Develop LookML models, explores, and derived tables to support business intelligence needs. Optimize dashboard performance by implementing best practices in data aggregation and visualization. Collaborate with data engineers, analysts, and business teams to understand requirements and translate them into actionable insights. Implement security and governance policies within Looker to ensure data integrity and controlled access. Leverage Google Cloud Platform (GCP) services to build scalable and reliable data solutions. Maintain documentation and provide training to stakeholders on using Looker dashboards effectively. Troubleshoot and resolve issues related to dashboard performance, data accuracy, and visualization constraints. Maintain and optimize existing Looker dashboards and reports to ensure continuity and alignment with business KPIs Understand, audit, and enhance existing LookerML models to ensure data integrity and performance Build new dashboards and data visualizations based on business requirements and stakeholder input Collaborate with data engineers to define and validate data pipelines required for dashboard development and ensure the timely availability of clean, structured data Document existing and new Looker assets and processes to support knowledge transfer, scalability, and maintenance Support the transition/handover process by acquiring detailed knowledge of legacy implementations and ensuring a smooth takeover Required Skills & Experience: 6-8 years of experience in data visualization and business intelligence using Looker and LookerML. Strong proficiency in writing and optimizing SQL queries, especially for BigQuery. Experience in Google Cloud Platform (GCP), particularly with BigQuery and related data services. Solid understanding of data modeling, ETL processes, and database structures. Familiarity with data governance, security, and access controls in Looker. Strong analytical skills and the ability to translate business requirements into technical solutions. Excellent communication and collaboration skills. Expertise in Looker and LookerML, including Explore creation, Views, and derived tables Strong SQL skills for data exploration, transformation, and validation Experience in BI solution lifecycle management (build, test, deploy, maintain) Excellent documentation and stakeholder communication skills for handovers and ongoing alignment Strong data visualization and storytelling abilities, focusing on user-centric design and clarity Preferred Qualifications: Experience working in the media industry (OTT, DTH, Web) and handling large-scale media datasets. Knowledge of other BI tools like Tableau, Power BI, or Data Studio is a plus. Experience with Python or scripting languages for automation and data processing. Understanding of machine learning or predictive analytics is an advantage.

Posted 3 weeks ago

Apply

5.0 - 10.0 years

4 - 8 Lacs

Hyderabad

Work from Office

EXP LevelSoftware Developer II - 4Yrs - 6Yrs - TECHNICAL SETS: Development experience using Python (Python is must-have) Expertise in Java Script / Type Script, UI frameworks (Preferably AngularJS) Hands-on experience with solution design, customization, and deployment on Google Cloud Platform (GCP) Data Gathering with BigQuery SQL - (not must have) Data Visualization with Looker Studio or Tableau - (not must have) Expertise with SDLC tools and processes including GitHub, Jenkins, Maven or Gradle Min of 1+ yrs of experience in refactoring code, debugging and building tools Excellent understanding of computer science fundamentals data structures and algorithms Development/ consumption experience in RESTful Web Services/ any other open source APIs Experience in understanding System design and implementation Write Unit Tests and Documentation Working knowledge with Agile/ scrum/ Jira (any similar tool) Overlap required with EST hours (at least until mid-day EST)

Posted 3 weeks ago

Apply

3.0 - 8.0 years

5 - 9 Lacs

Hyderabad

Work from Office

Project Role : Application Designer Project Role Description : Assist in defining requirements and designing applications to meet business process and application requirements. Must have skills : Python (Programming Language) Good to have skills : Oracle Procedural Language Extensions to SQL (PLSQL), Google BigQueryMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Designer, you will assist in defining requirements and designing applications to meet business process and application requirements. A typical day involves collaborating with cross-functional teams to gather insights, analyzing user needs, and translating them into functional specifications. You will engage in brainstorming sessions to develop innovative solutions and ensure that the applications align with business objectives while maintaining a user-centric approach. Your role will also include testing and validating designs to ensure they meet the specified requirements, ultimately contributing to the successful delivery of high-quality applications. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Collaborate with stakeholders to gather and analyze requirements for application design.- Develop and document application specifications and design documents.- Participate in code reviews and provide constructive feedback to peers. Professional & Technical Skills: - Must To Have Skills: Proficiency in Python (Programming Language).- Good To Have Skills: Experience with Google BigQuery, Oracle Procedural Language Extensions to SQL (PLSQL).- Strong understanding of application design principles and methodologies.- Experience with software development life cycle and agile methodologies.- Familiarity with database management and data modeling techniques. Additional Information:- The candidate should have minimum 3 years of experience in Python (Programming Language).- This position is based at our Hyderabad office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 3 weeks ago

Apply

5.0 - 10.0 years

5 - 9 Lacs

Bengaluru

Work from Office

Project Role : Application Designer Project Role Description : Assist in defining requirements and designing applications to meet business process and application requirements. Must have skills : Google BigQuery Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Designer, you will assist in defining requirements and designing applications to meet business process and application requirements. Your typical day will involve collaborating with teams to design innovative solutions and contribute to key decisions in application development. Roles & Responsibilities:- Expected to be an SME- Collaborate and manage the team to perform- Responsible for team decisions- Engage with multiple teams and contribute on key decisions- Provide solutions to problems for their immediate team and across multiple teams- Lead the design and development of scalable applications- Conduct regular code reviews and provide technical guidance to team members Professional & Technical Skills: - Must To Have Skills: Proficiency in Google BigQuery- Strong understanding of cloud-based data warehousing solutions- Experience in designing and implementing complex data models- Hands-on experience with ETL processes and data integration- Knowledge of SQL and data visualization tools Additional Information:- The candidate should have a minimum of 5 years of experience in Google BigQuery- This position is based at our Bengaluru office- A 15 years full time education is required Qualification 15 years full time education

Posted 3 weeks ago

Apply

6.0 - 11.0 years

7 - 11 Lacs

Hyderabad

Work from Office

Experience Good experience in API design, development, and implementation 3 years experience of cloud platform services (preferably GCP) Hands-on experience in designing, implementing, and maintaining APIs that meet the highest standards of performance, security, and scalability. Hands-on experience in Design, develop, and implement microservices architectures and solutions using industry best practices and design patterns. Hands-on experience with cloud computing and services. Hands-on experience with proficiency in programming languages like Java, Python, JavaScript etc. Hands-on experience with API Gateway and management tools like Apigee, Kong, API Gateway. Hands-on experience with integrating APIs with a variety of systems/applications/microservices and infrastructure . Deployment experience in Cloud environment (preferably GCP) Experience in TDD/DDD and unit testing. Hands-on CI/CD experience in automating the build, test, and deployment processes to ensure rapid and reliable delivery of API updates. Technical Skills Programming & LanguagesJava, GraphQL, SQL, API Gateway and management tools Apigee, API Gateway Database TechOracle, Spanner, BigQuery, Cloud Storage Operating Systems Linux Expert with API design principles, specification and architectural styles like REST, GraphQL, and gRPC, Proficiency in API lifecycle management, advanced security measures, and performance optimization. Good Knowledge of Security Best Practices and Compliance Awareness. Good Knowledge of messaging patterns and distributed systems. Well-versed with protocols and data formats. Strong development knowledge in microservice design, architectural patterns, frameworks and libraries. Knowledge of SQL and NoSQL databases, and how to interact with them through APIs Good to have knowledge of data modeling and database management design database schemas that efficiently store and retrieve data. Scripting and configuration (eg yaml) knowledge. Strong Testing and Debugging Skills writing unit tests and familiarity with the tools and techniques to fix issues. DevOps knowledge CI/CD practices and tools. Familiarity with Monitoring and observability platforms for real-time insights into application performance Understanding version control systems like Git. Familiarity with API documentation standards such as OpenAPI. Problem-solving skills and ability to work independently in a fast-paced environment. Effective Communication negotiate and communicate effectively with stakeholders to ensure API solutions meet both technical and non-technical stakeholders.

Posted 3 weeks ago

Apply

3.0 - 7.0 years

37 - 40 Lacs

Bengaluru

Work from Office

: Job TitleDevOps Engineer, AS LocationBangalore, India Role Description Deutsche Bank has set for itself ambitious goals in the areas of Sustainable Finance, ESG Risk Mitigation as well as Corporate Sustainability. As Climate Change throws new Challenges and opportunities, Bank has set out to invest in developing a Sustainability Technology Platform, Sustainability data products and various sustainability applications which will aid Banks goals. As part of this initiative, we are building an exciting global team of technologists who are passionate about Climate Change, want to contribute to greater good leveraging their Technology Skillset in Cloud / Hybrid Architecture. As part of this Role, we are seeking a highly skilled and experienced DevOps Engineer to join our growing team. In this role, you will play a pivotal role in managing and optimizing cloud infrastructure, facilitating continuous integration and delivery, and ensuring system reliability. What well offer you . 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Accident and Term life Insurance Your key responsibilities Create, implement, and oversee scalable, secure, and cost-efficient cloud infrastructures on Google Cloud Platform (GCP). Utilize Infrastructure as Code (IaC) methodologies with tools such as Terraform, Deployment Manager, or alternatives. Implement robust security measures to ensure data access control and compliance with regulations. Adopt security best practices, establish IAM policies, and ensure adherence to both organizational and regulatory requirements. Set up and manage Virtual Private Clouds (VPCs), subnets, firewalls, VPNs, and interconnects to facilitate secure cloud networking. Establish continuous integration and continuous deployment (CI/CD) pipelines using Jenkins, GitHub Actions, or comparable tools for automated application deployments. Implement monitoring and alerting solutions through Stackdriver (Cloud Operations), Prometheus, or other third-party applications. Evaluate and optimize cloud expenditures by utilizing committed use discounts, autoscaling features, and resource rightsizing. Manage and deploy containerized applications through Google Kubernetes Engine (GKE) and Cloud Run. Deploy and manage GCP databases like Cloud SQL, BigQuery. Your skills and experience Minimum of 5+ years of experience in DevOps or similar roles with hands-on experience in GCP. In-depth knowledge of Google Cloud services (e.g., GCE, GKE, Cloud Functions, Cloud Run, Pub/Sub, BigQuery, Cloud Storage) and the ability to architect, deploy, and manage cloud-native applications. Proficient in using tools like Jenkins, GitLab, Terraform, Ansible, Docker, Kubernetes. Experience with Infrastructure as Code (IaC) tools like Terraform, CloudFormation, or GCP-native Deployment Manager. Solid understanding of security protocols, IAM, networking, and compliance requirements within cloud environments. Strong problem-solving skills and ability to troubleshoot cloud-based infrastructure. Google Cloud certifications (e.g., Associate Cloud Engineer, Professional Cloud Architect, or Professional DevOps Engineer) are a plus. How well support you . . . About us and our teams Please visit our company website for further information: https://www.db.com/company/company.htm We strive for a culture in which we are empowered to excel together every day. This includes acting responsibly, thinking commercially, taking initiative and working collaboratively. Together we share and celebrate the successes of our people. Together we are Deutsche Bank Group. We welcome applications from all people and promote a positive, fair and inclusive work environment.

Posted 3 weeks ago

Apply

6.0 - 11.0 years

4 - 8 Lacs

Bengaluru

Work from Office

React Developer Responsibilities Developing new user-facing features using React.js Building reusable components and front-end libraries for future use Translate User Stories and wireframes into high quality code Create applications which provide fantastic UI/UX and responsive design Integrate apps with third-party APIs and Cloud APIs Apply core Computer Science concepts to improve consumer web apps Profile and improve our frontend performance Design for scalability and adherence to standards Required Skills: Should be excellent in UI development using React framework Should be strong in Redux or Flux Should be strong in JavaScript (ES 6 and above standards)

Posted 3 weeks ago

Apply

10.0 - 15.0 years

12 - 16 Lacs

Pune

Work from Office

To be successful in this role, you should meet the following requirements(Must have ) Payments and Banking experience is a must. Experience in implementing and monitoring data governance using standard methodology throughout the data life cycle, within a large organisation. Demonstrate up-to-date knowledge of data governance theory, standard methodology and the practical considerations. Demonstrate knowledge of data governance industry standards and tools. Overall experience of 10+ years experience in Data governance, encompassing Data Quality management, Master data management, Data privacy & compliance, Data cataloguing and metadata management, Data security, maturity and lineage. Prior experience in implementing an end-to-end data governance framework. Experience in Automating Data cataloguing, ensuring accurate, consistent metadata, making data easily discoverable and usable. Domain experience across the payments and banking lifecycle. An analytical mind and inclination for problem-solving, with an attention to detail. Ability to effectively navigate and deliver transformation programmes in large global financial organisations, amidst the challenges posed by bureaucracy, globally distributed teams and local data regulations. Strong communication skills coupled with presentation skills of complex information and data. A first-class degree in Engineering or relevant field - with 2 or more of the following subjects as a major Mathematics, Computer Science, Statistics, Economics. The successful candidate will also meet the following requirements(Good to have ) Database TypeRelational, NoSQL, DocumentDB DatabasesOracle, PostgreSQL, BigQuery, Big Table, MongoDB, Neo4j Experience in Conceptual/ Logical/Physical Data Modeling. Experience in Agile methodology and leading agile delivery, aligned with organisational needs. Effective leader as well as team player with a strong commitment to quality and efficiency.

Posted 3 weeks ago

Apply

8.0 - 13.0 years

4 - 8 Lacs

Hyderabad

Work from Office

Key Responsibilities Set up and maintain monitoring dashboards for ETL jobs using Datadog, including metrics, logs, and alerts. Monitor daily ETL workflows and proactively detect and resolve data pipeline failures or performance issues. Create Datadog Monitors for job status (success/failure), job duration, resource utilization, and error trends. Work closely with Data Engineering teams to onboard new pipelines and ensure observability best practices. Integrate Datadog with tools. Conduct root cause analysis of ETL failures and performance bottlenecks. Tune thresholds, baselines, and anomaly detection settings in Datadog to reduce false positives. Document incident handling procedures and contribute to improving overall ETL monitoring maturity. Participate in on call rotations or scheduled support windows to manage ETL health. Required Skills & Qualifications 3+ years of experience in ETL/data pipeline monitoring, preferably in a cloud or hybrid environment. Proficiency in using Datadog for metrics, logging, alerting, and dashboards. Strong understanding of ETL concepts and tools (e.g., Airflow, Informatica, Talend, AWS Glue, or dbt). Familiarity with SQL and querying large datasets. Experience working with Python, Shell scripting, or Bash for automation and log parsing. Understanding of cloud platforms (AWS/GCP/Azure) and services like S3, Redshift, BigQuery, etc. Knowledge of CI/CD and DevOps principles related to data infrastructure monitoring. Preferred Qualifications Experience with distributed tracing and APM in Datadog. Prior experience monitoring Spark, Kafka, or streaming pipelines. Familiarity with ticketing tools (e.g., Jira, ServiceNow) and incident management workflows.

Posted 3 weeks ago

Apply

10.0 - 15.0 years

12 - 16 Lacs

Pune

Work from Office

To be successful in this role, you should meet the following requirements(Must have ) Expertise in Conceptual/ Logical/Physical Data Modeling. Payments and Banking experience is a must. Database Design Database TypeRelational, NoSQL, DocumentDB DatabasesOracle, PostgreSQL, BigQuery, Big Table, MongoDB, Neo4j ToolsErwin, Visual Paradigm Solid experience in PL/SQL, Python, Unix Shell scripting, Java. Domain experience across the payments and banking lifecycle. An analytical mind and inclination for problem-solving, with an attention to detail. Sound knowledge of payments workflows and statuses across various systems within a large global bank. Experience in collecting large data sets, identifying patterns and trends in data sets. Overall experience of 10+ years, with considerable experience in Big Data and Relational Database. Prior experience across requirements gathering, build and implementation, stakeholder co-ordination, release management and production support. Ability to effectively navigate and deliver transformation programmes in large global financial organisations, amidst the challenges posed by bureaucracy, globally distributed teams and local data regulations. Strong communication skills coupled with presentation skills of complex information and data Strong communication skills coupled with presentation skills of complex information and data. A first-class degree in Engineering or relevant field with 2 or more of the following subjects as a major Mathematics, Computer Science, Statistics, Economics. The successful candidate will also meet the following requirements(Good to have ) Understanding of DevOps and CI tools (Jenkins, Git, Grunt, Bamboo, Artifactory) would be an added advantage. Experience in Agile methodology and leading agile delivery, aligned with organisational needs. Effective leader as well as team player with a strong commitment to quality and efficiency.

Posted 3 weeks ago

Apply

2.0 - 7.0 years

4 - 8 Lacs

Hyderabad

Work from Office

EXP LevelSoftware Developer II - 4Yrs - 6Yrs - TECHNICAL SETS: Development experience using Python (Python is must-have) Expertise in Java Script / Type Script, UI frameworks (Preferably AngularJS) Hands-on experience with solution design, customization, and deployment on Google Cloud Platform (GCP) Data Gathering with BigQuery SQL - (not must have) Data Visualization with Looker Studio or Tableau - (not must have) Expertise with SDLC tools and processes including GitHub, Jenkins, Maven or Gradle Min of 1+ yrs of experience in refactoring code, debugging and building tools Excellent understanding of computer science fundamentals data structures and algorithms Development/ consumption experience in RESTful Web Services/ any other open source APIs Experience in understanding System design and implementation Write Unit Tests and Documentation Working knowledge with Agile/ scrum/ Jira (any similar tool) Overlap required with EST hours (at least until mid-day EST)

Posted 3 weeks ago

Apply

5.0 - 10.0 years

5 - 9 Lacs

Bengaluru

Work from Office

Project Role : Application Designer Project Role Description : Assist in defining requirements and designing applications to meet business process and application requirements. Must have skills : Google BigQuery Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Designer, you will assist in defining requirements and designing applications to meet business process and application requirements. Your typical day will involve collaborating with teams to design and develop innovative solutions for business needs. Roles & Responsibilities:- Expected to be an SME- Collaborate and manage the team to perform- Responsible for team decisions- Engage with multiple teams and contribute on key decisions- Provide solutions to problems for their immediate team and across multiple teams- Lead design discussions and provide technical guidance to the team- Conduct code reviews and ensure adherence to coding standards- Stay updated on industry trends and technologies to drive innovation Professional & Technical Skills: - Must To Have Skills: Proficiency in Google BigQuery- Strong understanding of data modeling and database design- Experience with cloud-based data warehousing solutions- Hands-on experience in ETL processes and data integration- Knowledge of SQL and query optimization techniques Additional Information:- The candidate should have a minimum of 5 years of experience in Google BigQuery- This position is based at our Bengaluru office- A 15 years full time education is required Qualification 15 years full time education

Posted 3 weeks ago

Apply

5.0 - 10.0 years

6 - 10 Lacs

Hyderabad

Work from Office

Experience Good experience in API design, development, and implementation 3 years experience of cloud platform services (preferably GCP) Hands-on experience in designing, implementing, and maintaining APIs that meet the highest standards of performance, security, and scalability. Hands-on experience in Design, develop, and implement microservices architectures and solutions using industry best practices and design patterns. Hands-on experience with cloud computing and services. Hands-on experience with proficiency in programming languages like Java, Python, JavaScript etc. Hands-on experience with API Gateway and management tools like Apigee, Kong, API Gateway. Hands-on experience with integrating APIs with a variety of systems/applications/microservices and infrastructure . Deployment experience in Cloud environment (preferably GCP) Experience in TDD/DDD and unit testing. Hands-on CI/CD experience in automating the build, test, and deployment processes to ensure rapid and reliable delivery of API updates. Technical Skills Programming & LanguagesJava, GraphQL, SQL, API Gateway and management tools Apigee, API Gateway Database TechOracle, Spanner, BigQuery, Cloud Storage Operating Systems Linux Expert with API design principles, specification and architectural styles like REST, GraphQL, and gRPC, Proficiency in API lifecycle management, advanced security measures, and performance optimization. Good Knowledge of Security Best Practices and Compliance Awareness. Good Knowledge of messaging patterns and distributed systems. Well-versed with protocols and data formats. Strong development knowledge in microservice design, architectural patterns, frameworks and libraries. Knowledge of SQL and NoSQL databases, and how to interact with them through APIs Good to have knowledge of data modeling and database management design database schemas that efficiently store and retrieve data. Scripting and configuration (eg yaml) knowledge. Strong Testing and Debugging Skills writing unit tests and familiarity with the tools and techniques to fix issues. DevOps knowledge CI/CD practices and tools. Familiarity with Monitoring and observability platforms for real-time insights into application performance Understanding version control systems like Git. Familiarity with API documentation standards such as OpenAPI. Problem-solving skills and ability to work independently in a fast-paced environment. Effective Communication negotiate and communicate effectively with stakeholders to ensure API solutions meet both technical and non-technical stakeholders.

Posted 3 weeks ago

Apply

12.0 - 17.0 years

4 - 8 Lacs

Hyderabad

Work from Office

Design and develop conceptual, logical, and physical data models for enterprise and application-level databases. Translate business requirements into well-structured data models that support analytics, reporting, and operational systems. Define and maintain data standards, naming conventions, and metadata for consistency across systems. Collaborate with data architects, engineers, and analysts to implement models into databases and data warehouses/lakes. Analyze existing data systems and provide recommendations for optimization, refactoring, and improvements. Create entity relationship diagrams (ERDs) and data flow diagrams to document data structures and relationships. Support data governance initiatives including data lineage, quality, and cataloging. Review and validate data models with business and technical stakeholders. Provide guidance on normalization, denormalization, and performance tuning of database designs. Ensure models comply with organizational data policies, security, and regulatory requirements.

Posted 3 weeks ago

Apply

5.0 - 10.0 years

5 - 9 Lacs

Hyderabad

Work from Office

Experience 8+ years Data Engineering experience. 3+ years experience of cloud platform services (preferably GCP) 2+ years hands-on experience on Pentaho. Hands-on experience in building and optimizing data pipelines and data sets. Hands-on experience with data extraction and transformation tasks while taking care of data security, error handling and pipeline performance. Hands-on experience with relational SQL (Oracle, SQL Server or MySQL) and NoSQL databases . Advance SQL experience - creating, debugging Stored Procedures, Functions, Triggers and Object Types in PL/SQL Statements. Hands-on experience with programming languages - Java (mandatory), Go, Python. Hands-on experience in unit testing data pipelines. Experience in using Pentaho Data Integration (Kettle/Spoon) and debugging issues. Experience supporting and working with cross-functional teams in a dynamic environment. Technical Skills Programming & LanguagesJAVA Database TechOracle, Spanner, BigQuery, Cloud Storage Operating SystemsLinux Good knowledge and understanding of cloud based ETL framework and tools. Good understanding and working knowledge of batch and streaming data processing. Good understanding of the Data Warehousing architecture. Knowledge of open table and file formats (e.g. delta, hudi, iceberg, avro, parquet, json, csv) Strong analytic skills related to working with unstructured datasets. Excellent numerical and analytical skills. Responsibilities Design and develop various standard/reusable to ETL Jobs and pipelines. Work with the team in extracting the data from different data sources like Oracle, cloud storage and flat files. Work with database objects including tables, views, indexes, schemas, stored procedures, functions, and triggers. Work with team to troubleshoot and resolve issues in job logic as well as performance. Write ETL validations based on design specifications for unit testing Work with the BAs and the DBAs for requirements gathering, analysis, testing, metrics and project coordination.

Posted 3 weeks ago

Apply

5.0 - 10.0 years

7 - 12 Lacs

Pune

Work from Office

Tech stack GCP Data Fusion, BigQuery, Dataproc, SQL/T-SQL, Cloud Run, Secret Manager Git, Ansible Tower / Ansible scripts, Jenkins, Java, Python, Terraform, Cloud Composer/Airflow Experience and Skills Must Have Proven (3+ years) hands on experience in designing, testing, and implementing data ingestion pipelines on GCP Data Fusion, CDAP or similar tools, including ingestion and parsing and wrangling of CSV, JSON, XML etc formatted data from RESTful & SOAP APIs, SFTP servers, etc. Modern world data contract best practices in-depth understanding with proven experience (3+ years) for independently directing, negotiating, and documenting best in class data contracts. Java (2+ years) experience in development, testing and deployment (ideally custom plugins for Data Fusion) Proficiency in working with Continuous Integration (CI), Continuous Delivery (CD) and continuous testing tools, ideally for Cloud based Data solutions. Experience in working in Agile environment and toolset. Strong problem-solving and analytical skills Enthusiastic willingness to learn and develop technical and soft skills as needs require rapidly and independently. Strong organisational and multi-tasking skills. Good team player who embraces teamwork and mutual support. Nice to Have Hands on experience in Cloud Composer/Airflow, Cloud Run, Pub/Sub Hands on development in Python, Terraform Strong SQL skills for data transformation, querying and optimization in BigQuery, with a focus on cost, time-effective SQL coding and concurrency/data integrity (ideally in BigQuery dialect) Data Transformation/ETL/ELT pipelines development, testing and implementation ideally in Big Query Experience in working in DataOps model Experience in Data Vault modelling and usage. Proficiency in Git usage for version control and collaboration. Proficiency with CI/CD processes/pipelines designing, creation, maintenance in DevOps tools like Ansible/Jenkins etc. for Cloud Based Applications (Ideally GCP)

Posted 3 weeks ago

Apply

7.0 - 12.0 years

4 - 8 Lacs

Pune

Work from Office

1. ETLHands on experience of building data pipelines. Proficiency in two or more data integration platforms such as Ab Initio, Apache Spark, Talend and Informatica 2. Big DataExperience of big data platforms such as Hadoop, Hive or Snowflake for data storage and processing 3. Data Warehousing & Database ManagementUnderstanding of Data Warehousing concepts, Relational (Oracle, MSSQL, MySQL) and NoSQL (MongoDB, DynamoDB) database design 4. Data Modeling & DesignGood exposure to data modeling techniques; design, optimization and maintenance of data models and data structures 5. LanguagesProficient in one or more programming languages commonly used in data engineering such as Python, Java or Scala 6. DevOpsExposure to concepts and enablers - CI/CD platforms, version control, automated quality control management Ab InitioExperience developing CoOp graphs; ability to tune for performance. Demonstrable knowledge across full suite of Ab Initio toolsets e.g., GDE, ExpressIT, Data Profiler and ConductIT, ControlCenter, ContinuousFlows CloudGood exposure to public cloud data platforms such as S3, Snowflake, Redshift, Databricks, BigQuery, etc. Demonstratable understanding of underlying architectures and trade-offs Data Quality & ControlsExposure to data validation, cleansing, enrichment and data controls ContainerizationFair understanding of containerization platforms like Docker, Kubernetes File FormatsExposure in working on Event/File/Table Formats such as Avro, Parquet, Protobuf, Iceberg, Delta OthersBasics of Job scheduler like Autosys. Basics of Entitlement management Certification on any of the above topics would be an advantage.

Posted 3 weeks ago

Apply

9.0 - 14.0 years

11 - 16 Lacs

Hyderabad

Work from Office

We are seeking a skilled and proactive DevOps Engineer with deep expertise in Google Cloud Platform (GCP), Google Kubernetes Engine (GKE), and on-premises Kubernetes platforms like OpenShift. The ideal candidate will have a strong foundation in Infrastructure as Code (IaC) using Terraform, and a solid understanding of cloud-native networking, service meshes (e.g., Istio), and CI/CD pipelines. Experience with DevSecOps practices and security tools is highly desirable. Key Responsibilities Design, implement, and manage scalable infrastructure on GCP (especially GKE google Kubernetes environment) and on-prem Kubernetes (OpenShift). Develop and maintain Terraform modules for infrastructure provisioning and configuration. Troubleshoot and resolve complex issues related to networking, Istio, and Kubernetes clusters. Build and maintain CI/CD pipelines using tools such as Jenkins, Codefresh, or GitHub Actions. Integrate and manage DevSecOps tools such as Blackduck, Checkmarx, Twistlock, and Dependabot to ensure secure software delivery. Collaborate with development and security teams to enforce security best practices across the SDLC. Support and configure WAFs and on-prem load balancers as needed. Required Skills & Qualifications: 5+ years of experience in a DevOps or Site Reliability Engineering role. Proficiency in GCP and GKE, with hands-on experience in OpenShift or similar on-prem Kubernetes platforms. Strong experience with Terraform and managing cloud infrastructure as code. Solid understanding of Kubernetes networking, Istio, and service mesh architectures. Experience with at least one CI/CD toolJenkins, Codefresh, or GitHub Actions. Familiarity with DevSecOps tools such as Black Duck, Checkmarx, Twistlock, and Dependabot. Strong Linux administration and scripting skills. Nice to Have: Experience with WAFs and on-prem load balancers. Familiarity with monitoring and logging tools (e.g., Prometheus, ELK stack, Dynatrace, Splunk). Knowledge of container security and vulnerability scanning best practices. Familiarity with GenAI, Google Vertex AI in Google Cloud.

Posted 3 weeks ago

Apply

15.0 - 20.0 years

10 - 14 Lacs

Pune

Work from Office

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Databricks Unified Data Analytics Platform, Engineering Expertise, Data Lakehouse Development Good to have skills : Google BigQueryMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various teams to ensure that application development aligns with business objectives, overseeing project timelines, and facilitating communication among stakeholders to drive project success. You will also engage in problem-solving activities, providing guidance and support to your team while ensuring that best practices are followed throughout the development process. Your role will be pivotal in shaping the direction of application projects and ensuring that they meet the needs of the organization and its clients. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate training and development opportunities for team members to enhance their skills.- Monitor project progress and implement necessary adjustments to ensure timely delivery. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform.- Good To Have Skills: Experience with Google BigQuery.- Strong understanding of data integration and ETL processes.- Experience with cloud-based data solutions and architecture.- Familiarity with data governance and compliance standards. Additional Information:- The candidate should have minimum 5 years of experience in Databricks Unified Data Analytics Platform.- This position is based in Pune.- A 15 years full time education is required. Qualification 15 years full time education

Posted 3 weeks ago

Apply

7.0 - 12.0 years

10 - 14 Lacs

Bengaluru

Work from Office

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Google BigQuery Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your day will involve overseeing the application development process, coordinating with team members, and ensuring project milestones are met. Roles & Responsibilities:- Expected to be an SME- Collaborate and manage the team to perform- Responsible for team decisions- Engage with multiple teams and contribute on key decisions- Provide solutions to problems for their immediate team and across multiple teams- Lead the application development process effectively- Ensure timely delivery of project milestones- Provide guidance and mentorship to team members Professional & Technical Skills: - Must To Have Skills: Proficiency in Google BigQuery- Strong understanding of data analytics and visualization- Experience with cloud-based data warehousing solutions- Hands-on experience in designing and implementing scalable applications- Knowledge of data security and compliance standards Additional Information:- The candidate should have a minimum of 7.5 years of experience in Google BigQuery- This position is based at our Bengaluru office- A 15 years full-time education is required Qualification 15 years full time education

Posted 3 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies