Home
Jobs

646 Dataflow Jobs

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Job Description The Security Platform Engineering team, EPEO is looking for a passionate, experienced DevOps Engineer who is excited to foray into any technology and create innovative products & services. The interested candidate should have experience in designing and implementing frontend technologies, constructing backend APIs, database, setting up infrastructure in cloud and automating repeatable tasks. Also, the engineer should be a team player working with the team of developers from conception to the final product stage. Responsibilities YOUR TYPICAL DAY HERE WOULD BE: Design/Develop APIs using Java or Python and deploy using GCP Services Design, build, and maintain robust and scalable data pipelines to ingest, process, and transform data from various sources using GCP Services Contribute to the design and architecture of our data infrastructure and Automate data pipeline deployment and management Create websites using Angular, CSS, Hugo, JavaScript/TypeScript Automating repeatable tasks, workflows to improve efficiency of processes. Design, build, observability dashboards using Dynatrace, Grafana, Looker etc. Qualifications WHAT YOUR SKILLSET LOOKS LIKE: A relevant Bachelor's or Master’s Degree in computer science / engineering 3+ Experience in developing RESTful endpoints (Python or Java), websites and deploying using GCP Services Proficiency in using GCP services, including Cloud Run, BigQuery, Dataflow, and Google Cloud Storage (GCS). Experience working in DevOps or Agile development team Deep understanding of SRE concepts, including monitoring, alerting, automation, and incident management WOULD BE GREAT IF YOU ALSO BRING: GCP Certification

Posted 4 hours ago

Apply

3.0 - 4.0 years

0 Lacs

Hyderābād

On-site

GlassDoor logo

Job description Some careers shine brighter than others. If you’re looking for a career that will help you stand out, join HSBC and fulfil your potential. Whether you want a career that could take you to the top, or simply take you in an exciting new direction, HSBC offers opportunities, support and rewards that will take you further. HSBC is one of the largest banking and financial services organisations in the world, with operations in 64 countries and territories. We aim to be where the growth is, enabling businesses to thrive and economies to prosper, and, ultimately, helping people to fulfil their hopes and realise their ambitions. We are currently seeking an experienced professional to join our team in the role of Senior Software Engineer In this role, you will: Responsible for performing system development work around ETL, which can include both the development of new function and facilities, and the on-going systems support of live systems. Responsible for the documentation, coding, and maintenance of new and existing Extract, Transform, and Load (ETL) processes within the Enterprise Data Warehouse. Investigate live systems faults, diagnose problems, and propose and provide solutions. Work closely with various teams to design, build, test, deploy and maintain insightful MI reports. Support System Acceptance Testing, System Integration and Regression Testing. Identify any issues that may arise to delivery risk, formulate preventive actions or corrective measures, and timely escalate major project risks & issues to service owner. Execute test cases and log defects. Should be proactive in understanding the existing system, identifying areas for improvement, and taking ownership of assigned tasks. Ability to work independently with minimal supervision while ensuring timely delivery of tasks. Requirements To be successful in this role, you should meet the following requirements: 3-4 years of experience in Data Warehousing specialized in ETL. Given the current team is highly technical in nature, the expectation is that the candidate has experience in technologies like DataStage, Teradata Vantage, Unix Scripting and scheduling using Control-M and DevOps tools. Candidate should possess good knowledge on SQL and demonstrate the ability to write efficient and optimized queries effectively. Hands on experience or knowledge on GCP’s Data Storage and Processing services such as BigQuery, Dataflow, Bigtable, Cloud spanner, Cloud SQL would be an added advantage. Hands-on experience with Unix, Git and Jenkins and would be added advantage. This individual should be able to develop and implement solutions on both on-prem and google cloud platform (GCP). Conducting migration, where necessary, to bring tools and other elements into the cloud and software upgrades. Should have proficiency in using JIRA and Confluence and experienced in working in projects that have followed Agile methodologies. You’ll achieve more when you join HSBC. www.hsbc.com/careers HSBC is committed to building a culture where all employees are valued, respected and opinions count. We take pride in providing a workplace that fosters continuous professional development, flexible working and opportunities to grow within an inclusive and diverse environment. Personal data held by the Bank relating to employment applications will be used in accordance with our Privacy Statement, which is available on our website. Issued by – HSBC Software Development India

Posted 4 hours ago

Apply

4.0 years

2 - 5 Lacs

Hyderābād

On-site

GlassDoor logo

About this role: Wells Fargo is seeking a Senior Cloud Platform engineer IaaC (Infrastructure as a Code) tools such as Terraform, Docker/OCI image creation, Kubernetes, Helm Charts, Python skills. In this role, you will: Understanding of Cloud Platform Technologies (GCP preferred) in the big data and data warehousing space (BigQuery, Dataproc, Dataflow, Data Catalog, Cloud Composer/Airflow, GKE/Anthos). Hands-on experience in IaaC (Infrastructure as a Code) tools such as Terraform, Docker/OCI image creation, Kubernetes, Helm Charts, Self-healing mechanisms, Load-balancing, API Gateway. In-depth knowledge of Cloud tools/solutions such as Cloud Pub/Sub, GKE, IAM, Scalability, Fault-tolerant design, Availability, BCP. Ability to quickly learn and adapt to the new cloud platforms / technologies Strong development experience in Python Extensive experience in working with Python API based solution design and integration" Required Qualifications, International: 4+ years of Software Engineering experience, or equivalent demonstrated through one or a combination of the following: work experience, training, military experience, education Bachelors or Masters Degree in Comp. Science or equivalent Desired Qualifications: GCP DevOps, Terraform and K8s Certification Posting End Date: 2 Jul 2025 *Job posting may come down early due to volume of applicants. We Value Equal Opportunity Wells Fargo is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other legally protected characteristic. Employees support our focus on building strong customer relationships balanced with a strong risk mitigating and compliance-driven culture which firmly establishes those disciplines as critical to the success of our customers and company. They are accountable for execution of all applicable risk programs (Credit, Market, Financial Crimes, Operational, Regulatory Compliance), which includes effectively following and adhering to applicable Wells Fargo policies and procedures, appropriately fulfilling risk and compliance obligations, timely and effective escalation and remediation of issues, and making sound risk decisions. There is emphasis on proactive monitoring, governance, risk identification and escalation, as well as making sound risk decisions commensurate with the business unit's risk appetite and all risk and compliance program requirements. Candidates applying to job openings posted in Canada: Applications for employment are encouraged from all qualified candidates, including women, persons with disabilities, aboriginal peoples and visible minorities. Accommodation for applicants with disabilities is available upon request in connection with the recruitment process. Applicants with Disabilities To request a medical accommodation during the application or interview process, visit Disability Inclusion at Wells Fargo . Drug and Alcohol Policy Wells Fargo maintains a drug free workplace. Please see our Drug and Alcohol Policy to learn more. Wells Fargo Recruitment and Hiring Requirements: a. Third-Party recordings are prohibited unless authorized by Wells Fargo. b. Wells Fargo requires you to directly represent your own experiences during the recruiting and hiring process.

Posted 4 hours ago

Apply

3.0 years

3 - 10 Lacs

Vadodara

On-site

GlassDoor logo

M3J Technical Services is seeking a Data Integration & Reporting Analyst adept at automating reports, extracting and cleansing data, and crafting impactful visualizations for KPI’s using tools like Power BI and Excel. You'll develop data-driven applications for desktop, web, and mobile platforms, ensuring our business remains agile and poised for growth. If you're passionate about leveraging data to drive strategic solutions, join our team! Local candidates based in Vadodara, Gujarat preferred. Responsibilities: Collaborate with stakeholders to design and publish Power BI reports aligned with business goals. Analyze and understand business processes to develop reports tailored to specific operational needs. Prepare and transform data from sources such as SQL Server, Excel, and SharePoint using Microsoft Fabric tools, including Dataflow Gen 2, Power Query, Lakehouse, and other related tools. Develop data models and optimize report performance, including row-level security. Maintain clear documentation and provide user training and support for Power BI. Actively contribute to process improvement initiatives by leveraging the Microsoft Power Platform (Power Apps, Power Automate, SharePoint) to enhance data collection and workflow automation. Qualifications: Bachelor’s degree in Computer Science, Industrial Engineering, Data Science, or Related field; or equivalent work experience. Solid understanding of BI concepts and data visualization best practices. 3+ years of hands-on experience with Power BI development. Strong skills in DAX, Power Query (M), and data modeling. Proficient in SQL and working with relational databases. 5+ years of working experience with Excel and Power Query. Experience with Fabric and other data integration tools. High attention to detail and the ability to work independently. Strong analytical and organizational skills. Excellent written and verbal communication skills. Results-oriented, proactive, and possessing a high level of integrity. Microsoft Certified: Power BI Data Analyst Associate is a plus Preferred Qualifications: Experience with Power BI Service administration (fabric, dataflows, workspaces, security, and dataset refresh). Familiarity with Microsoft Fabric, Power Automate, or SharePoint. Able to work independently and take initiative to improve data collection and reporting using modern tools and best practices. Language Requirement: Fluent in English Schedule: Monday to Friday Working Hours: 8am to 5pm Central US Time, or must be available to work at least 4 hours of the day during 8am to 4pm US Central Time Zone. Work Location: Vadodara, Gujarat, India (Preference will be given to candidates located in Vadodara, Gujarat.) Job Types: Full-time, Permanent Benefits: Flexible schedule Paid sick time Paid time off Schedule: Monday to Friday Supplemental Pay: Yearly bonus Application Question(s): Please share with us your desired salary. Have you implemented reuse of dataflows across multiple reports or workspaces? Would you be open to presenting a Power BI report you developed professionally — and explaining your approach to data connection, transformation, and solving performance or business logic challenges? Work Location: In person

Posted 5 hours ago

Apply

5.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

GCP Data Engineer (5+ Years Exp) | Hybrid - Hyderabad Location: Hyderabad, India Experience: 5+ Years We are looking for an experienced GCP Data Engineer to join our growing data team. If you're passionate about building scalable data pipelines, optimizing workflows, and working with modern cloud-native tech, we want to hear from you! 🚀 Key Responsibilities: Design, develop, and maintain robust data pipelines on Google Cloud Platform (GCP) Work with structured and unstructured data to support analytics, ML, and reporting use cases Integrate data sources using Python , APIs, and GCP-native services (BigQuery, Dataflow, Pub/Sub, etc.) Implement data quality and governance practices using DBT or Collibra Collaborate cross-functionally with data analysts, data scientists, and business stakeholders 🛠️ Must-Have Skills: 5+ years of experience in data engineering or related roles Strong proficiency in GCP data services (BigQuery, Cloud Storage, Dataflow, Composer, etc.) Excellent Python programming skills, especially for ETL development Hands-on experience with DBT or Collibra Strong SQL skills and familiarity with relational and cloud-native databases Solid understanding of data modeling, pipeline orchestration, and performance tuning ✅ Good to Have: Experience with CI/CD pipelines and version control (Git) Knowledge of data security and compliance in cloud environments Familiarity with Agile methodologies 💼 What We Offer: Competitive compensation Flexible hybrid working model Opportunity to work on cutting-edge cloud data projects Collaborative and growth-focused culture 📩 Interested? Apply directly via LinkedIn or send your resume to [sasidhar.m@technogenindia.com].

Posted 6 hours ago

Apply

6.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

We are seeking a skilled Data Engineer with over 6+ years of experience to design, build, and maintain scalable data pipelines and perform advanced data analysis to support business intelligence and data-driven decision-making. The ideal candidate will have a strong foundation in computer science principles, extensive experience with SQL and big data tools, and proficiency in cloud platforms and data visualization tools. Key Responsibilities: Design, develop, and maintain robust, scalable ETL pipelines using Apache Airflow, DBT, Composer, Control-M, Cron, Luigi, and similar tools. Build and optimize data architectures including data lakes and data warehouses. Integrate data from multiple sources ensuring data quality and consistency. Collaborate with data scientists, analysts, and stakeholders to translate business requirements into technical solutions. Analyze complex datasets to identify trends, generate actionable insights, and support decision-making. Develop and maintain dashboards and reports using Tableau, Power BI, and Jupyter Notebooks for visualization and pipeline validation. Manage and optimize relational and NoSQL databases such as MySQL, PostgreSQL, Oracle, MongoDB, and DynamoDB. Work with big data tools and frameworks including Hadoop, Spark, Hive, Kafka, Informatica, Talend, SSIS, and Dataflow. Utilize cloud data services and warehouses like AWS Glue, GCP Dataflow, Azure Data Factory, Snowflake, Redshift, and BigQuery. Support CI/CD pipelines and DevOps workflows using Git, Docker, Terraform, and related tools. Ensure data governance, security, and compliance standards are met. Participate in Agile and DevOps processes to enhance data engineering workflows. Required Qualifications: 6+ years of professional experience in data engineering and data analysis roles. Strong proficiency in SQL and experience with database management systems such as MySQL, PostgreSQL, Oracle, and MongoDB. Hands-on experience with big data tools like Hadoop and Apache Spark. Proficient in Python programming. Experience with data visualization tools such as Tableau, Power BI, and Jupyter Notebooks. Proven ability to design, build, and maintain scalable ETL pipelines using tools like Apache Airflow, DBT, Composer (GCP), Control-M, Cron, and Luigi. Familiarity with data engineering tools including Hive, Kafka, Informatica, Talend, SSIS, and Dataflow. Experience working with cloud data warehouses and services (Snowflake, Redshift, BigQuery, AWS Glue, GCP Dataflow, Azure Data Factory). Understanding of data modeling concepts and data lake/data warehouse architectures. Experience supporting CI/CD practices with Git, Docker, Terraform, and DevOps workflows. Knowledge of both relational and NoSQL databases, including PostgreSQL, BigQuery, MongoDB, and DynamoDB. Exposure to Agile and DevOps methodologies. Experience with Amazon Web Services (S3, Glue, Redshift, Lambda, Athena) Preferred Skills: Strong problem-solving and communication skills. Ability to work independently and collaboratively in a team environment. Experience with service development, REST APIs, and automation testing is a plus. Familiarity with version control systems and workflow automation.

Posted 9 hours ago

Apply

0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

General Skills & Experience: Minimum 10-18 yrs of Experience • Expertise in Spark (Scala/Python), Kafka, and cloud-native big data services (GCP, AWS, Azure) for ETL, batch, and stream processing. • Deep knowledge of cloud platforms (AWS, Azure, GCP), including certification (preferred). • Experience designing and managing advanced data warehousing and lakehouse architectures (e.g., Snowflake, Databricks, Delta Lake, BigQuery, Redshift, Synapse). • Proven experience with building, managing, and optimizing ETL/ELT pipelines and data workflows for large-scale systems. • Strong experience with data lakes, storage formats (Parquet, ORC, Delta, Iceberg), and data movement strategies (cloud and hybrid). • Advanced knowledge of data modeling, SQL development, data partitioning, optimization, and database administration. • Solid understanding and experience with Master Data Management (MDM) solutions and reference data frameworks. • Proficient in implementing Data Lineage, Data Cataloging, and Data Governance solutions (e.g., AWS Glue Data Catalog, Azure Purview). • Familiar with data privacy, data security, compliance regulations (GDPR, CCPA, HIPAA, etc.), and best practices for enterprise data protection. • Experience with data integration tools and technologies (e.g. AWS Glue, GCP Dataflow , Apache Nifi/Airflow, etc.). • Expertise in batch and real-time data processing architectures; familiarity with event-driven, microservices, and message-driven patterns. • Hands-on experience in Data Analytics, BI & visualization tools (PowerBI, Tableau, Looker, Qlik, etc.) and supporting complex reporting use-cases. • Demonstrated capability with data modernization projects: migrations from legacy/on-prem systems to cloud-native architectures. • Experience with data quality frameworks, monitoring, and observability (data validation, metrics, lineage, health checks). • Background in working with structured, semi-structured, unstructured, temporal, and time series data at large scale. • Familiarity with Data Science and ML pipeline integration (DevOps/MLOps, model monitoring, and deployment practices). • Experience defining and managing enterprise metadata strategies.

Posted 10 hours ago

Apply

4.0 years

0 Lacs

India

Remote

Linkedin logo

We are looking for a Google Cloud Data Engineer who will help us build a highly scalable and reliable platform to match our exponential growth. As a Google Cloud Data Engineer, you will be responsible for building a solid back end infrastructure which will enable data delivery in near real-time using next-gen technologies. Title : Google Cloud Data Engineer Location : Remote Work Employment Type : Full Time Work Timings : 2PM to 11 PM No of Openings : 3 We are looking for a candidate with Google Cloud Data Engineering experience only who can join us within 15 days or less. Applications not meeting this requirement will not be considered. Roles and Responsibilities: Design, develop, and maintain scalable data pipelines on Google Cloud Platform (GCP). Implement data processing solutions using GCP services such as BigQuery, Dataflow, Data Proc, Pub/Sub, and Cloud Storage. Optimize data processing and storage for performance, cost, and scalability. Ensure data quality and integrity by implementing best practices for data governance and monitoring. Develop and maintain documentation for data pipelines, architectures, and processes. Stay up-to-date with the latest advancements in data engineering and GCP technologies. Qualifications: Bachelor’s degree in Computer Science, Engineering, or a related field. Proven experience as a Data Engineer with a focus on Google Cloud Platform (GCP). Proficiency in GCP services such as BigQuery, Dataflow, Pub/Sub, Cloud Storage , and others. Strong programming skills in Python, Java, or similar languages. Experience with SQL and relational databases. Familiarity with data modeling, ETL processes, and data warehousing concepts. Knowledge of best practices in data security and privacy. Excellent problem-solving skills and attention to detail. Strong communication and collaboration skills. Preferred Skills : Google Cloud Data Engineer certification. About Techolution: Techolution is a leading innovation consulting company on track to become one of the most admired brands in the world for "innovation done right". Our purpose is to harness our expertise in novel technologies to deliver more profits for our enterprise clients while helping them deliver a better human experience for the communities they serve. With that, we are now fully committed to helping our clients build the enterprise of tomorrow by making the leap from Lab Grade AI to Real World AI. In 2019, we won the prestigious Inc. 500 Fastest-Growing Companies in America award, only 4 years after its formation. In 2022, Techolution was honored with the “Best-in-Business” title by Inc. for “Innovation Done Right”. Most recently, we received the “AIConics” trophy for being the Top AI Solution Provider of the Year at the AI Summit in New York. Let’s give you more insights! Some videos you wanna watch! Life at Techolution GoogleNext 2023 Ai4 - Artificial Intelligence Conferences 2023 WaWa - Solving Food Wastage Saving lives - Brooklyn Hospital Innovation Done Right on Google Cloud Techolution featured on Worldwide Business with KathyIreland Techolution presented by ION World’s Greatest Visit us @ www.techolution.com : To know more about our revolutionary core practices and getting to know in detail about how we enrich the human experience with technology.

Posted 12 hours ago

Apply

5.0 - 7.0 years

0 Lacs

Trivandrum, Kerala, India

On-site

Linkedin logo

Equifax is where you can power your possible. If you want to achieve your true potential, chart new paths, develop new skills, collaborate with bright minds, and make a meaningful impact, we want to hear from you. You are passionate about quality and how customers experience the products you test. You have the ability to create, maintain and execute test plans in order to verify requirements. As a Quality Engineer at Equifax, you will be a catalyst in both the development and the testing of high priority initiatives. You will develop and test new products to support technology operations while maintaining exemplary standards. As a collaborative member of the team, you will deliver QA services (code quality, testing services, performance engineering, development collaboration and continuous integration). You will conduct quality control tests in order to ensure full compliance with specified standards and end user requirements. You will execute tests using established plans and scripts; documents problems in an issues log and retest to ensure problems are resolved. You will create test files to thoroughly test program logic and verify system flow. You will identify, recommend and implement changes to enhance effectiveness of QA strategies. What You Will Do Independently develop scalable and reliable automated tests and frameworks for testing software solutions. Specify and automate test scenarios and test data for a highly complex business by analyzing integration points, data flows, personas, authorization schemes and environments Develop regression suites, develop automation scenarios, and move automation to an agile continuous testing model. Pro-actively and collaboratively taking part in all testing related activities while establishing partnerships with key stakeholders in Product, Development/Engineering, and Technology Operations. What Experience You Need Bachelor's degree in a STEM major or equivalent experience 5-7 years of software testing experience Able to create and review test automation according to specifications Ability to write, debug, and troubleshoot code in Java, Springboot, TypeScript/JavaScript, HTML, CSS Creation and use of big data processing solutions using Dataflow/Apache Beam, Bigtable, BigQuery, PubSub, GCS, Composer/Airflow, and others with respect to software validation Created test strategies and plans Led complex testing efforts or projects Participated in Sprint Planning as the Test Lead Collaborated with Product Owners, SREs, Technical Architects to define testing strategies and plans. Design and development of micro services using Java, Springboot, GCP SDKs, GKE/Kubeneties Deploy and release software using Jenkins CI/CD pipelines, understand infrastructure-as-code concepts, Helm Charts, and Terraform constructs Cloud Certification Strongly Preferred What Could Set You Apart An ability to demonstrate successful performance of our Success Profile skills, including: Attention to Detail - Define test case candidates for automation that are outside of product specifications. i.e. Negative Testing; Create thorough and accurate documentation of all work including status updates to summarize project highlights; validating that processes operate properly and conform to standards Automation - Automate defined test cases and test suites per project Collaboration - Collaborate with Product Owners and development team to plan and and assist with user acceptance testing; Collaborate with product owners, development leads and architects on functional and non-functional test strategies and plans Execution - Develop scalable and reliable automated tests; Develop performance testing scripts to assure products are adhering to the documented SLO/SLI/SLAs; Specify the need for Test Data types for automated testing; Create automated tests and tests data for projects; Develop automated regression suites; Integrate automated regression tests into the CI/CD pipeline; Work with teams on E2E testing strategies and plans against multiple product integration points Quality Control - Perform defect analysis, in-depth technical root cause analysis, identifying trends and recommendations to resolve complex functional issues and process improvements; Analyzes results of functional and non-functional tests and make recommendation for improvements; Performance / Resilience: Understanding application and network architecture as inputs to create performance and resilience test strategies and plans for each product and platform. Conducting the performance and resilience testing to ensure the products meet SLAs / SLOs Quality Focus - Review test cases for complete functional coverage; Review quality section of Production Readiness Review for completeness; Recommend changes to existing testing methodologies for effectiveness and efficiency of product validation; Ensure communications are thorough and accurate for all work documentation including status and project updates Risk Mitigation - Work with Product Owners, QE and development team leads to track and determine prioritization of defects fixes We offer a hybrid work setting, comprehensive compensation and healthcare packages, attractive paid time off, and organizational growth potential through our online learning platform with guided career tracks. Are you ready to power your possible? Apply today, and get started on a path toward an exciting new career at Equifax, where you can make a difference! Who is Equifax? At Equifax, we believe knowledge drives progress. As a global data, analytics and technology company, we play an essential role in the global economy by helping employers, employees, financial institutions and government agencies make critical decisions with greater confidence. We work to help create seamless and positive experiences during life’s pivotal moments: applying for jobs or a mortgage, financing an education or buying a car. Our impact is real and to accomplish our goals we focus on nurturing our people for career advancement and their learning and development, supporting our next generation of leaders, maintaining an inclusive and diverse work environment, and regularly engaging and recognizing our employees. Regardless of location or role, the individual and collective work of our employees makes a difference and we are looking for talented team players to join us as we help people live their financial best. Equifax is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, or status as a protected veteran.

Posted 19 hours ago

Apply

170.0 years

0 Lacs

Mulshi, Maharashtra, India

On-site

Linkedin logo

Area(s) of responsibility About Birlasoft Birlasoft, a global leader at the forefront of Cloud, AI, and Digital technologies, seamlessly blends domain expertise with enterprise solutions. The company’s consultative and design-thinking approach empowers societies worldwide, enhancing the efficiency and productivity of businesses. As part of the multibillion-dollar diversified CKA Birla Group, Birlasoft with its 12,000+ professionals, is committed to continuing the Group’s 170-year heritage of building sustainable communities. About the Job – Ability to relate the product functionality to business processes, and thus offer implementation advice to customers on how to meet their various business scenarios. Job Title – GCP BigQuery Engineer Location: Pune/Bangalore/Mumbai Educational Background – BE/Btech Key Responsibilities – Must Have Skills Should have 4-8 Years of Exp Data Quality Management (DQM) Specialist - Quick Base Location:** Mumbai, Pune, Bangalore We are seeking a highly skilled and experienced Data Quality Management (DQM) Specialist with expertise in Quick Base to join our team. Hands on experience in implementing and managing data quality initiatives, with a strong focus on utilizing Quick Base as a data quality management tool. As a DQM Specialist, you will be responsible for developing, implementing, and maintaining data quality processes and standards using Quick Base to ensure the accuracy, completeness, and consistency of our organization's data assets Experience with data modeling, schema design, and optimization techniques in BigQuery. Hands-on experience with GCP services such as Cloud Dataflow, Cloud Storage, Data Transfer Service, and Data Studio.

Posted 21 hours ago

Apply

5.0 - 7.0 years

0 Lacs

Trivandrum, Kerala, India

On-site

Linkedin logo

You are passionate about quality and how customers experience the products you test. You have the ability to create, maintain and execute test plans in order to verify requirements. As a Quality Engineer at Equifax, you will be a catalyst in both the development and the testing of high priority initiatives. You will develop and test new products to support technology operations while maintaining exemplary standards. As a collaborative member of the team, you will deliver QA services (code quality, testing services, performance engineering, development collaboration and continuous integration). You will conduct quality control tests in order to ensure full compliance with specified standards and end user requirements. You will execute tests using established plans and scripts; documents problems in an issues log and retest to ensure problems are resolved. You will create test files to thoroughly test program logic and verify system flow. You will identify, recommend and implement changes to enhance effectiveness of QA strategies. What You Will Do Independently develop scalable and reliable automated tests and frameworks for testing software solutions. Specify and automate test scenarios and test data for a highly complex business by analyzing integration points, data flows, personas, authorization schemes and environments Develop regression suites, develop automation scenarios, and move automation to an agile continuous testing model. Pro-actively and collaboratively taking part in all testing related activities while establishing partnerships with key stakeholders in Product, Development/Engineering, and Technology Operations. What Experience You Need Bachelor's degree in a STEM major or equivalent experience 5-7 years of software testing experience Able to create and review test automation according to specifications Ability to write, debug, and troubleshoot code in Java, Springboot, TypeScript/JavaScript, HTML, CSS Creation and use of big data processing solutions using Dataflow/Apache Beam, Bigtable, BigQuery, PubSub, GCS, Composer/Airflow, and others with respect to software validation Created test strategies and plans Led complex testing efforts or projects Participated in Sprint Planning as the Test Lead Collaborated with Product Owners, SREs, Technical Architects to define testing strategies and plans. Design and development of micro services using Java, Springboot, GCP SDKs, GKE/Kubeneties Deploy and release software using Jenkins CI/CD pipelines, understand infrastructure-as-code concepts, Helm Charts, and Terraform constructs Cloud Certification Strongly Preferred What Could Set You Apart An ability to demonstrate successful performance of our Success Profile skills, including: Attention to Detail - Define test case candidates for automation that are outside of product specifications. i.e. Negative Testing; Create thorough and accurate documentation of all work including status updates to summarize project highlights; validating that processes operate properly and conform to standards Automation - Automate defined test cases and test suites per project Collaboration - Collaborate with Product Owners and development team to plan and and assist with user acceptance testing; Collaborate with product owners, development leads and architects on functional and non-functional test strategies and plans Execution - Develop scalable and reliable automated tests; Develop performance testing scripts to assure products are adhering to the documented SLO/SLI/SLAs; Specify the need for Test Data types for automated testing; Create automated tests and tests data for projects; Develop automated regression suites; Integrate automated regression tests into the CI/CD pipeline; Work with teams on E2E testing strategies and plans against multiple product integration points Quality Control - Perform defect analysis, in-depth technical root cause analysis, identifying trends and recommendations to resolve complex functional issues and process improvements; Analyzes results of functional and non-functional tests and make recommendation for improvements; Performance / Resilience: Understanding application and network architecture as inputs to create performance and resilience test strategies and plans for each product and platform. Conducting the performance and resilience testing to ensure the products meet SLAs / SLOs Quality Focus - Review test cases for complete functional coverage; Review quality section of Production Readiness Review for completeness; Recommend changes to existing testing methodologies for effectiveness and efficiency of product validation; Ensure communications are thorough and accurate for all work documentation including status and project updates Risk Mitigation - Work with Product Owners, QE and development team leads to track and determine prioritization of defects fixes

Posted 22 hours ago

Apply

3.0 years

0 Lacs

India

On-site

Linkedin logo

The client is looking to hire Nursing & Healthcare Assistants for their Rehabilitation Centre in Oman. Candidates willing to relocate to Oman may apply. Below are the vacancies we have for this project: ✓ Nursing Assistants Qualification: GNM (or) BSc – Nursing Experience: 3 years Positive dataflow reports are Mandatory ✓ Healthcare Assistant Qualification: GNM (or) BSc – Nursing Experience: 3 years Positive dataflow reports are Mandatory Other benefits: • Free Joining Ticket (Will be reimbursed after the 3 months Probation period) • 30 Days paid Annual leave after 1 year of service completion • Yearly Up and Down Air Ticket • Medical Insurance • Life Insurance • Accommodation (Charged at a nominal fee)

Posted 23 hours ago

Apply

3.0 years

0 Lacs

Vadodara, Gujarat

On-site

Indeed logo

M3J Technical Services is seeking a Data Integration & Reporting Analyst adept at automating reports, extracting and cleansing data, and crafting impactful visualizations for KPI’s using tools like Power BI and Excel. You'll develop data-driven applications for desktop, web, and mobile platforms, ensuring our business remains agile and poised for growth. If you're passionate about leveraging data to drive strategic solutions, join our team! Local candidates based in Vadodara, Gujarat preferred. Responsibilities: Collaborate with stakeholders to design and publish Power BI reports aligned with business goals. Analyze and understand business processes to develop reports tailored to specific operational needs. Prepare and transform data from sources such as SQL Server, Excel, and SharePoint using Microsoft Fabric tools, including Dataflow Gen 2, Power Query, Lakehouse, and other related tools. Develop data models and optimize report performance, including row-level security. Maintain clear documentation and provide user training and support for Power BI. Actively contribute to process improvement initiatives by leveraging the Microsoft Power Platform (Power Apps, Power Automate, SharePoint) to enhance data collection and workflow automation. Qualifications: Bachelor’s degree in Computer Science, Industrial Engineering, Data Science, or Related field; or equivalent work experience. Solid understanding of BI concepts and data visualization best practices. 3+ years of hands-on experience with Power BI development. Strong skills in DAX, Power Query (M), and data modeling. Proficient in SQL and working with relational databases. 5+ years of working experience with Excel and Power Query. Experience with Fabric and other data integration tools. High attention to detail and the ability to work independently. Strong analytical and organizational skills. Excellent written and verbal communication skills. Results-oriented, proactive, and possessing a high level of integrity. Microsoft Certified: Power BI Data Analyst Associate is a plus Preferred Qualifications: Experience with Power BI Service administration (fabric, dataflows, workspaces, security, and dataset refresh). Familiarity with Microsoft Fabric, Power Automate, or SharePoint. Able to work independently and take initiative to improve data collection and reporting using modern tools and best practices. Language Requirement: Fluent in English Schedule: Monday to Friday Working Hours: 8am to 5pm Central US Time, or must be available to work at least 4 hours of the day during 8am to 4pm US Central Time Zone. Work Location: Vadodara, Gujarat, India (Preference will be given to candidates located in Vadodara, Gujarat.) Job Types: Full-time, Permanent Benefits: Flexible schedule Paid sick time Paid time off Schedule: Monday to Friday Supplemental Pay: Yearly bonus Application Question(s): Please share with us your desired salary. Have you implemented reuse of dataflows across multiple reports or workspaces? Would you be open to presenting a Power BI report you developed professionally — and explaining your approach to data connection, transformation, and solving performance or business logic challenges? Work Location: In person

Posted 1 day ago

Apply

3.0 - 6.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Role Description Hiring Locations: Chennai, Trivandrum, Kochi Experience Range: 3 to 6 years Role Description The L1 Data Ops Analyst / Data Pipeline Developer is responsible for developing, testing, and maintaining robust data pipelines and monitoring operational dashboards to ensure smooth data flow. This role demands proficiency in data engineering tools, SQL, and cloud platforms, with the ability to work independently and in 24x7 shift environments. The candidate should be capable of analyzing data, troubleshooting issues using SOPs, and collaborating effectively across support levels. Key Responsibilities Development & Engineering: Design, code, test, and implement scalable and efficient data pipelines. Develop features in accordance with requirements and low-level design. Write optimized, clean code using Python, PySpark, SQL, and ETL tools. Conduct unit testing and validate data integrity. Maintain comprehensive documentation of work. Monitoring & Support Monitor dashboards, pipelines, and databases across assigned shifts. Identify, escalate, and resolve anomalies using defined SOPs. Collaborate with L2/L3 teams to ensure timely issue resolution. Analyze trends and anomalies using SQL and Excel. Process Adherence & Contribution Follow configuration and release management processes. Participate in estimation, knowledge sharing, and defect management. Adhere to SLA and compliance standards. Contribute to internal documentation and knowledge bases. Mandatory Skills Strong command of SQL for data querying and analysis. Proficiency in Python or PySpark for data manipulation. Experience in ETL tools (any of the following): Informatica, Talend, Apache Airflow, AWS Glue, Azure ADF, GCP DataProc/DataFlow. Experience working with cloud platforms (AWS, Azure, or GCP). Hands-on experience with data validation and performance tuning. Working knowledge of data schemas and data modeling. Good To Have Skills Certification in Azure, AWS, or GCP (foundational or associate level). Familiarity with monitoring tools and dashboard platforms. Understanding of data warehouse concepts. Exposure to BigQuery, ADLS, or similar services. Soft Skills Excellent written and verbal communication in English. Strong attention to detail and analytical skills. Ability to work in a 24x7 shift model, including night shifts. Ability to follow SOPs precisely and escalate issues appropriately. Self-motivated with minimal supervision. Team player with good interpersonal skills. Outcomes Expected Timely and error-free code delivery. Consistent adherence to engineering processes and release cycles. Documented and trackable issue handling with minimal escalations. Certification and training compliance. High availability and uptime of monitored pipelines and dashboards. Skills Sql,Data Analysis,Ms Excel,Dashboards

Posted 1 day ago

Apply

7.0 years

4 - 7 Lacs

Thiruvananthapuram

On-site

GlassDoor logo

Trivandrum India Technology Full time 6/23/2025 J00167993 Equifax is where you can power your possible. If you want to achieve your true potential, chart new paths, develop new skills, collaborate with bright minds, and make a meaningful impact, we want to hear from you. Equifax is seeking a Software Engineering Lead to spearhead innovative batch and data product development using Java and Google Cloud Platform. This exciting role offers the opportunity to lead cutting-edge projects, mentor a team of developers, and drive the implementation of high-impact data solutions. As a key player in Equifax's technology team, you'll leverage your expertise to shape the future of data engineering while working with the latest cloud and AI technologies. The position combines technical expertise with leadership skills to drive the development and deployment of advanced batch and data products. Key Responsibilities: Oversee the design, development, and deployment of innovative batch and data products Provide technical direction for Java and GCP implementations Lead and mentor a team of developers and engineers Collaborate with cross-functional teams to translate requirements into technical solutions Implement rigorous testing strategies and optimize performance Maintain documentation and ensure compliance with standards What experience you need: Bachelor's or Master's degree in Computer Science, Software Engineering, or a related field Proficiency in Java and its ecosystems Extensive experience with Google Cloud Platform, including GKE, Cloud Storage, Dataflow, BigQuery Minimum of 7 years in software development, focusing on batch processing and data solutions Exceptional communication and teamwork skills Experience with securely handling sensitive data (PII / PHI) Proven track record of writing defect-free code At least 3 years in a lead role What could set you apart Ability to convey complex technical concepts to non-technical stakeholders Relevant certifications (e.g., Google Cloud Professional Developer, Professional Data Engineer) Experience with multiple cloud platforms (e.g., AWS, Azure) in addition to GCP Proficiency in both Java and Python for versatile development and capabilities Experience in optimizing performance and reducing costs in cloud environments We offer a hybrid work setting, comprehensive compensation and healthcare packages, attractive paid time off, and organizational growth potential through our online learning platform with guided career tracks. Are you ready to power your possible? Apply today, and get started on a path toward an exciting new career at Equifax, where you can make a difference! Who is Equifax? At Equifax, we believe knowledge drives progress. As a global data, analytics and technology company, we play an essential role in the global economy by helping employers, employees, financial institutions and government agencies make critical decisions with greater confidence. We work to help create seamless and positive experiences during life’s pivotal moments: applying for jobs or a mortgage, financing an education or buying a car. Our impact is real and to accomplish our goals we focus on nurturing our people for career advancement and their learning and development, supporting our next generation of leaders, maintaining an inclusive and diverse work environment, and regularly engaging and recognizing our employees. Regardless of location or role, the individual and collective work of our employees makes a difference and we are looking for talented team players to join us as we help people live their financial best. Equifax is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, or status as a protected veteran.

Posted 1 day ago

Apply

4.0 - 7.0 years

24 Lacs

India

Remote

GlassDoor logo

Vacancy with a company focused on digital transformation, specializing in intelligent automation, digitalization, data science & analytics, and mobile enablement. They help businesses improve cost efficiency, productivity, and agility by reducing turnaround time and errors. The company provides services and solutions including operations digital transformation consulting, next-gen shared services setup consulting, cognitive RPA deployment, and AI-enabled CX enhancement. Founded in 2020 ;with HQ in Gurugram, India; the Company is now operating from Noida, Mumbai, Hyderabad, and Bengaluru as well. Job Role: Bigdata, GCP Years Of Experience 4 to 7 Years The candidate should have extensive production experience (1-2 Years ) in GCP, Other cloud experience would be a strong bonus. - Strong background in Data engineering 4-5 Years of exp in Big Data technologies including, Hadoop, NoSQL, Spark, Kafka etc. - Exposure to Production application is a must and Operating knowledge of cloud computing platforms (GCP, especially Big Query, Dataflow, Dataproc, Storage, VMs, Networking, Pub Sub, Cloud Functions, Composer servics) Job Types: Full-time, Permanent Pay: Up to ₹2,464,248.21 per year Benefits: Cell phone reimbursement Internet reimbursement Life insurance Paid sick time Paid time off Work from home Work Location: In person

Posted 1 day ago

Apply

10.0 - 15.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Experience: 10 to 15 years Location: Bengaluru, Gurgaon, Pune About Us: AceNet Consulting is a fast-growing global business and technology consulting firm specializing in business strategy, digital transformation, technology consulting, product development, start-up advisory and fund-raising services to our global clients across banking & financial services, healthcare, supply chain & logistics, consumer retail, manufacturing, eGovernance and other industry sectors. We are looking for hungry, highly skilled and motivated individuals to join our dynamic team. If you’re passionate about technology and thrive in a fast-paced environment, we want to hear from you. Job Summary : The data architect is responsible for designing, creating, and managing an organization’s data architecture. This role is critical in establishing a solid foundation for data management within an organization, ensuring that data is organized, structured, accessible, secure, and aligned with business objectives. Key Responsibilities: *Interact & Influence business stakeholders to secure strong engagement and ensures that the data & analytical product delivery aligns with longer-term strategic roadmaps. *Design & contribute towards the structure and layout of lake house architecture optimizing data storage, and establishing data access controls and security measures. *Implement the long-term Data & Analytics strategy and deliver functional objectives. *Assess requirement feasibility, translates high-level business requirements into data requirements, appropriate metadata, test data, and data quality standards. *Explore Data Sources by working with Application owners to confirm datasets to be extracted. *Contribute to establishing and implementing database structure, including schema design, table definitions, column specifications, and naming conventions. *Design Data models for Source data products, Master data products & Insight data products. *Document Data Architecture artifacts for different Data Products and solutions and perform peer review across various functions. *Support Data Engineering and BI Engineering teams during the build phase. *Review Data models development, validate and provide deployment approval. *Work closely with data stewards and governance functions to continuously improve data quality and enhance the reliability of data model(s). *Simplify the existing data architecture, delivering reusable services and cost-saving opportunities in line with the policies and standards of the company. *Leads and participates in the peer review and quality assurance of project architectural artifacts across the Data Architecture group through governance forums. *Collaborate and contribute to the development and enhancement of standards, guidelines, and best practices within Data Architecture discipline. *Works with Product owners, Business stewards, Data Scientists and end users to understand data consumers’ needs and develop data products/ data solutions. *Evaluates and recommends emerging technologies for data management, storage, and analytics. Role Requirements and Qualifications: *A bachelor’s degree in computer science, data science, engineering, or related field. *At least five years of relevant experience in design and implementation of data models for enterprise data warehouse initiatives. *Translate business requirements and ability to guide solution design & architecture in developing Data Products. *Develop scalable, high-performance, and reusable data models that can be efficiently utilized across different data initiatives and help in generating actionable insights. *Work collaboratively with data stewardship and governance functions to continuously improve data quality and enhance the reliability of data models. *Ability to navigate and collaborate with cross-functional teams involving data scientists, business analysts, and stakeholders. *Strong Business Process and Functional understanding with an Analytical background. *CPG experience with knowledge in domain specific concepts is a plus. *Knowledge on Agile methodologies with experience working on tools such as Jira & Confluence. *Proficiency in the design and implementation of modern data architectures and concepts such as cloud services (AWS, Azure, GCP), real-time data distribution (Kafka, Dataflow), and modern data warehouse tools (Snowflake). *Experience with database technologies such as SQL, NoSQL, Snowflake, HANA. *Understanding of entity-relationship modeling, metadata systems, and data quality tools and techniques. *Experience building enterprise data models (Logical, Physical, Conceptual) and data modeling tool experience a plus (ERWIN, ER/STUDIO, etc.) *Strong Business Process and SAP functional understanding with an analytics background (preferred SAP ECC/S4, BW, HANA, BI, ARIBA experience) is a plus. *Expert-level SQL skills. *Experience with enterprise scale data engineering orchestration frameworks/ELT tools and common data engineering Python libraries (dbt, pandas, great expectations, etc.) is a plus. *Experience with business intelligence tools and technologies such as Power BI & Tableau. *Strong analytical and problem-solving skills. *Understanding of Data Governance principles and practices including Data Quality, Data Security and compliance. *Ability to think strategically on the use of data within the Organization that support both current and future needs. *Excellent communication and interpersonal skills for stakeholder management and cross-functional collaboration. Why Join Us: *Opportunities to work on transformative projects, cutting-edge technology and innovative solutions with leading global firms across industry sectors. *Continuous investment in employee growth and professional development with a strong focus on up & re-skilling. *Competitive compensation & benefits, ESOPs and international assignments. *Supportive environment with healthy work-life balance and a focus on employee well-being. *Open culture that values diverse perspectives, encourages transparent communication and rewards contributions.

Posted 1 day ago

Apply

10.0 years

0 Lacs

Kochi, Kerala, India

On-site

Linkedin logo

Immediate Joiner Kindly share your CV here: joinus@amussoft.com Qualifications Expertise in Data Architecture and Data Modeling 10+ years of experience in data architecture, with at least 3 years in GCP environments. Expertise in BigQuery, Cloud Dataflow, Cloud Pub/Sub, Cloud Storage, and related GCP services. Strong experience in data warehousing, data lakes, and real-time data pipelines. Proficiency in SQL, Python, or other data processing languages. Experience with cloud security, data governance, and compliance frameworks. Google Cloud Certification (Professional Data Engineer, Professional Cloud Architect) preferred. Excellent communication and collaboration skills.

Posted 1 day ago

Apply

8.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Equifax is where you can power your possible. If you want to achieve your true potential, chart new paths, develop new skills, collaborate with bright minds, and make a meaningful impact, we want to hear from you. Reporting to the A/NZ DSE Chapter Manager India PEC within Decision Sciences & Engineering, this role will own and be responsible for the data & analytic engineering chapter in India PEC. The Data Engineer is an essential part of the business that enables the team to support the ongoing acquisition and internal purposing of data, through to the fulfilment of products, insights and systems. As a Data Engineer, you will be responsible for working with our internal customers to ensure that data and systems are being designed and built to move and manipulate data in a scalable, reusable and efficient manner to suit the environment, project, security and requirements. What You’ll Do Design, architect, and implement scalable and secure data pipelines on GCP, utilizing services like Dataflow, Pub/Sub, and Cloud Storage. Develop and maintain data models, ensuring data quality, consistency, and accessibility for various internal stakeholders. Automate data processes and workflows using scripting languages like Python, leveraging technologies like Spark and Airflow. Monitor and troubleshoot data pipelines, identifying and resolving performance issues proactively. Stay up-to-date with the latest trends and advancements in GCP and related technologies, actively proposing and evaluating new solutions. Implement data governance best practices, including data security, access control, and lineage tracking. Lead security initiatives, design and implement security architecture. Lead data quality initiatives, design and implement monitoring dashboards. Mentor and guide junior data engineers, sharing knowledge and best practices to foster a high-performing team. Role requires a solid educational foundation and the ability to develop a strategic vision and roadmap for D&A’s transition to the cloud while balancing delivery of near-term results that are aligned with execution. What Experience You Need BS degree in a STEM major or equivalent discipline; Master’s Degree strongly preferred 8+ years of experience as a data engineer or related role, with experience demonstrating leadership capabilities Cloud certification strongly preferred Expert level skills using programming languages such as Python or SQL (Big Query) and advanced level experience with scripting languages. Demonstrated proficiency in all Google Cloud Services Experience building and maintaining complex data pipelines, troubleshooting complex issues, transforming and entering data into a data pipeline in order for the content to be digested and usable for future projects; Proficiency in Airflow strongly desired Experience designing and implementing advanced to complex data models and experience enabling advanced optimization to improve performance Experience leading a team with Git expertise strongly preferred Hands on Experience on Agile Methodoligies Working Knowledge of CI/CD What Could Set You Apart Self-starter that identifies/responds to priority shifts with minimal supervision. Strong communication and presentation skills Strong leadership qualities A well-balanced view of resource management, thinking creatively and effectively to deploy the team whilst building skills for the future Skilled in internal networking, negotiating and proactively developing individuals and teams to be the best they can be Strong communicator & presenter, bringing everyone on the journey. Knowledge of Big Data technology and tools with the ability to share ideas among a collaborative team and drive the team based on technical expertise and learning, sharing best practices Excellent communication skills to engage with senior management, internal customers and product management Sound understanding of regulations and security requirements governing access to data in Big Data systems Sound understanding of Insight delivery systems for batch and online Should be able to run Agile Scrum-based projects Demonstrated problem solving skills and the ability to resolve conflicts Experience creating and maintaining product and software roadmaps Experience overseeing yearly as well as product/project budgets Working in a highly regulated environment We offer a hybrid work setting, comprehensive compensation and healthcare packages, attractive paid time off, and organizational growth potential through our online learning platform with guided career tracks. Are you ready to power your possible? Apply today, and get started on a path toward an exciting new career at Equifax, where you can make a difference! Who is Equifax? At Equifax, we believe knowledge drives progress. As a global data, analytics and technology company, we play an essential role in the global economy by helping employers, employees, financial institutions and government agencies make critical decisions with greater confidence. We work to help create seamless and positive experiences during life’s pivotal moments: applying for jobs or a mortgage, financing an education or buying a car. Our impact is real and to accomplish our goals we focus on nurturing our people for career advancement and their learning and development, supporting our next generation of leaders, maintaining an inclusive and diverse work environment, and regularly engaging and recognizing our employees. Regardless of location or role, the individual and collective work of our employees makes a difference and we are looking for talented team players to join us as we help people live their financial best. Equifax is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, or status as a protected veteran.

Posted 1 day ago

Apply

5.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Job Title: GCP Data Engineer Location: Chennai (Hybrid Work Mode) Experience: 5+ Years with relevant 4 years in GCP Budget - 30+ LPA Joining: Immediate Joiners Preferred – July 1st or 2nd Week Employment Type: Full-Time Client Type: Product and Service-Based Clients Job Description: We are looking for an experienced GCP Data Engineer to join our team for an engagement with a leading product and service-based client. The position is based in Chennai with a hybrid work model (3 days WFO). The ideal candidate should be able to join by the first or second week of July . Key Responsibilities: Build, manage, and optimize robust data pipelines using GCP tools such as BigQuery, Dataflow, Composer, and Pub/Sub Design scalable and efficient data models to support analytics and reporting Develop data engineering solutions using SQL and Python Implement workflow orchestration using Airflow or similar tools Work closely with DevOps for CI/CD integration , version control, and cloud-native deployment Ensure data integrity, security, and performance across distributed systems Required Skills: Hands-on experience with GCP services : BigQuery, Dataflow, Composer, Pub/Sub Strong proficiency in SQL , Python , and data modeling Experience with Airflow or similar orchestration tools Familiarity with CI/CD pipelines , Git , and cloud-native architectures Nice to Have: Exposure to Terraform , Looker , or Kafka Experience with Docker and DevOps practices Background in MLOps or working with ML pipelines Educational Qualifications: Must have completed B.E / B.Tech If the candidate holds a Postgraduate degree (PG) , it should be in regular (full-time) mode only Additional Information: Location: Chennai (Hybrid – 3 days WFO) Client Type: Product and Service-Based Notice Period: Immediate joiners preferred (able to join by July 1st or 2nd week) If you're passionate about building cloud-based data solutions and meet the above criteria, we invite you to apply and be part of a forward-thinking team. 📩 Interested? Drop your resume at gomathi@reveilletechnologies.com 📢 Tag your connections or share this post if you know someone who fits! hashtag#DataEngineer hashtag#GCP hashtag#BigQuery hashtag#CloudJobs hashtag#Python hashtag#ETL hashtag#Dataflow hashtag#HiringNow

Posted 1 day ago

Apply

6.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Greeting from Getronics! We have multiple opportunities for Senior GCP Data Engineers for our automotive client in Chennai Location. Position Description: Data Analytics team is seeking a GCP Data Engineer to create, deliver, and support custom data products, as well as enhance/expand team capabilities. They will work on analysing and manipulating large datasets supporting the enterprise by activating data assets to support Enabling Platforms and analytics. Google Cloud Data Engineers will be responsible for designing the transformation and modernization on Google Cloud Platform cloud . Experience with large scale solutioning and operationalization of data warehouses, data lakes and analytics platforms on Google Cloud Platform is a must. We are looking for candidates who have a broad set of technology skills across these areas and who can demonstrate an ability to design right solutions with appropriate combination of Google Cloud Platform and 3rd party technologies for deploying on Google Cloud Platform cloud. Skills Required: Work as part of an implementation team from concept to operations, providing deep technical subject matter expertise for successful deployment. Implement methods for automation of all parts of the pipeline to minimize labor in development and production. This includes designing and deploying a pipeline with automated data lineage. Identify, develop, evaluate and summarize Proof of Concepts to prove out solutions. Test and compare competing solutions and report out a point of view on the best solution. Integration between GCP Data Catalog and Informatica EDC. Design and build production data engineering solutions to deliver our pipeline patterns using Google Cloud Platform (Google Cloud Platform) services: BigQuery, DataFlow, Pub/Sub, BigTable, Data Fusion, DataProc, Cloud Compose, Cloud SQL, Compute Engine, Cloud Functions, and App Engine. Experience Required: 6+ Years of experience in Data Engineering and minimum 3+ years of Google Cloud Platform is a must. Education Required : Any Bachelor's degree (preferably Engineering Graduate) Additional Information: Willing to work in Hybrid mode (3 days a week) in Chennai - Client location. Looking for Immediate to 30 days' notice candidates only. Candidate should be willing to attend GCP coding assessment (1-hour online video coding) as 1st level of interview. Interested candidates, please share your resume to abirami.rsk@getronics.com

Posted 1 day ago

Apply

0 years

0 Lacs

Mumbai Metropolitan Region

On-site

Linkedin logo

Skills: Python, Spark, Data Engineer, Cloudera, Onpremise, Azure, Snlowfow, Kafka, Overview Of The Company Jio Platforms Ltd. is a revolutionary Indian multinational tech company, often referred to as India's biggest startup, headquartered in Mumbai. Launched in 2019, it's the powerhouse behind Jio, India's largest mobile network with over 400 million users. But Jio Platforms is more than just telecom. It's a comprehensive digital ecosystem, developing cutting-edge solutions across media, entertainment, and enterprise services through popular brands like JioMart, JioFiber, and JioSaavn. Join us at Jio Platforms and be part of a fast-paced, dynamic environment at the forefront of India's digital transformation. Collaborate with brilliant minds to develop next-gen solutions that empower millions and revolutionize industries. Team Overview The Data Platforms Team is the launchpad for a data-driven future, empowering the Reliance Group of Companies. We're a passionate group of experts architecting an enterprise-scale data mesh to unlock the power of big data, generative AI, and ML modelling across various domains. We don't just manage data we transform it into intelligent actions that fuel strategic decision-making. Imagine crafting a platform that automates data flow, fuels intelligent insights, and empowers the organization that's what we do. Join our collaborative and innovative team, and be a part of shaping the future of data for India's biggest digital revolution! About the role. Title: Lead Data Engineer Location: Mumbai Responsibilities End-to-End Data Pipeline Development: Design, build, optimize, and maintain robust data pipelines across cloud, on-premises, or hybrid environments, ensuring performance, scalability, and seamless data flow. Reusable Components & Frameworks: Develop reusable data pipeline components and contribute to the team's data pipeline framework evolution. Data Architecture & Solutions: Contribute to data architecture design, applying data modelling, storage, and retrieval expertise. Data Governance & Automation: Champion data integrity, security, and efficiency through metadata management, automation, and data governance best practices. Collaborative Problem Solving: Partner with stakeholders, data teams, and engineers to define requirements, troubleshoot, optimize, and deliver data-driven insights. Mentorship & Knowledge Transfer: Guide and mentor junior data engineers, fostering knowledge sharing and professional growth. Qualification Details Education: Bachelor's degree or higher in Computer Science, Data Science, Engineering, or a related technical field. Core Programming: Excellent command of a primary data engineering language (Scala, Python, or Java) with a strong foundation in OOPS and functional programming concepts. Big Data Technologies: Hands-on experience with data processing frameworks (e.g., Hadoop, Spark, Apache Hive, NiFi, Ozone, Kudu), ideally including streaming technologies (Kafka, Spark Streaming, Flink, etc.). Database Expertise: Excellent querying skills (SQL) and strong understanding of relational databases (e.g., MySQL, PostgreSQL). Experience with NoSQL databases (e.g., MongoDB, Cassandra) is a plus. End-to-End Pipelines: Demonstrated experience in implementing, optimizing, and maintaining complete data pipelines, integrating varied sources and sinks including streaming real-time data. Cloud Expertise: Knowledge of Cloud Technologies like Azure HDInsights, Synapse, EventHub and GCP DataProc, Dataflow, BigQuery. CI/CD Expertise: Experience with CI/CD methodologies and tools, including strong Linux and shell scripting skills for automation. Desired Skills & Attributes Problem-Solving & Troubleshooting: Proven ability to analyze and solve complex data problems, troubleshoot data pipeline issues effectively. Communication & Collaboration: Excellent communication skills, both written and verbal, with the ability to collaborate across teams (data scientists, engineers, stakeholders). Continuous Learning & Adaptability: A demonstrated passion for staying up-to-date with emerging data technologies and a willingness to adapt to new tools.

Posted 1 day ago

Apply

0 years

0 Lacs

Mumbai Metropolitan Region

On-site

Linkedin logo

Skills: Data Engineer, Spark, Scala, Python, Onpremise, Cloudera, Snowflake, kafka, Overview Of The Company Jio Platforms Ltd. is a revolutionary Indian multinational tech company, often referred to as India's biggest startup, headquartered in Mumbai. Launched in 2019, it's the powerhouse behind Jio, India's largest mobile network with over 400 million users. But Jio Platforms is more than just telecom. It's a comprehensive digital ecosystem, developing cutting-edge solutions across media, entertainment, and enterprise services through popular brands like JioMart, JioFiber, and JioSaavn. Join us at Jio Platforms and be part of a fast-paced, dynamic environment at the forefront of India's digital transformation. Collaborate with brilliant minds to develop next-gen solutions that empower millions and revolutionize industries. Team Overview The Data Platforms Team is the launchpad for a data-driven future, empowering the Reliance Group of Companies. We're a passionate group of experts architecting an enterprise-scale data mesh to unlock the power of big data, generative AI, and ML modelling across various domains. We don't just manage data we transform it into intelligent actions that fuel strategic decision-making. Imagine crafting a platform that automates data flow, fuels intelligent insights, and empowers the organization that's what we do. Join our collaborative and innovative team, and be a part of shaping the future of data for India's biggest digital revolution! About the role. Title : Senior Data Engineer Location : Mumbai Responsibilities End-to-End Data Pipeline Development: Design, build, optimize, and maintain robust data pipelines across cloud, on-premises, or hybrid environments, ensuring performance, scalability, and seamless data flow. Reusable Components & Frameworks: Develop reusable data pipeline components and contribute to the team's data pipeline framework evolution. Data Architecture & Solutions: Contribute to data architecture design, applying data modelling, storage, and retrieval expertise. Data Governance & Automation: Champion data integrity, security, and efficiency through metadata management, automation, and data governance best practices. Collaborative Problem Solving: Partner with stakeholders, data teams, and engineers to define requirements, troubleshoot, optimize, and deliver data-driven insights. Mentorship & Knowledge Transfer: Guide and mentor junior data engineers, fostering knowledge sharing and professional growth. Qualification Details Education: Bachelor's degree or higher in Computer Science, Data Science, Engineering, or a related technical field. Core Programming: Excellent command of a primary data engineering language (Scala, Python, or Java) with a strong foundation in OOPS and functional programming concepts. Big Data Technologies: Hands-on experience with data processing frameworks (e.g., Hadoop, Spark, Apache Hive, NiFi, Ozone, Kudu), ideally including streaming technologies (Kafka, Spark Streaming, Flink, etc.). Database Expertise: Excellent querying skills (SQL) and strong understanding of relational databases (e.g., MySQL, PostgreSQL). Experience with NoSQL databases (e.g., MongoDB, Cassandra) is a plus. End-to-End Pipelines: Demonstrated experience in implementing, optimizing, and maintaining complete data pipelines, integrating varied sources and sinks including streaming real-time data. Cloud Expertise: Knowledge of Cloud Technologies like Azure HDInsights, Synapse, EventHub and GCP DataProc, Dataflow, BigQuery. CI/CD Expertise: Experience with CI/CD methodologies and tools, including strong Linux and shell scripting skills for automation. Desired Skills & Attributes Problem-Solving & Troubleshooting: Proven ability to analyze and solve complex data problems, troubleshoot data pipeline issues effectively. Communication & Collaboration: Excellent communication skills, both written and verbal, with the ability to collaborate across teams (data scientists, engineers, stakeholders). Continuous Learning & Adaptability: A demonstrated passion for staying up-to-date with emerging data technologies and a willingness to adapt to new tools.

Posted 1 day ago

Apply

7.0 years

0 Lacs

Trivandrum, Kerala, India

On-site

Linkedin logo

Equifax is where you can power your possible. If you want to achieve your true potential, chart new paths, develop new skills, collaborate with bright minds, and make a meaningful impact, we want to hear from you. Equifax is seeking a Software Engineering Lead to spearhead innovative batch and data product development using Java and Google Cloud Platform. This exciting role offers the opportunity to lead cutting-edge projects, mentor a team of developers, and drive the implementation of high-impact data solutions. As a key player in Equifax's technology team, you'll leverage your expertise to shape the future of data engineering while working with the latest cloud and AI technologies. The position combines technical expertise with leadership skills to drive the development and deployment of advanced batch and data products. Key Responsibilities Oversee the design, development, and deployment of innovative batch and data products Provide technical direction for Java and GCP implementations Lead and mentor a team of developers and engineers Collaborate with cross-functional teams to translate requirements into technical solutions Implement rigorous testing strategies and optimize performance Maintain documentation and ensure compliance with standards What Experience You Need Bachelor's or Master's degree in Computer Science, Software Engineering, or a related field Proficiency in Java and its ecosystems Extensive experience with Google Cloud Platform, including GKE, Cloud Storage, Dataflow, BigQuery Minimum of 7 years in software development, focusing on batch processing and data solutions Exceptional communication and teamwork skills Experience with securely handling sensitive data (PII / PHI) Proven track record of writing defect-free code At least 3 years in a lead role What could set you apart Ability to convey complex technical concepts to non-technical stakeholders Relevant certifications (e.g., Google Cloud Professional Developer, Professional Data Engineer) Experience with multiple cloud platforms (e.g., AWS, Azure) in addition to GCP Proficiency in both Java and Python for versatile development and capabilities Experience in optimizing performance and reducing costs in cloud environments We offer a hybrid work setting, comprehensive compensation and healthcare packages, attractive paid time off, and organizational growth potential through our online learning platform with guided career tracks. Are you ready to power your possible? Apply today, and get started on a path toward an exciting new career at Equifax, where you can make a difference! Who is Equifax? At Equifax, we believe knowledge drives progress. As a global data, analytics and technology company, we play an essential role in the global economy by helping employers, employees, financial institutions and government agencies make critical decisions with greater confidence. We work to help create seamless and positive experiences during life’s pivotal moments: applying for jobs or a mortgage, financing an education or buying a car. Our impact is real and to accomplish our goals we focus on nurturing our people for career advancement and their learning and development, supporting our next generation of leaders, maintaining an inclusive and diverse work environment, and regularly engaging and recognizing our employees. Regardless of location or role, the individual and collective work of our employees makes a difference and we are looking for talented team players to join us as we help people live their financial best. Equifax is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, or status as a protected veteran.

Posted 1 day ago

Apply

8.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Linkedin logo

About the Role: We are seeking a talented and experienced GCP Data Engineer with 5–8 years of overall experience in data engineering and strong hands-on expertise with Google Cloud Platform services. The ideal candidate will be responsible for building, optimizing, and maintaining scalable, robust, and secure data pipelines that support advanced analytics and business intelligence initiatives. Key Responsibilities: Design, implement, and maintain scalable data pipelines and ETL/ELT workflows using GCP services like Dataflow , BigQuery , Pub/Sub , and Cloud Functions . Build and manage real-time and batch data processing pipelines with an emphasis on reliability, security, and performance. Leverage BigQuery for high-performance analytical queries and transformations. Implement streaming and event-driven architectures using Pub/Sub and Dataflow . Integrate data pipelines with third-party APIs, cloud storage, and internal systems. Collaborate with product teams, data analysts, and ML engineers to align data engineering solutions with business goals. Ensure data quality , governance , security , and compliance across the platform. Participate in code reviews, design sessions, and technical planning. Create and maintain clear technical documentation and operational guides. Required Skills & Experience: 5 to 8 years of experience in data engineering , with at least 2+ years of hands-on GCP experience . Proficient with key GCP data services : BigQuery Dataflow (Apache Beam) Pub/Sub Cloud Functions Strong coding skills in Python or Java , and advanced SQL skills. Experience with REST APIs , data ingestion , streaming , and batch processing . Strong understanding of data warehousing , data lake architecture , and data modeling principles . Familiarity with CI/CD tools (e.g., Jenkins, GitHub Actions) and DevOps practices . Exposure to Docker and cloud deployment workflows. Experience working with PostgreSQL , MongoDB , or other relational/NoSQL databases.

Posted 1 day ago

Apply

Exploring Dataflow Jobs in India

The dataflow job market in India is currently experiencing a surge in demand for skilled professionals. With the increasing reliance on data-driven decision-making in various industries, the need for individuals proficient in managing and analyzing dataflow is on the rise. This article aims to provide job seekers with valuable insights into the dataflow job landscape in India.

Top Hiring Locations in India

  1. Bangalore
  2. Mumbai
  3. Pune
  4. Hyderabad
  5. Delhi

These cities are known for their thriving tech ecosystems and are home to numerous companies actively hiring for dataflow roles.

Average Salary Range

The average salary range for dataflow professionals in India varies based on experience levels. Entry-level positions can expect to earn between INR 4-6 lakhs per annum, while experienced professionals can command salaries upwards of INR 12-15 lakhs per annum.

Career Path

In the dataflow domain, a typical career path may involve starting as a Junior Data Analyst or Data Engineer, progressing to roles such as Senior Data Scientist or Data Architect, and eventually reaching positions like Tech Lead or Data Science Manager.

Related Skills

In addition to expertise in dataflow tools and technologies, dataflow professionals are often expected to have proficiency in programming languages such as Python or R, knowledge of databases like SQL, and familiarity with data visualization tools like Tableau or Power BI.

Interview Questions

  • What is dataflow and how is it different from data streaming? (basic)
  • Explain the difference between batch processing and real-time processing. (medium)
  • How do you handle missing or null values in a dataset? (basic)
  • Can you explain the concept of data lineage? (medium)
  • What is the importance of data quality in dataflow processes? (basic)
  • How do you optimize dataflow pipelines for performance? (medium)
  • Describe a time when you had to troubleshoot a dataflow issue. (medium)
  • What are some common challenges faced in dataflow projects? (medium)
  • How do you ensure data security and compliance in dataflow processes? (medium)
  • What are the key components of a dataflow architecture? (medium)
  • Explain the concept of data partitioning in dataflow. (advanced)
  • How would you handle a sudden increase in data volume in a dataflow pipeline? (advanced)
  • What role does data governance play in dataflow processes? (medium)
  • Can you discuss the advantages and disadvantages of using cloud-based dataflow solutions? (medium)
  • How do you stay updated with the latest trends and technologies in dataflow? (basic)
  • What is the significance of metadata in dataflow management? (medium)
  • Walk us through a dataflow project you have worked on from start to finish. (medium)
  • How do you ensure data quality and consistency across different data sources in a dataflow pipeline? (medium)
  • What are some best practices for monitoring and troubleshooting dataflow pipelines? (medium)
  • How do you handle data transformations and aggregations in a dataflow process? (basic)
  • What are the key performance indicators you would track in a dataflow project? (medium)
  • How do you collaborate with cross-functional teams in a dataflow project? (basic)
  • Can you explain the concept of data replication in dataflow management? (advanced)
  • How do you approach data modeling in a dataflow project? (medium)
  • Describe a challenging dataflow problem you encountered and how you resolved it. (advanced)

Closing Remark

As you navigate the dataflow job market in India, remember to showcase your skills and experiences confidently during interviews. Stay updated with the latest trends in dataflow and continuously upskill to stand out in a competitive job market. Best of luck in your job search journey!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies