Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
3.0 - 5.0 years
5 - 7 Lacs
Chennai, Tamil Nadu
Work from Office
Duration: 12Months Work Type: Onsite Position Description: We seeking an experienced GCP Data Engineer who can build cloud analytics platform to meet ever expanding business requirements with speed and quality using lean Agile practices. You will work on analyzing and manipulating large datasets supporting the enterprise by activating data assets to support Enabling Platforms and Analytics in the Google Cloud Platform (GCP). You will be responsible for designing the transformation and modernization on GCP, as well as landing data from source applications to GCP. Experience with large scale solution and operationalization of data warehouses, data lakes and analytics platforms on Google Cloud Platform or other cloud environment is a must. We are looking for candidates who have a broad set of technology skills across these areas and who can demonstrate an ability to design right solutions with appropriate combination of GCP and 3rd party technologies for deploying on Google Cloud Platform. Skills Required: Experience in working in an implementation team from concept to operations, providing deep technical subject matter expertise for successful deployment. Implement methods for automation of all parts of the pipeline to minimize labor in development and production Experience in analyzing complex data, organizing raw data and integrating massive datasets from multiple data sources to build subject areas and reusable data products Experience in working with architects to evaluate and productionalize appropriate GCP tools for data ingestion, integration, presentation, and reporting Experience in working with all stakeholders to formulate business problems as technical data requirement, identify and implement technical solutions while ensuring key business drivers are captured in collaboration with product management Proficient in Machine Learning model architecture, data pipeline interaction and metrics interpretation. This includes designing and deploying a pipeline with automated data lineage. Identify, develop, evaluate and summarize Proof of Concepts to prove out solutions. Test and compare competing solutions and report out a point of view on the best solution. Integration between GCP Data Catalog and Informatica EDC. Design and build production data engineering solutions to deliver pipeline patterns using Google Cloud Platform (GCP) services: BigQuery, DataFlow, Pub/Sub, BigTable, Data Fusion, DataProc, Cloud Composer, Cloud SQL, Compute Engine, Cloud Functions, and App Engine. Skills Preferred: Strong drive for results and ability to multi-task and work independently Self-starter with proven innovation skills Ability to communicate and work with cross-functional teams and all levels of management Demonstrated commitment to quality and project timing Demonstrated ability to document complex systems Experience in creating and executing detailed test plans Experience Required: 3 to 5 Yrs Education Required: BE or Equivalent
Posted 2 months ago
4.0 years
0 Lacs
Greater Kolkata Area
On-site
Job Description Job Description As a Technical Lead, you will be working on both offshore and onsite client projects. You will be working in projects which will involve Oracle BI Applications/ FAW or OBIEE/OAC/ ODI Implementation, You will be Interacting with client to understand and gather requirement You will be responsible for technical design, development, and system/integration testing using oracle methodologies Desired Profile End –to-end ODI, OAC and Oracle BI Applications/ FAW implementation experience Expert knowledge of BI Applications/ FAW including basic and advanced configurations with Oracle eBS suite/ Fusion as the source system Expert knowledge of OBIEE/OAC RPD design and reports design Expert knowledge ETL(ODI) design/ OCI DI/ OCI Dataflow Mandatory to have 1 of these skills : PLSQL/ BI Publisher/BI Apps Good to have EDQ, Pyspark skills Architectural Solution Definition Any Industry Standard Certifications will be a plus Good knowledge in Oracle database and development Experience in the database application. Creativity, Personal Drive, Influencing and Negotiating, Problem Solving Building Effective Relationships, Customer Focus, Effective Communication, Coaching Ready to travel as and when required by project Experience 8-12 yrs of Data warehousing and Business Intelligence project experience Having 4-6 years of project experience on BI Applications/ FAW and OBIEE/OAC/ ODI/ OCI DI with at least 2 complete lifecycle implementations 4-6 yrs of specialized BI Applications and OBIEE/OAC/ ODI/ OCI DI customization and solution architecture experience. Worked on Financial, SCM or HR Analytics recently in implementation and configuration Career Level - IC3 Responsibilities Job Description As a Technical Lead, you will be working on both offshore and onsite client projects. You will be working in projects which will involve Oracle BI Applications/ FAW or OBIEE/OAC/ ODI Implementation, You will be Interacting with client to understand and gather requirement You will be responsible for technical design, development, and system/integration testing using oracle methodologies Desired Profile End –to-end ODI, OAC and Oracle BI Applications/ FAW implementation experience Expert knowledge of BI Applications/ FAW including basic and advanced configurations with Oracle eBS suite/ Fusion as the source system Expert knowledge of OBIEE/OAC RPD design and reports design Expert knowledge ETL(ODI) design/ OCI DI/ OCI Dataflow Mandatory to have 1 of these skills : PLSQL/ BI Publisher/BI Apps Good to have EDQ, Pyspark skills Architectural Solution Definition Any Industry Standard Certifications will be a plus Good knowledge in Oracle database and development Experience in the database application. Creativity, Personal Drive, Influencing and Negotiating, Problem Solving Building Effective Relationships, Customer Focus, Effective Communication, Coaching Ready to travel as and when required by project Experience 8-12 yrs of Data warehousing and Business Intelligence project experience Having 4-6 years of project experience on BI Applications/ FAW and OBIEE/OAC/ ODI/ OCI DI with at least 2 complete lifecycle implementations 4-6 yrs of specialized BI Applications and OBIEE/OAC/ ODI/ OCI DI customization and solution architecture experience. Worked on Financial, SCM or HR Analytics recently in implementation and configuration About Us As a world leader in cloud solutions, Oracle uses tomorrow’s technology to tackle today’s challenges. We’ve partnered with industry-leaders in almost every sector—and continue to thrive after 40+ years of change by operating with integrity. We know that true innovation starts when everyone is empowered to contribute. That’s why we’re committed to growing an inclusive workforce that promotes opportunities for all. Oracle careers open the door to global opportunities where work-life balance flourishes. We offer competitive benefits based on parity and consistency and support our people with flexible medical, life insurance, and retirement options. We also encourage employees to give back to their communities through our volunteer programs. We’re committed to including people with disabilities at all stages of the employment process. If you require accessibility assistance or accommodation for a disability at any point, let us know by emailing accommodation-request_mb@oracle.com or by calling +1 888 404 2494 in the United States. Oracle is an Equal Employment Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability and protected veterans’ status, or any other characteristic protected by law. Oracle will consider for employment qualified applicants with arrest and conviction records pursuant to applicable law. Show more Show less
Posted 2 months ago
4.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Job Description Job Description As a Technical Lead, you will be working on both offshore and onsite client projects. You will be working in projects which will involve Oracle BI Applications/ FAW or OBIEE/OAC/ ODI Implementation, You will be Interacting with client to understand and gather requirement You will be responsible for technical design, development, and system/integration testing using oracle methodologies Desired Profile End –to-end ODI, OAC and Oracle BI Applications/ FAW implementation experience Expert knowledge of BI Applications/ FAW including basic and advanced configurations with Oracle eBS suite/ Fusion as the source system Expert knowledge of OBIEE/OAC RPD design and reports design Expert knowledge ETL(ODI) design/ OCI DI/ OCI Dataflow Mandatory to have 1 of these skills : PLSQL/ BI Publisher/BI Apps Good to have EDQ, Pyspark skills Architectural Solution Definition Any Industry Standard Certifications will be a plus Good knowledge in Oracle database and development Experience in the database application. Creativity, Personal Drive, Influencing and Negotiating, Problem Solving Building Effective Relationships, Customer Focus, Effective Communication, Coaching Ready to travel as and when required by project Experience 8-12 yrs of Data warehousing and Business Intelligence project experience Having 4-6 years of project experience on BI Applications/ FAW and OBIEE/OAC/ ODI/ OCI DI with at least 2 complete lifecycle implementations 4-6 yrs of specialized BI Applications and OBIEE/OAC/ ODI/ OCI DI customization and solution architecture experience. Worked on Financial, SCM or HR Analytics recently in implementation and configuration Career Level - IC3 Responsibilities Job Description As a Technical Lead, you will be working on both offshore and onsite client projects. You will be working in projects which will involve Oracle BI Applications/ FAW or OBIEE/OAC/ ODI Implementation, You will be Interacting with client to understand and gather requirement You will be responsible for technical design, development, and system/integration testing using oracle methodologies Desired Profile End –to-end ODI, OAC and Oracle BI Applications/ FAW implementation experience Expert knowledge of BI Applications/ FAW including basic and advanced configurations with Oracle eBS suite/ Fusion as the source system Expert knowledge of OBIEE/OAC RPD design and reports design Expert knowledge ETL(ODI) design/ OCI DI/ OCI Dataflow Mandatory to have 1 of these skills : PLSQL/ BI Publisher/BI Apps Good to have EDQ, Pyspark skills Architectural Solution Definition Any Industry Standard Certifications will be a plus Good knowledge in Oracle database and development Experience in the database application. Creativity, Personal Drive, Influencing and Negotiating, Problem Solving Building Effective Relationships, Customer Focus, Effective Communication, Coaching Ready to travel as and when required by project Experience 8-12 yrs of Data warehousing and Business Intelligence project experience Having 4-6 years of project experience on BI Applications/ FAW and OBIEE/OAC/ ODI/ OCI DI with at least 2 complete lifecycle implementations 4-6 yrs of specialized BI Applications and OBIEE/OAC/ ODI/ OCI DI customization and solution architecture experience. Worked on Financial, SCM or HR Analytics recently in implementation and configuration About Us As a world leader in cloud solutions, Oracle uses tomorrow’s technology to tackle today’s challenges. We’ve partnered with industry-leaders in almost every sector—and continue to thrive after 40+ years of change by operating with integrity. We know that true innovation starts when everyone is empowered to contribute. That’s why we’re committed to growing an inclusive workforce that promotes opportunities for all. Oracle careers open the door to global opportunities where work-life balance flourishes. We offer competitive benefits based on parity and consistency and support our people with flexible medical, life insurance, and retirement options. We also encourage employees to give back to their communities through our volunteer programs. We’re committed to including people with disabilities at all stages of the employment process. If you require accessibility assistance or accommodation for a disability at any point, let us know by emailing accommodation-request_mb@oracle.com or by calling +1 888 404 2494 in the United States. Oracle is an Equal Employment Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability and protected veterans’ status, or any other characteristic protected by law. Oracle will consider for employment qualified applicants with arrest and conviction records pursuant to applicable law. Show more Show less
Posted 2 months ago
4.0 - 7.0 years
8 - 14 Lacs
Noida
Hybrid
Data Engineer (L3) || GCP Certified Employment Type : Full-Time Work Mode : In-office/ Hybrid Notice : Immediate joiners As a Data Engineer, you will design, develop, and support data pipelines and related data products and platforms. Your primary responsibilities include designing and building data extraction, loading, and transformation pipelines across on-prem and cloud platforms. You will perform application impact assessments, requirements reviews, and develop work estimates. Additionally, you will develop test strategies and site reliability engineering measures for data products and solutions, participate in agile development ""scrums"" and solution reviews, mentor junior Data Engineering Specialists, lead the resolution of critical operations issues, and perform technical data stewardship tasks, including metadata management, security, and privacy by design. Required Skills : Design, develop, and support data pipelines and related data products and platforms. Design and build data extraction, loading, and transformation pipelines and data products across on-prem and cloud platforms. Perform application impact assessments, requirements reviews, and develop work estimates. Develop test strategies and site reliability engineering measures for data products and solutions. Participate in agile development ""scrums"" and solution reviews. Mentor junior Data Engineers. Lead the resolution of critical operations issues, including post-implementation reviews. Perform technical data stewardship tasks, including metadata management, security, and privacy by design. Design and build data extraction, loading, and transformation pipelines using Python and other GCP Data Technologies Demonstrate SQL and database proficiency in various data engineering tasks. Automate data workflows by setting up DAGs in tools like Control-M, Apache Airflow, and Prefect. Develop Unix scripts to support various data operations. Model data to support business intelligence and analytics initiatives. Utilize infrastructure-as-code tools such as Terraform, Puppet, and Ansible for deployment automation. Expertise in GCP data warehousing technologies, including BigQuery, Cloud SQL, Dataflow, Data Catalog, Cloud Composer, Google Cloud Storage, IAM, Compute Engine, Cloud Data Fusion and Dataproc (good to have). data pipelines, agile development,scrums, GCP Data Technologies, Python, DAGs, Control-M, Apache Airflow, Data solution architecture Qualifications : Bachelor's degree in Software Engineering, Computer Science, Business, Mathematics, or related field. 4+ years of data engineering experience. 2 years of data solution architecture and design experience. GCP Certified Data Engineer (preferred). Job Type : Full-time
Posted 2 months ago
15.0 years
0 Lacs
India
Remote
Job Title: Lead Data Engineering Manager – GCP Cloud Migration Location: [Remote] Experience: 12–15 Years (5+ Years Leading Data Teams, 8+ Years in Data Engineering) Employment Type: Full-Time About the Role: We are seeking an experienced Lead Data Engineering Manager to drive the end-to-end migration of enterprise data platforms, ETL pipelines, and data warehouses to the cloud — with a focus on Google Cloud Platform (GCP) . This role will lead high-performing engineering teams and collaborate with cross-functional stakeholders to architect and execute scalable, secure, and modern data solutions using BigQuery, Dataform, Dataplex , and other cloud-native tools. A background in premium consulting or strategic technology advisory is highly preferred, as this role will engage with executive stakeholders and contribute to data transformation strategies at the enterprise level. Key Responsibilities: Lead and mentor Data Engineering teams across design, development, and deployment of modern cloud data architectures. Drive cloud migration initiatives including re-platforming legacy ETL workflows and on-prem DWHs to GCP-based solutions . Architect and implement scalable data pipelines using BigQuery, Dataform , and orchestration tools. Ensure robust data governance and cataloging practices leveraging Dataplex and other GCP services. Collaborate with data analysts, data scientists, and business stakeholders to enable advanced analytics and ML capabilities. Establish and enforce engineering best practices, CI/CD pipelines, and monitoring strategies. Provide technical leadership, project planning, and resource management to deliver projects on time and within scope. Represent the data engineering function in client or leadership meetings, especially in a consulting or multi-client context. Required Skills & Qualifications: 12–15 years of total experience, with 8+ years in data engineering and 5+ years in team leadership roles. Proven expertise in cloud-based data platforms, especially GCP (BigQuery, Dataflow, Dataform, Dataplex, Cloud Composer, Pub/Sub) . Strong knowledge of modern ETL/ELT practices, data modeling, and pipeline orchestration. Experience with data warehouse modernization and migration from platforms like Teradata, Oracle, or Hadoop to GCP. Familiarity with data governance , metadata management , and data cataloging . Background in consulting or strategic data advisory with Fortune 500 clients preferred. Hands-on skills in SQL, Python, and cloud infrastructure-as-code (e.g., Terraform). Strong communication, stakeholder engagement, and leadership presence. Preferred Qualifications: GCP Data Engineer or Architect Certification. Experience with agile methodologies and DevOps practices. Prior work with multi-cloud or hybrid environments is a plus. Experience in regulated industries (finance, healthcare, etc.) is advantageous. Show more Show less
Posted 2 months ago
4.0 - 7.0 years
10 - 14 Lacs
Noida
Work from Office
Location: Noida (In-office/Hybrid; client site if required) Type: Full-Time | Immediate Joiners Preferred Must-Have Skills: GCP (BigQuery, Dataflow, Dataproc, Cloud Storage) PySpark / Spark Distributed computing expertise Apache Iceberg (preferred), Hudi, or Delta Lake Role Overview: Be part of a high-impact Data Engineering team focused on building scalable, cloud-native data pipelines. You'll support and enhance EMR platforms using DevOps principles, helping deliver real-time health alerts and diagnostics for platform performance. Key Responsibilities: Provide data engineering support to EMR platforms Design and implement cloud-native, automated data solutions Collaborate with internal teams to deliver scalable systems Continuously improve infrastructure reliability and observability Technical Environment: Databases: Oracle, MySQL, MSSQL, MongoDB Distributed Engines: Spark/PySpark, Presto, Flink/Beam Cloud Infra: GCP (preferred), AWS (nice-to-have), Terraform Big Data Formats: Iceberg, Hudi, Delta Tools: SQL, Data Modeling, Palantir Foundry, Jenkins, Confluence Bonus: Stats/math tools (NumPy, PyMC3), Linux scripting Ideal for engineers with cloud-native, real-time data platform experience especially those who have worked with EMR and modern lakehouse stacks.
Posted 2 months ago
5.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Job Description Participates in review and gathering of business requirements for various HR system updates and/or functionality; Develop specification documentation. Creates test plans and/or related documentation that would be used to manage the testing process Performs system troubleshooting. Lead review, testing and implementation of system enhancements. Collaborates with IT staff to coordinate updates and/or fixes and documents the process and results. Provides production support, including researching and resolving HR technology issues, unexpected results or process flaws. Conducts regular data audits and implement validation measures to ensure data integrity. Make recommendations to help re-engineer and improve effectiveness of various HR processes. Familiarity working with large data sets. Proficient in Excel Formulas and macros. Understanding of SQL and OTBI, BI Publisher in order to analyze data and set requirements for IT Ability to draft business requirement documents (BRD) based on Verisk standard format. Drafts and executes test scripts Ability to understand technical design requirements and communicate those requirements to both IT and business partners Qualifications 5+ years experience with Oracle HCM platforms. 3+ years experience with Oracle Fusion Experience with customizing pages with Oracle Visual Builder Studio (VBS) Oracle Redwood experience a plus Oracle Work Structure Configuration experience Coordinating with oracle and creating SR’s a plus Understanding and ability to create Oracle HDL load templates for business objects and employee data loads Experience with technologies supporting any combination of the following HR disciplines: HRIS (HR Job Management, HR Org Management, HR Master Data, Compensation, Benefits, etc.), Talent Acquisitions, Talent Development (Learning and Performance Management), Oracle Cloud Recruiting. Experience with and proficiency around Oracle data structures such as Approval Workflows, Setup and Maintenance of Data elements, locations, departments, department trees, jobs, legal employers, etc. Proficiency in Microsoft Productivity Suite Can project manage and follow through on deliverables independently with stakeholders Solid understanding of HR Functional areas and how downstream dataflow from HRIS supports each function Stakeholder Management, Managing Expectations, Prioritization Skills, Conflict Management, Decision Making Quality Strong experience in industry and/or similar positions supporting back end HR platforms About Us For over 50 years, Verisk has been the leading data analytics and technology partner to the global insurance industry by delivering value to our clients through expertise and scale. We empower communities and businesses to make better decisions on risk, faster. At Verisk, you'll have the chance to use your voice and build a rewarding career that's as unique as you are, with work flexibility and the support, coaching, and training you need to succeed. For the eighth consecutive year, Verisk is proudly recognized as a Great Place to Work® for outstanding workplace culture in the US, fourth consecutive year in the UK, Spain, and India, and second consecutive year in Poland. We value learning, caring and results and make inclusivity and diversity a top priority. In addition to our Great Place to Work® Certification, we’ve been recognized by The Wall Street Journal as one of the Best-Managed Companies and by Forbes as a World’s Best Employer and Best Employer for Women, testaments to the value we place on workplace culture. We’re 7,000 people strong. We relentlessly and ethically pursue innovation. And we are looking for people like you to help us translate big data into big ideas. Join us and create an exceptional experience for yourself and a better tomorrow for future generations. Verisk Businesses Underwriting Solutions — provides underwriting and rating solutions for auto and property, general liability, and excess and surplus to assess and price risk with speed and precision Claims Solutions — supports end-to-end claims handling with analytic and automation tools that streamline workflow, improve claims management, and support better customer experiences Property Estimating Solutions — offers property estimation software and tools for professionals in estimating all phases of building and repair to make day-to-day workflows the most efficient Extreme Event Solutions — provides risk modeling solutions to help individuals, businesses, and society become more resilient to extreme events. Specialty Business Solutions — provides an integrated suite of software for full end-to-end management of insurance and reinsurance business, helping companies manage their businesses through efficiency, flexibility, and data governance Marketing Solutions — delivers data and insights to improve the reach, timing, relevance, and compliance of every consumer engagement Life Insurance Solutions – offers end-to-end, data insight-driven core capabilities for carriers, distribution, and direct customers across the entire policy lifecycle of life and annuities for both individual and group. Verisk Maplecroft — provides intelligence on sustainability, resilience, and ESG, helping people, business, and societies become stronger Verisk Analytics is an equal opportunity employer. All members of the Verisk Analytics family of companies are equal opportunity employers. We consider all qualified applicants for employment without regard to race, religion, color, national origin, citizenship, sex, gender identity and/or expression, sexual orientation, veteran's status, age or disability. Verisk’s minimum hiring age is 18 except in countries with a higher age limit subject to applicable law. https://www.verisk.com/company/careers/ Unsolicited resumes sent to Verisk, including unsolicited resumes sent to a Verisk business mailing address, fax machine or email address, or directly to Verisk employees, will be considered Verisk property. Verisk will NOT pay a fee for any placement resulting from the receipt of an unsolicited resume. Verisk Employee Privacy Notice Show more Show less
Posted 2 months ago
5.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Job Description A platform software Engineer is a versatile developer with expertise in Java or Python and a strong foundation in cloud platforms to build and manage applications at scale. Generally, platform engineers fall into two categories: backend engineers, who design and implement microservices with robust APIs, and full-stack engineers, who deliver native UI/UX solutions, and ability to develop frameworks and service to enable an enterprise data platform. With a solid understanding of the SDLC and hands-on experience in Git and CI/CD, platform engineers can independently design, code, test, and release features to production efficiently. Responsibilities Design and Build Data Pipelines: Architect, develop, and maintain scalable data pipelines and microservices that support real-time and batch processing on GCP. Service-Oriented Architecture (SOA) and Microservices: Design and implement SOA and microservices-based architectures to ensure modular, flexible, and maintainable data solutions. Full-Stack Integration: Leverage your full-stack expertise to contribute to the seamless integration of front-end and back-end components, ensuring robust data access and UI-driven data exploration. Data Ingestion and Integration: Lead the ingestion and integration of data from various sources into the data platform, ensuring data is standardized and optimized for analytics. GCP Data Solutions: Utilize GCP services (BigQuery, Dataflow, Pub/Sub, Cloud Functions, etc.) to build and manage data platforms that meet business needs. Data Governance and Security: Implement and manage data governance, access controls, and security best practices while leveraging GCP’s native row- and column-level security features. Performance Optimization: Continuously monitor and improve the performance, scalability, and efficiency of data pipelines and storage solutions. Collaboration and Best Practices: Work closely with data architects, software engineers, and cross-functional teams to define best practices, design patterns, and frameworks for cloud data engineering. Automation and Reliability: Automate data platform processes to enhance reliability, reduce manual intervention, and improve operational efficiency. Qualifications Education: Bachelor’s degree in Computer Science, Data Engineering, Information Systems, or a related field. Master’s degree or equivalent experience preferred. Experience: Minimum 5 years of experience as a Software Engineer Technical Skills: Proficient in Java, angular or any javascript technology with experience in designing and deploying cloud-based data pipelines and microservices using GCP tools like BigQuery, Dataflow, and Dataproc. Ability to leverage best in-class data platform technologies to deliver platform features, and design & orchestrate platform services to deliver data platform capabilities. Service-Oriented Architecture and Microservices: Strong understanding of SOA, microservices, and their application within a cloud data platform context. Develop robust, scalable services using Java Spring Boot, Python, Angular, and GCP technologies. Full-Stack Development: Knowledge of front-end and back-end technologies, enabling collaboration on data access and visualization layers (e.g., React, Node.js). Design and develop RESTful APIs for seamless integration across platform services. Implement robust unit and functional tests to maintain high standards of test coverage and quality. Database Management: Experience with relational (e.g., PostgreSQL, MySQL) and NoSQL databases, as well as columnar databases like BigQuery. Data Governance and Security: Understanding of data governance frameworks and implementing RBAC, encryption, and data masking in cloud environments. CI/CD and Automation: Familiarity with CI/CD pipelines, Infrastructure as Code (IaC) tools like Terraform, and automation frameworks. Manage code changes with GitHub and troubleshoot and resolve application defects efficiently. Ensure adherence to SDLC best practices, independently managing feature design, coding, testing, and production releases. Problem-Solving: Strong analytical skills with the ability to troubleshoot complex data platform and microservices issues. Certifications (Preferred): GCP Data Engineer, GCP Professional Cloud Show more Show less
Posted 2 months ago
5.0 years
0 Lacs
Gurugram, Haryana, India
On-site
About Eucloid At Eucloid, innovation meets impact. As a leader in AI and Data Science, we create solutions that redefine industries—from Hi-tech and D2C to Healthcare and SaaS. With partnerships with giants like Databricks, Google Cloud, and Adobe, we’re pushing boundaries and building next-gen technology. Join our talented team of engineers, scientists, and visionaries from top institutes like IITs, IIMs, and NITs. At Eucloid, growth is a promise, and your work will drive transformative results for Fortune 100 clients. What You’ll Do Design, build, and optimize AI platforms and pipelines on Google Cloud Platform (GCP) . Implement AI Ops practices and DevOps workflows for end-to-end machine learning lifecycle management (model training, deployment, monitoring). Develop automation scripts for deployment and manage cluster deployment on Kubernetes (GKE) . Build and maintain scalable CI/CD pipelines using tools like Jenkins, Cloud Build, or GitLab CI/CD for AI workflows. Architect GCP-based infrastructure using Terraform (Infrastructure as Code) for AI workloads (Vertex AI, BigQuery, Dataflow). Collaborate with data scientists to operationalize AI models and ensure seamless integration into production environments. Optimize cloud infrastructure for cost, performance, and security, focusing on automation and scalability. What Makes You a Fit Academic Background: Bachelor’s or Master’s in Computer Science, Data Science, Engineering, or a related field. Technical Expertise: 5+ years of experience in AI engineering, DevOps, or cloud platforms , with a focus on Google Cloud (GCP) . Hands-on experience with GCP services : Vertex AI, Kubernetes Engine (GKE), BigQuery, Dataflow, and Terraform. Proficiency in CI/CD pipeline tools: Jenkins , GitLab CI/CD, or Cloud Build. Strong coding skills in Python for scripting and automation. Expertise in Kubernetes for cluster deployment and orchestration. Experience with Infrastructure as Code (IaC) using Terraform or Google Deployment Manager. Knowledge of automation scripts for deployment and containerization ( Docker ). Extra Skills: Professional Cloud DevOps Engineer or related certification. Familiarity with Gen-AI tools (LangChain, Vertex AI Generative AI). Exposure to monitoring tools like Cloud Monitoring, Prometheus, or Grafana . Why You’ll Love It Here Innovate with the Best Tech: Work on groundbreaking projects using AI, GenAI, LLMs, and massive-scale data platforms. Tackle challenges that push the boundaries of innovation. Impact Industry Giants: Deliver business-critical solutions for Fortune 100 clients across Hi-tech, D2C, Healthcare, SaaS, and Retail. Partner with platforms like Databricks, Google Cloud, and Adobe to create high-impact products. Collaborate with a World-Class Team: Join exceptional professionals from IITs, IIMs, NITs, and global leaders like Walmart, Amazon, Accenture, and ZS. Learn, grow, and lead in a team that values expertise and collaboration. Accelerate Your Growth: Access our Centres of Excellence to upskill and work on industry-leading innovations. Your professional development is a top priority. Work in a Culture of Excellence: Be part of a dynamic workplace that fosters creativity, teamwork, and a passion for building transformative solutions. Your contributions will be recognized and celebrated. About Our Leadership Anuj Gupta – Former Amazon leader with over 22 years of experience in building and managing large engineering teams. (B.Tech, IIT Delhi; MBA, ISB Hyderabad). Raghvendra Kushwah – Business consulting expert with 21+ years at Accenture and Cognizant (B.Tech, IIT Delhi; MBA, IIM Lucknow). Key Benefits Competitive salary and performance-based bonus. Comprehensive benefits package, including health insurance and flexible work hours. Opportunities for professional development and careers growth. Location: Gurugram Submit your resume to saurabh.bhaumik@eucloid.com with the subject line “ Application: Platform Engineer (GCP). ” Eucloid is an equal-opportunity employer. We celebrate diversity and are committed to creating an inclusive environment. Show more Show less
Posted 2 months ago
8.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Job Title: Data Architect – Data Integration & Engineering Location: Hybrid Experience: 8+ years Job Summary: We are seeking an experienced Data Architect specializing in data integration, data engineering, and hands-on coding to design, implement, and manage scalable and high-performance data solutions. The ideal candidate should have expertise in ETL/ELT, cloud data platforms, big data technologies, and enterprise data architecture. Key Responsibilities: 1. Data Architecture & Design: Develop enterprise-level data architecture solutions, ensuring scalability, performance, and reliability. Design data models (conceptual, logical, physical) for structured and unstructured data. Define and implement data integration frameworks using industry-standard tools. Ensure compliance with data governance, security, and regulatory policies (GDPR, HIPAA, etc.). 2. Data Integration & Engineering: Implement ETL/ELT pipelines using Informatica, Talend, Apache Nifi, or DBT. Work with batch and real-time data processing tools such as Apache Kafka, Kinesis, and Apache Flink. Integrate and optimize data lakes, data warehouses, and NoSQL databases. 3. Hands-on Coding & Development: Write efficient and scalable code in Python, Java, or Scala for data transformation and processing. Optimize SQL queries, stored procedures, and indexing strategies for performance tuning. Build and maintain Spark-based data processing solutions in Databricks and Cloudera ecosystems. Develop workflow automation using Apache Airflow, Prefect, or similar tools. 4. Cloud & Big Data Technologies: Work with cloud platforms such as AWS (Redshift, Glue), Azure (Data Factory, Synapse), and GCP (BigQuery, Dataflow). Manage big data processing using Cloudera, Hadoop, HBase, and Apache Spark. Deploy containerized data services using Kubernetes and Docker. Automate infrastructure using Terraform and CloudFormation. 5. Governance, Security & Compliance: Implement data security, masking, and encryption strategies. Define RBAC (Role-Based Access Control) and IAM policies for data access. Work on metadata management, data lineage, and cataloging. Required Skills & Technologies: Data Engineering & Integration: ETL/ELT Tools: Informatica, Talend, Apache Nifi, DBT Big Data Ecosystem: Cloudera, HBase, Apache Hadoop, Spark Data Streaming: Apache Kafka, AWS Kinesis, Apache Flink Data Warehouses: Snowflake, AWS Redshift, Google Big Query, Azure Synapse Databases: PostgreSQL, MySQL, MongoDB, Cassandra Programming & Scripting: Languages: Python, Java, Scala Scripting: Shell, PowerShell, Bash Frameworks: PySpark, SparkSQL Cloud & DevOps: Cloud Platforms: AWS, Azure, GCP Containerization & Orchestration: Kubernetes, Docker CI/CD Pipelines: Jenkins, GitHub Actions, Terraform, CloudFormation Security & Governance: Compliance Standards: GDPR, HIPAA, SOC 2 Data Cataloging: Collibra, Alation Access Controls: IAM, RBAC, ABAC Preferred Certifications: AWS Certified Data Analytics – Specialty Microsoft Certified: Azure Data Engineer Associate Google Professional Data Engineer Databricks Certified Data Engineer Associate/Professional Cloudera Certified Data Engineer Informatica Certified Professional Education & Experience: Bachelor's/Master’s degree in Computer Science/ MCA, Data Engineering, or a related field. 8+ years of experience in data architecture, integration, and engineering. Proven expertise in designing and implementing enterprise-scale data solutions. Show more Show less
Posted 2 months ago
4.0 - 6.0 years
4 - 7 Lacs
Pune
Work from Office
Job Summary We are looking for a Data Quality Engineer who will safeguard the integrity of our cloud-native data assets. You will design and execute automated and manual data-quality checks across structured and semi-structured sources on Azure and GCP, validating that our data pipelines deliver accurate, complete, and consistent datasets for analytics, reporting, and AI initiatives. Key Responsibilities Define, build, and maintain data-quality frameworks that measure accuracy, completeness, timeliness, consistency, and validity of data ingested through ETL/ELT pipelines. Develop automated tests using SQL, Python, or similar tools; supplement with targeted manual validation where required. Collaborate with data engineers to embed data-quality gates into CI/CD pipelines on Azure Data Factory / Synapse / Fabric and GCP Dataflow / Cloud Composer. Profile new data sources (structured and semi-structuredJSON, Parquet, Avro) to establish baselines, detect anomalies, and recommend cleansing or transformation rules. Monitor data-quality KPIs and publish dashboards/alerts that surface issues to stakeholders in near-real time. Conduct root-cause analysis for data-quality defects, propose remediation strategies, and track resolution to closure. Maintain comprehensive documentation of test cases, data-quality rules, lineage, and issue logs for audit and governance purposes. Partner with data governance, security, and compliance teams to ensure adherence to regulatory requirements Must-Have Skills 4-6 years of experience in data quality, data testing, or data engineering roles within cloud environments. Hands-on expertise with at least one major cloud data stackAzure (Data Factory, Synapse, Databricks/Fabric) or GCP (BigQuery, Dataflow, Cloud Composer). Strong SQL skills and proficiency in a scripting language such as Python for building automated validation routines. Solid understanding of data-modeling concepts (dimensional, 3NF, data vault) and how they impact data-quality rules. Experience testing semi-structured data formats (JSON, XML, Avro, Parquet) and streaming/near-real-time pipelines. Excellent analytical and communication skills; able to translate complex data issues into clear, actionable insights for technical and business stakeholders. Nice-to-Have Skills Familiarity with BI/reporting tools (Power BI, Looker, Tableau) for surfacing data-quality metrics. Preferred Certifications Google Professional Data Engineer or Associate Cloud Engineer (GCP track) - OR - Microsoft Certified: Azure Data Engineer Associate Education Bachelors or Masters degree in Computer Science, Information Systems, Engineering, Mathematics, or a related field. Comparable professional experience will also be considered. Why Join Us? You will be the guardian of our datas trustworthiness, enabling decision-makers to rely on insights with confidence. If you are passionate about building automated, scalable data-quality solutions in a modern cloud environment, we’d love to meet you.
Posted 2 months ago
4.0 - 8.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
We are hiring for Power BI Developer Should have basic knowledge in data warehouse concepts Develop and enhance Power BI reports and dashboards Experienced in data modelling in Power BI including MQuery and DAX Experienced in Power BI features like RLS incremental data load dataflow etc Should have good exposure working with diverse set of visuals and various data sources like SQL Server BigQuery etc Proficient in TSQL Job description Design analyze develop test and debug Power BI reports and dashboards to satisfy business requirements Ability to translate business requirements to technical solutions Collaborate with other teams within the organization and be able to devise the technical solution as it relates to the business technical requirements Experienced in Client communication Excellent communication skill Should be a team player Maintain documentation for all processes implemented Adhere to and suggest improvements to coding standards best practices and contribute to the improvement of these best practices. Experience:- 4-8 years Location: Hyderabad & Mumbai Primary Skill – Power BI, SQL Working Days- Hybrid Joining time - Immediate-30 days If the above criteria are matching your profiles, please share your profile to swathi.gangu@ltimindtree.com with below details. Relevant in Power Bi: Relevant in SQL: Current CTC: Expected CTC: Current Location: Preferred Location Offer in hand if any: Pan Card No: Notice period/how soon you can join: Skills Mandatory Skills : ANSI-SQL, Dimensional Data Modeling, PowerBI Regards Swathi LTIM Show more Show less
Posted 2 months ago
6.0 - 8.0 years
8 - 10 Lacs
Pune
Work from Office
Job Summary We are looking for a seasoned Data Modeler / Data Analyst to design and implement scalable, reusable logical and physical data models on Google Cloud Platformprimarily BigQuery. You will partner closely with data engineers, analytics teams, and business stakeholders to translate complex business requirements into performant data models that power reporting, self-service analytics, and advanced data science workloads. Key Responsibilities Gather and analyze business requirements to translate them into conceptual, logical, and physical data models on GCP (BigQuery, Cloud SQL, Cloud Spanner, etc.). Design star/snowflake schemas, data vaults, and other modeling patterns that balance performance, flexibility, and cost. Implement partitioning, clustering, and materialized views in BigQuery to optimize query performance and cost efficiency. Establish and maintain data modelling standards, naming conventions, and metadata documentation to ensure consistency across analytic and reporting layers. Collaborate with data engineers to define ETL/ELT pipelines and ensure data models align with ingestion and transformation strategies (Dataflow, Cloud Composer, Dataproc, dbt). Validate data quality and lineage; work with BI developers and analysts to troubleshoot performance issues or data anomalies. Conduct impact assessments for schema changes and guide version-control processes for data models. Mentor junior analysts/engineers on data modeling best practices and participate in code/design reviews. Contribute to capacity planning and cost-optimization recommendations for BigQuery datasets and reservations. Must-Have Skills 6-8 years of hands-on experience in data modeling, data warehousing, or database design, including at least 2 years on GCP BigQuery. Proficiency in dimensional modeling, 3NF, and modern patterns such as data vault. Expert SQL skills with demonstrable ability to optimize complex analytical queries on BigQuery (partitioning, clustering, sharding strategies). Strong understanding of ETL/ELT concepts and experience working with tools such as Dataflow, Cloud Composer, or dbt. Familiarity with BI/reporting tools (Looker, Tableau, Power BI, or similar) and how model design impacts dashboard performance. Experience with data governance practices—data cataloging, lineage, and metadata management (e.g., Data Catalog). Excellent communication skills to translate technical concepts into business-friendly language and collaborate across functions. Good to Have Experience of working on Azure Cloud (Fabric, Synapse, Delta Lake) Education Bachelor’s or Master’s degree in Computer Science, Information Systems, Engineering, Statistics, or a related field. Equivalent experience will be considered.
Posted 2 months ago
0.0 - 5.0 years
0 Lacs
Noida, Uttar Pradesh
On-site
Data Engineer – (L3) || GCP Certified Experience Level: 4-7 years Location: Noida Office or at Client Site as Required Employment Type: Full-Time Work Mode: In-office/ Hybrid Notice: Immediate joiners Client Profile: A leading technology company As a Data Engineer, you will design, develop, and support data pipelines and related data products and platforms. Your primary responsibilities include designing and building data extraction, loading, and transformation pipelines across on-prem and cloud platforms. You will perform application impact assessments, requirements reviews, and develop work estimates. Additionally, you will develop test strategies and site reliability engineering measures for data products and solutions, participate in agile development "scrums" and solution reviews, mentor junior Data Engineering Specialists, lead the resolution of critical operations issues, and perform technical data stewardship tasks, including metadata management, security, and privacy by design. Required Skills: Design, develop, and support data pipelines and related data products and platforms. Design and build data extraction, loading, and transformation pipelines and data products across on-prem and cloud platforms. Perform application impact assessments, requirements reviews, and develop work estimates. Develop test strategies and site reliability engineering measures for data products and solutions. Participate in agile development "scrums" and solution reviews. Mentor junior Data Engineers. Lead the resolution of critical operations issues, including post-implementation reviews. Perform technical data stewardship tasks, including metadata management, security, and privacy by design. Design and build data extraction, loading, and transformation pipelines using Python and other GCP Data Technologies Demonstrate SQL and database proficiency in various data engineering tasks. Automate data workflows by setting up DAGs in tools like Control-M, Apache Airflow, and Prefect. Develop Unix scripts to support various data operations. Model data to support business intelligence and analytics initiatives. Utilize infrastructure-as-code tools such as Terraform, Puppet, and Ansible for deployment automation. Expertise in GCP data warehousing technologies, including BigQuery, Cloud SQL, Dataflow, Data Catalog, Cloud Composer, Google Cloud Storage, IAM, Compute Engine, Cloud Data Fusion and Dataproc (good to have). Qualifications: Bachelor's degree in Software Engineering, Computer Science, Business, Mathematics, or related field. 4+ years of data engineering experience. 2 years of data solution architecture and design experience. GCP Certified Data Engineer (preferred). Job Type: Full-time Pay: Up to ₹1,400,000.00 per year Application Question(s): What is your notice period (in days)? What is your current annual salary (in INR)? What is your expected annual salary (in INR)? Experience: designing, developing, and supporting data pipelines : 4 years (Required) developing test strategies & measures for data products : 5 years (Required) GCP Data Technologies: 4 years (Required) SQL and database : 5 years (Required) agile development "scrums" and solution reviews: 4 years (Required) automation of data workflow by setting up DAGs : 5 years (Required) Location: Noida, Uttar Pradesh (Required) Work Location: In person
Posted 2 months ago
7.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
🚀 We’re Hiring: Senior GCP Data Engineer (7+ Years Experience) 📍 Location: Hyderabad (Work from Office - Mandatory) 📧 Apply Now: sasidhar.m@technogenindia.com Are you a passionate Data Engineer with deep expertise in Google Cloud Platform (GCP) and strong hands-on experience in data migration projects ? Do you bring solid knowledge of Oracle to the table and thrive in a fast-paced, collaborative environment? TechnoGen India is looking for a Senior GCP Data Engineer to join our Hyderabad team. This is a full-time, on-site opportunity designed for professionals ready to take on challenging migration projects and deliver impactful solutions. 🔍 What We’re Looking For: ✅ 7+ years of experience in Data Engineering ✅ Strong expertise in GCP (BigQuery, Dataflow, Pub/Sub, etc.) ✅ Proven experience in complex GCP migration projects ✅ Solid Oracle background (data extraction, transformation, and optimization) ✅ Ability to work full-time from our Hyderabad office 🎯 If you’re ready to bring your skills to a growing team that values innovation and excellence, we want to hear from you ! 👉 Share your updated resume with us at Only Whatsapp(9177108771) Let’s build the future of data together! 💡 #Hiring #GCP #DataEngineer #Oracle #HyderabadJobs #WorkFromOffice #TechJobsIndia #DataMigration #TechnoGenIndia Show more Show less
Posted 2 months ago
0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Job Overview- We are looking for enthusiast Product Manager who involves in driving the product lifecycle, from requirement synthesis to release planning and performance evaluation. It requires close collaboration with senior stakeholders, cross-functional teams, and clients to elicit requirements, analyze market trends, and identify functional gaps. The role emphasizes maintaining an updated product roadmap, monitoring development and validating requirements to create detailed specifications. Role: Product Manager Reporting to Director of Product Experience Range: 4-8yrs Responsibilities Requirements Synthesis & Product Road Mapping: Elicit requirements for product through regular interactions with Vice-President, Onsite Director and Product Manager, and onsite and offshore Customer Success Team, Sales team and clients Study current solution as used by customer and as offered by competitors to understand functional gaps Be up-to-date with current industry developments both in technology as well as target-market for the next big idea of the product Enhance product roadmap, maintain backlog and map-out product's release planning based on stakeholder priorities, competitive outlook, market demands and client requests Evaluate and validate requirements gathered from multiple sources, reconcile conflicts, convert business/product feature ideas into detailed functional specifications, abstract high-level understanding from low-level technicalities, and distinguish user-requests from their underlying true needs Monitoring & Leveraging Key Metrics: In an agile working environment, monitor development, testing and documentation to ensure that the product is built based on specifications defined Identify and track KPIs that measure product success by making use of explicit and implicit data-sources like feedback from clients, field-force and support, and application metrics collected through analytics tools and database querying on product performance, usage, latency, etc. Documentation: Elicit requirements in business documents for consumption of each of the teams Business, Engineering and QA Produce high-level business presentations for Stakeholders and Business teams Develop business-case, use-cases, user-stories, personas as well as detailed functional diagrams, dataflow and deployment diagrams, and flowcharts for consumption of various teams involved with the product Create and maintain product’s API web-services On-job experience using UI prototyping, wireframes and mock-up tools Project & Team Management: Work closely with Engineering, QA, Dev-Ops, Sales, and Customer Success teams, enable a symbiotic ecosystem and provide direction to meet common goals and timelines Interact with Engineering team to translate requirements into implementation plan Interact with QA team to communicate expected workflows of new enhancements and feature-development to enable certified code-release through SIT and UAT Sales & Marketing Support: Conduct demo sessions for Marketing, Sales and Support Engineers team on newly released product features to facilitate their operations Facilitate creation of knowledgebase, user manuals and marketing material Qualifications B.E/B.Tech graduate and MBA combination will be a additional advantage Knowledge of Software Languages, RDBMS, Software Tools, Software Design, Software Documentation, Software Development Process (especially Agile Methodologies), Software Requirements, Software Maintenance, Quality Assurance, UI Prototyping and Analytics Experience working with web and mobile applications Experience working with B2B Enterprise products Understanding of Online Marketing concepts and Retail Industry Certification on agile/scrum practices preferable Ability to: Plan and complete projects within deadlines Generate ideas that extend and enhance product feature-set Ensure quality in product Promote process improvement About The Company OptCulture is at the forefront of helping brands elevate their customer relationships through cutting-edge retention strategies. We don’t just connect the dots; we create journeys that keep customers coming back for more! Think about the brands you admire - IKEA, Marks & Spencer, GUESS, Style Union. At OptCulture, we’re the behind-the-scenes marketing technology enabling them to understand and engage with their customers in deeply meaningful ways. It’s not just about sales; it’s about fostering loyalty that lasts. What Makes Us Unique? OptCulture isn’t your typical tech company. We’re a bootstrapped powerhouse, driven by relentless innovation and determination. From Houston to Dubai to Hyderabad, our solutions are redefining customer retention on a global scale. And here’s the kicker—we’re growing! OptCulture aims to hire thinkers and achievers. We believe in providing an environment for fast-paced growth as an individual, team, and organization. We encourage a culture of independence, collaboration, trust, and balance. Show more Show less
Posted 2 months ago
6.0 - 12.0 years
0 Lacs
Greater Kolkata Area
On-site
Job Description We are looking for a highly skilled GCP Technical Lead with 6 to 12 years of experience to join our dynamic team. In this role, you will be responsible for designing and implementing scalable, secure, and highly available cloud infrastructure solutions on Google Cloud Platform (GCP). You will lead the architecture and development of cloud-native applications and ensure that infrastructure and applications are optimized for performance, security, and scalability. Your expertise will play a key role in the design and execution of workload migrations, CI/CD pipelines, and infrastructure : Cloud Architecture and Design : Lead the design and implementation of scalable, secure, and highly available cloud infrastructure solutions on GCP using services like Compute Engine, Kubernetes Engine, Cloud Storage, Cloud SQL, and Cloud Load Balancing. Cloud-Native Applications Design : Develop architecture design and guidelines for the development, deployment, and lifecycle management of cloud-native applications, ensuring optimization for security, performance, and scalability with services such as App Engine, Cloud Functions, and Cloud Run. API Management : Implement secure API interfaces and granular access control using IAM, RBAC, and API Gateway for workloads running on GCP. Workload Migration : Lead the migration of on-premises workloads to GCP, ensuring minimal downtime, data integrity, and smooth transitions. CI/CD : Design and implement CI/CD pipelines using Cloud Build, Cloud Source Repositories, and Artifact Registry to automate development and deployment processes. Infrastructure as Code (IaC) : Automate cloud infrastructure provisioning and management using Terraform. Collaboration : Collaborate closely with cross-functional teams to define requirements, design solutions, and ensure successful project delivery, utilizing tools like Google Workspace and Jira. Monitoring and Optimization : Continuously monitor cloud environments to ensure optimal performance, availability, and security, and perform regular audits and tuning. Documentation : Prepare and maintain comprehensive documentation for cloud infrastructure, configurations, and procedures using Google Docs and Qualifications : Bachelors degree in computer science, Information Systems, or related field. 6-12 years of relevant experience in cloud engineering and architecture. Google Cloud Professional Cloud Architect certification. Experience with Kubernetes. Familiarity with DevOps methodologies. Strong problem-solving and analytical skills. Excellent communication skills. Required Skills Google Cloud Platform (GCP) Services, Compute Engine, Google Kubernetes Engine (GKE), Cloud Storage, Cloud SQL, Cloud Load Balancing, Identity and Access Management (IAM), Google Workflows, Google Cloud Pub/Sub, App Engine, Cloud Functions, Cloud Run, API Gateway Cloud Build, Cloud Source Repositories, Artifact Registry, Google Cloud Monitoring, Logging and Error Reporting, Python, Terraform, Google Cloud Firestore, GraphQL, MongoDB, Cassandra, Neo4j, ETL (Extract, Transform, Load) Paradigms, Google Cloud Dataflow, Apache Beam, BigQuery, Service Mesh, Content Delivery Network (CDN), Stackdriver, Google Cloud Trace (ref:hirist.tech) Show more Show less
Posted 2 months ago
0 years
0 Lacs
Mumbai Metropolitan Region
On-site
Skills: Data Engineer, Python, Spark, Cloudera, onpremise, Azure, Snowflow, kafka, Overview Of The Company Jio Platforms Ltd. is a revolutionary Indian multinational tech company, often referred to as India's biggest startup, headquartered in Mumbai. Launched in 2019, it's the powerhouse behind Jio, India's largest mobile network with over 400 million users. But Jio Platforms is more than just telecom. It's a comprehensive digital ecosystem, developing cutting-edge solutions across media, entertainment, and enterprise services through popular brands like JioMart, JioFiber, and JioSaavn. Join us at Jio Platforms and be part of a fast-paced, dynamic environment at the forefront of India's digital transformation. Collaborate with brilliant minds to develop next-gen solutions that empower millions and revolutionize industries. Team Overview The Data Platforms Team is the launchpad for a data-driven future, empowering the Reliance Group of Companies. We're a passionate group of experts architecting an enterprise-scale data mesh to unlock the power of big data, generative AI, and ML modelling across various domains. We don't just manage data we transform it into intelligent actions that fuel strategic decision-making. Imagine crafting a platform that automates data flow, fuels intelligent insights, and empowers the organization that's what we do. Join our collaborative and innovative team, and be a part of shaping the future of data for India's biggest digital revolution! About the role. Title: Lead Data Engineer Location : Mumbai Responsibilities End-to-End Data Pipeline Development: Design, build, optimize, and maintain robust data pipelines across cloud, on-premises, or hybrid environments, ensuring performance, scalability, and seamless data flow. Reusable Components & Frameworks: Develop reusable data pipeline components and contribute to the team's data pipeline framework evolution. Data Architecture & Solutions: Contribute to data architecture design, applying data modelling, storage, and retrieval expertise. Data Governance & Automation: Champion data integrity, security, and efficiency through metadata management, automation, and data governance best practices. Collaborative Problem Solving: Partner with stakeholders, data teams, and engineers to define requirements, troubleshoot, optimize, and deliver data-driven insights. Mentorship & Knowledge Transfer: Guide and mentor junior data engineers, fostering knowledge sharing and professional growth. Qualification Details Education: Bachelor's degree or higher in Computer Science, Data Science, Engineering, or a related technical field. Core Programming: Excellent command of a primary data engineering language (Scala, Python, or Java) with a strong foundation in OOPS and functional programming concepts. Big Data Technologies: Hands-on experience with data processing frameworks (e.g., Hadoop, Spark, Apache Hive, NiFi, Ozone, Kudu), ideally including streaming technologies (Kafka, Spark Streaming, Flink, etc.). Database Expertise: Excellent querying skills (SQL) and strong understanding of relational databases (e.g., MySQL, PostgreSQL). Experience with NoSQL databases (e.g., MongoDB, Cassandra) is a plus. End-to-End Pipelines: Demonstrated experience in implementing, optimizing, and maintaining complete data pipelines, integrating varied sources and sinks including streaming real-time data. Cloud Expertise: Knowledge of Cloud Technologies like Azure HDInsights, Synapse, EventHub and GCP DataProc, Dataflow, BigQuery. CI/CD Expertise: Experience with CI/CD methodologies and tools, including strong Linux and shell scripting skills for automation. Desired Skills & Attributes Problem-Solving & Troubleshooting: Proven ability to analyze and solve complex data problems, troubleshoot data pipeline issues effectively. Communication & Collaboration: Excellent communication skills, both written and verbal, with the ability to collaborate across teams (data scientists, engineers, stakeholders). Continuous Learning & Adaptability: A demonstrated passion for staying up-to-date with emerging data technologies and a willingness to adapt to new tools. Show more Show less
Posted 2 months ago
2.0 years
0 Lacs
Trivandrum, Kerala, India
On-site
Equifax is where you can power your possible. If you want to achieve your true potential, chart new paths, develop new skills, collaborate with bright minds, and make a meaningful impact, we want to hear from you. What You’ll Do As a Senior Developer at Equifax, you will oversee and steer the delivery of innovative batch and data products, primarily leveraging Java and the Google Cloud Platform (GCP). Your expertise will ensure the efficient and timely deployment of high quality data solutions that support our business objectives and client needs. Project Development: Development, and deployment of batch and data products, ensuring alignment with business goals and technical requirements. Technical Oversight: Provide technical direction and oversee the implementation of solutions using Java and GCP, ensuring best practices in coding, performance, and security. Team Management: Mentor and guide junior developers and engineers, fostering a collaborative and high performance environment. Stakeholder Collaboration: Work closely with cross functional teams including product managers, business analysts, and other stakeholders to gather requirements and translate them into technical solutions. Documentation and Compliance: Maintain comprehensive documentation for all technical processes, ensuring compliance with internal and external standards. Continuous Improvement: Advocate for and implement continuous improvement practices within the team, staying abreast of emerging technologies and methodologies. What Experience You Need Education: Bachelor's degree in Computer Science, Information Technology, or a related field. Experience: Minimum of 2-5 years of relevant experience in software development, with a focus on batch processing and data solutions. Technical Skills: Proficiency in Java, with a strong understanding of its ecosystems. 2+ year of relevant Java, Sprint, Spring Boot, REST, Microservices, Hibernate, JPA, RDBMS Minimum 2 Git, CI/CD Pipelines, Jenkins Experience with Google Cloud Platform, including BigQuery, Cloud Storage, Dataflow, Big Table and other GCP tools. Familiarity with ETL processes, data modeling, and SQL. Soft Skills: Strong problem solving abilities and a proactive approach to project management. Effective communication and interpersonal skills, with the ability to convey technical concepts to nontechnical stakeholders. What Could Set You Apart Technical Skills: Proficiency in Java, with a strong understanding of its ecosystems. 2+ year of relevant Java, Sprint, Spring Boot, REST, Microservices, Hibernate, JPA, RDBMS Minimum 2 Git, CI/CD Pipelines, Jenkins Experience with Google Cloud Platform, including Big Query, Cloud Storage, Dataflow, Big Table and other GCP tools. Familiarity with ETL processes, data modeling, and SQL. Certification in Google Cloud (e.g., Associate Cloud Engineer). Experience with other cloud platforms (AWS, Azure) is a plus. Understanding of data privacy regulations and compliance frameworks. We offer a hybrid work setting, comprehensive compensation and healthcare packages, attractive paid time off, and organizational growth potential through our online learning platform with guided career tracks. Are you ready to power your possible? Apply today, and get started on a path toward an exciting new career at Equifax, where you can make a difference! Who is Equifax? At Equifax, we believe knowledge drives progress. As a global data, analytics and technology company, we play an essential role in the global economy by helping employers, employees, financial institutions and government agencies make critical decisions with greater confidence. We work to help create seamless and positive experiences during life’s pivotal moments: applying for jobs or a mortgage, financing an education or buying a car. Our impact is real and to accomplish our goals we focus on nurturing our people for career advancement and their learning and development, supporting our next generation of leaders, maintaining an inclusive and diverse work environment, and regularly engaging and recognizing our employees. Regardless of location or role, the individual and collective work of our employees makes a difference and we are looking for talented team players to join us as we help people live their financial best. Equifax is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, or status as a protected veteran. Show more Show less
Posted 2 months ago
7.0 years
0 Lacs
India
Remote
Cloud Data Engineer - Scala / Databricks is required 100% Remote working IMMEDIATE JOINER REQUIRED Cloud Data Engineer is required ASAP by our global market leading IT Consultancy client! With a strong background in AWS, Azure, and GCP the ideal candidate will have extensive experience with cloud-native ETL tools such as AWS DMS, AWS Glue, Kafka, Azure Data Factory, GCP Dataflow, and other ETL tools like Informatica, SAP Data Intelligence, etc. Key responsibilities You main responsibility will be for designing, implementing, and maintaining robust data pipelines and building scalable data lakes, broken down into the following: Design and Development: Design, develop, and maintain scalable ETL pipelines using cloud-native tools (AWS DMS, AWS Glue, Kafka, Azure Data Factory, GCP Dataflow, etc.). Architect and implement data lakes and data warehouses on cloud platforms (AWS, Azure, GCP). Develop and optimize data ingestion, transformation, and loading processes using Databricks, Snowflake, Redshift, BigQuery and Azure Synapse. Implement ETL processes using tools like Informatica, SAP Data Intelligence, and others. Develop and optimize data processing jobs using Spark Scala. Data Integration and Management: Integrate various data sources, including relational databases, APIs, unstructured data, and ERP systems into the data lake. Ensure data quality and integrity through rigorous testing and validation. Perform data extraction from SAP or ERP systems when necessary. Performance Optimization: Monitor and optimize the performance of data pipelines and ETL processes. Implement best practices for data management, including data governance, security, and compliance. Collaboration and Communication: Work closely with data scientists, analysts, and other stakeholders to understand data requirements and deliver solutions. Collaborate with cross-functional teams to design and implement data solutions that meet business needs. Documentation and Maintenance: Document technical solutions, processes, and workflows. Maintain and troubleshoot existing ETL pipelines and data integrations. Essential previous experience must include 7+ years of experience as a Data Engineer in a similar role. Minimum 3 years of experience specifically working with "Databricks on AWS" MUST HAVE Strong hands on coding and platform development in Apache Spark / Scala / Databricks Experience with data extraction from SAP or ERP systems Experience with various Data platforms such as Amazon Redshift / Snowflake / Synapse Proficient in SQL and query optimization techniques. Familiarity with data modeling, ETL/ELT processes, and data warehousing concepts. Knowledge of data governance, security, and compliance best practices. Show more Show less
Posted 2 months ago
2.0 years
0 Lacs
Trivandrum, Kerala, India
On-site
What you’ll do: As a Senior Developer at Equifax, you will oversee and steer the delivery of innovative batch and data products, primarily leveraging Java and the Google Cloud Platform (GCP). Your expertise will ensure the efficient and timely deployment of high quality data solutions that support our business objectives and client needs. Project Development: Development, and deployment of batch and data products, ensuring alignment with business goals and technical requirements. Technical Oversight: Provide technical direction and oversee the implementation of solutions using Java and GCP, ensuring best practices in coding, performance, and security. Team Management: Mentor and guide junior developers and engineers, fostering a collaborative and high performance environment. Stakeholder Collaboration: Work closely with cross functional teams including product managers, business analysts, and other stakeholders to gather requirements and translate them into technical solutions. Documentation and Compliance: Maintain comprehensive documentation for all technical processes, ensuring compliance with internal and external standards. Continuous Improvement: Advocate for and implement continuous improvement practices within the team, staying abreast of emerging technologies and methodologies. What experience you need: Education: Bachelor's degree in Computer Science, Information Technology, or a related field. Experience: Minimum of 2-5 years of relevant experience in software development, with a focus on batch processing and data solutions. Technical Skills: Proficiency in Java, with a strong understanding of its ecosystems. 2+ year of relevant Java, Sprint, Spring Boot, REST, Microservices, Hibernate, JPA, RDBMS Minimum 2 Git, CI/CD Pipelines, Jenkins Experience with Google Cloud Platform, including BigQuery, Cloud Storage, Dataflow, Big Table and other GCP tools. Familiarity with ETL processes, data modeling, and SQL. Soft Skills: Strong problem solving abilities and a proactive approach to project management. Effective communication and interpersonal skills, with the ability to convey technical concepts to nontechnical stakeholders. What could set you apart: Technical Skills: Proficiency in Java, with a strong understanding of its ecosystems. 2+ year of relevant Java, Sprint, Spring Boot, REST, Microservices, Hibernate, JPA, RDBMS Minimum 2 Git, CI/CD Pipelines, Jenkins Experience with Google Cloud Platform, including Big Query, Cloud Storage, Dataflow, Big Table and other GCP tools. Familiarity with ETL processes, data modeling, and SQL. Certification in Google Cloud (e.g., Associate Cloud Engineer). Experience with other cloud platforms (AWS, Azure) is a plus. Understanding of data privacy regulations and compliance frameworks. Show more Show less
Posted 2 months ago
2.0 years
0 Lacs
Kolkata, West Bengal, India
On-site
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. EY-Advisory - Data and Analytics – Staff – Data Engineer(Scala) EY's Advisory Services is a unique, industry-focused business unit that provides a broad range of integrated services that leverage deep industry experience with strong functional and technical capabilities and product knowledge. EY’s financial services practice provides integrated advisory services to financial institutions and other capital markets participants, including commercial banks, retail banks, investment banks, broker-dealers & asset management firms, and insurance firms from leading Fortune 500 Companies. Within EY’s Advisory Practice, Data and Analytics team solves big, complex issues and capitalize on opportunities to deliver better working outcomes that help expand and safeguard the businesses, now and in the future. This way we help create a compelling business case for embedding the right analytical practice at the heart of client’s decision-making. The opportunity We’re looking for Senior – Big Data Experts with expertise in Financial Services domain hand-on experience with Big data ecosystem. Primary Skills And Key Responsibilities Strong knowledge in Spark, good understanding of Spark framework, Performance tuning. Proficiency in Scala & SQL. Good exposure to one of the Cloud technologies - GCP/ Azure/ AWS Hands-on Experience in designing, building, and maintaining scalable data pipelines and solutions to manage and process large datasets efficiently. Flexible and proactive/self-motivated working style with strong personal ownership of problem resolution. Excellent communicator (written and verbal formal and informal). Ability to multi-task under pressure and work independently with minimal supervision. Strong verbal and written communication skills. Must be a team player and enjoy working in a cooperative and collaborative team environment. Adaptable to new technologies and standards. Participate in all aspects of Big Data solution delivery life cycle including analysis, design, development, testing, production deployment, and support Nice To Have Skills Design, develop, and deploy robust and scalable data pipelines using GCP services such as BigQuery, Dataflow, Data Composer/Cloud Composer (Airflow) and related technologies. Good experience in experience in GCP technology areas of Data store, Big Query, Cloud storage, Persistent disk IAM, Roles, Projects, Organization. Understanding & familiarity with all Hadoop Ecosystem components and Hadoop Administrative Fundamentals Experience working with NoSQL in at least one of the data stores - HBase, Cassandra, MongoDB Experience in HDFS, Hive, Impala Experience is schedulers like Airflow, Nifi etc Experienced in Hadoop clustering and Auto scaling. Develop standardized practices for delivering new products and capabilities using Big Data technologies, including data acquisition, transformation, and analysis. Define and develop client specific best practices around data management within a Hadoop environment on Azure cloud To qualify for the role, you must have BE/BTech/MCA/MBA Minimum 2 years hand-on experience in one or more relevant areas. Total of 1-3 years industry experience Ideally, you’ll also have Experience on Banking and Capital Markets domains Skills And Attributes For Success Use an issue-based approach to deliver growth, market and portfolio strategy engagements for corporates Strong communication, presentation and team building skills and experience in producing high quality reports, papers, and presentations. Experience in executing and managing research and analysis of companies and markets, preferably from a commercial due diligence standpoint. What We Look For A Team of people with commercial acumen, technical experience and enthusiasm to learn new things in this fast-moving environment An opportunity to be a part of market-leading, multi-disciplinary team of 1400 + professionals, in the only integrated global transaction business worldwide. Opportunities to work with EY Advisory practices globally with leading businesses across a range of industries What Working At EY Offers At EY, we’re dedicated to helping our clients, from start–ups to Fortune 500 companies — and the work we do with them is as varied as they are. You get to work with inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees and you will be able to control your own development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Plus, we offer: Support, coaching and feedback from some of the most engaging colleagues around Opportunities to develop new skills and progress your career The freedom and flexibility to handle your role in a way that’s right for you EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less
Posted 2 months ago
2.0 years
0 Lacs
Trivandrum, Kerala, India
On-site
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. EY-Advisory - Data and Analytics – Staff – Data Engineer(Scala) EY's Advisory Services is a unique, industry-focused business unit that provides a broad range of integrated services that leverage deep industry experience with strong functional and technical capabilities and product knowledge. EY’s financial services practice provides integrated advisory services to financial institutions and other capital markets participants, including commercial banks, retail banks, investment banks, broker-dealers & asset management firms, and insurance firms from leading Fortune 500 Companies. Within EY’s Advisory Practice, Data and Analytics team solves big, complex issues and capitalize on opportunities to deliver better working outcomes that help expand and safeguard the businesses, now and in the future. This way we help create a compelling business case for embedding the right analytical practice at the heart of client’s decision-making. The opportunity We’re looking for Senior – Big Data Experts with expertise in Financial Services domain hand-on experience with Big data ecosystem. Primary Skills And Key Responsibilities Strong knowledge in Spark, good understanding of Spark framework, Performance tuning. Proficiency in Scala & SQL. Good exposure to one of the Cloud technologies - GCP/ Azure/ AWS Hands-on Experience in designing, building, and maintaining scalable data pipelines and solutions to manage and process large datasets efficiently. Flexible and proactive/self-motivated working style with strong personal ownership of problem resolution. Excellent communicator (written and verbal formal and informal). Ability to multi-task under pressure and work independently with minimal supervision. Strong verbal and written communication skills. Must be a team player and enjoy working in a cooperative and collaborative team environment. Adaptable to new technologies and standards. Participate in all aspects of Big Data solution delivery life cycle including analysis, design, development, testing, production deployment, and support Nice To Have Skills Design, develop, and deploy robust and scalable data pipelines using GCP services such as BigQuery, Dataflow, Data Composer/Cloud Composer (Airflow) and related technologies. Good experience in experience in GCP technology areas of Data store, Big Query, Cloud storage, Persistent disk IAM, Roles, Projects, Organization. Understanding & familiarity with all Hadoop Ecosystem components and Hadoop Administrative Fundamentals Experience working with NoSQL in at least one of the data stores - HBase, Cassandra, MongoDB Experience in HDFS, Hive, Impala Experience is schedulers like Airflow, Nifi etc Experienced in Hadoop clustering and Auto scaling. Develop standardized practices for delivering new products and capabilities using Big Data technologies, including data acquisition, transformation, and analysis. Define and develop client specific best practices around data management within a Hadoop environment on Azure cloud To qualify for the role, you must have BE/BTech/MCA/MBA Minimum 2 years hand-on experience in one or more relevant areas. Total of 1-3 years industry experience Ideally, you’ll also have Experience on Banking and Capital Markets domains Skills And Attributes For Success Use an issue-based approach to deliver growth, market and portfolio strategy engagements for corporates Strong communication, presentation and team building skills and experience in producing high quality reports, papers, and presentations. Experience in executing and managing research and analysis of companies and markets, preferably from a commercial due diligence standpoint. What We Look For A Team of people with commercial acumen, technical experience and enthusiasm to learn new things in this fast-moving environment An opportunity to be a part of market-leading, multi-disciplinary team of 1400 + professionals, in the only integrated global transaction business worldwide. Opportunities to work with EY Advisory practices globally with leading businesses across a range of industries What Working At EY Offers At EY, we’re dedicated to helping our clients, from start–ups to Fortune 500 companies — and the work we do with them is as varied as they are. You get to work with inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees and you will be able to control your own development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Plus, we offer: Support, coaching and feedback from some of the most engaging colleagues around Opportunities to develop new skills and progress your career The freedom and flexibility to handle your role in a way that’s right for you EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less
Posted 2 months ago
2.0 years
0 Lacs
Kanayannur, Kerala, India
On-site
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. EY-Advisory - Data and Analytics – Staff – Data Engineer(Scala) EY's Advisory Services is a unique, industry-focused business unit that provides a broad range of integrated services that leverage deep industry experience with strong functional and technical capabilities and product knowledge. EY’s financial services practice provides integrated advisory services to financial institutions and other capital markets participants, including commercial banks, retail banks, investment banks, broker-dealers & asset management firms, and insurance firms from leading Fortune 500 Companies. Within EY’s Advisory Practice, Data and Analytics team solves big, complex issues and capitalize on opportunities to deliver better working outcomes that help expand and safeguard the businesses, now and in the future. This way we help create a compelling business case for embedding the right analytical practice at the heart of client’s decision-making. The opportunity We’re looking for Senior – Big Data Experts with expertise in Financial Services domain hand-on experience with Big data ecosystem. Primary Skills And Key Responsibilities Strong knowledge in Spark, good understanding of Spark framework, Performance tuning. Proficiency in Scala & SQL. Good exposure to one of the Cloud technologies - GCP/ Azure/ AWS Hands-on Experience in designing, building, and maintaining scalable data pipelines and solutions to manage and process large datasets efficiently. Flexible and proactive/self-motivated working style with strong personal ownership of problem resolution. Excellent communicator (written and verbal formal and informal). Ability to multi-task under pressure and work independently with minimal supervision. Strong verbal and written communication skills. Must be a team player and enjoy working in a cooperative and collaborative team environment. Adaptable to new technologies and standards. Participate in all aspects of Big Data solution delivery life cycle including analysis, design, development, testing, production deployment, and support Nice To Have Skills Design, develop, and deploy robust and scalable data pipelines using GCP services such as BigQuery, Dataflow, Data Composer/Cloud Composer (Airflow) and related technologies. Good experience in experience in GCP technology areas of Data store, Big Query, Cloud storage, Persistent disk IAM, Roles, Projects, Organization. Understanding & familiarity with all Hadoop Ecosystem components and Hadoop Administrative Fundamentals Experience working with NoSQL in at least one of the data stores - HBase, Cassandra, MongoDB Experience in HDFS, Hive, Impala Experience is schedulers like Airflow, Nifi etc Experienced in Hadoop clustering and Auto scaling. Develop standardized practices for delivering new products and capabilities using Big Data technologies, including data acquisition, transformation, and analysis. Define and develop client specific best practices around data management within a Hadoop environment on Azure cloud To qualify for the role, you must have BE/BTech/MCA/MBA Minimum 2 years hand-on experience in one or more relevant areas. Total of 1-3 years industry experience Ideally, you’ll also have Experience on Banking and Capital Markets domains Skills And Attributes For Success Use an issue-based approach to deliver growth, market and portfolio strategy engagements for corporates Strong communication, presentation and team building skills and experience in producing high quality reports, papers, and presentations. Experience in executing and managing research and analysis of companies and markets, preferably from a commercial due diligence standpoint. What We Look For A Team of people with commercial acumen, technical experience and enthusiasm to learn new things in this fast-moving environment An opportunity to be a part of market-leading, multi-disciplinary team of 1400 + professionals, in the only integrated global transaction business worldwide. Opportunities to work with EY Advisory practices globally with leading businesses across a range of industries What Working At EY Offers At EY, we’re dedicated to helping our clients, from start–ups to Fortune 500 companies — and the work we do with them is as varied as they are. You get to work with inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees and you will be able to control your own development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Plus, we offer: Support, coaching and feedback from some of the most engaging colleagues around Opportunities to develop new skills and progress your career The freedom and flexibility to handle your role in a way that’s right for you EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less
Posted 2 months ago
0.0 - 7.0 years
0 Lacs
Noida, Uttar Pradesh
On-site
Data Engineer (SaaS-Based) || 5-7 years || NOIDA || 3 pm-12 AM IST shift Location: Noida (In-office/Hybrid; Client site if required) Experience: 5–7 years Type: Full-Time | Immediate Joiners Preferred Shift: 3 PM to 12 AM IST Client: Leading Canadian-based Tech Company Good to have: GCP Certified Data Engineer Overview of the role: As a GCP Data Engineer, you'll focus on solving problems and creating value for the business by building solutions that are reliable and scalable to work with the size and scope of the company. You will be tasked with creating custom-built pipelines as well as migrating on-prem data pipelines to the GCP stack. You will be part of a team tackling intricate problems by designing and deploying reliable and scalable solutions tailored to the company's data landscape. Required Skills: • 5+ years of industry experience in software development, data engineering, business intelligence, or related field with experience in manipulating, processing, and extracting value from datasets. •Extensive experience in doing requirement discovery, analysis and data pipeline solution design. • Design, build and deploy internal applications to support our technology life cycle, collaboration and spaces, service delivery management, data and business intelligence among others. • Building Modular code for multi usable pipeline or any kind of complex Ingestion Framework used to ease the job to load the data into Datalake or Data Warehouse from multiple sources. • Work closely with analysts and business process owners to translate business requirements into technical solutions. • Coding experience in scripting and languages (Python, SQL, PySpark). • Expertise in Google Cloud Platform (GCP) technologies in the data warehousing space (BigQuery, GCP Workflows, Cloud Scheduler, Secret Manager, Batch, Cloud Logging, Cloud SDK, Google Cloud Storage, IAM). • Exposure of Google Dataproc and Dataflow. • Maintain highest levels of development practices including: technical design, solution development, systems configuration, test documentation/execution, issue identification and resolution, writing clean, modular and self-sustaining code, with repeatable quality and predictability. • Understanding CI/CD Processes using Pulumi, GitHub, Cloud Build, Cloud SDK, Docker • Experience with SAS/SQL Server/SSIS is an added advantage. Qualifications: • Bachelor's degree in Computer Science or related technical field, or equivalent practical experience. • GCP Certified Data Engineer (preferred) • Excellent verbal and written communication skills with the ability to effectively advocate technical solutions to other engineering teams and business audiences. Job Type: Full-time Pay: Up to ₹1,400,000.00 per year Schedule: UK shift Application Question(s): What is your notice period (in days)? What is your current annual salary (in INR)? What is your expected annual salary (in INR)? This is a 3 PM to 12 Am IST shift job. Please apply ONLY IF you are willing to work in this shift. Experience: coding in Python, SQL, PySpark: 7 years (Required) Google Dataproc and Dataflow: 7 years (Required) (GCP) technologies in the data warehousing space : 7 years (Required) Pulumi, GitHub, Cloud Build, Cloud SDK, Docker: 7 years (Required) building or maintaining data pipelines : 7 years (Required) BigQuery, GCP Workflows, IAM: 7 years (Required) SAS/SQL Server/SSIS : 7 years (Required) License/Certification: GCP Certified Data Engineer (Required) Location: Noida, Uttar Pradesh (Required) Shift availability: Night Shift (Required) Work Location: In person
Posted 2 months ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
32455 Jobs | Dublin
Wipro
16590 Jobs | Bengaluru
EY
11025 Jobs | London
Accenture in India
10991 Jobs | Dublin 2
Amazon
8878 Jobs | Seattle,WA
Uplers
8715 Jobs | Ahmedabad
IBM
8204 Jobs | Armonk
Oracle
7750 Jobs | Redwood City
Capgemini
6181 Jobs | Paris,France
Muthoot FinCorp (MFL)
6170 Jobs | New Delhi