Jobs
Interviews

961 Dataflow Jobs - Page 37

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

0.0 - 5.0 years

0 Lacs

Noida, Uttar Pradesh

On-site

Data Engineer – (L3) || GCP Certified Experience Level: 4-7 years Location: Noida Office or at Client Site as Required Employment Type: Full-Time Work Mode: In-office/ Hybrid Notice: Immediate joiners Client Profile: A leading technology company As a Data Engineer, you will design, develop, and support data pipelines and related data products and platforms. Your primary responsibilities include designing and building data extraction, loading, and transformation pipelines across on-prem and cloud platforms. You will perform application impact assessments, requirements reviews, and develop work estimates. Additionally, you will develop test strategies and site reliability engineering measures for data products and solutions, participate in agile development "scrums" and solution reviews, mentor junior Data Engineering Specialists, lead the resolution of critical operations issues, and perform technical data stewardship tasks, including metadata management, security, and privacy by design. Required Skills: Design, develop, and support data pipelines and related data products and platforms. Design and build data extraction, loading, and transformation pipelines and data products across on-prem and cloud platforms. Perform application impact assessments, requirements reviews, and develop work estimates. Develop test strategies and site reliability engineering measures for data products and solutions. Participate in agile development "scrums" and solution reviews. Mentor junior Data Engineers. Lead the resolution of critical operations issues, including post-implementation reviews. Perform technical data stewardship tasks, including metadata management, security, and privacy by design. Design and build data extraction, loading, and transformation pipelines using Python and other GCP Data Technologies Demonstrate SQL and database proficiency in various data engineering tasks. Automate data workflows by setting up DAGs in tools like Control-M, Apache Airflow, and Prefect. Develop Unix scripts to support various data operations. Model data to support business intelligence and analytics initiatives. Utilize infrastructure-as-code tools such as Terraform, Puppet, and Ansible for deployment automation. Expertise in GCP data warehousing technologies, including BigQuery, Cloud SQL, Dataflow, Data Catalog, Cloud Composer, Google Cloud Storage, IAM, Compute Engine, Cloud Data Fusion and Dataproc (good to have). Qualifications: Bachelor's degree in Software Engineering, Computer Science, Business, Mathematics, or related field. 4+ years of data engineering experience. 2 years of data solution architecture and design experience. GCP Certified Data Engineer (preferred). Job Type: Full-time Pay: Up to ₹1,400,000.00 per year Application Question(s): What is your notice period (in days)? What is your current annual salary (in INR)? What is your expected annual salary (in INR)? Experience: designing, developing, and supporting data pipelines : 4 years (Required) developing test strategies & measures for data products : 5 years (Required) GCP Data Technologies: 4 years (Required) SQL and database : 5 years (Required) agile development "scrums" and solution reviews: 4 years (Required) automation of data workflow by setting up DAGs : 5 years (Required) Location: Noida, Uttar Pradesh (Required) Work Location: In person

Posted 2 months ago

Apply

7.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

🚀 We’re Hiring: Senior GCP Data Engineer (7+ Years Experience) 📍 Location: Hyderabad (Work from Office - Mandatory) 📧 Apply Now: sasidhar.m@technogenindia.com Are you a passionate Data Engineer with deep expertise in Google Cloud Platform (GCP) and strong hands-on experience in data migration projects ? Do you bring solid knowledge of Oracle to the table and thrive in a fast-paced, collaborative environment? TechnoGen India is looking for a Senior GCP Data Engineer to join our Hyderabad team. This is a full-time, on-site opportunity designed for professionals ready to take on challenging migration projects and deliver impactful solutions. 🔍 What We’re Looking For: ✅ 7+ years of experience in Data Engineering ✅ Strong expertise in GCP (BigQuery, Dataflow, Pub/Sub, etc.) ✅ Proven experience in complex GCP migration projects ✅ Solid Oracle background (data extraction, transformation, and optimization) ✅ Ability to work full-time from our Hyderabad office 🎯 If you’re ready to bring your skills to a growing team that values innovation and excellence, we want to hear from you ! 👉 Share your updated resume with us at Only Whatsapp(9177108771) Let’s build the future of data together! 💡 #Hiring #GCP #DataEngineer #Oracle #HyderabadJobs #WorkFromOffice #TechJobsIndia #DataMigration #TechnoGenIndia Show more Show less

Posted 2 months ago

Apply

0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Job Overview- We are looking for enthusiast Product Manager who involves in driving the product lifecycle, from requirement synthesis to release planning and performance evaluation. It requires close collaboration with senior stakeholders, cross-functional teams, and clients to elicit requirements, analyze market trends, and identify functional gaps. The role emphasizes maintaining an updated product roadmap, monitoring development and validating requirements to create detailed specifications. Role: Product Manager Reporting to Director of Product Experience Range: 4-8yrs Responsibilities Requirements Synthesis & Product Road Mapping: Elicit requirements for product through regular interactions with Vice-President, Onsite Director and Product Manager, and onsite and offshore Customer Success Team, Sales team and clients Study current solution as used by customer and as offered by competitors to understand functional gaps Be up-to-date with current industry developments both in technology as well as target-market for the next big idea of the product Enhance product roadmap, maintain backlog and map-out product's release planning based on stakeholder priorities, competitive outlook, market demands and client requests Evaluate and validate requirements gathered from multiple sources, reconcile conflicts, convert business/product feature ideas into detailed functional specifications, abstract high-level understanding from low-level technicalities, and distinguish user-requests from their underlying true needs Monitoring & Leveraging Key Metrics: In an agile working environment, monitor development, testing and documentation to ensure that the product is built based on specifications defined Identify and track KPIs that measure product success by making use of explicit and implicit data-sources like feedback from clients, field-force and support, and application metrics collected through analytics tools and database querying on product performance, usage, latency, etc. Documentation: Elicit requirements in business documents for consumption of each of the teams Business, Engineering and QA Produce high-level business presentations for Stakeholders and Business teams Develop business-case, use-cases, user-stories, personas as well as detailed functional diagrams, dataflow and deployment diagrams, and flowcharts for consumption of various teams involved with the product Create and maintain product’s API web-services On-job experience using UI prototyping, wireframes and mock-up tools Project & Team Management: Work closely with Engineering, QA, Dev-Ops, Sales, and Customer Success teams, enable a symbiotic ecosystem and provide direction to meet common goals and timelines Interact with Engineering team to translate requirements into implementation plan Interact with QA team to communicate expected workflows of new enhancements and feature-development to enable certified code-release through SIT and UAT Sales & Marketing Support: Conduct demo sessions for Marketing, Sales and Support Engineers team on newly released product features to facilitate their operations Facilitate creation of knowledgebase, user manuals and marketing material Qualifications B.E/B.Tech graduate and MBA combination will be a additional advantage Knowledge of Software Languages, RDBMS, Software Tools, Software Design, Software Documentation, Software Development Process (especially Agile Methodologies), Software Requirements, Software Maintenance, Quality Assurance, UI Prototyping and Analytics Experience working with web and mobile applications Experience working with B2B Enterprise products Understanding of Online Marketing concepts and Retail Industry Certification on agile/scrum practices preferable Ability to: Plan and complete projects within deadlines Generate ideas that extend and enhance product feature-set Ensure quality in product Promote process improvement About The Company OptCulture is at the forefront of helping brands elevate their customer relationships through cutting-edge retention strategies. We don’t just connect the dots; we create journeys that keep customers coming back for more! Think about the brands you admire - IKEA, Marks & Spencer, GUESS, Style Union. At OptCulture, we’re the behind-the-scenes marketing technology enabling them to understand and engage with their customers in deeply meaningful ways. It’s not just about sales; it’s about fostering loyalty that lasts. What Makes Us Unique? OptCulture isn’t your typical tech company. We’re a bootstrapped powerhouse, driven by relentless innovation and determination. From Houston to Dubai to Hyderabad, our solutions are redefining customer retention on a global scale. And here’s the kicker—we’re growing! OptCulture aims to hire thinkers and achievers. We believe in providing an environment for fast-paced growth as an individual, team, and organization. We encourage a culture of independence, collaboration, trust, and balance. Show more Show less

Posted 2 months ago

Apply

6.0 - 12.0 years

0 Lacs

Greater Kolkata Area

On-site

Job Description We are looking for a highly skilled GCP Technical Lead with 6 to 12 years of experience to join our dynamic team. In this role, you will be responsible for designing and implementing scalable, secure, and highly available cloud infrastructure solutions on Google Cloud Platform (GCP). You will lead the architecture and development of cloud-native applications and ensure that infrastructure and applications are optimized for performance, security, and scalability. Your expertise will play a key role in the design and execution of workload migrations, CI/CD pipelines, and infrastructure : Cloud Architecture and Design : Lead the design and implementation of scalable, secure, and highly available cloud infrastructure solutions on GCP using services like Compute Engine, Kubernetes Engine, Cloud Storage, Cloud SQL, and Cloud Load Balancing. Cloud-Native Applications Design : Develop architecture design and guidelines for the development, deployment, and lifecycle management of cloud-native applications, ensuring optimization for security, performance, and scalability with services such as App Engine, Cloud Functions, and Cloud Run. API Management : Implement secure API interfaces and granular access control using IAM, RBAC, and API Gateway for workloads running on GCP. Workload Migration : Lead the migration of on-premises workloads to GCP, ensuring minimal downtime, data integrity, and smooth transitions. CI/CD : Design and implement CI/CD pipelines using Cloud Build, Cloud Source Repositories, and Artifact Registry to automate development and deployment processes. Infrastructure as Code (IaC) : Automate cloud infrastructure provisioning and management using Terraform. Collaboration : Collaborate closely with cross-functional teams to define requirements, design solutions, and ensure successful project delivery, utilizing tools like Google Workspace and Jira. Monitoring and Optimization : Continuously monitor cloud environments to ensure optimal performance, availability, and security, and perform regular audits and tuning. Documentation : Prepare and maintain comprehensive documentation for cloud infrastructure, configurations, and procedures using Google Docs and Qualifications : Bachelors degree in computer science, Information Systems, or related field. 6-12 years of relevant experience in cloud engineering and architecture. Google Cloud Professional Cloud Architect certification. Experience with Kubernetes. Familiarity with DevOps methodologies. Strong problem-solving and analytical skills. Excellent communication skills. Required Skills Google Cloud Platform (GCP) Services, Compute Engine, Google Kubernetes Engine (GKE), Cloud Storage, Cloud SQL, Cloud Load Balancing, Identity and Access Management (IAM), Google Workflows, Google Cloud Pub/Sub, App Engine, Cloud Functions, Cloud Run, API Gateway Cloud Build, Cloud Source Repositories, Artifact Registry, Google Cloud Monitoring, Logging and Error Reporting, Python, Terraform, Google Cloud Firestore, GraphQL, MongoDB, Cassandra, Neo4j, ETL (Extract, Transform, Load) Paradigms, Google Cloud Dataflow, Apache Beam, BigQuery, Service Mesh, Content Delivery Network (CDN), Stackdriver, Google Cloud Trace (ref:hirist.tech) Show more Show less

Posted 2 months ago

Apply

0 years

0 Lacs

Mumbai Metropolitan Region

On-site

Skills: Data Engineer, Python, Spark, Cloudera, onpremise, Azure, Snowflow, kafka, Overview Of The Company Jio Platforms Ltd. is a revolutionary Indian multinational tech company, often referred to as India's biggest startup, headquartered in Mumbai. Launched in 2019, it's the powerhouse behind Jio, India's largest mobile network with over 400 million users. But Jio Platforms is more than just telecom. It's a comprehensive digital ecosystem, developing cutting-edge solutions across media, entertainment, and enterprise services through popular brands like JioMart, JioFiber, and JioSaavn. Join us at Jio Platforms and be part of a fast-paced, dynamic environment at the forefront of India's digital transformation. Collaborate with brilliant minds to develop next-gen solutions that empower millions and revolutionize industries. Team Overview The Data Platforms Team is the launchpad for a data-driven future, empowering the Reliance Group of Companies. We're a passionate group of experts architecting an enterprise-scale data mesh to unlock the power of big data, generative AI, and ML modelling across various domains. We don't just manage data we transform it into intelligent actions that fuel strategic decision-making. Imagine crafting a platform that automates data flow, fuels intelligent insights, and empowers the organization that's what we do. Join our collaborative and innovative team, and be a part of shaping the future of data for India's biggest digital revolution! About the role. Title: Lead Data Engineer Location : Mumbai Responsibilities End-to-End Data Pipeline Development: Design, build, optimize, and maintain robust data pipelines across cloud, on-premises, or hybrid environments, ensuring performance, scalability, and seamless data flow. Reusable Components & Frameworks: Develop reusable data pipeline components and contribute to the team's data pipeline framework evolution. Data Architecture & Solutions: Contribute to data architecture design, applying data modelling, storage, and retrieval expertise. Data Governance & Automation: Champion data integrity, security, and efficiency through metadata management, automation, and data governance best practices. Collaborative Problem Solving: Partner with stakeholders, data teams, and engineers to define requirements, troubleshoot, optimize, and deliver data-driven insights. Mentorship & Knowledge Transfer: Guide and mentor junior data engineers, fostering knowledge sharing and professional growth. Qualification Details Education: Bachelor's degree or higher in Computer Science, Data Science, Engineering, or a related technical field. Core Programming: Excellent command of a primary data engineering language (Scala, Python, or Java) with a strong foundation in OOPS and functional programming concepts. Big Data Technologies: Hands-on experience with data processing frameworks (e.g., Hadoop, Spark, Apache Hive, NiFi, Ozone, Kudu), ideally including streaming technologies (Kafka, Spark Streaming, Flink, etc.). Database Expertise: Excellent querying skills (SQL) and strong understanding of relational databases (e.g., MySQL, PostgreSQL). Experience with NoSQL databases (e.g., MongoDB, Cassandra) is a plus. End-to-End Pipelines: Demonstrated experience in implementing, optimizing, and maintaining complete data pipelines, integrating varied sources and sinks including streaming real-time data. Cloud Expertise: Knowledge of Cloud Technologies like Azure HDInsights, Synapse, EventHub and GCP DataProc, Dataflow, BigQuery. CI/CD Expertise: Experience with CI/CD methodologies and tools, including strong Linux and shell scripting skills for automation. Desired Skills & Attributes Problem-Solving & Troubleshooting: Proven ability to analyze and solve complex data problems, troubleshoot data pipeline issues effectively. Communication & Collaboration: Excellent communication skills, both written and verbal, with the ability to collaborate across teams (data scientists, engineers, stakeholders). Continuous Learning & Adaptability: A demonstrated passion for staying up-to-date with emerging data technologies and a willingness to adapt to new tools. Show more Show less

Posted 2 months ago

Apply

2.0 years

0 Lacs

Trivandrum, Kerala, India

On-site

Equifax is where you can power your possible. If you want to achieve your true potential, chart new paths, develop new skills, collaborate with bright minds, and make a meaningful impact, we want to hear from you. What You’ll Do As a Senior Developer at Equifax, you will oversee and steer the delivery of innovative batch and data products, primarily leveraging Java and the Google Cloud Platform (GCP). Your expertise will ensure the efficient and timely deployment of high quality data solutions that support our business objectives and client needs. Project Development: Development, and deployment of batch and data products, ensuring alignment with business goals and technical requirements. Technical Oversight: Provide technical direction and oversee the implementation of solutions using Java and GCP, ensuring best practices in coding, performance, and security. Team Management: Mentor and guide junior developers and engineers, fostering a collaborative and high performance environment. Stakeholder Collaboration: Work closely with cross functional teams including product managers, business analysts, and other stakeholders to gather requirements and translate them into technical solutions. Documentation and Compliance: Maintain comprehensive documentation for all technical processes, ensuring compliance with internal and external standards. Continuous Improvement: Advocate for and implement continuous improvement practices within the team, staying abreast of emerging technologies and methodologies. What Experience You Need Education: Bachelor's degree in Computer Science, Information Technology, or a related field. Experience: Minimum of 2-5 years of relevant experience in software development, with a focus on batch processing and data solutions. Technical Skills: Proficiency in Java, with a strong understanding of its ecosystems. 2+ year of relevant Java, Sprint, Spring Boot, REST, Microservices, Hibernate, JPA, RDBMS Minimum 2 Git, CI/CD Pipelines, Jenkins Experience with Google Cloud Platform, including BigQuery, Cloud Storage, Dataflow, Big Table and other GCP tools. Familiarity with ETL processes, data modeling, and SQL. Soft Skills: Strong problem solving abilities and a proactive approach to project management. Effective communication and interpersonal skills, with the ability to convey technical concepts to nontechnical stakeholders. What Could Set You Apart Technical Skills: Proficiency in Java, with a strong understanding of its ecosystems. 2+ year of relevant Java, Sprint, Spring Boot, REST, Microservices, Hibernate, JPA, RDBMS Minimum 2 Git, CI/CD Pipelines, Jenkins Experience with Google Cloud Platform, including Big Query, Cloud Storage, Dataflow, Big Table and other GCP tools. Familiarity with ETL processes, data modeling, and SQL. Certification in Google Cloud (e.g., Associate Cloud Engineer). Experience with other cloud platforms (AWS, Azure) is a plus. Understanding of data privacy regulations and compliance frameworks. We offer a hybrid work setting, comprehensive compensation and healthcare packages, attractive paid time off, and organizational growth potential through our online learning platform with guided career tracks. Are you ready to power your possible? Apply today, and get started on a path toward an exciting new career at Equifax, where you can make a difference! Who is Equifax? At Equifax, we believe knowledge drives progress. As a global data, analytics and technology company, we play an essential role in the global economy by helping employers, employees, financial institutions and government agencies make critical decisions with greater confidence. We work to help create seamless and positive experiences during life’s pivotal moments: applying for jobs or a mortgage, financing an education or buying a car. Our impact is real and to accomplish our goals we focus on nurturing our people for career advancement and their learning and development, supporting our next generation of leaders, maintaining an inclusive and diverse work environment, and regularly engaging and recognizing our employees. Regardless of location or role, the individual and collective work of our employees makes a difference and we are looking for talented team players to join us as we help people live their financial best. Equifax is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, or status as a protected veteran. Show more Show less

Posted 2 months ago

Apply

7.0 years

0 Lacs

India

Remote

Cloud Data Engineer - Scala / Databricks is required 100% Remote working IMMEDIATE JOINER REQUIRED Cloud Data Engineer is required ASAP by our global market leading IT Consultancy client! With a strong background in AWS, Azure, and GCP the ideal candidate will have extensive experience with cloud-native ETL tools such as AWS DMS, AWS Glue, Kafka, Azure Data Factory, GCP Dataflow, and other ETL tools like Informatica, SAP Data Intelligence, etc. Key responsibilities You main responsibility will be for designing, implementing, and maintaining robust data pipelines and building scalable data lakes, broken down into the following: Design and Development: Design, develop, and maintain scalable ETL pipelines using cloud-native tools (AWS DMS, AWS Glue, Kafka, Azure Data Factory, GCP Dataflow, etc.). Architect and implement data lakes and data warehouses on cloud platforms (AWS, Azure, GCP). Develop and optimize data ingestion, transformation, and loading processes using Databricks, Snowflake, Redshift, BigQuery and Azure Synapse. Implement ETL processes using tools like Informatica, SAP Data Intelligence, and others. Develop and optimize data processing jobs using Spark Scala. Data Integration and Management: Integrate various data sources, including relational databases, APIs, unstructured data, and ERP systems into the data lake. Ensure data quality and integrity through rigorous testing and validation. Perform data extraction from SAP or ERP systems when necessary. Performance Optimization: Monitor and optimize the performance of data pipelines and ETL processes. Implement best practices for data management, including data governance, security, and compliance. Collaboration and Communication: Work closely with data scientists, analysts, and other stakeholders to understand data requirements and deliver solutions. Collaborate with cross-functional teams to design and implement data solutions that meet business needs. Documentation and Maintenance: Document technical solutions, processes, and workflows. Maintain and troubleshoot existing ETL pipelines and data integrations. Essential previous experience must include 7+ years of experience as a Data Engineer in a similar role. Minimum 3 years of experience specifically working with "Databricks on AWS" MUST HAVE Strong hands on coding and platform development in Apache Spark / Scala / Databricks Experience with data extraction from SAP or ERP systems Experience with various Data platforms such as Amazon Redshift / Snowflake / Synapse Proficient in SQL and query optimization techniques. Familiarity with data modeling, ETL/ELT processes, and data warehousing concepts. Knowledge of data governance, security, and compliance best practices. Show more Show less

Posted 2 months ago

Apply

2.0 years

0 Lacs

Trivandrum, Kerala, India

On-site

What you’ll do: As a Senior Developer at Equifax, you will oversee and steer the delivery of innovative batch and data products, primarily leveraging Java and the Google Cloud Platform (GCP). Your expertise will ensure the efficient and timely deployment of high quality data solutions that support our business objectives and client needs. Project Development: Development, and deployment of batch and data products, ensuring alignment with business goals and technical requirements. Technical Oversight: Provide technical direction and oversee the implementation of solutions using Java and GCP, ensuring best practices in coding, performance, and security. Team Management: Mentor and guide junior developers and engineers, fostering a collaborative and high performance environment. Stakeholder Collaboration: Work closely with cross functional teams including product managers, business analysts, and other stakeholders to gather requirements and translate them into technical solutions. Documentation and Compliance: Maintain comprehensive documentation for all technical processes, ensuring compliance with internal and external standards. Continuous Improvement: Advocate for and implement continuous improvement practices within the team, staying abreast of emerging technologies and methodologies. What experience you need: Education: Bachelor's degree in Computer Science, Information Technology, or a related field. Experience: Minimum of 2-5 years of relevant experience in software development, with a focus on batch processing and data solutions. Technical Skills: Proficiency in Java, with a strong understanding of its ecosystems. 2+ year of relevant Java, Sprint, Spring Boot, REST, Microservices, Hibernate, JPA, RDBMS Minimum 2 Git, CI/CD Pipelines, Jenkins Experience with Google Cloud Platform, including BigQuery, Cloud Storage, Dataflow, Big Table and other GCP tools. Familiarity with ETL processes, data modeling, and SQL. Soft Skills: Strong problem solving abilities and a proactive approach to project management. Effective communication and interpersonal skills, with the ability to convey technical concepts to nontechnical stakeholders. What could set you apart: Technical Skills: Proficiency in Java, with a strong understanding of its ecosystems. 2+ year of relevant Java, Sprint, Spring Boot, REST, Microservices, Hibernate, JPA, RDBMS Minimum 2 Git, CI/CD Pipelines, Jenkins Experience with Google Cloud Platform, including Big Query, Cloud Storage, Dataflow, Big Table and other GCP tools. Familiarity with ETL processes, data modeling, and SQL. Certification in Google Cloud (e.g., Associate Cloud Engineer). Experience with other cloud platforms (AWS, Azure) is a plus. Understanding of data privacy regulations and compliance frameworks. Show more Show less

Posted 2 months ago

Apply

2.0 years

0 Lacs

Kolkata, West Bengal, India

On-site

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. EY-Advisory - Data and Analytics – Staff – Data Engineer(Scala) EY's Advisory Services is a unique, industry-focused business unit that provides a broad range of integrated services that leverage deep industry experience with strong functional and technical capabilities and product knowledge. EY’s financial services practice provides integrated advisory services to financial institutions and other capital markets participants, including commercial banks, retail banks, investment banks, broker-dealers & asset management firms, and insurance firms from leading Fortune 500 Companies. Within EY’s Advisory Practice, Data and Analytics team solves big, complex issues and capitalize on opportunities to deliver better working outcomes that help expand and safeguard the businesses, now and in the future. This way we help create a compelling business case for embedding the right analytical practice at the heart of client’s decision-making. The opportunity We’re looking for Senior – Big Data Experts with expertise in Financial Services domain hand-on experience with Big data ecosystem. Primary Skills And Key Responsibilities Strong knowledge in Spark, good understanding of Spark framework, Performance tuning. Proficiency in Scala & SQL. Good exposure to one of the Cloud technologies - GCP/ Azure/ AWS Hands-on Experience in designing, building, and maintaining scalable data pipelines and solutions to manage and process large datasets efficiently. Flexible and proactive/self-motivated working style with strong personal ownership of problem resolution. Excellent communicator (written and verbal formal and informal). Ability to multi-task under pressure and work independently with minimal supervision. Strong verbal and written communication skills. Must be a team player and enjoy working in a cooperative and collaborative team environment. Adaptable to new technologies and standards. Participate in all aspects of Big Data solution delivery life cycle including analysis, design, development, testing, production deployment, and support Nice To Have Skills Design, develop, and deploy robust and scalable data pipelines using GCP services such as BigQuery, Dataflow, Data Composer/Cloud Composer (Airflow) and related technologies. Good experience in experience in GCP technology areas of Data store, Big Query, Cloud storage, Persistent disk IAM, Roles, Projects, Organization. Understanding & familiarity with all Hadoop Ecosystem components and Hadoop Administrative Fundamentals Experience working with NoSQL in at least one of the data stores - HBase, Cassandra, MongoDB Experience in HDFS, Hive, Impala Experience is schedulers like Airflow, Nifi etc Experienced in Hadoop clustering and Auto scaling. Develop standardized practices for delivering new products and capabilities using Big Data technologies, including data acquisition, transformation, and analysis. Define and develop client specific best practices around data management within a Hadoop environment on Azure cloud To qualify for the role, you must have BE/BTech/MCA/MBA Minimum 2 years hand-on experience in one or more relevant areas. Total of 1-3 years industry experience Ideally, you’ll also have Experience on Banking and Capital Markets domains Skills And Attributes For Success Use an issue-based approach to deliver growth, market and portfolio strategy engagements for corporates Strong communication, presentation and team building skills and experience in producing high quality reports, papers, and presentations. Experience in executing and managing research and analysis of companies and markets, preferably from a commercial due diligence standpoint. What We Look For A Team of people with commercial acumen, technical experience and enthusiasm to learn new things in this fast-moving environment An opportunity to be a part of market-leading, multi-disciplinary team of 1400 + professionals, in the only integrated global transaction business worldwide. Opportunities to work with EY Advisory practices globally with leading businesses across a range of industries What Working At EY Offers At EY, we’re dedicated to helping our clients, from start–ups to Fortune 500 companies — and the work we do with them is as varied as they are. You get to work with inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees and you will be able to control your own development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Plus, we offer: Support, coaching and feedback from some of the most engaging colleagues around Opportunities to develop new skills and progress your career The freedom and flexibility to handle your role in a way that’s right for you EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less

Posted 2 months ago

Apply

2.0 years

0 Lacs

Trivandrum, Kerala, India

On-site

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. EY-Advisory - Data and Analytics – Staff – Data Engineer(Scala) EY's Advisory Services is a unique, industry-focused business unit that provides a broad range of integrated services that leverage deep industry experience with strong functional and technical capabilities and product knowledge. EY’s financial services practice provides integrated advisory services to financial institutions and other capital markets participants, including commercial banks, retail banks, investment banks, broker-dealers & asset management firms, and insurance firms from leading Fortune 500 Companies. Within EY’s Advisory Practice, Data and Analytics team solves big, complex issues and capitalize on opportunities to deliver better working outcomes that help expand and safeguard the businesses, now and in the future. This way we help create a compelling business case for embedding the right analytical practice at the heart of client’s decision-making. The opportunity We’re looking for Senior – Big Data Experts with expertise in Financial Services domain hand-on experience with Big data ecosystem. Primary Skills And Key Responsibilities Strong knowledge in Spark, good understanding of Spark framework, Performance tuning. Proficiency in Scala & SQL. Good exposure to one of the Cloud technologies - GCP/ Azure/ AWS Hands-on Experience in designing, building, and maintaining scalable data pipelines and solutions to manage and process large datasets efficiently. Flexible and proactive/self-motivated working style with strong personal ownership of problem resolution. Excellent communicator (written and verbal formal and informal). Ability to multi-task under pressure and work independently with minimal supervision. Strong verbal and written communication skills. Must be a team player and enjoy working in a cooperative and collaborative team environment. Adaptable to new technologies and standards. Participate in all aspects of Big Data solution delivery life cycle including analysis, design, development, testing, production deployment, and support Nice To Have Skills Design, develop, and deploy robust and scalable data pipelines using GCP services such as BigQuery, Dataflow, Data Composer/Cloud Composer (Airflow) and related technologies. Good experience in experience in GCP technology areas of Data store, Big Query, Cloud storage, Persistent disk IAM, Roles, Projects, Organization. Understanding & familiarity with all Hadoop Ecosystem components and Hadoop Administrative Fundamentals Experience working with NoSQL in at least one of the data stores - HBase, Cassandra, MongoDB Experience in HDFS, Hive, Impala Experience is schedulers like Airflow, Nifi etc Experienced in Hadoop clustering and Auto scaling. Develop standardized practices for delivering new products and capabilities using Big Data technologies, including data acquisition, transformation, and analysis. Define and develop client specific best practices around data management within a Hadoop environment on Azure cloud To qualify for the role, you must have BE/BTech/MCA/MBA Minimum 2 years hand-on experience in one or more relevant areas. Total of 1-3 years industry experience Ideally, you’ll also have Experience on Banking and Capital Markets domains Skills And Attributes For Success Use an issue-based approach to deliver growth, market and portfolio strategy engagements for corporates Strong communication, presentation and team building skills and experience in producing high quality reports, papers, and presentations. Experience in executing and managing research and analysis of companies and markets, preferably from a commercial due diligence standpoint. What We Look For A Team of people with commercial acumen, technical experience and enthusiasm to learn new things in this fast-moving environment An opportunity to be a part of market-leading, multi-disciplinary team of 1400 + professionals, in the only integrated global transaction business worldwide. Opportunities to work with EY Advisory practices globally with leading businesses across a range of industries What Working At EY Offers At EY, we’re dedicated to helping our clients, from start–ups to Fortune 500 companies — and the work we do with them is as varied as they are. You get to work with inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees and you will be able to control your own development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Plus, we offer: Support, coaching and feedback from some of the most engaging colleagues around Opportunities to develop new skills and progress your career The freedom and flexibility to handle your role in a way that’s right for you EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less

Posted 2 months ago

Apply

2.0 years

0 Lacs

Kanayannur, Kerala, India

On-site

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. EY-Advisory - Data and Analytics – Staff – Data Engineer(Scala) EY's Advisory Services is a unique, industry-focused business unit that provides a broad range of integrated services that leverage deep industry experience with strong functional and technical capabilities and product knowledge. EY’s financial services practice provides integrated advisory services to financial institutions and other capital markets participants, including commercial banks, retail banks, investment banks, broker-dealers & asset management firms, and insurance firms from leading Fortune 500 Companies. Within EY’s Advisory Practice, Data and Analytics team solves big, complex issues and capitalize on opportunities to deliver better working outcomes that help expand and safeguard the businesses, now and in the future. This way we help create a compelling business case for embedding the right analytical practice at the heart of client’s decision-making. The opportunity We’re looking for Senior – Big Data Experts with expertise in Financial Services domain hand-on experience with Big data ecosystem. Primary Skills And Key Responsibilities Strong knowledge in Spark, good understanding of Spark framework, Performance tuning. Proficiency in Scala & SQL. Good exposure to one of the Cloud technologies - GCP/ Azure/ AWS Hands-on Experience in designing, building, and maintaining scalable data pipelines and solutions to manage and process large datasets efficiently. Flexible and proactive/self-motivated working style with strong personal ownership of problem resolution. Excellent communicator (written and verbal formal and informal). Ability to multi-task under pressure and work independently with minimal supervision. Strong verbal and written communication skills. Must be a team player and enjoy working in a cooperative and collaborative team environment. Adaptable to new technologies and standards. Participate in all aspects of Big Data solution delivery life cycle including analysis, design, development, testing, production deployment, and support Nice To Have Skills Design, develop, and deploy robust and scalable data pipelines using GCP services such as BigQuery, Dataflow, Data Composer/Cloud Composer (Airflow) and related technologies. Good experience in experience in GCP technology areas of Data store, Big Query, Cloud storage, Persistent disk IAM, Roles, Projects, Organization. Understanding & familiarity with all Hadoop Ecosystem components and Hadoop Administrative Fundamentals Experience working with NoSQL in at least one of the data stores - HBase, Cassandra, MongoDB Experience in HDFS, Hive, Impala Experience is schedulers like Airflow, Nifi etc Experienced in Hadoop clustering and Auto scaling. Develop standardized practices for delivering new products and capabilities using Big Data technologies, including data acquisition, transformation, and analysis. Define and develop client specific best practices around data management within a Hadoop environment on Azure cloud To qualify for the role, you must have BE/BTech/MCA/MBA Minimum 2 years hand-on experience in one or more relevant areas. Total of 1-3 years industry experience Ideally, you’ll also have Experience on Banking and Capital Markets domains Skills And Attributes For Success Use an issue-based approach to deliver growth, market and portfolio strategy engagements for corporates Strong communication, presentation and team building skills and experience in producing high quality reports, papers, and presentations. Experience in executing and managing research and analysis of companies and markets, preferably from a commercial due diligence standpoint. What We Look For A Team of people with commercial acumen, technical experience and enthusiasm to learn new things in this fast-moving environment An opportunity to be a part of market-leading, multi-disciplinary team of 1400 + professionals, in the only integrated global transaction business worldwide. Opportunities to work with EY Advisory practices globally with leading businesses across a range of industries What Working At EY Offers At EY, we’re dedicated to helping our clients, from start–ups to Fortune 500 companies — and the work we do with them is as varied as they are. You get to work with inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees and you will be able to control your own development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Plus, we offer: Support, coaching and feedback from some of the most engaging colleagues around Opportunities to develop new skills and progress your career The freedom and flexibility to handle your role in a way that’s right for you EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less

Posted 2 months ago

Apply

0.0 - 7.0 years

0 Lacs

Noida, Uttar Pradesh

On-site

Data Engineer (SaaS-Based) || 5-7 years || NOIDA || 3 pm-12 AM IST shift Location: Noida (In-office/Hybrid; Client site if required) Experience: 5–7 years Type: Full-Time | Immediate Joiners Preferred Shift: 3 PM to 12 AM IST Client: Leading Canadian-based Tech Company Good to have: GCP Certified Data Engineer Overview of the role: As a GCP Data Engineer, you'll focus on solving problems and creating value for the business by building solutions that are reliable and scalable to work with the size and scope of the company. You will be tasked with creating custom-built pipelines as well as migrating on-prem data pipelines to the GCP stack. You will be part of a team tackling intricate problems by designing and deploying reliable and scalable solutions tailored to the company's data landscape. Required Skills: • 5+ years of industry experience in software development, data engineering, business intelligence, or related field with experience in manipulating, processing, and extracting value from datasets. •Extensive experience in doing requirement discovery, analysis and data pipeline solution design. • Design, build and deploy internal applications to support our technology life cycle, collaboration and spaces, service delivery management, data and business intelligence among others. • Building Modular code for multi usable pipeline or any kind of complex Ingestion Framework used to ease the job to load the data into Datalake or Data Warehouse from multiple sources. • Work closely with analysts and business process owners to translate business requirements into technical solutions. • Coding experience in scripting and languages (Python, SQL, PySpark). • Expertise in Google Cloud Platform (GCP) technologies in the data warehousing space (BigQuery, GCP Workflows, Cloud Scheduler, Secret Manager, Batch, Cloud Logging, Cloud SDK, Google Cloud Storage, IAM). • Exposure of Google Dataproc and Dataflow. • Maintain highest levels of development practices including: technical design, solution development, systems configuration, test documentation/execution, issue identification and resolution, writing clean, modular and self-sustaining code, with repeatable quality and predictability. • Understanding CI/CD Processes using Pulumi, GitHub, Cloud Build, Cloud SDK, Docker • Experience with SAS/SQL Server/SSIS is an added advantage. Qualifications: • Bachelor's degree in Computer Science or related technical field, or equivalent practical experience. • GCP Certified Data Engineer (preferred) • Excellent verbal and written communication skills with the ability to effectively advocate technical solutions to other engineering teams and business audiences. Job Type: Full-time Pay: Up to ₹1,400,000.00 per year Schedule: UK shift Application Question(s): What is your notice period (in days)? What is your current annual salary (in INR)? What is your expected annual salary (in INR)? This is a 3 PM to 12 Am IST shift job. Please apply ONLY IF you are willing to work in this shift. Experience: coding in Python, SQL, PySpark: 7 years (Required) Google Dataproc and Dataflow: 7 years (Required) (GCP) technologies in the data warehousing space : 7 years (Required) Pulumi, GitHub, Cloud Build, Cloud SDK, Docker: 7 years (Required) building or maintaining data pipelines : 7 years (Required) BigQuery, GCP Workflows, IAM: 7 years (Required) SAS/SQL Server/SSIS : 7 years (Required) License/Certification: GCP Certified Data Engineer (Required) Location: Noida, Uttar Pradesh (Required) Shift availability: Night Shift (Required) Work Location: In person

Posted 2 months ago

Apply

0.0 - 7.0 years

0 Lacs

Noida, Uttar Pradesh

On-site

Data Engineer – GCP & PySpark (Iceberg/BigQuery) || 4-7 years Location: Noida (In-office/Hybrid; client site if required) Experience: 4–7 years Type: Full-Time | Immediate Joiners Preferred Client: Leading Canadian-based Tech Company Must-Have Skills: GCP (BigQuery, Dataflow, Dataproc, Cloud Storage) PySpark / Spark – Distributed computing expertise Apache Iceberg (preferred), Hudi, or Delta Lake Role Overview: Be part of a high-impact Data Engineering team focused on building scalable, cloud-native data pipelines. You'll support and enhance EMR platforms using DevOps principles , helping deliver real-time health alerts and diagnostics for platform performance. Key Responsibilities: Provide data engineering support to EMR platforms Design and implement cloud-native, automated data solutions Collaborate with internal teams to deliver scalable systems Continuously improve infrastructure reliability and observability Technical Environment: Databases: Oracle, MySQL, MSSQL, MongoDB Distributed Engines: Spark/PySpark, Presto, Flink/Beam Cloud & Infra: GCP (preferred), AWS (nice-to-have), Terraform Big Data Formats: Iceberg, Hudi, Delta Tools: SQL, Data Modeling, Palantir Foundry, Jenkins, Confluence Bonus: Stats/math tools (NumPy, PyMC3), Linux scripting Ideal for engineers with cloud-native, real-time data platform experience — especially those who have worked with EMR and modern lakehouse stacks. Job Type: Full-time Pay: Up to ₹1,000,000.00 per year Application Question(s): What is your notice period (in Days)? What is your current annual compensation (in INR)? What is your expected annual salary (in INR)? Experience: GCP services like BigQuery, Dataflow, or Dataproc: 7 years (Required) developing or maintaining Spark or PySpark jobs: 7 years (Required) Apache Iceberg : 7 years (Required) EMR or similar big data platforms: 7 years (Required) Terraform projects: 7 years (Required) SQL development and data modeling: 7 years (Required) Location: Noida, Uttar Pradesh (Required) Work Location: In person

Posted 2 months ago

Apply

0 years

0 Lacs

Gurugram, Haryana, India

On-site

We are seeking a skilled Senior Data Engineer to become a part of our dynamic team. In this role as a Senior Data Engineer, you will focus on projects involving data integration and ETL processes tailored for cloud-based environments. Your main tasks will include crafting and executing sophisticated data structures, while ensuring the integrity, accuracy, and accessibility of data. Responsibilities Design and execute sophisticated data structures for cloud environments Develop ETL workflows utilizing SQL, Python, and other pertinent technologies Maintain data integrity, reliability, and accessibility for all relevant parties Work with diverse teams to comprehend data integration needs and specifications Create and manage documentation such as technical details, data flow charts, and data mappings Enhance and monitor data integration workflows to boost performance and efficiency while maintaining data accuracy and integrity Requirements Bachelor’s degree in Computer Science, Electrical Engineering, or a related field 5-8 years of experience in data engineering Proficiency in cloud-native or Spark-based ETL tools like AWS Glue, Azure Data Factory, or GCP Dataflow Strong understanding of SQL for data querying and manipulation Familiarity with Snowflake for data warehousing Background in cloud platforms such as AWS, GCP, or Azure for data storage and processing Excellent problem-solving skills and attention to detail Good verbal and written communication skills in English at a B2 level Nice to have Background in ETL using Python Show more Show less

Posted 2 months ago

Apply

5 years

0 Lacs

Gurgaon, Haryana, India

On-site

We are looking for an experienced Data Engineer to design, build, and maintain scalable data pipelines for processing clickstream data in Google Cloud Platform (GCP). The ideal candidate will have 5+ years of experience working with GCP, particularly in eCommerce, and possess a deep understanding of BigQuery and data pipeline automation. This role will involve building robust data workflows using Airflow , handling large data volumes, and ensuring smooth integration with Google Analytics BigQuery data exports. Key Responsibilities Pipeline Development: Design, implement, and maintain automated data pipelines using Airflow to process clickstream data in GCP, ensuring efficiency and reliability. BigQuery Expertise: Leverage BigQuery for data storage, querying, and optimizing the performance of large data sets, ensuring fast query performance on 100B+ row tables. Data Integration: Work closely with the team to integrate clickstream data from various sources, particularly focusing on Google Analytics BigQuery exports, into the data pipeline. Automation & Monitoring: Automate data processing workflows and establish robust monitoring processes to ensure seamless data flow and timely delivery of data to end-users. Data Quality & Optimization: Ensure high-quality data with proper transformations and aggregations. Optimize large data queries to reduce latency and improve processing efficiency. Collaboration: Work closely with cross-functional teams (Data Science, Analytics, Product, and Business teams) to understand their data requirements and deliver solutions. Documentation & Best Practices: Document processes, workflows, and pipeline architecture. Promote best practices for data pipeline development and GCP services usage. Scalability: Design and implement scalable systems that can handle growing volumes of clickstream data in eCommerce applications. Skills & Qualifications Experience: 5+ years of experience in data engineering with a focus on GCP, particularly in eCommerce or digital analytics environments. GCP Expertise: Extensive experience with Google Cloud Platform (GCP) services such as BigQuery , Cloud Storage , Cloud Functions , Pub/Sub , and Dataflow . BigQuery Mastery: Deep understanding of BigQuery for large-scale data processing and optimization, including partitioning, clustering, and query optimization for massive datasets. Data Pipelines: Hands-on experience automating, scheduling, and monitoring data pipelines using Airflow . Handling Large Data Volumes: Experience working with very large datasets (e.g., 100B+ rows) and optimizing data storage and processing for high performance. Clickstream Data: Familiarity with working with clickstream data and integrating it into data pipelines. Google Analytics BigQuery Export: Ideally, experience working with Google Analytics BigQuery data exports and integrating analytics data into a centralized data warehouse. Programming: Strong proficiency in Python for data processing, pipeline orchestration, and automation. SQL Skills: Proficient in writing complex SQL queries for data extraction, transformation, and analysis. Problem Solving: Strong analytical and problem-solving skills, with an ability to troubleshoot and resolve data pipeline and performance issues. Collaboration & Communication: Excellent communication skills to work across teams and explain complex technical concepts to non-technical stakeholders. Preferred Qualifications Experience with eCommerce Analytics: Previous experience working with eCommerce data, particularly clickstream and transactional data. Monitoring & Alerts: Familiarity with setting up monitoring, logging, and alerts to ensure data pipelines are running smoothly and issues are flagged promptly. Cloud Security: Knowledge of data security and access control best practices in the cloud environment (IAM, VPC, etc.). Show more Show less

Posted 2 months ago

Apply

0 years

0 Lacs

Kolkata, West Bengal, India

On-site

Role- Senior Azure data Engineer Location- Kolkata Experience- 7+ Must-Have** Build the solution for optimal extraction, transformation, and loading of data from a wide variety of data sources using Azure data ingestion and transformation components. Following technology skills are required – Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases. Experience with ADF, Dataflow Experience with big data tools like Delta Lake, Azure Databricks Experience with Synapse Designing an Azure Data Solution skills Assemble large, complex data sets that meet functional / non-functional business requirements. Good-to-Have Working knowledge of Azure DevOps SN Responsibility of / Expectations from the Role 1 Customer Centric Work closely with client teams to understand project requirements and translate into technical design Experience working in scrum or with scrum teams Internal Collaboration Work with project teams and guide the end to end project lifecycle, resolve technical queries Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data needs. Soft Skills Good communication skills Ability to interact with various internal groups and CoEs Show more Show less

Posted 2 months ago

Apply

0 years

0 Lacs

Tamil Nadu, India

On-site

Job Title: Senior Data Engineer (AWS & GCP) Experience: 6+ Years We are seeking an experienced Senior Data Engineer with expertise in AWS and preferably GCP to join our data engineering team. The ideal candidate will be skilled in building, optimizing, and managing data pipelines and infrastructure in cloud environments. You’ll work closely with cross-functional teams including data scientists, analysts, and architects to ensure efficient and secure data operations. Key Responsibilities: Design, develop, and maintain robust ETL/ELT pipelines using AWS services like Glue, Lambda, EMR and/or GCP equivalents such as Dataflow, Cloud Functions, and BigQuery. Build scalable and efficient data storage and warehousing solutions using AWS S3, Redshift, RDS or GCP Cloud Storage, BigQuery, and Cloud SQL . Optimize data architecture for performance and cost across cloud platforms. Implement and manage data governance, security policies, and access controls using IAM and cloud-native tools. Collaborate with analytics and business intelligence teams to ensure data availability and reliability. Monitor and manage cloud costs, resource utilization, and performance. Troubleshoot and resolve issues related to data ingestion, transformation, and performance bottlenecks. Qualifications: 6+ years of experience in data engineering with at least 4+ years on AWS and familiarity or hands-on experience with GCP (preferred). Proficiency in Python , SQL , and data modeling best practices. Strong experience with ETL tools, data pipelines, and cloud-native services. Working knowledge of data warehousing , distributed computing , and data lakes . Experience with Infrastructure-as-Code tools like Terraform or CloudFormation (a plus). AWS Certification required; GCP Certification is a plus. Strong problem-solving skills and ability to work in a fast-paced environment. Show more Show less

Posted 2 months ago

Apply

0 years

0 Lacs

Hyderabad, Telangana, India

On-site

We are looking for a skilled Lead Data Engineer to become an integral part of our vibrant team. In this role, you will take charge of designing, developing, and maintaining data integration solutions tailored to our clients' needs. You will oversee a team of engineers, ensuring the delivery of high-quality, scalable, and efficient data integration solutions. This role presents a thrilling challenge for a seasoned data integration expert who is passionate about technology and excels in a fast-paced, dynamic setting. Responsibilities Design, develop, and maintain client-specific data integration solutions Oversee a team of engineers to guarantee high-quality, scalable, and efficient delivery of data integration solutions Work with cross-functional teams to comprehend business requirements and create suitable data integration solutions Ensure the security, reliability, and efficiency of data integration solutions Create and update documentation, including technical specifications, data flow diagrams, and data mappings Stay informed and up-to-date with the latest data integration methods and tools Requirements Bachelor’s degree in Computer Science, Information Systems, or a related field 8-13 years of experience in data engineering, data integration, or related fields Proficiency in cloud-native or Spark-based ETL tools such as AWS Glue, Azure Data Factory, or GCP Dataflow Strong knowledge of SQL for data querying and manipulation Background in Snowflake for cloud data warehousing Familiarity with at least one cloud platform such as AWS, Azure, or GCP Experience in leading a team of engineers on data integration projects Good verbal and written communication skills in English at a B2 level Nice to have Background in ETL using Python Show more Show less

Posted 2 months ago

Apply

0 years

0 Lacs

Hyderabad, Telangana, India

On-site

We are seeking an experienced Lead Data Engineer to join our dynamic team. As a Lead Data Engineer, you will be responsible for designing, developing, and maintaining data integration solutions for our clients. You will lead a team of engineers to ensure the delivery of high-quality, scalable, and performant data integration solutions. This is an exciting opportunity for a seasoned data integration professional passionate about technology and who thrives in a fast-paced, dynamic environment. Responsibilities Design, develop, and maintain data integration solutions for clients Lead a team of engineers to ensure the delivery of high-quality, scalable, and performant data integration solutions Collaborate with cross-functional teams to understand business requirements and design data integration solutions that meet those requirements Ensure data integration solutions are secure, reliable, and performant Develop and maintain documentation, including technical specifications, data flow diagrams, and data mappings Continuously learn and stay up-to-date with the latest data integration approaches and tools Requirements Bachelor's degree in Computer Science, Information Systems, or a related field 8-13 years of experience in data engineering, data integration, or a related field Experience with cloud-native or Spark-based ETL tools such as AWS Glue, Azure Data Factory, or GCP Dataflow Strong knowledge of SQL for querying and manipulating data Experience with Snowflake for cloud data warehousing Experience with at least one cloud platform such as AWS, Azure, or GCP Experience leading a team of engineers on data integration projects Good verbal and written communication skills in English at a B2 level Nice to have Experience with ETL using Python Show more Show less

Posted 2 months ago

Apply

0 years

0 Lacs

Hyderabad, Telangana, India

On-site

We are looking for a skilled Lead Data Engineer to enhance our dynamic team. In this role, you will focus on designing, developing, and maintaining data integration solutions for our clients. Your leadership will guide a team of engineers in delivering scalable, high-quality, and efficient data integration solutions. This role is perfect for an experienced data integration expert who is passionate about technology and excels in a fast-paced, dynamic setting. Responsibilities Design, develop, and maintain data integration solutions for clients Lead a team of engineers to ensure high-quality, scalable, and efficient delivery of data integration solutions Collaborate with cross-functional teams to comprehend business requirements and design fitting data integration solutions Ensure the security, reliability, and efficiency of data integration solutions Develop and maintain documentation, including technical specifications, data flow diagrams, and data mappings Continuously update knowledge on the latest data integration methods and tools Requirements Bachelor's degree in Computer Science, Information Systems, or a related field 8-13 years of experience in data engineering, data integration, or a related field Proficiency in cloud-native or Spark-based ETL tools such as AWS Glue, Azure Data Factory, or GCP Dataflow Strong knowledge of SQL for querying and manipulating data Competency in Snowflake for cloud data warehousing Familiarity with at least one cloud platform such as AWS, Azure, or GCP Experience in leading a team of engineers on data integration projects Good verbal and written communication skills in English at a B2 level Nice to have Background in ETL using Python Show more Show less

Posted 2 months ago

Apply

0 years

0 Lacs

Trivandrum, Kerala, India

On-site

Equifax is where you can power your possible. If you want to achieve your true potential, chart new paths, develop new skills, collaborate with bright minds, and make a meaningful impact, we want to hear from you. Equifax is seeking creative, high-energy and driven software engineers with hands-on development skills to work on a variety of meaningful projects. Our software engineering positions provide you the opportunity to join a team of talented engineers working with leading-edge technology. You are ideal for this position if you are a forward-thinking, committed, and enthusiastic software engineer who is passionate about technology. What You’ll Do Demonstrate a deep understanding of cloud native, distributed micro service based architectures Deliver solutions for complex business problems through software standard SDLC Build strong relationships with both internal and external stakeholders including product, business and sales partners Demonstrate excellent communication skills with the ability to both simplify complex problems and also dive deeper if needed Build and manage strong technical teams that deliver complex software solutions that scale Manage teams with cross functional skills that include software, quality, reliability engineers, project managers and scrum masters Provide deep troubleshooting skills with the ability to lead and solve production and customer issues under pressure Leverage strong experience in full stack software development and public cloud like GCP and AWS Mentor, coach and develop junior and senior software, quality and reliability engineers Lead with a data/metrics driven mindset with a maniacal focus towards optimizing and creating efficient solutions Ensure compliance with EFX secure software development guidelines and best practices and responsible for meeting and maintaining QE, DevSec, and FinOps KPIs Define, maintain and report SLA, SLO, SLIs meeting EFX engineering standards in partnership with the product, engineering and architecture teams Collaborate with architects, SRE leads and other technical leadership on strategic technical direction, guidelines, and best practices Drive up-to-date technical documentation including support, end user documentation and run books Lead Sprint planning, Sprint Retrospectives, and other team activity Responsible for implementation architecture decision making associated with Product features/stories, refactoring work, and EOSL decisions Create and deliver technical presentations to internal and external technical and non-technical stakeholders communicating with clarity and precision, and present complex information in a concise format that is audience appropriate What Experience You Need Bachelor's degree or equivalent experience 7+ years of software engineering experience 7+ years experience writing, debugging, and troubleshooting code in mainstream Java, SpringBoot, TypeScript/JavaScript, HTML, CSS 7+ years experience with Cloud technology: GCP, AWS, or Azure 7+ years experience designing and developing cloud-native solutions 7+ years experience designing and developing microservices using Java, SpringBoot, GCP SDKs, GKE/Kubernetes 7+ years experience deploying and releasing software using Jenkins CI/CD pipelines, understand infrastructure-as-code concepts, Helm Charts, and Terraform constructs What could set you apart Self-starter that identifies/responds to priority shifts with minimal supervision. Strong communication and presentation skills Strong leadership qualities Demonstrated problem solving skills and the ability to resolve conflicts Experience creating and maintaining product and software roadmaps Experience overseeing yearly as well as product/project budgets Working in a highly regulated environment Experience designing and developing big data processing solutions using Dataflow/Apache Beam, Bigtable, BigQuery, PubSub, GCS, Composer/Airflow, and others UI development (e.g. HTML, JavaScript, Angular and Bootstrap) Experience with backend technologies such as JAVA/J2EE, SpringBoot, SOA and Microservices Source code control management systems (e.g. SVN/Git, Github) and build tools like Maven & Gradle. Agile environments (e.g. Scrum, XP) Relational databases (e.g. SQL Server, MySQL) Atlassian tooling (e.g. JIRA, Confluence, and Github) Developing with modern JDK (v1.7+) Automated Testing: JUnit, Selenium, LoadRunner, SoapUI We offer a hybrid work setting, comprehensive compensation and healthcare packages, attractive paid time off, and organizational growth potential through our online learning platform with guided career tracks. Are you ready to power your possible? Apply today, and get started on a path toward an exciting new career at Equifax, where you can make a difference! Who is Equifax? At Equifax, we believe knowledge drives progress. As a global data, analytics and technology company, we play an essential role in the global economy by helping employers, employees, financial institutions and government agencies make critical decisions with greater confidence. We work to help create seamless and positive experiences during life’s pivotal moments: applying for jobs or a mortgage, financing an education or buying a car. Our impact is real and to accomplish our goals we focus on nurturing our people for career advancement and their learning and development, supporting our next generation of leaders, maintaining an inclusive and diverse work environment, and regularly engaging and recognizing our employees. Regardless of location or role, the individual and collective work of our employees makes a difference and we are looking for talented team players to join us as we help people live their financial best. Equifax is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, or status as a protected veteran. Show more Show less

Posted 2 months ago

Apply

0 years

0 Lacs

Trivandrum, Kerala, India

On-site

Equifax is where you can power your possible. If you want to achieve your true potential, chart new paths, develop new skills, collaborate with bright minds, and make a meaningful impact, we want to hear from you. Equifax is seeking creative, high-energy and driven software engineers with hands-on development skills to work on a variety of meaningful projects. Our software engineering positions provide you the opportunity to join a team of talented engineers working with leading-edge technology. You are ideal for this position if you are a forward-thinking, committed, and enthusiastic software engineer who is passionate about technology. What You’ll Do Design, develop, and operate high scale applications across the full engineering stack Design, develop, test, deploy, maintain, and improve software. Apply modern software development practices (serverless computing, microservices architecture, CI/CD, infrastructure-as-code, etc.) Work across teams to integrate our systems with existing internal systems, Data Fabric, CSA Toolset. Participate in technology roadmap and architecture discussions to turn business requirements and vision into reality. Participate in a tight-knit, globally distributed engineering team. Triage product or system issues and debug/track/resolve by analyzing the sources of issues and the impact on network, or service operations and quality. Manage sole project priorities, deadlines, and deliverables. Research, create, and develop software applications to extend and improve on Equifax Solutions Collaborate on scalability issues involving access to data and information. Actively participate in Sprint planning, Sprint Retrospectives, and other team activity What Experience You Need Bachelor's degree or equivalent experience 5+ years of software engineering experience 5+ years experience writing, debugging, and troubleshooting code in mainstream Java, SpringBoot, TypeScript/JavaScript, HTML, CSS 5+ years experience with Cloud technology: GCP, AWS, or Azure 5+ years experience designing and developing cloud-native solutions 5+ years experience designing and developing microservices using Java, SpringBoot, GCP SDKs, GKE/Kubernetes 5+ years experience deploying and releasing software using Jenkins CI/CD pipelines, understand infrastructure-as-code concepts, Helm Charts, and Terraform constructs What could set you apart Self-starter that identifies/responds to priority shifts with minimal supervision. Experience designing and developing big data processing solutions using Dataflow/Apache Beam, Bigtable, BigQuery, PubSub, GCS, Composer/Airflow, and others UI development (e.g. HTML, JavaScript, Angular and Bootstrap) Experience with backend technologies such as JAVA/J2EE, SpringBoot, SOA and Microservices Source code control management systems (e.g. SVN/Git, Github) and build tools like Maven & Gradle. Agile environments (e.g. Scrum, XP) Relational databases (e.g. SQL Server, MySQL) Atlassian tooling (e.g. JIRA, Confluence, and Github) Developing with modern JDK (v1.7+) Automated Testing: JUnit, Selenium, LoadRunner, SoapUI We offer a hybrid work setting, comprehensive compensation and healthcare packages, attractive paid time off, and organizational growth potential through our online learning platform with guided career tracks. Are you ready to power your possible? Apply today, and get started on a path toward an exciting new career at Equifax, where you can make a difference! Who is Equifax? At Equifax, we believe knowledge drives progress. As a global data, analytics and technology company, we play an essential role in the global economy by helping employers, employees, financial institutions and government agencies make critical decisions with greater confidence. We work to help create seamless and positive experiences during life’s pivotal moments: applying for jobs or a mortgage, financing an education or buying a car. Our impact is real and to accomplish our goals we focus on nurturing our people for career advancement and their learning and development, supporting our next generation of leaders, maintaining an inclusive and diverse work environment, and regularly engaging and recognizing our employees. Regardless of location or role, the individual and collective work of our employees makes a difference and we are looking for talented team players to join us as we help people live their financial best. Equifax is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, or status as a protected veteran. Show more Show less

Posted 2 months ago

Apply

3 years

0 Lacs

Hyderabad, Telangana, India

On-site

Role : SRE Manager - Techblocks India. Location :Hyderabad & Ahmedabad. Full-Time. 3 Days from office. The SRE Manager at Techblocks India will lead the reliability engineering function, ensuring infrastructure resiliency and optimal operational performance. This hybrid role blends technical leadership with team mentorship and cross-functional coordination. 10+ years total experience, with 3+ years in a leadership role in SRE or Cloud Operations. Deep understanding of Kubernetes, GKE, Prometheus, Terraform. Cloud : Advanced GCP administration. CI/CD : Jenkins, Argo CD, GitHub Actions. Incident Management : Full lifecycle, tools like OpsGenie. Knowledge of service mesh and observability stacks. Strong scripting skills (Python, Bash). BigQuery/Dataflow exposure for telemetry. Build and lead a team of SREs. Standardize practices for reliability, alerting, and response. Engage with Engineering and Product leaders. Establish and lead the implementation of organizational reliability strategies, aligning SLAs, SLOs, and Error Budgets with business goals and customer expectations. - Develop and institutionalize incident response frameworks, including escalation policies, on- call scheduling, service ownership mapping, and RCA process governance. Lead technical reviews for infrastructure reliability design, high-availability architectures, and resiliency patterns across distributed cloud services. Champion observability and monitoring culture by standardizing tooling, alert definitions, dashboard templates, and telemetry data schemas across all product teams. Drive continuous improvement through operational maturity assessments, toil elimination initiatives, and SRE OKRs aligned with product objectives. Collaborate with cloud engineering and platform teams to introduce self-healing systems, capacity-aware autoscaling, and latency-optimized service mesh patterns. Act as the principal escalation point for reliability-related concerns and ensure incident retrospectives lead to measurable improvements in uptime and MTTR. Own runbook standardization, capacity planning, failure mode analysis, and production readiness reviews for new feature launches. Mentor and develop a high-performing SRE team, fostering a proactive ownership culture, encouraging cross-functional knowledge sharing, and establishing technical career pathways. Collaborate with leadership, delivery, and customer stakeholders to define reliability goals, track performance, and demonstrate ROI on SRE is a global digital product engineering company with 16+ years of experience helping Fortune 500 enterprises and high-growth brands accelerate innovation, modernize technology, and drive digital transformation. From cloud solutions and data engineering to experience design and platform modernization, we help businesses solve complex challenges and unlock new growth opportunities. At TechBlocks, we believe technology is only as powerful as the people behind it. We foster a culture of collaboration, creativity, and continuous learning, where big ideas turn into real impact. Whether you're building seamless digital experiences, optimizing enterprise platforms, or tackling complex integrations, you'll be part of a dynamic, fast-moving team that values innovation and ownership. Join us and shape the future of digital transformation. (ref:hirist.tech) Show more Show less

Posted 2 months ago

Apply

0 years

0 Lacs

Mumbai, Maharashtra, India

Remote

Position Title Data Scientist II Function/Group R&D/Packaging Location Mumbai Shift Timing Regular Role Reports to Sr. Manager, Global Knowledge Solutions Remote/Hybrid/in-Office Hybrid About General Mills We make food the world loves: 100 brands. In 100 countries. Across six continents. With iconic brands like Cheerios, Pillsbury, Betty Crocker, Nature Valley, and Haagen-Dazs, we have been serving up food the world loves for 155 years (and counting). Each of our brands has a unique story to tell. How we make our food is as important as the food we make. Our values are baked into our legacy and continue to accelerate. us into the future as an innovative force for good. General Mills was founded in 1866 when Cadwallader Washburn boldly bought the largest flour mill west of the Mississippi. That pioneering spirit lives on today through our leadership team who upholds a vision of relentless innovation while being a force for good. For more details check out http://www.generalmills.com General Mills India Center (GIC) is our global capability center in Mumbai that works as an extension of our global organization delivering business value, service excellence and growth, while standing for good for our planet and people. With our team of 1800+ professionals, we deliver superior value across the areas of Supply chain (SC), Digital & Technology (D&T) Innovation, Technology & Quality (ITQ), Consumer and Market Intelligence (CMI), Sales Strategy & Intelligence (SSI), Global Shared Services (GSS), Finance Shared Services (FSS) and Human Resources Shared Services (HRSS).For more details check out https://www.generalmills.co.in We advocate for advancing equity and inclusion to create more equitable workplaces and a better tomorrow. Job Overview Function Overview In partnership with our cross-functional partners, ITQ innovates and develops products that meet the ever-changing needs of our consumers and enables long-term business growth. We identify and develop technologies that shape and protect our businesses today and into the future. ITQ operates across three organizations: Global Applications, Capabilities COEs, and Shared Services & Operations For more details about General Mills please visit this Link Purpose of the role The Global Knowledge Services (GKS) organization catalyzes the creation, transfer, and application of knowledge to ensure ITQ succeeds at its mission of driving internal and external innovation, developing differentiated technology, and engendering trust through food safety and quality. The scientists in the Statistics and Analytics Program Area will collaborate with US and India GKS team members to deliver high value statistical work that advances ITQ initiatives in consumer product research, health and nutrition science, research and development, and quality improvement. The Data Scientist II in this program area will be responsible for: designing, building, and maintaining scalable data pipelines and infrastructure to support advanced analytics, data science, and business intelligence across our organization leveraging GCP services. This role requires close collaboration with statisticians, data scientists, and BI developers to ensure timely, reliable, and quality data delivery that drives insights and decision-making. Key Accountabilities 70%of Time- Excellent Technical Work Design, develop, and optimize data pipelines and ETL/ELT workflows using GCP services (BigQuery, Dataflow, Pub/Sub, Cloud Functions, etc.) Build and maintain data architecture that supports structured and unstructured data from multiple sources Work closely with statisticians and data scientists to provision clean, transformed datasets for advanced modeling and analytics Enable self-service BI through efficient data modeling and provisioning in tools like Looker, Power BI, or Tableau Implement data quality checks, monitoring, and documentation to ensure high data reliability and accuracy Collaborate with DevOps/Cloud teams to ensure data infrastructure is secure, scalable, and cost-effective Support and optimize workflows for data exploration, experimentation, and productization of models Participate in data governance efforts, including metadata management, data cataloging, and access controls 15%of Time- Client Consultation and Business Partnering Work effectively with clients to identify client needs and success criteria, and translate into clear project objectives, timelines, and plans. Be responsive and timely in sharing project updates, responding to client queries, and delivering on project commitments. Clearly communicate analysis, conclusions, insights, and conclusions to clients using written reports and real-time meetings. 10%of Time-Innovation, Continuous Improvement (CI), and Personal Development Learn and apply a CI mindset to work, seeking opportunities for improvements in efficiency and client value. Identify new resources, develop new methods, and seek external inspiration to drive innovations in our work processes. Continually build skills and knowledge in the fields of statistics, and the relevant sciences. 5% of Time-Administration Participate in all required training (Safety, HR, Finance, CI, other) and actively GKS, and ITQ meetings, events, and activities. Complete other administrative tasks as required. Minimum Qualifications Minimum Degree Requirements: Masters from an accredited university Minimum 6 years of related experience required Specific Job Experience Or Skills Needed 6+ years of experience in data engineering roles, including strong hands-on GCP experience Proficiency in GCP services like BigQuery, Cloud Storage, Cloud Composer (Airflow), Dataflow, Pub/Sub Strong SQL skills and experience working with large-scale data warehouses Solid programming skills in Python and/or Java/Scala Experience with data modeling, schema design, and performance tuning Familiarity with CI/CD, Git, and infrastructure-as-code principles (Terraform preferred) Strong communication and collaboration skills across cross-functional teams For Global Knowledge Services Ability to effectively work cross-functionally with internal/global team members. High self-motivation, with the ability to work both independently and in teams. Excels at driving projects to completion, with attention to detail. Ability to exercise judgment in handling confidential and proprietary information. Ability to effectively prioritize, multi-task, and execute tasks according to a plan. Able to work on multiple priorities and projects simultaneously. Demonstrated creative problem-solving abilities, attention to detail, ability to “think outside the box.” Preferred Qualifications Preferred Major Area of Study: Master’s degree in Computer Science, Engineering, Data Science, or a related field Preferred Professional Certifications: GCP Preferred 6 years of related experience Company Overview We exist to make food the world loves. But we do more than that. Our company is a place that prioritizes being a force for good, a place to expand learning, explore new perspectives and reimagine new possibilities, every day. We look for people who want to bring their best — bold thinkers with big hearts who challenge one other and grow together. Because becoming the undisputed leader in food means surrounding ourselves with people who are hungry for what’s next. Show more Show less

Posted 2 months ago

Apply

0 years

0 Lacs

Goregaon, Maharashtra, India

On-site

Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Senior Associate Job Description & Summary At PwC, our people in data and analytics engineering focus on leveraging advanced technologies and techniques to design and develop robust data solutions for clients. They play a crucial role in transforming raw data into actionable insights, enabling informed decision-making and driving business growth. In data engineering at PwC, you will focus on designing and building data infrastructure and systems to enable efficient data processing and analysis. You will be responsible for developing and implementing data pipelines, data integration, and data transformation solutions. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us. At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Job Description & Summary: A career within Data and Analytics services will provide you with the opportunity to help organisations uncover enterprise insights and drive business results using smarter data analytics. We focus on a collection of organisational technology capabilities, including business intelligence, data management, and data assurance that help our clients drive innovation, growth, and change within their organisations in order to keep up with the changing nature of customers and technology. We make impactful decisions by mixing mind and machine to leverage data, understand and navigate risk, and help our clients gain a competitive edge. Responsibilities Design and implement scalable, efficient, and secure data pipelines on GCP, utilizing tools such as BigQuery, Dataflow, Dataproc, Pub/Sub, and Cloud Storage. Collaborate with cross-functional teams (data scientists, analysts, and software engineers) to understand business requirements and deliver actionable data solutions. Develop and maintain ETL/ELT processes to ingest, transform, and load data from various sources into GCP-based data warehouses. Build and manage data lakes and data marts on GCP to support analytics and business intelligence initiatives. Implement automated data quality checks, monitoring, and alerting systems to ensure data integrity.  Optimize and tune performance for large-scale data processing jobs in BigQuery, Dataflow, and other GCP tools. Create and maintain data pipelines to collect, clean, and transform data for analytics and machine learning purposes. Ensure data governance and compliance with organizational policies, including data security, privacy, and access controls. Stay up to date with new GCP services and features and make recommendations for improvements and new implementations. Mandatory Skill Sets GCP, Big query , Data Proc Preferred Skill Sets GCP, Big query , Data Proc, Airflow Years Of Experience Required 4-7 Education Qualification B.Tech / M.Tech / MBA / MCA Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Master of Engineering, Master of Business Administration, Bachelor of Engineering Degrees/Field Of Study Preferred Certifications (if blank, certifications not specified) Required Skills Good Clinical Practice (GCP) Optional Skills Accepting Feedback, Accepting Feedback, Active Listening, Agile Scalability, Amazon Web Services (AWS), Analytical Thinking, Apache Hadoop, Azure Data Factory, Communication, Creativity, Data Anonymization, Database Administration, Database Management System (DBMS), Database Optimization, Database Security Best Practices, Data Engineering, Data Engineering Platforms, Data Infrastructure, Data Integration, Data Lake, Data Modeling, Data Pipeline, Data Quality, Data Transformation, Data Validation {+ 18 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Not Specified Available for Work Visa Sponsorship? No Government Clearance Required? No Job Posting End Date Show more Show less

Posted 2 months ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies