Jobs
Interviews

1096 Dataflow Jobs - Page 20

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 - 7.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Introduction In this role, you'll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology Your Role And Responsibilities Skilled Multiple GCP services - GCS, BigQuery, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer etc. Must have Python and SQL work experience & Proactive, collaborative and ability to respond to critical situation Ability to analyse data for functional business requirements & front face customer Preferred Education Master's Degree Required Technical And Professional Expertise 5 to 7 years of relevant experience working as technical analyst with Big Query on GCP platform. Skilled in multiple GCP services - GCS, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer You love collaborative environments that use agile methodologies to encourage creative design thinking and find innovative ways to develop with cutting edge technologies Ambitious individual who can work under their own direction towards agreed targets/goals and with creative approach to work Preferred Technical And Professional Experience Create up to 3 bullets maxitive individual with an ability to manage change and proven time management Proven interpersonal skills while contributing to team effort by accomplishing related results as needed Up-to-date technical knowledge by attending educational workshops, reviewing publications (encouraging then to focus on required skills)

Posted 1 month ago

Apply

5.0 - 10.0 years

20 - 35 Lacs

Pune

Work from Office

Responsibilities Lead and mentor a team of data engineers, providing technical guidance, setting best practices, and overseeing task execution for the migration project. Design, develop, and architect scalable ETL processes to extract, transform, and load petabytes of data from on-premises SQL Server to GCP Cloud SQL PostgreSQL. Oversee the comprehensive analysis of existing SQL Server schemas, data types, stored procedures, and complex data models, defining strategies for their optimal conversion and refactoring for PostgreSQL. Establish and enforce rigorous data validation, quality, and integrity frameworks throughout the migration lifecycle, ensuring accuracy and consistency. Collaborate strategically with Database Administrators, application architects, business stakeholders, and security teams to define migration scope, requirements, and cutover plans. Lead the development and maintenance of advanced scripts (primarily Python) for automating large-scale migration tasks, complex data transformations, and reconciliation processes. Proactively identify, troubleshoot, and lead the resolution of complex data discrepancies, performance bottlenecks, and technical challenges during migration. Define and maintain comprehensive documentation standards for migration strategies, data mapping, transformation rules, and post-migration validation procedures. Ensure data governance, security, and compliance standards are meticulously applied throughout the migration process, including data encryption and access controls within GCP. Implement Schema conversion or custom schema mapping strategy for SQL Server to PostgreSQL shift Refactor and translate complex stored procedures and T-SQL logic to PostgreSQL-compatible constructs while preserving functional equivalence. Develop and execute comprehensive data reconciliation strategies to ensure consistency and parity between legacy and migrated datasets post-cutover. Design fallback procedures and lead post-migration verification and support to ensure business continuity. Ensuring metadata cataloging and data lineage tracking using GCP-native or integrated tools. Must-Have Skills Expertise in data engineering, specifically for Google Cloud Platform (GCP). Deep understanding of relational database architecture, advanced schema design, data modeling, and performance tuning. Expert-level SQL proficiency, with extensive hands-on experience in both T-SQL (SQL Server) and PostgreSQL. Hands-on experience with data migration processes, including moving datasets from on-premises databases to cloud storage solutions. Proficiency in designing, implementing, and optimizing complex ETL/ELT pipelines for high-volume data movement, leveraging tools and custom scripting. Strong knowledge of GCP services: Cloud SQL, Dataflow, Pub/Sub, Cloud Storage, Dataproc, Cloud Composer, Cloud Functions, and Bigquery. Solid understanding of data governance, security, and compliance practices in the cloud, including the management of sensitive data during migration. Strong programming skills in Python or Java for building data pipelines and automating processes. Experience with real-time data processing using Pub/Sub, Dataflow, or similar GCP services. Experience with CI/CD practices and tools like Jenkins, GitLab, or Cloud Build for automating the data engineering pipeline. Knowledge of data modeling and best practices for structuring cloud data storage for optimal query performance and analytics in GCP. Familiarity with observability and monitoring tools in GCP (e.g., Stackdriver, Prometheus) for real-time data pipeline visibility and alerting. Good-to-Have Skills Direct experience with GCP Database Migration Service, Storage Transfer Service, or similar cloud-native migration tools. Familiarity with data orchestration using tools like Cloud Composer (based on Apache Airflow) for managing workflows. Experience with containerization tools like Docker and Kubernetes for deploying data pipelines in a scalable manner. Exposure to DataOps tools and methodologies for managing data workflows. Experience with machine learning platforms like AI Platform in GCP to integrate with data pipelines. Familiarity with data lake architecture and the integration of BigQuery with Google Cloud Storage or Dataproc.

Posted 1 month ago

Apply

0 years

5 - 9 Lacs

Hyderābād

On-site

Req ID: 327059 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a GCP Python Pyspark Engineer to join our team in Hyderabad, Telangana (IN-TG), India (IN). Strong hands-on experience in designing and building data pipelines using Google Cloud Platform (GCP) services like BigQuery, Dataflow, and Cloud Composer. Proficient in Python for data processing, scripting, and automation in cloud and distributed environments. Solid working knowledge of Apache Spark / PySpark, with experience in large-scale data transformation and performance tuning. Familiar with CI/CD processes, version control (Git), and workflow orchestration tools such as Airflow or Composer. Ability to work independently in fast-paced Agile environments with strong problem-solving and communication skills. Exposure to modern data architectures and real-time/streaming data solutions is an added advantage. About NTT DATA NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies. Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us at us.nttdata.com NTT DATA endeavors to make https://us.nttdata.com accessible to any and all users. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact us at https://us.nttdata.com/en/contact-us. This contact information is for accommodation requests only and cannot be used to inquire about the status of applications. NTT DATA is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status. For our EEO Policy Statement, please click here. If you'd like more information on your EEO rights under the law, please click here. For Pay Transparency information, please click here.

Posted 1 month ago

Apply

5.0 years

4 - 7 Lacs

Thiruvananthapuram

On-site

Equifax is seeking creative, high-energy and driven software engineers with hands-on development skills to work on a variety of meaningful projects. Our software engineering positions provide you the opportunity to join a team of talented engineers working with leading-edge technology. You are ideal for this position if you are a forward-thinking, committed, and enthusiastic software engineer who is passionate about technology. What you’ll do Design, develop, and operate high scale applications across the full engineering stack Design, develop, test, deploy, maintain, and improve software. Apply modern software development practices (serverless computing, microservices architecture, CI/CD, infrastructure-as-code, etc.) Work across teams to integrate our systems with existing internal systems, Data Fabric, CSA Toolset. Participate in technology roadmap and architecture discussions to turn business requirements and vision into reality. Participate in a tight-knit, globally distributed engineering team. Triage product or system issues and debug/track/resolve by analyzing the sources of issues and the impact on network, or service operations and quality. Manage sole project priorities, deadlines, and deliverables. Research, create, and develop software applications to extend and improve on Equifax Solutions Collaborate on scalability issues involving access to data and information. Actively participate in Sprint planning, Sprint Retrospectives, and other team activity What experience you need Bachelor's degree or equivalent experience 5+ years of software engineering experience 5+ years experience writing, debugging, and troubleshooting code in mainstream Java, SpringBoot, TypeScript/JavaScript, HTML, CSS 5+ years experience with Cloud technology: GCP, AWS, or Azure 5+ years experience designing and developing cloud-native solutions 5+ years experience designing and developing microservices using Java, SpringBoot, GCP SDKs, GKE/Kubernetes 5+ years experience deploying and releasing software using Jenkins CI/CD pipelines, understand infrastructure-as-code concepts, Helm Charts, and Terraform constructs What could set you apart Self-starter that identifies/responds to priority shifts with minimal supervision. Experience designing and developing big data processing solutions using Dataflow/Apache Beam, Bigtable, BigQuery, PubSub, GCS, Composer/Airflow, and others UI development (e.g. HTML, JavaScript, Angular and Bootstrap) Experience with backend technologies such as JAVA/J2EE, SpringBoot, SOA and Microservices Source code control management systems (e.g. SVN/Git, Github) and build tools like Maven & Gradle. Agile environments (e.g. Scrum, XP) Relational databases (e.g. SQL Server, MySQL) Atlassian tooling (e.g. JIRA, Confluence, and Github) Developing with modern JDK (v1.7+) Automated Testing: JUnit, Selenium, LoadRunner, SoapUI

Posted 1 month ago

Apply

5.0 years

5 - 9 Lacs

Thiruvananthapuram

On-site

Equifax is seeking creative, high-energy and driven software engineers with hands-on development skills to work on a variety of meaningful projects. Our software engineering positions provide you the opportunity to join a team of talented engineers working with leading-edge technology. You are ideal for this position if you are a forward-thinking, committed, and enthusiastic software engineer who is passionate about technology. What you’ll do Design, develop, and operate high scale RPA/AI applications across the full engineering stack Design, develop, test, deploy, maintain, and improve software. Apply modern software development practices (serverless computing, microservices architecture, CI/CD, infrastructure-as-code, etc.) Work across teams to integrate our systems with existing internal systems, Data Fabric, CSA Toolset. Participate in technology roadmap and architecture discussions to turn business requirements and vision into reality. Participate in a tight-knit, globally distributed engineering team. Triage product or system issues and debug/track/resolve by analyzing the sources of issues and the impact on network, or service operations and quality. Manage sole project priorities, deadlines, and deliverables. Research, create, and develop software applications to extend and improve on Equifax Solutions Collaborate on scalability issues involving access to data and information. Actively participate in Sprint planning, Sprint Retrospectives, and other team activity What experience you need Bachelor's degree or equivalent experience 5+ years of software engineering experience 5+ years experience writing, debugging, and troubleshooting code in mainstream RPA/AI, Python 5+ years experience with Cloud technology: GCP, AWS, or Azure 5+ years experience designing and developing cloud-native solutions 5+ years experience deploying and releasing software using Jenkins CI/CD pipelines, understand infrastructure-as-code concepts, Helm Charts, and Terraform constructs What could set you apart Self-starter that identifies/responds to priority shifts with minimal supervision. Experience designing and developing UiPath RPA solutions using Dataflow/Apache Beam, SQL Server,BigQuery, PubSub, GCS, and others Source code control management systems (e.g. SVN/Git, Github) and build tools like Maven & Gradle. Agile environments (e.g. Scrum, XP) Relational databases (e.g. SQL Server, MySQL) Atlassian tooling (e.g. JIRA, Confluence, and Github) Automated Testing: JUnit, Selenium, LoadRunner, SoapUI

Posted 1 month ago

Apply

5.0 years

4 - 7 Lacs

Thiruvananthapuram

On-site

Equifax is seeking creative, high-energy and driven software engineers with hands-on development skills to work on a variety of meaningful projects. Our software engineering positions provide you the opportunity to join a team of talented engineers working with leading-edge technology. You are ideal for this position if you are a forward-thinking, committed, and enthusiastic software engineer who is passionate about technology. What you’ll do Design, develop, and operate high scale applications across the full engineering stack Design, develop, test, deploy, maintain, and improve software. Apply modern software development practices (serverless computing, microservices architecture, CI/CD, infrastructure-as-code, etc.) Work across teams to integrate our systems with existing internal systems, Data Fabric, CSA Toolset. Participate in technology roadmap and architecture discussions to turn business requirements and vision into reality. Participate in a tight-knit, globally distributed engineering team. Triage product or system issues and debug/track/resolve by analyzing the sources of issues and the impact on network, or service operations and quality. Manage sole project priorities, deadlines, and deliverables. Research, create, and develop software applications to extend and improve on Equifax Solutions Collaborate on scalability issues involving access to data and information. Actively participate in Sprint planning, Sprint Retrospectives, and other team activity What experience you need Bachelor's degree or equivalent experience 5+ years of software engineering experience 5+ years experience writing, debugging, and troubleshooting code in mainstream Java, SpringBoot, TypeScript/JavaScript, HTML, CSS 5+ years experience with Cloud technology: GCP, AWS, or Azure 5+ years experience designing and developing cloud-native solutions 5+ years experience designing and developing microservices using Java, SpringBoot, GCP SDKs, GKE/Kubernetes 5+ years experience deploying and releasing software using Jenkins CI/CD pipelines, understand infrastructure-as-code concepts, Helm Charts, and Terraform constructs What could set you apart Self-starter that identifies/responds to priority shifts with minimal supervision. Experience designing and developing big data processing solutions using Dataflow/Apache Beam, Bigtable, BigQuery, PubSub, GCS, Composer/Airflow, and others UI development (e.g. HTML, JavaScript, Angular and Bootstrap) Experience with backend technologies such as JAVA/J2EE, SpringBoot, SOA and Microservices Source code control management systems (e.g. SVN/Git, Github) and build tools like Maven & Gradle. Agile environments (e.g. Scrum, XP) Relational databases (e.g. SQL Server, MySQL) Atlassian tooling (e.g. JIRA, Confluence, and Github) Developing with modern JDK (v1.7+) Automated Testing: JUnit, Selenium, LoadRunner, SoapUI

Posted 1 month ago

Apply

5.0 years

0 Lacs

Trivandrum, Kerala, India

On-site

Equifax is seeking creative, high-energy and driven software engineers with hands-on development skills to work on a variety of meaningful projects. Our software engineering positions provide you the opportunity to join a team of talented engineers working with leading-edge technology. You are ideal for this position if you are a forward-thinking, committed, and enthusiastic software engineer who is passionate about technology. What You’ll Do Design, develop, and operate high scale applications across the full engineering stack Design, develop, test, deploy, maintain, and improve software. Apply modern software development practices (serverless computing, microservices architecture, CI/CD, infrastructure-as-code, etc.) Work across teams to integrate our systems with existing internal systems, Data Fabric, CSA Toolset. Participate in technology roadmap and architecture discussions to turn business requirements and vision into reality. Participate in a tight-knit, globally distributed engineering team. Triage product or system issues and debug/track/resolve by analyzing the sources of issues and the impact on network, or service operations and quality. Manage sole project priorities, deadlines, and deliverables. Research, create, and develop software applications to extend and improve on Equifax Solutions Collaborate on scalability issues involving access to data and information. Actively participate in Sprint planning, Sprint Retrospectives, and other team activity What Experience You Need Bachelor's degree or equivalent experience 5+ years of software engineering experience 5+ years experience writing, debugging, and troubleshooting code in mainstream Java, SpringBoot, TypeScript/JavaScript, HTML, CSS 5+ years experience with Cloud technology: GCP, AWS, or Azure 5+ years experience designing and developing cloud-native solutions 5+ years experience designing and developing microservices using Java, SpringBoot, GCP SDKs, GKE/Kubernetes 5+ years experience deploying and releasing software using Jenkins CI/CD pipelines, understand infrastructure-as-code concepts, Helm Charts, and Terraform constructs What could set you apart Self-starter that identifies/responds to priority shifts with minimal supervision. Experience designing and developing big data processing solutions using Dataflow/Apache Beam, Bigtable, BigQuery, PubSub, GCS, Composer/Airflow, and others UI development (e.g. HTML, JavaScript, Angular and Bootstrap) Experience with backend technologies such as JAVA/J2EE, SpringBoot, SOA and Microservices Source code control management systems (e.g. SVN/Git, Github) and build tools like Maven & Gradle. Agile environments (e.g. Scrum, XP) Relational databases (e.g. SQL Server, MySQL) Atlassian tooling (e.g. JIRA, Confluence, and Github) Developing with modern JDK (v1.7+) Automated Testing: JUnit, Selenium, LoadRunner, SoapUI

Posted 1 month ago

Apply

0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

About Us Transcloud is a cloud technology services company that helps businesses adopt the cloud to empower them for the future. Job Description We are seeking a skilled and experienced Cloud Engineer to join our dynamic team. As a Cloud Engineer, you will play a crucial role in implementing and managing cloud architectures for our client’s software applications. Your strong expertise in Google Cloud Platform (GCP) implementations, programming languages, and cloud ecosystem design will contribute to the success of our cloud-based solutions. We are offering a highly competitive salary commensurate with industry standards. Minimum Qualifications: - Demonstrated experience in implementing cloud architecture for software applications. - Extensive expertise in GCP implementations. - Proficiency in at least one of the programming/scripting languages. - Proficient in using Linux CLI commands and Google Cloud SDK. - Ability to design holistic cloud ecosystems with a focus on Google Cloud Platform capabilities and features. - Familiarity with Cloud Shell and GCP commands such as gcloud and gsutil. - Hands-on experience with Kubernetes, DevOps, developing and managing CI/CD pipelines. - Hands-on experience with GCP IaaS services such as GCE, GAE, GKE, VPC, DNS, Interconnect VPN, CDN, Cloud Storage, FileStore, Firebase, Deployment Manager, and Stackdriver. - Familiarity with GCP services including Cloud Endpoints, Dataflow, Dataproc, Datalab, Dataprep, Cloud Composer, Pub/Sub, and Cloud Functions. Responsibilities: - Troubleshoot issues, actively seeking out problems, and providing effective solutions. - Implementing HA and DR solutions - Be an active participant in the running of the team, fostering a great place to work. - Engage with the wider business to identify opportunities for future work for the team. - Experiment with new technologies to help push the boundaries of what the team is building. Requirements - Professional certifications related to cloud platforms, specifically Google Cloud Platform. - Knowledge of containerization technologies (e.g., Docker, Kubernetes). - Familiarity with DevOps practices and tools. - Understanding basic network and security principles in cloud environments. - Experience with automation and infrastructure-as-code tools, preferably terraform - Experience with other cloud platforms such as AWS or Azure is a good to have. - Excellent problem-solving and analytical skills. - Strong communication and collaboration abilities. Benefits ✔ Health Insurance for a worry-free lifestyle. ✔ Flexible work hours for better work-life balance. ✔ Informal dress code to express your individuality. ✔ Enjoy a 5-day work week to pursue your passions outside of work. ✔ Exposure to work directly with clients and grow up in the career. Apply here: https://wetranscloud.zohorecruit.in/jobs/Careers/77118000003921192/Senior-Cloud-Engineer-GCP

Posted 1 month ago

Apply

5.0 years

0 Lacs

Trivandrum, Kerala, India

On-site

Equifax is where you can power your possible. If you want to achieve your true potential, chart new paths, develop new skills, collaborate with bright minds, and make a meaningful impact, we want to hear from you. Equifax is seeking creative, high-energy and driven software engineers with hands-on development skills to work on a variety of meaningful projects. Our software engineering positions provide you the opportunity to join a team of talented engineers working with leading-edge technology. You are ideal for this position if you are a forward-thinking, committed, and enthusiastic software engineer who is passionate about technology. What You’ll Do Design, develop, and operate high scale applications across the full engineering stack Design, develop, test, deploy, maintain, and improve software. Apply modern software development practices (serverless computing, microservices architecture, CI/CD, infrastructure-as-code, etc.) Work across teams to integrate our systems with existing internal systems, Data Fabric, CSA Toolset. Participate in technology roadmap and architecture discussions to turn business requirements and vision into reality. Participate in a tight-knit, globally distributed engineering team. Triage product or system issues and debug/track/resolve by analyzing the sources of issues and the impact on network, or service operations and quality. Manage sole project priorities, deadlines, and deliverables. Research, create, and develop software applications to extend and improve on Equifax Solutions Collaborate on scalability issues involving access to data and information. Actively participate in Sprint planning, Sprint Retrospectives, and other team activity What Experience You Need Bachelor's degree or equivalent experience 5+ years of software engineering experience 5+ years experience writing, debugging, and troubleshooting code in mainstream Java, SpringBoot, TypeScript/JavaScript, HTML, CSS 5+ years experience with Cloud technology: GCP, AWS, or Azure 5+ years experience designing and developing cloud-native solutions 5+ years experience designing and developing microservices using Java, SpringBoot, GCP SDKs, GKE/Kubernetes 5+ years experience deploying and releasing software using Jenkins CI/CD pipelines, understand infrastructure-as-code concepts, Helm Charts, and Terraform constructs What could set you apart Self-starter that identifies/responds to priority shifts with minimal supervision. Experience designing and developing big data processing solutions using Dataflow/Apache Beam, Bigtable, BigQuery, PubSub, GCS, Composer/Airflow, and others UI development (e.g. HTML, JavaScript, Angular and Bootstrap) Experience with backend technologies such as JAVA/J2EE, SpringBoot, SOA and Microservices Source code control management systems (e.g. SVN/Git, Github) and build tools like Maven & Gradle. Agile environments (e.g. Scrum, XP) Relational databases (e.g. SQL Server, MySQL) Atlassian tooling (e.g. JIRA, Confluence, and Github) Developing with modern JDK (v1.7+) Automated Testing: JUnit, Selenium, LoadRunner, SoapUI We offer a hybrid work setting, comprehensive compensation and healthcare packages, attractive paid time off, and organizational growth potential through our online learning platform with guided career tracks. Are you ready to power your possible? Apply today, and get started on a path toward an exciting new career at Equifax, where you can make a difference! Who is Equifax? At Equifax, we believe knowledge drives progress. As a global data, analytics and technology company, we play an essential role in the global economy by helping employers, employees, financial institutions and government agencies make critical decisions with greater confidence. We work to help create seamless and positive experiences during life’s pivotal moments: applying for jobs or a mortgage, financing an education or buying a car. Our impact is real and to accomplish our goals we focus on nurturing our people for career advancement and their learning and development, supporting our next generation of leaders, maintaining an inclusive and diverse work environment, and regularly engaging and recognizing our employees. Regardless of location or role, the individual and collective work of our employees makes a difference and we are looking for talented team players to join us as we help people live their financial best. Equifax is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, or status as a protected veteran.

Posted 1 month ago

Apply

3.0 - 5.0 years

5 - 7 Lacs

Bengaluru

Hybrid

Shift : (GMT+05:30) Asia/Kolkata (IST) What do you need for this opportunity? Must have skills required: Data Engineering, Big Data Technologies, Hadoop, Spark, Hive, Presto, Airflow, Data Modeling, Etl development, Data Lake Architecture, Python, Scala, GCP Big Query, Dataproc, Dataflow, Cloud Composer, AWS, Big Data Stack, Azure, GCP Wayfair is Looking for: About the job The Data Engineering team within the SMART org supports development of large-scale data pipelines for machine learning and analytical solutions related to unstructured and structured data. You'll have the opportunity to gain hands-on experience on all kinds of systems in the data platform ecosystem. Your work will have a direct impact on all applications that our millions of customers interact with every day: search results, homepage content, emails, auto-complete searches, browse pages and product carousels and build and scale data platforms that enable to measure the effectiveness of wayfairs ad-costs , media attribution that helps to decide on day to day or major marketing spends. About the Role: As a Data Engineer, you will be part of the Data Engineering team with this role being inherently multi-functional, and the ideal candidate will work with Data Scientist, Analysts, Application teams across the company, as well as all other Data Engineering squads at Wayfair. We are looking for someone with a love for data, understanding requirements clearly and the ability to iterate quickly. Successful candidates will have strong engineering skills and communication and a belief that data-driven processes lead to phenomenal products. What you'll do: Build and launch data pipelines, and data products focussed on SMART Org. Helping teams push the boundaries of insights, creating new product features using data, and powering machine learning models. Build cross-functional relationships to understand data needs, build key metrics and standardize their usage across the organization. Utilize current and leading edge technologies in software engineering, big data, streaming, and cloud infrastructure What You'll Need: Bachelor/Master degree in Computer Science or related technical subject area or equivalent combination of education and experience 3+ years relevant work experience in the Data Engineering field with web scale data sets. Demonstrated strength in data modeling, ETL development and data lake architecture. Data Warehousing Experience with Big Data Technologies (Hadoop, Spark, Hive, Presto, Airflow etc.). Coding proficiency in at least one modern programming language (Python, Scala, etc) Experience building/operating highly available, distributed systems of data extraction, ingestion, and processing and query performance tuning skills of large data sets. Industry experience as a Big Data Engineer and working along cross functional teams such as Software Engineering, Analytics, Data Science with a track record of manipulating, processing, and extracting value from large datasets. Strong business acumen. Experience leading large-scale data warehousing and analytics projects, including using GCP technologies Big Query, Dataproc, GCS, Cloud Composer, Dataflow or related big data technologies in other cloud platforms like AWS, Azure etc. Be a team player and introduce/follow the best practices on the data engineering space. Ability to effectively communicate (both written and verbally) technical information and the results of engineering design at all levels of the organization. Good to have : Understanding of NoSQL Database exposure and Pub-Sub architecture setup. Familiarity with Bl tools like Looker, Tableau, AtScale, PowerBI, or any similar tools.

Posted 1 month ago

Apply

3.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Description About GlobalLogic – With a workforce of 30000+ employees, GlobalLogic is a leader in digital product engineering. We help brands across the globe design and build innovative products, platforms, and digital experiences for the modern world. By integrating experience design, complex engineering, and data expertise — we help our clients imagine what’s possible, and accelerate their transition into tomorrow’s digital businesses. Headquartered in Silicon Valley, GlobalLogic operates design studios and engineering centers around the world, extending our deep expertise to customers in the communications, financial services, automotive, healthcare and life sciences, technology, media and entertainment, manufacturing, and semiconductor industries. GlobalLogic is a Hitachi Group Company. Feel free to watch other videos on GlobalLogic India YouTube channel to understand what makes GlobalLogic truly exceptional! Requirements Leadership & Strategy Lead and mentor a team of cloud engineers, providing technical guidance and career development support Define cloud architecture standards and best practices across the organization Collaborate with senior leadership to develop cloud strategy and roadmap aligned with business objectives Drive technical decision-making for complex cloud infrastructure projects Establish and maintain cloud governance frameworks and operational procedures Leadership Experience 3+ years in technical leadership roles managing engineering teams Proven track record of successfully delivering large-scale cloud transformation projects Experience with budget management and resource planning Strong presentation and communication skills for executive-level reporting Certifications (Preferred) Google Cloud Professional Cloud Architect Google Cloud Professional Data Engineer Additional relevant cloud or security certifications Technical Excellence 10+ years of experience in designing and implementing enterprise-scale Cloud Solutions using GCP services Architect and oversee the development of sophisticated cloud solutions using Python and advanced GCP services Lead the design and deployment of solutions utilizing Cloud Functions, Docker containers, Dataflow, and other GCP services Design complex integrations with multiple data sources and systems, ensuring optimal performance and scalability Implement and enforce security best practices and compliance frameworks across all cloud environments Troubleshoot and resolve complex technical issues while establishing preventive measures Job responsibilities Technical Skills Programming Languages: Expert-level proficiency in Python with experience in additional languages (Java, Go, or Scala preferred) GCP Services: Deep expertise with comprehensive GCP service portfolio including Dataflow, Compute Engine, BigQuery, Cloud Functions, Cloud Run, Pub/Sub, GCS, IAM, VPC, and emerging services Containerization & Orchestration: Advanced knowledge of Docker, Kubernetes (GKE), and container orchestration patterns Cloud Security: Extensive experience implementing enterprise security frameworks, compliance standards (SOC2, GDPR, HIPAA), and zero-trust architectures Infrastructure as Code: Proficiency with Terraform, Cloud Deployment Manager, or similar tools CI/CD: Experience with advanced deployment pipelines and GitOps practices Cross-functional Collaboration Partner with C-level executives, senior architects, and product leadership to translate business requirements into technical solutions Lead cross-functional project teams and coordinate deliverables across multiple stakeholders Present technical recommendations and project updates to executive leadership Establish relationships with GCP technical account managers and solution architects What we offer Culture of caring. At GlobalLogic, we prioritize a culture of caring. Across every region and department, at every level, we consistently put people first. From day one, you’ll experience an inclusive culture of acceptance and belonging, where you’ll have the chance to build meaningful connections with collaborative teammates, supportive managers, and compassionate leaders. Learning and development. We are committed to your continuous learning and development. You’ll learn and grow daily in an environment with many opportunities to try new things, sharpen your skills, and advance your career at GlobalLogic. With our Career Navigator tool as just one example, GlobalLogic offers a rich array of programs, training curricula, and hands-on opportunities to grow personally and professionally. Interesting & meaningful work. GlobalLogic is known for engineering impact for and with clients around the world. As part of our team, you’ll have the chance to work on projects that matter. Each is a unique opportunity to engage your curiosity and creative problem-solving skills as you help clients reimagine what’s possible and bring new solutions to market. In the process, you’ll have the privilege of working on some of the most cutting-edge and impactful solutions shaping the world today. Balance and flexibility. We believe in the importance of balance and flexibility. With many functional career areas, roles, and work arrangements, you can explore ways of achieving the perfect balance between your work and life. Your life extends beyond the office, and we always do our best to help you integrate and balance the best of work and life, having fun along the way! High-trust organization. We are a high-trust organization where integrity is key. By joining GlobalLogic, you’re placing your trust in a safe, reliable, and ethical global company. Integrity and trust are a cornerstone of our value proposition to our employees and clients. You will find truthfulness, candor, and integrity in everything we do. About GlobalLogic GlobalLogic, a Hitachi Group Company, is a trusted digital engineering partner to the world’s largest and most forward-thinking companies. Since 2000, we’ve been at the forefront of the digital revolution – helping create some of the most innovative and widely used digital products and experiences. Today we continue to collaborate with clients in transforming businesses and redefining industries through intelligent products, platforms, and services.

Posted 1 month ago

Apply

8.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

About Company : Our client is prominent Indian multinational corporation specializing in information technology (IT), consulting, and business process services and its headquartered in Bengaluru with revenues of gross revenue of ₹222.1 billion with global work force of 234,054 and listed in NASDAQ and it operates in over 60 countries and serves clients across various industries, including financial services, healthcare, manufacturing, retail, and telecommunications. The company consolidated its cloud, data, analytics, AI, and related businesses under the tech services business line. Major delivery centers in India, including cities like Chennai, Pune, Hyderabad, and Bengaluru, kochi, kolkatta, Noida. Job Title: Data Analysis Location: pan India Experience: 9-12 yrs Job Type : Contract to hire. Notice Period: Immediate joiners. Mandatory Skills: CRM Analytics Salesforce ETL techniques TCRM UX design with advanced skills in binding, SAQL and JSON. Job Description: • Develop data recipes/dataflows in CRM Analytics tool to support business requirements. • Help establish best practices and standards for dataflow, data recipe, dashboard, and report/lens development and deployment. • Perform data analysis, produce data samples/prototypes and produce ad-hoc reports. Obtain and analyze data by accessing multiple sources, including ADL, EDW and Salesforce. • Partner and collaborate with IT, platform owners and BI Analysts to provide input to project timelines, scope and risks for transformational commission initiatives. • Developing a clear understanding of system landscape, data repositories and its interrelationships. • Develop reports and processes to continuously monitor data quality and integrity issues and problems. • Locate and define new process improvement opportunities. • Perform extensive data unit testing and quality assurance (QA). Troubleshoot issues and work cross-functionally towards a solution. • Respond and resolve all assigned feedbacks (all priorities) within the set thresholds. • Provide guidance/governance on best practices and approaches to design, implement and sustain effective data driven analytics solutions To succeed in this role, you’ll need the following: • Minimum of 8+ years in Salesforce and CRM Analytics having successfully implemented at least 1 TCRM solution. • Experience working with sales, service and marketing data is highly preferred. • Technical expertise building data models and ETL techniques (Data integration using Sync, recipes and dataflows). • Expert in building Salesforce CRM Analytics Dashboard/Reporting capabilities and UX design with advanced skills in binding, SAQL and JSON. • Experience with building Einstein Discovery models and deploying/embedding the model to Sales Cloud page layouts. • Familiarity with Einstein AI components like Next Best Action, Prediction Builder, and Insights nice to have. • Experience with CRM Analytics standard/templated apps. • Experience working in an Agile/Scrum environment. • Strong verbal & written communication skills. Self-starter personality who can operate with minimal supervision • Experience in problem solving, critical thinking and priority setting. Certifications: • CRM Analytics & Einstein Discovery Certified (Must) • Salesforce Admin Certified (Preferred).

Posted 1 month ago

Apply

10.0 years

0 Lacs

Kochi, Kerala, India

On-site

Introduction We are looking for candidates with 10 +years of experience in data architect role. Responsibilities include: • Design and implement scalable, secure, and cost-effective data architectures using GCP. • Lead the design and development of data pipelines with BigQuery, Dataflow, and Cloud Storage. • Architect and implement data lakes, data warehouses, and real-time data processing solutions on GCP. • Ensure data architecture aligns with business goals, governance, and compliance requirements. • Collaborate with stakeholders to define data strategy and roadmap. • Design and deploy BigQuery solutions for optimized performance and cost efficiency. • Build and maintain ETL/ELT pipelines for large-scale data processing. • Leverage Cloud Pub/Sub, Dataflow, and Cloud Functions for real-time data integration. • Implement best practices for data security, privacy, and compliance in cloud environments. • Integrate machine learning workflows with data pipelines and analytics tools. • Define data governance frameworks and manage data lineage. • Lead data modeling efforts to ensure consistency, accuracy, and performance across systems. • Optimize cloud infrastructure for scalability, performance, and reliability. • Mentor junior team members and ensure adherence to architectural standards. • Collaborate with DevOps teams to implement Infrastructure as Code (Terraform, Cloud Deployment Manager). • Ensure high availability and disaster recovery solutions are built into data systems. • Conduct technical reviews, audits, and performance tuning for data solutions. • Design solutions for multi-region and multi-cloud data architecture. • Stay updated on emerging technologies and trends in data engineering and GCP. • Drive innovation in data architecture, recommending new tools and services on GCP. Certifications : • Google Cloud Certification is Preferred. Primary Skills : • 7+ years of experience in data architecture, with at least 3 years in GCP environments. • Expertise in BigQuery, Cloud Dataflow, Cloud Pub/Sub, Cloud Storage, and related GCP services. • Strong experience in data warehousing, data lakes, and real-time data pipelines. • Proficiency in SQL, Python, or other data processing languages. • Experience with cloud security, data governance, and compliance frameworks. • Strong problem-solving skills and ability to architect solutions for complex data environments. • Google Cloud Certification (Professional Data Engineer, Professional Cloud Architect) preferred. • Leadership experience and ability to mentor technical teams. • Excellent communication and collaboration skills

Posted 1 month ago

Apply

0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Job Description Software Engineer position responsible for implementing and maintaining the application in AWS SaaS Environment. You will work closely with business analysts and stakeholders to ensure a robust and scalable solution to support the Finished Vehicle logistic Operation.We are seeking a skilled and motivated Software Engineer with strong experience in Google Cloud Platform (GCP) and Java programming in Spring boot framework. In this role, you will be responsible for designing, developing, and maintaining scalable and reliable cloud-based solutions, data pipelines, or applications on GCP, leveraging Java for scripting, automation, data processing, and service integration. Responsibilities Work closely with product manager and business stakeholders to understand the business needs and associated systems requirements to meet customization required in SaaS Solution. Run and protect the SaaS Solution in AWS Environment and troubleshoots production issues. Active participant in all team agile ceremonies, manage the daily deliverables in Jira with proper user story and acceptance criteria. Design, build, Test, implement, and manage scalable, secure, and reliable infrastructure on Google Cloud Platform (GCP) using Infrastructure as Code (IaC) principles, primarily with Terraform. Develop and manage APIs or backend services in Java deployed on GCP services like Cloud Run Function, App Engine, or GKE. Build and maintain robust CI/CD pipelines (e.g., using Cloud Build, Jenkins, GitHub) to enable frequent and reliable application deployments. Build and maintain data products; Design, develop and maintain ETL/data pipelines for handling business and transformation rules. Implement and manage monitoring, logging, and alerting solutions (e.g., Cloud Monitoring, Prometheus, Grafana, Cloud Logging) to ensure system health and performance. Implement and enforce security best practices within the GCP environment (e.g., IAM policies, network security groups, security scanning). Troubleshoot and resolve production issues across various services (Applications) and infrastructure components (GCP). Active participant in all team agile ceremonies, manage the daily deliverables in Jira with proper user story and acceptance criteria. Qualifications Bachelor’s degree in engineering -computer science or other streams 3+ plus years of software development and support experience including analysis, design, & testing. Strong proficiency in software development using Java & Spring Boot. Experience working with Microservices, data ingestion tools and API's Experience working with GCP cloud-based services & solutions Experience working with GCP’s data storage and services such as BigQuery, Dataflow, PubSub Hands-on experience designing, deploying, and managing resources and services on Google Cloud Platform (GCP). Familiarity with database querying (SQL) and understanding of database concepts. Understanding of cloud architecture principles, including scalability, reliability, and security. Proven experience working effectively within an Agile development or operations team (e.g., Scrum, Kanban). Experience using incident tracking and project management tools (e.g., Jira, ServiceNow, Azure DevOps). Excellent problem-solving, communication, and organizational skills. Proven ability to work independently and with a team Nice-to-Have Skills: GCP certifications (e.g., Associate Cloud Engineer, Professional Cloud DevOps Engineer, Professional Cloud Architect). Experience with other cloud providers (AWS, Azure). Experience with containerization (Docker) and orchestration (Kubernetes). Experience with database administration (e.g., PostgreSQL, MySQL). Familiarity with security best practices and tools in a cloud environment (DevSecOps). Experience with serverless technologies beyond Cloud Functions/Run. Contribution to open-source projects.

Posted 1 month ago

Apply

8.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

About Company: Our client is prominent Indian multinational corporation specializing in information technology (IT), consulting, and business process services and its headquartered in Bengaluru with revenues of gross revenue of ₹222.1 billion with global work force of 234,054 and listed in NASDAQ and it operates in over 60 countries and serves clients across various industries, including financial services, healthcare, manufacturing, retail, and telecommunications. The company consolidated its cloud, data, analytics, AI, and related businesses under the tech services business line. Major delivery centers in India, including cities like Chennai, Pune, Hyderabad, and Bengaluru, kochi, kolkatta, Noida. · Job Title: "Business Intelligence & Data Analysis". · Location: Pan India. · Experience: 9+ yrs · Job Type : Contract to hire. · Notice Period:- Immediate joiners. JD: Business Intelligence and Data Analysis • Discuss reporting needs with business users to identify critical KPIs and success metrics. • Design dashboards and reports that will provide data visibility into KPIs for leadership and business groups. • Review business reporting requirements and translate those requirements into technical reporting & dashboard specifications. • Develop data recipes/dataflows in CRM Analytics tool to support business requirements. • Help establish best practices and standards for dataflow, data recipe, dashboard, and report/lens development and deployment. • Perform data analysis, produce data samples/prototypes and produce ad-hoc reports. Obtain and analyze data by accessing multiple sources, including ADL, EDW and Salesforce. • Partner and collaborate with IT, platform owners and BI Analysts to provide input to project timelines, scope and risks for transformational commission initiatives. • Developing a clear understanding of system landscape, data repositories and its interrelationships. • Develop reports and processes to continuously monitor data quality and integrity issues and problems. • Locate and define new process improvement opportunities. • Perform extensive data unit testing and quality assurance (QA). Troubleshoot issues and work cross-functionally towards a solution. • Respond and resolve all assigned feedbacks (all priorities) within the set thresholds. • Provide guidance/governance on best practices and approaches to design, implement and sustain effective data driven analytics solutions To succeed in this role, you’ll need the following: • Minimum of 8+ years in Salesforce and CRM Analytics having successfully implemented at least 1 TCRM solution. • Experience working with sales, service and marketing data is highly preferred. • Technical expertise building data models and ETL techniques (Data integration using Sync, recipes and dataflows). • Expert in building Salesforce CRM Analytics Dashboard/Reporting capabilities and UX design with advanced skills in binding, SAQL and JSON. • Experience with building Einstein Discovery models and deploying/embedding the model to Sales Cloud page layouts. • Familiarity with Einstein AI components like Next Best Action, Prediction Builder, and Insights nice to have. • Experience with CRM Analytics standard/templated apps. • Experience working in an Agile/Scrum environment. • Strong verbal & written communication skills. Self-starter personality who can operate with minimal supervision • Experience in problem solving, critical thinking and priority setting. Certifications: • CRM Analytics & Einstein Discovery Certified (Must) • Salesforce Admin Certified (Preferred)

Posted 1 month ago

Apply

5.0 years

0 Lacs

Trivandrum, Kerala, India

On-site

Equifax is where you can power your possible. If you want to achieve your true potential, chart new paths, develop new skills, collaborate with bright minds, and make a meaningful impact, we want to hear from you. Equifax is seeking creative, high-energy and driven software engineers with hands-on development skills to work on a variety of meaningful projects. Our software engineering positions provide you the opportunity to join a team of talented engineers working with leading-edge technology. You are ideal for this position if you are a forward-thinking, committed, and enthusiastic software engineer who is passionate about technology. What You’ll Do Design, develop, and operate high scale applications across the full engineering stack Design, develop, test, deploy, maintain, and improve software. Apply modern software development practices (serverless computing, microservices architecture, CI/CD, infrastructure-as-code, etc.) Work across teams to integrate our systems with existing internal systems, Data Fabric, CSA Toolset. Participate in technology roadmap and architecture discussions to turn business requirements and vision into reality. Participate in a tight-knit, globally distributed engineering team. Triage product or system issues and debug/track/resolve by analyzing the sources of issues and the impact on network, or service operations and quality. Manage sole project priorities, deadlines, and deliverables. Research, create, and develop software applications to extend and improve on Equifax Solutions Collaborate on scalability issues involving access to data and information. Actively participate in Sprint planning, Sprint Retrospectives, and other team activity What Experience You Need Bachelor's degree or equivalent experience 5+ years of software engineering experience 5+ years experience writing, debugging, and troubleshooting code in mainstream Java, SpringBoot, TypeScript/JavaScript, HTML, CSS, Rest API 5+ years experience with Cloud technology: GCP, AWS, or Azure 5+ years experience designing and developing cloud-native solutions 5+ years experience designing and developing microservices using Java, SpringBoot, GCP SDKs, GKE/Kubernetes 5+ years experience deploying and releasing software using Jenkins CI/CD pipelines, understand infrastructure-as-code concepts, Helm Charts, and Terraform constructs What could set you apart Self-starter that identifies/responds to priority shifts with minimal supervision. Experience designing and developing big data processing solutions using Dataflow/Apache Beam, Bigtable, BigQuery, PubSub, GCS, Composer/Airflow, and others UI development (e.g. HTML, JavaScript, Angular and Bootstrap) Experience with backend technologies such as JAVA/J2EE, SpringBoot, SOA and Microservices Source code control management systems (e.g. SVN/Git, Github) and build tools like Maven & Gradle. Agile environments (e.g. Scrum, XP) Relational databases (e.g. SQL Server, MySQL) Atlassian tooling (e.g. JIRA, Confluence, and Github) Developing with modern JDK (v1.7+) Automated Testing: JUnit, Selenium, LoadRunner, SoapUI We offer a hybrid work setting, comprehensive compensation and healthcare packages, attractive paid time off, and organizational growth potential through our online learning platform with guided career tracks. Are you ready to power your possible? Apply today, and get started on a path toward an exciting new career at Equifax, where you can make a difference! Who is Equifax? At Equifax, we believe knowledge drives progress. As a global data, analytics and technology company, we play an essential role in the global economy by helping employers, employees, financial institutions and government agencies make critical decisions with greater confidence. We work to help create seamless and positive experiences during life’s pivotal moments: applying for jobs or a mortgage, financing an education or buying a car. Our impact is real and to accomplish our goals we focus on nurturing our people for career advancement and their learning and development, supporting our next generation of leaders, maintaining an inclusive and diverse work environment, and regularly engaging and recognizing our employees. Regardless of location or role, the individual and collective work of our employees makes a difference and we are looking for talented team players to join us as we help people live their financial best. Equifax is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, or status as a protected veteran.

Posted 1 month ago

Apply

5.0 years

0 Lacs

Trivandrum, Kerala, India

On-site

Equifax is seeking creative, high-energy and driven software engineers with hands-on development skills to work on a variety of meaningful projects. Our software engineering positions provide you the opportunity to join a team of talented engineers working with leading-edge technology. You are ideal for this position if you are a forward-thinking, committed, and enthusiastic software engineer who is passionate about technology. What You’ll Do Design, develop, and operate high scale applications across the full engineering stack Design, develop, test, deploy, maintain, and improve software. Apply modern software development practices (serverless computing, microservices architecture, CI/CD, infrastructure-as-code, etc.) Work across teams to integrate our systems with existing internal systems, Data Fabric, CSA Toolset. Participate in technology roadmap and architecture discussions to turn business requirements and vision into reality. Participate in a tight-knit, globally distributed engineering team. Triage product or system issues and debug/track/resolve by analyzing the sources of issues and the impact on network, or service operations and quality. Manage sole project priorities, deadlines, and deliverables. Research, create, and develop software applications to extend and improve on Equifax Solutions Collaborate on scalability issues involving access to data and information. Actively participate in Sprint planning, Sprint Retrospectives, and other team activity What Experience You Need Bachelor's degree or equivalent experience 5+ years of software engineering experience 5+ years experience writing, debugging, and troubleshooting code in mainstream Java, SpringBoot, TypeScript/JavaScript, HTML, CSS, Rest API 5+ years experience with Cloud technology: GCP, AWS, or Azure 5+ years experience designing and developing cloud-native solutions 5+ years experience designing and developing microservices using Java, SpringBoot, GCP SDKs, GKE/Kubernetes 5+ years experience deploying and releasing software using Jenkins CI/CD pipelines, understand infrastructure-as-code concepts, Helm Charts, and Terraform constructs What could set you apart Self-starter that identifies/responds to priority shifts with minimal supervision. Experience designing and developing big data processing solutions using Dataflow/Apache Beam, Bigtable, BigQuery, PubSub, GCS, Composer/Airflow, and others UI development (e.g. HTML, JavaScript, Angular and Bootstrap) Experience with backend technologies such as JAVA/J2EE, SpringBoot, SOA and Microservices Source code control management systems (e.g. SVN/Git, Github) and build tools like Maven & Gradle. Agile environments (e.g. Scrum, XP) Relational databases (e.g. SQL Server, MySQL) Atlassian tooling (e.g. JIRA, Confluence, and Github) Developing with modern JDK (v1.7+) Automated Testing: JUnit, Selenium, LoadRunner, SoapUI

Posted 1 month ago

Apply

5.0 years

0 Lacs

Trivandrum, Kerala, India

On-site

What you’ll do Design, develop, and operate high scale applications across the full engineering stack. Design, develop, test, deploy, maintain, and improve software. Apply modern software development practices (serverless computing, microservices architecture, CI/CD, infrastructure-as-code, etc.) Work across teams to integrate our systems with existing internal systems, Data Fabric, CSA Toolset. Participate in technology roadmap and architecture discussions to turn business requirements and vision into reality. Participate in a tight-knit, globally distributed engineering team. Triage product or system issues and debug/track/resolve by analyzing the sources of issues and the impact on network, or service operations and quality. Research, create, and develop software applications to extend and improve on Equifax Solutions. Manage sole project priorities, deadlines, and deliverables. Collaborate on scalability issues involving access to data and information. Actively participate in Sprint planning, Sprint Retrospectives, and other team activity What Experience You Need Bachelor's degree or equivalent experience 5+ years of software engineering experience 5+ years experience writing, debugging, and troubleshooting code in mainstream Java, SpringBoot, TypeScript/JavaScript, HTML, CSS 5+ years experience with Cloud technology: GCP, AWS, or Azure 5+ years experience designing and developing cloud-native solutions 5+ years experience designing and developing microservices using Java, SpringBoot, GCP SDKs, GKE/Kubernetes 5+ years experience deploying and releasing software using Jenkins CI/CD pipelines, understand infrastructure-as-code concepts, Helm Charts, and Terraform constructs What could set you apart Self-starter that identifies/responds to priority shifts with minimal supervision. Experience designing and developing big data processing solutions using Dataflow/Apache Beam, Bigtable, BigQuery, PubSub, GCS, Composer/Airflow, and others UI development (e.g. HTML, JavaScript, Angular and Bootstrap) Experience with backend technologies such as JAVA/J2EE, SpringBoot, SOA and Microservices Source code control management systems (e.g. SVN/Git, Github) and build tools like Maven & Gradle. Agile environments (e.g. Scrum, XP) Relational databases (e.g. SQL Server, MySQL) Atlassian tooling (e.g. JIRA, Confluence, and Github) Developing with modern JDK (v1.7+) Automated Testing: JUnit, Selenium, LoadRunner, SoapUI Cloud Certification strongly preferred

Posted 1 month ago

Apply

5.0 years

0 Lacs

Trivandrum, Kerala, India

On-site

Equifax is seeking creative, high-energy and driven software engineers with hands-on development skills to work on a variety of meaningful projects. Our software engineering positions provide you the opportunity to join a team of talented engineers working with leading-edge technology. You are ideal for this position if you are a forward-thinking, committed, and enthusiastic software engineer who is passionate about technology. What You’ll Do Design, develop, and operate high scale applications across the full engineering stack Design, develop, test, deploy, maintain, and improve software. Apply modern software development practices (serverless computing, microservices architecture, CI/CD, infrastructure-as-code, etc.) Work across teams to integrate our systems with existing internal systems, Data Fabric, CSA Toolset. Participate in technology roadmap and architecture discussions to turn business requirements and vision into reality. Participate in a tight-knit, globally distributed engineering team. Triage product or system issues and debug/track/resolve by analyzing the sources of issues and the impact on network, or service operations and quality. Manage sole project priorities, deadlines, and deliverables. Research, create, and develop software applications to extend and improve on Equifax Solutions Collaborate on scalability issues involving access to data and information. Actively participate in Sprint planning, Sprint Retrospectives, and other team activity What Experience You Need Bachelor's degree or equivalent experience 5+ years of software engineering experience 5+ years experience writing, debugging, and troubleshooting code in mainstream Java, SpringBoot, TypeScript/JavaScript, HTML, CSS 5+ years experience with Cloud technology: GCP, AWS, or Azure 5+ years experience designing and developing cloud-native solutions 5+ years experience designing and developing microservices using Java, SpringBoot, GCP SDKs, GKE/Kubernetes 5+ years experience deploying and releasing software using Jenkins CI/CD pipelines, understand infrastructure-as-code concepts, Helm Charts, and Terraform constructs What could set you apart Self-starter that identifies/responds to priority shifts with minimal supervision. Experience designing and developing big data processing solutions using Dataflow/Apache Beam, Bigtable, BigQuery, PubSub, GCS, Composer/Airflow, and others UI development (e.g. HTML, JavaScript, Angular and Bootstrap) Experience with backend technologies such as JAVA/J2EE, SpringBoot, SOA and Microservices Source code control management systems (e.g. SVN/Git, Github) and build tools like Maven & Gradle. Agile environments (e.g. Scrum, XP) Relational databases (e.g. SQL Server, MySQL) Atlassian tooling (e.g. JIRA, Confluence, and Github) Developing with modern JDK (v1.7+) Automated Testing: JUnit, Selenium, LoadRunner, SoapUI

Posted 1 month ago

Apply

5.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Role – GCP Data Engineer Experience: 5+ years Preferred - Data Engineering Background Location –Kolkata, Pune, Hyderabad, Chennai Required Skills - GCP DE Experience, Big query, SQL, Cloud compressor/Python, Cloud functions, Dataproc+pyspark, Python injection, Dataflow+PUB/SUB Job Requirement: Have Implemented and Architected solutions on Google Cloud Platform using the components of GCP Experience with Apache Beam/Google Dataflow/Apache Spark in creating end to end data pipelines. Experience in some of the following: Python, Hadoop, Spark, SQL, Big Query, Big Table Cloud Storage, Datastore, Spanner, Cloud SQL, Machine Learning. Experience programming in Java, Python, etc. Expertise in at least two of these technologies: Relational Databases, Analytical Databases, NoSQL databases. Certified in Google Professional Data Engineer/ Solution Architect is a major Advantage Skills Required: 5+ years’ experience in IT or professional services experience in IT delivery or large-scale IT analytics projects Candidates must have expertise knowledge of Google Cloud Platform; the other cloud platforms are nice to have. Expert knowledge in SQL development. Expertise in building data integration and preparation tools using cloud technologies (like Snaplogic, Google Dataflow, Cloud Dataprep, Python, etc). Experience with Apache Beam/Google Dataflow/Apache Spark in creating end to end data pipelines. Experience in some of the following: Python, Hadoop, Spark, SQL, Big Query, Big Table Cloud Storage, Datastore, Spanner, Cloud SQL, Machine Learning. Experience programming in Java, Python, etc. Identify downstream implications of data loads/migration (e.g., data quality, regulatory, etc.) Implement data pipelines to automate the ingestion, transformation, and augmentation of data sources, and provide best practices for pipeline operations. Capability to work in a rapidly changing business environment and to enable simplified user access to massive data by building scalable data solutions Advanced SQL writing and experience in data mining (SQL, ETL, data warehouse, etc.) and using databases in a business environment with complex datasets

Posted 1 month ago

Apply

170.0 years

0 Lacs

Pune, Maharashtra, India

On-site

About Birlasoft: Birlasoft, a global leader at the forefront of Cloud, AI, and Digital technologies, seamlessly blends domain expertise with enterprise solutions. The company’s consultative and design-thinking approach empowers societies worldwide, enhancing the efficiency and productivity of businesses. As part of the multibillion-dollar diversified CKA Birla Group, Birlasoft with its 12,000+ professionals, is committed to continuing the Group’s 170-year heritage of building sustainable communities. We are looking for a talented and experienced GCP BigQuery Engineer to join our team. The ideal candidate will have 6 to 8 years of hands-on experience working with Google Cloud Platform (GCP) services, with a strong focus on BigQuery. As a GCP BigQuery Engineer, you will be responsible for designing, implementing, and optimizing data warehousing and analytics solutions using BigQuery to support our organization's business objectives. **Responsibilities:** - Design, develop, and implement data warehousing and analytics solutions using Google BigQuery as the primary data storage and processing platform. - Work closely with business stakeholders, data architects, and data engineers to gather requirements and design scalable and efficient data models and schemas in BigQuery. - Implement data ingestion pipelines to extract, transform, and load (ETL) data from various source systems into BigQuery using GCP services such as Cloud Dataflow, Cloud Storage, and Data Transfer Service. - Optimize BigQuery performance and cost-effectiveness by designing partitioned tables, clustering tables, and optimizing SQL queries. - Develop and maintain data pipelines and workflows using GCP tools and technologies to automate data processing and analytics tasks. - Implement data security and access controls in BigQuery to ensure compliance with regulatory requirements and protect sensitive data. - Collaborate with cross-functional teams to integrate BigQuery with other GCP services and third-party tools to support advanced analytics, machine learning, and business intelligence initiatives. - Provide technical guidance and mentorship to junior members of the team and contribute to knowledge sharing and best practices development.

Posted 1 month ago

Apply

5.0 years

5 - 9 Lacs

Thiruvananthapuram

On-site

Trivandrum India Technology Full time 6/22/2025 J00169002 Equifax is where you can power your possible. If you want to achieve your true potential, chart new paths, develop new skills, collaborate with bright minds, and make a meaningful impact, we want to hear from you. Equifax is seeking creative, high-energy and driven software engineers with hands-on development skills to work on a variety of meaningful projects. Our software engineering positions provide you the opportunity to join a team of talented engineers working with leading-edge technology. You are ideal for this position if you are a forward-thinking, committed, and enthusiastic software engineer who is passionate about technology. What you’ll do Design, develop, and operate high scale RPA/AI applications across the full engineering stack Design, develop, test, deploy, maintain, and improve software. Apply modern software development practices (serverless computing, microservices architecture, CI/CD, infrastructure-as-code, etc.) Work across teams to integrate our systems with existing internal systems, Data Fabric, CSA Toolset. Participate in technology roadmap and architecture discussions to turn business requirements and vision into reality. Participate in a tight-knit, globally distributed engineering team. Triage product or system issues and debug/track/resolve by analyzing the sources of issues and the impact on network, or service operations and quality. Manage sole project priorities, deadlines, and deliverables. Research, create, and develop software applications to extend and improve on Equifax Solutions Collaborate on scalability issues involving access to data and information. Actively participate in Sprint planning, Sprint Retrospectives, and other team activity What experience you need Bachelor's degree or equivalent experience 5+ years of software engineering experience 5+ years experience writing, debugging, and troubleshooting code in mainstream RPA/AI, Python 5+ years experience with Cloud technology: GCP, AWS, or Azure 5+ years experience designing and developing cloud-native solutions 5+ years experience deploying and releasing software using Jenkins CI/CD pipelines, understand infrastructure-as-code concepts, Helm Charts, and Terraform constructs What could set you apart Self-starter that identifies/responds to priority shifts with minimal supervision. Experience designing and developing UiPath RPA solutions using Dataflow/Apache Beam, SQL Server,BigQuery, PubSub, GCS, and others Source code control management systems (e.g. SVN/Git, Github) and build tools like Maven & Gradle. Agile environments (e.g. Scrum, XP) Relational databases (e.g. SQL Server, MySQL) Atlassian tooling (e.g. JIRA, Confluence, and Github) Automated Testing: JUnit, Selenium, LoadRunner, SoapUI We offer a hybrid work setting, comprehensive compensation and healthcare packages, attractive paid time off, and organizational growth potential through our online learning platform with guided career tracks. Are you ready to power your possible? Apply today, and get started on a path toward an exciting new career at Equifax, where you can make a difference! Who is Equifax? At Equifax, we believe knowledge drives progress. As a global data, analytics and technology company, we play an essential role in the global economy by helping employers, employees, financial institutions and government agencies make critical decisions with greater confidence. We work to help create seamless and positive experiences during life’s pivotal moments: applying for jobs or a mortgage, financing an education or buying a car. Our impact is real and to accomplish our goals we focus on nurturing our people for career advancement and their learning and development, supporting our next generation of leaders, maintaining an inclusive and diverse work environment, and regularly engaging and recognizing our employees. Regardless of location or role, the individual and collective work of our employees makes a difference and we are looking for talented team players to join us as we help people live their financial best. Equifax is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, or status as a protected veteran.

Posted 1 month ago

Apply

4.0 years

0 Lacs

Noida

On-site

As a Data Engineer, you will design, develop, and support data pipelines and related data products and platforms. Your primary responsibilities include designing and building data extraction, loading, and transformation pipelines across on-prem and cloud platforms. You will perform application impact assessments, requirements reviews, and develop work estimates. Additionally, you will develop test strategies and site reliability engineering measures for data products and solutions, participate in agile development and solution reviews, mentor junior Data Engineering Specialists lead the resolution of critical operations issues, and perform technical data stewardship tasks, including metadata management, security, and privacy by design. Required Skills: ● Design, develop, and support data pipelines and related data products and platforms. ● Design and build data extraction, loading, and transformation pipelines and data products across on-prem and cloud platforms. ● Perform application impact assessments, requirements reviews, and develop work estimates. ● Develop test strategies and site reliability engineering measures for data products and solutions. ● Participate in agile development and solution reviews. ● Mentor junior Data Engineers. ● Lead the resolution of critical operations issues, including post- implementation reviews. ● Perform technical data stewardship tasks, including metadata management, security, and privacy by design. ● Design and build data extraction, loading, and transformation pipelines using Python and other GCP Data Technologies ● Demonstrate SQL and database proficiency in various data engineering tasks. ● Automate data workflows by setting up DAGs in tools like Control-M, Apache Airflow and Prefect. ● Develop Unix scripts to support various data operations. ● Model data to support business intelligence and analytics initiatives. ● Utilize infrastructure-as-code tools such as Terraform, Puppet, and Ansible for deployment automation. ● Expertise in GCP data warehousing technologies, including BigQuery, Cloud SQL, Dataflow, Data Catalog, Cloud Composer, Google Cloud Storage, IAM, Compute Engine, Cloud Data Fusion, and Dataproc (good to have). Qualification s: ● Bachelor’s degree in Software Engineering, Computer Science, Business, Mathematics, or related field. ● 4+ years of data engineering experience. ● 2 years of data solution architecture and design experience. ● GCP Certified Data Engineer (preferred). Job Type: Full-time Schedule: Day shift Location: Noida, Uttar Pradesh (Required) Work Location: In person

Posted 1 month ago

Apply

5.0 years

0 Lacs

Trivandrum, Kerala, India

On-site

Equifax is seeking creative, high-energy and driven software engineers with hands-on development skills to work on a variety of meaningful projects. Our software engineering positions provide you the opportunity to join a team of talented engineers working with leading-edge technology. You are ideal for this position if you are a forward-thinking, committed, and enthusiastic software engineer who is passionate about technology. What You’ll Do Design, develop, and operate high scale RPA/AI applications across the full engineering stack Design, develop, test, deploy, maintain, and improve software. Apply modern software development practices (serverless computing, microservices architecture, CI/CD, infrastructure-as-code, etc.) Work across teams to integrate our systems with existing internal systems, Data Fabric, CSA Toolset. Participate in technology roadmap and architecture discussions to turn business requirements and vision into reality. Participate in a tight-knit, globally distributed engineering team. Triage product or system issues and debug/track/resolve by analyzing the sources of issues and the impact on network, or service operations and quality. Manage sole project priorities, deadlines, and deliverables. Research, create, and develop software applications to extend and improve on Equifax Solutions Collaborate on scalability issues involving access to data and information. Actively participate in Sprint planning, Sprint Retrospectives, and other team activity What Experience You Need Bachelor's degree or equivalent experience 5+ years of software engineering experience 5+ years experience writing, debugging, and troubleshooting code in mainstream RPA/AI, Python 5+ years experience with Cloud technology: GCP, AWS, or Azure 5+ years experience designing and developing cloud-native solutions 5+ years experience deploying and releasing software using Jenkins CI/CD pipelines, understand infrastructure-as-code concepts, Helm Charts, and Terraform constructs What could set you apart Self-starter that identifies/responds to priority shifts with minimal supervision. Experience designing and developing UiPath RPA solutions using Dataflow/Apache Beam, SQL Server,BigQuery, PubSub, GCS, and others Source code control management systems (e.g. SVN/Git, Github) and build tools like Maven & Gradle. Agile environments (e.g. Scrum, XP) Relational databases (e.g. SQL Server, MySQL) Atlassian tooling (e.g. JIRA, Confluence, and Github) Automated Testing: JUnit, Selenium, LoadRunner, SoapUI

Posted 1 month ago

Apply

0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Role Description Role Proficiency: This role requires proficiency in data pipeline development including coding and testing data pipelines for ingesting wrangling transforming and joining data from various sources. Must be skilled in ETL tools such as Informatica Glue Databricks and DataProc with coding expertise in Python PySpark and SQL. Works independently and has a deep understanding of data warehousing solutions including Snowflake BigQuery Lakehouse and Delta Lake. Capable of calculating costs and understanding performance issues related to data solutions. Outcomes Act creatively to develop pipelines and applications by selecting appropriate technical options optimizing application development maintenance and performance using design patterns and reusing proven solutions.rnInterpret requirements to create optimal architecture and design developing solutions in accordance with specifications. Document and communicate milestones/stages for end-to-end delivery. Code adhering to best coding standards debug and test solutions to deliver best-in-class quality. Perform performance tuning of code and align it with the appropriate infrastructure to optimize efficiency. Validate results with user representatives integrating the overall solution seamlessly. Develop and manage data storage solutions including relational databases NoSQL databases and data lakes. Stay updated on the latest trends and best practices in data engineering cloud technologies and big data tools. Influence and improve customer satisfaction through effective data solutions. Measures Of Outcomes Adherence to engineering processes and standards Adherence to schedule / timelines Adhere to SLAs where applicable # of defects post delivery # of non-compliance issues Reduction of reoccurrence of known defects Quickly turnaround production bugs Completion of applicable technical/domain certifications Completion of all mandatory training requirements Efficiency improvements in data pipelines (e.g. reduced resource consumption faster run times). Average time to detect respond to and resolve pipeline failures or data issues. Number of data security incidents or compliance breaches. Outputs Expected Code Development: Develop data processing code independently ensuring it meets performance and scalability requirements. Define coding standards templates and checklists. Review code for team members and peers. Documentation Create and review templates checklists guidelines and standards for design processes and development. Create and review deliverable documents including design documents architecture documents infrastructure costing business requirements source-target mappings test cases and results. Configuration Define and govern the configuration management plan. Ensure compliance within the team. Testing Review and create unit test cases scenarios and execution plans. Review the test plan and test strategy developed by the testing team. Provide clarifications and support to the testing team as needed. Domain Relevance Advise data engineers on the design and development of features and components demonstrating a deeper understanding of business needs. Learn about customer domains to identify opportunities for value addition. Complete relevant domain certifications to enhance expertise. Project Management Manage the delivery of modules effectively. Defect Management Perform root cause analysis (RCA) and mitigation of defects. Identify defect trends and take proactive measures to improve quality. Estimation Create and provide input for effort and size estimation for projects. Knowledge Management Consume and contribute to project-related documents SharePoint libraries and client universities. Review reusable documents created by the team. Release Management Execute and monitor the release process to ensure smooth transitions. Design Contribution Contribute to the creation of high-level design (HLD) low-level design (LLD) and system architecture for applications business components and data models. Customer Interface Clarify requirements and provide guidance to the development team. Present design options to customers and conduct product demonstrations. Team Management Set FAST goals and provide constructive feedback. Understand team members' aspirations and provide guidance and opportunities for growth. Ensure team engagement in projects and initiatives. Certifications Obtain relevant domain and technology certifications to stay competitive and informed. Skill Examples Proficiency in SQL Python or other programming languages used for data manipulation. Experience with ETL tools such as Apache Airflow Talend Informatica AWS Glue Dataproc and Azure ADF. Hands-on experience with cloud platforms like AWS Azure or Google Cloud particularly with data-related services (e.g. AWS Glue BigQuery). Conduct tests on data pipelines and evaluate results against data quality and performance specifications. Experience in performance tuning of data processes. Expertise in designing and optimizing data warehouses for cost efficiency. Ability to apply and optimize data models for efficient storage retrieval and processing of large datasets. Capacity to clearly explain and communicate design and development aspects to customers. Ability to estimate time and resource requirements for developing and debugging features or components. Knowledge Examples Knowledge Examples Knowledge of various ETL services offered by cloud providers including Apache PySpark AWS Glue GCP DataProc/DataFlow Azure ADF and ADLF. Proficiency in SQL for analytics including windowing functions. Understanding of data schemas and models relevant to various business contexts. Familiarity with domain-related data and its implications. Expertise in data warehousing optimization techniques. Knowledge of data security concepts and best practices. Familiarity with design patterns and frameworks in data engineering. Additional Comments Tech skills Proficient in Python (Including popular python packages e.g. Pandas, NumPy etc.) and SQL Strong background in distributed data processing and storage (e.g. Apache Spark, Hadoop) Large scale (TBs of data) data engineering skills - Model data, create production ready ETL pipelines Development experience with at least one cloud (Azure high preference, AWS, GCP) Knowledge of data lake and data lake house patterns Knowledge of ETL performance tuning and cost optimization Knowledge of data structures and algorithms and good software engineering practices Soft skills Strong communication skills to articulate complex situation concisely Comfortable with picking up new technologies independently Eye for detail, good data intuition, and a passion for data quality Comfortable working in a rapidly changing environment with ambiguous requirements Skills Python,Sql,Aws,Azure

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies