Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
6.0 years
0 Lacs
Gurugram, Haryana, India
Remote
What would a typical day at your work be like? • You will lead and manage the delivery of projects and be responsible for the delivery of project and team goals. • Build & support data ingestion and processing pipelines. This will entail extract, load and transform of data from a wide variety of sources using latest data frameworks and technologies. • Design, build, test, and maintain machine learning infrastructure and frameworks to empower data scientists to rapidly iterate on model development. • Own and lead client engagement and communication on technical projects. Define project scopes and track project progress and delivery. • Plan and execute project architecture and allocate work to team. • Keep up to date with advances in big data technologies and run pilots to design the data architecture to scale with the increased data volume. • Partner with software engineering teams to drive completion of multi-functional projects. What Do We Expect? • Minimum 6 years of overall experience in data engineering and 2+ years leading a team as team lead and doing project management. • Experience working with a global team and remote clients. • Hands on experience in building data pipelines on various infrastructures. • Knowledge of statistical and machine learning techniques. Hands on experience in integrating machine learning in data pipelines. • Ability to work hands-on with the data engineers in the team in design and development of the solution using the relevant big data technologies and data warehouse concepts Strong knowledge of advanced SQL, data warehousing concepts, DataMart designing. • Have strong experience in modern data platform components such as Spark, Python, etc. • Experience with setting up and maintaining Data warehouse (Google BigQuery, Redshift, Snowflake) and Data Lakes (GCS, AWS S3 etc.) for an organization. • Experience in building data pipeline with AWS Glue, Azure Data Factory and Google Dataflow. • Experience with relational SQL and NoSQL databases, including Postgres and Cassandra / MongoDB. • Strong problem solving and communication skills.
Posted 2 weeks ago
5.0 years
0 Lacs
Pune, Maharashtra, India
Remote
Our technology services client is seeking multiple DevSecOps Security Engineer to join their team on a contract basis. These positions offer a strong potential for conversion to full-time employment upon completion of the initial contract period. Below are further details about the role: Role: DevSecOps Security Engineer Experience: 5- 7 Years Location: Mumbai, Pune, Hyderabad, Bangalore, Chennai, Kolkata Notice Period: Immediate- 15 Days Mandatory Skills: Devops Support, GitHub Actions, CI/CD Pipelines, Argocd , Snyk, multicloud (AWS/AZure/GCP) GIT, MS Tools, Docker, Kubernetes, Jfrog, SCA & SAST Job Description: A security expert who can write code as needed and knows the difference between Object vs Class vs Function programming. Strong passion and thorough understanding of what it takes to build and operate secure, reliable systems at scale. Strong passion and technical expertise to automate security functions via code. Strong technical expertise with Application, Cloud, Data, and Network Security best practices. Strong technical expertise with multi-cloud environments, including container/serverless and other microservice architectures. Strong technical expertise with older technology stacks, including mainframes and monolithic architectures. Strong technical expertise with SDLC, CI/CD tools, and Deployment Automation. Strong technical expertise with operating security for Windows Server and Linux Server systems. Strong technical expertise with configuration management, version control, and DevOps operational support. Strong experience with implementing security measures for both applications and data, with an understanding of the unique security requirements of data warehouse technologies such as Snowflake. Role Responsibilities Development & Enforcement Develop and enforce engineering security policies and standards. Develop and enforce data security policies and standards. Drive security awareness across the organization. Collaboration & Expertise Collaborate with Engineering and Business teams to develop secure engineering practices. Serve as the Subject Matter Expert for Application Security. Work with cross-functional teams to ensure security is considered throughout the software development lifecycle Analysis & Configuration Analyze, develop, and configure security solutions across multi-cloud, on-premises, and colocation environments, ensuring application security, integrity, confidentiality, and availability of data. Lead security testing, vulnerability analysis, and documentation. Operational Support Participate in operational on-call duties to support infrastructure across multiple regions and environments (cloud, on-premises, colocation). Develop incident response and recovery strategies. Qualifications Basic Qualifications 5+ years of experience in developing and deploying security technologies. A minimum of a Bachelor’s degree in Computer Science, Software Development, Software Engineering, or a related field, or equivalent alternative education, skills, and/or practical experience is required. Experience with modern Software Development Lifecycles and CI/CD practices Experience for the remediation of vulnerabilities sourced from Static Analysis (SAST), Open Source Scanning (SCA), Mobile Scanning (MAST) and API Scanning Proficiency in Public Clo\ud (AWS/Azure/GCP) & Network Security. Experience with Docker, Kubernetes, Security-as-Code, and Infrastructure-as-Code. Experience with one or more general-purpose programming/script languages including but not limited to: Java, C/C++, C#, Python, JavaScript, Shell Script, PowerShell. Strong experience with implementing and managing data protection measures and compliance with data protection regulations (e.g., GDPR, CCPA). Preferred Qualifications Strong technical expertise with Architecting Public Cloud solutions and processes. Strong technical expertise with Networking and Software-Defined Networking (SDN) principles. Strong technical expertise with developing and interpreting Network, Sequence, and Dataflow diagrams. Familiarity with OWASP Application Security Verification Standard Experience with direct, remote, and virtual teams. Understanding of at least one compliance framework (HIPAA, HITRUST, PCI, NIST, CSA). Strong technical expertise with Static Analysis, Open Source Scanning, Mobile Scanning, and API Scanning security solutions for data warehouses and big data platforms, particularly with technologies like GitHub Advanced Security, CodeQL, Checkmarx, and Snyk. Strong technical expertise in defining and implementing cyber resilience standards, policies, and programs for distributed cloud and network infrastructure, ensuring robust redundancy and system reliability. Education A minimum of a Bachelor’s degree in Computer Science, Software Development, Software Engineering, or a related field, or equivalent alternative education, skills, and/or practical experience is required. If you are interested, share the updated resume to hema.g@s3staff.com
Posted 2 weeks ago
0 years
0 Lacs
Pune, Maharashtra, India
On-site
Introducing Thinkproject Platform Pioneering a new era and offering a cohesive alternative to the fragmented landscape of construction software, Thinkproject seamlessly integrates the most extensive portfolio of mature solutions with an innovative platform, providing unparalleled features, integrations, user experiences, and synergies. By combining information management expertise and in-depth knowledge of the building, infrastructure, and energy industries, Thinkproject empowers customers to efficiently deliver, operate, regenerate, and dispose of their built assets across their entire lifecycle through a Connected Data Ecosystem. We are seeking a hands-on Applied Machine Learning Engineer to join our team and lead the development of ML-driven insights from historical data in our contracts management, assets management and common data platform. This individual will work closely with our data engineering and product teams to design, develop, and deploy scalable machine learning models that can parse, learn from, and generate value from both structured and unstructured contract data. You will use BigQuery and its ML capabilities (including SQL and Python integrations) to prototype and productionize models across a variety of NLP and predictive analytics use cases. Your work will be critical in enhancing our platform’s intelligence layer, including search, classification, recommendations, and risk detection. What your day will look like Key Responsibilities Model Development: Design and implement machine learning models using structured and unstructured historical contract data to support intelligent document search, clause classification, metadata extraction, and contract risk scoring. BigQuery ML Integration: Build, train, and deploy ML models directly within BigQuery using SQL and/or Python, leveraging native GCP tools (e.g., Vertex AI, Dataflow, Pub/Sub). Data Preprocessing & Feature Engineering: Clean, enrich, and transform raw data (e.g., legal clauses, metadata, audit trails) into model-ready features using scalable and efficient pipelines. Model Evaluation & Experimentation: Conduct experiments, model validation, A/B testing, and iterate based on precision, recall, F1-score, RMSE, etc. Deployment & Monitoring: Operationalize models in production environments with monitoring, retraining pipelines, and CI/CD best practices for ML (MLOps). Collaboration: Work cross-functionally with data engineers, product managers, legal domain experts, and frontend teams to align ML solutions with product needs. What you need to fulfill the role Skills And Experience Education: Bachelor’s or Master’s degree in Computer Science, Machine Learning, Data Science, or a related field. ML Expertise: Strong applied knowledge of supervised and unsupervised learning, classification, regression, clustering, feature engineering, and model evaluation. NLP Experience: Hands-on experience working with textual data, especially in NLP use cases like entity extraction, classification, and summarization. GCP & BigQuery: Proficiency with Google Cloud Platform, especially BigQuery and BigQuery ML; comfort querying large-scale datasets and integrating with external ML tooling. Programming: Proficient in Python and SQL; familiarity with libraries such as Scikit-learn, TensorFlow, PyTorch, Keras. MLOps Knowledge: Experience with model deployment, monitoring, versioning, and ML CI/CD best practices. Data Engineering Alignment: Comfortable working with data pipelines and tools like Apache Beam, Dataflow, Cloud Composer, and pub/sub systems. Version Control: Strong Git skills and experience collaborating in Agile teams. Preferred Qualifications Experience working with contractual or legal text datasets. Familiarity with document management systems, annotation tools, or enterprise collaboration platforms. Exposure to Vertex AI, LangChain, RAG-based retrieval, or embedding models for Gen AI use cases. Comfortable working in a fast-paced, iterative environment with changing priorities. What we offer Lunch 'n' Learn Sessions I Women's Network I LGBTQIA+ Network I Coffee Chat Roulette I Free English Lessons I Thinkproject Academy I Social Events I Volunteering Activities I Open Forum with Leadership Team (Tp Café) I Hybrid working I Unlimited learning We are a passionate bunch here. To join Thinkproject is to shape what our company becomes. We take feedback from our staff very seriously and give them the tools they need to help us create our fantastic culture of mutual respect. We believe that investing in our staff is crucial to the success of our business. Your contact: Mehal Mehta Please submit your application, including salary expectations and potential date of entry, by submitting the form on the next page. Working at thinkproject.com - think career. think ahead.
Posted 2 weeks ago
3.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Location : Hyderabad and Chennai Immediate Joiners Experience : 3 to 5 years Mandatory skills: MLOps, Model lifecycle + Python + PySpark + GCP (BigQuery, Dataproc & Airflow), And CI/CD Required Skills and Experience: Strong programming skills: Proficiency in languages like Python, with experience in libraries like TensorFlow, PyTorch, or scikit-learn. Cloud Computing: Deep understanding of GCP services relevant to ML, such as Vertex AI, BigQuery, Cloud Storage, Dataflow, Dataproc, and others. Data Science Fundamentals: Solid foundation in machine learning concepts, statistical analysis, and data modeling. Software Engineering Principles: Experience with software development best practices, version control (e.g., Git), and testing methodologies. MLOps: Familiarity with MLOps principles and practices. Data Engineering: Experience in building and managing data pipelines.
Posted 2 weeks ago
5.0 years
0 Lacs
India
Remote
Job Title - GCP Administrator Location - Remote (Hybrid for Chennai& Mumbai) Experience - 8 + years We are looking for an experienced GCP Administrator to join our team. The ideal candidate will have strong hands-on experience with IAM Administration, multi-account management, Big Query administration, performance optimization, monitoring and cost management within Google Cloud Platform (GCP). Responsibilities: ● Manages and configures roles/permissions in GCP IAM by following the principle of least privileged access ● Manages Big Query service by way of optimizing slot assignments and SQL Queries, adopting FinOps practices for cost control, troubleshooting and resolution of critical data queries, etc. ● Collaborate with teams like Data Engineering, Data Warehousing, Cloud Platform Engineering, SRE, etc. for efficient Data management and operational practices in GCP ● Create automations and monitoring mechanisms for GCP Data-related services, processes and tasks ● Work with development teams to design the GCP-specific cloud architecture ● Provisioning and de-provisioning GCP accounts and resources for internal projects. ● Managing, and operating multiple GCP subscriptions ● Keep technical documentation up to date ● Proactively being up to date on GCP announcements, services and developments. Requirements: ● Must have 5+ years of work experience on provisioning, operating, and maintaining systems in GCP ● Must have a valid certification of either GCP Associate Cloud Engineer or GCP Professional Cloud Architect. ● Must have hands-on experience on GCP services such as Identity and Access Management (IAM), BigQuery, Google Kubernetes Engine (GKE), etc. ● Must be capable to provide support and guidance on GCP operations and services depending upon enterprise needs ● Must have a working knowledge of docker containers and Kubernetes. ● Must have strong communication skills and the ability to work both independently and in a collaborative environment. ● Fast learner, Achiever, sets high personal goals ● Must be able to work on multiple projects and consistently meet project deadlines ● Must be willing to work on shift-basis based on project requirements. Good to Have: ● Experience in Terraform Automation over GCP Infrastructure provisioning ● Experience in Cloud Composer, Dataproc, Dataflow Storage and Monitoring services ● Experience in building and supporting any form of data pipeline. ● Multi-Cloud experience with AWS. ● New-Relic monitoring. Perks: ● Day off on the 3rd Friday of every month (one long weekend each month) ● Monthly Wellness Reimbursement Program to promote health well-being ● Paid paternity and maternity leaves Notice period: Immediate to 30 days Email to: poniswarya.m@aptita.com
Posted 2 weeks ago
3.0 - 7.0 years
0 Lacs
noida, uttar pradesh
On-site
As a Cloud Engineering Team Leader at GlobalLogic, you will be responsible for providing technical guidance and career development support to a team of cloud engineers. You will define cloud architecture standards and best practices across the organization, collaborating with senior leadership to develop a cloud strategy aligned with business objectives. Your role will involve driving technical decision-making for complex cloud infrastructure projects, establishing and maintaining cloud governance frameworks, and operational procedures. With a background in technical leadership roles managing engineering teams, you will have a proven track record of successfully delivering large-scale cloud transformation projects. Experience in budget management, resource planning, and strong presentation and communication skills for executive-level reporting are essential. Preferred certifications include Google Cloud Professional Cloud Architect, Google Cloud Professional Data Engineer, and additional relevant cloud or security certifications. You will leverage your 10+ years of experience in designing and implementing enterprise-scale Cloud Solutions using GCP services to architect sophisticated cloud solutions using Python and advanced GCP services. Leading the design and deployment of solutions utilizing Cloud Functions, Docker containers, Dataflow, and other GCP services will be part of your responsibilities. Ensuring optimal performance and scalability of complex integrations with multiple data sources and systems, implementing security best practices and compliance frameworks, and troubleshooting and resolving technical issues will be key aspects of your role. Your technical skills will include expert-level proficiency in Python with experience in additional languages, deep expertise with GCP services such as Dataflow, Compute Engine, BigQuery, Cloud Functions, and others, advanced knowledge of Docker, Kubernetes, and container orchestration patterns, extensive experience in cloud security, proficiency in Infrastructure as Code tools like Terraform, Cloud Deployment Manager, and CI/CD experience with advanced deployment pipelines and GitOps practices. As part of the GlobalLogic team, you will benefit from a culture of caring, continuous learning and development opportunities, interesting and meaningful work, balance and flexibility in work arrangements, and being part of a high-trust organization. You will have the chance to work on impactful projects, engage with collaborative teammates and supportive leaders, and contribute to shaping cutting-edge solutions in the digital engineering domain.,
Posted 2 weeks ago
6.0 years
0 Lacs
India
Remote
Role: GCP Data Engineer Experience: 6+ years Type: Contract Duration: 6 months Location: Remote Time zone: IST Shift Job Description: We are looking for a skilled GCP Data Engineer with strong expertise in SQL and Python coding . The ideal candidate will have hands-on experience with Google Cloud Platform (GCP) services, especially BigQuery , and will be responsible for designing, building, and optimizing data pipelines and analytics solutions. Key Skills: Strong proficiency in SQL and Python Experience with GCP services, especially BigQuery Data pipeline development and ETL processes Good understanding of data warehousing and data modeling Nice to Have: Experience with Cloud Functions, Dataflow, or Pub/Sub Exposure to CI/CD and DevOps practices on GCP
Posted 2 weeks ago
0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Job Description Product Designer position responsible for implementing and maintaining the application in AWS SaaS Environment. You will work closely with business analysts and stakeholders to ensure a robust and scalable solution to support the Finished Vehicle logistic Operation. We are seeking a skilled and motivated Product Designer with strong experience in Google Cloud Platform (GCP) and Java programming in Spring boot framework. In this role, you will be responsible for designing, developing, and maintaining scalable and reliable cloud-based solutions, data pipelines, or applications on GCP, leveraging Java for scripting, automation, data processing, and service integration. Responsibilities Work closely with product manager and business stakeholders to understand the business needs and associated systems requirements to meet customization required in SaaS Solution. Run and protect the SaaS Solution in AWS Environment and troubleshoots production issues. Active participant in all team agile ceremonies, manage the daily deliverables in Jira with proper user story and acceptance criteria. Design, build, Test, implement, and manage scalable, secure, and reliable infrastructure on Google Cloud Platform (GCP) using Infrastructure as Code (IaC) principles, primarily with Terraform. Develop and manage APIs or backend services in Java deployed on GCP services like Cloud Run Function, App Engine, or GKE. Build and maintain robust CI/CD pipelines (e.g., using Cloud Build, Jenkins, GitHub) to enable frequent and reliable application deployments. Build and maintain data products; Design, develop and maintain ETL/data pipelines for handling business and transformation rules. Implement and manage monitoring, logging, and alerting solutions (e.g., Cloud Monitoring, Prometheus, Grafana, Cloud Logging) to ensure system health and performance. Implement and enforce security best practices within the GCP environment (e.g., IAM policies, network security groups, security scanning). Troubleshoot and resolve production issues across various services (Applications) and infrastructure components (GCP). Active participant in all team agile ceremonies, manage the daily deliverables in Jira with proper user story and acceptance criteria. Qualifications Bachelor’s degree in engineering -computer science or other streams 8+ plus years of software development and support experience including analysis, design, & testing. 4+ plus years Strong proficiency in software development using Java & Spring Boot. 3+ plus years Experience working with Microservices, data ingestion tools and API's 2+ plus years Experience working with GCP cloud-based services & solutions Experience working with GCP’s data storage and services such as BigQuery, Dataflow, PubSub Hands-on experience designing, deploying, and managing resources and services on Google Cloud Platform (GCP). Familiarity with database querying (SQL) and understanding of database concepts. Understanding of cloud architecture principles, including scalability, reliability, and security. Proven experience working effectively within an Agile development or operations team (e.g., Scrum, Kanban). Experience using incident tracking and project management tools (e.g., Jira, ServiceNow, Azure DevOps). Excellent problem-solving, communication, and organizational skills. Proven ability to work independently and with a team Nice-to-Have Skills: GCP certifications (e.g., Associate Cloud Engineer, Professional Cloud DevOps Engineer, Professional Cloud Architect). Experience with other cloud providers (AWS, Azure). Experience with containerization (Docker) and orchestration (Kubernetes). Experience with database administration (e.g., PostgreSQL, MySQL). Familiarity with security best practices and tools in a cloud environment (DevSecOps). Experience with serverless technologies beyond Cloud Functions/Run. Contribution to open-source projects.
Posted 2 weeks ago
5.0 years
0 Lacs
Pune, Maharashtra, India
Remote
Job Title: GCP Data Engineer Experience: 5 to 7+ Years Location: Remote Shift Timing: 3:00 PM – 12:00 AM IST Job Summary: We are looking for a highly skilled and motivated GCP Data Engineer with a minimum of 5 to 7+ years of experience in Data Engineering , including at least 3 years of hands-on expertise with Google Cloud Platform (GCP) . The ideal candidate will be responsible for designing and implementing robust, scalable, and secure data pipelines and architectures to support data analytics and business intelligence initiatives. This is a remote role with a dedicated shift timing of 3 PM – 12 AM IST. Key Responsibilities: Design, develop, and optimize data pipelines using GCP services, especially BigQuery, Cloud Storage, and Dataflow. Develop and maintain ETL/ELT processes using Python and SQL. Implement data models and schemas in BigQuery for efficient querying and storage. Collaborate with data scientists, analysts, and stakeholders to define data requirements and deliver robust data solutions. Monitor and troubleshoot data pipeline performance, quality, and reliability issues. Ensure best practices in data security, governance, and compliance on GCP. Automate data workflows and contribute to CI/CD pipeline integration for data solutions. Required Skills & Qualifications: Minimum 6+ years of experience in Data Engineering. Strong hands-on experience with Google Cloud Platform (GCP) services. Proficiency in BigQuery, Cloud Storage, Dataflow, Pub/Sub, Cloud Functions . Expertise in Python , Advanced SQL , and shell scripting. Familiarity with Cloud Composer (Airflow) for workflow orchestration. Solid understanding of data warehousing concepts , dimensional modeling , and schema design . Experience with version control tools like Git and CI/CD tools. Knowledge of data security best practices , IAM roles , and encryption methods on GCP. Strong problem-solving and debugging skills. Good communication and team collaboration abilities.
Posted 2 weeks ago
5.0 years
0 Lacs
Pune, Maharashtra, India
On-site
● 5+ years of experience in core JAVA, Spring Framework (Required) ● 2 years of Cloud experience (GCP, AWS, Azure, GCP preferred ) (Required) ● Experience in big data processing, on a distributed system. (required) ● Experience in databases RDBMS, NoSQL databases Cloud natives. (Required) ● Experience in handling various data formats like Flat file, jSON, Avro, xml etc with defining the schemas and the contracts. (required) ● Experience in implementing the data pipeline (ETL) using Dataflow( Apache beam) ● Experience in Microservices and integration patterns of the APIs with data processing. ● Experience in data structure, defining and designing the data models
Posted 2 weeks ago
4.0 - 8.0 years
0 Lacs
karnataka
On-site
At PwC, the focus in data and analytics is on leveraging data to drive insights and make informed business decisions. Utilizing advanced analytics techniques to help clients optimize their operations and achieve strategic goals is key. In data analysis at PwC, the emphasis is on utilizing advanced analytical techniques to extract insights from large datasets and drive data-driven decision-making. Skills in data manipulation, visualization, and statistical modeling play a crucial role in supporting clients in solving complex business problems. Candidates with 4+ years of hands-on experience are sought for the position of Senior Associate in supply chain analytics. Successful candidates should possess proven expertise in supply chain analytics across domains such as demand forecasting, inventory optimization, logistics, segmentation, and network design. Additionally, hands-on experience working on optimization methods like linear programming, mixed integer programming, and scheduling optimization is required. Proficiency in forecasting techniques and machine learning techniques, along with a strong command of statistical modeling, testing, and inference, is essential. Familiarity with GCP tools like BigQuery, Vertex AI, Dataflow, and Looker is also necessary. Required skills include building data pipelines and models for forecasting, optimization, and scenario planning, strong SQL and Python programming skills, experience deploying models in a GCP environment, and knowledge of orchestration tools like Cloud Composer (Airflow). Nice-to-have skills consist of familiarity with MLOps, containerization (Docker, Kubernetes), and orchestration tools, as well as strong communication and stakeholder engagement skills at the executive level. The roles and responsibilities of the Senior Associate involve assisting analytics projects within the supply chain domain, driving design, development, and delivery of data science solutions. They are expected to interact with and advise consultants/clients as subject matter experts, conduct analysis using advanced analytics tools, and implement quality control measures for deliverable integrity. Validating analysis outcomes, making presentations, and contributing to knowledge and firm building activities are also part of the role. The ideal candidate should hold a degree in BE / B.Tech / MCA / M.Sc / M.E / M.Tech / Masters Degree / MBA from a reputed institute.,
Posted 2 weeks ago
7.0 - 11.0 years
0 Lacs
pune, maharashtra
On-site
You will be joining a global technology consulting and digital solutions company that specializes in helping enterprises to innovate and accelerate business through digital technologies. With a team of over 84,000 professionals in more than 30 countries, the company serves 700+ clients by leveraging its domain and technology expertise to drive competitive differentiation, enhance customer experiences, and achieve superior business outcomes. As a JAVA-GCP professional, you will be based in Pune (Shivajinagar) and should have at least 6 years of experience in Java with a focus on cloud technologies, specifically Google Cloud Platform (GCP). You will be responsible for data processing, including big data processing on distributed systems, working with databases such as RDBMS and NoSQL databases, and handling various data formats like Flat file, JSON, Avro, and XML. To excel in this role, you must have a Bachelor's degree in computer science, engineering, or equivalent experience, with a minimum of 7 years of experience in core Java and Spring Framework. Additionally, you should have at least 2 years of experience with cloud platforms (GCP, AWS, Azure, GCP preferred) and expertise in implementing data pipelines using Dataflow (Apache Beam). Experience in Microservices, integration patterns of APIs with data processing, and defining data models will be advantageous. If you are passionate about technology, data processing, and driving innovation through digital solutions, and if you are looking for a challenging opportunity in a hybrid work environment, we encourage you to apply. Please note that this is a Contract-to-Hire position, and we are looking for immediate joiners who can contribute effectively to our projects.,
Posted 2 weeks ago
3.0 - 7.0 years
0 Lacs
chennai, tamil nadu
On-site
As a Data Engineer, your primary responsibility will be to design and develop robust ETL pipelines using Python, PySpark, and various Google Cloud Platform (GCP) services. You will be tasked with building and optimizing data models and queries in BigQuery to support analytics and reporting needs. Additionally, you will play a crucial role in ingesting, transforming, and loading structured and semi-structured data from diverse sources. Collaboration with data analysts, scientists, and business teams is essential to grasp and address data requirements effectively. Ensuring data quality, integrity, and security across cloud-based data platforms will be a key part of your role. You will also be responsible for monitoring and troubleshooting data workflows and performance issues. Automation of data validation and transformation processes using scripting and orchestration tools will be a significant aspect of your day-to-day tasks. Your hands-on experience with Google Cloud Platform (GCP), particularly BigQuery, will be crucial. Proficiency in Python and/or PySpark programming, along with experience in designing and implementing ETL workflows and data pipelines, is required. A strong command of SQL and data modeling for analytics is essential. Familiarity with GCP services like Cloud Storage, Dataflow, Pub/Sub, and Composer will be beneficial. An understanding of data governance, security, and compliance in cloud environments is also expected. Experience with version control using Git and agile development practices will be advantageous for this role.,
Posted 2 weeks ago
15.0 years
0 Lacs
Thane, Maharashtra, India
On-site
Key Responsibilities: Platform Stabilization & Operational Excellence: Accountable for stable, reliable, and secure operations across all Datawarehouse applications, ensuring adherence to defined SLAs and KPIs. Assess the current data platform architecture, identify bottlenecks, and implement solutions to ensure high availability, reliability, performance, and scalability. Establish robust monitoring, alerting, and incident management processes for all data pipelines and infrastructure. Drive initiatives to improve data quality, consistency, and trustworthiness across the platform. Oversee the operational health and day-to-day management of existing data systems during the transition period. Manage relationships with strategic vendors across the enterprise applications landscape, ensuring strong performance, innovation contributions, and commercial value. Platform Modernization & Architecture: Define and execute a strategic roadmap for modernizing PerkinElmer's data platform, leveraging cloud-native technologies (AWS, Azure, or GCP) and modern data stack components (e.g., data lakes/lakehouses, Data Fabric/Mesh architectures, streaming platforms like Kafka/Kinesis, orchestration tools like Airflow, ELT/ETL tools, containerization). Lead the design and implementation of a scalable, resilient, and cost-effective data architecture that meets current and future business needs. (DaaS) Champion and implement DataOps principles, including CI/CD, automated testing, and infrastructure-as-code, to improve development velocity and reliability. Stay abreast of emerging technologies and industry trends, evaluating and recommending new tools and techniques to enhance the platform. Leadership & Strategy: Build, mentor, and lead a world-class data engineering team, fostering a culture of innovation, collaboration, and continuous improvement. Develop and manage the data engineering budget, resources, and vendor relationships. Define the overall data engineering vision, strategy, and multi-year roadmap in alignment with PerkinElmer's business objectives. Effectively communicate strategy, progress, and challenges to executive leadership and key stakeholders across the organization. Drive cross-functional collaboration with IT, Security, Enterprise Apps, R&D, and Business Units. Data Monetization Enablement: Partner closely with business leaders, enterprise app teams, and other business teams to understand data needs and identify opportunities for data monetization. Architect data solutions, APIs, and data products that enable the creation of new revenue streams or significant internal efficiencies derived from data assets. Ensure robust data governance, security, and privacy controls are embedded within the platform design and data products, adhering to relevant regulations (e.g., GDPR, HIPAA where applicable). Build the foundational data infrastructure required to support advanced analytics, machine learning, and AI initiatives. Basic Qualifications Required Qualifications & Experience Bachelor's or Master's degree in Computer Science, Engineering, Information Technology, or a related quantitative field. 15+ years of experience in data engineering, data architecture and/or data warehousing. 5+ years of experience in a leadership role, managing data engineering teams and driving large-scale data initiatives. Proven track record of successfully leading the stabilization, modernization, and scaling of complex data platforms. Deep expertise in modern data architecture patterns (Data Lakes, Data Warehouses, Lakehouses, Lambda/Kappa architectures). Extensive hands-on experience with cloud data platforms (AWS, Azure, or GCP – specify preferred if applicable) and their associated data services (e.g., S3/ADLS/GCS, Redshift/Synapse/BigQuery, EMR/Dataproc/Databricks, Kinesis/Kafka/Event Hubs, Glue/Data Factory/Dataflow). Strong experience with big data technologies (e.g., Spark, Hadoop ecosystem) and data processing frameworks. Proficiency with data pipeline orchestration tools (e.g., Airflow, Prefect, Dagster). Solid understanding of SQL and NoSQL databases, data modeling techniques, and ETL/ELT development. Experience with programming languages commonly used in data engineering (e.g., Python, Scala, Java). Excellent understanding of data governance, data security, and data privacy principles and best practices. Exceptional leadership, communication, stakeholder management, and strategic thinking skills. Demonstrated ability to translate business requirements into technical solutions.
Posted 2 weeks ago
0 years
0 Lacs
Gujarat, India
Remote
Job Purpose Job Purpose The IT/OT Integration Specialist plays a critical role in bridging the gap between Information Technology (IT) and Operational Technology (OT) within the organization. This position is responsible for developing and implementing strategies that ensure seamless integration of IT systems with OT environments to enhance operational efficiency, data analytics, and decision-making processes. The specialist will collaborate with cross-functional teams to design and maintain integrated systems that optimize production, improve safety, and support digital transformation initiatives. By leveraging technical expertise and industry best practices, this role aims to drive innovation and facilitate the smooth exchange of information across the organization, ultimately contributing to improved business outcomes and competitive advantage. Dimensions: No of Users (Non ERP Apps) 800 Locations Supported Units – 4 No of Applications 10 No of Functions 15 No of External Stakeholders 3 Other Quantitative and Important Parameters for the Job: Budgets/Volumes/No. of Products/Geography/Markets/ Customers or any other parameter Platforms - IT System (Non ERP) ERP – Central Application 3 rd Party Systems OT – DCS Side Data integration Analytics Platforms Job Context & Major Challenges: Job Context : Responsible for Automation within the domain of IT using IT automation and technology tools, supporting and leading to Automation framework. He will be the SPOC from Copper for all the IT Automation Projects, around functional, technical, infrastructure requirements including co-ordination with Corporate team and vendors for execution and overall adherence & governance. Job Challenges: Single resource with multi-level understanding is a challenge Remote location of the unit Responsible for managing all Automation and Standardization in the area of Finance, Legal, HR, Contracts using IT tools, supporting and leading to digital and paperless environment using latest technology and tools. To partner in continuous improvement initiatives through Information Technology support to incorporate changes and improve productivity to match current and future Copper business needs. Working with the central Application factory team to have synergy and harmonization of processes before rolling out any changes Responsible for supporting Digital projects along with Digital Ambassador and work as solution architect for managing IT/OT infrastructure, security and application integration. He will also be responsible for driving other IT solutions like RFID, Barcode, RPA, Analytics Deep understanding of business requirement gathering, BRD documentation, Testing, UAT strategy, JIRA workflow, SDLC life cycle etc Fostering business process automation through new-age technologies like RPA, AI, ML, Cloud, API's, SQL Identify Gaps in project post GO LIVE & gather feedback from user groups for suggesting improvements as needed & conduct appropriate user training to ensure to achieve high system usage. Work closely with Stakeholders to ensure that applications support continuous improvements around quality, cycle time and operating efficiency Technical expertise to organization as it relates processes automations and productivity Participate and deliver special projects/assignments such process improvements initiatives A hunger for the latest knowledge of automation and new technologies Establish metrics, apply industry best practices, and develop new tools and processes to ensure automation goals are met An ability to manage performance, development and deployment issues across the client(s)/portfolio(s) Create reusable processes and/or extensions for the Automation tools An ability to create automation architecture and solution proposals An ability to develop prototypes and Proofs of Concept Plan, estimate and implement automation on repeatable processes using automation tools Driving the change, new way of working with utmost accuracy and adherence to timeliness Key Result Areas/Accountabilities: Key Result Areas/Accountabilities Supporting Actions Manage IT-OT integration layer Develop and manage interfaces between IT systems (e.g., ERP, databases) and OT systems (e.g., SCADA, PLCs). IT OT Dataflow & Integration Smooth implementation of digital and analytics projects.( E.g. integration of LIMS, ERP and other applications with digital technologies) adhering the security policies. IT OT Security Monitor and review OT security threats, collaborate with plant teams for necessary actions such as A) Ensure monitoring / management and hardening of network security devices (switches, routers) with the help of the vendor on regular basis. B) Monitor firewall and Manage data movement across Perimeter Firewalls through proper configurations, virtual patching. C) Hardening of Cybersecurity Solutions by checking & applying latest patch/firmware release for Anomaly detection tool, SRA, Secured Remote Access, Network Monitoring tool, Syslog software, Backup software, Servers, Firewall – (IT/OT Segregation) and Network Switch – DMZ, Core and Ring / Distribution switch Operational Excellence Support digital projects from infrastructure, network architecture and security aspects. Do a need analysis in Copper Business and cross check with Business/ DA / Non ERP and ERP teams to improve the automation and work as an integrator for Data Analytics, BI, RPA for specific developments for Copper Implementing Data Migration and flow of data from Legacy to ERP and vice-versa wherever needed Liaising with IT vendors such as Oracle for critical SRs or Bugs for quick resolution The SPOC for IT Automation Operations projects to make sure the project plans are made and aligned to the overall automation objective in Copper Business Seeks to make continuous improvements to execution and automation; Provide automation/tooling architecture thought process and application design and development guidance that ensures enterprise wide scalability; Aligns with existing design/development/usage of automation and technologies; Work with the development team and/or vendors to successfully integrate automation Serve as an active and consistent participant in the Automation governance process; Customer-centricity by bridging gap with better synergy Coordination with Team Leaders of Metals business / business units to collect desired data Coordination with CIT, Vendors for compliance governance of security in-line with the timelines Regular interaction with IT team for security requirement Reporting the progress and or hindrances if any to project lead(s) To build a close working relationship with peers of the team and business process teams. Cordial communication with peers in the other groups within the business / various business locations. Work with various stakeholders to identify automation and systems as part of a automation implementation; Job Purpose of Direct Reports: Position Title Job Purpose Position No. Relationships: Relationship Type (Internal or External) Frequency & Nature Internal Business Heads, Functional Heads, Unit Heads, Cluster Heads, Unit HR Heads, Finance Heads, DH-IT, Unit IT SPOCs, etc. on regular basis for fulfilment of IT infrastructural requirement. Interaction with function heads at HO and or at unit locations on regular basis to seek input on current practices, procedures and data / information required for execution of project activities. Interaction with IT SPOC, User Department Regular interaction with CIT External Vendors & Consultants: On regular and ongoing basis
Posted 2 weeks ago
0 years
2 - 9 Lacs
Hyderābād
On-site
Job description We are seeking for the role of Consultant Specialist In this role, you will have to: A senior full stack Automation test engineer with experience and knowledge in Software Automation testing using Tosca, Selenium and other tools, knowledge of ETL tools like Data stage, Dataflow, SQL, Shell scripting, Control-M, API Development, Design Patterns, SDLC, IaC tools, testing and site reliability engineering. Need deep understanding of Desktop, Web, Data warehouse application automating testing, API testing, related ways to design and develop automation framework. Proven experience in writing automation test scripts, conducting reviews, building test packs for regression, smoke, integration testing scenarios. Identify ways to increase test coverage, metric based test status reporting, manage defect lifecycle Define and implement best practices for software automation testing (framework, patterns, standards, reviews, coverage, requirement traceability), including testing methodologies. Be a generalist with the breadth and depth of experience in CICD best practices and has core experience in testing (i.e. TDD/BDD/Automated testing/Contract testing/API testing/Desktop/web apps, DW test automation) Able to see a problem or an opportunity with the ability to engineer a solution, be respected for what they deliver not just what they say, should think about the business impact of their work and has a holistic view to problem- solving. Apply thinking to many problems across multiple technical domains and suggest way to solve the problems. Contributes to architectural discussions by asking the right questions to ensure a solution matches the business needs. Excellent verbal and written communication skills to articulate technical concepts to both technical and non-technical stakeholders. Represent at Scrum meetings and all other key project meetings and provide a single point of accountability and escalation for Automation testing within the scrum teams. Work with cross-functional team, opportunity to work with software product, development, and support teams, capable of handling tasks to accelerate the testing delivery and to improve the quality for Applications at HSBC. Willing to adapt, learn innovative technologies/tools and be flexible to work on projects as demanded by business. Requirements To be successful in this role, you must meet the following requirements: Experience in software testing approaches on automation testing using Tosca, Selenium, cucumber BDD framework. Experienced on writing test plans, test strategy, test data management includes test artifacts management for both automation and manual testing. Experience on setting up CI/CD pipeline and work experience on GitHub, Jenkins along with integration to cucumber and Jira. Experience in agile methodology and proven experience in working on agile projects. Experience in analysis of bug tracking, prioritizing and bug reporting with bug tracking tools. API Automation using Rest Assured. Communicate effectively with stakeholders across the Bank. Experience in SQL, Unix, Control-M, ETL, Data Testing, API testing. Expert level experience on Jira and Zephyr. Good to have skills: Knowledge on latest technology, tools like, GITHUB Co Pilot, Python Scripting, Tricentis Tosca, Dataflow, Hive, DevOpS, REST API, Hadoop, Kafka framework, GCP, AWS, will be an added advantage. You’ll achieve more when you join HSBC. www.hsbc.com/careers HSBC is committed to building a culture where all employees are valued, respected and opinions count. We take pride in providing a workplace that fosters continuous professional development, flexible working and opportunities to grow within an inclusive and diverse environment. Personal data held by the Bank relating to employment applications will be used in accordance with our Privacy Statement, which is available on our website. Issued by – HSBC Software Development India
Posted 2 weeks ago
5.0 years
4 - 7 Lacs
Thiruvananthapuram
On-site
Equifax is seeking creative, high-energy and driven software engineers with hands-on development skills to work on a variety of meaningful projects. Our software engineering positions provide you the opportunity to join a team of talented engineers working with leading-edge technology. You are ideal for this position if you are a forward-thinking, committed, and enthusiastic software engineer who is passionate about technology. What you’ll do Design, develop, and operate high scale applications across the full engineering stack Design, develop, test, deploy, maintain, and improve software. Apply modern software development practices (serverless computing, microservices architecture, CI/CD, infrastructure-as-code, etc.) Work across teams to integrate our systems with existing internal systems, Data Fabric, CSA Toolset. Participate in technology roadmap and architecture discussions to turn business requirements and vision into reality. Participate in a tight-knit, globally distributed engineering team. Triage product or system issues and debug/track/resolve by analyzing the sources of issues and the impact on network, or service operations and quality. Manage sole project priorities, deadlines, and deliverables. Research, create, and develop software applications to extend and improve on Equifax Solutions Collaborate on scalability issues involving access to data and information. Actively participate in Sprint planning, Sprint Retrospectives, and other team activity What experience you need Bachelor's degree or equivalent experience 5+ years of software engineering experience 5+ years experience writing, debugging, and troubleshooting code in mainstream Java, SpringBoot 5+ years experience with Cloud technology: GCP, AWS, or Azure 5+ years experience designing and developing cloud-native solutions 5+ years experience designing and developing microservices using Java, SpringBoot, GCP SDKs, GKE/Kubernetes 5+ years experience deploying and releasing software using Jenkins CI/CD pipelines, understand infrastructure-as-code concepts, Helm Charts, and Terraform constructs Big Data experience preferred. What could set you apart Self-starter that identifies/responds to priority shifts with minimal supervision. Experience designing and developing big data processing solutions using Dataflow/Apache Beam, Bigtable, BigQuery, PubSub, GCS, Composer/Airflow, and others Experience with backend technologies such as JAVA/J2EE, SpringBoot, SOA and Microservices Source code control management systems (e.g. SVN/Git, Github) and build tools like Maven & Gradle. Agile environments (e.g. Scrum, XP) Relational databases (e.g. SQL Server, MySQL) Atlassian tooling (e.g. JIRA, Confluence, and Github) Developing with modern JDK (v1.7+) Automated Testing: JUnit, Selenium, LoadRunner, SoapUI
Posted 2 weeks ago
6.0 years
6 - 10 Lacs
Noida
On-site
Country/Region: IN Requisition ID: 27468 Work Model: Position Type: Salary Range: Location: INDIA - NOIDA- BIRLASOFT OFFICE Title: Technical Specialist-Data Engg Description: Area(s) of responsibility Required Skills & Experience 6 years of experience working with hands on development experience in DBT, Aptitude, Snowflake on Azure platform- Dataflow, Data Ingestion, Data Storage & Security Expertise in ETL tool DBT Design Data Integration (ETL) projects using the DBT Strong hands-on experience in build custom data models/semantic reporting layer in Snowflake to support customer reporting current platform requirements. Good to have experience in any other ETL tool Participate in the entire project lifecycle including design and development of ETL solutions Design data integration and conversion strategy, exception handing mechanism, data retention and archival strategy Ability to communicate platform features/development effectively to customer SME & Technical team. Drive UAT activities with business partners to ensure solutions meet business requirements Experience working with Power BI reporting, Dataflow, Data Lake, Databrick, ADF Pipeline, Security knowledge is added advantage
Posted 2 weeks ago
3.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Location : Hyderabad and Chennai Immediate Joiners Experience : 3 to 5 years Mandatory skills: MLOps, Model lifecycle + Python + PySpark + GCP (BigQuery, Dataproc & Airflow), And CI/CD Required Skills and Experience: Strong programming skills: Proficiency in languages like Python, with experience in libraries like TensorFlow, PyTorch, or scikit-learn. Cloud Computing: Deep understanding of GCP services relevant to ML, such as Vertex AI, BigQuery, Cloud Storage, Dataflow, Dataproc, and others. Data Science Fundamentals: Solid foundation in machine learning concepts, statistical analysis, and data modeling. Software Engineering Principles: Experience with software development best practices, version control (e.g., Git), and testing methodologies. MLOps: Familiarity with MLOps principles and practices. Data Engineering: Experience in building and managing data pipelines.
Posted 2 weeks ago
5.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Equifax is seeking creative, high-energy and driven software engineers with hands-on development skills to work on a variety of meaningful projects. Our software engineering positions provide you the opportunity to join a team of talented engineers working with leading-edge technology. You are ideal for this position if you are a forward-thinking, committed, and enthusiastic software engineer who is passionate about technology. What You’ll Do Design, develop, and operate high scale applications across the full engineering stack Design, develop, test, deploy, maintain, and improve software. Apply modern software development practices (serverless computing, microservices architecture, CI/CD, infrastructure-as-code, etc.) Work across teams to integrate our systems with existing internal systems, Data Fabric, CSA Toolset. Participate in technology roadmap and architecture discussions to turn business requirements and vision into reality. Participate in a tight-knit, globally distributed engineering team. Triage product or system issues and debug/track/resolve by analyzing the sources of issues and the impact on network, or service operations and quality. Manage sole project priorities, deadlines, and deliverables. Research, create, and develop software applications to extend and improve on Equifax Solutions Collaborate on scalability issues involving access to data and information. Actively participate in Sprint planning, Sprint Retrospectives, and other team activity. Ensure high code quality through comprehensive unit, integration, and end-to-end testing, alongside participation in code reviews. What Experience You Need Bachelor's degree or equivalent experience 5+ years of software engineering experience 5+ years experience writing, debugging, and troubleshooting code in mainstream Java, SpringBoot, Angular UI, TypeScript/JavaScript, HTML, CSS 5+ years experience with Cloud technology: GCP, AWS, or Azure 5+ years experience designing and developing cloud-native solutions 5+ years experience designing and developing microservices using Java, SpringBoot, GCP SDKs, GKE/Kubernetes 5+ years experience deploying and releasing software using Jenkins CI/CD pipelines, understand infrastructure-as-code concepts, Helm Charts, and Terraform constructs Design and build robust, user-friendly, and highly responsive web applications using Angular (versions 12+). Implement and manage micro-frontend architectures to foster independent deployments and enhance team autonomy. Collaborate closely with DevOps teams, contributing to CI/CD pipeline automation for seamless integration and deployment processes. Utilize Git and Gitflow workflows for efficient source code management, branching, and merging strategies. What could set you apart Self-starter that identifies/responds to priority shifts with minimal supervision. Deep expertise in designing and developing complex, high-performance web applications using Angular (v12+) including advanced state management, performance optimization, and modular design. Proven experience in implementing and managing micro-frontend solutions, enabling independent team development, scalable deployments, and enhanced application resilience. Strong command of core UI technologies including HTML, JavaScript, and CSS frameworks like Bootstrap, ensuring pixel-perfect and responsive user experiences. Experience designing and developing big data processing solutions using Dataflow/Apache Beam, Bigtable, BigQuery, PubSub, GCS, Composer/Airflow, and others UI development (e.g. HTML, JavaScript, Angular and Bootstrap) Experience with backend technologies such as JAVA/J2EE, SpringBoot, SOA and Microservices Source code control management systems (e.g. SVN/Git, Github) and build tools like Maven & Gradle. Agile environments (e.g. Scrum, XP) Relational databases (e.g. SQL Server, MySQL) Atlassian tooling (e.g. JIRA, Confluence, and Github) Developing with modern JDK (v1.7+) Automated Testing: JUnit, Selenium, LoadRunner, SoapUI
Posted 2 weeks ago
6.0 years
0 Lacs
Pune, Maharashtra, India
On-site
About Company: Our client is a global technology consulting and digital solutions company that enables enterprises to reimagine business models and accelerate innovation through digital technologies. Powered by more than 84,000 entrepreneurial professionals across more than 30 countries, it caters to over 700 clients with its extensive domain and technology expertise to help drive superior competitive differentiation, customer experiences, and business outcomes. Job Title: JAVA-GCP Location: – Pune (Shivajinagar) Experience: 6+ Years Employment Type: Contract to Hire Work Mode: Hybrid Notice Period: Immediate Joiners Only Job Description: Java with cloud (GCP referable) and Dataflow, Data processing Bachelors in computer science, Engineering, or equivalent experience 7+ years of experience in core JAVA, Spring Framework (Required) 2 years of Cloud experience (GCP, AWS, Azure, GCP preferred ) (Required) Experience in big data processing, on a distributed system. (required) Experience in databases RDBMS, NoSQL databases Cloud natives. (Required) Experience in handling various data formats like Flat file, jSON, Avro, xml etc with defining the schemas and the contracts. (required) Experience in implementing the data pipeline (ETL) using Dataflow( Apache beam) Experience in Microservices and integration patterns of the APIs with data processing. Experience in data structure, defining and designing the data models
Posted 2 weeks ago
3.0 - 7.0 years
0 Lacs
chennai, tamil nadu
On-site
As a Data Engineer, you will be responsible for designing and developing robust ETL pipelines using Python, PySpark, and Google Cloud Platform (GCP) services. Your role will involve building and optimizing data models and queries in BigQuery for analytics and reporting purposes. You will also be responsible for ingesting, transforming, and loading structured and semi-structured data from various sources. Collaboration with data analysts, scientists, and business teams to comprehend data requirements will be a key aspect of your job. Ensuring data quality, integrity, and security across cloud-based data platforms is crucial. Monitoring and troubleshooting data workflows and performance issues will also be part of your responsibilities. Automation of data validation and transformation processes using scripting and orchestration tools will be an essential aspect of your role. You are required to have hands-on experience with Google Cloud Platform (GCP), especially BigQuery. Strong programming skills in Python and/or PySpark are necessary for this position. Your experience in designing and implementing ETL workflows and data pipelines will be valuable. Proficiency in SQL and data modeling for analytics is required. Familiarity with GCP services such as Cloud Storage, Dataflow, Pub/Sub, and Composer is preferred. Understanding data governance, security, and compliance in cloud environments is essential. Experience with version control tools like Git and agile development practices will be beneficial for this role. If you are looking for a challenging opportunity to work on cutting-edge data engineering projects, this position is ideal for you.,
Posted 2 weeks ago
5.0 years
0 Lacs
Trivandrum, Kerala, India
On-site
Equifax is seeking creative, high-energy and driven software engineers with hands-on development skills to work on a variety of meaningful projects. Our software engineering positions provide you the opportunity to join a team of talented engineers working with leading-edge technology. You are ideal for this position if you are a forward-thinking, committed, and enthusiastic software engineer who is passionate about technology. What You’ll Do Design, develop, and operate high scale applications across the full engineering stack Design, develop, test, deploy, maintain, and improve software. Apply modern software development practices (serverless computing, microservices architecture, CI/CD, infrastructure-as-code, etc.) Work across teams to integrate our systems with existing internal systems, Data Fabric, CSA Toolset. Participate in technology roadmap and architecture discussions to turn business requirements and vision into reality. Participate in a tight-knit, globally distributed engineering team. Triage product or system issues and debug/track/resolve by analyzing the sources of issues and the impact on network, or service operations and quality. Manage sole project priorities, deadlines, and deliverables. Research, create, and develop software applications to extend and improve on Equifax Solutions Collaborate on scalability issues involving access to data and information. Actively participate in Sprint planning, Sprint Retrospectives, and other team activity What Experience You Need Bachelor's degree or equivalent experience 5+ years of software engineering experience 5+ years experience writing, debugging, and troubleshooting code in mainstream Java, SpringBoot 5+ years experience with Cloud technology: GCP, AWS, or Azure 5+ years experience designing and developing cloud-native solutions 5+ years experience designing and developing microservices using Java, SpringBoot, GCP SDKs, GKE/Kubernetes 5+ years experience deploying and releasing software using Jenkins CI/CD pipelines, understand infrastructure-as-code concepts, Helm Charts, and Terraform constructs Big Data experience preferred. What could set you apart Self-starter that identifies/responds to priority shifts with minimal supervision. Experience designing and developing big data processing solutions using Dataflow/Apache Beam, Bigtable, BigQuery, PubSub, GCS, Composer/Airflow, and others Experience with backend technologies such as JAVA/J2EE, SpringBoot, SOA and Microservices Source code control management systems (e.g. SVN/Git, Github) and build tools like Maven & Gradle. Agile environments (e.g. Scrum, XP) Relational databases (e.g. SQL Server, MySQL) Atlassian tooling (e.g. JIRA, Confluence, and Github) Developing with modern JDK (v1.7+) Automated Testing: JUnit, Selenium, LoadRunner, SoapUI
Posted 2 weeks ago
4.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Job Summary Synechron is seeking an experienced Data Processing Engineer to lead the development of large-scale data processing solutions using Java, Apache Flink/Storm/Beam, and Google Cloud Platform (GCP). In this role, you will collaborate across teams to design, develop, and optimize data-intensive applications that support strategic business objectives. Your expertise will help evolve our data architecture, improve processing efficiency, and ensure the delivery of reliable, scalable solutions in an Agile environment. Software Requirements Required: Java (version 8 or higher) Apache Flink, Storm, or Beam for streaming data processing Google Cloud Platform (GCP) services, especially BigQuery and related data tools Experience with databases such as BigQuery, Oracle, or equivalent Familiarity with version control tools such as Git Preferred: Cloud deployment experience with GCP in particular Additional familiarity with containerization (Docker/Kubernetes) Knowledge of CI/CD pipelines and DevOps practices Overall Responsibilities Collaborate closely with cross-functional teams to understand data and system requirements, then design scalable solutions aligned with business needs. Develop detailed technical specifications, implementation plans, and documentation for new features and enhancements. Implement, test, and deploy data processing applications using Java and Apache Flink/Storm/Beam within GCP environments. Conduct code reviews to ensure quality, security, and maintainability, supporting team members' growth and best practices. Troubleshoot technical issues, resolve bottlenecks, and optimize application performance and resource utilization. Stay current with advancements in data processing, cloud technology, and Java development to continuously improve solutions. Support testing teams to verify data workflows and validation processes, ensuring reliability and accuracy. Participate in Agile ceremonies, including sprint planning, stand-ups, and retrospectives to ensure continuous delivery and process improvement. Technical Skills (By Category) Programming Languages: Required: Java (8+) Preferred: Python, Scala, or Node.js for scripting or auxiliary processing Databases/Data Management: Experience with BigQuery, Oracle, or similar relational data stores Cloud Technologies: GCP (BigQuery, Cloud Storage, Dataflow etc.) with hands-on experience in cloud data solutions Frameworks and Libraries: Apache Flink, Storm, or Beam for stream processing Java SDKs, APIs, and data integration libraries Development Tools and Methodologies: Git, Jenkins, JIRA, and Agile/Scrum practices Familiarity with containerization (Docker, Kubernetes) is a plus Security and Compliance: Understanding of data security principles in cloud environments Experience Requirements 4+ years of experience in software development, with a focus on data processing and Java-based backend development Proven experience working with Apache Flink, Storm, or Beam in production environments Strong background in managing large data workflows and pipeline optimization Experience with GCP data services and cloud-native development Demonstrated success in Agile projects, including collaboration with cross-functional teams Previous leadership or mentorship experience is a plus Day-to-Day Activities Design, develop, and deploy scalable data processing applications in Java using Flink/Storm/Beam on GCP Collaborate with data engineers, analysts, and architects to translate business needs into technical solutions Conduct code reviews, optimize data pipelines, and troubleshoot system issues swiftly Document technical specifications, data schemas, and process workflows Participate actively in Agile ceremonies, provide updates on task progress, and suggest process improvements Support continuous integration and deployment of data applications Mentor junior team members, sharing best practices and technical insights Qualifications Bachelor’s or Master’s degree in Computer Science, Information Technology, or equivalent Relevant certifications in cloud technologies or data processing (preferred) Evidence of continuous professional development and staying current with industry trends Professional Competencies Strong analytical and problem-solving skills focused on data processing challenges Leadership abilities to guide, mentor, and develop team members Excellent communication skills for technical documentation and stakeholder engagement Adaptability to rapidly changing technologies and project priorities Capacity to prioritize tasks and manage time efficiently under tight deadlines Innovative mindset to leverage new tools and techniques for performance improvements S YNECHRON’S DIVERSITY & INCLUSION STATEMENT Diversity & Inclusion are fundamental to our culture, and Synechron is proud to be an equal opportunity workplace and is an affirmative action employer. Our Diversity, Equity, and Inclusion (DEI) initiative ‘Same Difference’ is committed to fostering an inclusive culture – promoting equality, diversity and an environment that is respectful to all. We strongly believe that a diverse workforce helps build stronger, successful businesses as a global company. We encourage applicants from across diverse backgrounds, race, ethnicities, religion, age, marital status, gender, sexual orientations, or disabilities to apply. We empower our global workforce by offering flexible workplace arrangements, mentoring, internal mobility, learning and development programs, and more. All employment decisions at Synechron are based on business needs, job requirements and individual qualifications, without regard to the applicant’s gender, gender identity, sexual orientation, race, ethnicity, disabled or veteran status, or any other characteristic protected by law. Candidate Application Notice
Posted 2 weeks ago
12.0 years
25 - 35 Lacs
Madurai
On-site
Dear Candidate, Greetings of the day!! I am Kantha, and I'm reaching out to you regarding an exciting opportunity with TechMango. You can connect with me on LinkedIn https://www.linkedin.com/in/kantha-m-ashwin-186ba3244/ Or Email: kanthasanmugam.m@techmango.net Techmango Technology Services is a full-scale software development services company founded in 2014 with a strong focus on emerging technologies. It holds a primary objective of delivering strategic solutions towards the goal of its business partners in terms of technology. We are a full-scale leading Software and Mobile App Development Company. Techmango is driven by the mantra “Clients Vision is our Mission”. We have a tendency to stick on to the current statement. To be the technologically advanced & most loved organization providing prime quality and cost-efficient services with a long-term client relationship strategy. We are operational in the USA - Chicago, Atlanta, Dubai - UAE, in India - Bangalore, Chennai, Madurai, Trichy. Techmangohttps://www.techmango.net/ Job Title: GCP Data Architect Location: Madurai Experience: 12+ Years Notice Period: Immediate About TechMango TechMango is a rapidly growing IT Services and SaaS Product company that helps global businesses with digital transformation, modern data platforms, product engineering, and cloud-first initiatives. We are seeking a GCP Data Architect to lead data modernization efforts for our prestigious client, Livingston, in a highly strategic project. Role Summary As a GCP Data Architect, you will be responsible for designing and implementing scalable, high-performance data solutions on Google Cloud Platform. You will work closely with stakeholders to define data architecture, implement data pipelines, modernize legacy data systems, and guide data strategy aligned with enterprise goals. Key Responsibilities: Lead end-to-end design and implementation of scalable data architecture on Google Cloud Platform (GCP) Define data strategy, standards, and best practices for cloud data engineering and analytics Develop data ingestion pipelines using Dataflow, Pub/Sub, Apache Beam, Cloud Composer (Airflow), and BigQuery Migrate on-prem or legacy systems to GCP (e.g., from Hadoop, Teradata, or Oracle to BigQuery) Architect data lakes, warehouses, and real-time data platforms Ensure data governance, security, lineage, and compliance (using tools like Data Catalog, IAM, DLP) Guide a team of data engineers and collaborate with business stakeholders, data scientists, and product managers Create documentation, high-level design (HLD) and low-level design (LLD), and oversee development standards Provide technical leadership in architectural decisions and future-proofing the data ecosystem Required Skills & Qualifications: 10+ years of experience in data architecture, data engineering, or enterprise data platforms. Minimum 3–5 years of hands-on experience in GCP Data Service. Proficient in:BigQuery, Cloud Storage, Dataflow, Pub/Sub, Composer, Cloud SQL/Spanner. Python / Java / SQL Data modeling (OLTP, OLAP, Star/Snowflake schema). Experience with real-time data processing, streaming architectures, and batch ETL pipelines. Good understanding of IAM, networking, security models, and cost optimization on GCP. Prior experience in leading cloud data transformation projects. Excellent communication and stakeholder management skills. Preferred Qualifications: GCP Professional Data Engineer / Architect Certification. Experience with Terraform, CI/CD, GitOps, Looker / Data Studio / Tableau for analytics. Exposure to AI/ML use cases and MLOps on GCP. Experience working in agile environments and client-facing roles. What We Offer: Opportunity to work on large-scale data modernization projects with global clients. A fast-growing company with a strong tech and people culture. Competitive salary, benefits, and flexibility. Collaborative environment that values innovation and leadership. Job Type: Full-time Pay: ₹2,500,000.00 - ₹3,500,000.00 per year Application Question(s): Current CTC ? Expected CTC ? Notice Period ? (If you are serving Notice period please mention the Last working day) Experience: GCP Data Architecture : 3 years (Required) BigQuery: 3 years (Required) Cloud Composer (Airflow): 3 years (Required) Location: Madurai, Tamil Nadu (Required) Work Location: In person
Posted 2 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough