Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
170.0 years
0 Lacs
Mulshi, Maharashtra, India
On-site
Area(s) of responsibility About Birlasoft Birlasoft, a global leader at the forefront of Cloud, AI, and Digital technologies, seamlessly blends domain expertise with enterprise solutions. The company’s consultative and design-thinking approach empowers societies worldwide, enhancing the efficiency and productivity of businesses. As part of the multibillion-dollar diversified CKA Birla Group, Birlasoft with its 12,000+ professionals, is committed to continuing the Group’s 170-year heritage of building sustainable communities. About the Job – Ability to relate the product functionality to business processes, and thus offer implementation advice to customers on how to meet their various business scenarios. Job Title – GCP BigQuery Engineer Location: Pune/Bangalore/Mumbai/Pune/Hyd/Noida Educational Background – BE/Btech Key Responsibilities – Must Have Skills Should have 4-8 Years of Exp Design, develop, and implement data warehousing and analytics solutions using Google BigQuery as the primary data storage and processing platform. Work closely with business stakeholders, data architects, and data engineers to gather requirements and design scalable and efficient data models and schemas in BigQuery. Implement data ingestion pipelines to extract, transform, and load (ETL) data from various source systems into BigQuery using GCP services such as Cloud Dataflow, Cloud Storage, and Data Transfer Service. Optimize BigQuery performance and cost-effectiveness by designing partitioned tables, clustering tables, and optimizing SQL queries. Develop and maintain data pipelines and workflows using GCP tools and technologies to automate data processing and analytics tasks. Implement data security and access controls in BigQuery to ensure compliance with regulatory requirements and protect sensitive data. Collaborate with cross-functional teams to integrate BigQuery with other GCP services and third-party tools to support advanced analytics, machine learning, and business intelligence initiatives. Provide technical guidance and mentorship to junior members of the team and contribute to knowledge sharing and best practices development. Qualifications Bachelor's degree in Computer Science, Information Technology, or related field. Strong proficiency in designing and implementing data warehousing and analytics solutions using BigQuery. Experience with data modeling, schema design, and optimization techniques in BigQuery. Hands-on experience with GCP services such as Cloud Dataflow, Cloud Storage, Data Transfer Service, and Data Studio.
Posted 2 weeks ago
5.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Role Description Roles and Responsibilities: Lead and manage end-to-end delivery of data-centric projects including data warehousing, data integration, and business intelligence initiatives. Drive project planning, execution, monitoring, and closure using industry-standard project management methodologies (Agile/Scrum/Waterfall). Collaborate with cross-functional teams, data architects, developers, and business stakeholders to define project scope and requirements. Ensure timely delivery of project milestones within the approved scope, budget, and timelines. Proactively manage project risks, dependencies, and issues with clear mitigation strategies. Establish effective communication plans and engage stakeholders at all levels to ensure project alignment and transparency. Maintain and track detailed project documentation including timelines, resource plans, status reports, and governance logs. Lead one or more full lifecycle ETL/Data integration implementations from initiation to go-live and support transition. Ensure alignment of data architecture and modeling practices with organizational standards and best practices. Must-Have Skills Minimum 5+ years of experience in Project Management, with at least 3 years managing data-centric projects (e.g., Data Warehousing, Business Intelligence, Data Integration). Strong understanding of data architecture principles, data modeling, and database design. Proven experience managing full-lifecycle ETL/Data integration projects. Hands-on exposure to project planning, budgeting, resource management, stakeholder communication, and risk management. Ability to drive cross-functional teams and communicate effectively with both technical and non-technical stakeholders. Good-to-Have Skills Working knowledge or hands-on experience with ETL tools such as: Informatica Talend IBM DataStage SSIS AWS Glue Azure Data Factory GCP Dataflow Familiarity with Agile/Scrum methodologies and tools like JIRA, MS Project, or Confluence. PMP, PMI-ACP, or Scrum Master certification. Prior experience working with cloud-based data solutions. Skills Healthcare,Etl,Data Warehousing,Project Management
Posted 2 weeks ago
8.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Job Description Product Strategy & Vision (AI/ML & Scale): Define and evangelize the product vision, strategy, and roadmap for our AI/ML platform, data pipelines, and scalable application infrastructure, aligning with overall company objectives. Identify market opportunities, customer needs, and technical trends to drive innovation and competitive advantage in the AI/ML and large-scale data domain. Translate complex technical challenges and opportunities into clear, actionable product requirements and specifications. Product Development & Execution Leadership: Oversee the entire product lifecycle from ideation and discovery through development, launch, and post-launch iteration for critical AI/ML and data products. Work intimately with engineering, data science, and operations teams to ensure seamless execution and delivery of high-quality, performant, and scalable solutions. Champion best practices for productionizing applications at scale and ensuring our systems can handle huge volumes of data efficiently and reliably. Define KPIs and metrics for product success, monitoring performance and iterating based on data-driven insights. People Management & Team Leadership: Lead, mentor, coach, and grow a team of Technical Product Managers, fostering a culture of innovation, accountability, and continuous improvement. Provide clear direction, set performance goals, conduct regular reviews, and support the professional development and career growth of your team members. Act as a leader and role model, promoting collaboration, open communication, and a positive team environment. Technical Expertise & Hands-On Contribution: Possess a deep understanding of the end-to-end ML lifecycle (MLOps), from data ingestion and model training to deployment, monitoring, and continuous improvement. Demonstrate strong proficiency in Google Cloud Platform (GCP) services, including but not limited to, compute, storage, networking, data processing (e.g., BigQuery, Dataflow, Dataproc), and AI/ML services (e.g., Vertex AI, Cloud AI Platform). Maintain a strong hands-on expertise level in Python programming, capable of contributing to prototypes, proof-of-concepts, data analysis, or technical investigations as needed. Extensive practical experience with leading AI frameworks and libraries, including Hugging Face for natural language processing and transformer models. Proven experience with LangGraph (or similar sophisticated agentic frameworks like LangChain, LlamaIndex), understanding their architecture and application in building intelligent, multi-step AI systems. Solid understanding of agentic frameworks, their design patterns, and how to productionize complex AI agents. Excellent exposure to GitHub and modern coding practices, including version control, pull requests, code reviews, CI/CD pipelines, and writing clean, maintainable code. Cross-functional Collaboration & Stakeholder Management: Collaborate effectively with diverse stakeholders across engineering, data science, design, sales, marketing, and executive leadership to gather requirements, communicate progress, and align strategies. Act as a bridge between technical teams and business stakeholders, translating complex technical concepts into understandable business implications and vice-versa. Responsibilities Technical Skills: Deep expertise in Google Cloud Platform (GCP) services for data, AI/ML, and scalable infrastructure. Expert-level hands-on Python programming skills (e.g., for data manipulation, scripting, API interaction, ML prototyping, productionizing) Strong working knowledge of Hugging Face libraries and ecosystem. Direct experience with LangGraph and/or other advanced agentic frameworks (e.g., LangChain, LlamaIndex) for building intelligent systems. Solid understanding of software development lifecycle, GitHub, Git workflows, and modern coding practices (CI/CD, testing, code quality). Qualifications Education: Bachelor’s or Master’s degree in Computer Science, Engineering, or a related technical field. Experience: 8+ years of progressive experience in technical product management roles, with a significant portion focused on AI/ML, data platforms, or highly scalable systems. 3+ years of direct people management experience, leading and mentoring a team of product managers or technical leads. Demonstrable track record of successfully bringing complex technical products from concept to production at scale. Proven ability to manage products that handle massive volumes of data and require high throughput. Extensive practical experience with AI/ML model deployment and MLOps best practices in a production environment. Leadership & Soft Skills: Exceptional leadership, communication, and interpersonal skills, with the ability to inspire and motivate a team. Strong analytical and problem-solving abilities, with a data-driven approach to decision-making. Ability to thrive in a fast-paced, ambiguous, and rapidly evolving technical environment. Excellent ability to articulate complex technical concepts to both technical and non-technical audiences.
Posted 2 weeks ago
0 years
0 Lacs
Gurugram, Haryana, India
On-site
Impetus is hiring for good GCP Data Engineers, If you are good in Bigdata, Spark, pyspark & GCP-Pub Sub, Dataproc, Big query etc & you are immediate joiner & can join us in 0-30 days, please share your resume at rashmeet.g.tuteja@impetus.com. Responsibilities Able to effectively use GCP managed services e.g. Dataproc, Dataflow, pub/sub, Cloud functions, Big Query, GCS - At least 4 of these Services. Should have strong experience in Bigdata, Spark, Pyspark & Python. Strong experience in Big Data technologies – Hadoop, Sqoop, Hive and Spark including. Good hands-on expertise on either Python or Java programming. Good Understanding of GCP core services like Google cloud storage, Google compute engine, Cloud SQL, Cloud IAM. Good to have knowledge on GCP services like App engine, GKE, Cloud Run, Cloud Built.
Posted 2 weeks ago
0 years
0 Lacs
Gurugram, Haryana, India
On-site
Impetus is hiring for good GCP Data Engineers If you are good in Bigdata, Spark, pyspark & GCP-Pub Sub, Dataproc, Big query etc & you are immediate joiner & can join us in 0-30 days, please share your resume at vaishali.tyagi@impetus.com. Responsibilities Able to effectively use GCP managed services e.g. Dataproc, Dataflow, pub/sub, Cloud functions, Big Query, GCS - At least 4 of these Services. Should have strong experience in Bigdata, Spark, Pyspark & Python. Strong experience in Big Data technologies – Hadoop, Sqoop, Hive and Spark including. Good hands-on expertise on either Python or Java programming. Good Understanding of GCP core services like Google cloud storage, Google compute engine, Cloud SQL, Cloud IAM. Good to have knowledge on GCP services like App engine, GKE, Cloud Run, Cloud Built.
Posted 2 weeks ago
8.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Exp- 8+ Years Location - Pune Notice Period: 15 Days or Immediate joiner Job Summary: We are seeking a highly skilled and experienced SQL and BigQuery Expert with 8–9 years of hands-on experience in data engineering, data analytics, and cloud-based data warehousing solutions. The ideal candidate will be responsible for designing, optimizing, and maintaining large-scale datasets and advanced SQL queries using Google BigQuery, with a strong focus on performance, scalability, and reliability. Key Responsibilities: Design and optimize complex SQL queries for data transformation and reporting. Develop and manage BigQuery data pipelines using native SQL, PL/SQL, and GCP services (e.g., Cloud Storage, Dataflow, Pub/Sub). Work closely with data engineers, analysts, and business stakeholders to understand data requirements. Automate data ingestion and transformation workflows using scheduled queries or orchestration tools (e.g., Cloud Composer / Airflow). Perform query performance tuning and troubleshoot latency issues in BigQuery . Implement best practices for cost optimization, data partitioning, clustering, and access control. Develop data marts and maintain semantic layers for BI tools (e.g., Looker, Tableau, Power BI). Ensure data quality, governance, and security standards are followed. Required Skills and Qualifications: 8–9 years of experience in advanced SQL development and performance tuning. Minimum 2–3 years of strong hands-on experience with Google BigQuery in a production environment. Solid understanding of data warehousing concepts and best practices. Experience with cloud platforms (GCP preferred; AWS/Azure is a plus). Familiarity with scripting languages such as Python or Shell. Experience with version control (Git) and CI/CD workflows for data projects. Knowledge of LookerML or other BI integration with BigQuery is a plus. Experience in working in Agile/Scrum environments.
Posted 2 weeks ago
5.0 years
0 Lacs
India
Remote
Position Title: MLOps Engineer Experience: 5+ Years Location: Remote Employment Type: Full-Time About the Role: We are looking for an experienced MLOps Engineer to lead the design, deployment, and maintenance of scalable and production-grade machine learning infrastructure. The ideal candidate will have a strong foundation in MLOps principles, expertise in GCP (Google Cloud Platform), and a proven track record in operationalizing ML models in cloud environments. Key Responsibilities: Design, build, and maintain scalable ML infrastructure on GCP using tools such as Vertex AI, GKE, Dataflow, BigQuery, and Cloud Functions. Develop and automate ML pipelines for training, validation, deployment, and monitoring using Kubeflow Pipelines, TFX, or Vertex AI Pipelines. Collaborate closely with Data Scientists to transition models from experimentation to production. Implement robust monitoring systems for model drift, performance degradation, and data quality issues. Manage containerized ML workloads using Docker and Kubernetes (GKE). Set up and manage CI/CD workflows for ML systems using Cloud Build, Jenkins, Bitbucket, or similar tools. Ensure model security, versioning, governance, and compliance throughout the ML lifecycle. Create and maintain documentation, reusable templates, and artifacts for reproducibility and audit readiness. Required Skills & Experience: Minimum 5 years of experience in MLOps, ML Engineering, or related roles. Strong programming skills in Python with experience in ML frameworks and libraries. Hands-on experience with GCP services including Vertex AI, BigQuery, GKE, and Dataflow. Solid understanding of machine learning concepts and algorithms such as XGBoost and classification models. Experience with container orchestration using Docker and Kubernetes. Proficiency in implementing CI/CD practices for ML workflows. Strong analytical, problem-solving, and communication skills.
Posted 2 weeks ago
4.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
What Your Impact Will Be Lead the development of scalable, secure, and high-performing data integration pipelines for structured and semi-structured data using Google BigQuery. Design and develop scalable data integration pipelines to ingest structured and semi-structured data from enterprise systems (e.g., ERP, CRM, E-commerce, Order Management) into a centralized cloud data warehouse using Google BigQuery. Build analytics-ready pipelines that transform raw data into trusted, curated datasets for reporting, dashboards, and advanced analytics. Implement transformation logic using DBT to create modular, maintainable, and reusable data models that evolve with business needs. Apply BigQuery best practicesincluding partitioning, clustering, and query optimizationto ensure high performance and scalability. Automate and monitor complex data workflows using Airflow/Cloud Composer, ensuring dependable pipeline orchestration and job execution. Develop efficient, reusable Python and SQL code for data ingestion, transformation, validation, and performance tuning across the pipeline lifecycle. Establish robust data quality checks and testing strategies to validate both technical accuracy and alignment with business logic. Partner with architects and Technical leads to establish best practices, scalable frameworks, and reference implementations across projects. Collaborate with cross-functional teamsincluding data analysts, BI developers, and product ownersto understand integration needs and deliver impactful, business-aligned data solutions. Leverage modern ETL platforms such as Ascend.io, Databricks, Dataflow, or Fivetran to accelerate development and improve observability and orchestration. Contribute to technical documentation, CI/CD workflows, and monitoring processes to drive transparency, reliability, and continuous improvement across the data engineering ecosystem. Mentor junior engineers, conduct peer code reviews, and lead technical Were Looking For : Bachelors or Masters degree in Computer Science, Data Engineering, Information Systems, or related technical field. Minimum 4+ years of hands-on experience in data engineering with strong expertise in data warehousing, pipeline development, and analytics on cloud experience in : Google BigQuery for large-scale data warehousing and analytics. Python for data processing, orchestration, and scripting. SQL for data wrangling, transformation, and query optimization. DBT for developing modular and maintainable data transformation layers. Airflow / Cloud Composer for workflow orchestration and scheduling. Proven experience building enterprise-grade ETL/ELT pipelines and scalable data architectures. Strong understanding of data quality frameworks, validation techniques, and governance processes. Proficiency in Agile methodologies (Scrum/Kanban) and managing IT backlogs in a collaborative, iterative experience with : Tools like Ascend.io, Databricks, Fivetran, or Dataflow. Data cataloging/governance tools (e.g., Collibra). CI/CD tools, Git workflows, and infrastructure automation. Real-time/event-driven data processing using Pub/Sub, Kafka, or similar platforms. Strategic problem-solving skills and ability to architect innovative solutions. Ability to adapt quickly to new technologies and lead adoption across teams. Excellent communication skills and ability to influence cross-functional teams. Good experience on Agile Methodologies like Scrum, Kanban, and managing IT backlog. Be a go-to expert for data technologies and solutions (ref:hirist.tech)
Posted 2 weeks ago
4.0 years
0 Lacs
Greater Hyderabad Area
Remote
Job Description : Data Engineer (Bangalore, India - Hybrid/Remote Considered) Experience : - 4+ Years Notice Period : Immediate Joiner / to Max of 15 Days Notice Period Role Overview We are looking for two Data Engineers with strong experience in SAP, GCP, and From finance/Revenue domains to join our team in Bangalore, India. These roles require hands-on technical expertise in building scalable data solutions, i ntegrating SAP data with cloud platforms , and working on finance-related data projects. Key Responsibilities Develop and maintain data pipelines for processing finance and revenue-related data from SAP to GCP. Work with Google Cloud services (Big Query, Dataflow, Pub/Sub, Cloud Composer, Dataproc) to manage and optimize data processing. Extract and transform SAP data, ensuring seamless integration into cloud-based analytics platforms. Write optimized SQL, Python, and Spark scripts to support ETL/ELT workflows. Ensure data security, governance, and compliance best practices for financial datasets. Collaborate with cross-functional teams to support data analytics and reporting needs. Required Qualifications 4-6 years of hands-on experience in data engineering, ETL, and cloud data platforms. Strong expertise in SQL, Python, and Spark. Experience working with SAP Finance or Revenue modules and integrating SAP data with cloud platforms. Proficiency in Google Cloud Platform (GCP) services like BigQuery, Dataflow, and Pub/Sub. Good understanding of data modeling, ETL/ELT processes, and large-scale data processing. Strong problem-solving skills and ability to work in a fast-paced environment. Preferred Qualifications Experience with SAP BODS, SAP HANA, or SAP Data Services for data extraction. Knowledge of data governance, compliance, and cloud cost optimization. Exposure to Terraform, Git, and DevOps practices. Familiarity with AI/ML applications in financial data analytics. Why Join Us? Be part of a fast-growing AI-first data solutions company. Work on global projects with top-tier financial and technology enterprises. Competitive salary and career growth opportunities in AI and cloud data solutions. Hybrid work environment (Bangalore-based, remote flexibility for exceptional candidates). Important Note If performance is an issue - candidate must be ready to re-locate to Bengaluru KA (Work from Office ) / Hybrid Mode ( Its not fully remote opportunity - Hybrid Opportunity Mix of WFH- Remote). Notice Period - Immediate to Early Joiners 15-20 Days Time (maximum) (ref:hirist.tech)
Posted 2 weeks ago
5.0 years
0 Lacs
Greater Kolkata Area
On-site
Job Description We are seeking a highly skilled and experienced GCP Cloud Data Engineer to join our dynamic team. The ideal candidate will have a strong background in data engineering with a focus on Google Cloud Platform (GCP) services. You will be responsible for designing, building, and maintaining scalable data pipelines and infrastructure on GCP, ensuring data is accessible, reliable, and available for business use. Key Responsibilities Data Pipeline Development: Design, develop, and maintain data pipelines using GCP services such as Dataflow, Dataproc, BigQuery, and Cloud Storage. Data Integration: Work on integrating data from various sources (structured, semi-structured, and unstructured) into GCP environments. Data Modeling: Develop and maintain efficient data models in BigQuery to support analytics and reporting needs. Data Warehousing: Implement data warehousing solutions on GCP, optimizing performance and scalability. ETL/ELT Processes: Build and manage ETL/ELT processes using tools like Apache Airflow, Data Fusion, and Python. Data Quality & Governance: Implement data quality checks, data lineage, and data governance best practices to ensure high data integrity. Automation: Automate data pipelines and workflows to reduce manual effort and improve efficiency. Collaboration: Work closely with data scientists, analysts, and other stakeholders to understand data requirements and deliver data solutions that meet business needs. Optimization: Continuously monitor and optimize the performance of data pipelines and queries for cost and efficiency. Security: Ensure data security and compliance with industry standards and best practices. Required Skills & Qualifications Education: Bachelors degree in Computer Science, Information Technology, Engineering, or a related field. Experience: 5+ years of experience in data engineering, with at least 2+ years working with GCP. Technical Skills Proficiency in GCP services: BigQuery, Dataflow, Dataproc, Cloud Storage, Pub/Sub, and Cloud Functions. Strong programming skills in Python, SQL,Pyspark and familiarity with Java/Scala. Experience with orchestration tools like Apache Airflow. Knowledge of ETL/ELT processes and tools. Experience with data modeling and designing data warehouses in BigQuery. Familiarity with CI/CD pipelines and VC- version control systems like Git. Understanding of data governance, security, and compliance. Soft Skills Excellent problem-solving and analytical skills. Strong communication and collaboration abilities. Ability to work in a fast-paced environment and manage multiple priorities. Preferred Qualifications: Good to have : Certifications: GCP Professional Data Engineer or GCP Professional Cloud Architect certification. Domain Knowledge: Experience in the finance, e-commerce, healthcare domain is a plus. (ref:hirist.tech)
Posted 2 weeks ago
2.0 years
0 Lacs
Trivandrum, Kerala, India
On-site
Equifax is where you can power your possible. If you want to achieve your true potential, chart new paths, develop new skills, collaborate with bright minds, and make a meaningful impact, we want to hear from you. What You’ll Do As a Senior Developer at Equifax, you will oversee and steer the delivery of innovative batch and data products, primarily leveraging Java and the Google Cloud Platform (GCP). Your expertise will ensure the efficient and timely deployment of high quality data solutions that support our business objectives and client needs. Project Development: Development, and deployment of batch and data products, ensuring alignment with business goals and technical requirements. Technical Oversight: Provide technical direction and oversee the implementation of solutions using Java and GCP, ensuring best practices in coding, performance, and security. Team Management: Mentor and guide junior developers and engineers, fostering a collaborative and high performance environment. Stakeholder Collaboration: Work closely with cross functional teams including product managers, business analysts, and other stakeholders to gather requirements and translate them into technical solutions. Documentation and Compliance: Maintain comprehensive documentation for all technical processes, ensuring compliance with internal and external standards. Continuous Improvement: Advocate for and implement continuous improvement practices within the team, staying abreast of emerging technologies and methodologies. What Experience You Need Education: Bachelor's degree in Computer Science, Information Technology, or a related field. Experience: Minimum of 2-5 years of relevant experience in software development, with a focus on batch processing and data solutions. Technical Skills: Proficiency in Java, with a strong understanding of its ecosystems. 2+ year of relevant Java, Sprint, Spring Boot, REST, Microservices, Hibernate, JPA, RDBMS Minimum 2 Git, CI/CD Pipelines, Jenkins Experience with Google Cloud Platform, including BigQuery, Cloud Storage, Dataflow, Big Table and other GCP tools. Familiarity with ETL processes, data modeling, and SQL. Soft Skills: Strong problem solving abilities and a proactive approach to project management. Effective communication and interpersonal skills, with the ability to convey technical concepts to nontechnical stakeholders. What Could Set You Apart Technical Skills: Proficiency in Java, with a strong understanding of its ecosystems. 2+ year of relevant Java, Sprint, Spring Boot, REST, Microservices, Hibernate, JPA, RDBMS Minimum 2 Git, CI/CD Pipelines, Jenkins Experience with Google Cloud Platform, including Big Query, Cloud Storage, Dataflow, Big Table and other GCP tools. Familiarity with ETL processes, data modeling, and SQL. Certification in Google Cloud (e.g., Associate Cloud Engineer). Experience with other cloud platforms (AWS, Azure) is a plus. Understanding of data privacy regulations and compliance frameworks. We offer a hybrid work setting, comprehensive compensation and healthcare packages, attractive paid time off, and organizational growth potential through our online learning platform with guided career tracks. Are you ready to power your possible? Apply today, and get started on a path toward an exciting new career at Equifax, where you can make a difference! Who is Equifax? At Equifax, we believe knowledge drives progress. As a global data, analytics and technology company, we play an essential role in the global economy by helping employers, employees, financial institutions and government agencies make critical decisions with greater confidence. We work to help create seamless and positive experiences during life’s pivotal moments: applying for jobs or a mortgage, financing an education or buying a car. Our impact is real and to accomplish our goals we focus on nurturing our people for career advancement and their learning and development, supporting our next generation of leaders, maintaining an inclusive and diverse work environment, and regularly engaging and recognizing our employees. Regardless of location or role, the individual and collective work of our employees makes a difference and we are looking for talented team players to join us as we help people live their financial best. Equifax is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, or status as a protected veteran.
Posted 2 weeks ago
5.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Equifax is where you can power your possible. If you want to achieve your true potential, chart new paths, develop new skills, collaborate with bright minds, and make a meaningful impact, we want to hear from you. Equifax is seeking creative, high-energy and driven software engineers with hands-on development skills to work on a variety of meaningful projects. Our software engineering positions provide you the opportunity to join a team of talented engineers working with leading-edge technology. You are ideal for this position if you are a forward-thinking, committed, and enthusiastic software engineer who is passionate about technology. What You’ll Do Perform general application development activities, including unit testing, code deployment to development environment and technical documentation. Work on one or more projects, making contributions to unfamiliar code written by team members. Diagnose and resolve performance issues. Participate in the estimation process, use case specifications, reviews of test plans and test cases, requirements, and project planning. Document code/processes so that any other developer is able to dive in with minimal effort. Develop, and operate high scale applications from the backend to UI layer, focusing on operational excellence, security and scalability. Apply modern software development practices (serverless computing, microservices architecture, CI/CD, infrastructure-as-code, etc.) Work across teams to integrate our systems with existing internal systems, Data Fabric, CSA Toolset. Participate in technology roadmap and architecture discussions to turn business requirements and vision into reality. Participate in a tight-knit engineering team employing agile software development practices. Triage product or system issues and debug/track/resolve by analyzing the sources of issues and the impact on network, or service operations and quality. Write, debug, and troubleshoot code in mainstream open source technologies Lead effort for Sprint deliverables, and solve problems with medium complexity Research, create, and develop software applications to extend and improve on Equifax Solutions Collaborate on scalability issues involving access to data and information. Actively participate in Sprint planning, Sprint Retrospectives, and other team activity What Experience You Need Bachelor's degree or equivalent experience 5+ years working experience software development using the most recent version of Python 3+ years experience with software build management tools like Maven or Gradle 3+ years experience with CI/CD Jenkins pipeline development and backend coding 3+ years experience with software testing, performance, and quality engineering techniques and strategies 3+ years experience with Cloud technology: GCP, AWS, or Azure is preferable Experience and familarity with the various Python frameworks currently in use to leverage software development processes What could set you apart Self-starter that identifies/responds to priority shifts with minimal supervision Experience designing and developing big data processing solutions using Dataflow/Apache Beam, Bigtable, BigQuery, PubSub, GCS, Composer/Airflow, and others Source code control management systems (e.g. SVN/Git, Github) and build tools like Maven & Gradle. Agile environments (e.g. Scrum, XP) Relational databases (e.g. SQL Server, MySQL) Atlassian tooling (e.g. JIRA, Confluence, and Github) Developing with modern Python version We offer a hybrid work setting, comprehensive compensation and healthcare packages, attractive paid time off, and organizational growth potential through our online learning platform with guided career tracks. Are you ready to power your possible? Apply today, and get started on a path toward an exciting new career at Equifax, where you can make a difference! Who is Equifax? At Equifax, we believe knowledge drives progress. As a global data, analytics and technology company, we play an essential role in the global economy by helping employers, employees, financial institutions and government agencies make critical decisions with greater confidence. We work to help create seamless and positive experiences during life’s pivotal moments: applying for jobs or a mortgage, financing an education or buying a car. Our impact is real and to accomplish our goals we focus on nurturing our people for career advancement and their learning and development, supporting our next generation of leaders, maintaining an inclusive and diverse work environment, and regularly engaging and recognizing our employees. Regardless of location or role, the individual and collective work of our employees makes a difference and we are looking for talented team players to join us as we help people live their financial best. Equifax is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, or status as a protected veteran.
Posted 2 weeks ago
8.0 - 12.0 years
12 - 18 Lacs
Noida, Pune, Bengaluru
Work from Office
Lead the technical discovery process, assess customer requirements, and design scalable solutions leveraging a comprehensive suite of Data & AI services, including BigQuery, Dataflow, Vertex AI, Generative AI solutions, and advanced AI/ML services like Vertex AI, Gemini, and Agent Builder. Architect and demonstrate solutions leveraging generative AI, large language models (LLMs), AI agents, and agentic AI patterns to automate workflows, enhance decision-making, and create intelligent applications. Develop and deliver compelling product demonstrations, proofs-of-concept (POCs), and technical workshops that showcase the value and capabilities of Google Cloud. Strong understanding of data warehousing, data lakes, streaming analytics, and machine learning pipelines. Collaborate with sales to build strong client relationships, articulate the business value of Google Cloud solutions, and drive adoption. Lead and contribute technical content and architectural designs for RFI/RFP responses and technical proposals leveraging Google Cloud Services. Stay informed of industry trends, competitive offerings, and new Google Cloud product releases, particularly in the infrastructure and data/AI domains. Extensive experience in architecting & designing solutions on Google Cloud Platform, with a strong focus on: Data & AI services such as BigQuery, Dataflow, Dataproc, Pub/Sub, Vertex AI (ML Ops, custom models, pre-trained APIs), Generative AI (e.g., Gemini). Strong understanding of cloud architecture patterns, DevOps practices, and modern software development methodologies. Ability to work effectively in a cross-functional team environment with sales, product, and engineering teams. 5+ years of experience in pre-sales or solutions architecture, focused on cloud Data & AI platforms. Skilled in client engagements, technical presentations, and proposal development. Excellent written and verbal communication skills, with the ability to articulate complex technical concepts to both technical and non-technical audiences. Location-Noida,Pune,Bengaluru,Hyderabad,Chennai
Posted 2 weeks ago
8.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Join us in bringing joy to customer experience. Five9 is a leading provider of cloud contact center software, bringing the power of cloud innovation to customers worldwide. Living our values everyday results in our team-first culture and enables us to innovate, grow, and thrive while enjoying the journey together. We celebrate diversity and foster an inclusive environment, empowering our employees to be their authentic selves. Team Five9 is a leading provider of cloud software for the enterprise contact center market, powering more than three billion customer interactions annually. Since 2001, Five9 has pioneered the cloud revolution in contact centers, helping businesses transition from legacy systems to the cloud. Our cloud-based solutions are reliable, secure, scalable, and designed to enhance customer experiences, improve agent productivity, and deliver measurable business results. This position is based out of one of the offices of our affiliate Acqueon Technologies in India, and will adopt the hybrid work arrangements of that location. You will be a member of the Acqueon team with responsibilities supporting Five9 products, collaborating with global teammates based primarily in the United States. About The Role We are seeking an experienced Site Leader to drive strategy, execution, and leadership for Five9 Voice Services . This role will oversee high-performing engineering teams, ensuring the development of scalable, reliable microservices while fostering a culture of innovation and operational excellence. Key Responsibilities Lead and grow high-performing teams, promoting a culture of collaboration, accountability, and innovation. Oversee the execution of product engineering, ensuring successful delivery of the Five9 Voice Services roadmap. Design, implement, and maintain REST APIs to support workflow automation and seamless integration with internal tools and external systems. Develop and automate dashboards that provide key performance insights to both internal teams and customers, ensuring actionable metrics for performance tracking. Integrate robust QA practices into the development lifecycle, including automated testing frameworks and continuous integration for high code quality and reliability. Define, develop, and maintain end-to-end journey tests to ensure consistent and seamless user experiences across voice workflows and services. Analyze large-scale service data to uncover trends, detect data deviations for alerts, and generate insights to improve service reliability. Define and drive technical requirements, ensuring alignment with business goals and customer needs. Lead technical and business discussions, ensuring the right decisions are made to solve complex challenges. Required Skills 8+ years of demonstrated experience in site leadership, driving local engineering culture, team development, and cross-functional collaboration. 5+ years of experience developing event streaming applications that correlate, decorate, and calculate streaming events in real time. 5+ years of experience with modern enterprise frameworks such as SpringBoot microservices, DevOps environments, Docker, and Kubernetes. 5+ years of experience designing, implementing, and consuming REST APIs, with a focus on enabling automation and integration. Hands-on experience integrating QA into the development lifecycle, including automated testing, CI/CD pipelines, and regression testing. Experience designing and maintaining journey tests that validate end-to-end user workflows across complex systems and services. Strong knowledge of cloud technologies such as Google Cloud Platform (GCP), Pub/Sub, Dataflow, BigQuery, and Cloud Functions. Diversity & Inclusion Five9 embraces diversity and is committed to building a team that represents a variety of backgrounds, perspectives, and skills. The more inclusive we are, the better we are. Five9 is an equal opportunity employer. Location Our headquarters are located in the beautiful Bishop Ranch Business Park in San Ramon, CA. Five9 embraces diversity and is committed to building a team that represents a variety of backgrounds, perspectives, and skills. The more inclusive we are, the better we are. Five9 is an equal opportunity employer. View our privacy policy, including our privacy notice to California residents here: https://www.five9.com/pt-pt/legal. Note: Five9 will never request that an applicant send money as a prerequisite for commencing employment with Five9.
Posted 2 weeks ago
4.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
We are seeking a highly skilled and experienced Google Cloud Platform (GCP) Engineer with AI/ML expertise to join our technology team. The ideal candidate will have a strong background in cloud engineering, machine learning deployment, and automation, with at least 4 years of experience working on end-to-end cloud-based solutions. This role involves building scalable infrastructure, deploying machine learning models into production, and working closely with data scientists, DevOps, and software engineers to drive intelligent, data-driven solutions. Key Responsibilities: Cloud Infrastructure (GCP) Design, implement, and manage scalable, secure, and high-performance infrastructure on Google Cloud Platform . Build and optimize CI/CD pipelines for ML model training and deployment. Develop and maintain GCP services such as GKE, BigQuery, Cloud Functions, Vertex AI, Dataflow, Pub/Sub, Cloud Storage, IAM, and Cloud Composer. Automate infrastructure provisioning using Terraform, Deployment Manager , or similar IaC tools. Monitor system performance, cost optimization, and ensure high availability and disaster recovery. AI/ML Engineering Collaborate with Data Science teams to operationalize ML models and deploy them into production using Vertex AI or Kubeflow Pipelines. Implement and manage ML workflows , including data ingestion, training, tuning, and serving. Support MLOps practices including versioning, testing, and monitoring of ML models. Ensure compliance with model governance, security, and ethical AI practices. Collaboration & Support Provide technical mentorship to junior engineers and ML Ops professionals. Work with cross-functional teams including product, engineering, and data to ensure timely delivery of projects. Create documentation, knowledge bases, and SOPs for deployment and operations processes. Required Qualifications Bachelor’s or Master’s degree in Computer Science, Engineering, Data Science, or a related field. 4+ years of professional experience in cloud engineering or AI/ML infrastructure, with at least 2 years on GCP. Hands-on experience with GCP tools such as Vertex AI, BigQuery, GKE, Cloud Functions, and Dataflow. Proficient in Python , SQL , and scripting for automation and data processing. Strong knowledge of MLOps principles , model versioning, monitoring, and rollback strategies. Experience with containerization (Docker) and orchestration (Kubernetes/GKE). Familiarity with CI/CD tools (e.g., Jenkins, Cloud Build, GitLab CI/CD). Experience with infrastructure as code (IaC) tools like Terraform or Ansible . Preferred Qualifications GCP certifications such as Professional Cloud Architect , Professional Data Engineer , or Machine Learning Engineer . Experience with real-time data processing and streaming analytics (e.g., using Pub/Sub + Dataflow). Familiarity with data governance , security best practices , and GDPR/PII compliance in ML applications. Exposure to multi-cloud or hybrid cloud environments . Knowledge of ML frameworks like TensorFlow, PyTorch, Scikit-learn, or XGBoost. Key Competencies Strong problem-solving skills and analytical thinking. Excellent communication and collaboration abilities. Proactive mindset with a focus on innovation and continuous improvement. Adaptable in a fast-paced, evolving technological environment.
Posted 2 weeks ago
0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Req ID: 327059 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a GCP Python Pyspark Engineer to join our team in Hyderabad, Telangana (IN-TG), India (IN). Strong hands-on experience in designing and building data pipelines using Google Cloud Platform (GCP) services like BigQuery, Dataflow, and Cloud Composer. Proficient in Python for data processing, scripting, and automation in cloud and distributed environments. Solid working knowledge of Apache Spark / PySpark, with experience in large-scale data transformation and performance tuning. Familiar with CI/CD processes, version control (Git), and workflow orchestration tools such as Airflow or Composer. Ability to work independently in fast-paced Agile environments with strong problem-solving and communication skills. Exposure to modern data architectures and real-time/streaming data solutions is an added advantage. About NTT DATA NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies. Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us at us.nttdata.com NTT DATA endeavors to make https://us.nttdata.com accessible to any and all users. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact us at https://us.nttdata.com/en/contact-us . This contact information is for accommodation requests only and cannot be used to inquire about the status of applications. NTT DATA is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status. For our EEO Policy Statement, please click here . If you'd like more information on your EEO rights under the law, please click here . For Pay Transparency information, please click here .
Posted 2 weeks ago
2.0 years
3 - 6 Lacs
Hyderābād
On-site
About the job Our Hubs are a crucial part of how we innovate, improving performance across Sanofi departments and providing a springboard for the amazing work we do. Build a career and you can be part of transforming our business while helping to change millions of lives. Ready? As a Senior Biomarker Biostatistician within our Statistics Team at Hyderabad , you will play a crucial role in developing and implementing different statistical and machine learning algorithms to solve complex problems and support our clinical biomarker data insights generation. You will work closely with cross-functional teams to ensure data is accessible, reliable, and optimized for analysis. Your expertise in omics data, machine learning and deep learning will be essential in driving our data initiatives forward. We are an innovative global healthcare company with one purpose: to chase the miracles of science to improve people’s lives. We’re also a company where you can flourish and grow your career, with countless opportunities to explore, make connections with people, and stretch the limits of what you thought was possible. Ready to get started? Main responsibilities: Provide high quality input regarding TM/Biomarker aspects into the design of the clinical study (including protocol development), the setup of the study to make sure biomarker data are adequately captured and collected to answer the study objectives and to support the planned statistical analyses under guidance of project biomarker statistical lead. Coordinate the activities of external partners and CROs for biomarker data generation, dataflow or biomarker statistical activities Perform pre-processing and normalization of biomarker data (e.g., RNAseq, scRNAseq, Olink, flow cytometry data etc.) Perform and/or coordinate with the programming team the production of the definitions, documentation and review of derived variables, as well as the quality control plan. Perform and/or coordinate with study programmer the production of biomarker statistical analyses. Review and examine statistical data distributions/properties. Oversee execution of the statistical analyses according to the SAP, prepare statistical methods & provide statistical insight into interpretation and discussion of results sections for the clinical study report (CSR) and/or publications to ensure the statistical integrity of the content according to internal standards and regulatory guidelines and in compliance with SOPs. Propose, prepare and perform exploratory biomarker data analyses, ad-hoc analyses as relevant for the study, project objectives or publication. About you Experience : 2+ years (Master) or 1+ years (PhD) of pharmaceutical or related industry experience Soft and technical skills : o Basic knowledge of pharmaceutical clinical development o Good knowledge and good understanding of key statistical concepts and techniques, in particular high-dimensional statistics o Good knowledge in handling complex biomarker data (e.g., RNAseq, scRNAseq, Olink, flow cytometry data etc.)), knowledge in pathway level and network analyses is a plus. o Able to work in departmental computing environment, do advanced statistical analyses using R&R-Shiny and possibly other languages (python, C++, …) o Demonstrated interpersonal and communication skills and able to work in cross functional and global team setting Education : Master’s or Ph.D. in Data Science, Bioinformatics, Statistics, or a related field. Soft and technical skills :
Posted 2 weeks ago
5.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Job Description Global Data Insight & Analytics organization is looking for a top-notch Software Engineer who has also got Machine Learning knowledge & Experience to add to our team to drive the next generation of Cloud platform Fullstack Developers. In this role you will work in a small, cross-functional team. The position will collaborate directly and continuously with other engineers, business partners, product managers and designers from distributed locations, and will release early and often. The team you will be working on is focused on building Cloud platform to democratize Machine We strongly believe that data has the power to help create great products and experiences which delight our customers. We believe that actionable and persistent insights, based on high quality data platform, help business and engineering make more impactful decisions. Our ambitions reach well beyond existing solutions, and we are in search of innovative individuals to join this Agile team. This is an exciting, fast-paced role which requires outstanding technical and organization skills combined with critical thinking, problem-solving and agile management tools to support team success Responsibilities 5+ years of experience in data engineering or software engineering, with at least 2 years focused on cloud data platforms (GCP preferred). Technical Skills: Proficient in Java, SpringBoot & Angular/React with experience in designing and deploying cloud-based data pipelines and microservices using GCP tools like BigQuery, Dataflow, and Dataproc. Service-Oriented Architecture and Microservices: Strong understanding of SOA, microservices, and their application within a cloud data platform context. Full-Stack Development: Knowledge of front-end and back-end technologies, enabling collaboration on data access and visualization layers (e.g.Angular, React, Node.js). Database Management: Experience with relational (e.g., PostgreSQL, MySQL) and NoSQL databases, as well as columnar databases like BigQuery. Data Governance and Security: Understanding of data governance frameworks and implementing RBAC, encryption, and data masking in cloud environments. CI/CD and Automation: Familiarity with CI/CD pipelines, Infrastructure as Code (IaC) tools like Terraform, and automation frameworks. Problem-Solving: Strong analytical skills with the ability to troubleshoot complex data platform and microservices issues. Qualifications Design and Build Data Pipelines: Architect, develop, and maintain scalable data pipelines and microservices that support real-time and batch processing on GCP. Service-Oriented Architecture (SOA) and Microservices: Design and implement SOA and microservices-based architectures to ensure modular, flexible, and maintainable data solutions. Full-Stack Integration: Leverage your full-stack expertise to contribute to the seamless integration of front-end and back-end components, ensuring robust data access and UI-driven data exploration. Data Ingestion and Integration: Lead the ingestion and integration of data from various sources into the data platform, ensuring data is standardized and optimized for analytics. GCP Data Solutions: Utilize GCP services (BigQuery, Dataflow, Pub/Sub, Cloud Functions, etc.) to build and manage data platforms that meet business needs. Data Governance and Security: Implement and manage data governance, access controls, and security best practices while leveraging GCP’s native row- and column-level security features. Performance Optimization: Continuously monitor and improve the performance, scalability, and efficiency of data pipelines and storage solutions. Collaboration and Best Practices: Work closely with data architects, software engineers, and cross-functional teams to define best practices, design patterns, and frameworks for cloud data engineering. Automation and Reliability: Automate data platform processes to enhance reliability, reduce manual intervention, and improve operational efficiency.
Posted 2 weeks ago
12.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
We are hiring for Power BI Developer Should have basic knowledge in data warehouse concepts Develop and enhance Power BI reports and dashboards Experienced in data modelling in Power BI including MQuery and DAX Experienced in Power BI features like RLS incremental data load dataflow etc Should have good exposure working with diverse set of visuals and various data sources like SQL Server BigQuery etc Proficient in TSQL Job description Design analyze develop test and debug Power BI reports and dashboards to satisfy business requirements Ability to translate business requirements to technical solutions Collaborate with other teams within the organization and be able to devise the technical solution as it relates to the business technical requirements Experienced in Client communication Excellent communication skill Should be a team player Maintain documentation for all processes implemented Adhere to and suggest improvements to coding standards best practices and contribute to the improvement of these best practices. Experience:- 5 -12 years Location: Hyderabad /Bangalore Primary Skill – Power BI, SQL, DAX Working Days- Hybrid Joining time - Immediate-30 days If the above criteria are matching your profiles, please share your profile to Ritusmita.MatagajSingh@ltimindtree.com swathi.gangu@ltimindtree.com with below details. Relevant in Power Bi: Relevant in SQL: Current CTC: Expected CTC: Current Location: Preferred Location Offer in hand if any: Pan Card No: Notice period/how soon you can join: Regards Swathi LTIM
Posted 2 weeks ago
30.0 years
0 Lacs
Gurugram, Haryana, India
On-site
About Client Our client is a market-leading company with over 30 years of experience in the industry. As one of the world’s leading professional services firms, with $19.7B, with 333,640 associates worldwide, helping their clients modernize technology, reimagine processes, and transform experiences, enabling them to remain competitive in our fast-paced world. Their Specialties in Intelligent Process Automation, Digital Engineering, Industry & Platform Solutions, Internet of Things, Artificial Intelligence, Cloud, Data, Healthcare, Banking, Finance, Fintech, Manufacturing, Retail, Technology, and Salesforce Job Title: GCP Data Engineer Location : Gurgaon Experience : 7-9 Years Job Typ e: Contract to Hire Notice Period : Immediate Joiners Mandatory Skills: GCP; Big Data; ETL - Big Data / Data Warehousing Big query, data proc, data flow, composer Job description: Looking for GCP Developer with below mandatory skills and requirementsMandatory Skills:: BigQuery,Cloud Storage, Cloud Pub/Sub, Dataflow, Dataproc,Composer• 6+ years in cloud infrastructure and designing data pipeline, specifically in GCP• Proficiency in programming languages Python, SQL•Proven experience in designing and implementing cloud-native applications and microservices on GCP. •Hands-on experience with CI/CD tools like Jenkins and Github Action•In-depth understanding of GCP networking, IAM policies, and security best practices.
Posted 2 weeks ago
5.0 - 7.0 years
0 Lacs
Gurgaon, Haryana, India
On-site
Introduction In this role, you'll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology Your Role And Responsibilities Skilled Multiple GCP services - GCS, Big Query, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer etc. Must have Python and SQL work experience & Proactive, collaborative and ability to respond to critical situation Ability to analyse data for functional business requirements & front face customer Preferred Education Master's Degree Required Technical And Professional Expertise 5 to 7 years of relevant experience working as technical analyst with Big Query on GCP platform. Skilled in multiple GCP services - GCS, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer You love collaborative environments that use agile methodologies to encourage creative design thinking and find innovative ways to develop with cutting edge technologies Ambitious individual who can work under their own direction towards agreed targets/goals and with creative approach to work Preferred Technical And Professional Experience Create up to 3 bullets MA Intuitive individual with an ability to manage change and proven time management Proven interpersonal skills while contributing to team effort by accomplishing related results as needed Up-to-date technical knowledge by attending educational workshops, reviewing publications (encouraging then to focus on required skills
Posted 3 weeks ago
5.0 - 9.0 years
20 - 30 Lacs
Hyderabad, Pune, Bengaluru
Hybrid
-Design, develop & maintain data pipelines using GCP services: Dataflow, BigQuery, and Pub/Sub -Provisioning infrastructure on GCP using IaC with Terraform -Implement & manage data warehouse solutions -Monitor and resolve issues in data workflows Required Candidate profile -Expertise in GCP, Apache Beam, Dataflow, & BigQuery -Pro in Python, SQL, PySpark -Worked with Cloud Composer for orchestration -Solid understanding of DWH, ETL pipelines, and real-time data streaming
Posted 3 weeks ago
6.0 years
7 - 9 Lacs
Hyderābād
On-site
Job description Some careers shine brighter than others. If you’re looking for a career that will help you stand out, join HSBC and fulfil your potential. Whether you want a career that could take you to the top, or simply take you in an exciting new direction, HSBC offers opportunities, support and rewards that will take you further. HSBC is one of the largest banking and financial services organisations in the world, with operations in 64 countries and territories. We aim to be where the growth is, enabling businesses to thrive and economies to prosper, and, ultimately, helping people to fulfil their hopes and realise their ambitions. We are currently seeking an experienced professional to join our team in the role of Software Engineer . In this role, you will: Enhance & drive the overall product strategy, providing the vision and roadmap for the Data Analytics platform over Cloud journey and to help drive future requirements with reduced operational costs. Implementation of IT strategy to support core business objectives and gain business value. Become the ‘voice’ of the business within technology to ensure strategies are cohesive across all business streams. Identify interdependencies between various integrated teams and release plans. Accountable for identifying and resolving any alignment issues within the component teams delivered through the Global IT organization. Part of global team, consisting of 20+ resources across development and support. Creation & execution of plans to support training, adequate levels of resourcing to support the global demand Accountable for ensuring the products & services are delivered adhering to the approved architecture and solutions to meet the customer needs. Drive/Supporting technical design, change for new and existing data sources and manage support for delivering state of art intelligence infrastructure. Evolution of the DevOps model, ensuring continued improvement of the technology lifecycle and alignment with stakeholder plans Adhere to compliance with external regulatory requirements, internal control standards and group compliance policy. Maintains HSBC internal control standards, including timely implementation of internal and external audit points. Take accountability to work closely and build a trusted relationship with the business to ensure delivery of the benefits outlined by the respective strategy. Requirements To be successful in this role, you should meet the following requirements: Retail banking environment, with good understanding of customer lifecycle across core products 6+ years of Industry experience, solid exposure to managing/supporting product-based teams providing global services. Developing and maintaining ReactJS-based web applications: This includes creating new features, enhancing existing ones, and ensuring the overall functionality and user experience of the application. Writing clean, efficient, and reusable React components: This involves using best practices for component design, ensuring code readability, and creating components that can be used across multiple applications. Implementing state management: This involves using tools like Redux or Context API to manage the application's data and state efficiently. Ensuring cross-browser compatibility and mobile responsiveness: This means ensuring that the application looks and functions correctly across different browsers and devices. Optimizing application performance: This includes identifying and fixing performance bottlenecks, improving loading times, and ensuring a smooth user experience. Working closely with backend developers to integrate APIs: This involves collaborating with backend developers to define API endpoints, consume them in the frontend, and ensure seamless data flow. Following best practices in UI/UX design and front-end architecture: This involves understanding UI/UX principles, designing user-friendly interfaces, and structuring the codebase in a maintainable and scalable way. Staying updated with the latest ReactJS trends and features: This means continuously learning about new features and best practices in the ReactJS ecosystem. Performing unit testing and debugging for high-quality applications: This involves writing unit tests to ensure the quality of the code, debugging issues, and fixing bugs. Maintaining code quality, organization, and documentation: This involves writing clear and concise code, organizing the codebase, and documenting the code for future reference Skills : In-depth knowledge of JavaScript and ReactJS: This includes understanding core JavaScript concepts, React components, JSX, and state management. s Familiarity with other front-end technologies: This can include HTML, CSS, Bootstrap, and potentially other frameworks like Angular or VueJS. Experience with state management libraries: This can include Redux, Context API, or MobX. Understanding of front-end performance optimization techniques: This can include lazy loading, code splitting, and image optimization. Experience with version control systems (e.g., Git): This is essential for collaborating with other developers and managing the codebase. Good communication and collaboration skills: This is crucial for working with other developers, designers, and stakeholders. Problem-solving skills and ability to debug: This is essential for identifying and fixing issues in the codebase. Understanding of UI/UX design principles: This helps in creating user-friendly and intuitive interfaces. Ability to write clean, well-documented code: This makes the code easier to maintain and understand. Experience with front-end build tools (e.g., Webpack, Babel): These tools are used to automate tasks like bundling, transpiling, and minifying code. Strong proven experience in data migration projects over Cloud technologies like GCP/AWS, hands-on on Docker/Kubernetes Strong proven skills in Dataflow, Airflow, Big queries, Big Data Ecosystem including Hadoop and Cloud technologies. Strong knowledge of Data Warehousing, ETL, Analytics and Business Intelligence Reporting Experience of working in DevOps and Agile environment, strong knowledge, and experience of support tools like Jenkins, GIT, Nexus, Splunk, AppDynamics etc You’ll achieve more when you join HSBC. www.hsbc.com/careers HSBC is committed to building a culture where all employees are valued, respected and opinions count. We take pride in providing a workplace that fosters continuous professional development, flexible working and opportunities to grow within an inclusive and diverse environment. Personal data held by the Bank relating to employment applications will be used in accordance with our Privacy Statement, which is available on our website. Issued by – HSBC Software Development India
Posted 3 weeks ago
1.0 years
0 Lacs
Cochin
On-site
We are seeking a proactive and results-driven Healthcare Recruiter to join our HR team in Kochi. The recruiter will be responsible for sourcing, screening, and hiring qualified healthcare professionals. Source healthcare professionals through job portals, social media, and campus drives Conduct telephonic and face-to-face interviews Coordinate with licensing teams for DataFlow, Prometric, and embassy processing Maintain candidate databases and follow up regularly Coordinate offer letters, onboarding, and deployment Collaborate with manpower agencies and vendors for bulk hiring needs Requirements Bachelor's degree in HR / Healthcare Management / Minimum 1 year experience in recruitment (preferably healthcare) Familiar with GCC licensing systems, (DataFlow, Prometric) is a plus Strong communication and interpersonal skills Ability to multitask and meet recruitment deadlines Excellent communication skills in English are mandatory Job Type: Full-time Schedule: Day shift Ability to commute/relocate: Kochi, Kerala: Reliably commute or willing to relocate with an employer-provided relocation package (Preferred) Education: Bachelor's (Preferred) Experience: HR: 1 year (Required) Work Location: In person
Posted 3 weeks ago
8.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Job Title: Senior Software Developer Location: Chennai, India. About The Job: Developers at Vendasta work in teams, working with Product Managers and Designers in the creation of new features and products. Our Research and Development department works hard to help developers learn, grow, and experiment while at work. With a group of over 100 developers, we have fostered an environment that provides developers with the opportunity to continuously learn from each other. The ideal candidate will demonstrate that they are bright and can tackle tough problems while being able to communicate their solution to others. They are creative and can mix technology with the customer’s problems to find the right solution. Lastly, they are driven and will motivate themselves and others to get things done. As an experienced Software Developer, we expect that you will grow into a thought leader at Vendasta, driving better results across our development organization. Your Impact: Develop software in teams of 3-5 developers, with the ability to take on tasks for the team and independently work on them to completion. Follow best practices to write clean, maintainable, scalable, and tested software. Contribute to the best engineering practices, including the use of design patterns, CI/CD, maintainable and scalable code, code review, and automated tests. Provide inputs for a technical roadmap for the Product Area. Ensure that the NFRs and technical debt get their due focus. Work collaboratively with Product Managers to design solutions (including technical roadmap) that help our Partners connect digital solutions to small and medium-sized businesses. Analyzing and improving current system integrations and migration strategies. Interact and collaborate with our high-quality technical team across India and Canada What You Bring To The Table: 8+ years experience in a related field with at least 3+ years as full stack developer in an architect or senior development role Experience or strong understanding of high scalability, data-intensive, distributed Internet applications Software development experience including building distributed, microservice-style and cloud-based application architectures Proficiency in modern software language, and willingness to quickly learn our technology stack Preference will be given to candidates with a strong Go (programming language) experience, and who can demonstrate the ability to build and adapt web applications using Angular. Experience in designing, Building and Implementing cloud-native architectures (GCP preferred). Experience working with the Scrum framework Technologies We Use: Cloud Native Computing using Google Cloud Platform BigQuery, Cloud Dataflow, Cloud Pub/Sub, Google Data Studio, Cloud IAM, Cloud Storage, Cloud SQL, Cloud Spanner, Cloud Datastore, Google Maps Platform, Stackdriver, etc.… We have been invited to join the Early Access Program on quite a few GCP technologies. GoLang, Typescript, Python, JavaScript, HTML, Angular, GRPC, Kubernetes Elasticsearch, MySQL, PostgreSQL About Vendasta: So what do we do? We create an entire platform full of digital products & solutions that help small to medium-sized businesses (SMBs) have a stronger presence online through digital advertising, online listings, reputation management, website creation, social media marketing … and much more! Our platform is used exclusively by channel partners, who sell products and services to SMBs, allowing them to leverage us to scale and grow their business. We are trusted by 65,000+ channel partners, serving over 6 million SMBs worldwide! Perks: Stock options (as per policy) Benefits - Health insurance Paid time off Public transport reimbursement Flex days Training & Career Development - Professional development plans, leadership workshops, mentorship programs, and more! Free Snacks, hot beverages, and catered lunches on Fridays Culture - comprised of our core values: Drive, Innovation, Respect, and Agility Provident Fund
Posted 3 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
31458 Jobs | Dublin
Wipro
16542 Jobs | Bengaluru
EY
10788 Jobs | London
Accenture in India
10711 Jobs | Dublin 2
Amazon
8660 Jobs | Seattle,WA
Uplers
8559 Jobs | Ahmedabad
IBM
7988 Jobs | Armonk
Oracle
7535 Jobs | Redwood City
Muthoot FinCorp (MFL)
6170 Jobs | New Delhi
Capgemini
6091 Jobs | Paris,France