Jobs
Interviews
11 Job openings at Wavicle Data Solutions
About Wavicle Data Solutions

At Wavicle we provide advanced internal communications solutions through the use of cloud-based, mobile-first technology. Our customers depend upon Wavicle to create team alignment to vision. Whether talking to a select few or a big team, together or dispersed. Wavicle will deliver your most important, core messages with velocity and impact to inspire teams and drive measurable performance. We meet and exceed existing realities that our enterprise customers face including: cloud, mobility, BYOD, and gamification. We strive to express technological advancement as ease-of-use. Our Wavicle Platform is a cloud-based, turnkey solution for game-based learning course authoring, delivery, and analytics. Our easy-to-use platform renders your content into multimedia game-based learning courses, delivers them to smartphones, tablets, and browsers, then returns a robust dashboard of analytics back to the administrator in real-time. We rely upon the vast wealth of research proving game-based, mobile, and blended learning’s effectiveness in driving higher retention rates on knowledge transfer – thus creating happier learners and more productive organizations overall.

Data Architect

Tamil Nadu, India

11 years

Not disclosed

Remote

Full Time

Job Title: Data Architect – AWS & GCP Experience: 11+ Years Location: Remote / Hybrid Job Type: Full-timeAbout the Role:We are seeking an experienced Data Architect with deep expertise in AWS and Google Cloud Platform (GCP). The ideal candidate will have a proven track record in designing and implementing scalable, secure, and high-performance data architectures across cloud platforms. You will play a key role in shaping our cloud data strategy, ensuring data quality, and enabling analytics at scale.Key Responsibilities:Design and implement end-to-end data architecture and pipelines on AWS and GCP.Define data models, data flow, and data storage strategies aligned with business needs.Architect and implement data lakes, data warehouses, and real-time streaming platforms.Develop and maintain technical documentation, data dictionaries, and architecture blueprints.Work closely with business stakeholders, data scientists, and engineering teams to define data strategies and solutions.Ensure data quality, security, governance, and compliance across platforms.Lead cloud data migration and modernization projects.Evaluate new tools and technologies to improve data architecture performance and reliability.Collaborate with DevOps and Security teams to enforce best practices in deployment and access control.Mentor junior team members and provide architectural oversight.Required Skills & Experience:11+ years of IT experience with at least 5+ years in data architecture.Strong hands-on experience with AWS (Redshift, S3, Glue, RDS, Lambda, etc.) and GCP (BigQuery, Dataflow, Cloud Storage, Cloud Composer, etc.).Experience in data modeling, ETL/ELT pipelines, data integration, and data warehouse design.Expertise with SQL, Python, Spark, and other data processing frameworks.Strong experience with data governance, security, and compliance on cloud platforms.Proficiency in Infrastructure as Code (IaC) using tools like Terraform or CloudFormation.Familiarity with streaming data tools (Kafka, Pub/Sub, etc.).Experience in real-time analytics, batch processing, and big data ecosystems.Understanding of CI/CD practices, DevOps tools, and agile methodologies.

Senior Data Engineer

Chennai, Coimbatore, Bengaluru

6 - 10 years

INR 15.0 - 25.0 Lacs P.A.

Work from Office

Full Time

Hi Professionals, We are looking for Senior Data Engineer for Permanent Role Experience : 6 to 10 Years Work Location: Hybrid Chennai, Coimbatore, Bangalore Notice Period: 0 TO 15 Days or Immediate Joiner Skills: 1.Python 2.PySpark. 3.AWS Or GCP Interested can send your resume to gowtham.veerasamy@wavicledata.com

Senior Data Engineer

Tamil Nadu, India

6 years

Not disclosed

On-site

Full Time

We are seeking a highly skilled Senior Azure Databricks Data Engineer to design, develop, and optimize data solutions on Azure . The ideal candidate will have expertise in Azure Data Factory (ADF), Databricks, SQL, Python , and experience working with SAP IS-Auto as a data source . This role involves data modeling, systematic layer modeling, and ETL/ELT pipeline development to enable efficient data processing and analytics. Experience: 6+ years Key Responsibilities: Develop & Optimize ETL Pipelines : Build robust and scalable data pipelines using ADF, Databricks, and Python for data ingestion, transformation, and loading. Data Modeling & Systematic Layer Modeling : Design logical, physical, and systematic data models for structured and unstructured data. Integrate SAP IS-Auto : Extract, transform, and load data from SAP IS-Auto into Azure-based data platforms. Database Management : Develop and optimize SQL queries, stored procedures, and indexing strategies to enhance performance. Big Data Processing : Work with Azure Databricks for distributed computing, Spark for large-scale processing, and Delta Lake for optimized storage . Data Quality & Governance : Implement data validation, lineage tracking, and security measures for high-quality, compliant data. Collaboration : Work closely with business analysts, data scientists, and DevOps teams to ensure data availability and usability. Required Skills: Azure Cloud Expertise : Strong experience in Azure Data Factory (ADF), Databricks, and Azure Synapse . Programming : Proficiency in Python for data processing, automation, and scripting. SQL & Database Skills : Advanced knowledge of SQL, T-SQL, or PL/SQL for data manipulation. SAP IS-Auto Data Handling : Experience integrating SAP IS-Auto as a data source into data pipelines. Data Modeling : Hands-on experience in dimensional modeling, systematic layer modeling, and entity-relationship modeling . Big Data Frameworks : Strong understanding of Apache Spark, Delta Lake, and distributed computing . Performance Optimization : Expertise in query optimization, indexing, and performance tuning . Data Governance & Security : Knowledge of RBAC, encryption, and data privacy standards . Preferred Qualifications: Experience with CI/CD for data pipelines using Azure DevOps. Knowledge of Kafka/Event Hub for real-time data processing. Experience with Power BI/Tableau for data visualization (not mandatory but a plus). Show more Show less

Senior Data Engineer (AWS & GCP)

Tamil Nadu, India

0 years

Not disclosed

On-site

Full Time

Job Title: Senior Data Engineer (AWS & GCP) Experience: 6+ Years We are seeking an experienced Senior Data Engineer with expertise in AWS and preferably GCP to join our data engineering team. The ideal candidate will be skilled in building, optimizing, and managing data pipelines and infrastructure in cloud environments. You’ll work closely with cross-functional teams including data scientists, analysts, and architects to ensure efficient and secure data operations. Key Responsibilities: Design, develop, and maintain robust ETL/ELT pipelines using AWS services like Glue, Lambda, EMR and/or GCP equivalents such as Dataflow, Cloud Functions, and BigQuery. Build scalable and efficient data storage and warehousing solutions using AWS S3, Redshift, RDS or GCP Cloud Storage, BigQuery, and Cloud SQL . Optimize data architecture for performance and cost across cloud platforms. Implement and manage data governance, security policies, and access controls using IAM and cloud-native tools. Collaborate with analytics and business intelligence teams to ensure data availability and reliability. Monitor and manage cloud costs, resource utilization, and performance. Troubleshoot and resolve issues related to data ingestion, transformation, and performance bottlenecks. Qualifications: 6+ years of experience in data engineering with at least 4+ years on AWS and familiarity or hands-on experience with GCP (preferred). Proficiency in Python , SQL , and data modeling best practices. Strong experience with ETL tools, data pipelines, and cloud-native services. Working knowledge of data warehousing , distributed computing , and data lakes . Experience with Infrastructure-as-Code tools like Terraform or CloudFormation (a plus). AWS Certification required; GCP Certification is a plus. Strong problem-solving skills and ability to work in a fast-paced environment. Show more Show less

Data Engineer

Chennai, Coimbatore, Bengaluru

6 - 11 years

INR 15.0 - 30.0 Lacs P.A.

Work from Office

Full Time

Hi Professionals, We are looking for Data Engineer for Permanent Role Work Location: Hybrid Chennai, Coimbatore or Bangalore Experience: 6 to 12 Years Notice Period: 0 TO 15 Days or Immediate Joiner. Skills: 1. Python 2. Pyspark 3. SQL 4. Azure Data bricks 5. AWS Interested can send your resume to gowtham.veerasamy@wavicledata.com.

Data Engineer

Chennai, Coimbatore, Bengaluru

6 - 11 years

INR 15.0 - 25.0 Lacs P.A.

Work from Office

Full Time

Hi Professionals, We are looking for Data Engineer for Permanent Role Work Location: Hybrid Chennai, Coimbatore or Bangalore Experience: 6 to 11 Years Notice Period: 0 TO 15 Days or Immediate Joiner. Skills: 1. Python 2. Pyspark 3. SQL 4. Azure Data bricks 5. AWS Interested can send your resume to gowtham.veerasamy@wavicledata.com.

AWS Quicksight Developer

Chennai, Coimbatore, Bengaluru

6 - 11 years

INR 12.0 - 20.0 Lacs P.A.

Work from Office

Full Time

Job Summary: We are seeking an experienced Amazon QuickSight Developer to design and develop interactive dashboards, business intelligence (BI) reports, and data visualizations. The ideal candidate will have hands-on experience with Amazon QuickSight, a strong background in data analytics, and the ability to work closely with stakeholders to transform business requirements into actionable insights. Key Responsibilities: Design, develop, and maintain BI dashboards, reports, and visualizations using Amazon QuickSight. Integrate QuickSight with AWS data services like Amazon Redshift, Athena, S3, and RDS. Optimize dashboards and visualizations for performance, usability, and scalability. Gather, analyze, and translate business requirements into technical specifications for BI solutions. Implement security settings, row-level security, and user access controls in QuickSight. Collaborate with cross-functional teams including data engineers, data scientists, and business analysts. Interested can send your resume to gowtham.veerasamy@wavicledata.com.

Senior Data Engineer

Chennai, Coimbatore, Bengaluru

6 - 11 years

INR 15.0 - 25.0 Lacs P.A.

Work from Office

Full Time

Hi Professionals, We are looking for Senior Data Engineer for Permanent Role Work Location: Hybrid Chennai, Coimbatore or Bangalore Experience: 6 to 12 Years Notice Period: 0 TO 15 Days or Immediate Joiner. Skills: 1. Python 2. Pyspark 3. SQL 4. AWS 5. GCP 6. MLOps Interested can send your resume to gowtham.veerasamy@wavicledata.com.

Business Intelligence Architect

Chennai, Coimbatore, Bengaluru

13 - 17 years

INR 20.0 - 30.0 Lacs P.A.

Work from Office

Full Time

Hi Professionals, We are looking for BI Architect for Permanent Role Work Location: Hybrid Chennai, Coimbatore or Bangalore Experience: 13 to 17 Years Notice Period: 0 TO 15 Days or Immediate Joiner. Skills: 1.Develop and lead the BI strategy, acting as a key decision-maker for BI initiatives. 2.Expertise in BI tools such as Power BI or Tableau. 3.Evaluate, recommend, and implement BI tools based on project requirements. 4.Proficiency in cloud platforms, leveraging cloud services for BI implementations. 5.AWS or Azure certifications are a plus. 6.Excellent communication skills to effectively interact with customers and stakeholders. 7.Ability to convey complex BI concepts in a clear and understandable manner. 8.Proficient in programming languages (e.g., Python, Java) and scripting languages for BI customization and automation. 9.Knowledge of ETL processes, data warehousing, and data modeling. 10.Collaborate with Data Engineers and Architects to ensure data integrity and optimal performance. 11.Interact with customers, understanding their BI needs and providing tailored solutions. 12.Prior experience working with customers in the US or UK is preferred. Interested can send your resume to gowtham.veerasamy@wavicledata.com

Java Developer

Bengaluru

8 - 12 years

INR 25.0 - 30.0 Lacs P.A.

Work from Office

Full Time

Hi Professionals, We are looking for Java Developer in Bangalore for Permanent Role Work Location: Bangalore Experience: 8 to 13 Years Notice Period: 0 TO 15 Days or Immediate Joiner. Skills: 1.Java 2.Multi-threading 3.Message queues 4.PostgreSQL 5.Azure Note: Candidates should be based in or around the Bangalore area and must be willing to visit the client site once or twice a week. Interested can send your resume to gowtham.veerasamy@wavicledata.com

Senior GCP DevOps Engineer

Tamil Nadu, India

8 years

None Not disclosed

On-site

Full Time

Job Summary: We are seeking a skilled DevOps Engineer over 8+ years of experience with a strong foundation in Google Cloud Platform (GCP) and hands-on experience in AWS and Azure. The ideal candidate will be responsible for building, automating, and maintaining cloud-native and hybrid infrastructure, CI/CD pipelines, and cloud operations with a focus on scalability, security, and reliability. Key Responsibilities: Build and manage scalable and secure infrastructure using Compute Engine, Cloud Run, Cloud Functions, and Cloud Load Balancing. Lead infrastructure automation using Terraform and Cloud Deployment Manager to ensure consistent and version-controlled infrastructure. Design and improve CI/CD pipelines using Cloud Build, GitLab CI/CD, Jenkins, or similar tools, and integrate them with Artifact Registry. Set up monitoring, logging, and alerts using the Cloud Operations Suite (Cloud Monitoring, Cloud Logging, Cloud Trace, Cloud Profiler), and integrate with Prometheus, Grafana, or Datadog. Configure and secure network architecture using VPC, subnets, Cloud NAT, Cloud VPN, Interconnect, firewall rules, and Private Service Connect to support hybrid and multi-cloud environments. Apply security best practices across GCP using IAM policies, VPC Service Controls, Cloud Armor, Binary Authorization, and Secret Manager. Manage and optimize cloud databases such as Cloud SQL, Cloud Spanner, Firestore and Bigtable for performance, availability, and scalability. Lead production support efforts, including incident management, root cause analysis, and post-mortems, to improve reliability and reduce downtime. Provide strategic guidance on cost management, performance tuning, and cloud governance for large-scale environments on Google Cloud Platform. Required Skills & Qualifications: Strong hands-on experience with core GCP services such as Compute Engine, VPC, IAM, Cloud Storage, and Cloud SQL Experience with serverless, and observability tools including Cloud Run, Cloud Functions, Cloud Monitoring, and Cloud Logging Proficient in managing GCP database services such as Cloud SQL, Cloud Spanner, Firestore, and Bigtable Advanced CI/CD pipeline development using Cloud Build, Cloud Deploy, GitLab CI/CD, or Jenkins Solid experience with Kubernetes, Docker, Helm, and container orchestration using GKE Proficient in Infrastructure as Code (IaC) using Terraform or Cloud Deployment Manager Scripting skills in Python, Bash, or Powershell Strong understanding of Git, branching strategies, and version control workflows Experience deploying microservices architectures in Agile and DevSecOps environments Nice to Have: GCP certification. Experience with multi cloud environment. Knowledge of cost optimization and budgeting in cloud. Security and compliance best practices for cloud infrastructure.

FIND ON MAP

Wavicle Data Solutions

Wavicle Data Solutions logo

Wavicle Data Solutions

|

E-Learning Providers

New York New York

11-50 Employees

11 Jobs

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Job Titles Overview