Jobs
Interviews

505 S3 Jobs - Page 8

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 - 9.0 years

0 Lacs

pune, maharashtra

On-site

You should have a good knowledge of the JavaScript language and expertise in the Cypress Automation Framework through BDD. It is essential to be well-versed with GIT repositories and have expertise in RESTful API Automation and UI Automation. Additionally, experience in Performance Testing using Gatling is required. Knowledge of AWS services like S3, SES, EC2, IAM, Lambda is a plus. Hands-on experience in DevOps tools such as Docker, Kubernetes, Gitlab/Jenkins, and CICD implementation is necessary. Basic networking knowledge is also required. You should have good experience in Testing and Test management, along with experience working in the Unified Communication domain. Your key responsibilities will include participating in Pre-PI, PI planning, and all SAFe Agile ceremonies. You will be responsible for creating an Automation test framework from scratch and working on automated scripts, back-end scripting, and user interface. Implementing new ideas to enhance the Automation framework and conducting Performance tests using Gatling are also part of the role. You will be involved in peer/junior code reviews, supporting juniors in creating reusable keywords/utils, and training new joiners and juniors through knowledge sharing. Monitoring and managing changes in regression tests, understanding and implementing CI/CD pipeline flow, and pushing code daily into the code repository are crucial tasks. Understanding the product by conducting exploratory testing manually, participating in estimation, Agile Scrum ceremonies, and backlog grooming are part of your responsibilities. Collating and monitoring the defect management process, uploading test cases in JIRA, providing daily status updates of stories in JIRA, and generating reports will also be expected. Furthermore, you will be required to provide input to the Test Manager.,

Posted 2 weeks ago

Apply

5.0 - 9.0 years

0 Lacs

maharashtra

On-site

Data Scientist (5+ Years of Experience) We are seeking a highly motivated Data Scientist with over 5 years of hands-on experience in data mining, statistical analysis, and developing high-quality machine learning models. The ideal candidate will have a passion for solving real-world problems using data-driven approaches and possess strong technical expertise across various data science domains. Key Responsibilities: Apply advanced data mining techniques and statistical analysis to extract actionable insights. Design, develop, and deploy robust machine learning models to address complex business challenges. Conduct A/B and multivariate experiments to evaluate model performance and optimize outcomes. Monitor, analyze, and enhance the performance of machine learning models post-deployment. Collaborate cross-functionally to build customer cohorts for CRM campaigns and conduct market basket analysis. Stay updated with state-of-the-art techniques in NLP, particularly within the e-commerce domain. Required Skills & Qualifications: Programming & Tools: Proficient in Python, PySpark, and SQL for data manipulation and analysis. Machine Learning & AI: Strong experience with ML libraries (e.g., Scikit-learn, TensorFlow, PyTorch) and expertise in NLP, Computer Vision, Recommender Systems, and Optimization techniques. Cloud & Big Data: Hands-on experience with AWS services, including Glue, EKS, S3, SageMaker, and Redshift. Model Deployment: Experience deploying pre-trained models from platforms like Hugging Face and AWS Bedrock. DevOps & MLOps: Understanding of Git, Docker, CI/CD pipelines, and deploying models with frameworks such as FastAPI. Advanced NLP: Experience in building, retraining, and optimizing NLP models for diverse use cases. Preferred Qualifications: Strong research mindset with a keen interest in exploring new data science methodologies. Background in e-commerce analytics is a plus. If youre passionate about leveraging data to drive impactful business decisions and thrive in a dynamic environment, wed love to hear from you!,

Posted 2 weeks ago

Apply

7.0 - 12.0 years

17 - 27 Lacs

Hyderabad

Work from Office

Job Title: Data Quality Engineer Mandatory Skills Data Engineer, Python, AWS, SQL, Glue, Lambda, S3, SNS, ML, SQS Job Summary: We are seeking a highly skilled Data Engineer (SDET) to join our team, responsible for ensuring the quality and reliability of complex data workflows, data migrations, and analytics solutions across both cloud and on-premises environments. The ideal candidate will have extensive experience in SQL, Python, AWS, and ETL testing, along with a strong background in data quality assurance, data science platforms, DevOps pipelines, and automation frameworks. This role involves close collaboration with business analysts, developers, and data architects to support end-to-end testing,data validation, and continuous integration for data products. Expertise in tools like Redshift, EMR,Athena, Jenkins, and various ETL platforms is essential, as is experience with NoSQL databases, big data technologies, and cloud-native testing strategies. Role and Responsibilities: Work with business stakeholders, Business Systems Analysts and Developers to ensure quality delivery of software. Interact with key business functions to confirm data quality policies and governed attributes. Follow quality management best practices and processes to bring consistency and completeness to integration service testing. Designing and managing the testing AWS environments of data workflows during development and deployment of data products Provide assistance to the team in Test Estimation & Test Planning Design, development of Reports and dashboards. Analyzing and evaluating data sources, data volume, and business rules. Proficiency with SQL, familiarity with Python, Scala, Athena, EMR, Redshift and AWS. No SQL data and unstructured data experience. Extensive experience in programming tools like Map Reduce to HIVEQL. Experience in data science platforms like SageMaker/Machine Learning Studio/ H2O. Should be well versed with the Data flow and Test Strategy for Cloud/ On Prem ETL Testing. Interpret and analyses data from various source systems to support data integration and data reporting needs. Experience in testing Database Application to validate source to destination data movement and transformation. Work with team leads to prioritize business and information needs. Develop complex SQL scripts (Primarily Advanced SQL) for Cloud ETL and On prem. Develop and summarize Data Quality analysis and dashboards. Knowledge of Data modeling and Data warehousing concepts with emphasis on Cloud/ On Prem ETL. Execute testing of data analytic and data integration on time and within budget. Work with team leads to prioritize business and information needs Troubleshoot & determine best resolution for data issues and anomalies Experience in Functional Testing, Regression Testing, System Testing, Integration Testing & End to End testing. Has deep understanding of data architecture & data modeling best practices and guidelines for different data and analytic platforms Required Skills and Qualifications: Extensive Experience in Data migration is a must (Teradata to Redshift preferred). Extensive testing Experience with SQL/Unix/Linux scripting is a must. Extensive experience testing Cloud/On Prem ETL (e.g. Abinitio, Informatica, SSIS, Datastage, Alteryx, Glu). Extensive experience DBMS like Oracle, Teradata, SQL Server, DB2, Redshift, Postgres and Sybase. Extensive experience using Python scripting and AWS and Cloud Technologies. Extensive experience using Athena, EMR, Redshift, AWS, and Cloud Technologies. Experienced in large-scale application development testing Cloud/ On Prem Data warehouse, Data Lake, Data science. Experience with multi-year, large-scale projects. Expert technical skills with hands-on testing experience using SQL queries. Extensive experience with both data migration and data transformation testing. Extensive experience DBMS like Oracle, Teradata, SQL Server, DB2, Redshift, Postgres and Sybase. Extensive testing Experience with SQL/Unix/Linux. Extensive experience testing Cloud/On Prem ETL (e.g. Abinitio, Informatica, SSIS, Datastage, Alteryx, Glu). Extensive experience using Python scripting and AWS and Cloud Technologies. Extensive experience using Athena, EMR , Redshift and AWS and Cloud Technologies. API/Rest Assured automation, building reusable frameworks, and good technical expertise/acumen. Java/Java Script - Implement core Java, Integration, Core Java and API. Functional/UI/ Selenium - BDD/Cucumber, Specflow, Data Validation/Kafka, BigData, also automation experience using Cypress. AWS/Cloud - Jenkins/ Gitlab/ EC2 machine, S3 and building Jenkins and CI/CD pipelines, SouceLabs. Preferred Skills: API/Rest API - Rest API and Micro Services using JSON, SoapUI. Extensive experience in DevOps/Data Ops space. Strong experience in working with DevOps and build pipelines. Strong experience of AWS data services including Redshift, Glue, Kinesis, Kafka (MSK) and EMR/Spark, Sage Maker etc. Experience with technologies like Kubeflow, EKS, Docker. Extensive experience using No SQL data and unstructured data experience like MongoDB, Cassandra, Redis, ZooKeeper. Extensive experience in Map reduce using tools like Hadoop, Hive, Pig, Kafka, S4, Map R. Experience using Jenkins and Gitlab. Experience using both Waterfall and Agile methodologies. Experience in testing storage tools like S3, HDFS. Experience with one or more industry-standard defect or Test Case management Tools. Great communication skills (regularly interacts with cross functional team members).

Posted 2 weeks ago

Apply

8.0 - 11.0 years

30 - 35 Lacs

Hyderabad

Work from Office

NP: Immediate to 15 Days. Required Skills & Qualifications: Strong experience in backend development using Java (Java 8 or later). Hands-on experience with front-end technologies such as Vue.js or Angular (with a strong preference to work with Vue.js). Solid understanding of PostgreSQL and ability to write optimized SQL queries and stored procedures. AWS cloud experience with knowledge of services like EC2, RDS, S3, Lambda, API Gateway, etc. Experience building and consuming RESTful APIs. Proficiency with version control systems (e.g., Git). Familiarity with Agile/Scrum methodologies. Strong problem-solving skills and attention to detail. Excellent communication and collaboration skills.

Posted 2 weeks ago

Apply

5.0 - 10.0 years

20 - 35 Lacs

Kochi, Bengaluru

Work from Office

Job Summary: We are seeking a highly skilled and motivated Machine Learning Engineer with a strong foundation in programming and machine learning, hands-on experience with AWS Machine Learning services (especially SageMaker), and a solid understanding of Data Engineering and MLOps practices. You will be responsible for designing, developing, deploying, and maintaining scalable ML solutions in a cloud-native environment. Key Responsibilities: • Design and implement machine learning models and pipelines using AWS SageMaker and related services. • Develop and maintain robust data pipelines for training and inference workflows. • Collaborate with data scientists, engineers, and product teams to translate business requirements into ML solutions. • Implement MLOps best practices including CI/CD for ML, model versioning, monitoring, and retraining strategies. • Optimize model performance and ensure scalability and reliability in production environments. • Monitor deployed models for drift, performance degradation, and anomalies. • Document processes, architectures, and workflows for reproducibility and compliance. Required Skills & Qualifications: • Strong programming skills in Python and familiarity with ML libraries (e.g., scikitlearn, TensorFlow, PyTorch). • Solid understanding of machine learning algorithms, model evaluation, and tuning. • Hands-on experience with AWS ML services, especially SageMaker, S3, Lambda, Step Functions, and CloudWatch. • Experience with data engineering tools (e.g., Apache Airflow, Spark, Glue) and workflow orchestration. Machine Learning Engineer - Job Description • Proficiency in MLOps tools and practices (e.g., MLflow, Kubeflow, CI/CD pipelines, Docker, Kubernetes). • Familiarity with monitoring tools and logging frameworks for ML systems. • Excellent problem-solving and communication skills. Preferred Qualifications: • AWS Certification (e.g., AWS Certified Machine Learning Specialty). • Experience with real-time inference and streaming data. • Knowledge of data governance, security, and compliance in ML systems

Posted 2 weeks ago

Apply

6.0 - 7.0 years

27 - 35 Lacs

Hyderabad, Chennai, Bengaluru

Work from Office

Role & responsibilities Provide technical leadership and mentorship to data engineering teams. Architect, design, and deploy scalable, secure, and high-performance data pipelines. Collaborate with stakeholders, clients, and cross-functional teams to deliver end-to-end data solutions. Drive technical strategy and implementation plans in alignment with business needs. Oversee project execution using tools like JIRA, ensuring timely delivery and adherence to best practices. Implement and maintain CI/CD pipelines and automation tools to streamline development workflows. Promote best practices in data engineering and AWS implementations across the team. Preferred candidate profile Strong hands-on expertise in Python, PySpark, and Spark architecture, including performance tuning and optimization. Advanced proficiency in SQL and experience in writing optimized stored procedures. In-depth knowledge of the AWS data engineering stack, including: AWS Glue Lambda API Gateway EMR S3 Redshift Athena Experience with Infrastructure as Code (IaC) using CloudFormation and Terraform. Familiarity with Unix/Linux scripting and system administration is a plus. Proven ability to design and deploy robust, production-grade data solutions.

Posted 2 weeks ago

Apply

3.0 - 5.0 years

10 - 15 Lacs

Mumbai, Aurangabad

Work from Office

Joining Preferred immediate joiners. Job Summary: We are seeking a skilled and motivated Data Developer with 3 to 5 years of hands-on experience in designing, developing, and maintaining scalable data solutions. The ideal candidate will work closely with data architects, data analysts, and application developers to build efficient data pipelines, transform data, and support data integration across various platforms. Key Responsibilities: Design, develop, and maintain ETL/ELT pipelines to ingest, transform, and load data from various sources (structured and unstructured). Develop and optimize SQL queries , stored procedures, views, and functions for data analysis and reporting. Work with data warehousing technologies (e.g., Snowflake, Redshift, BigQuery, Azure Synapse) to support business intelligence solutions. Collaborate with data engineers and analysts to implement robust data models and schemas for analytics. Ensure data quality , consistency, and accuracy through data validation, testing, and monitoring. Implement data security, compliance, and governance protocols in alignment with organizational policies. Maintain documentation related to data sources, data flows, and business rules. Participate in code reviews , sprint planning, and agile development practices. Technical Skills Required: Languages & Tools: SQL (Advanced proficiency required) Python or Scala for data processing Shell scripting (Bash, PowerShell) ETL Tools / Data Integration: Apache NiFi, Talend, Informatica, Azure Data Factory, SSIS, or equivalent Data Warehousing & Databases: Snowflake, Amazon Redshift, Google BigQuery, Azure Synapse SQL Server, PostgreSQL, Oracle, or MySQL Cloud Platforms (at least one): AWS (Glue, S3, Redshift, Lambda) Azure (ADF, Blob Storage, Synapse) GCP (Dataflow, BigQuery, Cloud Storage) Big Data & Streaming (Nice to Have): Apache Spark, Databricks, Kafka, Hadoop ecosystem Version Control & DevOps: Git, Bitbucket CI/CD pipelines (Jenkins, GitHub Actions) Qualifications: Bachelors or Masters degree in Computer Science , Information Systems , or related field. 35 years of professional experience as a Data Developer or Data Engineer. Strong problem-solving skills and the ability to work both independently and in a team environment. Experience working in Agile/Scrum teams is a plus. Excellent communication and documentation skills. Preferred Certifications (Optional): Microsoft Certified: Azure Data Engineer Associate AWS Certified Data Analytics Specialty Google Cloud Professional Data Engineer

Posted 2 weeks ago

Apply

0.0 - 3.0 years

3 - 8 Lacs

Chennai

Hybrid

Key Responsibilities AWS Infrastructure Management Design, deploy, and manage AWS infrastructure using services such as EC2, ECS, EKS, Lambda, RDS, S3, VPC, and CloudFront Implement and maintain Infrastructure as Code using AWS CloudFormation, AWS CDK, or Terraform Optimize AWS resource utilization and costs through rightsizing, reserved instances, and automated scaling Manage multi-account AWS environments using AWS Organizations and Control Tower Implement disaster recovery and backup strategies using AWS services CI/CD Pipeline Development Build and maintain CI/CD pipelines using AWS CodePipeline, CodeBuild, CodeDeploy, and CodeCommit Integrate with third-party tools like Jenkins, GitLab CI, or GitHub Actions when needed Implement automated testing and security scanning within deployment pipelines Manage deployment strategies including blue-green deployments using AWS services Automate application deployments to ECS, EKS, Lambda, and EC2 environments Container and Serverless Management Deploy and manage containerized applications using Amazon ECS and Amazon EKS Implement serverless architectures using AWS Lambda, API Gateway, and Step Functions Manage container registries using Amazon ECR Optimize container and serverless application performance and costs Implement service mesh architectures using AWS App Mesh when applicable Monitoring and Observability Implement comprehensive monitoring using Amazon CloudWatch, AWS X-Ray, and AWS Systems Manager Set up alerting and dashboards for proactive incident management Configure log aggregation and analysis using CloudWatch Logs and AWS OpenSearch Implement distributed tracing for microservices architectures Create and maintain operational runbooks and documentation Security and Compliance Implement AWS security best practices using IAM, Security Groups, NACLs, and AWS Config Manage secrets and credentials using AWS Secrets Manager and Systems Manager Parameter Store Implement compliance frameworks and automated security scanning Configure AWS GuardDuty, AWS Inspector, and AWS Security Hub for threat detection Manage SSL/TLS certificates using AWS Certificate Manager Automation and Scripting Develop automation scripts using Python, Bash, and AWS CLI/SDK Create AWS Lambda functions for operational automation Implement event-driven automation using CloudWatch Events and EventBridge Automate backup, patching, and maintenance tasks using AWS Systems Manager Build custom tools and utilities to improve operational efficiency Required Qualifications AWS Expertise Strong experience with core AWS services: EC2, S3, RDS, VPC, IAM, CloudFormation Experience with container services (ECS, EKS) and serverless technologies (Lambda, API Gateway) Proficiency with AWS networking concepts and security best practices Experience with AWS monitoring and logging services (CloudWatch, X-Ray) Technical Skills Expertise in Infrastructure as Code using CloudFormation, CDK, or Terraform Strong scripting skills in Python, Bash, or PowerShell Experience with CI/CD tools, preferably AWS native services and Bitbucket Pipelines. Knowledge of containerization with Docker and orchestration with Kubernetes Understanding of microservices architecture and distributed systems Experience with configuration management and automation tools DevOps Practices Strong understanding of CI/CD best practices and GitOps workflows Experience with automated testing and deployment strategies Knowledge of monitoring, alerting, and incident response procedures Understanding of security scanning and compliance automation AWS Services Experience Compute & Containers Amazon EC2, ECS, EKS, Fargate, Lambda, Batch Storage & Database Amazon S3, EBS, EFS, RDS, DynamoDB, ElastiCache, Redshift Networking & Security VPC, Route 53, CloudFront, ALB/NLB, IAM, Secrets Manager, Certificate Manager Developer Tools CodePipeline, CodeBuild, CodeDeploy, CodeCommit, CodeArtifact Monitoring & Management CloudWatch, X-Ray, Systems Manager, Config, CloudTrail, AWS OpenSearch

Posted 2 weeks ago

Apply

4.0 - 6.0 years

10 - 20 Lacs

Chennai

Work from Office

Role & responsibilities Role : DevOps Engineer DevOps Engineer Job Description: DevOps Engineer (4+ Years of Experience) We are looking for a DevOps Engineer with 4+ years of experience to join our dynamic team. The ideal candidate will have hands-on experience with AWS services, Docker, Kubernetes, and Jenkins, along with a strong understanding of CI/CD pipelines and infrastructure automation. Relevant course completion is mandatory, and certifications in related fields are a plus. Key Responsibilities: Design, implement, and manage scalable and reliable cloud infrastructure using AWS services. Develop and maintain CI/CD pipelines using Jenkins to support continuous integration and deployment. Containerize applications using Docker and orchestrate them with Kubernetes. Monitor, troubleshoot, and optimize system performance to ensure high availability and scalability. Collaborate with development and operations teams to improve deployment workflows and infrastructure automation. Implement security best practices for cloud and container environments. Maintain and update documentation for infrastructure, processes, and configurations. Requirements: Experience: 2+ years in DevOps or related roles. Technical Skills: Hands-on experience with AWS services (e.g., EC2, S3, RDS, CloudFormation, Lambda). Strong understanding and practical knowledge of Docker and containerization. Experience with Kubernetes for container orchestration. Proficiency in using Jenkins for CI/CD pipeline creation and management. Familiarity with Infrastructure as Code (IaC) tools like Terraform or CloudFormation. Basic scripting knowledge (e.g., Bash, Python, or PowerShell). Familiarity with version control systems like Git. Strong problem-solving skills and attention to detail. Excellent communication and collaboration abilities. Preferred Qualifications: Relevant certifications in AWS or Kubernetes. Understanding of monitoring tools like Prometheus, Grafana, or CloudWatch. Experience in setting up logging systems (e.g., ELK Stack) If any Candidate interested, please share the CV in madhumithak@sightspectrum.in Preferred candidate profile

Posted 2 weeks ago

Apply

5.0 - 8.0 years

7 - 10 Lacs

Bengaluru

Remote

Skillset: PostgreSQL, Amazon Redshift, MongoDB, Apache Cassandra,AWS,ETL, Shell Scripting, Automation, Microsoft Azure We are looking for futuristic, motivated go getters having following skills for an exciting role. Job Description: Monitor and maintain the performance, reliability, and availability of multiple database systems. Optimize complex SQL queries, stored procedures, and ETL scripts for better performance and scalability. Troubleshoot and resolve issues related to database performance, integrity, backups, and replication. Design, implement, and manage scalable data pipelines across structured and unstructured sources. Develop automation scripts for routine maintenance tasks using Python, Bash, or similar tools. Perform regular database health checks, set up alerting mechanisms, and respond to incidents proactively. Analyze performance bottlenecks and resolve slow query issues and deadlocks. Work in DevOps/Agile environments, integrating with CI/CD pipelines for database operations. Collaborate with engineering, analytics, and infrastructure teams to integrate database solutions with applications and BI tools. Research and implement emerging technologies and best practices in database administration. Participate in capacity planning, security audits, and software upgrades for data infrastructure. Maintain comprehensive documentation related to database schemas, metadata, standards, and procedures. Ensure compliance with data privacy regulations and implement robust disaster recovery and backup strategies. Desired skills: Database Systems: Hands-on experience with SQL-based databases (PostgreSQL, MySQL), Amazon Redshift, MongoDB, and Apache Cassandra. Scripting & Automation: Proficiency in scripting using Python, Shell, or similar to automate database operations. Cloud Platforms: Working knowledge of AWS (RDS, Redshift, EC2, S3, IAM,Lambda) and Azure SQL/Azure Cosmos DB. Big Data & Distributed Systems: Familiarity with Apache Spark for distributed data processing. Performance Tuning: Deep experience in performance analysis, indexing strategies, and query optimization. Security & Compliance: Experience with database encryption, auditing, access control, and GDPR/PII policies. Familiarity with Linux and Windows server administration is a plus. Education & Experience: BE, B.Tech, MCA, Mtech from Tier 2/3 colleges & Science Graduates 5-8 years of work experience.

Posted 2 weeks ago

Apply

4.0 - 6.0 years

12 - 18 Lacs

Chennai, Bengaluru

Work from Office

Key Skills : Python, SQL, PySpark, Databricks, AWS, Data Pipeline, Data Integration, Airflow, Delta Lake, Redshift, S3, Data Security, Cloud Platforms, Life Sciences. Roles & Responsibilities : Develop and maintain robust, scalable data pipelines for ingesting, transforming, and optimizing large datasets from diverse sources. Integrate multi-source data into performant, query-optimized formats such as Delta Lake, Redshift, and S3. Tune data processing jobs and storage layers to ensure cost efficiency and high throughput. Automate data workflows using orchestration tools like Airflow and Databricks APIs for ingestion, transformation, and reporting. Implement data validation and quality checks to ensure reliable and accurate data. Manage and optimize AWS and Databricks infrastructure to support scalable data operations. Lead cloud platform migrations and upgrades, transitioning legacy systems to modern, cloud-native solutions. Enforce security best practices, ensuring compliance with regulatory standards such as IAM and data encryption. Collaborate with cross-functional teams, including data scientists, analysts, and business stakeholders to deliver data solutions. Experience Requirement : 4-6 years of hands-on experience in data engineering with expertise in Python, SQL, PySpark, Databricks, and AWS. Strong background in designing and building data pipelines, and optimizing data storage and processing. Proficiency in using cloud services such as AWS (S3, Redshift, Lambda) for building scalable data solutions. Hands-on experience with containerized environments and orchestration tools like Airflow for automating data workflows. Expertise in data migration strategies and transitioning legacy data systems to modern cloud platforms. Experience with performance tuning, cost optimization, and lifecycle management of cloud data solutions. Familiarity with regulatory compliance (GDPR, HIPAA) and security practices (IAM, encryption). Experience in the Life Sciences or Pharma domain is highly preferred, with an understanding of industry-specific data requirements. Strong problem-solving abilities with a focus on delivering high-quality data solutions that meet business needs. Education : Any Graduation.

Posted 2 weeks ago

Apply

3.0 - 7.0 years

0 Lacs

hyderabad, telangana

On-site

You will be responsible for developing and modifying programs using Python, AWS Glue/Redshift, and PySpark technologies. Your role will involve writing effective and scalable code, as well as identifying areas for program modifications. Additionally, you must have a strong understanding of AWS cloud technologies such as CloudWatch, Lambda, Dynamo, API Gateway, and S3. Experience in creating APIs from scratch and integrating with 3rd party APIs is also required. This is a full-time position based in Hyderabad/Chennai/Bangalore, and the ideal candidate should have a maximum notice period of 15 days.,

Posted 2 weeks ago

Apply

3.0 - 7.0 years

0 Lacs

pune, maharashtra

On-site

The ideal candidate for this position should have advanced proficiency in Python, with a solid understanding of inheritance and classes. Additionally, the candidate should be well-versed in EMR, Athena, Redshift, AWS Glue, IAM roles, CloudFormation (CFT is optional), Apache Airflow, Git, SQL, Py-Spark, Open Metadata, and Data Lakehouse. Experience with metadata management is highly desirable, particularly with AWS Services such as S3. The candidate should possess the following key skills: - Creation of ETL Pipelines - Deploying code in EMR - Querying in Athena - Creating Airflow Dags for scheduling ETL pipelines - Knowledge of AWS Lambda and ability to create Lambda functions This role is for an individual contributor, and as such, the candidate is expected to autonomously manage client communication and proactively resolve technical issues without external assistance.,

Posted 2 weeks ago

Apply

3.0 - 8.0 years

0 Lacs

karnataka

On-site

We are looking for a skilled R Shiny programmer to create interactive reports that transform clinical trial data into actionable clinical insights. As an R Shiny programmer, your role will involve designing, developing, deploying, and optimizing user-friendly web applications for analyzing and visualizing clinical data. Your responsibilities will include designing, developing, testing, and deploying interactive R Shiny web applications. You will collaborate with data scientists, bioinformatics programmers, analysts, and stakeholders to understand application requirements and translate them into intuitive R Shiny applications. Additionally, you will be responsible for translating complex data analysis and visualization tasks into clear and user-friendly interfaces, writing clean and efficient R code, conducting code reviews, and validating R programming. Moreover, you will integrate R Shiny applications with AWS services like AWS Redshift, implement unit tests to ensure quality and performance, benchmark and optimize application performance, and address any inconsistencies in data, analytical, or reporting problems that may arise. Other duties may be assigned as needed. The ideal candidate should possess a Bachelor's degree in computer science, Data Science, or a related field, along with 3 to 8 years of relevant experience. Proven expertise in building R Shiny applications, strong proficiency in R programming, including data manipulation, statistical analysis, and data visualization, experience in using SQL, and an understanding of user interface (UI) and user experience (UX) principles are essential. Experience with gathering requirements, using RStudio, Version Control software, managing programming code, and working with POSIT Workbench, Connect, and/or Package Manager is preferred. Candidates should have the ability to manage multiple tasks, work independently and in a team environment, effectively communicate technical concepts in written and oral formats, and experience with R markdown, continuous integration/continuous delivery (CI/CD) pipelines, and AWS cloud computing services such as Redshift, EC2, S3, and CloudWatch. The required education for this position is a BE/MTech/MCA degree in a computer-related field. A satisfactory background check is mandatory for this role.,

Posted 2 weeks ago

Apply

3.0 - 7.0 years

0 Lacs

karnataka

On-site

Job Title : Azure Presales Engineer. About the Role : As a Cloud Presales Engineer specializing in Azure, you will play a critical role in our sales process by working closely with sales and technical teams to provide expert guidance and solutions for our clients. Leveraging your in-depth knowledge on Azure services, you will understand customer needs, design tailored cloud solutions, and drive the adoption of our cloud offerings. This position requires strong technical acumen, excellent communication skills, and a passion for cloud technologies. Key Responsibilities Solution Design and Architecture : Understand customer requirements and design effective cloud solutions using Azure services. Create architecture diagrams and detailed proposals tailored to customer needs. Collaborate with sales teams to define the scope of technical solutions and present them to customers. Technical Expertise And Consultation Act as a subject matter expert on AWS and Azure services, including EC2, S3, Lambda, RDS, VPC, IAM, CloudFormation, Azure Virtual Machines, Blob Storage, Functions, SQL Database, Virtual Network, Azure Active Directory, and ARM Templates. Provide technical support during the sales process, including product demonstrations, POCs (Proof of Concepts), and answering customer queries. Advise customers on best practices for cloud adoption, migration, and optimization. Customer Engagement Build and maintain strong relationships with customers, understanding their business challenges and technical needs. Conduct customer workshops, webinars, and training sessions to educate customers on Azure solutions and services. Gather customer feedback and insights to help shape product and service offerings. Sales Support Partner with sales teams to develop sales strategies and drive cloud adoption. Prepare and deliver compelling presentations, demonstrations, and product pitches to customers. Assist in the preparation of RFPs, RFQs, and other customer documentation. Continuous Learning And Development Stay up-to-date with the latest AWS and Azure services, technologies, and industry trends. Achieve and maintain relevant AWS and Azure certifications to demonstrate expertise. Share knowledge and best practices with internal teams to enhance overall capabilities. Required Qualifications Bachelor's degree in Computer Science, Information Technology, or a related field. Experience in a presales or technical consulting role, with a focus on cloud solutions. In-depth knowledge of AWS and Azure services, with hands-on experience in designing and implementing cloud-based architectures. Azure certifications (i.e. Microsoft Certified : Azure Solutions Architect Expert) are highly preferred. Strong understanding of cloud computing concepts, including IaaS, PaaS, SaaS, and hybrid cloud models. Excellent presentation, communication, and interpersonal skills. Ability to work independently and collaboratively in a fast-paced, dynamic environment. Preferred Qualifications Experience with other cloud platforms (i.e., Google Cloud) is a plus. Familiarity with DevOps practices, CI/CD pipelines, and infrastructure as code (IaC) using Terraform, CloudFormation, and ARM Templates. Experience with cloud security, compliance, and governance best practices. Background in software development, scripting, or system administration. Join us to be part of an innovative team, shaping cloud solutions and driving digital transformation for our clients!. (ref:hirist.tech),

Posted 2 weeks ago

Apply

3.0 - 7.0 years

0 Lacs

haryana

On-site

As an Engineering Leader at Crop.photo, you will play a crucial role in shaping the future of brand consistency through AI technology. With a focus on building a high-performing engineering team, you will not only lead by example through hands-on coding but also provide guidance to ensure the success of our projects. Your responsibilities will encompass a wide range of tasks, from architecting and developing our AWS-based microservices infrastructure to collaborating with product management on technical decision-making. You will be at the forefront of backend development using Java, Node.js, and Python within the AWS ecosystem, while also contributing to frontend development using React and TypeScript when necessary. Your expertise will be essential in designing and implementing scalable AI/ML pipeline architectures, establishing engineering best practices, and mentoring junior engineers to foster a culture of engineering excellence. Additionally, you will be responsible for system reliability, performance optimization, and cost management, ensuring that our platform delivers high-quality solutions for our marketing professionals. To excel in this role, you must have a minimum of 8+ years of software engineering experience, including at least 3 years of experience leading engineering teams. Your technical skills should cover a wide range of AWS services, backend development, frontend development, system design & architecture, as well as leadership & communication. Your ability to drive architectural decisions, identify technical debt, and lead initiatives to address it will be key to the success of our projects. Working at Crop.photo will provide you with the opportunity to take true technical ownership of a rapidly growing AI platform, shape architecture from an early stage, work with cutting-edge AI/ML technologies, and have a direct impact on product direction and engineering culture. Your success in this role will be measured by the implementation of scalable, maintainable architecture, reduction in system latency and processing costs, successful delivery of key technical initiatives, team growth, and engineering velocity improvements, as well as system reliability and uptime metrics. If you are passionate about building scalable systems, have a proven track record of technical leadership, and thrive in an early-stage environment where you can make a significant impact on both technology and team culture, we encourage you to apply for this exciting opportunity at Crop.photo.,

Posted 2 weeks ago

Apply

7.0 - 11.0 years

0 Lacs

karnataka

On-site

You will be responsible for analyzing and debugging problems at the network, storage, and virtualization layers of scale-out distributed storage solutions in various cloud environments. Your role will involve developing a knowledge base to expedite the troubleshooting of customer issues and providing feedback on existing tools while identifying and creating new tools required for customer problem triaging. Additionally, you will research, diagnose, troubleshoot, and resolve customer issues, collaborating with engineering teams as necessary. Working with technical writing resources to document issue resolutions accurately and assisting in defining the support process from issue identification to resolution will also be part of your responsibilities. Ideally, you should hold a BS/MS in Computer Science or equivalent and possess a minimum of 7+ years of experience in storage solutions. Your background should include testing and debugging storage systems, particularly distributed systems, along with a solid understanding of NFS protocols (v3, v4.1, pNFS). Familiarity with other storage protocols like SMB and S3, as well as virtualization technologies such as VMware, Hyper-V, and containers, will be beneficial. Knowledge of network solutions for clouds, including network virtualization technologies, is also desirable. Leadership skills, the ability to take ownership of customer issues, commitment, focus, and exceptional customer service and communication skills are essential. You should be proficient in researching, diagnosing, troubleshooting, and providing solutions to customer problems. Experience in developing scripts using Python or other scripting languages would be advantageous. In terms of personal characteristics, you should have a keen eye for distinguishing between perfection and adequacy, be prepared to tackle challenging tasks independently, be a team player, demonstrate good judgment, and be willing to question assumptions. You should be comfortable working in a fast-paced environment alongside a global team.,

Posted 2 weeks ago

Apply

3.0 - 7.0 years

0 Lacs

kochi, kerala

On-site

You will be responsible for planning, implementing, and growing the AWS cloud infrastructure. Your role will involve building, releasing, and managing the configuration of all production systems. It will be essential to manage a continuous integration and deployment methodology for server-based technologies. Collaboration with architecture and engineering teams to design and implement scalable software services will also be part of your responsibilities. Ensuring system security through the utilization of best-in-class cloud security solutions will be crucial. Staying up to date with new technology options and vendor products is important, and you will be expected to evaluate which ones would be suitable for the company. Implementing continuous integration/continuous delivery (CI/CD) pipelines when needed will also fall under your purview. You will have the opportunity to recommend process and architecture improvements, troubleshoot the system, and resolve problems across all platform and application domains. Overseeing pre-production acceptance testing to maintain the high quality of the company's services and products will be part of your duties. Experience with Terraform, Ansible, GIT, and Cloud Formation will be beneficial for this role. Additionally, a solid background in Linux/Unix and Windows server system administration is required. Configuring the AWS CloudWatch and monitoring, creating and modifying scripts, and hands-on experience with MySQL are also essential skills. You should have experience in designing and building web environments on AWS, including working with services like EC2, ELB, RDS, and S3. This is a full-time position with benefits such as Provident Fund and a yearly bonus. The work schedule is during the day shift, and the preferred experience level for AWS is 3 years. The work location is in person.,

Posted 2 weeks ago

Apply

4.0 - 8.0 years

0 Lacs

karnataka

On-site

We are looking for individuals who are risk-takers, collaborators, inspired, and inspirational. We seek those who are courageous enough to work on the cutting edge and develop solutions that will enhance and enrich the lives of people globally. If you aspire to make a difference that wows the world, we are eager to have a conversation with you. If you believe this role aligns with your ambitions and skill set, we invite you to begin the application process. Explore our other available positions as well, as our numerous opportunities can pave the way for endless possibilities. With 4 to 8 years of experience, the ideal candidate should possess the following primary skills: - Proficiency in Server Side (Java) & AWS serverless framework. - Hands-on experience with serverless framework is a must. - Design knowledge and experience in cloud-based web applications. Familiarity with software design representation tools like astah, visio, etc. - Strong experience in AWS, including but not limited to EC2 Volume, EC2 Security Group, EC2 AMI, Lambda, S3, AWSbackup, CloudWatch, CloudFormation, CloudTrail, IAM, SecretsManager, StepFunction, CostExplorer, KMS, VPC/Subnet. - Ability to understand business requirements concerning UI/UX. - Work experience on development/staging/production servers. - Proficient in testing and verification, knowledge of SSL certificates, and encryption. - Familiarity with Docker containerization. In addition to technical skills, soft skills are also crucial, including: - Excellent interpersonal, oral, and written communication skills. - Strong analytical and problem-solving abilities. - Capability to comprehend and analyze customer requirements and expectations. - Experience in interacting with customers. - Previous work with international cross-culture teams is a plus. Secondary Skills include: - Scripting using Python. - Knowledge of identity management is advantageous. - Understanding of UI/UX, ReactJS/typescript/bootstrap. - Proficiency in business use cases concerning UI/UX. - Troubleshooting issues related to integration on the cloud (front end/back end/system/services APIs).,

Posted 2 weeks ago

Apply

4.0 - 8.0 years

0 Lacs

chennai, tamil nadu

On-site

This is a full-time on-site role for a PHP Laravel Developer based in Chennai. In this position, you will play a key role in developing and maintaining web applications utilizing the Laravel framework. Your responsibilities will include coding, debugging, testing, and deploying new features. Additionally, you will collaborate with cross-functional teams to create efficient and scalable solutions. To excel in this role, you must possess a strong proficiency in PHP and have hands-on experience with the Laravel framework. Familiarity with frontend technologies like HTML, CSS, and JavaScript is essential. Moreover, knowledge of database management systems, particularly MySQL, is required. Understanding RESTful APIs, integrating third-party services, and using version control systems like Git are also important aspects of this position. Candidates should have practical experience in schema design, query optimization, REST API, and AWS services such as EC2, S3, RDS, Lambda, and Redis. Proficiency in designing scalable and secure web applications, expertise in automated testing frameworks, and a solid grasp of web security practices are crucial for success in this role. The ideal candidate will be able to prioritize tasks effectively and work both independently and collaboratively as part of a team. Strong problem-solving and troubleshooting skills are essential, as is clear communication and the ability to work with others. A Bachelor's degree in computer science or a related field, or equivalent experience, is required. Requirements: - Strong proficiency in PHP with Laravel framework - Experience in HTML, CSS, and JavaScript - Knowledge of MySQL and RESTful APIs - Familiarity with Git and version control systems - Hands-on experience with schema design, query optimization, and REST API - Profound knowledge of AWS services - Demonstrated experience in designing scalable and secure web applications - Expertise in automated testing frameworks - Strong understanding of web security practices - Ability to prioritize tasks and work independently or as part of a team - Excellent problem-solving and troubleshooting skills - Good communication and collaboration skills - Bachelor's degree or equivalent experience in computer science or related field Experience: 4+ Years Location: Chennai/Madurai Interested candidates can share CV at anushya.a@extendotech.com / 6374472538 Job Type: Full-time Benefits: Health insurance, Provident Fund Location Type: In-person Schedule: Morning shift Work Location: In person,

Posted 2 weeks ago

Apply

3.0 - 7.0 years

0 Lacs

hyderabad, telangana

On-site

Tezo is a new generation Digital & AI solutions provider, with a history of creating remarkable outcomes for our customers. We bring exceptional experiences using cutting-edge analytics, data proficiency, technology, and digital excellence. Job Overview The AWS Architect with Data Engineering Skills will be responsible for designing, implementing, and managing scalable, robust, and secure cloud infrastructure and data solutions on AWS. This role requires a deep understanding of AWS services, data engineering best practices, and the ability to translate business requirements into effective technical solutions. Key Responsibilities Architecture Design: Design and architect scalable, reliable, and secure AWS cloud infrastructure. Develop and maintain architecture diagrams, documentation, and standards. Data Engineering Design and implement ETL pipelines using AWS services such as Glue, Lambda, and Step Functions. Build and manage data lakes and data warehouses using AWS services like S3, Redshift, and Athena. Ensure data quality, data governance, and data security across all data platforms. AWS Services Management Utilize a wide range of AWS services (EC2, S3, RDS, Lambda, DynamoDB, etc.) to support various workloads and applications. Implement and manage CI/CD pipelines using AWS CodePipeline, CodeBuild, and CodeDeploy. Monitor and optimize the performance, cost, and security of AWS resources. Collaboration And Communication Work closely with cross-functional teams including software developers, data scientists, and business stakeholders. Provide technical guidance and mentorship to team members on best practices in AWS and data engineering. Security And Compliance Ensure that all cloud solutions follow security best practices and comply with industry standards and regulations. Implement and manage IAM policies, roles, and access controls. Innovation And Improvement Stay up to date with the latest AWS services, features, and best practices. Continuously evaluate and improve existing systems, processes, and architectures. (ref:hirist.tech),

Posted 2 weeks ago

Apply

4.0 - 8.0 years

0 Lacs

ahmedabad, gujarat

On-site

This is a remote position. Job Details Job Title - Sr. QA Analyst ( Automation Engineer ) Office Location - Office No: 403-405, Time Square, CG Road,Ellisbridge, Ahmedabad, Gujarat-380006. Duration & Type of Employment - Full Time Permanent , 4+yrs Experience req. Work Style - Hybrid In Office days - 3 days a week Requirements Tech Stack Selenium WebDriver (browser automation) Playwright (modern web automation) TestNG / JUnit (WebDriver) APPIUM Testing Mocha / Jest API Testing : Postman, Playwright Mocking: Playwright Network mocks, wiremock Programming language: JS, TS, Python Mentoring: Guide junior QAs on automation best practices. Automation Framework Design: Design & maintain WebDriver/Playwright automation frameworks. CI/CD Integration: Ensure automated tests are part of the CI/CD pipeline. (Github Actions) Test Reporting & Continuous Improvement: Oversee test reporting, monitor results, and improve automation efficiency. Testing to be done on the following Tech Stack: AWS Serverless Lambda with Node.js API Gateway (REST/JSON) DynamoDB S3 API Integration React.js (website) / React Native (app) Requirements: Bachelor's or Master's degree in Computer Science, Engineering, or a related field, or equivalent experience. Proven experience as a QA Analyst, Software Tester, Developer, or in a similar role. Strong understanding of software testing concepts, methodologies, and best practices. Proficiency in test case design, test execution, and defect tracking. Experience with manual testing of web and mobile applications across different platforms and devices. Experience with JavaScript testing frameworks like Jest and Vitest. Knowledge of automated testing tools and setting up frameworks. Solid knowledge of defect tracking systems and experience working with bug tracking tools. Strong analytical and problem-solving skills, with the ability to think critically and troubleshoot issues. Excellent attention to detail and ability to meticulously follow test plans and procedures. Effective communication and collaboration skills to work with cross-functional teams and stakeholders. Knowledge of Agile methodologies and experience working in Agile development environments. Familiarity with continuous integration/continuous deployment (CI/CD) pipelines and tools. Ability to adapt to changing priorities and work under tight deadlines. Knowledge of software development lifecycle (SDLC) and software engineering principles. Responsibilities: Collaborate with cross-functional teams to understand project requirements and define test strategies and plans. Develop, document, and maintain detailed test cases and test scripts based on project requirements and functional specifications. Execute manual and automated tests to verify software functionality, performance, usability, and security. Identify, document, and track software defects using a bug tracking system and work closely with the development team to ensure timely resolution. Participate in the review of product requirements, design documents, and specifications to provide input on testability and quality aspects. Perform exploratory testing and provide feedback on user experience and potential usability issues. Conduct regression testing to ensure that software changes and updates do not introduce new defects. Collaborate with software developers to reproduce and debug reported issues, and provide clear and concise steps to reproduce. Continuously improve the QA process by identifying inefficiencies, proposing solutions, and implementing best practices. Stay up-to-date with industry trends and advancements in software testing methodologies and tools. Communicate test progress, test results, and other relevant information to project stakeholders. Bonus skills: Developer experience. ISTQB or similar certification in software testing. Experience with performance testing and load testing tools (e.g., JMeter, LoadRunner). Knowledge of test automation frameworks and scripting languages (e.g., Java, Python, JavaScript). Familiarity with API testing and tools like Postman or SOAPUI. Experience with database testing and SQL query language. Understanding of security testing concepts and tools (e.g., OWASP ZAP, Burp Suite). Experience with test management tools (e.g., TestRail, Zephyr). Knowledge of usability testing and user experience (UX) principles. Start-up experience. Benefits Hybrid Working Culture Amazing Perks & Medical Benefits 5 Days Working Mentorship programs & Certification Courses Flexible work arrangements Free drinks fridge and snacks Competitive Salary & recognitions,

Posted 2 weeks ago

Apply

5.0 - 9.0 years

0 Lacs

karnataka

On-site

As a Senior / Lead Full Stack or Frontend Engineer, you will be responsible for delivering performant, scalable, and high-quality cloud-based software, encompassing both frontend and backend components. Your key duties will include conducting code reviews, mentoring team members to align with product requirements, and collaborating with the Senior Architect to make design and technology decisions for the product development roadmap. We are seeking candidates with a strong educational background from prestigious institutions like IIT, NIT, BITs, or other Tier 1 colleges, or individuals from non-premium institutes with experience in product companies. The ideal candidate should possess a comprehensive understanding of developing cloud-based software, including backend APIs and frontend frameworks like React and Angular. Proficiency in scalable design patterns and message-based systems such as Kafka, RabbitMQ, Redis, MongoDB, ORM, SQL, along with experience in AWS services like S3, IAM, and Lambda is essential. You should have expert-level coding skills in NodeJs, TypeScript, Scala, ReactJs, and Angular, with a focus on user-centric mobile-first designs on the frontend. Experience with hybrid frontends such as React Native and ElectronJs will be considered a plus. Join us in Bangalore for a Full-Time, Permanent position (currently remote, relocation required post-pandemic) and contribute to building cutting-edge cloud software solutions that drive our product forward.,

Posted 2 weeks ago

Apply

5.0 - 9.0 years

0 Lacs

haryana

On-site

Genpact is a global professional services and solutions firm focused on delivering outcomes that shape the future. With over 125,000 employees in more than 30 countries, we are driven by curiosity, agility, and the desire to create lasting value for our clients. Our purpose is the relentless pursuit of a world that works better for people, serving and transforming leading enterprises, including Fortune Global 500 companies, through deep business and industry knowledge, digital operations services, and expertise in data, technology, and AI. We are currently seeking applications for the position of Lead Consultant-Databricks Developer - AWS. As a Databricks Developer in this role, you will be responsible for solving cutting-edge real-world problems to meet both functional and non-functional requirements. Responsibilities: - Stay updated on new and emerging technologies and explore their potential applications for service offerings and products. - Collaborate with architects and lead engineers to design solutions that meet functional and non-functional requirements. - Demonstrate knowledge of relevant industry trends and standards. - Showcase strong analytical and technical problem-solving skills. - Possess excellent coding skills, particularly in Python or Scala, with a preference for Python. Qualifications: Minimum qualifications: - Bachelor's Degree in CS, CE, CIS, IS, MIS, or an engineering discipline, or equivalent work experience. - Stay informed about new technologies and their potential applications. - Collaborate with architects and lead engineers to develop solutions. - Demonstrate knowledge of industry trends and standards. - Exhibit strong analytical and technical problem-solving skills. - Proficient in Python or Scala coding. - Experience in the Data Engineering domain. - Completed at least 2 end-to-end projects in Databricks. Additional qualifications: - Familiarity with Delta Lake, dbConnect, db API 2.0, and Databricks workflows orchestration. - Understanding of Databricks Lakehouse concept and its implementation in enterprise environments. - Ability to create complex data pipelines. - Strong knowledge of Data structures & algorithms. - Proficiency in SQL and Spark-SQL. - Experience in performance optimization to enhance efficiency and reduce costs. - Worked on both Batch and streaming data pipelines. - Extensive knowledge of Spark and Hive data processing framework. - Experience with cloud platforms (Azure, AWS, GCP) and common services like ADLS/S3, ADF/Lambda, CosmosDB/DynamoDB, ASB/SQS, Cloud databases. - Skilled in writing unit and integration test cases. - Excellent communication skills and experience working in teams of 5 or more. - Positive attitude towards learning new skills and upskilling. - Knowledge of Unity catalog and basic governance. - Understanding of Databricks SQL Endpoint. - Experience in CI/CD to build pipelines for Databricks jobs. - Exposure to migration projects for building Unified data platforms. - Familiarity with DBT, Docker, and Kubernetes. This is a full-time position based in India-Gurugram. The job posting was on August 5, 2024, and the unposting date is set for October 4, 2024.,

Posted 2 weeks ago

Apply

10.0 - 17.0 years

0 Lacs

hyderabad, telangana

On-site

We have an exciting opportunity for an ETL Data Architect position with an AI-ML driven SaaS Solution Product Company in Hyderabad. As an ETL Data Architect, you will play a crucial role in designing and implementing a robust Data Access Layer to provide consistent data access needs to the underlying heterogeneous storage layer. You will also be responsible for developing and enforcing data governance policies to ensure data security, quality, and compliance across all systems. In this role, you will lead the architecture and design of data solutions that leverage the latest tech stack and AWS cloud services. Collaboration with product managers, tech leads, and cross-functional teams will be essential to align data strategy with business objectives. Additionally, you will oversee data performance optimization, scalability, and reliability of data systems while guiding and mentoring team members on data architecture, design, and problem-solving. The ideal candidate should have at least 10 years of experience in data-related roles, with a minimum of 5 years in a senior leadership position overseeing data architecture and infrastructure. A deep background in designing and implementing enterprise-level data infrastructure, preferably in a SaaS environment, is required. Extensive knowledge of data architecture principles, data governance frameworks, security protocols, and performance optimization techniques is essential. Hands-on experience with AWS services such as RDS, Redshift, S3, Glue, Document DB, as well as other services like MongoDB, Snowflake, etc., is highly desirable. Familiarity with big data technologies (e.g., Hadoop, Spark) and modern data warehousing solutions is a plus. Proficiency in at least one programming language (e.g., Node.js, Java, Golang, Python) is a must. Excellent communication skills are crucial in this role, with the ability to translate complex technical concepts to non-technical stakeholders. Proven leadership experience, including team management and cross-functional collaboration, is also required. A Bachelor's degree in computer science, Information Systems, or related field is necessary, with a Master's degree being preferred. Preferred qualifications include experience with Generative AI and Large Language Models (LLMs) and their applications in data solutions, as well as familiarity with financial back-office operations and the FinTech domain. Stay updated on emerging trends in data technology, particularly in AI/ML applications for finance. Industry: IT Services and IT Consulting,

Posted 2 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies