Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
6.0 - 8.0 years
0 Lacs
gurugram
Work from Office
Design and manage Azure-based data pipelines, models, and ETL processes. Ensure governance, security, performance, and real-time analytics while leveraging DevOps, CI/CD, and big data tools for reliable, scalable solutions. Health insurance Maternity policy Annual bonus Provident fund Gratuity
Posted 1 day ago
5.0 - 10.0 years
20 - 35 Lacs
bengaluru
Work from Office
Seikor is hiring for Tricon Infotech Pvt. Ltd. ( https://www.triconinfotech.com/ ) We are seeking full stack python developers. We are offering INR 1500 if you clear round 1 interview and are selected for round 2 interview. Apply, Earn during the process and find your next awesome job at Tricon powered by Seikor Job Title : Python Full-stack Developer Location : Bengaluru, India Experience : 4 - 10 Years Team Size : 5001,000 employees globally Function : Software Development Job Summary: We are looking for a skilled and experienced Python Full Stack Developer with hands-on experience in AWS . The ideal candidate should have a strong foundation in backend development using Python and frameworks like Django or Flask. This role offers an exciting opportunity to work on dynamic, scalable applications in a collaborative and fast-paced environment. Key Responsibilities: Lead and mentor a team of engineers, especially data engineers Architect scalable, secure backend systems using Python, FastAPI, and AWS Drive data infrastructure decisions with PostgreSQL, Redshift , and advanced data pipelines Collaborate cross-functionally to integrate AI-first features and stay ahead of emerging AI trends Ensure delivery of high-quality, maintainable code and manage technical debt Required Skills & Qualifications: Strong leadership and communication skills Deep understanding of AWS services (EC2, Lambda, S3, IAM, Redshift) Advanced proficiency in Python and FastAPI Expertise in relational databases ( PostgreSQL ) and data warehousing ( Redshift ) Proven experience in ETL pipelines, data modeling, and optimization Ability to thrive in fast-paced, iterative environments Nice to Have: Experience with AI/ML pipelines or data science platforms Familiarity with Airflow or similar orchestration tools Exposure to DevOps practices and CI/CD pipelines Soft Skills: Engineer-first mindset Team-oriented culture Growth mindset Strong problem-solving skills Educational Qualification: Bachelors or Masters degree in Computer Science, Engineering, or related field
Posted 1 day ago
7.0 - 9.0 years
10 - 15 Lacs
hyderabad, gurugram
Work from Office
job role:Data engineer Experience:7+years Location: Hyderabad and Gurugram shift: till 12Am-1Am Experience: 7+ years in database architecture, data engineering, or a similar role. Extensive experience migrating legacy SQL databases (e.g., MS SQL Server, Oracle) to Azure PaaS solutions. Technical Skills: Strong expertise in Azure SQL Database and Azure SQL Managed Instance. Proficiency in configuring and managing Cosmos DB for NoSQL applications. Hands-on experience with Azure Data Lake Storage for scalable data storage and analytics. Knowledge of modern data warehousing tools such as Azure Synapse Analytics, Snowflake, or Redshift. Expertise in data pipeline design using tools like Azure Data Factory and Databricks. Familiarity with database monitoring and performance tuning tools in Azure. Strong scripting and query optimization skills using T-SQL, Python, PowerShell. Knowledge on EDI /ECP protocol is desired. Cloud Expertise: Deep understanding of Azure cloud services, including networking, storage, and security
Posted 1 day ago
0.0 years
0 Lacs
mumbai, maharashtra, india
On-site
Job Description: Business Title Data Management DevOps Analyst Years Of Experience 0-1 Job Descreption Our Analyst is curious and self-driven to maintain, optimize and build additional components to multi-terabyte operational marketing databases and integrate them with cloud technologies. Once hired, the qualified candidate will be immersed in the monitoring and maintenance of Microsoft SQL Server database solutions to meet client business objectives. You will have an opportunity to work with a cross-disciplinary team. Must Have Skills Data Base: SQL server Scripting: Python, SQL General: Microsoft Office Other: Good verbal & written communication Skills Good To Have Skills Python Google Cloud Snowflake, Cisco Tidal (automation tool) Redshift DB JIRA Client-facing skills Key responsibiltes Monitoring and debugging automation/ scheduled jobs Maintaining/ writing basic SQL scripts in Microsoft SQL Server for data analysis and data extract Understanding and debugging ETL flow Maintaining data mart extraction processes Performing quality assurance and testing at the unit level Writing user and technical documentation Communicating efficiently with various stakeholders like client, IT team, Dev teams, Etc. Education Qulification Bachelor&aposs degree in Computer Science or equivalent Relevant Experience 0 - 1 yr Shift timing Shift & Time Zones - Working in client specific time zones and rotational shifts, as required. Location: DGS India - Mumbai - Goregaon Prism Tower Brand: Merkle Time Type: Full time Contract Type: Permanent Show more Show less
Posted 2 days ago
8.0 - 12.0 years
17 - 18 Lacs
bengaluru
Work from Office
Must-Haves: MUST-HAVE: Overall technology experience of 8+ years MUST-HAVE: Minimum experience of 5 years in data modelling and database design MUST-HAVE: Minimum experience of 7 years in designing, implementing, and supporting medium to large scale database systems MUST-HAVE: Minimum experience of 5 years in designing, developing, and supporting solutions using S3, Redshift, DynamoDB and any of the Managed RDS MUST-HAVE: Minimum experience of 4 years designing, developing, and tuning solutions using AWS database and storage technologies Preferred: Prior experience with designing, developing, and supporting solutions using database technologies like MySQL, PostgreSQL, Cassandra is a plus Experience with designing, developing, and supporting solutions using Map Reduce, Kafka, & Streaming technologies is a plus Advanced python programming skills is a plus Roles & Responsibilities : Understand the business domain, core data objects, data entities. Model the relationships between the various entities Design the data warehouse, data mart and transactional databases including all facets of load parameters Induct aspects of high performance, security, usability, operability, maintainability, traceability, observability, evolvability into the systems design Assess performance influencing parameters like normalization, de-normalization, most executed transactions, record count, data size, I/O parameters at the database and OS level in the database and table designs Maintain a catalog of meta, master, transactional and reference data Tune the transactions and queries and determine the use of appropriate client libraries and fetch mechanism (like query vs stored procedures) Design the system for resilience, fail-over, self-healing and institute rollback plans Develop and test database code and other core and helper utilities in Python Develop and profile queries, triggers, indices, and stored procedures Monitor the health of queries and identify patterns leading to bottlenecks in the system before the customer finds it Own the DevOps and release mgmt. practices pertaining to the database solutions Estimate the cost of AWS services usage and look to continuously optimize the cost Design and develop data REST API layer on Pytho Role & responsibilities Preferred candidate profile
Posted 2 days ago
8.0 - 13.0 years
15 - 27 Lacs
Bengaluru
Hybrid
Job Description: We are seeking a visionary and experienced Senior Data Architect to lead the design and implementation of our enterprise-wide data architecture. The role requires a solid foundation in Java, Spring, SQL , and strong knowledge of modern data platforms and cloud technologies like Azure Databricks, Snowflake, BigQuery , etc. You will be responsible for modernizing our data infrastructure, ensuring security and accessibility of data assets, and providing strategic direction to the data team. Key Responsibilities: Define and implement enterprise data architecture aligned with organizational goals. Design and lead scalable, secure, and resilient data platforms for structured & unstructured data. Architect cloud-native data solutions using tools like Databricks, Snowflake, Redshift, BigQuery . Lead design and integration of data lakes, warehouses, and ETL pipelines. Collaborate with cross-functional teams and leadership to define data needs and deliver solutions. Guide data engineers and analysts in best practices, modeling, and governance. Drive initiatives around data quality, metadata, lineage, and master data management (MDM) . Ensure compliance with data privacy regulations (GDPR, HIPAA, CCPA). Lead modernization/migration of legacy systems to modern cloud platforms. Must-Have Skills: Strong expertise in Java, Spring Framework, and SQL . Experience with Azure Databricks or similar cloud data platforms. Hands-on with Snowflake , BigQuery , Redshift , or Azure Synapse . Deep understanding of data modeling tools like Erwin or ER/Studio. Proven experience designing data platforms in hybrid/multi-cloud setups. Strong background in ETL/ELT pipelines , data APIs , and integration . Proficient in Python or similar languages used for data engineering. Knowledge of DevOps and CI/CD processes in data pipelines. Preferred Qualifications: 10+ years of experience in Data Architecture. At least 3 years in a senior or lead role. Familiarity with data governance , security policies , identity management , and RBAC . Excellent leadership, communication, and stakeholder management skills.
Posted 1 month ago
2.0 - 7.0 years
3 - 6 Lacs
Bengaluru
Work from Office
Looking for an AWS & DevOps trainer to take 1-hour daily virtual classes (MonFri). Should cover AWS services, DevOps tools (Jenkins, Docker, K8s, etc.), give hands-on tasks, guide on interviews & certs, and support doubt sessions.
Posted 1 month ago
7.0 - 12.0 years
7 - 17 Lacs
Hyderabad, Bengaluru
Hybrid
Hexaware Technologies is Hiring AWS Redshift developers Primary Skill set - AWS redshift, Glue, Lambda, Pyspark Total Exp Required - 6 + years to 12yrs Location - Bangalore & Hyderabad only Work mode - Hybrid Job Description: Mandatory Skills: 8 to 10 years Experience with most of the cloud products such as Amazon AWS. Experience as developer in multiple cloud technologies including AWS EC2, S3, Amazon API Gateway , AWS Lambda, AWS Glue, AWS RDS, AWS Step Functions. Good knowledge on AWS environment and Service knowledge with S3 storage understanding. Must have good knowledge of AWS Glue and serverless architecture. Must have good knowledge of Pyspark. Must have good knowledge of SQL. Nice to have skills: Collaborate with data analysts and stakeholders to meet data requirements. AWS Postgres experience for DB design. Must have worked with Dynamo DB. Interested candidates, Kindly share your updated resume to ramyar2@hexaware.com with below required details. Full Name: Contact No: Total Exp: Rel Exp in AWS: Current & Joining Location: Notice Period (If serving mention LWD): Current CTC: Expected CTC:
Posted 1 month ago
5.0 - 10.0 years
11 - 16 Lacs
Nagpur, Pune
Work from Office
JOB DESCRIPTION Off-Shore Contract Data Engineering Role that MUST work out of an approved Clean Room facility. The role is part of an Agile Team in support of Financial Crimes data platforms and strategies, including but not limited to their use of SAS Grid and SnowFlake. JOB SUMMARY Handle the design and construction of scalable data management systems, ensure that all data systems meet company requirements, and also research new uses for data acquisition. Required to know and understand the ins and outs of the industry such as data mining practices, algorithms, and how data can be used. Primary Responsibilities: Design, construct, install, test and maintain data management systems. Build high-performance algorithms, predictive models, and prototypes. Ensure that all systems meet the business/company requirements as well as industry practices. Integrate up-and-coming data management and software engineering technologies into existing data structures. Develop set processes for data mining, data modeling, and data production. Create custom software components and analytics applications. Research new uses for existing data. Employ an array of technological languages and tools to connect systems together. Collaborate with members of your team (eg, data architects, the IT team, data scientists) on the projects goals. Install/update disaster recovery procedures. Recommend different ways to constantly improve data reliability and quality. Maintaining up-to-date knowledge, support, and training documentation QUALIFICATIONS Technical Degree or related work experience Proficiency and Technical Skills Relating to: SQL, MySQL, DBT, SnowFlake, and SAS Exposure and experience with: ETL (DataStage), Scripting (Python, Java Script, Etc), Version Controls (Git), Highly Regulated Environments (Banking, Health Care, Etc).
Posted 1 month ago
7.0 - 12.0 years
16 - 30 Lacs
Chennai
Hybrid
Responsibilities: Working with other members of the Database Services and external development teams to deliver projects specified in our company roadmap Owning, tracking and resolving database related incidents and requests across the following database platforms (PgSQL, SQL Server, Opensearch, Redshift) Fulfilling requests and resolving incidents within SLAs. Reviewing service-related reports (e.g. database backups, maintenance, monitoring) on a daily basis to ensure operational issues are identified and resolved promptly. Responding to database related alerts and escalations and working with database engineering to come up with strategic solutions to recurring problems. Identifying opportunities for process improvement including enhancing automation for database platform provisioning Enhancing database monitoring & alerting platforms to ensure proactive alerting is always achieved. Promotion of database engineering best practices within our cross-functional development teams. Coaching / mentoring junior engineers. Where youll be working: This hybrid role will have a defined work location that includes work from home and assigned office days as set by the manager. You’ll need to have: experience with management & operations of Database technologies & services with the majority of your recent experience in the AWS platform. Strong analytical & problem solving skills In-depth experience in one or more of the following Technologies operating in a high-volume, high-throughput environment. PgSQL Opensearch/ElasticSearch Redshift SQL Server Experience with the above technologies to include but not limited to the following:- General Database Administration Tasks Database Troubleshooting & Performance Reviews Database & Index Design & Maintenance Design & Maintenance of Partitioning Database Upgrades High Availability & Disaster Recovery Options Performance Tuning & optimisation Security Hardening & Access Provisioning Monitoring & Alerting Experience in managing database platforms which operate regulatory controls such as GDPR/CCPA/CPRA, HIPPA, SOCII etc. Experiencing in designing, development & maintaining CI/CD pipelines for AWS Infrastructure & Database Code deployments (using services such as GIT, Bamboo, Powershell, RedGate Flyway, Cloud Formation Templates, etc.) Ability to troubleshoot software, hardware & service related issues. Excellent organizational skills & attention to detail Excellent written and oral communication skills Ability to coach, mentor, and influence team members through the necessary database disciplines
Posted 1 month ago
4.0 - 9.0 years
0 - 2 Lacs
Nagpur, Pune, Bengaluru
Work from Office
Hi We have one urgent open position for Redshift database admin. Please find the details for the Redshift Admin role below: Exp range- 7 to 10 yrs. Location: All Infocepts / Remote (only in exceptional cases Must-haves: Overall exp. Of 7 to 10 yrs. Redshift administration (key role,5 years experience with redshift) Data Migration / Sync – specifically DATA UNLOAD, COPY Deployments Tuning / Optimization Key Result Areas and Activities: Design and Development: Design, implement, and manage Redshift clusters for high availability, performance, and security. Performance Optimization: Monitor and optimize database performance, including query tuning and resource management. Backup and Recovery: Develop and maintain database backup and recovery strategies. Security Enforcement: Implement and enforce database security policies and procedures. Cost-Performance Balance: Ensure an optimal balance between cost and performance. Collaboration with Development Teams: Work with development teams to design and optimize database schemas and queries. Perform database migrations, upgrades, and patching. Issue Resolution: Troubleshoot and resolve database-related issues, providing support to development and operations teams. Automate routine database tasks using scripting languages and tools. Must-Have: Strong understanding of database design, performance tuning, and optimization techniques Proficiency in SQL and experience with database scripting languages (e.g., Python, Shell) Experience with database backup and recovery, security, and high availability solutions Familiarity with AWS services and tools, including S3, EC2, IAM, and CloudWatch Operating System - Any flavor of Linux, Windows Core Redshift Administration Skills -Cluster Management, Performance Optimization, workload management (WLM), vacuuming/analyzing tables for optimal performance, IAM policies, role[1]based access control, Backup & Recovery, automated backups, and restoration strategies. SQL Query Optimization, distribution keys, sort keys, and compression encoding Knowledge of COPY and UNLOAD commands, S3 integration, and best practices for bulk data loading Scripting & Automation for automating routine DBA tasks Expertise in debugging slow queries, troubleshooting system tables -- Thanks Regards, Tharani S, Recruitment Lead Sight Spectrum Technology Solutions Pvt. Ltd. 9500066211. | www.sightspectrum.com t tharani@sightspectrum.com | Chennai
Posted 1 month ago
5.0 - 10.0 years
10 - 18 Lacs
Bengaluru, Mumbai (All Areas)
Hybrid
About the Role: We are seeking a passionate and experienced Subject Matter Expert and Trainer to deliver our comprehensive Data Engineering with AWS program. This role combines deep technical expertise with the ability to coach, mentor, and empower learners to build strong capabilities in data engineering, cloud services, and modern analytics tools. If you have a strong background in data engineering and love to teach, this is your opportunity to create impact by shaping the next generation of cloud data professionals. Key Responsibilities: Deliver end-to-end training on the Data Engineering with AWS curriculum, including: - Oracle SQL and ANSI SQL - Data Warehousing Concepts, ETL & ELT - Data Modeling and Data Vault - Python programming for data engineering - AWS Fundamentals (EC2, S3, Glue, Redshift, Athena, Kinesis, etc.) - Apache Spark and Databricks - Data Ingestion, Processing, and Migration Utilities - Real-time Analytics and Compute Services (Airflow, Step Functions) Facilitate engaging sessions virtual and in-person and adapt instructional methods to suit diverse learning styles. Guide learners through hands-on labs, coding exercises, and real-world projects. Assess learner progress through evaluations, assignments, and practical assessments. Provide mentorship, resolve doubts, and inspire confidence in learners. Collaborate with the program management team to continuously improve course delivery and learner experience. Maintain up-to-date knowledge of AWS and data engineering best practices. Ideal Candidate Profile: Experience: Minimum 5-8 years in Data Engineering, Big Data, or Cloud Data Solutions. Prior experience delivering technical training or conducting workshops is strongly preferred. Technical Expertise: Proficiency in SQL, Python, and Spark. Hands-on experience with AWS services: Glue, Redshift, Athena, S3, EC2, Kinesis, and related tools. Familiarity with Databricks, Airflow, Step Functions, and modern data pipelines. Certifications: AWS certifications (e.g., AWS Certified Data Analytics Specialty) are a plus. Soft Skills: Excellent communication, facilitation, and interpersonal skills. Ability to break down complex concepts into simple, relatable examples. Strong commitment to learner success and outcomes. Email your application to: careers@edubridgeindia.in.
Posted 1 month ago
5.0 - 6.0 years
6 - 15 Lacs
Navi Mumbai
Work from Office
Job Title: SQL & Redshift Developer Location: Airoli, Navi Mumbai. Experience: 6+ Years Notice Period : 10 to 15 Days Required Skills: Strong proficiency in SQL and relational database concepts. Seeking a Redshift with expertise in data warehousing, ETL, and SQL optimization for scalable analytics solutions. In-depth knowledge of Stored Procedures, Functions, and Triggers. Experience with SQL Tuning and performance optimization. Excellent understanding of Joins (Inner, Outer, Cross, etc.). Hands-on experience in DB Design, including normalization and indexing strategies. Thanks & Regards Chetna Gidde | HR Associate-Talent Acquisition | chetna.gidde@rigvedit.com |
Posted 1 month ago
6.0 - 8.0 years
0 - 2 Lacs
Jaipur
Work from Office
Role & responsibilities As a Senior/Lead Database Administrator , you'll be a hands-on leader overseeing our critical data infrastructure across Amazon Redshift, PostgreSQL, MySQL, and MSSQL environments. Your core responsibilities include: Strategic Design & Implementation: Lead the architectural design, development, and implementation of robust database solutions, ensuring alignment with organizational needs and best practices. System Management: Architect, install, configure, and maintain all database management systems, optimizing for performance and reliability. Advanced Security: Implement and enforce comprehensive security measures, access controls, and audit mechanisms to protect data integrity and confidentiality. Disaster Recovery & HA: Develop, test, and maintain advanced backup, recovery, and high-availability strategies to ensure business continuity. Performance Tuning: Proactively monitor and optimize database performance through query analysis, indexing, and configuration tuning. Capacity Planning: Assess current and future capacity needs, planning and implementing scalable solutions to support growth. Operational Excellence: Oversee routine and complex database maintenance, ensuring continuous and reliable operations. Expert Troubleshooting: Diagnose and resolve complex database issues, collaborating with IT teams for systemic problem-solving. Data Migration & Upgrades: Lead data migration projects and manage database software upgrades, ensuring data integrity. Technical Leadership & Collaboration: Mentor team members, advise developers on database best practices, and ensure seamless integration of database systems with the broader IT infrastructure. Preferred candidate profile Is Education overrated? Yes. We believe so. However there is no way to locate you otherwise. So unfortunately we might have to look for a Bachelor's or Master's degree in engineering from a reputed institute or you should be programming from 12. And the latter is better. We will find you faster if you specify the latter in some manner. Not just a degree, but we are not too thrilled by tech certifications too ... :) To reiterate: Passion to tech-awesome, insatiable desire to learn the latest of the new-age cloud tech, highly analytical aptitude and a strong desire to deliver’ outlives those fancy degrees! 6-9 years experience in Cloud Database Administration Proficient in Amazon Redshift /PostgreSQL/MSSQL database management, including installation, configuration, and performance tuning. Experience with Amazon RDS or Cloud SQL and managing databases in cloud environments. Proficient in Amazon Redshift /PostgreSQL/MySQL/MSSQL database management, including installation, configuration, and performance tuning. Proven experience as a Database Administrator with a focus on Amazon Redshift /PostgreSQL/MySQL/MSSQL. In-depth knowledge of database design, implementation, and management for either Amazon Redshift /PostgreSQL/MySQL/MSSQL environments. Familiarity with database security, backup procedures, and performance tuning for both systems. Strong analytical and problem-solving skills. Excellent communication and collaboration abilities. Relevant certifications are a plus.
Posted 1 month ago
3.0 - 8.0 years
7 - 17 Lacs
Hyderabad
Work from Office
Job Title: Database Engineer Analytics – L Responsibilities As a Database Engineer supporting the bank’s Analytics platforms, you will be a part of a centralized team of database engineers who are responsible for the maintenance and support of Citizens’ most critical databases. A Database Engineer will be responsible for: • Requires conceptual knowledge of database practices and procedures such as DDL, DML and DCL. • Requires how to use basic SQL skills including SELECT, FROM, WHERE and ORDER BY. • Ability to code SQL Joins, subqueries, aggregate functions (AVG, SUM, COUNT), and use data manipulation techniques (UPDATE, DELETE). • Understanding basic data relationships and schemas. • Develop Basic Entity-Relationship diagrams. • Conceptual understanding of cloud computing • Can solves routine problems using existing procedures and standard practices. • Can look up error codes and open tickets with vendors • Ability to execute explains and identify poorly written queries • Review data structures to ensure they adhere to database design best practices. • Develop a comprehensive backup plan. • Understanding the different cloud models (IaaS, PaaS, SaaS), service models, and deployment options (public, private, hybrid). • Solves standard problems by analyzing possible solutions using experience, judgment and precedents. • Troubleshoot database issues, such as integrity issues, blocking/deadlocking issues, log shipping issues, connectivity issues, security issues, memory issues, disk space, etc. • Understanding cloud security concepts, including data protection, access control, and compliance. • Manages risks that are associated with the use of information technology. • Identifies, assesses, and treats risks that might affect the confidentiality, integrity, and availability of the organization's assets. • Ability to design and implement highly performing database using partitioning & indexing that meet or exceed the business requirements. • Documents a complex software system design as an easily understood diagram, using text and symbols to represent the way data needs to flow. • Ability to code complex SQL. • Performs effective backup management and periodic databases restoration testing. • General DB Cloud networking skills – VPCs, SGs, KMS keys, private links. JOB DESCRIPTION • Ability to develop stored procedures and at least one scripting language for reusable code and improved performance. Know how to import and export data into and out of databases using ETL tools, code, migration tools like DMS or scripts • Knowledge of DevOps principles and tools, such as CI/CD. • Attention to detail and demonstrate a customer centric approach. • Solves complex problems by taking a new perspective on existing solutions; exercises judgment based on the analysis of multiple sources of information • Ability to optimize queries for performance and resource efficiency • Review database metrics to identify performance issues. Required Qualifications • 2-10+ years of experience with database management/administration, Redshift, Snowflake or Neo4J • 2-10+ years of experience working with incident, change and problem management processes and procedures. • Experience maintaining and supporting large-scale critical database systems in the cloud. • 2+ years of experience working with AWS cloud hosted databases • An understanding of one programming languages, including at least one front end framework (Angular/React/Vue), such as Python3, Java, JavaScript, Ruby, Golang, C, C++, etc. • Experience with cloud computing, ETL and streaming technologies – OpenShift, DataStage, Kafka • Experience with agile development methodology • Strong SQL performance & tuning skills • Excellent communication and client interfacing skills • Strong team collaboration skills and capacity to prioritize tasks efficiently. Desired Qualifications • Experience working in an agile development environment • Experience working in the banking industry • Experience working in cloud environments such as AWS, Azure or Google • Experience with CI/CD pipeline (Jenkins, Liquibase or equivalent) Education and Certifications • Bachelor’s degree in computer science or related discipline
Posted 2 months ago
5.0 - 10.0 years
25 - 30 Lacs
Bengaluru
Work from Office
Here's a comprehensive description and job description (JD) for a Data Analyst Lead role, suitable for a growth-oriented organizationespecially in tech, product, or edtech environments: Role: Data Analyst Lead Role Overview We are seeking a Data Analyst Lead who will take ownership of our analytics strategy, leading a team of analysts and working closely with stakeholders across product, marketing, operations, and engineering. The ideal candidate is not only strong in SQL, data modeling, and visualization but also highly business-savvycapable of turning data into actionable insights that drive growth and efficiency. You will be responsible for building scalable dashboards, running deep-dive analyses, mentoring junior analysts, and enabling data-driven decisions across the company. Key Responsibilities Lead & Mentor : Manage a small team of data analysts. Set direction, review work, provide mentorship, and grow team capabilities. Business Partnering : Collaborate with stakeholders across product, marketing, sales, finance, and ops to understand their data needs and translate them into impactful analytics projects. Data Strategy : Define and drive data quality, consistency, and accessibility across departments. Build scalable data pipelines in collaboration with engineering teams if needed. Analysis & Reporting : Design, implement, and maintain dashboards (using tools like QuickSight, Metabase). Perform exploratory data analysis (EDA), cohort analysis, funnel drop-off analysis, retention modeling, etc. Provide insights on KPIs like customer acquisition cost (CAC), LTV, churn, NPS, and usage metrics. Experimentation : Support A/B testing strategies by defining metrics, evaluating results, and communicating outcomes clearly. Drive Insights : Proactively identify business trends, anomalies, and opportunities for growth or efficiency using data. Tools & Tech : Oversee adoption of BI tools, ensure SQL standards, and guide best practices in dashboard design and analytics workflows. Requirements Must-Have 5+ years of experience in analytics, with at least 1–2 years leading a team or cross-functional projects. Advanced SQL skills and experience working with large datasets (PostgreSQL, Redshift, etc.). Strong skills in dashboarding and visualization (Quicksight, Metabase). Experience with statistical or data modeling tools (Python, R, or similar). Proven experience in designing KPIs and performance dashboards across teams. Business acumen and the ability to convert ambiguous problems into clear data-backed narratives. Nice-to-Have Experience in a high-growth startup or tech/edtech environment. Exposure to data warehousing (e.g., AWS Glue, Redshift, Snowflake). Familiarity with experimentation frameworks or product analytics tools (Mixpanel, Amplitude, GA4). Understanding of data privacy and governance best practices.
Posted 2 months ago
7.0 - 12.0 years
16 - 30 Lacs
Chennai
Hybrid
Responsibilities: Working with other members of the Database Services and external development teams to deliver projects specified in our company roadmap Owning, tracking and resolving database related incidents and requests across the following database platforms (PgSQL, SQL Server, Opensearch, Redshift) Fulfilling requests and resolving incidents within SLAs. Reviewing service-related reports (e.g. database backups, maintenance, monitoring) on a daily basis to ensure operational issues are identified and resolved promptly. Responding to database related alerts and escalations and working with database engineering to come up with strategic solutions to recurring problems. Identifying opportunities for process improvement including enhancing automation for database platform provisioning Enhancing database monitoring & alerting platforms to ensure proactive alerting is always achieved. Promotion of database engineering best practices within our cross-functional development teams. Coaching / mentoring junior engineers. Where youll be working: This hybrid role will have a defined work location that includes work from home and assigned office days as set by the manager. You’ll need to have: experience with management & operations of Database technologies & services with the majority of your recent experience in the AWS platform. Strong analytical & problem solving skills In-depth experience in one or more of the following Technologies operating in a high-volume, high-throughput environment. PgSQL Opensearch/ElasticSearch Redshift SQL Server Experience with the above technologies to include but not limited to the following:- General Database Administration Tasks Database Troubleshooting & Performance Reviews Database & Index Design & Maintenance Design & Maintenance of Partitioning Database Upgrades High Availability & Disaster Recovery Options Performance Tuning & optimisation Security Hardening & Access Provisioning Monitoring & Alerting Experience in managing database platforms which operate regulatory controls such as GDPR/CCPA/CPRA, HIPPA, SOCII etc. Experiencing in designing, development & maintaining CI/CD pipelines for AWS Infrastructure & Database Code deployments (using services such as GIT, Bamboo, Powershell, RedGate Flyway, Cloud Formation Templates, etc.) Ability to troubleshoot software, hardware & service related issues. Excellent organizational skills & attention to detail Excellent written and oral communication skills Ability to coach, mentor, and influence team members through the necessary database disciplines
Posted 2 months ago
5.0 - 10.0 years
9 - 19 Lacs
Kolkata, Hyderabad, Pune
Work from Office
• Minimum 5 years + working exp as Databricks Developer • Minimum 3 years + working exp on Redshift, Python, PySpark, and AWS • Associate should hold Databrick Certification and willing to join within 30 Days
Posted 2 months ago
11.0 - 19.0 years
20 - 35 Lacs
faridabad
Remote
We are seeking an experienced and highly skilled Senior Data Engineer to drive data-driven decision-making and innovation. In this role, you will leverage your expertise in advanced analytics, machine learning, and big data technologies to solve complex business challenges. You will be responsible for designing predictive models, building scalable data pipelines, and uncovering actionable insights from structured and unstructured datasets. Collaborating with cross-functional teams, your work will empower strategic decision-making and foster a data-driven culture across the organization. Role & responsibilities 1 Position Overview: We are seeking an experienced and highly skilled Senior Data Scientist to drive data-driven decision-making and innovation. In this role, you will leverage your expertise in advanced analytics, machine learning, and big data technologies to solve complex business challenges. You will be responsible for designing predictive models, building scalable data pipelines, and uncovering actionable insights from structured and unstructured datasets. Collaborating with cross-functional teams, your work will empower strategic decision-making and foster a data-driven culture across the organization. Key Responsibilities: 1. Data Exploration and Analysis: Collect, clean, and preprocess large and complex datasets from diverse sources, including SQL databases, cloud platforms, and APIs. Perform exploratory data analysis (EDA) to identify trends, patterns, and relationships in data. Develop meaningful KPIs and metrics tailored to business objectives. 2. Advanced Modeling and Machine Learning: Design, implement, and optimize predictive and prescriptive models using statistical techniques and machine learning algorithms. Evaluate model performance and ensure scalability and reliability in production. Work with both structured and unstructured data for tasks such as text analysis, image processing, and recommendation systems. 3. Data Engineering and Automation: Build and optimize scalable ETL pipelines for data processing and feature engineering. Collaborate with data engineers to ensure seamless integration of data science solutions into production environments. Leverage cloud platforms (e.g., AWS, Azure, GCP) for scalable computation and storage. 4. Data Visualization and Storytelling: Communicate complex analytical findings effectively through intuitive visualizations and presentations. Create dashboards and visualizations using tools such as Power BI, Tableau, or Python libraries (e.g., Matplotlib, Seaborn, Plotly). Translate data insights into actionable recommendations for stakeholders. 5. Cross-functional Collaboration and Innovation: Partner with business units, product teams, and data engineers to define project objectives and deliver impactful solutions. Stay updated with emerging technologies and best practices in data science, machine learning, and AI. Contribute to fostering a data-centric culture within the rganization by mentoring junior team members and promoting innovative approaches. Skills and Qualifications: Technical Skills: Proficiency in Python, R, or other data science programming languages. Strong knowledge of machine learning libraries and frameworks (e.g., Scikit-learn, TensorFlow, PyTorch). Advanced SQL skills for querying and managing relational databases. Experience with big data technologies (e.g., Spark, Hadoop) and cloud platforms (AWS, Azure, GCP), preferably MS Azure. Familiarity with data visualization tools such as Power BI, Tableau, or equivalent, preferably MS Power BI. Analytical and Problem-solving Skills: Expertise in statistical modeling, hypothesis testing, and experiment design. Strong problem-solving skills to address business challenges through data-driven solutions. Ability to conceptualize and implement metrics/KPIs tailored to business needs. Soft Skills: Excellent communication skills to translate complex technical concepts into business insights. Collaborative mindset with the ability to work in cross-functional teams. Proactive and detail-oriented approach to project management and execution. Perks & Benefits Best as per market standard. Work from home opportunity. 5 days working. Shift Timing 2PM-11PM IST (Flexible hours)
Posted Date not available
8.0 - 13.0 years
20 - 30 Lacs
chennai, bengaluru, mumbai (all areas)
Hybrid
Job Description: Experience in designing, building, and maintaining AWS Cloud Infrastructure. Proficient in AWS services including EC2, S3, RDS, VPC, Redshift, ECS, Route 53, Load Balancers, Auto Scaling, Cost Explorer, and others. Experience setting up ECS/EKS clusters, configuring deployments, and implementing performance-based scaling. Familiar with Docker images, Docker Hub, AMIs, etc. Skilled in automating infrastructure setup using Terraform and CloudFormation. Experience in setting up new environments based on specific landing page requirements and industry best practices. Hands-on experience with AWS IAM, including configuring roles, policies, and user groups. Experience in setting up RDS databases (MySQL, Redshift, PostgreSQL, MariaDB) and managing intranet user access. Skilled in troubleshooting connectivity issues. Specialized knowledge in network configurations using services such as VPC, Subnets, Security Groups, NAT Gateways, Internet Gateways, Route 53, WAF, etc. Experience in DevOps and CI/CD processes. Proficient with Bitbucket/GitHub for pipeline creation and familiar with ticketing systems like ServiceNow and Jira. Strong background in cost analysis and resource optimization. Knowledgeable in using Trusted Advisor, Cost Explorer, and tagging for granular cost insights. Experience with AMI upgrades, patch application, and maintenance window management. Skilled in monitoring AWS environments using tools such as New Relic, Splunk, Prometheus, Grafana, and others. Solid understanding of cloud architecture best practices and security guidelines. Experience in documenting SOPs, creating knowledge bases, and maintaining technical documentation. Demonstrated ability to work independently with clear communication skills. Collaborate effectively as a proactive and supportive team player.
Posted Date not available
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
54024 Jobs | Dublin
Wipro
24262 Jobs | Bengaluru
Accenture in India
18733 Jobs | Dublin 2
EY
17079 Jobs | London
Uplers
12548 Jobs | Ahmedabad
IBM
11704 Jobs | Armonk
Amazon
11059 Jobs | Seattle,WA
Bajaj Finserv
10656 Jobs |
Accenture services Pvt Ltd
10587 Jobs |
Oracle
10506 Jobs | Redwood City