Jobs
Interviews

2536 Data Engineering Jobs - Page 22

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3.0 - 8.0 years

5 - 9 Lacs

Coimbatore

Work from Office

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : PySpark Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : Graduate Project Role :Data Platform Engineer Project Role Description :Assists with the data platform blueprint and design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have Skills :PySpark, SSINON SSI:Good to Have Skills :SSI:No Technology Specialization NON SSI :Job :Key Responsibilities :Work on client projects to deliver AWS, PySpark, Databricks based Data engineering Analytics solutions Build and operate very large data warehouses or data lakes ETL optimization, designing, coding, tuning big data processes using Apache Spark Build data pipelines applications to stream and process datasets at low latencies Show efficiency in handling data - tracking data lineage, ensuring data quality, and improving discoverability of data Technical Experience :Minimum of 1 years of experience in Databricks engineering solutions on AWS Cloud platforms using PySpark Minimum of 3 years of experience years of experience in ETL, Big Data/Hadoop and data warehouse architecture delivery Minimum 2 year of Experience in one or more programming languages Python, Java, Scala Experience using airflow for the data pipelines in min 1 project 1 years of experience developing CICD pipelines using GIT, Jenkins, Docker, Kubernetes, Shell Scripting, Terraform Professional Attributes :-Ready to work in B Shift 12 PM 10 PM 2-A Client facing skills:solid experience working in client facing environments, to be able to build trusted relationships with client stakeholders 3-Good critical thinking and problem-solving abilities 4-Health care knowledge Good Communication Skills Educational Qualification:GraduateAdditional Info :Level and Across Accenture Location Facilities Qualification Graduate

Posted 1 week ago

Apply

5.0 - 10.0 years

8 - 12 Lacs

Mangaluru

Work from Office

Job OverviewBranch launched in India in 2019 and has seen rapid adoption and growth. As a company, we are passionate about our customers, fearless in the face of barriers, and driven by data.We are seeking an experienced and strategic Data and Reporting Lead to shape our data strategy and drive data-driven decision-making across the organization. This role will focus on developing a comprehensive data infrastructure, ensuring data accuracy, and providing critical insights to support our business goals.ResponsibilitiesData Strategy & Governance: Develop and implement a data strategy that aligns with organizational goals. Establish governance policies to maintain data quality, consistency, and security.Team Leadership: Provide training and development to enhance the teams skills in data management and reporting.Reporting & Analytics: Oversee the creation of dashboards and reports, delivering key insights to stakeholders. Ensure reports are accessible, reliable, and relevant, with a focus on performance metrics, customer insights, and operational efficiencies.Cross-functional Collaboration: Work closely with cross-functional teams (Tech, Finance, Operations, Marketing, Credit and Analytics) to identify data requirements, integrate data across systems, and support data-driven initiatives.Data Infrastructure & Tools: Work with Data Engineering to assess, select, and implement data tools and platforms to optimize data storage, processing, and reporting capabilities. Maintain and improve our data infrastructure to support scalability and data accessibility.Data Compliance: Ensure adherence to data privacy laws and compliance standards, implementing best practices in data security and privacy. Qualifications Experience: 5-10 years of experience in data management and reporting with at least some in a leadership role.Education: Bachelors or Masters degree in Data Science, Business Analytics, Statistics, Computer Science, or a related field (STEM).Technical Skills: Proficiency in data visualization tools (Metabase, Sisense, Tableau, Power BI), SQL, and data warehousing solutions. Knowledge of ETL processes and familiarity with cloud data platforms is a plus.Analytical Skills: Strong analytical abilities and a strategic mindset, with proven experience in translating data into actionable business insights.Leadership & Communication: Excellent leadership, communication, and presentation skills. Ability to communicate complex information clearly to both technical and non-technical stakeholders.Strong strategic thinking and problem-solving skillsEnthusiasm for working across cultures, functions, and time zones

Posted 1 week ago

Apply

5.0 - 10.0 years

8 - 12 Lacs

Madurai

Work from Office

Job OverviewBranch launched in India in 2019 and has seen rapid adoption and growth. As a company, we are passionate about our customers, fearless in the face of barriers, and driven by data.We are seeking an experienced and strategic Data and Reporting Lead to shape our data strategy and drive data-driven decision-making across the organization. This role will focus on developing a comprehensive data infrastructure, ensuring data accuracy, and providing critical insights to support our business goals. ResponsibilitiesData Strategy & Governance: Develop and implement a data strategy that aligns with organizational goals. Establish governance policies to maintain data quality, consistency, and security.Team Leadership: Provide training and development to enhance the teams skills in data management and reporting.Reporting & Analytics: Oversee the creation of dashboards and reports, delivering key insights to stakeholders. Ensure reports are accessible, reliable, and relevant, with a focus on performance metrics, customer insights, and operational efficiencies.Cross-functional Collaboration: Work closely with cross-functional teams (Tech, Finance, Operations, Marketing, Credit and Analytics) to identify data requirements, integrate data across systems, and support data-driven initiatives.Data Infrastructure & Tools: Work with Data Engineering to assess, select, and implement data tools and platforms to optimize data storage, processing, and reporting capabilities. Maintain and improve our data infrastructure to support scalability and data accessibility.Data Compliance: Ensure adherence to data privacy laws and compliance standards, implementing best practices in data security and privacy. Qualifications Experience: 5-10 years of experience in data management and reporting with at least some in a leadership role.Education: Bachelors or Masters degree in Data Science, Business Analytics, Statistics, Computer Science, or a related field (STEM).Technical Skills: Proficiency in data visualization tools (Metabase, Sisense, Tableau, Power BI), SQL, and data warehousing solutions. Knowledge of ETL processes and familiarity with cloud data platforms is a plus.Analytical Skills: Strong analytical abilities and a strategic mindset, with proven experience in translating data into actionable business insights.Leadership & Communication: Excellent leadership, communication, and presentation skills. Ability to communicate complex information clearly to both technical and non-technical stakeholders.Strong strategic thinking and problem-solving skillsEnthusiasm for working across cultures, functions, and time zones

Posted 1 week ago

Apply

7.0 - 11.0 years

9 - 13 Lacs

Mumbai

Work from Office

Skill required: Data Management - PySpark Designation: Data Eng, Mgmt & Governance Specialist Qualifications: BE/BTech Years of Experience: 7 to 11 years About Accenture Combining unmatched experience and specialized skills across more than 40 industries, we offer Strategy and Consulting, Technology and Operations services, and Accenture Song all powered by the worlds largest network of Advanced Technology and Intelligent Operations centers. Our 699,000 people deliver on the promise of technology and human ingenuity every day, serving clients in more than 120 countries. Visit us at www.accenture.com What would you do Data & AIUnderstand Pyspark interface and integration in handling complexities of multiprocessing, such as distributing the data, distributing code and collecting output from the workers on a cluster of machines. What are we looking for Data Engineering Python (Programming Language) Structured Query Language (SQL) Adaptable and flexible Ability to work well in a team Agility for quick learning Strong analytical skills Prioritization of workload Roles and Responsibilities: In this role you are required to do analysis and solving of moderately complex problems May create new solutions, leveraging and, where needed, adapting existing methods and procedures The person would require understanding of the strategic direction set by senior management as it relates to team goals Primary upward interaction is with direct supervisor May interact with peers and/or management levels at a client and/or within Accenture Guidance would be provided when determining methods and procedures on new assignments Decisions made by you will often impact the team in which they reside Individual would manage small teams and/or work efforts (if in an individual contributor role) at a client or within Accenture Qualification BE,BTech

Posted 1 week ago

Apply

2.0 - 7.0 years

17 - 22 Lacs

Gurugram

Work from Office

Entity :- Accenture Strategy & Consulting Team :- Global Network Data & AI Practice :- Life Sciences Title :- Ind & Func AI Decision Science Consultant Job Location :- Delhi, Gurgaon, Mumbai, Bangalore About Strategy & Consulting Global Network :- Accenture S&C Global Network - Data & AI practice help our clients grow their business in entirely new ways. Analytics enables our clients to achieve high performance through insights from data - insights that inform better decisions and strengthen customer relationships. From strategy to execution, Accenture works with organizations to develop analytic capabilities - from accessing and reporting on data to predictive modelling - to outperform the competition. WHATS IN IT FOR YOU An opportunity to work on high-visibility projects with top Pharma clients around the globe. Potential to Co-create with leaders in strategy, industry experts, enterprise function practitioners, and business intelligence professionals to shape and recommend innovative solutions that leverage emerging technologies. Ability to embed responsible business into everythingfrom how you service your clients to how you operate as a responsible professional. Personalized training modules to develop your strategy & consulting acumen to grow your skills, industry knowledge, and capabilities. Opportunity to thrive in a culture that is committed to accelerating equality for all. Engage in boundaryless collaboration across the entire organization. What you would do in this role Support delivery of small to medium-sized teams to deliver consulting projects for global clients. Responsibilities may include strategy, implementation, process design, and change management for specific modules. Work with the team or as an Individual contributor on the project assigned which includes a variety of skills to be utilized from Data Engineering to Data Science Develop assets and methodologies, point-of-view, research, or white papers for use by the team and the larger community. Work on variety of projects in Data Modeling, Data Engineering, Data Visualization, Data Science etc., Acquire new skills that have utility across industry groups. Support strategies and operating models focused on some business units and assess likely competitive responses. Also, assess implementation readiness and points of greatest impact. Make presentations wherever required to a known audience or client on functional aspects of his or her domain. Qualification Who are we looking for Bachelors or Masters degree in Statistics, Data Science, Applied Mathematics, Business Analytics, Computer Science, Information Systems, or other Quantitative field. Proven experience (2+ years) in working on Life Sciences/Pharma/Healthcare projects and delivering successful outcomes. Understanding of Pharma data sets commercial, clinical, RWE (Real World Evidence) & EMR (Electronic medical records) Leverage ones hands on experience of working across one or more of these areas such as real-world evidence data, R&D clinical data, digital marketing data. Hands-on experience in building and deployment of Statistical Models/Machine Learning including Segmentation & predictive modeling, hypothesis testing, multivariate statistical analysis, time series techniques, and optimization. Hands-on experience with handling Datasets like Komodo, RAVE, IQVIA, Truven, Optum etc. Proficiency in Programming languages such as R, Python, SQL, Spark, etc. Ability to work with large data sets and present findings/insights to key stakeholders; Data management using databases like SQL. Experience with any of the cloud platforms like AWS, Azure, or Google Cloud for deploying and scaling language models. Experience with any of the Data Visualization tools like Tableau, Power BI, Qlikview, Spotfire is good to have. Excellent analytical and problem-solving skills, with a data-driven mindset. Proficient in Excel, MS Word, PowerPoint, etc. Ability to solve complex business problems and deliver client delight. Strong writing skills to build points of view on current industry trends. Good communication, interpersonal, and presentation skills Accenture is an equal opportunities employer and welcomes applications from all sections of society and does not discriminate on grounds of race, religion or belief, ethnic or national origin, disability, age, citizenship, marital, domestic or civil partnership status, sexual orientation, gender identity, or any other basis as protected by applicable law.

Posted 1 week ago

Apply

1.0 - 4.0 years

2 - 5 Lacs

Mumbai, New Delhi, Bengaluru

Work from Office

Experience : 5 + years Expected Notice Period : 15 Days Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote ,New Delhi,Bengaluru,Mumbai Must have skills required: ActiveCampaign, AI, GPT, Juniper Square, CRM, Google Workspace, Notion, Yardi, Zapier Take ownership of our systems architecture and play a foundational role in operational scale Build the tools and automations that power a modern, data-driven investment platform Work closely with the executive team and gain visibility across business units Enjoy autonomy, flexibility, and a high-trust, results-focused team culture Competitive compensation based on experience and strategic impact We are seeking a systems-driven professional to join us as Head of Systems & Workflow Automation. This is a strategic and implementation-focused role responsible for owning our internal technology stackfrom process discovery and design to full deployment, integration, and automation. You will lead the effort to understand our real estate, marketing, and investor operations workflows, identify points of friction or inefficiency, and implement technology solutions that simplify execution and ensure data flows cleanly across tools. A key part of your role will be building automated data connections across systems and maintaining a centralized Notion-based company dashboard to ensure real-time visibility and team-wide coordination. Core Mission Own the implementation and performance of Aptas technology infrastructure by: Designing and deploying efficient, simplified workflows between departments and platforms Automating data flow between systems (e.g., CRM, investor portals, Google Workspace, Yardi, Agora) and into centralized dashboards in Notion Translating business processes into scalable, tech-enabled solutions that support day-to-day execution and decision-making Key Responsibilities Tech Stack Ownership and Implementation Lead implementation, integration, and ongoing management of core business platforms, including Notion, Slack, Google Workspace, Juniper Square, Yardi Breeze Premier, Agora, and our CRM Serve as the point person for all internal platform configuration and system enhancements Process Mapping and Workflow Design Work with each team function (marketing, investor relations, acquisitions, asset management) to map operational workflows and identify opportunities to streamline processes Design and implement simplified, standardized workflows across platforms that reduce friction and improve handoffs Cross-System Integration and Automation Build and maintain automations using Zapier or equivalent tools to eliminate manual entry, increase accuracy, and connect siloed tools Automate structured data transfer from external platforms into a Notion-based dashboard used across the company Documentation, Training, and Adoption Document systems architecture, SOPs, and platform usage guidelines for each major process Deliver live training and onboarding for internal users and serve as a support resource for troubleshooting system issues Reporting, Governance, and Optimization Ensure system accuracy, data governance, and real-time reporting integrity across all platforms Regularly assess platform usage, functionality gaps, and data flow, and implement ongoing improvements AI and Innovation Enablement Explore and implement intelligent tools (e.g., AI assistants, GPTs, internal automations) that accelerate business operations What Were Looking For Required Skills and Experience 5+ years in systems enablement, technical operations, or RevOps/MarketingOps roles Experience managing business platforms and integrating cross-functional workflows Proven ability to automate data movement between systems and into shared dashboards (especially using Zapier or similar tools) Deep familiarity with CRM tools (HubSpot, ActiveCampaign, or equivalent), platform APIs, and structured data Exceptional systems thinking and the ability to map, simplify, and scale operational processes Strong documentation and communication skills; comfortable leading internal trainings and writing SOPs Self-motivated and highly organized, capable of managing multiple initiatives in parallel Preferred Qualifications Experience with Notion as a central operations dashboard or team knowledge hub Exposure to real estate tech platforms such as Yardi Breeze Premier, Juniper Square, Agora Background working with high-performance teams in fast-paced or entrepreneurial environments Familiarity with AI or GPT-based automations as applied to business process enablement

Posted 1 week ago

Apply

4.0 - 6.0 years

3 - 6 Lacs

Kolkata, Hyderabad, Telkapalle

Work from Office

Work Location: Hyderabad, Telangana & Kolkata, West Bengal Experience: 4 to 6 Years Job Type: Full-Time Job Summary: We are looking for experienced AWS/Python Data Engineers with a strong background in cloud-based data processing and development using Amazon Web Services and Python. The ideal candidate should have hands-on experience building scalable data pipelines and cloud-native solutions in a fast-paced, agile environment. Key Responsibilities: Design, develop, and maintain data engineering solutions using AWS services and Python . Build robust, scalable, and high-performance data pipelines. Integrate and process data from multiple sources for analytics and business intelligence. Collaborate with cross-functional teams including data scientists, analysts, and DevOps engineers. Optimize existing pipelines for performance and cost-efficiency. Ensure data quality, security, and governance across the data lifecycle. Essential Skills: 46 years of experience in Data Engineering roles. Strong proficiency in Python programming for data processing. Hands-on experience with AWS Cloud Services such as S3, Lambda, Glue, Redshift, EMR, etc. Good understanding of cloud architecture and security best practices. Experience with CI/CD, version control (Git), and infrastructure as code (CloudFormation /Terraform) is a plus. Preferred Qualifications: Bachelor’s or Master’s degree in Computer Science, Information Technology, or related field. AWS Certification (e.g., AWS Certified Data Analytics – Specialty or Solutions Architect) is a plus. Familiarity with data lake architectures and ETL/ELT patterns.

Posted 1 week ago

Apply

8.0 - 10.0 years

6 - 10 Lacs

Chennai

Work from Office

We are seeking a skilled Data Engineer with expertise in MuleSoft to join our dynamic team In this role, you will be responsible for designing, developing, and maintaining robust data integration solutions that leverage MuleSoft's powerful capabilities You will collaborate closely with cross-functional teams to gather requirements and translate them into scalable data architectures Our ideal candidate is not only proficient in data engineering but also has a strong understanding of API-led connectivity and microservices architecture You will work on various projects that involve data extraction, transformation, and loading (ETL) processes, as well as ensuring the integrity and accessibility of data across different systems Your analytical mindset and problem-solving skills will be crucial in optimizing data flows and enhancing performance Additionally, you will be involved in the automation of data processes, implementing best practices for data management, and ensuring compliance with data governance policies By joining our team, you will have the opportunity to work with a variety of technologies, contribute to innovative projects, and grow your skills in a collaborative environment Responsibilities: Design and implement ETL processes using MuleSoft to integrate data from various sources. Collaborate with stakeholders to gather and understand data integration requirements. Develop and maintain data pipelines and workflows to ensure efficient data transfer and processing. Optimize data models for performance and scalability across different applications and environments. Monitor and troubleshoot data integration processes, addressing any issues that arise. Ensure data quality and integrity by implementing validation and cleansing procedures. Document data flows, processes, and integration designs to maintain comprehensive records. Requirements: Bachelor's degree in Computer Science, Engineering, or a related field. Proven experience as a Data Engineer with a strong focus on MuleSoft technologies. Hands-on experience with API development and integration using MuleSoft Anypoint Platform. Strong understanding of data modeling concepts and database management systems. Proficiency in programming languages such as Java, Python, or SQL. Experience with cloud services such as AWS, Azure, or Google Cloud Platform. Excellent problem-solving skills and attention to detail, with the ability to work collaboratively.

Posted 1 week ago

Apply

2.0 - 7.0 years

40 - 45 Lacs

Chandigarh

Work from Office

As the Data Engineer, you will play a pivotal role in shaping our data infrastructure and executing against our strategy. You will ideate alongside engineering, data and our clients to deploy data products with an innovative and meaningful impact to clients. You will design, build, and maintain scalable data pipelines and workflows on AWS. Additionally, your expertise in AI and machine learning will enhance our ability to deliver smarter, more predictive solutions. Key Responsibilities Collaborate with other engineers, customers to brainstorm and develop impactful data products tailored to our clients. Leverage AI and machine learning techniques to integrate intelligent features into our offerings. Develop, and optimize end-to-end data pipelines on AWS Follow best practices in software architecture and development. Implement effective cost management and performance optimization strategies. Develop and maintain systems using Python, SQL, PySpark, and Django for front-end development. Work directly with clients and end-users and address their data needs Utilize databases and tools including and not limited to, Postgres, Redshift, Airflow, and MongoDB to support our data ecosystem. Leverage AI frameworks and libraries to integrate advanced analytics into our solutions. Qualifications Experience: Minimum of 3 years of experience in data engineering, software development, or related roles. Proven track record in designing and deploying AWS cloud infrastructure solutions At least 2 years in data analysis and mining techniques to aid in descriptiv and diagnostic insights Extensive hands-on experience with Postgres, Redshift, Airflow, MongoDB, and real-time data workflows. Technical Skills: Expertise in Python, SQL, and PySpark Strong background in software architecture and scalable development practices. Tableau, Metabase or similar viz tools experience Working knowledge of AI frameworks and libraries is a plus. Leadership & Communication: Demonstrates ownership and accountability for delivery with a strong commitment to quality. Excellent communication skills with a history of effective client and end-user engagement. Startup & Fintech Mindset: Adaptability and agility to thrive in a fast-paced, early-stage startup environment. Passion for fintech innovation and a strong desire to make a meaningful impact on the future of finance.

Posted 1 week ago

Apply

4.0 - 9.0 years

5 - 9 Lacs

Hyderabad

Work from Office

Job_Description":" Role: Python/Data Engineer: Level Expected: 4-9 yrs Must Haves: 1. Good analytical and problem-solving skills. 2. Good hands-on experience in developing Python programs with Python 3.10 version 3. Familiarity with Python frameworks Django, Flask etc. 4. Good knowledge of Database technologies like RDBMS, MongoDB, Hibernate etc. 5. Good knowledge of REST API, creation and consumption 6. Basic knowledge of AWS services - EC2, S3, Lambda function, ALB 7. Familiarity with Git, JIRA and other dev tools Good to have: 1. Hands on experience with AWS services, mainly Lambda function creation. 2. Basic knowledge of Databricks ","

Posted 1 week ago

Apply

5.0 - 8.0 years

7 - 10 Lacs

Mumbai, New Delhi, Bengaluru

Work from Office

Expected Notice Period : 15 Days Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote (*Note: This is a requirement for one of Uplers' client - A fast-growing, VC-backed B2B SaaS platform revolutionizing financial planning and analysis for modern finance teams.) What do you need for this opportunity? Must have skills required: async workflows, MLOps, Ray Tune, Data Engineering, MLFlow, Supervised Learning, Time-Series Forecasting, Docker, machine_learning, NLP, Python, SQL A fast-growing, VC-backed B2B SaaS platform revolutionizing financial planning and analysis for modern finance teams. is Looking for: We are a fast-moving startup building AI-driven solutions to the financial planning workflow. Were looking for a versatile Machine Learning Engineer to join our team and take ownership of building, deploying, and scaling intelligent systems that power our core product. Job Description- Full-time Team: Data & ML Engineering Were looking for 5+ years of experience as a Machine Learning or Data Engineer (startup experience is a plus) WHAT YOU WILL DO- Build and optimize machine learning models from regression to time-series forecasting Work with data pipelines and orchestrate training/inference jobs using Ray, Airflow, and Docker Train, tune, and evaluate models using tools like Ray Tune, MLflow, and scikit-learn Design and deploy LLM-powered features and workflows Collaborate closely with product managers to turn ideas into experiments and production-ready solutions Partner with Software and DevOps engineers to build robust ML pipelines and integrate them with the broader platform BASIC SKILLS Proven ability to work creatively and analytically in a problem-solving environment Excellent communication (written and oral) and interpersonal skills Strong understanding of supervised learning and time-series modeling Experience deploying ML models and building automated training/inference pipelines Ability to work cross-functionally in a collaborative and fast-paced environment Comfortable wearing many hats and owning projects end-to-end Write clean, tested, and scalable Python and SQL code Leverage async workflows and cloud-native infrastructure (S3, Docker, etc.) for high-throughput data processing. ADVANCED SKILLS Familiarity with MLOps best practices Prior experience with LLM-based features or production-level NLP Experience with LLMs, vector stores, or prompt engineering Contributions to open-source ML or data tools TECH STACK Languages: Python, SQL Frameworks & Tools: scikit-learn, Prophet, pyts, MLflow, Ray, Ray Tune, Jupyter Infra: Docker, Airflow, S3, asyncio, Pydantic

Posted 1 week ago

Apply

5.0 - 8.0 years

7 - 10 Lacs

Mumbai, New Delhi, Bengaluru

Work from Office

Expected Notice Period : 15 Days Shift : (GMT+05:30) Asia/Kolkata (IST) What do you need for this opportunity? Must have skills required: Data Governance, Lakehouse architecture, Medallion Architecture, Azure DataBricks, Azure Synapse, Data Lake Storage, Azure Data Factory Intelebee LLC is Looking for: Data Engineer:We are seeking a skilled and hands-on Cloud Data Engineer with 5-8 years of experience to drive end-to-end data engineering solutions. The ideal candidate will have a deep understanding of dimensional modeling, data warehousing (DW), Lakehouse architecture, and the Medallion architecture. This role will focus on leveraging Azure's/AWS ecosystem to build scalable, efficient, and secure data solutions. You will work closely with customers to understand requirements, create technical specifications, and deliver solutions that scale across both on-premise and cloud environments. Key Responsibilities: End-to-End Data Engineering Lead the design and development of data pipelines for large-scale data processing, utilizing Azure/AWS tools such as Azure Data Factory, Azure Synapse, Azure functions, Logic Apps , Azure Databricks, and Data Lake Storage. Tools, AWS Lambda, AWS Glue Develop and implement dimensional modeling techniques and data warehousing solutions for effective data analysis and reporting. Build and maintain Lakehouse and Medallion architecture solutions for streamlined, high-performance data processing. Implement and manage Data Lakes on Azure/AWS, ensuring that data storage and processing is both scalable and secure. Handle large-scale databases (both on-prem and cloud) ensuring high availability, security, and performance. Design and enforce data governance policies for data security, privacy, and compliance within the Azure ecosystem.

Posted 1 week ago

Apply

7.0 - 8.0 years

9 - 10 Lacs

Mumbai, New Delhi, Bengaluru

Work from Office

Expected Notice Period : 15 Days Shift : (GMT+05:30) Asia/Kolkata (IST) What do you need for this opportunity? Must have skills required: Gen AI, AWS data stack, Kinesis, open table format, PySpark, stream processing, Kafka, MySQL, Python MatchMove is Looking for: Technical Lead - Data Platform - Data, you will architect, implement, and scale our end-to-end data platform built on AWS S3, Glue, Lake Formation, and DMS. You will lead a small team of engineers while working cross-functionally with stakeholders from fraud, finance, product, and engineering to enable reliable, timely, and secure data access across the business. You will champion best practices in data design, governance, and observability, while leveraging GenAI tools to improve engineering productivity and accelerate time to insight. You will contribute to Owning the design and scalability of the data lake architecture for both streaming and batch workloads, leveraging AWS-native services. Leading the development of ingestion, transformation, and storage pipelines using AWS Glue, DMS, Kinesis/Kafka, and PySpark. Structuring and evolving data into OTF formats (Apache Iceberg, Delta Lake) to support real-time and time-travel queries for downstream services. Driving data productization, enabling API-first and self-service access to curated datasets for fraud detection, reconciliation, and reporting use cases. Defining and tracking SLAs and SLOs for critical data pipelines, ensuring high availability and data accuracy in a regulated fintech environment. Collaborating with InfoSec, SRE, and Data Governance teams to enforce data security, lineage tracking, access control, and compliance (GDPR, MAS TRM). Using Generative AI tools to enhance developer productivity including auto-generating test harnesses, schema documentation, transformation scaffolds, and performance insights. Mentoring data engineers, setting technical direction, and ensuring delivery of high-quality, observable data pipelines. Responsibilities:: Architect scalable, cost-optimized pipelines across real-time and batch paradigms, using tools such as AWS Glue, Step Functions, Airflow, or EMR. Manage ingestion from transactional sources using AWS DMS, with a focus on schema drift handling and low-latency replication. Design efficient partitioning, compression, and metadata strategies for Iceberg or Hudi tables stored in S3, and cataloged with Glue and Lake Formation. Build data marts, audit views, and analytics layers that support both machine-driven processes (e.g. fraud engines) and human-readable interfaces (e.g. dashboards). Ensure robust data observability with metrics, alerting, and lineage tracking via OpenLineage or Great Expectations. Lead quarterly reviews of data cost, performance, schema evolution, and architecture design with stakeholders and senior leadership. Enforce version control, CI/CD, and infrastructure-as-code practices using GitOps and tools like Terraform. Requirements At-least 7 years of experience in data engineering. Deep hands-on experience with AWS data stack: Glue (Jobs & Crawlers), S3, Athena, Lake Formation, DMS, and Redshift Spectrum Expertise in designing data pipelines for real-time, streaming, and batch systems, including schema design, format optimization, and SLAs. Strong programming skills in Python (PySpark) and advanced SQL for analytical processing and transformation. Proven experience managing data architectures using open table formats (Iceberg, Delta Lake, Hudi) at scale . Understanding of stream processing with Kinesis/Kafka and orchestration via Airflow or Step Functions. Experience implementing data access controls, encryption policies, and compliance workflows in regulated environments. Ability to integrate GenAI tools into data engineering processes to drive measurable productivity and quality gains with strong engineering hygiene. Demonstrated ability to lead teams, drive architectural decisions, and collaborate with cross-functional stakeholders. Brownie Points: Experience working in a PCI DSS or any other central bank regulated environment with audit logging and data retention requirements. Experience in the payments or banking domain, with use cases around reconciliation, chargeback analysis, or fraud detection. Familiarity with data contracts, data mesh patterns, and data as a product principles. Experience using GenAI to automate data documentation, generate data tests, or support reconciliation use cases. Exposure to performance tuning and cost optimization strategies in AWS Glue, Athena, and S3. Experience building data platforms for ML/AI teams or integrating with model feature stores. Engagement Model:: Direct placement with client This is remote role Shift timings::10 AM to 7 PM

Posted 1 week ago

Apply

8.0 - 13.0 years

25 - 30 Lacs

Bengaluru

Work from Office

About the Team When 5% of Indian households shop with us, its important to build resilient systems to manage millions of orders every day. Weve done this with zero downtime! ?? Sounds impossible? Well, thats the kind of Engineering muscle that has helped Meesho become the e-commerce giant that it is today. We value speed over perfection, and see failures as opportunities to become better. Weve taken steps to inculcate a strong Founders Mindset across our engineering teams, making us grow and move fast. We place special emphasis on the continuous growth of each team member - and we do this with regular 1-1s and open communication. As Engineering Manager, you will be part of self-starters who thrive on teamwork and constructive feedback. We know how to party as hard as we work! If we arent building unparalleled tech solutions, you can find us debating the plot points of our favourite books and games or even gossipping over chai. So, if a day filled with building impactful solutions with a fun team sounds appealing to you, join us. About the Role We are looking for a seasoned Engineering Manager well-versed with emerging technologies to join our team. As an Engineering Manager, you will ensure consistency and quality by shaping the right strategies. You will keep an eye on all engineering projects and ensure all duties are fulfilled. You will analyse other employees tasks and carry on collaborations effectively. You will also transform newbies into experts and build reports on the progress of all projects. What you will do Design tasks for other engineers, keeping Meeshos guidelines and standards in mind Keep a close look on various projects and monitor the progress Drive excellence in quality across the organisation and solutioning of product problems Collaborate with the sales and design teams to create new products Manage engineers and take ownership of the project while ensuring product scalability Conduct regular meetings to plan and develop reports on the progress of projects What you will need Bachelor's / Masters in computer science At least 8+ years of professional experience At least 4+ years experience in managing software development teams Experience in building large-scale distributed Systems Experience in Scalable platforms Expertise in Java/Python/Go-Lang and multithreading Good understanding on Spark and internals Deep understanding of transactional and NoSQL DBs Deep understanding of Messaging systems Kafka Good experience on cloud infrastructure - AWS preferably Ability to drive sprints and OKRs with good stakeholder management experience. Exceptional team managing skills Experience in managing a team of 4-5 junior engineers Good understanding on Streaming and real time pipelines Good understanding on Data modelling concepts, Data Quality tools Good knowledge in Business Intelligence tools Metabase, Superset, Tableau etc. Good to have knowledge - Trino, Flink, Presto, Druid, Pinot etc. Good to have knowledge - Data pipeline building

Posted 1 week ago

Apply

1.0 - 3.0 years

15 - 30 Lacs

Bengaluru

Work from Office

About the Role Does digging deep for data and turning it into useful, impactful insights get you excited? Then you could be our next SDE II Data-Real Time Streaming. In this role, youll oversee your entire teams work, ensuring that each individual is working towards achieving their personal goals and Meeshos organisational goals. Moreover, youll keep an eye on all engineering projects and ensure the team is not straying from the right track. Youll also be tasked with directing programming activities, evaluating system performance, and designing new programs and features for smooth functioning. What you will do Build a platform for ingesting and processing multi terabytes of data daily Curate, build and transform raw data into scalable information Create prototypes and proofs-of-concept for iterative development Reduce technical debt with quality coding Keep a closer look at various projects and monitor the progress Carry on smooth collaborations with the sales team and engineering teams Provide management mentorship which sets the tone for holistic growth Ensure everyone is on the same page and taking ownership of the project What you will need Bachelors/Masters degree in Computer Science At least 1 to 3 years of professional experience Exceptional coding skills using Java, Scala, Python Working knowledge of Redis, MySQL and messaging systems like Kafka Knowledge of RxJava, Java Springboot, Microservices architecture Hands-on experience with the distributed systems architecture dealing with high throughput. Experience in building streaming and real-time solutions using Apache Flink/Spark Streaming/Samza. Familiarity with software engineering best practices across all stages of software development Expertise in Data system internalsStrong problem-solving and analytical skills Familiarity with Big Data systems (Spark/EMR, Hive/Impala, Delta Lake, Presto, Airflow, Data Lineage) is an advantage Familiarity with data modeling, end-to-end data pipelining, OLAP data cubes and BI tools is a plus Experience as a contributor/committer to the Big data stack is a plus Having been a contributor/committer to the big data stack Data modeling experience and end-to-end data pipelining experience is a plus Brownie points for knowledge of OLAP data cubes and BI tools

Posted 1 week ago

Apply

3.0 - 6.0 years

1 - 4 Lacs

Pune

Work from Office

Work Location: Viman Nagar, Pune Work Type: WFO (5 days week), Interview Mode: Face to Face Only Role & responsibilities ETL Python Proficiency: Strong command of Python programming language. PySpark Expertise: In-depth knowledge of PySpark API and its functionalities for data processing and manipulation. Data Structures and Algorithms: Solid understanding of data structures and algorithms. Distributed Computing Concepts: Familiarity with distributed computing principles. Big Data Technologies: Experience with big data technologies like Hadoop, and cloud platforms such as Azure or AWS. SQL: Proficiency in SQL for data querying and analysis. Problem-Solving: Strong analytical and problem-solving skills. Communication: Effective communication and collaboration skills.

Posted 1 week ago

Apply

5.0 - 7.0 years

6 - 10 Lacs

Mumbai, Bengaluru, Delhi

Work from Office

Must have skills required : Java, Groovy, SQL, AWS, Data Engineering, Agile, Database Good to have skills : Machine Learning, Python, CI/CD, MicroServices, Problem Solving Intro and job overview: As a Senior Software Engineer II, you will join a team working with next gen technologies on geospatial solutions in order to identify areas for future growth, new customers and new markets in the Geocoding data integrity space. You will be working on the distributed computing platform in order to migrate existing geospatial datasets creation process-and bring more value to Preciselys customers and grow market share. Responsibilities and Duties: You wil be working on the distributing computing platform to migrate the existing Geospatial data processes including sql scripts,groovy scripts. Strong Concepts in Object Oriented Programming and development languages, Java, including SQL, Groove/Gradle/maven You will be working closely with Domain/Technical experts and drive the overall modernization of the existing processes. You will be responsible to drive and maintain the AWS infrastructures and other Devops processes. Participate in design and code reviews within a team environment to eliminate errors early in the development process. Participate in problem determination and debugging of software product issues by using technical skills and tools to isolate the cause of the problem in an efficient and timely manner. Provide documentation needed to thoroughly communicate software functionality. Present technical features of product to customers and stakeholders as required. Ensure timelines and deliverables are met. Participate in the Agile development process. Requirements and Qualifications: UG - B.Tech/B.E. OR PG M.S. / M.Tech in Computer Science, Engineering or related discipline At least 5-7 years of experience implementing and managing geospatial solutions Expert level in programming language Java, Python. Groovy experience is preferred. Expert level in writing optimized SQL queries, procedures, or database objects to support data extraction, manipulation in data environment Strong Concepts in Object Oriented Programming and development languages, Java, including SQL, Groovy/Gradle/maven Expert in script automation in Gradle and Maven. Problem Solving and Troubleshooting Proven ability to analyze and solve complex data problems, troubleshoot data pipelines issues effectively Experience in SQL, database warehouse and data engineering concepts Experience with AWS platform provided Big Data technologies (IAM, EC2, S3, EMR, RedShift, Lambda, Aurora, SNS, etc.) Strong analytical, problem-solving, data analysis and research Good knowledge of Continuous Build Integration (Jenkins and Gitlab pipeline) Experience with agile development and working with agile engineering teams Excellent interpersonal skills Knowledge on micro services and cloud native framework. Knowledge of Machine Learning / AI. Knowledge on programming language Python.

Posted 1 week ago

Apply

8.0 - 11.0 years

15 - 25 Lacs

Pune

Hybrid

We are seeking a Senior Data Engineer with 8 - 11 years of experience to join our Pune-based team. The ideal candidate will bring deep expertise in Python , GCP (BigQuery) , PySpark , Snowflake , and GitHub , and be responsible for building and maintaining scalable, cloud-native data pipelines. Youll work closely with cross-functional teams to ensure high-quality data solutions that support analytics and business intelligence needs. This is a great opportunity for professionals passionate about working on modern data platforms and contributing to enterprise-level data initiatives. If anyone interested can share their updated resume on " shashwat.pa@peoplefy.com " or feel free to reach out to me at " +918660547469 "

Posted 1 week ago

Apply

4.0 - 8.0 years

7 - 17 Lacs

Chennai

Work from Office

Dear Candidate We have a Walk In drive happening for Bigdata developer position on this Saturday. Skill: Bigdata developer Primary Skills :Python-pyspark OR Python -scala Experience : 4- 8 yrs Location: Chennai Notice period : Immediate to 15 days only Mode of Discussion : F2F Date of Interview : 5-Jul-25 (Saturday) Timing : 9:30 AM Venue : Aspire system office - Siruseri If interested Kinldy share your resume to saranya.raghu@aspiresys.com Regards Saranya Raghu

Posted 1 week ago

Apply

5.0 - 10.0 years

8 - 12 Lacs

Erode

Work from Office

We are seeking an experienced and strategic Data and Reporting Lead to shape our data strategy and drive data-driven decision-making across the organization. This role will focus on developing a comprehensive data infrastructure, ensuring data accuracy, and providing critical insights to support our business goals. Responsibilities Data Strategy & Governance: Develop and implement a data strategy that aligns with organizational goals. Establish governance policies to maintain data quality, consistency, and security. Team Leadership: Provide training and development to enhance the teams skills in data management and reporting.Reporting & Analytics: Oversee the creation of dashboards and reports, delivering key insights to stakeholders. Ensure reports are accessible, reliable, and relevant, with a focus on performance metrics, customer insights, and operational efficiencies.Cross-functional Collaboration: Work closely with cross-functional teams (Tech, Finance, Operations, Marketing, Credit and Analytics) to identify data requirements, integrate data across systems, and support data-driven initiatives. Data Infrastructure & Tools: Work with Data Engineering to assess, select, and implement data tools and platforms to optimize data storage, processing, and reporting capabilities. Maintain and improve our data infrastructure to support scalability and data accessibility.Data Compliance: Ensure adherence to data privacy laws and compliance standards, implementing best practices in data security and privacy. Qualifications Experience: 5-10 years of experience in data management and reporting with at least some in a leadership role. Education: Bachelors or Masters degree in Data Science, Business Analytics, Statistics, Computer Science, or a related field (STEM). Technical Skills: Proficiency in data visualization tools (Metabase, Sisense, Tableau, Power BI), SQL, and data warehousing solutions. Knowledge of ETL processes and familiarity with cloud data platforms is a plus.Analytical Skills: Strong analytical abilities and a strategic mindset, with proven experience in translating data into actionable business insights. Leadership & Communication: Excellent leadership, communication, and presentation skills. Ability to communicate complex information clearly to both technical and non-technical stakeholders.Strong strategic thinking and problem-solving skills Enthusiasm for working across cultures, functions, and time zones

Posted 1 week ago

Apply

5.0 - 10.0 years

8 - 12 Lacs

Kozhikode

Work from Office

We are seeking an experienced and strategic Data and Reporting Lead to shape our data strategy and drive data-driven decision-making across the organization. This role will focus on developing a comprehensive data infrastructure, ensuring data accuracy, and providing critical insights to support our business goals.ResponsibilitiesData Strategy & Governance: Develop and implement a data strategy that aligns with organizational goals. Establish governance policies to maintain data quality, consistency, and security. Team Leadership: Provide training and development to enhance the teams skills in data management and reporting. Reporting & Analytics: Oversee the creation of dashboards and reports, delivering key insights to stakeholders. Ensure reports are accessible, reliable, and relevant, with a focus on performance metrics, customer insights, and operational efficiencies.Cross-functional Collaboration: Work closely with cross-functional teams (Tech, Finance, Operations, Marketing, Credit and Analytics) to identify data requirements, integrate data across systems, and support data-driven initiatives. Data Infrastructure & Tools: Work with Data Engineering to assess, select, and implement data tools and platforms to optimize data storage, processing, and reporting capabilities. Maintain and improve our data infrastructure to support scalability and data accessibility.Data Compliance: Ensure adherence to data privacy laws and compliance standards, implementing best practices in data security and privacy. Qualifications Experience: 5-10 years of experience in data management and reporting with at least some in a leadership role.Education: Bachelors or Masters degree in Data Science, Business Analytics, Statistics, Computer Science, or a related field (STEM). Technical Skills: Proficiency in data visualization tools (Metabase, Sisense, Tableau, Power BI), SQL, and data warehousing solutions. Knowledge of ETL processes and familiarity with cloud data platforms is a plus.Analytical Skills: Strong analytical abilities and a strategic mindset, with proven experience in translating data into actionable business insights.Leadership & Communication: Excellent leadership, communication, and presentation skills. Ability to communicate complex information clearly to both technical and non-technical stakeholders.Strong strategic thinking and problem-solving skillsEnthusiasm for working across cultures, functions, and time zones

Posted 1 week ago

Apply

5.0 - 10.0 years

8 - 12 Lacs

Jharkhand

Work from Office

We are seeking an experienced and strategic Data and Reporting Lead to shape our data strategy and drive data-driven decision-making across the organization. This role will focus on developing a comprehensive data infrastructure, ensuring data accuracy, and providing critical insights to support our business goals. Responsibilities Data Strategy & Governance: Develop and implement a data strategy that aligns with organizational goals. Establish governance policies to maintain data quality, consistency, and security. Team Leadership: Provide training and development to enhance the teams skills in data management and reporting.Reporting & Analytics: Oversee the creation of dashboards and reports, delivering key insights to stakeholders. Ensure reports are accessible, reliable, and relevant, with a focus on performance metrics, customer insights, and operational efficiencies. Cross-functional Collaboration: Work closely with cross-functional teams (Tech, Finance, Operations, Marketing, Credit and Analytics) to identify data requirements, integrate data across systems, and support data-driven initiatives. Data Infrastructure & Tools: Work with Data Engineering to assess, select, and implement data tools and platforms to optimize data storage, processing, and reporting capabilities. Maintain and improve our data infrastructure to support scalability and data accessibility.Data Compliance: Ensure adherence to data privacy laws and compliance standards, implementing best practices in data security and privacy. Qualifications Experience: 5-10 years of experience in data management and reporting with at least some in a leadership role. Education: Bachelors or Masters degree in Data Science, Business Analytics, Statistics, Computer Science, or a related field (STEM). Technical Skills: Proficiency in data visualization tools (Metabase, Sisense, Tableau, Power BI), SQL, and data warehousing solutions. Knowledge of ETL processes and familiarity with cloud data platforms is a plus.Analytical Skills: Strong analytical abilities and a strategic mindset, with proven experience in translating data into actionable business insights. Leadership & Communication: Excellent leadership, communication, and presentation skills. Ability to communicate complex information clearly to both technical and non-technical stakeholders.Strong strategic thinking and problem-solving skillsEnthusiasm for working across cultures, functions, and time zones

Posted 1 week ago

Apply

5.0 - 10.0 years

7 - 12 Lacs

Raipur

Work from Office

Job Overview As a company, we are passionate about our customers, fearless in the face of barriers, and driven by data.We are seeking an experienced and strategic Data and Reporting Lead to shape our data strategy and drive data-driven decision-making across the organization. This role will focus on developing a comprehensive data infrastructure, ensuring data accuracy, and providing critical insights to support our business goals.ResponsibilitiesData Strategy & Governance: Develop and implement a data strategy that aligns with organizational goals. Establish governance policies to maintain data quality, consistency, and security.Team Leadership: Provide training and development to enhance the teams skills in data management and reporting.Reporting & Analytics: Oversee the creation of dashboards and reports, delivering key insights to stakeholders. Ensure reports are accessible, reliable, and relevant, with a focus on performance metrics, customer insights, and operational efficiencies.Cross-functional Collaboration: Work closely with cross-functional teams (Tech, Finance, Operations, Marketing, Credit and Analytics) to identify data requirements, integrate data across systems, and support data-driven initiatives.Data Infrastructure & Tools: Work with Data Engineering to assess, select, and implement data tools and platforms to optimize data storage, processing, and reporting capabilities. Maintain and improve our data infrastructure to support scalability and data accessibility.Data Compliance: Ensure adherence to data privacy laws and compliance standards, implementing best practices in data security and privacy. Qualifications Experience: 5-10 years of experience in data management and reporting with at least some in a leadership role.Education: Bachelors or Masters degree in Data Science, Business Analytics, Statistics, Computer Science, or a related field (STEM).Technical Skills: Proficiency in data visualization tools (Metabase, Sisense, Tableau, Power BI), SQL, and data warehousing solutions. Knowledge of ETL processes and familiarity with cloud data platforms is a plus.Analytical Skills: Strong analytical abilities and a strategic mindset, with proven experience in translating data into actionable business insights.Leadership & Communication: Excellent leadership, communication, and presentation skills. Ability to communicate complex information clearly to both technical and non-technical stakeholders.Strong strategic thinking and problem-solving skillsEnthusiasm for working across cultures, functions, and time zones

Posted 1 week ago

Apply

5.0 - 10.0 years

7 - 12 Lacs

Kolhapur

Work from Office

Job Overview As a company, we are passionate about our customers, fearless in the face of barriers, and driven by data.We are seeking an experienced and strategic Data and Reporting Lead to shape our data strategy and drive data-driven decision-making across the organization. This role will focus on developing a comprehensive data infrastructure, ensuring data accuracy, and providing critical insights to support our business goals. Responsibilities Data Strategy & Governance: Develop and implement a data strategy that aligns with organizational goals. Establish governance policies to maintain data quality, consistency, and security.Team Leadership: Provide training and development to enhance the teams skills in data management and reporting.Reporting & Analytics: Oversee the creation of dashboards and reports, delivering key insights to stakeholders. Ensure reports are accessible, reliable, and relevant, with a focus on performance metrics, customer insights, and operational efficiencies.Cross-functional Collaboration: Work closely with cross-functional teams (Tech, Finance, Operations, Marketing, Credit and Analytics) to identify data requirements, integrate data across systems, and support data-driven initiatives.Data Infrastructure & Tools: Work with Data Engineering to assess, select, and implement data tools and platforms to optimize data storage, processing, and reporting capabilities. Maintain and improve our data infrastructure to support scalability and data accessibility.Data Compliance: Ensure adherence to data privacy laws and compliance standards, implementing best practices in data security and privacy. Qualifications Experience: 5-10 years of experience in data management and reporting with at least some in a leadership role.Education: Bachelors or Masters degree in Data Science, Business Analytics, Statistics, Computer Science, or a related field (STEM).Technical Skills: Proficiency in data visualization tools (Metabase, Sisense, Tableau, Power BI), SQL, and data warehousing solutions. Knowledge of ETL processes and familiarity with cloud data platforms is a plus.Analytical Skills: Strong analytical abilities and a strategic mindset, with proven experience in translating data into actionable business insights.Leadership & Communication: Excellent leadership, communication, and presentation skills. Ability to communicate complex information clearly to both technical and non-technical stakeholders.Strong strategic thinking and problem-solving skillsEnthusiasm for working across cultures, functions, and time zones

Posted 1 week ago

Apply

5.0 - 10.0 years

7 - 12 Lacs

Kochi

Work from Office

Job Overview As a company, we are passionate about our customers, fearless in the face of barriers, and driven by data.We are seeking an experienced and strategic Data and Reporting Lead to shape our data strategy and drive data-driven decision-making across the organization. This role will focus on developing a comprehensive data infrastructure, ensuring data accuracy, and providing critical insights to support our business goals. Responsibilities Data Strategy & Governance: Develop and implement a data strategy that aligns with organizational goals. Establish governance policies to maintain data quality, consistency, and security.Team Leadership: Provide training and development to enhance the teams skills in data management and reporting.Reporting & Analytics: Oversee the creation of dashboards and reports, delivering key insights to stakeholders. Ensure reports are accessible, reliable, and relevant, with a focus on performance metrics, customer insights, and operational efficiencies.Cross-functional Collaboration: Work closely with cross-functional teams (Tech, Finance, Operations, Marketing, Credit and Analytics) to identify data requirements, integrate data across systems, and support data-driven initiatives.Data Infrastructure & Tools: Work with Data Engineering to assess, select, and implement data tools and platforms to optimize data storage, processing, and reporting capabilities. Maintain and improve our data infrastructure to support scalability and data accessibility.Data Compliance: Ensure adherence to data privacy laws and compliance standards, implementing best practices in data security and privacy. Qualifications Experience: 5-10 years of experience in data management and reporting with at least some in a leadership role.Education: Bachelors or Masters degree in Data Science, Business Analytics, Statistics, Computer Science, or a related field (STEM).Technical Skills: Proficiency in data visualization tools (Metabase, Sisense, Tableau, Power BI), SQL, and data warehousing solutions. Knowledge of ETL processes and familiarity with cloud data platforms is a plus.Analytical Skills: Strong analytical abilities and a strategic mindset, with proven experience in translating data into actionable business insights.Leadership & Communication: Excellent leadership, communication, and presentation skills. Ability to communicate complex information clearly to both technical and non-technical stakeholders.Strong strategic thinking and problem-solving skillsEnthusiasm for working across cultures, functions, and time zones

Posted 1 week ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies