UserReady is a digital product design and development agency focused on building intuitive user experiences and effective digital solutions.
Mohali, Gurugram, Bengaluru
INR 12.0 - 16.5 Lacs P.A.
Work from Office
Full Time
Job Title: Senior BI Platform Administrator (Business Objects, Tableau, Metric Insights) Job Summary: We are seeking a seasoned BI Platform Administrator to oversee and manage our Business Intelligence (BI) infrastructure, focusing on SAP Business Objects, Tableau, and Metric Insights. The ideal candidate will be responsible for ensuring the stability, security, and efficiency of our BI platforms, supporting internal applications, and facilitating seamless data-driven decision-making across the organization. Key Responsibilities: Platform Administration: Administer and maintain SAP Business Objects, Tableau, and Metric Insights environments, including installation, configuration, and upgrades. Monitor system performance, troubleshoot issues, and implement best practices for optimization. Manage user access, permissions, and security settings across BI platforms. System Maintenance & Upgrades: Plan and implement platform upgrades, patches, and migrations from development (DEV) and quality assurance (QA) to production (PROD) environments. Ensure compliance with server vulnerability standards and organizational IT policies. Coordinate disaster recovery planning and execution for BI systems. Incident & Request Management: Handle incidents and service requests related to BI platforms, ensuring timely resolution. Automate workflows to improve efficiency and reduce manual intervention. Manage password changes and access provisioning for BI tools. Collaboration & Communication: Work closely with third-party vendors to address bugs, issues, and system enhancements. Collaborate with developers to resolve report/dashboard errors, schedule creations, and bursting requirements. Participate in Change Advisory Board (CAB) meetings to obtain necessary approvals for changes. Reporting & Dashboard Development: Create and maintain reports and dashboards for internal stakeholders using BI tools. Support internal applications such as NTRACs, Trauma One, and Harmony. Manage Epic upgrades concerning Business Objects integration. Qualifications: Bachelor's degree in Computer Science, Information Technology, or related field. 5–8 years of experience in BI platform administration, specifically with SAP Business Objects, Tableau, and Metric Insights. Strong understanding of server maintenance, security standards, and disaster recovery protocols. Proficient in automating workflows and managing content migrations across environments. Experience collaborating with vendors and cross-functional teams. Familiarity with internal applications like NTRACs, Trauma One, Harmony, and Epic systems. Excellent problem-solving skills and attention to detail. Preferred Certifications: SAP Certified Application Associate – SAP BusinessObjects Business Intelligence Platform. Tableau Desktop Certified Professional.
Mohali
INR 10.0 - 15.0 Lacs P.A.
Work from Office
Full Time
Job Title: Senior BI Platform Administrator (Business Objects, Tableau, Metric Insights) Job Summary: We are seeking a seasoned BI Platform Administrator to oversee and manage our Business Intelligence (BI) infrastructure, focusing on SAP Business Objects, Tableau, and Metric Insights. The ideal candidate will be responsible for ensuring the stability, security, and efficiency of our BI platforms, supporting internal applications, and facilitating seamless data-driven decision-making across the organization. Key Responsibilities: Platform Administration: Administer and maintain SAP Business Objects, Tableau, and Metric Insights environments, including installation, configuration, and upgrades. Monitor system performance, troubleshoot issues, and implement best practices for optimization. Manage user access, permissions, and security settings across BI platforms. System Maintenance & Upgrades: Plan and implement platform upgrades, patches, and migrations from development (DEV) and quality assurance (QA) to production (PROD) environments. Ensure compliance with server vulnerability standards and organizational IT policies. Coordinate disaster recovery planning and execution for BI systems. Incident & Request Management: Handle incidents and service requests related to BI platforms, ensuring timely resolution. Automate workflows to improve efficiency and reduce manual intervention. Manage password changes and access provisioning for BI tools. Collaboration & Communication: Work closely with third-party vendors to address bugs, issues, and system enhancements. Collaborate with developers to resolve report/dashboard errors, schedule creations, and bursting requirements. Participate in Change Advisory Board (CAB) meetings to obtain necessary approvals for changes. Reporting & Dashboard Development: Create and maintain reports and dashboards for internal stakeholders using BI tools. Support internal applications such as NTRACs, Trauma One, and Harmony. Manage Epic upgrades concerning Business Objects integration. Qualifications: Bachelors degree in Computer Science, Information Technology, or related field. 5-8 years of experience in BI platform administration, specifically with SAP Business Objects, Tableau, and Metric Insights. Strong understanding of server maintenance, security standards, and disaster recovery protocols. Proficient in automating workflows and managing content migrations across environments. Experience collaborating with vendors and cross-functional teams. Familiarity with internal applications like NTRACs, Trauma One, Harmony, and Epic systems. Excellent problem-solving skills and attention to detail. Preferred Certifications: SAP Certified Application Associate - SAP BusinessObjects Business Intelligence Platform. Tableau Desktop Certified Professional.
Pune, Gurugram, Bengaluru
INR 8.0 - 11.0 Lacs P.A.
Work from Office
Full Time
Job Title: Data Engineer – Snowflake & Python About the Role: We are seeking a skilled and proactive Data Developer with 3-5 years of hands-on experience in Snowflake , Python , Streamlit , and SQL , along with expertise in consuming REST APIs and working with modern ETL tools like Matillion, Fivetran etc. The ideal candidate will have a strong foundation in data modeling , data warehousing , and data profiling , and will play a key role in designing and implementing robust data solutions that drive business insights and innovation. Key Responsibilities: Design, develop, and maintain data pipelines and workflows using Snowflake and an ETL tool (e.g., Matillion, dbt, Fivetran, or similar). Develop data applications and dashboards using Python and Streamlit. Create and optimize complex SQL queries for data extraction, transformation, and loading. Integrate REST APIs for data access and process automation. Perform data profiling, quality checks, and troubleshooting to ensure data accuracy and integrity. Design and implement scalable and efficient data models aligned with business requirements. Collaborate with data analysts, data scientists, and business stakeholders to understand data needs and deliver actionable solutions. Implement best practices in data governance, security, and compliance. Required Skills and Qualifications: 3–5 years of professional experience in a data engineering or development role. Strong expertise in Snowflake , including performance tuning and warehouse optimization. Proficient in Python , including data manipulation with libraries like Pandas. Experience building web-based data tools using Streamlit . Solid understanding and experience with RESTful APIs and JSON data structures. Strong SQL skills and experience with advanced data transformation logic. Experience with an ETL tool commonly used with Snowflake (e.g., dbt , Matillion , Fivetran , Airflow ). Hands-on experience in data modeling (dimensional and normalized), data warehousing concepts , and data profiling techniques . Familiarity with version control (e.g., Git) and CI/CD processes is a plus. Preferred Qualifications: Experience working in cloud environments (AWS, Azure, or GCP). Knowledge of data governance and cataloging tools. Experience with agile methodologies and working in cross-functional teams.
Mohali, Gurugram, Bengaluru
INR 9.0 - 13.0 Lacs P.A.
Work from Office
Full Time
Job Title: Data Engineer – Snowflake & Python About the Role: We are seeking a skilled and proactive Data Developer with 3-5 years of hands-on experience in Snowflake , Python , Streamlit , and SQL , along with expertise in consuming REST APIs and working with modern ETL tools like Matillion, Fivetran etc. The ideal candidate will have a strong foundation in data modeling , data warehousing , and data profiling , and will play a key role in designing and implementing robust data solutions that drive business insights and innovation. Key Responsibilities: Design, develop, and maintain data pipelines and workflows using Snowflake and an ETL tool (e.g., Matillion, dbt, Fivetran, or similar). Develop data applications and dashboards using Python and Streamlit. Create and optimize complex SQL queries for data extraction, transformation, and loading. Integrate REST APIs for data access and process automation. Perform data profiling, quality checks, and troubleshooting to ensure data accuracy and integrity. Design and implement scalable and efficient data models aligned with business requirements. Collaborate with data analysts, data scientists, and business stakeholders to understand data needs and deliver actionable solutions. Implement best practices in data governance, security, and compliance. Required Skills and Qualifications: Experience in HR Data and databases is a must. 3–5 years of professional experience in a data engineering or development role. Strong expertise in Snowflake , including performance tuning and warehouse optimization. Proficient in Python , including data manipulation with libraries like Pandas. Experience building web-based data tools using Streamlit . Solid understanding and experience with RESTful APIs and JSON data structures. Strong SQL skills and experience with advanced data transformation logic. Experience with an ETL tool commonly used with Snowflake (e.g., dbt , Matillion , Fivetran , Airflow ). Hands-on experience in data modeling (dimensional and normalized), data warehousing concepts , and data profiling techniques . Familiarity with version control (e.g., Git) and CI/CD processes is a plus. Preferred Qualifications: Experience working in cloud environments (AWS, Azure, or GCP). Knowledge of data governance and cataloging tools. Experience with agile methodologies and working in cross-functional teams. Experience in HR Data and databases. Experience in Azure Data Factory
Mohali, Pune
INR 7.0 - 11.0 Lacs P.A.
Work from Office
Full Time
Job Title: Data Engineer - Snowflake & Python About the Role: We are seeking a skilled and proactive Data Developer with 3-5 years of hands-on experience in Snowflake , Python , Streamlit , and SQL , along with expertise in consuming REST APIs and working with modern ETL tools like Matillion, Fivetran etc. The ideal candidate will have a strong foundation in data modeling , data warehousing , and data profiling , and will play a key role in designing and implementing robust data solutions that drive business insights and innovation. Key Responsibilities: Design, develop, and maintain data pipelines and workflows using Snowflake and an ETL tool (e.g., Matillion, dbt, Fivetran, or similar). Develop data applications and dashboards using Python and Streamlit. Create and optimize complex SQL queries for data extraction, transformation, and loading. Integrate REST APIs for data access and process automation. Perform data profiling, quality checks, and troubleshooting to ensure data accuracy and integrity. Design and implement scalable and efficient data models aligned with business requirements. Collaborate with data analysts, data scientists, and business stakeholders to understand data needs and deliver actionable solutions. Implement best practices in data governance, security, and compliance. Required Skills and Qualifications: Experience in HR Data and databases is a must. 3-5 years of professional experience in a data engineering or development role. Strong expertise in Snowflake , including performance tuning and warehouse optimization. Proficient in Python , including data manipulation with libraries like Pandas. Experience building web-based data tools using Streamlit . Solid understanding and experience with RESTful APIs and JSON data structures. Strong SQL skills and experience with advanced data transformation logic. Experience with an ETL tool commonly used with Snowflake (e.g., dbt , Matillion , Fivetran , Airflow ). Hands-on experience in data modeling (dimensional and normalized), data warehousing concepts , and data profiling techniques . Familiarity with version control (e.g., Git) and CI/CD processes is a plus. Preferred Qualifications: Experience working in cloud environments (AWS, Azure, or GCP). Knowledge of data governance and cataloging tools. Experience with agile methodologies and working in cross-functional teams. Experience in HR Data and databases. Experience in Azure Data Factory
Mohali
INR 20.0 - 25.0 Lacs P.A.
Work from Office
Full Time
Title: Senior Tableau Developer with Strong SQL Skills and Critical Thinking We are seeking a Senior Tableau Developer to join our team. The ideal candidate should have a strong background in Tableau development, SQL skills, and critical thinking. Responsibilities: Develop and maintain Tableau dashboards, reports, and visualizations that meet business requirements Collaborate with business users to gather and document requirements for Tableau projects Analyze data from various sources and implement data models in SQL for efficient reporting Optimize Tableau dashboards for performance and usability Conduct testing and quality assurance to ensure accuracy of data and functionality of dashboards Stay up-to-date with latest Tableau and SQL techniques and technologies Requirements: Bachelors degree in Computer Science, Information Technology, or related field At least 5 -8 years of experience in Tableau development and data visualization Strong SQL skills, including data modeling and optimization Ability to analyze complex data and implement data models Strong critical thinking and problem-solving skills Experience working in an Agile development environment Excellent communication and collaboration skills If you are a self-motivated, detail-oriented individual who is passionate about Tableau development and data visualization, we encourage you to apply for this exciting opportunity.
Mohali, Pune
INR 7.0 - 11.0 Lacs P.A.
Work from Office
Full Time
Job Title: Data Engineer - Snowflake & Python About the Role: We are seeking a skilled and proactive Data Developer with 3-5 years of hands-on experience in Snowflake , Python , Streamlit , and SQL , along with expertise in consuming REST APIs and working with modern ETL tools like Matillion, Fivetran etc. The ideal candidate will have a strong foundation in data modeling , data warehousing , and data profiling , and will play a key role in designing and implementing robust data solutions that drive business insights and innovation. Key Responsibilities: Design, develop, and maintain data pipelines and workflows using Snowflake and an ETL tool (e.g., Matillion, dbt, Fivetran, or similar). Develop data applications and dashboards using Python and Streamlit. Create and optimize complex SQL queries for data extraction, transformation, and loading. Integrate REST APIs for data access and process automation. Perform data profiling, quality checks, and troubleshooting to ensure data accuracy and integrity. Design and implement scalable and efficient data models aligned with business requirements. Collaborate with data analysts, data scientists, and business stakeholders to understand data needs and deliver actionable solutions. Implement best practices in data governance, security, and compliance. Required Skills and Qualifications: 3-5 years of professional experience in a data engineering or development role. Strong expertise in Snowflake , including performance tuning and warehouse optimization. Proficient in Python , including data manipulation with libraries like Pandas. Experience building web-based data tools using Streamlit . Solid understanding and experience with RESTful APIs and JSON data structures. Strong SQL skills and experience with advanced data transformation logic. Experience with an ETL tool commonly used with Snowflake (e.g., dbt , Matillion , Fivetran , Airflow ). Hands-on experience in data modeling (dimensional and normalized), data warehousing concepts , and data profiling techniques . Familiarity with version control (e.g., Git) and CI/CD processes is a plus. Preferred Qualifications: Experience working in cloud environments (AWS, Azure, or GCP). Knowledge of data governance and cataloging tools. Experience with agile methodologies and working in cross-functional teams.
Mohali, Pune
INR 7.0 - 11.0 Lacs P.A.
Work from Office
Full Time
Job Title: Data Engineer - Snowflake ,SQL & Python About the Role: We are seeking a skilled and proactive Data Developer with 3-5 years of hands-on experience in Snowflake , Alteryx , and SQL , along with expertise in consuming REST APIs and working with modern ETL tools like Matillion, Fivetran etc. The ideal candidate will have a strong foundation in data modeling , data warehousing , and data profiling , and will play a key role in designing and implementing robust data solutions that drive business insights and innovation. Required Skills and Qualifications 3-5 years of professional experience in a data engineering or development role. Hands-on experience in migrating SQL Server/Oracle/legacy databases to Snowflake . Strong working knowledge of Snowflake , including data warehousing concepts , performance tuning , and data transformation . Expertise in Alteryx Designer for building and orchestrating ETL workflows targeting Snowflake. Advanced SQL development skills for writing complex queries, data extraction, transformation, and loading. Experience in data mapping , data validation , and data quality checks during migration. Knowledge of Snowflake stored procedures , tasks , and user-defined functions (UDFs) . Familiarity with cloud-based environments (AWS, Azure, or GCP), particularly for Snowflake hosting and integration. Experience with data profiling tools to assess source data and ensure accurate migration. Understanding of API integration for automated data transfers or orchestration (if applicable). Familiarity with version control systems (e.g., Git) and CI/CD pipeline for data workflows. Preferred Qualifications: Experience working in cloud environments (AWS, Azure, or GCP). Knowledge of data governance and cataloging tools. Experience with agile methodologies and working in cross-functional teams.
Kolkata, Mumbai, New Delhi, Hyderabad, Pune, Chennai, Bengaluru
INR 14.0 - 16.0 Lacs P.A.
Work from Office
Full Time
We are seeking a Senior Tableau Developer to join our team. The ideal candidate should have a strong background in Tableau development, SQL skills, and critical thinking. Responsibilities: Develop and maintain Tableau dashboards, reports, and visualizations that meet business requirements Collaborate with business users to gather and document requirements for Tableau projects Analyze data from various sources and implement data models in SQL for efficient reporting Optimize Tableau dashboards for performance and usability Conduct testing and quality assurance to ensure accuracy of data and functionality of dashboards Stay up-to-date with latest Tableau and SQL techniques and technologies Requirements: Bachelors degree in Computer Science, Information Technology, or related field At least 5 -8 years of experience in Tableau development and data visualization Strong SQL skills, including data modeling and optimization Ability to analyze complex data and implement data models Strong critical thinking and problem-solving skills Experience working in an Agile development environment Excellent communication and collaboration skills
Mumbai
INR Not disclosed
Work from Office
Internship
Key Responsibilities Support day-to-day operations of marketing tools including HubSpot, Salesforce, and LinkedIn Campaign Manager Assist with campaign setup, tracking, and performance measurement Help maintain data hygiene across CRM and marketing automation platforms Monitor and report on campaign performance metrics and marketing KPIs Support lead management and ensure proper attribution and routing Help document and streamline marketing processes and workflows Collaborate cross-functionally with design, sales, and content teams Qualifications Pursuing or recently completed a degree in Marketing, Business, Data Analytics, or a related field Strong interest in marketing operations, analytics, or martech Basic understanding of marketing automation tools and CRM platforms (HubSpot, Salesforce preferred) Analytical mindset with proficiency in Excel/Google Sheets; knowledge of reporting dashboards is a plus Excellent communication and organizational skills Proactive, eager to learn, and comfortable working in a fast-paced environment What We Offer A collaborative and supportive team culture Real-world experience with industry-leading marketing tools Exposure to high-impact campaigns and data-driven strategies Possibility of full-time employment upon successful internship completion
Kolkata, Mumbai, New Delhi, Hyderabad, Pune, Chennai, Bengaluru
INR 12.0 - 17.0 Lacs P.A.
Work from Office
Full Time
Job Requirement: Tableau Asset Coverage All Tableau workbooks, dashboards, published data sources, and extracts across environments (Dev, QA, Prod). Environment Inclusion Applicable to both Tableau Server and Tableau Cloud (if present). Metadata Extraction Automated extraction of metadata from all Tableau artifacts. Usage & Ownership Analysis Identification of user access, asset activity, and ownership across the organization. Classification & Tagging Categorization by business function, department, usage frequency, and criticality. Stakeholder Mapping Associating assets with owners, publishers, and consumers for validation and decision-making. Key Activities Initiate Discovery Process Connect to Tableau environments using APIs or admin tools. Validate access and authentication for metadata extraction. Extract and Store Metadata Pull data on workbooks, dashboards, data sources, data connections, calculations, refresh schedules, etc. Store metadata in a centralized repository (e.g., SQL DB, Excel, or BI tool). Perform Asset Classification Tag assets by type, department, business domain, and sensitivity. Identify certified, shared, and personal content. Analyze Usage Patterns Evaluate server logs or usage tracking to determine access frequency and stale assets. Identify dashboards with low/no usage for potential deprecation. Map Ownership and Access Link assets to creators, publishers, and viewers. Identify orphaned assets and assign potential new owners if needed. Generate Reports & Insights Create dashboards/reports on inventory summary, usage heatmaps, top users/assets, and stale content. Highlight rationalization opportunities (e.g., consolidation, archival). Engage Stakeholders Review findings with key business users and IT owners. Validate asset importance, current usage, and future disposition. Analyze data from various sources and implement data models in SQL for efficient reporting Optimize Tableau dashboards for performance and usability Conduct testing and quality assurance to ensure accuracy of data and functionality of dashboards Stay up-to-date with latest Tableau and SQL techniques and technologies Requirements: Bachelors degree in Computer Science, Information Technology, or related field At least 5 -8 years of experience in Tableau development and data visualization Strong SQL skills, including data modeling and optimization Ability to analyze complex data and implement data models Strong critical thinking and problem-solving skills Experience working in an Agile development environment Excellent communication and collaboration skills If you are a self-motivated, detail-oriented individual who is passionate about Tableau development and data visualization, we encourage you to apply for this exciting opportunity.
Mumbai
INR Not disclosed
Work from Office
Internship
Write compelling, clear, and grammatically accurate content for blogs, social media, emails, landing pages, and case studies Assist in researching industry trends and creating content calendars Social media posting on daily basis Conduct research on industry trends, emerging hashtags, and competitive activities to inform content strategies Support in SEO content writing and keyword optimization Edit and proofread content to ensure brand tone, consistency, and clarity Coordinate with the design team to create visually engaging content Help repurpose existing content into short-form or visual formats (e.g., infographics, carousels) Track performance metrics on content and suggest improvements Preferred Skills and Qualifications: Excellent command over written English Strong attention to detail and creativity Basic understanding of content marketing and SEO principles Pursuing or recently completed a degree in Communications, English, Marketing, Journalism, or related field Bonus: Interest in data, analytics, or technology sector What We Offer: Mentorship from experienced marketers and content strategists Exposure to B2B content strategy in a fast-growing tech company Opportunity to work on real campaigns with visible business impact Monthly stipend and access to learning resources Possibility of a full-time role based on performance and business needs
Mohali, Bengaluru
INR 30.0 - 40.0 Lacs P.A.
Work from Office
Full Time
Job Summary: We are seeking an experienced and detail-oriented Technical Project Manager , with strong interpersonal skills to lead and manage Data, Business Intelligence (BI), and Analytics initiatives across single and multiple client engagements. The ideal candidate will have a solid background in data project delivery, knowledge of modern cloud platforms, and familiarity with tools like Snowflake , Tableau , and Power BI . Understanding of AI and machine learning projects is a strong plus. This role requires strong communication and leadership skills, with the ability to translate complex technical requirements into actionable plans and ensure successful, timely, and high-quality delivery with attention to details. Key Responsibilities: Project & Program Delivery Manage end-to-end, the full lifecycle of data engineering and analytics, projects including data platform migrations, dashboard/report development, and advanced analytics initiatives. Define project scope, timelines, milestones, resource needs, and deliverables in alignment with stakeholder objectives. Manage budgets, resource allocation, and risk mitigation strategies to ensure successful program delivery. Use Agile, Scrum, or hybrid methodologies to ensure iterative delivery and continuous improvement. Monitor performance, track KPIs, and adjust plans to maintain scope, schedule, and quality. Excellence in execution and ensure client satisfaction Client & Stakeholder Engagement Serve as the primary point of contact for clients and internal teams across all data initiatives. Translate business needs into actionable technical requirements and facilitate alignment across teams. Conduct regular status meetings, monthly and quarterly reviews, executive updates, and retrospectives. Manage Large teams Ability to manage up to 50+ resources working on different projects for different clients. Work with practice and talent acquisition teams for resourcing needs Manage P & L Manage allocation, gross margin, utilization etc effectively Team Coordination Lead and coordinate cross-functional teams including data engineers, BI developers, analysts, and QA testers. Ensure appropriate allocation of resources across concurrent projects and clients. Foster collaboration, accountability, and a results-oriented team culture. Data, AI and BI Technology Oversight Manage project delivery using modern cloud data platforms Oversee BI development using Tableau and/or Power BI , ensuring dashboards meet user needs and follow visualization best practices. Conduct UATs Manage initiatives involving ETL/ELT processes, data modeling, and real-time analytics pipelines. Ensure compatibility with data governance, security, and privacy requirements. Manage AL ML projects Data & Cloud Understanding Oversee delivery of solutions involving cloud data platforms (e.g., Azure, AWS, GCP), data lakes, and modern data stacks. Support planning for data migrations, ETL processes, data modeling, and analytics pipelines. Be conversant in tools such as Power BI, Tableau, Snowflake, Databricks, Azure Synapse, or BigQuery. Risk, Quality & Governance Identify and mitigate risks related to data quality, project timelines, and resource availability. Ensure adherence to governance, compliance, and data privacy standards (e.g., GDPR, HIPAA). Maintain thorough project documentation including charters, RACI matrices, RAID logs, and retrospectives. Qualifications: Bachelor’s degree in Computer Science, Information Systems, Business, or a related field. Certifications (Preferred): PMP, PRINCE2, or Certified ScrumMaster (CSM) Cloud certifications (e.g., AWS Cloud Practitioner , Azure Fundamentals , Google Cloud Certified ) BI/analytics certifications (e.g., Tableau Desktop Specialist , Power BI Data Analyst Associate , DA-100 ) Must Have Skills: Strong communication skills Strong interpersonal Ability to work collaboratively Excellent Organizing skills Stakeholder Management Customer Management People Management Contract Management Risk & Compliance Management C-suite reporting Team Management Resourcing Experience using tools like JIRA, MS Plan etc. Desirable Skills: 15 years of IT experience with 8+ years of proven project management experience, in delivering data, AI Ml, BI / analytics-focused environments. Experience delivering projects with cloud platforms (e.g., Azure , AWS , GCP ) and data platforms like Snowflake . Proficiency in managing BI projects preferably Tableau and/or Power BI . Knowledge or hands on experience on legacy tools is a plus. Solid understanding of the data lifecycle including ingestion, transformation, visualization, and reporting. Comfortable using PM tools like Jira, Azure DevOps, Monday.com, or Smartsheet. Experience managing projects involving data governance , metadata management , or master data management (MDM) .
Mohali, Gurugram, Bengaluru
INR 10.0 - 15.0 Lacs P.A.
Work from Office
Full Time
Key Responsibilities: Lead Tableau Cloud implementation and architecture design , including site structure, user provisioning, data connections, and security models. Develop and enforce Tableau governance policies , including naming conventions, folder structure, usage monitoring, and content lifecycle management. Guide and mentor Tableau developers and analysts in best practices for visualization design, performance tuning, and metadata management. Collaborate with Data Engineering and BI teams to design scalable, secure, and high-performing data models. Monitor Tableau Cloud usage, site health, and performance, proactively resolving issues and managing upgrades or configuration changes. Work closely with stakeholders to understand business requirements and translate them into high-impact Tableau dashboards and reports. Develop technical documentation, training materials, and user guides for Tableau Cloud users and developers. Lead or support migration efforts from Tableau Server or on-prem environments to Tableau Cloud. Lead Tableau Cloud implementation and architecture design , including site structure, user provisioning, data connections, and security models. Develop and enforce Tableau governance policies , including naming conventions, folder structure, usage monitoring, and content lifecycle management. Guide and mentor Tableau developers and analysts in best practices for visualization design, performance tuning, and metadata management. Collaborate with Data Engineering and BI teams to design scalable, secure, and high-performing data models. Monitor Tableau Cloud usage, site health, and performance, proactively resolving issues and managing upgrades or configuration changes. Work closely with stakeholders to understand business requirements and translate them into high-impact Tableau dashboards and reports. Develop technical documentation, training materials, and user guides for Tableau Cloud users and developers. Lead or support migration efforts from Tableau Server or on-prem environments to Tableau Cloud. Qualifications Required: Bachelor’s or Master’s degree in Computer Science, Information Systems, or a related field. 5+ years of experience with Tableau, with at least 1–2 years in Tableau Cloud (SaaS) specifically. Strong experience with dashboard design, Tableau calculations (LOD, table calcs), parameters, and actions. Solid understanding of data visualization best practices and UX principles. Hands-on experience managing Tableau Cloud environments, including content governance, access control, and site administration. Preferred: Tableau Certified Associate or Tableau Certified Professional certification. Experience with Tableau Prep, REST API, or other automation tools for Tableau administration. Familiarity with DevOps, CI/CD, or version control for BI assets (e.g., Git). Experience migrating from Tableau Server to Tableau Cloud. Knowledge of data security, privacy, and compliance standards in cloud environments.
Mohali
INR 32.5 - 37.5 Lacs P.A.
Work from Office
Full Time
Job Summary: We are seeking an experienced and detail-oriented Technical Project Manager , with strong interpersonal skills to lead and manage Data, Business Intelligence (BI), and Analytics initiatives across single and multiple client engagements. The ideal candidate will have a solid background in data project delivery, knowledge of modern cloud platforms, and familiarity with tools like Snowflake , Tableau , and Power BI . Understanding of AI and machine learning projects is a strong plus. This role requires strong communication and leadership skills, with the ability to translate complex technical requirements into actionable plans and ensure successful, timely, and high-quality delivery with attention to details. Key Responsibilities: Project Program Delivery Manage end-to-end, the full lifecycle of data engineering and analytics, projects including data platform migrations, dashboard/report development, and advanced analytics initiatives. Define project scope, timelines, milestones, resource needs, and deliverables in alignment with stakeholder objectives. Manage budgets, resource allocation, and risk mitigation strategies to ensure successful program delivery. Use Agile, Scrum, or hybrid methodologies to ensure iterative delivery and continuous improvement. Monitor performance, track KPIs, and adjust plans to maintain scope, schedule, and quality. Excellence in execution and ensure client satisfaction Client Stakeholder Engagement Serve as the primary point of contact for clients and internal teams across all data initiatives. Translate business needs into actionable technical requirements and facilitate alignment across teams. Conduct regular status meetings, monthly and quarterly reviews, executive updates, and retrospectives. Manage Large teams Ability to manage up to 50+ resources working on different projects for different clients. Work with practice and talent acquisition teams for resourcing needs Manage P L Manage allocation, gross margin, utilization etc effectively Team Coordination Lead and coordinate cross-functional teams including data engineers, BI developers, analysts, and QA testers. Ensure appropriate allocation of resources across concurrent projects and clients. Foster collaboration, accountability, and a results-oriented team culture. Data, AI and BI Technology Oversight Manage project delivery using modern cloud data platforms Oversee BI development using Tableau and/or Power BI , ensuring dashboards meet user needs and follow visualization best practices. Conduct UATs Manage initiatives involving ETL/ELT processes, data modeling, and real-time analytics pipelines. Ensure compatibility with data governance, security, and privacy requirements. Manage AL ML projects Data Cloud Understanding Oversee delivery of solutions involving cloud data platforms (e.g., Azure, AWS, GCP), data lakes, and modern data stacks. Support planning for data migrations, ETL processes, data modeling, and analytics pipelines. Be conversant in tools such as Power BI, Tableau, Snowflake, Databricks, Azure Synapse, or BigQuery. Risk, Quality Governance Identify and mitigate risks related to data quality, project timelines, and resource availability. Ensure adherence to governance, compliance, and data privacy standards (e.g., GDPR, HIPAA). Maintain thorough project documentation including charters, RACI matrices, RAID logs, and retrospectives. Qualifications: Bachelor s degree in Computer Science, Information Systems, Business, or a related field. Certifications (Preferred): PMP, PRINCE2, or Certified ScrumMaster (CSM) Cloud certifications (e.g., AWS Cloud Practitioner , Azure Fundamentals , Google Cloud Certified ) BI/analytics certifications (e.g., Tableau Desktop Specialist , Power BI Data Analyst Associate , DA-100 ) Must Have Skills: Strong communication skills Strong interpersonal Ability to work collaboratively Excellent Organizing skills Stakeholder Management Customer Management People Management Contract Management Risk Compliance Management C-suite reporting Team Management Resourcing Experience using tools like JIRA, MS Plan etc. Desirable Skills: 15 years of IT experience with 8+ years of proven project management experience, in delivering data, AI Ml, BI / analytics-focused environments. Experience delivering projects with cloud platforms (e.g., Azure , AWS , GCP ) and data platforms like Snowflake . Proficiency in managing BI projects preferably Tableau and/or Power BI . Knowledge or hands on experience on legacy tools is a plus. Solid understanding of the data lifecycle including ingestion, transformation, visualization, and reporting. Comfortable using PM tools like Jira, Azure DevOps, Monday.com, or Smartsheet. Experience managing projects involving data governance , metadata management , or master data management (MDM) .
Mohali
INR 9.0 - 13.0 Lacs P.A.
Work from Office
Full Time
Key Responsibilities: Lead Tableau Cloud implementation and architecture design , including site structure, user provisioning, data connections, and security models. Develop and enforce Tableau governance policies , including naming conventions, folder structure, usage monitoring, and content lifecycle management. Guide and mentor Tableau developers and analysts in best practices for visualization design, performance tuning, and metadata management. Collaborate with Data Engineering and BI teams to design scalable, secure, and high-performing data models. Monitor Tableau Cloud usage, site health, and performance, proactively resolving issues and managing upgrades or configuration changes. Work closely with stakeholders to understand business requirements and translate them into high-impact Tableau dashboards and reports. Develop technical documentation, training materials, and user guides for Tableau Cloud users and developers. Lead or support migration efforts from Tableau Server or on-prem environments to Tableau Cloud. Lead Tableau Cloud implementation and architecture design , including site structure, user provisioning, data connections, and security models. Develop and enforce Tableau governance policies , including naming conventions, folder structure, usage monitoring, and content lifecycle management. Guide and mentor Tableau developers and analysts in best practices for visualization design, performance tuning, and metadata management. Collaborate with Data Engineering and BI teams to design scalable, secure, and high-performing data models. Monitor Tableau Cloud usage, site health, and performance, proactively resolving issues and managing upgrades or configuration changes. Work closely with stakeholders to understand business requirements and translate them into high-impact Tableau dashboards and reports. Develop technical documentation, training materials, and user guides for Tableau Cloud users and developers. Lead or support migration efforts from Tableau Server or on-prem environments to Tableau Cloud. Qualifications Required: Bachelor s or Master s degree in Computer Science, Information Systems, or a related field. 5+ years of experience with Tableau, with at least 1-2 years in Tableau Cloud (SaaS) specifically. Strong experience with dashboard design, Tableau calculations (LOD, table calcs), parameters, and actions. Solid understanding of data visualization best practices and UX principles. Hands-on experience managing Tableau Cloud environments, including content governance, access control, and site administration. Preferred: Tableau Certified Associate or Tableau Certified Professional certification. Experience with Tableau Prep, REST API, or other automation tools for Tableau administration. Familiarity with DevOps, CI/CD, or version control for BI assets (e.g., Git). Experience migrating from Tableau Server to Tableau Cloud. Knowledge of data security, privacy, and compliance standards in cloud environments.
Chandigarh
INR 13.0 - 17.0 Lacs P.A.
Work from Office
Full Time
We are looking for an Intern to support senior leadership in managing daily operations. The ideal candidate will be skilled in information collation, research, data analysis, and stakeholder communication to ensure seamless executive support. Key Responsibilities: Collating information and summarize in word documents Perform research based on specific goals - Account Satisfaction research, Perform data review from executive dashboards and summarize Follow up with stakeholders for meeting prep or readiness Requirements: Need someone with an MBA degree. Proficiency in Microsoft Office (Word, Excel, PowerPoint) and knowledge of data visualisation tools would be good to have. Excellent communication and stakeholder management skills. Ability to prioritize tasks and manage multiple deadlines efficiently. Strong analytical skills and ability to interpret data effectively.
Pune, Gurugram, Bengaluru
INR 25.0 - 30.0 Lacs P.A.
Work from Office
Full Time
NYU Manager - Owais UR Delivery Manager - Laxmi Title: Senior Data Developer with Strong MS/Oracle SQL, Python Skills and Critical Thinking Description: The EDA team seeks a dedicated and detail-oriented Senior Developer I to join our dynamic team. The responsibility of the successful candidate will be to handle repetitive technical tasks, such as Healthy Planet MS SQL file loads into a data warehouse, monitor Airflow DAGs, manage alerts, and rerun failed processes. Additionally, the role will require the analyst to monitor various daily and weekly jobs, which may include generation of revenue cycle reports and data delivery to external vendors. The perfect candidate will have a robust experience with MS/Oracle SQL, Python, Epic Health Systems, and other relevant technologies. Overview: As a Senior Developer I at NYU EDA team, you will play a vital role to improve the operation of our data load and management processes. Your primary responsibilities will be to ensure the accuracy and timeliness of data loads, maintain the health of data pipelines, and monitor that all scheduled jobs are completed successfully. You will collaborate with cross-functional teams to identify and resolve issues, improve processes, and maintain a high standard of data integrity. Responsibilities: Manage and perform Healthy Planet file loads into a data warehouse. Monitor Airflow DAGs for successful completion, manage alerts, and rerun failed tasks as necessary. Monitor and oversee other daily and weekly jobs, including FGP cash reports and external reports. Collaborate with the data engineering team to streamline data processing workflows. Develop automation scripts to reduce manual intervention in repetitive tasks using SQL and Python. Ensure all data-related tasks are performed accurately and on time. Investigate and resolve data discrepancies and processing issues. Prepare and maintain documentation for processes and workflows. Conduct periodic data audits to ensure data integrity and compliance with defined standards. Skillset Requirements: MS/Oracle SQL Python Data warehousing and ETL processes Monitoring tools such as Apache Airflow Data quality and integrity assurance Strong analytical and problem-solving abilities Excellent written and verbal communication Additional Skillset Familiarity with monitoring and managing Apache Airflow DAGs. Experience: Minimum of 5 years experience in a similar role, with a focus on data management and process automation. Proven track record of successfully managing complex data processes and meeting deadlines. Education: Bachelor's degree in Computer Science, Information Technology, Data Science, or a related field. Certifications: Epic Cogito MS/Oracle SQL, Python, or data management are a plus.
Bengaluru
INR 18.0 - 33.0 Lacs P.A.
Remote
Full Time
USEReady is a data and analytics firm that provides the strategies, tools, capability, and capacity that businesses need to turn their data into a competitive advantage. USEReady partners with cloud and data ecosystem leaders like Tableau, Salesforce, Snowflake and Amazon Web Services, and has been named Tableau partner of the year multiple times. More details as below: Location : Remote Technical Lead Analytics with senior level expertise and experience in Tableau. The candidate needs to: 1) be at the forefront of creating visually impactful Tableau dashboards. 2) have expertise in translating business requirements into meaningful insights. 3) have experience in telling a story using a Tableau dashboard. 4) be able to lead others by example and through coaching. 5) have expertise in the Tableau Platform end-to-end, including how to use the different features, best practices around adminstration and familiarity with how Tableau can integrate with other platforms. Note Required but, it would be helpful if the candidate understands Tableau architecture principles. The person needs to have excellent communication skills. Tableau certification(s) expected. The candidate will need to be able to showcase their Tableau Dashboards as part of the consideration process.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.