Jobs
Interviews

4127 Data Visualization Jobs - Page 13

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

7.0 - 10.0 years

10 - 14 Lacs

Noida

Work from Office

About the Job : We are seeking a highly skilled and experienced Senior Data Engineer to join our dynamic team. In this pivotal role, you will be instrumental in driving our data engineering initiatives, with a strong emphasis on leveraging Dataiku's capabilities to enhance data processing and analytics. You will be responsible for designing, developing, and optimizing robust data pipelines, ensuring seamless integration of diverse data sources, and maintaining high data quality and accessibility to support our business intelligence and advanced analytics projects. This role requires a unique blend of expertise in traditional data engineering principles, advanced data modeling, and a forward-thinking approach to integrating cutting-AI technologies, particularly LLM Mesh for Generative AI applications. If you are passionate about building scalable data solutions and are eager to explore the cutting edge of AI, we encourage you to apply. Key Responsibilities : - Dataiku Leadership : Drive data engineering initiatives with a strong emphasis on leveraging Dataiku capabilities for data preparation, analysis, visualization, and the deployment of data solutions. - Data Pipeline Development : Design, develop, and optimize robust and scalable data pipelines to support various business intelligence and advanced analytics projects. This includes developing and maintaining ETL/ELT processes to automate data extraction, transformation, and loading from diverse sources. - Data Modeling & Architecture : Apply expertise in data modeling techniques to design efficient and scalable database structures, ensuring data integrity and optimal performance. - ETL/ELT Expertise : Implement and manage ETL processes and tools to ensure efficient and reliable data flow, maintaining high data quality and accessibility. - Gen AI Integration : Explore and implement solutions leveraging LLM Mesh for Generative AI applications, contributing to the development of innovative AI-powered features. - Programming & Scripting : Utilize programming languages such as Python and SQL for data manipulation, analysis, automation, and the development of custom data solutions. - Cloud Platform Deployment : Deploy and manage scalable data solutions on cloud platforms such as AWS or Azure, leveraging their respective services for optimal performance and cost-efficiency. - Data Quality & Governance : Ensure seamless integration of data sources, maintaining high data quality, consistency, and accessibility across all data assets. Implement data governance best practices. - Collaboration & Mentorship : Collaborate closely with data scientists, analysts, and other stakeholders to understand data requirements and deliver impactful solutions. Potentially mentor junior team members. - Performance Optimization : Continuously monitor and optimize the performance of data pipelines and data systems. Required Skills & Experience : - Proficiency in Dataiku : Demonstrable expertise in Dataiku for data preparation, analysis, visualization, and building end-to-end data pipelines and applications. - Expertise in Data Modeling : Strong understanding and practical experience in various data modeling techniques (e.g., dimensional modeling, Kimball, Inmon) to design efficient and scalable database structures. - ETL/ELT Processes & Tools : Extensive experience with ETL/ELT processes and a proven track record of using various ETL tools (e.g., Dataiku's built-in capabilities, Apache Airflow, Talend, SSIS, etc.). - Familiarity with LLM Mesh : Familiarity with LLM Mesh or similar frameworks for Gen AI applications, understanding its concepts and potential for integration. - Programming Languages : Strong proficiency in Python for data manipulation, scripting, and developing data solutions. Solid command of SQL for complex querying, data analysis, and database interactions. - Cloud Platforms : Knowledge and hands-on experience with at least one major cloud platform (AWS or Azure) for deploying and managing scalable data solutions (e.g., S3, EC2, Azure Data Lake, Azure Synapse, etc.). - Gen AI Concepts : Basic understanding of Generative AI concepts and their potential applications in data engineering. - Problem-Solving : Excellent analytical and problem-solving skills with a keen eye for detail. - Communication : Strong communication and interpersonal skills to collaborate effectively with cross-functional teams. Bonus Points (Nice to Have) : - Experience with other big data technologies (e.g., Spark, Hadoop, Snowflake). - Familiarity with data governance and data security best practices. - Experience with MLOps principles and tools. - Contributions to open-source projects related to data engineering or AI. Education : Bachelor's or Master's degree in Computer Science, Data Science, Engineering, or a related quantitative field.

Posted 2 days ago

Apply

5.0 - 7.0 years

17 - 20 Lacs

Noida

Work from Office

Job Summary : We are seeking a skilled Data Scientist with expertise in AI orchestration and embedded systems to support a sprint-based Agile implementation focused on integrating generative AI capabilities into enterprise platforms such as Slack, Looker, and Confluence. The ideal candidate will have hands-on experience with Gemini and a strong understanding of prompt engineering, vector databases, and orchestration infrastructure. Key Responsibilities : - Develop and deploy Slack-based AI assistants leveraging Gemini models. - Design and implement prompt templates tailored to enterprise data use cases (Looker and Confluence). - Establish and manage an embedding pipeline for Confluence documentation. - Build and maintain orchestration logic for prompt execution and data retrieval. - Set up API authentication and role-based access controls for integrated systems. - Connect and validate vector store operations (e.g., Pinecone, Weaviate, or Snowflake vector extension). - Contribute to documentation, internal walkthroughs, and user acceptance testing planning. - Participate in Agile ceremonies including daily standups and sprint demos. Required Qualifications : - Proven experience with Gemini and large language model deployment in production environments. - Proficiency in Python, orchestration tools, and prompt engineering techniques. - Familiarity with vector database technologies and embedding workflows. - Experience integrating APIs for data platforms such as Looker and Confluence. - Strong understanding of access control frameworks and enterprise-grade authentication. - Demonstrated success in Agile, sprint-based project environments. Preferred Qualifications : - Experience working with Slack app development and deployment. - Background in MLOps, LLMOps, or AI system orchestration at scale. - Excellent communication skills and ability to work in cross-functional teams.

Posted 2 days ago

Apply

10.0 - 12.0 years

10 - 14 Lacs

Gurugram

Work from Office

Role Responsibilities : - Collaborate with stakeholders to understand reporting requirements and translate them into interactive visualizations. - Design and develop Power BI reports and dashboards that provide actionable insights to the business. - Create detailed wireframes and prototypes using Figma to effectively communicate design ideas. - Implement best practices for data visualization and ensure reports are intuitive and user-friendly. - Develop and maintain data models for Power BI to support analytical processes. - Conduct data analysis to identify trends and patterns that drive business decisions. - Provide training and support to end-users regarding dashboard functionalities. - Work with cross-functional teams to gather requirements and feedback for continuous improvement. - Test and validate data accuracy and integrity across all reports and dashboards. - Implement data governance best practices to ensure compliance and security. - Stay updated with the latest Power BI features and UI/UX design trends. - Assist in project management activities to ensure timely delivery of projects. - Create documentation for report development processes and user guides. - Support ad-hoc reporting requests as needed by stakeholders. - Contribute to a positive team environment by mentoring junior staff and sharing knowledge. Qualifications : - Bachelor's degree in Computer Science, Data Science, or a related field. - Minimum of 10 years of experience in Power BI consulting and data visualization. - Proficiency in Figma for UI/UX design. - Strong understanding of wireframing principles and design thinking. - Hands-on experience with data analysis and data modeling. - Excellent problem-solving abilities with a keen eye for detail. - Strong communication skills and the ability to engage with stakeholders. - Experience in working within an Agile project environment. - Ability to manage multiple projects and deadlines. - Strong knowledge of SQL and data querying languages. - Familiarity with DAX and Power Query. - Experience with data governance practices. - Ability to provide effective user training and support. - Solid understanding of business intelligence tools and methodologies. - Self-motivated and able to work independently in a remote environment.

Posted 2 days ago

Apply

2.0 - 3.0 years

3 - 4 Lacs

Mumbai

Work from Office

We are looking for a Python Developer with expertise in Python, SQL (PostgreSQL), Pandas, Numpy, Django, and AWS services (Lambda, S3, RDS, EC2). Be a part of developing and optimizing scalable solutions, enhancing performance & driving innovation. Provident fund

Posted 2 days ago

Apply

2.0 - 3.0 years

3 - 4 Lacs

Nagpur

Work from Office

Responsibilities: Data Modeling & Integration: Report & Dashboard Development: Data Transformation: Collaboration: Performance Optimization: . Security & Access Control: Training & Support: Qualification: Gradution in IT or CS Field Requirements: Proven experience as a Power BI Engineer or BI Developer, with a solid understanding of data modeling, visualization, and reporting. Proficiency in Power BI Desktop, Power BI Service, Power Query, DAX, and Power BI Gateway. Strong experience with SQL and data integration from different sources (e.g., databases, APIs, cloud storage).. Strong analytical and problem-solving skills with attention to detail. Excellent communication skills and the ability to work in a collaborative team environment.

Posted 2 days ago

Apply

1.0 - 7.0 years

3 - 9 Lacs

Bengaluru

Work from Office

Design, develop, and implement machine learning models and statistical algorithms.Analyze large datasets to extract meaningful insights and trends.Collaborate with stakeholders to define business problems and deliver data-driven solutions.Optimize and scale machine learning models for production environments.Present analytical findings and recommendations in a clear, actionable manner.Key Skills:Proficiency in Python, R, and SQL.Experience with ML libraries like TensorFlow, PyTorch, or Scikit-learn.Strong knowledge of statistical methods and data visualization tools.Excellent problem-solving and storytelling skills

Posted 2 days ago

Apply

7.0 - 8.0 years

9 - 10 Lacs

Pune

Work from Office

Develop, implement, and maintain governance frameworks tailored to divisional needs, ensuring consistency with corporate policies and standards.Data Analysis and Reporting: Utilize Excel and Power BI to analyse project data, generate insights, and produce comprehensive reports for stakeholders, enabling informed decision-making.Facilitate communication and collaboration between divisional teams and corporate governance bodies, ensuring alignment and transparency.Develop and deliver reports on governance performance metrics to divisional and corporate leadership. Candidate with 7-8 years of total experience. Candidate should have exposure in any of governance roles. Have good communication skills, attention to details. Excellent analytical and problem-solving skills, with attention to detail Strong proficiency in MS Excel and Power BI, with the ability to create complex data models and visualizations. Effective communication and interpersonal skills, with the ability to collaborate across teams

Posted 2 days ago

Apply

2.0 - 6.0 years

4 - 8 Lacs

Mumbai

Work from Office

Job Description Roles and responsibilities : Track multiple data sources for Monthly & Fortnightly Portfolio, Factsheet, Fund performance within the given time schedule and update compositions for all mutual fund, Insurance and Pension Fund schemes in the in-house Analytics Tool Search multiple data sources for other data set collections and update on Data Extraction Tool. Maintaining Operational Efficiency Conduct quality checks and validate all figures and disclosures prior to final publishing After final publication maintain quality checks with observation Manage timelines to ensure timely release of portfolio, factsheets and fund performance Create and Maintain internal MIS reports and maintain an audit trail of published data Regular co-ordination with senior member team during the operational work. Perform web data mining and additional information to substantiate more content. Skill, Knowledge & Trainings: Basic knowledge of Funds preferably Mutual Fund, Insurance and Pension Fund Service oriented attitude Strong arithmetic base Excellent attention to detail, ability to prioritize and multitask Good communication skills, Accuracy time management and co-ordination. Good working knowledge of MS Word, MS Excel etc.

Posted 2 days ago

Apply

5.0 - 10.0 years

18 Lacs

Pune, Chennai, Bengaluru

Hybrid

Hello Folks, We have an excellent job opportuinity to work with one of the MNC based in Bangalore/Pune/Chennai/Gurugram/Hyderabad. Please find below the Job Description: 5+ years of hands-on Tableau experience dashboard creation, data blending, parameters, calculated fields. Good understanding of data modeling , ETL , and data warehousing concepts. SQL (PL/SQL/T-SQL) Ability to work independently and collaboratively in a fast-paced environment. Strong analytical and problem-solving skill. Interested candidates, Please revert with your currently updated resume on nirbhi.dixit@reynasolutions.com Preferred candidate profile

Posted 2 days ago

Apply

1.0 - 3.0 years

2 - 3 Lacs

Bengaluru

Work from Office

WALK-IN ALERT | Data Analyst Role (1-2 Years Experience) WyzMindz Solutions Private Limited is hiring talented Data Analysts with 1-2 years of experience to join our growing team in Bengaluru! Walk-In Date: Friday, 18th July 2025 Reporting Time: Between 10:00 AM to 11:00 AM Note: Candidates must carry their laptop for the MS Excel live test Job Description: Understand business requirements and solve problems using data analytics and visualization Present insights to stakeholders for data-driven decision-making Analyze complex data sets and prepare insightful reports Meet all Insights & Reporting SLAs Build strong relationships with team members Technical Skills Required: Strong experience in BI tools (Power BI, QlikView) dashboards, reports, automation Knowledge in SQL Advanced Excel skills, Power Query, DAX, macros, etc. Solid understanding of databases, schemas, and dimensional modelling Selection Procedure (At WyzMindz Office Only): Stage 1: Extempore for 2 minutes Stage 2: Written test (Mental Ability) and MS Excel LIVE test Stage 3: Personal Interview Business Location/Walk-In Venue: WyzMindz Solutions Private Limited AROHANA, 19/3, 3rd Floor, Srinivasa Industrial Estate Behind RMS International School & PU College, Kanakapura Rd, Konanakunte, Bengaluru, Karnataka 560062 Landmark: Near Yelachenahalli Metro Station, Kanakapura Road Metro Pillar No: 127 Google Maps: https://goo.gl/maps/mNN9R37hG4UsP4rN8 CTC: 25,000 per month If you have 1-2 years of experience in data analytics and want to grow your career, walk in and meet us on 18th July!

Posted 2 days ago

Apply

6.0 - 10.0 years

8 - 13 Lacs

Chennai, Bengaluru

Work from Office

Career Area: Technology, Digital and Data : Your Work Shapes the World at Caterpillar Inc. When you join Caterpillar, you'rejoining a global team who cares not just about the work we do but also about each other. We are the makers, problem solvers, and future world builders who are creating stronger, more sustainable communities. We don'tjust talk about progress and innovation here we make it happen, with our customers, where we work and live. Together, we are building a better world, so we can all enjoy living in it. JOB PURPOSE: The Lead Management Product team is hiring! Will you embrace that challenge The Lead Management Product Teams mission is to deliver qualitative sales and marketing leads through our digital application to our internal customersIndustries and Dealers in order to drive services revenue through our digital product. JOB DUTIES: At the Interface of Business and Analytics, the Lead Management analyst is a key player to connect these two worlds together. The Lead Management analyst main mission is to leverage various datasets and develop/sustain services leads in partnership with sales and marketing teams. The individual will also participate in specific projects aiming at further strengthening the lead management digital product. Responsibilities will include: Gather and implement business requirements for leads development Analyze feedback information to improve leads quality Provide business recommendations to internal partners based on quantitative and qualitative data Improvement, enhancement and development of specific dataset based on opportunities Drive processes improvement and automation of lead generation application Coordination with Data & BI Team Users data support and training The Lead Management analyst demonstrates excellent communication skills, being able to explain lead management analytics to customers with limited data knowledge and experience and at the same being able to interact with Data Engineers on technical details. He or she is a fast learner and is able to find creative ways to transform data into sales & marketing recommendations. Additionally, he or she should also exhibit strong planning and organization skills. If you: Like working in a challenging yet rewarding analytical international environment Truly believe that data can be translated into valuable and monetized information Like helping internal customers beyond initial requirements Want to grow and acquire new skills Do not hesitate to apply! Basic Requirements: Master or bachelors degree in finance, Accounting, Data Science, Engineering, Business administration, Economics, Marketing, Law, other Advanced SQL Knowledge Business Knowledge in a specific area or dealer exposure Excellent analytical skillsability to interpret data to solve business issues & vulgarize results for a large and diverse audience Ability to manipulate data and create meaningful visualization Aftermarket services understanding Basic programming skills in any language, preferably Python Top Candidates Will Also Have: Alteryx, Tableau and any data visualization software Advanced programming skills in any language Product knowledge, Applications knowledge, dealer knowledge Project management Track record of getting things done Fast learner Ability to work independently and reach out in a timely and relevant manner Good communication & presentation skills Ability to influence other groups and teams Posting Dates: Caterpillar is an Equal Opportunity Employer. Not ready to applyJoin our Talent Community.

Posted 2 days ago

Apply

4.0 - 9.0 years

15 - 30 Lacs

Noida, Pune, Bengaluru

Hybrid

Location- All EXL Locations ( Hybrid Mode) Salary - 15 to 35 LPA Experience - 5 to 10 years Competencies-Analytical Thinking and Problem Solving, Communication, Teamwork, Flexibility Skills- Agile, Tableau, BigQuery, SQL, T-SQL, data modelling, Snowflake, redshift, CDP Platform, Power BI, Data Visualization Notice period- 15 days or less Job Description : - Design, develop, and maintain dashboards and reports using Sigma Computing. Collaborate with business stakeholders to understand data requirements and deliver actionable insights. Write and optimize SQL queries that run directly on cloud data warehouses. Enable self-service analytics for business users via Sigma's spreadsheet interface and templates. Apply row-level security and user-level filters to ensure proper data access controls. Partner with data engineering teams to validate data accuracy and ensure model alignment. Troubleshoot performance or data issues in reports and dashboards. Train and support users on Sigma best practices, tools, and data literacy. Required Skills & Qualifications: 5+ years of experience in Business Intelligence, Analytics, or Data Visualization roles. Hands-on experience with Sigma Computing is highly preferred. Strong SQL skills and experience working with cloud data platforms (Snowflake, BigQuery, Redshift, etc.). Experience with data modeling concepts and modern data stacks. Ability to translate business requirements into technical solutions. Familiarity with data governance, security, and role-based access controls. Excellent communication and stakeholder management skills. Experience with Looker, Tableau, Power BI, or similar tools (for comparative insight). Familiarity with dbt, Fivetran, or other ELT/ETL tools. Exposure to Agile or Scrum methodologies. Follow to keep yourself updated about future job openings linkedin.com/in/sonali-nayakawad-088b19199

Posted 2 days ago

Apply

5.0 - 9.0 years

0 Lacs

pune, maharashtra

On-site

As a member of the Security Solutions, Platform and Analytics team (SPA) at Snowflake, your primary responsibility will be to develop custom solutions that enhance the security of Snowflake's Data Cloud. Leveraging your expertise in SQL, Python, and security domain knowledge, you will analyze security logs and event data to translate security requirements into effective technical solutions. Your role will involve developing advanced analytics techniques and scalable solutions to identify patterns, anomalies, and trends in security data. In this role at Snowflake, you will have the opportunity to: - Develop and optimize data pipelines, data models, and visualization dashboards for security analytics - Design and implement scalable automated solutions in collaboration with various security teams - Take ownership of database management tasks, including data modeling and performance optimization - Utilize tools like DBT to streamline data transformations and ensure high data quality - Conduct research and propose innovative approaches to enhance security posture - Translate security requirements into technical solutions that align with organizational goals To be successful in this role, we are looking for candidates who possess: - A Bachelor's degree in Computer Science, Information Security, or a related field - 5-8 years of experience in Data Analytics with strong SQL and Python skills - Experience in data visualization, DBT, and data pipeline development - Hands-on experience with Snowflake and familiarity with Cortex functions is a plus - Strong understanding of databases, data modeling, and data warehousing - Security domain knowledge, including experience with SIEM systems and threat intelligence platforms - Proven ability to analyze complex security events and effectively communicate findings Joining our team at Snowflake offers you the opportunity to work with cutting-edge technology and contribute to the security of a rapidly growing data platform. We value innovation, continuous learning, and the chance to make a significant impact on enterprise-scale security solutions. Snowflake is committed to growth and is seeking individuals who share our values, challenge conventional thinking, and drive innovation while building a future for themselves and Snowflake. If you are interested in making an impact and contributing to our team, we encourage you to explore the job posting on the Snowflake Careers Site for salary and benefits information (careers.snowflake.com).,

Posted 2 days ago

Apply

3.0 - 7.0 years

0 Lacs

karnataka

On-site

We are looking for a highly skilled and motivated SQL Developer with expertise in PostgreSQL or Oracle databases, along with proficiency in Java and/or Python. Your responsibilities will include designing, developing, and maintaining robust and scalable database solutions. You will also be integrating these solutions with Java and/or Python applications, optimizing database performance, ensuring data integrity, and collaborating with cross-functional teams to deliver high-quality software. Your main responsibilities will involve designing, developing, and implementing complex SQL queries, stored procedures, functions, and triggers for PostgreSQL or Oracle databases. Additionally, you will optimize database performance through indexing, query tuning, and schema improvements. You will work closely with application developers to integrate database solutions with Java and/or Python applications and develop and maintain ETL processes for data migration and integration. Collaboration with business analysts to understand data requirements, ensuring data integrity, security, and availability, and performing database performance monitoring, troubleshooting, and tuning will also be part of your role. You will participate in database design reviews, provide recommendations for best practices, and develop documentation for database designs, processes, and procedures. Supporting existing database systems and applications, including on-call rotation as needed, and staying up-to-date with the latest database technologies and best practices in PostgreSQL, Oracle, Java, and Python are also essential parts of your responsibilities. Required Skills and Qualifications: - Bachelor's degree in Computer Science, Information Technology, or a related field (or equivalent practical experience). - 3+ years of experience as an SQL Developer. - Strong proficiency in PostgreSQL or Oracle databases, including expertise in writing complex SQL queries, stored procedures, functions, and triggers. - Experience with database design, normalization, and optimization techniques. - Proficiency in Java or Python programming languages. - Experience with version control systems such as Git. - Strong analytical and problem-solving skills. - Excellent communication and interpersonal skills. - Ability to work independently and as part of a team in a fast-paced environment. Preferred Skills (Bonus Points): - Experience with both PostgreSQL and Oracle databases. - Familiarity with cloud platforms (AWS, Azure, GCP) and their database services. - Experience with CI/CD pipelines and DevOps practices. - Knowledge of data warehousing concepts and tools. - Experience with data visualization tools such as Tableau or Power BI. - Familiarity with NoSQL databases like MongoDB or Cassandra.,

Posted 2 days ago

Apply

5.0 - 9.0 years

0 Lacs

haryana

On-site

The Spec Analytics Intmd Analyst role is a developing professional position where you will independently handle most problems and have the freedom to solve complex issues. You will utilize your in-depth specialty knowledge along with industry standards to integrate with the team and achieve sub-function/job family objectives. Your role will involve applying analytical thinking and data analysis tools to make judgments and recommendations based on factual information with attention to detail. You will be dealing with variable issues that could have broader business impacts and will have a direct influence on the core activities of the business through close contact. Your communication and diplomacy skills will be crucial to exchange potentially complex information effectively. Responsibilities: - Working with large and complex data sets to evaluate, recommend, and support business strategies - Identifying and compiling data sets using tools like SQL and Access to predict, improve, and measure business outcomes - Documenting data requirements, data collection, processing, cleaning, and exploratory data analysis - Specializing in marketing, risk, digital, and AML fields - Assessing risk in business decisions with consideration for the firm's reputation and compliance with laws and regulations Skills and Experience: As a successful candidate, you should ideally have: - 5+ years of experience in data science, machine learning, or related fields - Advanced process management skills, detail-oriented, and organized - Curiosity and willingness to learn new skill sets, particularly in artificial intelligence - Strong programming skills in Python and proficiency in data science libraries such as scikit-learn, TensorFlow, PyTorch, and Transformers - Experience with statistical modeling techniques, data visualization tools, and GenAI solutions - Familiarity with agentic AI frameworks and MLOps Education: - Bachelors/University degree or equivalent experience This job description provides an overview of the work performed, and additional duties may be assigned as needed.,

Posted 2 days ago

Apply

5.0 - 9.0 years

0 Lacs

haryana

On-site

As a Senior Specialist, HR Data and Digital at NTT DATA, your primary focus will be on innovating HR platforms and data constructs. You will collaborate closely with HR, IT, and finance teams to ensure alignment and collaboration within the organization. Your responsibilities will include regular reviews to maintain data integrity, testing system changes, report writing, and analyzing data flows. You will extract and compile data, write reports using appropriate tools, and provide support for HR platforms like Workday, SuccessFactors, and Phenom People. Additionally, you will participate in major release reviews and integration testing, maintain HRIS procedures and documentation, and manage HR data and digital projects. To excel in this role, you should have a strong understanding of HR data management principles, data analytics concepts, and data governance. You should be familiar with HR technology systems, data privacy regulations, and emerging digital trends in HR. Proficiency in data analysis tools, attention to detail, problem-solving skills, and effective communication are essential for success in this role. Academically, a Bachelor's degree in Information Technology or related field is required, along with certifications such as Workday Success Factors, Lean Six Sigma Black Belt, and Certified Maintenance & Reliability Professional. Previous experience with HRIS platforms, talent analytics, and digital HR projects is crucial for this role. This position offers a hybrid working environment and is an equal opportunity employer. If you are looking to drive innovation in HR, optimize processes, and enhance employee experiences, this role at NTT DATA could be the perfect fit for you.,

Posted 2 days ago

Apply

2.0 - 6.0 years

0 Lacs

kollam, kerala

On-site

As a Marketing Analyst/Marketing Manager at our company based in Kollam, Kerala, you will be responsible for leveraging your Bachelor's/Master's degree in Marketing, Business Analytics, or a similar field to drive insightful marketing strategies. Ideally, you hold a BTech with an MBA in Marketing, which is a preferred qualification for this role. With 2 to 5 years of experience in roles such as Marketing Analyst or Digital Marketing Analyst, particularly within the EdTech or online education sector, you are well-equipped to excel in this position. Your key responsibilities will include data collection and analysis from various sources such as Google Analytics, CRM, ad platforms, and social media. By conducting thorough market research, competitor analysis, and consumer behavior studies, you will identify trends and opportunities to enhance our marketing strategies. Developing and maintaining comprehensive dashboards and reports to monitor key marketing KPIs across all channels will be crucial for tracking performance metrics like website traffic, conversion rates, cost per acquisition, and organic search performance. Furthermore, you will play a pivotal role in defining and evolving our brand positioning at Amrita Online, identifying differentiation opportunities, and leading integrated brand campaigns across digital channels. Monitoring organic search performance, analyzing website crawlability, and providing recommendations for SEO improvements will be essential to boost organic visibility and traffic. By leveraging your insights from data, you will optimize digital marketing campaigns and contribute to forecasting and budget allocation based on robust analysis. To excel in this role, you should possess proficiency in web analytics tools such as Google Analytics 4 and Google Search Console, as well as expertise in Microsoft Excel, including advanced functions and pivot tables. Experience with data visualization tools like Tableau and Power BI, familiarity with marketing automation and CRM platforms, and a good understanding of SEO tools and principles are required. Additionally, basic knowledge of SQL is considered a plus. If you are passionate about driving data-focused marketing strategies, optimizing digital campaigns, and contributing to brand development in a dynamic environment, we encourage you to apply for this non-teaching position before the deadline on July 26, 2025.,

Posted 2 days ago

Apply

5.0 - 9.0 years

0 Lacs

chennai, tamil nadu

On-site

You should have a degree in Computer Science, Artificial Intelligence, Machine Learning, or a related field. You must possess solid experience in developing and implementing generative AI models, with a deep understanding of techniques like GPT, VAE, and GANs. Proficiency in Python and experience with machine learning libraries like TensorFlow, PyTorch, or Keras is required. Strong knowledge of data structures, algorithms, and software engineering principles is essential. Familiarity with cloud-based platforms such as AWS, GCP, or Azure is expected. Experience with natural language processing (NLP) techniques and tools like SpaCy, NLTK, or Hugging Face is necessary. Knowledge of data visualization tools and libraries such as Matplotlib, Seaborn, or Plotly is preferred. Understanding of software development methodologies like Agile or Scrum is important. Excellent problem-solving skills are a must, with the ability to think critically and creatively to develop innovative AI solutions. Strong communication skills are required to effectively convey complex technical concepts to a diverse audience. A proactive mindset is essential, with the ability to work independently and collaboratively in a fast-paced, dynamic environment.,

Posted 2 days ago

Apply

3.0 - 7.0 years

0 Lacs

karnataka

On-site

At PwC, our focus in data and analytics is on leveraging data to drive insights and make informed business decisions. We utilize advanced analytics techniques to assist clients in optimizing their operations and achieving strategic goals. In the realm of data analysis at PwC, you will primarily be tasked with utilizing advanced analytical techniques to extract insights from large datasets and drive data-driven decision-making. Your role will involve leveraging skills in data manipulation, visualization, and statistical modeling to support clients in resolving complex business problems. In this position, you will be expected to build meaningful client connections and learn how to effectively manage and inspire others. By navigating increasingly complex situations, you will have the opportunity to grow your personal brand, deepen technical expertise, and enhance awareness of your strengths. Anticipating the needs of both your teams and clients and delivering quality results will be a key aspect of your responsibilities. Embracing ambiguity will be essential, as you will need to be comfortable when the path forward is unclear, ask pertinent questions, and view these moments as opportunities for personal growth. Examples of the skills, knowledge, and experiences required to lead and deliver value at this level include but are not limited to: - Responding effectively to diverse perspectives, needs, and feelings of others. - Using a broad range of tools, methodologies, and techniques to generate new ideas and solve problems. - Employing critical thinking to break down complex concepts. - Understanding the broader objectives of your project or role and how your work aligns with the overall strategy. - Developing a deeper understanding of the business context and its evolving nature. - Using reflection to enhance self-awareness, strengthen strengths, and address development areas. - Interpreting data to derive insights and make recommendations. - Upholding and reinforcing professional and technical standards, such as specific PwC tax and audit guidance, the Firm's code of conduct, and independence requirements. As a Senior Associate, you will work as part of a team of problem solvers, contributing to the resolution of complex business issues from strategy to execution. Responsibilities at this management level include but are not limited to: - Developing self-awareness and personal strengths through feedback and reflection. - Delegating tasks to others to provide growth opportunities and coaching for delivering results. - Demonstrating critical thinking and the ability to organize unstructured problems. - Utilizing a wide range of tools and techniques to extract insights from current industry or sector trends. - Reviewing your work and that of others for quality, accuracy, and relevance. - Knowing how and when to use appropriate tools for a given situation and being able to explain the rationale behind the choice. - Seeking and embracing opportunities for exposure to different situations, environments, and perspectives. - Communicating straightforwardly and in a structured manner when influencing and connecting with others. - Adapting behavior to build quality relationships by reading situations and modifying approach accordingly. - Adhering to the firm's code of ethics and business conduct. In addition to the core responsibilities, a career in Treasury Risk Analytics within Risk & Regulatory Advisory will involve advising financial institutions on strategies to optimize financial performance and risk management across various financial risks. This includes designing and implementing strategies to address market challenges like low interest rates, LIBOR transition programs, competition from non-banks, and cost pressures.,

Posted 2 days ago

Apply

2.0 - 6.0 years

0 Lacs

karnataka

On-site

As a skilled Web Scraping Data Analyst, you will be responsible for collecting, cleaning, and analyzing data from various online sources. Your expertise in Python-based scraping frameworks, data transformation, and experience with proxy/VPN rotation and IP management will be crucial in building data pipelines that support our analytics and business intelligence initiatives. Your key responsibilities will include designing, developing, and maintaining robust web scraping scripts using tools like Python, BeautifulSoup, Scrapy, Selenium, etc. You will also implement IP rotation, proxy management, and anti-bot evasion techniques, deploy scraping tools on cloud-based or edge servers, and monitor scraping jobs for uptime and efficiency. Additionally, you will parse and structure unstructured or semi-structured web data into clean, usable datasets, collaborate with data analysts and data engineers to integrate web-sourced data into internal databases and reporting systems, conduct exploratory data analysis (EDA), and ensure compliance with website scraping policies, robots.txt, and relevant data privacy regulations. To excel in this role, you should have proficiency in Python and experience with libraries like Requests, BeautifulSoup, Scrapy, Pandas. Knowledge of proxy/VPN usage, IP rotation, and web traffic routing tools (e.g., Smartproxy, BrightData, Tor, etc.), familiarity with cloud platforms (AWS, Azure, or GCP) and Linux-based environments, experience deploying scraping scripts on edge servers or containerized environments (e.g., Docker), solid understanding of HTML, CSS, JSON, and browser dev tools for DOM inspection, strong analytical mindset with experience in data cleansing, transformation, and visualization, good knowledge of SQL and basic data querying, and ability to handle large volumes of data and build efficient data pipelines. Preferred qualifications for this role include experience with headless browsers like Puppeteer or Playwright, familiarity with scheduling tools like Airflow or Cron, background in data analytics or reporting using tools like Tableau, Power BI, or Jupyter Notebooks, and knowledge of anti-captcha solutions and browser automation challenges. This is a full-time position with the work location being in person.,

Posted 2 days ago

Apply

2.0 - 6.0 years

0 Lacs

chennai, tamil nadu

On-site

You are a highly skilled Technical Data Analyst, and you will be joining our growing team. Your role will involve utilizing your strong technical foundation in Oracle PL/SQL and Python, along with expertise in data analysis tools and techniques. As a strategic thinker, you will lead and mentor a team of data analysts, driving data-driven insights and contributing to key business decisions. Additionally, you will be responsible for researching and evaluating emerging AI tools and techniques for potential application in data analysis projects. Your responsibilities will include designing, developing, and maintaining complex Oracle PL/SQL queries and procedures for data extraction, transformation, and loading (ETL) processes. You will also utilize Python scripting for data analysis, automation, and reporting. Performing in-depth data analysis to identify trends, patterns, and anomalies will be crucial in providing actionable insights to improve business performance. Collaboration with cross-functional teams to understand business requirements and translating them into technical specifications is essential. Furthermore, you will develop and maintain data quality standards to ensure data integrity across various systems. Leveraging data analysis and visualization tools like Tableau, Power BI, and Qlik Sense to create interactive dashboards and reports for business stakeholders will also be part of your role. Staying up-to-date with the latest data analysis tools, techniques, and industry best practices, including AI/ML advancements, will be necessary. Researching and evaluating emerging AI/ML tools and techniques for potential application in data analysis projects will also be one of your responsibilities. Preferred Qualifications for this role include hands-on work experience as a Technical Data Analyst (not a business analyst) with Oracle PL/SQL and Python programming to interpret analytical tasks and analyze large datasets. Proficiency in Python scripting for data analysis and automation, expertise in data visualization tools such as Tableau, Power BI, or Qlik Sense, and an awareness and understanding of AI/ML tools and techniques in data analytics are highly desirable. Practical experience applying AI/ML techniques in data analysis projects is a plus. Strong analytical, problem-solving, communication, and interpersonal skills are required, along with experience in the financial services industry. Your education should include a Bachelor's degree/University degree or equivalent experience. It is important to note that this job description provides a high-level review of the types of work performed, and other job-related duties may be assigned as required. This role falls under Data Analytics, and a minimum of 2 years of experience as a Technical Data Analyst (not a business analyst) with hands-on PL/SQL experience to interpret analytical tasks and analyze large datasets is necessary. Hands-on Oracle/SQL experience to develop performance-optimized SQL scripts to fetch/analyze data from an Oracle Database and an excellent understanding of business operations and analytics tools for effective data analysis are also required. You will coordinate and contribute to the objectives of data science initiatives and overall business by leveraging an in-depth understanding of how areas collectively integrate within the sub-function. Conducting strategic data analysis, identifying insights and implications, making strategic recommendations, and developing data displays that clearly communicate complex analysis will be part of your role. Delivering analytics initiatives to address business problems, identifying required data, assessing time & effort required, and establishing a project plan are essential tasks. Experience in the financial sector is considered an advantage. This is a full-time position in the Technology job family group, specifically in Applications Development. Your most relevant skills should match the requirements listed above, and for complementary skills, you can refer to the job description or contact the recruiter for further information.,

Posted 2 days ago

Apply

3.0 - 7.0 years

0 Lacs

maharashtra

On-site

You are a strategic thinker passionate about driving solutions and innovation mindset. You have found the right team. As a Data Engineer in our STO team, you will be a strategic thinker passionate about promoting solutions using data. You will mine, interpret, and clean our data, asking questions, connecting the dots, and uncovering hidden opportunities for realizing the data's full potential. As part of a team of specialists, you will slice and dice data using various methods and create new visions for the future. Our STO team is focused on collaborating and partnering with business to deliver efficiency and enhance controls via technology adoption and infrastructure support for Global Finance & Business Management India. Job Responsibilities: - Write efficient Python and SQL code to extract, transform, and load (ETL) data from various sources into Databricks. - Perform data analysis and computation to derive actionable insights from the data. - Collaborate with data scientists, analysts, and other stakeholders to understand data requirements and deliver solutions. - Ensure data quality, integrity, and security across all data processes. - Develop optimized solutions for performance and scalability. - Monitor and troubleshoot data workflows to ensure reliability and efficiency. - Document data engineering processes, methodologies, and workflows. - Communicate analytical findings to senior leaders through data visualization and storytelling. Required qualifications, capabilities and skills: - Minimum 3+ years of hands-on experience in developing, implementing, and maintaining Python automation solutions including the use of LLM. - Develop, implement, and maintain new and existing solutions. - Write efficient Python and SQL code to extract, transform, and load (ETL) data from various sources. - Ability to use LLM to build AI solutions. - Perform data analysis and computation to derive actionable insights from the data. - Collaborate with data scientists, analysts, and other stakeholders to understand data requirements and deliver solutions. - Ensure data quality, integrity, and security across all data processes. - Monitor and troubleshoot data workflows to ensure reliability and efficiency. - Document data engineering processes, methodologies, and workflows. Preferred qualifications, capabilities and skills: - Hands-on experience in Python desktop solution development. - Knowledge of machine learning and data science concepts will be a plus. - Experience with data visualization tool Tableau will be a plus.,

Posted 2 days ago

Apply

2.0 - 6.0 years

0 Lacs

karnataka

On-site

You should have hands-on experience in Machine Learning Model and deep learning development using python. You will be responsible for Data Quality Analysis and Data preparation, Exploratory Data Analysis, and visualization of data. Additionally, you will define validation strategies, preprocessing or feature engineering on a given dataset, and data augmentation pipelines. Text processing using Machine Learning and NLP for processing documents will also be part of your role. Your tasks will include training models, tuning their hyperparameters, analyzing model errors, and designing strategies to overcome them. You should have experience with python packages such as Numpy, Scipy, Scikit-learn, Theano, TensorFlow, Keras, PyTorch, Pandas, and Matplotlib. Experience in working on Azure open AI studio or openai using python or LLAMA or Langchain is required. Moreover, experience in working on Azure function and python flask/api development/streamlit, prompt engineering, conversational AI, and LLM models like word2Vec, Glove, spacy, BERT embedding models is preferred. You are expected to possess distinctive problem-solving, strategic, and analytical capabilities, as well as excellent time-management and organization skills. Strong knowledge in Programming languages like Python, reactjs, SQL, big data is essential. Excellent verbal and written communication skills are necessary for effective interaction between business and technical architects and developers. You should have 2 - 4 years of relevant experience and a Bachelors Degree in Computer Science, Computer Engineering, masters in computer application, MIS, or a related field. End-to-End development experience in deployment of the Machine Learning model using python and Azure ML studio is required. Exposure in developing client-based or web-based software solutions and Certification of Machine Learning and Artificial Intelligence will be beneficial. Good to have experience in power platform or power pages or Azure OpenAI studio. Grant Thornton INDUS comprises GT U.S. Shared Services Center India Pvt Ltd and Grant Thornton U.S. Knowledge and Capability Center India Pvt Ltd. Grant Thornton INDUS is the shared services center supporting the operations of Grant Thornton LLP, the U.S. member firm of Grant Thornton International Ltd. Established in 2012, Grant Thornton INDUS employs professionals across a wide range of disciplines including Tax, Audit, Advisory, and other operational functions. The culture at Grant Thornton INDUS promotes empowered people, bold leadership, and distinctive client service. Working at Grant Thornton INDUS offers an opportunity to be part of something significant and serves communities in India through inspirational and generous services to give back to the communities they work in. Grant Thornton INDUS has its offices in two locations in India Bengaluru and Kolkata.,

Posted 2 days ago

Apply

3.0 - 7.0 years

0 Lacs

maharashtra

On-site

As a Software Engineer in the Direct Platform Quality team at Morningstar's Enterprise Data Platform (EDP), you will play a crucial role in developing and maintaining data quality solutions to enhance Morningstar's client experience. You will collaborate with a quality engineering team to automate the creation of client scorecards and conduct data-specific audit and benchmarking activities. By partnering with key stakeholders like Product Managers and Senior Engineers, you will contribute to the development and execution of data quality control suites. Your responsibilities will include developing and deploying quality solutions using best practices of Software Engineering, building applications and services for Data Quality Benchmarking and Data Consistency Solutions, and adding new features as per the Direct Platform Quality initiatives" product roadmap. You will also be required to participate in periodic calls during US or European hours and adhere to coding standards and guidelines. To excel in this role, you should have a minimum of 3 years of hands-on experience in software engineering with a focus on building and deploying applications for data analytics. Proficiency in Python, Object Oriented Programming, SQL, and AWS Cloud is essential, with AWS certification being a plus. Additionally, expertise in big data open-source technologies, Analytics & ML/AI, public cloud services, and cloud-native architectures is required. Experience in working on Data Analytics and Data Quality projects for AMCs, Banks, Hedge Funds, and designing complex data pipelines in a Cloud Environment will be advantageous. An advanced degree in engineering, computer science, or a related field is preferred, along with experience in the Financial Domain. Familiarity with Agile software engineering practices and mutual fund, fixed income, and equity data is beneficial. At Morningstar, we believe in continuous learning and expect you to stay abreast of software engineering, cloud and data science, and financial research trends. Your contributions to the technology strategy will lead to the development of superior products, streamlined processes, effective communication, and faster delivery times. As our products have a global reach, a global mindset is essential for success in this role. Morningstar is committed to providing an equal opportunity work environment. Our hybrid work model allows for remote work with regular in-person collaboration, fostering a culture of flexibility and connectivity among global colleagues. Join us at Morningstar to be part of a dynamic team that values innovation, collaboration, and personal growth.,

Posted 2 days ago

Apply

3.0 - 7.0 years

0 Lacs

noida, uttar pradesh

On-site

As a dynamic and strategic Customer Lifecycle Management (CLM) Specialist/Manager, you will play a crucial role in optimizing the customer journey, enhancing customer retention, and driving customer satisfaction and loyalty. Your responsibilities will include mapping the end-to-end customer journey, analyzing customer data and feedback, developing and implementing strategies to optimize each stage of the customer lifecycle, and collaborating with various departments to deliver a seamless customer experience. You will be tasked with designing and executing customer engagement programs, monitoring customer retention metrics, collecting and analyzing customer feedback, and working with product development teams to implement changes that enhance the overall customer experience. Additionally, you will be responsible for developing and maintaining dashboards and reports to track key customer lifecycle metrics, presenting findings and recommendations to senior management, and continuously refining lifecycle strategies based on performance data and evolving customer needs. To excel in this role, you should possess a Bachelor's degree or higher in Marketing, Business Administration, or a related field, along with proven experience in customer lifecycle management, customer success, or a related role. Strong analytical skills, excellent communication and interpersonal skills, experience with CRM systems and customer lifecycle management tools, and proficiency in data analysis and visualization tools are also essential. Additionally, you should have a background in Media & Gaming Products, strong project management skills, and the ability to work collaboratively across departments and with external partners. Knowledge of customer experience best practices and methodologies will be an added advantage. If you are looking to make a meaningful impact in optimizing customer experiences and driving customer satisfaction and loyalty, this role as a Customer Lifecycle Management Specialist/Manager is the perfect opportunity for you.,

Posted 2 days ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies