Home
Jobs

1943 Querying Jobs

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

10.0 years

8 - 10 Lacs

Hyderābād

On-site

GlassDoor logo

About the Role: Grade Level (for internal use): 11 The Role: Lead Quality Engineer (ML and automation on AI workflows) The Team: This team is recognized for its expertise, innovation, and passion. Together, you'll focus on agile product development with cutting-edge technology, offering insights into global capital markets and the financial services industry. Your role will involve close collaboration, initiative, and achieving ambitious goals alongside your talented colleagues and stakeholders. This is a unique opportunity to be a pivotal part of our fast-growing global organization during this exciting phase in our company's evolution. The Impact: This role is essential for the Market Intelligence Group of S&P as it ensures the development of new software solutions and continuous improvement and stability of our existing applications. Role will enable our clients with a seamless user experience and access to up-to-date data, ultimately bolstering their confidence in our services and reinforcing our competitive position in the market. What’s in it for you: Drive quality practices and processes. Exposure to cutting-edge technology and tools in the financial domain. Opportunity to work within an multi cloud environment (AWS), promoting skill development and innovation. Collaboration with global teams, offering diverse perspectives and enhancing your professional growth potential. Access to a dynamic and forward-thinking work environment, where you can contribute to the development of innovative solutions and stay at the forefront of industry trends. Working Realtime in actual CI/CD environment Work with multiple MI product and learn Public Market domain. Responsibilities: Leads and designs test automation architecture to work across all product technologies covering areas such as (but not limited to) build verification, functional verification, stability, and data integrity. Build test frameworks/architectures specifically for applications predicated on Large Language Models (LLMs) and agentic workflows. Design, develop & maintain framework, scripts and execute automation scripts. Expertise in Automation Testing for WebdriverIO (Typescript), and Cypress (JavaScript) Spearheads the enhancement of software development processes across all teams in accordance with Total Quality Assurance best practices (including, but not limited to, project management, development, business operations, reporting, and quality management). Oversee and participate in the development and review of test strategies and test plans to ensure appropriate test coverage of all features. Oversee and participate in the performance of tests across various applications. Offers support and mentorship to other engineers on the automation team. Provides technical guidance to software testers to facilitate their adaptation to automation tools. Become a subject matter expert in the domain and applications built and supported by our program. Review requirements, user stories, specifications, and technical design documents and create detailed, comprehensive and well-structured test plans and test cases using available test methods. Estimate, prioritize, plan and coordinate testing activities in Agile environment. Solid understanding of database concepts, methodologies, and best practices Proficiency in SQL and database querying Liaise with internal teams (e.g. developers and product managers) to identify system requirements, and evaluate system interfaces, operational requirements, and performance requirements of overall system. What We’re Looking For: Bachelor’s or master’s degree in Computer Science, Engineering, or a related field. A minimum of 10 years of experience as a Quality Engineer, with leadership exposure. Exhibits experience in testing agentic workflows and possesses the capability to construct AI agents. Demonstrates a thorough understanding of Machine Learning concepts and possesses the ability to efficiently analyze extensive datasets. Strong skills in Python, JavaScript or TypeScript. Experience with API and mobile testing. Expertise in GitHub pipelines for continuous delivery. Excellent communication and facilitation skills. Ability to translate software requirements/stories into accurate and complete test scenarios, including identifying detailed test data needs. Proficiency in SQL and database querying, with a solid understanding of database concepts, methodologies, and best practices. Experience managing teams and mentoring team members across multiple projects and products. Ownership of delivery, with the ability to identify potential risks and mitigate them to achieve desired goals. Excellent written and spoken English skills. Experience working in a distributed environment with colleagues across different geographies. Must Haves: Possesses experience in building test frameworks/architectures specifically for applications predicated on Large Language Models (LLMs) and agentic workflows. Demonstrates a thorough understanding of Machine Learning concepts and possesses the ability to efficiently analyze extensive datasets. Experience in Python programing language. Additional Skills (Preferred): Good understanding of performance testing and metrics. Experience with AWS/Azure Understanding of UX principles C# language skill About S&P Global Market Intelligence At S&P Global Market Intelligence, a division of S&P Global we understand the importance of accurate, deep and insightful information. Our team of experts delivers unrivaled insights and leading data and technology solutions, partnering with customers to expand their perspective, operate with confidence, and make decisions with conviction. For more information, visit www.spglobal.com/marketintelligence . What’s In It For You? Our Purpose: Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technology–the right combination can unlock possibility and change the world. Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence®, pinpointing risks and opening possibilities. We Accelerate Progress. Our People: We're more than 35,000 strong worldwide—so we're able to understand nuances while having a broad perspective. Our team is driven by curiosity and a shared belief that Essential Intelligence can help build a more prosperous future for us all. From finding new ways to measure sustainability to analyzing energy transition across the supply chain to building workflow solutions that make it easy to tap into insight and apply it. We are changing the way people see things and empowering them to make an impact on the world we live in. We’re committed to a more equitable future and to helping our customers find new, sustainable ways of doing business. We’re constantly seeking new solutions that have progress in mind. Join us and help create the critical insights that truly make a difference. Our Values: Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits: We take care of you, so you can take care of business. We care about our people. That’s why we provide everything you—and your career—need to thrive at S&P Global. Our benefits include: Health & Wellness: Health care coverage designed for the mind and body. Flexible Downtime: Generous time off helps keep you energized for your time on. Continuous Learning: Access a wealth of resources to grow your career and learn valuable new skills. Invest in Your Future: Secure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly Perks: It’s not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the Basics: From retail discounts to referral incentive awards—small perks can make a big difference. For more information on benefits by country visit: https://spgbenefits.com/benefit-summaries Global Hiring and Opportunity at S&P Global: At S&P Global, we are committed to fostering a connected and engaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. Recruitment Fraud Alert: If you receive an email from a spglobalind.com domain or any other regionally based domains, it is a scam and should be reported to reportfraud@spglobal.com . S&P Global never requires any candidate to pay money for job applications, interviews, offer letters, “pre-employment training” or for equipment/delivery of equipment. Stay informed and protect yourself from recruitment fraud by reviewing our guidelines, fraudulent domains, and how to report suspicious activity here . ----------------------------------------------------------- Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to: EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only: The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf ----------------------------------------------------------- IFTECH202.2 - Middle Professional Tier II (EEO Job Group) Job ID: 316626 Posted On: 2025-06-23 Location: Hyderabad, Telangana, India

Posted 8 hours ago

Apply

3.0 - 5.0 years

7 - 9 Lacs

Pune

On-site

GlassDoor logo

Data Management and Quantitative Analysis - IC3Under moderate guidance, works with internal and external datasets and client reference data and provides analysis in the development statistical, financial and/or econometric models for analyzing asset performance, securities data, derivative pricing, risk exposure or other sophisticated concepts. Provides analytical support and prepares drafts of standard and ad hoc reports for assigned area.With moderate guidance, supports assigned area with more advanced statistical and quantitative analyses. Serves as resource to less experienced colleagues. Runs models, looks for exceptions, takes corrective action.Uses technology tools to conduct analyses. Applies techniques such as SQL and querying and macro development to extract data for populating models.Has a good understanding of the relevant processes and products in assigned area and which analyses, methodologies and approaches best support assessment of performance, risk, or valuation. Interprets findings and prepares initial drafts of standard reports. Prepares ad-hoc reports at the request of managers and/or other leaders. Translates complex technical concepts and analyses to non-technical audiences.Reviews accuracy of reports and calculations performed by less experienced colleagues.No direct reports. Provides guidance to more junior analysts. Primarily responsible for the accuracy and quality of own work. Work contributes to the achievement of team goals.Bachelors degree or the equivalent combination of education and experience. Advanced degree in quantitative analysis preferred.3-5 years experience preferred. Experience in quantitative finance and technology preferred.. BNY Mellon is an Equal Employment Opportunity/Affirmative Action Employer. Minorities/Females/Individuals with Disabilities/Protected Veterans. Our ambition is to build the best global team – one that is representative and inclusive of the diverse talent, clients and communities we work with and serve – and to empower our team to do their best work. We support wellbeing and a balanced life, and offer a range of family-friendly, inclusive employment policies and employee forums.

Posted 8 hours ago

Apply

2.0 - 4.0 years

4 - 12 Lacs

Mumbai

On-site

GlassDoor logo

Position: ODI Developer Experience Required: Minimum 2 to 4 Years Location: Mumbai only Job Type: Permanent/ Full Time Notice Period: Immediate only Working Days: Monday to Friday Interviews: First Round- Virtual, 2nd round- Face to Face Key Responsibilities: 1 to 3 years of proven experience in design and development of Data Warehouse solutions, ETL, Software Development and system Integration projects. Strong experience in Oracle Data Integrator (ODI) 12c with knowledge of data modeling and ETL design. In-depth knowledge of Oracle Data Integrator (ODI) 12c concepts. Must have done the Development of topology setup, Load Plans, Packages, Mappings/Interface, Procedures, Scheduling Jobs and ODI tools. Experience working with multiple technologies including Oracle, SAP HANA, and file-based systems (CSV/XML). Good querying skills on Oracle database with ability to understand and debug scripts (Groovy \ Python). Hands-on experience with managing ODI Agents and job deployments across environments. Experience in performance tuning and optimization of ETL processes. Ability to design ETL unit test cases and debug ETL Mappings. Managing Files, Oracle DB, SAP HANA, XMLs through ODI. Must be well versed and hands-on in using and customizing Knowledge Modules (KM) — IKM, LKM, CKM. Strong understanding and implementation experience of Slowly Changing Dimensions (SCD) — Type 1 and Type 2 mappings. Functional Skills/Competencies: Proficient in writing and optimizing SQL queries. Good to have knowledge on SAP HANA data extraction methods. Good to have knowledge on DBT and Fivetran (ETL tools). Strong follow-up skills: ability to organize applicable department timelines and follow up with internal and external customer needs. Ability to debug run time issues in PROD efficiently. Job Types: Full-time, Permanent Pay: ₹400,000.00 - ₹1,200,000.00 per year Benefits: Health insurance Provident Fund Schedule: Day shift Work Location: In person

Posted 8 hours ago

Apply

3.0 years

3 - 7 Lacs

Pune

Remote

GlassDoor logo

Your work days are brighter here. At Workday, it all began with a conversation over breakfast. When our founders met at a sunny California diner, they came up with an idea to revolutionize the enterprise software market. And when we began to rise, one thing that really set us apart was our culture. A culture which was driven by our value of putting our people first. And ever since, the happiness, development, and contribution of every Workmate is central to who we are. Our Workmates believe a healthy employee-centric, collaborative culture is the essential mix of ingredients for success in business. That’s why we look after our people, communities and the planet while still being profitable. Feel encouraged to shine, however that manifests: you don’t need to hide who you are. You can feel the energy and the passion, it's what makes us unique. Inspired to make a brighter work day for all and transform with us to the next stage of our growth journey? Bring your brightest version of you and have a brighter work day here. At Workday, we value our candidates’ privacy and data security. Workday will never ask candidates to apply to jobs through websites that are not Workday Careers. Please be aware of sites that may ask for you to input your data in connection with a job posting that appears to be from Workday but is not. In addition, Workday will never ask candidates to pay a recruiting fee, or pay for consulting or coaching services, in order to apply for a job at Workday. About the Team Join our team and experience Workday! / About the team It's fun to work in a company where people truly believe in what they're doing. At Workday, we're committed to bringing passion and customer focus to the business of enterprise applications. We work hard, and we're serious about what we do. But we like to have a good time, too. In fact, we run our company with that principle in mind every day: One of our core values is fun. About the Role Job Description / About the Role Workday is looking for a Support Engineer specializing in Analytics with expertise in troubleshooting, performance optimization, and data analysis across Workday’s analytics services, including Prism Analytics, People Analytics, Discovery Boards, and Accounting Center. The ideal candidate has a solid foundation in big-data processing, data transformation, and reporting frameworks, with the ability to diagnose and resolve complex issues by analyzing logs, performance metrics, and system integrations. This role requires hands-on experience with query performance tuning, data pipeline debugging, and structured troubleshooting methodologies to support Workday’s analytics solutions. Strong data modeling, log analysis, and problem-solving skills combined with clear, effective communication are essential for success in this role. About You Key Areas of Responsibility: Provide sophisticated technical support for Workday’s reporting and analytics tools, including Prism Analytics, People Analytics, Discovery Boards, and Accounting Center, focusing on performance optimization, index debugging, memory management, and system health debugging. Develop expertise in Workday analytics services to drive high-performance reporting and data analytics solutions, using Prism Analytics, People Analytics, and SQL best practices. Collaborate with clients to define business requirements and translate them into optimized reports and configurations, improving query performance, data accuracy, and system health using Prism Analytics and Discovery Boards. Troubleshoot and resolve issues related to report configurations, system performance, integrations, and memory management, including detailed analysis of logs, query performance, and data pipelines. Guide customers in building, modifying, and optimizing reports, ensuring scalability, data integrity, and alignment with business needs, especially in Prism Analytics and Accounting Center. Educate users on standard methodologies for Workday reporting, security, and data governance, emphasizing People Analytics and Discovery Boards. Collaborate cross-functionally with engineering teams to address data quality issues, security concerns, and performance optimizations across Prism Analytics and Accounting Center, with a focus on memory management and system health. Contribute to documentation, QA efforts, and the optimization of analytics tools, with a focus on SQL querying, indexing, and debugging system health issues. Participate in 24x7 global support coverage, providing timely and efficient support across time zones. Key Technical Skills & Knowledge: Bachelor’s degree in Computer Science, Information Management, Statistics, Data Science, or a related field. 3+ years of experience in customer support, system performance optimization, data analysis, or similar roles, with a solid background in big data technologies and AI-driven analytics. Demonstrable experience with data platforms (e.g., Spark, Hadoop) and working with large-scale datasets, including data pipeline design and distributed processing. Hands-on experience with advanced reporting tools and analytics solutions, including AI-powered reporting platforms and big data tools like Spark for data transformation and analysis. Strong proficiency in SQL and data querying with the ability to analyze complex data sets, optimize queries, and perform data-driven insights to enhance system performance and business processes. Demonstrated ability to gather and map business requirements to advanced analytics and application capabilities, ensuring alignment with AI-driven insights and reporting solutions. Solid understanding of data architecture, including data lakes, ETL processes, and real-time data streaming. Strong analytical skills to collect, organize, and interpret complex datasets, using AI and big data tools to drive product improvements and optimize reporting performance. Ability to deliver data-driven insights to technical and non-technical partners, presenting complex findings to end-users and executive teams in an actionable manner. Proven collaboration skills, working across teams to drive issue resolution and using AI or machine learning models to enhance system functionality and customer experience. Strong written and verbal communication skills, with experience in technical consulting, customer support, or AI/ML-driven technical roles. Self-motivated with the ability to work independently in a fast-paced environment, while using AI and big data technologies to identify and resolve issues. Our Approach to Flexible Work With Flex Work, we’re combining the best of both worlds: in-person time and remote. Our approach enables our teams to deepen connections, maintain a strong community, and do their best work. We know that flexibility can take shape in many ways, so rather than a number of required days in-office each week, we simply spend at least half (50%) of our time each quarter in the office or in the field with our customers, prospects, and partners (depending on role). This means you'll have the freedom to create a flexible schedule that caters to your business, team, and personal needs, while being intentional to make the most of time spent together. Those in our remote "home office" roles also have the opportunity to come together in our offices for important moments that matter. Are you being referred to one of our roles? If so, ask your connection at Workday about our Employee Referral process!

Posted 8 hours ago

Apply

2.0 - 3.0 years

7 Lacs

Mumbai

On-site

GlassDoor logo

Job Title: Tableau Developer Experience: 2-3 Years Location: Mumbai, India About the Role: We are seeking a highly motivated and skilled Tableau Developer with years of proven experience to join our dynamic team in Mumbai. In this role, you will be instrumental in transforming complex data into insightful and interactive dashboards and reports using Tableau. You will work closely with business stakeholders, data analysts, and other technical teams to understand reporting requirements, develop effective data visualizations, and contribute to data-driven decision-making within the organization. Roles and Responsibilities: Dashboard Development: Design, develop, and maintain compelling and interactive Tableau dashboards and reports that meet business requirements and enhance user experience. Create various types of visualizations, including charts, graphs, maps, and tables, to effectively communicate data insights. Implement advanced Tableau features such as calculated fields, parameters, sets, groups, and Level of Detail (LOD) expressions to create sophisticated analytics. Optimize Tableau dashboards for performance and scalability, ensuring quick loading times and efficient data retrieval. Data Sourcing and Preparation: Connect to various data sources (e.g., SQL Server, Oracle, Excel, cloud-based data platforms like AWS Redshift, Google BigQuery, etc.) and extract, transform, and load (ETL) data for reporting purposes. Perform data analysis, validation, and cleansing to ensure the accuracy, completeness, and consistency of data used in reports. Collaborate with data engineers and data analysts to understand data structures, identify data gaps, and ensure data quality. Requirements Gathering & Collaboration: Work closely with business users, stakeholders, and cross-functional teams to gather and understand reporting and analytical requirements. Translate business needs into technical specifications and develop effective visualization solutions. Participate in discussions and workshops to refine requirements and propose innovative reporting approaches. Troubleshooting and Support: Diagnose and resolve issues related to data accuracy, dashboard performance, and report functionality. Provide ongoing support and maintenance for existing Tableau dashboards and reports. Assist end-users with Tableau-related queries and provide training as needed. Documentation and Best Practices: Create and maintain comprehensive documentation for Tableau dashboards, data sources, and development processes. Adhere to data visualization best practices and design principles to ensure consistency and usability across all reports. Contribute to code reviews and knowledge sharing within the team. Continuous Improvement: Stay up-to-date with the latest Tableau features, updates, and industry trends in data visualization and business intelligence. Proactively identify opportunities for improvement in existing reports and propose enhancements. Participate in an Agile development environment, adapting to changing priorities and contributing to sprint goals. Required Skills and Qualifications: Bachelor's degree in Computer Science, Information Systems, Data Science, or a related field. 2 years of hands-on experience as a Tableau Developer , with a strong portfolio of developed dashboards and reports. Proficiency in Tableau Desktop and Tableau Server (including publishing, managing permissions, and performance monitoring). Strong SQL skills for data extraction, manipulation, and querying from various databases. Solid understanding of data warehousing concepts, relational databases, and ETL processes. Familiarity with data visualization best practices and design principles. Excellent analytical and problem-solving skills with a keen eye for detail. Strong communication skills (verbal and written) with the ability to explain complex data insights to non-technical stakeholders. Ability to work independently and collaboratively in a team-oriented environment. Adaptability to changing business requirements and a fast-paced environment. Additional Qualifications: Experience with other BI tools (e.g., Power BI, Qlik Sense) is a plus. Familiarity with scripting languages like Python or R for advanced data manipulation and analytics. Knowledge of cloud data platforms (e.g., AWS, Azure, GCP). Experience with Tableau Prep for data preparation. Job Types: Full-time, Permanent Pay: Up to ₹750,000.00 per year Benefits: Health insurance Provident Fund Schedule: Day shift Monday to Friday Work Location: In person

Posted 8 hours ago

Apply

4.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

Location: Noida & Gurgaon Total Experience: 4 to 12 years(Senior Associate & Manager) Role Overview The Agentic Automation Engineer will develop and implement intelligent automation solutions using agentic frameworks like LangGraph, AutoGen, and CrewAI, integrating generative AI and Retrieval-Augmented Generation (RAG) techniques. This role requires deep expertise in Python, generative AI, and both open-source and closed-source LLMs, along with proficiency in databases and modern automation tools. The ideal candidate will collaborate with cross-functional teams to deliver scalable, high-impact automation workflows that enhance business processes. Key Responsibilities · Design and develop agentic automation workflows using frameworks such as LangGraph, AutoGen, CrewAI, and other multi-agent systems (e.g., MCP, A2A) to automate complex business processes. · Build and optimize Retrieval-Augmented Generation (RAG) pipelines for enhanced contextual understanding and accurate response generation in automation tasks. · Integrate open-source LLMs (e.g. LLaMA) and closed-source LLMs (e.g., OpenAI, Gemini, Vertex AI) to power agentic systems and generative AI applications. · Develop robust Python-based solutions using libraries like LangChain, Transformers, Pandas, and PyTorch for automation and AI model development. · Implement and manage CI/CD pipelines, Git workflows, and software development best practices to ensure seamless deployment of automation solutions. · Work with structured and unstructured data, applying prompt engineering and fine-tuning techniques to enhance LLM performance for specific use cases. · Query and manage databases (e.g., SQL, NoSQL) for data extraction, transformation, and integration into automation workflows. · Collaborate with stakeholders to translate technical solutions into business value, delivering clear presentations and documentation. · Stay updated on advancements in agentic automation, generative AI, and LLM technologies to drive innovation and maintain competitive edge. · Ensure scalability, security, and performance of deployed automation solutions in production environments. Required Qualifications and Skills · Education: Bachelor’s or Master’s degree in Computer Science, Data Science, Artificial Intelligence, or a related field. · Experience: o 5+ years of hands-on experience in AI/ML, generative AI, or automation development. o Proven expertise in agentic frameworks like LangGraph, AutoGen, CrewAI, and multi-agent systems. o Experience building and deploying RAG-based solutions for automation or knowledge-intensive applications. o Hands-on experience with open-source LLMs (Hugging Face) and closed-source LLMs (OpenAI, Gemini, Vertex AI). · Technical Skills: o Advanced proficiency in Python and relevant libraries (LangChain, Transformers, Pandas, PyTorch, Scikit-learn). o Strong SQL skills for querying and managing databases (e.g., PostgreSQL, MongoDB). o Familiarity with CI/CD tools (e.g., Jenkins, GitHub Actions), Git workflows, and containerization (e.g., Docker, Kubernetes). o Experience with Linux (Ubuntu) and cloud platforms (AWS, Azure, Google Cloud) for deploying automation solutions. o Knowledge of automation tools (e.g., UiPath, Automation Anywhere) and workflow orchestration platforms. · Soft Skills: o Exceptional communication skills to articulate technical concepts to non-technical stakeholders. o Strong problem-solving and analytical skills to address complex automation challenges. o Ability to work collaboratively in a fast-paced, client-facing environment. o Proactive mindset with a passion for adopting emerging technologies. Preferred Qualifications · Experience with multi-agent coordination protocols (MCP) and agent-to-agent (A2A) communication systems. · Familiarity with advanced generative AI techniques, such as prompt chaining, tool-augmented LLMs, and model distillation. · Exposure to enterprise-grade automation platforms or intelligent process automation (IPA) solutions. · Contributions to open-source AI/automation projects or publications in relevant domains. · Certification in AI, cloud platforms, or automation technologies (e.g., AWS Certified AI Practitioner, RPA Developer).

Posted 8 hours ago

Apply

3.0 years

0 Lacs

Pune, Maharashtra, India

Remote

Linkedin logo

Your work days are brighter here. At Workday, it all began with a conversation over breakfast. When our founders met at a sunny California diner, they came up with an idea to revolutionize the enterprise software market. And when we began to rise, one thing that really set us apart was our culture. A culture which was driven by our value of putting our people first. And ever since, the happiness, development, and contribution of every Workmate is central to who we are. Our Workmates believe a healthy employee-centric, collaborative culture is the essential mix of ingredients for success in business. That’s why we look after our people, communities and the planet while still being profitable. Feel encouraged to shine, however that manifests: you don’t need to hide who you are. You can feel the energy and the passion, it's what makes us unique. Inspired to make a brighter work day for all and transform with us to the next stage of our growth journey? Bring your brightest version of you and have a brighter work day here. At Workday, we value our candidates’ privacy and data security. Workday will never ask candidates to apply to jobs through websites that are not Workday Careers. Please be aware of sites that may ask for you to input your data in connection with a job posting that appears to be from Workday but is not. In addition, Workday will never ask candidates to pay a recruiting fee, or pay for consulting or coaching services, in order to apply for a job at Workday. About The Team Join our team and experience Workday! / About the team It's fun to work in a company where people truly believe in what they're doing. At Workday, we're committed to bringing passion and customer focus to the business of enterprise applications. We work hard, and we're serious about what we do. But we like to have a good time, too. In fact, we run our company with that principle in mind every day: One of our core values is fun. About The Role Job Description / About the Role Workday is looking for a Support Engineer specializing in Analytics with expertise in troubleshooting, performance optimization, and data analysis across Workday’s analytics services, including Prism Analytics, People Analytics, Discovery Boards, and Accounting Center. The ideal candidate has a solid foundation in big-data processing, data transformation, and reporting frameworks, with the ability to diagnose and resolve complex issues by analyzing logs, performance metrics, and system integrations. This role requires hands-on experience with query performance tuning, data pipeline debugging, and structured troubleshooting methodologies to support Workday’s analytics solutions. Strong data modeling, log analysis, and problem-solving skills combined with clear, effective communication are essential for success in this role. About You Key Areas of Responsibility: Provide sophisticated technical support for Workday’s reporting and analytics tools, including Prism Analytics, People Analytics, Discovery Boards, and Accounting Center, focusing on performance optimization, index debugging, memory management, and system health debugging. Develop expertise in Workday analytics services to drive high-performance reporting and data analytics solutions, using Prism Analytics, People Analytics, and SQL best practices. Collaborate with clients to define business requirements and translate them into optimized reports and configurations, improving query performance, data accuracy, and system health using Prism Analytics and Discovery Boards. Troubleshoot and resolve issues related to report configurations, system performance, integrations, and memory management, including detailed analysis of logs, query performance, and data pipelines. Guide customers in building, modifying, and optimizing reports, ensuring scalability, data integrity, and alignment with business needs, especially in Prism Analytics and Accounting Center. Educate users on standard methodologies for Workday reporting, security, and data governance, emphasizing People Analytics and Discovery Boards. Collaborate cross-functionally with engineering teams to address data quality issues, security concerns, and performance optimizations across Prism Analytics and Accounting Center, with a focus on memory management and system health. Contribute to documentation, QA efforts, and the optimization of analytics tools, with a focus on SQL querying, indexing, and debugging system health issues. Participate in 24x7 global support coverage, providing timely and efficient support across time zones. Key Technical Skills & Knowledge: Bachelor’s degree in Computer Science, Information Management, Statistics, Data Science, or a related field. 3+ years of experience in customer support, system performance optimization, data analysis, or similar roles, with a solid background in big data technologies and AI-driven analytics. Demonstrable experience with data platforms (e.g., Spark, Hadoop) and working with large-scale datasets, including data pipeline design and distributed processing. Hands-on experience with advanced reporting tools and analytics solutions, including AI-powered reporting platforms and big data tools like Spark for data transformation and analysis. Strong proficiency in SQL and data querying with the ability to analyze complex data sets, optimize queries, and perform data-driven insights to enhance system performance and business processes. Demonstrated ability to gather and map business requirements to advanced analytics and application capabilities, ensuring alignment with AI-driven insights and reporting solutions. Solid understanding of data architecture, including data lakes, ETL processes, and real-time data streaming. Strong analytical skills to collect, organize, and interpret complex datasets, using AI and big data tools to drive product improvements and optimize reporting performance. Ability to deliver data-driven insights to technical and non-technical partners, presenting complex findings to end-users and executive teams in an actionable manner. Proven collaboration skills, working across teams to drive issue resolution and using AI or machine learning models to enhance system functionality and customer experience. Strong written and verbal communication skills, with experience in technical consulting, customer support, or AI/ML-driven technical roles. Self-motivated with the ability to work independently in a fast-paced environment, while using AI and big data technologies to identify and resolve issues. Our Approach to Flexible Work With Flex Work, we’re combining the best of both worlds: in-person time and remote. Our approach enables our teams to deepen connections, maintain a strong community, and do their best work. We know that flexibility can take shape in many ways, so rather than a number of required days in-office each week, we simply spend at least half (50%) of our time each quarter in the office or in the field with our customers, prospects, and partners (depending on role). This means you'll have the freedom to create a flexible schedule that caters to your business, team, and personal needs, while being intentional to make the most of time spent together. Those in our remote "home office" roles also have the opportunity to come together in our offices for important moments that matter. Are you being referred to one of our roles? If so, ask your connection at Workday about our Employee Referral process!

Posted 8 hours ago

Apply

3.0 years

0 Lacs

Bengaluru

On-site

GlassDoor logo

JOB DESCRIPTION We have an exciting and rewarding opportunity for you to take your software engineering career to the next level. As a Software Engineer III at JPMorgan Chase within the Corporate Technology, you serve as a seasoned member of an agile team to design and deliver trusted market-leading technology products in a secure, stable, and scalable way. You are responsible for carrying out critical technology solutions across multiple technical areas within various business functions in support of the firm’s business objectives. Job responsibilities Executes software solutions, design, development, and technical troubleshooting with ability to think beyond routine or conventional approaches to build solutions or break down technical problems Creates secure and high-quality production code and maintains algorithms that run synchronously with appropriate systems Produces architecture and design artifacts for complex applications while being accountable for ensuring design constraints are met by software code development Gathers, analyzes, synthesizes, and develops visualizations and reporting from large, diverse data sets in service of continuous improvement of software applications and systems Proactively identifies hidden problems and patterns in data and uses these insights to drive improvements to coding hygiene and system architecture Contributes to software engineering communities of practice and events that explore new and emerging technologies Adds to team culture of diversity, equity, inclusion, and respect Required qualifications, capabilities, and skills Formal training or certification on software engineering concepts and 3+ years applied experience Development experience in Python and DataBricks AIML Ops Hands-on practical experience in system design, application development, testing, and operational stability Proficient in coding in one or more languages Experience in developing, debugging, and maintaining code in a large corporate environment with one or more modern programming languages and database querying languages Overall knowledge of the Software Development Life Cycle Solid understanding of agile methodologies such as CI/CD, Application Resiliency, and Security Preferred qualifications, capabilities, and skills Familiarity with modern front-end technologies AWS exposure, financial background, Credit risk knowledge. Knowledge of data & controls is a plus. Exposure to cloud technologies ABOUT US

Posted 8 hours ago

Apply

3.0 - 6.0 years

6 - 8 Lacs

Vadodara

On-site

GlassDoor logo

Company Description Digiflux Technologies Private Limited, based in Vadodara, is a leading provider of comprehensive digital solutions. We specialize in developing Web and Mobile Apps, as well as delivering innovative IoT-based solutions tailored to various organizational needs. Role Description We are seeking a Backend Engineer with 3 to 6 years of experience to join our team at Digiflux Technologies Private Limited in Vadodara. This is a full-time, on-site position. The successful candidate will be responsible for designing, developing, and maintaining high-performance backend APIs using Node.js. You will work closely with the frontend and DevOps teams to ensure seamless integration and optimal application performance, with a focus on handling large-scale traffic and ensuring fast response times. Key Responsibilities Develop and maintain scalable backend APIs using Node.js with frameworks like Express.js or NestJS. Ensure efficient API performance, focusing on low-latency and high-throughput handling. Optimize backend systems to handle large-scale traffic and data load. Work closely with frontend and DevOps teams to ensure seamless integration and application performance. Implement secure and maintainable code following industry best practices for backend development. Perform thorough Unit testing of APIs, ensuring functionality, reliability, and high performance. Monitor backend performance and troubleshoot issues to improve response times and system stability. Qualifications 3 to 6 years of experience in backend development, with hands-on experience in Node.js and frameworks like Express.js or NestJS. Good understanding of backend development tools and techniques for creating high-quality APIs with fast response times. Familiarity with database design, querying, and optimization (SQL or NoSQL databases). Experience in developing scalable backend systems capable of handling high traffic loads. Knowledge of API security best practices, authentication mechanisms, and data protection. Strong problem-solving skills and attention to detail. Bachelor’s degree in Computer Science or a related field, or equivalent work experience. Good to Have Knowledge of containerization technologies such as Docker. Basic understanding of microservices architecture and how it applies to scalable systems. Familiarity with cloud architecture planning and AWS services (e.g., EC2, RDS, Lambda, Batch functions) for better infrastructure planning. Experience working with distributed systems and improving application scalability. Additional Information The role is based in Vadodara and requires a full-time on-site presence. Competitive salary and benefits package commensurate with experience. Opportunities for professional growth and development within a dynamic and innovative company. Job Type: Full-time Pay: ₹600,000.00 - ₹800,000.00 per year Location Type: In-person Schedule: Day shift Ability to commute/relocate: Vadodara, Gujarat: Reliably commute or planning to relocate before starting work (Required) Application Question(s): Current CTC Expected CTC Notice Period Experience: total relevant: 2 years (Preferred) Work Location: In person

Posted 8 hours ago

Apply

2.0 - 4.0 years

10 - 12 Lacs

Noida

On-site

GlassDoor logo

Job Title: Assistant Product Manager Experience: 2–4 years Type: Full-Time Package : 10 -12 LPA About the Role: We’re looking for an enthusiastic and detail-oriented Assistant Product Manager (APM) to support the development and enhancement of features across our OTT platform. While this role will have a strong focus on analytics and reporting , it also offers exposure to the end-to-end OTT ecosystem—including user experience, content management, playback performance, and backend workflows. Key Responsibilities: · Assist in defining and enhancing features across the OTT platform, with a strong focus on analytics, user experience, content delivery, and performance tracking. · Take ownership of product modules driving them from ideation to release in collaboration with technical and business teams. · Analyze user and platform data to define, track, and improve KPIs related to user engagement, content consumption, and service quality. · Work closely with BI and data engineering teams to ensure accurate data pipelines, validated reports, and actionable insights. · Coordinate with engineering and QA teams to test features, track issues, and support smooth product releases. · Gather feedback post-deployment to assess feature performance and identify improvement opportunities. · Maintain clear, up-to-date product documentation, user stories, and requirement specs. · Track tasks, bugs, and product enhancements using Agile tools like JIRA or Click up. · Continuously learn about OTT technologies including CDN, video transcoding, media storage, and playback infrastructure to support well-informed product decisions. What You’ll Bring: · Bachelor’s degree in Computer Science, Engineering, Information Technology . · 2–4 years of experience in product operations, analytics, or product support—ideally within OTT, streaming, or SaaS platforms. · Proficiency in using analytics and reporting tools such as Google Analytics, Mixpanel, or Tableau. · Hands-on experience with SQL for querying databases and validating product or performance data. · Exposure to stakeholder management —working with internal teams (engineering, QA, BI, content ops) and external partners to gather requirements and ensure delivery. · Familiarity with Agile tools like JIRA and Confluence for task and documentation management. · A data-driven mindset with the ability to interpret usage data and derive product insights. · Strong organizational, communication, and problem-solving skills. Bonus Points: · Experience working with or understanding data pipelines , ETL processes , or product instrumentation for analytics. · Understanding of OTT technology components such as Content Delivery Networks (CDNs) , video encoding/transcoding , cloud storage , and media asset management systems · Basic understanding of API interactions , client-server architecture, and performance monitoring tools. Job Types: Full-time, Permanent Pay: ₹1,000,000.00 - ₹1,200,000.00 per year Benefits: Paid sick time Paid time off Provident Fund Schedule: Day shift Monday to Friday Morning shift Application Question(s): What is your current CTC Experience: Product management: 2 years (Required) Location: Noida, Uttar Pradesh (Required) Work Location: In person

Posted 8 hours ago

Apply

0 years

0 Lacs

India

On-site

Linkedin logo

Who are we and What do we do? ShareChat ( https://sharechat.com/about ) is India's largest homegrown social media company, with 400+ million monthly active users across all its platforms including Moj, a leading short video app which was launched in a record 30 hours. Founded in October 2015, with a vision of building an inclusive community that encourages & empowers each individual to share their unique journey and valuable experiences with confidence. ShareChat is valued at $5 bn. We are spearheading India's internet revolution by building products through world-class AI & tech to evangelize the content ecosystem for India in regional languages. We believe in complete ownership of problem-solving while committing to speed and integrity in everything we do. We place the utmost importance on user empathy & strive to work towards creating a world-class experience for them every day. Join us to drive how the next billion users will interact on the internet! What You’ll Do? Querying the Database: Using SQL to run queries on ShareChat’s analytical engine built on Redshift. Scripting: Writing scalable scripts to fetch or modify data from API endpoints. Analytical Skills: You will work with large amounts of data related to user behaviour on ShareChat. You will need to see through the data and analyze it to find conclusions. Critical Thinking: You may need to look for trend and patterns in data to come up with new conclusions based on the findings. Attention to Detail: Data is precise. You have to make sure you are vigilant in your analysis to come to correct conclusions Who are you? BS in Mathematics, Economics, Computer Science, Information Management or Statistics is preferred. Proven working experience as a data analyst or business data analyst Technical expertise regarding data models, database design development, data mining and segmentation techniques Strong knowledge of and experience with reporting packages (Business Objects etc), databases (SQL etc), programming (XML, Javascript, or ETL frameworks) Knowledge of statistics and experience using statistical packages for analyzing datasets (Excel, SPSS, SAS etc) Strong analytical skills with the ability to collect, organize, analyze, and disseminate significant amounts of information with attention to detail and accuracy Adept at queries, report writing and presenting findings What's in it for you? At ShareChat, our values - Ownership, Speed, User Empathy, Integrity, and First Principles - are at the core of our ways of working. We believe in hiring top talent and grooming future leaders by providing a flexible environment to aid growth and development.

Posted 9 hours ago

Apply

100.0 years

0 Lacs

Kochi, Kerala, India

On-site

Linkedin logo

About Consilium Safety Group Consilium Safety Group is a global leader in fire and gas safety technology, with over 100 years of experience in safeguarding lives, assets, and the environment. Headquartered in Gothenburg, Sweden, and with over 55 offices worldwide, we serve vital sectors including marine, energy, real estate, and the fast-growing new energy industry. Our mission is to drive safety innovation—and we’re looking for skilled professionals to help ensure that innovation remains secure, scalable, and impactful. Position Overview We’re seeking a passionate and results-driven Power Platform Engineer (contract) to design and build business-critical applications and automation solutions using Microsoft’s Power Platform. This role is ideal for someone who thrives in a dynamic environment, is enthusiastic about low-code/no-code solutions, and wants to make a meaningful impact across global operations. You will work closely with stakeholders to transform business requirements into scalable, efficient, and user-friendly solutions using tools like Power Apps, Power Automate, and more. High-performing contractors may be considered for long-term or permanent opportunities within the organization. Key Responsibilities Develop and maintain business applications using Power Apps (Canvas and Model-driven). Automate workflows using Power Automate (Flow). Integrate solutions with Microsoft 365, SharePoint, Dynamics 365, Azure, and third-party services. Collaborate with stakeholders to gather requirements and translate them into technical designs. Ensure high standards in architecture, performance, security, and deployment practices. Troubleshoot, debug, and support deployed applications. Provide end-user documentation, training, and support. Required Qualifications Bachelor’s degree in Computer Science, IT, or a related field. 1–2 years of hands-on experience with Microsoft Power Platform (especially Power Apps and Power Automate). Solid understanding of Dataverse, Microsoft 365, and connector integrations. Basic knowledge of SQL for querying and reporting. Familiarity with JavaScript, Power FX, JSON, REST APIs, or Azure Logic Apps is a plus. Understanding of CI/CD pipelines and DevOps in the context of Power Platform. Strong analytical, troubleshooting, and collaboration skills. Self-motivated with the ability to manage multiple tasks independently. Preferred Qualifications Microsoft Power Platform certifications (e.g., PL-100, PL-400). Experience working in Agile/Scrum environments. Exposure to business analysis or solution architecture roles. Familiarity with Microsoft Dynamics 365 Finance & Operations (F&O) workflows. Why Join Us? Join a global company working on innovative, high-impact safety technology projects. Work in a collaborative, inclusive environment that values continuous learning. Hands-on experience with the latest Microsoft technologies and tools. Flexible work culture that supports work-life balance and career growth. Fresh graduates with relevant certifications, academic projects, or strong foundational knowledge are encouraged to apply . Location: Kochi, India Contract period: Approximately from July 1, 2025 to December 31, 2025 . Start and end dates may be subject to adjustment based on agreement. Ready to make an impact? Apply now and help us drive safety innovation forward.

Posted 9 hours ago

Apply

12.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Greetings From Ushta Te !!! Hiring for the role of Data Scientist for top mnc company - Bangalore !!! JOB DESCRIPTION Mandatory Skills - Data Scientist experience, Python, SQL 12+ years of technical experience in data analytics, data science Responsible for developing and translating computer algorithms into prototype code and maintaining, organizing, and identifying trends in large data sets. Proficiency in SQL database design, proficiency in creating process documentation, strong written and verbal communication skills, and the ability to work independently and on teams. Familiarity with the computer coding languages python, java, kafka, hive, or storm may be required in order to oversee real-time business metric aggregation, data warehousing and querying, schema and data management, and related duties. Should have knowledge of algorithms, data structures, and performance optimism and experience with processing and interpreting data sets. Develop technical solutions to improve access to data and data usage. Understand data needs and advise company on technological resources. Aggregate and analyze various data sets to provide actionable insight. Develop reports, dashboards, and tools for business-users. Key Responsibilities: Lead, mentor, and inspire a high-performing team of Data Scientists, Analysts, and AI/ML Engineers, fostering a collaborative, innovative, and impact-driven environment. Design, develop, and maintain robust data models, analytics frameworks, dashboards, and machine learning solutions that drive business strategy and decision-making. Develop and execute experimentation frameworks, including defining clear success metrics, designing A/B tests, and analyzing results to guide strategic product decisions. Partner closely with product, engineering, and leadership to translate complex clickstream behavioral data into strategic insights that inform product roadmaps and investments. Clearly communicate findings, insights, and recommendations to both technical and non-technical stakeholders, building credibility and driving alignment with senior leadership. Influence product direction by developing hypotheses, employing analytical rigor, and creating clear, impactful presentations and narratives around data-driven insights. Continuously identify new opportunities for leveraging data and analytics to enhance product capabilities, operational efficiency, and customer satisfaction. Drive measurement strategies, set ambitious goals, forecast outcomes, and closely monitor key product and business metrics. If interested kindly share your resume on kausar.rangari@ushtate.co.in/WhatsApp - 7304429460 for further details.

Posted 9 hours ago

Apply

0.0 - 2.0 years

6 - 8 Lacs

Vadodara, Gujarat

On-site

Indeed logo

Company Description Digiflux Technologies Private Limited, based in Vadodara, is a leading provider of comprehensive digital solutions. We specialize in developing Web and Mobile Apps, as well as delivering innovative IoT-based solutions tailored to various organizational needs. Role Description We are seeking a Backend Engineer with 3 to 6 years of experience to join our team at Digiflux Technologies Private Limited in Vadodara. This is a full-time, on-site position. The successful candidate will be responsible for designing, developing, and maintaining high-performance backend APIs using Node.js. You will work closely with the frontend and DevOps teams to ensure seamless integration and optimal application performance, with a focus on handling large-scale traffic and ensuring fast response times. Key Responsibilities Develop and maintain scalable backend APIs using Node.js with frameworks like Express.js or NestJS. Ensure efficient API performance, focusing on low-latency and high-throughput handling. Optimize backend systems to handle large-scale traffic and data load. Work closely with frontend and DevOps teams to ensure seamless integration and application performance. Implement secure and maintainable code following industry best practices for backend development. Perform thorough Unit testing of APIs, ensuring functionality, reliability, and high performance. Monitor backend performance and troubleshoot issues to improve response times and system stability. Qualifications 3 to 6 years of experience in backend development, with hands-on experience in Node.js and frameworks like Express.js or NestJS. Good understanding of backend development tools and techniques for creating high-quality APIs with fast response times. Familiarity with database design, querying, and optimization (SQL or NoSQL databases). Experience in developing scalable backend systems capable of handling high traffic loads. Knowledge of API security best practices, authentication mechanisms, and data protection. Strong problem-solving skills and attention to detail. Bachelor’s degree in Computer Science or a related field, or equivalent work experience. Good to Have Knowledge of containerization technologies such as Docker. Basic understanding of microservices architecture and how it applies to scalable systems. Familiarity with cloud architecture planning and AWS services (e.g., EC2, RDS, Lambda, Batch functions) for better infrastructure planning. Experience working with distributed systems and improving application scalability. Additional Information The role is based in Vadodara and requires a full-time on-site presence. Competitive salary and benefits package commensurate with experience. Opportunities for professional growth and development within a dynamic and innovative company. Job Type: Full-time Pay: ₹600,000.00 - ₹800,000.00 per year Location Type: In-person Schedule: Day shift Ability to commute/relocate: Vadodara, Gujarat: Reliably commute or planning to relocate before starting work (Required) Application Question(s): Current CTC Expected CTC Notice Period Experience: total relevant: 2 years (Preferred) Work Location: In person

Posted 10 hours ago

Apply

3.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Hiring!! #Data Analysis #ETL #Python #R #SAS # Data Science #Visualization #Automation #Python #Pandas #Scikit-learn #Seaborn #NumPy #Plotly, #SQL #SAS We have urgent opening Data Scientist for at Pune Position: Data Scientist Experience: 3 To 4+ Years Job Location: Pune Key Responsibilities: Translate Business Problems: Apply analytical/technical expertise to convert business challenges into actionable solutions. Project Execution & Automation: Lead and automate analytical projects, ensuring outputs meet client needs and expectations. Data Analysis & ETL: Perform ETL tasks and conduct exploratory, confirmatory, and qualitative data analyses, utilizing statistical models/algorithms in Python, R, SAS, etc. Model Development: Leverage data mining and machine learning techniques to build statistical models across various topics. Database Management: Collaborate with technology and other partners to build and maintain the People Analytics and Insights database. Integrate multiple systems and data sources, transform and clean data, and handle incomplete data sources. Stay Current: Keep up-to-date with current trends and research in data science, research, and technology. Knowledge/Experience: Essential- Machine learning, Statistics, Data engineering Desirable- Tableau Skills: (technical skills) Essential- Data Science, Visualization, and Automation with Python. Python packages like Pandas, Scikit-learn, Seaborn, NumPy, Plotly, etc. Advanced SQL (or similar language for querying relational databases). Desirable SAS, R, and other scripting languages. Tableau and other data visualization tools.

Posted 10 hours ago

Apply

1.0 - 2.5 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

About Job Brightmoney is seeking a highly skilled, detail-oriented Software Development Engineer I (Backend) to join our dynamic team. As a key member of our engineering team, you will be responsible for designing, developing, and deploying scalable, efficient, and secure backend systems. The ideal candidate will have a strong foundation in computer science, exceptional problem-solving skills, and a passion for delivering high-quality, production-ready software solutions. The successful candidate will have a deep understanding of software engineering principles, algorithm design, and data structures. They will be able to communicate effectively with cross-functional teams, including product management, design, and engineering. If you are a motivated, collaborative, and innovative individual who thrives in a fast-paced environment, we encourage you to apply. Responsibilities Design, develop, and deploy complex, scalable, and efficient backend systems using Django and Django Rest Framework, ensuring seamless integration with front-end applications. Collaborate with cross-functional teams to define and prioritize project requirements, ensuring alignment with business objectives and driving the delivery of high-quality software solutions. Develop and maintain high-quality software solutions, with a strong focus on reliability, scalability, performance, and security. Participate in code reviews, providing constructive feedback to ensure high-quality code, adherence to best practices, and consistency in coding standards. Contribute to the development of technical documentation, including architecture diagrams, API documentation, and technical guides, to ensure knowledge sharing and onboarding of new team members. Stay current with industry trends and emerging technologies to enhance our products and engineering practices. Lead and contribute to technical discussions, distilling complex technical concepts into clear and concise communication, and collaborating with other teams to ensure alignment and effective communication. Collaborate with other teams, including product management and design, to ensure alignment and effective communication, and drive the delivery of high-quality software solutions that meet business objectives. Drive the advancement of process enhancements, mitigate technical debt, and automate repetitive tasks to optimize resource utilization and promote the ongoing refinement of engineering practices. Engage with cross-functional teams to formulate and sustain technical roadmaps, ensuring alignment with organizational goals and facilitating the delivery of superior software solutions. Spearhead and participate in technical planning sessions, establishing project scope, timelines, and resource distribution, while guaranteeing the delivery of high-quality software solutions that fulfill business objectives. Collaborate with other teams to create and upkeep technical documentation, encompassing architecture diagrams, API documentation, and technical guides, to foster knowledge sharing and support the onboarding of new team members. Collaborate with other teams to develop and maintain technical documentation, including architecture diagrams, API documentation, and technical guides, to ensure knowledge sharing and onboarding of new team members. Skills & Qualification Proficient in Django and Django Rest Framework to design and develop scalable, efficient, and secure backend systems. Develop, train, and deploy machine learning models for real-world applications. Optimize and fine-tune machine learning algorithms for accuracy and efficiency. Strong understanding of Python programming language, including data structures, file input/output, object-oriented programming, and latest software development best practices. Familiarity with database systems, including data modeling, querying, and optimization, in addition to designing and implementing data pipelines and data warehousing solutions. Excellent problem-solving skills, with the ability to analyze complex technical problems and develop innovative solutions that meet business objectives. Strong collaboration and communication skills, with experience working with cross-functional teams, including product management, design, and engineering. Ability to work in a fast-paced environment, with a strong focus on delivering high-quality software solutions that meet business objectives and driving the growth and development of the team. Bachelor's degree in Computer Science, Information Technology, or a related field, with a strong foundation in computer science and software engineering principles. Preferred Minimum 1-2.5 years of experience in software development, with a strong focus on backend systems, including experience with Python, Django, MLE and Database Systems.

Posted 10 hours ago

Apply

0 years

0 Lacs

India

Remote

Linkedin logo

Position: SQL Developer Trainee Company: Lead India Location: Remote Job Type: Internship (Full-Time) Duration: 1–3 Months Stipend: ₹25,000/month Start Date: Immediate About Lead India Lead India is dedicated to empowering the next generation of professionals through hands-on learning and impactful projects. We focus on innovation, technology, and leadership development to prepare individuals for real-world challenges. Role Overview We are looking for an enthusiastic SQL Developer Trainee to join our team. This internship is an excellent opportunity to gain practical experience in database development, data querying, and backend data operations in a real-world business environment. Key Responsibilities Write and optimize SQL queries for data extraction, transformation, and reporting. Design and manage relational databases, including creating tables, views, indexes, and stored procedures. Support data integration and migration tasks. Troubleshoot and resolve database-related issues. Collaborate with developers and analysts to understand data requirements. Maintain documentation of database structures and processes. Requirements Pursuing or recently completed a degree in Computer Science, Information Technology, or a related field. Basic understanding of SQL and relational database concepts (e.g., MySQL, PostgreSQL, SQL Server). Familiarity with writing simple queries, joins, and data manipulation statements. Strong analytical and problem-solving skills. Willingness to learn and work independently in a remote environment. Good communication and documentation skills. Benefits Hands-on experience with real-world data and database systems. Opportunity to work under the mentorship of experienced developers. Certificate of Internship upon successful completion. Possibility of a full-time offer based on performance.

Posted 12 hours ago

Apply

2.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

About BiteSpeed Hey there! We are a Sequoia-backed SaaS startup building an AI-native Marketing, Support & Sales suite for e-commerce brands. We’re currently working with 3000+ e-commerce brands globally across 50+ countries and are fortunate to have raised $5.5M+ along the journey with marquee investors like Sequoia Capital India, Kunal Shah, Gaurav Munjal & more backing us. 💡 Read more about our mission and the story of commerce here- https://www.notion.so/bitespeed/BiteSpeed-s-Mission-the-Future-of-Commerce-b3cf14a080d94654ba46693c8cacd24f Check out more about us here - https://www.bitespeed.co/ and do read through our 200+ odd 5-star reviews to get a sense of what our customers say about us here - https://apps.shopify.com/bitespeed-fb-messenger-chatbot (we openly brag about this 😉) We’ve had some solid investors back us (making it easier for you to stalk us since you’d do this anyway):- BiteSpeed Raises USD 3.5M Funding, Led by Peak XV’s Surge- https://www.businesswireindia.com/e-commerce-ai-startup-bitespeed-raises-usd-3-5m-funding-led-by-peak-xvs-surge-92455.html E-commerce AI start-up BiteSpeed raises $3.5 million funding led by Peak XV’s Surge- https://www.thehindubusinessline.com/info-tech/e-commerce-ai-start-up-bitespeed-raises-35-million-funding-led-by-peak-xvs-surge/article68863058.ece BiteSpeed Raises $1.9 Million Seed Funding From Sequoia India's Surge- https://www.entrepreneur.com/en-in/business-news/bitespeed-raises-19-million-seed-funding-from-sequoia/418414 About the role Our product function so far has been founder-led largely built on internal tribal knowledge. This will be an early product hire for us with which means high product ownership . Given our multi-product DNA , there is now increasing breadth and depth in our products with each product taking a life of its own and we see value to bringing some order to the chaos with this role :). This role is as early-stage product as it gets. Morning stand-ups, scrambling through customer conversations during the day, data digging in the evenings and finally getting to those PRDs at night! The role is also at an interesting intersection of B2B SaaS customer conversations and B2C experimentation given our space. What you’ll do Own and drive the product roadmap , including product decisions, prioritisation and execution. Lots of user interviews & speaking to sales and CS teams to inform product decisions . Create detailed product specification documents for the design & development team. Work closely with design teams to go from ideas to user flows . Work closely with engineering on product delivery . Querying databases and monitoring dashboards to track product metrics. Research the market and competitive landscape to have a pulse of market direction. What makes you a good fit 2+ years of product management experience (B2B SaaS not necessary). Comfortable with early stage startup imperfections and able to operate in disorder. Rigour, thoroughness and detail-orientation in thought and action. Extremely high agency and adaptability. Willingness to get hands dirty with data, customer interviews and so on. Some recent experience dabbling into LLMs (nice to have, but not a deal-breaker). Salary and Location Location: Bangalore Expected CTC: We pay top of market for the right folks and also offer generous equity (ESOP) to everyone in the team. Our Way Of Life - https://www.notion.so/bitespeed/Way-Of-Life-At-BiteSpeed-44d9b9614d9641419da910189b1e9f8e. Our Purpose At BiteSpeed, work is personal. You could blame this on us being existential, but most of us are spending the best years of our lives doing this and we want to be purposeful about the kind of workplace we’re trying to create. Our purpose is about why we’re here and what we care about:- Personal Transformation Wealth Creation Winning Together Our Values Our values are about how we do what we do. Values define the right thing to do. We hire, reward and sometimes have to let go based on our values. We have 5 core values:- Go Above And Beyond Making Things Happen Say It Like It Is Progress Over Perfection Don’t Take Yourself Seriously, Take Your Work Seriously Perks & Benefits Small things we’ve done to ensure we take care of our wellness, learning & keep things fun:- Health Insurance - Health insurance cover and accident coverage for extra cushion and mental peace when rainy days hit us. Quarterly Off-sites - Quarterly off-sites are a core part of the BiteSpeed culture. Our off-sites range from intense quarter planning sessions to crazy mafia nights and competitive cricket matches (with a lot of trash talking). Cult Fitness Membership - All work and no play makes jack a dull boy. Cult Fit and Cult Play passes to make sure we hit the gym more often. Personal Development - We sponsor courses, conference tickets, books on a case to case basis to ensure we’re constantly growing. Salary In Advance - Trust first, by default. We pay out salaries in the first week of the month. Know someone who might be a great fit? Refer them to us, if they end up joining we'll send you an Apple AirPods Pro as a gesture of thanks! For any queries feel free to write to talent@bitespeed.co.

Posted 12 hours ago

Apply

5.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Linkedin logo

Who we are: LUMIQ is the leading Data and Analytics company in the Financial Services and Insurance (FSI) industry. We are trusted by the world's largest FSIs, including insurers, banks, AMCs, and NBFCs, to address their data challenges. Our clients include 40+ enterprises with over $10B in deposits/AUM, collectively representing about 1B customers globally. Our expertise lies in creating next-gen data technology products to help FSI enterprises organize, manage, and effectively use data. We have consistently challenged the status quo, introducing many industry-firsts like the first enterprise data platform in Asia on cloud for a regulated entity. Founded in 2013, LUMIQ has now completed a decade of innovation, backed by Info Edge Ventures (a JV between Temasek Holdings of Singapore and Naukri) and US-based Season 2 Ventures. Our Culture: At LUMIQ, we strive to create a community of passionate data professionals who aim to transcend the usual corporate dynamics. We offer you the freedom to ideate, commit, and navigate your career trajectory at your own pace. Culture of ownership – empowerment to drive outcomes. Our culture encourages 'Tech Poetry' – combining creativity and technology to create solutions that revolutionize industry. We trust our people to manage their responsibilities with minimal policy constraints. Our team is composed of the industry's brightest minds, from PhDs and engineers to industry specialists from Banking, Insurance, NBFCs, AMCs, who will challenge and inspire you to reach new heights. Job Description: We are seeking a highly skilled Data Engineer to join our dynamic team. As a Data Engineer at LUMIQ, you will play a crucial role in designing, developing, and maintaining our cloud-based data infrastructure to support our BFSI customers. You will work at the intersection of cloud technologies, data engineering, and the BFSI domain to deliver robust and scalable data solutions. Key Responsibilities: Design, develop, and implement data pipelines, ETL processes, and data integration solutions. Collaborate with cross-functional teams to understand data requirements and design scalable data models and architectures that align with BFSI industry needs. Optimize data storage, processing, and retrieval for maximum performance and cost-efficiency in cloud environments. Implement data security and compliance measures to ensure the protection and integrity of sensitive BFSI data. Work closely with data scientists and analysts to enable seamless access to high-quality data for analytical purposes. Troubleshoot and resolve data-related issues, ensuring data availability and reliability for BFSI customers. Stay updated on industry best practices, emerging cloud technologies, and trends in the BFSI sector to drive continuous improvement and innovation. Qualifications: Minimum 5 years’ experience as a Data Engineer with strong skills in SQL and Data Marts. Strong SQL querying skills Strong experience and skills in Data Marts and complex KPI transformations and data pipelines Skills and experience working on any of these - DBT, Redshift, Big query, Snowflake, Airflow Experience and skills in Spark Strong skills and experience in Ingestion and Transformation. Bachelor’s degree in computer science, Engineering, or a related field. Master's degree preferred. Proven experience in cloud data engineering, with expertise in platforms such as AWS, Azure, or Google Cloud. Proficiency in programming languages like Python, Java, or spark for data processing and ETL operations. Strong understanding of data warehousing concepts, data modeling, and database management systems. Experience with big data technologies (e.g., Hadoop, Spark) and data processing frameworks. Familiarity with BFSI industry data standards, compliance requirements, and security protocols. Excellent problem-solving skills and the ability to work in a collaborative, cross-functional team environment. What do you get: Opportunity to contribute to an entrepreneurial culture and exposure to the startup hustler culture. Competitive Salary Packages. Group Medical Policies. Equal Employment Opportunity. Maternity Leave. Opportunities for upskilling and exposure to the latest technologies. 100% Sponsorship for certification.

Posted 12 hours ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Role: Argus Safety Data Analyst Required Technical Skill Set: Safety Reporting, ReportingCognos Report Development, PL/SQL, Argus Safety data model Experience: 5+ Yr Work Location: TCS - Pune Must-Have: · Business Intelligence and Analytics tools – Cognos · Experience in Argus Safety and Argus Insight databases · Well versed with PV concepts in context of Argus Safety product suite, · PL/SQL and database querying capabilities · Experience in creating data models for periodic reporting, aggregate reporting and signal detection requirements sourcing data from custom data marts and/ or Argus Insight · Experience in Data Mart with Locked versions for supporting Periodic Reporting and Signal Detection · o Experience in Data Mart with all versions for supporting Operational and PV analytics · Ability to write complex Cognos reports Must Have experience with below report types in Argus . Periodic Listing o Blinded Line Listing o SUSAR Line Listing o DSUR o EPPV o PADER o PBRER o PBRER and US Section o PBRER Supplemental o NSUL & UNDAR o Non-Standard/New Periodic Report o SUSAR o MedDRA Topic Search o J-PSR o S. Korea Listing · Reconciliation o All AEs · Standard Report o New Reports o Report Schedules o Report Updates Good-to-Have: · Knowledge of ALM and test script writing · Knowledge of HP ALM Is a value add Responsibility of / Expectations from the Role: 1. Supporting Business teams with their day-to-day data queries, adhoc reports requirement, report issues troubleshooting 2. Handling support issues, tickets and change management activities 3. Daily interaction with customer and coordination with different team

Posted 12 hours ago

Apply

0.0 - 2.0 years

10 - 12 Lacs

Noida, Uttar Pradesh

On-site

Indeed logo

Job Title: Assistant Product Manager Experience: 2–4 years Type: Full-Time Package : 10 -12 LPA About the Role: We’re looking for an enthusiastic and detail-oriented Assistant Product Manager (APM) to support the development and enhancement of features across our OTT platform. While this role will have a strong focus on analytics and reporting , it also offers exposure to the end-to-end OTT ecosystem—including user experience, content management, playback performance, and backend workflows. Key Responsibilities: · Assist in defining and enhancing features across the OTT platform, with a strong focus on analytics, user experience, content delivery, and performance tracking. · Take ownership of product modules driving them from ideation to release in collaboration with technical and business teams. · Analyze user and platform data to define, track, and improve KPIs related to user engagement, content consumption, and service quality. · Work closely with BI and data engineering teams to ensure accurate data pipelines, validated reports, and actionable insights. · Coordinate with engineering and QA teams to test features, track issues, and support smooth product releases. · Gather feedback post-deployment to assess feature performance and identify improvement opportunities. · Maintain clear, up-to-date product documentation, user stories, and requirement specs. · Track tasks, bugs, and product enhancements using Agile tools like JIRA or Click up. · Continuously learn about OTT technologies including CDN, video transcoding, media storage, and playback infrastructure to support well-informed product decisions. What You’ll Bring: · Bachelor’s degree in Computer Science, Engineering, Information Technology . · 2–4 years of experience in product operations, analytics, or product support—ideally within OTT, streaming, or SaaS platforms. · Proficiency in using analytics and reporting tools such as Google Analytics, Mixpanel, or Tableau. · Hands-on experience with SQL for querying databases and validating product or performance data. · Exposure to stakeholder management —working with internal teams (engineering, QA, BI, content ops) and external partners to gather requirements and ensure delivery. · Familiarity with Agile tools like JIRA and Confluence for task and documentation management. · A data-driven mindset with the ability to interpret usage data and derive product insights. · Strong organizational, communication, and problem-solving skills. Bonus Points: · Experience working with or understanding data pipelines , ETL processes , or product instrumentation for analytics. · Understanding of OTT technology components such as Content Delivery Networks (CDNs) , video encoding/transcoding , cloud storage , and media asset management systems · Basic understanding of API interactions , client-server architecture, and performance monitoring tools. Job Types: Full-time, Permanent Pay: ₹1,000,000.00 - ₹1,200,000.00 per year Benefits: Paid sick time Paid time off Provident Fund Schedule: Day shift Monday to Friday Morning shift Application Question(s): What is your current CTC Experience: Product management: 2 years (Required) Location: Noida, Uttar Pradesh (Required) Work Location: In person

Posted 12 hours ago

Apply

2.0 years

0 Lacs

Mumbai Metropolitan Region

On-site

Linkedin logo

Experience Required more than 2 years Job Description About The Company Axis My India is India’s foremost Consumer Data Intelligence Company, which in partnership with Google is building a single-stop People Empowerment Platform, the ‘a’ app, that aims to change people’s awareness, accessibility, and utilization of a slew of services. At Axis, we are dedicated to making a tangible impact on the lives of millions. If you're passionate about creating meaningful changes and aren't afraid to get your hands dirty, we want you on our team! For more insights of the company, kindly visit our website https://www.axismyindia.org Job Description Implement and manage GA4 tracking for the "a" App, ensuring accurate and comprehensive data collection. Integrate GA4 with Firebase for robust mobile app analytics, including event and parameter configuration. Design and develop custom dashboards and reports (using Looker Studio or similar tools) to visualize key performance indicators (KPIs) and user behavior trends. Analyze user engagement, retention, funnel performance, and other critical app metrics to identify growth opportunities and areas for optimization. Collaborate with product, UX, marketing, and engineering teams to translate data insights into actionable strategies for improving app features and user journeys. Define and track key performance indicators (KPIs) such as installs, active users, retention rate and session duration Stay current with the latest GA4 features, updates, and industry best practices, proactively recommending enhancements to analytics processes. Provide regular reporting and presentations to stakeholders, clearly communicating findings and recommendations. Identify trends in user behavior, such as most viewed screens, popular features, and drop-off points within the app journey. Track the effectiveness of marketing campaigns by analyzing acquisition sources, installs, and in-app engagement. Collaborate with developers to implement and test analytics updates in new app releases. Requirements Proven hands-on experience with Google Analytics 4 (GA4) implementation and analysis, particularly for mobile apps. Strong understanding of mobile app analytics concepts, event-based tracking, and cross-platform user journeys. Proficiency in dashboard and data visualization tools (e.g., Looker Studio, Google Data Studio). Experience with Firebase SDK integration and custom event setup for mobile apps. Familiarity with Google Tag Manager and tag management for app environments. Ability to translate complex data into clear, actionable insights for non-technical stakeholders. Strong analytical, problem-solving, and communication skills. Knowledge of data privacy regulations and best practices for secure data handling. Bachelor’s degree in computer science, Data Science, Analytics, or related field; advanced degree preferred Experience Required Experience with G4A data querying tool for more than 2 years Exposure to digital marketing analytics (SEO, PPC, campaign tracking) and attribution modelling. Familiarity with data engineering concepts. Experience working in fast-paced, cross-functional teams, preferably in a consumer app environment Benefits Competitive salary and benefits package Opportunity to make significant contributions to a dynamic company Evening snacks are provided by the company to keep you refreshed towards the end of the day Walking distance from Chakala metro station, making commuting easy and convenient. At Axis My India, we value discipline and focus. Our team members wear uniforms, adhere to a no-mobile policy during work hours, and work from our office with alternate Saturdays off. If you thrive in a structured environment and are committed to excellence, we encourage you to apply. check(event) ; career-website-detail-template-2 => apply(record.id,meta)" mousedown="lyte-button => check(event)" final-style="background-color:#9C27B0;border-color:#9C27B0;color:white;" final-class="lyte-button lyteBackgroundColorBtn lyteSuccess" lyte-rendered="">

Posted 14 hours ago

Apply

0 years

0 Lacs

Chennai, Tamil Nadu, India

Remote

Linkedin logo

Job Title: Data Science Specialist 📍 Location: Remote / Hybrid Languages: English, Tamil (mandatory) About the Role We are looking for a passionate and hands-on Data Science Specialist to join our dynamic tech team in the ed-tech domain. You will be actively involved in developing foundational ML/DL projects, mentoring learners, supporting their career journey, and shaping the next generation of data scientists. This role is perfect for someone who thrives in a tech-first environment and is excited about working at the intersection of education, AI, and innovation. Key Responsibilities ● Design and deliver foundational and advanced ML/DL projects ● Handle instant query resolution and technical doubts via sessions or support channels ● Conduct live sessions, weekly assessments, and provide detailed project feedback ● Guide learners with assignments, career tips, and tech support ● Perform periodic learner evaluations and ensure quality learning outcomes ● Build strong rapport with learners and ensure a smooth, engaging learning experience ● Collaborate with the internal tech team to improve curriculum and learning tools ● Support learners with interview preparation strategies and real-world project guidance. Qualifications & Skills ● Bachelor’s or Master’s in Computer Science / Artificial Intelligence / Data Science/ Statistics (Master’s preferred) ● Strong coding skills in Python with exposure to real-world applications ● Solid foundation in Applied Statistics and core Data Science concepts ● Proficiency with SQL and data querying techniques ● Experience building and deploying Machine Learning and Deep Learning models ● Hands-on experience with Generative AI projects (LLMs, Diffusion, etc.) ● Excellent communication and mentoring skills in English (mandatory) ● Bonus skills (good to have): ○ Knowledge of Cloud services (AWS, Azure, GCP) ○ Familiarity with MLOps/LLMOps tools and workflows

Posted 14 hours ago

Apply

14.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

Backdrop AVIZVA is a Healthcare Technology Organization that harnesses technology to simplify, accelerate, & optimize the way healthcare enterprises deliver care. Established in 2011, we have served as strategic enablers for healthcare enterprises, helping them enhance their overall care delivery. With over 14 years of expertise, we have engineered more than 150 tailored products for leading Medical Health Plans, Dental and Vision Plan Providers, PBMs, Medicare Plan Providers, TPAs, and more. Overview Of The Role As a Senior Business Analyst, you are expected to not only manage complex projects and stakeholder relationships but also to lead business analysis efforts across multiple teams or departments. You must have a deep understanding of business processes, technology systems, and industry trends, and drive business value through process optimization, and technology solutions. You will also work closely with the Product Manager, Product Owner(s) and key stakeholders to gather business requirements, and help translate them into clear requirement documents, write compelling business cases and proposals to implement and deliver cost-effective solutions.  Job Responsibilities Conduct requirements elicitation & analysis activities with key focus on business, subject matter, industry trends & standards, data, usability & user experience, through collaboration with Product Owners (PO), Product Managers (PM), Stakeholders & SMEs. Create detailed Product Requirements Documents (PRDs) alongside the PO in accordance with organizational standards. Ensure key elements such as user-journeys, BPMN process flows, wireframes, feature-sets, data requirements, potential integration requirements, and UX nuances. Spearhead end-to-end product UI/UX design activities, collaborating with the PO, and UI/UX designers Leverage your knowledge and understanding of system flows, data flows, API integrations, & databases to define the functional design of your product through collaboration with the POs and BAs/SAs from integrating products and your lead developers. Create detailed system & functional specifications (SFS) in accordance with organizational standards, for your product modules to share functional designs with the product engineering (developers) team highlighting key aspects of system behavior, use-cases, data & integrations. Support the testing/QA team in creating and reviewing test cases, and ensure appropriate clarifications to the development team during the implementation phase. Ensure your documentation (both requirements & specifications) is always up-to-date and is aligned with the developed features in event of changes, enhancements identified during the course of feature development. Play a step-in Product Owner role in the absence of the PO, performing activities such as backlog grooming, team support, Scrum Master (SM) collaboration. Mentor & coach Associate & Specialist BAs in your product team ensuring their success & continued comfort around the nuances of the product, and technicals of the BA role. Drive innovation in BA processes and methodologies along with leadership, introducing improvements based on industry trends, known cases, and popular advancements. Skills & Qualifications Bachelor’s or Master’s degree in any related field or equivalent qualification. 5-8 years of relevant experience in business and/or system analysis. Possess excellent communication, analytical, problem-solving, and critical thinking skills. Expertise around various kinds of requirement documentation formats such as BRD, FRD, SRS, Use-Cases, User-Stories, and creating other documents such as Data Flow Diagrams (DFDs), System Flows, Context diagrams, etc. Possess hands-on experience with BPMN, UML diagrams, and tools like MS Visio, along with basic knowledge and practical exposure to system integrations and APIs. Strong analytical mindset with a proven ability to understand various business problems. Experience in driving UI/UX design activities with designers via enabling tools such as sketches &, wireframes. Familiarity with Atlassian tools (JIRA, & Confluence). Hands-on SQL experience, with comfort around data querying, conceptual & logical data models. Hands-on experience with system integrations and APIs is required. Familiarity with wrapper APIs, ElasticSearch indexes, and AWS S3 will be an added advantage. Experience of working on Healthcare Insurance domain-focused IT products and /or Industry knowledge would be a huge plus.

Posted 14 hours ago

Apply

8.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

Job Description: Job Summary: SSENTIAL SKILLS AND QUALIFICATIONS: Bachelor’s degree in computer science, Data Science, or a related field (Master’s preferred). Certifications (Preferred): Microsoft Certified: Azure Data Engineer Associate Databricks Certified Data Engineer Professional Microsoft Certified: Power BI Data Analyst Associate 8+ years of experience in analytics, data integration, and reporting. 4+ years of hands-on experience with Databricks, including: Proficiency in Databricks Notebooks for development and testing. Advanced skills in Databricks SQL, Python, and/or Scala for data engineering. Expertise in cluster management, auto-scaling, and cost optimization. 4+ years of expertise with Power BI, including: Advanced DAX for building measures and calculated fields. Proficiency in Power Query for data transformation. Deep understanding of Power BI architecture, workspaces, and row-level security. Strong knowledge of SQL for querying, aggregations, and optimization. Experience with modern ETL/ELT tools such as Azure Data Factory, Informatica, or Talend. Proficiency in Azure cloud platforms and their application to analytics solutions. Strong analytical thinking with the ability to translate data into actionable insights. Excellent communication skills to effectively collaborate with technical and non-technical stakeholders. Ability to manage multiple priorities in a fast-paced environment with high customer expectations.

Posted 14 hours ago

Apply

Exploring Querying Jobs in India

The querying job market in India is thriving with opportunities for professionals skilled in database querying. With the increasing demand for data-driven decision-making, companies across various industries are actively seeking candidates who can effectively retrieve and analyze data through querying. If you are considering a career in querying in India, here is some essential information to help you navigate the job market.

Top Hiring Locations in India

  1. Bangalore
  2. Pune
  3. Hyderabad
  4. Mumbai
  5. Delhi

Average Salary Range

The average salary range for querying professionals in India varies based on experience and skill level. Entry-level positions can expect to earn between INR 3-6 lakhs per annum, while experienced professionals can command salaries ranging from INR 8-15 lakhs per annum.

Career Path

In the querying domain, a typical career progression may look like: - Junior Querying Analyst - Querying Specialist - Senior Querying Consultant - Querying Team Lead - Querying Manager

Related Skills

Apart from strong querying skills, professionals in this field are often expected to have expertise in: - Database management - Data visualization tools - SQL optimization techniques - Data warehousing concepts

Interview Questions

  • What is the difference between SQL and NoSQL databases? (basic)
  • Explain the purpose of the GROUP BY clause in SQL. (basic)
  • How do you optimize a slow-performing SQL query? (medium)
  • What are the different types of joins in SQL? (medium)
  • Can you explain the concept of ACID properties in database management? (medium)
  • Write a query to find the second-highest salary in a table. (advanced)
  • What is a subquery in SQL? Provide an example. (advanced)
  • Explain the difference between HAVING and WHERE clauses in SQL. (advanced)

Closing Remark

As you venture into the querying job market in India, remember to hone your skills, stay updated with industry trends, and prepare thoroughly for interviews. By showcasing your expertise and confidence, you can position yourself as a valuable asset to potential employers. Best of luck on your querying job search journey!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies