Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
10.0 - 15.0 years
0 Lacs
chennai, tamil nadu
On-site
Are you a skilled Data Architect with a passion for tackling intricate data challenges from various structured and unstructured sources Do you excel in crafting micro data lakes and spearheading data strategies at an enterprise level If this sounds like you, we are eager to learn more about your expertise. In this role, you will be responsible for designing and constructing tailored micro data lakes specifically catered to the lending domain. Your tasks will include defining and executing enterprise data strategies encompassing modeling, lineage, and governance. You will play a crucial role in architecting robust data pipelines for both batch and real-time data ingestion, as well as devising strategies for extracting, transforming, and storing data from diverse sources like APIs, PDFs, logs, and databases. Furthermore, you will be instrumental in establishing best practices related to data quality, metadata management, and data lifecycle control. Your hands-on involvement in implementing processes, strategies, and tools will be pivotal in creating innovative products. Collaboration with engineering and product teams to align data architecture with overarching business objectives will be a key aspect of your role. To excel in this position, you should bring to the table over 10 years of experience in data architecture and engineering. A deep understanding of both structured and unstructured data ecosystems is essential, along with practical experience in ETL, ELT, stream processing, querying, and data modeling. Proficiency in tools and languages such as Spark, Kafka, Airflow, SQL, Amundsen, Glue Catalog, and Python is a must. Additionally, expertise in cloud-native data platforms like AWS, Azure, or GCP is highly desirable, along with a solid foundation in data governance, privacy, and compliance standards. While exposure to the lending domain, ML pipelines, or AI integrations is considered advantageous, a background in fintech, lending, or regulatory data environments is also beneficial. This role offers you the chance to lead data-first transformation, develop products that drive AI adoption, and the autonomy to design, build, and scale modern data architecture. You will be part of a forward-thinking, collaborative, and tech-driven culture with access to cutting-edge tools and technologies in the data ecosystem. If you are ready to shape the future of data with us, we encourage you to apply for this exciting opportunity based in Chennai. Join us in redefining data architecture and driving innovation in the realm of structured and unstructured data sources.,
Posted 11 hours ago
6.0 - 8.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Quality Engineer Location Chennai Work from Office Experience Level 6 8 years Tier T3 We are looking for a highly motivated and detail oriented Test Engineer to join our data platform engineering team focused on delivering high quality scalable and secure healthcare data solutions The role will involve end to end testing of data pipelines data quality checks data lineage and transformation logic ensuring the integrity and reliability of healthcare datasets The ideal candidate will have a strong foundation in Python scripting for test automation cloud platforms and DevOps CI CD tools with a deep understanding of testing in data centric environments Key Responsibilities Design and execute end to end testing strategies for the healthcare data platform covering Data ingestion and flow Data lineage validation Data quality checks Data transformation logic Data reconciliation across systems Develop and maintain automated test scripts in Python to validate large scale healthcare data pipelines and transformations Build robust data validation frameworks that can perform row level column level and aggregate level comparisons between source and target systems Implement and maintain automated testing suites using Selenium for UI where applicable and integrate with CI CD pipelines using DevOps tools e g Jenkins Azure DevOps GitHub Actions Collaborate with data engineers analysts and business users to understand data flows and ensure business rules and transformation logic are correctly implemented and tested Develop and maintain test cases test plans and test reports for data warehouse and data lake components Support performance regression and smoke testing of data platform components Perform root cause analysis and troubleshoot test failures by analyzing logs data profiles and pipeline behavior Ensure all test artifacts comply with internal compliance audit and HIPAA standards for healthcare data Required Qualifications 3 6 years of experience as a QA Test Engineer with at least 2 years focused on data platform or data warehouse testing Strong experience with Python scripting for building custom test automation tools and frameworks Hands on experience with data testing techniques including Data reconciliation between systems Data profiling Schema validation Transformation rule validation Proficiency in SQL for data validation and querying across structured and semi structured sources Experience with Selenium for UI automation testing where applicable Familiarity with DevOps CI CD tools e g Jenkins GitLab CI Azure DevOps Working knowledge of cloud platforms e g AWS Azure or GCP for testing cloud native data services and environments
Posted 23 hours ago
5.0 years
3 - 6 Lacs
Thiruvananthapuram
On-site
5 - 7 Years 1 Opening Trivandrum Role description Role: Power BI Developer Experience Level: 6+ years (minimum 4+ years in Power BI) Role Overview: UST is seeking a skilled Power BI Developer to lead the development and support of business intelligence solutions using Power BI. You will be responsible for the full lifecycle of dashboard creation—from gathering requirements to deployment—while ensuring high performance, usability, and compliance. This role requires deep technical knowledge of Power BI, SQL, and strong communication skills for engaging business stakeholders. Key Responsibilities: Design, develop, and maintain Power BI dashboards and reports across various business units. Manage end-to-end BI development workflows, from requirements gathering to deployment and post-production support. Administer Power BI services , including workspace management , gateway configuration , and capacity planning . Work directly with business users to understand needs and deliver clear, insightful visualizations. Optimize DAX expressions and data models for performance and scalability. Ensure adherence to data governance , security , and compliance standards within BI environments. Provide ongoing technical support and training to business users on Power BI features and best practices. Maintain documentation and implement version control using Git or similar tools. Required Skills & Experience: 6+ years of experience in BI development or data analytics , including 4+ years of hands-on experience with Power BI . Proficiency in developing Power BI reports, dashboards, DAX formulas , and working with Power BI services. Advanced SQL skills for data querying, transformation, and optimization. Experience in Power BI administration , including performance tuning and capacity planning. Excellent communication and collaboration skills, with the ability to engage senior leadership and business teams. Strong troubleshooting and dashboard optimization capabilities. Familiarity with version control systems (e.g., Git) and CI/CD pipelines is a plus Skills Data Analytics,Power Bi,Sql,CICD About UST UST is a global digital transformation solutions provider. For more than 20 years, UST has worked side by side with the world’s best companies to make a real impact through transformation. Powered by technology, inspired by people and led by purpose, UST partners with their clients from design to operation. With deep domain expertise and a future-proof philosophy, UST embeds innovation and agility into their clients’ organizations. With over 30,000 employees in 30 countries, UST builds for boundless impact—touching billions of lives in the process.
Posted 23 hours ago
0 years
6 - 8 Lacs
Hyderābād
On-site
Business Unit: Cubic Transportation Systems Company Details: When you join Cubic, you become part of a company that creates and delivers technology solutions in transportation to make people’s lives easier by simplifying their daily journeys, and defense capabilities to help promote mission success and safety for those who serve their nation. Led by our talented teams around the world, Cubic is committed to solving global issues through innovation and service to our customers and partners. We have a top-tier portfolio of businesses, including Cubic Transportation Systems (CTS) and Cubic Defense (CD). Explore more on Cubic.com. Job Details: JOB SUMMARY: The Program Systems Engineer (PSE), Principal Systems Engineering is the leading practitioner, functioning as the technical manager on assigned programs. The Principal Systems Engineer leverages broad expertise across specialized fields or several related disciplines to lead complex systems engineering efforts and guide program systems ensuring alignment with CTS architecture standards. This role will work on advanced electronic and software-intensive systems, providing technical direction and thought leadership. The position requires significant conceptual thinking, creativity, and independent judgment to address unique, complex issues and drive results that impact broader company functions. The Principal Systems Engineering will collaborate with internal and external stakeholders, coordinate cross-functional projects, and serve as a recognized subject matter expert. Essential Duties and Responsibilities: Define project concepts, develop objectives, and manage project resource needs. Manage the engineering Integrated Master Schedule (IMS) tasks, budgets (burn rates, Earned Value, Estimate-at-Completion), technical performance, and subcontractors while ensuring compliance with contractual requirements. Execute systems engineering work products through all levels of the project and workflow through the product development teams. Direct the application of existing principles while contributing to the development of new policies and ideas. Leverage cross-disciplinary expertise to advance the organization's technical capabilities. Direct the application of advanced engineering principles, procedures, and techniques to perform systems engineering tasks related to the development of electronic and software-intensive systems and subsystems across programs. Serve as the primary technical consultant, guiding program teams on the most suitable technical approaches and solutions for complex challenges. Exercise independent judgment in evaluating advanced technical methods, techniques, and data to solve significant and unique issues across systems engineering disciplines. Direct the application of existing principles while contributing to the development of new policies and ideas. Leverage cross-disciplinary expertise to advance the organization's technical capabilities. Direct the development and application of innovative solutions and contribute to the advancement of company objectives and strategic goals. Mentor and guide junior engineers, providing technical leadership and knowledge transfer to ensure individual growth and organizational development. Oversee architectural design, system performance evaluations, and risk mitigation strategies, identifying both technical risks and creative ways to mitigate them. Maintain a strong awareness of technological advancements in systems engineering and actively contribute to the improvement of processes, policies, and methods across the organization. Interface with customers, vendors, subcontractors, and interdisciplinary teams to drive project success, ensuring that systems meet all performance, cost, and schedule requirements. Lead system design studies, evaluate technical risks, and communicate findings and advancements through briefings, presentations, and technical papers. Actively lead proposal efforts, offering technical advice, developing system architectures, and preparing cost estimates. Develop proposals for new business, including technical planning and cost estimation, ensuring alignment with business and technical goals. Participate in broader organization projects, requiring effective persuasion of diverse stakeholders and the ability to articulate advanced technical information to non-technical audiences. Background and Experience: College Degree or equivalent in Computer Science, Computer Engineering, Electrical Engineering, or related technical discipline. Ten (10)+ years of related experience, or a master’s degree with eight (8)+ years of experience. Strong experience in Business Analysis, including requirement gathering, technical documentation, and translating business needs into functional specifications. Proficient in Scrum/Agile methodologies with hands-on experience in iterative development processes. Working knowledge of SQL for data querying and analysis. Experience creating and interpreting UML diagrams for system design and analysis. Demonstrated ability to work in customer-facing roles, handling stakeholder expectations and delivering technical solutions. Familiarity with APIs, including understanding of integration principles and API documentation. Solid grasp of Object-Oriented Programming (OOP) concepts. Hands-on experience with tools like EA (Enterprise Architect), Microsoft Visio, and Azure DevOps. Exposure to or involvement in testing activities, including test planning, defect management, and verification processes. Proficient in tools such as DOORS, EPDM, and AllChange. Ability to effectively interface with customers, subcontractors, and vendors, and engage with employees and managers at all levels within and outside of the engineering organization. In-depth knowledge of customer needs and awareness of competing products. Strong analytical and problem-solving abilities with the mental agility to address engineering challenges. Capable of working effectively under project deadline pressures and within cost and schedule constraints. Proficient in standard desktop applications, including spreadsheets and word processing software. The description provided above is not intended to be an exhaustive list of all job duties, responsibilities and requirements. Duties, responsibilities and requirements may change over time and according to business need. #LI-NB1 Worker Type: Employee
Posted 23 hours ago
4.0 years
8 Lacs
Hyderābād
On-site
Job Title: Data Engineer Experience: 4–6 Years Location: Hyderabad (Onsite) Employment Type: Contract (6 Months – 1 Year) Industry: Information Technology / Data Engineering / Cloud Working Model: Onsite About the Role: We are seeking a highly skilled and motivated Data Engineer with strong experience in building scalable data pipelines and working with modern cloud-based data ecosystems. The ideal candidate will have hands-on experience with Databricks, Apache Spark, and Google Cloud Platform (GCP) , especially BigQuery , and a passion for driving data initiatives that power intelligent decision-making across the organization. Key Responsibilities: Design, build, and optimize large-scale, reliable data pipelines using Databricks , GCP (BigQuery) , and other modern tools. Perform advanced SQL querying, data wrangling , and complex data transformations to support analytics and machine learning initiatives. Handle structured and semi-structured data, and apply Exploratory Data Analysis (EDA) techniques to derive insights. Work closely with data scientists to implement and deploy data models and pipelines into production environments. Ensure data quality, reliability, lineage , and security across the entire data pipeline lifecycle. Participate in data architecture discussions and influence decisions around data design and storage strategy . Contribute to data democratization by ensuring business users have access to clean and usable data. Create detailed documentation and reusable frameworks for data ingestion, transformation, and operational workflows. Required Skills & Qualifications: 3–6+ years of experience in a Data Engineering role or similar. Strong expertise in Databricks and Apache Spark . Deep hands-on experience with GCP BigQuery – including performance tuning, partitioning, and optimization. Proficiency in advanced SQL – including complex joins, CTEs, window functions, and query optimization. Solid experience with Python for data manipulation and developing robust pipelines. Familiarity with data science concepts – such as feature engineering, basic model implementation, and evaluation metrics. Knowledge of data profiling , EDA , and statistical analysis . Sound understanding of data structures , normalization/denormalization , and metadata management . Demonstrated understanding of how data impacts business decisions and product development. Strong problem-solving , communication, and collaboration skills. Education: Bachelor’s degree in Computer Science , Information Systems , Engineering , Computer Applications , or a related technical discipline. Preferred Qualifications (Nice to Have): Exposure to modern data orchestration tools (e.g., Airflow, dbt). Experience working in Agile environments and cross-functional teams. For more information or to apply, contact us at: -career@munificentresource.in -Call/WhatsApp: +91 9064363461 -Subject Line: Application for Data Engineer – Hyderabad Would you like a PDF version of this JD as well? Job Types: Full-time, Contractual / Temporary Contract length: 6 months Pay: Up to ₹70,000.00 per month Work Location: In person
Posted 23 hours ago
0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Job Description Product Designer position responsible for implementing and maintaining the application in AWS SaaS Environment. You will work closely with business analysts and stakeholders to ensure a robust and scalable solution to support the Finished Vehicle logistic Operation. We are seeking a skilled and motivated Product Designer with strong experience in Google Cloud Platform (GCP) and Java programming in Spring boot framework. In this role, you will be responsible for designing, developing, and maintaining scalable and reliable cloud-based solutions, data pipelines, or applications on GCP, leveraging Java for scripting, automation, data processing, and service integration. Responsibilities Work closely with product manager and business stakeholders to understand the business needs and associated systems requirements to meet customization required in SaaS Solution. Run and protect the SaaS Solution in AWS Environment and troubleshoots production issues. Active participant in all team agile ceremonies, manage the daily deliverables in Jira with proper user story and acceptance criteria. Design, build, Test, implement, and manage scalable, secure, and reliable infrastructure on Google Cloud Platform (GCP) using Infrastructure as Code (IaC) principles, primarily with Terraform. Develop and manage APIs or backend services in Java deployed on GCP services like Cloud Run Function, App Engine, or GKE. Build and maintain robust CI/CD pipelines (e.g., using Cloud Build, Jenkins, GitHub) to enable frequent and reliable application deployments. Build and maintain data products; Design, develop and maintain ETL/data pipelines for handling business and transformation rules. Implement and manage monitoring, logging, and alerting solutions (e.g., Cloud Monitoring, Prometheus, Grafana, Cloud Logging) to ensure system health and performance. Implement and enforce security best practices within the GCP environment (e.g., IAM policies, network security groups, security scanning). Troubleshoot and resolve production issues across various services (Applications) and infrastructure components (GCP). Active participant in all team agile ceremonies, manage the daily deliverables in Jira with proper user story and acceptance criteria. Qualifications Bachelor’s degree in engineering -computer science or other streams 8+ plus years of software development and support experience including analysis, design, & testing. 4+ plus years Strong proficiency in software development using Java & Spring Boot. 3+ plus years Experience working with Microservices, data ingestion tools and API's 2+ plus years Experience working with GCP cloud-based services & solutions Experience working with GCP’s data storage and services such as BigQuery, Dataflow, PubSub Hands-on experience designing, deploying, and managing resources and services on Google Cloud Platform (GCP). Familiarity with database querying (SQL) and understanding of database concepts. Understanding of cloud architecture principles, including scalability, reliability, and security. Proven experience working effectively within an Agile development or operations team (e.g., Scrum, Kanban). Experience using incident tracking and project management tools (e.g., Jira, ServiceNow, Azure DevOps). Excellent problem-solving, communication, and organizational skills. Proven ability to work independently and with a team Nice-to-Have Skills: GCP certifications (e.g., Associate Cloud Engineer, Professional Cloud DevOps Engineer, Professional Cloud Architect). Experience with other cloud providers (AWS, Azure). Experience with containerization (Docker) and orchestration (Kubernetes). Experience with database administration (e.g., PostgreSQL, MySQL). Familiarity with security best practices and tools in a cloud environment (DevSecOps). Experience with serverless technologies beyond Cloud Functions/Run. Contribution to open-source projects.
Posted 23 hours ago
6.0 years
1 Lacs
Hyderābād
On-site
About us: Where elite tech talent meets world-class opportunities! At Xenon7, we work with leading enterprises and innovative startups on exciting, cutting-edge projects that leverage the latest technologies across various domains of IT including Data, Web, Infrastructure, AI, and many others. Our expertise in IT solutions development and on-demand resources allows us to partner with clients on transformative initiatives, driving innovation and business growth. Whether it's empowering global organizations or collaborating with trailblazing startups, we are committed to delivering advanced, impactful solutions that meet today’s most complex challenges. We are building a community of top-tier experts and we’re opening the doors to an exclusive group of exceptional AI & ML Professionals ready to solve real-world problems and shape the future of intelligent systems. Structured Onboarding Process We ensure every member is aligned and empowered: Screening – We review your application and experience in Data & AI, ML engineering, and solution delivery Technical Assessment – 2-step technical assessment process that includes an interactive problem-solving test, and a verbal interview about your skills and experience Matching you to Opportunity – We explore how your skills align with ongoing projects and innovation tracks Who We're Looking For As a Data Analyst, you will work closely with business stakeholders, data engineers, and data scientists to analyze large datasets, build scalable queries and dashboards, and provide deep insights that guide strategic decisions. You’ll use Databricks for querying, transformation, and reporting across Delta Lake and other data sources.nd act on data with confidence. Requirements 6+ years of experience in data analysis, BI, or analytics roles Strong experience with Databricks Notebooks , SQL, and Delta Lake Proficiency in writing complex SQL queries (joins, CTEs, window functions) Experience with data profiling, data validation, and root-cause analysis Comfortable working with large-scale datasets and performance tuning Solid understanding of data modeling concepts and ETL workflows Experience with business intelligence tools (e.g., Power BI, Tableau) Familiarity with Unity Catalog and data access governance (a plus) Exposure to Python or PySpark for data wrangling (a plus) Benefits At Xenon7, we're not just building AI systems—we're building a community of talent with the mindset to lead, collaborate, and innovate together. Ecosystem of Opportunity: You'll be part of a growing network where client engagements, thought leadership, research collaborations, and mentorship paths are interconnected. Whether you're building solutions or nurturing the next generation of talent, this is a place to scale your influence. Collaborative Environment: Our culture thrives on openness, continuous learning, and engineering excellence. You'll work alongside seasoned practitioners who value smart execution and shared growth. Flexible & Impact-Driven Work: Whether you're contributing from a client project, innovation sprint, or open-source initiative, we focus on outcomes—not hours. Autonomy, ownership, and curiosity are encouraged here. Talent-Led Innovation: We believe communities are strongest when built around real practitioners. Our Innovation Community isn’t just a knowledge-sharing forum—it’s a launchpad for members to lead new projects, co-develop tools, and shape the direction of AI itself.
Posted 23 hours ago
3.0 - 8.0 years
5 - 9 Lacs
Gurgaon
On-site
Lead Assistant Manager EXL/LAM/1415937 Insurance ManagementGurgaon Posted On 15 Jul 2025 End Date 29 Aug 2025 Required Experience 3 - 8 Years Basic Section Number Of Positions 1 Band B2 Band Name Lead Assistant Manager Cost Code D010854 Campus/Non Campus NON CAMPUS Employment Type Permanent Requisition Type New Max CTC 1200000.0000 - 2500000.0000 Complexity Level Not Applicable Work Type Hybrid – Working Partly From Home And Partly From Office Organisational Group Insurance Sub Group Insurance Organization Insurance Management LOB Insurance Management SBU Insurance Management Country India City Gurgaon Center Gurgaon-SEZ BPO Solutions Skills Skill PYTHON DATA SCIENCE DATA SCIENCE - ML SQL MACHINE LEARNING Minimum Qualification B.TECH/BE MCA BSC Certification No data available Job Description Role Summary: We are looking for a highly motivated and skilled Data Analyst with strong expertise in Python, SQL, and applied Data Science. The candidate should have experience working on machine learning models, forecasting techniques, and data-driven prediction projects. This is a hands-on role involving both development and collaboration with business stakeholders to deliver analytical solutions. Key Responsibilities: Analyze large datasets using SQL and Python to derive actionable insights. Develop and deploy machine learning models for forecasting and predictive analytics. Work on structured and unstructured data to identify patterns and trends. Collaborate with business stakeholders to understand requirements and translate them into analytical solutions. Create dashboards, reports, and visualizations to communicate findings. Ensure data accuracy, consistency, and integrity throughout the analysis process. Leverage cloud platforms such as Azure (preferred) for data handling and model deployment. Required Skills: Strong proficiency in Python for data manipulation and model development. Advanced SQL skills for querying complex data structures. Solid understanding of machine learning algorithms , time series forecasting , and predictive modeling . Good problem-solving ability and a strong analytical mindset. Good to Have: Exposure to Azure cloud services such as Azure Data Factory, Azure ML, or Azure Blob Storage. Familiarity with Power BI, Tableau, or similar BI tools. Workflow Workflow Type L&S-DA-Consulting
Posted 23 hours ago
5.0 - 8.0 years
5 - 9 Lacs
Gurgaon
On-site
Lead Assistant Manager EXL/LAM/1422675 ServicesGurgaon Posted On 15 Jul 2025 End Date 29 Aug 2025 Required Experience 5 - 8 Years Basic Section Number Of Positions 1 Band B2 Band Name Lead Assistant Manager Cost Code D012516 Campus/Non Campus NON CAMPUS Employment Type Permanent Requisition Type New Max CTC 1600000.0000 - 2600000.0000 Complexity Level Not Applicable Work Type Hybrid – Working Partly From Home And Partly From Office Organisational Group Analytics Sub Group Retail Media & Hi-Tech Organization Services LOB Services SBU Analytics Country India City Gurgaon Center EXL - Gurgaon Center 38 Skills Skill MARKETING ANALYTICS SQL TABLEAU Minimum Qualification ANY GRADUATE Certification No data available Job Description Job Description: Analyst – Marketing & Customer Analytics (Offshore Team) Job Overview We are looking for a data-savvy and insight-driven Analyst to join our Retail Marketing & Customer Analytics offshore team. This role supports a variety of high-impact analytics needs—from channel efficiency analysis and incrementality measurement to building scalable dashboards and supporting ad hoc marketing and custom er deep dives. Ideal for candidates with 3–4 years of experience, this role requires strong SQL and visualization skills, a sharp analytical mindset, and a passion for improving marketing effectiveness through data. Key Responsibilities Marketing Performance & Channel Analytics Analyze the efficiency and ROI of marketing channels (e.g., paid search, social, email, affiliates), identifying performance trends and optimization opportunities. Support channel incrementality testing, including design, measurement, and interpretation of test results to determine true lift and return from campaigns. Translate findings from Marketing Mix Models (MMM), attribution models (GA4, MTA, last-click), and driver tree frameworks into actionable recommendations for budget and channel strategy. Conduct post-campaign analysis to identify what worked, what didn’t, and where opportunities exist for improvement. Customer & Ad Hoc Analytics Partner with onshore stakeholders to solve ad hoc business questions related to customer engagement, retention, segmentation, and lifecycle behavior. Identify and explain key business drivers influencing metrics like revenue, traffic, conversions, or churn using structured frameworks and analytical methods. Visualization & Reporting Build and maintain automated, insightful dashboards using Tableau and Looker that track marketing performance, customer behavior, and attribution outcomes. Design reporting views that make complex modeling output (e.g., MMM or GA4 attribution data) intuitive and digestible for marketing stakeholders. Ensure data pipelines and visualizations are accurate, scalable, and aligned with business definitions. Data Extraction & Quality Write efficient, scalable SQL queries to extract and manipulate large datasets from cloud platforms (e.g., GCP BigQuery). Support ongoing data validation, anomaly detection, and root cause investigation to ensure confidence in insights delivered. Team Collaboration & Agile Delivery Work closely with onshore marketing analytics leads, campaign managers, and CRM teams to align on goals and interpret insights in business context. Participate in Agile ceremonies (e.g., standups, sprint planning) and support documentation and stakeholder communication. Requirements 3–4 years of analytics experience, with a strong foundation in marketing or customer analytics. Proficient in SQL for querying and analyzing large datasets. Hands-on experience with Tableau, Looker, or similar BI tools for building dashboards and visualizations. Familiarity with marketing KPIs such as ROI, conversion rate, CAC, LTV, and incrementality. Strong analytical thinking, with the ability to turn model outputs into real-world business narratives. Clear verbal and written communication skills to share insights with technical and non-technical stakeholders. Comfortable working in Agile or cross-functional teams using tools like Jira, Confluence, or Slack. Preferred Qualifications Exposure to MMM, GA4, MTA, or other attribution models and frameworks. Experience supporting or interpreting incrementality experiments and A/B testing for paid media or CRM programs. Familiarity with retail or e-commerce environments and omnichannel marketing strategies. Understanding of driver tree modeling, campaign forecasting, or uplift modeling concepts. Workflow Workflow Type L&S-DA-Consulting
Posted 23 hours ago
2.0 years
10 Lacs
Gurgaon
On-site
Gurgaon, India We are seeking an Associate Consultant to join our India team based in Gurgaon. This role at Viscadia offers a unique opportunity to gain hands-on experience in the healthcare industry, with comprehensive training in core consulting skills such as critical thinking, market analysis, and executive communication. Through project work and direct mentorship, you will develop a deep understanding of healthcare business dynamics and build a strong foundation for a successful consulting career. ROLES AND RESPONSIBILITIES Technical Responsibilities Design and build full-stack forecasting and simulation platforms using modern web technologies (e.g., React, Node.js, Python) hosted on AWS infrastructure (e.g., Lambda, EC2, S3, RDS, API Gateway). Automate data pipelines and model workflows using Python for data preprocessing, time-series modeling (e.g., ARIMA, Exponential Smoothing), and backend services. Develop and enhance product positioning, messaging, and resources that support the differentiation of Viscadia from its competitors. Conduct research and focus groups to elucidate key insights that augment positioning and messaging Replace legacy Excel/VBA tools with scalable, cloud-native applications, integrating dynamic reporting features and user controls via web UI. Use SQL and cloud databases (e.g., AWS RDS, Redshift) to query and transform large datasets as inputs to models and dashboards. Develop interactive web dashboards using frameworks like React + D3.js or embed tools like Power BI/Tableau into web portals to communicate insights effectively. Implement secure, modular APIs and microservices to support modularity, scalability, and seamless data exchange across platforms. Ensure cost-effective and reliable deployment of solutions via AWS services, CI/CD pipelines, and infrastructure-as-code (e.g., CloudFormation, Terraform). Business Responsibilities Support the development and enhancement of forecasting and analytics platforms tailored to the needs of pharmaceutical clients across various therapeutic areas Build in depth understanding of pharma forecasting concepts, disease areas, treatment landscapes, and market dynamics to contextualize forecasting models and inform platform features Partner with cross-functional teams to ensure forecast deliverables align with client objectives, timelines, and decision-making needs Contribute to a culture of knowledge sharing and continuous improvement by mentoring junior team members and helping codify best practices in forecasting and business analytics Grow into a client-facing role, combining an understanding of commercial strategy with forecasting expertise to lead engagements and drive value for clients QUALIFICATIONS Bachelor’s degree (B.Tech/B.E.) from a premier engineering institute, preferably in Computer Science, Information Technology, Electrical Engineering, or related disciplines 2+ years of experience in full-stack development, with a strong focus on designing, developing, and maintaining AWS-based applications and services SKILLS & TECHNICAL PROFICIENCIES Technical Skills Proficient in Python, with practical experience using libraries such as pandas, NumPy, matplotlib/seaborn, and statsmodels for data analysis and statistical modeling Strong command of SQL for data querying, transformation, and seamless integration with backend systems Hands-on experience in designing and maintaining ETL/ELT data pipelines, ensuring efficient and scalable data workflows Solid understanding and applied experience with cloud platforms, particularly AWS; working familiarity with Azure and Google Cloud Platform (GCP) Full-stack web development expertise, including building and deploying modern web applications, web hosting, and API integration Proficient in Microsoft Excel and PowerPoint, with advanced skills in data visualization and delivering professional presentations Soft Skills Excellent verbal and written communication skills, with the ability to effectively engage both technical and non-technical stakeholders Strong analytical thinking and problem-solving abilities, with a structured and solution-oriented mindset Demonstrated ability to work independently as well as collaboratively within cross-functional teams Adaptable and proactive, with a willingness to thrive in a dynamic, fast-growing environment Genuine passion for consulting, with a focus on delivering tangible business value for clients Domain Expertise (Good to have) Strong understanding of pharmaceutical commercial models, including treatment journeys, market dynamics, and key therapeutic areas Experience working with and interpreting industry-standard datasets such as IQVIA, Symphony Health, or similar secondary data sources Familiarity with product lifecycle management, market access considerations, and sales performance tracking metrics used across the pharmaceutical value chain
Posted 23 hours ago
0 years
5 - 6 Lacs
Gurgaon
On-site
Work Location- Gurgaon/Bangalore/Noida Shift Time – 12PM to 9PM People Manager role: Yes GENERAL DESCRIPTION OF ROLE: Lead an operations team of 12+ colleagues which is part of a larger 50+ project team that conducts surveys to collect compensation data from corporate & business firms and used the data to create reports that help clients benchmark the external market, or prevailing compensation trends on jobs or skill sets that are comparable to those within the client organization. Clients use this data to evaluate their standing vis-a-via other organizations. The process involves analysis of data which includes rigorous auditing of compensation data, querying clients for doubts/ clarification, removal of compensation outliers keeping in mind the market benchmarks, if any, and finally work on report generation. JOB RESPONSIBILITIES Leading, motivating, and mentoring direct reports and team members of pension administration, fostering collaboration, and resolving conflicts. Setting and monitoring key performance indicators (KPIs), providing feedback, and ensuring team members meet performance standards. Holds regular monthly connect with colleagues to review performance, discuss issues, manage expectations and provide constructive feedback. Analyses team outputs and identify issues or trends connected to errors and time-consuming tasks Accountable for overall team quality control, management of risks and escalations, audit and governance. Uses talent management tools including succession planning and talent reviews to ensure that the right people are in the right roles for future growth and there is no Single Point of Failures in the team. Identifying development needs and solutions in line with business needs based on the Aon Development Framework. Owns and drives recognition practices in the team, responsible for recognizing colleagues and promoting the culture of recognizing others in the team. The Manager provides support, coaching and delivers constructive feedback to team members and encourage all to take an active role in their own career development plan (CDP). Conduct risk analysis and mitigation – Understanding of high impact risks, develop mitigation plans and governance Provide process improvement ideas to simplify the process & reduce manual procedure. Ensures updates and changes to processes are consulted with relevant stakeholders and the agreed change management processes are followed by colleagues. Responsible for governance on project plan and milestones and do risk assessment Undertake direct end-client communication with team members to resolve any data-cleaning issues. SKILLS/COMPETENCIES REQUIRED Strong People/Team management skills Prior experience on Quality framework is a must Highly proficient with MS office tools An effective communicator, confident to express your own views and demonstrate excellent interpersonal skills Problem solving skills and time management skills Should be flexible, keen on taking initiatives, takes accountability & ownership on all project related aspects and have a collaborative approach with peers. Self-motivated and displays leadership qualities 2564130
Posted 23 hours ago
3.0 - 5.0 years
7 - 10 Lacs
Mohali
On-site
Key Responsibilities: Application Development: Design and develop enterprise applications using the Joget platform, ensuring robust, scalable, and user-friendly solutions. Customization: Customize Joget forms, workflows, plugins, and UI components to meet business requirements. Process Automation: Analyze and implement business process automation workflows, enhancing operational efficiency and reducing manual efforts. Integration: Integrate Joget applications with third-party systems, APIs, and enterprise tools to enable seamless data exchange. Performance Optimization: Optimize Joget applications for performance, scalability, and security. Collaboration: Work closely with business analysts, project managers, and other stakeholders to gather and refine requirements. Testing & Debugging: Conduct thorough testing, troubleshooting, and debugging to ensure application stability and quality. Documentation: Maintain comprehensive technical documentation for all development activities. Mentorship: Provide guidance and mentorship to junior developers as needed. Requirements Experience: 3-5 years of experience in Joget development (internship experience excluded). Core Technical Skills: Joget Platform Expertise Proficiency in Joget Workflow platform for designing and developing forms, workflows, data lists, and user views. Experience in creating and managing custom Joget plugins . Expertise in workflow automation and process configuration. Knowledge of Joget’s built-in components , templates, and modular features. Programming and Development Strong knowledge of Java for back-end customizations and plugin development. Proficiency in JavaScript , HTML , and CSS for front-end customizations. Experience in SQL for database querying and management. Familiarity with XML and JSON for data handling. Integration and APIs Hands-on experience integrating Joget applications with third-party systems using REST and SOAP APIs . Knowledge of OAuth , JWT , and other authentication mechanisms for secure integrations. Experience in handling data exchange between Joget and external systems. Database Management Proficiency in relational databases such as MySQL , PostgreSQL , or Oracle . Experience in writing and optimizing complex SQL queries . Knowledge of database performance tuning and troubleshooting. Deployment and Infrastructure Familiarity with cloud platforms like AWS, Azure, or Google Cloud for Joget deployment. Experience in Docker or other containerization tools for application hosting. Joget Deployment on Multiple Operating Systems and Databases Knowledge of CI/CD pipelines and deployment automation using tools like Jenkins or GitHub Actions. Debugging and Performance Optimization Strong skills in troubleshooting Joget applications to identify and resolve issues. Experience in performance optimization of Joget workflows and UI components. Familiarity with Joget’s logging and monitoring tools for system analysis. Security Understanding of application security best practices , including data encryption, role-based access control, and user authentication. Familiarity with secure coding practices and compliance standards. Job Type: Full-time Pay: ₹700,000.00 - ₹1,000,000.00 per year Benefits: Health insurance Provident Fund Schedule: Day shift Ability to commute/relocate: Mohali, Punjab: Reliably commute or planning to relocate before starting work (Required) Experience: Joget: 2 years (Required) Work Location: In person
Posted 23 hours ago
5.0 years
6 - 9 Lacs
Noida
On-site
Calling all innovators – find your future at Fiserv. We’re Fiserv, a global leader in Fintech and payments, and we move money and information in a way that moves the world. We connect financial institutions, corporations, merchants, and consumers to one another millions of times a day – quickly, reliably, and securely. Any time you swipe your credit card, pay through a mobile app, or withdraw money from the bank, we’re involved. If you want to make an impact on a global scale, come make a difference at Fiserv. Job Title Specialist, Technical Business Analysis What does a great Business Analyst do in Fiserv? Consults with project teams and functional units on the design of important projects or services. Helps support existing business systems applications and demonstrates proficiency and leadership. Balances and prioritizes project work efficiently. Utilizing data to provide insights and actionable recommendations, they help inform decision-making and strategy development. What you will do The candidate for this position will leverage your expertise in data analysis, machine learning, and statistical modeling to uncover insights and develop predictive models that enhance our fraud rules. You will collaborate with cross-functional teams to analyze vast amounts of transactional data, identify patterns of fraudulent behavior, and continuously improve our monitoring systems. Your contributions will play a crucial role in enhancing the effectiveness of our solutions and protecting our clients from evolving threats. Must have excellent communication skills, as the role periodically interfaces with clients, vendors and business partners. What you will need to have: Bachelor's degree in Computer Science or Engineering, or relevant work experience IT experience with background in development. Minimum of 5 year’s experience in fraud analysis, investigations, or a related field, preferably in the financial services or e-commerce sector. Proficiency in programming languages such as Python or R for data analysis and model development. Familiarity with SQL for querying and managing relational databases. Solid understanding of statistical methods and techniques, including regression analysis, clustering, time series analysis, and classification. Excellent analytical skills with the ability to derive insights from complex datasets. Strong communication skills to effectively convey technical findings to non-technical stakeholders. Ability to work collaboratively in a fast-paced, team-oriented environment. Strong written and verbal communication skills, with the ability to present findings and recommendations clearly to stakeholders. Self-motivated, proactive, and able to work independently as well as collaboratively in a team environment. What would be great to have Knowledge of financial technologies, standards, and industry regulations (i.e., Payment Card Industry Data Security Standards) are highly preferred Knowledge of and experience with JIRA, Service Point and Confluence products. Experience with machine learning frameworks (e.g., scikit-learn, TensorFlow, Keras) and data manipulation libraries (e.g., Pandas, NumPy). Conclusion The role demands Analyze large datasets to identify trends, patterns, and anomalies that may indicate payment card compromises across a client’s portfolio and Develop, implement, and maintain predictive models and machine learning algorithms to enhance fraud detection capabilities. Thank you for considering employment with Fiserv. Please: Apply using your legal name Complete the step-by-step profile and attach your resume (either is acceptable, both are preferable). Our commitment to Diversity and Inclusion: Fiserv is proud to be an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, national origin, gender, gender identity, sexual orientation, age, disability, protected veteran status, or any other category protected by law. Note to agencies: Fiserv does not accept resume submissions from agencies outside of existing agreements. Please do not send resumes to Fiserv associates. Fiserv is not responsible for any fees associated with unsolicited resume submissions. Warning about fake job posts: Please be aware of fraudulent job postings that are not affiliated with Fiserv. Fraudulent job postings may be used by cyber criminals to target your personally identifiable information and/or to steal money or financial information. Any communications from a Fiserv representative will come from a legitimate Fiserv email address.
Posted 1 day ago
0.0 years
0 Lacs
India
Remote
Job Description: Back End API Developer Company Overview We specialize in creating advanced software solutions for stock market brokers, offering Accounting & Compliance ERP solution to enable brokers with enhanced decision-making, and simplify client management. With cutting-edge technology and seamless integration, we empower brokers to navigate dynamic markets efficiently and deliver exceptional service to their clients Position Overview As a Back End API Developer, you will design, develop, and maintain robust APIs to support our web and mobile applications, with a focus on microservices and real-time data processing. You will collaborate with UI developers, database engineers, and project managers to deliver secure, efficient, and scalable backend services. This role requires expertise in API development using Python, Node.js, TypeScript, and C#, with a strong emphasis on integration with SQL Server, Kafka, and Docker. Key Responsibilities Design, develop, and maintain RESTful APIs using Python, Node.js, TypeScript, or C# to support business applications and front-end frameworks like Angular and Next.js. Integrate APIs with SQL Server databases, ensuring efficient schema design and optimized queries. Implement real-time data processing using Apache Kafka for event-driven architectures. Containerize applications using Docker to ensure consistent deployment across environments. Collaborate with UI developers to integrate APIs with front-end applications built with Angular, Next.js, or other UI frameworks. Write clean, maintainable, and well-documented code, adhering to best practices and using npm for package management. Optimize APIs for performance, scalability, and security, implementing authentication mechanisms (e.g., OAuth, JWT). Monitor and troubleshoot API performance, leveraging tools like Postman or Swagger for testing and documentation. Participate in agile development processes, including code reviews, testing, and CI/CD pipeline integration. Stay updated on industry trends and propose improvements to enhance system performance and reliability. Qualifications Education : Bachelor’s degree in Computer Science, Software Engineering, or a related field (or equivalent experience). Experience : 0 to 2 years of back-end development experience with a focus on API development. Proven experience building RESTful APIs using Python, Node.js, TypeScript, or C#. Hands-on experience with npm for package management and dependency handling. Technical Skills : Proficiency in Python, Node.js, TypeScript, and C# for API development. Experience with front-end integration using Angular or Next.js. Strong knowledge of SQL Server for database design, querying, and optimization. Basic understanding of Apache Kafka for event-driven systems and message queues. Familiarity with Docker for containerization and deployment. Expertise in RESTful API design, including authentication (e.g., OAuth, JWT) and documentation (e.g., Swagger, OpenAPI). Experience with version control (e.g., Git) and CI/CD pipelines. Familiarity with microservices architecture and cloud platforms (e.g., AWS, Azure). Soft Skills : Strong problem-solving and analytical skills. Excellent communication and collaboration abilities. Ability to work in a fast-paced, agile environment. Proactive in learning and adopting new technologies. Preferred Qualifications Experience with Next.js for server-side rendering and API integration. Familiarity with Angular for front-end integration with APIs. Knowledge of GraphQL or gRPC for advanced API development. Experience with monitoring tools (e.g., New Relic, Datadog) for API performance. Understanding of DevOps practices, including CI/CD pipelines and automation. Contributions to open-source projects or a portfolio showcasing API development. Benefits Competitive salary and performance-based incentives. · Flexible work hours with the option to work from home when specific business requirements arise. The office operates six days a week, excluding the second and fourth Saturdays. Opportunities for professional growth and skill development. Collaborative and inclusive team culture. How to Apply Please submit your resume, a cover letter highlighting your experience with Python, Node.js, TypeScript, C#, SQL Server, Kafka, Docker, and UI integration (e.g., Angular, Next.js), and any relevant portfolio or GitHub links to [application email/link]. We look forward to exploring how you can contribute to our innovative projects! Indiminds Technologies LLP is an equal opportunity employer. We value diversity and are committed to fostering an inclusive environment for all employees. Job Type: Full-time Location Type: In-person Schedule: Day shift Fixed shift Work Location: In person Speak with the employer +91 6200588341 Expected Start Date: 17/07/2025
Posted 1 day ago
3.0 - 5.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
As a Data Engineer , you are required to: Design, build, and maintain data pipelines that efficiently process and transport data from various sources to storage systems or processing environments while ensuring data integrity, consistency, and accuracy across the entire data pipeline. Integrate data from different systems, often involving data cleaning, transformation (ETL), and validation. Design the structure of databases and data storage systems, including the design of schemas, tables, and relationships between datasets to enable efficient querying. Work closely with data scientists, analysts, and other stakeholders to understand their data needs and ensure that the data is structured in a way that makes it accessible and usable. Stay up-to-date with the latest trends and technologies in the data engineering space, such as new data storage solutions, processing frameworks, and cloud technologies. Evaluate and implement new tools to improve data engineering processes. Qualification : Bachelor's or Master's in Computer Science & Engineering, or equivalent. Professional Degree in Data Science, Engineering is desirable. Experience level : At least 3 - 5 years hands-on experience in Data Engineering Desired Knowledge & Experience : Spark: Spark 3.x, RDD/DataFrames/SQL, Batch/Structured Streaming Knowing Spark internals: Catalyst/Tungsten/Photon Databricks: Workflows, SQL Warehouses/Endpoints, DLT, Pipelines, Unity, Autoloader IDE: IntelliJ/Pycharm, Git, Azure Devops, Github Copilot Test: pytest, Great Expectations CI/CD Yaml Azure Pipelines, Continuous Delivery, Acceptance Testing Big Data Design: Lakehouse/Medallion Architecture, Parquet/Delta, Partitioning, Distribution, Data Skew, Compaction Languages: Python/Functional Programming (FP) SQL: TSQL/Spark SQL/HiveQL Storage: Data Lake and Big Data Storage Design additionally it is helpful to know basics of: Data Pipelines: ADF/Synapse Pipelines/Oozie/Airflow Languages: Scala, Java NoSQL: Cosmos, Mongo, Cassandra Cubes: SSAS (ROLAP, HOLAP, MOLAP), AAS, Tabular Model SQL Server: TSQL, Stored Procedures Hadoop: HDInsight/MapReduce/HDFS/YARN/Oozie/Hive/HBase/Ambari/Ranger/Atlas/Kafka Data Catalog: Azure Purview, Apache Atlas, Informatica Required Soft skills & Other Capabilities : Great attention to detail and good analytical abilities. Good planning and organizational skills Collaborative approach to sharing ideas and finding solutions Ability to work independently and also in a global team environment.
Posted 1 day ago
6.0 years
0 Lacs
Trivandrum, Kerala, India
On-site
Role Description Role: Power BI Developer Experience Level: 6+ years (minimum 4+ years in Power BI) Role Overview UST is seeking a skilled Power BI Developer to lead the development and support of business intelligence solutions using Power BI. You will be responsible for the full lifecycle of dashboard creation—from gathering requirements to deployment—while ensuring high performance, usability, and compliance. This role requires deep technical knowledge of Power BI, SQL, and strong communication skills for engaging business stakeholders. Key Responsibilities Design, develop, and maintain Power BI dashboards and reports across various business units. Manage end-to-end BI development workflows, from requirements gathering to deployment and post-production support. Administer Power BI services, including workspace management, gateway configuration, and capacity planning. Work directly with business users to understand needs and deliver clear, insightful visualizations. Optimize DAX expressions and data models for performance and scalability. Ensure adherence to data governance, security, and compliance standards within BI environments. Provide ongoing technical support and training to business users on Power BI features and best practices. Maintain documentation and implement version control using Git or similar tools. Required Skills & Experience 6+ years of experience in BI development or data analytics, including 4+ years of hands-on experience with Power BI. Proficiency in developing Power BI reports, dashboards, DAX formulas, and working with Power BI services. Advanced SQL skills for data querying, transformation, and optimization. Experience in Power BI administration, including performance tuning and capacity planning. Excellent communication and collaboration skills, with the ability to engage senior leadership and business teams. Strong troubleshooting and dashboard optimization capabilities. Familiarity with version control systems (e.g., Git) and CI/CD pipelines is a plus Skills Data Analytics,Power Bi,Sql,CICD
Posted 1 day ago
5.0 years
0 Lacs
Mumbai Metropolitan Region
On-site
Vodafone Idea Limited is an Aditya Birla Group and Vodafone Group partnership. It is India’s leading telecom service provider. The Company provides pan India Voice and Data services across 2G, 3G and 4G platform. With the large spectrum portfolio to support the growing demand for data and voice, the company is committed to deliver delightful customer experiences and contribute towards creating a truly ‘Digital India’ by enabling millions of citizens to connect and build a better tomorrow. The Company is developing infrastructure to introduce newer and smarter technologies, making both retail and enterprise customers future ready with innovative offerings, conveniently accessible through an ecosystem of digital channels as well as extensive on-ground presence. The Company is listed on National Stock Exchange (NSE) and Bombay Stock Exchange (BSE) in India. We're proud to be an equal opportunity employer. At VIL, we know that diversity makes us stronger. We are committed to a collaborative, inclusive environment that encourages authenticity and fosters a sense of belonging. We strive for everyone to feel valued, connected and empowered to reach their potential and contribute their best. VIL's goal is to build and maintain a workforce that is diverse in experience and background but uniform in reflecting our Values of Passion, Boldness, Trust, Speed and Digital. Consequently, our recruiting efforts are directed towards attracting and retaining best and brightest talents. Our endeavour is to be First Choice for prospective employees. VIL ensures equal employment opportunity without discrimination or harassment based on race, colour, religion, creed, age, sex, sex stereotype, gender, gender identity or expression, sexual orientation, national origin, citizenship, disability, marital and civil partnership/union status, pregnancy, veteran or military service status, genetic information, or any other characteristic protected by law. VIL is an equal opportunity employer committed to diversifying its workforce. Role Sales Analyst Job Level/ Designation M1/ Manager Function / Department Sales and Distribution Reports to Head BI Job Purpose To analyze sales data to uncover trends, gaps, and opportunities for improvement appoint distributors to service the market as per norms, build relationship with trade and monitor DSE performance for achievement of targets Key Result Areas/Accountabilities Analyze sales data to uncover trends, gaps, and opportunities for improvement Develop and maintain dashboards, trackers, and performance reports for sales teams and management Provide actionable insights to drive sales growth, improve territory performance, and support strategic decisions Track and forecast sales performance across regions, products, and channels Work closely with cross-functional teams (Sales, Marketing, Finance, and IT) to align analytics with business goals Support periodic reviews and presentations with visual data insights and strategic recommendations Maintain and optimize sales data in systems like MS Access and Excel, ensuring data integrity and accuracy Run complex queries and automate reports using SQL and Excel VBA/macros Core Competencies, Knowledge, Experience 3–5 years of experience in sales analysis, business intelligence, or related field Proficient in SQL for data querying and extraction Strong skills in MS Access for managing and analyzing relational datasets Experience with BI tools like Power BI, Tableau (optional but preferred) Strong analytical thinking with attention to detail and problem-solving skills Ability to work under pressure and meet tight deadlines Years Of Experience 2-3 years of experience preferably in a distribution set up of FMCG/Retail/Telecom. Direct reports Nil Vodafone Idea Limited (formerly Idea Cellular Limited) An Aditya Birla Group & Vodafone partnership
Posted 1 day ago
3.0 years
0 Lacs
New Delhi, Delhi, India
On-site
Job Title: Data Analyst Location: New Delhi (On-site) Company: The Kaptive Experience Required: 3-5 years About the Role The Kaptive is looking for a highly skilled Data Analyst to join our team and support our Australian client, at their New Delhi office. This role is ideal for someone who thrives on working closely with cross-functional teams, translating raw data into meaningful insights and actionable strategies. Key Responsibilities · Develop interactive dashboards and reports using Power BI or Tableau. · Connect to various data sources, import and clean data, and transform it for analysis. · Work with SQL and for querying and managing large datasets and create KPIs using BI software. · Analyze complex datasets to identify trends, patterns, and actionable insights. · Create regular reports and ad-hoc analyses for business teams. · Collaborate with both technical and non-technical teams to understand requirements and deliver meaningful data solutions. Requirements · Graduate in any discipline (preferably via regular/full-time education) from a recognized institute with a good academic record. · 2+ years of strong hands-on experience with SQL, Python and Data Visualization. · Proficient in BI report building. · Strong analytical and mathematical skills. · Good communication skills with the ability to explain data insights clearly to different stakeholders. · Comfortable working in a collaborative and fast-paced environment. · Willingness to learn and adapt to new tools, technologies, and business domains. Nice to Have · Exposure to scripting languages like R. · Experience working on end-to-end BI or analytics projects. Who You Are · A critical thinker with exceptional problem-solving abilities. · A self-starter who takes initiative, works independently, and delivers results without constant supervision. Passionate about data, innovation, and continuous learning.
Posted 1 day ago
0 years
0 Lacs
Budaun Sadar, Uttar Pradesh, India
On-site
Req ID: 330307 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Business Consulting- Power BI Consultant to join our team in Gurgaon, Uttar Pradesh (IN-UP), India (IN). Core Responsibilities: Developing and Maintaining Data Models: Power BI developers create and maintain data models that effectively represent business requirements. This includes understanding data sources, designing relationships between tables, and ensuring data accuracy and integrity. Creating Reports and Dashboards: They design and build interactive dashboards and reports using Power BI to visualize key business metrics and trends. This involves choosing appropriate charts, formats, and layouts to effectively communicate data insights. Data Analysis and Visualization: Power BI developers analyze data to identify trends, patterns, and insights, and then visualize them in a way that is understandable and actionable for stakeholders. Collaborating with Stakeholders: They work closely with business users to understand their needs and requirements, ensuring that the reports and dashboards meet their specific needs. Ensuring Data Governance and Compliance: Power BI developers play a role in ensuring that data is accurate, secure, and compliant with relevant regulations and policies. Troubleshooting and Optimization: They troubleshoot and resolve issues related to Power BI solutions, including data integration, performance tuning, and report accessibility. Staying Updated with Industry Trends: They keep abreast of the latest Power BI features, best practices, and industry trends to continuously improve reporting capabilities. Additional Responsibilities: Data Integration: Integrating data from various sources, such as SQL databases, Excel, and cloud-based systems, into Power BI. Data Transformation: Transforming data to make it suitable for analysis and visualization in Power BI. Technical Documentation: Creating technical documentation to support the use and maintenance of Power BI solutions. DAX Calculations: Using DAX (Data Analysis Expressions) to create calculated columns and measures for data analysis and reporting. SQL Querying: Using SQL to query and retrieve data from databases. Custom Visual Development: Developing custom visuals in Power BI to meet specific reporting needs. Skills Required: Proficiency with Power BI tools: Strong understanding of Power BI Desktop, Power BI Service, and other related tools. Strong Analytical Skills: Ability to analyze data, identify trends, and derive insights. Expertise in DAX and SQL: Knowledge of DAX for calculations and SQL for database querying. Excellent Communication Skills: Ability to communicate effectively with stakeholders and users. About NTT DATA NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies. Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us at us.nttdata.com NTT DATA endeavors to make https://us.nttdata.com accessible to any and all users. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact us at https://us.nttdata.com/en/contact-us . This contact information is for accommodation requests only and cannot be used to inquire about the status of applications. NTT DATA is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status. For our EEO Policy Statement, please click here . If you'd like more information on your EEO rights under the law, please click here . For Pay Transparency information, please click here .
Posted 1 day ago
3.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Hi all, We have an immediate opening for "Frontend Full stack Developer" at Hyderabad location. Experience: 2–3 years Location: Hyderabad Work Mode: WFO NP: Immediate joiners Required Skills and Qualifications: 2–3 years of hands-on experience as a frontend or full stack developer. Deep expertise in React.js, JavaScript (ES6+), HTML5, and CSS3. Proficiency with Material UI (MUI) and component-driven UI development. Experience working with Micro-Frontend architecture (e.g., Module Federation, Single-SPA). Good understanding of backend development using Node.js/Express.js. Familiarity with MySQL or PostgreSQL for querying and data interaction. Cloud deployment experience with AWS or Azure services. Strong foundation in version control (Git) and CI/CD tools (e.g., GitHub, GitLab). Awareness of security best practices and frontend performance optimization techniques. Education: Bachelor’s degree in Computer Science, IT, Engineering, or a related technical field. If you're interested please share updated resume to recruitment.india@maxisit.com
Posted 1 day ago
5.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
About US At Particleblack, we drive innovation through intelligent experimentation with Artificial Intelligence. Our multidisciplinary team—comprising solution architects, data scientists, engineers, product managers, and designers—collaborates with domain experts to deliver cutting-edge R&D solutions tailored to your business. Responsibilities Analyze raw data: assessing quality, cleansing, structuring for downstream processing Design accurate and scalable prediction algorithms Collaborate with engineering team to bring analytical prototypes to production Generate actionable insights for business improvements "Statistical Modeling: Develop and implement core statistical models, including linear and logistic regression, decision trees, and various classification algorithms. Analyze and interpret model outputs to inform business decisions. Advanced NLP: Work on complex NLP tasks, including data cleansing, text preprocessing, and feature engineering. Develop models for text classification, sentiment analysis, and entity recognition. LLM Integration: Design and optimize pipelines for integrating Large Language Models (LLMs) into applications, with a focus on Retrieval-Augmented Generation (RAG) systems. Work on fine-tuning LLMs to enhance their performance on domain-specific tasks. ETL Processes: Design ETL (Extract, Transform, Load) processes to ensure that data is accurately extracted from various sources, transformed into usable formats, and loaded into data warehouses or databases for analysis. BI Reporting and SQL: Collaborate with BI teams to ensure that data pipelines support efficient reporting. Write complex SQL queries to extract, analyze, and visualize data for business intelligence reports. Ensure that data models are optimized for reporting and analytics. Data Storage and Management: Collaborate with data engineers to design and implement efficient storage solutions for structured datasets and semi structured text datasets. Ensure that data is accessible, well-organized, and optimized for retrieval. Model Evaluation and Optimization: Regularly evaluate models using appropriate metrics and improve them through hyperparameter tuning, feature selection, and other optimization techniques. Deploy models in production environments and monitor their performance. Collaboration: Work closely with cross-functional teams, including software engineers, data engineers, and product managers, to integrate models into applications and ensure they meet business requirements. Innovation: Stay updated with the latest advancements in machine learning, NLP, and data engineering. Experiment with new algorithms, tools, and frameworks to continuously enhance the capabilities of our models and data processes." Qualifications Overall Experience: 5+ years of overall experience working in a modern software engineering environment with exposure to best practices in code management, devops and cloud data/ML engineering. Proven track record of developing and deploying machine learning models in production. ML Experience: 3+ years of experience in machine learning engineering, data science with a focus on fundamental statistical modeling. Experience in feature engineering, basic model tuning and understanding model drift over time. Strong foundations in statistics for applied ML. Data Experience: 1+ year(s) in building data engineering ETL processes, and BI reporting. NLP Experience: 1+ year(s) of experience working on NLP use cases, including large scale text data processing, storage and fundamental NLP models for text classification, topic modeling and/or more recently, LLM models and their applications Core Technical Skills: Proficiency in Python and relevant ML and/or NLP specific libraries. Strong SQL skills for data querying, analysis, and BI reporting. Experience with ETL tools and data pipeline management. BI Reporting: Experience in designing and optimizing data models for BI reporting, using tools like Tableau, Power BI, or similar. Education: Bachelor’s or Master’s degree in Computer Science / Data Science,
Posted 1 day ago
0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Job description: ESG Specialist team are set of skilled and domain knowledge individuals who build and maintain the content currently covered on existing databases and test future database, tools and systems. The Specialist team drives automation and simplification to bring in efficient processes and quality of updates within ESG Operations. The successful applicant will be working in an agile environment and will work on projects to deliver automation and process simplifications solutions. Roles, Responsibilities & Key Accountabilities: Develop and deploy machine learning and deep learning models to solve complex problems. Conduct statistical analysis to identify trends, patterns, and insights. Application of data science skills to develop intelligent systematic checks on textual and Boolean data to identify incorrect and missing data Manage data extraction, reporting, data analysis. Use SQL to query and manage large datasets. Implement web scraping techniques to gather relevant data from various sources. Work with large language models (LLMs) for natural language processing tasks. Required Skills: Proficiency in ML, deep learning, and statistical analysis techniques. Strong Python & SQL skills for data querying and manipulation. Proficiency in Microsoft Power Platform. Experience with web scraping tools and frameworks. Working knowledge of NLP and LLM concepts, familiarity with BERT, GPT, etc. Strong problem-solving and analytical skills. Reporting using Python visualization packages (e.g., Matplotlib, Seaborn) and Microsoft Power BI. Desired Skills: Knowledge of LSEG proprietary systems for automations including LEAD & DEMING. Understanding of ESG content and marketplace. Understanding of 6 sigma principles. Education: Data Science Certification or Software Engineer or study in computer science subject area or Any subject area of study along with expertise in programming languages like Python, VBA, SQL etc. People are at the heart of what we do and drive the success of our business. Our colleagues thrive personally and expertly through our shared values of Integrity, Partnership, Change and Excellence, which are at the core of our culture. We embrace diversity and actively seek to attract people with unique backgrounds and perspectives. We are always looking at ways to become more agile, so we meet the needs of our teams and customers. We believe that an inclusive collaborative workplace is pivotal to our success and supports the potential and growth of all colleagues at LSEG. LSEG is a leading global financial markets infrastructure and data provider. Our purpose is driving financial stability, empowering economies and enabling customers to create sustainable growth. Our purpose is the foundation on which our culture is built. Our values of Integrity, Partnership , Excellence and Change underpin our purpose and set the standard for everything we do, every day. They go to the heart of who we are and guide our decision making and everyday actions. Working with us means that you will be part of a dynamic organisation of 25,000 people across 65 countries. However, we will value your individuality and enable you to bring your true self to work so you can help enrich our diverse workforce. You will be part of a collaborative and creative culture where we encourage new ideas and are committed to sustainability across our global business. You will experience the critical role we have in helping to re-engineer the financial ecosystem to support and drive sustainable economic growth. Together, we are aiming to achieve this growth by accelerating the just transition to net zero, enabling growth of the green economy and creating inclusive economic opportunity. LSEG offers a range of tailored benefits and support, including healthcare, retirement planning, paid volunteering days and wellbeing initiatives. We are proud to be an equal opportunities employer. This means that we do not discriminate on the basis of anyone’s race, religion, colour, national origin, gender, sexual orientation, gender identity, gender expression, age, marital status, veteran status, pregnancy or disability, or any other basis protected under applicable law. Conforming with applicable law, we can reasonably accommodate applicants' and employees' religious practices and beliefs, as well as mental health or physical disability needs. Please take a moment to read this privacy notice carefully, as it describes what personal information London Stock Exchange Group (LSEG) (we) may hold about you, what it’s used for, and how it’s obtained, your rights and how to contact us as a data subject. If you are submitting as a Recruitment Agency Partner, it is essential and your responsibility to ensure that candidates applying to LSEG are aware of this privacy notice.
Posted 1 day ago
5.0 years
0 Lacs
Pune, Maharashtra, India
Remote
Job Title: GCP Data Engineer Experience: 5 to 7+ Years Location: Remote Shift Timing: 3:00 PM – 12:00 AM IST Job Summary: We are looking for a highly skilled and motivated GCP Data Engineer with a minimum of 5 to 7+ years of experience in Data Engineering , including at least 3 years of hands-on expertise with Google Cloud Platform (GCP) . The ideal candidate will be responsible for designing and implementing robust, scalable, and secure data pipelines and architectures to support data analytics and business intelligence initiatives. This is a remote role with a dedicated shift timing of 3 PM – 12 AM IST. Key Responsibilities: Design, develop, and optimize data pipelines using GCP services, especially BigQuery, Cloud Storage, and Dataflow. Develop and maintain ETL/ELT processes using Python and SQL. Implement data models and schemas in BigQuery for efficient querying and storage. Collaborate with data scientists, analysts, and stakeholders to define data requirements and deliver robust data solutions. Monitor and troubleshoot data pipeline performance, quality, and reliability issues. Ensure best practices in data security, governance, and compliance on GCP. Automate data workflows and contribute to CI/CD pipeline integration for data solutions. Required Skills & Qualifications: Minimum 6+ years of experience in Data Engineering. Strong hands-on experience with Google Cloud Platform (GCP) services. Proficiency in BigQuery, Cloud Storage, Dataflow, Pub/Sub, Cloud Functions . Expertise in Python , Advanced SQL , and shell scripting. Familiarity with Cloud Composer (Airflow) for workflow orchestration. Solid understanding of data warehousing concepts , dimensional modeling , and schema design . Experience with version control tools like Git and CI/CD tools. Knowledge of data security best practices , IAM roles , and encryption methods on GCP. Strong problem-solving and debugging skills. Good communication and team collaboration abilities.
Posted 1 day ago
0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
About the Organization : IndusInd Bank is a universal Bank with a widespread banking footprint with over 2.5 crore customers, over 5,000 distribution points and nearly 2,000 branches across the country. Bank is currently in its fifth planning cycle and is on a mission to attain scale with sustainability at its core and with a specific focus on leapfrogging digital banking capabilities. Accordingly, the digital team at IndusInd Bank is on a journey to create differentiated transformative customer experience in financial services. The team’s charter to redefine banking experience for clients making is much simpler, intelligent, and personalized. The data science team(digital portfolio, platform growth & engagement) will be part of the digital team at IndusInd Bank and will help uncover insights that help drive the digital strategy and optimize customer experience. About the role : You would be part of asset analytics and data science team and work on cutting edge problems for the bank. The individual will work closely with the stakeholders across risk, business, partnerships, digital and strategy in creating and refining strategies to augment profitability and growth for the bank. The incumbent will majorly be responsible with coming up data driven and actionable insights and presenting them to relevant stakeholders The candidate will work in close collaboration with digital product, growth, and marketing teams. Overall, Job Description Experience querying databases and using statistical computer languages: R, Python, SLQ, etc. Use predictive modelling to increase and optimize customer experiences, revenue generation, ad targeting and other business outcomes. Experienced in working with large and multiple datasets, data warehouses and ability to pull data using relevant programs and coding. Well versed with necessary data reprocessing and feature engineering skills. Strong background in Statistical Analysis. Constantly look and research on ML algorithms and data sources for better prediction Work and coordinate with multiple stakeholders to identify opportunities for leveraging company data to drive business solutions, implement models and monitor outcomes. Assess the effectiveness and accuracy of new data sources and data gathering techniques and develop processes and tools to monitor and analyze model performance and data accuracy. Experience in establishing/scaling up data science functions Proven ability to discover solutions hidden in large datasets and to drive business results with their data-based insights Leverage analytics to increase customer lifetime value for clients acquired digitally by pitching right product to the right client at the right time Help define pricing models for digital value propositions for various segments of users / clients to ensure profitability of the portfolio and to ensure achievement of business outcomes Work with product, growth, and marketing teams across product/campaign lifecycle Empower product and marketing teams by creating automated dashboards and reports using PowerBI Skills/Capabilities Model development experience in R, Python, SAS Strong and in-depth understanding of statistics Strong strategic thought leadership and problem-solving skills with ability to tackle unstructured and complex business problems Ability to build & use relationships and influence broadly across the organization Results driven with strong project management skills, ability to work on multiple priorities Handling Big Data, Segmentation, Analytics, Machine Learning, Artificial Intelligence, Statistics and Hypothesis Testing
Posted 1 day ago
8.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
MakeMyTrip-GoIbibo data science group is looking for a seasoned data science professional, who can build next-gen travel fintech systems powered via personalized product recommendations/persuasions, real-time pricing and causal algorithms. Responsibilities: Train and deploy best in class deep learning models for ranking, pricing, recommendation, representation learning, FinTech products. Work with stakeholders at various stages of project, lead project development from inception to completion. Be a SQL python ninja, understand data thoroughly, build smart AI/ML systems. Opportunity to work with large click stream e-commerce data sets, at MakeMyTrip and GoIbibo, i. e., in rapidly growing travel space. Build and own robust data science APIs meeting 99% SLA at very high RPS. Also ensure training/inference MLOps rigour. Show business impact while having opportunity to build best in class AI/ML models. Requirements: Bachelor's degree in mathematics, Statistics, related technical field, or equivalent practical experience. A minimum of 8 years of experience (or minimum of 5 years with a Ph. D.) in one or more of the following: ML Modeling, Ranking, Recommendations, or Personalization systems. Knowledge of advanced ML techniques such as Classification, Prediction, Recommender Systems, Dynamic Pricing, Choice modelling, Anomaly Detection Experience with statistical data analysis such as linear models, multivariate analysis, stochastic models, and sampling methods. Experience of deploying models into production is preferred Experience with applying machine learning techniques to big data systems (e. g., Spark) with TB to PB scale datasets. Experience with data querying languages (e. g., SQL), scripting languages (e. g., Python), and/or statistical/mathematical software. Strong research record demonstrated through publications. Experience with design and analysis of experiments. Experience with large-scale A/B testing systems, especially in the domain of online advertising or online commerce.
Posted 1 day ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
The querying job market in India is thriving with opportunities for professionals skilled in database querying. With the increasing demand for data-driven decision-making, companies across various industries are actively seeking candidates who can effectively retrieve and analyze data through querying. If you are considering a career in querying in India, here is some essential information to help you navigate the job market.
The average salary range for querying professionals in India varies based on experience and skill level. Entry-level positions can expect to earn between INR 3-6 lakhs per annum, while experienced professionals can command salaries ranging from INR 8-15 lakhs per annum.
In the querying domain, a typical career progression may look like: - Junior Querying Analyst - Querying Specialist - Senior Querying Consultant - Querying Team Lead - Querying Manager
Apart from strong querying skills, professionals in this field are often expected to have expertise in: - Database management - Data visualization tools - SQL optimization techniques - Data warehousing concepts
As you venture into the querying job market in India, remember to hone your skills, stay updated with industry trends, and prepare thoroughly for interviews. By showcasing your expertise and confidence, you can position yourself as a valuable asset to potential employers. Best of luck on your querying job search journey!
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
31458 Jobs | Dublin
Wipro
16542 Jobs | Bengaluru
EY
10788 Jobs | London
Accenture in India
10711 Jobs | Dublin 2
Amazon
8660 Jobs | Seattle,WA
Uplers
8559 Jobs | Ahmedabad
IBM
7988 Jobs | Armonk
Oracle
7535 Jobs | Redwood City
Muthoot FinCorp (MFL)
6170 Jobs | New Delhi
Capgemini
6091 Jobs | Paris,France