Jobs
Interviews

859 Olap Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 years

3 - 7 Lacs

Hyderābād

On-site

Job Title: Databricks Developer / Data Engineer Duration - 12 Months with Possible Extension Location: Hyderabad, Telangana (Hybrid) 1-2 days onsite at client location Job Summary: We are seeking a highly skilled Databricks Developer / Data Engineer with 5+ years of experience in building scalable data pipelines, managing large datasets, and optimizing data workflows in cloud environments. The ideal candidate will have hands-on expertise in Azure Databricks, Azure Data Factory, and other Azure-native services, playing a key role in enabling data-driven decision-making across the organization. Key Responsibilities: Design, develop, and maintain scalable ETL/ELT pipelines for data ingestion, transformation, and integration Work with both structured and unstructured data from a variety of internal and external sources Collaborate with data analysts, scientists, and engineers to ensure data quality, integrity, and availability Build and manage data lakes, data warehouses, and data models (Azure Databricks, Azure Data Factory, Snowflake, etc.) Optimize performance of large-scale batch and real-time processing systems Implement data governance , metadata management, and data lineage practices Monitor and troubleshoot pipeline issues; perform root cause analysis and proactive resolution Automate data validation and quality checks Ensure compliance with data privacy, security, and regulatory requirements Maintain thorough documentation of architecture, data workflows, and processes Mandatory Qualifications: 5+ years of hands-on experience with: Azure Blob Storage, Azure Data Lake Storage, Azure SQL Database Azure Logic Apps, Azure Data Factory, Azure Databricks, Azure ML Azure DevOps Services, Azure API Management, Webhooks Intermediate-level proficiency in Python scripting and PySpark Basic understanding of Power BI and visualization functionalities Technical Skills & Experience Required: Proficient in SQL and working with both relational and non-relational databases (e.g., SQL, PostgreSQL, MongoDB, Cassandra) Hands-on experience with Apache Spark, Hadoop, Hive for big data processing Proficiency in building scalable data pipelines using Azure Data Factory and Azure Databricks Solid knowledge of cloud-native tools : Delta Lake, Azure ML, Azure DevOps Understanding of data modeling , OLAP/OLTP systems , and data warehousing best practices Experience with CI/CD pipelines , version control with Git , and working with Azure Repos Knowledge of data security , privacy policies, and compliance frameworks Excellent problem-solving , troubleshooting , and analytical skills

Posted 6 hours ago

Apply

4.0 years

0 Lacs

Pune, Maharashtra

Remote

R022093 Pune, Maharashtra, India IT Operations Regular Location: India, Remote This is a remote position, so you’ll be working remotely from your home. You may occasionally visit a GoDaddy office to meet with your team for events or meetings. Join Our Team... Demonstrate your passion for helping small businesses achieve their dreams online. By helping to move strategy into action, you will be improving GoDaddy’s outreach to those small business owners whose dreams are the backbone of our company. Take part within a multichannel environment, turning strategic plans into digital marketing campaigns and ultimately influencing our customers’ success! The Marketing Data Analyst will bring to bear their experience and knowledge of marketing data to deliver timely and relevant omni channel marketing experiences to our customers worldwide. Your experience understanding and working with marketing data will be applying a robust marketing technology platform to drive campaign automation and optimization.This will ensure continuous improvement in scaling operations for our customer marketing programs, including Email, SMS, WhatsApp, and new & emerging channels What you'll get to do... Serve as Marketing Data subject matter expert for the Customer Marketing team with extensive knowledge of data including standard methodologies and anti-patterns Play an active role in driving requirements for the implementation and integration of an evolving exceptional marketing automation platform Craft and develop customer segments to be applied over Email, Web, CRM, SMS, WhatsApp, and many other customer touch points Collaborate with cross functional teams in the creation of segmentation and personalisation-based strategies Ongoing analysis of marketing programs and broader business performance to surface key insights and recommendations to help inform our marketing strategy Ensure the accuracy of our outbound marketing campaigns by driving QA and on-going monitoring at all levels all the way up to source data Your experience should include... 4+ years of experience in marketing data management, specialising in data set development for marketing automation and email marketing Minimum 4 years of experience working with SQL syntax, relational and non-relational database models, OLAP, and data driven marketing platforms with proven experience writing and understanding complex queries Expertise in testing/optimization methodologies, performance tuning for self-work and reviews with strong analytical and data presentation abilities Experience collaborating with the MarTech Platform Team, Data Platform, and Marketing Managers to present findings, quickly diagnose and troubleshoot emergent issues Experience in segmentation tools like Message Gears, SQL Server, and AWS database systems such as Redshift, Athena is highly preferred Experience with Data Visualisation tools like Tableau and/or Quick-Sight is preferred You might also have... Four-year bachelor’s degree required; master’s degree is preferred Hands on skills in Python and experience with an enterprise level Marketing Automation platform such as Salesforce Marketing Cloud is preferred Experience working with B2B and B2C data including lead and prospect management is nice to have We've got your back... We offer a range of total rewards that may include paid time off, retirement savings (e.g., 401k, pension schemes), bonus/incentive eligibility, equity grants, participation in our employee stock purchase plan, competitive health benefits, and other family-friendly benefits including parental leave. GoDaddy’s benefits vary based on individual role and location and can be reviewed in more detail during the interview process. We also embrace our diverse culture and offer a range of Employee Resource Groups (Culture). Have a side hustle? No problem. We love entrepreneurs! Most importantly, come as you are and make your own way. About us... GoDaddy is empowering everyday entrepreneurs around the world by providing the help and tools to succeed online, making opportunity more inclusive for all. GoDaddy is the place people come to name their idea, build a professional website, attract customers, sell their products and services, and manage their work. Our mission is to give our customers the tools, insights, and people to transform their ideas and personal initiative into success. To learn more about the company, visit About Us. At GoDaddy, we know diverse teams build better products—period. Our people and culture reflect and celebrate that sense of diversity and inclusion in ideas, experiences and perspectives. But we also know that’s not enough to build true equity and belonging in our communities. That’s why we prioritize integrating diversity, equity, inclusion and belonging principles into the core of how we work every day—focusing not only on our employee experience, but also our customer experience and operations. It’s the best way to serve our mission of empowering entrepreneurs everywhere, and making opportunity more inclusive for all. To read more about these commitments, as well as our representation and pay equity data, check out our Diversity and Pay Parity annual report which can be found on our Diversity Careers page. GoDaddy is proud to be an equal opportunity employer . GoDaddy will consider for employment qualified applicants with criminal histories in a manner consistent with local and federal requirements. Refer to our full EEO policy. Our recruiting team is available to assist you in completing your application. If they could be helpful, please reach out to myrecruiter@godaddy.com. GoDaddy doesn’t accept unsolicited resumes from recruiters or employment agencies.

Posted 14 hours ago

Apply

2.0 years

0 Lacs

Patna, Bihar, India

On-site

Job Description This is an exciting opportunity for an experienced industry professional with strong analytical and technical skills to join and add value to a dedicated and friendly team. We are looking for a Data Analyst who is driven by data-driven decision-making and insights. As a core member of the Analytics Team, the candidate will take ownership of data analysis projects by working independently with little supervision. The ideal candidate is a highly resourceful and innovative professional with extensive experience in data analysis, statistical modeling, and data visualization. The candidate must have a strong command of data analysis tools like SAS/SPSS, Power BI/Tableau, or R, along with expertise in MS Excel and MS PowerPoint. The role requires optimizing data collection procedures, generating reports, and applying statistical techniques for hypothesis testing and data interpretation. Key Responsibilities Perform data analysis using tools like SAS, SPSS, Power BI, Tableau, or R. Optimize data collection procedures and generate reports on a weekly, monthly, and quarterly basis. Utilize statistical techniques for hypothesis testing to validate data and interpretations. Apply data mining techniques and OLAP methodologies for in-depth insights. Develop dashboards and data visualizations to present findings effectively. Collaborate with cross-functional teams to define, design, and execute data-driven strategies. Ensure the accuracy and integrity of data used for analysis and reporting. Utilize advanced Excel skills to manipulate and analyze large datasets. Prepare technical documentation and presentations for stakeholders. Candidate Profile Qualifications Qualification: MCA / Graduate / Post Graduate in Statistics or MCA or BE/B.Tech in Computer Science & Engineering, Information Technology, or Electronics. A minimum of 2 years' experience in data analysis using SAS/SPSS, Power BI/Tableau, or R. Proficiency in MS Office with expertise in MS Excel & MS PowerPoint. Strong analytical skills with attention to detail. Experience in data mining and OLAP methodologies. Ability to generate insights and reports based on data trends. Excellent communication and presentation skills. Desired Qualifications Experience in predictive analytics and machine learning techniques. Knowledge of SQL and database management. Familiarity with Python for data analysis. Experience in automating reporting processes. (ref:hirist.tech)

Posted 16 hours ago

Apply

5.0 - 9.0 years

0 Lacs

pune, maharashtra

On-site

You have a minimum of 5 years of experience in the MSBI Product suite, particularly in Power BI and DAX. Your role involves data preparation for BI projects, understanding business requirements in a BI context, transforming raw data into meaningful insights using Power BI, and working with SSIS. You are skilled in requirement analysis, design, prototyping, and building enterprise models using Power BI desktop. Your responsibilities also include developing data models, OLAP cubes, and reports while applying best practices to the development lifecycle. You document source-to-target mappings, data dictionaries, and database designs, and identify areas for optimization in data flows. You have a good understanding of DAX queries in Power BI desktop and can create Power BI dashboards, reports, KPI scorecards, and transform manual reports. Additionally, you have experience in visualization, transformation, data analysis, and formatting. Your expertise extends to connecting to data sources, importing and transforming data for Business Intelligence, publishing, and scheduling Power BI reports. You are also involved in the installation and administration of Microsoft SQL Server. Knowledge of EBS Modules like Finance, HCM, and Procurement is considered an advantage. You excel in a fast-paced, dynamic, client-facing role, delivering high-quality work products to exceed expectations. Your leadership, interpersonal, prioritization, multi-tasking, problem-solving, and communication skills are exceptional. Your ability to thrive in a team-oriented environment, manage ambiguity, adapt to new technologies, and solve undefined problems make you a valuable asset.,

Posted 18 hours ago

Apply

2.0 years

27 - 42 Lacs

Pune

Work from Office

The Role We are currently seeking an experienced Backend Software Engineer with a strong Java background to join Addepar in our Partner Platform team! We are building out a new platform from scratch which will enable third parties to simply and safely engage with Addepar at scale. This team is passionate about handling large volumes of data and the engineering challenges in building the distributed systems responsible for automated data ingestion and transformation. We want people who are hard-working and care deeply about solving hard problems at high scale, delighting customers, and participating in the success of the whole company. We look for dedicated engineers with real technical depth and a desire to understand the end business. If you've designed sophisticated scalable systems, have extensive experience with Java and related technologies, or are just interested in tackling complicated technical, critically important problems, join us! What You’ll Do Work in partnership with engineering partners and other platform users to identify requirements and priorities, and map out solutions for challenging technology and workflow problems. Design, develop, and deploy high-quality Java applications that integrate with various data sources and services. Build technical skills in a high-performing team of engineers in India who can design, develop, and deploy Java-based solutions with a focus on backend services and APIs, and help other teams at Addepar build on top of the Addepar platform. Lay a solid foundation of the software architecture for the team in system design and code development with a strong focus on Java and related technologies. Who You Are B.S., or M.S. in Computer Science or similar technical field of study (or equivalent practical experience). 4+ years of software engineering experience. Expert-level proficiency in backend development, with a focus on Java. Good experience on AWS or any other cloud platform. Experience with databases, SQL, NoSQL, OLAP, and/or data lake architectures. A strong ownership mentality and drive to solve the most important problems. Passion for implementing standard processes with a bias toward smart automation. A rapid learner with robust analytical and problem-solving abilities. Comfortable working in a cloud context, with automated infrastructure and service-oriented architecture. Experience with Java, Spring Boot, RESTful APIs, and related technologies is preferred. Practical knowledge of agile practices with an outlook that prioritizes experimentation and iteration combined with an ability to guide teams toward activities and processes that facilitate optimal outcomes.

Posted 18 hours ago

Apply

4.0 years

0 Lacs

Pune, Maharashtra, India

Remote

Who We Are Addepar is a global technology and data company that helps investment professionals provide the most informed, precise guidance for their clients. Hundreds of thousands of users have entrusted Addepar to empower smarter investment decisions and better advice over the last decade. With client presence in more than 50 countries, Addepar’s platform aggregates portfolio, market and client data for over $7 trillion in assets. Addepar’s open platform integrates with more than 100 software, data and services partners to deliver a complete solution for a wide range of firms and use cases. Addepar embraces a global flexible workforce model with offices in New York City, Salt Lake City, Chicago, London, Edinburgh, Pune, and Dubai. The Role We are currently seeking an experienced Backend Software Engineer with a strong Java background to join Addepar in our Partner Platform team! We are building out a new platform from scratch which will enable third parties to simply and safely engage with Addepar at scale. This team is passionate about handling large volumes of data and the engineering challenges in building the distributed systems responsible for automated data ingestion and transformation. We want people who are hard-working and care deeply about solving hard problems at high scale, delighting customers, and participating in the success of the whole company. We look for dedicated engineers with real technical depth and a desire to understand the end business. If you've designed sophisticated scalable systems, have extensive experience with Java and related technologies, or are just interested in tackling complicated technical, critically important problems, join us! What You’ll Do Work in partnership with engineering partners and other platform users to identify requirements and priorities, and map out solutions for challenging technology and workflow problems. Design, develop, and deploy high-quality Java applications that integrate with various data sources and services. Build technical skills in a high-performing team of engineers in India who can design, develop, and deploy Java-based solutions with a focus on backend services and APIs, and help other teams at Addepar build on top of the Addepar platform. Lay a solid foundation of the software architecture for the team in system design and code development with a strong focus on Java and related technologies. Who You Are B.S., or M.S. in Computer Science or similar technical field of study (or equivalent practical experience). 4+ years of software engineering experience. Expert-level proficiency in backend development, with a focus on Java. Good experience on AWS or any other cloud platform. Experience with databases, SQL, NoSQL, OLAP, and/or data lake architectures. A strong ownership mentality and drive to solve the most important problems. Passion for implementing standard processes with a bias toward smart automation. A rapid learner with robust analytical and problem-solving abilities. Comfortable working in a cloud context, with automated infrastructure and service-oriented architecture. Experience with Java, Spring Boot, RESTful APIs, and related technologies is preferred. Practical knowledge of agile practices with an outlook that prioritizes experimentation and iteration combined with an ability to guide teams toward activities and processes that facilitate optimal outcomes. Our Values Act Like an Owner - Think and operate with intention, purpose and care. Own outcomes. Build Together - Collaborate to unlock the best solutions. Deliver lasting value. Champion Our Clients - Exceed client expectations. Our clients’ success is our success. Drive Innovation - Be bold and unconstrained in problem solving. Transform the industry. Embrace Learning - Engage our community to broaden our perspective. Bring a growth mindset. In addition to our core values, Addepar is proud to be an equal opportunity employer. We seek to bring together diverse ideas, experiences, skill sets, perspectives, backgrounds and identities to drive innovative solutions. We commit to promoting a welcoming environment where inclusion and belonging are held as a shared responsibility. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation. PHISHING SCAM WARNING: Addepar is among several companies recently made aware of a phishing scam involving con artists posing as hiring managers recruiting via email, text and social media. The imposters are creating misleading email accounts, conducting remote “interviews,” and making fake job offers in order to collect personal and financial information from unsuspecting individuals. Please be aware that no job offers will be made from Addepar without a formal interview process. Additionally, Addepar will not ask you to purchase equipment or supplies as part of your onboarding process. If you have any questions, please reach out to TAinfo@addepar.com.

Posted 23 hours ago

Apply

3.0 - 7.0 years

12 - 16 Lacs

Bengaluru

Work from Office

Develop, test and support future-ready data solutions for customers across industry verticals. Develop, test, and support end-to-end batch and near real-time data flows/pipelines. Demonstrate understanding of data architectures, modern data platforms, big data, analytics, cloud platforms, data governance and information management and associated technologies. Communicates risks and ensures understanding of these risks. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Graduate with a minimum of 5+ years of related experience required. Experience in modelling and business system designs. Good hands-on experience on DataStage, Cloud-based ETL Services. Have great expertise in writing TSQL code. Well-versed with data warehouse schemas and OLAP techniques. Preferred technical and professional experience Ability to manage and make decisions about competing priorities and resources. Ability to delegate where appropriate. Must be a strong team player/leader. Ability to lead Data transformation projects with multiple junior data engineers. Strong oral written and interpersonal skills for interacting throughout all levels of the organization. Ability to communicate complex business problems and technical solutions.

Posted 1 day ago

Apply

2.0 - 5.0 years

9 - 13 Lacs

Gurugram

Work from Office

Develop, test and support future-ready data solutions for customers across industry verticals. Develop, test, and support end-to-end batch and near real-time data flows/pipelines. Demonstrate understanding of data architectures, modern data platforms, big data, analytics, cloud platforms, data governance and information management and associated technologies. Communicates risks and ensures understanding of these risks. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Graduate with a minimum of 5+ years of related experience required. Experience in modelling and business system designs. Good hands-on experience on DataStage, Cloud-based ETL Services. Have great expertise in writing TSQL code. Well-versed with data warehouse schemas and OLAP techniques. Preferred technical and professional experience Ability to manage and make decisions about competing priorities and resources. Ability to delegate where appropriate. Must be a strong team player/leader. Ability to lead Data transformation projects with multiple junior data engineers. Strong oral written and interpersonal skills for interacting throughout all levels of the organization. Ability to communicate complex business problems and technical solutions.

Posted 1 day ago

Apply

2.0 - 5.0 years

9 - 13 Lacs

Bengaluru

Work from Office

Develop, test and support future-ready data solutions for customers across industry verticals. Develop, test, and support end-to-end batch and near real-time data flows/pipelines. Demonstrate understanding of data architectures, modern data platforms, big data, analytics, cloud platforms, data governance and information management and associated technologies. Communicates risks and ensures understanding of these risks. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Graduate with a minimum of 5+ years of related experience required. Experience in modelling and business system designs. Good hands-on experience on DataStage, Cloud-based ETL Services. Have great expertise in writing TSQL code. Well-versed with data warehouse schemas and OLAP techniques. Preferred technical and professional experience Ability to manage and make decisions about competing priorities and resources. Ability to delegate where appropriate. Must be a strong team player/leader. Ability to lead Data transformation projects with multiple junior data engineers. Strong oral written and interpersonal skills for interacting throughout all levels of the organization. Ability to communicate complex business problems and technical solutions.

Posted 1 day ago

Apply

5.0 years

15 Lacs

Indore

On-site

Job description Out of the box thinker to build innovative AI/ML models : 1. Understand and analyze requirements requiring Machine Learning Models from Product Owners, Customers, and Other Stakeholders 2. Analyze and verify data quality and features 3. Design solutions by choosing the right algorithms, features, and hyperparameters 4. Manage the full lifecycle of ML Models: Data Acquisition, Feature Engineering, Model Development, Training, Verification, Optimization, Deployment, Versioning 5. Augment Enterprise data with publicly available datasets to enrich models features 6. Create strategies for integrating Whiz.AI platform with external enterprise data sources like Databases, Data Warehouses, Analytical Stores, External ML Systems/Algorithms, Hadoop and ERP/CRM systems. Qualifications Technical 5+ years of experience in implementing Machine Learning and Deep Learning models applied to traditional as well as NLP problems Machine Learning-based models: ANN, SVM, Logistic Regression, Gradient Boosting Time Series Anomaly Detections Methods, Hierarchical or Grouped Time Series Forecasting Knowledge of BERT, LSTMs, RNNs, and HMMs applied to Text classification and Text Generation problems Understanding of ML Data processing frameworks like TensorFlow or PyTorch, XGBoost, SciPy, Scikit-Learn, Apache Spark SQL and handling Big Data, databases Excellent knowledge of Python Programming, NumPy, Pandas, and processing JSON, XML, CSV files Non-Technical Good communication analytical skills Self-driven with a strong sense of ownership urgency Preferred Qualifications Preference will be given to the hands-on Deep Learning and NLP application experience Knowledge of Analytical/OLAP/Columnar, Hadoop Ecosystem and NoSQL databases Deep Learning, GANs, Reinforcement Learning R programming, Matlab Knowledge of Lifesciences or Pharmaceutical Industry dataset Interested candidate can share resume at shwetachouhan@valere.io Job Type: Full-time Pay: From ₹1,500,000.00 per year Benefits: Health insurance Paid sick time Paid time off Provident Fund Education: Bachelor's (Preferred) Experience: software development: 1 year (Preferred) HTML5: 1 year (Preferred) total work: 5 years (Preferred) Work Location: In person

Posted 1 day ago

Apply

0.0 - 8.0 years

0 Lacs

Bengaluru, Karnataka

On-site

About Us Observe.AI is transforming customer service with AI agents that speak, think, and act like your best human agents—helping enterprises automate routine customer calls and workflows, support agents in real time, and uncover powerful insights from every interaction. With Observe.AI, businesses boost automation, deliver faster, more consistent 24/7 service and build stronger customer loyalty. Trusted by brands like Accolade, Prudential, Concentrix, Cox Automotive, and Included Health, Observe.AI is redefining how businesses connect with customers—driving better experiences and lasting relationships at every touchpoint. The Opportunity We are looking for a Senior Data Engineer with strong hands-on experience in building scalable data pipelines and real-time processing systems. You will be part of a high-impact team focused on modernizing our data architecture, enabling self-serve analytics, and delivering high-quality data products. This role is ideal for engineers who love solving complex data challenges, have a growth mindset, and are excited to work on both batch and streaming systems. What you’ll be doing: Build and maintain real-time and batch data pipelines using tools like Kafka, Spark, and Airflow. Contribute to the development of a scalable LakeHouse architecture using modern data formats such as Delta Lake, Hudi, or Iceberg. Optimize data ingestion and transformation workflows across cloud platforms (AWS, GCP, or Azure). Collaborate with Analytics and Product teams to deliver data models, marts, and dashboards that drive business insights. Support data quality, lineage, and observability using modern practices and tools. Participate in Agile processes (Sprint Planning, Reviews) and contribute to team knowledge sharing and documentation. Contribute to building data products for inbound (ingestion) and outbound (consumption) use cases across the organization. Who you are: 5-8 years of experience in data engineering or backend systems with a focus on large-scale data pipelines. Hands-on experience with streaming platforms (e.g., Kafka) and distributed processing tools (e.g., Spark or Flink). Working knowledge of LakeHouse formats (Delta/Hudi/Iceberg) and columnar storage like Parquet. Proficient in building pipelines on AWS, GCP, or Azure using managed services and cloud-native tools. Experience in Airflow or similar orchestration platforms. Strong in data modeling and optimizing data warehouses like Redshift, BigQuery, or Snowflake. Exposure to real-time OLAP tools like ClickHouse, Druid, or Pinot. Familiarity with observability tools such as Grafana, Prometheus, or Loki. Some experience integrating data with MLOps tools like MLflow, SageMaker, or Kubeflow. Ability to work with Agile practices using JIRA, Confluence, and participating in engineering ceremonies. Compensation, Benefits and Perks Excellent medical insurance options and free online doctor consultations Yearly privilege and sick leaves as per Karnataka S&E Act Generous holidays (National and Festive) recognition and parental leave policies Learning & Development fund to support your continuous learning journey and professional development Fun events to build culture across the organization Flexible benefit plans for tax exemptions (i.e. Meal card, PF, etc.) Our Commitment to Inclusion and Belonging Observe.AI is an Equal Employment Opportunity employer that proudly pursues and hires a diverse workforce. Observe AI does not make hiring or employment decisions on the basis of race, color, religion or religious belief, ethnic or national origin, nationality, sex, gender, gender identity, sexual orientation, disability, age, military or veteran status, or any other basis protected by applicable local, state, or federal laws or prohibited by Company policy. Observe.AI also strives for a healthy and safe workplace and strictly prohibits harassment of any kind. We welcome all people. We celebrate diversity of all kinds and are committed to creating an inclusive culture built on a foundation of respect for all individuals. We seek to hire, develop, and retain talented people from all backgrounds. Individuals from non-traditional backgrounds, historically marginalized or underrepresented groups are strongly encouraged to apply. If you are ambitious, make an impact wherever you go, and you're ready to shape the future of Observe.AI, we encourage you to apply.

Posted 1 day ago

Apply

5.0 years

0 Lacs

Hyderabad, Telangana

On-site

Job Title: Databricks Developer / Data Engineer Duration - 12 Months with Possible Extension Location: Hyderabad, Telangana (Hybrid) 1-2 days onsite at client location Job Summary: We are seeking a highly skilled Databricks Developer / Data Engineer with 5+ years of experience in building scalable data pipelines, managing large datasets, and optimizing data workflows in cloud environments. The ideal candidate will have hands-on expertise in Azure Databricks, Azure Data Factory, and other Azure-native services, playing a key role in enabling data-driven decision-making across the organization. Key Responsibilities: Design, develop, and maintain scalable ETL/ELT pipelines for data ingestion, transformation, and integration Work with both structured and unstructured data from a variety of internal and external sources Collaborate with data analysts, scientists, and engineers to ensure data quality, integrity, and availability Build and manage data lakes, data warehouses, and data models (Azure Databricks, Azure Data Factory, Snowflake, etc.) Optimize performance of large-scale batch and real-time processing systems Implement data governance , metadata management, and data lineage practices Monitor and troubleshoot pipeline issues; perform root cause analysis and proactive resolution Automate data validation and quality checks Ensure compliance with data privacy, security, and regulatory requirements Maintain thorough documentation of architecture, data workflows, and processes Mandatory Qualifications: 5+ years of hands-on experience with: Azure Blob Storage, Azure Data Lake Storage, Azure SQL Database Azure Logic Apps, Azure Data Factory, Azure Databricks, Azure ML Azure DevOps Services, Azure API Management, Webhooks Intermediate-level proficiency in Python scripting and PySpark Basic understanding of Power BI and visualization functionalities Technical Skills & Experience Required: Proficient in SQL and working with both relational and non-relational databases (e.g., SQL, PostgreSQL, MongoDB, Cassandra) Hands-on experience with Apache Spark, Hadoop, Hive for big data processing Proficiency in building scalable data pipelines using Azure Data Factory and Azure Databricks Solid knowledge of cloud-native tools : Delta Lake, Azure ML, Azure DevOps Understanding of data modeling , OLAP/OLTP systems , and data warehousing best practices Experience with CI/CD pipelines , version control with Git , and working with Azure Repos Knowledge of data security , privacy policies, and compliance frameworks Excellent problem-solving , troubleshooting , and analytical skills

Posted 1 day ago

Apply

0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Company Description Acronotics Limited specializes in cutting-edge robotic automation and artificial intelligence solutions. By applying human intelligence to build advanced AI-fueled systems, Acronotics transforms businesses with technologies like AI and Robotic Process Automation (RPA). As a consulting and services firm, we are dedicated to creating automated solutions that will redefine how products are made, sold, and consumed. Our mission is to help clients implement and run game-changing robotic automation and artificial intelligence-based solutions. Explore our product, Radium AI, which automates bot monitoring and support activities, on our website: Radium AI. Role Description This is a full-time, on-site role for a Data Engineer (Power BI) based in Bengaluru. You will design and manage data pipelines that connect Power BI, OLAP cubes, documents (pdfs, presentations) and external data sources to Azure AI. Your role ensures structured and unstructured financial data is indexed and accessible for semantic search and LLM use. Key Responsibilities: Extract data from Power BI datasets, semantic models, and OLAP cubes. Connect and transform data via Azure Synapse, Data Factory, and Lakehouse architecture. Preprocess PDFs, PPTs, and Excel files using Azure Form Recognizer or Python-based tools. Design data ingestion pipelines for external web sources (e.g., commodity prices). Coordinate with AI engineers to feed cleaned and contextual data into vector indexes. Requirements: Strong experience with Power BI REST/XMLA APIs. Expertise in OLAP systems (SSAS, SAP BW), data modelling, and ETL design. Hands-on experience with Azure Data Factory, Synapse, or Data Lake. Familiarity with JSON, DAX, M queries.

Posted 1 day ago

Apply

6.0 - 10.0 years

0 Lacs

pune, maharashtra

On-site

The Applications Development Intermediate Programmer Analyst position at our company involves actively participating in the establishment and implementation of new or updated application systems and programs in collaboration with the Technology team. Your main goal in this role will be to contribute to applications systems analysis and programming activities. With 6+ years of experience in MicroStrategy SDK development, you will be proficient in Java, JavaScript, jQuery, HTML, CSS REST API, and MicroStrategy. Your responsibilities will include developing schema objects like Attributes, Facts, and Transformations, as well as creating public objects such as filters, prompts, and reports. You will also work on document and dashboard development, intelligent cubes, cube reports, and performance optimization in MSTR and UI technologies. Familiarity with MOLAP, ROLAP, OLAP concepts, CI/CD process, Agile development, MSTR iCube automation, MSTR REST API, Library, and DW concepts will be beneficial. Experience in MSTR version upgrades, collaborating with geographically dispersed teams, and leading small to medium teams of BI developers will be valuable assets. As a UI developer, your skills in JavaScript, jQuery, HTML, CSS REST API, and basics of MicroStrategy will be put to good use. Your qualifications should include a minimum of 6 years of relevant experience in MicroStrategy SDK, SQL, Web SDK. It is essential that you possess clear and concise written and verbal communication skills, problem-solving abilities, decision-making skills, and the capacity to work under pressure while managing deadlines or unexpected changes in expectations or requirements. A Bachelors degree, University degree, or equivalent experience is required for this position, and it is a full-time role within the Technology job family group, specifically in Applications Development. If you require a reasonable accommodation due to a disability for using our search tools or applying for a career opportunity, please review Accessibility at Citi. You can also refer to Citis EEO Policy Statement and the Know Your Rights poster for further information.,

Posted 1 day ago

Apply

8.0 - 12.0 years

0 Lacs

hyderabad, telangana

On-site

The role of Sr Specialist Visualization & Automation in Hyderabad, India involves defining and leading the Platform engineering of Business intelligence solutions, specifically focusing on Power BI technology. Your key responsibilities include overseeing the creation and management of BI and analytics solutions using strong Power BI skills. You will drive the success of technology usage for solution delivery, best practices, compliance, and enablement of the business. Collaboration with the solution architect and platform architect is crucial in defining visualization architecture patterns based on functional and non-functional requirements. You will also be responsible for driving the DevOps Roadmap to enable Agile ways of working, CI/CD pipeline, and automation for self-serve governance of the Power BI platform in alignment with the Platform lead. Ensuring adherence to security and compliance policies and procedures is paramount in this role. You will play a vital role in defining architecture standards, patterns, and platform solutions while upholding Information Security & Compliance (ISC), legal, ethics, and other compliance policies. The ideal candidate should have 8-10 years of IT experience in Data and Analytics, Visualization, with a strong exposure to Power BI Solution delivery and Platform Automation. Proficiency in database management systems, ETL, OLAP, and data lake technologies is required. Experience in Power BI and knowledge of other visualization technologies are advantageous. A specialization in the Pharma domain and understanding of data usage across the enterprise value chain are desirable. Good interpersonal, written, and verbal communication skills are essential, along with the ability to manage vendor and customer expectations. Your technical expertise, understanding of business processes and systems, and commitment to Novartis Values & Behaviors will be critical in this role. Novartis is committed to creating an inclusive work environment and diverse teams that represent the patients and communities served. By joining Novartis, you will contribute to reimagining medicine to improve and extend people's lives, striving to become the most valued and trusted medicines company globally. If you are passionate about making a difference and want to be part of a community that drives breakthroughs to change patients" lives, consider joining the Novartis Network to explore career opportunities and stay connected with Novartis. Learn more about our commitment to diversity and inclusion and the benefits and rewards we offer to help you thrive both personally and professionally.,

Posted 1 day ago

Apply

10.0 years

0 Lacs

Dehradun, Uttarakhand, India

On-site

We are looking for a skilled Data Modeller with strong experience in the Big Data ecosystem, particularly in the Azure Data Platform and Databricks environment. The ideal candidate should have a deep understanding of data modelling principles and hands-on expertise in building models in modern data architectures such as Unity Catalog and Delta Lake. Key Responsibilities: Design and develop conceptual, logical, and physical data models to support enterprise analytics and reporting needs Build and manage data models in Unity Catalog within the Databricks environment Work across teams to model and structure data in Delta Lake and optimize for performance and reusability Collaborate with data engineers, architects, and analysts to ensure models align with data ingestion, transformation, and business reporting workflows Translate business requirements into scalable and efficient data designs using best practices in data warehousing and Lakehouse architecture Maintain comprehensive documentation, including data dictionaries, data lineage, and metadata Implement and support data governance, data quality, and security controls across datasets and platforms Qualifications and Skills: 10+ years of Hands-on data modelling experience in the Big Data ecosystem, with a strong understanding of OLTP, OLAP, and dimensional modelling Hands – on experience in Data modelling techniques like Kimball, Inmon, Data vault and Dimensional Strong proficiency in data modeling tools (e.g., ER/Studio, ERwin, PowerDesigner, dbt, SQLDBM, or Lucidchart) Experience building and maintaining data models using Unity Catalog in Databricks Proven experience working with the Azure Data Platform, including services like: Azure Data Factory Azure Data Lake Azure Synapse Analytics Azure SQL Database Strong proficiency in SQL and Apache Spark for data transformation and querying Familiarity with Delta Lake, Parquet, and modern data storage formats Knowledge of data cataloging tools such as Azure Purview is a plus Excellent problem-solving skills and ability to work in agile and fast-paced environments Strong communication skills to articulate data concepts to technical and non-technical stakeholders Preferred Qualifications: Bachelor’s or Master’s degree in Computer Science, Information Systems, or a related field Relevant certifications such as DP-203 (Azure Data Engin About Us: We’re an international team who specialize in building technology products & then helping brands grow with multi-channel demand generation marketing. We have in-house experience working for Fortune companies, e-commerce brands, technology SaaS companies. We have assisted over a dozen billion dollar companies with consulting, technology, operations, and digital agency capabilities in managing their unique brand online. We have a fun and friendly work culture that also encourages employees personally and professionally. EbizON has many values that are important to our success as a company: integrity, creativity, innovation, mindfulness and teamwork. We thrive on the idea of making life better for people by providing them with peace of mind. The people here love what they do because everyone from management all way down understands how much it means living up close-to someones' ideals which allows every day feel less stressful knowing each person has somebody cheering him. Equal Opportunity Employer: EbizON is committed to providing equal opportunity for all employees, and we will consider any qualified applicant without regard to race or other prohibited characteristics. Flexible Timings: Flexible working hours are the new normal. We at EbizON believe giving employees freedom to choose when to work, how to work. It helps them thrive and also balance their life better. Global Clients Exposure: Our goal is to provide excellent customer service and we want our employees to work closely with clients from around the world. That's why you'll find us working closely with clients from around the world through Microsoft Teams, Zoom and other video conferencing tools. Retreats & Celebrations: With annual retreats, quarterly town halls and festive celebrations we have a lot of opportunities to get together. Powered by JazzHR ndhkjYTwXs

Posted 2 days ago

Apply

0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

We are seeking a skilled Senior Data Modeller to design, implement, and maintain conceptual, logical, and physical data models that support enterprise information management and business intelligence efforts. The ideal candidate will collaborate with business analysts, data architects, and developers to ensure high-quality data models that meet both business and technical requirements. • GCP, Data Modelling (OLTP, OLAP), indexing, DBSchema, CloudSQL, BigQuery • Data Modeller - Hands-on data modelling for OLTP and OLAP systems. • In-Depth knowledge of Conceptual, Logical and Physical data modelling. • Strong understanding of Indexing, partitioning, data sharding with practical experience of having done the same. • Strong understanding of variables impacting database performance for near-real time reporting and application interaction. • Should have working experience on at least one data modelling tool, preferably DBSchema. • People with functional knowledge of the mutual fund industry will be a plus. • Good understanding of GCP databases like AlloyDB, CloudSQL and BigQuery

Posted 2 days ago

Apply

0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Required Skills Strong working knowledge of modern programming languages, ETL/Data Integration tools (preferably SnapLogic) and understanding of Cloud Concepts. SSL/TLS, SQL, REST, JDBC, JavaScript, JSON Has Strong hands-on experience in Snaplogic Design/Development. Has good working experience using various snaps for JDBC, SAP, Files, Rest, SOAP, etc. Good to have the ability to build complex mappings with JSON path expressions, and Python scripting. Good to have experience in ground plex and cloud plex integrations. Has Strong hands-on experience in Snaplogic Design/Development. Has good working experience using various snaps for JDBC, SAP, Files, Rest, SOAP, etc. Should be able to deliver the project by leading a team of the 6-8member team. Should have had experience in integration projects with heterogeneous landscapes. Good to have the ability to build complex mappings with JSON path expressions, flat files and cloud. Good to have experience in ground plex and cloud plex integrations. Experience in one or more RDBMS (Oracle, DB2, and SQL Server, PostgreSQL and RedShift) Real-time experience working in OLAP & OLTP database models (Dimensional models).

Posted 2 days ago

Apply

15.0 years

0 Lacs

Hyderābād

On-site

Project Role : Data Architect Project Role Description : Define the data requirements and structure for the application. Model and design the application data structure, storage and integration. Must have skills : Microsoft Azure Analytics Services Good to have skills : NA Minimum 12 year(s) of experience is required Educational Qualification : 15 years full time education Summary: As a Data Architect, you will define the data requirements and structure for the application. Your typical day will involve modeling and designing the application data structure, storage, and integration, ensuring that the architecture aligns with business needs and technical specifications. You will collaborate with various teams to ensure that data flows seamlessly and efficiently throughout the organization, while also addressing any challenges that arise in the data management process. Your role will be pivotal in shaping the data landscape of the organization, enabling informed decision-making and strategic planning. Roles & Responsibilities: A. Function as the Lead Data Architect for a small, simple project/proposal or as a team lead for medium/large sized project or proposal B. Discuss specific Big data architecture and related issues with client architect/team (in area of expertise) C. Analyze and assess the impact of the requirements on the data and its lifecycle D. Lead Big data architecture and design medium-big Cloud based, Big Data and Analytical Solutions using Lambda architecture. E. Breadth of experience in various client scenarios and situations F. Experienced in Big Data Architecture-based sales and delivery G. Thought leadership and innovation H. Lead creation of new data assets & offerings I. Experience in handling OLTP and OLAP data workloads Professional & Technical Skills: A. Strong experience in Azure is preferred with hands-on experience in two or more of these skills : Azure Synapse Analytics, Azure HDInsight, Azure Databricks with PySpark / Scala / SparkSQL, Azure Analysis Services B. Experience in one or more Real-time/Streaming technologies including: Azure Stream Analytics, Azure Data Explorer, Azure Time Series Insights, etc. C. Experience in handling medium to large Big Data implementations D. Candidate must have around 5 years of extensive Big data experience E. Candidate must have 15 years of IT experience and around 5 years of extensive Big data experience (design + build) Additional Information: A. Should be able to drive the technology design meetings, propose technology design and architecture B. Should have excellent client communication skills C. Should have good analytical and problem-solving skills 15 years full time education

Posted 2 days ago

Apply

1.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Years Of Experience : 1Years - 3 years Location: Noida, Indore Requisition Description: Experience in writing and troubleshooting SQL queries Proficient in database and Data warehousing concepts. Proven hands-on experience in designing, developing, and supporting Database projects for analysis Good written and verbal communication skills. Knowledge and/or experience of the following will be added advantage - MDX/ DAX Database design techniques Data modeling SSAS Spark processing Hadoop ecosystem or AWS, Azure, or GCP Cluster and processing Hive Redshift, or Snowflake Linux system Tableau, Micro strategy, Power BI, or any BI tools Programming on Python, Java, or Shell Script Roles and Responsibilities: Interact with senior-most technical and businesspeople of large enterprises to understand their analytics strategy and their problem statements in that area. Understand Customer domain and database schema Designing OLAP semantic models and dashboards Be the go-To person for customers regarding technical issues during the project Efficient task status reporting to stakeholders and customers Be willing to work on off hours to meet the timeline. Be willing to travel or relocate as per project requirement Be willing to work on different technologies

Posted 2 days ago

Apply

4.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Job Description: QA Automation Engineer As a QA Automation Engineer specializing in Data Warehousing, you will play a critical role in ensuring that our data solutions are of the highest quality. You will work closely with data engineers and analysts to develop, implement, and maintain automated testing frameworks for data validation, ETL processes, data quality, and integration. Your work will ensure that data is accurate, consistent, and performs optimally across our data warehouse systems. Responsibilities Develop and Implement Automation Frameworks: Design, build, and maintain scalable test automation frameworks tailored for data warehousing environments. Test Strategy and Execution: Define and execute automated test strategies for ETL processes, data pipelines, and database integration across a variety of data sources. Data Validation: Implement automated tests to validate data consistency, accuracy, completeness, and transformation logic. Performance Testing: Ensure that the data warehouse systems meet performance benchmarks through automation tools and load testing strategies. Collaborate with Teams: Work closely with data engineers, software developers, and data analysts to understand business requirements and design tests accordingly. Continuous Integration: Integrate automated tests into the CI/CD pipelines, ensuring that testing is part of the deployment process. Defect Tracking and Reporting: Use defect-tracking tools (e.g., JIRA) to log and track issues found during automated testing, ensuring that defects are resolved in a timely manner. Test Data Management: Develop strategies for handling large volumes of test data while maintaining data security and privacy. Tool and Technology Evaluation: Stay current with emerging trends in automation testing for data warehousing and recommend tools, frameworks, and best practices. Job Qualifications: Requirements and skills · At Least 4+ Years Experience Solid understanding of data warehousing concepts (ETL, OLAP, data marts, data vault,star/snowflake schemas, etc.). · Proven experience in building and maintaining automation frameworks using tools like Python, Java, or similar, with a focus on database and ETL testing. · Strong knowledge of SQL for writing complex queries to validate data, test data pipelines, and check transformations. · Experience with ETL tools (e.g., Matillion, Qlik Replicate) and their testing processes. · Performance Testing · Experience with version control systems like Git · Strong analytical and problem-solving skills, with the ability to troubleshoot complex data issues. · Strong communication and collaboration skills. · Attention to detail and a passion for delivering high-quality solutions. · Ability to work in a fast-paced environment and manage multiple priorities. · Enthusiastic about learning new technologies and frameworks. Experience with the following tools and technologies are desired. QLIK Replicate Matillion ETL Snowflake Data Vault Warehouse Design Power BI Azure Cloud – Including Logic App, Azure Functions, ADF

Posted 2 days ago

Apply

10.0 years

0 Lacs

Bengaluru East, Karnataka, India

On-site

Requirements (AWS Data Engineer) >> 3 – 10 years of strong python or Java data engineering experience Experience in Developing Data Pipelines that process large volumes of data using Python, PySpark, Pandas etc, on AWS. Experience in developing ETL, OLAP based and Analytical Applications. Experience in ingesting batch and streaming data from various data sources. Strong Experience in writing complex SQL using any RDBMS (Oracle, PostgreSQL, SQL Server etc.) Exposure to AWS platform's data services (AWS Lambda, Glue, Athena, Redshift, Kinesis etc.) Experience in Airflow DAGS, AWS EMR, S3, IAM and other services Experience working on Test cases using pytest/ unit test or any other framework. Snowflake or Redshift data warehouses Experience of DevOps and CD/CD tools. Familiarity with Rest APIs Experience with CI/CD pipelines, branching strategies, & GIT for code management Bachelor's degree in computer science, information technology, or a similar field You will need to be well spoken and have an easy time establishing productive long lasting working relationships with a large variety of stakeholders Take the lead on data pipeline design with strong analytical skills and a keen eye for detail to really understand and tackle the challenges businesses are facing You will be confronted with a large variety of Data Engineer tools and other new technologies as well with a wide variety of IT, compliance, security related issues. Design and develop world-class technology solutions to solve business problems across multiple client engagements. Collaborate with other teams to understand business requirements, client infrastructure, platforms and overall strategy to ensure seamless transitions. Work closely with AI and A team to build world-class solutions and to define AI strategy. You will possess strong logical structuring and problem-solving skills with expert level understanding of database and have an inherent desire to turn data into actions. Strong verbal, written and presentation skills Comfortable working in Agile projects Clear and precise communication skills Ability to quickly learn and develop expertise in existing highly complex applications and architectures.

Posted 2 days ago

Apply

2.0 years

0 Lacs

Ahmedabad, Gujarat, India

On-site

We are looking for Data Engineer with 2+years of experience to set-up, build, deploy and maintain scalable and reliable data assets and reporting/dashboarding solutions, ensuring compelling user experience and insights that drive real business value. Job Description In your new role you will: Set-up, build, deploy and maintain scalable and reliable data assets and reporting/dashboarding solutions, ensuring compelling user experience and insights that drive real business value. Improve speed and quality of data provisioning and data usage, e.g.by effectively combining data from different sources, across platforms and different formats. Ensure scalability, robustness and sustainable usage of your developed solutions. Team up with our domain-, IT- and methods experts to capture the full value of our data, e.g. by ensuring optimal usage of existing tools,resources and business needs and by working in tandem on complex analytics cases. Your Profile You are best equipped for this task if you have: A degree in Information Technology, Business Informatics, Computer Science or related field of studies. At least 1-3 years of relevant work experience related to Data Engineering, Data Visualization and/or Data Analytics. Solid expertise of database concepts (e.g. DWH, Hadoop/Big Data,OLAP, MS Access) and related query languages (e.g. SQL, MDX). Solid expertise in visualization of complex / big data, using Tableau and Tableau Prep. Ability to translate complex business needs into concrete actions. Ability to work both independently and within a team. Fluent in English. Contact: Shavin.shashidhar@infineon.com #WeAreIn for driving decarbonization and digitalization. As a global leader in semiconductor solutions in power systems and IoT, Infineon enables game-changing solutions for green and efficient energy, clean and safe mobility, as well as smart and secure IoT. Together, we drive innovation and customer success, while caring for our people and empowering them to reach ambitious goals. Be a part of making life easier, safer and greener. Are you in? We are on a journey to create the best Infineon for everyone. This means we embrace diversity and inclusion and welcome everyone for who they are. At Infineon, we offer a working environment characterized by trust, openness, respect and tolerance and are committed to give all applicants and employees equal opportunities. We base our recruiting decisions on the applicant´s experience and skills. Please let your recruiter know if they need to pay special attention to something in order to enable your participation in the interview process. Click here for more information about Diversity & Inclusion at Infineon.

Posted 2 days ago

Apply

12.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

About Team: Enterprise Business Services is invested in building a compact, robust organization that includes service operations and technology solutions for Finance, People, Associate Digital Experience. Our team is responsible for design and development of solution that knows our consumer’s needs better than ever by predicting what they want based on unconstrained demand, and efficiently unlock strategic growth, economic profit, and wallet share by orchestrating intelligent, connected planning and decisioning across all functions. We interact with multiple teams across the company to provide scalable robust technical solutions. This role will play crucial role in overseeing the planning, execution and delivery of complex projects within team Walmart’s Enterprise Business Services (EBS) is a powerhouse of several exceptional teams delivering world-class technology solutions and services making a profound impact at every level of Walmart. As a key part of Walmart Global Tech, our teams set the bar for operational excellence and leverage emerging technology to support millions of customers, associates, and stakeholders worldwide. Each time an associate turns on their laptop, a customer makes a purchase, a new supplier is onboarded, the company closes the books, physical and legal risk is avoided, and when we pay our associates consistently and accurately, that is EBS. Joining EBS means embarking on a journey of limitless growth, relentless innovation, and the chance to set new industry standards that shape the future of Walmart. What you'll do: Manage a high performing team of 8-10 engineers who work across multiple technology stacks including Java and Mainframe Drive design, development, implementation and documentation Establish best engineering and operational excellence practices based product, engineering and scrum metrics Interact with Walmart engineering teams across geographies to leverage expertise and contribute to the tech community. Engage with Product and Business stakeholders to drive the agenda, set the priorities and deliver scalable and resilient products. Work closely with the Architects and cross functional teams and follow established practices for the delivery of solutions meeting QCD (Quality, Cost & Delivery) within the established architectural guidelines. Work with senior leadership to chart out the future roadmap of the products Participate in hiring, mentoring and building high performing agile teams. Participating in organizational events like hackathons, demodays etc. and be the catalyst towards the success of those events Interact closely for requirements with Business owners and technical teams both within India and across the globe.. What you'll bring: Bachelor's/Master’s degree in Computer Science, engineering, or related field, with minimum 12+ years of experience in software development and at least 5+ years of experience in managing engineering teams. Have prior experience in managing high performing agile technology teams. Hands on experience building Java-based backend systems is a must, and experience of working in cloud based solutions is desirable Proficient in Javascript, NodeJS, ReactJS and NextJS. A good understanding of CS Fundamentals, Microservices, Data Structures, Algorithms & Problem Solving Should have exposed to CI/CD development environments/tools including, but not limited to, Git, Maven, Jenkins. Strong in writing modular and testable code and test cases (unit, functional and integration) using frameworks like JUnit, Mockito, and Mock MVC Should be experienced in microservices architecture. Posses good understanding of distributed concepts, common design principles, design patterns and cloud native development concepts. Hands-on experience in Spring boot, concurrency, garbage collection, RESTful services, data caching services and ORM tools. Experience working with Relational Database and writing complex OLAP, OLTP and SQL queries. Experience in working with NoSQL Databases like cosmos DB. Experience in working with Caching technology like Redis, Mem cache or other related Systems. Good knowledge in Pub sub system like Kafka. Experience utilizing monitoring and alert tools like Prometheus, Splunk, and other related systems and excellent in debugging and troubleshooting issues. Exposure to Containerization tools like Docker, Helm, Kubernetes. Knowledge of public cloud platforms like Azure, GCP etc. will be an added advantage. About Walmart Global Tech Imagine working in an environment where one line of code can make life easier for hundreds of millions of people. That’s what we do at Walmart Global Tech. We’re a team of software engineers, data scientists, cybersecurity expert's and service professionals within the world’s leading retailer who make an epic impact and are at the forefront of the next retail disruption. People are why we innovate, and people power our innovations. We are people-led and tech-empowered. We train our team in the skillsets of the future and bring in experts like you to help us grow. We have roles for those chasing their first opportunity as well as those looking for the opportunity that will define their career. Here, you can kickstart a great career in tech, gain new skills and experience for virtually every industry, or leverage your expertise to innovate at scale, impact millions and reimagine the future of retail. Flexible, hybrid work We use a hybrid way of working with primary in office presence coupled with an optimal mix of virtual presence. We use our campuses to collaborate and be together in person, as business needs require and for development and networking opportunities. This approach helps us make quicker decisions, remove location barriers across our global team, be more flexible in our personal lives. Benefits Beyond our great compensation package, you can receive incentive awards for your performance. Other great perks include a host of best-in-class benefits maternity and parental leave, PTO, health benefits, and much more. Belonging We aim to create a culture where every associate feels valued for who they are, rooted in respect for the individual. Our goal is to foster a sense of belonging, to create opportunities for all our associates, customers and suppliers, and to be a Walmart for everyone. At Walmart, our vision is "everyone included." By fostering a workplace culture where everyone is—and feels—included, everyone wins. Our associates and customers reflect the makeup of all 19 countries where we operate. By making Walmart a welcoming place where all people feel like they belong, we’re able to engage associates, strengthen our business, improve our ability to serve customers, and support the communities where we operate. Equal Opportunity Employer Walmart, Inc., is an Equal Opportunities Employer – By Choice. We believe we are best equipped to help our associates, customers and the communities we serve live better when we really know them. That means understanding, respecting and valuing unique styles, experiences, identities, ideas and opinions – while being inclusive of all people.

Posted 2 days ago

Apply

5.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Company Description Acronotics Limited specializes in building cutting-edge robotic automation and artificial intelligence solutions. We focus on modern-age automation technologies such as AI and RPA, aiming to transform how businesses operate. Our mission is to help clients design, develop, implement, and run transformative robotic automation and AI-based solutions. Our product, Radium AI, automates bot monitoring and support activities, delivering frictionless digital worker support driven by AI. Role Description This is a full-time, on-site role located in Bengaluru for a Business Analyst - Corporate Finance. We are looking for someone with a strong background in delivering IT projects for finance functions in large enterprises. You will act as a critical liaison between our client’s finance teams and our AI/technology teams to ensure that solutions are aligned with financial reporting requirements, KPIs, and business goals. This is a client-facing role requiring both domain expertise in corporate finance and strong business analysis skills. Responsibilities Work closely with client finance teams to gather, understand, and document financial processes, KPIs, and reporting needs. Translate business requirements into clear, structured documentation for AI engineers, data scientists, and solution architects. Facilitate workshops, interviews, and process walkthroughs with finance stakeholders (e.g., FP&A, Controlling, Tax, Treasury). Define and prioritise business requirements and user stories for AI and data-driven solutions. Analyse existing financial data sources (ERP systems like SAP, BI tools like Power BI, OLAP cubes) and propose integration/data flow approaches. Collaborate with AI/ML and software engineering teams to ensure accurate understanding of financial concepts and requirements. Support the validation, testing, and rollout of AI-based tools by ensuring they meet user expectations and financial accuracy standards. Qualifications 5+ years of experience as a Business Analyst, with at least 2+ years in finance-related IT projects . Strong understanding of corporate finance functions and KPIs (e.g., P&L, working capital, cost allocation, budgeting, forecasting). Experience working in an IT services, consulting, or digital transformation environment. Familiarity with ERP systems (e.g., SAP, Oracle), BI platforms (e.g., Power BI, Tableau), and financial data structures. Ability to create process maps, functional specifications, and business requirements documents. Strong stakeholder management and communication skills, especially with finance professionals and technical teams. Experience with Agile/Scrum methodologies is preferred. Exposure to AI/ML, analytics, or automation projects is a plus.

Posted 2 days ago

Apply

Exploring OLAP Jobs in India

With the increasing demand for data analysis and business intelligence, OLAP (Online Analytical Processing) jobs have become popular in India. OLAP professionals are responsible for designing, building, and maintaining OLAP databases to support data analysis and reporting activities for organizations. If you are looking to pursue a career in OLAP in India, here is a comprehensive guide to help you navigate the job market.

Top Hiring Locations in India

  1. Bangalore
  2. Mumbai
  3. Pune
  4. Hyderabad
  5. Chennai

These cities are known for having a high concentration of IT companies and organizations that require OLAP professionals.

Average Salary Range

The average salary range for OLAP professionals in India varies based on experience levels. Entry-level professionals can expect to earn around INR 4-6 lakhs per annum, while experienced professionals with 5+ years of experience can earn upwards of INR 12 lakhs per annum.

Career Path

Career progression in OLAP typically follows a trajectory from Junior Developer to Senior Developer, and then to a Tech Lead role. As professionals gain experience and expertise in OLAP technologies, they may also explore roles such as Data Analyst, Business Intelligence Developer, or Database Administrator.

Related Skills

In addition to OLAP expertise, professionals in this field are often expected to have knowledge of SQL, data modeling, ETL (Extract, Transform, Load) processes, data warehousing concepts, and data visualization tools such as Tableau or Power BI.

Interview Questions

  • What is OLAP and how does it differ from OLTP? (basic)
  • Explain the difference between a star schema and a snowflake schema. (medium)
  • How do you optimize OLAP queries for performance? (advanced)
  • What is the role of aggregation functions in OLAP databases? (basic)
  • Can you explain the concept of drill-down in OLAP? (medium)
  • How do you handle slowly changing dimensions in OLAP databases? (advanced)
  • What are the advantages of using a multidimensional database over a relational database for OLAP purposes? (medium)
  • Describe your experience with OLAP tools such as Microsoft Analysis Services or Oracle OLAP. (basic)
  • How do you ensure data consistency in an OLAP environment? (medium)
  • What are some common challenges faced when working with OLAP databases? (advanced)
  • Explain the concept of data cubes in OLAP. (basic)
  • How do you approach designing a data warehouse for OLAP purposes? (medium)
  • Can you discuss the importance of indexing in OLAP databases? (advanced)
  • How do you handle missing or incomplete data in OLAP analysis? (medium)
  • What are the key components of an OLAP system architecture? (basic)
  • How do you troubleshoot performance issues in OLAP queries? (advanced)
  • Have you worked with real-time OLAP systems? If so, can you explain the challenges involved? (medium)
  • What are the limitations of OLAP compared to other data analysis techniques? (advanced)
  • How do you ensure data security in an OLAP environment? (medium)
  • Have you implemented any data mining algorithms in OLAP systems? If so, can you provide an example? (advanced)
  • How do you approach designing dimensions and measures in an OLAP cube? (medium)
  • What are some best practices for OLAP database design? (advanced)
  • How do you handle concurrent user access in an OLAP environment? (medium)
  • Can you explain the concept of data slicing and dicing in OLAP analysis? (basic)
  • What are your thoughts on the future of OLAP technologies in the era of big data and AI? (advanced)

Closing Remark

As you prepare for OLAP job interviews in India, make sure to hone your technical skills, brush up on industry trends, and showcase your problem-solving abilities. With the right preparation and confidence, you can successfully land a rewarding career in OLAP in India. Good luck!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies