Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
8.0 - 13.0 years
5 - 9 Lacs
Bengaluru
Work from Office
Join our Team About this opportunity: Ericsson s Automation Chapter is seeking a highly motivated and self-driven Data Engineer and Senior Data Engineer with strong expertise in SAP HANA and SAP BODS. The ideal candidates will be focused on SAP-centric development and integration, ensuring that enterprise data flows are robust, scalable, and optimized for analytics consumption. You will collaborate with a high-performing team that builds and supports end-to-end data solutions aligned with our SAP ecosystem. You are adaptable and a flexible problem-solver with deep hands-on experience in HANA modeling and ETL workflows, capable of switching contexts across a range of projects with varying scale and complexity. What you bring : Design, develop, and optimize SAP HANA objects such as Calculation Views, SQL Procedures, and Custom Functions. Develop robust and reusable ETL pipelines using SAP BODS for both SAP and third-party system integration. Enable seamless data flow between SAP ECC and external platforms, ensuring accuracy and performance. Collaborate with business analysts, architects, and integration specialists to translate requirements into technical deliverables. Tune and troubleshoot HANA and BODS jobs for performance, scalability, and maintainability. Ensure compliance with enterprise data governance, lineage, and documentation standards. Support ongoing enhancements, production issues, and business-critical data deliveries. Experience 8+ years of experience in SAP data engineering roles. Strong hands-on experience in SAP HANA (native development, modeling, SQL scripting). Proficient in SAP BODS, including job development, data flows, and integration techniques. Experience working with SAP ECC data structures, IDOCs, and remote function calls. Knowledge of data warehouse concepts, data modeling, and performance optimization techniques. Strong debugging and analytical skills, with the ability to independently drive technical solutions. Familiarity with version control tools and SDLC processes. Excellent communication skills and ability to work collaboratively in cross-functional teams. Education Bachelor s degree in computer science, Information Systems, Electronics Communication, or a related field. Why join Ericsson What happens once you apply Primary country and city: India (IN) || Bangalore Req ID: 770551
Posted 2 weeks ago
10.0 - 15.0 years
40 - 45 Lacs
Coimbatore
Work from Office
Job_Description":" Skills: Data Modeling, Data Vault Digitalto and from core is the DNA of Colruyt Group. We have been pioneering inadapting IT technologiesfrom the advent of IT revolution. Digital Transformation is the core element torealize Colruyt GroupStrategic Plan. Aspart of the Data and Analytics team in Colruyt, you will have anopportunity to work with latest methodologies like DataVault and tools likeVaultspeed, design data products. Weare looking for an individual with Passion, Innovation,out of the box thinking to join our team. Responsibilities: Collaborate with Business partners to understandData products. Analyse the requirements and translate into Data products and Info marts. Analyseand Create Dimensional Data Models using Kimball methodology. Coach and Train existing teamin BI Business Analysis space. Playa key role in aligning and contributing to realize Colruyt Groupstrategic data ambition . Profile : 10+years of overallexperience in Business Intelligence. Atleast 6+ years of working experience in business requirements analysis inBusiness Intelligence. Expertisein working with Information Steward s, Information Modeler s, Information Coach es and Business users to capture and describebusiness requirements for creating Rawand Business Vault. Expertisein Datavault 2.0 Methodology to create Raw and Business Vaults . Expertisein Vaultspeed tool to create vaults. Expertisein creating data models using Kimball methodology . Proficientin understanding DWH databases preferably Sybase IQ GoodKnowledge in analysing & writing complex SQLs in an efficient way. Expertisein solving performance issues and tuning Queries for SQL generated using vault speed tool. Workingwith Cloudera Platform is an advantage. Priorretail domain knowledge is an advantage. Excellent Communication, AnalyticalMindset Good team player with an exceptional Stakeholder management ","
Posted 2 weeks ago
2.0 - 7.0 years
4 - 8 Lacs
Hyderabad
Work from Office
The Data Management analyst will play an instrumental role in supporting ongoing projects related to data integrity, consumer advocacy, related analytics, and accuracy. The analyst will work with stakeholders to identify opportunities for data accuracy, business process re-engineering, and provide insights to improve data management.You will be reporting to a Senior Manager.You are required to work from Hyderabad as its a Hybrid working (2 days WFO) Key Responsibilities Identify, analyze, and interpret trends and patterns in core Consumer, Clarity and Rent bureau, and Ops processing data to help make business decisions. Design new analytical workflows, processes, and/or optimize existing workflows with the goal to streamline processes and enable other analysts to self-service analytics. Convert high level business requirements into clear technical specifications, process flow diagrams, and queries. Effectively summarize, present actional insights and recommendations to the management team. Be a great story teller! Consult with internal clients on data quality issues and partner with them to set up remediation and monitoring programs. Engage with internal teams like data operation, data governance, compliance, audit, product development, consumer assistance center and gather requirements for business process re-engineering and improving data accuracy. About Experian Experience and Skills Bachelors degree in Data science, Engineering, Computer Science, Information Management, Statistics, related field, or equivalent experience is required. 2+ years of experience in Data Analytics roles. Expertise in SQL and one of the databases like SQL server, MySQL, or Aurora is required. Experience analyzing large datasets and familiarity with one of analytical tools like Alteryx, Python, SAS, R, or equivalent tool is required. Experience working with BI tools like Tableau, Qlik, and MS Office tools. Experience with Metro2 data quality, public records, credit inquiries and consumer disputes Experience with data modeling, GenAI, machine learning, and tools like Python, Spark, Athena, Hive is desirable. Navigate a rather complex business environment and willingness to learn new business processes, tools and techniques is needed. Benefits Experian care for employees work life balance, health, safety and wellbeing. In support of this endeavor, we offer best-in-class family well-being benefits, enhanced medical benefits and paid time off. #LI-Hybrid This is a hybrid /in-office role. Find out what its like to work for Experian by clicking here
Posted 2 weeks ago
2.0 - 6.0 years
3 - 6 Lacs
Hyderabad, Pune
Work from Office
":" Core Responsibilities: Design and develop data pipelines and workflows within Palantir Foundry Build and manage Ontology objects to enable semantic reasoning over data Configure and maintain data integrations from various sources (JDBC, SFTP, APIs) Write clean, efficient code using Python, Java, Scala, and SQL Collaborate with data scientists, analysts, and business stakeholders to translate requirements into technical solutions Perform data quality checks and ensure reliability across Foundry applications Participate in code reviews, agile ceremonies, and contribute to best practices Troubleshoot and resolve issues in Foundry applications and pipelines Stay updated on Foundry features, AIP capabilities, and platform enhancements Requirements 36 years of experience in software development or data engineering 2+ years of hands-on experience with Palantir Foundry Strong understanding of data modeling, ETL processes, and data warehousing Proficiency in Python, Java, or Scala Experience with SQL and relational databases Familiarity with cloud platforms like AWS, Azure, or GCP Excellent problem-solving, communication, and collaboration skills Preferred Qualifications: Experience with Palantir Foundry/AIP and LLMs Knowledge of DevOps practices, ontology design, and access control Exposure to data visualization tools (e.g., Tableau, Power BI) Familiarity with big data technologies like Spark or Hadoop Palantir certifications (e.g., Foundry Developer, Data Engineer) Benefits " , "Work_Experience":"3-6years" , "Job_Type":"Full time" , "Job_Opening_Name":"Palantir Foundry/AIP Developer" , "State":"Maharashtra" , "Country":"India" , "Zip_Code":"411045" , "id":"86180000007419270" , "Publish":true , "Date_Opened":"2025-07-23" , "Keep_on_Career_Site":false}]);
Posted 2 weeks ago
7.0 - 10.0 years
25 - 30 Lacs
Noida
Work from Office
Job Responsibilities: Technical Leadership: Provide technical leadership and mentorship to a team of data engineers. Design, architect, and implement highly scalable, resilient, and performant data pipelines, using GCP technologies is a plus (e.g., Dataproc, Cloud Composer, Pub/Sub, BigQuery). Guide the team in adopting best practices for data engineering, including CI/CD, infrastructure-as-code, and automated testing. Conduct code reviews, design reviews, and provide constructive feedback to team members. Stay up-to-date with the latest technologies and trends in data engineering, Data Pipeline Development: Develop and maintain robust and efficient data pipelines to ingest, process, and transform large volumes of structured and unstructured data from various sources. Implement data quality checks and monitoring systems to ensure data accuracy and integrity. Collaborate with cross functional teams, and business stakeholders to understand data requirements and deliver data solutions that meet their needs. Platform Building & Maintenance: Design and implement secure and scalable data storage solutions Manage and optimize cloud infrastructure costs related to data engineering workloads. Contribute to the development and maintenance of data engineering tooling and infrastructure to improve team productivity and efficiency. Collaboration & Communication: Effectively communicate technical designs and concepts to both technical and non-technical audiences. Collaborate effectively with other engineering teams, product managers, and business stakeholders. Contribute to knowledge sharing within the team and across the organization. Required Qualifications: Bachelors or Masters degree in Computer Science, Engineering, or a related field. 7+ years of experience in data engineering and Software Development. 7+ years of experience of coding in SQL and Python/Java. 3+ years of hands-on experience building and managing data pipelines in cloud environment like GCP. Strong programming skills in Python or Java, with experience in developing data-intensive applications. Expertise in SQL and data modeling techniques for both transactional and analytical workloads. Experience with CI/CD pipelines and automated testing frameworks. Excellent communication, interpersonal, and problem-solving skills. Experience leading or mentoring a team of engineers
Posted 2 weeks ago
3.0 - 5.0 years
11 - 15 Lacs
Mumbai
Work from Office
Jul 3, 2025 Location: Mumbai Designation: Analyst Entity: Deloitte Touche Tohmatsu India LLP Your potential, unleashed. India s impact on the global economy has increased at an exponential rate and Deloitte presents an opportunity to unleash and realize your potential amongst cutting edge leaders, and organizations shaping the future of the region, and indeed, the world beyond. At Deloitte, your whole self to work, every day. Combine that with our drive to propel with purpose and you have the perfect playground to collaborate, innovate, grow, and make an impact that matters. The team Deloitte helps organizations prevent cyberattacks and protect valuable assets. We believe in being secure, vigilant, and resilient not only by looking at how to prevent and respond to attacks, but at how to manage cyber risk in a way that allows you to unleash new opportunities. Embed cyber risk at the start of strategy development for more effective management of information and technology risks. Your work profile. Description Develop the strategic narrative to drive the direction, development, standards, for developing persona based cyber-Risk reporting & Visualization capability of large organization with an objective to make leadership (C-Level) aware of cyber security posture for taking informed decisions. Likewise, also cater to personas such as Audit, IT and Cyber operations teams. Interlock with various stakeholders and understand the requirements, translate them into storytelling based visual representation using reports and dashboards. Guide and direct delivery of cohesive, end-to-end visual experience on dashboards including simplifying quantifiable cyber security related information to various technical and non-technical personas. Understand various perspectives of users and embed principle of visual design in dashboard development process with focus on persona specific insights requirement data and user s actions based on the insights. Collaborate with User, delivery team and key stakeholders to influence design and adoption of the dashboard in the organization. Ideal candidate to have 3-5 years of proven experience in visual design, design strategy, experience strategy, design thinking, and human-centered design for enterprise-wide reporting solutions. Context & Main Purpose of Role Build UI/UX strategy based on data storytelling principles and support Cyber Risk reporting program through development of Interactive and contextualized Power BI dashboards which convey cyber risk posture to C-level executives. Rigorous focus on adherence to design principles defined by client s Power BI guidelines while bringing creative, simple but intuitive visuals on dashboards to communicate the Cyber risk to non-technical audience. Be part of dynamic Cyber Risk reporting team and collaborate with Data engineers, Power BI developers and Cyber Risk SMEs to create an impact through user-centric dashboard design. Required Qualifications, capabilities, and skills: Bachelors Degree with curriculum including business, mathematics, UX/UI design with story-telling, or equivalent working experience. Portfolio of work demonstrating effective visual communication of quantitative information related to Risk, specifically visually appealing dashboards. Experience working with data analytics teams and strong understanding of common challenges to measure and communicate cyber risk to non-technical leadership. Experience working on fast-paced, cross-functional teams in demanding business environments. Practical experience with tools for business intelligence, quantitative graphical analysis, and UX/UI design (e.g., Power BI, Figma) Excellent communication skills, with ability to share ideas concisely, clearly, and accurately. Experience in building dashboards along with ability to build C-level stakeholders dashboards. Developing visual reports, dashboards, and KPI scorecards. Knowledge on Connecting multiple data sources, importing data, and transforming data for Business Intelligence. Excellent in analytical thinking for translating data into informative visuals and reports. Should have inclination to understand cyber security related concepts which may help in dashboarding project to improve user satisfaction of dashboard users. Understand fundamentals of data preparation/data modeling necessary for the visualization purpose. Capture reporting requirements from various partners, architect the solution/report, Understand/analyze the source data and deliver the reports in timely manner. Strong expertise in Crafting intuitive and interactive reports and dashboards for data driven decisions. Proficiency in Microsoft Power BI, including Power BI Desktop, Power BI Service, and Power Query. Strong understanding of DAX (Data Analysis Expressions) and its application in data modeling. Familiarity with other Microsoft tools such as Excel, Azure, and SharePoint are a plus. Experience with Agile/Scrum methodologies is advantageous. Job Responsibilities: Collaborate across stakeholders, including developers, data engineers, and SMEs, to understand and align on business objectives and data requirements. Connect to various data sources and ensure data integrity, accuracy, and consistency. Optimize Power BI solutions for performance and usability. Create and maintain Power BI data models, including measures, calculated columns, and DAX expressions. Develop compelling data-driven narratives that effectively communicate insights and recommendations to various audiences, such as senior executives, departmental leaders, and managers. Provide guidance and support to other Power BI developers in creating visually appealing and accessible data visualizations. Apply visual design principles to ensure a positive user experience in the presentation of quantitative information. Conduct business and data analysis to uncover actionable insights. Ensure compliance with all applicable design principles. How you ll grow Connect for impact Our exceptional team of professionals across the globe are solving some of the world s most complex business problems, as well as directly supporting our communities, the planet, and each other. Know more in our Global Impact Report and our India Impact Report . Empower to lead You can be a leader irrespective of your career level. Our colleagues are characterised by their ability to inspire, support, and provide opportunities for people to deliver their best and grow both as professionals and human beings. Know more about Deloitte and our One Young World partnership. Inclusion for all At Deloitte, people are valued and respected for who they are and are trusted to add value to their clients, teams and communities in a way that reflects their own unique capabilities. Know more about everyday steps that you can take to be more inclusive. At Deloitte, we believe in the unique skills, attitude and potential each and every one of us brings to the table to make an impact that matters. Drive your career At Deloitte, you are encouraged to take ownership of your career. We recognise there is no one size fits all career path, and global, cross-business mobility and up / re-skilling are all within the range of possibilities to shape a unique and fulfilling career. Know more about Life at Deloitte. Everyone s welcome entrust your happiness to us Our workspaces and initiatives are geared towards your 360-degree happiness. This includes specific needs you may have in terms of accessibility, flexibility, safety and security, and caregiving. Here s a glimpse of things that are in store for you. Interview tips We want job seekers exploring opportunities at Deloitte to feel prepared, confident and comfortable. To help you with your interview, we suggest that you do your research, know some background about the organisation and the business area you re applying to. Check out recruiting tips from Deloitte professionals. *Caution against fraudulent job offers*: We would like to advise career aspirants to exercise caution against fraudulent job offers or unscrupulous practices. At Deloitte, ethics and integrity are fundamental and not negotiable. We do not charge any fee or seek any deposits, advance, or money from any career aspirant in relation to our recruitment process. We have not authorized any party or person to collect any money from career aspirants in any form whatsoever for promises of getting jobs in Deloitte or for being considered against roles in Deloitte. We follow a professional recruitment process, provide a fair opportunity to eligible applicants and consider candidates only on merit. No one other than an authorized official of Deloitte is permitted to offer or confirm any job offer from Deloitte. We advise career aspirants to exercise caution. In this regard, you may refer to a more detailed advisory given on our website at: https: / / www2.deloitte.com / in / en / careers /
Posted 2 weeks ago
10.0 - 12.0 years
30 - 37 Lacs
Chennai
Work from Office
Design and execute a global compensation strategy aligned with the companyobjectives, ensuring competitiveness in the global talent market. Oversee job architecture, salary structures, and pay equity programs across all levels and regions. Lead annual compensation processes, including merit increases, promotions, and bonus cycles. Partner with finance and business leaders to manage the companyshort- and long-term incentive programs (STIP/MIP). Develop a comprehensive global benefits strategy that aligns with the company culture and meets diverse employee needs and local market trends across regions. Responsible for reporting/disclosures, while ensuring compliance with US and international regulatory guidelines Oversee health, wellness, retirement, and ancillary benefits programs, ensuring compliance with local regulations and industry best practices. Manage relationships with external vendors and consultants to deliver cost-effective, high-quality benefits. Lead and develop a Total Rewards team, fostering innovation and digitalization. Leverage data and analytics to drive decision-making, monitor program effectiveness, and deliver insights to leadership. Stay current with market trends, emerging practices, and regulatory changes in global total rewards. Develop and lead global wellness programs that promote the physical, mental, and emotional well-being of employees in line with QH Culture and Values. Design and implement initiatives to support a healthy work-life balance, including mental health resources, wellness challenges, and benefits integration. Collaborate with internal and external stakeholders to promote a culture of wellness, including offering resources for stress management, fitness, financial wellness, and work-life balance. Measure and track the effectiveness of wellness programs through employee surveys, participation rates, and health metrics to continually improve offerings. Educational Experience Minimum Requirements This position requires the following knowledge and skills: Bachelors degree in Human Resources, Business Administration, Finance, or a related field. 10- 12+ years of progressive experience in total rewards, including leadership roles in compensation and benefits. Proven track record in leading global compensation and benefits programs. Deep knowledge of equity programs and global compliance requirements. Competency Requirements Strong analytical, strategic thinking, and problem-solving skills. Ability to work across and influence leadership teams. Proficiency in HR technology and tools, including HRIS and compensation/benefits platforms. Knowledge of compensation and benefits survey tools (e.g., Mercer, Willis Towers Watson, Hay Group) and demonstrated ability to consult and guide rewards decisions using these tools is required. Experience working with a globally diverse population. Strong knowledge of global compensation and benefits practices, including familiarity with laws and regulations in key regions (North America, EMEA, APAC, etc.). Highly proficient in Microsoft office applications (PowerPoint, outlook etc.) with advanced skills in Microsoft Excel. Proficiency in UKG (UltiPro) is a plus. Working proficiency in other HRIS systems and compensation software. High level of data modelling and analysis knowledge and ability to present findings concisely. Track record of process improvement implementation and organizational impact.
Posted 2 weeks ago
3.0 - 8.0 years
30 - 45 Lacs
Gurugram
Work from Office
Requirement : Data Architect & Business Intelligence Experience: 5-12 Years Work Type: Full-Time Job Summary: We are looking for a Data Architect & Business Intelligence Expert who will be responsible for designing and implementing enterprise-level data architecture solutions. The ideal candidate will have extensive experience in data warehousing, data modeling, and BI frameworks , with a strong focus on Salesforce, Informatica, DBT, IICS, and Snowflake . Key Responsibilities: Design and implement scalable and efficient data architecture solutions for enterprise applications. Develop and maintain robust data models that support business intelligence and analytics. Build data warehouses to support structured and unstructured data storage needs. Optimize data pipelines, ETL processes, and real-time data processing. Work with business stakeholders to define data strategies that support analytics and reporting. Ensure seamless integration of Salesforce, Informatica, DBT, IICS, and Snowflake into the data ecosystem. Establish and enforce data governance, security policies, and best practices . Conduct performance tuning and optimization for large-scale databases and data processing systems. Provide technical leadership and mentorship to development teams. Key Skills & Requirements: Strong experience in data architecture, data warehousing, and data modeling . Hands-on expertise with Salesforce, Informatica, DBT, IICS, and Snowflake . Deep understanding of ETL pipelines, real-time data streaming, and cloud-based data solutions . Experience in designing scalable, high-performance, and secure data environments . Ability to work with big data frameworks and BI tools for reporting and visualization. Strong analytical, problem-solving, and communication skills
Posted 2 weeks ago
3.0 - 8.0 years
30 - 45 Lacs
Noida
Work from Office
Requirement : Data Architect & Business Intelligence Experience: 5-12 Years Work Type: Full-Time Job Summary: We are looking for a Data Architect & Business Intelligence Expert who will be responsible for designing and implementing enterprise-level data architecture solutions. The ideal candidate will have extensive experience in data warehousing, data modeling, and BI frameworks , with a strong focus on Salesforce, Informatica, DBT, IICS, and Snowflake . Key Responsibilities: Design and implement scalable and efficient data architecture solutions for enterprise applications. Develop and maintain robust data models that support business intelligence and analytics. Build data warehouses to support structured and unstructured data storage needs. Optimize data pipelines, ETL processes, and real-time data processing. Work with business stakeholders to define data strategies that support analytics and reporting. Ensure seamless integration of Salesforce, Informatica, DBT, IICS, and Snowflake into the data ecosystem. Establish and enforce data governance, security policies, and best practices . Conduct performance tuning and optimization for large-scale databases and data processing systems. Provide technical leadership and mentorship to development teams. Key Skills & Requirements: Strong experience in data architecture, data warehousing, and data modeling . Hands-on expertise with Salesforce, Informatica, DBT, IICS, and Snowflake . Deep understanding of ETL pipelines, real-time data streaming, and cloud-based data solutions . Experience in designing scalable, high-performance, and secure data environments . Ability to work with big data frameworks and BI tools for reporting and visualization. Strong analytical, problem-solving, and communication skills
Posted 2 weeks ago
3.0 - 8.0 years
30 - 45 Lacs
Pune
Work from Office
Requirement : Data Architect & Business Intelligence Experience: 5-12 Years Work Type: Full-Time Job Summary: We are looking for a Data Architect & Business Intelligence Expert who will be responsible for designing and implementing enterprise-level data architecture solutions. The ideal candidate will have extensive experience in data warehousing, data modeling, and BI frameworks , with a strong focus on Salesforce, Informatica, DBT, IICS, and Snowflake . Key Responsibilities: Design and implement scalable and efficient data architecture solutions for enterprise applications. Develop and maintain robust data models that support business intelligence and analytics. Build data warehouses to support structured and unstructured data storage needs. Optimize data pipelines, ETL processes, and real-time data processing. Work with business stakeholders to define data strategies that support analytics and reporting. Ensure seamless integration of Salesforce, Informatica, DBT, IICS, and Snowflake into the data ecosystem. Establish and enforce data governance, security policies, and best practices . Conduct performance tuning and optimization for large-scale databases and data processing systems. Provide technical leadership and mentorship to development teams. Key Skills & Requirements: Strong experience in data architecture, data warehousing, and data modeling . Hands-on expertise with Salesforce, Informatica, DBT, IICS, and Snowflake . Deep understanding of ETL pipelines, real-time data streaming, and cloud-based data solutions . Experience in designing scalable, high-performance, and secure data environments . Ability to work with big data frameworks and BI tools for reporting and visualization. Strong analytical, problem-solving, and communication skills
Posted 2 weeks ago
3.0 - 8.0 years
30 - 45 Lacs
Bengaluru
Work from Office
Requirement : Data Architect & Business Intelligence Experience: 5-12 Years Work Type: Full-Time Job Summary: We are looking for a Data Architect & Business Intelligence Expert who will be responsible for designing and implementing enterprise-level data architecture solutions. The ideal candidate will have extensive experience in data warehousing, data modeling, and BI frameworks , with a strong focus on Salesforce, Informatica, DBT, IICS, and Snowflake . Key Responsibilities: Design and implement scalable and efficient data architecture solutions for enterprise applications. Develop and maintain robust data models that support business intelligence and analytics. Build data warehouses to support structured and unstructured data storage needs. Optimize data pipelines, ETL processes, and real-time data processing. Work with business stakeholders to define data strategies that support analytics and reporting. Ensure seamless integration of Salesforce, Informatica, DBT, IICS, and Snowflake into the data ecosystem. Establish and enforce data governance, security policies, and best practices . Conduct performance tuning and optimization for large-scale databases and data processing systems. Provide technical leadership and mentorship to development teams. Key Skills & Requirements: Strong experience in data architecture, data warehousing, and data modeling . Hands-on expertise with Salesforce, Informatica, DBT, IICS, and Snowflake . Deep understanding of ETL pipelines, real-time data streaming, and cloud-based data solutions . Experience in designing scalable, high-performance, and secure data environments . Ability to work with big data frameworks and BI tools for reporting and visualization. Strong analytical, problem-solving, and communication skills
Posted 2 weeks ago
4.0 - 9.0 years
15 - 25 Lacs
Kolkata
Work from Office
Inviting applications for the role of Senior Principal Consultant-Power BI Developer! Responsibilities: • Working within a team to identify, design and implement a reporting/dashboarding user experience that is consistent and intuitive across environments, across report methods, defines security and meets usability and scalability best practices Gathering query data from tables of industry cognitive model/data lake and build data models with BI tools Apply requisite business logic using data transformation and DAX Understanding on Power BI Data Modelling and various in-built functions Knowledge on reporting sharing through Workspace/APP, Access management, Dataset Scheduling and Enterprise Gateway • Understanding of static and dynamic row level security Ability to create wireframes based on user stories and Business requirement Basic Understanding on ETL and Data Warehousing concepts Conceptualizing and developing industry specific insights in forms dashboards/reports/analytical web application to deliver of Pilots/Solutions following best practices Qualifications we seek in you! Minimum Qualifications Graduate Proficient in Power BI report development and data modeling. • Strong analytical skills and ability to work independently. • Experience in developing and implementing solutions in Power BI. 1 • Expertise in creating data models for report development in Power BI. • Strong SQL skills and ability to interpret data. • Proficient in overall testing of code and functionality. • Optional: Knowledge of Snowflake. • Preferred: Experience in finance projects/ Financial System Knowledge
Posted 2 weeks ago
6.0 - 10.0 years
15 - 30 Lacs
Bengaluru
Hybrid
Role & responsibilities SAP Datasphere or good SQL knowledge Skills: Good to have (any one of these skills): Hybrid Planning (BPC), BW on HANA and/or BW4HANA, Native HANA) Designs dashboards and reports that are intuitive and easy-to-understand while following the Enterprise Template. Must have good knowledge of Architecture of SAC/ SAP BO / BW / BW-IP and various solution options for developing strategic, operational, and tactical reports. Data Connections (Live and Import) from various data sources including but not limited to SAP Universe, SAP BW, Native HANA, S4/HANA, HANA Cloud, Concur, SuccessFactors, ARIBA, IBP and other SAP / non-SAP applications. Build complex Analytic and Planning applications Coding capabilities - HTML, CSS and Java Script Data Integration - experience in connect and ingest data from SAP / Non SAP systems, flat files etc. Experience in both batch and real-time data integrations Preferred candidate profile Education: Bachelors degree/masters degree Overall 3-4 years of SAP Analytics Cloud experience, including SAC Planning. 7-8 years of SAP BO / SAP BI / SAP BW/ SAP HANA experience Set up Planning Dimensions, Models (Standard and New) and help with Security mapping to existing BW Security set up Awareness of SAC/ BW /BW-IP functions as Data Actions, Multi Actions, Data Analysis and Visualization, Data Connections for Planning. SAP Datasphere What we offer: Competitive salary package. Leave Policies: 10 Days of Public Holiday (Includes 2 days optional) & 22 days of Earned Leave (EL) & 11 days for sick or caregiving leave. Office Requirement: 3 Days WFO
Posted 2 weeks ago
4.0 - 8.0 years
8 - 12 Lacs
Pune
Work from Office
Join a dynamic engineering team to contribute to the development and implementation of scalable digital commerce solutions. This role involves working across the full stack, with a focus on Shopify and other commerce platforms, ensuring performant, secure, and maintainable solutions for global clients. Job Description: Key Responsibilities Full Stack Development: Develop and maintain eCommerce platforms using Shopify and other technologies such as Adobe Commerce or custom stacks. Implement and support headless commerce architectures using Shopify Hydrogen, Storefront API, and GraphQL. Build responsive frontend interfaces using React, Next. js, or Angular. Design backend services and APIs with Node. js, Express, or similar frameworks. Integration & Cloud: Integrate third-party systems including payment gateways, ERP, CMS, and analytics tools. Collaborate on deployment strategies using cloud platforms like AWS, GCP, or Azure. Support CI/CD pipelines and DevOps best practices. Code Quality & Collaboration: Follow best practices in coding, testing, and documentation. Work closely with senior engineers, architects, and designers to deliver high-quality features. Participate in code reviews and knowledge-sharing sessions. Client & Team Interaction: Communicate technical solutions clearly to stakeholders. Collaborate with cross-functional teams in agile environments. Take ownership of deliverables and contribute to sprint planning and estimation. Qualifications & Skills Experience: 3+ years of professional experience in full stack development. Hands-on experience with eCommerce platforms, especially Shopify (Shopify+, Hydrogen, Storefront API). Exposure to Adobe Commerce, SAP Commerce, or custom commerce platforms is a plus. Technical Skills: Proficient in modern frontend frameworks: React. js, Next. js, or Angular. Skilled in backend development with Node. js, Express. js; bonus for Java or . NET exposure. Good understanding of REST/GraphQL APIs, authentication, and data modeling. Familiarity with Git, CI/CD tools, and DevOps workflows. Basic experience with cloud services (AWS/GCP/Azure) for deployments and hosting. Mindset & Soft Skills: Strong problem-solving and debugging skills. Detail-oriented, quality-conscious, and eager to learn. Team player with good communication and collaboration abilities. Passionate about eCommerce technology and user experience. Location: Pune Brand: Merkle Time Type: Full time Contract Type: Permanent
Posted 2 weeks ago
2.0 - 5.0 years
5 - 9 Lacs
Bengaluru
Work from Office
The IQVIA Digital Data team is growing and looking for super curious, passionate, and driven individuals to join the team. Our people are our greatest asset and were committed to creating an environment where we all thrive doing what we love. IQVIA Digital is the world s most comprehensive platform for healthcare marketing and analytics. It is changing the way Healthcare Marketing is done by leveraging latest cloud-native solutions, efficient big data pipelines, processes and technologies. We work with the largest pharmaceutical brands and media agencies in the US. We empower media planning, buying, and analytics teams with the tools they need to do their job, and do it well. By simplifying workflows that used to take days into seconds, integrating functionality that used to require multiple vendors into one, and providing faster and deeper insights than anyone in the industry, we are helping healthcare marketers cut their costs, move faster and drive measurable results. We are looking for a driven and dynamic Senior Data Engineer who will have the responsibility to expand and maintain our data warehouse, develop scalable data products, and help orchestrate terabytes of data flowing through our platform. This individual will work directly with a group of cross-functional engineers and product owners in our reporting and statistical aggregations, leveraging best practice engineering standards to ensure secure and successful data solutions. About the Job - Construct data pipelines using Airflow and Cloud Functions to meet business requirements set from the Reporting Product and Engineering teams - Maintain and optimize table schemas, views, and queries in our data warehouse and databases - Perform ad-hoc analysis to troubleshoot stakeholder issues surrounding data and provide insights into feature usage - Document data architecture and integration efforts to provide a clear understanding of the data platform to other team members - Provide guidance on data best practices when building out new product lines - Mentor a team of engineers Must have - Experience with data task orchestration (Airflow, CRON, Prefect etc. . ) with dependency mapping - Data analysis and data modeling - Strong experience in Python, SQL, shell scripting - Experience interacting with APIs, SFTP, Cloud Storage locations (eg. GCS, s3) - Analytical problem-solving skills, ability to analyze application logs to troubleshoot issues - Familiarity with Cloud Computing (GCP a plus) -Experience developing and implementing statistical models Nice to have - Experience with JavaScript - Hands on work with Airflow - Exposure to producer/consumer messaging systems - Has led a small team of developers IQVIA is a leading global provider of clinical research services, commercial insights and healthcare intelligence to the life sciences and healthcare industries. We create intelligent connections to accelerate the development and commercialization of innovative medical treatments to help improve patient outcomes and population health worldwide . Learn more at https://jobs. iqvia. com
Posted 2 weeks ago
3.0 - 7.0 years
7 - 11 Lacs
Pune
Work from Office
Join a dynamic engineering team to contribute to the development and implementation of scalable digital commerce solutions. This role involves working across the full stack, with a focus on Shopify and other commerce platforms, ensuring performant, secure, and maintainable solutions for global clients. Job Description: Key Responsibilities Full Stack Development: Develop and maintain eCommerce platforms using Shopify and other technologies such as Adobe Commerce or custom stacks. Implement and support headless commerce architectures using Shopify Hydrogen, Storefront API, and GraphQL. Build responsive frontend interfaces using React, Next. js, or Angular. Design backend services and APIs with Node. js, Express, or similar frameworks. Integration & Cloud: Integrate third-party systems including payment gateways, ERP, CMS, and analytics tools. Collaborate on deployment strategies using cloud platforms like AWS, GCP, or Azure. Support CI/CD pipelines and DevOps best practices. Code Quality & Collaboration: Follow best practices in coding, testing, and documentation. Work closely with senior engineers, architects, and designers to deliver high-quality features. Participate in code reviews and knowledge-sharing sessions. Client & Team Interaction: Communicate technical solutions clearly to stakeholders. Collaborate with cross-functional teams in agile environments. Take ownership of deliverables and contribute to sprint planning and estimation. Qualifications & Skills Experience: 3+ years of professional experience in full stack development. Hands-on experience with eCommerce platforms, especially Shopify (Shopify+, Hydrogen, Storefront API). Exposure to Adobe Commerce, SAP Commerce, or custom commerce platforms is a plus. Technical Skills: Proficient in modern frontend frameworks: React. js, Next. js, or Angular. Skilled in backend development with Node. js, Express. js; bonus for Java or . NET exposure. Good understanding of REST/GraphQL APIs, authentication, and data modeling. Familiarity with Git, CI/CD tools, and DevOps workflows. Basic experience with cloud services (AWS/GCP/Azure) for deployments and hosting. Mindset & Soft Skills: Strong problem-solving and debugging skills. Detail-oriented, quality-conscious, and eager to learn. Team player with good communication and collaboration abilities. Passionate about eCommerce technology and user experience. Location: Pune Brand: Merkle Time Type: Full time Contract Type: Permanent
Posted 2 weeks ago
2.0 - 5.0 years
4 - 7 Lacs
Bengaluru
Work from Office
Country India Number of Openings* 1 Approved ECMS RQ# 533568 Duration of contract* 6 Months Total Yrs. of Experience* 8+ years Relevant Yrs. of experience* 8+ years Detailed JD *(Roles and Responsibilities) We are seeking a highly skilled and experienced Database Developer to join our team. The ideal candidate will have a strong background in SQL, SQL Server, BigQuery, Data Modelling, SSIS, and ETL processes. You will be responsible for designing, developing, and maintaining robust database solutions that support business operations and analytics. Key Responsibilities: > Design and implement efficient database solutions and models to store and retrieve company data. > Develop and optimize SQL queries, stored procedures, and functions. > Work with SQL Server and BigQuery to manage large datasets and ensure data integrity. > Build and maintain ETL pipelines using SSIS and other tools. > Collaborate with data analysts, software developers, and business stakeholders to understand data requirements. > Perform data profiling, cleansing, and transformation to support analytics and reporting. > Monitor database performance and implement improvements. > Ensure security and compliance standards are met in all database solutions. Required Skills & Qualifications: > 8 12 years of hands-on experience in database development. Strong proficiency in SQL and SQL Server. > Experience with Google BigQuery and cloud-based data solutions. > Expertise in Data Modelling and relational database design. Proficient in SSIS and ETL development. > Solid understanding of performance tuning and optimization techniques. > Excellent problem-solving and analytical skills. > Strong communication and collaboration abilities. Mandatory skills* SQL, SQL Server, BIG Query, , SSIS Desired skills* Data Modelling, ETL Domain* Payments Client name (for internal purpose only)* NatWest Approx. vendor billing rate(INR /Day) 10000 INR/Day Work Location* Chennai or Bangalore or Gurgaon Background check process to be followed: * Yes Before onboarding / After onboarding: * Before Onboarding BGV Agency: * Any Nascom approved
Posted 2 weeks ago
4.0 - 7.0 years
20 - 32 Lacs
Bengaluru
Work from Office
ECMS ID/ Title Number of Openings 3 Duration of contract 6 No of years experience Relevant 5+ Years and Total 8+ years . Detailed job description - Skill Set: Attached Mandatory Skills* Power Bi /UI developer Good to Have Skills Power BI Vendor Billing range 6000- 9000/Day Remote option available: Yes/ No Hybrid Mode Work location: Chennai Start date: Immediate Client Interview / F2F Applicable yes Background check process to be followed: Before onboarding / After onboarding: BGV Agency: Pre- ( 1 Month BGV) Master degree or equivalent experience. Minimum of 5 years of experience in data visualization, UI development, or related roles. Relevant certifications in PowerBI, data analysis, or related technologies are preferred are a plus. Proven track record of developing PowerBI dashboards and reports to meet business requirements. Strong attention to detail and commitment to delivering high-quality user interfaces. Strong attention to detail and commitment to delivering high-quality design solutions. Experience in user testing and feedback incorporation to improve design and functionality. Ability to analyze complex data sets and derive meaningful insights to support business decision-making. Experience in collaborating with data scientists, analysts, and business stakeholders as well as development, operations, and security teams to deliver data-driven solutions. Ability to work effectively in a team-oriented environment. Strong communication skills to articulate technical concepts to non-technical stakeholders. Demonstrated ability to identify and resolve technical issues efficiently. Innovative mindset with a focus on continuous improvement and automation. Ability to adapt to new technologies and methodologies in a fast-paced environment. The person must have the ability to work in a multicultural environment, and have excellent process, functional, communication, teamwork and interpersonal skills, and willingness to work in a team environment to support other technical staff as needed. The person should have high tolerance for ambiguity. The PowerBI / UI Developer needs to be well versed with: Creating complex PowerBI reports and dashboards with advanced data visualization techniques. DAX (Data Analysis Expressions) for creating custom calculations in PowerBI. Power Query for data transformation and manipulation. Data modeling concepts and best practices in PowerBI. Integrating PowerBI with various data sources, including Azure SQL Database and Azure Data Lake. Designing and implementing user-friendly UI/UX for PowerBI dashboards. PowerApps for building custom business applications. Azure Synapse Analytics for handling big data workloads. Azure Data Factory for orchestrating data workflows. Azure Blob Storage for storing and managing large datasets. Implementing row-level security in PowerBI for data protection. Version control systems like Git for managing PowerBI projects. REST APIs for integrating PowerBI with other applications. Power Automate for automating workflows and processes. SQL for querying and managing data in Azure databases. Azure Active Directory for managing user access and authentication. Troubleshooting and optimizing PowerBI performance issues. Designing visually appealing and functional user interfaces and reports.
Posted 2 weeks ago
3.0 - 6.0 years
5 - 8 Lacs
Bengaluru
Work from Office
Country India Number of Openings* 1 Approved ECMS RQ# 533573 Duration of contract* 6 Months Total Yrs. of Experience* 8+ years Relevant Yrs. of experience* 8+ years Detailed JD *(Roles and Responsibilities) We are looking for a seasoned GCP Engineer with 8 10 years of experience in cloud infrastructure and automation. The ideal candidate will hold a GCP Architecture Certification and possess deep expertise in Terraform, GitLab, Shell Scripting, and a wide range of GCP services including Compute Engine, Cloud Storage, Dataflow, BigQuery, and IAM. You will be responsible for designing, implementing, and maintaining scalable cloud solutions that meet business and technical requirements. Key Responsibilities: > Design and implement secure, scalable, and highly available cloud infrastructure on Google Cloud Platform. > Automate infrastructure provisioning and configuration using Terraform. > Manage CI/CD pipelines using GitLab for efficient deployment and integration. > Develop and maintain Shell scripts for automation and system management tasks. > Utilize GCP services such as Compute Engine, Cloud Storage, Dataflow, BigQuery, and IAM to support data and application workflows. > Ensure compliance with security policies and manage access controls using IAM. > Monitor system performance and troubleshoot issues across cloud environments. > Collaborate with cross-functional teams to understand requirements and deliver cloud-based solutions. Required Skills & Qualifications: > 8 12 years of experience in cloud engineering or infrastructure roles. > GCP Architecture Certification is mandatory. > Strong hands-on experience with Terraform and infrastructure-as-code practices. > Proficiency in GitLab for version control and CI/CD. > Solid experience in Shell Scripting for automation. > Deep understanding of GCP services: Compute Engine, Cloud Storage, Dataflow, BigQuery, and IAM. > Strong problem-solving skills and ability to work independently. > Excellent communication and colloboration skills. Mandatory skills* SQL, SQL Server, BIG Query, , SSIS Desired skills* Data Modelling, ETL Domain* Payments Client name (for internal purpose only)* NatWest Approx. vendor billing rate(INR /Day) 10000 INR/Day Work Location* Chennai or Bangalore or Gurgaon Background check process to be followed: * Yes Before onboarding / After onboarding: * Before Onboarding BGV Agency: * Any Nascom approved Mode of Interview: Telephonic/Face to Face/Skype Interview* Teams virtual followed by F2F WFO / WFH / Hybrid Hybrid Any Certification (Mandatory) As virtual followed by A2A Shift Time Chennai or Bangalore or Gurgaon Business travel required (Yes / No) No Client BTP / SHTP UK
Posted 2 weeks ago
5.0 - 8.0 years
7 - 8 Lacs
Bengaluru
Work from Office
Were proud to be recognized as a Great Place to Work , a testament to our inclusive culture, strong leadership, and commitment to employee well-being and growth. At Tarento, you ll be part of a collaborative environment where ideas are valued, learning is continuous, and careers are built on passion and purpose. Role Overview An Azure Data Engineer specializing in Databricks is responsible for designing, building, and maintaining scalable data solutions on the Azure cloud platform, with a focus on leveraging Databricks and related big data technologies. The role involves close collaboration with data scientists, analysts, and software engineers to ensure efficient data processing, integration, and delivery for analytics and business intelligence needs 2 4 5 . Key Responsibilities Design, develop, and maintain robust and scalable data pipelines using Azure Databricks, Azure Data Factory, and other Azure services. Build and optimize data architectures to support large-scale data processing and analytics. Collaborate with cross-functional teams to gather requirements and deliver data solutions tailored to business needs. Ensure data quality, integrity, and security across various data sources and pipelines. Implement data governance, compliance, and best practices for data security (e. g. , encryption, RBAC). Monitor, troubleshoot, and optimize data pipeline performance, ensuring reliability and scalability. Document technical specifications, data pipeline processes, and architectural decisions Support and troubleshoot data workflows, ensuring consistent data delivery and availability for analytics and reporting Automate data tasks and deploy production-ready code using CI/CD practices Stay updated with the latest Azure and Databricks features, recommending improvements and adopting new tools as appropriate Required Skills and Qualifications Bachelor s degree in computer science, Engineering, or a related field 5+ years of experience in data engineering, with hands-on expertise in Azure and Databricks environments Proficiency in Databricks, Apache Spark, and Spark SQL Strong programming skills in Python and/or Scala Advanced SQL skills and experience with relational and NoSQL databases Experience with ETL processes, data warehousing concepts, and big data technologies (e. g. , Hadoop, Kafka) Familiarity with Azure services: Azure Data Lake Storage (ADLS), Azure Data Factory, Azure SQL Data Warehouse, Cosmos DB, Azure Stream Analytics, Azure Functions Understanding of data modeling, schema design, and data integration best practices Strong analytical, problem-solving, and troubleshooting abilities Experience with source code control systems (e. g. , GIT) and technical documentation tools Excellent communication and collaboration skills; ability to work both independently and as part of a team Preferred Skills Experience with automation, unit testing, and CI/CD pipelines Certifications in Azure Data Engineering or Databricks are advantageous Soft Skills Flexible, self-starter, and proactive in learning and adopting new technologies Ability to manage multiple priorities and work to tight deadlines Strong stakeholder management and teamwork capabilities
Posted 2 weeks ago
0.0 - 2.0 years
8 - 12 Lacs
Bengaluru
Work from Office
Database design, SQL query optimization, data modeling, and troubleshooting database issues. Responsibilities Database Development and Design: Create, maintain, and optimize SQL databases, including tables, views, stored procedures, and functions. SQL Query Optimization: Write efficient and performant SQL queries, and analyze existing queries for performance improvements. Data Modeling: Design and implement data models to effectively represent and manage data within the database. Troubleshooting and Debugging: Identify and resolve issues with database queries and applications. Collaboration: Work with other developers, business analysts, and stakeholders to understand requirements and integrate database solutions. Database Administration: Perform tasks related to database backup, recovery, and security. Data Analysis and Reporting: Generate and analyze reports from SQL databases to support decision-making. Staying Up-to-Date: Keep abreast of emerging database technologies and best practices. Skills SQL Programming: Proficient in SQL syntax and database management systems (DBMS) like MySQL, Oracle, or Microsoft SQL Server. Data Modeling: Understanding of database design principles and data modeling techniques. SQL Query Optimization: Ability to write and optimize SQL queries for performance. Problem-Solving: Strong analytical and problem-solving skills to troubleshoot database issues. Communication: Ability to communicate effectively with other developers and stakeholders. Experience Proven Experience: In SQL development or related roles. Bachelor's Degree: In Computer Science, Information Technology, or a related field is often preferred. Relevant Work Experience: Experience in SQL development, database design, or database administration.
Posted 2 weeks ago
0.0 - 2.0 years
8 - 12 Lacs
Bengaluru
Work from Office
We are looking for an experienced data engineer to join our team. You will use various methods to transform raw data into useful data systems. For example, youll create algorithms and conduct statistical analysis. Overall, youll strive for efficiency by aligning data systems with business goals. To succeed in this data engineering position, you should have strong analytical skills and the ability to combine data from different sources. Data engineer skills also include familiarity with several programming languages and knowledge of learning machine methods. Responsibilities Analyze and organize raw data Build data systems and pipelines Interpret trends and patterns Conduct complex data analysis and report on results Prepare data for prescriptive and predictive modeling Build algorithms and prototypes Combine raw information from different sources Explore ways to enhance data quality and reliability Identify opportunities for data acquisition Develop analytical tools and programs Previous experience as a data engineer or in a similar role Technical expertise with data models, data mining, and segmentation techniques Knowledge of programming languages (e.g. Java and Python) Hands-on experience with SQL database design Degree in Computer Science, IT, or similar field; a Masters is a plus Focus will be on building out our Python ETL processes and writing superb SQL Use agile software development processes to make iterative improvements to our back-end systems Model front-end and back-end data sources to help draw a more comprehensive picture of user flows throughout the system and to enable powerful data analysis Build data pipelines that clean, transform, and aggregate data from disparate sources
Posted 2 weeks ago
5.0 - 8.0 years
10 - 20 Lacs
Hyderabad, Pune, Chennai
Hybrid
Role & responsibilities We are looking for a skilled Data Modeller with 5 to 8 years of hands-on experience in designing and maintaining robust data models for enterprise data solutions. The ideal candidate has a strong foundation in dimensional, relational, and semantic data modelling and is ready to expand into data engineering technologies and practices . This is a unique opportunity to influence enterprise-wide data architecture while growing your career in modern data engineering. Required Skills & Experience: 5 to 8 years of experience in data modelling with tools such as Erwin, ER/Studio, dbt, PowerDesigner , or equivalent. Strong understanding of relational databases, star/snowflake schemas, normalization, and denormalization . Experience working with SQL , stored procedures , and performance tuning of data queries. Exposure to data warehousing concepts and BI tools (e.g., Tableau, Power BI, Looker). Familiarity with data governance , metadata management , and data cataloging tools . Excellent communication and documentation skills.
Posted 2 weeks ago
5.0 - 9.0 years
10 - 17 Lacs
Chennai
Hybrid
Position Description: Job Title: Salesforce Data Cloud Experience: 5 to 7 Years Job Summary: We are looking for a skilled Salesforce Data Cloud Developer with strong experience in both development and administration. The ideal candidate will be responsible for designing and implementing scalable solutions on Salesforce Data Cloud, managing platform configurations, and working closely with business stakeholders to gather and understand requirements. Key Responsibilities: Design, develop, and deploy custom solutions using Salesforce Data Cloud Perform administrative tasks such as user management, security settings, and data configuration Collaborate with business teams to gather, analyze, and translate requirements into technical solutions Build and maintain data models, integrations, and automation workflows Ensure data integrity, security, and compliance with governance standards Troubleshoot and resolve issues related to performance, data quality, and system behavior Stay updated with Salesforce releases and recommend best practices Required Skills: Strong hands-on experience with Salesforce Data Cloud and core Salesforce platform Solid understanding of data modelling and integration patterns Solid understanding of Data Streams,Data Lake, Data Models,Data Transforms and Data Analysis Experience working with segments, activations, and calculated insights Experience with Salesforce administration tasks and declarative tools Excellent communication skills to interact with business users and translate needs into solutions Salesforce certifications Skills Required: Salesforce Experience Required: 3 to 6 years Experience Preferred: Strong hands-on experience with Salesforce Data Cloud and core Salesforce platform Solid understanding of data modelling and integration patterns Solid understanding of Data Streams,Data Lake, Data Models,Data Transforms and Data Analysis Experience working with segments, activations, and calculated insights Experience with Salesforce administration tasks and declarative tools Excellent communication skills to interact with business users and translate needs into solutions Salesforce certifications Education Required: Bachelor's Degree Additional Information : Key Responsibilities: Design, develop, and deploy custom solutions using Salesforce Data Cloud Perform administrative tasks such as user management, security settings, and data configuration Collaborate with business teams to gather, analyze, and translate requirements into technical solutions Build and maintain data models, integrations, and automation workflows Ensure data integrity, security, and compliance with governance standards Troubleshoot and resolve issues related to performance, data quality, and system behavior Stay updated with Salesforce releases and recommend best practices
Posted 2 weeks ago
8.0 - 13.0 years
27 - 42 Lacs
Kolkata, Pune, Chennai
Hybrid
Job Description Role requires him/her to design and implement data modeling solutions using relational, dimensional, and NoSQL databases. You will work closely with data architects to design bespoke databases using a mixture of conceptual, physical, and logical data models. Job title Data Modeler Hybrid role from Location: Bangalore, Chennai, Gurgaon, Pune, Kolkata Interviews: 3 rounds of 30 ~ 45 Minutes video-based Teams interviews Employment Type: Permanent Full Time with Tredence Total Experience 9~13 years Required Skills Data Modeling, Dimensional modeling, ErWIN, Data Management, RDBMS, SQL/NoSQL, ETL What we look for: BE/B.Tech or equivalent The data modeler designs, implements, and documents data architecture and data modeling solutions, which include the use of relational, dimensional, and NoSQL databases. These solutions support enterprise information management, business intelligence, machine learning, data science, and other business interests. 9~13 years of hands-on relational, dimensional, and/or analytic experience (using RDBMS, dimensional, NoSQL data platform technologies, and ETL and data ingestion protocols). Experience with data warehouse, data lake, and enterprise big data platforms in multi-data-center contexts required. Good knowledge of metadata management, data modeling, and related tools (Erwin or ER Studio or others) required. Experience in team management, communication, and presentation. Be responsible for the development of the conceptual, logical, and physical data models, the implementation of RDBMS, operational data store (ODS), data marts, and data lakes on target platforms (SQL/NoSQL). Oversee and govern the expansion of existing data architecture and the optimization of data query performance via best practices. The candidate must be able to work independently and collaboratively. Responsibilities Implement business and IT data requirements through new data strategies and designs across all data platforms (relational, dimensional, and NoSQL) and data tools (reporting, visualization, analytics, and machine learning). Work with business and application/solution teams to implement data strategies, build data flows, and develop conceptual/logical/physical data models Define and govern data modeling and design standards, tools, best practices, and related development for enterprise data models. Identify the architecture, infrastructure, and interfaces to data sources, tools supporting automated data loads, security concerns, analytic models, and data visualization. Hands-on modeling, design, configuration, installation, performance tuning, and sandbox POC. Work proactively and independently to address project requirements and articulate issues/challenges to reduce project delivery risks.
Posted 2 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
40005 Jobs | Dublin
Wipro
19416 Jobs | Bengaluru
Accenture in India
16187 Jobs | Dublin 2
EY
15356 Jobs | London
Uplers
11435 Jobs | Ahmedabad
Amazon
10613 Jobs | Seattle,WA
Oracle
9462 Jobs | Redwood City
IBM
9313 Jobs | Armonk
Accenture services Pvt Ltd
8087 Jobs |
Capgemini
7830 Jobs | Paris,France