Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
10.0 - 15.0 years
0 Lacs
noida, uttar pradesh
On-site
As an experienced Power BI Architect with knowledge of Microsoft Fabric Solutions, you will be responsible for leading the design, development, and implementation of innovative Business Intelligence (BI) solutions. Your expertise in enterprise data architecture, analytics platforms, and data integration strategies will be crucial in optimizing data pipelines and enhancing performance through the effective use of Power BI and Microsoft Fabric. Your key responsibilities will include developing comprehensive Power BI solutions such as dashboards, reports, and data models to meet business requirements. You will lead the end-to-end development lifecycle of BI projects, from requirement gathering to deployment, ensuring optimal performance. Utilizing Microsoft Fabric, you will streamline data pipelines, integrate data engineering, storage, and processing capabilities, and enhance performance and scalability by integrating Power BI with Microsoft Fabric. Your role will also involve working with Azure Data Services like Azure Data Lake, Azure Synapse, and Azure Data Factory to support BI architecture. Implementing best practices in Power BI development, providing leadership and mentorship to a team of developers, overseeing project management tasks, and collaborating with data engineers and stakeholders to translate business requirements into scalable BI solutions will be part of your responsibilities. To qualify for this role, you should hold a Bachelor's degree in Computer Science, Information Systems, Data Analytics, or a related field. You must have 10-15 years of experience in BI development, including at least 3 years in a leadership position. Proven experience with Power BI, Microsoft Fabric, and Azure Data Services is also required for this position.,
Posted 6 days ago
5.0 - 9.0 years
0 Lacs
hyderabad, telangana
On-site
As a Senior SQL Developer at our company, you will play a crucial role in our BI & analytics team by expanding and optimizing our data and data queries. Your responsibilities will include optimizing data flow and collection for consumption by our BI & Analytics platform. You should be an experienced data querying builder and data wrangler with a passion for optimizing data systems from the ground up. Collaborating with software developers, database architects, data analysts, and data scientists, you will support data and product initiatives and ensure consistent optimal data delivery architecture across ongoing projects. Your self-directed approach will be essential in supporting the data needs of multiple systems and products. If you are excited about enhancing our company's data architecture to support our upcoming products and data initiatives, this role is perfect for you. Your essential functions will involve creating and maintaining optimal SQL queries, Views, Tables, and Stored Procedures. By working closely with various business units such as BI, Product, and Reporting, you will contribute to developing the data warehouse platform vision, strategy, and roadmap. Understanding physical and logical data models and ensuring high-performance access to diverse data sources will be key aspects of your role. Encouraging the adoption of organizational frameworks through documentation, sample code, and developer support will also be part of your responsibilities. Effective communication of progress and effectiveness of developed frameworks to department heads and managers will be essential. To be successful in this role, you should possess a Bachelor's or Master's degree or equivalent combination of education and experience in a relevant field. Proficiency in T-SQL, Data Warehouses, Star Schema, Data Modeling, OLAP, SQL, and ETL is required. Experience in creating Tables, Views, and Stored Procedures is crucial. Familiarity with BI and Reporting Platforms, industry trends, and knowledge of multiple database platforms like SQL Server and MySQL are necessary. Proficiency in Source Control and Project Management tools such as Azure DevOps, Git, and JIRA is expected. Experience with SonarQube for clean coding T-SQL practices and DevOps best practices will be advantageous. Applicants must have exceptional written and spoken communication skills and strong team-building abilities to contribute to making strategic decisions and advising senior management on technical matters. With at least 5+ years of experience in a data warehousing position, including working as a SQL Developer, and experience in system development lifecycle, you should also have a proven track record in data integration, consolidation, enrichment, and aggregation. Strong analytical skills, attention to detail, organizational skills, and the ability to mentor junior colleagues will be crucial for success in this role. This full-time position requires flexibility to support different time zones between 12 PM IST to 9 PM IST, Monday through Friday. You will work in a Hybrid Mode and spend at least 2 days working from the office in Hyderabad. Occasional evening and weekend work may be expected based on client needs or job-related emergencies. This job description may not cover all responsibilities and duties, which may change with or without notice.,
Posted 6 days ago
5.0 - 9.0 years
0 Lacs
karnataka
On-site
It is exciting to be part of a company where individuals wholeheartedly BELIEVE in the mission! The company is dedicated to bringing enthusiasm and a customer-centric approach to the operations. As a Data Scientist, you will be required to work on the end-to-end data science pipeline, demonstrating the ability to meticulously design efficient data models, training data samples, training pipelines, and inference pipelines to ensure reproducibility. You should possess a strong background in evaluating various AI/ML outputs and models such as recommender systems, similarity scoring, classification, and regression. Your role will involve breaking down complex business challenges into manageable sub-tasks that can be addressed with AI/ML solutions to facilitate decision-making at scale. A solid understanding of statistical concepts and empirical methods that influence post-modeling decisions like sorting, ranking, and selection is essential. You must have substantial experience in transitioning Proof of Concepts (PoCs) and experimental models into production-grade AIML systems. Collaborating effectively with ML engineering and MLOps teams is crucial to deploying AI/ML models, constructing and serving models through APIs, and integrating AI/ML models into web applications. The role requires the ability to lead mid-sized teams, strategize project tracks centered around AI/ML and data science, and strike a balance between delegation and hands-on development. Applying a first-principles approach to tackle data science problems, communicating data requirements to data engineering teams, and elucidating model outputs to business users are key responsibilities. Engaging with business users to comprehend requirements and translating them into technical data science challenges is imperative. Effectively communicating and translating model algorithms and outputs to non-technical users and stakeholders is expected. Proficiency in agile methodologies and project management practices is essential. Creating efficient user stories and other agile project artifacts should come naturally. If you thrive in a dynamic environment and enjoy collaborating with enthusiastic high-achievers, a rewarding career opportunity awaits you here! If this opportunity does not align with your current aspirations, please express your interest in future openings by clicking on Introduce Yourself in the top-right corner of the page or setting up email alerts for relevant job postings.,
Posted 6 days ago
5.0 - 9.0 years
0 Lacs
maharashtra
On-site
As a Senior PL/SQL Developer with 5-8 years of experience, you will play a crucial role in leading the design, development, and implementation of complex PL/SQL applications. Your expertise will be instrumental in overseeing all phases of the software development lifecycle, ensuring that database structures effectively support business logic and workflows. In addition to your technical responsibilities, you will also have the opportunity to mentor and guide junior developers, promote best practices, coding standards, and foster a culture of continuous improvement within the team. Your role will involve collaborating with stakeholders to define project scope, deliverables, and timelines, ensuring that projects are completed to specification and on schedule. You will be expected to perform advanced performance tuning and optimization on large datasets, applying your expertise to improve efficiency and reduce processing time. Furthermore, you will evaluate and recommend tools and technologies that enhance development processes and improve overall project outcomes, ensuring that solutions are scalable and sustainable. To excel in this position, you should hold a Bachelor's degree in computer science or a related field, with a preference for a Master's degree for senior candidates. You must have 5-8 years of experience in PL/SQL development, with a focus on business analysis, solutioning, architecting, and client interaction. Strong expertise in Oracle databases, including advanced PL/SQL features and functions, is essential. Your proven experience in performance tuning, optimization, and troubleshooting in complex database environments will be highly valued. Familiarity with data modeling and database design principles is also required. In return, we offer a competitive salary and a comprehensive benefits package. You will have opportunities for professional growth and development, including training and certifications. Our work environment is collaborative, inclusive, values diversity, and promotes innovation. Join our dynamic team and contribute to innovative projects that directly impact our success.,
Posted 6 days ago
10.0 - 14.0 years
0 Lacs
pune, maharashtra
On-site
You should have over 10 years of experience in data architecture, data engineering, or related roles. Your expertise should include designing and implementing enterprise-level data solutions with a hands-on technical approach. You should have a proven track record of managing client relationships and leading technical teams. In terms of technical skills, you must be well-versed in data modeling, data warehousing, and database design, including both relational and NoSQL databases. You should have a strong proficiency in data engineering, which includes experience with ETL tools, data integration frameworks, and big data technologies. Hands-on experience with Google Cloud data platform and modern data processing frameworks is crucial. Moreover, familiarity with scripting and programming languages like Python and SQL for hands-on development and troubleshooting is essential. Experience with data governance frameworks & solutions such as Informatica, Collibra, Purview, etc., will be a plus. Soft skills required for this role include exceptional client management and communication skills to confidently interact with both technical and non-technical stakeholders. You should possess proven team management and leadership abilities, including mentoring, coaching, and project management. Strong analytical and problem-solving skills with a proactive, detail-oriented approach are necessary. The ability to work collaboratively in a fast-paced, dynamic environment while successfully driving multiple projects to completion is important. Preferred certifications for this position include Professional Cloud Architect (GCP), Data Architect, Certified Data Management Professional (CDMP), or similar credentials.,
Posted 6 days ago
5.0 - 9.0 years
0 Lacs
pune, maharashtra
On-site
Join us as a Senior Software Java Developer at Barclays, where you will play a crucial role in supporting the successful delivery of Location Strategy projects. Your responsibilities include ensuring projects are delivered according to plan, budget, agreed quality, and governance standards. You will lead the evolution of our digital landscape, driving innovation and excellence to provide unparalleled customer experiences. To excel in this role, you should have expertise in various areas, including: - End-to-end designing, solutioning, and delivering change and new initiatives. - Strong logical reasoning, problem-solving, performance tuning, and decision-making skills. - Developing APIs using REST and UI development using Angular/React JS. - Extensive experience (5+ years) in Core Java, J2EE design, and development for large-scale banking applications, along with Cloud expertise. - Proficiency in OO Designing and Programming Techniques, Data Modeling, Design Patterns. - In-depth knowledge and experience with Springboot, Spring, Javascript, JDBC, and relational SQL (preferably SQL Server). - Working knowledge of application and web servers, Linux or other UNIX-based systems. - Hands-on experience with workflow applications and business rules engines like JBPM. - Mentoring skills. Additional skills that would be beneficial for this role include knowledge of SQL databases, experience in implementing CI/CD pipelines, automated deployment processes, and a good understanding of the Banking Domain. As a Senior Software Java Developer, you will be evaluated based on key critical skills such as risk and controls, change and transformation, business acumen, strategic thinking, and digital and technology expertise. This role is based in Pune. Purpose of the Role: The primary purpose of this role is to design, develop, and enhance software using various engineering methodologies to deliver business, platform, and technology capabilities for customers and colleagues. Accountabilities: - Develop and deliver high-quality software solutions using industry-aligned programming languages, frameworks, and tools. - Collaborate with product managers, designers, and engineers to define software requirements, devise solution strategies, and ensure alignment with business objectives. - Engage in code reviews, promote a culture of code quality, and participate in knowledge sharing. - Stay updated on industry technology trends, contribute to technology communities, and promote technical excellence. - Implement secure coding practices and effective unit testing to ensure secure and reliable software solutions. Analyst Expectations: - Perform activities in a timely manner with high standards to drive continuous improvement. - Lead and supervise a team, guide professional development, allocate work requirements, and coordinate team resources. - Demonstrate a clear set of leadership behaviors or develop technical expertise as an advisor. - Partner with other functions and business areas, take responsibility for operational activities, and escalate breaches appropriately. - Influence decision-making, manage risk, strengthen controls, and adhere to relevant rules and regulations. All colleagues are expected to embody the Barclays Values of Respect, Integrity, Service, Excellence, and Stewardship, and demonstrate the Barclays Mindset to Empower, Challenge, and Drive.,
Posted 6 days ago
4.0 - 8.0 years
0 Lacs
delhi
On-site
The ideal candidate should possess extensive expertise in SQL, data modeling, ETL/ELT pipeline development, and cloud-based data platforms like Databricks or Snowflake. You will be responsible for designing scalable data models, managing reliable data workflows, and ensuring the integrity and performance of critical financial datasets. Collaboration with engineering, analytics, product, and compliance teams is a key aspect of this role. Responsibilities: - Design, implement, and maintain logical and physical data models for transactional, analytical, and reporting systems. - Develop and oversee scalable ETL/ELT pipelines to process large volumes of financial transaction data. - Optimize SQL queries, stored procedures, and data transformations for enhanced performance. - Create and manage data orchestration workflows using tools like Airflow, Dagster, or Luigi. - Architect data lakes and warehouses utilizing platforms such as Databricks, Snowflake, BigQuery, or Redshift. - Ensure adherence to data governance, security, and compliance standards (e.g., PCI-DSS, GDPR). - Work closely with data engineers, analysts, and business stakeholders to comprehend data requirements and deliver solutions. - Conduct data profiling, validation, and quality assurance to maintain clean and consistent data. - Maintain comprehensive documentation for data models, pipelines, and architecture. Required Skills & Qualifications: - Proficiency in advanced SQL, including query tuning, indexing, and performance optimization. - Experience in developing ETL/ELT workflows with tools like Spark, dbt, Talend, or Informatica. - Familiarity with data orchestration frameworks such as Airflow, Dagster, Luigi, etc. - Hands-on experience with cloud-based data platforms like Databricks, Snowflake, or similar technologies. - Deep understanding of data warehousing principles like star/snowflake schema, slowly changing dimensions, etc. - Knowledge of cloud services (AWS, GCP, or Azure) and data security best practices. - Strong analytical and problem-solving skills in high-scale environments. Preferred Qualifications: - Exposure to real-time data pipelines like Kafka, Spark Streaming. - Knowledge of data mesh or data fabric architecture paradigms. - Certifications in Snowflake, Databricks, or relevant cloud platforms. - Familiarity with Python or Scala for data engineering tasks.,
Posted 6 days ago
3.0 - 10.0 years
0 Lacs
karnataka
On-site
You are an experienced Software Developer with a strong background in SQL databases, responsible for leading the development of SQL databases for various applications and business needs. Your expertise in data architecture and management will be crucial in designing and scaling SQL databases to meet the organization's requirements. You will also play a key role in writing SQL queries to store, sort, and retrieve a wide range of data. Your ability to think quickly, stay organized, and troubleshoot issues efficiently will be essential for day-to-day operations. Your responsibilities will include designing, developing, and maintaining robust SQL databases and database solutions in both on-premises and Cloud environments (AWS & Azure). You will provide technical expertise in migrating databases from on-premises to the Cloud and have knowledge of C++ as an added advantage. Additionally, you will lead a team of SQL developers, offer technical guidance, analyze and resolve issues in real-time, automate processes, track issues, and document changes. You will evaluate business data, recommend analytic strategies, perform statistical analysis, and work closely with development and architecture teams to optimize database schemas. To excel in this role, you should have a Bachelor's degree in Computer Science or a related field, along with 10+ years of experience in SQL development and database management (MS SQL, PostgreSQL). You should also possess 3+ years of experience in data analysis in an enterprise setting, a strong understanding of database design principles and data modeling, knowledge of ETL concepts, and excellent communication and presentation skills. Strong quantitative skills, attention to detail, problem-solving abilities, and the capacity to collaborate effectively with various teams are also essential. Working at LSEG, a leading global financial markets infrastructure and data provider, will offer you the opportunity to be part of a diverse workforce across 65 countries. You will contribute to a culture that values individuality, encourages new ideas, and is committed to sustainability. By helping to re-engineer the financial ecosystem for sustainable economic growth and supporting the transition to net zero, you will play a critical role in driving inclusive economic opportunity. LSEG provides a range of benefits and support, including healthcare, retirement planning, paid volunteering days, and wellbeing initiatives. If you join us, you will be part of an organization that upholds values such as Integrity, Partnership, Excellence, and Change, guiding decision-making and actions on a daily basis.,
Posted 6 days ago
3.0 - 7.0 years
0 Lacs
ahmedabad, gujarat
On-site
Relay Human Cloud is a young and dynamic company dedicated to assisting top US-based companies in expanding their teams globally. With operations in the US, India, Honduras, and Mexico (and more countries to be added soon), we specialize in connecting companies with the best international talent. At Relay, our core focus includes areas such as Accounting & Finance, Administration, Operations, Space Planning, Leasing, Data Science, Data Search, Machine Learning, and Artificial Intelligence. Our India operations are based in Ahmedabad and Vadodara, where we prioritize delivering high-quality services to cutting-edge companies. We are currently seeking a talented and dedicated Yardi Report Developer with expertise in YSR reporting to collaborate directly with our US-based clients. The Yardi Report Developer will play a crucial role in designing, developing, and maintaining custom reports and data visualization solutions within the Yardi property management software. This position is integral to providing accurate insights that support decision-making and enhance property management operations for our clients. Key Responsibilities: - Develop and maintain custom YSR reports in the Yardi Voyager property management software. - Collaborate with business stakeholders to understand their reporting and data visualization requirements. - Design dynamic and interactive reports and dashboards to provide valuable insights. - Troubleshoot and resolve issues related to report performance or data accuracy. - Create and update documentation for YSR reports and processes. - Stay updated with Yardi software features and updates, implementing them as necessary. - Assist in data extraction, transformation, and loading processes to support reporting needs. - Undertake ad-hoc data analysis and reporting tasks as requested. - Provide training and support to end-users on YSR reporting capabilities. Qualifications: - Proficiency in English to communicate effectively with US-based clients. - Bachelor's degree in computer science, Information Technology, or related field (or equivalent experience). - 3+ years of experience in Yardi property management software, specializing in YSR reporting. - Strong knowledge of SQL, data modeling, and data warehousing concepts. - Familiarity with report development tools like Yardi Voyager, YSR, SSRS, Power BI, or similar. - Excellent problem-solving and analytical abilities. - Detail-oriented with a focus on data accuracy and report quality. - Self-motivated and capable of working independently or as part of a team. Preferred Qualifications: - Experience in real estate or property management industry. - Knowledge of ETL tools and processes. - Familiarity with data visualization best practices.,
Posted 6 days ago
4.0 - 9.0 years
0 Lacs
noida, uttar pradesh
On-site
As a SAP Native HANA Developer with 4 to 9 years of experience, you will be responsible for developing and optimizing native HANA models using Calculation Views (Graphical/SQL Script). You will work with SQL scripting, procedures, and table functions in HANA, while also performing data modeling, performance tuning, and HANA security setup. Collaboration with cross-functional teams to gather requirements and deliver robust solutions is a key aspect of this role. Integrating SAP HANA with various data sources and front-end tools (like SAP Analytics Cloud, Fiori, etc.) is also part of your responsibilities. Additionally, you will troubleshoot performance issues and implement enhancements as needed. The ideal candidate for this position will have strong hands-on experience with SAP Native HANA 2.0, proficiency in SQLScript, Calculation Views, and Table Functions. Experience with data provisioning using SDI, SDA, or SLT is required, along with a good understanding of data modeling principles and performance tuning. Excellent communication and teamwork skills are also necessary to succeed in this role. This is a full-time/permanent position with a hybrid work mode located in Bangalore & Noida. Immediate to 20 days preferred for joining. If you are looking to showcase your expertise in SAP Native HANA development and work in a collaborative environment to deliver innovative solutions, this role is perfect for you.,
Posted 6 days ago
4.0 - 10.0 years
0 Lacs
karnataka
On-site
You are a developer of digital futures at Tietoevry, a leading technology company with a strong Nordic heritage and global capabilities. With core values of openness, trust, and diversity, you collaborate with customers to create digital futures where businesses, societies, and humanity thrive. The company's 24,000 experts specialize in cloud, data, and software, serving enterprise and public-sector customers in around 90 countries. Tietoevry's annual turnover is approximately EUR 3 billion, and its shares are listed on the NASDAQ exchange in Helsinki, Stockholm, and Oslo Brs. In the USA, EVRY USA delivers IT services through global delivery centers and offices in India (EVRY India). The company offers a comprehensive IT services portfolio, driving digital transformation across sectors like Banking & Financial Services, Insurance, Healthcare, Retail & Logistics, and Energy, Utilities & Manufacturing. EVRY India's process and project maturity are high, with offshore development centers in India appraised at CMMI DEV Maturity Level 5 & CMMI SVC Maturity Level 5 and certified under ISO 9001:2015 & ISO/IEC 27001:2013. As a Senior Data Modeler, you will lead the design and development of enterprise-grade data models for a modern cloud data platform built on Snowflake and Azure. With a strong foundation in data modeling best practices and hands-on experience with the Medallion Architecture, you will ensure data structures are scalable, reusable, and aligned with business and regulatory requirements. You will work on data models that meet processing, analytics, and reporting needs, focusing on Snowflake data warehousing and Medallion Architecture's Bronze, Silver, and Gold layers. Collaborating with various stakeholders, you will translate business needs into scalable data models, drive data model governance, and ensure compliance with data governance, quality, and security requirements. **Pre-requisites:** - 10 years of experience in data modeling, data architecture, or data engineering roles. - 4 years of experience modeling data in Snowflake or other cloud data warehouses. - Strong understanding and hands-on experience with Medallion Architecture and modern data platform design. - Experience using data modeling tools (Erwin etc.). - Proficiency in data modeling techniques: 3NF, dimensional modeling, data vault, and star/snowflake schemas. - Expert-level SQL and experience working with semi-structured data (JSON, XML). - Familiarity with Azure data services (ADF, ADLS, Synapse, Purview). **Key Responsibilities:** - Design, develop, and maintain data models for Snowflake data warehousing. - Lead the design and implementation of logical, physical, and canonical data models. - Architect data models for Bronze, Silver, and Gold layers following the Medallion Architecture. - Collaborate with stakeholders to translate business needs into scalable data models. - Drive data model governance and compliance with data requirements. - Conduct data profiling, gap analysis, and data integration efforts. - Support time travel kind of reporting and build models for operational & analytical reports. Recruiter Information: - Recruiter Name: Harish Gotur - Recruiter Email Id: harish.gotur@tietoevry.com,
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
karnataka
On-site
We are developers of digital futures! Tietoevry creates purposeful technology that reinvents the world for good. We are a leading technology company with a strong Nordic heritage and global capabilities. Based on our core values of openness, trust, and diversity, we work with our customers to develop digital futures where businesses, societies, and humanity thrive. Our 24,000 experts globally specialize in cloud, data, and software, serving thousands of enterprise and public-sector customers in approximately 90 countries. Tietoevry's annual turnover is approximately EUR 3 billion, and the company's shares are listed on the NASDAQ exchange in Helsinki and Stockholm, as well as on Oslo Brs. EVRY USA delivers IT services to a wide range of customers in the USA through its global delivery centers and India offices (EVRY India) in Bangalore & Chandigarh, India. We offer a comprehensive IT services portfolio and drive digital transformation across various sectors including Banking & Financial Services, Insurance, Healthcare, Retail & Logistics, and Energy, Utilities & Manufacturing. EVRY India's process and project maturity is very high, with the two offshore development centers in India being appraised at CMMI DEV Maturity Level 5 & CMMI SVC Maturity Level 5 and certified under ISO 9001:2015 & ISO/IEC 27001:2013. We are seeking a highly experienced Snowflake Architect with deep expertise in building scalable data platforms on Azure, applying Medallion Architecture principles. The ideal candidate should have strong experience working in the Banking domain. The candidate will play a key role in architecting secure, performant, and compliant data solutions to support business intelligence, risk, compliance, and analytics initiatives. **Pre-requisites:** - 5 years of hands-on experience in Snowflake including schema design, security setup, and performance tuning. - Implementation experience using Snowpark. - Must have a Data Architecture background. - Deployed a fully operational data solution into production on Snowflake & Azure. - Snowflake certification preferred. - Familiarity with data modeling practices like dimensional modeling & data vault. - Understanding of the dbt tool. **Key Responsibilities:** - Design and implement scalable and performant data platforms using Snowflake on Azure, tailored for banking industry use cases. - Architect ingestion, transformation, and consumption layers using Medallion Architecture for a performant & scalable data platform. - Work with data engineers to build modular and reusable bronze, silver and gold layer models that support diverse workloads. - Provide architectural oversight and best practices to ensure scalability, performance, and maintainability. - Collaborate with stakeholders from risk, compliance, and analytics teams to translate requirements into data-driven solutions. - Build architecture to support time travel kind of reporting. - Support CI/CD automation and environment management using tools like Azure DevOps and Git. - Build architecture to support operational & analytical reports. Recruiter Information: - Recruiter Name: Harish Gotur - Recruiter Email Id: harish.gotur@tietoevry.com,
Posted 1 week ago
12.0 - 16.0 years
0 Lacs
hyderabad, telangana
On-site
You are an experienced Data Architect with over 12 years of expertise in data architecture, data engineering, and enterprise-scale data solutions. Your strong background in Microsoft Fabric Data Engineering, Azure Synapse, Power BI, and Data Lake will be instrumental in driving strategic data initiatives for our organization in Hyderabad, India. In this role, you will design and implement scalable, secure, and high-performance data architecture solutions utilizing Microsoft Fabric and related Azure services. Your responsibilities will include defining data strategies aligned with business goals, architecting data pipelines and warehouses, collaborating with stakeholders to define data requirements, and providing technical leadership in data engineering best practices. Your qualifications include 12+ years of experience in data engineering or related roles, proven expertise in Microsoft Fabric and Azure Data Services, hands-on experience in modern data platform design, proficiency in SQL, Python, Spark, and Power BI, as well as strong problem-solving and communication skills. Preferred qualifications include Microsoft certifications, experience with DevOps and CI/CD for data projects, exposure to real-time streaming and IoT data, and prior Agile/Scrum environment experience. If you are passionate about driving innovation in data architecture, optimizing data performance, and leading data initiatives that align with business objectives, we encourage you to apply for this Full-Time Data Architect position in Hyderabad, India.,
Posted 1 week ago
6.0 - 12.0 years
0 Lacs
karnataka
On-site
As a Syniti ADM Jr Developer at NTT DATA in Bangalore, Karnataka (IN-KA), India, you will play a crucial role in ensuring the successful migration of SAP objects from SAP ECC to SAP S4/HANA or from legacy source systems into SAP S4/HANA. Your responsibilities will include working closely with project team members, providing technical guidance, taking ownership of the project's technical aspects, collaborating with clients and other stakeholders, and ensuring the smooth execution of the migration process. To excel in this role, you should have at least 6 years of experience in SAP Data Cleansing, Profiling, Harmonization, and Migration, as well as expertise in SAP Cutover Planning, Data Analysis, and Business Intelligence. Your diverse experience should encompass successful projects involving Data Migration using BackOffice Associates Tools Methodology (Syniti ADM) in areas such as SAP MM and SAP Plant Maintenance. You will need to demonstrate proven hands-on experience in leading large-scale SAP data migration projects, supervising project team members, and ensuring deliverables align with client requirements. Your technical skills should include proficiency in Syniti ADM tool and MS-SQL Server, along with extensive experience in Data Cleansing, Profiling, and Harmonization using Backoffice Tools like ADM and qSuite for SAP Plant Maintenance Master Data and Materials Master Data. Additionally, you should have knowledge of Pharmaceutical and Healthcare verticals, possess excellent communication and organizational skills, and exhibit adaptability in new environments. Your ability to work effectively in time-sensitive environments, lead offshore and onsite teams, and collaborate within onsite-offshore models will be essential for success in this role. Furthermore, your specific expertise should include a Bachelor's Degree in Computer Science Engineering or equivalent, along with 12 years of IT experience in SAP Data Migration, Data Analysis, Process Management, Business Analysis, and more. Certification in BackOffice Associates Data Migration and ETL Tools, such as Syniti ADM and qSuite, is required. Your experience in Healthcare, Pharmaceutical, and various SAP modules like MM, PM, Finance, and SD will be valuable for this position. In summary, as a Syniti ADM Jr Developer at NTT DATA, you will be part of a global innovator in business and technology services, dedicated to helping clients innovate, optimize, and transform for long-term success. With your expertise and skills, you will contribute to the seamless migration of SAP objects and play a vital role in the overall success of the projects undertaken by the organization.,
Posted 1 week ago
1.0 - 5.0 years
0 Lacs
hyderabad, telangana
On-site
The Data Analyst II plays a critical role in engaging with stakeholders and technical team members to execute requirement gathering, documentation of data flows, mapping, extraction, transformation, visualizations, and analytical data analysis. You will work closely with cross-functional teams, including IT and business stakeholders to ensure seamless and efficient data flow, report generation, visualizations, and data analysis. Collaboration with various departments to ensure data accuracy, integrity, and compliance with established data standards is essential. This role reports to the BEST Data Services Senior Manager in the Business Enterprise Systems Technology department. A successful Data Analyst must take a hands-on approach, ensuring the highest quality solutions are provided to business stakeholders, with accurate development, documentation, and adherence to deadlines. This role will also work with key stakeholders across the organization to drive enhancements to a successful implementation and ensure all reporting and analytics meet requirements and are deployed and implemented properly. Responsibilities include engaging with multiple teams to understand reporting and analytical requirements, collaborating with business stakeholders to understand their requirements and translate them into reporting specifications, retrieving and manipulating data from various sources using SQL and ETL tools, cleansing, transforming, and enriching data for accuracy and consistency, developing and maintaining data models to support reporting needs, creating standardized reports and dashboards using tools like Power BI and Tableau, designing and building data visualizations, conducting ad-hoc analysis, ensuring data security and compliance, aligning with business objectives, working on data migration, communicating project status, maintaining detailed documentation, providing post-migration support, and collaborating with technical teams for solutions. Required Knowledge/Skills/Abilities: - Minimum of 1 year of hands-on Data Analyst experience - Proficiency in SQL for data extraction, manipulation, and analysis - Minimum of 1 year of experience with data visualization tools like Power BI - Minimum of 2 years" experience with Python libraries for data visualization and analysis - Strong understanding of data structures, databases, and data models - Excellent communication skills - Proficiency in designing and implementing process workflows and data diagrams - Proven Agile development experience - Excellent problem-solving skills and innovative thinking - Exceptional communication, analytical, and management skills - Ability to present technical concepts to both business executives and technical teams - Able to manage daily stand-ups, escalations, issues, and risks - Self-directed, adaptable, empathetic, flexible, and forward-thinking - Strong organizational, interpersonal, and relationship-building skills - Passionate about technology, digital transformation, and business process reengineering,
Posted 1 week ago
1.0 - 5.0 years
0 Lacs
kochi, kerala
On-site
As an ideal candidate for this position, you should have a minimum of 1 year hands-on experience with Oracle Fusion Applications. You must possess knowledge and practical usage of OTBI (Oracle Transactional Business Intelligence), Oracle BI Publisher (OBIEE/BI Reports), and FDI (Fusion Data Integration). A basic understanding of SQL and data modeling is also required for this role. Experience in report customization, dashboard development, and data extraction is essential, along with the ability to troubleshoot and optimize the performance of BI reports. In addition to the required skills, exposure to Oracle Cloud Infrastructure (OCI) and familiarity with REST/SOAP APIs for Oracle Fusion would be considered advantageous for this position. This is a Full-time job opportunity with benefits including health insurance, paid sick time, paid time off, and the flexibility to work from home. The work location for this role is in person, and the expected start date is 29/07/2025.,
Posted 1 week ago
6.0 - 10.0 years
0 Lacs
karnataka
On-site
As an Associate Director Oracle Technical Architect, you will be a certified Oracle Technical Architect proficient in architecting, solutioning, implementing, and developing various Oracle SaaS, PaaS, and IaaS solutions. You will be responsible for mentoring, guiding, and monitoring a skilled technical team proficient in technologies like Oracle Integration Cloud, Oracle BIP, Oracle FAW, Oracle APEX, Oracle Business Process Management, Oracle Java Cloud Services, Oracle VBCS, node.js, etc. Your key responsibilities will include architecting solutions for different customers" technical challenges and functional needs by identifying the best-in-class technology solutions leveraging the Oracle footprint. You will provide detailed technical architecture and roadmap to the development and functional team for seamless conversion, integration, reports, and workflow improvements. Additionally, you will design, develop, and maintain robust integrations using Oracle Integration Cloud, as well as metrics-based analytics and reporting solutions using technologies like FAW, OTBI, or BIP. You will also be responsible for providing insights and implementing best coding practices and standards, assisting with administration and provisioning of various servers and related services in OCI, fine-tuning codes for quality delivery to customers, and collaborating with stakeholders to gather and translate functional and technical requirements into effective solutions. Your ability to work across various functional areas like ERP, HCM, EPM, SCM, and CX on both cloud/SaaS and on-premises solutions of Oracle will be crucial. As a thought leader, you will mentor, guide, and monitor technical/development team members and ensure technical delivery for the entire organization across multiple customers. Your primary technical skills should include proficiency in Oracle Techno Functional applications, strong expertise in SQL, PL/SQL, Oracle database technologies, designing and developing integrations using OIC, Oracle APEX, ATP solutioning, data modeling, ETL, data warehousing, and experience in fine-tuning database queries. Strong analytical and solution-oriented skills are essential. Certifications required for this role include OCI Cloud Architect, preferred Oracle Cloud Fusion Analytics Warehouse Certified Implementation Professional, Oracle APEX developer, and Oracle SaaS-related certifications. Qualifications should include a Master's degree in computer science, Information Technology, or related field, a minimum of 6 years in Oracle Cloud applications technology field, added experience in Oracle on-premises solutions, excellent problem-solving abilities, strong communication skills, collaborative approach, and strong leadership skills aligned with organizational goals and success.,
Posted 1 week ago
10.0 - 14.0 years
0 Lacs
pune, maharashtra
On-site
As a SQL Developer at Fusion Practices, you will be responsible for designing, developing, and maintaining high-performance MS SQL databases, stored procedures, and T-SQL queries tailored to banking systems and financial reporting. You will utilize your expertise in Microsoft SQL Server and Power BI to develop and implement business intelligence solutions, including crafting dashboards, reports, and data visualizations to support financial and regulatory analysis. Collaboration with stakeholders and business users to gather requirements related to financial products, reporting standards, and risk/regulatory needs will be a key aspect of your role. Your responsibilities will also include working on data modeling, SSIS packages, SQL Agent jobs, SSRS reports, and Power BI dashboards with a focus on banking data and compliance needs. You will translate complex financial data requirements into scalable and performant technical solutions and implement RESTful APIs to support integrations with front-end frameworks (Angular/React) and back-end systems (.NET/C#). Additionally, you will be involved in performance tuning, debugging, unit testing, and documentation for all solutions developed, ensuring adherence to compliance, audit, and financial reporting standards and guidelines. To be successful in this role, you should have 10+ years of hands-on experience with MS SQL Server, T-SQL, SSIS, SSRS, and SQL Agent. Strong expertise in Microsoft Power BI for data visualization and reporting is essential, along with a programming background in C#, JavaScript, and .NET framework. Proven experience in the banking domain, particularly with financial products like loans and deposits, is required. Familiarity with financial reporting frameworks, regulatory compliance, and data privacy standards is also expected. Knowledge of HTML, CSS, and front-end technologies (Angular/React) is a plus, as well as a strong grasp of version control tools (Git) and Agile methodologies.,
Posted 1 week ago
10.0 - 14.0 years
0 Lacs
karnataka
On-site
The role of Finance Supply Chain DDA & System automation Lead at GSK is crucial in driving automation and advancing analytics capabilities across the Finance Supply Chains. The scope of the role has expanded to cover both Vaccines and Pharma sectors, with an increased number of systems to support and users to assist. The Lead will be responsible for leading strategic global projects within GSCF, such as implementing new Forecasting tools and S4Hanna (One SAP), in addition to bringing innovative solutions to enhance productivity. The main purpose of the role includes building a GSCF data lake in Azure/Data sphere, developing data flow for Reporting, creating statistical forecasting models, integrating advanced Analytics tools, and implementing AI solutions within GSCF. The Lead will collaborate closely with TECH and DD&A teams to implement the data strategy and innovation roadmap, partner with GSCF leadership to understand challenges related to AI and new technologies, and drive and implement projects around data modeling and AI. Key responsibilities also include translating finance needs into data model requirements, acting as a conduit between Tech organization and Finance for new ideas and innovations, defining and monitoring KPIs for digital finance solutions, and identifying and resolving issues in data reconciliation. The ideal candidate for this role should have a minimum of 10 years of experience in SAP, data modeling, and BI, with strong technical knowledge and interest in technology and innovations. They should possess effective communication skills, high levels of resilience and energy, analytical skills, and learning agility. Additionally, the candidate should demonstrate accountability, self-motivation, ambition, attention to detail, and the ability to work effectively in a multicultural environment. GSK is a global biopharma company focused on uniting science, technology, and talent to prevent and treat diseases through vaccines, specialty, and general medicines. The organization aims to positively impact the health of billions of people while delivering sustainable shareholder returns. GSK values its employees and strives to create a welcoming and inclusive environment where individuals can thrive, grow, and contribute to the company's success. If you are passionate about making a difference in healthcare and are eager to be part of a team that values innovation and collaboration, then join GSK in their mission to get ahead of disease together.,
Posted 1 week ago
6.0 - 10.0 years
0 Lacs
hyderabad, telangana
On-site
As a Lead Data Engineer specializing in Snowflake Migration at Anblicks, you will be a key player in our Data Modernization Center of Excellence (COE). You will be at the forefront of transforming traditional data platforms by utilizing Snowflake, cloud-native tools, and intelligent automation to help enterprises unlock the power of the cloud. Your primary responsibility will be to lead the migration of legacy data warehouses such as Teradata, Netezza, Oracle, or SQL Server to Snowflake. You will re-engineer and modernize ETL pipelines using cloud-native tools and frameworks like DBT, Snowflake Tasks, Streams, and Snowpark. Additionally, you will design robust ELT pipelines on Snowflake that ensure high performance, scalability, and cost optimization, while integrating Snowflake with AWS, Azure, or GCP. In this role, you will also focus on implementing secure and compliant architectures with RBAC, masking policies, Unity Catalog, and SSO. Automation of repeatable tasks, ensuring data quality and parity between source and target systems, and mentoring junior engineers will be essential aspects of your responsibilities. Collaboration with client stakeholders, architects, and delivery teams to define migration strategies, as well as presenting solutions and roadmaps to technical and business leaders, will also be part of your role. To qualify for this position, you should have at least 6 years of experience in Data Engineering or Data Warehousing, with a minimum of 3 years of hands-on experience in Snowflake design and development. Strong expertise in migrating ETL pipelines from Talend and/or Informatica to cloud-native alternatives, proficiency in SQL, data modeling, ELT design, and pipeline performance tuning are prerequisites. Familiarity with tools like DBT Cloud, Airflow, Snowflake Tasks, or similar orchestrators, as well as a solid understanding of cloud data architecture, security frameworks, and data governance, are also required. Preferred qualifications include Snowflake certifications (SnowPro Core and/or SnowPro Advanced Architect), experience with custom migration tools, metadata-driven pipelines, or LLM-based code conversion, familiarity with domain-specific architectures in Retail, Healthcare, or Manufacturing, and prior experience in a COE or modernization-focused consulting environment. By joining Anblicks as a Lead Data Engineer, you will have the opportunity to lead enterprise-wide data modernization programs, tackle complex real-world challenges, and work alongside certified Snowflake architects, cloud engineers, and innovation teams. You will also have the chance to build reusable IP that scales across clients and industries, while experiencing accelerated career growth in the dynamic Data & AI landscape.,
Posted 1 week ago
2.0 - 6.0 years
0 - 0 Lacs
bangalore, pune
On-site
Key Responsibilities: Design, develop, and maintain SAP BI reports and dashboards Work on data modeling , data extraction , and ETL processes using SAP BW Collaborate with business users to gather reporting requirements Create and manage InfoCubes, DSO, MultiProviders , and BEx Queries Ensure data accuracy and optimize report performance Integrate SAP BI with front-end tools like SAP BO, Lumira , or Analytics Cloud Support testing, documentation, and end-user training Skills Required: 23 years of hands-on experience in SAP BI/BW development and support Strong knowledge of SAP BW Data Modeling , BEx Queries , and ETL Experience with data extraction from SAP and non-SAP sources Good understanding of BEx Analyzer, BO tools , and data flow architecture Familiarity with SAP HANA , S/4HANA , or SAP BW on HANA is an advantage Excellent analytical and problem-solving skills Strong communication and stakeholder management abilities To Apply: Walk-in / Contact us at: White Horse Manpower #12, Office 156, 3rd Floor, Jumma Masjid Golden Complex, Jumma Masjid Road, Bangalore 560051. Contact Numbers: 9739002621
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
karnataka
On-site
You will be responsible for leading the delivery of complex solutions by coding larger features from start to finish. Actively participating in planning, performing code and architecture reviews of your team's product will be a crucial aspect of your role. You will help ensure the quality and integrity of the Software Development Life Cycle (SDLC) for your team by identifying opportunities for improvement in how the team works, through the usage of recommended tools and practices. Additionally, you will lead the triage of complex production issues across systems and demonstrate creativity and initiative in solving complex problems. As a high performer, you will consistently deliver a high volume of story points relative to your team. Being aware of the technology landscape, you will plan the delivery of coarse-grained business needs spanning multiple applications. You will also influence technical peers outside your team and set a consistent example of agile development practices. Coaching other engineers to work as a team with Product and UX will be part of your responsibilities. Furthermore, you will create and enhance internal libraries and tools, provide technical leadership on the product, and determine the technical approach. Proactively communicating status and issues to your manager, collaborating with other teams to find creative solutions to customer issues, and showing a commitment to delivery deadlines, especially seasonal and vendor partner deadlines that are critical to Best Buy's continued success, will be essential. Basic Qualifications: - 5+ years of relevant technical professional experience with a bachelor's degree OR equivalent professional experience. - 2+ years of experience with Google Cloud services including Dataflow, Bigquery, Looker. - 1+ years of experience with Adobe Analytics, Content Square, or similar technologies. - Hands-on experience with data engineering and visualization tools like SQL, Airflow, DBT, PowerBI, Tableau, and Looker. - Strong understanding of real-time data processing and issue detection. - Expertise in data architecture, database design, data quality standards/implementation, and data modeling. Preferred Qualifications: - Experience working in an omni-channel retail environment. - Experience connecting technical issues with business performance metrics. - Experience with Forsta or similar customer feedback systems. - Certification in Google Cloud Platform services. - Good understanding of data governance, data privacy laws & regulations, and best practices. About Best Buy: BBY India is a service provider to Best Buy, and as part of the team working on Best Buy projects and initiatives, you will help fulfill Best Buy's purpose to enrich lives through technology. Every day, you will humanize and personalize tech solutions for every stage of life in Best Buy stores, online, and in Best Buy customers" homes. Best Buy is a place where techies can make technology more meaningful in the lives of millions of people, enabling the purpose of enriching lives through technology. The unique culture at Best Buy unleashes the power of its people and provides fast-moving, collaborative, and inclusive experiences that empower employees of all backgrounds to make a difference, learn, and grow every day. Best Buy's culture is built on deeply supporting and valuing its amazing employees and other team members. Best Buy is committed to being a great place to work, where you can unlock unique career possibilities. Above all, Best Buy aims to provide a place where people can bring their full, authentic selves to work now and into the future. Tomorrow works here.,
Posted 1 week ago
10.0 - 14.0 years
0 Lacs
karnataka
On-site
As a Data Architect / Data Modeling Expert, you will be an essential part of our offshore team based in India, collaborating closely with Business Analysts and Technical Analysts. Your primary responsibilities will revolve around designing and implementing efficient data models in Snowflake, along with creating source-to-target mapping documents. Your expertise in data modeling principles, coupled with exposure to ETL tools, will play a crucial role in architecting databases and driving data modeling initiatives leading to AI solutions. Your key responsibilities will include: - Designing and implementing normalized and denormalized data models in Snowflake based on business and technical requirements. - Collaborating with Business Analysts/Technical Analysts to gather data needs and document requirements effectively. - Developing source-to-target mapping documents to ensure accurate data transformations. - Working on data ingestion, transformation, and integration pipelines using SQL and cloud-based tools. - Optimizing Snowflake queries, schema designs, and indexing for enhanced performance. - Maintaining clear documentation of data models, mappings, and data flow processes. - Ensuring data accuracy, consistency, and compliance with best practices in data governance and quality. You should possess: - 10+ years of experience in Data Modeling, Data Engineering, or related roles. - A strong understanding of data modeling concepts such as OLTP, OLAP, Star Schema, and Snowflake Schema. - Hands-on experience in Snowflake including schema design and query optimization. - The ability to create detailed source-to-target mapping documents. - Proficiency in SQL-based data transformations and queries. - Exposure to ETL tools, with familiarity in Matillion considered advantageous. - Strong problem-solving and analytical skills. - Excellent communication skills for effective collaboration with cross-functional teams. Preferred qualifications include experience in cloud-based data environments (AWS, Azure, or GCP), hands-on exposure to Matillion or other ETL tools, understanding of data governance and security best practices, and familiarity with Agile methodologies.,
Posted 1 week ago
8.0 - 12.0 years
0 Lacs
kochi, kerala
On-site
The ideal candidate for the role of Data Architect should have at least 8+ years of experience in Modern Data Architecture, RDBMS, ETL, NoSQL, Data warehousing, Data Governance, Data Modeling, and Performance Optimization, along with proficiency in Azure/AWS/GCP. Primary skills include defining architecture & end-to-end development of Database/ETL/Data Governance processes. It is essential for the candidate to possess technical leadership skills and provide mentorship to junior team members. The candidate must have hands-on experience in 3 to 4 end-to-end projects involving Modern Data Architecture and Data Governance. Responsibilities include defining the architecture for Data engineering projects and Data Governance systems, designing, developing, and supporting Data Integration applications using Azure/AWS/GCP Cloud platforms, and implementing performance optimization techniques. Proficiency in advanced SQL and experience in modeling/designing transactional and DWH databases is required. Adherence to ISMS policies and procedures is mandatory. Good to have skills include Python, Pyspark, and Power BI. The candidate is expected to onboard by 15/01/2025 and possess a Bachelor's Degree qualification. The role entails ensuring the performance of all duties in accordance with the company's policies and procedures.,
Posted 1 week ago
2.0 - 10.0 years
0 Lacs
pune, maharashtra
On-site
As a Principal Data Engineer at Brillio, you will play a key role in leveraging your expertise in Data Modeling, particularly with tools like ER Studio and ER Win. Brillio, known for its digital technology services and partnership with Fortune 1000 companies, is committed to transforming disruptions into competitive advantages through innovative digital solutions. With over 10 years of IT experience and at least 2 years of hands-on experience in Snowflake, you will be responsible for building and maintaining data models that support the organization's Data Lake/ODS, ETL processes, and data warehousing needs. Your ability to collaborate closely with clients to deliver physical and logical model solutions will be critical to the success of various projects. In this role, you will demonstrate your expertise in Data Modeling concepts at an advanced level, with a focus on modeling in large volume-based environments. Your experience with tools like ER Studio and your overall understanding of database technologies, data warehouses, and analytics will be essential in designing and implementing effective data models. Additionally, your strong skills in Entity Relationship Modeling, knowledge of database design and administration, and proficiency in SQL query development will enable you to contribute to the design and optimization of data structures, including Star Schema design. Your leadership abilities and excellent communication skills will be instrumental in leading teams and ensuring the successful implementation of data modeling solutions. While experience with AWS ecosystems is a plus, your dedication to staying at the forefront of technological advancements and your passion for delivering exceptional client experiences will make you a valuable addition to Brillio's team of "Brillians." Join us in our mission to create innovative digital solutions and make a difference in the world of technology.,
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39817 Jobs | Dublin
Wipro
19388 Jobs | Bengaluru
Accenture in India
15459 Jobs | Dublin 2
EY
14907 Jobs | London
Uplers
11185 Jobs | Ahmedabad
Amazon
10459 Jobs | Seattle,WA
IBM
9256 Jobs | Armonk
Oracle
9226 Jobs | Redwood City
Accenture services Pvt Ltd
7971 Jobs |
Capgemini
7704 Jobs | Paris,France