Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
5.0 - 10.0 years
6 - 10 Lacs
Mumbai
Remote
Travel Requirement : will be plus if willing to travel to the UK as needed Job Description : We are seeking a highly experienced Senior Data Engineer with a background in Microsoft Fabric and have done projects in it. This is a remote position based in India, ideal for professionals who are open to occasional travel to the UK and must possess a valid passport. Key Responsibilities : - Design and implement scalable data solutions using Microsoft Fabric - Lead complex data integration, transformation, and migration projects - Collaborate with global teams to deliver end-to-end data pipelines and architecture - Optimize performance of data systems and troubleshoot issues proactively - Ensure data governance, security, and compliance with industry best practices Required Skills and Experience : - 5+ years of experience in data engineering, including architecture and development - Expertise in Microsoft Fabric, Data Lake, Azure Data Services, and related technologies - Experience in SQL, data modeling, and data pipeline development - Knowledge of modern data platforms and big data technologies - Excellent communication and leadership skills Preferred Qualifications : - Good communication skills - Understanding of data governance and security best practices Perks & Benefits : - Work-from-home flexibility - Competitive salary and perks - Opportunities for international exposure - Collaborative and inclusive work culture
Posted 2 weeks ago
6.0 - 10.0 years
0 Lacs
karnataka
On-site
As a part of BSH Home Appliances Group, you will be responsible for the further development and management of the "Master Data Management (MDM) Data Quality" area. Your role will involve collecting, assessing, and prioritizing requirements in close collaboration with business units and IT teams. You will lead the implementation of data quality initiatives to ensure high data quality across the organization. Additionally, you will be responsible for reporting, analyzing, and visualizing data quality metrics using tools such as PowerBI, as well as handling data integration and creating dashboards utilizing Microsoft PowerBI, Backend Development, DENODO, SAP R3, S4 HANA, and Data Integration. To excel in this role, you should possess excellent stakeholder management and moderation skills. You are expected to have a structured, solution-oriented, and independent working style. Experience working in an Agile environment is crucial, along with the ability to adapt to changing priorities and collaborate effectively with cross-functional teams. The ideal candidate will have 6 or more years of experience in designing, developing, and maintaining interactive PowerBI dashboards, with at least 2 years of experience as a Product Owner. At BSH Home Appliances Group, we offer competitive benefits including GTLI and GMC. If you are ready to take on this exciting opportunity and grow your career in a dynamic environment, we invite you to visit bsh-group.com/career and join our team.,
Posted 2 weeks ago
3.0 - 7.0 years
11 - 15 Lacs
Gurugram
Work from Office
Overview We are seeking an experienced Data Modeller with expertise in designing and implementing data models for modern data platforms. This role requires deep knowledge of data modeling techniques, healthcare data structures, and experience with Databricks Lakehouse architecture. The ideal candidate will have a proven track record of translating complex business requirements into efficient, scalable data models that support analytics and reporting needs. About the Role As a Data Modeller, you will be responsible for designing and implementing data models for our Databricks-based Modern Data Platform. You will work closely with business stakeholders, data architects, and data engineers to create logical and physical data models that support the migration from legacy systems to the Databricks Lakehouse architecture, ensuring data integrity, performance, and compliance with healthcare industry standards. Key Responsibilities Design and implement logical and physical data models for Databricks Lakehouse implementations Translate business requirements into efficient, scalable data models Create and maintain data dictionaries, entity relationship diagrams, and model documentation Develop dimensional models, data vault models, and other modeling approaches as appropriate Support the migration of data models from legacy systems to Databricks platform Collaborate with data architects to ensure alignment with overall data architecture Work with data engineers to implement and optimize data models Ensure data models comply with healthcare industry regulations and standards Implement data modeling best practices and standards Provide guidance on data modeling approaches and techniques Participate in data governance initiatives and data quality assessments Stay current with evolving data modeling techniques and industry trends Qualifications Extensive experience in data modeling for analytics and reporting systems Strong knowledge of dimensional modeling, data vault, and other modeling methodologies Experience with Databricks platform and Delta Lake architecture Expertise in healthcare data modeling and industry standards Experience migrating data models from legacy systems to modern platforms Strong SQL skills and experience with data definition languages Understanding of data governance principles and practices Experience with data modeling tools and technologies Knowledge of performance optimization techniques for data models Bachelor's degree in Computer Science, Information Systems, or related field; advanced degree preferred Professional certifications in data modeling or related areas Technical Skills Data modeling methodologies (dimensional, data vault, etc.) Databricks platform and Delta Lake SQL and data definition languages Data modeling tools (erwin, ER/Studio, etc.) Data warehousing concepts and principles ETL/ELT processes and data integration Performance tuning for data models Metadata management and data cataloging Cloud platforms (AWS, Azure, GCP) Big data technologies and distributed computing Healthcare Industry Knowledge Healthcare data structures and relationships Healthcare terminology and coding systems (ICD, CPT, SNOMED, etc.) Healthcare data standards (HL7, FHIR, etc.) Healthcare analytics use cases and requirements Optionally Healthcare regulatory requirements (HIPAA, HITECH, etc.) Clinical and operational data modeling challenges Population health and value-based care data needs Personal Attributes Strong analytical and problem-solving skills Excellent attention to detail and data quality focus Ability to translate complex business requirements into technical solutions Effective communication skills with both technical and non-technical stakeholders Collaborative approach to working with cross-functional teams Self-motivated with ability to work independently Continuous learner who stays current with industry trends What We Offer Opportunity to design data models for cutting-edge healthcare analytics Collaborative and innovative work environment Competitive compensation package Professional development opportunities Work with leading technologies in the data space This position requires a unique combination of data modeling expertise, technical knowledge, and healthcare industry understanding. The ideal candidate will have demonstrated success in designing efficient, scalable data models and a passion for creating data structures that enable powerful analytics and insights.
Posted 2 weeks ago
10.0 - 17.0 years
20 - 35 Lacs
Gurugram
Hybrid
Who We Are: As the worlds leading sustainability consulting firm, ERM is uniquely positioned to contribute to the environment and society through the expertise and energy of our employees worldwide. Sustainability is what we do, and is at the heart of both our service offerings and how we operate our business. For our people, our vision means attracting, inspiring, developing and rewarding our people to work with the best clients and on the biggest challenges, thus creating valuable careers. We achieve our vision in a sustainable manner by maintaining and living our ERM values that include Accountability, Caring for our People, Client Focus, Collaboration, Empowerment, and Transparency. ERM does not accept recruiting agency resumes. Please do not forward resumes to our jobs alias, ERM employees or any other company location. ERM is not responsible for any fees related to unsolicited resumes. ERM is proud to be an Equal Employment Opportunity employer. We do not discriminate based upon race, religion, color, national origin, gender, sexual orientation, gender identity, age, marital status or disability status. Job Description An exciting opportunity has emerged for a seasoned Data Architect to become a vital member of our ERM Technology team. You will report to the Lead Enterprise Architect and join a dynamic team focused on delivering corporate and technology strategic initiatives. The role demands high-level analytical, problem-solving, and communication skills, along with a strong commitment to customer service. As the Data Architect for ERM, you will work closely with both business and technology stakeholders, utilizing your expertise in business intelligence, analytics, data engineering, data management, and data integration to significantly advance our data strategy and ecosystem. Key responsibilities include: Empowered to define the data and information management architecture for ERM. Collaborate with product owners, engineers, data scientists, and business stakeholders to understand data needs across the full product lifecycle. Ensure a shared understanding of our data, including its quality, ownership, and lineage throughout its lifecycle, from initial capture via client interaction to final consumption by internal and external processes and stakeholders. Ensure that our data landscape effectively meets corporate and regulatory reporting requirements. Establish clear ownership and governance for comprehensive data domain models, encompassing both data in motion and data at rest. Provide expert guidance on solution architecture, engineering principles, and the implementation of data applications utilizing both existing and cutting-edge technology platforms. Build a robust data community by collaborating with architects and engineers, leveraging this community to implement solutions that enhance client and business outcomes through data. The successful candidate will have: Proven experience as an enterprise data architect. Experience in end-to-end implementation of data-intensive analytics-based projects encompassing data acquisition, ingestion, integration, transformation and consumption. Proven experience in the design, development, and implementation of data engineering technologies. Strong knowledge of data management and governance principles. A strong understanding of Azure and AWS service landscapes, particularly data services. Proven experience with various data modelling techniques. Understanding of big data architectures and emerging trends in technology. A solid familiarity with Agile methodologies, test-driven development, source control management, and automated testing. Thank you for your interest in ERM.
Posted 2 weeks ago
5.0 - 9.0 years
8 - 13 Lacs
Hyderabad
Work from Office
About The Role Role Overview:Develop efficient SQL queries and maintain views, models, and data structures across federated and transactional DB to support analytics and reporting. SQL (Advanced) Python for data exploration and scripting Shell scripting for lightweight automationKey Responsibilities: Write complex SQL queries for data extraction and transformations Build and maintain views, materialized views, and data models Enable efficient federated queries and optimize joins across databases Support performance tuning, indexing, and query optimization effortsPrimary: Expertise in MS SQL Server / Oracle DB / PostgresSQL , Columnar DBs like DuckDB , and federated data access Good understanding of Apache Arrow columnar data format, Flight SQL, Apache Calcite Secondary: Experience with data modelling, ER diagrams, and schema design Familiarity with reporting layer backend (e.g., Power BI datasets) Familiarity with utility operations and power distribution is preferred Experience with cloud-hosted databases is preferred Exposure to data lake in cloud ecosystems is a plusOptional Familiar with Grid CIM (Common Information Model; IEC 61970, IEC 61968) Familiarity with GE ADMS DNOM (Distribution Network Object Model) GE GridOS Data Fabric
Posted 2 weeks ago
8.0 - 13.0 years
13 - 18 Lacs
Hyderabad
Work from Office
About the Role: Grade Level (for internal use): 11 The Role: Lead Software Engineer The Team: The Market Intelligence Industry Data Solutions business line provides data technology and services supporting acquisition, ingestion, content management, mastering, and distribution to power our Financial Institution Group business and customer needs. We focus on platform scalability to support business operations by following a common data lifecycle that accelerates business value. Our team provides essential intelligence for the Financial Services, Real Estate, and Insurance industries. The Impact: The FIG Data Engineering team will be responsible for implementing and maintaining services and tools to support existing feed systems. This enables users to consume FIG datasets and makes FIG data available for broader consumption and processing within the company. Whats in it for you: Opportunity to work with global stakeholders and engage with the latest tools and technologies. Responsibilities: Build new data acquisition and transformation pipelines using advanced data processing and cloud technologies. Collaborate with the broader technology team, including information architecture and data integration teams, to align pipelines with strategic initiatives. What Were Looking For: Bachelors degree in computer science or a related field, with at least 8+ years of professional software development experience. Must have: Programming languages commonly used for data processing, Data orchestration and workflow management systems,Distributed data processing framework, relational database management systems, Big data processing frameworks Experience with large-scale data processing platforms. Deep understanding of RESTful services, good API design, and object-oriented programming principles. Proficiency in object-oriented or functional scripting languages. Good working knowledge of relational and NoSQL databases. Experience in maintaining and developing software in production environments utilizing cloud-based tools. Strong collaboration and teamwork skills, along with excellent written and verbal communication abilities. Self-starter and motivated individual with the ability to thrive in a fast-paced software development environment. Agile experience is highly desirable. Experience with data warehousing and analytics platforms will be a significant advantage. About S&P Global Market Intelligence At S&P Global Market Intelligence, a division of S&P Global we understand the importance of accurate, deep and insightful information. Our team of experts delivers unrivaled insights and leading data and technology solutions, partnering with customers to expand their perspective, operate with confidence, andmake decisions with conviction.For more information, visit www.spglobal.com/marketintelligence . Whats In It For You Our Purpose: Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technologythe right combination can unlock possibility and change the world.Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence, pinpointing risks and opening possibilities. We Accelerate Progress. Our People: Our Values: Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits: We take care of you, so you cantake care of business. We care about our people. Thats why we provide everything youand your careerneed to thrive at S&P Global. Health & WellnessHealth care coverage designed for the mind and body. Continuous LearningAccess a wealth of resources to grow your career and learn valuable new skills. Invest in Your FutureSecure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly PerksIts not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the BasicsFrom retail discounts to referral incentive awardssmall perks can make a big difference. For more information on benefits by country visithttps://spgbenefits.com/benefit-summaries Global Hiring and Opportunity at S&P Global: At S&P Global, we are committed to fostering a connected andengaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. Recruitment Fraud Alert If you receive an email from a spglobalind.com domain or any other regionally based domains, it is a scam and should be reported to reportfraud@spglobal.com. S&P Global never requires any candidate to pay money for job applications, interviews, offer letters, pre-employment training or for equipment/delivery of equipment. Stay informed and protect yourself from recruitment fraud by reviewing our guidelines, fraudulent domains, and how to report suspicious activity here. ---- Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf ---- 20 - Professional (EEO-2 Job Categories-United States of America), IFTECH202.2 - Middle Professional Tier II (EEO Job Group), SWP Priority Ratings - (Strategic Workforce Planning)
Posted 2 weeks ago
2.0 - 3.0 years
3 - 4 Lacs
Nagpur
Work from Office
Responsibilities: Data Modeling & Integration: Report & Dashboard Development: Data Transformation: Collaboration: Performance Optimization: . Security & Access Control: Training & Support: Qualification: Gradution in IT or CS Field Requirements: Proven experience as a Power BI Engineer or BI Developer, with a solid understanding of data modeling, visualization, and reporting. Proficiency in Power BI Desktop, Power BI Service, Power Query, DAX, and Power BI Gateway. Strong experience with SQL and data integration from different sources (e.g., databases, APIs, cloud storage).. Strong analytical and problem-solving skills with attention to detail. Excellent communication skills and the ability to work in a collaborative team environment.
Posted 2 weeks ago
12.0 - 20.0 years
35 - 60 Lacs
Bengaluru
Work from Office
Who We Are At Kyndryl, we design, build, manage and modernize the mission-critical technology systems that the world depends on every day. So why work at Kyndryl? We are always moving forward – always pushing ourselves to go further in our efforts to build a more equitable, inclusive world for our employees, our customers and our communities. The Role Join the innovative team at Kyndryl as a Client Technical Solutioner and unlock your potential to shape the future of technology solutions. As a key player in our organization, you will embark on an exciting journey where you get to work closely with customers, understand their unique challenges, and provide them with cutting-edge technical solutions and services. Picture yourself as a trusted advisor – collaborating directly with customers to unravel their business needs, pain points, and technical requirements. Your expertise and deep understanding of our solutions will empower you to craft tailored solutions that address their specific challenges and drive their success. Your role as a Client Technical Solutioner is pivotal in developing domain-specific solutions for our cutting-edge services and offerings. You will be at the forefront of crafting tailored domain solutions and cost cases for both simple and complex, long-term opportunities, demonstrating we meet our customers' requirements while helping them overcome their business challenges. At Kyndryl, we believe in the power of collaboration and your expertise will be essential in supporting our Technical Solutioning and Solutioning Managers during customer technology and business discussions, even at the highest levels of Business/IT Director/LOB. You will have the chance to demonstrate the value of our solutions and products, effectively communicating their business and technical benefits to decision makers and customers. In this role, you will thrive as you create innovative technical solutions that align with industry trends and exceed customer expectations. Your ability to collaborate seamlessly with internal stakeholders will enable you to gather the necessary documents and technical insights to deliver compelling bid submissions. Not only will you define winning cost models for deals, but you will also lead these deals to profitability, ensuring the ultimate success of both our customers and Kyndryl. You will play an essential role in contract negotiations, up to the point of signature, and facilitate a smooth engagement hand-over process. As the primary source of engagement management and solution design within your technical domain, you will compile, refine, and take ownership of final solution documents. Your technical expertise will shine through as you present these documents in a professional and concise manner, showcasing your mastery of the subject matter. You’ll have the opportunity to contribute to the growth and success of Kyndryl by standardizing our go-to-market pitches across various industries. By creating differentiated propositions that align with market requirements, you will position Kyndryl as a leader in the industry, opening new avenues of success for our customers and our organization. Join us as a Client Technical Solutioner at Kyndryl and unleash your potential to shape the future of technical solutions while enjoying a stimulating and rewarding career journey filled with innovation, collaboration, and growth. Your Future at Kyndryl Every position at Kyndryl offers a way forward to grow your career. We have opportunities that you won’t find anywhere else, including hands-on experience, learning opportunities, and the chance to certify in all four major platforms. Whether you want to broaden your knowledge base or narrow your scope and specialize in a specific sector, you can find your opportunity here. Who You Are You’re good at what you do and possess the required experience to prove it. However, equally as important – you have a growth mindset; keen to drive your own personal and professional development. You are customer-focused – someone who prioritizes customer success in their work. And finally, you’re open and borderless – naturally inclusive in how you work with others. Required Skills and Experience 10 – 15 Years (Specialist Seller / Consultant) is a must with 3 – 4 years of relevant experience as Data. Hands on experience in the area of Data Platforms (DwH / Datalake) like Cloudera / Databricks / MS Data Fabric / Teradata / Apache Hadoop / BigQuery / AWS Big Data Solutions (EMR, Redshift, Kinesis) / Qlik etc. Proven past experience in modernizing legacy data / app & transforming them to cloud - architectures Strong understanding of data modelling and database design. Expertise in data integration and ETL processes. Knowledge of data warehousing and business intelligence concepts. Experience with data governance and data quality management Good Domain Experience in BFSI or Manufacturing area. Experience with cloud-based data platforms (e.g., AWS, Azure, GCP). Strong understanding of data integration techniques, including ETL (Extract, Transform, Load), Processes, Data Pipelines, and Data Streaming using Python , Kafka for streams , Pyspark , DBT , and ETL services Understanding & Experience in Data Security principles - Data Masking / Encryption etc Knowledge of Data Governance principles and practices, including Data Quality, Data Lineage, Data Privacy, and Compliance. Knowledge of systems development, including system development life cycle, project management approaches and requirements, design, and testing techniques Excellent communication skills to engage with clients and influence decisions. High level of competence in preparing Architectural documentation and presentations. Must be organized, self-sufficient and can manage multiple initiatives simultaneously. Must have the ability to coordinate with other teams and vendors, independently Deep knowledge of Services offerings and technical solutions in a practice Demonstrated experience translating distinctive technical knowledge into actionable customer insights and solutions Prior consultative selling experience Externally recognized as an expert in the technology and/or solutioning areas, to include technical certifications supporting subdomain focus area(s) Responsible for Prospecting & Qualifying leads, do the relevant Product / Market Research independently, in response to Customer’s requirement / Pain Point. Advising and Shaping Client Requirements to produce high-level designs and technical solutions in response to opportunities and requirements from Customers and Partners. Work with both internal / external stakeholders to identify business requirements, develop solutions to meet those requirements / build the Opportunity. Understand & analyze the application requirements in Client RFPs Design software applications based on the requirements within specified architectural guidelines & constraints. Lead, Design and implement Proof of Concepts & Pilots to demonstrate the solution to Clients /prospects. Being You Diversity is a whole lot more than what we look like or where we come from, it’s how we think and who we are. We welcome people of all cultures, backgrounds, and experiences. But we’re not doing it single-handily: Our Kyndryl Inclusion Networks are only one of many ways we create a workplace where all Kyndryls can find and provide support and advice. This dedication to welcoming everyone into our company means that Kyndryl gives you – and everyone next to you – the ability to bring your whole self to work, individually and collectively, and support the activation of our equitable culture. That’s the Kyndryl Way. What You Can Expect With state-of-the-art resources and Fortune 100 clients, every day is an opportunity to innovate, build new capabilities, new relationships, new processes, and new value. Kyndryl cares about your well-being and prides itself on offering benefits that give you choice, reflect the diversity of our employees and support you and your family through the moments that matter – wherever you are in your life journey. Our employee learning programs give you access to the best learning in the industry to receive certifications, including Microsoft, Google, Amazon, Skillsoft, and many more. Through our company-wide volunteering and giving platform, you can donate, start fundraisers, volunteer, and search over 2 million non-profit organizations. At Kyndryl, we invest heavily in you, we want you to succeed so that together, we will all succeed. Get Referred! If you know someone that works at Kyndryl, when asked ‘How Did You Hear About Us’ during the application process, select ‘Employee Referral’ and enter your contact's Kyndryl email address.
Posted 2 weeks ago
2.0 - 6.0 years
3 - 7 Lacs
Hyderabad
Work from Office
Design, develop, and maintain high-performance SQL and PL/SQL procedures, packages, and functions in Snowflake or other cloud database technologies. Apply advanced performance tuning techniques to optimize database objects, queries, indexing strategies, and resource usage. ; Develop code based on reading and understanding business and functional requirements following the Agile process Produce high-quality code to meet all project deadlines and ensuring the functionality matches the requirements ; Analyze and resolve issues found during the testing or pre-production phases of the software delivery lifecycle; coordinating changes with project team leaders and cross-work team members ; Provide technical support to project team members and responding to inquiries regarding errors or questions about programs Interact with architects, technical leads, team members and project managers as required to address technical and schedule issues. ; Suggest and implement process improvements for estimating, development and testing processes. Support the development of automated and repeatable processes for ETL/ELT, data integration, and data transformation using industry best practices. ; Support cloud migration and modernization initiatives, including re-platforming or refactoring legacy database objects for cloud-native platforms. ; BS Degree in Computer Science, Information Technology, Electrical/Electronic Engineering or another related field or equivalent ; A minimum of 7 years prior work experience working with an application and database development organization with deep expertise in Oracle PL/SQL or SQL Server T-SQL; must demonstrate experience delivering systems and projects from inception through implementation ; Proven experience writing and optimizing complex stored procedures, functions, and packages in relational databases such as Oracle, MySQL, SQL Server ; Strong knowledge of performance tuning, including query optimization, indexing, statistics, execution plans, and partitioning ; Understanding of data integration pipelines, ETL tools, and batch processing techniques. ; Possesses solid software development and programming skills, with an understanding of design patterns, and software development best practices ; Experience with Snowflake, Python scripting, and data transformation frameworks like dbt is a plus ; Work experience in developing Web Applications with Java, Java Script, HTML, JSPs. Experience with MVC frameworks Spring and Angular
Posted 2 weeks ago
5.0 - 9.0 years
0 Lacs
pune, maharashtra
On-site
The Salesforce Team Lead will be responsible for leading a team of Salesforce developers and administrators, overseeing the design, development, and deployment of Salesforce solutions. You will need to demonstrate strong leadership skills, technical expertise in Salesforce, and the ability to collaborate with cross-functional teams to deliver high-quality CRM solutions. Your responsibilities will include leading and managing the Salesforce development team, providing guidance, mentorship, and support. You will oversee the design, development, testing, and deployment of Salesforce solutions to ensure they meet business requirements and are delivered on time. Collaborating with stakeholders to gather requirements, design solutions, and develop project plans will be a key part of your role. Additionally, you will need to ensure the quality of Salesforce solutions through code reviews, testing, and adherence to best practices, as well as manage the integration of Salesforce with other systems and applications. Monitoring and maintaining the health of the Salesforce platform, including performance optimization and troubleshooting, will also be part of your responsibilities. It will be essential to stay up-to-date with Salesforce updates, releases, and best practices to ensure the team is leveraging the latest features and capabilities. Providing technical leadership and expertise in Salesforce development, including Apex, Visualforce, Lightning Components, and Salesforce APIs, will be crucial. Driving continuous improvement initiatives to enhance team productivity and the quality of deliverables will also be expected of you. To qualify for this role, you should have a Bachelor's degree in Computer Science, Information Technology, or a related field. Proven experience as a Salesforce Team Lead, Salesforce Developer, or similar role is required. Strong proficiency in Salesforce development, including Apex, Visualforce, Lightning Components, and Salesforce APIs, is essential. You should also have experience with Salesforce administration, including configuration, customization, and user management, as well as familiarity with Salesforce integration tools and techniques. Excellent leadership, communication, and interpersonal skills are necessary, along with strong problem-solving skills and the ability to work independently and as part of a team. Salesforce certifications (e.g., Salesforce Certified Administrator, Salesforce Certified Platform Developer) are highly desirable. In terms of skills, experience with Agile/Scrum methodologies, knowledge of Salesforce Sales Cloud, Service Cloud, and other Salesforce products, understanding of data migration, data integration, and ETL processes, familiarity with DevOps practices and tools for Salesforce development, and experience with third-party applications and AppExchange products will be beneficial for this role.,
Posted 3 weeks ago
5.0 - 10.0 years
0 Lacs
coimbatore, tamil nadu
On-site
You are an experienced Qlik Sense & Qlik Cloud Developer who will be responsible for designing, developing, and implementing business intelligence solutions using Qlik Sense and Qlik Cloud. Your expertise in data visualization, dashboard development, and cloud-based analytics will be crucial in supporting data-driven decision-making. Your key responsibilities will include developing, maintaining, and enhancing Qlik Sense dashboards and Qlik Cloud applications to meet business analytics needs. You will design and implement data models, ETL processes, and data integration solutions from various sources. Optimizing Qlik applications for performance, scalability, and efficiency will also be a significant part of your role. Collaboration with business stakeholders to gather requirements and deliver insightful analytics solutions is essential. Ensuring data accuracy, integrity, and security across Qlik Sense and Qlik Cloud environments is a critical aspect of your job. Troubleshooting and resolving issues related to data connectivity, scripting, and performance tuning will also be part of your responsibilities. Staying updated with the latest Qlik technologies, best practices, and industry trends is required. Providing technical guidance and training to business users on Qlik Sense & Qlik Cloud functionalities is expected. Collaborating with IT and Data Engineering teams to ensure seamless integration with enterprise data systems is also part of your role. To qualify for this position, you should have 5 to 10 years of hands-on experience in Qlik Sense and Qlik Cloud development. Strong expertise in Qlik scripting, expressions, and set analysis is necessary. Experience with data modeling, ETL processes, and data transformation is required. Knowledge of SQL, relational databases, and data warehousing concepts is essential. Experience integrating Qlik Sense/Qlik Cloud with different data sources like SAP, REST APIs, Cloud Storage, etc., is preferred. A strong understanding of Qlik Management Console (QMC) and security configurations is important. Proficiency in performance optimization, data governance, and dashboard usability is expected. Hands-on experience with cloud platforms like AWS, Azure, or Google Cloud is a plus. You should be able to work independently and collaboratively in a fast-paced environment. Excellent communication and problem-solving skills are necessary for this role. This is a full-time position with the option to work from either Coimbatore or remotely. Interested candidates can send their resumes to fazilahamed.r@forartech.in or contact +91-7305020181. We are excited to meet you and explore the potential of having you as a valuable member of our team. Benefits include commuter assistance, flexible schedule, health insurance, leave encashment, provident fund, and the opportunity to work from home. The work schedule is during the day shift from Monday to Friday, and there is a performance bonus offered. If you are interested in applying for this position, please provide the following information: - Number of years of experience in Qlik Sense - Current CTC - Minimum expected CTC - Notice period or availability to join - Present location Work Location: Coimbatore / Remote (Work from Home),
Posted 3 weeks ago
1.0 - 9.0 years
0 Lacs
hyderabad, telangana
On-site
As an Associate Manager - Data IntegrationOps, you will play a crucial role in supporting and managing data integration and operations programs within our data organization. Your responsibilities will involve maintaining and optimizing data integration workflows, ensuring data reliability, and supporting operational excellence. To succeed in this position, you will need a solid understanding of enterprise data integration, ETL/ELT automation, cloud-based platforms, and operational support. Your primary duties will include assisting in the management of Data IntegrationOps programs, aligning them with business objectives, data governance standards, and enterprise data strategies. You will also be involved in monitoring and enhancing data integration platforms through real-time monitoring, automated alerting, and self-healing capabilities to improve uptime and system performance. Additionally, you will help develop and enforce data integration governance models, operational frameworks, and execution roadmaps to ensure smooth data delivery across the organization. Collaboration with cross-functional teams will be essential to optimize data movement across cloud and on-premises platforms, ensuring data availability, accuracy, and security. You will also contribute to promoting a data-first culture by aligning with PepsiCo's Data & Analytics program and supporting global data engineering efforts across sectors. Continuous improvement initiatives will be part of your responsibilities to enhance the reliability, scalability, and efficiency of data integration processes. Furthermore, you will be involved in supporting data pipelines using ETL/ELT tools such as Informatica IICS, PowerCenter, DDH, SAP BW, and Azure Data Factory under the guidance of senior team members. Developing API-driven data integration solutions using REST APIs and Kafka, deploying and managing cloud-based data platforms like Azure Data Services, AWS Redshift, and Snowflake, and participating in implementing DevOps practices using tools like Terraform, GitOps, Kubernetes, and Jenkins will also be part of your role. Your qualifications should include at least 9 years of technology work experience in a large-scale, global organization, preferably in the CPG (Consumer Packaged Goods) industry. You should also have 4+ years of experience in Data Integration, Data Operations, and Analytics, as well as experience working in cross-functional IT organizations. Leadership/management experience supporting technical teams and hands-on experience in monitoring and supporting SAP BW processes are also required qualifications for this role. In summary, as an Associate Manager - Data IntegrationOps, you will be responsible for supporting and managing data integration and operations programs, collaborating with cross-functional teams, and ensuring the efficiency and reliability of data integration processes. Your expertise in enterprise data integration, ETL/ELT automation, cloud-based platforms, and operational support will be key to your success in this role.,
Posted 3 weeks ago
3.0 - 7.0 years
0 Lacs
pune, maharashtra
On-site
As a People Analytics Specialist, you will collaborate closely with the Regional HR Business Partner to integrate data from multiple systems for comprehensive analysis. You will partner with business leaders to align HR strategies with operational goals, providing strategic HR guidance on workforce planning, talent development, and organizational design. Your role will involve presenting findings and data-driven recommendations to senior management and other key stakeholders while staying informed about the latest trends, tools, and best practices in people analytics and HR technology. In this position, you will be responsible for continuously improving data collection processes, reporting standards, and analytical techniques. You will serve as the single point of contact (SPOC) for all HR operational activities for the region, ensuring smooth coordination and communication across teams. Furthermore, you will focus on measuring and tracking key HR metrics to provide insights on workforce trends and business outcomes. Your duties will include collecting, analyzing, and interpreting HR data related to employee performance, turnover, recruitment, engagement, training and development, attrition, and retention. Collaborating with HR teams, you will play a crucial role in ensuring data-driven decisions in areas such as talent acquisition, employee engagement, and performance management.,
Posted 3 weeks ago
5.0 - 9.0 years
0 Lacs
karnataka
On-site
You are a motivated Business Analyst who will be responsible for the Veeva CRM to Vault CRM migration integration project utilizing Informatica Intelligent Cloud Services (IICS). Your primary role will involve collaborating with business stakeholders to gather, document, and validate requirements for the migration and integration process. You must have a strong background in CRM systems and data integration to ensure a successful transition. Your key responsibilities will include analyzing existing data structures in Veeva CRM, defining mapping strategies for data migration to Vault CRM, and working closely with technical teams to design integration processes, workflows, and ETL requirements. Effective communication with stakeholders is crucial to understand their needs, expectations, and potential impacts of the migration. You will also be involved in developing test plans, supporting user acceptance testing (UAT), and ensuring data integrity and compliance throughout the process. In addition, you will be responsible for creating training materials, conducting training sessions, and educating users on new processes and the Vault CRM system. Your role will play a vital part in facilitating seamless data transfer and ensuring a successful Veeva CRM to Vault CRM migration and integration project.,
Posted 3 weeks ago
3.0 - 7.0 years
0 Lacs
karnataka
On-site
As a HubSpot CRM Administrator at Smith + Howard, you will be responsible for managing and optimizing our HubSpot CRM system. Your role will involve integrating and consolidating three separate CRM platforms into a unified system, ensuring data integrity, and working closely with cross-functional teams to achieve a seamless transition. Your proactive and analytical approach will be crucial in supporting business teams with actionable insights and process improvements. Key Responsibilities: CRM Administration & Management: - Serve as the primary administrator for HubSpot, ensuring optimal performance and user adoption. - Customize HubSpot modules to align with organizational needs. - Manage user roles, permissions, and access controls to maintain security and workflows. - Implement governance policies to maintain data quality. Automation & Workflow Optimization: - Design and implement automated workflows to streamline operations. - Create custom properties, pipelines, workflows, reports, and dashboards. - Develop email sequences, templates, and automation rules for marketing campaigns. Reporting & Analytics: - Build dashboards and reports to provide insights on sales performance and customer engagement. - Monitor key performance indicators and recommend improvements. - Conduct audits of CRM data and processes for optimization. User Support & Training: - Provide technical support and training for HubSpot users. - Stay updated on best practices and emerging CRM trends. Integration & Migration: - Support the consolidation of CRM systems into HubSpot with minimal disruption. - Work with stakeholders to define integration requirements and migration strategies. - Develop testing plans for migrated data to ensure a smooth transition. Qualifications & Experience: - 3-6 years of experience in HubSpot CRM or similar CRM administration. - Proficiency in CRM data management and segmentation. - Strong analytical and problem-solving skills. - Excellent communication and interpersonal skills. Preferred Skills: - HubSpot certifications. - Bachelor's degree in Business, Marketing, or Information Technology. - Familiarity with customer journey mapping and sales process optimization. Location & Work Mode: - Location: Bengaluru (In-office). - Working Hours: Flexible to collaborate with global teams. Join us at Smith + Howard for the opportunity to work in a dynamic company with a strong CRM strategy, shape sales and marketing processes, and work on cutting-edge automation projects with growth opportunities and learning support.,
Posted 3 weeks ago
3.0 - 8.0 years
0 Lacs
pune, maharashtra
On-site
As a Profisee MDM Consultant with 3-8 years of experience, you will be responsible for designing, developing, implementing, and maintaining MDM solutions using Profisee. Your expertise in data governance, data quality, and data integration will be crucial in ensuring the accuracy, consistency, and completeness of master data. This role requires strong technical skills, excellent communication abilities, and effective collaboration with cross-functional teams. Your responsibilities will include: Solution Design and Development: - Leading the design and development of MDM solutions using Profisee, including data models, workflows, business rules, and user interfaces. - Translating business requirements into technical specifications and MDM solutions. - Configuring and customizing the Profisee platform to meet specific business needs. - Developing and implementing data quality rules and processes within Profisee to ensure data accuracy and consistency. - Designing and implementing data integration processes between Profisee and other enterprise systems using various integration techniques. Implementation and Deployment: - Participating in the full MDM implementation lifecycle, including requirements gathering, design, development, testing, deployment, and support. - Developing and executing test plans and scripts to validate the functionality and performance of the MDM solution. - Troubleshooting and resolving issues related to MDM data, processes, and infrastructure. - Deploying and configuring Profisee environments (development, test, production). Data Governance and Stewardship: - Contributing to the development and enforcement of data governance policies and procedures. - Working with data stewards to define data ownership and accountability. - Assisting in the creation and maintenance of data dictionaries and metadata repositories. - Ensuring compliance with data privacy regulations and security policies. Maintenance and Support: - Monitoring the performance and stability of the MDM environment. - Providing ongoing support and maintenance for the MDM solution, including bug fixes, enhancements, and upgrades. - Developing and maintaining documentation for MDM processes, configurations, and procedures. - Proactively identifying and addressing potential issues related to data quality and MDM performance. Collaboration and Communication: - Collaborating with business users, IT staff, and other stakeholders to understand data requirements and implement effective MDM solutions. - Communicating effectively with technical and non-technical audiences. - Participating in project meetings and providing regular status updates. - Mentoring and training junior team members on MDM best practices and the Profisee platform.,
Posted 3 weeks ago
3.0 - 7.0 years
0 Lacs
maharashtra
On-site
The Application Analyst (Finance Systems) role at The Atlas Corp. subsidiary, Seaspan, involves supporting the company's core financial systems, such as NetSuite, Oracle EPM FCCS/PBCS, GTreasury, and Certify. Your responsibilities will include ensuring effective management, configuration, and maintenance of these applications to support key financial and accounting processes. Collaboration with IT and finance teams is crucial to ensure that the systems align with business needs. You will also be involved in system analysis, project management, and continuous improvement efforts. Your main responsibilities will include providing application support and systems analysis for financial applications, ensuring alignment with business processes related to accounting, financial consolidation, budgeting, treasury management, and expense tracking. Applying your understanding of systems development life cycle theory, you will manage the setup, configurations, and updates of financial systems, focusing on continuous quality improvement. Collaborating with cross-functional teams, you will identify, resolve, and implement system enhancements to enhance efficiency and effectiveness. Utilizing your project management skills, you will assist with system upgrades, implementations, and enhancements to ensure timely completion meeting business requirements. Analyzing and resolving system issues will be part of your role while maintaining high-quality standards and compliance with financial reporting standards. Managing application-related issues through the IT ticketing system will be necessary to minimize disruptions and ensure data integrity through effective data integration processes between financial applications. In addition to the above, you will provide training and support to finance users, assist in prioritizing tasks efficiently to ensure optimal operation of financial systems, and employ critical thinking and problem-solving skills to make sound decisions and recommendations for system improvements or issue resolution. To qualify for this role, you should have a Bachelor's degree in IT, Computer Science, Finance, or a related field, or equivalent experience. A minimum of 3 years of experience supporting financial applications, with expertise in at least one of the required systems (NetSuite, GTreasury, or Oracle EPM), is necessary. Strong knowledge of systems analysis, systems development life cycle theory, project management techniques, and financial reporting standards is essential. The ability to work independently, collaborate effectively with teams, and excellent communication skills are also required. Additionally, the role may require occasional after-hours support, international travel, and sufficient mobility to perform basic IT setup tasks.,
Posted 3 weeks ago
5.0 - 9.0 years
0 Lacs
hyderabad, telangana
On-site
You are a talented and collaborative SFCC Developer with a strong understanding of the SFCC SFRA platform and a passion for building high-performance, scalable e-commerce solutions. Working in a team environment, you adhere to established development best practices and contribute to a CORE architecture with a strong focus on performance optimization. Your responsibilities include developing and maintaining e-commerce sites on the SFCC platform using the SFRA framework, contributing to the development and maintenance of a CORE architecture, writing clean, well-documented, and testable code, understanding requirements and translating them into technical solutions, participating in code reviews to ensure code quality and knowledge sharing, staying up-to-date with the latest SFCC technologies and trends, focusing on performance optimization techniques, and utilizing OCAPI and SCAPI to develop and maintain robust and scalable e-commerce functionalities. You hold a minimum Bachelor's Degree in Computer Science, System Engineering, Information Systems, or a related field, along with 5+ years of relevant work experience in SFCC and 3+ years of Commerce Cloud development experience. Certification as a Commerce Cloud Developer is strongly preferred. Your experience includes implementing Core SFCC programming concepts, handling BM for multilingual stores, integrating 3rd party apps to existing stores, working on Debugger (Script & Pipeline), writing custom JavaScript and JS OOP, working with JavaScript technology such as ReactJS, and using various build frameworks such as SFCC-CI, Git workflow, Bitbucket Pipelines, etc. You have hands-on experience using OCAPI and Webservices (REST, SOAP), experience with Atlassian's JIRA, Confluence, and Git source code management tools, as well as experience in multiple web technologies including XML, HTML, CSS, AJAX/JavaScript, and Web Services/SOAP.,
Posted 3 weeks ago
6.0 - 10.0 years
0 Lacs
chennai, tamil nadu
On-site
The WRB Data Technology team at Standard Chartered Bank supports Data and Business Intelligence, Finance and Risk projects globally by delivering data through data warehouse solutions. The team is composed of data specialists, technology experts, and project managers who work closely with business stakeholders to implement end-to-end solutions. Standard Chartered Bank is looking to hire skilled data professionals with relevant experience to contribute to the team's objectives. The successful candidates will be expected to work in a global environment, drawing from both internal and external talent pools. Your responsibilities as a member of the WRB Data Technology team will include participating in data warehousing migration programs involving cross-geography and multi-functional delivery. You will need to align project timelines to ensure successful project delivery, provide support for data analysis, mapping, and profiling, and perform data requirement gathering, analysis, and documentation. Additionally, you will be responsible for mapping data attributes from different source systems to target data models, interpreting use case requirements, designing target data models/data marts, and profiling data attributes to assess data quality and provide remediation recommendations. It is crucial to ensure that data use complies with data architecture principles, including golden sources and standard reference data. Furthermore, you will be involved in data modeling for better data integration within the data warehouse platform and project delivery, engaging consultants, business analysts, and escalating issues in a timely manner. You will work closely with Chapter Leads and Squad Leads to lead projects and manage various stakeholders, including business, technology teams, and internal development teams. Your role will involve transforming business requirements into data requirements, designing data models for use cases and data warehousing, creating data mapping templates, and profiling data to assess quality, suitability, and cardinality. You will also support data stores inbound and/or outbound development, perform data acceptance testing, provide direction on solutions from a standard product/architecture perspective, and participate in key decision-making discussions with business stakeholders. Additionally, you will be responsible for supporting System Integration Testing (SIT) and User Acceptance Testing (UAT), managing change requests effectively, ensuring alignment with bank processes and standards, and delivering functional specifications to the development team. To excel in this role, you should possess domain knowledge and technical skills, along with 6-8 years of experience in banking domain/product knowledge with IT working experience. A graduate degree in computer science or a relevant field is required, and familiarity with tools such as Clarity, ADO, Axess, and SQL is beneficial. Strong communication and stakeholder management skills are essential, as well as the ability to write complex SQL scripts. Knowledge of Base SAS is an advantage, and familiarity with Retail Banking and Wealth Lending data is ideal. You should be able to work effectively in a multi-cultural, cross-border, and matrix reporting environment, demonstrating knowledge management for MIS applications, business rules, mapping documents, data definitions, system functions, and processes. With a background in business or data analysis roles, you should have a good understanding of data analytics, deep dive capabilities, and excellent attention to detail and time management. This role offers the opportunity to become a go-to person for data across the bank globally, providing extensive exposure to all parts of the bank's business model. It serves as a solid foundation for a future career in the broader data space, preparing individuals for roles in analytics, business intelligence, and big data. Your work will contribute to driving commerce and prosperity through unique diversity, aligning with Standard Chartered Bank's purpose and brand promise to be here for good. If you are passionate about making a positive difference and are eager to work in a collaborative and inclusive environment, we encourage you to join our team at Standard Chartered Bank.,
Posted 3 weeks ago
5.0 - 9.0 years
0 Lacs
chennai, tamil nadu
On-site
As a PySpark Data Engineer, you will play a crucial role in developing robust data processing and transformation solutions within our data platform. Your responsibilities will include designing, implementing, and maintaining PySpark-based applications to handle complex data processing tasks, ensuring data quality, and integrating with diverse data sources. To excel in this role, you should possess strong PySpark development skills, experience with big data technologies, and the ability to thrive in a fast-paced, data-driven environment. Your primary responsibilities will involve designing, developing, and testing PySpark-based applications to process, transform, and analyze large-scale datasets from various sources such as relational databases, NoSQL databases, batch files, and real-time data streams. You will need to implement efficient data transformation and aggregation techniques using PySpark and relevant big data frameworks, as well as develop robust error handling and exception management mechanisms to maintain data integrity and system resilience within Spark jobs. Additionally, optimizing PySpark jobs for performance through techniques like partitioning, caching, and tuning of Spark configurations will be essential. Collaboration will be key in this role, as you will work closely with data analysts, data scientists, and data architects to understand data processing requirements and deliver high-quality data solutions. By analyzing and interpreting data structures, formats, and relationships, you will implement effective data transformations using PySpark and work with distributed datasets in Spark to ensure optimal performance for large-scale data processing and analytics. In terms of data integration and ETL processes, you will design and implement ETL (Extract, Transform, Load) processes to ingest and integrate data from various sources, ensuring consistency, accuracy, and performance. Integration of PySpark applications with data sources such as SQL databases, NoSQL databases, data lakes, and streaming platforms will also be a part of your responsibilities. To excel in this role, you should possess a Bachelor's degree in Computer Science, Information Technology, or a related field, along with 5+ years of hands-on experience in big data development, preferably with exposure to data-intensive applications. A strong understanding of data processing principles, techniques, and best practices in a big data environment is essential, as well as proficiency in PySpark, Apache Spark, and related big data technologies for data processing, analysis, and integration. Experience with ETL development and data pipeline orchestration tools such as Apache Airflow and Luigi will be advantageous. Strong analytical and problem-solving skills, along with excellent communication and collaboration abilities, will also be critical for success in this role.,
Posted 3 weeks ago
2.0 - 6.0 years
0 Lacs
noida, uttar pradesh
On-site
The PEX Report Developer position entails collaborating with fund accounting professionals and technology teams to develop, maintain, and enhance customized reporting statements. As a PEX Report Developer, your primary responsibility will involve utilizing QlikView version 11 or higher to create and manage reporting solutions. You should possess a minimum of 2 years of experience with a focus on QlikView Dashboard Design & Development. A strong understanding of SQL, relational databases, and Dimensional Modeling is essential for this role. Proficiency in working with large datasets and experience in handling complex data models involving more than 10 tables is required. You will be tasked with integrating data from various sources into a QlikView Data Model, including Social Media content and API extensions. The ideal candidate will have a Bachelor's degree in Computer Science and extensive expertise in all aspects of the QlikView lifecycle. You should be well-versed in complex QlikView functions, such as set analysis, alternate states, and advanced scripting. Experience with section access and implementing data level security is crucial for this role. Additionally, familiarity with QlikView distributed architecture, SDLC, and Agile software development concepts is preferred. Responsibilities of the role include creating new reporting and dashboard applications using technologies like QlikView and NPrinting to facilitate better decision-making within the business areas. You will collaborate with stakeholders to identify use cases, gather requirements, and translate them into system and functional specifications. Additionally, you will be responsible for installing, configuring, and maintaining the QlikView environment, developing complex QlikView applications, and defining data extraction processes from multiple sources. As part of the team, you will have the opportunity to mentor and train other team members on best practices related to QlikView. Furthermore, you will contribute to designing support procedures, training IT support, and providing end-user support for QlikView-related issues. Following the SDLC methodology is an integral part of this role. At GlobalLogic, we offer a culture that prioritizes caring, continuous learning and development opportunities, meaningful work on impactful projects, balance, flexibility, and a high-trust environment. As a trusted digital engineering partner, we collaborate with leading companies worldwide, driving digital transformation and creating intelligent products and services. Join us at GlobalLogic, a Hitachi Group Company, and be part of a team that is shaping the digital revolution and redefining industries through innovation and collaboration.,
Posted 3 weeks ago
8.0 - 12.0 years
0 Lacs
pune, maharashtra
On-site
As a Data Management Technical Lead/Product Owner at Air Products, you will be responsible for leading the technical support and implementation of various Data Management tools such as Alation Enterprise Data Catalog, SAP Information Steward, Precisely Management Studio, Qlik suite, SAC (SAP Analytics Cloud), SAP Datasphere, and HANA. Your role will involve possessing technical knowledge of these applications, including upgrades and maintenance, while collaborating effectively with global teams to build relationships with key stakeholders and drive business value through the use of data tools. In this hybrid role based in Pune, India, your primary responsibilities will include serving as the main point of contact for technical support, defining and prioritizing the technical product backlog in alignment with business objectives, collaborating with cross-functional teams, and leading the planning, execution, and delivery of technical environments. Your expertise will be crucial in providing technical guidance, training, and support to end-users, ensuring successful deployment and utilization of data management platforms. To excel in this role, you should have up to 8+ years of experience in Applications Development and/or Business Intelligence/Database work, with a focus on requirements analysis. A Bachelor's degree in computer science, Information Systems, or related field is required, with a preference for a Master's degree. Your technical skills should include experience with terraform and Azure DevOps for provisioning infrastructure, along with a deep understanding of data catalog concepts and data integration. Your ability to troubleshoot technical issues, translate business requirements into technical solutions, and communicate effectively with stakeholders will be essential. Experience with agile/scrum methodologies, strong analytical and problem-solving skills, and knowledge of data privacy considerations are also desired. By joining the Air Products team, you will contribute to building a cleaner future through safe, end-to-end solutions and driving innovation in the industrial gas industry. If you are a self-motivated and detail-oriented professional with a passion for data management and analytics solutions, we invite you to consider this exciting opportunity to grow with us at Air Products and be part of our mission to reimagine what's possible in the world of energy and environmental sustainability.,
Posted 3 weeks ago
9.0 - 13.0 years
0 Lacs
hyderabad, telangana
On-site
As part of our team at Pega, you will be working alongside a group of dedicated engineers who are driven by motivation and a strong sense of ownership. Our focus is on building high-quality software to ensure success for our customers. We believe in excellence and utilize agile methodologies to achieve our goals. Collaboration and support are key components of our work environment as we strive to innovate and make our products stand out using the latest technologies. In your role at Pega, you will lead engineering teams responsible for developing core features of the Pega Launchpad platform. This platform is cloud-native, scalable, and fault-tolerant, with a focus on integrating GenAI capabilities and enabling the Vibe coding paradigm. Your responsibilities will include setting clear goals for your team, providing continuous feedback, and ensuring that best engineering practices are followed to build top-notch software. Your expertise in database and data integration technologies, along with your experience in software development, will be crucial in translating requirements into application features. You will work closely with product management to define technical and architectural solutions, ensuring the quality and timely delivery of features. With over 9 years of experience in software development, you have a proven track record of delivering high-quality solutions that meet client expectations. Your background in enterprise-level, cloud-native software, coupled with your experience in managing high-performing teams, makes you an ideal candidate for this role. Your hands-on experience with cloud-native software and GenAI technologies will be instrumental in driving innovation and productivity within the team. At Pega, you will have the opportunity to work in a dynamic and inclusive environment that fosters continuous learning and development. Join us in our mission to deliver cutting-edge solutions and make a meaningful impact in the world of software development. Job ID: 22267,
Posted 3 weeks ago
5.0 - 9.0 years
0 Lacs
maharashtra
On-site
You will be responsible for leading and managing the technical aspects of the Credit Department's operations in the role of State Technical Head. Your main focus will be to ensure efficiency and effectiveness within the department. This includes overseeing the development and maintenance of credit-related software, systems, and technological infrastructure. Collaboration with cross-functional teams is essential to understand business requirements and translate them into technical solutions. You will play a key role in implementing and enhancing credit scoring models, algorithms, and decision-making tools. Introducing new technologies to streamline credit processes and enhance data analytics capabilities is also part of your responsibilities. Leading and managing a team of technical professionals will require you to provide guidance and support for their professional development. Collaborating with IT and security teams is crucial to ensure the security and compliance of credit-related systems. Your role will also involve evaluating and recommending emerging technologies to enhance the Credit Department's capabilities. Troubleshooting and resolving technical issues related to credit systems and applications will be part of your day-to-day tasks. Additionally, you will provide technical expertise and support for credit-related projects and initiatives. To be successful in this role, you should have a Bachelor's or Master's degree in engineering or a related field. A minimum of 5+ years of proven experience in a technical leadership role within the credit or financial services industry is required. In-depth knowledge of credit scoring models, risk analytics, and credit decisioning processes is essential. Proficiency in programming languages and database management is a must. Strong understanding of data management, data integration, and data governance is also necessary. Leadership and team management skills are crucial, with the ability to inspire and guide technical professionals. Excellent communication skills are needed to convey technical concepts to non-technical stakeholders. Familiarity with cloud computing platforms and technologies is an advantage. Being detail-oriented with a focus on system security and compliance is important. Adaptability to evolving technologies and industry best practices is also key for this role. If you are interested in this position and meet the requirements, please share your CV at aamer.khan@herohfl.com. This position is based in Mumbai, and only candidates from the Mumbai region with good knowledge about Mumbai region properties should apply.,
Posted 3 weeks ago
3.0 - 7.0 years
0 Lacs
chennai, tamil nadu
On-site
As a PL/SQL Developer with 3+ years of experience in Oracle / Postgres, you will be responsible for designing, developing, and maintaining database applications using the PL/SQL programming language. Your key roles and responsibilities will include: - Designing and developing database schemas, stored procedures, functions, and triggers using PL/SQL to ensure efficient data storage and retrieval. - Optimizing database performance by tuning SQL queries and PL/SQL code to enhance overall system efficiency. - Developing and executing test plans to validate the quality and accuracy of PL/SQL code, ensuring the reliability of database applications. - Troubleshooting and resolving issues related to PL/SQL code to maintain the integrity and functionality of database systems. - Implementing database security policies and procedures to safeguard the confidentiality and integrity of data, ensuring data protection. - Collaborating with cross-functional teams to support their data needs and provide access to data for reporting and analytics purposes. - Deploying and supporting object shipment during any database deployment and integrated system upgrades. - Creating and maintaining database schemas, tables, indexes, and relationships based on project requirements and best practices. - Writing and optimizing SQL queries to extract, manipulate, and transform data for various business needs, ensuring query performance. - Integrating data from different sources into the SQL database, including APIs, flat files, and other databases, for comprehensive data storage. - Developing and maintaining data models, ER diagrams, and documentation to effectively represent database structures and relationships. - Monitoring and fine-tuning database performance to identify and resolve bottlenecks and inefficiencies for optimized system functionality. - Ensuring data accuracy and consistency through validation and cleansing processes, identifying and rectifying data quality issues. - Analyzing and optimizing complex SQL queries and procedures for enhanced performance and efficiency. - Maintaining comprehensive documentation of database structures, schemas, and processes for future reference and team collaboration. You should possess strong problem-solving and analytical skills, with attention to detail, excellent project management abilities to oversee multiple projects and meet deadlines, and strong collaboration skills to work both independently and in a team. Fluency in English with excellent written and verbal communication skills is essential for effective interaction with stakeholders.,
Posted 3 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.