Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
5.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
hackajob is collaborating with Comcast to connect them with exceptional tech professionals for this role. Comcast brings together the best in media and technology. We drive innovation to create the world's best entertainment and online experiences. As a Fortune 50 leader, we set the pace in a variety of innovative and fascinating businesses and create career opportunities across a wide range of locations and disciplines. We are at the forefront of change and move at an amazing pace, thanks to our remarkable people, who bring cutting-edge products and services to life for millions of customers every day. If you share in our passion for teamwork, our vision to revolutionize industries and our goal to lead the future in media and technology, we want you to fast-forward your career at Comcast. Job Summary "Responsible for contributing towards the build and maintenance of the organization's cyber security systems and infrastructure. Exercises solid knowledge of engineering skills and methodology with a working knowledge of applicable cyber security compliance standards. Conducts security assessments and audits to identify cybersecurity risks within the company's networks, applications and operating systems. Helps secure and protect the Network Infrastructure: Routers, Switches, Optical Devices, L2 Datacenter and cabling, Strand Mounted devices, Secure Routing protocols, DOCSIS plant (CMTS/vCMTS/PON), SDN, best practice device configuration, network automation, monitoring and troubleshooting. Tests company's internal systems to validate security and detect any computer and information security weaknesses. Performs a technical analysis of vulnerabilities and determines the impacts to the organization Reports, tracks and records findings in a comprehensive vulnerability assessment report. Identifies and recommends appropriate action to mitigate vulnerabilities and reduce potential impacts on cybersecurity resources. Applies long-term objectives and plans related to the company's technical vision to daily activity. Applies innovative solutions for cyber engineering developmental problems that are competitive with industry and company standards. Has in-depth experience, knowledge and skills in own discipline. Usually determines own work priorities. Acts as a resource for colleagues with less experience. Employees at all levels are expect to: - Understand our Operating Principles; make them the guidelines for how you do your job - Own the customer experience - think and act in ways that put our customers first, give them seamless digital options at every touchpoint, and make them promoters of our products and services - Know your stuff - be enthusiastic learners, users and advocates of our game-changing technology, products and services, especially our digital tools and experiences - Win as a team - make big things happen by working together and being open to new ideas - Be an active part of the Net Promoter System - a way of working that brings more employee and customer feedback into the company - by joining huddles, making call backs and helping us elevate opportunities to do better for our customers - Drive results and growth - Respect and promote inclusion and diversity - Do what's right for each other, our customers, investors and our communities" Job Description Roles and Responsibilities: Server Management: Build, configure, and maintain Linux and Windows servers. System Monitoring: Monitor system performance and ensure the reliability and availability of infrastructure. Patching and Updates: Apply patches, updates, and upgrades to operating systems and software. Configuration Management: Manage system configurations and ensure compliance with organizational policies. User Management: Create, manage, and maintain user accounts, roles, and permissions. Security Management: Implement and maintain security measures, including firewalls, intrusion detection systems, and access controls. Backup and Recovery: Develop and maintain backup and recovery procedures to ensure data integrity and availability. Scripting and Automation: Write and maintain scripts (BASH, Python) to automate routine tasks and improve efficiency. Network Management: Configure and manage network services, including DNS, DHCP, and VPN. Troubleshooting: Diagnose and resolve hardware, software, and network issues. Documentation: Maintain detailed documentation of system configurations, procedures, and changes. Performance Tuning: Optimize system performance and resource utilization. Compliance: Ensure systems comply with industry standards and regulatory requirements. Collaboration: Work closely with other IT and cybersecurity teams to support and implement new technologies and solutions. Incident Response: Participate in incident response and problem resolution activities. Qualifications 5+ years of experience in building, running, and maintaining Linux and Windows servers in an enterprise environment. Proficiency in scripting (BASH, Python, etc) Strong understanding of Linux infrastructure, daemons, services, PIDs, folder structures, and permissions. Excellent troubleshooting and problem-solving skills. Strong verbal and written communication skills. Relevant certifications (e.g., RHCE, CompTIA Linux+, Microsoft Certified: Windows Server) are a plus. Some Other Qualifications That Are Good To Have Experience with log collection, log parsing/normalization and/or ETL pipelines are a plus Previous experience within Cybersecurity teams a plus Previous experience with Logstash, Splunk, Elastic, Databricks, Tableau a plus. Comcast is proud to be an equal opportunity workplace. We will consider all qualified applicants for employment without regard to race, color, religion, age, sex, sexual orientation, gender identity, national origin, disability, veteran status, genetic information, or any other basis protected by applicable law. Base pay is one part of the Total Rewards that Comcast provides to compensate and recognize employees for their work. Most sales positions are eligible for a Commission under the terms of an applicable plan, while most non-sales positions are eligible for a Bonus. Additionally, Comcast provides best-in-class Benefits to eligible employees. We believe that benefits should connect you to the support you need when it matters most, and should help you care for those who matter most. That’s why we provide an array of options, expert guidance and always-on tools, that are personalized to meet the needs of your reality - to help support you physically, financially and emotionally through the big milestones and in your everyday life. Education Bachelor's Degree While possessing the stated degree is preferred, Comcast also may consider applicants who hold some combination of coursework and experience, or who have extensive related professional experience. Relevant Work Experience 5-7 Years
Posted 1 week ago
10.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Make an impact with NTT DATA Join a company that is pushing the boundaries of what is possible. We are renowned for our technical excellence and leading innovations, and for making a difference to our clients and society. Our workplace embraces diversity and inclusion – it’s a place where you can grow, belong and thrive. Your day at NTT DATA We are seeking an experienced Solution Architect/Business Development Manager with expertise in AI/ML to drive business growth and deliver innovative solutions. The successful candidate will be responsible for assessing client business requirements, designing technical solutions, recommending AI/ML approaches, and collaborating with delivery organizations to implement end-to-end solutions. What You'll Be Doing Key Responsibilities: Business Requirement Analysis: Assess client's business requirements and convert them into technical specifications that meet business outcomes. AI/ML Solution Design: Recommend the right AI/ML approaches to meet business requirements and design solutions that drive business value. Opportunity Sizing: Size the opportunity and develop business cases to secure new projects and grow existing relationships. Solution Delivery: Collaborate with delivery organizations to design end-to-end AI/ML solutions, ensuring timely and within-budget delivery. Costing and Pricing: Develop costing and pricing strategies for AI/ML solutions, ensuring competitiveness and profitability. Client Relationship Management: Build and maintain strong relationships with clients, understanding their business needs and identifying new opportunities. Technical Leadership: Provide technical leadership and guidance to delivery teams, ensuring solutions meet technical and business requirements. Knowledge Sharing: Share knowledge and expertise with the team, contributing to the development of best practices and staying up-to-date with industry trends. Collaboration: Work closely with cross-functional teams, including data science, engineering, and product management, to ensure successful project delivery. Requirements: Education: Master's degree in Computer Science, Engineering, or related field Experience: 10+ years of experience in AI/ML solution architecture, business development, or a related field Technical Skills: Strong technical expertise in AI/ML, including machine learning algorithms, deep learning, and natural language processing. Technical Skills: Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity Technical Skills: Hands-on implementing various machine learning algorithms such as linear regression, logistic regression, decision trees, and clustering algorithms Hyperscaler: Experience with cloud-based AI/ML platforms and tools (e.g., AWS SageMaker, Azure Machine Learning, Google Cloud AI Platform) Softskill: Excellent business acumen and understanding of business requirements and outcomes Softskill: Strong communication and interpersonal skills, with ability to work with clients and delivery teams Business Acumen: Experience with solution costing and pricing strategies with Strong analytical and problem-solving skills, with ability to think creatively and drive innovation Nice to Have: Experience with Agile development methodologies Knowledge of industry-specific AI/ML applications (e.g., healthcare, finance, retail) Certification in AI/ML or related field (e.g., AWS Certified Machine Learning – Specialty) Location: Delhi or Bangalore Workplace type: Hybrid Working About NTT DATA NTT DATA is a $30+ billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long-term success. We invest over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies. Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure, and connectivity. We are also one of the leading providers of digital and AI infrastructure in the world. NTT DATA is part of NTT Group and headquartered in Tokyo. Equal Opportunity Employer NTT DATA is proud to be an Equal Opportunity Employer with a global culture that embraces diversity. We are committed to providing an environment free of unfair discrimination and harassment. We do not discriminate based on age, race, colour, gender, sexual orientation, religion, nationality, disability, pregnancy, marital status, veteran status, or any other protected category. Join our growing global team and accelerate your career with us. Apply today.
Posted 1 week ago
10.0 years
0 Lacs
Delhi Cantonment, Delhi, India
On-site
Make an impact with NTT DATA Join a company that is pushing the boundaries of what is possible. We are renowned for our technical excellence and leading innovations, and for making a difference to our clients and society. Our workplace embraces diversity and inclusion – it’s a place where you can grow, belong and thrive. Your day at NTT DATA We are seeking an experienced Solution Architect/Business Development Manager with expertise in AI/ML to drive business growth and deliver innovative solutions. The successful candidate will be responsible for assessing client business requirements, designing technical solutions, recommending AI/ML approaches, and collaborating with delivery organizations to implement end-to-end solutions. What You'll Be Doing Key Responsibilities: Business Requirement Analysis: Assess client's business requirements and convert them into technical specifications that meet business outcomes. AI/ML Solution Design: Recommend the right AI/ML approaches to meet business requirements and design solutions that drive business value. Opportunity Sizing: Size the opportunity and develop business cases to secure new projects and grow existing relationships. Solution Delivery: Collaborate with delivery organizations to design end-to-end AI/ML solutions, ensuring timely and within-budget delivery. Costing and Pricing: Develop costing and pricing strategies for AI/ML solutions, ensuring competitiveness and profitability. Client Relationship Management: Build and maintain strong relationships with clients, understanding their business needs and identifying new opportunities. Technical Leadership: Provide technical leadership and guidance to delivery teams, ensuring solutions meet technical and business requirements. Knowledge Sharing: Share knowledge and expertise with the team, contributing to the development of best practices and staying up-to-date with industry trends. Collaboration: Work closely with cross-functional teams, including data science, engineering, and product management, to ensure successful project delivery. Requirements: Education: Master's degree in Computer Science, Engineering, or related field Experience: 10+ years of experience in AI/ML solution architecture, business development, or a related field Technical Skills: Strong technical expertise in AI/ML, including machine learning algorithms, deep learning, and natural language processing. Technical Skills: Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity Technical Skills: Hands-on implementing various machine learning algorithms such as linear regression, logistic regression, decision trees, and clustering algorithms Hyperscaler: Experience with cloud-based AI/ML platforms and tools (e.g., AWS SageMaker, Azure Machine Learning, Google Cloud AI Platform) Softskill: Excellent business acumen and understanding of business requirements and outcomes Softskill: Strong communication and interpersonal skills, with ability to work with clients and delivery teams Business Acumen: Experience with solution costing and pricing strategies with Strong analytical and problem-solving skills, with ability to think creatively and drive innovation Nice to Have: Experience with Agile development methodologies Knowledge of industry-specific AI/ML applications (e.g., healthcare, finance, retail) Certification in AI/ML or related field (e.g., AWS Certified Machine Learning – Specialty) Location: Delhi or Bangalore Workplace type: Hybrid Working About NTT DATA NTT DATA is a $30+ billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long-term success. We invest over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies. Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure, and connectivity. We are also one of the leading providers of digital and AI infrastructure in the world. NTT DATA is part of NTT Group and headquartered in Tokyo. Equal Opportunity Employer NTT DATA is proud to be an Equal Opportunity Employer with a global culture that embraces diversity. We are committed to providing an environment free of unfair discrimination and harassment. We do not discriminate based on age, race, colour, gender, sexual orientation, religion, nationality, disability, pregnancy, marital status, veteran status, or any other protected category. Join our growing global team and accelerate your career with us. Apply today.
Posted 1 week ago
3.0 - 5.0 years
0 Lacs
Roorkee, Uttarakhand, India
Remote
Company Description Miratech helps visionaries change the world. We are a global IT services and consulting company that brings together enterprise and start-up innovation. Today, we support digital transformation for some of the world's largest enterprises. By partnering with both large and small players, we stay at the leading edge of technology, remain nimble even as a global leader, and create technology that helps our clients further enhance their business. We are a values-driven organization and our culture of Relentless Performance has enabled over 99% of Miratech's engagements to succeed by meeting or exceeding our scope, schedule, and/or budget objectives since our inception in 1989. Miratech has coverage across 5 continents and operates in 25 countries around the world. Miratech retains nearly 1000 full-time professionals, and our annual growth rate exceeds 25%. Job Description We seek a skilled Senior Python and SQL Engineers to work in the data team of a big financial asset management company to support their applications from a data perspective. Responsibilities: Write, test, and maintain Python code and SQL queries to support project requirements. Assist in system integration and debugging, addressing issues as they arise. Collaborate with senior engineers to ensure solutions are aligned with project goals. Conduct development testing to verify components function as intended. Perform data analysis, identify inconsistencies, and propose solutions to improve quality. Participate in task estimation and contribute to project timelines. Maintain technical documentation for solutions and processes. Support ongoing system improvements under the guidance of senior team members. Qualifications 3-5 years of experience as a software developer using Python. 1-2 years of experience working with relational databases, preferably Sybase, and SQL experience with Database Modeling/Normalization techniques. Experience on Linux operating systems. Experience in the finance industry and knowledge of financial products/markets. Experience working in a globally distributed team. Written and spoken fluency in English. Excellent communication skills, both written and verbal. A track record of taking the initiative to solve problems and working independently with minimal direction. Nice to have: Experience with Python frameworks utilizing Asyncio. Familiarity with cloud technologies like Kubernetes, Docker. Experience with DevOps tools like Git, Maven, Jenkins, GitLab CI. Experience in designing multi-tier application architectures and distributed caching solutions. ETL background in any language or tools. Experience working with large volumes of time series data and building services, APIs, and applications based on it. Ability to troubleshoot and fix performance issues across the codebase and database queries. BA/BS in Computer Science or equivalent practical experience. We offer: Culture of Relentless Performance: join an unstoppable technology development team with a 99% project success rate and more than 30% year-over-year revenue growth. Competitive Pay and Benefits: enjoy a comprehensive compensation and benefits package, including health insurance, and a relocation program. Work From Anywhere Culture: make the most of the flexibility that comes with remote work. Growth Mindset: reap the benefits of a range of professional development opportunities, including certification programs, mentorship and talent investment programs, internal mobility and internship opportunities. Global Impact: collaborate on impactful projects for top global clients and shape the future of industries. Welcoming Multicultural Environment: be a part of a dynamic, global team and thrive in an inclusive and supportive work environment with open communication and regular team-building company social events. Social Sustainability Values: join our sustainable business practices focused on five pillars, including IT education, community empowerment, fair operating practices, environmental sustainability, and gender equality. Miratech is an equal opportunity employer and does not discriminate against any employee or applicant for employment on the basis of race, color, religion, sex, national origin, age, disability, veteran status, sexual orientation, gender identity, or any other protected status under applicable law.
Posted 1 week ago
0 years
0 Lacs
Pune, Maharashtra, India
On-site
Job Summary ¿ Proficiency with major search engines and platforms such as Coveo, Elasticsearch, Solr, MongoDB Atlas, or similar technologies. ¿ Experience with Natural Language Processing (NLP) and machine learning techniques for search relevance and personalization. ¿ Ability to design and implement ranking algorithms and relevance tuning. ¿ Experience with A/B testing and other methods for optimizing search results. ¿ Experience with analyzing search logs and metrics to understand user behavior and improve search performance. ¿ Deep understanding of indexing, data storage, and retrieval mechanisms (RAG). ¿ Experience with data integration, ETL processes, and data normalization. ¿ Knowledge of scaling search solutions to handle large volumes of data and high query loads. ¿ Strong knowledge of programming languages like C#.NET, Python, or JavaScript for developing and customizing search functionalities. ¿ Experience in integrating search solutions with various APIs and third party systems. ¿ Understanding of how search interfaces impact user experience and ways to improve search usability and efficiency. ¿ Experience with enterprise level systems and an understanding of how search integrates with broader IT infrastructure and business processes.
Posted 1 week ago
6.0 - 9.0 years
0 Lacs
Hyderābād
Remote
Description Brief Job Overview The position is a non-supervisory position in our Digital Marketing team. This position will be the in-house expert on the Marketing Automation Platform (MAP) and the go-to person for implementing the platform, executing campaigns using the MAP and on marketing channels integrated with the MAP. How will YOU create impact here at USP? In this role at USP, you contribute to USP's public health mission of increasing equitable access to high-quality, safe medicine and improving global health through public standards and related programs. In addition, as part of our commitment to our employees, Global, People, and Culture, in partnership with the Equity Office, regularly invests in the professional development of all people managers. This includes training in inclusive management styles and other competencies necessary to ensure engaged and productive work environments. The position is critical to USP establishing a Center of excellence in Digital Marketing. The position will be responsible for: Implementation: In collaboration with external consultants, the data strategy team, and the Salesforce CRM team, upgrade from Pardot to Marketo and manage it Campaign Management: In collaboration with the campaign manager, develop, execute, and optimize marketing campaigns within the MAP, including email marketing, lead nurturing, scoring, and segmentation Marketing Automation Strategy: Collaborate with the marketing & sales teams to define and implement marketing automation strategies Lead Management: Implement lead scoring & grading models to improve lead qualification. Ensure smooth integration with Salesforce CRM for lead handoff to sales Email Marketing: Design and manage email workflows, A/B testing, personalization, and targeted campaigns to increase engagement and conversion Landing Pages & Forms: Create and maintain MAP landing pages, forms, and automated follow-up processes to capture leads and manage workflows Segmentation & Targeting: Develop dynamic and static lists, segment audiences based on demographic, firmographic, and behavioral data Marketing Automation Optimization: Continuously monitor and optimize marketing automation processes to improve lead generation, conversion rates, and overall campaign performance Integration: Maintain the Marketing Automation integration with CRM and assist in the integration of additional platforms into the marketing automation software Reporting: Work with management to define KPIs, create detailed campaign performance reporting, analyze campaign performance Training & Support: Provide training and support to internal teams on MAP best practices, helping to ensure broad adoption and efficient use of the platform Who is USP Looking For? The successful candidate will have a demonstrated understanding of our mission, commitment to excellence through inclusive and equitable behaviors and practices, ability to quickly build credibility with stakeholders, along with the following competencies and experience: Bachelor’s degree in marketing, business, Information Technology, or a related field. 6-9 years of relevant hands-on experience with Marketing Automation, preferably Marketo in operational set-up, Lead Management Framework (Scoring, Lead Lifecycle Modeler, etc.), Data Normalization setup, Preference Center setup, landing page, email settings and Launch Point integrations etc. Knowledge of HTML, CSS, and JavaScript to make email and landing page updates, CRM integrations and other native integrations, well versed in best practices for administering the Marketo ecosystem Strong data analysis capabilities to interpret campaign results and optimize performance. Ability to report on key metrics like email performance, lead engagement, CTR and ROI Ability to manage multiple campaigns and projects, adhering to timelines and delivering results Excellent written and oral communication skills. Ability to excel in a cross-functional environment. Additional Desired Preferences Desired preferences are to showcase any additional preferred levels of expertise to perform the role. Reminder: any items listed in this section are not requirements nor disqualifies for candidate consideration. 10-12 years of relevant hands-on experience with Marketing Automation, preferably Marketo in operational set-up, Lead Management Framework (Scoring, Lead Lifecycle Modeler, etc.), Data Normalization setup, Preference Center setup, landing page, email settings and Launch Point integrations etc. Experience with customer journeys to create and optimize innovative lead generation strategies to move prospects through the marketing funnel Understanding of how to build a variety of marketing models, including attribution models Ability to quickly learn new tools and technologies, troubleshoot issues, and optimize marketing workflows Marketo certified Expert certification Supervisory Responsibilities None Benefits USP provides the benefits to protect yourself and your family today and tomorrow. From company-paid time off and comprehensive healthcare options to retirement savings, you can have peace of mind that your personal and financial well-being is protected. Note: USP does not accept unsolicited resumes from 3rd party recruitment agencies and is not responsible for fees from recruiters or other agencies except under specific written agreement with USP. Who is USP? The U.S. Pharmacopeial Convention (USP) is an independent scientific organization that collaborates with the world's top authorities in health and science to develop quality standards for medicines, dietary supplements, and food ingredients. USP's fundamental belief that Equity = Excellence manifests in our core value of Passion for Quality through our more than 1,300 hard-working professionals across twenty global locations to deliver the mission to strengthen the supply of safe, quality medicines and supplements worldwide. At USP, we value inclusivity for all. We recognize the importance of building an organizational culture with meaningful opportunities for mentorship and professional growth. From the standards we create, the partnerships we build, and the conversations we foster, we affirm the value of Diversity, Equity, Inclusion, and Belonging in building a world where everyone can be confident of quality in health and healthcare. USP is proud to be an equal employment opportunity employer (EEOE) and affirmative action employer. We are committed to creating an inclusive environment in all aspects of our work—an environment where every employee feels fully empowered and valued irrespective of, but not limited to, race, ethnicity, physical and mental abilities, education, religion, gender identity, and expression, life experience, sexual orientation, country of origin, regional differences, work experience, and family status. We are committed to working with and providing reasonable accommodation to individuals with disabilities.
Posted 1 week ago
15.0 years
0 Lacs
Gurgaon
On-site
Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : SAP Plant Maintenance (PM) Good to have skills : NA Minimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary: As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. You will be responsible for managing the team and ensuring successful project delivery. Your typical day will involve collaborating with multiple teams, making key decisions, and providing solutions to problems for your immediate team and across multiple teams. Roles & Responsibilities: - Expected to be an SME - Collaborate and manage the team to perform - Responsible for team decisions - Engage with multiple teams and contribute to key decisions - Provide solutions to problems for their immediate team and across multiple teams - Lead the effort to design, build, and configure applications - Act as the primary point of contact for the project - Manage the team and ensure successful project delivery - Collaborate with multiple teams to make key decisions - Provide solutions to problems for the immediate team and across multiple teams Professional & Technical Skills: - Must To Have Skills: Proficiency in SAP Plant Maintenance (PM) - Strong understanding of statistical analysis and machine learning algorithms - Experience with data visualization tools such as Tableau or Power BI - Hands-on implementing various machine learning algorithms such as linear regression, logistic regression, decision trees, and clustering algorithms - Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity Additional Information: - The candidate should have a minimum of 5 years of experience in SAP Plant Maintenance (PM) - This position is based at our Gurugram office - A 15 years full-time education is required 15 years full time education
Posted 1 week ago
15.0 years
0 Lacs
Gurgaon
On-site
Project Role : Application Support Engineer Project Role Description : Act as software detectives, provide a dynamic service identifying and solving issues within multiple components of critical business systems. Must have skills : Software License Management Good to have skills : NA Minimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary: Competent on any 2 tier 1 publishers (Microsoft, Oracle, IBM, VMware, SAP) & any 2 Tier 2 publishers (Salesforce, Adobe, Quest, Autodesk, Microfocus, Citrix, Veritas, Informatica). Hands on experience on ServiceNow SAM Pro / Flexera / SNOW SLM. Good understanding of publisher contracts, license metrics and product use rights. Experience in creation of entitlements, license overview report and contracts. Experience in handling software license requests and performing technical validation. Key Responsibilities: • Maintain software publisher licensing information for the assigned publishers (i.e., both entitlements and deployments) • Analyze software licensing agreements, create entitlements summary, and summarize use right information from software agreements. • Importing licenses and agreements into the SAM tool (SNOW SLM/ SAM Pro, Flexera/Others). • Update software entitlement and agreement information into the SAM tool. • Maintain accurate records of software licenses and related assets, ensuring compliance with licensing agreements and regulations. • Develop and implement software license management policies and procedures, ensuring adherence to industry best practices and standards. • Maintain software installation records in SAM tool and perform product normalization. • Perform license reconciliation in SAM tool. • Work with internal stakeholders to ensure deployment of software applications are compliant and if not, work with the stakeholders to remediate non-compliance. • Respond to customer queries on software licensing. • Create customized reports and recommendations to report on SAM function activities. • Identify cost savings and license re-harvesting opportunities. • Drive periodic or ad-hoc stakeholder and project meetings. Technical Experience: • Excellent command over software licensing and use rights information of tier 1 software publishers (i.e., Microsoft, Oracle, IBM, VMware, Adobe, Citrix, and SAP) • Proficient in creating and delivering IBM Sub-Capacity Mainframe ELP reports • Proficient in creating Oracle DB server and Options ELP reports. Performing manual reconciliation and deployment validation as required Experience working on at least one or more SAM Tools (i.e., ServiceNow SAMPro, Flexera, SNOW License Manager) Professional Attributes: Excellent communication skills Expert knowledge in MS Office applications (Excel & PowerPoint) Ability to work in a team environment. Must have Skills: Software licensing & Software Asset Management Tools Good to Have Skills: Analytical and Communication Skills Candidate should be flexible on doing shifts and coming to office. Educational Qualification: 15 years of full-time education Desired Certifications: CSAM CITAM FlexNet Manager Implementation & Administration Flexera Certified IT Asset Management Administrator 15 years full time education
Posted 1 week ago
15.0 years
0 Lacs
Calcutta
On-site
Project Role : Application Tech Support Practitioner Project Role Description : Act as the ongoing interface between the client and the system or application. Dedicated to quality, using exceptional communication skills to keep our world class systems running. Can accurately define a client issue and can interpret and design a resolution based on deep product knowledge. Must have skills : Software License Management Good to have skills : NA Minimum 2 year(s) of experience is required Educational Qualification : 15 years full time education Summary: Service request management for IMAC tickets Discovery and software inventory reconciliation Gap analysis and Scheduled Reporting. Maintain currency on policy and process documents Track license renewals and warranty data in support of refresh as well as end of life processes. Minimum 2 plus years of experience ServiceNow and IT Software License Management Key Responsibilities: Maintain software publisher licensing information for the assigned publishers (i.e., both entitlements and deployments) Analyze software licensing agreements, create entitlements summary, and summarize use right information from software agreements. Importing licenses and agreements into the SAM tool (SNOW SLM/ SAM Pro, Flexera/Others). Update software entitlement and agreement information into the SAM tool. Maintain accurate records of software licenses and related assets, ensuring compliance with licensing agreements and regulations. Develop and implement software license management policies and procedures, ensuring adherence to industry best practices and standards. Maintain software installation records in SAM tool and perform product normalization. Perform license reconciliation in SAM tool. Work with internal stakeholders to ensure deployment of software applications are compliant and if not, work with the stakeholders to remediate non-compliance. Respond to customer queries on software licensing. Create customized reports and recommendations to report on SAM function activities. Identify cost savings and license re-harvesting opportunities. Drive periodic or ad-hoc stakeholder and project meetings. Technical Experience: Excellent command over software licensing and use rights information of tier 1 software publishers (i.e., Microsoft, Oracle, IBM, VMware, Adobe, Citrix, and SAP) Proficient in creating and delivering IBM Sub-Capacity Mainframe ELP reports Proficient in creating Oracle DB server and Options ELP reports. Performing manual reconciliation and deployment validation as required Experience working on at least one or more SAM Tools (i.e., ServiceNow SAMPro, Flexera, SNOW License Manager) Professional Attributes: Excellent communication skills Expert knowledge in MS Office applications (Excel & PowerPoint) Ability to work in a team environment. Must have Skills: Software licensing & Software Asset Management Tools Good to Have Skills: Analytical and Communication Skills Educational Qualification: 15 years of full-time education Desired Certifications: CSAM CITAM FlexNet Manager Implementation & Administration Flexera Certified IT Asset Management Administrator 15 years full time education
Posted 1 week ago
18.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Role Overview : We are seeking a strategic and results-driven professional to lead our Global Pay for Performance (GPP) program. This individual will oversee the global performance management framework, align variable pay with enterprise-wide KPIs, and drive a culture of meritocracy and performance-based rewards. The ideal candidate will possess a deep understanding of compensation structures, business-linked KPIs, performance evaluation models, and global reward governance. Key Responsibilities : 1. Strategy and Design Lead the design, implementation, and evolution of the Global Pay for Performance (GPP) framework. Align performance metrics with corporate, regional, and functional goals to ensure cross-border consistency and impact. Collaborate with Finance and Business Leaders to define global KPI scorecards tied to variable pay outcomes. 2. Performance Evaluation Framework Oversee the definition and calibration of KPIs for different business units and employee levels globally. Create and run governance models for performance assessments across geographies. Standardize performance rating normalization processes and ensure fairness and transparency. 3. Compensation & Rewards Execution Anchor the end-to-end cycle for variable pay – including budgeting, performance rating analysis, payout calculation, communication, and distribution. Monitor market trends and benchmarks to ensure competitiveness of performance-linked pay structures. Design and execute annual payout simulations and impact analyses in partnership with Finance and HRBPs. 4. Governance, Analytics & Compliance Develop GPP policies, guardrails, and audit-ready documentation for payouts and eligibility. Provide analytics-driven insights to leadership on payout trends, performance distribution, and ROI. Ensure compliance with country-specific employment and tax regulations related to variable compensation. 5. Technology & Systems Partner with HRIS / Rewards systems teams to automate and digitize GPP processes. Ensure data integrity, dashboarding, and accurate reporting across all GPP modules. Key Requirements : Experience : 12–18 years in HR, Total Rewards, or Compensation & Benefits; at least 5 years in a leadership role managing performance-based variable pay programs globally. Education : MBA in HR / Finance or equivalent. Certifications in Compensation or Performance Management preferred. Technical Skills : Strong knowledge of KPI-based compensation design, analytics, Excel modeling, OR enterprise tools like SAP SuccessFactors, Workday, Oracle HCM. Business Acumen : Deep understanding of business-linked performance metrics and the ability to interface with CXOs, finance heads, and functional leaders. Soft Skills : Strategic thinking, governance mindset, influencing skills, and the ability to manage ambiguity across global geographies. Show more Show less
Posted 1 week ago
6.0 - 10.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Job Description Summary: You will be creating, maintaining, and supporting data pipelines with information coming from our vessels, projects, campaigns, third party data services, and so on. You will play a key role in organizing data and developing & maintaining data models and designing modern data solutions and products on our cloud data platform. You will work closely with the business to define & finetune requirements. You will support our data scientists and report developers across the organization and enable them to find the required data and information. Your responsibilities You have a result-driven and hands-on mindset and prefer to work in an agile environment. You are a team player and good communicator. You have experience with SQL or other data-oriented development languages (Python, Scala, Spark etc). You have proven experience in developing data models and database structures. You have proven experience with UML modelling and ER modelling for documenting and designing data structures. You have proven experience in the development of data pipelines and orchestrations. You have a master or bachelor‘s degree in the field of engineering or computer science You like to iterate quickly and try out new things Ideally, you have experience with a wide variety of data tools & data like geospatial, time series, structured & unstructured, etc. Your profile Experience on Microsoft Azure data stack (Synapse/data factory, power bi, data bricks, data lake, Microsoft SQL, Microsoft AAS) is mandatory. Experience with machine learning and AI is a plus Knowledge in fundamental data modeling concepts such as entities, relationships, normalization, and denormalization. Knowledge of different data modeling techniques (e.g., ER diagrams, star schema, snowflake schema). Experience with reporting tools is a plus (Grafana, Power bi, Tableau). Having a healthy appetite and open mind for new technologies is a plus Holds a bachelor's or master's degree in computer science, information technology, or a related field. Relevant experience level of 6-10 years is mandatory. Job location is Chennai . Our offer An extensive mobility program for a healthy work-life balance. A permanent training track which allows you to develop yourself personally and professionally. A stimulating, innovative workplace with numerous growth opportunities. A people-oriented environment with an interactive health program and a focus on employee wellbeing. Show more Show less
Posted 1 week ago
0 years
0 Lacs
India
On-site
Healthcare Data Architect About Norstella At Norstella, our mission is simple: to help our clients bring life-saving therapies to the market quicker—and help patients in need. Founded in 2022, but with history going back to 1939, Norstella unites best-in-class brands to help clients navigate the complexities at each step of the drug development life cycle —and get the right treatments to the right patients at the right time. Each organization (Citeline, Evaluate, MMIT, Panalgo, The Dedham Group) delivers must-have answers for critical strategic and commercial decision-making. Together, via our market-leading brands, we help our clients: Citeline – accelerate the drug development cycle. Evaluate – bring the right drugs to market. MMIT – identify barrier to patient access. Panalgo – turn data into insight faster. The Dedham Group – think strategically for specialty therapeutics. By combining the efforts of each organization under Norstella, we can offer an even wider breadth of expertise, cutting-edge data solutions and expert advisory services alongside advanced technologies such as real-world data, machine learning and predictive analytics. As one of the largest global pharma intelligence solution providers, Norstella has a footprint across the globe with teams of experts delivering world class solutions in the USA, UK, The Netherlands, Japan, China and India. Job Summary We are seeking a Healthcare Data Architect - to lead the design and implementation of scalable real-world data (RWD) solutions architecture . This role sits within the Product team but maintains strong collaboration with Engineering to ensure technical feasibility and execution. The ideal candidate has expertise in healthcare data, claims, EHR, lab and other types of RWD and is skilled in translating business needs into scalable, high-impact data products . This role will be instrumental in shaping data-driven products , optimizing data architectures, and ensuring the integration of real-world data assets into enterprise solutions that support life sciences, healthcare, and payer analytics. Key Responsibilities Product & Solution Design Define and drive the requirements for RWD data products. Collaborate with leadership, product managers, customers, and data scientists to identify high-value use cases. Translate business and regulatory requirements into scalable and performant data models and solutions. Develop architectures to support payer claims, labs, EHR-sourced insight generation and analytics. Partner with healthcare providers, payers, and life sciences companies to enhance data interoperability. Technical Collaboration & Solution Architecture Work closely with Engineering to design and implement responsive analytics layer and data architecture. Provide technical guidance on ETL pipelines, data normalization, and integration with third-party RWD sources. Architect solutions to aggregate, standardize, and analyze EHR and molecular data, ensuring compliance with healthcare regulations (HIPAA, GDPR). Define best practices for claims data ingestion, quality control, and data transformations. Develop frameworks for processing structured and unstructured EHR data, leveraging NLP and data harmonization techniques. Ensure compliance with HIPAA, GDPR, and regulatory frameworks for healthcare data products. Define and implement data governance strategies to maintain high data integrity and lineage tracking. Required Skills & Qualifications Product & Business Acumen: Deep understanding of payer data claims lifecycle, EHR, labs and real-world data applications. Ability to translate business needs into technical solutions and drive execution. Strong understanding of data product lifecycle and product management principles. Experience working with cross-functional teams, including Product, Engineering, Clinical, Business and Customer Success. Excellent communication skills to engage with both technical and non-technical stakeholders. Technical & Data Architecture Expertise: Expertise in RWD and payer data structures (claims, EMR/EHR, registry data, prescription data, etc.). Proficiency in SQL and NoSQL databases (PostgreSQL, Snowflake, MongoDB, etc.). Strong knowledge of ETL processes and data pipeline orchestration. Experience with big data processing (Spark, Databricks, Hadoop). Understanding of payer and provider data models used in healthcare analytics. Strong presentation and documentation skills to articulate solutions effectively. Experience working with payer organizations, PBMs, life sciences, and health plans. Desired Skills & Qualifications Experience with OMOP, FHIR, HL7, and other healthcare data standards. Knowledge of data governance, metadata management, and lineage tracking tools. Experience in pharmaceutical RWE studies and market access analytics. Familiarity with BI tools (Tableau, Power BI, Looker). Understanding of data mesh and federated data architectures. Benefits: Health Insurance Provident Fund Reimbursement of Certification Expenses Gratuity 24x7 Health Desk The guiding principles for success at Norstella: 01: Bold, Passionate, Mission-First We have a lofty mission to Smooth Access to Life Saving Therapies and we will get there by being bold and passionate about the mission and our clients. Our clients and the mission in what we are trying to accomplish must be in the forefront of our minds in everything we do. 02: Integrity, Truth, Reality We make promises that we can keep, and goals that push us to new heights. Our integrity offers us the opportunity to learn and improve by being honest about what works and what doesn’t. By being true to the data and producing realistic metrics, we are able to create plans and resources to achieve our goals. 03: Kindness, Empathy, Grace We will empathize with everyone's situation, provide positive and constructive feedback with kindness, and accept opportunities for improvement with grace and gratitude. We use this principle across the organization to collaborate and build lines of open communication. 04: Resilience, Mettle, Perseverance We will persevere – even in difficult and challenging situations. Our ability to recover from missteps and failures in a positive way will help us to be successful in our mission. 05: Humility, Gratitude, Learning We will be true learners by showing humility and gratitude in our work. We recognize that the smartest person in the room is the one who is always listening, learning, and willing to shift their thinking. Norstella is an equal opportunities employer and does not discriminate on the grounds of gender, sexual orientation, marital or civil partner status, pregnancy or maternity, gender reassignment, race, color, nationality, ethnic or national origin, religion or belief, disability or age. Our ethos is to respect and value people’s differences, to help everyone achieve more at work as well as in their personal lives so that they feel proud of the part they play in our success. We believe that all decisions about people at work should be based on the individual’s abilities, skills, performance and behavior and our business requirements. Norstella operates a zero-tolerance policy to any form of discrimination, abuse or harassment. Sometimes the best opportunities are hidden by self-doubt. We disqualify ourselves before we have the opportunity to be considered. Regardless of where you came from, how you identify, or the path that led you here- you are welcome. If you read this job description and feel passion and excitement, we’re just as excited about you. Show more Show less
Posted 1 week ago
3.0 years
0 Lacs
Mumbai, Maharashtra, India
On-site
Control Risks is seeking a highly technical, detail-oriented Data Analyst to join our Data & Technology Consulting (DTC) team. This role is deeply embedded in data analytics, scripting, ETL workflows, and reporting. The candidate is required to have strong skills in Python, SQL, Power BI, Microsoft Fabric, and PowerApps. The successful candidate will play a critical role in developing and implementing data solutions that power our consulting engagements. This is not a generic data analyst role, we're looking for a problem-solver who thrives in complex, fast-paced environments, is confident writing production-level code, and can develop intuitive, scalable reporting solutions. Tasks & responsibilities: • Interrogate, clean, and assess structured and unstructured data for integrity, completeness, and business relevance. • Build and optimize robust ETL pipelines to normalize disparate datasets and enable downstream analysis. • Write efficient SQL and Python scripts to support custom data transformations, enrichment, and automations. • Design, build, and maintain interactive Power BI dashboards and PowerApps solutions aligned to client and internal requirements. • Interpret and analyse complex financial, operational, and transactional datasets to surface insights and support investigative work. • Document methodologies, code logic, data assumptions, and business context throughout the project lifecycle. • Collaborate across multi-disciplinary teams to ensure timely delivery of work products and reporting solutions. Requirements • Minimum 3 years of hands-on experience with: • Writing production-level SQL and Python for data transformation and automation. • Building and maintaining ETL pipelines for large, messy, and complex datasets. • Designing and deploying workflows and reports using Power BI, PowerApps, and Microsoft Fabric. • Advanced proficiency in Excel (pivoting, modelling, formulas, data wrangling). • Demonstrated experience working with relational databases and open-source tools. • Strong understanding of data structures, normalization, and query optimization. • Proven ability to manage multiple priorities in a deadline-driven environment. • Self-motivated, methodical, and committed to high-quality outcomes. • Excellent written and verbal communication in English. Preferred Skills • Experience in consulting, compliance, or risk advisory environments. • Comfort navigating ambiguity and changing priorities. • Exposure to version control systems (e.g., Git), cloud data tools, or APIs is a plus. Education • Bachelor's Degree in Computer Science, Data Science, Information Systems, or a relevant quantitative field. Show more Show less
Posted 1 week ago
0 years
0 Lacs
Delhi, India
On-site
Job Summary: We are looking for a detail-oriented and data-savvy Database Analyst / SQL Developer with hands-on experience in Oracle and SQL Server . The ideal candidate should have a strong understanding of database structures and be proficient in writing and optimizing SQL queries, procedures, and scheduled jobs. A good grasp of data relationships and query logic is essential. Key Responsibilities: Develop, test, and maintain SQL queries , stored procedures , functions , and views in Oracle and SQL Server. Read from and write to database tables for business logic processing and reporting. Design and optimize joins , subqueries , and complex data retrieval logic. Analyze existing database structures and recommend optimizations. Support data quality and integrity across systems. Create and maintain scheduled jobs , ETL processes, or data pipelines. Work with application developers to support backend data needs. Troubleshoot database issues and performance bottlenecks. Required Skills: Proficiency in Oracle PL/SQL and T-SQL (SQL Server) . Strong knowledge of joins , subqueries , and data manipulation . Ability to understand and work with stored procedures , functions , triggers , and scheduled jobs . Experience in reading and interpreting relational database models . Understanding of indexes , constraints , and basic normalization . Familiarity with data profiling and basic data modeling concepts. Preferred: Knowledge of data migration , ETL tools , or SSIS / Oracle Data Integrator . Familiarity with cloud databases (e.g., Azure SQL, Oracle Cloud). Experience with reporting tools or writing queries for dashboards. Soft Skills: Strong analytical and problem-solving mindset. Ability to communicate effectively with business and technical teams. Self-motivated and able to work independently or in a team. Show more Show less
Posted 1 week ago
2.0 years
0 Lacs
Gurugram, Haryana
On-site
- 1+ years of data querying languages (e.g. SQL), scripting languages (e.g. Python) or statistical/mathematical software (e.g. R, SAS, Matlab, etc.) experience - 2+ years of data/research scientist, statistician or quantitative analyst in an internet-based company with complex and big data sources experience Job Description Are you interested in applying your strong quantitative analysis and big data skills to world-changing problems? Are you interested in driving the development of methods, models and systems for capacity planning, transportation and fulfillment network? If so, then this is the job for you. Our team is responsible for creating core analytics tech capabilities, platforms development and data engineering. We develop scalable analytics applications and research modeling to optimize operation processes. We standardize and optimize data sources and visualization efforts across geographies, builds up and maintains the online BI services and data mart. You will work with professional software development managers, data engineers, scientists, business intelligence engineers and product managers using rigorous quantitative approaches to ensure high quality data tech products for our customers around the world, including India, Australia, Brazil, Mexico, Singapore and Middle East. Amazon is growing rapidly and because we are driven by faster delivery to customers, a more efficient supply chain network, and lower cost of operations, our main focus is in the development of strategic models and automation tools fed by our massive amounts of available data. You will be responsible for building these models/tools that improve the economics of Amazon’s worldwide fulfillment networks in emerging countries as Amazon increases the speed and decreases the cost to deliver products to customers. You will identify and evaluate opportunities to reduce variable costs by improving fulfillment center processes, transportation operations and scheduling, and the execution to operational plans. You will also improve the efficiency of capital investment by helping the fulfillment centers to improve storage utilization and the effective use of automation. Finally, you will help create the metrics to quantify improvements to the fulfillment costs (e.g., transportation and labor costs) resulting from the application of these optimization models and tools. Major responsibilities include: · Translating business questions and concerns into specific analytical questions that can be answered with available data using BI tools; produce the required data when it is not available. · Apply Statistical and Machine Learning methods to specific business problems and data. · Create global standard metrics across regions and perform benchmark analysis. · Ensure data quality throughout all stages of acquisition and processing, including such areas as data sourcing/collection, ground truth generation, normalization, transformation, cross-lingual alignment/mapping, etc. · Communicate proposals and results in a clear manner backed by data and coupled with actionable conclusions to drive business decisions. · Collaborate with colleagues from multidisciplinary science, engineering and business backgrounds. · Develop efficient data querying and modeling infrastructure. · Manage your own process. Prioritize and execute on high impact projects, triage external requests, and ensure to deliver projects in time. · Utilizing code (Python, R, Scala, etc.) for analyzing data and building statistical models. Knowledge of statistical packages and business intelligence tools such as SPSS, SAS, S-PLUS, or R Experience with clustered data processing (e.g., Hadoop, Spark, Map-reduce, and Hive) Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner.
Posted 1 week ago
5.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Our Client is a YCombinator and VC-backed API-tech company. They provide a unified API that allows proptech companies to quickly and easily integrate with multiple property management systems, helping them save time, ship faster, and unlock revenue. They are making it easy for others to create unique products in real estate, a four trillion-dollar market. Responsibilities Implement customer requests related to our unified property management API Build and maintain API integrations with various property management systems Work with complex data models and normalize inconsistent data formats Debug and resolve integration issues, including handling legacy SOAP APIs Troubleshoot and fix bugs in a timely manner Write tests for your own code and your teammates' code when needed Perform code reviews and contribute to technical documentation Communicate changes, ideas, thoughts, failures, and improvements to the team Qualifications 5+ years of professional experience in software development Strong proficiency with TypeScript, Node.js, and React Experience with RESTful API development and consumption Proven ability to navigate and integrate with complex external APIs Experience with data transformation and normalization techniques Solid understanding of API authentication methods (OAuth, Basic Auth, etc.) Experience building resilient systems that can handle unreliable upstream services Technologies Strong NodeJS Strong TypeScript Strong PostgreSQL or similar Experience with AWS services Familiarity with both REST and SOAP APIs Bonus Points Experience integrating with any of the following systems: Yardi, RealPage, Entrata, MRI, Appfolio, Buildium, RentManager, ResMan Familiarity with property management systems or proptech industry Experience with webhook implementation and management Knowledge of error handling and retry mechanisms for unreliable APIs If you're passionate about building reliable, scalable API integrations and want to help transform an industry in need of innovation, we'd love to hear from you! Job Location: Gurugram/ Hyderabad Work Model: Hybrid Payroll Company: Cloudhire Number of positions: 2 ( Lead Developer & Developer ) If interested, please take up an AI interview on our portal using the link below and drop an email confirmation @ whitasha@cloudhire.ai. Portal Link : https://jobs.cloudhire.ai/login Show more Show less
Posted 1 week ago
3.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Project Role : Application Support Engineer Project Role Description : Act as software detectives, provide a dynamic service identifying and solving issues within multiple components of critical business systems. Must have skills : Software License Management Good to have skills : NA Minimum 3 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: Competent on any 2 tier 1 publishers (Microsoft, Oracle, IBM, VMware, SAP) & any 2 Tier 2 publishers (Salesforce, Adobe, Quest, Autodesk, Microfocus, Citrix, Veritas, Informatica). Hands on experience on ServiceNow SAM Pro / Flexera / SNOW SLM. Good understanding of publisher contracts, license metrics and product use rights. Experience in creation of entitlements, license overview report and contracts. Experience in handling software license requests and performing technical validation. Key Responsibilities: Maintain software publisher licensing information for the assigned publishers (i.e., both entitlements and deployments) Analyze software licensing agreements, create entitlements summary, and summarize use right information from software agreements. Importing licenses and agreements into the SAM tool (SNOW SLM/ SAM Pro, Flexera/Others). Update software entitlement and agreement information into the SAM tool. Maintain accurate records of software licenses and related assets, ensuring compliance with licensing agreements and regulations. Develop and implement software license management policies and procedures, ensuring adherence to industry best practices and standards. Maintain software installation records in SAM tool and perform product normalization. Perform license reconciliation in SAM tool. Work with internal stakeholders to ensure deployment of software applications are compliant and if not, work with the stakeholders to remediate non-compliance. Respond to customer queries on software licensing. Create customized reports and recommendations to report on SAM function activities. Identify cost savings and license re-harvesting opportunities. Drive periodic or ad-hoc stakeholder and project meetings. Technical Experience: Excellent command over software licensing and use rights information of tier 1 software publishers (i.e., Microsoft, Oracle, IBM, VMware, Adobe, Citrix, and SAP) Proficient in creating and delivering IBM Sub-Capacity Mainframe ELP reports Proficient in creating Oracle DB server and Options ELP reports. Performing manual reconciliation and deployment validation as required Experience working on at least one or more SAM Tools (i.e., ServiceNow SAMPro, Flexera, SNOW License Manager) Professional Attributes: Excellent communication skills Expert knowledge in MS Office applications (Excel & PowerPoint) Ability to work in a team environment. Must have Skills: Software licensing & Software Asset Management Tools Good to Have Skills: Analytical and Communication Skills Candidate should be flexible on doing shifts and coming to office. Educational Qualification: 15 years of full-time education Desired Certifications: CSAM CITAM FlexNet Manager Implementation & Administration Flexera Certified IT Asset Management Administrator 15 years full time education Show more Show less
Posted 1 week ago
0 years
0 Lacs
Pune, Maharashtra, India
On-site
Position Overview Job Title: SQL Developer Location: Pune, India Corporate Title: AS Role Description This role is specifically within the Legal CIO sub-domain for the DExTR programme. The DExTR is an Enterprise Contract Management system to support the legal department in negotiating, drafting, managing and storing their body of contracts. The project objective is to Streamline the creation and amendments of legal documents and Automate contract generation, enrich metadata and enforce data quality through simple Document Assembly technology and Integrate negotiation workflow, document assembly and document automation with standard desktop applications (MS Word, MS Outlook) to enhance the user experience, and streamline user adoption of the application. Deutsche Bank is a client-centric global universal bank. One that is leading change and innovation in the industry – championing integrity, sustainable performance and innovation with our clients, and redefining our culture and relationships with each other. The CIO Chief Administrative Office (CAO) function brings together the IT services for the Group CAO functions, Human Resources, Legal, and Corporate Communications & CSR. Legal is the Principal Manager of legal risk of the Deutsche Bank Group and guardian of Deutsche Bank Group’s culture, integrity and reputation. DB’s Legal Department is fully independent from the Business Divisions and has a direct reporting line into the Management Board and not into any Business Division. The Legal CIO department has a broad change portfolio, that is in some cases regulatory driven and therefore visible to Board Level. The Legal department has been undergoing significant business and technology transformation in recent years, covering critical aspects of the departments; Risk Advisory, Litigation, and COO. A range of technology change initiatives are now running, that cover critical topics; Legal Document Management, Reference Data Distribution, Enterprise Legal Management, Spend Analytics, Global Governance, and Contract Management. What We’ll Offer You As part of our flexible scheme, here are just some of the benefits that you’ll enjoy Best in class leave policy Gender neutral parental leaves 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Employee Assistance Program for you and your family members Comprehensive Hospitalization Insurance for you and your dependents Accident and Term life Insurance Complementary Health screening for 35 yrs. and above Your Key Responsibilities Design, Develop and optimize complex SQL queries, procedures and views for data extraction and transformation. Develop and maintain dashboards and reports to track key performance metrics using advanced analytical tools like Tableau and SAP BO and generate actionable insights. Collaborate with business analysts and stakeholders to understand reporting requirements and translate them into technical solutions. Build and manage Query pipelines and ETL processes as needed for Tableau and SAP BO data sources. Perform Data Validation and quality assurance on SQL Outputs and Visualizations. Troubleshoot and resolve issues related to data inconsistencies, data quality or performance. Support ad-hoc reporting requests and provide documentation to end users. Your Skills And Experience Strong analytical, problem-solving and communication skills. Extensive experience in SQL development, relational databases (Oracle) and data distribution techniques. Strong understanding of data modelling, normalization and query performance tuning. Hands-on experience in creating data visualization and reports in Tableau and SAP BO. Knowledge on Cloud technologies preferably GCP. Good Knowledge on Java Programming is a plus. Working with Agile methodologies such as Scrum and Kanban. Working experience on version controlling tools like GIT, VCS and cloud Code repository like GitHub/Bitbucket. How We’ll Support You Training and development to help you excel in your career Coaching and support from experts in your team A culture of continuous learning to aid progression A range of flexible benefits that you can tailor to suit your needs About Us And Our Teams Please visit our company website for further information: https://www.db.com/company/company.htm We strive for a culture in which we are empowered to excel together every day. This includes acting responsibly, thinking commercially, taking initiative and working collaboratively. Together we share and celebrate the successes of our people. Together we are Deutsche Bank Group. We welcome applications from all people and promote a positive, fair and inclusive work environment. Show more Show less
Posted 1 week ago
5.0 years
0 Lacs
Greater Chennai Area
On-site
Area(s) of responsibility Key Responsibilities Daily Security Review: Monitor the Virtus Splunk environment (8x5) to detect anomalies, filter false positives, investigate threats, and escalate valid security incidents as per the Escalation Plan. Security Rule Tuning: Adjust security rules based on analysis and client feedback to enhance threat detection and reduce false positives. Not able Event Investigation: Perform initial analysis of notable security events and escalate cases requiring client attention. Security Use Case Development: Identify security incidents, refine detection processes, and update notification procedures per the agreed rules of engagement. Splunk Administration: Maintain the health of Splunk infrastructure, including search heads, indexers, deployment servers, and other critical components. Splunk Upgrades: Provide upgrade roadmaps, determine upgrade sequences, and assist with implementation to ensure an up-to-date Splunk environment. Splunk Dashboards & Searches: Develop customized dashboards, reports, and saved searches tailored to client requirements, integrating necessary data sources. Data Source Onboarding: Add new data sources to Splunk Enterprise Security, including installing technology add-ons, field extraction, and Common Information Model (CIM) normalization. Service Desk Integration: Manage ticket escalations through the Virtus Service Desk and leverage KACE for efficient incident response and tracking. Required Qualifications Experience: 5+ years in Splunk administration, including security monitoring and incident response. Technical Skills Strong expertise in Splunk Enterprise Security and its components. Proficiency in security use case development and event correlation. Experience with Splunk search processing language (SPL), dashboards, and reporting. Hands-on experience with data source onboarding and CIM normalization. Familiarity with ticketing systems like KACE or similar ITSM platforms. Certifications: Splunk Certified Admin, Splunk Enterprise Security Certified Admin (preferred). Show more Show less
Posted 1 week ago
5.0 years
0 Lacs
Mumbai Metropolitan Region
On-site
Company Description Forbes Advisor is a new initiative for consumers under the Forbes Marketplace umbrella that provides journalist- and expert-written insights, news and reviews on all things personal finance, health, business, and everyday life decisions. We do this by providing consumers with the knowledge and research they need to make informed decisions they can feel confident in, so they can get back to doing the things they care about most. The Data Research Engineering Team is a brand new team with the purpose of managing data from acquisition to presentation, collaborating with other teams while also operating independently. Their responsibilities include acquiring and integrating data, processing and transforming it, managing databases, ensuring data quality, visualizing data, automating processes, working with relevant technologies, and ensuring data governance and compliance. They play a crucial role in enabling data-driven decision-making and meeting the organization's data needs. A typical day in the life of a Database Engineer/Developer will involve designing, developing, and maintaining a robust and secure database infrastructure to efficiently manage company data. They collaborate with cross-functional teams to understand data requirements and migrate data from spreadsheets or other sources to relational databases or cloud-based solutions like Google BigQuery and AWS. They develop import workflows and scripts to automate data import processes, optimize database performance, ensure data integrity, and implement data security measures. Their creativity in problem-solving and continuous learning mindset contribute to improving data engineering processes. Proficiency in SQL, database design principles, and familiarity with Python programming are key qualifications for this role. Responsibilities Job Description Design, develop, and maintain the database infrastructure to store and manage company data efficiently and securely. Work with databases of varying scales, including small-scale databases, and databases involving big data processing. Work on data security and compliance, by implementing access controls, encryption, and compliance standards. Collaborate with cross-functional teams to understand data requirements and support the design of the database architecture. Migrate data from spreadsheets or other sources to a relational database system (e.g., PostgreSQL, MySQL) or cloud-based solutions like Google BigQuery. Develop import workflows and scripts to automate the data import process and ensure data accuracy and consistency. Optimize database performance by analyzing query execution plans, implementing indexing strategies, and improving data retrieval and storage mechanisms. Work with the team to ensure data integrity and enforce data quality standards, including data validation rules, constraints, and referential integrity. Monitor database health and identify and resolve issues. Collaborate with the full-stack web developer in the team to support the implementation of efficient data access and retrieval mechanisms. Implement data security measures to protect sensitive information and comply with relevant regulations. Demonstrate creativity in problem-solving and contribute ideas for improving data engineering processes and workflows. Embrace a learning mindset, staying updated with emerging database technologies, tools, and best practices. Explore third-party technologies as alternatives to legacy approaches for efficient data pipelines. Familiarize yourself with tools and technologies used in the team's workflow, such as Knime for data integration and analysis. Use Python for tasks such as data manipulation, automation, and scripting. Collaborate with the Data Research Engineer to estimate development efforts and meet project deadlines. Assume accountability for achieving development milestones. Prioritize tasks to ensure timely delivery, in a fast-paced environment with rapidly changing priorities. Collaborate with and assist fellow members of the Data Research Engineering Team as required. Perform tasks with precision and build reliable systems. Leverage online resources effectively like StackOverflow, ChatGPT, Bard, etc., while considering their capabilities and limitations. Skills And Experience Bachelor's degree in Computer Science, Information Systems, or a related field is desirable but not essential. Experience with data warehousing concepts and tools (e.g., Snowflake, Redshift) to support advanced analytics and reporting, aligning with the team’s data presentation goals. Skills in working with APIs for data ingestion or connecting third-party systems, which could streamline data acquisition processes. Proficiency with tools like Prometheus, Grafana, or ELK Stack for real-time database monitoring and health checks beyond basic troubleshooting. Familiarity with continuous integration/continuous deployment (CI/CD) tools (e.g., Jenkins, GitHub Actions). Deeper expertise in cloud platforms (e.g., AWS Lambda, GCP Dataflow) for serverless data processing or orchestration. Knowledge of database development and administration concepts, especially with relational databases like PostgreSQL and MySQL. Knowledge of Python programming, including data manipulation, automation, and object-oriented programming (OOP), with experience in modules such as Pandas, SQLAlchemy, gspread, PyDrive, and PySpark. Knowledge of SQL and understanding of database design principles, normalization, and indexing. Knowledge of data migration, ETL (Extract, Transform, Load) processes, or integrating data from various sources. Knowledge of cloud-based databases, such as AWS RDS and Google BigQuery. Eagerness to develop import workflows and scripts to automate data import processes. Knowledge of data security best practices, including access controls, encryption, and compliance standards. Strong problem-solving and analytical skills with attention to detail. Creative and critical thinking. Strong willingness to learn and expand knowledge in data engineering. Familiarity with Agile development methodologies is a plus. Experience with version control systems, such as Git, for collaborative development. Ability to thrive in a fast-paced environment with rapidly changing priorities. Ability to work collaboratively in a team environment. Good and effective communication skills. Comfortable with autonomy and ability to work independently. Qualifications 5+ Years exp in Database Engineering. Additional Information Perks: Day off on the 3rd Friday of every month (one long weekend each month) Monthly Wellness Reimbursement Program to promote health well-being Monthly Office Commutation Reimbursement Program Paid paternity and maternity leaves Show more Show less
Posted 1 week ago
2.0 years
0 Lacs
Vadodara, Gujarat, India
On-site
Job Description: We are looking for a skilled Python Developer with experience in PostgreSQL to join our team. The ideal candidate should have 6 months to 2 years of hands-on experience in developing, optimizing, and maintaining Python-based applications with PostgreSQL database integration. You will work closely with our development team to build scalable backend solutions, write efficient queries, and ensure data integrity. Key Responsibilities: · Develop and maintain Python-based applications with PostgreSQL integration. · Design and optimize database schemas, queries, and stored procedures . · Write clean, efficient, and well-documented code following best practices. · Troubleshoot and debug database performance issues. · Work with Django/Flask/FastAPI (if applicable) for backend development. · Collaborate with frontend developers and other team members to integrate APIs. · Ensure data security and implement proper backup/recovery strategies. · Participate in code reviews and contribute to continuous improvement. Required Skills & Qualifications: · 6 months - 2 years of experience in Python development . · Knowledge of PostgreSQL (queries, indexing, optimization). · Familiarity with ORM tools like Django ORM, or Psycopg2. · Experience with RESTful APIs and backend frameworks (Django, Flask, FastAPI). · Basic understanding of database design, normalization, and transactions . · Knowledge of Git version control and Agile methodologies. · Good problem-solving skills and attention to detail. Preferred Skills (Bonus): · Experience with AWS/GCP/Azure (especially RDS or cloud databases). · Familiarity with Docker, CI/CD pipelines . What We Offer: · Opportunity to work on challenging projects. · Flexible work environment. · Career growth and learning opportunities. Show more Show less
Posted 1 week ago
7.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Position : Senior SQL Database Developer / Architect Location: Hinjewadi Phase-1, Pune (WFO) Experience : 7 + years Shift : 10:30 AM to 7:30 PM Working Days : Monday to Friday Notice Period : Immediate to 15 Days Job Description: Futurism Tech is seeking an experienced SQL Database Developer / Architect with a minimum of 8 years of experience in designing, developing, and architecting complex database systems. The ideal candidate will have a strong foundation in SQL development, data modeling, performance tuning, and database architecture. This role involves both hands-on coding and high-level architectural planning to support scalable, secure, and efficient data solutions. Key Responsibilities: Design, develop, and maintain scalable and high-performance SQL databases. Define and implement database architecture standards, best practices, and design patterns. Build robust data models (logical and physical) for transactional and analytical systems. Optimize existing SQL queries, indexing strategies, and schema design to ensure performance and scalability. Collaborate with software developers, business analysts, and DevOps teams to implement end-to-end data solutions. Ensure data integrity, consistency, and availability across environments. Lead database design reviews and provide technical guidance on data storage and access strategies. Manage database lifecycle including schema changes, upgrades, backups, and recovery strategies. Evaluate new technologies and tools for improving database performance and architecture. Required Qualifications: 8+ years of experience in SQL database development and architecture. Deep expertise in SQL Server (or other RDBMS like Oracle, MySQL, PostgreSQL). Strong knowledge of database design principles, normalization, and performance tuning. Proven experience in designing scalable and secure database architectures. Proficient in writing complex stored procedures, views, triggers, and functions. Experience with ETL tools (e.g., SSIS, Informatica, Talend) and data integration strategies. Understanding of high availability, disaster recovery, and replication strategies. Familiarity with DevOps tools and CI/CD practices for database deployments. Excellent problem-solving and system design skills. Qualifications: Bachelor's degree in Computer Science, Bachelor of Engineering/Technology - BE/BTech (or equivalent experience) If you are interested share the updated resume on sanyogitas@futurismtechnologies.com or can connect on +91 (20) 67120700 Extn 201 /9226554403 Show more Show less
Posted 1 week ago
0 years
0 Lacs
India
On-site
We are looking for a SQL Developer responsible for designing, developing, and maintaining database systems. You will work closely with developers and system admins to ensure database performance, security, and reliability. Key Responsibilities: Design and implement database structures using SQL Ensure data integrity, normalization, and optimization Write and optimize complex SQL queries Manage data migration and system integration Implement database security and user access controls Troubleshoot performance issues and perform routine maintenance Collaborate with cross-functional teams Maintain database documentation Qualifications: Bachelor’s in CS, IT, or related field Strong SQL and database knowledge Understanding of normalization, indexing, and security Good problem-solving and communication skills Preferred: Experience with MySQL, SQL Server, PostgreSQL, or Oracle Knowledge of NoSQL and data warehousing concepts Relevant certifications are a plus Job Type: Full-time Pay: ₹15,000.00 - ₹20,000.00 per year Schedule: Day shift Work Location: On the road Application Deadline: 20/06/2025 Expected Start Date: 18/06/2025
Posted 1 week ago
8.0 years
0 Lacs
Kochi, Kerala, India
On-site
Introduction Software Developers at IBM are the backbone of our strategic initiatives to design, code, test, and provide industry-leading solutions that make the world run today - planes and trains take off on time, bank transactions complete in the blink of an eye and the world remains safe because of the work our software developers do. Whether you are working on projects internally or for a client, software development is critical to the success of IBM and our clients worldwide. At IBM, you will use the latest software development tools, techniques and approaches and work with leading minds in the industry to build solutions you can be proud of Your Role And Responsibilities This Candidate is responsible for DB2 installation and configuration on the below environments. On Prem Multi Cloud Redhat Open shift Cluster HADR Non-DPF and DPF. Migration of other databases to Db2(eg TERADATA / SNOWFLAKE / SAP/ Cloudera to db2 migration) Create high-level designs, detail level designs, maintaining product roadmaps which includes both modernization and leveraging cloud solutions Design scalable, performant, and cost-effective data architectures within the Lakehouse to support diverse workloads, including reporting, analytics, data science, and AI/ML. Perform health check of the databases, make recommendations and deliver tuning. At the Database and system level. Deploy DB2 databases as containers within Red Hat OpenShift clusters Configure containerized database instances, persistent storage, and network settings to optimize performance and reliability. Lead the architectural design and implementation of solutions on IBM watsonx.data, ensuring alignment with overall enterprise data strategy and business objectives. Define and optimize the watsonx.data ecosystem, including integration with other IBM watsonx components (watsonx.ai, watsonx.governance) and existing data infrastructure (DB2, Netezza, cloud data sources) Establish best practices for data modeling, schema evolution, and data organization within the watsonx.data lakehouse Act as a subject matter expert on Lakehouse architecture, providing technical leadership and guidance to data engineering, analytics, and development teams. Mentor junior architects and engineers, fostering their growth and knowledge in modern data platforms. Participate in the development of architecture governance processes and promote best practices across the organization. Communicate complex technical concepts to both technical and non-technical stakeholders. Required Technical And Professional Expertise 8+years of experience in data architecture, data engineering, or a similar role, with significant hands-on experience in cloud data platforms Strong proficiency in DB2, SQL and Python. Strong understanding of: Database design and modelling(dimensional, normalized, NoSQL schemas) Normalization and indexing Data warehousing and ETL processes Cloud platforms (AWS, Azure, GCP) Big data technologies (e.g., Hadoop, Spark) Database Migration project experience from one database to another database (target database Db2). Experience in deployment of DB2 databases as containers within Red Hat OpenShift clusters and configure containerized database instances, persistent storage, and network settings to optimize performance and reliability. Excellent communication, collaboration, problem-solving, and leadership skills Preferred Technical And Professional Experience Experience with machine learning environments and LLMs Certification in IBM watsonx.data or related IBM data and AI technologies Hands-on experience with Lakehouse platform (e.g., Databricks, Snowflake) Having exposure to implement or understanding of DB replication process Experience with integrating watsonx.data with GenAI or LLM initiatives (e.g., RAG architectures). Experience with NoSQL databases (e.g., MongoDB, Cassandra). Show more Show less
Posted 1 week ago
3.0 - 5.0 years
0 Lacs
Roorkee, Uttarakhand, India
Remote
Company Description Miratech helps visionaries change the world. We are a global IT services and consulting company that brings together enterprise and start-up innovation. Today, we support digital transformation for some of the world's largest enterprises. By partnering with both large and small players, we stay at the leading edge of technology, remain nimble even as a global leader, and create technology that helps our clients further enhance their business. We are a values-driven organization and our culture of Relentless Performance has enabled over 99% of Miratech's engagements to succeed by meeting or exceeding our scope, schedule, and/or budget objectives since our inception in 1989. Miratech has coverage across 5 continents and operates in 25 countries around the world. Miratech retains nearly 1000 full-time professionals, and our annual growth rate exceeds 25%. Job Description We seek a skilled Python and SQL Engineers to work in the data team of a big financial asset management company to support their applications from a data perspective. Responsibilities: Write, test, and maintain Python code and SQL queries to support project requirements. Assist in system integration and debugging, addressing issues as they arise. Collaborate with senior engineers to ensure solutions are aligned with project goals. Conduct development testing to verify components function as intended. Perform data analysis, identify inconsistencies, and propose solutions to improve quality. Participate in task estimation and contribute to project timelines. Maintain technical documentation for solutions and processes. Support ongoing system improvements under the guidance of senior team members. Qualifications 3-5 years of experience as a software developer using Python. 1-2 years of experience working with relational databases, preferably Sybase, and SQL experience with Database Modeling/Normalization techniques. Experience on Linux operating systems. Experience in the finance industry and knowledge of financial products/markets. Experience working in a globally distributed team. Written and spoken fluency in English. Excellent communication skills, both written and verbal. A track record of taking the initiative to solve problems and working independently with minimal direction. Nice to have: Experience with Python frameworks utilizing Asyncio. Familiarity with cloud technologies like Kubernetes, Docker. Experience with DevOps tools like Git, Maven, Jenkins, GitLab CI. Experience in designing multi-tier application architectures and distributed caching solutions. ETL background in any language or tools. Experience working with large volumes of time series data and building services, APIs, and applications based on it. Ability to troubleshoot and fix performance issues across the codebase and database queries. BA/BS in Computer Science or equivalent practical experience. We offer: Culture of Relentless Performance: join an unstoppable technology development team with a 99% project success rate and more than 30% year-over-year revenue growth. Competitive Pay and Benefits: enjoy a comprehensive compensation and benefits package, including health insurance, and a relocation program. Work From Anywhere Culture: make the most of the flexibility that comes with remote work. Growth Mindset: reap the benefits of a range of professional development opportunities, including certification programs, mentorship and talent investment programs, internal mobility and internship opportunities. Global Impact: collaborate on impactful projects for top global clients and shape the future of industries. Welcoming Multicultural Environment: be a part of a dynamic, global team and thrive in an inclusive and supportive work environment with open communication and regular team-building company social events. Social Sustainability Values: join our sustainable business practices focused on five pillars, including IT education, community empowerment, fair operating practices, environmental sustainability, and gender equality. Miratech is an equal opportunity employer and does not discriminate against any employee or applicant for employment on the basis of race, color, religion, sex, national origin, age, disability, veteran status, sexual orientation, gender identity, or any other protected status under applicable law. Show more Show less
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
20312 Jobs | Dublin
Wipro
11977 Jobs | Bengaluru
EY
8165 Jobs | London
Accenture in India
6667 Jobs | Dublin 2
Uplers
6464 Jobs | Ahmedabad
Amazon
6352 Jobs | Seattle,WA
Oracle
5993 Jobs | Redwood City
IBM
5803 Jobs | Armonk
Capgemini
3897 Jobs | Paris,France
Tata Consultancy Services
3776 Jobs | Thane