Jobs
Interviews

1190 Normalization Jobs - Page 38

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

0 years

0 Lacs

Vijayawada, Andhra Pradesh, India

On-site

About Us JOB DESCRIPTION SBI Card is a leading pure-play credit card issuer in India, offering a wide range of credit cards to cater to diverse customer needs. We are constantly innovating to meet the evolving financial needs of our customers, empowering them with digital currency for seamless payment experience and indulge in rewarding benefits. At SBI Card, the motto 'Make Life Simple' inspires every initiative, ensuring that customer convenience is at the forefront of all that we do. We are committed to building an environment where people can thrive and create a better future for everyone. SBI Card is proud to be an equal opportunity & inclusive employer and welcome employees without any discrimination on the grounds of race, color, gender, religion, creed, disability, sexual orientation, gender identity, marital status, caste etc. SBI Card is committed to fostering an inclusive and diverse workplace where all employees are treated equally with dignity and respect which makes it a promising place to work. Join us to shape the future of digital payment in India and unlock your full potential. What’s In It For YOU SBI Card truly lives by the work-life balance philosophy. We offer a robust wellness and wellbeing program to support mental and physical health of our employees Admirable work deserves to be rewarded. We have a well curated bouquet of rewards and recognition program for the employees Dynamic, Inclusive and Diverse team culture Gender Neutral Policy Inclusive Health Benefits for all - Medical Insurance, Personal Accidental, Group Term Life Insurance and Annual Health Checkup, Dental and OPD benefits Commitment to the overall development of an employee through comprehensive learning & development framework Role Purpose Responsible for the management of all collections processes for allocated portfolio in the assigned CD/Area basis targets set for resolution, normalization, rollback/absolute recovery and ROR. Role Accountability Conduct timely allocation of portfolio to aligned vendors/NFTEs and conduct ongoing reviews to drive performance on the business targets through an extended team of field executives and callers Formulate tactical short term incentive plans for NFTEs to increase productivity and drive DRR Ensure various critical segments as defined by business are reviewed and performance is driven on them Ensure judicious use of hardship tools and adherence to the settlement waivers both on rate and value Conduct ongoing field visits on critical accounts and ensure proper documentation in Collect24 system of all field visits and telephone calls to customers Raise red flags in a timely manner basis deterioration in portfolio health indicators/frauds and raise timely alarms on critical incidents as per the compliance guidelines Ensure all guidelines mentioned in the SVCL are adhered to and that process hygiene is maintained at aligned agencies Ensure 100% data security using secured data transfer modes and data purging as per policy Ensure all customer complaints received are closed within time frame Conduct thorough due diligence while onboarding/offboarding/renewing a vendor and all necessary formalities are completed prior to allocating Ensure agencies raise invoices timely Monitor NFTE ACR CAPE as per the collection strategy Measures of Success Portfolio Coverage Resolution Rate Normalization/Roll back Rate Settlement waiver rate Absolute Recovery Rupee collected NFTE CAPE DRA certification of NFTEs Absolute Customer Complaints Absolute audit observations Process adherence as per MOU Technical Skills / Experience / Certifications Credit Card knowledge along with good understanding of Collection Processes Competencies critical to the role Analytical Ability Stakeholder Management Problem Solving Result Orientation Process Orientation Qualification Post-Graduate / Graduate in any discipline Preferred Industry FSI Show more Show less

Posted 1 month ago

Apply

1.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Description The role is for 1 year term in Amazon Job Description Are you interested in applying your strong quantitative analysis and big data skills to world-changing problems? Are you interested in driving the development of methods, models and systems for strategy planning, transportation and fulfillment network? If so, then this is the job for you. Our team is responsible for creating core analytics tech capabilities, platforms development and data engineering. We develop scalable analytics applications across APAC, MENA and LATAM. We standardize and optimize data sources and visualization efforts across geographies, builds up and maintains the online BI services and data mart. You will work with professional software development managers, data engineers, business intelligence engineers and product managers using rigorous quantitative approaches to ensure high quality data tech products for our customers around the world, including India, Australia, Brazil, Mexico, Singapore and Middle East. Amazon is growing rapidly and because we are driven by faster delivery to customers, a more efficient supply chain network, and lower cost of operations, our main focus is in the development of strategic models and automation tools fed by our massive amounts of available data. You will be responsible for building these models/tools that improve the economics of Amazon’s worldwide fulfillment networks in emerging countries as Amazon increases the speed and decreases the cost to deliver products to customers. You will identify and evaluate opportunities to reduce variable costs by improving fulfillment center processes, transportation operations and scheduling, and the execution to operational plans. Major Responsibilities Include Translating business questions and concerns into specific analytical questions that can be answered with available data using BI tools; produce the required data when it is not available. Writing SQL queries and automation scripts Ensure data quality throughout all stages of acquisition and processing, including such areas as data sourcing/collection, ground truth generation, normalization, transformation, cross-lingual alignment/mapping, etc. Communicate proposals and results in a clear manner backed by data and coupled with actionable conclusions to drive business decisions. Collaborate with colleagues from multidisciplinary science, engineering and business backgrounds. Develop efficient data querying and modeling infrastructure. Manage your own process. Prioritize and execute on high impact projects, triage external requests, and ensure to deliver projects in time. Utilizing code (SQL, Python, R, Scala, etc.) for analyzing data and building data marts Basic Qualifications 3+ years of analyzing and interpreting data with Redshift, Oracle, NoSQL etc. experience Experience with data visualization using Tableau, Quicksight, or similar tools Experience with data modeling, warehousing and building ETL pipelines Experience in Statistical Analysis packages such as R, SAS and Matlab Experience using SQL to pull data from a database or data warehouse and scripting experience (Python) to process data for modeling Preferred Qualifications Experience with AWS solutions such as EC2, DynamoDB, S3, and Redshift Experience in data mining, ETL, etc. and using databases in a business environment with large-scale, complex datasets Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner. Company - ASSPL - Karnataka Job ID: A2997231 Show more Show less

Posted 1 month ago

Apply

5.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start Caring. Connecting. Growing together. Primary Responsibilities Support the full data engineering lifecycle including research, proof of concepts, design, development, testing, deployment, and maintenance of data management solutions Utilize knowledge of various data management technologies to drive data engineering projects Working with Operations and Product Development staff to support applications/processes to facilitate the effective and efficient implementation/migration of new clients' healthcare data through the Optum Impact Product Suite Lead data acquisition efforts to gather data from various structured or semi-structured source systems of record to hydrate client data warehouse and power analytics across numerous health care domains Leverage combination of ETL/ELT methodologies to pull complex relational and dimensional data to support loading DataMart’s and reporting aggregates Eliminate unwarranted complexity and unneeded interdependencies Detect data quality issues, identify root causes, implement fixes, and manage data audits to mitigate data challenges Implement, modify, and maintain data integration efforts that improve data efficiency, reliability, and value Leverage and facilitate the evolution of best practices for data acquisition, transformation, storage, and aggregation that solve current challenges and reduce the risk of future challenges Effectively create data transformations that address business requirements and other constraints Partner with the broader analytics organization to make recommendations for changes to data systems and the architecture of data platforms Prepare high level design documents and detailed technical design documents with best practices to enable efficient data ingestion, transformation and data movement Leverage DevOps tools to enable code versioning and code deployment Leverage data pipeline monitoring tools to detect data integrity issues before they result into user visible outages or data quality issues Leverage processes and diagnostics tools to troubleshoot, maintain and optimize solutions and respond to customer and production issues Continuously support technical debt reduction, process transformation, and overall optimization Leverage and contribute to the evolution of standards for high quality documentation of data definitions, transformations, and processes to ensure data transparency, governance, and security Ensure that all solutions meet the business needs and requirements for security, scalability, and reliability Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Required Qualifications Bachelor’s Degree (preferably in information technology, engineering, math, computer science, analytics, engineering or other related field) 5+ years of combined experience in data engineering, ingestion, normalization, transformation, aggregation, structuring, and storage 5+ years of combined experience working with industry standard relational, dimensional or non-relational data storage systems 5+ years of experience in designing ETL/ELT solutions using tools like Informatica, DataStage, SSIS , PL/SQL, T-SQL, etc. 5+ years of experience in managing data assets using SQL, Python, Scala, VB.NET or other similar querying/coding language 3+ years of experience working with healthcare data or data to support healthcare organizations Preferred Qualifications 5+ years of experience in creating Source to Target Mappings and ETL design for integration of new/modified data streams into the data warehouse/data marts Experience in Unix or Powershell or other batch scripting languages Experience supporting data pipelines that power analytical content within common reporting and business intelligence platforms (e.g. Power BI, Qlik, Tableau, MicroStrategy, etc.) Experience supporting analytical capabilities inclusive of reporting, dashboards, extracts, BI tools, analytical web applications and other similar products Experience contributing to cross-functional efforts with proven success in creating healthcare insights Experience and credibility interacting with analytics and technology leadership teams Depth of experience and proven track record creating and maintaining sophisticated data frameworks for healthcare organizations Exposure to Azure, AWS, or google cloud ecosystems Exposure to Amazon Redshift, Amazon S3, Hadoop HDFS, Azure Blob, or similar big data storage and management components Demonstrated desire to continuously learn and seek new options and approaches to business challenges Willingness to leverage best practices, share knowledge, and improve the collective work of the team Demonstrated ability to effectively communicate concepts verbally and in writing Demonstrated awareness of when to appropriately escalate issues/risks Demonstrated excellent communication skills, both written and verbal At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone-of every race, gender, sexuality, age, location and income-deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes - an enterprise priority reflected in our mission. Show more Show less

Posted 1 month ago

Apply

9.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Company Description ShuffleLabs is a trusted partner for integration solutions, delivering excellent products and exceptional customer service. The flagship platform ShuffleExchange connects applications using APIs, simplifies integration development with visual tools, and makes implementation easy. ShuffleExchange provides simplicity and ease of use for complex integration needs. Role Description This is a full-time on-site role in Chennai for a Technical Lead at ShuffleLabs. The Technical Lead will be responsible for leading technical teams, designing and implementing software solutions, providing technical guidance, and ensuring project success through effective leadership and problem-solving. Roles & Responsibilities Architect and deliver iPaaS solutions that enable seamless integration between cloud, on-premise, and hybrid systems using modern technologies and enterprise best practices. Lead end-to-end solution design, ensuring robust data integration patterns (event-driven, pub-sub, messaging queues, ETL, API orchestration). Collaborate with cross-functional teams, product managers, and integration engineers to define technical roadmaps and align architecture with business goals. Incorporate AI/ML components into integration workflows for intelligent automation, data transformation, anomaly detection, and predictive routing. Define and implement scalable cloud integration strategies leveraging Microsoft Azure services (Logic Apps, Azure Functions, Event Grid, API Management). Guide integration architecture design, performance optimization, security protocols (OAuth2, OpenID Connect, API security), and SLA compliance. Evaluate and select iPaaS tools, integration frameworks, and middleware platforms aligned with customer needs and future scalability. Mentor development teams, enforce engineering standards, lead code and design reviews, and promote DevOps and CI/CD best practices. Produce architecture artifacts, technical specifications, white papers, and reusable design patterns for internal and client use. Lead cloud migration initiatives and legacy modernization efforts, especially for enterprise integration systems. Experience with hybrid and multi-cloud integration scenarios Core Technical Skills iPaaS / Integration Expertise: Integration patterns: RESTful/SOAP APIs, file-based transfers, message queues, ETL pipelines, webhooks Experience with iPaaS platforms: Azure Logic Apps , MuleSoft , Dell Boomi , Informatica , SnapLogic , or Workato (experience with Azure preferred) Strong knowledge of Azure Integration Services : Logic Apps, API Management, Service Bus, Event Grid, Azure Functions, Data Factory Microsoft Stack: .NET Core / .NET 6+, ASP.NET MVC, C#, Entity Framework, Web API Azure PaaS services (App Services, Azure SQL, Azure Cosmos DB, Azure Blob Storage) Experience with database design, normalization, and data modeling in SQL Server environments Power Platform (Power Automate, Power BI, Power Apps) knowledge is a plus DevOps & Cloud Infrastructure: Azure DevOps, CI/CD pipelines, ARM/Bicep templates, GitHub Actions, Jenkins Kubernetes (AKS), Docker, Terraform, Ansible Azure Monitoring, App Insights, and Log Analytics for integration health checks and diagnostics AI & Intelligent Integration Skills Integrate AI into iPaaS pipelines using Azure Cognitive Services , OpenAI APIs , or ML.NET for: Data extraction/classification from unstructured sources (OCR, NLP) Smart routing and workflow recommendations Predictive analytics and anomaly detection Intelligent chatbots and virtual assistants for API support Experience with AI-enriched integration use cases: RAG pipelines, auto-mapping schemas, adaptive decision-making, intelligent data validation Familiarity with vector databases (e.g., Pinecone, FAISS) and semantic search to enhance knowledge workflows Understanding of MLOps practices and integration of AI lifecycle into CI/CD pipelines Contribution to open-source integration or AI tools Soft Skills & Leadership Strategic thinker with deep technical expertise and strong business acumen in the iPaaS space Excellent communication skills with the ability to translate complex technical concepts into business terms Collaborative leader experienced in mentoring cross-functional engineering teams Strong problem-solving, prioritization, and project planning abilities in fast-paced, evolving environments Qualifications & Experience M.C.A. / B.E. / B. Tech in Computer Science or related field from a reputed institute Minimum 9+ years of experience in IT, with 3+ years in architectural or lead roles focused on integration technologies Proven experience designing and deploying enterprise-scale integration solutions using Microsoft Azure Experience implementing AI-driven features in cloud or integration platforms is highly desirable Certifications in Microsoft Azure (Architect, DevOps Engineer, AI Engineer) or iPaaS platforms are a strong plus Show more Show less

Posted 2 months ago

Apply

7.0 - 10.0 years

2 - 3 Lacs

Gurgaon

On-site

Experience: 7 - 10 Years Location: GURGAON/ HYBRID MODE CTC TO BE OFFERED : Mention Your Current & Expected CTC Notice Period: IMMEDIATE TO 30 DAYS KeySkills: SPLUNK, SIEM DOMAIN, BACKEND OPERATIONS , UF, HF, SH, INDEXER CLUSTER, LOG MANAGEMENT, LOG COLLECTION, PARSING, NORMALIZATION, RETENTION PRACTICES, LOGS/LICENSE OPTIMIZATION, DESIGNING, DEPLOYMENT & IMPLEMENTATION, DATA PARSIMONY, GERMAN DATA SECURITY STANDARDS, SPLUNK LOGGING INFRASTRUCTURE, OBSERVABILITY TOOLS, ELK, DATADOG, NETWORK ARCHITECTURE, LINUX ADMINISTRATION, SYSLOG, PYTHON, POWERSHELL, OR BASH, OEM SIEM, HLD, LLD, IMPLEMENTATION GUIDE, OPERATION MANUALS Job Description: As Lead Splunk, your role and responsibilities would include: Hands on experience in the SIEM domain Expert knowledge on Splunk Backend operations (UF, HF, SH and Indexer Cluster) and architecture Expert knowledge of Log Management and Splunk SIEM. Understanding of log collection, parsing, normalization, and retention practices. Expert in Logs/License optimization techniques and strategy. Good Understanding of Designing, Deployment & Implementation of a scalable SIEM Architecture. Understanding of data parsimony as a concept, especially in terms of German data security standards. Working knowledge of integration of Splunk logging infrastructure with 3rd party Observability Tools (e.g. ELK, DataDog etc.) Experience in identifying the security and non-security logs and apply adequate filters/re-route the logs accordingly. Expert in understanding the Network Architecture and identifying the components of impact. Expert in Linux Administration. Proficient in working with Syslog. Proficiency in scripting languages like Python, PowerShell, or Bash to automate tasks Expertise with OEM SIEM tools preferably Splunk Experience with open source SIEM/Log storage solutions like ELK OR Datadog etc.. Very good with documentation of HLD, LLD, Implementation guide and Operation Manuals Note: (i) Our client is looking for immediate & early joiners. (ii) Having LinkedIn Profile is a must. (iii) Being an immediate & high priority requirement interested candidates can share their Resumes with Photograph in word doc. format

Posted 2 months ago

Apply

7.0 - 9.0 years

6 - 10 Lacs

Noida

On-site

Are you our “TYPE”? Monotype (Global) Named "One of the Most Innovative Companies in Design" by Fast Company, Monotype brings brands to life through type and technology that consumers engage with every day. The company's rich legacy includes a library that can be traced back hundreds of years, featuring famed typefaces like Helvetica, Futura, Times New Roman and more. Monotype also provides a first-of-its-kind service that makes fonts more accessible for creative professionals to discover, license, and use in our increasingly digital world. We work with the biggest global brands, and with individual creatives, offering a wide set of solutions that make it easier for them to do what they do best: design beautiful brand experiences. Monotype Solutions India Monotype Solutions India is a strategic center of excellence for Monotype and is a certified Great Place to Work® three years in a row. The focus of this fast-growing center spans Product Development, Product Management, Experience Design, User Research, Market Intelligence, Research in areas of Artificial Intelligence and Machine learning, Innovation, Customer Success, Enterprise Business Solutions, and Sales. Headquartered in the Boston area of the United States and with offices across 4 continents, Monotype is the world’s leading company in fonts. It’s a trusted partner to the world’s top brands and was named “One of the Most Innovative Companies in Design” by Fast Company. Monotype brings brands to life through the type and technology that consumers engage with every day. The company's rich legacy includes a library that can be traced back hundreds of years, featuring famed typefaces like Helvetica, Futura, Times New Roman, and more. Monotype also provides a first-of-its-kind service that makes fonts more accessible for creative professionals to discover, license, and use in our increasingly digital world. We work with the biggest global brands, and with individual creatives, offering a wide set of solutions that make it easier for them to do what they do best: design beautiful brand experiences. We are looking for problem solvers to help us build next-generation features, products, and services. You will work closely with a cross-functional team of engineers on microservices and event-driven architectures. You are expected to contribute to the architecture, design, and development of new features, identify technical risks and find alternate solutions to various problems. In addition, the role also demands to lead, motivate & mentor other team members with respect to technical challenges. What we’re looking for: Bachelor’s or Master’s degree in Computer Science, Information Systems, or related field. Minimum 7-9 years of professional experience, with at least 5 years specifically in data architecture. Proven experience designing and implementing data models, including ER diagrams, dimensional modeling, and normalization techniques. Strong expertise in relational databases (SQL Server, Oracle, PostgreSQL) and NoSQL databases (MongoDB, Cassandra). Proficiency with data modeling tools such as ERwin, PowerDesigner, or similar tools. Knowledge of cloud data platforms and services (AWS, Azure, GCP). Strong analytical and problem-solving skills, with the ability to provide creative and innovative solutions. Excellent communication and stakeholder management abilities. You will have an opportunity to: ✔ COLLABORATE with global teams to build scalable web-based applications. ✔ PARTNER closely with the engineering team to follow best practices and standards. ✔ PROVIDE reliable solutions to a variety of problems using sound problem-solving techniques. ✔ WORK with the broader team to build and maintain high performance, flexible, and highly scalable web-based applications. ✔ ACHIEVE engineering excellence by implementing standard practices and standards. ✔ PERFORM technical root causes analysis and outlines corrective action for given problems. What’s in it for you Hybrid work arrangements and competitive paid time off programs. Comprehensive medical insurance coverage to meet all your healthcare needs. Competitive compensation with corporate bonus program & uncapped commission for quota-carrying Sales A creative, innovative, and global working environment in the creative and software technology industry Highly engaged Events Committee to keep work enjoyable. Reward & Recognition Programs (including President's Club for all functions) Professional onboarding program, including robust targeted training for Sales function Development and advancement opportunities (high internal mobility across organization) Retirement planning options to save for your future, and so much more! Monotype is an Equal Opportunities Employer. Qualified applicants will receive consideration for employment without regard to race, colour, religion, sex, national origin, sexual orientation, gender identity, disability or protected veteran status.

Posted 2 months ago

Apply

0 years

2 - 4 Lacs

Vadodara

On-site

Are you passionate about data, performance tuning, and writing efficient SQL? Join our growing team where you’ll work on exciting projects and contribute to maintaining high-performance, scalable database systems. -What we’re looking for: -Strong SQL skills - Experience with SQL Server / PostgreSQL / MySQL. -Understanding of normalization, indexing, and query optimization. -Advance query writing skill. - Knowledge of database backup, recovery & security - Basic Linux/Unix scripting (a plus) -Exposure to cloud platforms like AWS RDS or Google Cloud SQL (bonus!) -Location: Vadodara -Apply here: khushirai@blueboxinfosoft.comLet’s build smarter systems together! Job Type: Full-time Pay: ₹200,000.00 - ₹400,000.00 per year Benefits: Paid sick time Schedule: Day shift Monday to Friday Work Location: In person

Posted 2 months ago

Apply

1.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

JD for DM resource Job Title: SQL Developer (1 Year Experience) Location: Noida Job Type: [Full-time] Experience Required: 1 Year Department: Insurance Product & Platforms Job Summary We are seeking a motivated and detail-oriented SQL Developer with 1 year of professional experience to join our data team. The ideal candidate will be responsible for writing queries, optimizing performance, managing databases, and supporting application development with efficient SQL solutions. Key Responsibilities Develop, test, and maintain SQL queries, stored procedures, and scripts to support applications and reporting needs. Work closely with developers, data analysts, and business users to gather requirements and deliver solutions. Optimize SQL queries for performance and scalability. Assist in maintaining data integrity and security across multiple databases. Monitor database performance and troubleshoot issues as they arise. Generate reports and data extracts as required by business units. Perform data validation and cleansing as part of data migration or integration projects. Collaborate in database design and normalization. Qualifications Bachelor’s degree in Computer Science, Information Technology, or related field (or equivalent experience). 1 year of hands-on experience working with SQL databases (such as MS SQL Server, MySQL, PostgreSQL, or Oracle). Strong understanding of relational database concepts. Basic experience with ETL tools or scripting (nice to have). Good problem-solving and communication skills. Experience with reporting tools (e.g., Power BI, SSRS) is a plus. Technical Skills Proficient in writing and debugging SQL queries, joins, subqueries, views, triggers, and stored procedures. Familiar with database performance tuning techniques. Understanding of database security and backup procedures. Exposure to version control systems like Git is an advantage. Soft Skills Attention to detail and a strong desire to learn. Ability to work independently and in a team environment. Strong communication and documentation skills. Analytical thinking and a structured approach to problem-solving. Preferred Certifications (optional) Cloud Certification (if any) Show more Show less

Posted 2 months ago

Apply

7.0 - 9.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Are you our “TYPE”? Monotype (Global) Named "One of the Most Innovative Companies in Design" by Fast Company, Monotype brings brands to life through type and technology that consumers engage with every day. The company's rich legacy includes a library that can be traced back hundreds of years, featuring famed typefaces like Helvetica, Futura, Times New Roman and more. Monotype also provides a first-of-its-kind service that makes fonts more accessible for creative professionals to discover, license, and use in our increasingly digital world. We work with the biggest global brands, and with individual creatives, offering a wide set of solutions that make it easier for them to do what they do best: design beautiful brand experiences. Monotype Solutions India Monotype Solutions India is a strategic center of excellence for Monotype and is a certified Great Place to Work® three years in a row. The focus of this fast-growing center spans Product Development, Product Management, Experience Design, User Research, Market Intelligence, Research in areas of Artificial Intelligence and Machine learning, Innovation, Customer Success, Enterprise Business Solutions, and Sales. Headquartered in the Boston area of the United States and with offices across 4 continents, Monotype is the world’s leading company in fonts. It’s a trusted partner to the world’s top brands and was named “One of the Most Innovative Companies in Design” by Fast Company. Monotype brings brands to life through the type and technology that consumers engage with every day. The company's rich legacy includes a library that can be traced back hundreds of years, featuring famed typefaces like Helvetica, Futura, Times New Roman, and more. Monotype also provides a first-of-its-kind service that makes fonts more accessible for creative professionals to discover, license, and use in our increasingly digital world. We work with the biggest global brands, and with individual creatives, offering a wide set of solutions that make it easier for them to do what they do best: design beautiful brand experiences. We are looking for problem solvers to help us build next-generation features, products, and services. You will work closely with a cross-functional team of engineers on microservices and event-driven architectures. You are expected to contribute to the architecture, design, and development of new features, identify technical risks and find alternate solutions to various problems. In addition, the role also demands to lead, motivate & mentor other team members with respect to technical challenges. What We’re Looking For Bachelor’s or Master’s degree in Computer Science, Information Systems, or related field. Minimum 7-9 years of professional experience, with at least 5 years specifically in data architecture. Proven experience designing and implementing data models, including ER diagrams, dimensional modeling, and normalization techniques. Strong expertise in relational databases (SQL Server, Oracle, PostgreSQL) and NoSQL databases (MongoDB, Cassandra). Proficiency with data modeling tools such as ERwin, PowerDesigner, or similar tools. Knowledge of cloud data platforms and services (AWS, Azure, GCP). Strong analytical and problem-solving skills, with the ability to provide creative and innovative solutions. Excellent communication and stakeholder management abilities. You Will Have An Opportunity To ✔ COLLABORATE with global teams to build scalable web-based applications. ✔ PARTNER closely with the engineering team to follow best practices and standards. ✔ PROVIDE reliable solutions to a variety of problems using sound problem-solving techniques. ✔ WORK with the broader team to build and maintain high performance, flexible, and highly scalable web-based applications. ✔ ACHIEVE e ngineering excellence by implementing standard practices and standards. ✔ PERFORM technical root causes analysis and outlines corrective action for given problems. What’s in it for you Hybrid work arrangements and competitive paid time off programs. Comprehensive medical insurance coverage to meet all your healthcare needs. Competitive compensation with corporate bonus program & uncapped commission for quota-carrying Sales A creative, innovative, and global working environment in the creative and software technology industry Highly engaged Events Committee to keep work enjoyable. Reward & Recognition Programs (including President's Club for all functions) Professional onboarding program, including robust targeted training for Sales function Development and advancement opportunities (high internal mobility across organization) Retirement planning options to save for your future, and so much more! Monotype is an Equal Opportunities Employer. Qualified applicants will receive consideration for employment without regard to race, colour, religion, sex, national origin, sexual orientation, gender identity, disability or protected veteran status. Show more Show less

Posted 2 months ago

Apply

0 years

0 Lacs

Mumbai Metropolitan Region

On-site

Make an impact with NTT DATA Join a company that is pushing the boundaries of what is possible. We are renowned for our technical excellence and leading innovations, and for making a difference to our clients and society. Our workplace embraces diversity and inclusion – it’s a place where you can grow, belong and thrive. Your day at NTT DATA The Security Managed Services Engineer (L1) is an entry level engineering role, responsible for providing a managed service to clients to ensure that their Firewall infrastructure remain operational through proactively identifying, investigating, and routing the incidents to correct resolver group. The primary objective of this role is to ensure zero missed service level agreement (SLA) conditions and focuses on first-line support for standard and low complexity incidents and service requests. The Security Managed Services Engineer (L1) may also contribute to / support on project work as and when required. Responsibilities What you'll be doing SOC Analyst Configure and maintain the SIEM system, ensuring that it's properly set up to collect and analyze security event data. Develop, customize, and manage security rules within the SIEM to detect and respond to security threats. Monitor SIEM alerts, investigate them, and take appropriate actions based on the severity and nature of the alerts. Oversee the collection, normalization, and storage of log data from various sources. Develop and document incident response procedures, and lead or assist in incident response efforts when security incidents occur. Analyze and investigate security events from various sources. Manage security incidents through all incident response phases to closure. Utilize SIEM, SOAR, UEBA, EDR, NBAD,Splunk PCAP, Vulnerability Scanning, and Malware analysis technologies for event detection and analysis. Update tickets, write incident reports, and document actions to reduce false positives. Develop knowledge of attack types and finetune detective capabilities. Identify log sources and examine system logs to reconstruct event histories using forensic techniques. Align SIEM rules and alerts with the LIC’s security policies and compliance requirements. Conduct computer forensic investigations, including examining running processes, identifying network connections, and disk imaging. Maintain and support the operational integrity of SOC toolsets. Collaborate with SIEM solution vendors for updates, patches, and support to ensure the system's reliability and effectiveness. Maintain thorough documentation of the SIEM system's configuration, procedures, and incident response plans. Proactively identify and report system security loopholes, infringements, and vulnerabilities to the Security Operations Centre Manager in a timely manner. Work closely with other IT and security teams during incident response, coordinating efforts and sharing information to mitigate security incidents effectively. Ensure that the SIEM system helps the LIC meet regulatory compliance requirements and is ready for security audits. Continuously optimize the SIEM system for efficient performance, ensuring it can handle the volume of data and remain responsive. Develop automation scripts and workflows to streamline common security response tasks and enhance efficiency. Certification: Valid CEH Certificate required Workplace type: On-site Working About NTT DATA NTT DATA is a $30+ billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long-term success. We invest over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies. Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure, and connectivity. We are also one of the leading providers of digital and AI infrastructure in the world. NTT DATA is part of NTT Group and headquartered in Tokyo. Equal Opportunity Employer NTT DATA is proud to be an Equal Opportunity Employer with a global culture that embraces diversity. We are committed to providing an environment free of unfair discrimination and harassment. We do not discriminate based on age, race, colour, gender, sexual orientation, religion, nationality, disability, pregnancy, marital status, veteran status, or any other protected category. Join our growing global team and accelerate your career with us. Apply today. Show more Show less

Posted 2 months ago

Apply

0 years

0 Lacs

Mumbai Metropolitan Region

On-site

Make an impact with NTT DATA Join a company that is pushing the boundaries of what is possible. We are renowned for our technical excellence and leading innovations, and for making a difference to our clients and society. Our workplace embraces diversity and inclusion – it’s a place where you can grow, belong and thrive. Your day at NTT DATA The Security Managed Services Engineer (L1) is an entry level engineering role, responsible for providing a managed service to clients to ensure that their Firewall infrastructure remain operational through proactively identifying, investigating, and routing the incidents to correct resolver group. The primary objective of this role is to ensure zero missed service level agreement (SLA) conditions and focuses on first-line support for standard and low complexity incidents and service requests. The Security Managed Services Engineer (L1) may also contribute to / support on project work as and when required. What You'll Be Doing Responsibilities: Configure and maintain the SIEM system, ensuring that it's properly set up to collect and analyze security event data. Develop, customize, and manage security rules within the SIEM to detect and respond to security threats. Monitor SIEM alerts, investigate them, and take appropriate actions based on the severity and nature of the alerts. Oversee the collection, normalization, and storage of log data from various sources. Develop and document incident response procedures, and lead or assist in incident response efforts when security incidents occur. Analyze and investigate security events from various sources. Manage security incidents through all incident response phases to closure. Utilize SIEM, SOAR, UEBA, EDR, NBAD, PCAP, Vulnerability Scanning, and Malware analysis technologies for event detection and analysis. Update tickets, write incident reports, and document actions to reduce false positives. Develop knowledge of attack types and finetune detective capabilities. Identify log sources and examine system logs to reconstruct event histories using forensic techniques. Align SIEM rules and alerts with the LIC’s security policies and compliance requirements. Conduct computer forensic investigations, including examining running processes, identifying network connections, and disk imaging. Maintain and support the operational integrity of SOC toolsets. Collaborate with SIEM solution vendors for updates, patches, and support to ensure the system's reliability and effectiveness. Maintain thorough documentation of the SIEM system's configuration, procedures, and incident response plans. Proactively identify and report system security loopholes, infringements, and vulnerabilities to the Security Operations Centre Manager in a timely manner. Work closely with other IT and security teams during incident response, coordinating efforts and sharing information to mitigate security incidents effectively. Ensure that the SIEM system helps the LIC meet regulatory compliance requirements and is ready for security audits. Continuously optimize the SIEM system for efficient performance, ensuring it can handle the volume of data and remain responsive. Develop automation scripts and workflows to streamline common security response tasks and enhance efficiency. Workplace type: On-site Working About NTT DATA NTT DATA is a $30+ billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long-term success. We invest over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies. Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure, and connectivity. We are also one of the leading providers of digital and AI infrastructure in the world. NTT DATA is part of NTT Group and headquartered in Tokyo. Equal Opportunity Employer NTT DATA is proud to be an Equal Opportunity Employer with a global culture that embraces diversity. We are committed to providing an environment free of unfair discrimination and harassment. We do not discriminate based on age, race, colour, gender, sexual orientation, religion, nationality, disability, pregnancy, marital status, veteran status, or any other protected category. Join our growing global team and accelerate your career with us. Apply today. Show more Show less

Posted 2 months ago

Apply

0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Role: Developer Required Technical Skill Set: BMC Remedy Desired Experience Range 5+ Location of Requirement : Hyderabad Desired Competencies (Technical/Behavioral Competency) Must-Have · Exposer for latest version of BMC ITSM (preferred 20.x) . Good understanding of Helix platform. · BMC Analytics and Migration experience · Good working experience of Smart IT, DWP(A), Smart Reporting. · Experience in BMC Remedy AR system workflow development as per the provided design and debugging of the workflows. · Experience in Active Links, Filters, Escalations, Web Services, All Form Objects creation and debugging. · Experience on advanced AR configurations such as Server Group, DSO, Load Balancer, Threads settings. · Experience on Installation, Configuration experience of AR/Atrium/ITSM highly desirable Strong understanding of Permission model is must (User/Group/Role concepts) · Experience in BMC ITSM (Incident, Problem, Change & Release, Asset, Knowledge Management) · Experience on BMC Atrium Core products (CMDB, Product Catalogue, Atrium Integrator) · Good understanding of CMDB class structure (Common Data Model CDM) and Reconciliation concepts other feature experience desirable Normalization Engine, Atrium Impact Simulator, Service Catalogue etc. · Experience on CMDB and AR Java or C API knowledge is desirable Knowledge of UNIX, basic Oracle and SQL scripts Show more Show less

Posted 2 months ago

Apply

6.0 - 8.0 years

0 Lacs

Gurgaon, Haryana, India

On-site

We are looking for a results-driven Senior Data Engineer to join our engineering team. The ideal candidate will have hands-on expertise in data pipeline development, cloud infrastructure, and BI support, with a strong command over modern data stacks. You'll be responsible for building scalable ETL/ELT workflows, managing data lakes and marts, and enabling seamless data delivery to analytics and business intelligence teams. This role requires deep technical know-how in PostgreSQL, Python scripting, Apache Airflow, AWS or other cloud environments, and a working knowledge of modern data and BI tools Responsibilities Design and optimize complex SQL queries, stored procedures, and indexes. Perform performance tuning and query plan analysis. Contribute to schema design and data normalization. Migrate data from multiple sources to cloud or ODS platforms. Design schema mapping and implement transformation logic. Ensure consistency, integrity, and accuracy in migrated data. Build automation scripts for data ingestion, cleansing, and transformation. Handle file formats (JSON, CSV, XML), REST APIs, cloud SDKs (e. g., Boto3). Maintain reusable script modules for operational pipelines. Develop and manage DAGs for batch/stream workflows. Implement retries, task dependencies, notifications, and failure handling. Integrate Airflow with cloud services, data lakes, and data warehouses. Manage data storage (S3 GCS, Blob), compute services, and data pipelines. Set up permissions, IAM roles, encryption, and logging for security. Monitor and optimize the cost and performance of cloud-based data operations. Design and manage data marts using dimensional models. Build star/snowflake schemas to support BI and self-serve analytics. Enable incremental load strategies and partitioning. Work with tools like DBT, Fivetran, Redshift, Snowflake, BigQuery, or Kafka. Support modular pipeline design and metadata-driven frameworks. Ensure high availability and scalability of the stack. Collaborate with BI teams to design datasets and optimize queries. Support the development of dashboards and reporting layers. Manage access, data refreshes, and performance for BI tools. Requirements 6-8 years of hands-on experience in data engineering roles. Strong SQL skills in PostgreSQL (tuning, complex joins, procedures). Advanced Python scripting skills for automation and ETL. Proven experience with Apache Airflow (custom DAGs, error handling). Solid understanding of cloud architecture (especially AWS). Experience with data marts and dimensional data modeling. Exposure to modern data stack tools (DBT, Kafka, Snowflake, etc. ) Familiarity with BI tools like Power BI, Apache Superset, or Supertech BI. Version control (Git) and CI/CD pipeline knowledge are a plus. Excellent problem-solving and communication skills. This job was posted by Suryansh Singh Karchuli from ShepHertz Technologies. Interested candidates can apply directly at Talent.acquisition@shephertz.com Show more Show less

Posted 2 months ago

Apply

7.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Job Title- Sr.Splunk Architect Exp-7+ Years Location- Gurgaon (Hybrid) Notice Period- Immediate Joiner /Serving Responsibilities As Lead Splunk, your role and responsibilities would include: Hands on experience in the SIEM domain o Expert knowledge on splunk> Backend operations (UF, HF, SH and Indexer Cluster) and architecture o Expert knowledge of Log Management and Splunk SIEM. Understanding of log collection, parsing, normalization, and retention practices. o Expert in Logs/License optimization techniques and strategy. o Good Understanding of Designing, Deployment & Implementation of a scalable SIEM Architecture. o Understanding of data parsimony as a concept, especially in terms of German data security standards. o Working knowledge of integration of Splunk logging infrastructure with 3rd party Observability Tools (e.g. ELK, DataDog etc.) o Experience in identifying the security and non-security logs and apply adequate filters/re- route the logs accordingly. o Expert in understanding the Network Architecture and identifying the components of impact. o Expert in Linux Administration. o Proficient in working with Syslog. o Proficiency in scripting languages like Python, PowerShell, or Bash to automate tasks Expertise with OEM SIEM tools preferably Splunk Experience with open source SIEM/Log storage solutions like ELK OR Datadog etc.. o Very good with documentation of HLD, LLD, Implementation guide and Operation Manuals Show more Show less

Posted 2 months ago

Apply

0 years

0 Lacs

Bengaluru, Karnataka, India

Remote

Work Level : Individual Core : Responsible Leadership : Team Alignment Industry Type : Information Technology Function : Database Administrator Key Skills : mSQL,SQL Writing,PLSQL Education : Graduate Note: This is a requirement for one of the Workassist Hiring Partner. Requirements Currently pursuing or recently completed a degree in Computer Science, Information Technology, or a related field. Strong understanding of SQL and relational database concepts. Experience with databases such as MySQL, PostgreSQL, SQL Server, or Oracle. Ability to write efficient and optimized SQL queries. Basic knowledge of indexing, stored procedures, and triggers. Understanding of database normalization and design principles. Good analytical and problem-solving skills. Ability to work independently and in a team in a remote setting. Company Description Workassist is an online recruitment and employment solution platform based in Lucknow, India. We provide relevant profiles to employers and connect job seekers with the best opportunities across various industries. With a network of over 10,000+ recruiters, we help employers recruit talented individuals from sectors such as Banking & Finance, Consulting, Sales & Marketing, HR, IT, Operations, and Legal. We have adapted to the new normal and strive to provide a seamless job search experience for job seekers worldwide. Our goal is to enhance the job seeking experience by leveraging technology and matching job seekers with the right employers. For a seamless job search experience, visit our website: https://bit.ly/3QBfBU2 (Note: There are many more opportunities apart from this on the portal. Depending on the skills, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less

Posted 2 months ago

Apply

7.5 years

0 Lacs

Bhubaneswar, Odisha, India

On-site

Project Role : AI / ML Engineer Project Role Description : Develops applications and systems that utilize AI tools, Cloud AI services, with proper cloud or on-prem application pipeline with production ready quality. Be able to apply GenAI models as part of the solution. Could also include but not limited to deep learning, neural networks, chatbots, image processing. Must have skills : Machine Learning Good to have skills : NA Minimum 7.5 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As an AI/ML Engineer, you will develop applications and systems utilizing AI tools, Cloud AI services, with proper cloud or on-prem application pipeline with production-ready quality. You will apply GenAI models as part of the solution, including deep learning, neural networks, chatbots, and image processing. Roles & Responsibilities: - Expected to be an SME. - Collaborate and manage the team to perform. - Responsible for team decisions. - Engage with multiple teams and contribute on key decisions. - Expected to provide solutions to problems that apply across multiple teams. - Lead research and development efforts in AI/ML technologies. - Implement and optimize machine learning models. - Conduct data analysis and interpretation for business insights. Professional & Technical Skills: - Must To Have Skills: Proficiency in Machine Learning. - Strong understanding of statistical analysis and machine learning algorithms. - Experience with data visualization tools such as Tableau or Power BI. - Hands-on implementing various machine learning algorithms such as linear regression, logistic regression, decision trees, and clustering algorithms. - Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity. Additional Information: - The candidate should have a minimum of 12 years of experience in Machine Learning. - This position is based at our Bhubaneswar office. - A 15 years full-time education is required. 15 years full time education Show more Show less

Posted 2 months ago

Apply

3.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Project Role : AI / ML Engineer Project Role Description : Develops applications and systems that utilize AI tools, Cloud AI services, with proper cloud or on-prem application pipeline with production ready quality. Be able to apply GenAI models as part of the solution. Could also include but not limited to deep learning, neural networks, chatbots, image processing. Must have skills : Machine Learning Good to have skills : NA Minimum 3 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As an AI/ML Engineer, you will be developing applications and systems that utilize AI tools, Cloud AI services, with proper cloud or on-prem application pipeline with production ready quality. You will apply GenAI models as part of the solution, including deep learning, neural networks, chatbots, and image processing. Roles & Responsibilities: - Expected to perform independently and become an SME. - Required active participation/contribution in team discussions. - Contribute in providing solutions to work related problems. - Develop applications and systems utilizing AI tools and Cloud AI services. - Implement proper cloud or on-prem application pipelines with production-ready quality. - Apply GenAI models as part of the solution. - Utilize deep learning and neural networks in projects. - Create chatbots and work on image processing tasks. Professional & Technical Skills: - Must To Have Skills: Proficiency in Machine Learning. - Strong understanding of statistical analysis and machine learning algorithms. - Experience with data visualization tools such as Tableau or Power BI. - Hands-on implementing various machine learning algorithms such as linear regression, logistic regression, decision trees, and clustering algorithms. - Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity. Additional Information: - The candidate should have a minimum of 3 years of experience in Machine Learning. - This position is based at our Bengaluru office. - A 15 years full-time education is required. 15 years full time education Show more Show less

Posted 2 months ago

Apply

7.0 years

0 Lacs

Hyderabad, Telangana, India

Remote

When you join Verizon You want more out of a career. A place to share your ideas freely even if theyre daring or different. Where the true you can learn, grow, and thrive. At Verizon, we power and empower how people live, work and play by connecting them to what brings them joy. We do what we love driving innovation, creativity, and impact in the world. Our V Team is a community of people who anticipate, lead, and believe that listening is where learning begins. In crisis and in celebration, we come together lifting our communities and building trust in how we show up, everywhere & always. Want in? Join the #VTeamLife. What Youll Be Doing... Flexera Implementation Consultant As a consultant in the Transformation Advisory Governance team, you will be working on Flexera Implementation and spearheading the deployment and configuration of Flexera's leading SAM solution within our dynamic organization. In this pivotal role, you will leverage your deep technical mastery of Flexera tools, comprehensive understanding of software licensing, compliance, and optimization strategies to drive significant value and efficiency. What We Are Looking For Leading End-to-End Flexera Implementation: Architect and execute the complete implementation lifecycle of Flexera One / Flexera IT Asset Management, encompassing discovery, seamless system integration, robust data normalization, and insightful reporting functionalities. Collaborating and Define Solution Scope: Partner closely with internal IT, procurement, and compliance teams to meticulously gather business and technical requirements, translating them into a clearly defined and effective Flexera solution scope. Configuring for Optimal SAM: Expertly configure Flexera to accurately capture and report on critical software usage metrics, licensing entitlements, potential compliance gaps, and opportunities for cost optimization. Driving Data Integration and Automation: Implement efficient data loading processes and establish robust inbound and outbound API integrations, along with configuring catalog items and automated workflows within the Flexera platform. Integrating with Enterprise Ecosystem: Seamlessly integrate Flexera with key enterprise systems, including SCCM, Active Directory, ServiceNow, and leading cloud platforms such as AWS and Azure, ensuring comprehensive asset visibility. Mastering Device Inventory and Discovery: Implement and manage agent deployment strategies, optimize device inventory processes, and refine discovery mechanisms to build a comprehensive and accurate software catalog within Flexera. Ensuring License Compliance and Optimization: Proactively manage license reconciliation processes and guarantee ongoing compliance with major software vendors, including Microsoft, Oracle, Adobe, and IBM, identifying and implementing cost-saving optimization strategies. Developing Actionable Insights through Reporting: Design and develop custom dashboards and insightful reports to effectively monitor software usage patterns, track costs, and ensure continuous compliance adherence. Empowering Internal Teams: Conduct comprehensive training sessions for internal stakeholders on effective Flexera utilization and champion Software Asset Management best practices across the organization. Providing Ongoing Expertise and Optimization: Serve as the subject matter expert for post-implementation support, manage system upgrades, and continuously identify and implement optimizations to maximize the value of the Flexera solution. Youll need to have: Proven Flexera Implementation Expertise: Demonstrated hands-on experience leading and executing Flexera (Flexera One / Flexera ITAM) implementations within medium to large-scale enterprise environments. Extensive ITAM/SAM Experience: A minimum of 7 years of progressive experience in IT Asset Management and Software Asset Management, with significant hands-on involvement in Flexera tool implementation projects. Deep Understanding of SAM Principles: Comprehensive knowledge of IT Asset Management (ITAM) and Software Asset Management (SAM) methodologies, best practices, and industry standards. Strong Software Licensing Acumen: In-depth understanding of complex enterprise software licensing models and agreements for major vendors such as Microsoft, Oracle, IBM, and Adobe. Proficiency in Discovery and Inventory: Hands-on experience with software discovery tools, inventory agents, and data connector technologies. Familiarity with ITSM Ecosystem: Working knowledge of IT Service Management (ITSM) tools, including ServiceNow, SCCM, and JAMF. Data Analysis and Reporting Skills: Proven ability to create custom reports and visualizations using SQL, Power BI, or native Flexera reporting tools. Exceptional Communication and Collaboration: Excellent interpersonal, written, and verbal communication skills with a proven ability to effectively manage stakeholders at all levels. Educational Foundation: Bachelor's degree in Computer Science, Information Technology, or a related field. Flexera Certification Advantage: Flexera Certified Implementation Professional or equivalent certification is highly desirable. Even better if you have one or more of the following: Experience implementing and leveraging ITIL processes and frameworks within an ITAM/SAM context. Hands-on experience in cloud asset management and integrating Flexera with major cloud platforms (AWS, Azure, etc.). Possession of a Flexera Certified Implementation Professional or an equivalent advanced Flexera certification. Where youll be working In this hybrid role, you'll have a defined work location that includes work from home and assigned office days set by your manager. Scheduled Weekly Hours 40 Equal Employment Opportunity Verizon is an equal opportunity employer. We evaluate qualified applicants without regard to race, gender, disability or any other legally protected characteristics. Locations - Hyderabad, India Show more Show less

Posted 2 months ago

Apply

4.0 years

5 - 8 Lacs

Bengaluru

On-site

The Purpose of the role Chubb is seeking a highly skilled and experienced Deep Learning Engineer with Generative AI experience to develop and scale our Generative AI capabilities. The ideal candidate will be responsible for designing, finetuning and training large language models and developing Generative AI systems that can create and improve conversational abilities and decision-making skills of our machines. Location: Bangalore, India Responsibilities Develop and improve Generative AI systems to enable high quality decision making, to refine answers for queries, and to enhance automated communication capabilities. Own the entire process of data collection, training, and deploying machine learning models. Continuous research and implementation of cutting-edge techniques in deep learning, NLP and Generative AI to build state-of-the-art models. Work closely with Data Scientists and other Machine Learning Engineers to design and implement end-to-end solutions. Optimize and streamline deep learning training pipelines. Develop performance metrics to track the efficiency and accuracy of deep learning models. Required knowledge, Skills and qualifications: Minimum of 4 years of industry experience in developing deep learning models with a focus on NLP and Generative AI. Expertise in deep learning frameworks such as Tensorflow, PyTorch and Keras. Experience working with cloud-based services such as Azure for training and deployment of deep learning models. Experience with Hugging Face's Transformer libraries. Expertise in developing and scaling Generative AI systems. Experience in large dataset processing, including pre-processing, cleaning and normalization. Proficient in programming languages such as Python and C++. Experience with natural language processing (NLP) techniques and libraries. Excellent analytical and problem-solving skills.

Posted 2 months ago

Apply

2.0 years

0 - 0 Lacs

Noida

On-site

modelling We are looking for a highly skilled Sr. Developer with 2+ years of experience in web-based project development. The successful candidate will be responsible for designing, developing, and implementing web applications using PHP and various open-source frameworks. Key Responsibilities: Collaborate with cross-functional teams to identify and prioritize project requirements Develop and maintain high-quality, efficient, and well-documented code Troubleshoot and resolve technical issues Implement Social Networks Integration, Payment Gateways Integration, and Web 2.0 in web-based projects Work with RDBMS design, normalization, Data modelling, Transactions, and distributed databases Develop and maintain database PL/SQL, stored procedures, and triggers Requirements: 2+ years of experience in web-based project development using PHP Experience with various open-source frameworks such as Laravel, WordPress, Drupal, Joomla, OsCommerce, OpenCart, TomatoCart, VirtueMart, Magento, Yii 2, CakePHP 2.6, Zend 1.10, and Kohana Strong knowledge of Object-Oriented PHP, Curl, Ajax, Prototype.Js, JQuery, Web services, Design Patterns, MVC architecture, and Object-Oriented Methodologies Experience with RDBMS design, normalization, Data modelling, Transactions, and distributed databases Well-versed with RDBMS MySQL (can work with other SQL flavors too) Experience with Social Networks Integration, Payment Gateways Integration, and Web 2.0 in web-based projects Job Type: Full-time Pay: ₹25,000.00 - ₹40,000.00 per month Benefits: Health insurance Provident Fund Location Type: In-person Schedule: Day shift Morning shift Education: Bachelor's (Required) Experience: PHP: 1 year (Required) Laravel: 1 year (Required) Total: 2 years (Required) Location: Noida, Uttar Pradesh (Required) Work Location: In person

Posted 2 months ago

Apply

7.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

PostgreSQL DBA About THE ROLE We are looking for an experienced PostgreSQL Administrator with expertise in database schema design, query optimization, performance optimization, cloud service management of AWS Aurora PostgreSQL. The role is essential for supporting our product development teams in building efficient and scalable data-driven applications for our container shipping industry. Key Responsibilities: Database Design & Management: Collaborate with the product development team to design, implement, and maintain scalable database schemas that meet business and application requirements. Develop and maintain data models, ensuring consistency and optimal performance. Design tables, indexes, and constraints for high data integrity and performance. Performance Tuning & Optimization: Analyse slow-running or poor performing queries and optimize performance through proper indexing, query restructuring, or caching mechanisms. Conduct performance tuning, including tuning the PostgreSQL parameters for optimal database performance. Work on improving database performance, scaling database operations, and addressing bottlenecks. Cloud Database Management (AWS Aurora PostgreSQL): Manage and administer AWS Aurora PostgreSQL clusters, ensuring high availability, backup, recovery, and disaster recovery planning. Optimize the use of cloud-based resources in AWS Aurora to ensure cost-effective and efficient use. Monitor and maintain database systems in cloud environments, ensuring data security and availability. Security & Compliance: Ensure that the database architecture complies with organizational security policies and best practices. Implement database encryption, user management, and access controls. Monitor database security and address any vulnerabilities or compliance concerns. Automation & Maintenance: Automate routine database tasks such as backups, failovers, maintenance windows, etc. Develop and maintain database monitoring and alerting mechanisms to ensure system stability. Documentation & Training: Create and maintain detailed documentation for database designs, performance optimizations, and cloud database configurations. Provide technical guidance and training to developers on best practices for schema design, query development, and database management. what we are looking for Experience: Over 7 to 11 years of technology experience working in a multi-national company. 5+ years of experience in PostgreSQL database administration, with a strong focus on query optimization, schema design, and performance tuning. Proven experience managing PostgreSQL on AWS Aurora. Technical Skills: Strong expertise in PostgreSQL database design, including normalization, indexing, partitioning, and data modeling. In-depth knowledge of SQL, PL/pgSQL, and advanced PostgreSQL features like triggers, stored procedures, and replication. Familiarity with AWS services (Aurora, RDS, EC2, S3, etc.) and cloud database management practices. Experience with query tuning tools such as pg_stat_statements and EXPLAIN for query analysis. Experience with database backup, recovery, replication, and failover strategies. Performance Tuning: Expertise in tuning PostgreSQL databases for high performance, including memory usage optimization, connection pooling, and query optimization. Proficiency in analyzing and resolving database performance issues, especially in high-traffic and high-volume production environments. Soft Skills: Excellent problem-solving skills and the ability to work closely with developers, DevOps, and architects. Strong communication skills to convey technical solutions to both technical and non-technical stakeholders. Education: Engineering degree in computer science, Information Technology, or related field. Nice to Have: Experience with containerized databases using Docker or Kubernetes. Familiarity with event-driven architectures using Kafka. Experience with CI/CD pipelines and Flyway Script. Show more Show less

Posted 2 months ago

Apply

0 years

0 Lacs

India

Remote

About PurpleMerit PurpleMerit is an AI-focused technology startup dedicated to building innovative, scalable, and intelligent software solutions. We leverage the latest advancements in artificial intelligence and cloud technology to deliver impactful products for a global audience. As a fully remote team, we value skill, curiosity, and a passion for solving real-world problems over formal degrees or prior experience. Job Description We are seeking motivated and talented Full Stack Developers to join our dynamic team. This position is structured as a mandatory internship-to-full-time pathway, designed to nurture and evaluate your technical and collaborative skills in a real-world environment. You will work on a variety of projects, including web applications, Chrome extensions, PWA apps, and full-stack software solutions, with a strong emphasis on system design and AI integration. Roles & Responsibilities Design, develop, and maintain robust, scalable web applications and software solutions. Build end-to-end applications, including websites, Chrome extensions, and PWAs. Architect systems with a strong focus on system design, scalability, and maintainability. Develop RESTful and GraphQL APIs for seamless frontend-backend integration. Implement secure authentication and authorization (OAuth, JWT, session management, role-based access). Integrate AI tools and APIs; demonstrate a basic understanding of AI agents and prompt engineering. Manage cloud infrastructure (AWS or Azure) and CI/CD pipelines for efficient deployment. Perform basic server management (Linux/Unix, Nginx, Apache). Design and optimize databases ( schema design , normalization, indexing, query optimization). Ensure code quality through testing and adherence to best practices. Collaborate effectively in a remote, agile startup environment. Required Skills Strong understanding of system design and software architecture. Experience with CI/CD pipelines and cloud platforms (AWS or Azure). Proficiency with version control systems (Git). Knowledge of API development (RESTful and GraphQL). Familiarity with authentication and authorization protocols (OAuth, JWT, sessions, RBAC). Basic server management skills (Linux/Unix, Nginx, Apache). Database design and optimization skills. Experience integrating AI tools or APIs; Basic understanding of AI agents. Basic knowledge of prompt engineering. Commitment to testing and quality assurance. Ability to build complete, production-ready applications. No formal degree or prior experience required; a strong willingness to learn is essential. Salary Structure 1. Pre-Qualification Internship (Mandatory) Duration: 2 months Stipend: ₹5,000/month Purpose: Evaluate foundational skills, work ethic, and cultural fit. 2. Internship (Mandatory) Duration: 3 months Stipend: ₹7,000–₹15,000/month (based on pre-qualification performance) Purpose: Deepen technical involvement and demonstrate capability. 3. Full-Time Employment Salary: ₹3 LPA – ₹9 LPA (performance-based, determined during internships) Note: Full-time offers are extended only upon successful completion of both internship stages. Why Our Salary Structure is Unique At PurpleMerit, we recognize the challenges of remote hiring in the AI era, where traditional interviews can be unreliable due to the widespread use of AI tools. To ensure genuine skills, cultural fit, and work ethic, we have implemented a structured pathway to full-time employment. This process allows both you and PurpleMerit to evaluate fit through real-world collaboration before making a long-term commitment. We believe in “try and then decide”—not just interviews—because we want to build a team based on real performance and trust. Why Join PurpleMerit? 100% remote work with a flexible schedule. Direct involvement in building AI-driven products from the ground up. Mentorship from experienced engineers and founders. Transparent growth path from internship to full-time employment. Merit-based culture—your skills and contributions are what matter. Opportunity to work on diverse projects and cutting-edge technologies. Your Impact At PurpleMerit, you will: Directly influence the architecture and development of innovative AI products. Solve complex challenges and see your solutions implemented in real products. Help shape our engineering culture and set high standards for quality. Accelerate your growth as a developer in a supportive, fast-paced environment. If you are passionate about building impactful software and eager to work in an AI-driven startup, we encourage you to apply. Join us at PurpleMerit and be a part of our journey to innovate and excel.  Apply now to start your career with PurpleMerit! Show more Show less

Posted 2 months ago

Apply

3.0 years

0 Lacs

Hyderabad, Telangana

On-site

- 2+ years of data scientist experience - 3+ years of data querying languages (e.g. SQL), scripting languages (e.g. Python) or statistical/mathematical software (e.g. R, SAS, Matlab, etc.) experience - 3+ years of machine learning/statistical modeling data analysis tools and techniques, and parameters that affect their performance experience - Experience applying theoretical models in an applied environment Job Description Are you interested in applying your strong quantitative analysis and big data skills to world-changing problems? Are you interested in driving the development of methods, models and systems for capacity planning, transportation and fulfillment network? If so, then this is the job for you. Our team is responsible for creating core analytics tech capabilities, platforms development and data engineering. We develop scalable analytics applications and research modeling to optimize operation processes. We standardize and optimize data sources and visualization efforts across geographies, builds up and maintains the online BI services and data mart. You will work with professional software development managers, data engineers, scientists, business intelligence engineers and product managers using rigorous quantitative approaches to ensure high quality data tech products for our customers around the world, including India, Australia, Brazil, Mexico, Singapore and Middle East. Amazon is growing rapidly and because we are driven by faster delivery to customers, a more efficient supply chain network, and lower cost of operations, our main focus is in the development of strategic models and automation tools fed by our massive amounts of available data. You will be responsible for building these models/tools that improve the economics of Amazon’s worldwide fulfillment networks in emerging countries as Amazon increases the speed and decreases the cost to deliver products to customers. You will identify and evaluate opportunities to reduce variable costs by improving fulfillment center processes, transportation operations and scheduling, and the execution to operational plans. You will also improve the efficiency of capital investment by helping the fulfillment centers to improve storage utilization and the effective use of automation. Finally, you will help create the metrics to quantify improvements to the fulfillment costs (e.g., transportation and labor costs) resulting from the application of these optimization models and tools. Major responsibilities include: · Translating business questions and concerns into specific analytical questions that can be answered with available data using BI tools; produce the required data when it is not available. · Apply Statistical and Machine Learning methods to specific business problems and data. · Create global standard metrics across regions and perform benchmark analysis. · Ensure data quality throughout all stages of acquisition and processing, including such areas as data sourcing/collection, ground truth generation, normalization, transformation, cross-lingual alignment/mapping, etc. · Communicate proposals and results in a clear manner backed by data and coupled with actionable conclusions to drive business decisions. · Collaborate with colleagues from multidisciplinary science, engineering and business backgrounds. · Develop efficient data querying and modeling infrastructure. · Manage your own process. Prioritize and execute on high impact projects, triage external requests, and ensure to deliver projects in time. · Utilizing code (Python, R, Scala, etc.) for analyzing data and building statistical models. Experience in Python, Perl, or another scripting language Experience in a ML or data scientist role with a large technology company Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner.

Posted 2 months ago

Apply

5.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

This job is with Kyndryl, an inclusive employer and a member of myGwork – the largest global platform for the LGBTQ+ business community. Please do not contact the recruiter directly. Who We Are At Kyndryl, we design, build, manage and modernize the mission-critical technology systems that the world depends on every day. So why work at Kyndryl? We are always moving forward - always pushing ourselves to go further in our efforts to build a more equitable, inclusive world for our employees, our customers and our communities. The Role Within our Database Administration team at Kyndryl, you'll be a master of managing and administering the backbone of our technological infrastructure. You'll be the architect of the system, shaping the base definition, structure, and documentation to ensure the long-term success of our business operations. Your expertise will be crucial in configuring, installing and maintaining database management systems, ensuring that our systems are always running at peak performance. You'll also be responsible for managing user access, implementing the highest standards of security to protect our valuable data from unauthorized access. In addition, you'll be a disaster recovery guru, developing strong backup and recovery plans to ensure that our system is always protected in the event of a failure. Your technical acumen will be put to use, as you support end users and application developers in solving complex problems related to our database systems. As a key player on the team, you'll implement policies and procedures to safeguard our data from external threats. You will also conduct capacity planning and growth projections based on usage, ensuring that our system is always scalable to meet our business needs. You'll be a strategic partner, working closely with various teams to coordinate systematic database project plans that align with our organizational goals. Your contributions will not go unnoticed - you'll have the opportunity to propose and implement enhancements that will improve the performance and reliability of the system, enabling us to deliver world-class services to our customers. Your Future at Kyndryl Every position at Kyndryl offers a way forward to grow your career, from Junior Administrator to Architect. We have training and upskilling programs that you won't find anywhere else, including hands-on experience, learning opportunities, and the chance to certify in all four major platforms. One of the benefits of Kyndryl is that we work with customers in a variety of industries, from banking to retail. Whether you want to broaden your knowledge base or narrow your scope and specialize in a specific sector, you can find your opportunity here. Who You Are You're good at what you do and possess the required experience to prove it. However, equally as important - you have a growth mindset; keen to drive your own personal and professional development. You are customer-focused - someone who prioritizes customer success in their work. And finally, you're open and borderless - naturally inclusive in how you work with others. Required Technical and Professional Expertise Bachelor's degree in Computer Science, Information Technology, or a related field. 5-8 years of proven hands-on experience in SQL database design, development, administration, and performance tuning. Expertise in a specific SQL Server platform (e.g., Microsoft SQL Server, PostgreSQL, MySQL). Experience with multiple platforms is a plus. Strong proficiency in writing complex SQL queries, stored procedures, functions, and triggers. Solid understanding of database concepts, including relational database theory, normalization, indexing, and transaction management. Experience with database performance monitoring and tuning tools. Experience with database backup and recovery strategies. Knowledge of database security principles and best practices. Experience with data migration and integration tools and techniques (e.g., ETL processes). Excellent analytical, problem-solving, and troubleshooting skills. Strong communication and collaboration skills. Ability to work independently and as part of a team Preferred Technical And Professional Experience Relevant certifications (e.g., Microsoft Certified: Database Administrator, Oracle Database Administrator). Experience with cloud-based database services (e.g., Azure SQL Database, AWS RDS, Google Cloud SQL). Experience with NoSQL databases. Knowledge of scripting languages (e.g., Python, PowerShell). Experience with data warehousing concepts and technologies. Familiarity with Agile development methodologies Being You Diversity is a whole lot more than what we look like or where we come from, it's how we think and who we are. We welcome people of all cultures, backgrounds, and experiences. But we're not doing it single-handily: Our Kyndryl Inclusion Networks are only one of many ways we create a workplace where all Kyndryls can find and provide support and advice. This dedication to welcoming everyone into our company means that Kyndryl gives you - and everyone next to you - the ability to bring your whole self to work, individually and collectively, and support the activation of our equitable culture. That's the Kyndryl Way. What You Can Expect With state-of-the-art resources and Fortune 100 clients, every day is an opportunity to innovate, build new capabilities, new relationships, new processes, and new value. Kyndryl cares about your well-being and prides itself on offering benefits that give you choice, reflect the diversity of our employees and support you and your family through the moments that matter - wherever you are in your life journey. Our employee learning programs give you access to the best learning in the industry to receive certifications, including Microsoft, Google, Amazon, Skillsoft, and many more. Through our company-wide volunteering and giving platform, you can donate, start fundraisers, volunteer, and search over 2 million non-profit organizations. At Kyndryl, we invest heavily in you, we want you to succeed so that together, we will all succeed. Get Referred! If you know someone that works at Kyndryl, when asked 'How Did You Hear About Us' during the application process, select 'Employee Referral' and enter your contact's Kyndryl email address. Show more Show less

Posted 2 months ago

Apply

170.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Area(s) of responsibility About Us Empowered By Innovation Birlasoft, a global leader at the forefront of Cloud, AI, and Digital technologies, seamlessly blends domain expertise with enterprise solutions. The company’s consultative and design-thinking approach empowers societies worldwide, enhancing the efficiency and productivity of businesses. As part of the multibillion-dollar diversified CKA Birla Group, Birlasoft, with its 12,000+ professionals, is committed to continuing the Group’s 170-year heritage of building sustainable communities. Job Description: DB Developer Position: DB Developer Location: Mumbai Experience: 4-6 years Position Overview: We are seeking a skilled Database Developer to design, develop, and maintain efficient database systems. The ideal candidate will have strong expertise in database programming, optimization, and troubleshooting to ensure high availability and performance of database solutions that support our applications. Responsibilities Design, develop, and maintain scalable database systems based on business needs. Write complex SQL queries, stored procedures, triggers, and functions. Optimize database performance, including indexing, query tuning, and normalization. Implement and maintain database security, backup, and recovery strategies. Collaborate with developers to integrate databases with application solutions. Troubleshoot database issues and ensure high availability and reliability. Design and maintain data models and database schema. Create ETL (Extract, Transform, Load) processes for data migration and transformation. Monitor database performance and provide recommendations for improvements. Document database architecture, procedures, and best practices. Qualifications Bachelor’s degree in computer science, Information Technology, or related field. Proven experience as a Database Developer or similar role. Proficiency in database technologies such as SQL Server, Oracle, MySQL, or PostgreSQL. Expertise in writing complex SQL scripts and query optimization. Experience with database tools like SSIS, SSRS, or Power BI. Familiarity with NoSQL databases like MongoDB or Cassandra (optional). Strong knowledge of database security, data modeling, and performance tuning. Hands-on experience with ETL processes and tools. Knowledge of cloud-based database solutions (AWS RDS, Azure SQL, etc.). Excellent problem-solving skills and attention to detail. Preferred Skills Experience in Agile/Scrum methodologies. Knowledge of scripting languages like Python, PowerShell, or Shell scripting. Familiarity with DevOps practices for database deployment and CI/CD pipelines. Show more Show less

Posted 2 months ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies