Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
0 years
0 - 0 Lacs
India
Remote
Job Title: Backend Developer – Microservices (Node.js, TypeScript) Location: Mumbai, India (Hybrid/Remote options available) Employment Type: Full-Time Role Overview We are seeking a skilled Backend Developer to join our dynamic team. The ideal candidate will have a strong background in microservices architecture and be proficient in Node.js, TypeScript, and various database technologies. You will be responsible for designing, developing, and maintaining scalable backend services that power our applications. Key Responsibilities Microservices Development: Design and implement scalable microservices using Node.js and TypeScript. Database Management: Develop and maintain both SQL (PostgreSQL) and NoSQL (MongoDB) databases, ensuring optimal performance and data integrity. Real-Time Communication: Implement real-time features using WebSockets and integrate with communication platforms like Agora for audio/video functionalities. API Development: Create and maintain RESTful APIs, ensuring they are secure, efficient, and well-documented. Database Design: Design and optimize database schemas, indexes, and queries to support application requirements. Code Quality: Write clean, maintainable, and testable code, adhering to best practices and coding standards. Collaboration: Work closely with frontend developers, product managers, and other stakeholders to deliver high-quality products. Required Skills and Qualifications Microservices Architecture: Proven experience in designing and developing microservices-based applications. Node.js & TypeScript: Strong proficiency in Node.js and TypeScript, with a solid understanding of asynchronous programming and event-driven architectures. SQL & PostgreSQL: Experience in designing and managing relational databases, particularly PostgreSQL. NoSQL & MongoDB: Hands-on experience with NoSQL databases, especially MongoDB, including schema design and performance tuning. WebSockets & Agora: Familiarity with real-time communication protocols and platforms like WebSockets and Agora. Database Design: Ability to design efficient and scalable database schemas, with a focus on data normalization and indexing strategies. Preferred Qualifications Socket Programming: Experience with socket programming for real-time data exchange. Additional Tools: Familiarity with tools like Monorepo structures and TurboRepo for managing codebases. Testing & CI/CD: Knowledge of testing frameworks and continuous integration/continuous deployment pipelines. Cloud Platforms: Experience with cloud services (e.g., AWS, Azure) for deploying and managing applications. Educational Background Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field Job Type: Full-time Pay: ₹30,000.00 - ₹70,000.00 per month Schedule: Day shift Work Location: In person
Posted 1 month ago
5.0 years
1 - 10 Lacs
Noida
On-site
Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start Caring. Connecting. Growing together. Primary Responsibilities: Support the full data engineering lifecycle including research, proof of concepts, design, development, testing, deployment, and maintenance of data management solutions Utilize knowledge of various data management technologies to drive data engineering projects Working with Operations and Product Development staff to support applications/processes to facilitate the effective and efficient implementation/migration of new clients' healthcare data through the Optum Impact Product Suite Lead data acquisition efforts to gather data from various structured or semi-structured source systems of record to hydrate client data warehouse and power analytics across numerous health care domains Leverage combination of ETL/ELT methodologies to pull complex relational and dimensional data to support loading DataMart’s and reporting aggregates Eliminate unwarranted complexity and unneeded interdependencies Detect data quality issues, identify root causes, implement fixes, and manage data audits to mitigate data challenges Implement, modify, and maintain data integration efforts that improve data efficiency, reliability, and value Leverage and facilitate the evolution of best practices for data acquisition, transformation, storage, and aggregation that solve current challenges and reduce the risk of future challenges Effectively create data transformations that address business requirements and other constraints Partner with the broader analytics organization to make recommendations for changes to data systems and the architecture of data platforms Prepare high level design documents and detailed technical design documents with best practices to enable efficient data ingestion, transformation and data movement Leverage DevOps tools to enable code versioning and code deployment Leverage data pipeline monitoring tools to detect data integrity issues before they result into user visible outages or data quality issues Leverage processes and diagnostics tools to troubleshoot, maintain and optimize solutions and respond to customer and production issues Continuously support technical debt reduction, process transformation, and overall optimization Leverage and contribute to the evolution of standards for high quality documentation of data definitions, transformations, and processes to ensure data transparency, governance, and security Ensure that all solutions meet the business needs and requirements for security, scalability, and reliability Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Required Qualifications: Bachelor’s Degree (preferably in information technology, engineering, math, computer science, analytics, engineering or other related field) 5+ years of combined experience in data engineering, ingestion, normalization, transformation, aggregation, structuring, and storage 5+ years of combined experience working with industry standard relational, dimensional or non-relational data storage systems 5+ years of experience in designing ETL/ELT solutions using tools like Informatica, DataStage, SSIS , PL/SQL, T-SQL, etc. 5+ years of experience in managing data assets using SQL, Python, Scala, VB.NET or other similar querying/coding language 3+ years of experience working with healthcare data or data to support healthcare organizations Preferred Qualifications: 5+ years of experience in creating Source to Target Mappings and ETL design for integration of new/modified data streams into the data warehouse/data marts Experience in Unix or Powershell or other batch scripting languages Experience supporting data pipelines that power analytical content within common reporting and business intelligence platforms (e.g. Power BI, Qlik, Tableau, MicroStrategy, etc.) Experience supporting analytical capabilities inclusive of reporting, dashboards, extracts, BI tools, analytical web applications and other similar products Experience contributing to cross-functional efforts with proven success in creating healthcare insights Experience and credibility interacting with analytics and technology leadership teams Depth of experience and proven track record creating and maintaining sophisticated data frameworks for healthcare organizations Exposure to Azure, AWS, or google cloud ecosystems Exposure to Amazon Redshift, Amazon S3, Hadoop HDFS, Azure Blob, or similar big data storage and management components Demonstrated desire to continuously learn and seek new options and approaches to business challenges Willingness to leverage best practices, share knowledge, and improve the collective work of the team Demonstrated ability to effectively communicate concepts verbally and in writing Demonstrated awareness of when to appropriately escalate issues/risks Demonstrated excellent communication skills, both written and verbal At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone–of every race, gender, sexuality, age, location and income–deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes - an enterprise priority reflected in our mission.
Posted 1 month ago
6.0 years
0 Lacs
Ahmedabad, Gujarat, India
Remote
Backend Developer Backend Developer responsible for building and maintaining the server-side infrastructure. This includes managing data, building APIs, integrating with external services, and ensuring the backend can handle scaling, security, and performance optimally. Job Overview: As a Backend Developer, you will be responsible for developing, maintaining, and optimizing the server-side application, database systems, and integrations that power core functionalities. You will ensure the backend is efficient, secure, scalable, and maintainable, collaborating with the frontend team and other stakeholders. Key Responsibilities: 1. API Development & Integration: • Design, develop, and maintain RESTful APIs and GraphQL services to serve data to the frontend. • Build and manage microservices for specific ERP functionalities (e.g., Inventory, Orders, User Management, etc.). • Integrate third-party APIs and services (payment gateways, authentication systems, etc.). • Work with API Gateway (AWS) to manage, monitor, and throttle API requests. 2. Database Design & Management: • Design and maintain PostgreSQL databases, ensuring data integrity, normalization, and efficient query performance. • Implement ORM (Object-Relational Mapping) solutions like Prisma, Sequelize, or Django ORM for easier database management. • Manage database migrations, backups, and high-availability configurations. • Design and implement caching mechanisms to improve database query performance (e.g., Redis). 3. Authentication & Authorization: • Implement secure user authentication and authorization systems (OAuth 2.0, JWT, Amazon Cognito). • Handle user sessions and roles to ensure that only authorized users can access specific data or perform actions. 4. Performance Optimization: • Optimize server-side performance to ensure the ERP system can handle high traffic and large data sets. • Perform database indexing and query optimization to reduce load times. • Set up and monitor auto-scaling infrastructure (e.g., AWS EC2 Auto Scaling, AWS Lambda for serverless functions). 5. Security & Compliance: • Implement best practices for securing the backend, including data encryption, rate limiting, and API security. • Ensure that sensitive data is stored and transmitted securely, using services like AWS KMS (Key Management Service). • Comply with industry standards for data protection and privacy (e.g., GDPR). 6. Testing & Debugging: • Write unit, integration, and API tests using testing frameworks like Jest, Mocha, or PyTest (depending on language). • Debug backend issues and optimize performance for a seamless user experience. • Conduct thorough testing for edge cases, system loads, and failure scenarios. 7. Collaboration & Agile Development: • Work closely with the frontend team to ensure smooth integration of APIs with the user interface. • Participate in agile development cycles, attending daily standups, sprint planning, and code reviews. • Contribute to architecture decisions and system design for scaling and maintaining the ERP platform. 8. Infrastructure & DevOps: • Manage cloud infrastructure, using AWS EC2, S3, Lambda, and other services. • Implement CI/CD pipelines for seamless deployment and updates using GitHub Actions, Jenkins, or AWS CodePipeline. • Use tools like Terraform or CloudFormation for infrastructure-as-code (IaC). Required Skills & Qualifications: • Proficiency in Backend Programming Languages: • Node.js (JavaScript/TypeScript) or Python (Django/Flask). • Experience with Relational Databases (PostgreSQL, MySQL, or similar). • Experience with ORMs like Prisma, Sequelize, Django ORM. • Knowledge of GraphQL and RESTful APIs. • Experience with authentication systems (OAuth 2.0, JWT, Amazon Cognito). • Familiarity with AWS services (EC2, Lambda, RDS, S3, CloudWatch). • Strong understanding of version control with Git. • Experience with Docker and containerized applications. • Ability to design and implement scalable microservices architecture. • Familiarity with caching mechanisms (e.g., Redis, CloudFront). • Knowledge of CI/CD pipelines (GitHub Actions, Jenkins, CodePipeline). • Familiarity with API Gateway (AWS). • Understanding of security best practices in backend systems. Preferred Skills & Qualifications: • Familiarity with serverless architecture (AWS Lambda). • Experience with container orchestration tools like Kubernetes or Docker Swarm. • Experience with GraphQL and tools like Apollo Server for building GraphQL APIs. • Knowledge of monitoring and logging tools like AWS CloudWatch, Prometheus, or ELK Stack. • Familiarity with server-side rendering frameworks like Next.js (for full-stack development). • Advanced Database Management: Sharding, Replication, High Availability, and Failover mechanisms. Education & Experience: • Degree or Experience in Computer Science, Software Engineering, or related field. • At least 6+ years of experience as a backend developer. • Proven experience in developing APIs, integrating with third-party services, and handling large-scale databases. Soft Skills: • Strong problem-solving and analytical skills. • Excellent written and verbal communication skills. • Ability to work collaboratively in a team-oriented, agile environment. • Comfortable with remote work and self-management. • Adaptability to new technologies and learning on the go. Show more Show less
Posted 1 month ago
10.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Location(s): Quay Building 8th Floor, Bagmane Tech Park, Bengaluru, IN Line Of Business: RRS(RRS) Job Category: Credit Analysis & Research Experience Level: Experienced Hire Skills and competencies: Credit: Perform holistic analysis to support ratings and research with autonomy for most advanced/bespoke tasks Extensive working knowledge of applicable methodologies Proficient in all technical and operational aspects of assigned deliverables Stakeholder Management: Comfortably and professionally interact with various senior stakeholders on complex topics Ability to effectively engage with team members of all levels and across departments Project Management: Manage deliverables related to multiple projects independently Qualification Bachelor's/Master's in relevant fields 10+ years in credit/financial data analysis Strong organizational and multitasking skills along with Experience in managing and mentoring teams Proficient in financial statement analysis and advanced Excel along with Fluent in English with strong communication skills Role And Responsibilities Lead and manage managers, people and projects supporting data, ratings, research, analytical outreach and apply technical knowledge to guide junior team members and lead process improvements. Act as a people manager and a manager of managers with responsibility for all aspects of data intake, data normalization and data processing Act as primary point of oversight across all teams: leading, motivating and supporting staff Take full responsibility for the quality of deliverables for the staff they manage as well as end to end delivery of several business as usual (BAU) tasks such as complex data and research tasks or analytic support tasks such as complex methodology application process, complex credit estimates and complicated research projects Take full responsibility for initial review/quality assessments of work for complex tasks and address improvement areas through feedback or training Coordinate allocation of work across the team and mange workload pipeline to provide sufficient time for completion. Provide feedback to management on quality and accuracy of work produced by team members while providing constructive and proactive feedback Responsible for identifying and implementing process re-engineering and process improvements. Utilize expertise to identify inefficiencies, suggest improvements, build consensus and implement the change Take complete ownership over the transition of simpler activities to the service hubs Manage relationships with key stakeholders and support outsourcing activities where relevant Lead projects or participate in working groups. For example, analyze impact of methodology updates onto team’s processes and leading the implementation; partnering with other departments to move work into RRS Global Capability Centers, building well-defined processes and output targets Attract, hire, develop, motivate and retain talented junior analytical staff. About the team: Being part of the RRS Global Capability Centers provides a unique opportunity to foster skills that are valuable to any future career in the financial services industry. The RRS GCC teams perform a range of data, analytical and research services that contribute to the overall credit analysis function performed by the rating groups. By joining our team, you will be a part of exciting work in the global capability centers. Show more Show less
Posted 1 month ago
5.0 years
0 Lacs
Thiruvananthapuram, Kerala, India
On-site
Job description We are looking for an experienced Database (DB) Developer with a minimum of 5 years of experience to join our team. The ideal candidate will have strong expertise in SQL Server, database schema design, SQL/DB programming, and database performance optimization. This role involves working on high-performance databases, optimizing queries, and handling data migrations for our IT products. Key Responsibilities - Design, develop, and maintain database solutions using SQL Server. - Create, modify, and manage database schema structures to meet project requirements. - Write complex SQL queries, stored procedures, triggers, and functions. - Optimize database performance through query tuning, indexing, and optimization techniques. - Conduct regular database performance analysis and suggest improvements for efficiency. - Handle data migration and transformation projects, ensuring data integrity and minimal downtime. - Monitor and maintain database systems to ensure high availability, security, and reliability. - Develop and maintain ETL (Extract, Transform, Load) processes for data movement between systems. - Collaborate with development teams to integrate database solutions with applications. - Ensure adherence to best practices for database design, development, and documentation. Required Skills and Qualifications - Bachelor's degree in Computer Science, Information Technology, or a related field. - Minimum of 5 years of hands-on experience as a Database Developer. - Strong expertise in SQL Server (SQL Server 2016/2019/2022 preferred). - In-depth knowledge of database schema design and data modelling. - Proficient in SQL/DB programming, including writing complex queries, stored procedures, views, triggers, and functions. - Proven experience in database performance tuning and query optimization. - Experience with data migration, transformation, and data integration processes. - Familiarity with SQL Server tools such as SSMS, Profiler, and Performance Monitor. - Strong understanding of indexing, partitioning, and database normalization techniques. - Knowledge of backup, recovery, and disaster recovery planning. - Excellent problem-solving skills and attention to detail. - Strong communication and collaboration skills. Preferred Skills - Knowledge of data warehousing concepts and BI tools (e.g., SSIS, SSRS, Power BI). - Experience with cloud-based databases (e.g., Azure SQL Database, AWS RDS). - Familiarity with version control systems (e.g., Git) and CI/CD pipelines. - Understanding of database security best practices and encryption methods. Work Mode: On-site (In Office) Show more Show less
Posted 1 month ago
0 years
0 Lacs
Mumbai Metropolitan Region
On-site
Make an impact with NTT DATA Join a company that is pushing the boundaries of what is possible. We are renowned for our technical excellence and leading innovations, and for making a difference to our clients and society. Our workplace embraces diversity and inclusion – it’s a place where you can grow, belong and thrive. Your day at NTT DATA The Security Managed Services Engineer (L1) is an entry level engineering role, responsible for providing a managed service to clients to ensure that their Firewall infrastructure remain operational through proactively identifying, investigating, and routing the incidents to correct resolver group. The primary objective of this role is to ensure zero missed service level agreement (SLA) conditions and focuses on first-line support for standard and low complexity incidents and service requests. The Security Managed Services Engineer (L1) may also contribute to / support on project work as and when required. What You'll Be Doing Key Responsibilities: Configure and maintain the SIEM system, ensuring that it's properly set up to collect and analyze security event data. Develop, customize, and manage security rules within the SIEM to detect and respond to security threats. Monitor SIEM alerts, investigate them, and take appropriate actions based on the severity and nature of the alerts. Oversee the collection, normalization, and storage of log data from various sources. Develop and document incident response procedures, and lead or assist in incident response efforts when security incidents occur. Analyze and investigate security events from various sources. Manage security incidents through all incident response phases to closure. Utilize SIEM, SOAR, UEBA, EDR, NBAD, PCAP, Vulnerability Scanning, and Malware analysis technologies for event detection and analysis. Update tickets, write incident reports, and document actions to reduce false positives. Develop knowledge of attack types and finetune detective capabilities. Identify log sources and examine system logs to reconstruct event histories using forensic techniques. Align SIEM rules and alerts with the LIC’s security policies and compliance requirements. Conduct computer forensic investigations, including examining running processes, identifying network connections, and disk imaging. Maintain and support the operational integrity of SOC toolsets. Collaborate with SIEM solution vendors for updates, patches, and support to ensure the system's reliability and effectiveness. Maintain thorough documentation of the SIEM system's configuration, procedures, and incident response plans. Proactively identify and report system security loopholes, infringements, and vulnerabilities to the Security Operations Centre Manager in a timely manner. Work closely with other IT and security teams during incident response, coordinating efforts and sharing information to mitigate security incidents effectively. Ensure that the SIEM system helps the LIC meet regulatory compliance requirements and is ready for security audits. Continuously optimize the SIEM system for efficient performance, ensuring it can handle the volume of data and remain responsive. Develop automation scripts and workflows to streamline common security response tasks and enhance efficiency. Workplace type: On-site Working About NTT DATA NTT DATA is a $30+ billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long-term success. We invest over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies. Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure, and connectivity. We are also one of the leading providers of digital and AI infrastructure in the world. NTT DATA is part of NTT Group and headquartered in Tokyo. Equal Opportunity Employer NTT DATA is proud to be an Equal Opportunity Employer with a global culture that embraces diversity. We are committed to providing an environment free of unfair discrimination and harassment. We do not discriminate based on age, race, colour, gender, sexual orientation, religion, nationality, disability, pregnancy, marital status, veteran status, or any other protected category. Join our growing global team and accelerate your career with us. Apply today. Show more Show less
Posted 1 month ago
0 years
0 Lacs
Mumbai, Maharashtra, India
On-site
Make an impact with NTT DATA Join a company that is pushing the boundaries of what is possible. We are renowned for our technical excellence and leading innovations, and for making a difference to our clients and society. Our workplace embraces diversity and inclusion – it’s a place where you can grow, belong and thrive. Your day at NTT DATA The Security Managed Services Engineer (L1) is an entry level engineering role, responsible for providing a managed service to clients to ensure that their Firewall infrastructure remain operational through proactively identifying, investigating, and routing the incidents to correct resolver group. The primary objective of this role is to ensure zero missed service level agreement (SLA) conditions and focuses on first-line support for standard and low complexity incidents and service requests. The Security Managed Services Engineer (L1) may also contribute to / support on project work as and when required. What You'll Be Doing Key Responsibilities: Configure and maintain the SIEM system, ensuring that it's properly set up to collect and analyze security event data. Develop, customize, and manage security rules within the SIEM to detect and respond to security threats. Monitor SIEM alerts, investigate them, and take appropriate actions based on the severity and nature of the alerts. Oversee the collection, normalization, and storage of log data from various sources. Develop and document incident response procedures, and lead or assist in incident response efforts when security incidents occur. Analyze and investigate security events from various sources. Manage security incidents through all incident response phases to closure. Utilize SIEM, SOAR, UEBA, EDR, NBAD, PCAP, Vulnerability Scanning, and Malware analysis technologies for event detection and analysis. Update tickets, write incident reports, and document actions to reduce false positives. Develop knowledge of attack types and finetune detective capabilities. Identify log sources and examine system logs to reconstruct event histories using forensic techniques. Align SIEM rules and alerts with the LIC’s security policies and compliance requirements. Conduct computer forensic investigations, including examining running processes, identifying network connections, and disk imaging. Maintain and support the operational integrity of SOC toolsets. Collaborate with SIEM solution vendors for updates, patches, and support to ensure the system's reliability and effectiveness. Maintain thorough documentation of the SIEM system's configuration, procedures, and incident response plans. Proactively identify and report system security loopholes, infringements, and vulnerabilities to the Security Operations Centre Manager in a timely manner. Work closely with other IT and security teams during incident response, coordinating efforts and sharing information to mitigate security incidents effectively. Ensure that the SIEM system helps the LIC meet regulatory compliance requirements and is ready for security audits. Continuously optimize the SIEM system for efficient performance, ensuring it can handle the volume of data and remain responsive. Develop automation scripts and workflows to streamline common security response tasks and enhance efficiency. Workplace type: On-site Working About NTT DATA NTT DATA is a $30+ billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long-term success. We invest over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies. Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure, and connectivity. We are also one of the leading providers of digital and AI infrastructure in the world. NTT DATA is part of NTT Group and headquartered in Tokyo. Equal Opportunity Employer NTT DATA is proud to be an Equal Opportunity Employer with a global culture that embraces diversity. We are committed to providing an environment free of unfair discrimination and harassment. We do not discriminate based on age, race, colour, gender, sexual orientation, religion, nationality, disability, pregnancy, marital status, veteran status, or any other protected category. Join our growing global team and accelerate your career with us. Apply today. Show more Show less
Posted 1 month ago
0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Role Description Role Proficiency: Provide expertise on data analysis techniques using software tools. Under supervision streamline business processes. Outcomes Design and manage the reporting environment; which include data sources security and metadata. Provide technical expertise on data storage structures data mining and data cleansing. Support the data warehouse in identifying and revising reporting requirements. Support initiatives for data integrity and normalization. Assess tests and implement new or upgraded software. Assist with strategic decisions on new systems. Generate reports from single or multiple systems. Troubleshoot the reporting database environment and associated reports. Identify and recommend new ways to streamline business processes Illustrate data graphically and translate complex findings into written text. Locate results to help clients make better decisions. Solicit feedback from clients and build solutions based on feedback. Train end users on new reports and dashboards. Set FAST goals and provide feedback on FAST goals of repartees Measures Of Outcomes Quality - number of review comments on codes written Data consistency and data quality. Number of medium to large custom application data models designed and implemented Illustrates data graphically; translates complex findings into written text. Number of results located to help clients make informed decisions. Number of business processes changed due to vital analysis. Number of Business Intelligent Dashboards developed Number of productivity standards defined for project Number of mandatory trainings completed Outputs Expected Determine Specific Data needs: Work with departmental managers to outline the specific data needs for each business method analysis project Critical Business Insights Mines the business’s database in search of critical business insights; communicates findings to relevant departments. Code Creates efficient and reusable SQL code meant for the improvement manipulation and analysis of data. Creates efficient and reusable code. Follows coding best practices. Create/Validate Data Models Builds statistical models; diagnoses validates and improves the performance of these models over time. Predictive Analytics Seeks to determine likely outcomes by detecting tendencies in descriptive and diagnostic analysis Prescriptive Analytics Attempts to identify what business action to take Code Versioning Organize and manage the changes and revisions to code. Use a version control tool for example git bitbucket. etc. Create Reports Create reports depicting the trends and behaviours from analyzed data Document Create documentation for worked performed. Additionally perform peer reviews of documentation of others' work Manage Knowledge Consume and contribute to project related documents share point libraries and client universities Status Reporting Report status of tasks assigned Comply with project related reporting standards and processes Skill Examples Analytical Skills: Ability to work with large amounts of data: facts figures and number crunching. Communication Skills: Communicate effectively with a diverse population at various organization levels with the right level of detail. Critical Thinking: Data Analysts must review numbers trends and data to come up with original conclusions based on the findings. Presentation Skills - facilitates reports and oral presentations to senior colleagues Strong meeting facilitation skills as well as presentation skills. Attention to Detail: Vigilant in the analysis to determine accurate conclusions. Mathematical Skills to estimate numerical data. Work in a team environment Proactively ask for and offer help Knowledge Examples Knowledge Examples Database languages such as SQL Programming language such as R or Python Analytical tools and languages such as SAS & Mahout. Proficiency in MATLAB. Data visualization software such as Tableau or Qlik. Proficient in mathematics and calculations. Efficiently with spreadsheet tools such as Microsoft Excel or Google Sheets DBMS Operating Systems and software platforms Knowledge regarding customer domain and sub domain where problem is solved Additional Comments Skill- Focused on end-to-end machine learning projects using AWS (Glue, SageMaker, CloudWatch) and Python/PySpark. Skills Python,ML,Sagemaker Show more Show less
Posted 1 month ago
7.0 - 10.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Job Title: Lead Splunk Engineer Location: Gurgaon (Hybrid) Experience: 7-10 Years Employment Type: Full-time Notice Period: Immediate Joiners Preferred Job Summary: We are seeking an experienced Lead Splunk Engineer to design, deploy, and optimize SIEM solutions with expertise in Splunk architecture, log management, and security event monitoring . The ideal candidate will have hands-on experience in Linux administration, scripting, and integrating Splunk with tools like ELK & DataDog . Key Responsibilities: ✔ Design & deploy scalable Splunk SIEM solutions (UF, HF, SH, Indexer Clusters). ✔ Optimize log collection, parsing, normalization, and retention . ✔ Ensure license & log optimization for cost efficiency. ✔ Integrate Splunk with 3rd-party tools (ELK, DataDog, etc.) . ✔ Develop automation scripts (Python/Bash/PowerShell) . ✔ Create technical documentation (HLD, LLD, Runbooks) . Skills Required: 🔹 Expert in Splunk (Architecture, Deployment, Troubleshooting) 🔹 Strong SIEM & Log Management Knowledge 🔹 Linux/Unix Administration 🔹 Scripting (Python, Bash, PowerShell) 🔹 Experience with ELK/DataDog 🔹 Understanding of German Data Security Standards (GDPR/Data Parsimony) Why Join Us? Opportunity to work with cutting-edge security tools . Hybrid work model (Gurgaon-based). Collaborative & growth-oriented environment . Show more Show less
Posted 1 month ago
0 years
0 Lacs
Vijayawada, Andhra Pradesh, India
On-site
About Us JOB DESCRIPTION SBI Card is a leading pure-play credit card issuer in India, offering a wide range of credit cards to cater to diverse customer needs. We are constantly innovating to meet the evolving financial needs of our customers, empowering them with digital currency for seamless payment experience and indulge in rewarding benefits. At SBI Card, the motto 'Make Life Simple' inspires every initiative, ensuring that customer convenience is at the forefront of all that we do. We are committed to building an environment where people can thrive and create a better future for everyone. SBI Card is proud to be an equal opportunity & inclusive employer and welcome employees without any discrimination on the grounds of race, color, gender, religion, creed, disability, sexual orientation, gender identity, marital status, caste etc. SBI Card is committed to fostering an inclusive and diverse workplace where all employees are treated equally with dignity and respect which makes it a promising place to work. Join us to shape the future of digital payment in India and unlock your full potential. What’s In It For YOU SBI Card truly lives by the work-life balance philosophy. We offer a robust wellness and wellbeing program to support mental and physical health of our employees Admirable work deserves to be rewarded. We have a well curated bouquet of rewards and recognition program for the employees Dynamic, Inclusive and Diverse team culture Gender Neutral Policy Inclusive Health Benefits for all - Medical Insurance, Personal Accidental, Group Term Life Insurance and Annual Health Checkup, Dental and OPD benefits Commitment to the overall development of an employee through comprehensive learning & development framework Role Purpose Responsible for the management of all collections processes for allocated portfolio in the assigned CD/Area basis targets set for resolution, normalization, rollback/absolute recovery and ROR. Role Accountability Conduct timely allocation of portfolio to aligned vendors/NFTEs and conduct ongoing reviews to drive performance on the business targets through an extended team of field executives and callers Formulate tactical short term incentive plans for NFTEs to increase productivity and drive DRR Ensure various critical segments as defined by business are reviewed and performance is driven on them Ensure judicious use of hardship tools and adherence to the settlement waivers both on rate and value Conduct ongoing field visits on critical accounts and ensure proper documentation in Collect24 system of all field visits and telephone calls to customers Raise red flags in a timely manner basis deterioration in portfolio health indicators/frauds and raise timely alarms on critical incidents as per the compliance guidelines Ensure all guidelines mentioned in the SVCL are adhered to and that process hygiene is maintained at aligned agencies Ensure 100% data security using secured data transfer modes and data purging as per policy Ensure all customer complaints received are closed within time frame Conduct thorough due diligence while onboarding/offboarding/renewing a vendor and all necessary formalities are completed prior to allocating Ensure agencies raise invoices timely Monitor NFTE ACR CAPE as per the collection strategy Measures of Success Portfolio Coverage Resolution Rate Normalization/Roll back Rate Settlement waiver rate Absolute Recovery Rupee collected NFTE CAPE DRA certification of NFTEs Absolute Customer Complaints Absolute audit observations Process adherence as per MOU Technical Skills / Experience / Certifications Credit Card knowledge along with good understanding of Collection Processes Competencies critical to the role Analytical Ability Stakeholder Management Problem Solving Result Orientation Process Orientation Qualification Post-Graduate / Graduate in any discipline Preferred Industry FSI Show more Show less
Posted 1 month ago
1.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Description The role is for 1 year term in Amazon Job Description Are you interested in applying your strong quantitative analysis and big data skills to world-changing problems? Are you interested in driving the development of methods, models and systems for strategy planning, transportation and fulfillment network? If so, then this is the job for you. Our team is responsible for creating core analytics tech capabilities, platforms development and data engineering. We develop scalable analytics applications across APAC, MENA and LATAM. We standardize and optimize data sources and visualization efforts across geographies, builds up and maintains the online BI services and data mart. You will work with professional software development managers, data engineers, business intelligence engineers and product managers using rigorous quantitative approaches to ensure high quality data tech products for our customers around the world, including India, Australia, Brazil, Mexico, Singapore and Middle East. Amazon is growing rapidly and because we are driven by faster delivery to customers, a more efficient supply chain network, and lower cost of operations, our main focus is in the development of strategic models and automation tools fed by our massive amounts of available data. You will be responsible for building these models/tools that improve the economics of Amazon’s worldwide fulfillment networks in emerging countries as Amazon increases the speed and decreases the cost to deliver products to customers. You will identify and evaluate opportunities to reduce variable costs by improving fulfillment center processes, transportation operations and scheduling, and the execution to operational plans. Major Responsibilities Include Translating business questions and concerns into specific analytical questions that can be answered with available data using BI tools; produce the required data when it is not available. Writing SQL queries and automation scripts Ensure data quality throughout all stages of acquisition and processing, including such areas as data sourcing/collection, ground truth generation, normalization, transformation, cross-lingual alignment/mapping, etc. Communicate proposals and results in a clear manner backed by data and coupled with actionable conclusions to drive business decisions. Collaborate with colleagues from multidisciplinary science, engineering and business backgrounds. Develop efficient data querying and modeling infrastructure. Manage your own process. Prioritize and execute on high impact projects, triage external requests, and ensure to deliver projects in time. Utilizing code (SQL, Python, R, Scala, etc.) for analyzing data and building data marts Basic Qualifications 3+ years of analyzing and interpreting data with Redshift, Oracle, NoSQL etc. experience Experience with data visualization using Tableau, Quicksight, or similar tools Experience with data modeling, warehousing and building ETL pipelines Experience in Statistical Analysis packages such as R, SAS and Matlab Experience using SQL to pull data from a database or data warehouse and scripting experience (Python) to process data for modeling Preferred Qualifications Experience with AWS solutions such as EC2, DynamoDB, S3, and Redshift Experience in data mining, ETL, etc. and using databases in a business environment with large-scale, complex datasets Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner. Company - ASSPL - Karnataka Job ID: A2997231 Show more Show less
Posted 1 month ago
5.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start Caring. Connecting. Growing together. Primary Responsibilities Support the full data engineering lifecycle including research, proof of concepts, design, development, testing, deployment, and maintenance of data management solutions Utilize knowledge of various data management technologies to drive data engineering projects Working with Operations and Product Development staff to support applications/processes to facilitate the effective and efficient implementation/migration of new clients' healthcare data through the Optum Impact Product Suite Lead data acquisition efforts to gather data from various structured or semi-structured source systems of record to hydrate client data warehouse and power analytics across numerous health care domains Leverage combination of ETL/ELT methodologies to pull complex relational and dimensional data to support loading DataMart’s and reporting aggregates Eliminate unwarranted complexity and unneeded interdependencies Detect data quality issues, identify root causes, implement fixes, and manage data audits to mitigate data challenges Implement, modify, and maintain data integration efforts that improve data efficiency, reliability, and value Leverage and facilitate the evolution of best practices for data acquisition, transformation, storage, and aggregation that solve current challenges and reduce the risk of future challenges Effectively create data transformations that address business requirements and other constraints Partner with the broader analytics organization to make recommendations for changes to data systems and the architecture of data platforms Prepare high level design documents and detailed technical design documents with best practices to enable efficient data ingestion, transformation and data movement Leverage DevOps tools to enable code versioning and code deployment Leverage data pipeline monitoring tools to detect data integrity issues before they result into user visible outages or data quality issues Leverage processes and diagnostics tools to troubleshoot, maintain and optimize solutions and respond to customer and production issues Continuously support technical debt reduction, process transformation, and overall optimization Leverage and contribute to the evolution of standards for high quality documentation of data definitions, transformations, and processes to ensure data transparency, governance, and security Ensure that all solutions meet the business needs and requirements for security, scalability, and reliability Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Required Qualifications Bachelor’s Degree (preferably in information technology, engineering, math, computer science, analytics, engineering or other related field) 5+ years of combined experience in data engineering, ingestion, normalization, transformation, aggregation, structuring, and storage 5+ years of combined experience working with industry standard relational, dimensional or non-relational data storage systems 5+ years of experience in designing ETL/ELT solutions using tools like Informatica, DataStage, SSIS , PL/SQL, T-SQL, etc. 5+ years of experience in managing data assets using SQL, Python, Scala, VB.NET or other similar querying/coding language 3+ years of experience working with healthcare data or data to support healthcare organizations Preferred Qualifications 5+ years of experience in creating Source to Target Mappings and ETL design for integration of new/modified data streams into the data warehouse/data marts Experience in Unix or Powershell or other batch scripting languages Experience supporting data pipelines that power analytical content within common reporting and business intelligence platforms (e.g. Power BI, Qlik, Tableau, MicroStrategy, etc.) Experience supporting analytical capabilities inclusive of reporting, dashboards, extracts, BI tools, analytical web applications and other similar products Experience contributing to cross-functional efforts with proven success in creating healthcare insights Experience and credibility interacting with analytics and technology leadership teams Depth of experience and proven track record creating and maintaining sophisticated data frameworks for healthcare organizations Exposure to Azure, AWS, or google cloud ecosystems Exposure to Amazon Redshift, Amazon S3, Hadoop HDFS, Azure Blob, or similar big data storage and management components Demonstrated desire to continuously learn and seek new options and approaches to business challenges Willingness to leverage best practices, share knowledge, and improve the collective work of the team Demonstrated ability to effectively communicate concepts verbally and in writing Demonstrated awareness of when to appropriately escalate issues/risks Demonstrated excellent communication skills, both written and verbal At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone-of every race, gender, sexuality, age, location and income-deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes - an enterprise priority reflected in our mission. Show more Show less
Posted 1 month ago
9.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Company Description ShuffleLabs is a trusted partner for integration solutions, delivering excellent products and exceptional customer service. The flagship platform ShuffleExchange connects applications using APIs, simplifies integration development with visual tools, and makes implementation easy. ShuffleExchange provides simplicity and ease of use for complex integration needs. Role Description This is a full-time on-site role in Chennai for a Technical Lead at ShuffleLabs. The Technical Lead will be responsible for leading technical teams, designing and implementing software solutions, providing technical guidance, and ensuring project success through effective leadership and problem-solving. Roles & Responsibilities Architect and deliver iPaaS solutions that enable seamless integration between cloud, on-premise, and hybrid systems using modern technologies and enterprise best practices. Lead end-to-end solution design, ensuring robust data integration patterns (event-driven, pub-sub, messaging queues, ETL, API orchestration). Collaborate with cross-functional teams, product managers, and integration engineers to define technical roadmaps and align architecture with business goals. Incorporate AI/ML components into integration workflows for intelligent automation, data transformation, anomaly detection, and predictive routing. Define and implement scalable cloud integration strategies leveraging Microsoft Azure services (Logic Apps, Azure Functions, Event Grid, API Management). Guide integration architecture design, performance optimization, security protocols (OAuth2, OpenID Connect, API security), and SLA compliance. Evaluate and select iPaaS tools, integration frameworks, and middleware platforms aligned with customer needs and future scalability. Mentor development teams, enforce engineering standards, lead code and design reviews, and promote DevOps and CI/CD best practices. Produce architecture artifacts, technical specifications, white papers, and reusable design patterns for internal and client use. Lead cloud migration initiatives and legacy modernization efforts, especially for enterprise integration systems. Experience with hybrid and multi-cloud integration scenarios Core Technical Skills iPaaS / Integration Expertise: Integration patterns: RESTful/SOAP APIs, file-based transfers, message queues, ETL pipelines, webhooks Experience with iPaaS platforms: Azure Logic Apps , MuleSoft , Dell Boomi , Informatica , SnapLogic , or Workato (experience with Azure preferred) Strong knowledge of Azure Integration Services : Logic Apps, API Management, Service Bus, Event Grid, Azure Functions, Data Factory Microsoft Stack: .NET Core / .NET 6+, ASP.NET MVC, C#, Entity Framework, Web API Azure PaaS services (App Services, Azure SQL, Azure Cosmos DB, Azure Blob Storage) Experience with database design, normalization, and data modeling in SQL Server environments Power Platform (Power Automate, Power BI, Power Apps) knowledge is a plus DevOps & Cloud Infrastructure: Azure DevOps, CI/CD pipelines, ARM/Bicep templates, GitHub Actions, Jenkins Kubernetes (AKS), Docker, Terraform, Ansible Azure Monitoring, App Insights, and Log Analytics for integration health checks and diagnostics AI & Intelligent Integration Skills Integrate AI into iPaaS pipelines using Azure Cognitive Services , OpenAI APIs , or ML.NET for: Data extraction/classification from unstructured sources (OCR, NLP) Smart routing and workflow recommendations Predictive analytics and anomaly detection Intelligent chatbots and virtual assistants for API support Experience with AI-enriched integration use cases: RAG pipelines, auto-mapping schemas, adaptive decision-making, intelligent data validation Familiarity with vector databases (e.g., Pinecone, FAISS) and semantic search to enhance knowledge workflows Understanding of MLOps practices and integration of AI lifecycle into CI/CD pipelines Contribution to open-source integration or AI tools Soft Skills & Leadership Strategic thinker with deep technical expertise and strong business acumen in the iPaaS space Excellent communication skills with the ability to translate complex technical concepts into business terms Collaborative leader experienced in mentoring cross-functional engineering teams Strong problem-solving, prioritization, and project planning abilities in fast-paced, evolving environments Qualifications & Experience M.C.A. / B.E. / B. Tech in Computer Science or related field from a reputed institute Minimum 9+ years of experience in IT, with 3+ years in architectural or lead roles focused on integration technologies Proven experience designing and deploying enterprise-scale integration solutions using Microsoft Azure Experience implementing AI-driven features in cloud or integration platforms is highly desirable Certifications in Microsoft Azure (Architect, DevOps Engineer, AI Engineer) or iPaaS platforms are a strong plus Show more Show less
Posted 1 month ago
7.0 - 10.0 years
2 - 3 Lacs
Gurgaon
On-site
Experience: 7 - 10 Years Location: GURGAON/ HYBRID MODE CTC TO BE OFFERED : Mention Your Current & Expected CTC Notice Period: IMMEDIATE TO 30 DAYS KeySkills: SPLUNK, SIEM DOMAIN, BACKEND OPERATIONS , UF, HF, SH, INDEXER CLUSTER, LOG MANAGEMENT, LOG COLLECTION, PARSING, NORMALIZATION, RETENTION PRACTICES, LOGS/LICENSE OPTIMIZATION, DESIGNING, DEPLOYMENT & IMPLEMENTATION, DATA PARSIMONY, GERMAN DATA SECURITY STANDARDS, SPLUNK LOGGING INFRASTRUCTURE, OBSERVABILITY TOOLS, ELK, DATADOG, NETWORK ARCHITECTURE, LINUX ADMINISTRATION, SYSLOG, PYTHON, POWERSHELL, OR BASH, OEM SIEM, HLD, LLD, IMPLEMENTATION GUIDE, OPERATION MANUALS Job Description: As Lead Splunk, your role and responsibilities would include: Hands on experience in the SIEM domain Expert knowledge on Splunk Backend operations (UF, HF, SH and Indexer Cluster) and architecture Expert knowledge of Log Management and Splunk SIEM. Understanding of log collection, parsing, normalization, and retention practices. Expert in Logs/License optimization techniques and strategy. Good Understanding of Designing, Deployment & Implementation of a scalable SIEM Architecture. Understanding of data parsimony as a concept, especially in terms of German data security standards. Working knowledge of integration of Splunk logging infrastructure with 3rd party Observability Tools (e.g. ELK, DataDog etc.) Experience in identifying the security and non-security logs and apply adequate filters/re-route the logs accordingly. Expert in understanding the Network Architecture and identifying the components of impact. Expert in Linux Administration. Proficient in working with Syslog. Proficiency in scripting languages like Python, PowerShell, or Bash to automate tasks Expertise with OEM SIEM tools preferably Splunk Experience with open source SIEM/Log storage solutions like ELK OR Datadog etc.. Very good with documentation of HLD, LLD, Implementation guide and Operation Manuals Note: (i) Our client is looking for immediate & early joiners. (ii) Having LinkedIn Profile is a must. (iii) Being an immediate & high priority requirement interested candidates can share their Resumes with Photograph in word doc. format
Posted 1 month ago
7.0 - 9.0 years
6 - 10 Lacs
Noida
On-site
Are you our “TYPE”? Monotype (Global) Named "One of the Most Innovative Companies in Design" by Fast Company, Monotype brings brands to life through type and technology that consumers engage with every day. The company's rich legacy includes a library that can be traced back hundreds of years, featuring famed typefaces like Helvetica, Futura, Times New Roman and more. Monotype also provides a first-of-its-kind service that makes fonts more accessible for creative professionals to discover, license, and use in our increasingly digital world. We work with the biggest global brands, and with individual creatives, offering a wide set of solutions that make it easier for them to do what they do best: design beautiful brand experiences. Monotype Solutions India Monotype Solutions India is a strategic center of excellence for Monotype and is a certified Great Place to Work® three years in a row. The focus of this fast-growing center spans Product Development, Product Management, Experience Design, User Research, Market Intelligence, Research in areas of Artificial Intelligence and Machine learning, Innovation, Customer Success, Enterprise Business Solutions, and Sales. Headquartered in the Boston area of the United States and with offices across 4 continents, Monotype is the world’s leading company in fonts. It’s a trusted partner to the world’s top brands and was named “One of the Most Innovative Companies in Design” by Fast Company. Monotype brings brands to life through the type and technology that consumers engage with every day. The company's rich legacy includes a library that can be traced back hundreds of years, featuring famed typefaces like Helvetica, Futura, Times New Roman, and more. Monotype also provides a first-of-its-kind service that makes fonts more accessible for creative professionals to discover, license, and use in our increasingly digital world. We work with the biggest global brands, and with individual creatives, offering a wide set of solutions that make it easier for them to do what they do best: design beautiful brand experiences. We are looking for problem solvers to help us build next-generation features, products, and services. You will work closely with a cross-functional team of engineers on microservices and event-driven architectures. You are expected to contribute to the architecture, design, and development of new features, identify technical risks and find alternate solutions to various problems. In addition, the role also demands to lead, motivate & mentor other team members with respect to technical challenges. What we’re looking for: Bachelor’s or Master’s degree in Computer Science, Information Systems, or related field. Minimum 7-9 years of professional experience, with at least 5 years specifically in data architecture. Proven experience designing and implementing data models, including ER diagrams, dimensional modeling, and normalization techniques. Strong expertise in relational databases (SQL Server, Oracle, PostgreSQL) and NoSQL databases (MongoDB, Cassandra). Proficiency with data modeling tools such as ERwin, PowerDesigner, or similar tools. Knowledge of cloud data platforms and services (AWS, Azure, GCP). Strong analytical and problem-solving skills, with the ability to provide creative and innovative solutions. Excellent communication and stakeholder management abilities. You will have an opportunity to: ✔ COLLABORATE with global teams to build scalable web-based applications. ✔ PARTNER closely with the engineering team to follow best practices and standards. ✔ PROVIDE reliable solutions to a variety of problems using sound problem-solving techniques. ✔ WORK with the broader team to build and maintain high performance, flexible, and highly scalable web-based applications. ✔ ACHIEVE engineering excellence by implementing standard practices and standards. ✔ PERFORM technical root causes analysis and outlines corrective action for given problems. What’s in it for you Hybrid work arrangements and competitive paid time off programs. Comprehensive medical insurance coverage to meet all your healthcare needs. Competitive compensation with corporate bonus program & uncapped commission for quota-carrying Sales A creative, innovative, and global working environment in the creative and software technology industry Highly engaged Events Committee to keep work enjoyable. Reward & Recognition Programs (including President's Club for all functions) Professional onboarding program, including robust targeted training for Sales function Development and advancement opportunities (high internal mobility across organization) Retirement planning options to save for your future, and so much more! Monotype is an Equal Opportunities Employer. Qualified applicants will receive consideration for employment without regard to race, colour, religion, sex, national origin, sexual orientation, gender identity, disability or protected veteran status.
Posted 1 month ago
0 years
2 - 4 Lacs
Vadodara
On-site
Are you passionate about data, performance tuning, and writing efficient SQL? Join our growing team where you’ll work on exciting projects and contribute to maintaining high-performance, scalable database systems. -What we’re looking for: -Strong SQL skills - Experience with SQL Server / PostgreSQL / MySQL. -Understanding of normalization, indexing, and query optimization. -Advance query writing skill. - Knowledge of database backup, recovery & security - Basic Linux/Unix scripting (a plus) -Exposure to cloud platforms like AWS RDS or Google Cloud SQL (bonus!) -Location: Vadodara -Apply here: khushirai@blueboxinfosoft.comLet’s build smarter systems together! Job Type: Full-time Pay: ₹200,000.00 - ₹400,000.00 per year Benefits: Paid sick time Schedule: Day shift Monday to Friday Work Location: In person
Posted 1 month ago
1.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
JD for DM resource Job Title: SQL Developer (1 Year Experience) Location: Noida Job Type: [Full-time] Experience Required: 1 Year Department: Insurance Product & Platforms Job Summary We are seeking a motivated and detail-oriented SQL Developer with 1 year of professional experience to join our data team. The ideal candidate will be responsible for writing queries, optimizing performance, managing databases, and supporting application development with efficient SQL solutions. Key Responsibilities Develop, test, and maintain SQL queries, stored procedures, and scripts to support applications and reporting needs. Work closely with developers, data analysts, and business users to gather requirements and deliver solutions. Optimize SQL queries for performance and scalability. Assist in maintaining data integrity and security across multiple databases. Monitor database performance and troubleshoot issues as they arise. Generate reports and data extracts as required by business units. Perform data validation and cleansing as part of data migration or integration projects. Collaborate in database design and normalization. Qualifications Bachelor’s degree in Computer Science, Information Technology, or related field (or equivalent experience). 1 year of hands-on experience working with SQL databases (such as MS SQL Server, MySQL, PostgreSQL, or Oracle). Strong understanding of relational database concepts. Basic experience with ETL tools or scripting (nice to have). Good problem-solving and communication skills. Experience with reporting tools (e.g., Power BI, SSRS) is a plus. Technical Skills Proficient in writing and debugging SQL queries, joins, subqueries, views, triggers, and stored procedures. Familiar with database performance tuning techniques. Understanding of database security and backup procedures. Exposure to version control systems like Git is an advantage. Soft Skills Attention to detail and a strong desire to learn. Ability to work independently and in a team environment. Strong communication and documentation skills. Analytical thinking and a structured approach to problem-solving. Preferred Certifications (optional) Cloud Certification (if any) Show more Show less
Posted 1 month ago
7.0 - 9.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Are you our “TYPE”? Monotype (Global) Named "One of the Most Innovative Companies in Design" by Fast Company, Monotype brings brands to life through type and technology that consumers engage with every day. The company's rich legacy includes a library that can be traced back hundreds of years, featuring famed typefaces like Helvetica, Futura, Times New Roman and more. Monotype also provides a first-of-its-kind service that makes fonts more accessible for creative professionals to discover, license, and use in our increasingly digital world. We work with the biggest global brands, and with individual creatives, offering a wide set of solutions that make it easier for them to do what they do best: design beautiful brand experiences. Monotype Solutions India Monotype Solutions India is a strategic center of excellence for Monotype and is a certified Great Place to Work® three years in a row. The focus of this fast-growing center spans Product Development, Product Management, Experience Design, User Research, Market Intelligence, Research in areas of Artificial Intelligence and Machine learning, Innovation, Customer Success, Enterprise Business Solutions, and Sales. Headquartered in the Boston area of the United States and with offices across 4 continents, Monotype is the world’s leading company in fonts. It’s a trusted partner to the world’s top brands and was named “One of the Most Innovative Companies in Design” by Fast Company. Monotype brings brands to life through the type and technology that consumers engage with every day. The company's rich legacy includes a library that can be traced back hundreds of years, featuring famed typefaces like Helvetica, Futura, Times New Roman, and more. Monotype also provides a first-of-its-kind service that makes fonts more accessible for creative professionals to discover, license, and use in our increasingly digital world. We work with the biggest global brands, and with individual creatives, offering a wide set of solutions that make it easier for them to do what they do best: design beautiful brand experiences. We are looking for problem solvers to help us build next-generation features, products, and services. You will work closely with a cross-functional team of engineers on microservices and event-driven architectures. You are expected to contribute to the architecture, design, and development of new features, identify technical risks and find alternate solutions to various problems. In addition, the role also demands to lead, motivate & mentor other team members with respect to technical challenges. What We’re Looking For Bachelor’s or Master’s degree in Computer Science, Information Systems, or related field. Minimum 7-9 years of professional experience, with at least 5 years specifically in data architecture. Proven experience designing and implementing data models, including ER diagrams, dimensional modeling, and normalization techniques. Strong expertise in relational databases (SQL Server, Oracle, PostgreSQL) and NoSQL databases (MongoDB, Cassandra). Proficiency with data modeling tools such as ERwin, PowerDesigner, or similar tools. Knowledge of cloud data platforms and services (AWS, Azure, GCP). Strong analytical and problem-solving skills, with the ability to provide creative and innovative solutions. Excellent communication and stakeholder management abilities. You Will Have An Opportunity To ✔ COLLABORATE with global teams to build scalable web-based applications. ✔ PARTNER closely with the engineering team to follow best practices and standards. ✔ PROVIDE reliable solutions to a variety of problems using sound problem-solving techniques. ✔ WORK with the broader team to build and maintain high performance, flexible, and highly scalable web-based applications. ✔ ACHIEVE e ngineering excellence by implementing standard practices and standards. ✔ PERFORM technical root causes analysis and outlines corrective action for given problems. What’s in it for you Hybrid work arrangements and competitive paid time off programs. Comprehensive medical insurance coverage to meet all your healthcare needs. Competitive compensation with corporate bonus program & uncapped commission for quota-carrying Sales A creative, innovative, and global working environment in the creative and software technology industry Highly engaged Events Committee to keep work enjoyable. Reward & Recognition Programs (including President's Club for all functions) Professional onboarding program, including robust targeted training for Sales function Development and advancement opportunities (high internal mobility across organization) Retirement planning options to save for your future, and so much more! Monotype is an Equal Opportunities Employer. Qualified applicants will receive consideration for employment without regard to race, colour, religion, sex, national origin, sexual orientation, gender identity, disability or protected veteran status. Show more Show less
Posted 1 month ago
0 years
0 Lacs
Mumbai Metropolitan Region
On-site
Make an impact with NTT DATA Join a company that is pushing the boundaries of what is possible. We are renowned for our technical excellence and leading innovations, and for making a difference to our clients and society. Our workplace embraces diversity and inclusion – it’s a place where you can grow, belong and thrive. Your day at NTT DATA The Security Managed Services Engineer (L1) is an entry level engineering role, responsible for providing a managed service to clients to ensure that their Firewall infrastructure remain operational through proactively identifying, investigating, and routing the incidents to correct resolver group. The primary objective of this role is to ensure zero missed service level agreement (SLA) conditions and focuses on first-line support for standard and low complexity incidents and service requests. The Security Managed Services Engineer (L1) may also contribute to / support on project work as and when required. Responsibilities What you'll be doing SOC Analyst Configure and maintain the SIEM system, ensuring that it's properly set up to collect and analyze security event data. Develop, customize, and manage security rules within the SIEM to detect and respond to security threats. Monitor SIEM alerts, investigate them, and take appropriate actions based on the severity and nature of the alerts. Oversee the collection, normalization, and storage of log data from various sources. Develop and document incident response procedures, and lead or assist in incident response efforts when security incidents occur. Analyze and investigate security events from various sources. Manage security incidents through all incident response phases to closure. Utilize SIEM, SOAR, UEBA, EDR, NBAD,Splunk PCAP, Vulnerability Scanning, and Malware analysis technologies for event detection and analysis. Update tickets, write incident reports, and document actions to reduce false positives. Develop knowledge of attack types and finetune detective capabilities. Identify log sources and examine system logs to reconstruct event histories using forensic techniques. Align SIEM rules and alerts with the LIC’s security policies and compliance requirements. Conduct computer forensic investigations, including examining running processes, identifying network connections, and disk imaging. Maintain and support the operational integrity of SOC toolsets. Collaborate with SIEM solution vendors for updates, patches, and support to ensure the system's reliability and effectiveness. Maintain thorough documentation of the SIEM system's configuration, procedures, and incident response plans. Proactively identify and report system security loopholes, infringements, and vulnerabilities to the Security Operations Centre Manager in a timely manner. Work closely with other IT and security teams during incident response, coordinating efforts and sharing information to mitigate security incidents effectively. Ensure that the SIEM system helps the LIC meet regulatory compliance requirements and is ready for security audits. Continuously optimize the SIEM system for efficient performance, ensuring it can handle the volume of data and remain responsive. Develop automation scripts and workflows to streamline common security response tasks and enhance efficiency. Certification: Valid CEH Certificate required Workplace type: On-site Working About NTT DATA NTT DATA is a $30+ billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long-term success. We invest over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies. Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure, and connectivity. We are also one of the leading providers of digital and AI infrastructure in the world. NTT DATA is part of NTT Group and headquartered in Tokyo. Equal Opportunity Employer NTT DATA is proud to be an Equal Opportunity Employer with a global culture that embraces diversity. We are committed to providing an environment free of unfair discrimination and harassment. We do not discriminate based on age, race, colour, gender, sexual orientation, religion, nationality, disability, pregnancy, marital status, veteran status, or any other protected category. Join our growing global team and accelerate your career with us. Apply today. Show more Show less
Posted 1 month ago
0 years
0 Lacs
Mumbai Metropolitan Region
On-site
Make an impact with NTT DATA Join a company that is pushing the boundaries of what is possible. We are renowned for our technical excellence and leading innovations, and for making a difference to our clients and society. Our workplace embraces diversity and inclusion – it’s a place where you can grow, belong and thrive. Your day at NTT DATA The Security Managed Services Engineer (L1) is an entry level engineering role, responsible for providing a managed service to clients to ensure that their Firewall infrastructure remain operational through proactively identifying, investigating, and routing the incidents to correct resolver group. The primary objective of this role is to ensure zero missed service level agreement (SLA) conditions and focuses on first-line support for standard and low complexity incidents and service requests. The Security Managed Services Engineer (L1) may also contribute to / support on project work as and when required. What You'll Be Doing Responsibilities: Configure and maintain the SIEM system, ensuring that it's properly set up to collect and analyze security event data. Develop, customize, and manage security rules within the SIEM to detect and respond to security threats. Monitor SIEM alerts, investigate them, and take appropriate actions based on the severity and nature of the alerts. Oversee the collection, normalization, and storage of log data from various sources. Develop and document incident response procedures, and lead or assist in incident response efforts when security incidents occur. Analyze and investigate security events from various sources. Manage security incidents through all incident response phases to closure. Utilize SIEM, SOAR, UEBA, EDR, NBAD, PCAP, Vulnerability Scanning, and Malware analysis technologies for event detection and analysis. Update tickets, write incident reports, and document actions to reduce false positives. Develop knowledge of attack types and finetune detective capabilities. Identify log sources and examine system logs to reconstruct event histories using forensic techniques. Align SIEM rules and alerts with the LIC’s security policies and compliance requirements. Conduct computer forensic investigations, including examining running processes, identifying network connections, and disk imaging. Maintain and support the operational integrity of SOC toolsets. Collaborate with SIEM solution vendors for updates, patches, and support to ensure the system's reliability and effectiveness. Maintain thorough documentation of the SIEM system's configuration, procedures, and incident response plans. Proactively identify and report system security loopholes, infringements, and vulnerabilities to the Security Operations Centre Manager in a timely manner. Work closely with other IT and security teams during incident response, coordinating efforts and sharing information to mitigate security incidents effectively. Ensure that the SIEM system helps the LIC meet regulatory compliance requirements and is ready for security audits. Continuously optimize the SIEM system for efficient performance, ensuring it can handle the volume of data and remain responsive. Develop automation scripts and workflows to streamline common security response tasks and enhance efficiency. Workplace type: On-site Working About NTT DATA NTT DATA is a $30+ billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long-term success. We invest over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies. Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure, and connectivity. We are also one of the leading providers of digital and AI infrastructure in the world. NTT DATA is part of NTT Group and headquartered in Tokyo. Equal Opportunity Employer NTT DATA is proud to be an Equal Opportunity Employer with a global culture that embraces diversity. We are committed to providing an environment free of unfair discrimination and harassment. We do not discriminate based on age, race, colour, gender, sexual orientation, religion, nationality, disability, pregnancy, marital status, veteran status, or any other protected category. Join our growing global team and accelerate your career with us. Apply today. Show more Show less
Posted 1 month ago
0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Role: Developer Required Technical Skill Set: BMC Remedy Desired Experience Range 5+ Location of Requirement : Hyderabad Desired Competencies (Technical/Behavioral Competency) Must-Have · Exposer for latest version of BMC ITSM (preferred 20.x) . Good understanding of Helix platform. · BMC Analytics and Migration experience · Good working experience of Smart IT, DWP(A), Smart Reporting. · Experience in BMC Remedy AR system workflow development as per the provided design and debugging of the workflows. · Experience in Active Links, Filters, Escalations, Web Services, All Form Objects creation and debugging. · Experience on advanced AR configurations such as Server Group, DSO, Load Balancer, Threads settings. · Experience on Installation, Configuration experience of AR/Atrium/ITSM highly desirable Strong understanding of Permission model is must (User/Group/Role concepts) · Experience in BMC ITSM (Incident, Problem, Change & Release, Asset, Knowledge Management) · Experience on BMC Atrium Core products (CMDB, Product Catalogue, Atrium Integrator) · Good understanding of CMDB class structure (Common Data Model CDM) and Reconciliation concepts other feature experience desirable Normalization Engine, Atrium Impact Simulator, Service Catalogue etc. · Experience on CMDB and AR Java or C API knowledge is desirable Knowledge of UNIX, basic Oracle and SQL scripts Show more Show less
Posted 1 month ago
6.0 - 8.0 years
0 Lacs
Gurgaon, Haryana, India
On-site
We are looking for a results-driven Senior Data Engineer to join our engineering team. The ideal candidate will have hands-on expertise in data pipeline development, cloud infrastructure, and BI support, with a strong command over modern data stacks. You'll be responsible for building scalable ETL/ELT workflows, managing data lakes and marts, and enabling seamless data delivery to analytics and business intelligence teams. This role requires deep technical know-how in PostgreSQL, Python scripting, Apache Airflow, AWS or other cloud environments, and a working knowledge of modern data and BI tools Responsibilities Design and optimize complex SQL queries, stored procedures, and indexes. Perform performance tuning and query plan analysis. Contribute to schema design and data normalization. Migrate data from multiple sources to cloud or ODS platforms. Design schema mapping and implement transformation logic. Ensure consistency, integrity, and accuracy in migrated data. Build automation scripts for data ingestion, cleansing, and transformation. Handle file formats (JSON, CSV, XML), REST APIs, cloud SDKs (e. g., Boto3). Maintain reusable script modules for operational pipelines. Develop and manage DAGs for batch/stream workflows. Implement retries, task dependencies, notifications, and failure handling. Integrate Airflow with cloud services, data lakes, and data warehouses. Manage data storage (S3 GCS, Blob), compute services, and data pipelines. Set up permissions, IAM roles, encryption, and logging for security. Monitor and optimize the cost and performance of cloud-based data operations. Design and manage data marts using dimensional models. Build star/snowflake schemas to support BI and self-serve analytics. Enable incremental load strategies and partitioning. Work with tools like DBT, Fivetran, Redshift, Snowflake, BigQuery, or Kafka. Support modular pipeline design and metadata-driven frameworks. Ensure high availability and scalability of the stack. Collaborate with BI teams to design datasets and optimize queries. Support the development of dashboards and reporting layers. Manage access, data refreshes, and performance for BI tools. Requirements 6-8 years of hands-on experience in data engineering roles. Strong SQL skills in PostgreSQL (tuning, complex joins, procedures). Advanced Python scripting skills for automation and ETL. Proven experience with Apache Airflow (custom DAGs, error handling). Solid understanding of cloud architecture (especially AWS). Experience with data marts and dimensional data modeling. Exposure to modern data stack tools (DBT, Kafka, Snowflake, etc. ) Familiarity with BI tools like Power BI, Apache Superset, or Supertech BI. Version control (Git) and CI/CD pipeline knowledge are a plus. Excellent problem-solving and communication skills. This job was posted by Suryansh Singh Karchuli from ShepHertz Technologies. Interested candidates can apply directly at Talent.acquisition@shephertz.com Show more Show less
Posted 1 month ago
7.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Job Title- Sr.Splunk Architect Exp-7+ Years Location- Gurgaon (Hybrid) Notice Period- Immediate Joiner /Serving Responsibilities As Lead Splunk, your role and responsibilities would include: Hands on experience in the SIEM domain o Expert knowledge on splunk> Backend operations (UF, HF, SH and Indexer Cluster) and architecture o Expert knowledge of Log Management and Splunk SIEM. Understanding of log collection, parsing, normalization, and retention practices. o Expert in Logs/License optimization techniques and strategy. o Good Understanding of Designing, Deployment & Implementation of a scalable SIEM Architecture. o Understanding of data parsimony as a concept, especially in terms of German data security standards. o Working knowledge of integration of Splunk logging infrastructure with 3rd party Observability Tools (e.g. ELK, DataDog etc.) o Experience in identifying the security and non-security logs and apply adequate filters/re- route the logs accordingly. o Expert in understanding the Network Architecture and identifying the components of impact. o Expert in Linux Administration. o Proficient in working with Syslog. o Proficiency in scripting languages like Python, PowerShell, or Bash to automate tasks Expertise with OEM SIEM tools preferably Splunk Experience with open source SIEM/Log storage solutions like ELK OR Datadog etc.. o Very good with documentation of HLD, LLD, Implementation guide and Operation Manuals Show more Show less
Posted 1 month ago
0 years
0 Lacs
Bengaluru, Karnataka, India
Remote
Work Level : Individual Core : Responsible Leadership : Team Alignment Industry Type : Information Technology Function : Database Administrator Key Skills : mSQL,SQL Writing,PLSQL Education : Graduate Note: This is a requirement for one of the Workassist Hiring Partner. Requirements Currently pursuing or recently completed a degree in Computer Science, Information Technology, or a related field. Strong understanding of SQL and relational database concepts. Experience with databases such as MySQL, PostgreSQL, SQL Server, or Oracle. Ability to write efficient and optimized SQL queries. Basic knowledge of indexing, stored procedures, and triggers. Understanding of database normalization and design principles. Good analytical and problem-solving skills. Ability to work independently and in a team in a remote setting. Company Description Workassist is an online recruitment and employment solution platform based in Lucknow, India. We provide relevant profiles to employers and connect job seekers with the best opportunities across various industries. With a network of over 10,000+ recruiters, we help employers recruit talented individuals from sectors such as Banking & Finance, Consulting, Sales & Marketing, HR, IT, Operations, and Legal. We have adapted to the new normal and strive to provide a seamless job search experience for job seekers worldwide. Our goal is to enhance the job seeking experience by leveraging technology and matching job seekers with the right employers. For a seamless job search experience, visit our website: https://bit.ly/3QBfBU2 (Note: There are many more opportunities apart from this on the portal. Depending on the skills, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less
Posted 1 month ago
7.5 years
0 Lacs
Bhubaneswar, Odisha, India
On-site
Project Role : AI / ML Engineer Project Role Description : Develops applications and systems that utilize AI tools, Cloud AI services, with proper cloud or on-prem application pipeline with production ready quality. Be able to apply GenAI models as part of the solution. Could also include but not limited to deep learning, neural networks, chatbots, image processing. Must have skills : Machine Learning Good to have skills : NA Minimum 7.5 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As an AI/ML Engineer, you will develop applications and systems utilizing AI tools, Cloud AI services, with proper cloud or on-prem application pipeline with production-ready quality. You will apply GenAI models as part of the solution, including deep learning, neural networks, chatbots, and image processing. Roles & Responsibilities: - Expected to be an SME. - Collaborate and manage the team to perform. - Responsible for team decisions. - Engage with multiple teams and contribute on key decisions. - Expected to provide solutions to problems that apply across multiple teams. - Lead research and development efforts in AI/ML technologies. - Implement and optimize machine learning models. - Conduct data analysis and interpretation for business insights. Professional & Technical Skills: - Must To Have Skills: Proficiency in Machine Learning. - Strong understanding of statistical analysis and machine learning algorithms. - Experience with data visualization tools such as Tableau or Power BI. - Hands-on implementing various machine learning algorithms such as linear regression, logistic regression, decision trees, and clustering algorithms. - Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity. Additional Information: - The candidate should have a minimum of 12 years of experience in Machine Learning. - This position is based at our Bhubaneswar office. - A 15 years full-time education is required. 15 years full time education Show more Show less
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
31458 Jobs | Dublin
Wipro
16542 Jobs | Bengaluru
EY
10788 Jobs | London
Accenture in India
10711 Jobs | Dublin 2
Amazon
8660 Jobs | Seattle,WA
Uplers
8559 Jobs | Ahmedabad
IBM
7988 Jobs | Armonk
Oracle
7535 Jobs | Redwood City
Muthoot FinCorp (MFL)
6170 Jobs | New Delhi
Capgemini
6091 Jobs | Paris,France