Home
Jobs

367 Masking Jobs - Page 4

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

8.0 - 13.0 years

7 - 9 Lacs

Chennai

On-site

Job ID: 27675 Location: Chennai, IN Area of interest: Technology Job type: Regular Employee Work style: Office Working Opening date: 25 May 2025 Job Summary We are looking for a skilled Automation Test Manager to join our team in ensuring the quality, reliability, and security of our payments processing application. The role involves creating and maintaining automated test scripts using Selenium, Java, SQL, and Python for data masking. The ideal candidate has a strong background in automation testing within payment systems or similar high-availability applications. Experienced in MT Swift messages like MT103, 202, 202 COV Experienced in MX messages like PACS.008, PACS.009, PACS.004, PACS.002 and PAIN.001 Experienced in Real Time (Faster Payments) processing like IMPS, G3, IBFT End to End Payment Processing Knowledge Ensure the quality and timeliness of delivery of testing assignments. Perform functional and technical test execution activities as per testing team engagement level in the project. Perform testing in Agile methodology delivery. Plan, analyse, design, prepare Test Strategy, Planning & Traceability Matrix Preparation Key Responsibilities Perform testing in Agile methodology delivery Functional / Automation testing for SCPay payments application Test Automation: Design, develop, and maintain automated test scripts using Selenium and Java to support regression, functional, and integration testing. Write and execute SQL queries to validate data integrity and ensure data consistency across transactions. Look into Kibana and understanding of KQL is a Plus Data Masking & Test Data Management: Utilize Python scripts for data masking to protect sensitive data used in test cases. Manage test data and set up testing environments to support end-to-end testing scenarios Quality Assurance & Test Strategy: Develop comprehensive test plans and test cases to cover all aspects of the application, including UI, API, and database layers. Collaborate with development and product teams to understand requirements, create testing strategies, and identify automation opportunities. Defect Tracking & Reporting: Log, track, and manage defects using tracking tools, ensuring clear documentation and communication of issues. Generate and share test execution reports with stakeholders, highlighting critical issues and providing insights for improvement. Continuous Improvement: Enhance existing automation frameworks and scripts to improve coverage, maintainability, and reliability. Stay updated on industry trends and best practices in test automation, implementing relevant improvements. Skills and Experience Min 8 - 13 Years of experience Experience in leading a team of more than 5 members Automation Testing using Rest API MT and MX Message Processing Agile methodology Payment processing testing is a must (ISO20022, MT/ MX Payment formats) Automation Tools: Proficiency in Selenium with Java for test automation SQL: Strong SQL skills for data validation and back-end testing Python: Experience with Python scripting for data masking and test data management. Testing Frameworks: Knowledge of testing frameworks such as TestNG or JUnit CI/CD: Familiarity with CI/CD tools like Jenkins, Git, or similar for automated test execution Excellent problem-solving and analytical skills Strong communication skills to convey technical details effectively Ability to work in a collaborative Agile environment with cross-functional teams Qualifications Bachelor’s Degree in Computer Science, Software Engineering or equivalent degree About Standard Chartered We're an international bank, nimble enough to act, big enough for impact. For more than 170 years, we've worked to make a positive difference for our clients, communities, and each other. We question the status quo, love a challenge and enjoy finding new opportunities to grow and do better than before. If you're looking for a career with purpose and you want to work for a bank making a difference, we want to hear from you. You can count on us to celebrate your unique talents and we can't wait to see the talents you can bring us. Our purpose, to drive commerce and prosperity through our unique diversity, together with our brand promise, to be here for good are achieved by how we each live our valued behaviours. When you work with us, you'll see how we value difference and advocate inclusion. Together we: Do the right thing and are assertive, challenge one another, and live with integrity, while putting the client at the heart of what we do Never settle, continuously striving to improve and innovate, keeping things simple and learning from doing well, and not so well Are better together, we can be ourselves, be inclusive, see more good in others, and work collectively to build for the long term What we offer In line with our Fair Pay Charter, we offer a competitive salary and benefits to support your mental, physical, financial and social wellbeing. Core bank funding for retirement savings, medical and life insurance, with flexible and voluntary benefits available in some locations. Time-off including annual leave, parental/maternity (20 weeks), sabbatical (12 months maximum) and volunteering leave (3 days), along with minimum global standards for annual and public holiday, which is combined to 30 days minimum. Flexible working options based around home and office locations, with flexible working patterns. Proactive wellbeing support through Unmind, a market-leading digital wellbeing platform, development courses for resilience and other human skills, global Employee Assistance Programme, sick leave, mental health first-aiders and all sorts of self-help toolkits A continuous learning culture to support your growth, with opportunities to reskill and upskill and access to physical, virtual and digital learning. Being part of an inclusive and values driven organisation, one that embraces and celebrates our unique diversity, across our teams, business functions and geographies - everyone feels respected and can realise their full potential. www.sc.com/careers

Posted 1 week ago

Apply

0 years

1 Lacs

India

On-site

About Us: At Web5G Technology , we help businesses grow through smart, performance-driven marketing. Our team specializes in digital advertising, lead generation, and campaign strategy—focused on delivering real results through platforms like Google, Meta, and YouTube. We're passionate about helping brands scale with clear goals, transparent processes, and data-backed decisions. Role: We are looking for a sharp and detail-oriented Editing & VFX Trainee who is eager to work on impactful video content. This role is ideal for someone who understands the fundamentals of post-production and wants to build advanced skills in visual effects and digital storytelling . Eligibility: Minimum 3 months of experience in video editing and/or basic VFX Completion of a certified course in video editing, motion graphics, VFX, or related multimedia field is mandatory Familiarity with tools like Adobe Premiere Pro, After Effects, DaVinci Resolve , or Fusion/Nuke Strong foundation in visual timing, layering, keyframing, masking, and compositing Key Responsibilities: Assist in editing videos for digital ads, brand campaigns, and social media content Create and apply basic VFX elements such as motion tracking, object removal, screen replacements, and text animations Support senior editors and VFX artists in building post-production pipelines Maintain organized project files, assets, and exports for different platforms Optimize video output for quality, speed, and performance Preferred Skills: Green screen keying and rotoscoping basics Working knowledge of LUTs, transitions, and audio syncing Ability to follow brand guidelines and meet project deadlines What You’ll Gain: Training and hands-on experience in both editing and visual effects Real-time exposure to creative advertising projects Internship Certificate + Letter of Recommendation High potential for transition into a full-time creative/VFX role Job Types: Full-time, Permanent Pay: From ₹10,000.00 per month Benefits: Leave encashment Paid sick time Paid time off Schedule: Day shift Supplemental Pay: Overtime pay Performance bonus Yearly bonus Work Location: In person

Posted 1 week ago

Apply

2.0 years

2 - 3 Lacs

Gurugram

Work from Office

We are expanding our Gurgaon Based team and are seeking Junior Retouchers who are eager to grow into high end beauty, luxury and still life work for European and US Markets Required Candidate profile Having 2 years experience in high end retouching. Expert knowledge of Adobe Photoshop CS6-CC 2024. Eye for colour, ability to work fast, organised mind set, having a portfolio.

Posted 1 week ago

Apply

0 years

0 Lacs

Kolkata, West Bengal, India

On-site

JobDescription: We are looking for a creative and talented Graphic Designer who can create amazing user experiences and visual assets. The ideal candidate should have an eye for clean and artful design and should be able to translate high-level requirements into beautiful, intuitive, and functional designs. Key Responsibilities: Receive and manage customer-submitted exterior or interior images for processing. Remove unwanted objects or visual obstacles from the images. Clean and enhance visual elements to ensure image clarity and quality. Apply appropriate digital layers and accurately fill in paint colors as per customer preferences or brand palette. Ensure all design outputs align with the customer’s brief and expectations. Create realistic, high-resolution previews that simulate the post-painting outcome of buildings or rooms. Collaborate with the customer support or sales team to clarify requirements when needed. Stay updated on the latest digital design tools, color theory, and architectural rendering techniques. Desired Candidate Profile: Proven experience as a graphic designer or photo editor. Proficiency in Adobe Photoshop (required); knowledge of Illustrator, InDesign, or other design tools is a plus. Strong skills in image manipulation, digital painting, masking, layering, and color correction. Ability to interpret customer preferences and creatively bring them to life. Keen attention to detail and a commitment to producing realistic, professional visuals. Strong time management and ability to work independently or with minimal supervision. Basic understanding of architecture or interior design is a plus but not mandatory. Key Skills : Adobe Photoshop Adobe Premiere Illustrator InDesign Corel Draw Work Experience: 0 to 6 yrs.

Posted 1 week ago

Apply

1.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Job Type: Full Time Experience: 1 Years to 2 Years Type: Virtual Hiring Last Date: 30-June-2025 Posted on: 18-June-2025 Education: BE/B.Tech,MCA,ME/M.Tech ADVERTISEMENT No. 02 Data Scientist/ AI Engineer / 2 Posts Age: 25 to 35 years Qualification Mandatory: Full Time B.E./B. Tech – First class (minimum of 60% marks) or equivalent or M.E./M. Tech/ MCA in Computer Science/ IT/ Data Science/ Machine Learning and AI. Professional / Preferred Qualification: Certification in Data Science/ AI/ ML/ Natural Language Processing/ Web Crawling and Neural Networks. Experience Essential: 1. Minimum 03 years of (post basic educational qualification) experience in related field, out of which: 2+ years experience with programming languages frequently used in data science (R/ Python). 2+ years Experience in model development, model validation or related field. 2+ years experience in data analytics. 2+ years experience in Relational Database or any NoSQL database including Graph databases Experience in cloud-based application/ service development. Experience in natural language processing, Web Crawling and Neural Networks. Experience in projects with Machine learning/ Artificial Intelligence technologies. Excellent communication skills and ability to work as part of a multicultural product development team. End-to-end experience from data extraction to modelling and its validation. Experience of working in a project environment as a developer. Preference will be given to candidates with experience in financial sector/ banks/ NBFCs/ Insurance/ Investment firms. Mandatory Skill Set: 1. Technical expertise regarding data models/ database design development, data mining and segmentation techniques. Expertise in Machine Learning technologies. Expertise in testing & validation of quality and accuracy of AI models. Expertise in developing models using structured, semi-structured and unstructured data. Expertise in Analytical Databases like Vertica DB or similar platforms. Data Modelling and Data Intelligence/ Data Cataloguing skills with tools like Alation. SQL (DDL/ DML/ DQL). Desirable Qualities 1. Good understanding of Data Model and types of dimension modelling. 2. Experience in Conversational AI and dialogue systems. 3. Strong understanding of explainable and Responsible/ Ethical AI framework. 4. Understand Data protection techniques like encryption, data masking and tokenization to safeguard sensitive data in transit and at rest. 5. Experience in designing secure solutions architecture for Cloud platforms (private/ public/ Hybrid). 6. Experience with tools like Nifi, HBase, Spark, pig, storm, flume, etc. 7. Experience in BI tools. 8. Expertise in MS Excel data analytics. 9. Expertise in usage and deployment of LLMs Key Responsibilities: 1. Be self-motivated, pro-active, and demonstrate an exceptional drive towards service delivery. 2. Identify valuable data sources and automate collection/ collation processes. 3. Undertake preprocessing of structured and unstructured data. 4. Analyze information to discover trends and patterns. Use AI/ ML techniques to improve the quality of data or product offerings. 6. Find patterns and trends in datasets to uncover insights. 7. Create algorithms and data models to forecast outcomes. 8. Combine models through ensemble modelling. Data Scientist-cum-BI Developer /1 Post Age 23 to 30 years Qualification Mandatory: Full Time B.E./B. Tech – First class (minimum of 60% marks) or equivalent or M.E./M. Tech/ MCA in Computer Science/ IT/ Data Science/ Machine Learning and AI. Professional/ preferred qualification: Certification/ Assignments/ Projects in Data Science/ AI/ ML/ Natural Language Processing/ Web Crawling and Neural Networks. Experience Essential: 1. Minimum 01 year of (post basic educational qualification) working experience on assignments/ projects/ jobs related to ML/ AI. 2. Experience in projects with Machine learning/ Artificial Intelligence technologies. 3. Excellent communication skills and ability to work as part of a multicultural product development team. 4. End-to-end experience from data extraction to modelling and its validation. 5. Experience of working in a project environment as a developer. 6. Preference will be given to candidates with experience in financial sector/ banks/ NBFCs/ Insurance/ Investment firms. Mandatory Skill Set: 1. Technical expertise regarding data models/ database design development, data mining and segmentation techniques. Expertise in Machine Learning technologies. Expertise in testing & validation of quality and accuracy of AI models. Expertise in developing models using structured, semi-structured and unstructured data. Expertise in Analytical Databases like Vertica DB or similar platforms. Data Modelling and Data Intelligence/ Data Cataloguing skills with tools like Alation. SQL (DDL / DML/ DQL). Desired Skill Set: 1. Good understanding of Data Model and types of dimension modelling. 2. Experience in Conversational AI and dialogue systems. 3. Strong understanding of explainable and Responsible/ Ethical AI framework. 4. Understand Data protection techniques like encryption, data masking and tokenization to safeguard sensitive data in transit and at rest. 5. Experience in designing secure solutions architecture for Cloud platforms (private/ public/ Hybrid). 6. Experience with tools like Nifi, HBase, Spark, pig, storm, flume, etc. 7. Experience in BI tools. 8. Expertise in MS Excel data analytics. 9. Expertise in usage and deployment of LLMs Key Responsibilities 1. Be self-motivated, pro-active, and demonstrate an exceptional drive towards service delivery. 2. Identify valuable data sources and automate collection/ collation processes. 3. Undertake preprocessing of structured and unstructured data. Analyze information to discover trends and patterns. 5. Use AI/ ML techniques to improve the quality of data or product offerings. 6. Find patterns and trends in datasets to uncover insights. 7. Create algorithms and data models to forecast outcomes. 8. Combine models through ensemble modelling. Candidates can apply only ON-LINE on 16th June 2025 to 30 June 2025. Note: This is an aggregated job, sharing with a motive to intimate relevant opportunities with job seekers. Hireclap is not responsible / authorized for this recruitment process. Click Here For Job Details & Apply Online

Posted 1 week ago

Apply

0.0 - 4.0 years

1 - 3 Lacs

Chennai

Work from Office

Proficient knowledge and Extensive experience using Adobe Photoshop for image editing. Good Knowledge of Clipping, Pathing, Channel Masking, Color correction, Image cleanup, Product retouching, Hair Masking, and all necessary retouching requirements.

Posted 1 week ago

Apply

1.0 - 2.0 years

0 Lacs

Jaipur, Rajasthan, India

On-site

Location: Jaipur, RJ, IN Areas of Work: Sales & Marketing Job Id: 12778 Executive N - SERVICES JAIPUR Objective Lead the team of Customer Associates & Sales Associates in the region / allocated territory to ensure their performance in terms of delivering value, adherence to processes and guidelines at the sites, driving the usages of key / focus products at the sites, interact and manage dealers as and when required. Reporting of data as required to SSE, UH & central function. Main Responsibilities Monitoring daily updates of all activities on the Paint Assist app. Daily monitoring of on time visits and follow ups to all new sites by customer associates (CA) or Sales Associates (SA). Daily monitoring of business collections by CA and ensuring delivery of month on month business objectives Prompt updation of records of the new joinees/exits at a CA/SA level to SSE In store & onsite training to new CAs for processes, business app training, product pitching & site monitoring. Random site visits to ensure adherence to systems & processes like usage of mechanized tools, covering and masking, correct application process and on-time site handover after proper cleaning. Approving business entries into the application post appropriate checks & validation. Other Responsibilities Undertake regular trainings for faster adoption of updated features of the business app Co-ordinate with CA and contractors and ensure their attendance in all contractor training programs. Scope of Work A ) People Management Scope (Range of no. of Direct/ Indirect Reports): Performance of Customer associates/Sales Associates Co-ordination with trainer, TA, TSE & SSE for contractor training needs. Co-ordination with DA, CC & SD for focused product requirements and leads c) Geography Coverage (Country-wide/ State-wide / Area-wide) d) Corporate Coverage (Company-wide / Business Unit or Function-wide / Sub-function-wide / Other): NA Key Interactions Internal: Customer associates, Sales Associates, Colour consultants, Designer Associate, Senior Sales Executive, Unit Head, Technical Associate, Technical Sales Executive. External: Customers, Store owner, contractors, other influencers. Role Requirements Qualifications Essential: Graduate Degree in any stream (BA/B.Sc./B.Com/BBA/BBM/BMS) Minimum of 50% marks throughout education without any backlogs Graduation must be through a full time course Applicants with an Engineering background (B.Tech/B.E./Diploma/B.Pharma) will not be considered Desired: Candidates with MBA/PGDM in Sales and Marketing Desired: 1-2 years of experience in sales function in any organization Functional Competencies Fluency in English, Hindi & local language Excellent communication and people skills Should have a working knowledge of MS Excel, MS Word, MS Powerpoint Behavioral Competencies Willingness to work in a retail environment and engage with clients across age and income groups for 8.5 hours a day and 6 days in a week. Extensive travelling across the region. Be diligent and ensure timely attendance / completion of all programs and modules designed by for training and development of customer associates. Additional Requirements Should have a two-wheeler with valid driving licence. Should have an android phone with latest updated operating system Age to be between 26 - 30 Years

Posted 1 week ago

Apply

6.0 years

20 - 22 Lacs

Udaipur

Remote

Senior Software Engineer - Data Governance Kadel Labs is a leading IT services company delivering top-quality technology solutions since 2017, focused on enhancing business operations and productivity through tailored, scalable, and future-ready solutions. With deep domain expertise and a commitment to innovation, we help businesses stay ahead of technological trends. As a CMMI Level 3 and ISO 27001:2022 certified company, we ensure best-in-class process maturity and information security, enabling organizations to achieve their digital transformation goals with confidence and efficiency. Role: Senior Software Engineer- Data Governance Experience: 6-8 Yrs Location: Udaipur , Jaipur, Bangalore Domain: Telecom Job Description: We are seeking an experienced Telecom Data Governance lead to join our team. In this role, you will be responsible for defining and implementing the data governance strategy. The role involves establishing metadata standards, defining attribute ownership models, ensuring regulatory compliance, and improving data quality and trust across the enterprise. initiatives. Key Responsibilities: Define and implement enterprise-wide data governance framework Own the metadata catalog and ensure consistency across business and technical assets Develop and manage KPI registries, data dictionaries, and lineage documentation Collaborate with data stewards and domain owners to establish attribute ownership Lead efforts around data standardization, quality rules, and classification of sensitive data Ensure privacy and compliance (e.g., GDPR, PII, PHI) by enforcing tagging, masking, and access rules Define access control rules (purpose-based views, user roles, sensitivity levels) Oversee governance for data products and federated data domains Support internal audits and external regulatory reviews Coordinate with platform, analytics, security, and compliance teams Required Skills: 6+ years of experience in data governance roles with at least 3-4 years in telecommunications industry Experience integrating governance with modern data stacks (e.g.Data bricks, Snowflake) Strong experience in data governance tools (e.g., Alation, Unity Catalog ,Azure Purview,) Proven understanding of metadata management, data lineage, and data quality frameworks Experience in implementing federated governance models and data stewardship programs Knowledge of compliance requirements (GDPR, PII, TMForum etc.) Familiarity with data mesh principles and data contract approaches Excellent communication and stakeholder management skills Background in telecom, networking or other data-rich industries Certification in data governance or management frameworks Educational Qualifications: · Bachelor's degree in Computer Science, Information Technology, or a related field. Visit us: https://kadellabs.com/ https://in.linkedin.com/company/kadel-labs https://www.glassdoor.co.in/Overview/Working-at-Kadel-Labs-EI_IE4991279.11,21.htm Job Types: Full-time, Permanent Pay: ₹2,087,062.21 - ₹2,209,304.16 per year Benefits: Flexible schedule Health insurance Leave encashment Paid time off Provident Fund Work from home Schedule: Day shift Monday to Friday Supplemental Pay: Overtime pay Performance bonus Quarterly bonus Yearly bonus Ability to commute/relocate: Udaipur City, Rajasthan: Reliably commute or planning to relocate before starting work (Required) Application Question(s): How Many years of experience in Telecom-Data Engineering? Experience: Data Engineer: 9 years (Required) Data governance: 6 years (Required) Location: Udaipur City, Rajasthan (Required) Work Location: In person

Posted 1 week ago

Apply

4.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Hello, Greetings from ZettaMine!! 📢 Job Opening: Storage Administrator Location: Hyderabad / Navi Mumbai Experience: 2–4 Years Job Type: Full-Time Joining: Immediate joiners preferred Mandatory to have good communication Job Description: Roles & Responsibilities: Manage enterprise-level SAN/NAS environments (e.g., NetApp, EMC, Hitachi, Dell EMC) Perform LUN provisioning , zoning , masking , and volume management Monitor and troubleshoot storage-related issues and performance bottlenecks Administer backup & recovery solutions (e.g., Veritas NetBackup, Commvault) Implement replication , snapshot , and Disaster Recovery (DR) strategies Support SAN switch infrastructure (Brocade/Cisco) Conduct capacity planning , utilization tracking, and forecasting Develop and maintain technical documentation , SOPs , and runbooks Collaborate with application and infrastructure teams to fulfill storage requirements. Interested candidates can share your updated CV at: 📧 md.afreen@zettamine.com Thanks & Regards, Afreen ZettaMine Show more Show less

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

Gurugram, Haryana, India

On-site

POSITION - Software Engineer – Data Engineering LOCATION - Bangalore/Mumbai/Kolkata/Gurugram/Hyderabad/Pune/Chennai EXPERIENCE - 5-9 Years ABOUT HASHEDIN We are software engineers who solve business problems with a Product Mindset for leading global organizations. By combining engineering talent with business insight, we build software and products that can create new enterprise value. The secret to our success is a fast-paced learning environment, an extreme ownership spirit, and a fun culture. JOB TITLE: Software Engineer – Data Engineering OVERVIEW OF THE ROLE: As a Data Engineer or Senior Data Engineer, you will be hands-on in architecting, building, and optimizing robust, efficient, and secure data pipelines and platforms that power business critical analytics and applications. You will play a central role in the implementation and automation of scalable batch and streaming data workflows using modern big data and cloud technologies. Working within cross-functional teams, you will deliver well-engineered, high quality code and data models, and drive best practices for data reliability, lineage, quality, and security Mandatory Skills: • Hands-on software coding or scripting for minimum 4 years • Experience in product management for at-least 4 years • Stakeholder management experience for at-least 4 years • Experience in one amongst GCP, AWS or Azure cloud platform Key Responsibilities: • Design, build, and optimize scalable data pipelines and ETL/ELT workflows using Spark (Scala/Python), SQL, and orchestration tools (e.g., Apache Airflow, Prefect, Luigi). • Implement efficient solutions for high-volume, batch, real-time streaming, and eventdriven data processing, leveraging best-in-class patterns and frameworks. • Build and maintain data warehouse and lakehouse architectures (e.g., Snowflake, Databricks, Delta Lake, BigQuery, Redshift) to support analytics, data science, and BI workloads. • Develop, automate, and monitor Airflow DAGs/jobs on cloud or Kubernetes, following robust deployment and operational practices (CI/CD, containerization, infra-as-code). • Write performant, production-grade SQL for complex data aggregation, transformation, and analytics tasks. • Ensure data quality, consistency, and governance across the stack, implementing processes for validation, cleansing, anomaly detection, and reconciliation General Skills & Experience: • Proficiency with Spark (Python or Scala), SQL, and data pipeline orchestration (Airflow, Prefect, Luigi, or similar). • Experience with cloud data ecosystems (AWS, GCP, Azure) and cloud-native services for data processing (Glue, Dataflow, Dataproc, EMR, HDInsight, Synapse, etc.) Hands-on development skills in at least one programming language (Python, Scala, or Java preferred); solid knowledge of software engineering best practices (version control, testing, modularity). • Deep understanding of batch and streaming architectures (Kafka, Kinesis, Pub/Sub, Flink, Structured Streaming, Spark Streaming). • Expertise in data warehouse/lakehouse solutions (Snowflake, Databricks, Delta Lake, BigQuery, Redshift, Synapse) and storage formats (Parquet, ORC, Delta, Iceberg, Avro). • Strong SQL development skills for ETL, analytics, and performance optimization. • Familiarity with Kubernetes (K8s), containerization (Docker), and deploying data pipelines in distributed/cloud-native environments. • Experience with data quality frameworks (Great Expectations, Deequ, or custom validation), monitoring/observability tools, and automated testing. • Working knowledge of data modeling (star/snowflake, normalized, denormalized) and metadata/catalog management. • Understanding of data security, privacy, and regulatory compliance (access management, PII masking, auditing, GDPR/CCPA/HIPAA). • Familiarity with BI or visualization tools (PowerBI, Tableau, Looker, etc.) is an advantage but not core. • Previous experience with data migrations, modernization, or refactoring legacy ETL processes to modern cloud architectures is a strong plus. • Bonus: Exposure to open-source data tools (dbt, Delta Lake, Apache Iceberg, Amundsen, Great Expectations, etc.) and knowledge of DevOps/MLOps processes EDUCATIONAL QUALIFICATIONS : • Bachelor’s or Master’s degree in Computer Science, Data Engineering, Information Systems, or related field (or equivalent experience). • Certifications in cloud platforms (AWS, GCP, Azure) and/or data engineering (AWS Data Analytics, GCP Data Engineer, Databricks). • Experience working in an Agile environment with exposure to CI/CD, Git, Jira, Confluence, and code review processes. • Prior work in highly regulated or large-scale enterprise data environments (finance, healthcare, or similar) is a plus Show more Show less

Posted 1 week ago

Apply

1.0 years

0 Lacs

Ahmedabad, Gujarat, India

On-site

Key Responsibility Areas (KRA) Associate with Senior Designers at the XP on preparing mood board curation, preparing 3D renders, 3D and 2D detailed drawings. Ensure an error-free QC and masking package by making necessary corrections before sending the project into production. 1.Skill Set Required: Freshers upto 1 year of experience. Basic proficiency in SketchUp, Revit and Cad Strong willingness to learn and follow instructions. 2. Education Diploma in Architecture, Civil, B,Tech Civil and B Arch Show more Show less

Posted 1 week ago

Apply

0.0 - 6.0 years

0 Lacs

Udaipur, Rajasthan

Remote

Senior Software Engineer - Data Governance Kadel Labs is a leading IT services company delivering top-quality technology solutions since 2017, focused on enhancing business operations and productivity through tailored, scalable, and future-ready solutions. With deep domain expertise and a commitment to innovation, we help businesses stay ahead of technological trends. As a CMMI Level 3 and ISO 27001:2022 certified company, we ensure best-in-class process maturity and information security, enabling organizations to achieve their digital transformation goals with confidence and efficiency. Role: Senior Software Engineer- Data Governance Experience: 6-8 Yrs Location: Udaipur , Jaipur, Bangalore Domain: Telecom Job Description: We are seeking an experienced Telecom Data Governance lead to join our team. In this role, you will be responsible for defining and implementing the data governance strategy. The role involves establishing metadata standards, defining attribute ownership models, ensuring regulatory compliance, and improving data quality and trust across the enterprise. initiatives. Key Responsibilities: Define and implement enterprise-wide data governance framework Own the metadata catalog and ensure consistency across business and technical assets Develop and manage KPI registries, data dictionaries, and lineage documentation Collaborate with data stewards and domain owners to establish attribute ownership Lead efforts around data standardization, quality rules, and classification of sensitive data Ensure privacy and compliance (e.g., GDPR, PII, PHI) by enforcing tagging, masking, and access rules Define access control rules (purpose-based views, user roles, sensitivity levels) Oversee governance for data products and federated data domains Support internal audits and external regulatory reviews Coordinate with platform, analytics, security, and compliance teams Required Skills: 6+ years of experience in data governance roles with at least 3-4 years in telecommunications industry Experience integrating governance with modern data stacks (e.g.Data bricks, Snowflake) Strong experience in data governance tools (e.g., Alation, Unity Catalog ,Azure Purview,) Proven understanding of metadata management, data lineage, and data quality frameworks Experience in implementing federated governance models and data stewardship programs Knowledge of compliance requirements (GDPR, PII, TMForum etc.) Familiarity with data mesh principles and data contract approaches Excellent communication and stakeholder management skills Background in telecom, networking or other data-rich industries Certification in data governance or management frameworks Educational Qualifications: · Bachelor's degree in Computer Science, Information Technology, or a related field. Visit us: https://kadellabs.com/ https://in.linkedin.com/company/kadel-labs https://www.glassdoor.co.in/Overview/Working-at-Kadel-Labs-EI_IE4991279.11,21.htm Job Types: Full-time, Permanent Pay: ₹2,087,062.21 - ₹2,209,304.16 per year Benefits: Flexible schedule Health insurance Leave encashment Paid time off Provident Fund Work from home Schedule: Day shift Monday to Friday Supplemental Pay: Overtime pay Performance bonus Quarterly bonus Yearly bonus Ability to commute/relocate: Udaipur City, Rajasthan: Reliably commute or planning to relocate before starting work (Required) Application Question(s): How Many years of experience in Telecom-Data Engineering? Experience: Data Engineer: 9 years (Required) Data governance: 6 years (Required) Location: Udaipur City, Rajasthan (Required) Work Location: In person

Posted 1 week ago

Apply

3.0 years

0 Lacs

Ahmedabad, Gujarat, India

On-site

Artic Consulting is a dynamic IT and consulting services firm, delivering digital transformation through innovative solutions in data, cloud, and business analytics. We are seeking a skilled Data Engineer with a strong focus on the Microsoft Fabric ecosystem, who can design and implement scalable data solutions for our clients. Key Responsibilities: Design, develop, and maintain Power BI reports, dashboards, DAX expressions, KPIs, and scorecards using both Import and Direct Query modes Build and orchestrate scalable ETL/ELT workflows using Fabric Data Pipelines, Dataflows Gen2, and Azure Data Factory Write and tune complex T-SQL and KQL queries, stored procedures, and views for performance in Synapse SQL and SQL Server environments Implement data models based on star/snowflake schemas and support modern data warehousing and Lakehouse architectures using Microsoft Fabric integrate structured and unstructured data sources (e.g., SQL, Excel, APIs, Blob Storage), and transform them efficiently using Fabric Notebooks (Spark/PySpark) or Dataflows Diagnose and resolve pipeline failures, logic errors, and performance bottlenecks across the data engineering lifecycle Automate repetitive data processes using Azure Functions, Logic Apps, PowerShell, or Python scripting within the Fabric ecosystem Collaborate with stakeholders to gather business requirements and translate them into scalable data solutions Ensure data governance, privacy, and compliance standards (e.g., GDPR, HIPAA, ISO) are adhered to, including sensitive data handling policies Apply best practices for item-level security, workspace-based access models, and data lineage using Microsoft One Lake and Fabric tools Required Qualifications : Bachelor's or master's degree in computer science, Information Systems, or related field Minimum 3 years of experience in Power BI development and data engineering Strong expertise in T-SQL and KQL, with demonstrated query optimization skills Proficiency in Microsoft Fabric tools: Data Pipelines, Dataflows Gen2, OneLake, Notebooks Hands-on experience with Spark/PySpark and data integration from varied sources Microsoft Power BI certification (PL-300) or equivalent Microsoft certifications Preferred Skills: Experience debugging and optimizing advanced SQL queries, database objects, and legacy components Ability to implement database security models and data protection policies Expertise in implementing row-level security, dynamic data masking, and role-based access control within Microsoft Fabric and Power BI environments Familiarity with Microsoft OneLake architecture, including data cataloging, lineage tracking, item-level security, and workspace-based access management Proven ability to operate effectively in dynamic, client-facing environments, delivering scalable and compliant data solutions with a focus on performance and quality Why Join Artic Consulting? Work with cutting-edge Microsoft technologies in a dynamic and collaborative environment Flexible work culture with opportunities for growth and certifications Be part of a mission to deliver impactful digital transformation for clients globally Powered by JazzHR K4jDCdH2kc Show more Show less

Posted 1 week ago

Apply

6.0 - 11.0 years

16 - 31 Lacs

Hyderabad, Pune, Bengaluru

Hybrid

Greetings from Cognizant!!! We are hiring for Permanent Position with Cognizant. Experience:- 3 - 12 Yrs. Mandatory to have experience in TDM, GenRocket, Delphix, Informatica. Work Location: Pan India Interview Mode : ( Virtual ) Interview Date : Weekday & Weekend JD: Job Summary Strong Test Data Manager with hands on Test Data experience preferably in TDM consultancy and Implementation of TDM tool Thorough understanding of Test Data Management hands on experience Data Generation Masking Profiling Experience in enterprise level TDM solutioning and implementation Experience in client facing roles and good stakeholder management Prefer experience in Google Cloud or Cloud Test Data handling Responsibilities Strong Test Data Manager with hands on Test Data experience preferably in TDM consultancy and Implementation of TDM tool Thorough understanding of Test Data Management hands on experience Data Generation Masking Profiling Experience in enterprise level TDM solutioning and implementation Experience in client facing roles and good stakeholder management.

Posted 1 week ago

Apply

0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Overview This role is responsible for coordinate resources, solve technical requirements, evaluate risks and scope of SAP improvements, upgrades and implementations for Global PGT and individual PGT and deploy technological solutions according Pepsico´s SAP/IT best practices and compliance. This role is also responsible for assessing functional requirements, guide the group as per Application Security guidelines and Compliance standard methodologies, and ensure transparent security design. Provides subject matter expertise (SME) in solutioning and implementing SAP access management requirements. This role severs as the leader for cyber security governance, engineering, and reporting for PGT. This role is also the liaison with Information Security. Additionally, this role’s objective is to successfully deliver security upfront across all PGT deployments while ensuring consistency in approach and providing visibility through communication and alignment with key stakeholders. Responsibilities Point Person for PGT SAP implementations with the leaders, functional team and business unit. Provide project progress information to functional and business Directors and Managers Minimize SoD critical risk during implementations and guide during each phase to achieve the SAP security governance and controls. Work closely with controls team (IT, configurable and internal control) and continue supporting the best practices Communicate with governance team in order to implement local and global best practices. Coordinate SAP Security implementations during the lifecycle of the projects. Consolidate and support PGT implementations regarding SAP Security best practices. Introduce delivery automation processes , Actively participate in the Continuous Process Improvement initiatives by striving to look for possible efficiencies, scalability and / or cost reduction opportunities. Work with limited supervision and exhibit a solid sense of urgency Actively participate in the Continuous Process Improvement initiatives by striving to look for possible efficiencies, scalability and / or cost reduction opportunities Facilitates internal and external audits as requested Always ensure Data Protection by leveraging Data Masking and Data Scrambling techniques Responsible for Leadership reporting on various Information Security metrics across Tech Strategy and Enterprise Solutions teams Collaborate with Information Security organization to ensure remediation of security vulnerabilities to ensure security health index is maintained as intended Manage and provide status updates on security assessments, vulnerability remediation, and exceptions Provide Security Engineering Expertise for PGT program Provide regular updates to Information Security Leadership on PGT status and risks and issues Qualifications Bachelor’s degree in computer science (or equivalent) is required Show more Show less

Posted 1 week ago

Apply

6.0 years

18 - 24 Lacs

Hyderābād

On-site

Delphix Senior Engineer Open Positions: 4 Experience: 6+ Years Location: Bangalore, Hyderabad, Chennai, Pune Employment Type: Full-Time About the Role: We are seeking highly skilled Delphix Senior Engineers to join our dynamic team. You will play a critical role in designing, deploying, and optimizing Delphix Data Virtualization and Data Masking solutions across enterprise environments. This role also involves mentoring junior engineers and ensuring best practices in data delivery, privacy, and DevOps enablement. Key Responsibilities: Design and implement scalable Delphix data virtualization and masking architectures. Lead end-to-end solution deployment, configuration, and integration with client systems. Collaborate with development, QA, and operations teams to ensure seamless data delivery for testing and analytics. Automate data provisioning workflows using Delphix APIs and integration tools. Monitor performance, troubleshoot issues, and optimize Delphix environments. Mentor and guide junior engineers; provide training and technical leadership. Document system designs, processes, and operational procedures. Work closely with stakeholders to understand data needs and ensure secure and efficient access. Required Skills & Experience: 6+ years of experience in enterprise data management or DevOps roles. 3+ years of hands-on experience with Delphix Data Virtualization and Data Masking solutions. Strong understanding of RDBMS technologies (Oracle, SQL Server, PostgreSQL, etc.). Experience with scripting (Shell, Python) and Delphix API integrations. Familiarity with CI/CD pipelines and DevOps practices. Excellent problem-solving, communication, and stakeholder management skills. Ability to work independently and lead technical discussions and initiatives. Preferred Qualifications: Experience in cloud-based deployments (AWS, Azure, GCP). Prior experience with Agile/Scrum methodologies. Delphix certifications or formal training. Job Type: Full-time Pay: ₹1,800,000.00 - ₹2,400,000.00 per year Schedule: Day shift Work Location: In person

Posted 1 week ago

Apply

8.0 years

25 - 30 Lacs

Hyderābād

On-site

Delphix Tech Lead ( Exp- 9 LPA Lead end-to-end Delphix solution design, implementation, and team guidance across enterprise environments. We are looking for an experienced Delphix Tech Lead to take ownership of the end-to-end design, implementation, and deployment of Delphix solutions across complex enterprise environments. The ideal candidate will provide technical leadership, guide team members, and ensure seamless integration of Delphix into various data architectures. Key Responsibilities: Lead the design and implementation of Delphix data virtualization and masking solutions. Collaborate with stakeholders to understand data management needs and translate them into technical solutions. Oversee installation, configuration, and maintenance of Delphix platform. Drive performance optimization, automation, and integration with CI/CD pipelines. Provide technical guidance and mentorship to team members. Create technical documentation and ensure best practices are followed. Troubleshoot and resolve issues related to data provisioning and masking. Required Skills & Qualifications: 8+ years of IT experience with at least 3+ years working on Delphix platform. Strong knowledge of data virtualization, masking, and DevOps processes. Experience with database technologies such as Oracle, SQL Server, or PostgreSQL. Solid understanding of data management and security best practices. Ability to lead projects and coordinate with cross-functional teams. Excellent communication and problem-solving skills. Job Type: Full-time Pay: ₹2,500,000.00 - ₹3,000,000.00 per year Schedule: Day shift Work Location: In person

Posted 1 week ago

Apply

5.0 years

14 - 18 Lacs

Hyderābād

On-site

Job Title: Delphix Engineer Experience: 5+ Years Positions Open: 2 Location: Bangalore, Pune, Hyderabad, Chennai Employment Type: Full-time About the Role: We are seeking skilled and motivated Delphix Engineers to join our dynamic team. In this role, you will be responsible for implementing, managing, and optimizing Delphix environments to support high-performing, secure, and efficient data virtualization and delivery. This is an excellent opportunity to contribute to enterprise-level data initiatives and drive automation in modern data infrastructure. Key Responsibilities: Design, implement, and maintain Delphix data virtualization and masking environments. Collaborate with development, testing, and infrastructure teams to deliver virtualized data environments. Automate data provisioning, refresh, masking, and archival processes using Delphix APIs and scripting tools. Monitor system health, troubleshoot issues, and ensure optimal performance and reliability. Manage integration of Delphix with databases like Oracle, SQL Server, PostgreSQL, and others. Ensure compliance with data security and masking requirements across environments. Contribute to documentation, best practices, and knowledge-sharing within the team. Required Skills & Experience: Minimum 5 years of overall experience, with strong expertise in Delphix Data Platform . Solid hands-on experience with Delphix Virtualization and Masking solutions . Strong scripting skills using Shell, PowerShell, or Python for automation. Experience integrating Delphix with Oracle, SQL Server, or other major databases . Good understanding of data lifecycle management , data masking , and data delivery pipelines . Familiarity with DevOps tools and CI/CD processes is a plus. Strong analytical, troubleshooting, and communication skills. Preferred Qualifications: Delphix certifications (if available). Experience working in Agile/Scrum environments. Exposure to cloud platforms (AWS, Azure, GCP) and cloud-based Delphix setups Job Type: Full-time Pay: ₹1,400,000.00 - ₹1,800,000.00 per year Schedule: Day shift Work Location: In person

Posted 1 week ago

Apply

3.0 - 6.0 years

8 - 10 Lacs

Hyderābād

On-site

Job Title: Delphix Support Engineer Open Positions: 2 Experience: 3–6 Years Location: Bangalore / Pune / Hyderabad / Chennai / Pune Employment Type: Full-time About the Role: We are looking for experienced Delphix Support Engineers to join our growing team. In this role, you will provide day-to-day operational support for Delphix platforms, resolve technical issues, and ensure the high availability and performance of data virtualization environments. You will collaborate closely with internal teams and stakeholders to maintain service excellence. Key Responsibilities: Provide Level 1 and Level 2 support for Delphix data virtualization and masking platforms. Monitor system health, performance, and availability of Delphix environments. Diagnose and troubleshoot incidents, escalate critical issues as needed, and drive timely resolution. Coordinate with engineering and infrastructure teams for patching, upgrades, and configuration changes. Maintain and update documentation related to Delphix support procedures and issue resolutions. Perform routine maintenance tasks including backups, restores, and environment refreshes. Ensure compliance with security and operational policies across environments. Required Skills & Experience: 3–6 years of experience supporting enterprise data platforms or infrastructure environments. Hands-on experience with Delphix Dynamic Data Platform (data virtualization and masking). Strong troubleshooting and analytical skills for resolving performance and availability issues. Familiarity with database platforms (Oracle, SQL Server, etc.) and understanding of data cloning/virtualization. Experience with incident management tools (e.g., ServiceNow, JIRA) and monitoring systems. Excellent communication skills and ability to work in a fast-paced, collaborative environment. Good to Have: Experience with scripting (Shell, Python) for automation. Knowledge of DevOps and CI/CD practices. Exposure to cloud platforms (AWS, Azure, GCP) and integration with Delphix. Job Type: Full-time Pay: ₹800,000.00 - ₹1,000,000.00 per year Shift: Day shift Work Days: Monday to Friday Work Location: In person

Posted 1 week ago

Apply

7.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

At TELUS Digital, we enable customer experience innovation through spirited teamwork, agile thinking, and a caring culture that puts customers first. TELUS Digital is the global arm of TELUS Corporation, one of the largest telecommunications service providers in Canada. We deliver contact center and business process outsourcing (BPO) solutions to some of the world's largest corporations in the consumer electronics, finance, telecommunications and utilities sectors. With global call center delivery capabilities, our multi-shore, multi-language programs offer safe, secure infrastructure, value-based pricing, skills-based resources and exceptional customer service - all backed by TELUS, our multi-billion dollar telecommunications parent. Required Skills : 7+ years of experience in quality assurance, with at least 3+ years in a Test Data Management (TDM) lead or senior role .Proven experience in designing and implementing test data management strategies, data masking, and test data provisioning for large-scale software projects .Lead the development and implementation of comprehensive test data management strategies to support functional, regression, performance, security, and other types of testing .Establish governance processes and best practices for handling, managing, and securing test data across multiple projects and environments .Ensure that test data complies with legal, regulatory, and organizational security policies (e.g., GDPR, HIPAA) .Design and oversee the creation of high-quality, realistic, and representative test data to meet the needs of different types of testing .Use data generation tools and techniques to produce test data that mirrors real-world data while maintaining privacy and security .Develop automated processes for generating and refreshing test data in line with project and release timelines .Implement and manage data masking, anonymization, and sanitization techniques to ensure sensitive information is protected while retaining data integrity for testing purposes .Develop and enforce data security practices related to the use and storage of test data .Work closely with QA, development, and DevOps teams to understand the specific test data requirements for different testing phases (e.g., unit, integration, performance, UAT) .Collaborate with business and IT teams to ensure that required test data is available when needed and meets quality expectations .Support the creation of data models and mapping to align test data with application requirements .Implement strategies for efficient storage and retrieval of test data to ensure high performance and reduce resource consumption during testing .Continuously assess and optimize test data strategies to improve test execution time, resource allocation, and overall testing efficiency .Manage large-scale data sets and ensure their availability across multiple environments (development, testing, staging, production) .Lead the evaluation, implementation, and continuous improvement of test data management tools and automation platforms (e.g., Informatica TDM, Delphix, IBM InfoSphere Optim) .Leverage automation to streamline test data creation, management, and refresh cycles, ensuring quick access to the latest data for testing .Drive the adoption of self-service tools to enable teams to generate, refresh, and manage their own test data securely .Monitor and manage test data usage to ensure compliance with internal standards and external regulations . Show more Show less

Posted 1 week ago

Apply

1.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Key Responsibility Areas (KRA) Associate with Senior Designers at the XP on preparing mood board curation, preparing 3D renders, 3D and 2D detailed drawings. Ensure an error-free QC and masking package by making necessary corrections before sending the project into production. 1.Skill Set Required: Freshers upto 1 year of experience. Basic proficiency in SketchUp, Revit and Cad Strong willingness to learn and follow instructions. 2. Education Diploma in Architecture, Civil, B,Tech Civil and B Arch Show more Show less

Posted 1 week ago

Apply

10.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Job Title: Azure Data Engineer with Databricks Experience: 5 – 10 years Job Level: Senior Engineer / Lead / Architect Notice Period: Immediate Joiner Role Overview Join our dynamic team at Team Geek Solutions, where we specialize in innovative data solutions and cutting-edge technology implementations to empower businesses across various sectors. We are looking for a skilled Azure Data Engineer with expertise in Databricks to join our high-performing data and AI team for a critical client engagement. The ideal candidate will have strong hands-on experience in building scalable data pipelines, data transformation, and real-time data processing using Azure Data Services and Databricks. Key Responsibilities Design, develop, and deploy end-to-end data pipelines using Azure Databricks, Azure Data Factory, and Azure Synapse Analytics. Perform data ingestion, data wrangling, and ETL/ELT processes from various structured and unstructured data sources (e.g., APIs, on-prem databases, flat files). Optimize and tune Spark-based jobs and Databricks notebooks for performance and scalability. Implement best practices for CI/CD, code versioning, and testing in a Databricks environment using DevOps pipelines. Design data lake and data warehouse solutions using Delta Lake and Synapse Analytics. Ensure data security, governance, and compliance using Azure-native tools (e.g., Azure Purview, Key Vault, RBAC). Collaborate with data scientists to enable feature engineering and model training within Databricks. Write efficient SQL and PySpark code for data transformation and analytics. Monitor and maintain existing data pipelines and troubleshoot issues in a production environment. Document technical solutions, architecture diagrams, and data lineage as part of delivery. Mandatory Skills & Technologies Azure Cloud Services: Azure Data Factory, Azure Databricks, Azure Synapse Analytics, Azure Data Lake Storage (Gen2), Azure Key Vault, Azure Functions, Azure Monitor Databricks Platform: Delta Lake, Databricks Notebooks, Job Clusters, MLFlow (optional), Unity Catalog Programming Languages: PySpark, SQL, Python Data Pipelines: ETL/ELT pipeline design and orchestration Version Control & DevOps: Git, Azure DevOps, CI/CD pipelines Data Modeling: Star/Snowflake schema, Dimensional modeling Performance Tuning: Spark job optimization, Data partitioning strategies Data Governance & Security: Azure Purview, RBAC, Data Masking Nice To Have Experience with Kafka, Event Hub, or other real-time streaming platforms Exposure to Power BI or other visualization tools Knowledge of Terraform or ARM templates for infrastructure as code Experience in MLOps and integration with MLFlow for model lifecycle management Certifications (Good To Have) Microsoft Certified: Azure Data Engineer Associate Databricks Certified Data Engineer Associate / Professional DP-203: Data Engineering on Microsoft Azure Soft Skills Strong communication and client interaction skills Analytical thinking and problem-solving Agile mindset with familiarity in Scrum/Kanban Team player with mentoring ability for junior engineers Skills: data partitioning strategies,azure functions,data analytics,unity catalog,rbac,databricks,elt,devops,azure data factory,delta lake,data factory,spark job optimization,job clusters,azure devops,etl/elt pipeline design and orchestration,data masking,azure key vault,azure databricks,azure data engineer,azure synapse,star/snowflake schema,azure data lake storage (gen2),git,sql,etl,snowflake,azure,python,azure cloud services,azure purview,pyspark,mlflow,ci/cd pipelines,dimensional modeling,sql server,big data technologies,azure monitor,azure synapse analytics,databricks notebooks Show more Show less

Posted 1 week ago

Apply

6.0 - 8.0 years

0 Lacs

Chennai, Tamil Nadu, India

Remote

Experience- 6-8 years Required Technical Skill Set** HPE Storage platforms: HPE Nimble ,HPE Primera , HPE 3PAR deployment, configuration, replication, and performance tuning, enterprise setup, firmware management, scalability, provisioning, snapshot/replication configuration SAN switch technologies Cisco MDS ,Brocade switches: zoning, fabric configuration, troubleshooting, Fabric management, diagnostics, and upgrade Desired Competencies (Technical/Behavioral Competency) Must-Have** (Ideally should not be more than 3-5) · Experience with LUN provisioning, masking, and zoning across multi-host environments. · Proficiency with Fibre Channel, iSCSI, and FCoE protocols for block-level storage connectivity. · Knowledge of storage replication, snapshot technologies, and remote data protection solutions. · Proficient in backup integration and disaster recovery strategies in storage environments. · Experience performing firmware upgrades and hardware lifecycle management on storage devices. · Ability to conduct and analyze storage performance assessments, capacity planning, and security audits. · Familiarity with storage monitoring, alerting, and reporting tools for proactive system health checks. · Troubleshooting of hardware-level and storage network issues affecting performance and availability. · Adequate knowledge of Ethernet/iSCSI and Fibre Channel-based SAN topology Good-to-Have · Hands-on experience with HPE Storage platforms: HPE Nimble ,HPE Primera , HPE 3PAR deployment, configuration, replication, and performance tuning, enterprise setup, firmware management, scalability, provisioning, snapshot/replication configuration. SN Responsibility of / Expectations from the Role 1. Ability to work independently in a fast-paced dynamic environment required. 2. Proven experience in designing, implementing, and managing enterprise storage solutions. 3. Deep knowledge of SAN (Storage Area Network), NAS (Network Attached Storage), and DAS (Direct Attached Storage) technologies. 4. Expertise in RAID levels, disk provisioning, and storage performance optimization. 5. Strong understanding of SAN switch technologies Cisco MDS ,Brocade switches: zoning, fabric configuration, troubleshooting, Fabric management, diagnostics, and upgrade 6. Experience performing firmware upgrades and hardware lifecycle management on storage devices. 7. Ability to conduct and analyze storage performance assessments, capacity planning, and security audits. Show more Show less

Posted 1 week ago

Apply

3.0 - 4.0 years

0 Lacs

India

On-site

Note : Kindly note only below qualification will be considered for this positions Mandatory educational qualification: Bachelor’s in engineering (specifically Computer Science) from a premier institute in India (IITs or NITs or IIITs) Bonus educational qualification: Masters (MTech) in Computer Science from IISC or IITs. Role Description We are currently looking for bright Deep learning talent who have 3-4 years of industry experience (1.5-2 years in practical Deep Learning industry projects), working on problems related to the Video AI space. A Deep Learning engineer gets exposed to building solutions related to human activity analysis in videos 1. reads research papers to understand the latest developments in activity recognition, multi-camera object tracking problems 2. Devlops production quality code to convert the research to usable features on https://deeplabel.app. 3. Gets to work on Video Language model architectures. 4. Along with strong research skills, good coding skills and knowledge of Data structures and algorithms is required. Qualifications Strong coding skills in Python Thorough hands-on knowledge of Deep Learning frameworks - Pytorch, Onnx Excellent understanding and experience of deploying data pipelines for Computer Vision projects. Good working knowledge of troubleshooting, fixing and patching Linux system level problems (we expect engineers to set up their own workstations, install, troubleshoot Cuda problems and OpenCV2 problems). Good understanding of Deep Learning concepts (model parameters, tuning of models, optimizers, learning rates, attention mechanisms, masking etc). Thorough understanding of deep learning implementations of activity detection or object detection. Ability to read research papers and implement new approaches for activity detection and object detection. Knowledge of deployment tools like Triton inference server or Ray are a plus. In addition to the above : we need a few key personality attributes Willingness to learn and try till you succeed Curiosity to learn and experiment. Ability to work with full stack engineers of the AI platform team, to deploy your new innovations. Good communication skills Ability to take ownership of the respective modules for the whole lifecycle till deployment. Company Description Drillo.AI specializes in delivering tailored AI solutions that empower small and medium-sized businesses (SMBs) to innovate and grow. Our mission is to make advanced technology accessible, helping SMBs streamline operations, enhance decision-making, and drive sustainable success. With a deep understanding of the unique challenges faced by smaller enterprises, we provide scalable, cost-effective AI strategies to unlock new opportunities. By working closely with our clients, we help elevate their businesses to the next level. Show more Show less

Posted 1 week ago

Apply

6.0 years

0 Lacs

Kolkata, West Bengal, India

On-site

JOB_POSTING-3-71493-1 Job Description Role Title : AVP, Enterprise Logging & Observability (L11) Company Overview Synchrony (NYSE: SYF) is a premier consumer financial services company delivering one of the industry’s most complete digitally enabled product suites. Our experience, expertise and scale encompass a broad spectrum of industries including digital, health and wellness, retail, telecommunications, home, auto, outdoors, pet and more. We have recently been ranked #2 among India’s Best Companies to Work for by Great Place to Work. We were among the Top 50 India’s Best Workplaces in Building a Culture of Innovation by All by GPTW and Top 25 among Best Workplaces in BFSI by GPTW. We have also been recognized by AmbitionBox Employee Choice Awards among the Top 20 Mid-Sized Companies, ranked #3 among Top Rated Companies for Women, and Top-Rated Financial Services Companies. Synchrony celebrates ~51% women diversity, 105+ people with disabilities, and ~50 veterans and veteran family members. We offer Flexibility and Choice for all employees and provide best-in-class employee benefits and programs that cater to work-life integration and overall well-being. We provide career advancement and upskilling opportunities, focusing on Advancing Diverse Talent to take up leadership roles. Organizational Overview Splunk is Synchrony's enterprise logging solution. Splunk searches and indexes log files and helps derive insights from the data. The primary goal is, to ingests massive datasets from disparate sources and employs advanced analytics to automate operations and improve data analysis. It also offers predictive analytics and unified monitoring for applications, services and infrastructure. There are many applications that are forwarding data to the Splunk logging solution. Splunk team including Engineering, Development, Operations, Onboarding, Monitoring maintain Splunk and provide solutions to teams across Synchrony. Role Summary/Purpose The role AVP, Enterprise Logging & Observability is a key leadership role responsible for driving the strategic vision, roadmap, and development of the organization’s centralized logging and observability platform. This role supports multiple enterprise initiatives including applications, security monitoring, compliance reporting, operational insights, and platform health tracking. This role lead platform development using Agile methodology, manage stakeholder priorities, ensure logging standards across applications and infrastructure, and support security initiatives. This position bridges the gap between technology teams, applications, platforms, cloud, cybersecurity, infrastructure, DevOps, Governance audit, risk teams and business partners, owning and evolving the logging ecosystem to support real-time insights, compliance monitoring, and operational excellence. Key Responsibilities Splunk Development & Platform Management Lead and coordinate development activities, ingestion pipeline enhancements, onboarding frameworks, and alerting solutions. Collaborate with engineering, operations, and Splunk admins to ensure scalability, performance, and reliability of the platform. Establish governance controls for source naming, indexing strategies, retention, access controls, and audit readiness. Splunk ITSI Implementation & Management - Develop and configure ITSI services, entities, and correlation searches. Implement notable events aggregation policies and automate response actions. Fine-tune ITSI performance by optimizing data models, summary indexing, and saved searches. Help identify patterns and anomalies in logs and metrics. Develop ML models for anomaly detection, capacity planning, and predictive analytics. Utilize Splunk MLTK to build and train models for IT operations monitoring. Security & Compliance Enablement Partner with InfoSec, Risk, and Compliance to align logging practices with regulations (e.g., PCI-DSS, GDPR, RBI). Enable visibility for encryption events, access anomalies, secrets management, and audit trails. Support security control mapping and automation through observability. Stakeholder Engagement Act as a strategic advisor and point of contact for business units, application, infrastructure, security stakeholders and business teams leveraging Splunk. Conduct stakeholder workshops, backlog grooming, and sprint reviews to ensure alignment. Maintain clear and timely communications across all levels of the organization. Process & Governance Drive logging and observability governance standards, including naming conventions, access controls, and data retention policies. Lead initiatives for process improvement in log ingestion, normalization, and compliance readiness. Ensure alignment with enterprise architecture and data classification models. Lead improvements in logging onboarding lifecycle time, automation pipelines, and selfservice ingestion tools. Mentor junior team members and guide engineering teams on secure, standardized logging practices. Required Skills/Knowledge Bachelor's degree with Minimum of 6+ years of experience in Technology ,or in lieu of a degree 8+ years of Experience in Technology Minimum of 3+ years of experience in leading development team or equivalent role in observability, logging, or security platforms. Splunk Subject Matter Expert (SME) Strong hands-on understanding of Splunk architecture, pipelines, dashboards, and alerting, data ingestion, search optimization, and enterprise-scale operations. Experience supporting security use cases, encryption visibility, secrets management, and compliance logging. Splunk Development & Platform Management, Security & Compliance Enablement, Stakeholder Engagement & Process & Governance Experience with Splunk Premium Apps - ITSI and Enterprise Security (ES) minimally Experience with Data Streaming Platforms & tools like Cribl, Splunk Edge Processor. Proven ability to work in Agile environments using tools such as JIRA or JIRA Align. Strong communication, leadership, and stakeholder management skills. Familiarity with security, risk, and compliance standards relevant to BFSI. Proven experience leading product development teams and managing cross-functional initiatives using Agile methods. Strong knowledge and hands-on experience with Splunk Enterprise/Splunk Cloud. Design and implement Splunk ITSI solutions for proactive monitoring and service health tracking. Develop KPIs, Services, Glass Tables, Entities, Deep Dives, and Notable Events to improve service reliability for users across the firm Develop scripts (python, JavaScript, etc.) as needed in support of data collection or integration Develop new applications leveraging Splunk’s analytic and Machine Learning tools to maximize performance, availability and security improving business insight and operations. Support senior engineers in analyzing system issues and performing root cause analysis (RCA). Desired Skills/Knowledge Deep knowledge of Splunk development, data ingestion, search optimization, alerting, dashboarding, and enterprise-scale operations. Exposure to SIEM integration, security orchestration, or SOAR platforms. Knowledge of cloud-native observability (e.g. AWS/GCP/Azure logging). Experience in BFSI or regulated industries with high-volume data handling. Familiarity with CI/CD pipelines, DevSecOps integration, and cloud-native logging. Working knowledge of scripting or automation (e.g., Python, Terraform, Ansible) for observability tooling. Splunk certifications (Power User, Admin, Architect, or equivalent) will be an advantage . Awareness of data classification, retention, and masking/anonymization strategies. Awareness of integration between Splunk and ITSM or incident management tools (e.g., ServiceNow, PagerDuty) Experience with Version Control tools – Git, Bitbucket Eligibility Criteria Bachelor's degree with Minimum of 6+ years of experience in Technology ,or in lieu of a degree 8+ years of Experience in Technology Minimum of 3+ years of experience in leading development team or equivalent role in observability, logging, or security platforms. Demonstrated success in managing large-scale logging platforms in regulated environments. Excellent communication, leadership, and cross-functional collaboration skills. Experience with scripting languages such as Python, Bash, or PowerShell for automation and integration purposes. Prior experience in large-scale, security-driven logging or observability platform development. Excellent problem-solving skills and the ability to work independently or as part of a team. Strong communication and interpersonal skills to interact effectively with team members and stakeholders. Knowledge of IT Service Management (ITSM) and monitoring tools. Knowledge of other data analytics tools or platforms is a plus. WORK TIMINGS : 01:00 PM to 10:00 PM IST This role qualifies for Enhanced Flexibility and Choice offered in Synchrony India and will require the incumbent to be available between 06:00 AM Eastern Time – 11:30 AM Eastern Time (timings are anchored to US Eastern hours and will adjust twice a year locally). This window is for meetings with India and US teams. The remaining hours will be flexible for the employee to choose. Exceptions may apply periodically due to business needs. Please discuss this with the hiring manager for more details. For Internal Applicants Understand the criteria or mandatory skills required for the role, before applying Inform your manager and HRM before applying for any role on Workday Ensure that your professional profile is updated (fields such as education, prior experience, other skills) and it is mandatory to upload your updated resume (Word or PDF format) Must not be any corrective action plan (First Formal/Final Formal, PIP) L9+ Employees who have completed 18 months in the organization and 12 months in current role and level are only eligible. L09+ Employees can apply. Level / Grade : 11 Job Family Group Information Technology Show more Show less

Posted 1 week ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies