Jobs
Interviews

460 Masking Jobs - Page 5

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

0 years

6 - 9 Lacs

Noida, Uttar Pradesh, India

On-site

About MyOperator MyOperator is India's top cloud communications provider, offering a comprehensive SAAS platform to 10,000+ businesses, including IRCTC, Razorpay, and Amazon. Our services include Cloud Call Center, IVR, Toll-free Numbers, and Enterprise Mobility. We've recently ventured into selling WhatsApp Business Solutions, alongside launching Heyo Phone, an SMB-focused conversation app, backed by super-angels Amit Chaudhary and Aakash Chaudhry. Awarded for ease of use and exceptional customer service, MyOperator leads India's cloud communications segment. Explore our solutions like call masking, call confirmation, and multi-store at myoperator.com. Responsibilities Job Description Conduct in-depth research to identify and prospect for qualified leads. Utilize various channels such as email, phone, and social media to connect with potential customers. Effectively qualify leads through needs discovery conversations to understand their challenges and pain points. Develop strong communication skills to present product features and benefits compellingly. Maintain accurate records of all lead interactions within the CRM system. Contribute to the continuous improvement of the sales development process. Product Expertise: Maintain an in-depth knowledge of MyOperator's product offerings, staying up-to-date with new features and capabilities. Consultative Selling: Engage in consultative selling, proactively identifying opportunities to enhance the customer's communication infrastructure. CXO Interaction: Conduct effective discussions with CXOs and key decision-makers to influence their adoption of MyOperator solutions. Market Insights: Stay informed about industry trends, competitor activities, and market dynamics to make informed sales decisions. Sales Collateral: Develop and deliver compelling sales presentations, proposals, and other collateral to effectively communicate the value proposition of MyOperator. Skills:- B2B, SaaS, B2Bsoftware, Salesforce.com administration and softwaresales

Posted 2 weeks ago

Apply

4.0 - 5.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Responsibilities From Research to Reality: You'll be the bridge between cutting-edge academic research (activity recognition, multi-camera tracking, Video Language Models) and deployable, production-grade features on https://deeplabel.app . Building Solutions: Design, develop, and optimize Deep Learning models for human activity analysis in videos, from initial concept to final deployment. System Ownership: Take charge of your modules, ensuring they're robust, efficient, and seamlessly integrated with our AI platform, collaborating closely with our full-stack team. Requirements 4-5 years of industry experience, with 2-3 years specifically in practical, project-driven Deep Learning in the Video AI domain. Rock-solid Python coding skills. Deep practical knowledge of PyTorch and ONNX. Proven track record of deploying data pipelines for Computer Vision projects. The ability to independently set up, troubleshoot, and optimize Linux workstations (CUDA, OpenCV). A strong grasp of Deep Learning concepts (optimizers, attention, masking, model tuning). Demonstrated experience with activity detection or object detection implementations. A keen ability to read and implement new approaches from research papers. Your Personality Is Just As Important You're incredibly curious and love to experiment. You have an unwavering commitment to learning and overcoming challenges. You're a great communicator and thrive in a collaborative environment. You take full ownership of your work, seeing it through to successful deployment. This job was posted by Vinay Ts from Streamingo.

Posted 2 weeks ago

Apply

0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Job Description Your Responsibilities We are seeking an experienced and highly motivated Sr Data Engineer - Data Ingestion to join our dynamic team. The ideal candidate will have strong hands-on experience with Azure Data Factory (ADF), a deep understanding of relational and non-relational data ingestion techniques, and proficiency in Python programming. You will be responsible for designing and implementing scalable data ingestion solutions that interface with Azure Data Lake Storage Gen 2 (ADLS Gen 2), Databricks, and various other Azure ecosystem services. The Data Ingestion Engineer will work closely with stakeholders to gather data ingestion requirements, create modularized ingestion solutions, and define best practices to ensure efficient, robust, and scalable data pipelines. This role requires effective communication skills, ownership, and accountability for the delivery of high-quality data solutions. Data Ingestion Strategy & Development: Design, develop, and deploy scalable and efficient data pipelines in Azure Data Factory (ADF) to move data from multiple sources (relational, non-relational, files, APIs, etc.) into Azure Data Lake Storage Gen 2 (ADLS Gen 2), Azure SQL Database, and other target systems. Implement ADF activities (copy, lookup, execute pipeline, etc.) to integrate data from on-premises and cloud-based systems. Build parameterized and reusable pipeline templates in ADF to standardize the data ingestion process, ensuring maintainability and scalability of ingestion workflows. Integrate custom data transformation activities within ADF pipelines, utilizing Python, Databricks, or Azure Functions when required. ADF Data Flows Design & Development: Leverage Azure Data Factory Data Flows for visually designing and orchestrating data transformation tasks, enabling complex ETL (Extract, Transform, Load) logic to process large datasets at scale. Design data flow transformations such as filtering, aggregation, joins, lookups, and sorting to process and transform data before loading it into target systems like ADLS Gen 2 or Azure SQL Database. Implement incremental loading strategies in Data Flows to ensure efficient and optimized data ingestion for large volumes of data while minimizing resource consumption. Develop reusable data flow components to streamline transformation processes, ensuring consistency and reducing development time for new data ingestion pipelines. Utilize debugging tools in Data Flows to troubleshoot, test, and optimize data transformations, ensuring accurate results and performance. ADF Orchestration & Automation: Use ADF triggers and scheduling to automate pipeline execution based on time or events, ensuring timely and efficient data ingestion. Configure ADF monitoring and alerting capabilities to proactively track pipeline performance, handle failures, and address issues in a timely manner. Implement ADF version control practices using Git to manage code changes, collaborate effectively with other team members, and ensure code integrity. Data Integration with Various Sources: Ingest data from diverse sources such as on-premise SQL Servers, REST APIs, cloud databases (e.g., Azure SQL Database, Cosmos DB), file-based systems (CSV, Parquet, JSON), and third-party services using ADF. Design and implement ADF linked services to securely connect to external data sources (databases, file systems, APIs, etc.). Develop and configure ADF datasets and dataflows to efficiently transform, clean, and load data into Azure Data Lake or other destinations. Pipeline Monitoring and Optimization: Continuously monitor and optimize ADF pipelines to ensure they run with high performance and minimal cost. Apply techniques like data partitioning, parallel processing, and incremental loading where appropriate. Implement data quality checks within the pipelines to ensure data integrity and handle data anomalies or errors in a systematic manner. Review pipeline execution logs and performance metrics regularly, and apply tuning recommendations to improve execution times and reduce operational costs. Collaboration and Communication: Work closely with business and technical stakeholders to capture and translate data ingestion requirements into ADF pipeline designs. Provide ADF-specific technical expertise to both internal and external teams, guiding them in the use of ADF for efficient and cost-effective data pipelines. Document ADF pipeline designs, error handling strategies, and best practices to ensure the team can maintain and scale the solutions. Conduct training sessions or knowledge transfer with junior engineers or other team members on ADF best practices and architecture. Security and Compliance: Ensure all data ingestion solutions built in ADF follow security and compliance guidelines, including encryption at rest and in transit, data masking, and identity and access management. Implement role-based access control (RBAC) and managed identities within ADF to manage access securely and reduce the risk of unauthorized access to sensitive data. Integration with Azure Ecosystem: Leverage other Azure services, such as Azure Logic Apps, Azure Function Apps, and Azure Databricks, to augment the capabilities of ADF pipelines, enabling more advanced data processing, event-driven workflows, and custom transformations. Incorporate Azure Key Vault to securely store and manage sensitive data (e.g., connection strings, credentials) used in ADF pipelines. Integrate ADF with Azure Data Lake Analytics, Synapse Analytics, or other data warehousing solutions for advanced querying and analytics after ingestion. Best Practices & Continuous Improvement: Develop and enforce best practices for building and maintaining ADF pipelines and data flows, ensuring the solutions are modular, reusable, and follow coding standards. Identify opportunities for pipeline automation to reduce manual intervention and improve operational efficiency. Regularly review and suggest new tools or services within the Azure ecosystem to enhance ADF pipeline performance and increase the overall efficiency of data ingestion workflows. Incident and Issue Management: Actively monitor the health of the data pipelines, swiftly addressing any failures, data quality issues, or performance bottlenecks. Troubleshoot ADF pipeline errors, including issues within Data Flows, and work with other teams to root-cause issues related to data availability, quality, or connectivity. Participate in post-mortem analysis for any major incidents, documenting lessons learned and implementing preventative measures for the future. Your Profile Experience with Azure Data Services: Strong experience with Azure Data Factory (ADF) for orchestrating data pipelines. Hands-on experience with ADLS Gen 2, Databricks, and various data formats (e.g., Parquet, JSON, CSV). Solid understanding of Azure SQL Database, Azure Logic Apps, Azure Function Apps, and Azure Container Apps. Programming and Scripting: Proficient in Python for data ingestion, automation, and transformation tasks. Ability to write clean, reusable, and maintainable code. Data Ingestion Techniques: Solid understanding of relational and non-relational data models and their ingestion techniques. Experience working with file-based data ingestion, API-based data ingestion, and integrating data from various third-party systems. Problem Solving & Analytical Skills Communication Skills #IncludingYou Diversity, equity, inclusion and belonging are cornerstones of ADM’s efforts to continue innovating, driving growth, and delivering outstanding performance. We are committed to attracting and retaining a diverse workforce and create welcoming, truly inclusive work environments — environments that enable every ADM colleague to feel comfortable on the job, make meaningful contributions to our success, and grow their career. We respect and value the unique backgrounds and experiences that each person can bring to ADM because we know that diversity of perspectives makes us better, together. For more information regarding our efforts to advance Diversity, Equity, Inclusion & Belonging, please visit our website here: Diversity, Equity and Inclusion | ADM. About ADM At ADM, we unlock the power of nature to provide access to nutrition worldwide. With industry-advancing innovations, a complete portfolio of ingredients and solutions to meet any taste, and a commitment to sustainability, we give customers an edge in solving the nutritional challenges of today and tomorrow. We’re a global leader in human and animal nutrition and the world’s premier agricultural origination and processing company. Our breadth, depth, insights, facilities and logistical expertise give us unparalleled capabilities to meet needs for food, beverages, health and wellness, and more. From the seed of the idea to the outcome of the solution, we enrich the quality of life the world over. Learn more at www.adm.com. Req/Job ID 97477BR Ref ID

Posted 2 weeks ago

Apply

3.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

The Cloud Storage Administrator will manage and support cloud-based storage platforms in AWS and/or Azure. This role involves configuring, monitoring, and optimizing object, block, and file storage solutions to ensure high availability, performance, and data protection across our cloud infrastructure. Required Skills Administer and support cloud storage services such as Amazon S3, EBS, EFS, Glacier and Azure Blob, File and Archive Storage. Disaster mitigation design and implementation experience with a focus on architecture for cross-region replication, backup management, RTO and RPO planning and chaos engineering recovery. Demonstrate use of AWS Elastic Disaster Recovery or Azure Site Recovery. Certification and privacy standards associated with PII, data protection and compliance gap expectations. Ability to identify and tag PII, applying encryption and masking techniques and knowledge and experience in compliance certification (SOC2, ISO27001, GDPR, etc.) and demonstrate use of Azure Macie or Azure Purview. Monitoring and cost optimization practices to proactively alert on performance, usage and anomalies. Demonstrate use of AWS CloudWatch or Azure Monitor and AWS Cost Explorer or Azure Cost Management, . Embrace IaC and automation practices for backups, lifecycles, and archival polices. Demonstrate expertise with AWS CloudFormation or Azure DevOps and a history of use with Terraform modules for Cloud Storage. Manage backup and recovery processes using native cloud tools and third-party solutions. Implement storage policies including lifecycle rules, replication, and access controls. Perform capacity planning and forecasting for storage growth and utilization. Collaborate with infrastructure and application teams to meet storage and data access requirements. Ensure storage systems comply with data protection, retention, and security standards. Document configurations, procedures, and best practices for storage management. Respond to incidents and service requests related to storage systems. Participate in change and incident management processes aligned with ITSM standards. Required Experience 3+ years of experience in storage administration with cloud platforms (AWS, Azure, or both). Hands-on experience with cloud-native storage services and understanding of storage protocols. Experience with AWS CloudWatch, Azure Monitor, and the ability to set up proactive alerting on storage performance, usage, and anomalies. Strong troubleshooting and performance tuning skills related to storage. Familiarity with backup and disaster recovery solutions in cloud environments. Understanding of identity and access management as it pertains to storage services. Knowledge of ITSM processes such as incident, change, and problem management. Experienced with storage cost monitoring tools like AWS Cost Explorer or Azure Cost Management Knowledge of IaC tools (Terraform, CloudFormation) for provisioning storage resources, and automation of backup, lifecycle, and archival policies. Producing technical documentation. Exposure to enterprise backup solutions

Posted 2 weeks ago

Apply

2.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Job Title: Photo Retoucher Location: Pune Reports To: Lead Photographer Role Overview We’re seeking a detail-oriented Photo Retoucher to support a well-known amateur photographer in transforming raw images into polished, publication-ready assets. You’ll apply advanced Photoshop techniques, maintain consistent visual style, and manage digital assets to ensure smooth post-production workflows. Key Responsibilities Image Enhancement & Cleanup (40%) Perform color correction, exposure/contrast adjustments, and tonal grading. Remove blemishes, stray hairs, dust spots and other imperfections. Apply frequency-separation, dodge & burn, and healing techniques for skin and product retouching. Composite & Creative Edits (20%) Execute background replacements, compositing and masking as required. Integrate retouching presets and creative LUTs to achieve the photographer’s signature look. Batch Processing & Workflow Automation (15%) Develop and maintain Photoshop actions and scripts for efficiency. Process large sets of images consistently while adhering to quality standards. Asset Management & Delivery (15%) Organize, rename and back up image files following studio naming conventions. Prepare final deliverables in required formats (JPEG, TIFF, PSD) and resolutions for web, print and social media. Client & Stakeholder Collaboration (10%) Communicate edit status, turnaround times and any technical challenges. Incorporate feedback swiftly and accurately, ensuring client satisfaction. Key Performance Indicators (KPIs) KPITarget / Standard Edit Turnaround Time ≥ 90% of batches delivered within agreed SLA Quality Consistency < 2% revision requests per batch Error Rate < 1% final-file technical errors (artifacts, wrong color space, missing layers) Workflow Efficiency Automated actions cover ≥ 50% of routine tasks Asset Organization Compliance 100% files named and archived per naming conventions Qualifications & Skills Technical Proficiency: Expert in Adobe Photoshop (CC), including advanced retouching tools and techniques. Familiarity with Lightroom and Camera Raw for preliminary adjustments. Experience: 2+ years as a Photo Retoucher or similar post-production role. Portfolio demonstrating both high-end beauty/product retouching and creative composites. Attention to Detail: Keen eye for color, lighting, composition and consistency across large image sets. Time Management: Ability to manage multiple projects and meet tight deadlines without sacrificing quality. Communication: Clear, proactive updates on project status; receptive to feedback. Education & Certifications Diploma or degree in Photography, Graphic Design, Visual Arts, or related field (preferred). Adobe Certified Expert (ACE) in Photoshop (a plus). Tools & Environment Adobe Creative Cloud (Photoshop, Lightroom, Illustrator as needed) Digital asset management tools (e.g., Adobe Bridge, Capture One, or Studio-specific DAM) PC or Mac workstation with calibrated monitor

Posted 2 weeks ago

Apply

5.0 years

0 Lacs

Delhi

On-site

The Role Context: This is an exciting opportunity to join a dynamic and growing organization, working at the forefront of technology trends and developments in social impact sector. Wadhwani Center for Government Digital Transformation (WGDT) works with the government ministries and state departments in India with a mission of “ Enabling digital transformation to enhance the impact of government policy, initiatives and programs ”. We are seeking a highly motivated and detail-oriented individual to join our team as a Data Engineer with experience in the designing, constructing, and maintaining the architecture and infrastructure necessary for data generation, storage and processing and contribute to the successful implementation of digital government policies and programs. You will play a key role in developing, robust, scalable, and efficient systems to manage large volumes of data, make it accessible for analysis and decision-making and driving innovation & optimizing operations across various government ministries and state departments in India. Key Responsibilities: a. Data Architecture Design : Design, develop, and maintain scalable data pipelines and infrastructure for ingesting, processing, storing, and analyzing large volumes of data efficiently. This involves understanding business requirements and translating them into technical solutions. b. Data Integration: Integrate data from various sources such as databases, APIs, streaming platforms, and third-party systems. Should ensure the data is collected reliably and efficiently, maintaining data quality and integrity throughout the process as per the Ministries/government data standards. c. Data Modeling: Design and implement data models to organize and structure data for efficient storage and retrieval. They use techniques such as dimensional modeling, normalization, and denormalization depending on the specific requirements of the project. d. Data Pipeline Development/ ETL (Extract, Transform, Load): Develop data pipeline/ETL processes to extract data from source systems, transform it into the desired format, and load it into the target data systems. This involves writing scripts or using ETL tools or building data pipelines to automate the process and ensure data accuracy and consistency. e. Data Quality and Governance: Implement data quality checks and data governance policies to ensure data accuracy, consistency, and compliance with regulations. Should be able to design and track data lineage, data stewardship, metadata management, building business glossary etc. f. Data lakes or Warehousing: Design and maintain data lakes and data warehouse to store and manage structured data from relational databases, semi-structured data like JSON or XML, and unstructured data such as text documents, images, and videos at any scale. Should be able to integrate with big data processing frameworks such as Apache Hadoop, Apache Spark, and Apache Flink, as well as with machine learning and data visualization tools. g. Data Security : Implement security practices, technologies, and policies designed to protect data from unauthorized access, alteration, or destruction throughout its lifecycle. It should include data access, encryption, data masking and anonymization, data loss prevention, compliance, and regulatory requirements such as DPDP, GDPR, etc. h. Database Management: Administer and optimize databases, both relational and NoSQL, to manage large volumes of data effectively. i. Data Migration: Plan and execute data migration projects to transfer data between systems while ensuring data consistency and minimal downtime. a. Performance Optimization : Optimize data pipelines and queries for performance and scalability. Identify and resolve bottlenecks, tune database configurations, and implement caching and indexing strategies to improve data processing speed and efficiency. b. Collaboration: Collaborate with data scientists, analysts, and other stakeholders to understand their data requirements and provide them with access to the necessary data resources. They also work closely with IT operations teams to deploy and maintain data infrastructure in production environments. c. Documentation and Reporting: Document their work including data models, data pipelines/ETL processes, and system configurations. Create documentation and provide training to other team members to ensure the sustainability and maintainability of data systems. d. Continuous Learning: Stay updated with the latest technologies and trends in data engineering and related fields. Should participate in training programs, attend conferences, and engage with the data engineering community to enhance their skills and knowledge. Desired Skills/ Competencies Education: A Bachelor's or Master's degree in Computer Science, Software Engineering, Data Science, or equivalent with at least 5 years of experience. Database Management: Strong expertise in working with databases, such as SQL databases (e.g., MySQL, PostgreSQL) and NoSQL databases (e.g., MongoDB, Cassandra). Big Data Technologies: Familiarity with big data technologies, such as Apache Hadoop, Spark, and related ecosystem components, for processing and analyzing large-scale datasets. ETL Tools: Experience with ETL tools (e.g., Apache NiFi, Talend, Apache Airflow, Talend Open Studio, Pentaho, Infosphere) for designing and orchestrating data workflows. Data Modeling and Warehousing: Knowledge of data modeling techniques and experience with data warehousing solutions (e.g., Amazon Redshift, Google BigQuery, Snowflake). Data Governance and Security: Understanding of data governance principles and best practices for ensuring data quality and security. Cloud Computing: Experience with cloud platforms (e.g., AWS, Azure, Google Cloud) and their data services for scalable and cost-effective data storage and processing. Streaming Data Processing: Familiarity with real-time data processing frameworks (e.g., Apache Kafka, Apache Flink) for handling streaming data. KPIs: Data Pipeline Efficiency: Measure the efficiency of data pipelines in terms of data processing time, throughput, and resource utilization. KPIs could include average time to process data, data ingestion rates, and pipeline latency. Data Quality Metrics: Track data quality metrics such as completeness, accuracy, consistency, and timeliness of data. KPIs could include data error rates, missing values, data duplication rates, and data validation failures. System Uptime and Availability: Monitor the uptime and availability of data infrastructure, including databases, data warehouses, and data processing systems. KPIs could include system uptime percentage, mean time between failures (MTBF), and mean time to repair (MTTR). Data Storage Efficiency: Measure the efficiency of data storage systems in terms of storage utilization, data compression rates, and data retention policies. KPIs could include storage utilization rates, data compression ratios, and data storage costs per unit. Data Security and Compliance: Track adherence to data security policies and regulatory compliance requirements such as DPDP, GDPR, HIPAA, or PCI DSS. KPIs could include security incident rates, data access permissions, and compliance audit findings. Data Processing Performance: Monitor the performance of data processing tasks such as ETL (Extract, Transform, Load) processes, data transformations, and data aggregations. KPIs could include data processing time, CPU usage, and memory consumption. Scalability and Performance Tuning: Measure the scalability and performance of data systems under varying workloads and data volumes. KPIs could include scalability benchmarks, system response times under load, and performance improvements achieved through tuning. Resource Utilization and Cost Optimization: Track resource utilization and costs associated with data infrastructure, including compute resources, storage, and network bandwidth. KPIs could include cost per data unit processed, cost per query, and cost savings achieved through optimization. Incident Response and Resolution: Monitor the response time and resolution time for data-related incidents and issues. KPIs could include incident response time, time to diagnose and resolve issues, and customer satisfaction ratings for support services. Documentation and Knowledge Sharing : Measure the quality and completeness of documentation for data infrastructure, data pipelines, and data processes. KPIs could include documentation coverage, documentation update frequency, and knowledge sharing activities such as internal training sessions or knowledge base contributions. Years of experience of the current role holder New Position Ideal years of experience 3 – 5 years Career progression for this role CTO WGDT (Head of Incubation Centre) ******************************************************************************* Wadhwani Corporate Profile: (Click on this link) Our Culture: WF is a global not-for-profit, and works like a start-up, in a fast-moving, dynamic pace where change is the only constant and flexibility is the key to success. Three mantras that we practice across job roles, levels, functions, programs and initiatives, are Quality, Speed, Scale, in that order. We are an ambitious and inclusive organization, where everyone is encouraged to contribute and ideate. We are intensely and insanely focused on driving excellence in everything we do. We want individuals with the drive for excellence, and passion to do whatever it takes to deliver world class outcomes to our beneficiaries. We set our own standards often more rigorous than what our beneficiaries demand, and we want individuals who love it this way. We have a creative and highly energetic environment – one in which we look to each other to innovate new solutions not only for our beneficiaries but for ourselves too. Open to collaborate with a borderless mentality, often going beyond the hierarchy and siloed definitions of functional KRAs, are the individuals who will thrive in our environment. This is a workplace where expertise is shared with colleagues around the globe. Individuals uncomfortable with change, constant innovation, and short learning cycles and those looking for stability and orderly working days may not find WF to be the right place for them. Finally, we want individuals who want to do greater good for the society leveraging their area of expertise, skills and experience. The foundation is an equal opportunity firm with no bias towards gender, race, colour, ethnicity, country, language, age and any other dimension that comes in the way of progress. Join us and be a part of us! Bachelors in Technology / Masters in Technology

Posted 2 weeks ago

Apply

8.0 - 10.0 years

0 Lacs

Hyderābād

On-site

Overview: Facilitate and enable all procurement needs for Plants, Co-packers, and Distribution centers. The team is a critical link between Global Procurement and the Manufacturing Network. Production needs cannot be met unless contracts, vendors, and materials are linked to facilities. Creation and Management of all contracts for Package Materials, Ingredients across multi business units. Ensure sourcing options can meet aggressive timelines and deliver on budget product launches Ensuring all contracts are sourced. This data feeds supplier requirements based on Demand Planning. Complete all new material requests for Packaging and Ingredients and populate all financial attributes including standard price, freight and masking component Process any requests to set up vendors as needed for contracts Processing price blocked invoices and researching root cause- Pricing discrepancy, Freight issues and price changes, to avoid the credit hold and smooth flow running plant by mitigating the materials demand. Creating Miscellaneous Purchase Orders for scraps, Plate & Make Ready charges and any other Global Procurement related costs for Direct Materials Ad Hoc Reporting as needed (Global Procurement buyers, Supply Chain Finance Purchasing, Senior Leaders) Professional Development: Supporting team members' growth and development through coaching and mentorship. Resource Management: Ensuring the team has the necessary resources and tools to perform their duties. Problem-Solving and Conflict Resolution: Addressing issues within the team constructively and promptly. Responsibilities: Strategic Supply Management Team Purchasing Supply Chain Finance, category analysts, COE team. Data Maintenance teams Manufacturing plants, co packers, Distribution Centers, storage facilities. PFFS – Payables and Supplier Maintenance Supply Chain Project Managers, MRP Managers and Integration Managers Cost Accounting teams all Divisions Global Procurement Buyers Qualifications: Obtaining a higher degree of cooperation from Supply Chain BU Managers to consistently create the correct information for all Production Material Master Data Input Timely communication of price changes for all Direct Material Contracts from GP, GP Control Team Managing manufacturing plants needs while ensuring compliance and following protocol. Experience in contract Management/Payables/Procurement roles 8 - 10 years of experience in Payables/Vendor Management Resource Management SAP Hands-on experience Able to work independently or as part of a team and takes initiatives Capable of managing multiple time-sensitive priorities simultaneously Detail-oriented; Methodological; organized in approach; and document maintenance Consistency with performance, curious to learn and explore Exceptional communication skills. Proficiency in the English language Ability to spot the errors and connect the dots

Posted 2 weeks ago

Apply

8.0 - 10.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Overview Facilitate and enable all procurement needs for Plants, Co-packers, and Distribution centers. The team is a critical link between Global Procurement and the Manufacturing Network. Production needs cannot be met unless contracts, vendors, and materials are linked to facilities. Creation and Management of all contracts for Package Materials, Ingredients across multi business units. Ensure sourcing options can meet aggressive timelines and deliver on budget product launches Ensuring all contracts are sourced. This data feeds supplier requirements based on Demand Planning. Complete all new material requests for Packaging and Ingredients and populate all financial attributes including standard price, freight and masking component Process any requests to set up vendors as needed for contracts Processing price blocked invoices and researching root cause- Pricing discrepancy, Freight issues and price changes, to avoid the credit hold and smooth flow running plant by mitigating the materials demand. Creating Miscellaneous Purchase Orders for scraps, Plate & Make Ready charges and any other Global Procurement related costs for Direct Materials Ad Hoc Reporting as needed (Global Procurement buyers, Supply Chain Finance Purchasing, Senior Leaders) Professional Development: Supporting team members' growth and development through coaching and mentorship. Resource Management: Ensuring the team has the necessary resources and tools to perform their duties. Problem-Solving and Conflict Resolution: Addressing issues within the team constructively and promptly. Responsibilities Strategic Supply Management Team Purchasing Supply Chain Finance, category analysts, COE team. Data Maintenance teams Manufacturing plants, co packers, Distribution Centers, storage facilities. PFFS - Payables and Supplier Maintenance Supply Chain Project Managers, MRP Managers and Integration Managers Cost Accounting teams all Divisions Global Procurement Buyers Qualifications Obtaining a higher degree of cooperation from Supply Chain BU Managers to consistently create the correct information for all Production Material Master Data Input Timely communication of price changes for all Direct Material Contracts from GP, GP Control Team Managing manufacturing plants needs while ensuring compliance and following protocol. Experience in contract Management/Payables/Procurement roles 8 - 10 years of experience in Payables/Vendor Management Resource Management SAP Hands-on experience Able to work independently or as part of a team and takes initiatives Capable of managing multiple time-sensitive priorities simultaneously Detail-oriented; Methodological; organized in approach; and document maintenance Consistency with performance, curious to learn and explore Exceptional communication skills. Proficiency in the English language Ability to spot the errors and connect the dots

Posted 2 weeks ago

Apply

2.0 - 7.0 years

10 - 14 Lacs

Bengaluru

Work from Office

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : SAP Governance Risk and Compliance (SAP GRC) Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various teams to ensure that application requirements are met, overseeing the development process, and providing guidance to team members. You will also engage in problem-solving activities, ensuring that the applications are aligned with business objectives and user needs, while maintaining a focus on quality and efficiency throughout the project lifecycle. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor project progress and ensure timely delivery of milestones.Design, build, implement and support SAP security roles, profiles and authorizations to SAP ECC, ,S4HANA, Fiori, GRC environments.Manage SAP Security settings, update profiles, roles, permission sets, and object & field level access as necessary.Perform Security Hardening activities, Ensure all Security settings are maintained as per baselineCollaborate with cross functional teams and provide security guidance.Have understanding of SAP UCON, Onapsis, UI Masking Tools.Experience in creating Fiori Spaces and PagesExperience in handling any non-SAP IDM tools like Sailpoint IIQ Worked on one or two SAP security implementations or upgrades. QualificationsBachelor's degree in computer science, system analysis or a related study, or equivalent experience Proven track record of delivering SAP upgrades, migrations and other Basis related projectsHands-on experience of 10/12+ years in SAP BASIS & HANA installation, upgrades and Patching, Migration.Exposure on handling the upgrade, installation and migration of SAP technologies like S/4 HANA suite, SAP Business suite (ERP, BW/BI, APO, GRC), SAP Integration suites PO/PI, BODS, BOBJ, Monitoring and administration using Solution manager, SAP SaaS (like Ariba, IBP, etc.) and BTP & Cloud ALM.Experience of technical evaluations of applications, databases and integrationsExperience in backup/Restore/Recovery of SAP/Oracle installations, Server Monitoring and optimizing techniquesExperience with SAP HANA, Sybase, MaxDB, MS SQL databases, Oracle databaseExperience of resolving technical issues, deep problem-solving skills, including those involving 3rd partiesExcellent analytical, technical and problem-solving skills, Root cause eradication mindset, Proactive approach, Receiver/Customer centricity.Familiar with ITIL concepts of Service Management, Change Management and Root Cause Analysis and using the ITIL tools like Service Now, BMC RemedyCloud knowledge and understanding of hyperscalers platform setup for SAP applications (e.g., experience of working in Public Cloud Domains like Microsoft Azure, AWS and GCP)Experience working in a global environment and with virtual teams.Good communication skills to interact with cross-functional and global teamsAbility to work under pressure and manage multiple priorities effectively.Ability to work independently and as part of a team. Professional & Technical Skills: - Must To Have Skills: Proficiency in SAP Governance Risk and Compliance (SAP GRC).- Strong understanding of risk management frameworks and compliance regulations.- Experience with application design and configuration best practices.- Ability to analyze complex business requirements and translate them into technical specifications.- Familiarity with project management methodologies and tools. Additional Information:- The candidate should have minimum 7.5 years of experience in SAP Governance Risk and Compliance (SAP GRC).- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 2 weeks ago

Apply

4.0 - 6.0 years

6 - 8 Lacs

Chandigarh, Dadra & Nagar Haveli, Daman

Work from Office

Extensive experience with Mainframe systems, especially z/OS, CICS, DB2, and VSAM. Develop and implement system automation and scripting to streamline processes and improve efficiency. Support environments refresh at enterprise level, troubleshoot, and resolve system and application issues, identifying root causes and implementing long-term fixes. Strong in SQL concepts. Knowledge of other mainframe utilities such as IMS, RACF, or HSM and tools as FileAid, Endevor, DFSORT. Familiarity with system monitoring tools and performance tuning techniques. Test Data Management techniques and process familiarization; strong understanding of root masking/obfuscation process. Excellent communication and collaboration Skills to work with cross-functional teams Location - Chandigarh,Dadra & Nagar Haveli,Daman,Diu,Goa,Haveli,Hyderabad,Jammu,Lakshadweep,Nagar,New Delhi,Puducherry,Sikkim

Posted 2 weeks ago

Apply

2.0 - 4.0 years

3 Lacs

Gurgaon

On-site

Job Summary: We are looking for a highly skilled Motion Graphic Designer with strong expertise in Adobe After Effects and Premiere Pro to join our creative team. In this role, you will be responsible for conceptualizing, designing, and producing high-quality motion content for branding, marketing campaigns, and social platforms. If you’re passionate about visual storytelling and have a flair for design and animation, we’d love to meet you. Key Responsibilities: Create visually compelling motion graphics for social media, ads, websites, presentations, and internal communications. Design custom animations using After Effects, including kinetic typography, icon animation, explainer elements, and transitions. Edit and assemble raw footage using Premiere Pro, applying color correction, sound design, and visual enhancements. Collaborate closely with designers, video editors, marketers, and copywriters to bring stories to life through motion. Develop animation assets from scratch or enhance static designs with animation and transitions. Stay updated on design trends, motion techniques, and new tools/plugins for more efficient and modern workflows. Manage multiple projects, meet tight deadlines, and maintain a high standard of quality and creativity. Required Skills & Qualifications: Bachelor’s degree in Motion Design, Animation, Graphic Design, or related field. 2–4 years of proven experience in motion design, preferably in an agency or digital content environment. Expert-level proficiency in Adobe After Effects and Adobe Premiere Pro. Strong understanding of animation principles, video editing, typography, and visual hierarchy. Experience in compositing, masking, motion tracking, rotoscoping, and applying visual effects. Familiarity with sound design, color grading, and working with audio in video projects. Ability to integrate After Effects with Premiere Pro for efficient dynamic workflows. Portfolio showcasing a strong body of work in both motion graphics and video editing. Preferred (Bonus) Skills: Experience with 3D tools like Cinema 4D or Blender. Basic scripting or expressions in After Effects. Job Type: Full-time Pay: From ₹30,000.00 per month Benefits: Health insurance Schedule: Day shift Monday to Friday Application Question(s): How many years of experience do you have with Illustrator? How many years of experience do you have with Photoshop? How many years of experience do you have with Digital Agency How many years of experience do you have with Adobe Premiere Pro? Education: Bachelor's (Preferred) Experience: Video Editing: 2 years (Preferred) total work: 3 years (Preferred) Adobe After Effects: 2 years (Preferred) Work Location: In person

Posted 2 weeks ago

Apply

0 years

1 - 4 Lacs

India

On-site

Key Responsibilities: Create and integrate visual effects (VFX), motion graphics, and animations into videos. Collaborate with video editors, content creators, and directors to match the creative vision. Apply transitions, tracking, chroma keying (green screen), masking, and compositing effects. Ensure high-quality visual output optimized for YouTube and other digital platforms. Stay updated on the latest trends and tools in video/VFX editing. Skills Required: Basic knowledge of software like Adobe After Effects, Premiere Pro, DaVinci Resolve, Blender , or similar. Familiar with concepts like keyframing, masking, rotoscoping, and motion tracking. Creativity, attention to detail, and a strong visual aesthetic. Willingness to learn and adapt to the style and tone of the channel. Eligibility: Freshers and recent graduates are welcome. Background in multimedia, animation, design, or a related field is a plus. A portfolio or showreel (even academic/personal projects) will be highly appreciated. Job Type: Full-time Pay: ₹8,539.73 - ₹37,076.21 per month Schedule: Morning shift Work Location: In person

Posted 2 weeks ago

Apply

1.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Key Responsibility Areas (KRA) Associate with Senior Designers at the XP on preparing mood board curation, preparing 3D renders, 3D and 2D detailed drawings. Ensure an error-free QC and masking package by making necessary corrections before sending the project into production. 1.Skill Set Required: Freshers upto 1 year of experience. Basic proficiency in SketchUp, Revit and Cad Strong willingness to learn and follow instructions. 2. Education Diploma in Architecture, Civil, B,Tech Civil and B Arch

Posted 2 weeks ago

Apply

0.0 - 4.0 years

0 - 0 Lacs

Gurugram, Haryana

On-site

Location: Gurugram Salary: Up to ₹50,000 per month Working Days: 5 days a week Job Type: Full-time Job Summary: We are looking for a highly skilled Motion Graphic Designer with strong expertise in Adobe After Effects and Premiere Pro to join our creative team. In this role, you will be responsible for conceptualizing, designing, and producing high-quality motion content for branding, marketing campaigns, and social platforms. If you’re passionate about visual storytelling and have a flair for design and animation, we’d love to meet you. Key Responsibilities: Create visually compelling motion graphics for social media, ads, websites, presentations, and internal communications. Design custom animations using After Effects, including kinetic typography, icon animation, explainer elements, and transitions. Edit and assemble raw footage using Premiere Pro, applying color correction, sound design, and visual enhancements. Collaborate closely with designers, video editors, marketers, and copywriters to bring stories to life through motion. Develop animation assets from scratch or enhance static designs with animation and transitions. Stay updated on design trends, motion techniques, and new tools/plugins for more efficient and modern workflows. Manage multiple projects, meet tight deadlines, and maintain a high standard of quality and creativity. Required Skills & Qualifications: Bachelor’s degree in Motion Design, Animation, Graphic Design, or related field. 2–4 years of proven experience in motion design, preferably in an agency or digital content environment. Expert-level proficiency in Adobe After Effects and Adobe Premiere Pro. Strong understanding of animation principles, video editing, typography, and visual hierarchy. Experience in compositing, masking, motion tracking, rotoscoping, and applying visual effects. Familiarity with sound design, color grading, and working with audio in video projects. Ability to integrate After Effects with Premiere Pro for efficient dynamic workflows. Portfolio showcasing a strong body of work in both motion graphics and video editing. Preferred (Bonus) Skills: Experience with 3D tools like Cinema 4D or Blender. Basic scripting or expressions in After Effects. Job Types: Full-time, Permanent Pay: ₹13,205.71 - ₹52,155.35 per month Benefits: Leave encashment Paid sick time Paid time off Schedule: Day shift Fixed shift Monday to Friday Weekend availability Supplemental Pay: Overtime pay Ability to commute/relocate: Gurgaon, Haryana: Reliably commute or planning to relocate before starting work (Preferred) Application Question(s): Current CTC Expected CTC Location: Gurgaon, Haryana (Preferred) Work Location: In person

Posted 2 weeks ago

Apply

2.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Hello All, Greetings from tesslogs, We are seeking a TDM Engineer to join our team. TDM Engineer with K2View Client: Infosys Locations: Pune | Mysore | Bangalore Work Mode: Hybrid Relevant Experience: 1–2 Years Notice Period: Immediate Joiners Preferred We are looking for a TDM Engineer with hands-on experience in K2View and Test Data Management (TDM) to support test data provisioning and automation in a dynamic project environment. Key Skills Required: 1–2 years of relevant experience in K2View and TDM Knowledge of data masking, subsetting, and synthetic data generation Strong understanding of test data lifecycle and quality assurance processes Good communication and collaboration skills for working with cross-functional teams

Posted 2 weeks ago

Apply

0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Shadow design discussions the Senior Designer does with clients; prepare Minutes of Meetings and keep track of project milestones to ensure a timely and high-quality delivery Assist the Senior Designer in 3D designs using SpaceCraft (HomeLane Software) and Sketchup; recommend enhancements and be a sounding board for the Senior Designer Be available for Site Visits, Masking along with the Senior Designer; take on the responsibility of file management across HomeLane tech systems Assist the Senior Designer in creating commercial proposals using SpaceCraft and other quoting tools; validate quotes to ensure customers get a transparent and fair estimate. Coordinate with various stakeholders to ensure a great design outcome; build relationships with teams like sales, drawing QC, project management teams and planners. Mandatory Qualifications: Design education background - B.Arch, B.Des, M.Des, Diploma in Design 0-1yr of experience in Interior Design / Architecture Good communication & presentation skills Basic knowledge of Modular furniture Practical knowledge of SketchUp A great attitude.

Posted 2 weeks ago

Apply

0 years

0 Lacs

Chennai, Tamil Nadu, India

Remote

Experience- 6-8 Required Technical Skill Set** HPE Storage platforms: HPE Nimble ,HPE Primera , HPE 3PAR deployment, configuration, replication, and performance tuning, enterprise setup, firmware management, scalability, provisioning, snapshot/replication configuration SAN switch technologies Cisco MDS ,Brocade switches: zoning, fabric configuration, troubleshooting, Fabric management, diagnostics, and upgrade Desired Competencies (Technical/Behavioral Competency) Must-Have** (Ideally should not be more than 3-5) · Experience with LUN provisioning, masking, and zoning across multi-host environments. · Proficiency with Fibre Channel, iSCSI, and FCoE protocols for block-level storage connectivity. · Knowledge of storage replication, snapshot technologies, and remote data protection solutions. · Proficient in backup integration and disaster recovery strategies in storage environments. · Experience performing firmware upgrades and hardware lifecycle management on storage devices. · Ability to conduct and analyze storage performance assessments, capacity planning, and security audits. · Familiarity with storage monitoring, alerting, and reporting tools for proactive system health checks. · Troubleshooting of hardware-level and storage network issues affecting performance and availability. · Adequate knowledge of Ethernet/iSCSI and Fibre Channel-based SAN topology Good-to-Have · Hands-on experience with HPE Storage platforms: HPE Nimble ,HPE Primera , HPE 3PAR deployment, configuration, replication, and performance tuning, enterprise setup, firmware management, scalability, provisioning, snapshot/replication configuration. SN Responsibility of / Expectations from the Role 1. Ability to work independently in a fast-paced dynamic environment required. 2. Proven experience in designing, implementing, and managing enterprise storage solutions. 3. Deep knowledge of SAN (Storage Area Network), NAS (Network Attached Storage), and DAS (Direct Attached Storage) technologies. 4. Expertise in RAID levels, disk provisioning, and storage performance optimization. 5. Strong understanding of SAN switch technologies Cisco MDS ,Brocade switches: zoning, fabric configuration, troubleshooting, Fabric management, diagnostics, and upgrade 6. Experience performing firmware upgrades and hardware lifecycle management on storage devices. 7. Ability to conduct and analyze storage performance assessments, capacity planning, and security audits.

Posted 2 weeks ago

Apply

6.0 years

0 Lacs

Kochi, Kerala, India

On-site

Role - Data Engineer Company - ART Technology Key Responsibilities Develop and maintain ETL pipelines for multiple source systems: Examples - SAP, Korber WMS, OpSuite ePOS, and internal BI systems. Design and implement SSIS-based data workflows including staging, data quality rules, and CDC logic. Collaborate with the DBA on schema design, indexing, and performance tuning for SQL Server 2022. Build reusable components and scripts for data loading, transformation, and validation. Support development of CDC (Change Data Capture) solutions for near real-time updates. Perform unit testing, documentation, and version control of data solutions using GitLab/Jenkins CI/CD. Ensure data security, masking, and encryption in accordance with project policies (TLS 1.3). Work closely with backend developers and analysts to align data models with reporting needs. Troubleshoot and resolve data-related issues during development and post-deployment. Required Skills & Experience 4–6 years of experience in data engineering or ETL development Strong Hands-on Expertise In SSIS (SQL Server Integration Services) SQL Server 2019/2022 (T-SQL, stored procedures, indexing, CDC) ETL development for ERP/warehouse systems (SAP preferred) Experience working with source systems like SAP, WMS, POS or retail systems is highly desirable. Proficiency in data quality frameworks, staging strategies, and workflow orchestration. Familiarity with CI/CD for data workflows (GitLab, Jenkins, etc.) Good understanding of data warehousing concepts and performance optimization. Strong communication and documentation skills. Skills: workflow orchestration,encryption,ssis (sql server integration services),cdc,performance optimization,wms,sql,data security,stored procedures,communication,etl development,data quality frameworks,pos,documentation,sap,data warehousing concepts,sql server 2019/2022,t-sql,ssis,gitlab,staging strategies,ci/cd,jenkins

Posted 2 weeks ago

Apply

1.0 years

0 Lacs

Surat, Gujarat, India

On-site

Key Responsibility Areas (KRA) Associate with Senior Designers at the XP on preparing mood board curation, preparing 3D renders, 3D and 2D detailed drawings. Ensure an error-free QC and masking package by making necessary corrections before sending the project into production. 1.Skill Set Required: Freshers upto 1 year of experience. Basic proficiency in SketchUp, Revit and Cad Strong willingness to learn and follow instructions. 2. Education Diploma in Architecture, Civil, B,Tech Civil and B Arch

Posted 2 weeks ago

Apply

5.0 years

0 Lacs

Andhra Pradesh, India

On-site

JD Key Responsibilities Lead the end-to-end migration of legacy data warehouses (e.g., Teradata, Oracle, SQL Server, Netezza, Redshift) to Snowflake. Assess current data architecture and define migration strategy, roadmap, and timelines. Develop ELT/ETL pipelines using tools such as dbt, Apache Airflow, Matillion, Talend, Informatica, etc. Optimize Snowflake configurations, including clustering, caching, and resource management for performance and cost efficiency. Implement security best practices, including role-based access, masking, and data encryption. Collaborate with data engineering, analytics, and business teams to ensure accurate and efficient data transfer. Create and maintain technical documentation, including migration plans, test scripts, and rollback procedures. Support validation, testing, and go-live activities. Required Skills & Experience 5+ years in data engineering or data platform roles, with at least 2+ years in Snowflake migration projects. Hands-on experience in migrating large datasets from legacy data warehouses to Snowflake. Proficient in SQL, Python, and Snowflake scripting (SnowSQL, stored procedures, UDFs). Experience with data migration tools and frameworks (e.g., AWS SCT, Azure Data Factory, Fivetran, etc.). Strong knowledge of cloud platforms (AWS, Azure, or GCP). Familiarity with DevOps practices, CI/CD for data pipelines, and version control (Git). Excellent problem-solving and communication skills. Preferred Qualifications Snowflake certification(s) SnowPro Core or Advanced Architect. Experience with real-time data ingestion (e.g., Kafka, Kinesis, Pub/Sub). Background in data governance, data quality, and compliance (GDPR, HIPAA). Prior experience in Agile/Scrum delivery environments

Posted 2 weeks ago

Apply

7.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Requirements Description and Requirements 7+ years of experience in quality assurance, with at least 3+ years in a Test Data Management (TDM) lead or senior role. Proven experience in designing and implementing test data management strategies, data masking, and test data provisioning for large-scale software projects. Lead the development and implementation of comprehensive test data management strategies to support functional, regression, performance, security, and other types of testing. Establish governance processes and best practices for handling, managing, and securing test data across multiple projects and environments. Ensure that test data complies with legal, regulatory, and organizational security policies (e.g., GDPR, HIPAA). Design and oversee the creation of high-quality, realistic, and representative test data to meet the needs of different types of testing. Use data generation tools and techniques to produce test data that mirrors real-world data while maintaining privacy and security. Develop automated processes for generating and refreshing test data in line with project and release timelines. Implement and manage data masking, anonymization, and sanitization techniques to ensure sensitive information is protected while retaining data integrity for testing purposes. Develop and enforce data security practices related to the use and storage of test data. Work closely with QA, development, and DevOps teams to understand the specific test data requirements for different testing phases (e.g., unit, integration, performance, UAT). Collaborate with business and IT teams to ensure that required test data is available when needed and meets quality expectations. Support the creation of data models and mapping to align test data with application requirements. Implement strategies for efficient storage and retrieval of test data to ensure high performance and reduce resource consumption during testing. Continuously assess and optimize test data strategies to improve test execution time, resource allocation, and overall testing efficiency. Manage large-scale data sets and ensure their availability across multiple environments (development, testing, staging, production). Lead the evaluation, implementation, and continuous improvement of test data management tools and automation platforms (e.g., Informatica TDM, Delphix, IBM InfoSphere Optim). Leverage automation to streamline test data creation, management, and refresh cycles, ensuring quick access to the latest data for testing. Drive the adoption of self-service tools to enable teams to generate, refresh, and manage their own test data securely. Monitor and manage test data usage to ensure compliance with internal standards and external regulations. Provide regular reporting on test data quality, availability, and utilization to key stakeholders, highlighting any risks or issues. Track and resolve test data issues (e.g., missing data, incorrect data) and provide solutions to improve data availability and accuracy. Lead and mentor a team of test data management professionals, providing guidance, training, and support to enhance team capabilities. Establish clear goals, KPIs, and performance metrics for the team and ensure that projects are completed on time and to a high standard. Foster a culture of continuous improvement, encouraging the team to innovate and apply new test data management techniques. Stay up-to-date with emerging trends, technologies, and best practices in test data management and data privacy. Evaluate and recommend new tools, technologies, and methods to improve the test data management process, increase efficiency, and reduce manual effort. Experience with AI and automation tools for test data generation and data management. Additional Job Description Expertise in test data management tools and platforms (e.g., Delphix, Informatica TDM, IBM InfoSphere Optim, CA TDM). Strong knowledge of data security, privacy, and compliance standards (e.g., GDPR, HIPAA) as they relate to test data. Proficient in database management and query languages (e.g., SQL, PL/SQL) for data manipulation, extraction, and analysis. Experience with test automation frameworks and integration of TDM tools into CI/CD pipelines. Familiarity with cloud-based test data management solutions (e.g., AWS, Azure, Google Cloud). EEO Statement At TELUS Digital, we enable customer experience innovation through spirited teamwork, agile thinking, and a caring culture that puts customers first. TELUS Digital is the global arm of TELUS Corporation, one of the largest telecommunications service providers in Canada. We deliver contact center and business process outsourcing (BPO) solutions to some of the world's largest corporations in the consumer electronics, finance, telecommunications and utilities sectors. With global call center delivery capabilities, our multi-shore, multi-language programs offer safe, secure infrastructure, value-based pricing, skills-based resources and exceptional customer service - all backed by TELUS, our multi-billion dollar telecommunications parent. Equal Opportunity Employer At TELUS Digital, we are proud to be an equal opportunity employer and are committed to creating a diverse and inclusive workplace. All aspects of employment, including the decision to hire and promote, are based on applicants’ qualifications, merits, competence and performance without regard to any characteristic related to diversity.

Posted 2 weeks ago

Apply

3.0 - 8.0 years

5 - 10 Lacs

Bengaluru

Work from Office

Minimum 3 years of experience in working TestDataManagement(TDM) Hands-on experience in handling tools like CA FastDataMasker, Informatica, IBM optim. Exposure in data masking/Obfuscation Hands on experience in SQL, along with multiple databases like Oracle, SQL Server, GreenPlum etc. Hands on experience in Data Profiling, Data masking and reporting. Experience in training and mentoring juniors. Experience in team handling not more than 3-4 associates. Hands Experience working in offshore-onshore model, with good communication skills. Hands on experience in Java and Python will be a plus.

Posted 2 weeks ago

Apply

5.0 - 6.0 years

0 Lacs

Mumbai, Maharashtra, India

Remote

About Godrej Agrovet Godrej Agrovet Limited (GAVL) is a diversified, Research & Development focused agri-business Company dedicated to improving the productivity of Indian farmers by innovating products and services that sustainably increase crop and livestock yields. GAVL holds leading market positions in the different businesses it operates - Animal Feed, Crop Protection, Oil Palm, Dairy, Poultry and Processed Foods. GAVL has a pan India presence with sales of over a million tons annually of high-quality animal feed and cutting- edge nutrition products for cattle, poultry, aqua feed and specialty feed. Our teams have worked closely with Indian farmers to develop large Oil Palm Plantations which is helping in bridging the demand and supply gap of edible oil in India. In the crop protection segment, the company meets the niche requirement of farmers through innovative agrochemical offerings. GAVL through its subsidiary Astec Life Sciences Limited, is also a business-to-business (B2B) focused bulk manufacturer of fungicides & herbicides. In Dairy and Poultry and Processed Foods, the company operates through its subsidiaries Creamline Dairy Products Limited and Godrej Tyson Foods Limited. Apart from this, GAVL also has a joint venture with the ACI group of Bangladesh for animal feed business in Bangladesh. For more information on the Company, please log on to www.godrejagrovet.com . Designation Location Mumbai Job Purpose We are seeking a highly skilled and experienced IT & OT Infrastructure, Data, and Applications Security Manager to lead the security strategy and implementation for IT & OT (Operational Technology) environments. This role is responsible for ensuring that critical infrastructure, network systems, and applications are secure from cyber threats while ensuring operational continuity in both the IT and OT domains. The position requires a deep understanding of both IT and OT security frameworks, as well as an ability to collaborate with cross-functional teams to safeguard digital assets and operations. Roles & Responsibilities IT & OT Infrastructure Security: Develop, implement, and maintain security policies, procedures, and controls to protect IT & OT infrastructure components, including servers, networks, industrial control systems (ICS), SCADA, and cloud environments. Collaborate with IT teams to ensure secure integration between IT and OT systems, addressing the unique security requirements of each domain. Conduct regular risk assessments, vulnerability scans, and penetration tests to identify and mitigate threats in IT & OT infrastructures. Manage the security of industrial networks, SCADA systems, and IIoT (Industrial Internet of Things) devices to prevent cyber threats and ensure safe operations. Implement and maintain security for cloud services, on-premises data centers, and critical OT assets, ensuring compliance with industry standards. Data Security: Implement data encryption, tokenization, and masking techniques to protect sensitive and proprietary data across systems, databases, and storage devices. Oversee data classification processes and ensure data protection in compliance with legal and regulatory requirements (GDPR, CCPA, HIPAA, etc.). Ensure proper data backup, disaster recovery, and business continuity planning related to data security. Conduct data loss prevention (DLP) assessments and implement preventative controls. Manage access control policies for databases and ensure segregation of duties for sensitive information. Network Security: Develop and maintain robust network security architecture for IT & OT networks, ensuring protection against unauthorized access, data breaches, and cyber-attacks. Monitor and analyze network traffic and logs to detect potential threats, vulnerabilities, and anomalous activities across IT & OT networks. Implement network segmentation to isolate IT and OT environments while ensuring controlled data exchange between systems. Configure and manage firewalls, intrusion detection/prevention systems (IDS/IPS), and secure VPNs to protect networks from external and internal threats. Manage secure communication channels for IT/OT devices and ensure the proper functioning of secure remote access protocols for IT/OT systems. Applications Security: Lead the implementation of secure application development practices for OT applications. Work with development and OT engineering teams to incorporate secure coding practices into OT software systems. Conduct regular security assessments and code reviews for applications, ensuring that vulnerabilities are identified and mitigated. Oversee security testing of OT applications, including SCADA systems, human-machine interfaces (HMIs), and industrial control software, to ensure that security controls are in place. Implement security controls around application access, user authentication, and data integrity for OT applications. Incident Response & Threat Management: Lead and coordinate response efforts to security incidents involving OT systems, ensuring that containment, investigation, and remediation processes are followed efficiently. Develop and maintain incident response plans that address OT-specific risks, ensuring minimal disruption to critical operations. Conduct post-incident analysis to identify root causes, recommend improvements, and apply corrective actions to prevent future occurrences. Collaborate with internal and external teams (e.g., law enforcement, vendors) during security incidents that may impact OT systems. Security Governance and Compliance: Ensure compliance with relevant industry regulations, standards, and frameworks (e.g., NIST, ISO 27001, IEC 62443, NERC CIP) in OT environments. Implement and enforce security governance, risk management, and compliance strategies across OT assets. Perform regular audits and assessments of OT security controls to ensure compliance with security policies and regulatory requirements. Maintain comprehensive security documentation, including risk assessments, incident reports, and security project plans. Security Awareness and Training: Develop and conduct security awareness training programs for OT staff, ensuring that they are educated on security best practices, emerging threats, and organizational policies. Provide ongoing education to the OT team about the importance of cybersecurity in the context of industrial operations and critical infrastructure. Stay current with emerging security trends, threats, and vulnerabilities specific to OT environments and incorporate new knowledge into security practices. Educational Qualification : Bachelor's degree in Computer Science, Information Security, Cybersecurity, Engineering, or a related field (Master’s preferred). Experience Minimum of 5 to 6 years of experience in IT & OT security, Data security, and application security. Extensive experience securing both OT (industrial control systems, SCADA, ICS, IIoT) environments. Proven experience with network segmentation, firewalls, IDS/IPS, VPNs, and application security frameworks. Familiarity with securing operational technology, including understanding of industrial protocols (Modbus, OPC, DNP3, etc.). Hands-on experience with OT vulnerability management, incident response, and threat intelligence processes. Skills Expertise in securing network and infrastructure devices, systems, and industrial control systems (ICS). Deep knowledge of network protocols and security mechanisms (e.g., IP, TCP/IP, VPNs, firewalls). Proficiency in securing cloud environments (AWS, Azure, Google Cloud) as well as on-premises systems. Experience with tools for vulnerability scanning, penetration testing, and risk assessments (e.g., Nessus, Qualys, Burp Suite). Certifications: CISSP, CISM, CISA, or similar certifications are preferred. OT-specific certifications such as Certified SCADA Security Architect (CSSA) or IEC 62443 certification a plus. Network security certifications such as CCSP, AWS Certified Security Specialty, or CCNA Security are beneficial. Application security certifications (e.g., CEH, OWASP) are a bonus. An inclusive Godrej Before you go, there is something important we want to highlight. There is no place for discrimination at Godrej. Diversity is the philosophy of who we are as a company. And has been for over a century. It’s not just in our DNA and nice to do. Being more diverse - especially having our team members reflect the diversity of our businesses and communities - helps us innovate better and grow faster. We hope this resonates with you. We take pride in being an equal opportunities employer. We recognize merit and encourage diversity. We do not tolerate any form of discrimination on the basis of nationality, race, color, religion, caste, gender identity or expression, sexual orientation, disability, age, or marital status and ensure equal opportunities for all our team members. If this sounds like a role for you, apply now! We look forward to meeting you.

Posted 2 weeks ago

Apply

5.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Job Description – Delphix TDM Professionals (Healthcare Domain) We are hiring for multiple positions in Test Data Management (TDM) with strong expertise in Delphix . The ideal candidates should have experience in data de-identification, masking , and synthetic data generation , preferably in healthcare environments. 🔹 General Requirements (All Roles) Minimum 5 years of experience in Test Data Management tools Mandatory experience with Delphix (Data Virtualization & Masking) Strong knowledge of Data De-identification & Masking Minimum 2 years of experience in Synthetic Data Generation Experience in aligning TDM with project roadmaps for faster test data delivery Nice to Have: Python, .NET knowledge, and exposure to CI/CD pipelines or cloud-hosted platforms 💼 Open Positions Skills: sql,cloud,ci/cd pipelines,delphix,performance tuning,python,synthetic data generation,data de-identification,cloud-hosted platforms,data masking,test data,design,data virtualization,tdm,.net,shell scripting,oracle Delphix Tech Lead (1 Role) Lead end-to-end Delphix solution design & implementation Drive strategy, architecture, and team guidance Collaborate across enterprise environments Delphix Senior Engineer (4 Roles) Design, deploy, and optimize Delphix virtualization & masking solutions Mentor junior team members Support best practices and innovation Delphix Engineer (2 Roles) Implement and manage Delphix environments Support automation and integration with pipelines Ensure performance in test data delivery Delphix Support Engineer (2 Roles) Provide operational support and troubleshooting for Delphix platforms Ensure platform availability and resolve issues quickly

Posted 2 weeks ago

Apply

5.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Job Description – Delphix TDM Professionals (Healthcare Domain) We are hiring for multiple positions in Test Data Management (TDM) with strong expertise in Delphix . The ideal candidates should have experience in data de-identification, masking , and synthetic data generation , preferably in healthcare environments. 🔹 General Requirements (All Roles) Minimum 5 years of experience in Test Data Management tools Mandatory experience with Delphix (Data Virtualization & Masking) Strong knowledge of Data De-identification & Masking Minimum 2 years of experience in Synthetic Data Generation Experience in aligning TDM with project roadmaps for faster test data delivery Nice to Have: Python, .NET knowledge, and exposure to CI/CD pipelines or cloud-hosted platforms 💼 Open Positions Skills: sql,cloud,ci/cd pipelines,delphix,performance tuning,python,synthetic data generation,data de-identification,cloud-hosted platforms,data masking,test data,design,data virtualization,tdm,.net,shell scripting,oracle Delphix Tech Lead (1 Role) Lead end-to-end Delphix solution design & implementation Drive strategy, architecture, and team guidance Collaborate across enterprise environments Delphix Senior Engineer (4 Roles) Design, deploy, and optimize Delphix virtualization & masking solutions Mentor junior team members Support best practices and innovation Delphix Engineer (2 Roles) Implement and manage Delphix environments Support automation and integration with pipelines Ensure performance in test data delivery Delphix Support Engineer (2 Roles) Provide operational support and troubleshooting for Delphix platforms Ensure platform availability and resolve issues quickly

Posted 2 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies