Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
8.0 - 13.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Job Summary We are looking for a skilled Automation Test Manager to join our team in ensuring the quality, reliability, and security of our payments processing application. The role involves creating and maintaining automated test scripts using Selenium, Java, SQL, and Python for data masking. The ideal candidate has a strong background in automation testing within payment systems or similar high-availability applications. Experienced in MT Swift messages like MT103, 202, 202 COV Experienced in MX messages like PACS.008, PACS.009, PACS.004, PACS.002 and PAIN.001 Experienced in Real Time (Faster Payments) processing like IMPS, G3, IBFT End to End Payment Processing Knowledge Ensure the quality and timeliness of delivery of testing assignments. Perform functional and technical test execution activities as per testing team engagement level in the project. Perform testing in Agile methodology delivery. Plan, analyse, design, prepare Test Strategy, Planning & Traceability Matrix Preparation Key Responsibilities Perform testing in Agile methodology delivery Functional / Automation testing for SCPay payments application Test Automation Design, develop, and maintain automated test scripts using Selenium and Java to support regression, functional, and integration testing. Write and execute SQL queries to validate data integrity and ensure data consistency across transactions. Look into Kibana and understanding of KQL is a Plus Data Masking & Test Data Management Utilize Python scripts for data masking to protect sensitive data used in test cases. Manage test data and set up testing environments to support end-to-end testing scenarios Quality Assurance & Test Strategy Develop comprehensive test plans and test cases to cover all aspects of the application, including UI, API, and database layers. Collaborate with development and product teams to understand requirements, create testing strategies, and identify automation opportunities. Defect Tracking & Reporting Log, track, and manage defects using tracking tools, ensuring clear documentation and communication of issues. Generate and share test execution reports with stakeholders, highlighting critical issues and providing insights for improvement. Continuous Improvement Enhance existing automation frameworks and scripts to improve coverage, maintainability, and reliability. Stay updated on industry trends and best practices in test automation, implementing relevant improvements. Skills And Experience Min 8 - 13 Years of experience Experience in leading a team of more than 5 members Automation Testing using Rest API MT and MX Message Processing Agile methodology Payment processing testing is a must (ISO20022, MT/ MX Payment formats) Automation Tools: Proficiency in Selenium with Java for test automation SQL: Strong SQL skills for data validation and back-end testing Python: Experience with Python scripting for data masking and test data management. Testing Frameworks: Knowledge of testing frameworks such as TestNG or JUnit CI/CD: Familiarity with CI/CD tools like Jenkins, Git, or similar for automated test execution Excellent problem-solving and analytical skills Strong communication skills to convey technical details effectively Ability to work in a collaborative Agile environment with cross-functional teams Qualifications Bachelor’s Degree in Computer Science, Software Engineering or equivalent degree About Standard Chartered We're an international bank, nimble enough to act, big enough for impact. For more than 170 years, we've worked to make a positive difference for our clients, communities, and each other. We question the status quo, love a challenge and enjoy finding new opportunities to grow and do better than before. If you're looking for a career with purpose and you want to work for a bank making a difference, we want to hear from you. You can count on us to celebrate your unique talents and we can't wait to see the talents you can bring us. Our purpose, to drive commerce and prosperity through our unique diversity, together with our brand promise, to be here for good are achieved by how we each live our valued behaviours. When you work with us, you'll see how we value difference and advocate inclusion. Together We Do the right thing and are assertive, challenge one another, and live with integrity, while putting the client at the heart of what we do Never settle, continuously striving to improve and innovate, keeping things simple and learning from doing well, and not so well Are better together, we can be ourselves, be inclusive, see more good in others, and work collectively to build for the long term What We Offer In line with our Fair Pay Charter, we offer a competitive salary and benefits to support your mental, physical, financial and social wellbeing. Core bank funding for retirement savings, medical and life insurance, with flexible and voluntary benefits available in some locations. Time-off including annual leave, parental/maternity (20 weeks), sabbatical (12 months maximum) and volunteering leave (3 days), along with minimum global standards for annual and public holiday, which is combined to 30 days minimum. Flexible working options based around home and office locations, with flexible working patterns. Proactive wellbeing support through Unmind, a market-leading digital wellbeing platform, development courses for resilience and other human skills, global Employee Assistance Programme, sick leave, mental health first-aiders and all sorts of self-help toolkits A continuous learning culture to support your growth, with opportunities to reskill and upskill and access to physical, virtual and digital learning. Being part of an inclusive and values driven organisation, one that embraces and celebrates our unique diversity, across our teams, business functions and geographies - everyone feels respected and can realise their full potential.
Posted 1 month ago
3.0 years
0 Lacs
India
Remote
Outdoor Norway, situated in the heart of Voss – recognized as Norway's Adventure Capital, seeks a creative and detail-oriented Photo Editor to become part of our remote team with deep expertise in Adobe Lightroom and Photoshop. In this role, you'll be responsible for editing and enhancing high-quality images across various projects, ensuring they meet our visual standards and brand aesthetics. Join our dynamic and adventurous team as we craft unforgettable experiences for travelers worldwide. If you're a solid candidate who can grasp project requirements, demonstrate meticulous attention to detail, you must apply now! Location : Remote Start date : June 2025 Contract timespan : Temporary position with possibility of becoming full time Compensation : Based on experience Link to apply for Job - https://outdoornorway.keka.com/careers/applyjob/27988 Website Link - www.outdoornorway.com Key Responsibilities Understands the company's Vision & Mission Edit and retouch high volumes of images using Adobe Lightroom and Photoshop Perform advanced color correction, exposure balancing, and detailed skin retouching Ensure all images maintain a consistent look, feel, and tone across the brand or client deliverables Collaborate with photographers, designers, and project managers remotely t o understand project needs Organize and manage digital image files and editing workflows efficiently Deliver high-quality images on deadline with minimal supervision Apply creative effects while maintaining realism and brand integrity Edit raw images into compelling content for social media, websites, advertisements, or internal use. Manage multiple editing projects simultaneously and meet deadlines. Ensure consistency in visual style, tone, and branding. Stay current with video trends, editing techniques, and platform requirements. Qualifications Proven experience o f 3+ years as a photo editor with a portfolio showcasing past work & a range of editing styles . Strong visual sense and understanding of composition, lighting, and photographic detail Expert-level proficiency in Lightroom and Photoshop (especially in retouching, color grading, and masking). We use Adobe Premiere in the company. Basic motion graphics or animation skills (After Effects or similar) are a plus. Strong sense of visual storytelling, pacing, and composition. Ability to work independently and take ownership of projects. Excellent time management and communication skills. Access to a reliable computer setup and high-speed internet. What we offer: 100% remote work Role designed in collaboration to align with your expertise and experience Being part of an international team consisting of driven and motivated colleagues Competitive salary Continuity and growth opportunities within Outdoor Norway A challenging learning environment that offers the autonomy to shape your own position Enjoying a lively and inclusive team atmosphere with plenty of laughter and team events We are enthusiastic about hearing from you and exploring how both of us can help each other grow into new levels. We are interviewing talent on an ongoing basis!
Posted 1 month ago
3.0 - 5.0 years
3 - 5 Lacs
Bengaluru
On-site
Date: Jun 5, 2025 Job Requisition Id: 61507 Location: Bangalore, KA, IN Bangalore, KA, IN YASH Technologies is a leading technology integrator specializing in helping clients reimagine operating models, enhance competitiveness, optimize costs, foster exceptional stakeholder experiences, and drive business transformation. At YASH, we’re a cluster of the brightest stars working with cutting-edge technologies. Our purpose is anchored in a single truth – bringing real positive changes in an increasingly virtual world and it drives us beyond generational gaps and disruptions of the future. We are looking forward to hire SnowFlake Professionals in the following areas : Job Description: Experience required: 3-5 years. Snowflake Architecture and Administration: Demonstrate a deep understanding of Snowflake architecture, including compute and storage management. Illustrate the administrative tasks involved in managing Snowflake accounts, optimizing performance, and ensuring fail-safe operations. Snowflake Utilities and Features: Hands-on experience with Snowflake utilities, such as Snow SQL, Snow pipe, and time travel, is essential. Proficiency in writing stored procedures, using Snowflake functions, and managing tasks will be required. Data Engineering Expertise: Exhibit expertise in engineering platform components like data pipelines, data masking, data orchestration, data quality, data governance, and analytics within the Snowflake environment. Load Operations and Performance: Review and implement Snowflake best practices for managing load operations and optimizing performance to ensure efficient data processing. Data Security and Governance: Describe data governance in Snowflake, including the use of secure views and dynamic data masking features for column-level data security. Design and develop secure access to objects using Role-Based Access Control (RBAC). Data Sharing and Replication: Utilize data replication for sharing data across accounts securely and managing failover scenarios. Large-Scale Data Intelligence: Demonstrate hands-on experience in implementing large-scale data intelligence solutions around Snowflake data warehousing. Knowledge of scripting languages like Spark, Py-Spark, Snowpark, or SQL is highly desirable. Performance Tuning: Implement advanced techniques for performance tuning methodologies in Snowflake to optimize query performance and reduce data processing time. Collaboration and Leadership: Work collaboratively with cross-functional teams to understand data requirements and contribute to the data engineering strategy. Provide technical leadership and mentorship to junior team members. Strategic Impact Impacts the effectiveness of own works team through the quality and timeliness of the work produced. Largely works within standardized procedures and practices to achieve objectives and meet deadlines and gains some discretion in problem solving. Scope of People Responsibility Manages own workload effectively and efficiently. Expands technical contribution and encourage knowledge management and promote the cross-fertilization of ideas and information between teams. Foster Snowflakes & other developers to grow and mentor to help ensure overall solutions are delivered on timely manner. Provides training for members of the team and ensures team effectiveness. May provide informal guidance and support to colleagues with less experience. Cooperation Communicates difficult concepts and negotiates with others to adopt a different viewpoint. Fostering accountability throughout the team to uphold strong governance in the form of standards, methodology and requirements. Demonstrates the ability to work as a team member on a project and can effectively work with a larger global team.|h2. | Candidate’s Profile Work Experience Overall 3-5 years of experience in IT Bachelor's or master's degree in computer science, Information Technology, or a related field. Extensive experience working with Snowflake data warehousing technology, including hands-on experience with various Snowflake utilities and features. Proficiency in SQL and scripting languages such as Spark, PySpark, Python or Snowpark. Strong knowledge of data engineering concepts, data integration, and ETL processes. Familiarity with data governance principles, data security, and RBAC. Excellent understanding of data replication and data sharing across accounts. Proven experience in performance tuning and optimization of Snowflake queries. Exceptional problem-solving skills and ability to address complex data engineering challenges. Excellent communication and leadership skills with a collaborative mindset. Ability to manage multiple accounts across the organization and ensure smooth operations. Knowledge of transaction and concurrency models, DDL operations, and DML considerations in Snowflake. At YASH, you are empowered to create a career that will take you to where you want to go while working in an inclusive team environment. We leverage career-oriented skilling models and optimize our collective intelligence aided with technology for continuous learning, unlearning, and relearning at a rapid pace and scale. Our Hyperlearning workplace is grounded upon four principles Flexible work arrangements, Free spirit, and emotional positivity Agile self-determination, trust, transparency, and open collaboration All Support needed for the realization of business goals, Stable employment with a great atmosphere and ethical corporate culture
Posted 1 month ago
8.0 - 13.0 years
7 - 9 Lacs
Chennai
On-site
Job ID: 27675 Location: Chennai, IN Area of interest: Technology Job type: Regular Employee Work style: Office Working Opening date: 25 May 2025 Job Summary We are looking for a skilled Automation Test Manager to join our team in ensuring the quality, reliability, and security of our payments processing application. The role involves creating and maintaining automated test scripts using Selenium, Java, SQL, and Python for data masking. The ideal candidate has a strong background in automation testing within payment systems or similar high-availability applications. Experienced in MT Swift messages like MT103, 202, 202 COV Experienced in MX messages like PACS.008, PACS.009, PACS.004, PACS.002 and PAIN.001 Experienced in Real Time (Faster Payments) processing like IMPS, G3, IBFT End to End Payment Processing Knowledge Ensure the quality and timeliness of delivery of testing assignments. Perform functional and technical test execution activities as per testing team engagement level in the project. Perform testing in Agile methodology delivery. Plan, analyse, design, prepare Test Strategy, Planning & Traceability Matrix Preparation Key Responsibilities Perform testing in Agile methodology delivery Functional / Automation testing for SCPay payments application Test Automation: Design, develop, and maintain automated test scripts using Selenium and Java to support regression, functional, and integration testing. Write and execute SQL queries to validate data integrity and ensure data consistency across transactions. Look into Kibana and understanding of KQL is a Plus Data Masking & Test Data Management: Utilize Python scripts for data masking to protect sensitive data used in test cases. Manage test data and set up testing environments to support end-to-end testing scenarios Quality Assurance & Test Strategy: Develop comprehensive test plans and test cases to cover all aspects of the application, including UI, API, and database layers. Collaborate with development and product teams to understand requirements, create testing strategies, and identify automation opportunities. Defect Tracking & Reporting: Log, track, and manage defects using tracking tools, ensuring clear documentation and communication of issues. Generate and share test execution reports with stakeholders, highlighting critical issues and providing insights for improvement. Continuous Improvement: Enhance existing automation frameworks and scripts to improve coverage, maintainability, and reliability. Stay updated on industry trends and best practices in test automation, implementing relevant improvements. Skills and Experience Min 8 - 13 Years of experience Experience in leading a team of more than 5 members Automation Testing using Rest API MT and MX Message Processing Agile methodology Payment processing testing is a must (ISO20022, MT/ MX Payment formats) Automation Tools: Proficiency in Selenium with Java for test automation SQL: Strong SQL skills for data validation and back-end testing Python: Experience with Python scripting for data masking and test data management. Testing Frameworks: Knowledge of testing frameworks such as TestNG or JUnit CI/CD: Familiarity with CI/CD tools like Jenkins, Git, or similar for automated test execution Excellent problem-solving and analytical skills Strong communication skills to convey technical details effectively Ability to work in a collaborative Agile environment with cross-functional teams Qualifications Bachelor’s Degree in Computer Science, Software Engineering or equivalent degree About Standard Chartered We're an international bank, nimble enough to act, big enough for impact. For more than 170 years, we've worked to make a positive difference for our clients, communities, and each other. We question the status quo, love a challenge and enjoy finding new opportunities to grow and do better than before. If you're looking for a career with purpose and you want to work for a bank making a difference, we want to hear from you. You can count on us to celebrate your unique talents and we can't wait to see the talents you can bring us. Our purpose, to drive commerce and prosperity through our unique diversity, together with our brand promise, to be here for good are achieved by how we each live our valued behaviours. When you work with us, you'll see how we value difference and advocate inclusion. Together we: Do the right thing and are assertive, challenge one another, and live with integrity, while putting the client at the heart of what we do Never settle, continuously striving to improve and innovate, keeping things simple and learning from doing well, and not so well Are better together, we can be ourselves, be inclusive, see more good in others, and work collectively to build for the long term What we offer In line with our Fair Pay Charter, we offer a competitive salary and benefits to support your mental, physical, financial and social wellbeing. Core bank funding for retirement savings, medical and life insurance, with flexible and voluntary benefits available in some locations. Time-off including annual leave, parental/maternity (20 weeks), sabbatical (12 months maximum) and volunteering leave (3 days), along with minimum global standards for annual and public holiday, which is combined to 30 days minimum. Flexible working options based around home and office locations, with flexible working patterns. Proactive wellbeing support through Unmind, a market-leading digital wellbeing platform, development courses for resilience and other human skills, global Employee Assistance Programme, sick leave, mental health first-aiders and all sorts of self-help toolkits A continuous learning culture to support your growth, with opportunities to reskill and upskill and access to physical, virtual and digital learning. Being part of an inclusive and values driven organisation, one that embraces and celebrates our unique diversity, across our teams, business functions and geographies - everyone feels respected and can realise their full potential. www.sc.com/careers
Posted 1 month ago
0 years
1 Lacs
India
On-site
About Us: At Web5G Technology , we help businesses grow through smart, performance-driven marketing. Our team specializes in digital advertising, lead generation, and campaign strategy—focused on delivering real results through platforms like Google, Meta, and YouTube. We're passionate about helping brands scale with clear goals, transparent processes, and data-backed decisions. Role: We are looking for a sharp and detail-oriented Editing & VFX Trainee who is eager to work on impactful video content. This role is ideal for someone who understands the fundamentals of post-production and wants to build advanced skills in visual effects and digital storytelling . Eligibility: Minimum 3 months of experience in video editing and/or basic VFX Completion of a certified course in video editing, motion graphics, VFX, or related multimedia field is mandatory Familiarity with tools like Adobe Premiere Pro, After Effects, DaVinci Resolve , or Fusion/Nuke Strong foundation in visual timing, layering, keyframing, masking, and compositing Key Responsibilities: Assist in editing videos for digital ads, brand campaigns, and social media content Create and apply basic VFX elements such as motion tracking, object removal, screen replacements, and text animations Support senior editors and VFX artists in building post-production pipelines Maintain organized project files, assets, and exports for different platforms Optimize video output for quality, speed, and performance Preferred Skills: Green screen keying and rotoscoping basics Working knowledge of LUTs, transitions, and audio syncing Ability to follow brand guidelines and meet project deadlines What You’ll Gain: Training and hands-on experience in both editing and visual effects Real-time exposure to creative advertising projects Internship Certificate + Letter of Recommendation High potential for transition into a full-time creative/VFX role Job Types: Full-time, Permanent Pay: From ₹10,000.00 per month Benefits: Leave encashment Paid sick time Paid time off Schedule: Day shift Supplemental Pay: Overtime pay Performance bonus Yearly bonus Work Location: In person
Posted 1 month ago
2.0 years
2 - 3 Lacs
Gurugram
Work from Office
We are expanding our Gurgaon Based team and are seeking Junior Retouchers who are eager to grow into high end beauty, luxury and still life work for European and US Markets Required Candidate profile Having 2 years experience in high end retouching. Expert knowledge of Adobe Photoshop CS6-CC 2024. Eye for colour, ability to work fast, organised mind set, having a portfolio.
Posted 1 month ago
0 years
0 Lacs
Kolkata, West Bengal, India
On-site
JobDescription: We are looking for a creative and talented Graphic Designer who can create amazing user experiences and visual assets. The ideal candidate should have an eye for clean and artful design and should be able to translate high-level requirements into beautiful, intuitive, and functional designs. Key Responsibilities: Receive and manage customer-submitted exterior or interior images for processing. Remove unwanted objects or visual obstacles from the images. Clean and enhance visual elements to ensure image clarity and quality. Apply appropriate digital layers and accurately fill in paint colors as per customer preferences or brand palette. Ensure all design outputs align with the customer’s brief and expectations. Create realistic, high-resolution previews that simulate the post-painting outcome of buildings or rooms. Collaborate with the customer support or sales team to clarify requirements when needed. Stay updated on the latest digital design tools, color theory, and architectural rendering techniques. Desired Candidate Profile: Proven experience as a graphic designer or photo editor. Proficiency in Adobe Photoshop (required); knowledge of Illustrator, InDesign, or other design tools is a plus. Strong skills in image manipulation, digital painting, masking, layering, and color correction. Ability to interpret customer preferences and creatively bring them to life. Keen attention to detail and a commitment to producing realistic, professional visuals. Strong time management and ability to work independently or with minimal supervision. Basic understanding of architecture or interior design is a plus but not mandatory. Key Skills : Adobe Photoshop Adobe Premiere Illustrator InDesign Corel Draw Work Experience: 0 to 6 yrs.
Posted 1 month ago
1.0 years
0 Lacs
Mumbai, Maharashtra, India
On-site
Job Type: Full Time Experience: 1 Years to 2 Years Type: Virtual Hiring Last Date: 30-June-2025 Posted on: 18-June-2025 Education: BE/B.Tech,MCA,ME/M.Tech ADVERTISEMENT No. 02 Data Scientist/ AI Engineer / 2 Posts Age: 25 to 35 years Qualification Mandatory: Full Time B.E./B. Tech – First class (minimum of 60% marks) or equivalent or M.E./M. Tech/ MCA in Computer Science/ IT/ Data Science/ Machine Learning and AI. Professional / Preferred Qualification: Certification in Data Science/ AI/ ML/ Natural Language Processing/ Web Crawling and Neural Networks. Experience Essential: 1. Minimum 03 years of (post basic educational qualification) experience in related field, out of which: 2+ years experience with programming languages frequently used in data science (R/ Python). 2+ years Experience in model development, model validation or related field. 2+ years experience in data analytics. 2+ years experience in Relational Database or any NoSQL database including Graph databases Experience in cloud-based application/ service development. Experience in natural language processing, Web Crawling and Neural Networks. Experience in projects with Machine learning/ Artificial Intelligence technologies. Excellent communication skills and ability to work as part of a multicultural product development team. End-to-end experience from data extraction to modelling and its validation. Experience of working in a project environment as a developer. Preference will be given to candidates with experience in financial sector/ banks/ NBFCs/ Insurance/ Investment firms. Mandatory Skill Set: 1. Technical expertise regarding data models/ database design development, data mining and segmentation techniques. Expertise in Machine Learning technologies. Expertise in testing & validation of quality and accuracy of AI models. Expertise in developing models using structured, semi-structured and unstructured data. Expertise in Analytical Databases like Vertica DB or similar platforms. Data Modelling and Data Intelligence/ Data Cataloguing skills with tools like Alation. SQL (DDL/ DML/ DQL). Desirable Qualities 1. Good understanding of Data Model and types of dimension modelling. 2. Experience in Conversational AI and dialogue systems. 3. Strong understanding of explainable and Responsible/ Ethical AI framework. 4. Understand Data protection techniques like encryption, data masking and tokenization to safeguard sensitive data in transit and at rest. 5. Experience in designing secure solutions architecture for Cloud platforms (private/ public/ Hybrid). 6. Experience with tools like Nifi, HBase, Spark, pig, storm, flume, etc. 7. Experience in BI tools. 8. Expertise in MS Excel data analytics. 9. Expertise in usage and deployment of LLMs Key Responsibilities: 1. Be self-motivated, pro-active, and demonstrate an exceptional drive towards service delivery. 2. Identify valuable data sources and automate collection/ collation processes. 3. Undertake preprocessing of structured and unstructured data. 4. Analyze information to discover trends and patterns. Use AI/ ML techniques to improve the quality of data or product offerings. 6. Find patterns and trends in datasets to uncover insights. 7. Create algorithms and data models to forecast outcomes. 8. Combine models through ensemble modelling. Data Scientist-cum-BI Developer /1 Post Age 23 to 30 years Qualification Mandatory: Full Time B.E./B. Tech – First class (minimum of 60% marks) or equivalent or M.E./M. Tech/ MCA in Computer Science/ IT/ Data Science/ Machine Learning and AI. Professional/ preferred qualification: Certification/ Assignments/ Projects in Data Science/ AI/ ML/ Natural Language Processing/ Web Crawling and Neural Networks. Experience Essential: 1. Minimum 01 year of (post basic educational qualification) working experience on assignments/ projects/ jobs related to ML/ AI. 2. Experience in projects with Machine learning/ Artificial Intelligence technologies. 3. Excellent communication skills and ability to work as part of a multicultural product development team. 4. End-to-end experience from data extraction to modelling and its validation. 5. Experience of working in a project environment as a developer. 6. Preference will be given to candidates with experience in financial sector/ banks/ NBFCs/ Insurance/ Investment firms. Mandatory Skill Set: 1. Technical expertise regarding data models/ database design development, data mining and segmentation techniques. Expertise in Machine Learning technologies. Expertise in testing & validation of quality and accuracy of AI models. Expertise in developing models using structured, semi-structured and unstructured data. Expertise in Analytical Databases like Vertica DB or similar platforms. Data Modelling and Data Intelligence/ Data Cataloguing skills with tools like Alation. SQL (DDL / DML/ DQL). Desired Skill Set: 1. Good understanding of Data Model and types of dimension modelling. 2. Experience in Conversational AI and dialogue systems. 3. Strong understanding of explainable and Responsible/ Ethical AI framework. 4. Understand Data protection techniques like encryption, data masking and tokenization to safeguard sensitive data in transit and at rest. 5. Experience in designing secure solutions architecture for Cloud platforms (private/ public/ Hybrid). 6. Experience with tools like Nifi, HBase, Spark, pig, storm, flume, etc. 7. Experience in BI tools. 8. Expertise in MS Excel data analytics. 9. Expertise in usage and deployment of LLMs Key Responsibilities 1. Be self-motivated, pro-active, and demonstrate an exceptional drive towards service delivery. 2. Identify valuable data sources and automate collection/ collation processes. 3. Undertake preprocessing of structured and unstructured data. Analyze information to discover trends and patterns. 5. Use AI/ ML techniques to improve the quality of data or product offerings. 6. Find patterns and trends in datasets to uncover insights. 7. Create algorithms and data models to forecast outcomes. 8. Combine models through ensemble modelling. Candidates can apply only ON-LINE on 16th June 2025 to 30 June 2025. Note: This is an aggregated job, sharing with a motive to intimate relevant opportunities with job seekers. Hireclap is not responsible / authorized for this recruitment process. Click Here For Job Details & Apply Online
Posted 1 month ago
0.0 - 4.0 years
1 - 3 Lacs
Chennai
Work from Office
Proficient knowledge and Extensive experience using Adobe Photoshop for image editing. Good Knowledge of Clipping, Pathing, Channel Masking, Color correction, Image cleanup, Product retouching, Hair Masking, and all necessary retouching requirements.
Posted 1 month ago
1.0 - 2.0 years
0 Lacs
Jaipur, Rajasthan, India
On-site
Location: Jaipur, RJ, IN Areas of Work: Sales & Marketing Job Id: 12778 Executive N - SERVICES JAIPUR Objective Lead the team of Customer Associates & Sales Associates in the region / allocated territory to ensure their performance in terms of delivering value, adherence to processes and guidelines at the sites, driving the usages of key / focus products at the sites, interact and manage dealers as and when required. Reporting of data as required to SSE, UH & central function. Main Responsibilities Monitoring daily updates of all activities on the Paint Assist app. Daily monitoring of on time visits and follow ups to all new sites by customer associates (CA) or Sales Associates (SA). Daily monitoring of business collections by CA and ensuring delivery of month on month business objectives Prompt updation of records of the new joinees/exits at a CA/SA level to SSE In store & onsite training to new CAs for processes, business app training, product pitching & site monitoring. Random site visits to ensure adherence to systems & processes like usage of mechanized tools, covering and masking, correct application process and on-time site handover after proper cleaning. Approving business entries into the application post appropriate checks & validation. Other Responsibilities Undertake regular trainings for faster adoption of updated features of the business app Co-ordinate with CA and contractors and ensure their attendance in all contractor training programs. Scope of Work A ) People Management Scope (Range of no. of Direct/ Indirect Reports): Performance of Customer associates/Sales Associates Co-ordination with trainer, TA, TSE & SSE for contractor training needs. Co-ordination with DA, CC & SD for focused product requirements and leads c) Geography Coverage (Country-wide/ State-wide / Area-wide) d) Corporate Coverage (Company-wide / Business Unit or Function-wide / Sub-function-wide / Other): NA Key Interactions Internal: Customer associates, Sales Associates, Colour consultants, Designer Associate, Senior Sales Executive, Unit Head, Technical Associate, Technical Sales Executive. External: Customers, Store owner, contractors, other influencers. Role Requirements Qualifications Essential: Graduate Degree in any stream (BA/B.Sc./B.Com/BBA/BBM/BMS) Minimum of 50% marks throughout education without any backlogs Graduation must be through a full time course Applicants with an Engineering background (B.Tech/B.E./Diploma/B.Pharma) will not be considered Desired: Candidates with MBA/PGDM in Sales and Marketing Desired: 1-2 years of experience in sales function in any organization Functional Competencies Fluency in English, Hindi & local language Excellent communication and people skills Should have a working knowledge of MS Excel, MS Word, MS Powerpoint Behavioral Competencies Willingness to work in a retail environment and engage with clients across age and income groups for 8.5 hours a day and 6 days in a week. Extensive travelling across the region. Be diligent and ensure timely attendance / completion of all programs and modules designed by for training and development of customer associates. Additional Requirements Should have a two-wheeler with valid driving licence. Should have an android phone with latest updated operating system Age to be between 26 - 30 Years
Posted 1 month ago
6.0 years
20 - 22 Lacs
Udaipur
Remote
Senior Software Engineer - Data Governance Kadel Labs is a leading IT services company delivering top-quality technology solutions since 2017, focused on enhancing business operations and productivity through tailored, scalable, and future-ready solutions. With deep domain expertise and a commitment to innovation, we help businesses stay ahead of technological trends. As a CMMI Level 3 and ISO 27001:2022 certified company, we ensure best-in-class process maturity and information security, enabling organizations to achieve their digital transformation goals with confidence and efficiency. Role: Senior Software Engineer- Data Governance Experience: 6-8 Yrs Location: Udaipur , Jaipur, Bangalore Domain: Telecom Job Description: We are seeking an experienced Telecom Data Governance lead to join our team. In this role, you will be responsible for defining and implementing the data governance strategy. The role involves establishing metadata standards, defining attribute ownership models, ensuring regulatory compliance, and improving data quality and trust across the enterprise. initiatives. Key Responsibilities: Define and implement enterprise-wide data governance framework Own the metadata catalog and ensure consistency across business and technical assets Develop and manage KPI registries, data dictionaries, and lineage documentation Collaborate with data stewards and domain owners to establish attribute ownership Lead efforts around data standardization, quality rules, and classification of sensitive data Ensure privacy and compliance (e.g., GDPR, PII, PHI) by enforcing tagging, masking, and access rules Define access control rules (purpose-based views, user roles, sensitivity levels) Oversee governance for data products and federated data domains Support internal audits and external regulatory reviews Coordinate with platform, analytics, security, and compliance teams Required Skills: 6+ years of experience in data governance roles with at least 3-4 years in telecommunications industry Experience integrating governance with modern data stacks (e.g.Data bricks, Snowflake) Strong experience in data governance tools (e.g., Alation, Unity Catalog ,Azure Purview,) Proven understanding of metadata management, data lineage, and data quality frameworks Experience in implementing federated governance models and data stewardship programs Knowledge of compliance requirements (GDPR, PII, TMForum etc.) Familiarity with data mesh principles and data contract approaches Excellent communication and stakeholder management skills Background in telecom, networking or other data-rich industries Certification in data governance or management frameworks Educational Qualifications: · Bachelor's degree in Computer Science, Information Technology, or a related field. Visit us: https://kadellabs.com/ https://in.linkedin.com/company/kadel-labs https://www.glassdoor.co.in/Overview/Working-at-Kadel-Labs-EI_IE4991279.11,21.htm Job Types: Full-time, Permanent Pay: ₹2,087,062.21 - ₹2,209,304.16 per year Benefits: Flexible schedule Health insurance Leave encashment Paid time off Provident Fund Work from home Schedule: Day shift Monday to Friday Supplemental Pay: Overtime pay Performance bonus Quarterly bonus Yearly bonus Ability to commute/relocate: Udaipur City, Rajasthan: Reliably commute or planning to relocate before starting work (Required) Application Question(s): How Many years of experience in Telecom-Data Engineering? Experience: Data Engineer: 9 years (Required) Data governance: 6 years (Required) Location: Udaipur City, Rajasthan (Required) Work Location: In person
Posted 1 month ago
4.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Hello, Greetings from ZettaMine!! 📢 Job Opening: Storage Administrator Location: Hyderabad / Navi Mumbai Experience: 2–4 Years Job Type: Full-Time Joining: Immediate joiners preferred Mandatory to have good communication Job Description: Roles & Responsibilities: Manage enterprise-level SAN/NAS environments (e.g., NetApp, EMC, Hitachi, Dell EMC) Perform LUN provisioning , zoning , masking , and volume management Monitor and troubleshoot storage-related issues and performance bottlenecks Administer backup & recovery solutions (e.g., Veritas NetBackup, Commvault) Implement replication , snapshot , and Disaster Recovery (DR) strategies Support SAN switch infrastructure (Brocade/Cisco) Conduct capacity planning , utilization tracking, and forecasting Develop and maintain technical documentation , SOPs , and runbooks Collaborate with application and infrastructure teams to fulfill storage requirements. Interested candidates can share your updated CV at: 📧 md.afreen@zettamine.com Thanks & Regards, Afreen ZettaMine Show more Show less
Posted 1 month ago
5.0 - 9.0 years
0 Lacs
Gurugram, Haryana, India
On-site
POSITION - Software Engineer – Data Engineering LOCATION - Bangalore/Mumbai/Kolkata/Gurugram/Hyderabad/Pune/Chennai EXPERIENCE - 5-9 Years ABOUT HASHEDIN We are software engineers who solve business problems with a Product Mindset for leading global organizations. By combining engineering talent with business insight, we build software and products that can create new enterprise value. The secret to our success is a fast-paced learning environment, an extreme ownership spirit, and a fun culture. JOB TITLE: Software Engineer – Data Engineering OVERVIEW OF THE ROLE: As a Data Engineer or Senior Data Engineer, you will be hands-on in architecting, building, and optimizing robust, efficient, and secure data pipelines and platforms that power business critical analytics and applications. You will play a central role in the implementation and automation of scalable batch and streaming data workflows using modern big data and cloud technologies. Working within cross-functional teams, you will deliver well-engineered, high quality code and data models, and drive best practices for data reliability, lineage, quality, and security Mandatory Skills: • Hands-on software coding or scripting for minimum 4 years • Experience in product management for at-least 4 years • Stakeholder management experience for at-least 4 years • Experience in one amongst GCP, AWS or Azure cloud platform Key Responsibilities: • Design, build, and optimize scalable data pipelines and ETL/ELT workflows using Spark (Scala/Python), SQL, and orchestration tools (e.g., Apache Airflow, Prefect, Luigi). • Implement efficient solutions for high-volume, batch, real-time streaming, and eventdriven data processing, leveraging best-in-class patterns and frameworks. • Build and maintain data warehouse and lakehouse architectures (e.g., Snowflake, Databricks, Delta Lake, BigQuery, Redshift) to support analytics, data science, and BI workloads. • Develop, automate, and monitor Airflow DAGs/jobs on cloud or Kubernetes, following robust deployment and operational practices (CI/CD, containerization, infra-as-code). • Write performant, production-grade SQL for complex data aggregation, transformation, and analytics tasks. • Ensure data quality, consistency, and governance across the stack, implementing processes for validation, cleansing, anomaly detection, and reconciliation General Skills & Experience: • Proficiency with Spark (Python or Scala), SQL, and data pipeline orchestration (Airflow, Prefect, Luigi, or similar). • Experience with cloud data ecosystems (AWS, GCP, Azure) and cloud-native services for data processing (Glue, Dataflow, Dataproc, EMR, HDInsight, Synapse, etc.) Hands-on development skills in at least one programming language (Python, Scala, or Java preferred); solid knowledge of software engineering best practices (version control, testing, modularity). • Deep understanding of batch and streaming architectures (Kafka, Kinesis, Pub/Sub, Flink, Structured Streaming, Spark Streaming). • Expertise in data warehouse/lakehouse solutions (Snowflake, Databricks, Delta Lake, BigQuery, Redshift, Synapse) and storage formats (Parquet, ORC, Delta, Iceberg, Avro). • Strong SQL development skills for ETL, analytics, and performance optimization. • Familiarity with Kubernetes (K8s), containerization (Docker), and deploying data pipelines in distributed/cloud-native environments. • Experience with data quality frameworks (Great Expectations, Deequ, or custom validation), monitoring/observability tools, and automated testing. • Working knowledge of data modeling (star/snowflake, normalized, denormalized) and metadata/catalog management. • Understanding of data security, privacy, and regulatory compliance (access management, PII masking, auditing, GDPR/CCPA/HIPAA). • Familiarity with BI or visualization tools (PowerBI, Tableau, Looker, etc.) is an advantage but not core. • Previous experience with data migrations, modernization, or refactoring legacy ETL processes to modern cloud architectures is a strong plus. • Bonus: Exposure to open-source data tools (dbt, Delta Lake, Apache Iceberg, Amundsen, Great Expectations, etc.) and knowledge of DevOps/MLOps processes EDUCATIONAL QUALIFICATIONS : • Bachelor’s or Master’s degree in Computer Science, Data Engineering, Information Systems, or related field (or equivalent experience). • Certifications in cloud platforms (AWS, GCP, Azure) and/or data engineering (AWS Data Analytics, GCP Data Engineer, Databricks). • Experience working in an Agile environment with exposure to CI/CD, Git, Jira, Confluence, and code review processes. • Prior work in highly regulated or large-scale enterprise data environments (finance, healthcare, or similar) is a plus Show more Show less
Posted 1 month ago
1.0 years
0 Lacs
Ahmedabad, Gujarat, India
On-site
Key Responsibility Areas (KRA) Associate with Senior Designers at the XP on preparing mood board curation, preparing 3D renders, 3D and 2D detailed drawings. Ensure an error-free QC and masking package by making necessary corrections before sending the project into production. 1.Skill Set Required: Freshers upto 1 year of experience. Basic proficiency in SketchUp, Revit and Cad Strong willingness to learn and follow instructions. 2. Education Diploma in Architecture, Civil, B,Tech Civil and B Arch Show more Show less
Posted 1 month ago
0.0 - 6.0 years
0 Lacs
Udaipur, Rajasthan
Remote
Senior Software Engineer - Data Governance Kadel Labs is a leading IT services company delivering top-quality technology solutions since 2017, focused on enhancing business operations and productivity through tailored, scalable, and future-ready solutions. With deep domain expertise and a commitment to innovation, we help businesses stay ahead of technological trends. As a CMMI Level 3 and ISO 27001:2022 certified company, we ensure best-in-class process maturity and information security, enabling organizations to achieve their digital transformation goals with confidence and efficiency. Role: Senior Software Engineer- Data Governance Experience: 6-8 Yrs Location: Udaipur , Jaipur, Bangalore Domain: Telecom Job Description: We are seeking an experienced Telecom Data Governance lead to join our team. In this role, you will be responsible for defining and implementing the data governance strategy. The role involves establishing metadata standards, defining attribute ownership models, ensuring regulatory compliance, and improving data quality and trust across the enterprise. initiatives. Key Responsibilities: Define and implement enterprise-wide data governance framework Own the metadata catalog and ensure consistency across business and technical assets Develop and manage KPI registries, data dictionaries, and lineage documentation Collaborate with data stewards and domain owners to establish attribute ownership Lead efforts around data standardization, quality rules, and classification of sensitive data Ensure privacy and compliance (e.g., GDPR, PII, PHI) by enforcing tagging, masking, and access rules Define access control rules (purpose-based views, user roles, sensitivity levels) Oversee governance for data products and federated data domains Support internal audits and external regulatory reviews Coordinate with platform, analytics, security, and compliance teams Required Skills: 6+ years of experience in data governance roles with at least 3-4 years in telecommunications industry Experience integrating governance with modern data stacks (e.g.Data bricks, Snowflake) Strong experience in data governance tools (e.g., Alation, Unity Catalog ,Azure Purview,) Proven understanding of metadata management, data lineage, and data quality frameworks Experience in implementing federated governance models and data stewardship programs Knowledge of compliance requirements (GDPR, PII, TMForum etc.) Familiarity with data mesh principles and data contract approaches Excellent communication and stakeholder management skills Background in telecom, networking or other data-rich industries Certification in data governance or management frameworks Educational Qualifications: · Bachelor's degree in Computer Science, Information Technology, or a related field. Visit us: https://kadellabs.com/ https://in.linkedin.com/company/kadel-labs https://www.glassdoor.co.in/Overview/Working-at-Kadel-Labs-EI_IE4991279.11,21.htm Job Types: Full-time, Permanent Pay: ₹2,087,062.21 - ₹2,209,304.16 per year Benefits: Flexible schedule Health insurance Leave encashment Paid time off Provident Fund Work from home Schedule: Day shift Monday to Friday Supplemental Pay: Overtime pay Performance bonus Quarterly bonus Yearly bonus Ability to commute/relocate: Udaipur City, Rajasthan: Reliably commute or planning to relocate before starting work (Required) Application Question(s): How Many years of experience in Telecom-Data Engineering? Experience: Data Engineer: 9 years (Required) Data governance: 6 years (Required) Location: Udaipur City, Rajasthan (Required) Work Location: In person
Posted 1 month ago
3.0 years
0 Lacs
Ahmedabad, Gujarat, India
On-site
Artic Consulting is a dynamic IT and consulting services firm, delivering digital transformation through innovative solutions in data, cloud, and business analytics. We are seeking a skilled Data Engineer with a strong focus on the Microsoft Fabric ecosystem, who can design and implement scalable data solutions for our clients. Key Responsibilities: Design, develop, and maintain Power BI reports, dashboards, DAX expressions, KPIs, and scorecards using both Import and Direct Query modes Build and orchestrate scalable ETL/ELT workflows using Fabric Data Pipelines, Dataflows Gen2, and Azure Data Factory Write and tune complex T-SQL and KQL queries, stored procedures, and views for performance in Synapse SQL and SQL Server environments Implement data models based on star/snowflake schemas and support modern data warehousing and Lakehouse architectures using Microsoft Fabric integrate structured and unstructured data sources (e.g., SQL, Excel, APIs, Blob Storage), and transform them efficiently using Fabric Notebooks (Spark/PySpark) or Dataflows Diagnose and resolve pipeline failures, logic errors, and performance bottlenecks across the data engineering lifecycle Automate repetitive data processes using Azure Functions, Logic Apps, PowerShell, or Python scripting within the Fabric ecosystem Collaborate with stakeholders to gather business requirements and translate them into scalable data solutions Ensure data governance, privacy, and compliance standards (e.g., GDPR, HIPAA, ISO) are adhered to, including sensitive data handling policies Apply best practices for item-level security, workspace-based access models, and data lineage using Microsoft One Lake and Fabric tools Required Qualifications : Bachelor's or master's degree in computer science, Information Systems, or related field Minimum 3 years of experience in Power BI development and data engineering Strong expertise in T-SQL and KQL, with demonstrated query optimization skills Proficiency in Microsoft Fabric tools: Data Pipelines, Dataflows Gen2, OneLake, Notebooks Hands-on experience with Spark/PySpark and data integration from varied sources Microsoft Power BI certification (PL-300) or equivalent Microsoft certifications Preferred Skills: Experience debugging and optimizing advanced SQL queries, database objects, and legacy components Ability to implement database security models and data protection policies Expertise in implementing row-level security, dynamic data masking, and role-based access control within Microsoft Fabric and Power BI environments Familiarity with Microsoft OneLake architecture, including data cataloging, lineage tracking, item-level security, and workspace-based access management Proven ability to operate effectively in dynamic, client-facing environments, delivering scalable and compliant data solutions with a focus on performance and quality Why Join Artic Consulting? Work with cutting-edge Microsoft technologies in a dynamic and collaborative environment Flexible work culture with opportunities for growth and certifications Be part of a mission to deliver impactful digital transformation for clients globally Powered by JazzHR K4jDCdH2kc Show more Show less
Posted 1 month ago
6.0 - 11.0 years
16 - 31 Lacs
Hyderabad, Pune, Bengaluru
Hybrid
Greetings from Cognizant!!! We are hiring for Permanent Position with Cognizant. Experience:- 3 - 12 Yrs. Mandatory to have experience in TDM, GenRocket, Delphix, Informatica. Work Location: Pan India Interview Mode : ( Virtual ) Interview Date : Weekday & Weekend JD: Job Summary Strong Test Data Manager with hands on Test Data experience preferably in TDM consultancy and Implementation of TDM tool Thorough understanding of Test Data Management hands on experience Data Generation Masking Profiling Experience in enterprise level TDM solutioning and implementation Experience in client facing roles and good stakeholder management Prefer experience in Google Cloud or Cloud Test Data handling Responsibilities Strong Test Data Manager with hands on Test Data experience preferably in TDM consultancy and Implementation of TDM tool Thorough understanding of Test Data Management hands on experience Data Generation Masking Profiling Experience in enterprise level TDM solutioning and implementation Experience in client facing roles and good stakeholder management.
Posted 1 month ago
0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Overview This role is responsible for coordinate resources, solve technical requirements, evaluate risks and scope of SAP improvements, upgrades and implementations for Global PGT and individual PGT and deploy technological solutions according Pepsico´s SAP/IT best practices and compliance. This role is also responsible for assessing functional requirements, guide the group as per Application Security guidelines and Compliance standard methodologies, and ensure transparent security design. Provides subject matter expertise (SME) in solutioning and implementing SAP access management requirements. This role severs as the leader for cyber security governance, engineering, and reporting for PGT. This role is also the liaison with Information Security. Additionally, this role’s objective is to successfully deliver security upfront across all PGT deployments while ensuring consistency in approach and providing visibility through communication and alignment with key stakeholders. Responsibilities Point Person for PGT SAP implementations with the leaders, functional team and business unit. Provide project progress information to functional and business Directors and Managers Minimize SoD critical risk during implementations and guide during each phase to achieve the SAP security governance and controls. Work closely with controls team (IT, configurable and internal control) and continue supporting the best practices Communicate with governance team in order to implement local and global best practices. Coordinate SAP Security implementations during the lifecycle of the projects. Consolidate and support PGT implementations regarding SAP Security best practices. Introduce delivery automation processes , Actively participate in the Continuous Process Improvement initiatives by striving to look for possible efficiencies, scalability and / or cost reduction opportunities. Work with limited supervision and exhibit a solid sense of urgency Actively participate in the Continuous Process Improvement initiatives by striving to look for possible efficiencies, scalability and / or cost reduction opportunities Facilitates internal and external audits as requested Always ensure Data Protection by leveraging Data Masking and Data Scrambling techniques Responsible for Leadership reporting on various Information Security metrics across Tech Strategy and Enterprise Solutions teams Collaborate with Information Security organization to ensure remediation of security vulnerabilities to ensure security health index is maintained as intended Manage and provide status updates on security assessments, vulnerability remediation, and exceptions Provide Security Engineering Expertise for PGT program Provide regular updates to Information Security Leadership on PGT status and risks and issues Qualifications Bachelor’s degree in computer science (or equivalent) is required Show more Show less
Posted 1 month ago
6.0 years
18 - 24 Lacs
Hyderābād
On-site
Delphix Senior Engineer Open Positions: 4 Experience: 6+ Years Location: Bangalore, Hyderabad, Chennai, Pune Employment Type: Full-Time About the Role: We are seeking highly skilled Delphix Senior Engineers to join our dynamic team. You will play a critical role in designing, deploying, and optimizing Delphix Data Virtualization and Data Masking solutions across enterprise environments. This role also involves mentoring junior engineers and ensuring best practices in data delivery, privacy, and DevOps enablement. Key Responsibilities: Design and implement scalable Delphix data virtualization and masking architectures. Lead end-to-end solution deployment, configuration, and integration with client systems. Collaborate with development, QA, and operations teams to ensure seamless data delivery for testing and analytics. Automate data provisioning workflows using Delphix APIs and integration tools. Monitor performance, troubleshoot issues, and optimize Delphix environments. Mentor and guide junior engineers; provide training and technical leadership. Document system designs, processes, and operational procedures. Work closely with stakeholders to understand data needs and ensure secure and efficient access. Required Skills & Experience: 6+ years of experience in enterprise data management or DevOps roles. 3+ years of hands-on experience with Delphix Data Virtualization and Data Masking solutions. Strong understanding of RDBMS technologies (Oracle, SQL Server, PostgreSQL, etc.). Experience with scripting (Shell, Python) and Delphix API integrations. Familiarity with CI/CD pipelines and DevOps practices. Excellent problem-solving, communication, and stakeholder management skills. Ability to work independently and lead technical discussions and initiatives. Preferred Qualifications: Experience in cloud-based deployments (AWS, Azure, GCP). Prior experience with Agile/Scrum methodologies. Delphix certifications or formal training. Job Type: Full-time Pay: ₹1,800,000.00 - ₹2,400,000.00 per year Schedule: Day shift Work Location: In person
Posted 1 month ago
8.0 years
25 - 30 Lacs
Hyderābād
On-site
Delphix Tech Lead ( Exp- 9 LPA Lead end-to-end Delphix solution design, implementation, and team guidance across enterprise environments. We are looking for an experienced Delphix Tech Lead to take ownership of the end-to-end design, implementation, and deployment of Delphix solutions across complex enterprise environments. The ideal candidate will provide technical leadership, guide team members, and ensure seamless integration of Delphix into various data architectures. Key Responsibilities: Lead the design and implementation of Delphix data virtualization and masking solutions. Collaborate with stakeholders to understand data management needs and translate them into technical solutions. Oversee installation, configuration, and maintenance of Delphix platform. Drive performance optimization, automation, and integration with CI/CD pipelines. Provide technical guidance and mentorship to team members. Create technical documentation and ensure best practices are followed. Troubleshoot and resolve issues related to data provisioning and masking. Required Skills & Qualifications: 8+ years of IT experience with at least 3+ years working on Delphix platform. Strong knowledge of data virtualization, masking, and DevOps processes. Experience with database technologies such as Oracle, SQL Server, or PostgreSQL. Solid understanding of data management and security best practices. Ability to lead projects and coordinate with cross-functional teams. Excellent communication and problem-solving skills. Job Type: Full-time Pay: ₹2,500,000.00 - ₹3,000,000.00 per year Schedule: Day shift Work Location: In person
Posted 1 month ago
5.0 years
14 - 18 Lacs
Hyderābād
On-site
Job Title: Delphix Engineer Experience: 5+ Years Positions Open: 2 Location: Bangalore, Pune, Hyderabad, Chennai Employment Type: Full-time About the Role: We are seeking skilled and motivated Delphix Engineers to join our dynamic team. In this role, you will be responsible for implementing, managing, and optimizing Delphix environments to support high-performing, secure, and efficient data virtualization and delivery. This is an excellent opportunity to contribute to enterprise-level data initiatives and drive automation in modern data infrastructure. Key Responsibilities: Design, implement, and maintain Delphix data virtualization and masking environments. Collaborate with development, testing, and infrastructure teams to deliver virtualized data environments. Automate data provisioning, refresh, masking, and archival processes using Delphix APIs and scripting tools. Monitor system health, troubleshoot issues, and ensure optimal performance and reliability. Manage integration of Delphix with databases like Oracle, SQL Server, PostgreSQL, and others. Ensure compliance with data security and masking requirements across environments. Contribute to documentation, best practices, and knowledge-sharing within the team. Required Skills & Experience: Minimum 5 years of overall experience, with strong expertise in Delphix Data Platform . Solid hands-on experience with Delphix Virtualization and Masking solutions . Strong scripting skills using Shell, PowerShell, or Python for automation. Experience integrating Delphix with Oracle, SQL Server, or other major databases . Good understanding of data lifecycle management , data masking , and data delivery pipelines . Familiarity with DevOps tools and CI/CD processes is a plus. Strong analytical, troubleshooting, and communication skills. Preferred Qualifications: Delphix certifications (if available). Experience working in Agile/Scrum environments. Exposure to cloud platforms (AWS, Azure, GCP) and cloud-based Delphix setups Job Type: Full-time Pay: ₹1,400,000.00 - ₹1,800,000.00 per year Schedule: Day shift Work Location: In person
Posted 1 month ago
3.0 - 6.0 years
8 - 10 Lacs
Hyderābād
On-site
Job Title: Delphix Support Engineer Open Positions: 2 Experience: 3–6 Years Location: Bangalore / Pune / Hyderabad / Chennai / Pune Employment Type: Full-time About the Role: We are looking for experienced Delphix Support Engineers to join our growing team. In this role, you will provide day-to-day operational support for Delphix platforms, resolve technical issues, and ensure the high availability and performance of data virtualization environments. You will collaborate closely with internal teams and stakeholders to maintain service excellence. Key Responsibilities: Provide Level 1 and Level 2 support for Delphix data virtualization and masking platforms. Monitor system health, performance, and availability of Delphix environments. Diagnose and troubleshoot incidents, escalate critical issues as needed, and drive timely resolution. Coordinate with engineering and infrastructure teams for patching, upgrades, and configuration changes. Maintain and update documentation related to Delphix support procedures and issue resolutions. Perform routine maintenance tasks including backups, restores, and environment refreshes. Ensure compliance with security and operational policies across environments. Required Skills & Experience: 3–6 years of experience supporting enterprise data platforms or infrastructure environments. Hands-on experience with Delphix Dynamic Data Platform (data virtualization and masking). Strong troubleshooting and analytical skills for resolving performance and availability issues. Familiarity with database platforms (Oracle, SQL Server, etc.) and understanding of data cloning/virtualization. Experience with incident management tools (e.g., ServiceNow, JIRA) and monitoring systems. Excellent communication skills and ability to work in a fast-paced, collaborative environment. Good to Have: Experience with scripting (Shell, Python) for automation. Knowledge of DevOps and CI/CD practices. Exposure to cloud platforms (AWS, Azure, GCP) and integration with Delphix. Job Type: Full-time Pay: ₹800,000.00 - ₹1,000,000.00 per year Shift: Day shift Work Days: Monday to Friday Work Location: In person
Posted 1 month ago
7.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
At TELUS Digital, we enable customer experience innovation through spirited teamwork, agile thinking, and a caring culture that puts customers first. TELUS Digital is the global arm of TELUS Corporation, one of the largest telecommunications service providers in Canada. We deliver contact center and business process outsourcing (BPO) solutions to some of the world's largest corporations in the consumer electronics, finance, telecommunications and utilities sectors. With global call center delivery capabilities, our multi-shore, multi-language programs offer safe, secure infrastructure, value-based pricing, skills-based resources and exceptional customer service - all backed by TELUS, our multi-billion dollar telecommunications parent. Required Skills : 7+ years of experience in quality assurance, with at least 3+ years in a Test Data Management (TDM) lead or senior role .Proven experience in designing and implementing test data management strategies, data masking, and test data provisioning for large-scale software projects .Lead the development and implementation of comprehensive test data management strategies to support functional, regression, performance, security, and other types of testing .Establish governance processes and best practices for handling, managing, and securing test data across multiple projects and environments .Ensure that test data complies with legal, regulatory, and organizational security policies (e.g., GDPR, HIPAA) .Design and oversee the creation of high-quality, realistic, and representative test data to meet the needs of different types of testing .Use data generation tools and techniques to produce test data that mirrors real-world data while maintaining privacy and security .Develop automated processes for generating and refreshing test data in line with project and release timelines .Implement and manage data masking, anonymization, and sanitization techniques to ensure sensitive information is protected while retaining data integrity for testing purposes .Develop and enforce data security practices related to the use and storage of test data .Work closely with QA, development, and DevOps teams to understand the specific test data requirements for different testing phases (e.g., unit, integration, performance, UAT) .Collaborate with business and IT teams to ensure that required test data is available when needed and meets quality expectations .Support the creation of data models and mapping to align test data with application requirements .Implement strategies for efficient storage and retrieval of test data to ensure high performance and reduce resource consumption during testing .Continuously assess and optimize test data strategies to improve test execution time, resource allocation, and overall testing efficiency .Manage large-scale data sets and ensure their availability across multiple environments (development, testing, staging, production) .Lead the evaluation, implementation, and continuous improvement of test data management tools and automation platforms (e.g., Informatica TDM, Delphix, IBM InfoSphere Optim) .Leverage automation to streamline test data creation, management, and refresh cycles, ensuring quick access to the latest data for testing .Drive the adoption of self-service tools to enable teams to generate, refresh, and manage their own test data securely .Monitor and manage test data usage to ensure compliance with internal standards and external regulations . Show more Show less
Posted 1 month ago
1.0 years
0 Lacs
Mumbai, Maharashtra, India
On-site
Key Responsibility Areas (KRA) Associate with Senior Designers at the XP on preparing mood board curation, preparing 3D renders, 3D and 2D detailed drawings. Ensure an error-free QC and masking package by making necessary corrections before sending the project into production. 1.Skill Set Required: Freshers upto 1 year of experience. Basic proficiency in SketchUp, Revit and Cad Strong willingness to learn and follow instructions. 2. Education Diploma in Architecture, Civil, B,Tech Civil and B Arch Show more Show less
Posted 1 month ago
10.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Job Title: Azure Data Engineer with Databricks Experience: 5 – 10 years Job Level: Senior Engineer / Lead / Architect Notice Period: Immediate Joiner Role Overview Join our dynamic team at Team Geek Solutions, where we specialize in innovative data solutions and cutting-edge technology implementations to empower businesses across various sectors. We are looking for a skilled Azure Data Engineer with expertise in Databricks to join our high-performing data and AI team for a critical client engagement. The ideal candidate will have strong hands-on experience in building scalable data pipelines, data transformation, and real-time data processing using Azure Data Services and Databricks. Key Responsibilities Design, develop, and deploy end-to-end data pipelines using Azure Databricks, Azure Data Factory, and Azure Synapse Analytics. Perform data ingestion, data wrangling, and ETL/ELT processes from various structured and unstructured data sources (e.g., APIs, on-prem databases, flat files). Optimize and tune Spark-based jobs and Databricks notebooks for performance and scalability. Implement best practices for CI/CD, code versioning, and testing in a Databricks environment using DevOps pipelines. Design data lake and data warehouse solutions using Delta Lake and Synapse Analytics. Ensure data security, governance, and compliance using Azure-native tools (e.g., Azure Purview, Key Vault, RBAC). Collaborate with data scientists to enable feature engineering and model training within Databricks. Write efficient SQL and PySpark code for data transformation and analytics. Monitor and maintain existing data pipelines and troubleshoot issues in a production environment. Document technical solutions, architecture diagrams, and data lineage as part of delivery. Mandatory Skills & Technologies Azure Cloud Services: Azure Data Factory, Azure Databricks, Azure Synapse Analytics, Azure Data Lake Storage (Gen2), Azure Key Vault, Azure Functions, Azure Monitor Databricks Platform: Delta Lake, Databricks Notebooks, Job Clusters, MLFlow (optional), Unity Catalog Programming Languages: PySpark, SQL, Python Data Pipelines: ETL/ELT pipeline design and orchestration Version Control & DevOps: Git, Azure DevOps, CI/CD pipelines Data Modeling: Star/Snowflake schema, Dimensional modeling Performance Tuning: Spark job optimization, Data partitioning strategies Data Governance & Security: Azure Purview, RBAC, Data Masking Nice To Have Experience with Kafka, Event Hub, or other real-time streaming platforms Exposure to Power BI or other visualization tools Knowledge of Terraform or ARM templates for infrastructure as code Experience in MLOps and integration with MLFlow for model lifecycle management Certifications (Good To Have) Microsoft Certified: Azure Data Engineer Associate Databricks Certified Data Engineer Associate / Professional DP-203: Data Engineering on Microsoft Azure Soft Skills Strong communication and client interaction skills Analytical thinking and problem-solving Agile mindset with familiarity in Scrum/Kanban Team player with mentoring ability for junior engineers Skills: data partitioning strategies,azure functions,data analytics,unity catalog,rbac,databricks,elt,devops,azure data factory,delta lake,data factory,spark job optimization,job clusters,azure devops,etl/elt pipeline design and orchestration,data masking,azure key vault,azure databricks,azure data engineer,azure synapse,star/snowflake schema,azure data lake storage (gen2),git,sql,etl,snowflake,azure,python,azure cloud services,azure purview,pyspark,mlflow,ci/cd pipelines,dimensional modeling,sql server,big data technologies,azure monitor,azure synapse analytics,databricks notebooks Show more Show less
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
32455 Jobs | Dublin
Wipro
16590 Jobs | Bengaluru
EY
11025 Jobs | London
Accenture in India
10991 Jobs | Dublin 2
Amazon
8878 Jobs | Seattle,WA
Uplers
8715 Jobs | Ahmedabad
IBM
8204 Jobs | Armonk
Oracle
7750 Jobs | Redwood City
Capgemini
6181 Jobs | Paris,France
Muthoot FinCorp (MFL)
6170 Jobs | New Delhi