Jobs
Interviews

533 Masking Jobs - Page 11

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 - 9.0 years

0 Lacs

Gurugram, Haryana, India

On-site

POSITION - Software Engineer – Data Engineering LOCATION - Bangalore/Mumbai/Kolkata/Gurugram/Hyderabad/Pune/Chennai EXPERIENCE - 5-9 Years ABOUT HASHEDIN We are software engineers who solve business problems with a Product Mindset for leading global organizations. By combining engineering talent with business insight, we build software and products that can create new enterprise value. The secret to our success is a fast-paced learning environment, an extreme ownership spirit, and a fun culture. JOB TITLE: Software Engineer – Data Engineering OVERVIEW OF THE ROLE: As a Data Engineer or Senior Data Engineer, you will be hands-on in architecting, building, and optimizing robust, efficient, and secure data pipelines and platforms that power business critical analytics and applications. You will play a central role in the implementation and automation of scalable batch and streaming data workflows using modern big data and cloud technologies. Working within cross-functional teams, you will deliver well-engineered, high quality code and data models, and drive best practices for data reliability, lineage, quality, and security Mandatory Skills: • Hands-on software coding or scripting for minimum 4 years • Experience in product management for at-least 4 years • Stakeholder management experience for at-least 4 years • Experience in one amongst GCP, AWS or Azure cloud platform Key Responsibilities: • Design, build, and optimize scalable data pipelines and ETL/ELT workflows using Spark (Scala/Python), SQL, and orchestration tools (e.g., Apache Airflow, Prefect, Luigi). • Implement efficient solutions for high-volume, batch, real-time streaming, and eventdriven data processing, leveraging best-in-class patterns and frameworks. • Build and maintain data warehouse and lakehouse architectures (e.g., Snowflake, Databricks, Delta Lake, BigQuery, Redshift) to support analytics, data science, and BI workloads. • Develop, automate, and monitor Airflow DAGs/jobs on cloud or Kubernetes, following robust deployment and operational practices (CI/CD, containerization, infra-as-code). • Write performant, production-grade SQL for complex data aggregation, transformation, and analytics tasks. • Ensure data quality, consistency, and governance across the stack, implementing processes for validation, cleansing, anomaly detection, and reconciliation General Skills & Experience: • Proficiency with Spark (Python or Scala), SQL, and data pipeline orchestration (Airflow, Prefect, Luigi, or similar). • Experience with cloud data ecosystems (AWS, GCP, Azure) and cloud-native services for data processing (Glue, Dataflow, Dataproc, EMR, HDInsight, Synapse, etc.) Hands-on development skills in at least one programming language (Python, Scala, or Java preferred); solid knowledge of software engineering best practices (version control, testing, modularity). • Deep understanding of batch and streaming architectures (Kafka, Kinesis, Pub/Sub, Flink, Structured Streaming, Spark Streaming). • Expertise in data warehouse/lakehouse solutions (Snowflake, Databricks, Delta Lake, BigQuery, Redshift, Synapse) and storage formats (Parquet, ORC, Delta, Iceberg, Avro). • Strong SQL development skills for ETL, analytics, and performance optimization. • Familiarity with Kubernetes (K8s), containerization (Docker), and deploying data pipelines in distributed/cloud-native environments. • Experience with data quality frameworks (Great Expectations, Deequ, or custom validation), monitoring/observability tools, and automated testing. • Working knowledge of data modeling (star/snowflake, normalized, denormalized) and metadata/catalog management. • Understanding of data security, privacy, and regulatory compliance (access management, PII masking, auditing, GDPR/CCPA/HIPAA). • Familiarity with BI or visualization tools (PowerBI, Tableau, Looker, etc.) is an advantage but not core. • Previous experience with data migrations, modernization, or refactoring legacy ETL processes to modern cloud architectures is a strong plus. • Bonus: Exposure to open-source data tools (dbt, Delta Lake, Apache Iceberg, Amundsen, Great Expectations, etc.) and knowledge of DevOps/MLOps processes EDUCATIONAL QUALIFICATIONS : • Bachelor’s or Master’s degree in Computer Science, Data Engineering, Information Systems, or related field (or equivalent experience). • Certifications in cloud platforms (AWS, GCP, Azure) and/or data engineering (AWS Data Analytics, GCP Data Engineer, Databricks). • Experience working in an Agile environment with exposure to CI/CD, Git, Jira, Confluence, and code review processes. • Prior work in highly regulated or large-scale enterprise data environments (finance, healthcare, or similar) is a plus Show more Show less

Posted 1 month ago

Apply

1.0 years

0 Lacs

Ahmedabad, Gujarat, India

On-site

Key Responsibility Areas (KRA) Associate with Senior Designers at the XP on preparing mood board curation, preparing 3D renders, 3D and 2D detailed drawings. Ensure an error-free QC and masking package by making necessary corrections before sending the project into production. 1.Skill Set Required: Freshers upto 1 year of experience. Basic proficiency in SketchUp, Revit and Cad Strong willingness to learn and follow instructions. 2. Education Diploma in Architecture, Civil, B,Tech Civil and B Arch Show more Show less

Posted 1 month ago

Apply

0.0 - 6.0 years

0 Lacs

Udaipur, Rajasthan

Remote

Senior Software Engineer - Data Governance Kadel Labs is a leading IT services company delivering top-quality technology solutions since 2017, focused on enhancing business operations and productivity through tailored, scalable, and future-ready solutions. With deep domain expertise and a commitment to innovation, we help businesses stay ahead of technological trends. As a CMMI Level 3 and ISO 27001:2022 certified company, we ensure best-in-class process maturity and information security, enabling organizations to achieve their digital transformation goals with confidence and efficiency. Role: Senior Software Engineer- Data Governance Experience: 6-8 Yrs Location: Udaipur , Jaipur, Bangalore Domain: Telecom Job Description: We are seeking an experienced Telecom Data Governance lead to join our team. In this role, you will be responsible for defining and implementing the data governance strategy. The role involves establishing metadata standards, defining attribute ownership models, ensuring regulatory compliance, and improving data quality and trust across the enterprise. initiatives. Key Responsibilities: Define and implement enterprise-wide data governance framework Own the metadata catalog and ensure consistency across business and technical assets Develop and manage KPI registries, data dictionaries, and lineage documentation Collaborate with data stewards and domain owners to establish attribute ownership Lead efforts around data standardization, quality rules, and classification of sensitive data Ensure privacy and compliance (e.g., GDPR, PII, PHI) by enforcing tagging, masking, and access rules Define access control rules (purpose-based views, user roles, sensitivity levels) Oversee governance for data products and federated data domains Support internal audits and external regulatory reviews Coordinate with platform, analytics, security, and compliance teams Required Skills: 6+ years of experience in data governance roles with at least 3-4 years in telecommunications industry Experience integrating governance with modern data stacks (e.g.Data bricks, Snowflake) Strong experience in data governance tools (e.g., Alation, Unity Catalog ,Azure Purview,) Proven understanding of metadata management, data lineage, and data quality frameworks Experience in implementing federated governance models and data stewardship programs Knowledge of compliance requirements (GDPR, PII, TMForum etc.) Familiarity with data mesh principles and data contract approaches Excellent communication and stakeholder management skills Background in telecom, networking or other data-rich industries Certification in data governance or management frameworks Educational Qualifications: · Bachelor's degree in Computer Science, Information Technology, or a related field. Visit us: https://kadellabs.com/ https://in.linkedin.com/company/kadel-labs https://www.glassdoor.co.in/Overview/Working-at-Kadel-Labs-EI_IE4991279.11,21.htm Job Types: Full-time, Permanent Pay: ₹2,087,062.21 - ₹2,209,304.16 per year Benefits: Flexible schedule Health insurance Leave encashment Paid time off Provident Fund Work from home Schedule: Day shift Monday to Friday Supplemental Pay: Overtime pay Performance bonus Quarterly bonus Yearly bonus Ability to commute/relocate: Udaipur City, Rajasthan: Reliably commute or planning to relocate before starting work (Required) Application Question(s): How Many years of experience in Telecom-Data Engineering? Experience: Data Engineer: 9 years (Required) Data governance: 6 years (Required) Location: Udaipur City, Rajasthan (Required) Work Location: In person

Posted 1 month ago

Apply

3.0 years

0 Lacs

Ahmedabad, Gujarat, India

On-site

Artic Consulting is a dynamic IT and consulting services firm, delivering digital transformation through innovative solutions in data, cloud, and business analytics. We are seeking a skilled Data Engineer with a strong focus on the Microsoft Fabric ecosystem, who can design and implement scalable data solutions for our clients. Key Responsibilities: Design, develop, and maintain Power BI reports, dashboards, DAX expressions, KPIs, and scorecards using both Import and Direct Query modes Build and orchestrate scalable ETL/ELT workflows using Fabric Data Pipelines, Dataflows Gen2, and Azure Data Factory Write and tune complex T-SQL and KQL queries, stored procedures, and views for performance in Synapse SQL and SQL Server environments Implement data models based on star/snowflake schemas and support modern data warehousing and Lakehouse architectures using Microsoft Fabric integrate structured and unstructured data sources (e.g., SQL, Excel, APIs, Blob Storage), and transform them efficiently using Fabric Notebooks (Spark/PySpark) or Dataflows Diagnose and resolve pipeline failures, logic errors, and performance bottlenecks across the data engineering lifecycle Automate repetitive data processes using Azure Functions, Logic Apps, PowerShell, or Python scripting within the Fabric ecosystem Collaborate with stakeholders to gather business requirements and translate them into scalable data solutions Ensure data governance, privacy, and compliance standards (e.g., GDPR, HIPAA, ISO) are adhered to, including sensitive data handling policies Apply best practices for item-level security, workspace-based access models, and data lineage using Microsoft One Lake and Fabric tools Required Qualifications : Bachelor's or master's degree in computer science, Information Systems, or related field Minimum 3 years of experience in Power BI development and data engineering Strong expertise in T-SQL and KQL, with demonstrated query optimization skills Proficiency in Microsoft Fabric tools: Data Pipelines, Dataflows Gen2, OneLake, Notebooks Hands-on experience with Spark/PySpark and data integration from varied sources Microsoft Power BI certification (PL-300) or equivalent Microsoft certifications Preferred Skills: Experience debugging and optimizing advanced SQL queries, database objects, and legacy components Ability to implement database security models and data protection policies Expertise in implementing row-level security, dynamic data masking, and role-based access control within Microsoft Fabric and Power BI environments Familiarity with Microsoft OneLake architecture, including data cataloging, lineage tracking, item-level security, and workspace-based access management Proven ability to operate effectively in dynamic, client-facing environments, delivering scalable and compliant data solutions with a focus on performance and quality Why Join Artic Consulting? Work with cutting-edge Microsoft technologies in a dynamic and collaborative environment Flexible work culture with opportunities for growth and certifications Be part of a mission to deliver impactful digital transformation for clients globally Powered by JazzHR K4jDCdH2kc Show more Show less

Posted 1 month ago

Apply

6.0 - 11.0 years

16 - 31 Lacs

Hyderabad, Pune, Bengaluru

Hybrid

Greetings from Cognizant!!! We are hiring for Permanent Position with Cognizant. Experience:- 3 - 12 Yrs. Mandatory to have experience in TDM, GenRocket, Delphix, Informatica. Work Location: Pan India Interview Mode : ( Virtual ) Interview Date : Weekday & Weekend JD: Job Summary Strong Test Data Manager with hands on Test Data experience preferably in TDM consultancy and Implementation of TDM tool Thorough understanding of Test Data Management hands on experience Data Generation Masking Profiling Experience in enterprise level TDM solutioning and implementation Experience in client facing roles and good stakeholder management Prefer experience in Google Cloud or Cloud Test Data handling Responsibilities Strong Test Data Manager with hands on Test Data experience preferably in TDM consultancy and Implementation of TDM tool Thorough understanding of Test Data Management hands on experience Data Generation Masking Profiling Experience in enterprise level TDM solutioning and implementation Experience in client facing roles and good stakeholder management.

Posted 1 month ago

Apply

0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Overview This role is responsible for coordinate resources, solve technical requirements, evaluate risks and scope of SAP improvements, upgrades and implementations for Global PGT and individual PGT and deploy technological solutions according Pepsico´s SAP/IT best practices and compliance. This role is also responsible for assessing functional requirements, guide the group as per Application Security guidelines and Compliance standard methodologies, and ensure transparent security design. Provides subject matter expertise (SME) in solutioning and implementing SAP access management requirements. This role severs as the leader for cyber security governance, engineering, and reporting for PGT. This role is also the liaison with Information Security. Additionally, this role’s objective is to successfully deliver security upfront across all PGT deployments while ensuring consistency in approach and providing visibility through communication and alignment with key stakeholders. Responsibilities Point Person for PGT SAP implementations with the leaders, functional team and business unit. Provide project progress information to functional and business Directors and Managers Minimize SoD critical risk during implementations and guide during each phase to achieve the SAP security governance and controls. Work closely with controls team (IT, configurable and internal control) and continue supporting the best practices Communicate with governance team in order to implement local and global best practices. Coordinate SAP Security implementations during the lifecycle of the projects. Consolidate and support PGT implementations regarding SAP Security best practices. Introduce delivery automation processes , Actively participate in the Continuous Process Improvement initiatives by striving to look for possible efficiencies, scalability and / or cost reduction opportunities. Work with limited supervision and exhibit a solid sense of urgency Actively participate in the Continuous Process Improvement initiatives by striving to look for possible efficiencies, scalability and / or cost reduction opportunities Facilitates internal and external audits as requested Always ensure Data Protection by leveraging Data Masking and Data Scrambling techniques Responsible for Leadership reporting on various Information Security metrics across Tech Strategy and Enterprise Solutions teams Collaborate with Information Security organization to ensure remediation of security vulnerabilities to ensure security health index is maintained as intended Manage and provide status updates on security assessments, vulnerability remediation, and exceptions Provide Security Engineering Expertise for PGT program Provide regular updates to Information Security Leadership on PGT status and risks and issues Qualifications Bachelor’s degree in computer science (or equivalent) is required Show more Show less

Posted 1 month ago

Apply

6.0 years

18 - 24 Lacs

Hyderābād

On-site

Delphix Senior Engineer Open Positions: 4 Experience: 6+ Years Location: Bangalore, Hyderabad, Chennai, Pune Employment Type: Full-Time About the Role: We are seeking highly skilled Delphix Senior Engineers to join our dynamic team. You will play a critical role in designing, deploying, and optimizing Delphix Data Virtualization and Data Masking solutions across enterprise environments. This role also involves mentoring junior engineers and ensuring best practices in data delivery, privacy, and DevOps enablement. Key Responsibilities: Design and implement scalable Delphix data virtualization and masking architectures. Lead end-to-end solution deployment, configuration, and integration with client systems. Collaborate with development, QA, and operations teams to ensure seamless data delivery for testing and analytics. Automate data provisioning workflows using Delphix APIs and integration tools. Monitor performance, troubleshoot issues, and optimize Delphix environments. Mentor and guide junior engineers; provide training and technical leadership. Document system designs, processes, and operational procedures. Work closely with stakeholders to understand data needs and ensure secure and efficient access. Required Skills & Experience: 6+ years of experience in enterprise data management or DevOps roles. 3+ years of hands-on experience with Delphix Data Virtualization and Data Masking solutions. Strong understanding of RDBMS technologies (Oracle, SQL Server, PostgreSQL, etc.). Experience with scripting (Shell, Python) and Delphix API integrations. Familiarity with CI/CD pipelines and DevOps practices. Excellent problem-solving, communication, and stakeholder management skills. Ability to work independently and lead technical discussions and initiatives. Preferred Qualifications: Experience in cloud-based deployments (AWS, Azure, GCP). Prior experience with Agile/Scrum methodologies. Delphix certifications or formal training. Job Type: Full-time Pay: ₹1,800,000.00 - ₹2,400,000.00 per year Schedule: Day shift Work Location: In person

Posted 1 month ago

Apply

8.0 years

25 - 30 Lacs

Hyderābād

On-site

Delphix Tech Lead ( Exp- 9 LPA Lead end-to-end Delphix solution design, implementation, and team guidance across enterprise environments. We are looking for an experienced Delphix Tech Lead to take ownership of the end-to-end design, implementation, and deployment of Delphix solutions across complex enterprise environments. The ideal candidate will provide technical leadership, guide team members, and ensure seamless integration of Delphix into various data architectures. Key Responsibilities: Lead the design and implementation of Delphix data virtualization and masking solutions. Collaborate with stakeholders to understand data management needs and translate them into technical solutions. Oversee installation, configuration, and maintenance of Delphix platform. Drive performance optimization, automation, and integration with CI/CD pipelines. Provide technical guidance and mentorship to team members. Create technical documentation and ensure best practices are followed. Troubleshoot and resolve issues related to data provisioning and masking. Required Skills & Qualifications: 8+ years of IT experience with at least 3+ years working on Delphix platform. Strong knowledge of data virtualization, masking, and DevOps processes. Experience with database technologies such as Oracle, SQL Server, or PostgreSQL. Solid understanding of data management and security best practices. Ability to lead projects and coordinate with cross-functional teams. Excellent communication and problem-solving skills. Job Type: Full-time Pay: ₹2,500,000.00 - ₹3,000,000.00 per year Schedule: Day shift Work Location: In person

Posted 1 month ago

Apply

5.0 years

14 - 18 Lacs

Hyderābād

On-site

Job Title: Delphix Engineer Experience: 5+ Years Positions Open: 2 Location: Bangalore, Pune, Hyderabad, Chennai Employment Type: Full-time About the Role: We are seeking skilled and motivated Delphix Engineers to join our dynamic team. In this role, you will be responsible for implementing, managing, and optimizing Delphix environments to support high-performing, secure, and efficient data virtualization and delivery. This is an excellent opportunity to contribute to enterprise-level data initiatives and drive automation in modern data infrastructure. Key Responsibilities: Design, implement, and maintain Delphix data virtualization and masking environments. Collaborate with development, testing, and infrastructure teams to deliver virtualized data environments. Automate data provisioning, refresh, masking, and archival processes using Delphix APIs and scripting tools. Monitor system health, troubleshoot issues, and ensure optimal performance and reliability. Manage integration of Delphix with databases like Oracle, SQL Server, PostgreSQL, and others. Ensure compliance with data security and masking requirements across environments. Contribute to documentation, best practices, and knowledge-sharing within the team. Required Skills & Experience: Minimum 5 years of overall experience, with strong expertise in Delphix Data Platform . Solid hands-on experience with Delphix Virtualization and Masking solutions . Strong scripting skills using Shell, PowerShell, or Python for automation. Experience integrating Delphix with Oracle, SQL Server, or other major databases . Good understanding of data lifecycle management , data masking , and data delivery pipelines . Familiarity with DevOps tools and CI/CD processes is a plus. Strong analytical, troubleshooting, and communication skills. Preferred Qualifications: Delphix certifications (if available). Experience working in Agile/Scrum environments. Exposure to cloud platforms (AWS, Azure, GCP) and cloud-based Delphix setups Job Type: Full-time Pay: ₹1,400,000.00 - ₹1,800,000.00 per year Schedule: Day shift Work Location: In person

Posted 1 month ago

Apply

3.0 - 6.0 years

8 - 10 Lacs

Hyderābād

On-site

Job Title: Delphix Support Engineer Open Positions: 2 Experience: 3–6 Years Location: Bangalore / Pune / Hyderabad / Chennai / Pune Employment Type: Full-time About the Role: We are looking for experienced Delphix Support Engineers to join our growing team. In this role, you will provide day-to-day operational support for Delphix platforms, resolve technical issues, and ensure the high availability and performance of data virtualization environments. You will collaborate closely with internal teams and stakeholders to maintain service excellence. Key Responsibilities: Provide Level 1 and Level 2 support for Delphix data virtualization and masking platforms. Monitor system health, performance, and availability of Delphix environments. Diagnose and troubleshoot incidents, escalate critical issues as needed, and drive timely resolution. Coordinate with engineering and infrastructure teams for patching, upgrades, and configuration changes. Maintain and update documentation related to Delphix support procedures and issue resolutions. Perform routine maintenance tasks including backups, restores, and environment refreshes. Ensure compliance with security and operational policies across environments. Required Skills & Experience: 3–6 years of experience supporting enterprise data platforms or infrastructure environments. Hands-on experience with Delphix Dynamic Data Platform (data virtualization and masking). Strong troubleshooting and analytical skills for resolving performance and availability issues. Familiarity with database platforms (Oracle, SQL Server, etc.) and understanding of data cloning/virtualization. Experience with incident management tools (e.g., ServiceNow, JIRA) and monitoring systems. Excellent communication skills and ability to work in a fast-paced, collaborative environment. Good to Have: Experience with scripting (Shell, Python) for automation. Knowledge of DevOps and CI/CD practices. Exposure to cloud platforms (AWS, Azure, GCP) and integration with Delphix. Job Type: Full-time Pay: ₹800,000.00 - ₹1,000,000.00 per year Shift: Day shift Work Days: Monday to Friday Work Location: In person

Posted 1 month ago

Apply

7.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

At TELUS Digital, we enable customer experience innovation through spirited teamwork, agile thinking, and a caring culture that puts customers first. TELUS Digital is the global arm of TELUS Corporation, one of the largest telecommunications service providers in Canada. We deliver contact center and business process outsourcing (BPO) solutions to some of the world's largest corporations in the consumer electronics, finance, telecommunications and utilities sectors. With global call center delivery capabilities, our multi-shore, multi-language programs offer safe, secure infrastructure, value-based pricing, skills-based resources and exceptional customer service - all backed by TELUS, our multi-billion dollar telecommunications parent. Required Skills : 7+ years of experience in quality assurance, with at least 3+ years in a Test Data Management (TDM) lead or senior role .Proven experience in designing and implementing test data management strategies, data masking, and test data provisioning for large-scale software projects .Lead the development and implementation of comprehensive test data management strategies to support functional, regression, performance, security, and other types of testing .Establish governance processes and best practices for handling, managing, and securing test data across multiple projects and environments .Ensure that test data complies with legal, regulatory, and organizational security policies (e.g., GDPR, HIPAA) .Design and oversee the creation of high-quality, realistic, and representative test data to meet the needs of different types of testing .Use data generation tools and techniques to produce test data that mirrors real-world data while maintaining privacy and security .Develop automated processes for generating and refreshing test data in line with project and release timelines .Implement and manage data masking, anonymization, and sanitization techniques to ensure sensitive information is protected while retaining data integrity for testing purposes .Develop and enforce data security practices related to the use and storage of test data .Work closely with QA, development, and DevOps teams to understand the specific test data requirements for different testing phases (e.g., unit, integration, performance, UAT) .Collaborate with business and IT teams to ensure that required test data is available when needed and meets quality expectations .Support the creation of data models and mapping to align test data with application requirements .Implement strategies for efficient storage and retrieval of test data to ensure high performance and reduce resource consumption during testing .Continuously assess and optimize test data strategies to improve test execution time, resource allocation, and overall testing efficiency .Manage large-scale data sets and ensure their availability across multiple environments (development, testing, staging, production) .Lead the evaluation, implementation, and continuous improvement of test data management tools and automation platforms (e.g., Informatica TDM, Delphix, IBM InfoSphere Optim) .Leverage automation to streamline test data creation, management, and refresh cycles, ensuring quick access to the latest data for testing .Drive the adoption of self-service tools to enable teams to generate, refresh, and manage their own test data securely .Monitor and manage test data usage to ensure compliance with internal standards and external regulations . Show more Show less

Posted 1 month ago

Apply

1.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Key Responsibility Areas (KRA) Associate with Senior Designers at the XP on preparing mood board curation, preparing 3D renders, 3D and 2D detailed drawings. Ensure an error-free QC and masking package by making necessary corrections before sending the project into production. 1.Skill Set Required: Freshers upto 1 year of experience. Basic proficiency in SketchUp, Revit and Cad Strong willingness to learn and follow instructions. 2. Education Diploma in Architecture, Civil, B,Tech Civil and B Arch Show more Show less

Posted 1 month ago

Apply

10.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Job Title: Azure Data Engineer with Databricks Experience: 5 – 10 years Job Level: Senior Engineer / Lead / Architect Notice Period: Immediate Joiner Role Overview Join our dynamic team at Team Geek Solutions, where we specialize in innovative data solutions and cutting-edge technology implementations to empower businesses across various sectors. We are looking for a skilled Azure Data Engineer with expertise in Databricks to join our high-performing data and AI team for a critical client engagement. The ideal candidate will have strong hands-on experience in building scalable data pipelines, data transformation, and real-time data processing using Azure Data Services and Databricks. Key Responsibilities Design, develop, and deploy end-to-end data pipelines using Azure Databricks, Azure Data Factory, and Azure Synapse Analytics. Perform data ingestion, data wrangling, and ETL/ELT processes from various structured and unstructured data sources (e.g., APIs, on-prem databases, flat files). Optimize and tune Spark-based jobs and Databricks notebooks for performance and scalability. Implement best practices for CI/CD, code versioning, and testing in a Databricks environment using DevOps pipelines. Design data lake and data warehouse solutions using Delta Lake and Synapse Analytics. Ensure data security, governance, and compliance using Azure-native tools (e.g., Azure Purview, Key Vault, RBAC). Collaborate with data scientists to enable feature engineering and model training within Databricks. Write efficient SQL and PySpark code for data transformation and analytics. Monitor and maintain existing data pipelines and troubleshoot issues in a production environment. Document technical solutions, architecture diagrams, and data lineage as part of delivery. Mandatory Skills & Technologies Azure Cloud Services: Azure Data Factory, Azure Databricks, Azure Synapse Analytics, Azure Data Lake Storage (Gen2), Azure Key Vault, Azure Functions, Azure Monitor Databricks Platform: Delta Lake, Databricks Notebooks, Job Clusters, MLFlow (optional), Unity Catalog Programming Languages: PySpark, SQL, Python Data Pipelines: ETL/ELT pipeline design and orchestration Version Control & DevOps: Git, Azure DevOps, CI/CD pipelines Data Modeling: Star/Snowflake schema, Dimensional modeling Performance Tuning: Spark job optimization, Data partitioning strategies Data Governance & Security: Azure Purview, RBAC, Data Masking Nice To Have Experience with Kafka, Event Hub, or other real-time streaming platforms Exposure to Power BI or other visualization tools Knowledge of Terraform or ARM templates for infrastructure as code Experience in MLOps and integration with MLFlow for model lifecycle management Certifications (Good To Have) Microsoft Certified: Azure Data Engineer Associate Databricks Certified Data Engineer Associate / Professional DP-203: Data Engineering on Microsoft Azure Soft Skills Strong communication and client interaction skills Analytical thinking and problem-solving Agile mindset with familiarity in Scrum/Kanban Team player with mentoring ability for junior engineers Skills: data partitioning strategies,azure functions,data analytics,unity catalog,rbac,databricks,elt,devops,azure data factory,delta lake,data factory,spark job optimization,job clusters,azure devops,etl/elt pipeline design and orchestration,data masking,azure key vault,azure databricks,azure data engineer,azure synapse,star/snowflake schema,azure data lake storage (gen2),git,sql,etl,snowflake,azure,python,azure cloud services,azure purview,pyspark,mlflow,ci/cd pipelines,dimensional modeling,sql server,big data technologies,azure monitor,azure synapse analytics,databricks notebooks Show more Show less

Posted 1 month ago

Apply

6.0 - 8.0 years

0 Lacs

Chennai, Tamil Nadu, India

Remote

Experience- 6-8 years Required Technical Skill Set** HPE Storage platforms: HPE Nimble ,HPE Primera , HPE 3PAR deployment, configuration, replication, and performance tuning, enterprise setup, firmware management, scalability, provisioning, snapshot/replication configuration SAN switch technologies Cisco MDS ,Brocade switches: zoning, fabric configuration, troubleshooting, Fabric management, diagnostics, and upgrade Desired Competencies (Technical/Behavioral Competency) Must-Have** (Ideally should not be more than 3-5) · Experience with LUN provisioning, masking, and zoning across multi-host environments. · Proficiency with Fibre Channel, iSCSI, and FCoE protocols for block-level storage connectivity. · Knowledge of storage replication, snapshot technologies, and remote data protection solutions. · Proficient in backup integration and disaster recovery strategies in storage environments. · Experience performing firmware upgrades and hardware lifecycle management on storage devices. · Ability to conduct and analyze storage performance assessments, capacity planning, and security audits. · Familiarity with storage monitoring, alerting, and reporting tools for proactive system health checks. · Troubleshooting of hardware-level and storage network issues affecting performance and availability. · Adequate knowledge of Ethernet/iSCSI and Fibre Channel-based SAN topology Good-to-Have · Hands-on experience with HPE Storage platforms: HPE Nimble ,HPE Primera , HPE 3PAR deployment, configuration, replication, and performance tuning, enterprise setup, firmware management, scalability, provisioning, snapshot/replication configuration. SN Responsibility of / Expectations from the Role 1. Ability to work independently in a fast-paced dynamic environment required. 2. Proven experience in designing, implementing, and managing enterprise storage solutions. 3. Deep knowledge of SAN (Storage Area Network), NAS (Network Attached Storage), and DAS (Direct Attached Storage) technologies. 4. Expertise in RAID levels, disk provisioning, and storage performance optimization. 5. Strong understanding of SAN switch technologies Cisco MDS ,Brocade switches: zoning, fabric configuration, troubleshooting, Fabric management, diagnostics, and upgrade 6. Experience performing firmware upgrades and hardware lifecycle management on storage devices. 7. Ability to conduct and analyze storage performance assessments, capacity planning, and security audits. Show more Show less

Posted 1 month ago

Apply

3.0 - 4.0 years

0 Lacs

India

On-site

Note : Kindly note only below qualification will be considered for this positions Mandatory educational qualification: Bachelor’s in engineering (specifically Computer Science) from a premier institute in India (IITs or NITs or IIITs) Bonus educational qualification: Masters (MTech) in Computer Science from IISC or IITs. Role Description We are currently looking for bright Deep learning talent who have 3-4 years of industry experience (1.5-2 years in practical Deep Learning industry projects), working on problems related to the Video AI space. A Deep Learning engineer gets exposed to building solutions related to human activity analysis in videos 1. reads research papers to understand the latest developments in activity recognition, multi-camera object tracking problems 2. Devlops production quality code to convert the research to usable features on https://deeplabel.app. 3. Gets to work on Video Language model architectures. 4. Along with strong research skills, good coding skills and knowledge of Data structures and algorithms is required. Qualifications Strong coding skills in Python Thorough hands-on knowledge of Deep Learning frameworks - Pytorch, Onnx Excellent understanding and experience of deploying data pipelines for Computer Vision projects. Good working knowledge of troubleshooting, fixing and patching Linux system level problems (we expect engineers to set up their own workstations, install, troubleshoot Cuda problems and OpenCV2 problems). Good understanding of Deep Learning concepts (model parameters, tuning of models, optimizers, learning rates, attention mechanisms, masking etc). Thorough understanding of deep learning implementations of activity detection or object detection. Ability to read research papers and implement new approaches for activity detection and object detection. Knowledge of deployment tools like Triton inference server or Ray are a plus. In addition to the above : we need a few key personality attributes Willingness to learn and try till you succeed Curiosity to learn and experiment. Ability to work with full stack engineers of the AI platform team, to deploy your new innovations. Good communication skills Ability to take ownership of the respective modules for the whole lifecycle till deployment. Company Description Drillo.AI specializes in delivering tailored AI solutions that empower small and medium-sized businesses (SMBs) to innovate and grow. Our mission is to make advanced technology accessible, helping SMBs streamline operations, enhance decision-making, and drive sustainable success. With a deep understanding of the unique challenges faced by smaller enterprises, we provide scalable, cost-effective AI strategies to unlock new opportunities. By working closely with our clients, we help elevate their businesses to the next level. Show more Show less

Posted 1 month ago

Apply

6.0 years

0 Lacs

Kolkata, West Bengal, India

On-site

JOB_POSTING-3-71493-1 Job Description Role Title : AVP, Enterprise Logging & Observability (L11) Company Overview Synchrony (NYSE: SYF) is a premier consumer financial services company delivering one of the industry’s most complete digitally enabled product suites. Our experience, expertise and scale encompass a broad spectrum of industries including digital, health and wellness, retail, telecommunications, home, auto, outdoors, pet and more. We have recently been ranked #2 among India’s Best Companies to Work for by Great Place to Work. We were among the Top 50 India’s Best Workplaces in Building a Culture of Innovation by All by GPTW and Top 25 among Best Workplaces in BFSI by GPTW. We have also been recognized by AmbitionBox Employee Choice Awards among the Top 20 Mid-Sized Companies, ranked #3 among Top Rated Companies for Women, and Top-Rated Financial Services Companies. Synchrony celebrates ~51% women diversity, 105+ people with disabilities, and ~50 veterans and veteran family members. We offer Flexibility and Choice for all employees and provide best-in-class employee benefits and programs that cater to work-life integration and overall well-being. We provide career advancement and upskilling opportunities, focusing on Advancing Diverse Talent to take up leadership roles. Organizational Overview Splunk is Synchrony's enterprise logging solution. Splunk searches and indexes log files and helps derive insights from the data. The primary goal is, to ingests massive datasets from disparate sources and employs advanced analytics to automate operations and improve data analysis. It also offers predictive analytics and unified monitoring for applications, services and infrastructure. There are many applications that are forwarding data to the Splunk logging solution. Splunk team including Engineering, Development, Operations, Onboarding, Monitoring maintain Splunk and provide solutions to teams across Synchrony. Role Summary/Purpose The role AVP, Enterprise Logging & Observability is a key leadership role responsible for driving the strategic vision, roadmap, and development of the organization’s centralized logging and observability platform. This role supports multiple enterprise initiatives including applications, security monitoring, compliance reporting, operational insights, and platform health tracking. This role lead platform development using Agile methodology, manage stakeholder priorities, ensure logging standards across applications and infrastructure, and support security initiatives. This position bridges the gap between technology teams, applications, platforms, cloud, cybersecurity, infrastructure, DevOps, Governance audit, risk teams and business partners, owning and evolving the logging ecosystem to support real-time insights, compliance monitoring, and operational excellence. Key Responsibilities Splunk Development & Platform Management Lead and coordinate development activities, ingestion pipeline enhancements, onboarding frameworks, and alerting solutions. Collaborate with engineering, operations, and Splunk admins to ensure scalability, performance, and reliability of the platform. Establish governance controls for source naming, indexing strategies, retention, access controls, and audit readiness. Splunk ITSI Implementation & Management - Develop and configure ITSI services, entities, and correlation searches. Implement notable events aggregation policies and automate response actions. Fine-tune ITSI performance by optimizing data models, summary indexing, and saved searches. Help identify patterns and anomalies in logs and metrics. Develop ML models for anomaly detection, capacity planning, and predictive analytics. Utilize Splunk MLTK to build and train models for IT operations monitoring. Security & Compliance Enablement Partner with InfoSec, Risk, and Compliance to align logging practices with regulations (e.g., PCI-DSS, GDPR, RBI). Enable visibility for encryption events, access anomalies, secrets management, and audit trails. Support security control mapping and automation through observability. Stakeholder Engagement Act as a strategic advisor and point of contact for business units, application, infrastructure, security stakeholders and business teams leveraging Splunk. Conduct stakeholder workshops, backlog grooming, and sprint reviews to ensure alignment. Maintain clear and timely communications across all levels of the organization. Process & Governance Drive logging and observability governance standards, including naming conventions, access controls, and data retention policies. Lead initiatives for process improvement in log ingestion, normalization, and compliance readiness. Ensure alignment with enterprise architecture and data classification models. Lead improvements in logging onboarding lifecycle time, automation pipelines, and selfservice ingestion tools. Mentor junior team members and guide engineering teams on secure, standardized logging practices. Required Skills/Knowledge Bachelor's degree with Minimum of 6+ years of experience in Technology ,or in lieu of a degree 8+ years of Experience in Technology Minimum of 3+ years of experience in leading development team or equivalent role in observability, logging, or security platforms. Splunk Subject Matter Expert (SME) Strong hands-on understanding of Splunk architecture, pipelines, dashboards, and alerting, data ingestion, search optimization, and enterprise-scale operations. Experience supporting security use cases, encryption visibility, secrets management, and compliance logging. Splunk Development & Platform Management, Security & Compliance Enablement, Stakeholder Engagement & Process & Governance Experience with Splunk Premium Apps - ITSI and Enterprise Security (ES) minimally Experience with Data Streaming Platforms & tools like Cribl, Splunk Edge Processor. Proven ability to work in Agile environments using tools such as JIRA or JIRA Align. Strong communication, leadership, and stakeholder management skills. Familiarity with security, risk, and compliance standards relevant to BFSI. Proven experience leading product development teams and managing cross-functional initiatives using Agile methods. Strong knowledge and hands-on experience with Splunk Enterprise/Splunk Cloud. Design and implement Splunk ITSI solutions for proactive monitoring and service health tracking. Develop KPIs, Services, Glass Tables, Entities, Deep Dives, and Notable Events to improve service reliability for users across the firm Develop scripts (python, JavaScript, etc.) as needed in support of data collection or integration Develop new applications leveraging Splunk’s analytic and Machine Learning tools to maximize performance, availability and security improving business insight and operations. Support senior engineers in analyzing system issues and performing root cause analysis (RCA). Desired Skills/Knowledge Deep knowledge of Splunk development, data ingestion, search optimization, alerting, dashboarding, and enterprise-scale operations. Exposure to SIEM integration, security orchestration, or SOAR platforms. Knowledge of cloud-native observability (e.g. AWS/GCP/Azure logging). Experience in BFSI or regulated industries with high-volume data handling. Familiarity with CI/CD pipelines, DevSecOps integration, and cloud-native logging. Working knowledge of scripting or automation (e.g., Python, Terraform, Ansible) for observability tooling. Splunk certifications (Power User, Admin, Architect, or equivalent) will be an advantage . Awareness of data classification, retention, and masking/anonymization strategies. Awareness of integration between Splunk and ITSM or incident management tools (e.g., ServiceNow, PagerDuty) Experience with Version Control tools – Git, Bitbucket Eligibility Criteria Bachelor's degree with Minimum of 6+ years of experience in Technology ,or in lieu of a degree 8+ years of Experience in Technology Minimum of 3+ years of experience in leading development team or equivalent role in observability, logging, or security platforms. Demonstrated success in managing large-scale logging platforms in regulated environments. Excellent communication, leadership, and cross-functional collaboration skills. Experience with scripting languages such as Python, Bash, or PowerShell for automation and integration purposes. Prior experience in large-scale, security-driven logging or observability platform development. Excellent problem-solving skills and the ability to work independently or as part of a team. Strong communication and interpersonal skills to interact effectively with team members and stakeholders. Knowledge of IT Service Management (ITSM) and monitoring tools. Knowledge of other data analytics tools or platforms is a plus. WORK TIMINGS : 01:00 PM to 10:00 PM IST This role qualifies for Enhanced Flexibility and Choice offered in Synchrony India and will require the incumbent to be available between 06:00 AM Eastern Time – 11:30 AM Eastern Time (timings are anchored to US Eastern hours and will adjust twice a year locally). This window is for meetings with India and US teams. The remaining hours will be flexible for the employee to choose. Exceptions may apply periodically due to business needs. Please discuss this with the hiring manager for more details. For Internal Applicants Understand the criteria or mandatory skills required for the role, before applying Inform your manager and HRM before applying for any role on Workday Ensure that your professional profile is updated (fields such as education, prior experience, other skills) and it is mandatory to upload your updated resume (Word or PDF format) Must not be any corrective action plan (First Formal/Final Formal, PIP) L9+ Employees who have completed 18 months in the organization and 12 months in current role and level are only eligible. L09+ Employees can apply. Level / Grade : 11 Job Family Group Information Technology Show more Show less

Posted 1 month ago

Apply

1.0 - 2.0 years

0 Lacs

Jaipur, Rajasthan, India

On-site

Location: Jaipur, RJ, IN Areas of Work: Sales & Marketing Job Id: 12778 Executive N - SERVICES JAIPUR Objective Lead the team of Customer Associates & Sales Associates in the region / allocated territory to ensure their performance in terms of delivering value, adherence to processes and guidelines at the sites, driving the usages of key / focus products at the sites, interact and manage dealers as and when required. Reporting of data as required to SSE, UH & central function. Main Responsibilities Monitoring daily updates of all activities on the Paint Assist app. Daily monitoring of on time visits and follow ups to all new sites by customer associates (CA) or Sales Associates (SA). Daily monitoring of business collections by CA and ensuring delivery of month on month business objectives Prompt updation of records of the new joinees/exits at a CA/SA level to SSE In store & onsite training to new CAs for processes, business app training, product pitching & site monitoring. Random site visits to ensure adherence to systems & processes like usage of mechanized tools, covering and masking, correct application process and on-time site handover after proper cleaning. Approving business entries into the application post appropriate checks & validation. Other Responsibilities Undertake regular trainings for faster adoption of updated features of the business app Co-ordinate with CA and contractors and ensure their attendance in all contractor training programs. Scope of Work A ) People Management Scope (Range of no. of Direct/ Indirect Reports): Performance of Customer associates/Sales Associates Co-ordination with trainer, TA, TSE & SSE for contractor training needs. Co-ordination with DA, CC & SD for focused product requirements and leads c) Geography Coverage (Country-wide/ State-wide / Area-wide) d) Corporate Coverage (Company-wide / Business Unit or Function-wide / Sub-function-wide / Other): NA Key Interactions Internal: Customer associates, Sales Associates, Colour consultants, Designer Associate, Senior Sales Executive, Unit Head, Technical Associate, Technical Sales Executive. External: Customers, Store owner, contractors, other influencers. Role Requirements Qualifications Essential: Graduate Degree in any stream (BA/B.Sc./B.Com/BBA/BBM/BMS) Minimum of 50% marks throughout education without any backlogs Graduation must be through a full time course Applicants with an Engineering background (B.Tech/B.E./Diploma/B.Pharma) will not be considered Desired: Candidates with MBA/PGDM in Sales and Marketing Desired: 1-2 years of experience in sales function in any organization Functional Competencies Fluency in English, Hindi & local language Excellent communication and people skills Should have a working knowledge of MS Excel, MS Word, MS Powerpoint Behavioral Competencies Willingness to work in a retail environment and engage with clients across age and income groups for 8.5 hours a day and 6 days in a week. Extensive travelling across the region. Be diligent and ensure timely attendance / completion of all programs and modules designed by for training and development of customer associates. Additional Requirements Should have a two-wheeler with valid driving licence. Should have an android phone with latest updated operating system Age to be between 26 - 30 Years Show more Show less

Posted 1 month ago

Apply

8.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Job Title: Senior Python Developer – Backend Engineering Company: Darwix AI Location: Gurgaon (On-site) Type: Full-Time Experience Required: 4–8 Years About Darwix AI Darwix AI is building India’s most advanced GenAI-powered platform for enterprise sales teams. We combine speech recognition, LLMs, vector databases, real-time analytics, and multilingual intelligence to power customer conversations across India, the Middle East, and Southeast Asia. We’re solving complex backend problems across speech-to-text pipelines , agent assist systems , AI-based real-time decisioning , and scalable SaaS delivery . Our engineering team sits at the core of our product and works closely with AI research, product, and client delivery to build the future of revenue enablement. Backed by top-tier VCs, AI advisors, and enterprise clients, this is a chance to build something foundational. Role Overview We are hiring a Senior Python Developer to architect, implement, and optimize high-performance backend systems that power our AI platform. You will take ownership of key backend services—from core REST APIs and data pipelines to complex integrations with AI/ML modules. This role is for builders. You’ll work closely with product, AI, and infra teams, write production-grade Python code, lead critical decisions on architecture, and help shape engineering best practices. Key Responsibilities 1. Backend API Development Design and implement scalable, secure RESTful APIs using FastAPI , Flask , or Django REST Framework Architect modular services and microservices to support AI, transcription, real-time analytics, and reporting Optimize API performance with proper indexing, pagination, caching, and load management strategies Integrate with frontend systems, mobile clients, and third-party systems through clean, well-documented endpoints 2. AI Integrations & Inference Orchestration Work closely with AI engineers to integrate GenAI/LLM APIs (OpenAI, Llama, Gemini), transcription models (Whisper, Deepgram), and retrieval-augmented generation (RAG) workflows Build services to manage prompt templates, chaining logic, and LangChain flows Deploy and manage vector database integrations (e.g., FAISS , Pinecone , Weaviate ) for real-time search and recommendation pipelines 3. Database Design & Optimization Model and maintain relational databases using MySQL or PostgreSQL ; experience with MongoDB is a plus Optimize SQL queries, schema design, and indexes to support low-latency data access Set up background jobs for session archiving, transcript cleanup, and audio-data binding 4. System Architecture & Deployment Own backend deployments using GitHub Actions , Docker , and AWS EC2 Ensure high availability of services through containerization, horizontal scaling, and health monitoring Manage staging and production environments, including DB backups, server health checks, and rollback systems 5. Security, Auth & Access Control Implement robust authentication (JWT, OAuth), rate limiting , and input validation Build role-based access controls (RBAC) and audit logging into backend workflows Maintain compliance-ready architecture for enterprise clients (data encryption, PII masking) 6. Code Quality, Documentation & Collaboration Write clean, modular, extensible Python code with meaningful comments and documentation Build test coverage (unit, integration) using PyTest , unittest , or Postman/Newman Participate in pull requests, code reviews, sprint planning, and retrospectives with the engineering team Required Skills & QualificationsTechnical Expertise 3–8 years of experience in backend development with Python, PHP. Strong experience with FastAPI , Flask , or Django (at least one in production-scale systems) Deep understanding of RESTful APIs , microservice architecture, and asynchronous Python patterns Strong hands-on with MySQL (joins, views, stored procedures); bonus if familiar with MongoDB , Redis , or Elasticsearch Experience with containerized deployment using Docker and cloud platforms like AWS or GCP Familiarity with Git , GitHub , CI/CD pipelines , and Linux-based server environments Show more Show less

Posted 1 month ago

Apply

10.0 years

0 Lacs

Hyderābād

On-site

Do you want to help one of the most respected companies in the world reinvent its approach to data? At Thomson Reuters, we are recruiting a team of motivated data professionals to transform how we manage and leverage our commercial data assets. It is a unique opportunity to join a diverse and global team with centers of excellence in Toronto, London, and Bangalore. Are you excited about working at the forefront of the data driven revolution that will change the way a company works? Thomson Reuters Data and Analytics team is seeking an experienced Lead Engineer, Test Data Management with a passion for engineering quality assurance solutions for cloud-based data warehouse systems. About the Role As Lead Engineer, In this opportunity you will: Test Data Management, you play a crucial role in ensuring the quality and reliability of our enterprise data systems. Your expertise in testing methods, data validation, and automation are essential to bring best-in-class standards to our data products. Design test data management frameworks, apply data masking, data sub-setting, and generate synthetic data to create robust test data solutions for enterprise-wide teams. You will collaborate with Engineers, Database Architects, Data Quality Stewards to build logical data models, execute data validation, design manual and automated testing Mentor and lead the testing of key data development projects related to Data Warehouse and other systems. Lead engineering team members in implementation of test data best practices and the delivery of test data solutions. Be a thought leader investigating leading edge quality technology for test data management and systems functionality including performance testing for data pipelines. Innovate create ETL mappings, workflows, functions to move data from multiple sources into target areas. Partner across the company with analytics teams, engineering managers, architecture teams and others to design and agree on solutions that meet business requirements. Effectively communicate and liaise with other engineering groups across the organization, data consumers, and business analytic groups. Utilize your experience in the following areas: SQL for data querying, validation, and analysis Knowledge of database management systems (e.g., SQL Server, Postgresql, mySQL) Test Data Management Tools (e.g., K2View, qTest, ALM, Zephyr) Proficiency in Python for test automation and data manipulation PySpark for big data testing Test case design, execution, and defect management AWS Cloud Data practices and DevOps tooling Performance testing for data management solutions, especially for complex data flows Data Security, Privacy, and Data governance compliance principles About You You're a fit for the role of Lead Engineer, If your Job role includes: 10+ years of experience as a Tester, Developer or Data Analyst with experience in establishing end-to-end test strategies, planning for data validation, transformation, and analytics Advanced SQL Knowledge Designing and executing test procedures and documenting best practices Experience planning and executing regression testing, data validation, and quality assurance Advanced command of data warehouse creation, management, and performance strategies Experience engineering and implementing data quality systems in the cloud Proficiency in scripting language such as Python Hands on experience with data test automation applications (preference for K2View) Identification and remediation of data quality issues Data Management tools like: K2View, Immuta, Alation, Informatica Agile development Business Intelligence and Data Warehousing concepts Familiarity SAP, Salesforce systems Intermediate understanding of Big Data technologies AWS services and management, including serverless, container, queueing and monitoring services Experience with creating manual or automated tests on data pipelines Programming languages: Python Data interchange formats: Parquet, JSON, CSV Version control with GitHub Cloud security and compliance, privacy, GDPR #LI-VGA1 What’s in it For You? Hybrid Work Model: We’ve adopted a flexible hybrid working environment (2-3 days a week in the office depending on the role) for our office-based roles while delivering a seamless experience that is digitally and physically connected. Flexibility & Work-Life Balance: Flex My Way is a set of supportive workplace policies designed to help manage personal and professional responsibilities, whether caring for family, giving back to the community, or finding time to refresh and reset. This builds upon our flexible work arrangements, including work from anywhere for up to 8 weeks per year, empowering employees to achieve a better work-life balance. Career Development and Growth: By fostering a culture of continuous learning and skill development, we prepare our talent to tackle tomorrow’s challenges and deliver real-world solutions. Our Grow My Way programming and skills-first approach ensures you have the tools and knowledge to grow, lead, and thrive in an AI-enabled future. Industry Competitive Benefits: We offer comprehensive benefit plans to include flexible vacation, two company-wide Mental Health Days off, access to the Headspace app, retirement savings, tuition reimbursement, employee incentive programs, and resources for mental, physical, and financial wellbeing. Culture: Globally recognized, award-winning reputation for inclusion and belonging, flexibility, work-life balance, and more. We live by our values: Obsess over our Customers, Compete to Win, Challenge (Y)our Thinking, Act Fast / Learn Fast, and Stronger Together. Social Impact: Make an impact in your community with our Social Impact Institute. We offer employees two paid volunteer days off annually and opportunities to get involved with pro-bono consulting projects and Environmental, Social, and Governance (ESG) initiatives. Making a Real-World Impact: We are one of the few companies globally that helps its customers pursue justice, truth, and transparency. Together, with the professionals and institutions we serve, we help uphold the rule of law, turn the wheels of commerce, catch bad actors, report the facts, and provide trusted, unbiased information to people all over the world. About Us Thomson Reuters informs the way forward by bringing together the trusted content and technology that people and organizations need to make the right decisions. We serve professionals across legal, tax, accounting, compliance, government, and media. Our products combine highly specialized software and insights to empower professionals with the data, intelligence, and solutions needed to make informed decisions, and to help institutions in their pursuit of justice, truth, and transparency. Reuters, part of Thomson Reuters, is a world leading provider of trusted journalism and news. We are powered by the talents of 26,000 employees across more than 70 countries, where everyone has a chance to contribute and grow professionally in flexible work environments. At a time when objectivity, accuracy, fairness, and transparency are under attack, we consider it our duty to pursue them. Sound exciting? Join us and help shape the industries that move society forward. As a global business, we rely on the unique backgrounds, perspectives, and experiences of all employees to deliver on our business goals. To ensure we can do that, we seek talented, qualified employees in all our operations around the world regardless of race, color, sex/gender, including pregnancy, gender identity and expression, national origin, religion, sexual orientation, disability, age, marital status, citizen status, veteran status, or any other protected classification under applicable law. Thomson Reuters is proud to be an Equal Employment Opportunity Employer providing a drug-free workplace. We also make reasonable accommodations for qualified individuals with disabilities and for sincerely held religious beliefs in accordance with applicable law. More information on requesting an accommodation here. Learn more on how to protect yourself from fraudulent job postings here. More information about Thomson Reuters can be found on thomsonreuters.com.

Posted 1 month ago

Apply

4.0 - 6.0 years

0 Lacs

Bhubaneshwar

On-site

Position: Data Migration Engineer (NV46FCT RM 3324) Required Qualifications: 4–6 years of experience in data migration, data integration, and ETL development. Hands-on experience with both relational (PostgreSQL, MySQL, Oracle, SQL Server) and NoSQL (MongoDB, Cassandra, DynamoDB) databases Experience in Google BigQuery for data ingestion, transformation, and performance optimization. Proficiency in SQL and scripting languages such as Python or Shell for custom ETL logic. Familiarity with ETL tools like Talend, Apache NiFi, Informatica, or AWS Glue. Experience working in cloud environments such as AWS, GCP, or Azure. Solid understanding of data modeling, schema design, and transformation best practices. Preferred Qualifications: Experience in BigQuery optimization, federated queries, and integration with external data sources. Exposure to data warehouses and lakes such as Redshift, Snowflake, or BigQuery. Experience with streaming data ingestion tools like Kafka, Debezium, or Google Dataflow. Familiarity with workflow orchestration tools such as Apache Airflow or DBT. Knowledge of data security, masking, encryption, and compliance requirements in migration scenarios. Soft Skills: Strong problem-solving and analytical mindset with high attention to data quality. Excellent communication and collaboration skills to work with engineering and client teams. Ability to handle complex migrations under tight deadlines with minimal supervision. ******************************************************************************************************************************************* Job Category: Digital_Cloud_Web Technologies Job Type: Full Time Job Location: BhubaneshwarNoida Experience: 4-6 years Notice period: 0-30 days

Posted 1 month ago

Apply

4.0 - 5.0 years

0 - 0 Lacs

India

On-site

Design and create high-quality pastries, desserts, and confections. ● Mastery in piping, masking, garnishing, and chocolate tempering techniques. ● Maintain high standards of hygiene and food safety. ● Innovate new pastry recipes and dessert presentations. ● Manage inventory and order supplies as needed. ● Collaborate with the kitchen team to ensure timely delivery of products. ● Train and mentor junior pastry staff. ● Developing new recipes. ● Hard working, ability to work at least 10 hours and must 4-5 years bakery experienced Job Type: Full-time Pay: ₹15,000.00 - ₹20,000.00 per month Schedule: Day shift Work Location: In person Expected Start Date: 01/07/2025

Posted 1 month ago

Apply

10.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Data Governance Lead – Telecom Domain About the Role: We are seeking an experienced Telecom Data Governance lead to join our team. In this role, you will be responsible for defining and implementing the data governance strategy. The role involves establishing metadata standards, defining attribute ownership models, ensuring regulatory compliance, and improving data quality and trust across the enterprise. Responsibilities: Define and implement enterprise-wide data governance framework Own the metadata catalog and ensure consistency across business and technical assets Develop and manage KPI registries, data dictionaries, and lineage documentation Collaborate with data stewards and domain owners to establish attribute ownership Lead efforts around data standardization, quality rules, and classification of sensitive data Ensure privacy and compliance (e.g., GDPR, PII, PHI) by enforcing tagging, masking, and access rules Define access control rules (purpose-based views, user roles, sensitivity levels) Oversee governance for data products and federated data domains Support internal audits and external regulatory reviews Coordinate with platform, analytics, security, and compliance teams Qualifications: Bachelor's or Master's degree in Computer Science, Telecommunications Engineering, Data Science, or a related technical field 10+ years of experience in data governance roles with at least 3-4 years in telecommunications industry Experience integrating governance with modern data stacks (e.g.Data bricks, Snowflake) Strong experience in data governance tools (e.g., Alation, Unity Catalog ,Azure Purview,) Proven understanding of metadata management, data lineage, and data quality frameworks Experience in implementing federated governance models and data stewardship programs Knowledge of compliance requirements (GDPR, HIPAA,PII, etc.) Familiarity with data mesh principles and data contract approaches Excellent communication and stakeholder management skills Background in telecom, networking or other data-rich industries Certification in data governance or management frameworks Show more Show less

Posted 1 month ago

Apply

175.0 years

0 Lacs

Gurgaon, Haryana, India

On-site

At American Express, our culture is built on a 175-year history of innovation, shared values and Leadership Behaviors, and an unwavering commitment to back our customers, communities, and colleagues. As part of Team Amex, you'll experience this powerful backing with comprehensive support for your holistic well-being and many opportunities to learn new skills, develop as a leader, and grow your career. Here, your voice and ideas matter, your work makes an impact, and together, you will help us define the future of American Express. How will you make an impact in this role? There are hundreds of opportunities to make your mark on technology and life at American Express. Here’s just some of what you’ll be doing: Contributes to design, development, troubleshooting, debugging, evaluating, modifying, deploying and documenting software and systems that meet the needs of Oracle Cloud application Design, development, troubleshooting, support and debugging of software development in Oracle cloud. Building File-based and API based integration between systems using secure the transmission. Designing Micro Services & integration patterns to securely communicate with backend services and clients. Function as member of an Agile team by contributing to software builds through consistent development practices. Participate in code reviews. Quickly debug basic software components and identify code defects for remediation. Enable the deployment, support, and monitoring of software across test, integration, and production environments. Ensures timely completion and quality product, including documentation and other deliverables produced by engineering team. Identifies opportunities to adopt innovative & new technologies to solve existing business needs and predict future challenges. Must have experience collaborating with Product Owners on business process enhancements. Provide constructive input and perspective to team conversations and effectively facilitate/negotiate through challenging situations. Minimum Qualifications · Bachelor’s Degree in CS or CSE or Equivalent. · 6-10 years technical expertise in implementing Oracle cloud in a Global organisations structure and knowledge on Oracle E-Business is preferred · Hands-on experience in the design and development in Oracle cloud pertaining to Oracle Financials -Procure to Pay (Payables, Fixed Assets, Projects and Payments). Hands-on experience in developing BI Reports, Interfaces, Conversions . · Hand on experience in building integration/interfaces based on web services (SOAP and REST using JSON, XML), File based interfaces (Batch Processing), Database (SQL and PLSQL). · Strong technical experience in Fusion Finance and SCM BIP, OTBI, FRS and Smartview reporting mechanisms. BICC knowledge will be add-on. · Conversion related to invoices, Purchase orders, Assets and Projects using the FBDI , ADFDI and UCM · Hands-on experience Security concepts like - API Security, Encryptions, Vault and Masking · Should be aware of customisation process in ERP Cloud: Sandboxes, Page Integrations, Application and Page Composer. VBCS/APEX good to have · Experience with web services, open API development and its concepts. Preferred Qualifications Technical knowledge of Oracle Development tools - PL/SQL, OAF, reports, Oracle workflow and Profound knowledge on oracle database Functional knowledge in finance/ procure to pay domain Knowledge of Collaboration Tools (GitHub, Confluence, Rally). Experience in Continuous Integration and Deployment (Jenkins). Oracle Financials including Procure to Pay, Fixed Assets, Projects or General Ledger. Agile/SAFe practices in building software We back you with benefits that support your holistic well-being so you can be and deliver your best. This means caring for you and your loved ones' physical, financial, and mental health, as well as providing the flexibility you need to thrive personally and professionally: Competitive base salaries Bonus incentives Support for financial-well-being and retirement Comprehensive medical, dental, vision, life insurance, and disability benefits (depending on location) Flexible working model with hybrid, onsite or virtual arrangements depending on role and business need Generous paid parental leave policies (depending on your location) Free access to global on-site wellness centers staffed with nurses and doctors (depending on location) Free and confidential counseling support through our Healthy Minds program Career development and training opportunities American Express is an equal opportunity employer and makes employment decisions without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, veteran status, disability status, age, or any other status protected by law. Offer of employment with American Express is conditioned upon the successful completion of a background verification check, subject to applicable laws and regulations. Show more Show less

Posted 1 month ago

Apply

170.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Birlasoft, a global leader at the forefront of Cloud, AI, and Digital technologies, seamlessly blends domain expertise with enterprise solutions. The company’s consultative and design-thinking approach empowers societies worldwide, enhancing the efficiency and productivity of businesses. As part of the multibillion-dollar diversified CKA Birla Group, Birlasoft with its 12,000+ professionals, is committed to continuing the Group’s 170-year heritage of building sustainable communities. Education : Bachelor's degree in Computer Science, Information Technology, or related field Location: Pune / Noida / Hyderabad Experience : 5-7 years of experience Role: Deputy Manager Job Summary : We're seeking a technically skilled Deputy Manager-Data Privacy to support our data privacy initiatives. The successful candidate shall have a strong technical background and experience in data privacy, with a focus on implementing technical controls to ensure data protection. This role requires a detail-oriented individual who can work closely with cross-functional teams to ensure data privacy compliance. Role & responsibilities: Responsible for effective management of a privacy framework within Birlasoft group. Acting as the point of contact for any privacy-related matters. Establishing and maintaining the Group’s policies and data protection framework. Drive greater consistency of process, practices, and execution across company-wide privacy workstreams. Being SME, champion the overall implementation plan including deeply understanding of the regulatory requirements and associated technical and operational work required across the company to comply successfully. Work on day-to day activities to improve and maintain robust privacy framework including ROPA, PIA, TIA, Third party risk management and DSARs etc. Ensure privacy by design in all aspects of business environment. Assist with the investigations on privacy breaches and other remedial matters as necessary. Supporting Head of Privacy and DPO with privacy-related strategic matters. Providing professional advice and support regarding various data protection compliance obligations and commitments requirements, both external and internal. Responsible for training and awareness initiatives across the Group. Align necessary cross-functional business strategies to ensure success in operational execution. Manage and prioritize work based on urgency and complexity while building operational cadences across technical and operational teams to coordinate work. Promoting a culture of compliance and integrity across the Group. Implement technical controls to ensure data protection, such as encryption, access controls, and data masking. Preferred candidate profile: Working knowledge of international data protection legislation and security requirements; Certifications: Certifications in data privacy, such as CIPP or CIPM, ISO 27001 and 27701 Working on Data Discovery and Privacy Management tools like Onetrust, Securiti.ai etc Show more Show less

Posted 1 month ago

Apply

0.0 - 1.0 years

0 Lacs

Sanand, Ahmedabad, Gujarat

On-site

Job Title: Paint Shop Supervisor Positions Open: 3 Men. Location: Sanand GIDC Reports to: Level - 1 Site Incharge, Level - 2: General Manager Operations Job Summary We are seeking a proactive and detail-oriented Paint Shop Supervisor to oversee and manage our paint department operations. The ideal candidate will be responsible for leading a team of painters and technicians, ensuring that all projects meet our high standards of quality, are completed efficiently, and adhere to safety protocols. This role is crucial for maintaining workflow, managing resources, and ensuring the final product finish is flawless. Key Responsibilities ● Team Leadership & Supervision: Lead, train, and motivate a team of paint shop staff. Schedule shifts, assign tasks, and monitor performance to ensure daily production targets are met. ● Quality Control: Conduct regular inspections of work in progress and finished products to ensure they meet stringent quality standards and client specifications. Implement corrective actions to resolve any quality issues. ● Process Management: Oversee the entire painting process, from surface preparation (sanding, masking, cleaning) to paint mixing, application, and finishing. ● Inventory & Equipment Management: Monitor and manage inventory levels of paints, solvents, and other supplies. Ensure all equipment, including spray guns, booths, and safety apparatus, is properly maintained and in good working order. ● Safety & Compliance: Enforce all workplace safety regulations and procedures, including the proper use of Personal Protective Equipment (PPE) and handling of hazardous materials. Maintain a clean and organized work environment. ● Workflow Coordination: Liaise with other departments (e.g., production, assembly) to ensure a smooth and efficient workflow, minimizing downtime and delays. ● Reporting: Prepare and maintain production reports, quality logs, and employee performance records. Qualifications and Skills ● In-depth knowledge of various painting techniques, materials, and equipment (e.g., spray painting, powder coating). ● Strong understanding of surface preparation and finishing processes. ● Excellent leadership, communication, and interpersonal skills. ● Proven ability to manage a team and coordinate workflow effectively. ● Strong problem-solving skills and a keen eye for detail. ● Proficiency in quality control principles and safety standards. ● Basic computer skills for reporting and inventory management. Experience ● A minimum of 1 year of hands-on experience in a professional paint shop or a similar industrial finishing environment is required. ● Prior experience in a supervisory or team lead role is highly preferred. What We Offer ● Salary: ₹23,000 - ₹27,000 per month (CTC). ● Accommodation: Company-provided accommodation is available. This includes a one-time setup of essential items such as a fan, light, gas cylinder, stove, and basic utensils. ○ Please Note: The cost of food, ongoing maintenance, and refilling of supplies (e.g., gas cylinder) will be the responsibility of the employee. ● A competitive benefits package. ● A dynamic and supportive work environment. ● Opportunities for professional growth and development. How to Apply Interested candidates who meet the above requirements are invited to submit their resume and a brief cover letter to hr@think-inds.com and admin@think-inds.com with the subject line "Application for Paint Shop Supervisor". Job Types: Full-time, Permanent Pay: ₹20,000.00 - ₹33,000.00 per month Benefits: Health insurance Provident Fund Schedule: Day shift Morning shift Night shift Rotational shift Supplemental Pay: Overtime pay Yearly bonus Ability to commute/relocate: Sanand, Ahmedabad, Gujarat: Reliably commute or willing to relocate with an employer-provided relocation package (Preferred) Experience: Painting: 1 year (Preferred) Work Location: In person

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies