Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
5.0 - 10.0 years
14 - 24 Lacs
Hyderabad, Chennai
Hybrid
JD Background in Data Analysis Experience in training Artificial Intelligence models, with special focus on prompt engineering Two years related work experience in Data Loss Prevention (DLP), and/or Cybersecurity. For DLP, experience providing recommendations for tuning content detection rules to improve accuracy, a plus Able to learn and apply new concepts quickly Proven analytical and problem solving abilities Able to learn and apply new concepts quickly Proven analytical and problem solving abilities Strong communications skills Responsibilities: train Machine Learning (AI) models in answering Data Loss Prevention topics Be an advocate of users of Artificial Intelligence (AI): understand and give prompts from user perspective. generate metrics for AI responses Evaluate AI responses and generate comprehensive feedback about the prompt and the responses Document and articulate the AI feedback into guide books Communicate the AI feedback across stakeholders and drive continuous improvements The JD for the DLP role is still accurate. We want candidates who has domain knowledge in using DLP module of compliance tools such as: MS Purview ProofPoint ForcePoint Symantec DLP others
Posted 4 days ago
2.0 - 4.0 years
0 - 2 Lacs
Hyderabad, Bangalore Rural, Bengaluru
Work from Office
Hi, We are hiring Data discovery and classification engineer for Bangalore location, please find the below details of the Opportunity: Job Description Summary: We are seeking a skilled Data Discovery & Classification Engineer to join our data management team. The ideal candidate will be responsible for identifying, classifying, and protecting sensitive data across the organization, ensuring compliance with data protection regulations and internal policies. Key Responsibilities: 1. Onboarding Data Sources: Work with IT and data teams to integrate new data sources into the existing data management platform. (e.g., Teams, OneDrive, file storage, APIs). Monitor data source health and connectivity; resolve any ingestion issues. Automate inventory updates where possible using scripts or tools. Create and maintain detailed documentation for each data source records, including metadata, data lineage, along with onboarding process. 2. Maintaining Inventory of Data Sources: Maintain an up-to-date inventory of all data sources, ensuring each source is accurately cataloged and classified. Regularly monitor data sources for changes or updates, ensuring the inventory reflects the current state of data assets. Generate reports on the status and progress of data sources, highlighting any issues or areas for improvement. 3. Data Classification: Create and maintain classification categories. Apply appropriate tags and labels to data to facilitate easy retrieval and management. Adjust classification rules based on changing business needs and policies. 4. Metrics and Reporting: Develop dashboards and reports to track onboarding status, classification accuracy, and data inventory health. Track key performance indicators (KPIs) such as: Number of data sources onboarded Percentage of classified vs. unclassified data Automation and Process Improvement: Identify opportunities to automate data onboarding and classification processes. Create and maintain scripts to automate data extraction and classification. Propose and implement improvements to data ingestion pipelines and workflows. Qualifications: Bachelors degree in Computer Science, Information Technology, or a related field. Proven experience as a Data Discovery & Classification Engineer or in a similar role. Strong knowledge of data discovery and classification tools and technologies (e.g., Congruity 360, Microsoft Purview, Big ID etc.). Excellent analytical and problem-solving skills. Ability to work independently and as part of a team.
Posted 1 week ago
0.0 years
0 Lacs
Bengaluru / Bangalore, Karnataka, India
On-site
Ready to shape the future of work At Genpact, we don&rsquot just adapt to change&mdashwe drive it. AI and digital innovation are redefining industries, and we&rsquore leading the charge. Genpact&rsquos AI Gigafactory, our industry-first accelerator, is an example of how we&rsquore scaling advanced technology solutions to help global enterprises work smarter, grow faster, and transform at scale. From large-scale models to agentic AI, our breakthrough solutions tackle companies most complex challenges. If you thrive in a fast-moving, tech-driven environment, love solving real-world problems, and want to be part of a team that&rsquos shaping the future, this is your moment. Genpact (NYSE: G) is an advanced technology services and solutions company that delivers lasting value for leading enterprises globally. Through our deep business knowledge, operational excellence, and cutting-edge solutions - we help companies across industries get ahead and stay ahead. Powered by curiosity, courage, and innovation, our teams implement data, technology, and AI to create tomorrow, today. Get to know us at genpact.com and on LinkedIn, X, YouTube, and Facebook. Inviting applications for the role of Senior Principal Consultant- Senior Data Engineer - Databricks, Azure & Mosaic AI Role Summary: We are seeking a Senior Data Engineer with extensive expertise in Data & Analytics platform modernization using Databricks, Azure, and Mosaic AI. This role will focus on designing and optimizing cloud-based data architectures, leveraging AI-driven automation to enhance data pipelines, governance, and processing at scale. Key Responsibilities: . Architect & modernize Data & Analytics platforms using Databricks on Azure. . Design and optimize Lakehouse architectures integrating Azure Data Lake, Databricks Delta Lake, and Synapse Analytics. . Implement Mosaic AI for AI-driven automation, predictive analytics, and intelligent data engineering solutions. . Lead the migration of legacy data platforms to a modern cloud-native Data & AI ecosystem. . Develop high-performance ETL pipelines, integrating Databricks with Azure services such as Data Factory, Synapse, and Purview. . Utilize MLflow & Mosaic AI for AI-enhanced data processing and decision-making. . Establish data governance, security, lineage tracking, and metadata management across modern data platforms. . Work collaboratively with business leaders, data scientists, and engineers to drive innovation. . Stay at the forefront of emerging trends in AI-powered data engineering and modernization strategies. Qualifications we seek in you! Minimum Qualifications . experience in Data Engineering, Cloud Platforms, and AI-driven automation. . Expertise in Databricks (Apache Spark, Delta Lake, MLflow) and Azure (Data Lake, Synapse, ADF, Purview). . Strong experience with Mosaic AI for AI-powered data engineering and automation. . Advanced proficiency in SQL, Python, and Scala for big data processing. . Experience in modernizing Data & Analytics platforms, migrating from on-prem to cloud. . Knowledge of Data Lineage, Observability, and AI-driven Data Governance frameworks. . Familiarity with Vector Databases & Retrieval-Augmented Generation (RAG) architectures for AI-powered data analytics. . Strong leadership, problem-solving, and stakeholder management skills. Preferred Skills: . Experience with Knowledge Graphs (Neo4J, TigerGraph) for data structuring. . Exposure to Kubernetes, Terraform, and CI/CD for scalable cloud deployments. . Background in streaming technologies (Kafka, Spark Streaming, Kinesis). Why join Genpact . Be a transformation leader - Work at the cutting edge of AI, automation, and digital innovation . Make an impact - Drive change for global enterprises and solve business challenges that matter . Accelerate your career - Get hands-on experience, mentorship, and continuous learning opportunities . Work with the best - Join 140,000+ bold thinkers and problem-solvers who push boundaries every day . Thrive in a values-driven culture - Our courage, curiosity, and incisiveness - built on a foundation of integrity and inclusion - allow your ideas to fuel progress Come join the tech shapers and growth makers at Genpact and take your career in the only direction that matters: Up. Let&rsquos build tomorrow together. Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color, religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a %27starter kit,%27 paying to apply, or purchasing equipment or training.
Posted 3 weeks ago
5.0 - 10.0 years
18 - 22 Lacs
Hyderabad, Bengaluru, Delhi / NCR
Hybrid
Shift timings- 2 PM -11 PM Primary skills Azure Security Defender, Sentinel,(identity, Endpoint, etc.) Secondary skills Azure Infrastructure, Office 365 collab workloads Required Skills & Experience: Technical Expertise: Strong understanding of Azure security offerings, including but not limited to: Microsoft Defender for Cloud / Endpoint / Identity Microsoft Sentinel (SIEM/SOAR) Microsoft Entra (Identity Governance, Conditional Access) Hands-on experience with cloud security assessments, PoC deployments, and client workshops. Familiarity with Zero Trust architecture and related best practices. Professional Experience: 5+ years in IT security roles, with 2+ years focused on Azure or cloud security. Proven track record of leading technical engagements independently. Soft Skills: Excellent communication and presentation skills. Ability to articulate technical concepts to both technical and business audiences. Self-starter who thrives in a fast-paced, client-facing environment. Preferred Qualifications: Microsoft certifications (e.g., SC-100, AZ-500, SC-200) Experience working with Microsoft partners or within funded engagement programs. Exposure to regulatory compliance frameworks (e.g., ISO, NIST, GDPR) Key Responsibilities: Client Engagements: Conduct security assessments and discovery workshops to understand client environments, security gaps, and cloud readiness. Deliver technical Proof of Concepts (PoCs) and hands-on demonstrations of Microsoft Azure security solutions. Host and facilitate technical workshops on Zero Trust, Microsoft Defender, Sentinel, Entra, and related technologies. Provide technology walkthroughs, highlight use cases, and share practical experience to illustrate business value. Solution Design & Implementation: Design and recommend secure architectures and configurations using Azure-native tools and services. Collaborate on solution development, documentation, and client readiness for security modernization. Internal & Cross-Functional Collaboration: Work closely with Sales, PreSales, and regional delivery teams to align on customer needs, technical strategy, and success metrics. Contribute to proposal development and client presentations from a technical security standpoint. Thought Leadership & Enablement: Stay updated on Azure security advancements and share knowledge internally and with clients. Support internal enablement sessions and mentor junior team members, where applicable.
Posted 4 weeks ago
8.0 - 10.0 years
13 - 15 Lacs
Pune
Work from Office
We are seeking a hands-on Lead Data Engineer to drive the design and delivery of scalable, secure data platforms on Google Cloud Platform (GCP). In this role you will own architectural decisions, guide service selection, and embed best practices across data engineering, security, and performance disciplines. You will partner with data modelers, analysts, security teams, and product owners to ensure our pipelines and datasets serve analytical, operational, and AI/ML workloads with reliability and cost efficiency. Familiarity with Microsoft Azure data services (Data Factory, Databricks, Synapse, Fabric) is valuable, as many existing workloads will transition from Azure to GCP. Key Responsibilities Lead end-to-end development of high-throughput, low-latency data pipelines and lake-house solutions on GCP (BigQuery, Dataflow, Pub/Sub, Dataproc, Cloud Composer, Dataplex, etc.). Define reference architectures, technology standards for data ingestion, transformation, and storage. Drive service-selection trade-offscost, performance, scalability, and securityacross streaming and batch workloads. Conduct design reviews and performance tuning sessions; ensure adherence to partitioning, clustering, and query-optimization standards in BigQuery. Contribute to long-term cloud data strategy, evaluating emerging GCP features and multi-cloud patterns (Azure Synapse, Data Factory, Purview, etc.) for future adoption. Lead the code reviews and oversee the development activities delegated to Data engineers. Implement best practices recommended by Google Cloud Provide effort estimates for the data engineering activities Participate in discussions to migrate existing Azure workloads to GCP, provide solutions to migrate the work loads for selected data pipelines Must-Have Skills 810 years in data engineering, with 3+ years leading teams or projects on GCP. Expert in GCP data services (BigQuery, Dataflow/Apache Beam, Dataproc/Spark, Pub/Sub, Cloud Storage) and orchestration with Cloud Composer or Airflow. Proven track record designing and optimizing large-scale ETL/ELT pipelines (streaming + batch). Strong fluency in SQL and one major programming language (Python, Java, or Scala). Deep understanding of data lake / lakehouse, dimensional & data-vault modeling, and data governance frameworks. Excellent communication and stakeholder-management skills; able to translate complex technical topics to non-technical audiences. Nice-to-Have Skills Hands-on experience with Microsoft Azure data services (Azure Synapse Analytics, Data Factory, Event Hub, Purview). Experience integrating ML pipelines (Vertex AI, Dataproc ML) or real-time analytics (BigQuery BI Engine, Looker). Familiarity with open-source observability stacks (Prometheus, Grafana) and FinOps tooling for cloud cost optimization. Preferred Certifications Google Professional Data Engineer (strongly preferred) or Google Professional Cloud Architect Microsoft Certified: Azure Data Engineer Associate (nice to have) Education Bachelors or Masters degree in Computer Science, Information Systems, Engineering, or a related technical field. Equivalent professional experience will be considered.
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
20312 Jobs | Dublin
Wipro
11977 Jobs | Bengaluru
EY
8165 Jobs | London
Accenture in India
6667 Jobs | Dublin 2
Uplers
6464 Jobs | Ahmedabad
Amazon
6352 Jobs | Seattle,WA
Oracle
5993 Jobs | Redwood City
IBM
5803 Jobs | Armonk
Capgemini
3897 Jobs | Paris,France
Tata Consultancy Services
3776 Jobs | Thane