Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
0 years
20 - 25 Lacs
Thane, Maharashtra, India
On-site
We are a fast-growing data-analytics consultancy dedicated to the Life Sciences / Pharmaceutical commercial analytics space. Our teams build cloud-native data platforms that power sales, marketing, and patient-centric insights for leading global pharma brands—delivering compliant, high-impact solutions at enterprise scale. Role & Responsibilities Architect, build, and tune Snowflake data warehouses and ELT pipelines (SQL, Streams, Tasks, UDFs, Stored Procedures) to meet complex commercial-analytics workloads. Integrate diverse pharma data sources (Veeva, Salesforce, IQVIA, Symphony, RWD, patient-services feeds) via Fivetran, ADF, or Python-based frameworks, ensuring end-to-end data quality. Establish robust data models (star, snowflake, Data Vault) optimized for sales reporting, market-share analytics, and AI/ML use-cases. Drive governance & compliance (HIPAA, GDPR, GxP) through fine-grained access controls, masking, lineage, and metadata management. Lead code reviews, mentor engineers, and resolve performance bottlenecks while right-sizing compute for cost efficiency. Partner with business stakeholders to translate commercial objectives into scalable data solutions and actionable insights. Skills & Qualifications Must-Have 7+ yrs data-engineering / warehousing experience, incl. 4+ yrs hands-on Snowflake design & development. Expert‐level SQL plus strong data-modeling (Dimensional, Data Vault) and ETL/ELT optimisation skills. Proficiency in Python (or similar) for automation, API integrations, and orchestration. Proven governance/security acumen within regulated industries (HIPAA, GDPR, PII). Bachelor’s in Computer Science, Engineering, Information Systems (Master’s preferred). Strong client-facing communication and problem-solving ability in fast-paced, agile environments. Preferred Direct experience with pharma commercial datasets (sales, CRM, claims, MDM, adherence KPIs). Cloud-platform depth (AWS, Azure, or GCP) and familiarity with tools such as Matillion/DBT/Airflow, Git. Snowflake certifications (SnowPro Core / Advanced) plus Tableau, Power BI, or Qlik connectivity know-how. Skills: Data,Analytics,Snowflake,Sales,Cloud,AWS,Azure
Posted 2 weeks ago
0 years
20 - 25 Lacs
Nashik, Maharashtra, India
On-site
We are a fast-growing data-analytics consultancy dedicated to the Life Sciences / Pharmaceutical commercial analytics space. Our teams build cloud-native data platforms that power sales, marketing, and patient-centric insights for leading global pharma brands—delivering compliant, high-impact solutions at enterprise scale. Role & Responsibilities Architect, build, and tune Snowflake data warehouses and ELT pipelines (SQL, Streams, Tasks, UDFs, Stored Procedures) to meet complex commercial-analytics workloads. Integrate diverse pharma data sources (Veeva, Salesforce, IQVIA, Symphony, RWD, patient-services feeds) via Fivetran, ADF, or Python-based frameworks, ensuring end-to-end data quality. Establish robust data models (star, snowflake, Data Vault) optimized for sales reporting, market-share analytics, and AI/ML use-cases. Drive governance & compliance (HIPAA, GDPR, GxP) through fine-grained access controls, masking, lineage, and metadata management. Lead code reviews, mentor engineers, and resolve performance bottlenecks while right-sizing compute for cost efficiency. Partner with business stakeholders to translate commercial objectives into scalable data solutions and actionable insights. Skills & Qualifications Must-Have 7+ yrs data-engineering / warehousing experience, incl. 4+ yrs hands-on Snowflake design & development. Expert‐level SQL plus strong data-modeling (Dimensional, Data Vault) and ETL/ELT optimisation skills. Proficiency in Python (or similar) for automation, API integrations, and orchestration. Proven governance/security acumen within regulated industries (HIPAA, GDPR, PII). Bachelor’s in Computer Science, Engineering, Information Systems (Master’s preferred). Strong client-facing communication and problem-solving ability in fast-paced, agile environments. Preferred Direct experience with pharma commercial datasets (sales, CRM, claims, MDM, adherence KPIs). Cloud-platform depth (AWS, Azure, or GCP) and familiarity with tools such as Matillion/DBT/Airflow, Git. Snowflake certifications (SnowPro Core / Advanced) plus Tableau, Power BI, or Qlik connectivity know-how. Skills: Data,Analytics,Snowflake,Sales,Cloud,AWS,Azure
Posted 2 weeks ago
0 years
20 - 25 Lacs
Nashik, Maharashtra, India
On-site
We are a fast-growing data-analytics consultancy dedicated to the Life Sciences / Pharmaceutical commercial analytics space. Our teams build cloud-native data platforms that power sales, marketing, and patient-centric insights for leading global pharma brands—delivering compliant, high-impact solutions at enterprise scale. Role & Responsibilities Architect, build, and tune Snowflake data warehouses and ELT pipelines (SQL, Streams, Tasks, UDFs, Stored Procedures) to meet complex commercial-analytics workloads. Integrate diverse pharma data sources (Veeva, Salesforce, IQVIA, Symphony, RWD, patient-services feeds) via Fivetran, ADF, or Python-based frameworks, ensuring end-to-end data quality. Establish robust data models (star, snowflake, Data Vault) optimized for sales reporting, market-share analytics, and AI/ML use-cases. Drive governance & compliance (HIPAA, GDPR, GxP) through fine-grained access controls, masking, lineage, and metadata management. Lead code reviews, mentor engineers, and resolve performance bottlenecks while right-sizing compute for cost efficiency. Partner with business stakeholders to translate commercial objectives into scalable data solutions and actionable insights. Skills & Qualifications Must-Have 7+ yrs data-engineering / warehousing experience, incl. 4+ yrs hands-on Snowflake design & development. Expert‐level SQL plus strong data-modeling (Dimensional, Data Vault) and ETL/ELT optimisation skills. Proficiency in Python (or similar) for automation, API integrations, and orchestration. Proven governance/security acumen within regulated industries (HIPAA, GDPR, PII). Bachelor’s in Computer Science, Engineering, Information Systems (Master’s preferred). Strong client-facing communication and problem-solving ability in fast-paced, agile environments. Preferred Direct experience with pharma commercial datasets (sales, CRM, claims, MDM, adherence KPIs). Cloud-platform depth (AWS, Azure, or GCP) and familiarity with tools such as Matillion/DBT/Airflow, Git. Snowflake certifications (SnowPro Core / Advanced) plus Tableau, Power BI, or Qlik connectivity know-how. Skills: Data,Analytics,Snowflake,Sales,Cloud,AWS,Azure
Posted 2 weeks ago
0 years
20 - 25 Lacs
Solapur, Maharashtra, India
On-site
We are a fast-growing data-analytics consultancy dedicated to the Life Sciences / Pharmaceutical commercial analytics space. Our teams build cloud-native data platforms that power sales, marketing, and patient-centric insights for leading global pharma brands—delivering compliant, high-impact solutions at enterprise scale. Role & Responsibilities Architect, build, and tune Snowflake data warehouses and ELT pipelines (SQL, Streams, Tasks, UDFs, Stored Procedures) to meet complex commercial-analytics workloads. Integrate diverse pharma data sources (Veeva, Salesforce, IQVIA, Symphony, RWD, patient-services feeds) via Fivetran, ADF, or Python-based frameworks, ensuring end-to-end data quality. Establish robust data models (star, snowflake, Data Vault) optimized for sales reporting, market-share analytics, and AI/ML use-cases. Drive governance & compliance (HIPAA, GDPR, GxP) through fine-grained access controls, masking, lineage, and metadata management. Lead code reviews, mentor engineers, and resolve performance bottlenecks while right-sizing compute for cost efficiency. Partner with business stakeholders to translate commercial objectives into scalable data solutions and actionable insights. Skills & Qualifications Must-Have 7+ yrs data-engineering / warehousing experience, incl. 4+ yrs hands-on Snowflake design & development. Expert‐level SQL plus strong data-modeling (Dimensional, Data Vault) and ETL/ELT optimisation skills. Proficiency in Python (or similar) for automation, API integrations, and orchestration. Proven governance/security acumen within regulated industries (HIPAA, GDPR, PII). Bachelor’s in Computer Science, Engineering, Information Systems (Master’s preferred). Strong client-facing communication and problem-solving ability in fast-paced, agile environments. Preferred Direct experience with pharma commercial datasets (sales, CRM, claims, MDM, adherence KPIs). Cloud-platform depth (AWS, Azure, or GCP) and familiarity with tools such as Matillion/DBT/Airflow, Git. Snowflake certifications (SnowPro Core / Advanced) plus Tableau, Power BI, or Qlik connectivity know-how. Skills: Data,Analytics,Snowflake,Sales,Cloud,AWS,Azure
Posted 2 weeks ago
0 years
20 - 25 Lacs
Solapur, Maharashtra, India
On-site
We are a fast-growing data-analytics consultancy dedicated to the Life Sciences / Pharmaceutical commercial analytics space. Our teams build cloud-native data platforms that power sales, marketing, and patient-centric insights for leading global pharma brands—delivering compliant, high-impact solutions at enterprise scale. Role & Responsibilities Architect, build, and tune Snowflake data warehouses and ELT pipelines (SQL, Streams, Tasks, UDFs, Stored Procedures) to meet complex commercial-analytics workloads. Integrate diverse pharma data sources (Veeva, Salesforce, IQVIA, Symphony, RWD, patient-services feeds) via Fivetran, ADF, or Python-based frameworks, ensuring end-to-end data quality. Establish robust data models (star, snowflake, Data Vault) optimized for sales reporting, market-share analytics, and AI/ML use-cases. Drive governance & compliance (HIPAA, GDPR, GxP) through fine-grained access controls, masking, lineage, and metadata management. Lead code reviews, mentor engineers, and resolve performance bottlenecks while right-sizing compute for cost efficiency. Partner with business stakeholders to translate commercial objectives into scalable data solutions and actionable insights. Skills & Qualifications Must-Have 7+ yrs data-engineering / warehousing experience, incl. 4+ yrs hands-on Snowflake design & development. Expert‐level SQL plus strong data-modeling (Dimensional, Data Vault) and ETL/ELT optimisation skills. Proficiency in Python (or similar) for automation, API integrations, and orchestration. Proven governance/security acumen within regulated industries (HIPAA, GDPR, PII). Bachelor’s in Computer Science, Engineering, Information Systems (Master’s preferred). Strong client-facing communication and problem-solving ability in fast-paced, agile environments. Preferred Direct experience with pharma commercial datasets (sales, CRM, claims, MDM, adherence KPIs). Cloud-platform depth (AWS, Azure, or GCP) and familiarity with tools such as Matillion/DBT/Airflow, Git. Snowflake certifications (SnowPro Core / Advanced) plus Tableau, Power BI, or Qlik connectivity know-how. Skills: Data,Analytics,Snowflake,Sales,Cloud,AWS,Azure
Posted 2 weeks ago
0 years
2 - 3 Lacs
India
On-site
Job Title: Painter Location: Serilingampally Salary: ₹20,000 - ₹30,000 per month Job Description: We are looking for Painters to join our team at Serilingampally location. Key Responsibilities: Prepare surfaces for painting (cleaning, sanding, filling cracks and holes). Mix, match, and apply paints and finishes as per specifications. Apply primer, paints, varnishes, or other finishes using brushes, rollers, or spray guns. Protect surrounding areas using drop cloths or masking tape. Ensure high-quality finishing and attention to detail. Follow safety protocols and use protective equipment. Clean up after completing work and maintain tools and equipment. Requirements: Proven experience as a painter (residential, commercial, or industrial). Knowledge of various painting techniques and materials. Good physical condition and ability to work at heights if required. Attention to detail and precision. Ability to work independently and as part of a team. Education: No formal education required. Relevant experience is mandatory. Job Types: Full-time, Permanent Pay: ₹20,000.00 - ₹30,000.00 per month Benefits: Health insurance Provident Fund Schedule: Day shift Supplemental Pay: Performance bonus Work Location: In person
Posted 2 weeks ago
3.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Job Title : Data Warehouse Administrator Job Summary We are seeking an experienced Data Warehouse Administrator with strong expertise in Snowflake to manage, monitor, and optimize our enterprise data warehousing environment. The ideal candidate will be responsible for implementing and maintaining secure, scalable, and high-performance Snowflake solutions while ensuring data availability and reliability. Key Responsibilities Snowflake Administration : Manage Snowflake accounts, warehouses, databases, roles, and users. Monitor performance, resource usage, and optimize warehouse configurations. Handle data replication, failover, and disaster recovery setup. Data Management & Security Implement security best practices : RBAC, masking, encryption. Support data governance and compliance requirements (e.g., GDPR, HIPAA). ETL/ELT & Data Integration Support Work closely with data engineers to support data pipelines and transformations. Manage integrations between Snowflake and tools like DBT, Fivetran, Airflow, etc. Monitoring & Troubleshooting Proactively identify performance bottlenecks and resolve issues. Implement alerts, usage monitoring, and cost tracking in Snowflake. Upgrades & Maintenance Stay current with Snowflake updates and implement new features. Schedule and manage routine maintenance, backups, and data archiving. Documentation & Support Create and maintain system documentation, runbooks, and best practices. Provide L2/L3 support for data warehouse-related issues. Required Skills & Qualifications Bachelor's degree in Computer Science, Information Systems, or related field. 3-5+ years of experience with data warehouse administration. 2+ years of hands-on experience with Snowflake. Proficiency in SQL, scripting (Python or Bash), and version control (Git). Experience with cloud platforms (AWS, Azure, or GCP). Familiarity with data modeling, ELT frameworks, and CI/CD practices. Preferred Qualifications Snowflake certifications (e.g., SnowPro Core/Advanced). Experience with tools like DBT, Airflow, Fivetran, or Matillion. Exposure to data cataloging, data governance tools (e.g., Collibra, Alation). Soft Skills Strong problem-solving and analytical skills. Effective communication with technical and non-technical teams. Ability to work independently and in a team-oriented environment. (ref:hirist.tech)
Posted 2 weeks ago
3.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Key Responsibilities Design, develop, and deploy conversational agents using platforms like Google Dialogflow, Amelia, Amazon Lex, etc. Create and optimize NLP/NLU models to support dynamic, multi-turn interactions. Develop dialog flows, intents, entities, and fulfillment logic with API integrations. Integrate bots with external systems (CRM, EHR, contact center, databases) using RESTful APIs. Collaborate with UX designers, business analysts, and stakeholders to define and refine conversation flows. Implement voice interface capabilities (SIP/SBC integration, TTS/STT engines). Conduct testing (unit, regression, UAT) and optimize bot performance using analytics and user feedback. Ensure compliance with data privacy standards (e.g., HIPAA, GDPR) and implement masking/redaction as needed. Document architecture, workflows, and development best practices. Required Skills and Qualifications: Bachelor’s degree in Computer Science, Engineering, or related field. 3+ years of experience in Conversational AI development. Proficiency with one or more platforms: Google Dialogflow, Amelia.ai, IBM Watson, Microsoft Bot Framework, Rasa, or similar. Strong understanding of NLU/NLP concepts and tools. Hands-on experience in REST API integration and backend scripting (Node.js, Python, or Java). Familiarity with telephony integrations (SIP, Twilio, Avaya, Genesys) is a plus. Experience with TTS/STT engines like Google Cloud, Nuance, or Amazon Polly. Strong debugging and problem-solving skills. Excellent communication and collaboration abilities. Preferred Qualifications: Certification in any Conversational AI platform (e.g., Dialogflow CX, Amelia Certified Developer). Experience with analytics tools for bot performance monitoring. Exposure to agentic AI design patterns or multi-agent systems. Understanding of AI governance and bias mitigation practices.
Posted 2 weeks ago
4.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Job Title: Security Engineer III Work Office- 5 days Location- Gurgaon About Us: Nykaa is a leading e-commerce platform that combines fashion and technology to deliver a seamless shopping experience. To fortify our commitment to security, we are seeking a dedicated Cyber Security engineer to join our team. If you have a strong background in securing infrastructure and are passionate about protecting e-commerce platforms, we encourage you to apply. Job Overview: We are looking for a talented and forward-thinking Cybersecurity Engineer to join our team. This role focuses on advancing our security infrastructure through cloud security, perimeter defenses, and cutting-edge security engineering practices powered by Artificial Intelligence (AI) and Machine Learning (ML). The ideal candidate will bring expertise in leveraging new technologies to enhance threat detection, automate responses, and predict vulnerabilities across cloud and network environments. Key Responsibilities: ● Security Engineering: ○ Build and integrate security solutions, such as firewalls, encryption tools, and intrusion detection systems, to protect critical infrastructure and data. ○ Collaborate with development teams to integrate security measures into the software development lifecycle (SDLC). ○ Ensure automation of security workflows, vulnerability management, and incident response processes. ○ Lead security initiatives to address evolving threat landscapes, ensuring systems are resilient against emerging cyber risks. ● Cloud Security: ○ Design, implement, and manage secure cloud architectures for platforms such as AWS, Azure, and Google Cloud. ○ Utilize AI/ML-driven security tools to enhance cloud monitoring, incident response, and threat detection in cloud environments. ○ Ensure secure cloud infrastructure by automating security configurations and leveraging AI for predictive vulnerability assessments. ○ Work with DevOps and infrastructure teams to implement automated security controls. ● Data Protection Controls: ○ Design and manage data encryption, tokenization, and masking practices to protect sensitive data both at rest and in transit. ○ Design and enforce data classification schemes, access controls, and data retention policies to mitigate risks to sensitive information. ○ Monitor and enforce security controls related to data handling, ensuring data is securely stored, processed, and transmitted in accordance with best practices. ● Collaboration & Reporting: ○ Work closely with cross-functional teams to embed AI-powered security practices in development pipelines, system architecture, and cloud-based environments. ○ Provide detailed insights and reports on AI/ML-driven security improvements, potential risks, and recommended mitigations to management and stakeholders. ○ Assist in creating and updating security policies, procedures, and standards to ensure they reflect emerging AI/ML technologies and best practices. ○ Conduct training and workshops for other security teams on AI/ML techniques in security operations. Required Skills & Qualifications: ● 4+ years of experience in cybersecurity with a strong focus on cloud security, perimeter security, and security engineering. ● Practical experience with cloud platforms (AWS, Azure, Google Cloud) and security services (IAM, encryption, security groups, etc.). ● Strong understanding of network security protocols (e.g., firewalls, VPNs, IDS/IPS) and their integration with AI/ML models for enhanced defense. ● Hands-on experience with AI/ML techniques for cybersecurity applications, including supervised and unsupervised learning, anomaly detection, and threat classification. ● Proficiency in programming and scripting languages (Python, R, TensorFlow, Keras, or similar AI/ML tools). ● Familiarity with cloud-native security tools that leverage AI/ML for threat detection (e.g., AWS GuardDuty, Azure Sentinel). ● Experience with threat intelligence, vulnerability management, and incident response frameworks. ● Experience in building and deploying security models in automated, scalable environments. This role is perfect for a cybersecurity professional who is passionate about leveraging AI and machine learning to revolutionize security operations, proactively defending cloud environments and networks against emerging threats. If you're eager to work with advanced technologies to secure the future of digital infrastructures, we'd love to hear from you!
Posted 2 weeks ago
0 years
2 - 3 Lacs
Thrissur
On-site
Company Description Redlands Ashlyn Motors Plc, a division of the Redlands Ashlyn group of companies, is dedicated to enhancing productivity in the Indian agriculture sector through advanced mechanization. We manufacture a range of agricultural machinery, including harvester combines, straw balers, rice transplanters, muck trucks, tillers, and weed cutters. Our fully equipped factory and fabrication workshop are located in Malumachampatty, Coimbatore, Tamilnadu, India. Our mission is to provide innovative, user-friendly, and affordable agricultural equipment to accelerate the mechanization of Indian agriculture. Role Description This is a full-time on-site role for an Automotive Painter located in Thrissur. The Automotive Painter will be responsible for preparing vehicles and machinery for painting, applying paint using various techniques, and ensuring high-quality finishes. Day-to-day tasks include sanding and masking surfaces, mixing and applying paint, inspecting painted surfaces for quality, and maintaining painting equipment and work areas. Qualifications Proficiency in automotive painting techniques, including spray painting Experience in surface preparation, sanding, masking, and paint mixing Knowledge of paint types, finishes, and application methods Attention to detail and a strong focus on quality and workmanship Ability to follow safety protocols and maintain a clean work environment Prior experience in automotive or machinery painting is preferred High school diploma or equivalent Job Type: Full-time Pay: ₹20,000.00 - ₹25,000.00 per month Benefits: Health insurance Schedule: Day shift Morning shift Work Location: In person
Posted 2 weeks ago
50.0 years
0 Lacs
Ranjangaon
On-site
At Jabil we strive to make ANYTHING POSSIBLE and EVERYTHING BETTER. We are proud to be a trusted partner for the world's top brands, offering comprehensive engineering, manufacturing, and supply chain solutions. With over 50 years of experience across industries and a vast network of over 100 sites worldwide, Jabil combines global reach with local expertise to deliver both scalable and customized solutions. Our commitment extends beyond business success as we strive to build sustainable processes that minimize environmental impact and foster vibrant and diverse communities around the globe. JOB SUMMARY To coordinate tasks with other Manufacturing staff to fulfill customer requirements, such as Paint process, Paint specification and aesthetic appearance of painted parts. adhere to safety of hazardous operations and consistent quality and customer specifications. ESSENTIAL DUTIES AND RESPONSIBILITIES GENERAL DUTIES: Works under direct, close supervision, with output monitored frequently. Follows mostly routine, standardized procedures to accomplish assigned tasks. May be exposed to more advanced functions as part of training and development. Selects from a variety of established procedures to perform assigned duties. Resolves routine questions and problems, referring more complex issues to higher levels. Errors can cause minor delay, expense and disruption. Assembles finished units per customer specifications. Coordinates with teammates to organize tasks requiring multiple team members to accomplish. Utilizes manual and automated lifting devices while adhering to product safety specifications. Provides information and coordinates action plans at cross-functional meetings and communicates issues with team members and/or visitors to drive corrective actions. Individual must be able to work overtime as required, must be able to respond to conflicting deadlines, changing priorities, and continuous interruptions. Organizes and maintains spare parts inventory and orders spare parts as needed to fill customer orders. Assists in area organization 5S attributes. Keeps abreast of spare parts inventory locations for ease of order fulfillments. Performs preventive maintenance on area tooling according to schedules. Follows preventive maintenance procedural requirements to ensure audit compliance. May perform other duties and responsibilities as assigned. Coating MAY be a responsibility within this job. If Coating is a responsibility, the following duties appl): KEY DUTIES SUPPORTING COATING: Perform manual conformal coating of product per required specifications. Prepare assemblies for automated coating processes and operate equipment as needed. Maintain spray equipment (spray guns, booths, stripping area) Ensure assemblies and components are properly handled and marked. Accurately maintain daily thickness logs and MES record keeping. Utilize bar code scanners and small hand tools. Inspect assemblies visually for proper masking application and placement of required materials. Work under direct, close supervision of manufacturing supervisor, or in his/her absence, from Group Leader, or other management so assigned. Follow detailed written or verbal instructions, including visual aids. Ensure that assigned area is clean and organized per 5S standards. Adhere to all safety and health rules and regulations associated with this position and as directed by supervisor. Comply and follow all procedures within the company security policy. JOB QUALIFICATIONS KNOWLEDGE REQUIREMENTS Ability to effectively present information and respond to questions from groups of managers, clients, customers, and the general public. Ability to define problems, collect data, establish facts, and draw valid conclusions. Ability to operate a personal computer including using a Windows based operating system and related software. Advanced PC skills, including training and knowledge of Jabil’s software packages. Ability to write simple correspondence. Read and understand visual aid. Ability to apply common sense understanding to carry out simple one- or two-step instructions. Ability to deal with standardized situations with only occasional or no variables. Ability to read and comprehend simple instructions, short correspondence, and memos. Ability to add, subtract, multiply, and divide in all units of measure, using whole numbers, common fractions, and decimals. Ability to compute rate, ratio, and percent and to draw and interpret graphs. BE AWARE OF FRAUD: When applying for a job at Jabil you will be contacted via correspondence through our official job portal with a jabil.com e-mail address; direct phone call from a member of the Jabil team; or direct e-mail with a jabil.com e-mail address. Jabil does not request payments for interviews or at any other point during the hiring process. Jabil will not ask for your personal identifying information such as a social security number, birth certificate, financial institution, driver’s license number or passport information over the phone or via e-mail. If you believe you are a victim of identity theft, contact your local police department. Any scam job listings should be reported to whatever website it was posted in. Jabil, including its subsidiaries, is an equal opportunity employer and considers qualified applicants for employment without regard to race, color, religion, national origin, sex, sexual orientation, gender identity, age, disability, genetic information, veteran status, or any other characteristic protected by law. Accessibility Accommodation If you are a qualified individual with a disability, you have the right to request a reasonable accommodation if you are unable or limited in your ability to use or access Jabil.com/Careers site as a result of your disability. You can request a reasonable accommodation by sending an e-mail to Always_Accessible@Jabil.com with the nature of your request and contact information. Please do not direct any other general employment related questions to this e-mail. Please note that only those inquiries concerning a request for reasonable accommodation will be responded to. #whereyoubelong
Posted 2 weeks ago
10.0 years
0 Lacs
Pune, Maharashtra, India
On-site
About Atos Atos is a global leader in digital transformation,European number one in Cloud, Cybersecurity and High-Performance Computing, the Group provides end-to-end Orchestrated Hybrid Cloud, Big Data, Business Applications and Digital Workplace solutions. The Group is the Worldwide Information Technology Partner for the Olympic & Paralympic Games and operates under the brands Atos, Atos Syntel, and Unify. Atos is a SE (Societas Europaea), listed on the CAC40 Paris stock index. The purpose of Atos is to help design the future of the information space. Its expertise and services support the development of knowledge, education and research in a multicultural approach and contribute to the development of scientific and technological excellence. Across the world, the Group enables its customers and employees, and members of societies at large to live, work and develop sustainably, in a safe and secure information space Role Overview The Technical Architect - Snowflake designs, implements and optimizes scalable data warehousing solutions. The jobholder has extensive experience with Snowflake, data architecture, and cloud integration, ensuring high performance, security, and reliability. Responsibilities Design and implement Snowflake-based data architectures to meet business requirements. Architect and optimize data solutions for performance, scalability, and reliability. Develop and optimize data pipelines and ETL/ELT processes. Establish best practices for data governance, security, and compliance. Collaborate with cross-functional teams to integrate Snowflake solutions with existing systems. Monitor and troubleshoot Snowflake environments for performance and reliability. Stay updated on Snowflake advancements and industry trends to recommend innovative solutions. Key Technical Skills & Responsibilities Minimum 10 + years of experience of designing and developing data warehouse / big data applications Must be able to provide thought leadership to customers for their data modernization initiatives using latest technology trends Must be able to lead data product development using Streamlit and Cortex Deep understanding of relational as well as NoSQL data stores, data modeling methods and approaches (star and snowflake, dimensional modeling) Good communication skill. Must have experience of solution architecture using Snowflake Design solutions using Snowflake for all type of data analytics use cases Must have experience of working with Snowflake data platform, it’s utilities (SnowSQL, SnowPipe etc) and it’s features (time travel, support to semi-structured data etc) Must have experience of migrating on premise data warehouse to Snowflake cloud data platform Must have experience of working with any cloud platform, AWS | Azure | GCP Experience of developing accelerators (using Python, Java etc) to expedite the migration to Snowflake Extensive experience of developing ANSI SQL queries and Snowflake compatible stored procedures Snowflake for AI/ML DevOps with Snowflake Data security and data masking features Multi cloud data exchange using Snowflake Snowflake certification is preferred Effective communication and required pre sales experience Eligibility Criteria Bachelor’s degree in Computer Science, Data Engineering, or a related field. Proven experience as a Snowflake Architect or similar role. Snowflake certification (e.g., SnowPro Core Certification). Experience with cloud platforms like AWS, Azure, or GCP. Proficiency in Snowflake, SQL, and data modeling. Strong understanding of ETL/ELT processes and cloud integration. Excellent problem-solving and communication skills
Posted 2 weeks ago
7.0 years
15 - 25 Lacs
Pune, Maharashtra, India
On-site
At Improzo ( Improve + Zoe; meaning Life in Greek ), we believe in improving life by empowering our customers. Founded by seasoned Industry leaders, we are laser focused on delivering quality-led commercial analytical solutions to our clients. Our dedicated team of experts in commercial data, technology, and operations has been evolving and learning together since our inception. Here, you won't find yourself confined to a cubicle; instead, you'll be navigating open waters, collaborating with brilliant minds to shape the future. You will work with leading Life Sciences clients, seasoned leaders and carefully chosen peers like you! People are at the heart of our success, so we have defined our CARE values framework with a lot of effort, and we use it as our guiding light in everything we do. We CARE! Customer-Centric: Client success is our success. Prioritize customer needs and outcomes in every action. Adaptive: Agile and Innovative, with a growth mindset. Pursue bold and disruptive avenues that push the boundaries of possibilities. Respect: Deep respect for our clients & colleagues. Foster a culture of collaboration and act with honesty, transparency, and ethical responsibility. Execution: Laser focused on quality-led execution; we deliver! Strive for the highest quality in our services, solutions, and customer experiences. About The Role Introduction: We are seeking an experienced and highly skilled Snowflake Data Lead/Architect to lead strategic projects focused on Pharma Commercial Data Management. This role demands a professional with 7-9 years of experience in data architecture, data management, ETL, data transformation, and governance, with an emphasis on providing scalable and secure data solutions for the pharmaceutical sector. The ideal candidate will bring a deep understanding of data architecture principles, experience with cloud platforms like Snowflake and Databricks, and a solid background in driving commercial data management projects. If you're passionate about leading impactful data initiatives, optimizing data workflows, and supporting the pharmaceutical industry's data needs, we invite you to apply. Responsibilities Key Responsibilities: Snowflake Solution Design & Development: Work closely with client stakeholders, data architects, and business analysts to understand detailed commercial data requirements and translate them into efficient Snowflake technical designs. Design, develop, and optimize complex ETL/ELT processes within Snowflake using SQL, Stored Procedures, UDFs, Streams, Tasks, and other Snowflake features. Implement data models (dimensional, star, snowflake schemas) optimized for commercial reporting, analytics, and data science use cases. Implement data governance, security, and access controls within Snowflake, adhering to strict pharmaceutical compliance regulations (e.g., HIPAA, GDPR, GxP principles). Develop and manage data sharing and collaboration solutions within Snowflake for internal and external partners. Optimize Snowflake warehouse sizing, query performance, and overall cost efficiency. Data Integration Integrate data from various commercial sources, including CRM systems (e.g., Veeva, Salesforce), sales data (e.g., IQVIA, Symphony), marketing platforms, patient services data, RWD, and other relevant datasets into Snowflake. Utilize tools like Fivetran, Azure Data Factory or custom Python scripts for data ingestion and transformation. Tech Leadership & Expertise Provide technical expertise and support for Snowflake-related issues, troubleshooting data discrepancies and performance bottlenecks. Participate in code reviews, ensuring adherence to best practices and coding standards. Mentor junior developers and contribute to the growth of the data engineering team. Data Quality, Governance & Security Implement robust data quality checks, validation rules, and reconciliation processes to ensure accuracy and reliability of commercial data. Apply and enforce data governance policies, including data lineage, metadata management, and master data management principles. Implement and maintain strict data security, access controls, and data masking techniques within Snowflake, adhering to pharmaceutical industry compliance standards (e.g., HIPAA, GDPR, GxP principles). Required Qualifications Bachelor's degree in Computer Science, Information Systems, Engineering, or a related quantitative field. Master's degree preferred. 7+ years of progressive experience in data warehousing, ETL/ELT development, and data engineering roles. 4+ years of hands-on, in-depth experience as a Snowflake Developer, with a proven track record of designing and implementing complex data solutions on the Snowflake platform. Expert-level proficiency in SQL for data manipulation, complex query optimization, and advanced stored procedure development within Snowflake. Strong understanding and practical experience with data modeling techniques (e.g., Dimensional Modeling, Data Vault). Experience with data integration tools for Snowflake (e.g., Fivetran, Matillion, DBT, Airflow, or custom Python-based ETL frameworks). Proficiency in at least one scripting language (e.g., Python) for data processing, API integration, and automation. Demonstrable understanding of data governance, data security, and regulatory compliance within the pharmaceutical or other highly regulated industries (e.g., GxP, HIPAA, GDPR, PII). Experience working in a client-facing or consulting environment with strong communication and presentation skills. Excellent problem-solving, analytical, and communication skills. Ability to work independently and as part of a collaborative team in a fast-paced environment. Preferred Qualifications Specific experience with pharmaceutical commercial data sets such as sales data (e.g., IQVIA, Symphony), CRM data (e.g., Veeva, Salesforce), claims data, patient services data, or master data management (MDM) for commercial entities. Knowledge of commercial analytics concepts and KPIs in the pharma industry (e.g., sales performance, market share, patient adherence). Experience working with cloud platforms (AWS, Azure, or GCP) and their native services for data storage and processing. Experience with version control systems (e.g., Git). Snowflake certifications (e.g., SnowPro Core, SnowPro Advanced). Experience with data visualization tools (e.g., Tableau, Power BI, Qlik Sense) and their connectivity to Snowflake. Knowledge of Agile methodologies for managing data projects. Benefits Competitive salary and benefits package. Opportunity to work on cutting-edge tech projects, transforming the life sciences industry Collaborative and supportive work environment. Opportunities for professional development and growth. Skills: data visualization tools,data vault,azure data factory,data architecture,client-facing,data governance,data quality,data,snowflake,sql,data integration,fivetran,pharma commercial,data security,python,dimensional modeling,etl,data management
Posted 2 weeks ago
5.0 - 11.0 years
0 Lacs
Chennai, Tamil Nadu, India
Remote
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Senior (CTM – Threat Detection & Response) KEY Capabilities: Experience in working with Splunk Enterprise, Splunk Enterprise Security & Splunk UEBA Minimum of Splunk Power User Certification Good knowledge in programming or Scripting languages such as Python (preferred), JavaScript (preferred), Bash, PowerShell, Bash, etc. Perform remote and on-site gap assessment of the SIEM solution. Define evaluation criteria & approach based on the Client requirement & scope factoring industry best practices & regulations Conduct interview with stakeholders, review documents (SOPs, Architecture diagrams etc.) Evaluate SIEM based on the defined criteria and prepare audit reports Good experience in providing consulting to customers during the testing, evaluation, pilot, production and training phases to ensure a successful deployment. Understand customer requirements and recommend best practices for SIEM solutions. Offer consultative advice in security principles and best practices related to SIEM operations Design and document a SIEM solution to meet the customer needs Experience in onboarding data into Splunk from various sources including unsupported (in-house built) by creating custom parsers Verification of data of log sources in the SIEM, following the Common Information Model (CIM) Experience in parsing and masking of data prior to ingestion in SIEM Provide support for the data collection, processing, analysis and operational reporting systems including planning, installation, configuration, testing, troubleshooting and problem resolution Assist clients to fully optimize the SIEM system capabilities as well as the audit and logging features of the event log sources Assist client with technical guidance to configure end log sources (in-scope) to be integrated to the SIEM Experience in handling big data integration via Splunk Expertise in SIEM content development which includes developing process for automated security event monitoring and alerting along with corresponding event response plans for systems Hands-on experience in development and customization of Splunk Apps & Add-Ons Builds advanced visualizations (Interactive Drilldown, Glass tables etc.) Build and integrate contextual data into notable events Experience in creating use cases under Cyber kill chain and MITRE attack framework Capability in developing advanced dashboards (with CSS, JavaScript, HTML, XML) and reports that can provide near real time visibility into the performance of client applications. Experience in installation, configuration and usage of premium Splunk Apps and Add-ons such as ES App, UEBA, ITSI etc Sound knowledge in configuration of Alerts and Reports. Good exposure in automatic lookup, data models and creating complex SPL queries. Create, modify and tune the SIEM rules to adjust the specifications of alerts and incidents to meet client requirement Work with the client SPOC to for correlation rule tuning (as per use case management life cycle), incident classification and prioritization recommendations Experience in creating custom commands, custom alert action, adaptive response actions etc. Qualification & experience: Minimum of 5 to 11 years’ experience with a depth of network architecture knowledge that will translate over to deploying and integrating a complicated security intelligence solution into global enterprise environments. Strong oral, written and listening skills are an essential component to effective consulting. Strong background in network administration. Ability to work at all layers of the OSI models, including being able to explain communication at any level is necessary. Must have knowledge of Vulnerability Management, Windows and Linux basics including installations, Windows Domains, trusts, GPOs, server roles, Windows security policies, user administration, Linux security and troubleshooting. Good to have below mentioned experience with designing and implementation of Splunk with a focus on IT Operations, Application Analytics, User Experience, Application Performance and Security Management Multiple cluster deployments & management experience as per Vendor guidelines and industry best practices Troubleshoot Splunk platform and application issues, escalate the issue and work with Splunk support to resolve issues Certification in any one of the SIEM Solution such as IBM QRadar, Exabeam, Securonix will be an added advantage Certifications in a core security related discipline will be an added advantage. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.
Posted 2 weeks ago
3.0 - 6.0 years
8 - 24 Lacs
Chennai, Tamil Nadu, India
On-site
Data Engineer – Snowflake About The Opportunity An award-winning global IT services & analytics consultancy operating in the Enterprise Cloud Data & Digital Transformation sector. We partner with Fortune 500 firms to modernise data platforms, unlock real-time insights, and drive AI/ML innovation. To scale client programmes in India, we seek a skilled Data Engineer specialised in Snowflake to design high-throughput, secure, and cost-efficient data products in an on-site environment. Role & Responsibilities Design and implement end-to-end Snowflake data warehouses, from staging to curated layers, ensuring ELT best practices. Build and automate data ingestion pipelines using Snowpipe, Python, and cloud services, delivering near real-time availability. Optimise schema design, clustering, and partitioning to reduce query latency and storage costs. Create robust CI/CD workflows for Snowflake objects via Git and orchestration tools like dbt/Airflow. Monitor workload performance, triage bottlenecks, and enforce data governance, security, and compliance policies. Collaborate with analytics, BI, and product teams to translate business requirements into scalable data models. Skills & Qualifications Must-Have 3-6 years professional Data Engineering experience with primary focus on Snowflake. Expert SQL skills and proficiency in scripting (Python or JavaScript) for ETL/ELT tasks. Hands-on with Snowpipe, Tasks, Streams, and Performance Tuning. Experience integrating cloud storage (AWS S3/Azure Blob/GCS) and building event-driven pipelines. Solid understanding of dimensional data modelling and data governance (RBAC, masking, encryption). Version control, testing, and deployment using Git and CI/CD pipelines. Preferred Exposure to dbt, Airflow, or similar orchestration frameworks. Experience with Kafka or real-time messaging platforms. Knowledge of BI tools (Tableau, Power BI) and modern data stack. Snowflake certifications (SnowPro Core/Advanced) a plus. Familiarity with Infra-as-Code (Terraform, CloudFormation). Benefits & Culture Highlights On-site, high-energy data lab with access to cutting-edge cloud tools and sandbox environments. Clear technical career ladder, certification sponsorship, and mentorship from Snowflake SMEs. Comprehensive health cover, performance bonuses, and employee wellness programmes. Skills: sql,elt,dbt,azure blob,masking,streams,data governance,snowflake,aws s3,dimensional data modelling,rbac,git,tasks,performance tuning,gcs,python,encryption,snowpipe,javascript,snowflake data engineer,aws,ci/cd,etl,data modeling
Posted 2 weeks ago
5.0 - 11.0 years
0 Lacs
Kolkata, West Bengal, India
Remote
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Senior (CTM – Threat Detection & Response) KEY Capabilities: Experience in working with Splunk Enterprise, Splunk Enterprise Security & Splunk UEBA Minimum of Splunk Power User Certification Good knowledge in programming or Scripting languages such as Python (preferred), JavaScript (preferred), Bash, PowerShell, Bash, etc. Perform remote and on-site gap assessment of the SIEM solution. Define evaluation criteria & approach based on the Client requirement & scope factoring industry best practices & regulations Conduct interview with stakeholders, review documents (SOPs, Architecture diagrams etc.) Evaluate SIEM based on the defined criteria and prepare audit reports Good experience in providing consulting to customers during the testing, evaluation, pilot, production and training phases to ensure a successful deployment. Understand customer requirements and recommend best practices for SIEM solutions. Offer consultative advice in security principles and best practices related to SIEM operations Design and document a SIEM solution to meet the customer needs Experience in onboarding data into Splunk from various sources including unsupported (in-house built) by creating custom parsers Verification of data of log sources in the SIEM, following the Common Information Model (CIM) Experience in parsing and masking of data prior to ingestion in SIEM Provide support for the data collection, processing, analysis and operational reporting systems including planning, installation, configuration, testing, troubleshooting and problem resolution Assist clients to fully optimize the SIEM system capabilities as well as the audit and logging features of the event log sources Assist client with technical guidance to configure end log sources (in-scope) to be integrated to the SIEM Experience in handling big data integration via Splunk Expertise in SIEM content development which includes developing process for automated security event monitoring and alerting along with corresponding event response plans for systems Hands-on experience in development and customization of Splunk Apps & Add-Ons Builds advanced visualizations (Interactive Drilldown, Glass tables etc.) Build and integrate contextual data into notable events Experience in creating use cases under Cyber kill chain and MITRE attack framework Capability in developing advanced dashboards (with CSS, JavaScript, HTML, XML) and reports that can provide near real time visibility into the performance of client applications. Experience in installation, configuration and usage of premium Splunk Apps and Add-ons such as ES App, UEBA, ITSI etc Sound knowledge in configuration of Alerts and Reports. Good exposure in automatic lookup, data models and creating complex SPL queries. Create, modify and tune the SIEM rules to adjust the specifications of alerts and incidents to meet client requirement Work with the client SPOC to for correlation rule tuning (as per use case management life cycle), incident classification and prioritization recommendations Experience in creating custom commands, custom alert action, adaptive response actions etc. Qualification & experience: Minimum of 5 to 11 years’ experience with a depth of network architecture knowledge that will translate over to deploying and integrating a complicated security intelligence solution into global enterprise environments. Strong oral, written and listening skills are an essential component to effective consulting. Strong background in network administration. Ability to work at all layers of the OSI models, including being able to explain communication at any level is necessary. Must have knowledge of Vulnerability Management, Windows and Linux basics including installations, Windows Domains, trusts, GPOs, server roles, Windows security policies, user administration, Linux security and troubleshooting. Good to have below mentioned experience with designing and implementation of Splunk with a focus on IT Operations, Application Analytics, User Experience, Application Performance and Security Management Multiple cluster deployments & management experience as per Vendor guidelines and industry best practices Troubleshoot Splunk platform and application issues, escalate the issue and work with Splunk support to resolve issues Certification in any one of the SIEM Solution such as IBM QRadar, Exabeam, Securonix will be an added advantage Certifications in a core security related discipline will be an added advantage. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.
Posted 2 weeks ago
3.0 - 6.0 years
8 - 24 Lacs
Bhubaneswar, Odisha, India
On-site
Data Engineer – Snowflake About The Opportunity An award-winning global IT services & analytics consultancy operating in the Enterprise Cloud Data & Digital Transformation sector. We partner with Fortune 500 firms to modernise data platforms, unlock real-time insights, and drive AI/ML innovation. To scale client programmes in India, we seek a skilled Data Engineer specialised in Snowflake to design high-throughput, secure, and cost-efficient data products in an on-site environment. Role & Responsibilities Design and implement end-to-end Snowflake data warehouses, from staging to curated layers, ensuring ELT best practices. Build and automate data ingestion pipelines using Snowpipe, Python, and cloud services, delivering near real-time availability. Optimise schema design, clustering, and partitioning to reduce query latency and storage costs. Create robust CI/CD workflows for Snowflake objects via Git and orchestration tools like dbt/Airflow. Monitor workload performance, triage bottlenecks, and enforce data governance, security, and compliance policies. Collaborate with analytics, BI, and product teams to translate business requirements into scalable data models. Skills & Qualifications Must-Have 3-6 years professional Data Engineering experience with primary focus on Snowflake. Expert SQL skills and proficiency in scripting (Python or JavaScript) for ETL/ELT tasks. Hands-on with Snowpipe, Tasks, Streams, and Performance Tuning. Experience integrating cloud storage (AWS S3/Azure Blob/GCS) and building event-driven pipelines. Solid understanding of dimensional data modelling and data governance (RBAC, masking, encryption). Version control, testing, and deployment using Git and CI/CD pipelines. Preferred Exposure to dbt, Airflow, or similar orchestration frameworks. Experience with Kafka or real-time messaging platforms. Knowledge of BI tools (Tableau, Power BI) and modern data stack. Snowflake certifications (SnowPro Core/Advanced) a plus. Familiarity with Infra-as-Code (Terraform, CloudFormation). Benefits & Culture Highlights On-site, high-energy data lab with access to cutting-edge cloud tools and sandbox environments. Clear technical career ladder, certification sponsorship, and mentorship from Snowflake SMEs. Comprehensive health cover, performance bonuses, and employee wellness programmes. Skills: sql,elt,dbt,azure blob,masking,streams,data governance,snowflake,aws s3,dimensional data modelling,rbac,git,tasks,performance tuning,gcs,python,encryption,snowpipe,javascript,snowflake data engineer,aws,ci/cd,etl,data modeling
Posted 2 weeks ago
3.0 - 6.0 years
8 - 24 Lacs
Hyderabad, Telangana, India
On-site
Data Engineer – Snowflake About The Opportunity An award-winning global IT services & analytics consultancy operating in the Enterprise Cloud Data & Digital Transformation sector. We partner with Fortune 500 firms to modernise data platforms, unlock real-time insights, and drive AI/ML innovation. To scale client programmes in India, we seek a skilled Data Engineer specialised in Snowflake to design high-throughput, secure, and cost-efficient data products in an on-site environment. Role & Responsibilities Design and implement end-to-end Snowflake data warehouses, from staging to curated layers, ensuring ELT best practices. Build and automate data ingestion pipelines using Snowpipe, Python, and cloud services, delivering near real-time availability. Optimise schema design, clustering, and partitioning to reduce query latency and storage costs. Create robust CI/CD workflows for Snowflake objects via Git and orchestration tools like dbt/Airflow. Monitor workload performance, triage bottlenecks, and enforce data governance, security, and compliance policies. Collaborate with analytics, BI, and product teams to translate business requirements into scalable data models. Skills & Qualifications Must-Have 3-6 years professional Data Engineering experience with primary focus on Snowflake. Expert SQL skills and proficiency in scripting (Python or JavaScript) for ETL/ELT tasks. Hands-on with Snowpipe, Tasks, Streams, and Performance Tuning. Experience integrating cloud storage (AWS S3/Azure Blob/GCS) and building event-driven pipelines. Solid understanding of dimensional data modelling and data governance (RBAC, masking, encryption). Version control, testing, and deployment using Git and CI/CD pipelines. Preferred Exposure to dbt, Airflow, or similar orchestration frameworks. Experience with Kafka or real-time messaging platforms. Knowledge of BI tools (Tableau, Power BI) and modern data stack. Snowflake certifications (SnowPro Core/Advanced) a plus. Familiarity with Infra-as-Code (Terraform, CloudFormation). Benefits & Culture Highlights On-site, high-energy data lab with access to cutting-edge cloud tools and sandbox environments. Clear technical career ladder, certification sponsorship, and mentorship from Snowflake SMEs. Comprehensive health cover, performance bonuses, and employee wellness programmes. Skills: sql,elt,dbt,azure blob,masking,streams,data governance,snowflake,aws s3,dimensional data modelling,rbac,git,tasks,performance tuning,gcs,python,encryption,snowpipe,javascript,snowflake data engineer,aws,ci/cd,etl,data modeling
Posted 2 weeks ago
5.0 - 11.0 years
0 Lacs
Kanayannur, Kerala, India
Remote
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Senior (CTM – Threat Detection & Response) KEY Capabilities: Experience in working with Splunk Enterprise, Splunk Enterprise Security & Splunk UEBA Minimum of Splunk Power User Certification Good knowledge in programming or Scripting languages such as Python (preferred), JavaScript (preferred), Bash, PowerShell, Bash, etc. Perform remote and on-site gap assessment of the SIEM solution. Define evaluation criteria & approach based on the Client requirement & scope factoring industry best practices & regulations Conduct interview with stakeholders, review documents (SOPs, Architecture diagrams etc.) Evaluate SIEM based on the defined criteria and prepare audit reports Good experience in providing consulting to customers during the testing, evaluation, pilot, production and training phases to ensure a successful deployment. Understand customer requirements and recommend best practices for SIEM solutions. Offer consultative advice in security principles and best practices related to SIEM operations Design and document a SIEM solution to meet the customer needs Experience in onboarding data into Splunk from various sources including unsupported (in-house built) by creating custom parsers Verification of data of log sources in the SIEM, following the Common Information Model (CIM) Experience in parsing and masking of data prior to ingestion in SIEM Provide support for the data collection, processing, analysis and operational reporting systems including planning, installation, configuration, testing, troubleshooting and problem resolution Assist clients to fully optimize the SIEM system capabilities as well as the audit and logging features of the event log sources Assist client with technical guidance to configure end log sources (in-scope) to be integrated to the SIEM Experience in handling big data integration via Splunk Expertise in SIEM content development which includes developing process for automated security event monitoring and alerting along with corresponding event response plans for systems Hands-on experience in development and customization of Splunk Apps & Add-Ons Builds advanced visualizations (Interactive Drilldown, Glass tables etc.) Build and integrate contextual data into notable events Experience in creating use cases under Cyber kill chain and MITRE attack framework Capability in developing advanced dashboards (with CSS, JavaScript, HTML, XML) and reports that can provide near real time visibility into the performance of client applications. Experience in installation, configuration and usage of premium Splunk Apps and Add-ons such as ES App, UEBA, ITSI etc Sound knowledge in configuration of Alerts and Reports. Good exposure in automatic lookup, data models and creating complex SPL queries. Create, modify and tune the SIEM rules to adjust the specifications of alerts and incidents to meet client requirement Work with the client SPOC to for correlation rule tuning (as per use case management life cycle), incident classification and prioritization recommendations Experience in creating custom commands, custom alert action, adaptive response actions etc. Qualification & experience: Minimum of 5 to 11 years’ experience with a depth of network architecture knowledge that will translate over to deploying and integrating a complicated security intelligence solution into global enterprise environments. Strong oral, written and listening skills are an essential component to effective consulting. Strong background in network administration. Ability to work at all layers of the OSI models, including being able to explain communication at any level is necessary. Must have knowledge of Vulnerability Management, Windows and Linux basics including installations, Windows Domains, trusts, GPOs, server roles, Windows security policies, user administration, Linux security and troubleshooting. Good to have below mentioned experience with designing and implementation of Splunk with a focus on IT Operations, Application Analytics, User Experience, Application Performance and Security Management Multiple cluster deployments & management experience as per Vendor guidelines and industry best practices Troubleshoot Splunk platform and application issues, escalate the issue and work with Splunk support to resolve issues Certification in any one of the SIEM Solution such as IBM QRadar, Exabeam, Securonix will be an added advantage Certifications in a core security related discipline will be an added advantage. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.
Posted 2 weeks ago
5.0 - 11.0 years
0 Lacs
Trivandrum, Kerala, India
Remote
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Senior (CTM – Threat Detection & Response) KEY Capabilities: Experience in working with Splunk Enterprise, Splunk Enterprise Security & Splunk UEBA Minimum of Splunk Power User Certification Good knowledge in programming or Scripting languages such as Python (preferred), JavaScript (preferred), Bash, PowerShell, Bash, etc. Perform remote and on-site gap assessment of the SIEM solution. Define evaluation criteria & approach based on the Client requirement & scope factoring industry best practices & regulations Conduct interview with stakeholders, review documents (SOPs, Architecture diagrams etc.) Evaluate SIEM based on the defined criteria and prepare audit reports Good experience in providing consulting to customers during the testing, evaluation, pilot, production and training phases to ensure a successful deployment. Understand customer requirements and recommend best practices for SIEM solutions. Offer consultative advice in security principles and best practices related to SIEM operations Design and document a SIEM solution to meet the customer needs Experience in onboarding data into Splunk from various sources including unsupported (in-house built) by creating custom parsers Verification of data of log sources in the SIEM, following the Common Information Model (CIM) Experience in parsing and masking of data prior to ingestion in SIEM Provide support for the data collection, processing, analysis and operational reporting systems including planning, installation, configuration, testing, troubleshooting and problem resolution Assist clients to fully optimize the SIEM system capabilities as well as the audit and logging features of the event log sources Assist client with technical guidance to configure end log sources (in-scope) to be integrated to the SIEM Experience in handling big data integration via Splunk Expertise in SIEM content development which includes developing process for automated security event monitoring and alerting along with corresponding event response plans for systems Hands-on experience in development and customization of Splunk Apps & Add-Ons Builds advanced visualizations (Interactive Drilldown, Glass tables etc.) Build and integrate contextual data into notable events Experience in creating use cases under Cyber kill chain and MITRE attack framework Capability in developing advanced dashboards (with CSS, JavaScript, HTML, XML) and reports that can provide near real time visibility into the performance of client applications. Experience in installation, configuration and usage of premium Splunk Apps and Add-ons such as ES App, UEBA, ITSI etc Sound knowledge in configuration of Alerts and Reports. Good exposure in automatic lookup, data models and creating complex SPL queries. Create, modify and tune the SIEM rules to adjust the specifications of alerts and incidents to meet client requirement Work with the client SPOC to for correlation rule tuning (as per use case management life cycle), incident classification and prioritization recommendations Experience in creating custom commands, custom alert action, adaptive response actions etc. Qualification & experience: Minimum of 5 to 11 years’ experience with a depth of network architecture knowledge that will translate over to deploying and integrating a complicated security intelligence solution into global enterprise environments. Strong oral, written and listening skills are an essential component to effective consulting. Strong background in network administration. Ability to work at all layers of the OSI models, including being able to explain communication at any level is necessary. Must have knowledge of Vulnerability Management, Windows and Linux basics including installations, Windows Domains, trusts, GPOs, server roles, Windows security policies, user administration, Linux security and troubleshooting. Good to have below mentioned experience with designing and implementation of Splunk with a focus on IT Operations, Application Analytics, User Experience, Application Performance and Security Management Multiple cluster deployments & management experience as per Vendor guidelines and industry best practices Troubleshoot Splunk platform and application issues, escalate the issue and work with Splunk support to resolve issues Certification in any one of the SIEM Solution such as IBM QRadar, Exabeam, Securonix will be an added advantage Certifications in a core security related discipline will be an added advantage. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.
Posted 2 weeks ago
3.0 - 6.0 years
8 - 24 Lacs
Pune, Maharashtra, India
On-site
Data Engineer – Snowflake About The Opportunity An award-winning global IT services & analytics consultancy operating in the Enterprise Cloud Data & Digital Transformation sector. We partner with Fortune 500 firms to modernise data platforms, unlock real-time insights, and drive AI/ML innovation. To scale client programmes in India, we seek a skilled Data Engineer specialised in Snowflake to design high-throughput, secure, and cost-efficient data products in an on-site environment. Role & Responsibilities Design and implement end-to-end Snowflake data warehouses, from staging to curated layers, ensuring ELT best practices. Build and automate data ingestion pipelines using Snowpipe, Python, and cloud services, delivering near real-time availability. Optimise schema design, clustering, and partitioning to reduce query latency and storage costs. Create robust CI/CD workflows for Snowflake objects via Git and orchestration tools like dbt/Airflow. Monitor workload performance, triage bottlenecks, and enforce data governance, security, and compliance policies. Collaborate with analytics, BI, and product teams to translate business requirements into scalable data models. Skills & Qualifications Must-Have 3-6 years professional Data Engineering experience with primary focus on Snowflake. Expert SQL skills and proficiency in scripting (Python or JavaScript) for ETL/ELT tasks. Hands-on with Snowpipe, Tasks, Streams, and Performance Tuning. Experience integrating cloud storage (AWS S3/Azure Blob/GCS) and building event-driven pipelines. Solid understanding of dimensional data modelling and data governance (RBAC, masking, encryption). Version control, testing, and deployment using Git and CI/CD pipelines. Preferred Exposure to dbt, Airflow, or similar orchestration frameworks. Experience with Kafka or real-time messaging platforms. Knowledge of BI tools (Tableau, Power BI) and modern data stack. Snowflake certifications (SnowPro Core/Advanced) a plus. Familiarity with Infra-as-Code (Terraform, CloudFormation). Benefits & Culture Highlights On-site, high-energy data lab with access to cutting-edge cloud tools and sandbox environments. Clear technical career ladder, certification sponsorship, and mentorship from Snowflake SMEs. Comprehensive health cover, performance bonuses, and employee wellness programmes. Skills: sql,elt,dbt,azure blob,masking,streams,data governance,snowflake,aws s3,dimensional data modelling,rbac,git,tasks,performance tuning,gcs,python,encryption,snowpipe,javascript,snowflake data engineer,aws,ci/cd,etl,data modeling
Posted 2 weeks ago
5.0 - 11.0 years
0 Lacs
Pune, Maharashtra, India
Remote
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Senior (CTM – Threat Detection & Response) KEY Capabilities: Experience in working with Splunk Enterprise, Splunk Enterprise Security & Splunk UEBA Minimum of Splunk Power User Certification Good knowledge in programming or Scripting languages such as Python (preferred), JavaScript (preferred), Bash, PowerShell, Bash, etc. Perform remote and on-site gap assessment of the SIEM solution. Define evaluation criteria & approach based on the Client requirement & scope factoring industry best practices & regulations Conduct interview with stakeholders, review documents (SOPs, Architecture diagrams etc.) Evaluate SIEM based on the defined criteria and prepare audit reports Good experience in providing consulting to customers during the testing, evaluation, pilot, production and training phases to ensure a successful deployment. Understand customer requirements and recommend best practices for SIEM solutions. Offer consultative advice in security principles and best practices related to SIEM operations Design and document a SIEM solution to meet the customer needs Experience in onboarding data into Splunk from various sources including unsupported (in-house built) by creating custom parsers Verification of data of log sources in the SIEM, following the Common Information Model (CIM) Experience in parsing and masking of data prior to ingestion in SIEM Provide support for the data collection, processing, analysis and operational reporting systems including planning, installation, configuration, testing, troubleshooting and problem resolution Assist clients to fully optimize the SIEM system capabilities as well as the audit and logging features of the event log sources Assist client with technical guidance to configure end log sources (in-scope) to be integrated to the SIEM Experience in handling big data integration via Splunk Expertise in SIEM content development which includes developing process for automated security event monitoring and alerting along with corresponding event response plans for systems Hands-on experience in development and customization of Splunk Apps & Add-Ons Builds advanced visualizations (Interactive Drilldown, Glass tables etc.) Build and integrate contextual data into notable events Experience in creating use cases under Cyber kill chain and MITRE attack framework Capability in developing advanced dashboards (with CSS, JavaScript, HTML, XML) and reports that can provide near real time visibility into the performance of client applications. Experience in installation, configuration and usage of premium Splunk Apps and Add-ons such as ES App, UEBA, ITSI etc Sound knowledge in configuration of Alerts and Reports. Good exposure in automatic lookup, data models and creating complex SPL queries. Create, modify and tune the SIEM rules to adjust the specifications of alerts and incidents to meet client requirement Work with the client SPOC to for correlation rule tuning (as per use case management life cycle), incident classification and prioritization recommendations Experience in creating custom commands, custom alert action, adaptive response actions etc. Qualification & experience: Minimum of 5 to 11 years’ experience with a depth of network architecture knowledge that will translate over to deploying and integrating a complicated security intelligence solution into global enterprise environments. Strong oral, written and listening skills are an essential component to effective consulting. Strong background in network administration. Ability to work at all layers of the OSI models, including being able to explain communication at any level is necessary. Must have knowledge of Vulnerability Management, Windows and Linux basics including installations, Windows Domains, trusts, GPOs, server roles, Windows security policies, user administration, Linux security and troubleshooting. Good to have below mentioned experience with designing and implementation of Splunk with a focus on IT Operations, Application Analytics, User Experience, Application Performance and Security Management Multiple cluster deployments & management experience as per Vendor guidelines and industry best practices Troubleshoot Splunk platform and application issues, escalate the issue and work with Splunk support to resolve issues Certification in any one of the SIEM Solution such as IBM QRadar, Exabeam, Securonix will be an added advantage Certifications in a core security related discipline will be an added advantage. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.
Posted 2 weeks ago
5.0 - 11.0 years
0 Lacs
Noida, Uttar Pradesh, India
Remote
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Senior (CTM – Threat Detection & Response) KEY Capabilities: Experience in working with Splunk Enterprise, Splunk Enterprise Security & Splunk UEBA Minimum of Splunk Power User Certification Good knowledge in programming or Scripting languages such as Python (preferred), JavaScript (preferred), Bash, PowerShell, Bash, etc. Perform remote and on-site gap assessment of the SIEM solution. Define evaluation criteria & approach based on the Client requirement & scope factoring industry best practices & regulations Conduct interview with stakeholders, review documents (SOPs, Architecture diagrams etc.) Evaluate SIEM based on the defined criteria and prepare audit reports Good experience in providing consulting to customers during the testing, evaluation, pilot, production and training phases to ensure a successful deployment. Understand customer requirements and recommend best practices for SIEM solutions. Offer consultative advice in security principles and best practices related to SIEM operations Design and document a SIEM solution to meet the customer needs Experience in onboarding data into Splunk from various sources including unsupported (in-house built) by creating custom parsers Verification of data of log sources in the SIEM, following the Common Information Model (CIM) Experience in parsing and masking of data prior to ingestion in SIEM Provide support for the data collection, processing, analysis and operational reporting systems including planning, installation, configuration, testing, troubleshooting and problem resolution Assist clients to fully optimize the SIEM system capabilities as well as the audit and logging features of the event log sources Assist client with technical guidance to configure end log sources (in-scope) to be integrated to the SIEM Experience in handling big data integration via Splunk Expertise in SIEM content development which includes developing process for automated security event monitoring and alerting along with corresponding event response plans for systems Hands-on experience in development and customization of Splunk Apps & Add-Ons Builds advanced visualizations (Interactive Drilldown, Glass tables etc.) Build and integrate contextual data into notable events Experience in creating use cases under Cyber kill chain and MITRE attack framework Capability in developing advanced dashboards (with CSS, JavaScript, HTML, XML) and reports that can provide near real time visibility into the performance of client applications. Experience in installation, configuration and usage of premium Splunk Apps and Add-ons such as ES App, UEBA, ITSI etc Sound knowledge in configuration of Alerts and Reports. Good exposure in automatic lookup, data models and creating complex SPL queries. Create, modify and tune the SIEM rules to adjust the specifications of alerts and incidents to meet client requirement Work with the client SPOC to for correlation rule tuning (as per use case management life cycle), incident classification and prioritization recommendations Experience in creating custom commands, custom alert action, adaptive response actions etc. Qualification & experience: Minimum of 5 to 11 years’ experience with a depth of network architecture knowledge that will translate over to deploying and integrating a complicated security intelligence solution into global enterprise environments. Strong oral, written and listening skills are an essential component to effective consulting. Strong background in network administration. Ability to work at all layers of the OSI models, including being able to explain communication at any level is necessary. Must have knowledge of Vulnerability Management, Windows and Linux basics including installations, Windows Domains, trusts, GPOs, server roles, Windows security policies, user administration, Linux security and troubleshooting. Good to have below mentioned experience with designing and implementation of Splunk with a focus on IT Operations, Application Analytics, User Experience, Application Performance and Security Management Multiple cluster deployments & management experience as per Vendor guidelines and industry best practices Troubleshoot Splunk platform and application issues, escalate the issue and work with Splunk support to resolve issues Certification in any one of the SIEM Solution such as IBM QRadar, Exabeam, Securonix will be an added advantage Certifications in a core security related discipline will be an added advantage. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.
Posted 2 weeks ago
8.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
This job is with Standard Chartered Bank, an inclusive employer and a member of myGwork – the largest global platform for the LGBTQ+ business community. Please do not contact the recruiter directly. Job Summary We are looking for a skilled Automation Test Manager to join our team in ensuring the quality, reliability, and security of our payments processing application. The role involves creating and maintaining automated test scripts using Selenium, Java, SQL, and Python for data masking. The ideal candidate has a strong background in automation testing within payment systems or similar high-availability applications. Experienced in MT Swift messages like MT103, 202, 202 COV Experienced in MX messages like PACS.008, PACS.009, PACS.004, PACS.002 and PAIN.001 Experienced in Real Time (Faster Payments) processing like IMPS, G3, IBFT End to End Payment Processing Knowledge Ensure the quality and timeliness of delivery of testing assignments. Perform functional and technical test execution activities as per testing team engagement level in the project. Perform testing in Agile methodology delivery. Plan, analyse, design, prepare Test Strategy, Planning & Traceability Matrix Preparation Key Responsibilities Perform testing in Agile methodology delivery Functional / Automation testing for SCPay payments application Test Automation Design, develop, and maintain automated test scripts using Selenium and Java to support regression, functional, and integration testing. Write and execute SQL queries to validate data integrity and ensure data consistency across transactions. Look into Kibana and understanding of KQL is a Plus Data Masking & Test Data Management Utilize Python scripts for data masking to protect sensitive data used in test cases. Manage test data and set up testing environments to support end-to-end testing scenarios Quality Assurance & Test Strategy Develop comprehensive test plans and test cases to cover all aspects of the application, including UI, API, and database layers. Collaborate with development and product teams to understand requirements, create testing strategies, and identify automation opportunities. Defect Tracking & Reporting Log, track, and manage defects using tracking tools, ensuring clear documentation and communication of issues. Generate and share test execution reports with stakeholders, highlighting critical issues and providing insights for improvement. Continuous Improvement Enhance existing automation frameworks and scripts to improve coverage, maintainability, and reliability. Stay updated on industry trends and best practices in test automation, implementing relevant improvements. Skills And Experience Min 8 - 13 Years of experience Experience in leading a team of more than 5 members Automation Testing using Rest API MT and MX Message Processing Agile methodology Payment processing testing is a must (ISO20022, MT/ MX Payment formats) Automation Tools: Proficiency in Selenium with Java for test automation SQL: Strong SQL skills for data validation and back-end testing Python: Experience with Python scripting for data masking and test data management. Testing Frameworks: Knowledge of testing frameworks such as TestNG or JUnit CI/CD: Familiarity with CI/CD tools like Jenkins, Git, or similar for automated test execution Excellent problem-solving and analytical skills Strong communication skills to convey technical details effectively Ability to work in a collaborative Agile environment with cross-functional teams Qualifications Bachelor's Degree in Computer Science, Software Engineering or equivalent degree About Standard Chartered We're an international bank, nimble enough to act, big enough for impact. For more than 170 years, we've worked to make a positive difference for our clients, communities, and each other. We question the status quo, love a challenge and enjoy finding new opportunities to grow and do better than before. If you're looking for a career with purpose and you want to work for a bank making a difference, we want to hear from you. You can count on us to celebrate your unique talents and we can't wait to see the talents you can bring us. Our purpose, to drive commerce and prosperity through our unique diversity, together with our brand promise, to be here for good are achieved by how we each live our valued behaviours. When you work with us, you'll see how we value difference and advocate inclusion. Together We Do the right thing and are assertive, challenge one another, and live with integrity, while putting the client at the heart of what we do Never settle, continuously striving to improve and innovate, keeping things simple and learning from doing well, and not so well Are better together, we can be ourselves, be inclusive, see more good in others, and work collectively to build for the long term What We Offer In line with our Fair Pay Charter, we offer a competitive salary and benefits to support your mental, physical, financial and social wellbeing. Core bank funding for retirement savings, medical and life insurance, with flexible and voluntary benefits available in some locations. Time-off including annual leave, parental/maternity (20 weeks), sabbatical (12 months maximum) and volunteering leave (3 days), along with minimum global standards for annual and public holiday, which is combined to 30 days minimum. Flexible working options based around home and office locations, with flexible working patterns. Proactive wellbeing support through Unmind, a market-leading digital wellbeing platform, development courses for resilience and other human skills, global Employee Assistance Programme, sick leave, mental health first-aiders and all sorts of self-help toolkits A continuous learning culture to support your growth, with opportunities to reskill and upskill and access to physical, virtual and digital learning. Being part of an inclusive and values driven organisation, one that embraces and celebrates our unique diversity, across our teams, business functions and geographies - everyone feels respected and can realise their full potential.
Posted 2 weeks ago
0 years
0 Lacs
Kochi, Kerala, India
On-site
Job role - Design Associate Company name - HomeLane Job location - Kochi, Kerala, India (On-site) Job Description Shadow design discussions the Senior Designer does with clients; prepare Minutes of Meetings and keep track of project milestones to ensure a timely and high-quality delivery. Assist the Senior Designer in 3D designs using SpaceCraft (HomeLane Software) and Sketchup; recommend enhancements and be a sounding board for the Senior Designer. Be available for Site Visits, Masking along with the Senior Designer; take on the responsibility of file management across HomeLane tech systems. Assist the Senior Designer in creating commercial proposals using SpaceCraft and other quoting tools; validate quotes to ensure customers get a transparent and fair estimate. Coordinate with various stakeholders to ensure a great design outcome; build relationships with teams like sales, drawing QC, project management teams and planners. Mandatory Qualifications: Design education background - B.Arch, B.Des, M.Des, Diploma in Design 0-1yr of experience in Interior Design / Architecture Good communication & presentation skills Basic knowledge of Modular furniture Practical knowledge of SketchUp Location: Kochi, Kerala.
Posted 2 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
27534 Jobs | Dublin
Wipro
14175 Jobs | Bengaluru
Accenture in India
9809 Jobs | Dublin 2
EY
9787 Jobs | London
Amazon
7964 Jobs | Seattle,WA
Uplers
7749 Jobs | Ahmedabad
IBM
7414 Jobs | Armonk
Oracle
7069 Jobs | Redwood City
Muthoot FinCorp (MFL)
6164 Jobs | New Delhi
Capgemini
5421 Jobs | Paris,France