Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
9.0 - 12.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Position Summary... What you'll do... This Position is with the Data Platform Engineering team (People Data Platforms) under the Enterprise Business Services – People Technology organisation focusing on Enterprise area of People Systems which is amid a massive digital transformation. The objective of the People tech organization is to build the best-in-class engineering, analytics and data science solutions that power the best experience for our people, adhering to the Walmart philosophy - Everyday Low Cost. People Data team is responsible to build and maintain People Data Lake platform and other analytical products that aim to democratize access to HR data, enabling tech teams and business users across Walmart with relevant, timely data by streamlining acquisition, curation and consumption of data from various HR systems. The Team supports multiple use-cases that focus on providing engaging employee experiences resulting in global company success. The team is spread over multiple locations, and we work towards providing the best experience to Walmart associates and Business Stakeholders. We are seeking an accomplished Staff Software Engineer to join our Data Platform Engineering team. This critical role is for a passionate technical leader eager to architect, design, and implement robust, scalable, and secure data platforms and pipelines. Leveraging deep expertise in big data technologies, cloud security, and data governance, you will tackle complex, ambiguous challenges across multiple teams and critical business initiatives. Beyond hands-on contributions, you will mentor engineers, champion best practices, and significantly influence the overall technical direction and culture of the organization. Staff Engineers at Walmart lead through unparalleled technical excellence, strategic thinking, and cross-functional influence What You'll Do : Define, drive, and be accountable for the technical strategy and architectural vision for the People Data Lake platform, data pipelines, and analytical products, ensuring alignment with overall business and engineering objectives. Lead the design and implementation of complex, multi-functional data solutions, particularly focusing on data security, access control, encryption, and privacy (e.g., Sensitive Data Protection framework, RBAC/ABAC, dynamic masking). Identify and evaluate critical technical challenges and opportunities across the engineering landscape, proposing innovative and impactful solutions that can span multiple teams. Champion and advocate for best practices in data engineering, software development, CI/CD, data quality, and operational excellence within the team and across the organization. Architect, develop, and maintain robust and scalable batch and streaming data pipelines using Google Data-proc, Apache Spark, Structured Streaming, and Airflow for orchestration. Design, develop, and maintain data security, access control, and encryption solutions within the data platform, working extensively with Google Cloud Platform (GCP) services including Big Query, GCS, IAM, and KMS to develop secure data solutions. Implement and optimize data transformation, integration, and consumption layers (Medallion Architecture) to deliver high-quality, actionable data for diverse use cases. Design and implement secure APIs and microservices for seamless data access and integration. Utilize Infrastructure as Code (IaaS) tools like Ansible and Terraform for automated cloud deployments and infrastructure management. Lead the technical implementation of data governance frameworks, policies, and standards, with a strong focus on data privacy, security (encryption, masking), and regulatory compliance (e.g., GDPR, HIPAA, SOC 2). Collaborate closely with data governance teams and utilize tools like Collibra, EDP, and GCP Data Catalog to ensure proper classification, metadata management, and secure handling of sensitive HR data. Provide expert guidance and establish best practices over the management of data assets, including data quality, retention, and accessibility. Serve as a primary technical mentor, providing coaching and guidance to Senior and Staff Software Engineers, fostering their growth and development in complex technical areas, architectural design, and problem-solving methodologies. Cultivate a culture of technical excellence, continuous learning, and innovation within the team. Collaborate cross-functionally with product managers, business stakeholders, data scientists, and other engineering teams to translate complex business requirements into technical strategies and deliver impactful solutions. Effectively manage multiple initiatives by delivering and delegating as appropriate, ensuring completion of assigned tasks, and representing tasks and technical direction to stakeholders. Address ambiguous, high-impact technical problems that span multiple systems or teams, driving them to resolution and enabling team productivity. Evaluate new technologies and approaches, recommending their adoption where they can provide significant business value or technical advantage. Proactively identify and resolve systemic issues in data architecture, pipelines, and processes. What You'll Bring: Bachelor's degree in Computer Science, Engineering, or a related field, and 9-12 years of experience in software engineering, with a significant focus on data platforms. Strong programming skills in Python, Java, or Scala. Proven experience as a Staff Software Engineer or in a similar senior technical leadership role. Strong problem-solving skills and the ability to work in fast-paced environments. Demonstrable experience with Infrastructure as Code (IaaS) tools like Ansible and Terraform for Cloud Deployments. Demonstrable experience in designing secure APIs and microservices, including Common API Architecture Styles (REST, GraphQL, gRPC, RPC). Expertise in distributed data processing frameworks, like Apache Spark on Google Data-proc, Kafka, Hadoop, Flink, or Apache Beam. Expertise in database technologies and distributed datastores (e.g., SQL, NoSQL, MPP databases such as BigQuery). Strong understanding of API Design Principles & NFR's (Scalability, Maintainability, Availability, Reliability). Hands-on experience with Google Cloud Platform (GCP) and its security-related services (IAM, KMS, Cloud Audit Logs, etc.). Exposure to data engineering techniques, including ETL pipeline development, data ingestion, and data wrangling. Solid understanding of security frameworks like OAuth, OpenID Connect, and JWT-based authentication. Experience designing secure APIs and microservices. Knowledge of data governance, compliance (GDPR, HIPAA, SOC 2, etc.), and regulatory requirements. Familiarity with encryption standards and cryptographic protocols. Experience with data orchestration tools such as Apache Airflow or Google Cloud Composer. Experience with streaming data systems like Kafka and Google Pub/Sub. Knowledge of containerisation (e.g., Docker, Kubernetes) and how to deploy and scale data engineering workloads in cloud environments. About Walmart Global Tech Imagine working in an environment where one line of code can make life easier for hundreds of millions of people. That’s what we do at Walmart Global Tech. We’re a team of software engineers, data scientists, cybersecurity expert's and service professionals within the world’s leading retailer who make an epic impact and are at the forefront of the next retail disruption. People are why we innovate, and people power our innovations. We are people-led and tech-empowered. We train our team in the skillsets of the future and bring in experts like you to help us grow. We have roles for those chasing their first opportunity as well as those looking for the opportunity that will define their career. Here, you can kickstart a great career in tech, gain new skills and experience for virtually every industry, or leverage your expertise to innovate at scale, impact millions and reimagine the future of retail. Flexible, hybrid work We use a hybrid way of working with primary in office presence coupled with an optimal mix of virtual presence. We use our campuses to collaborate and be together in person, as business needs require and for development and networking opportunities. This approach helps us make quicker decisions, remove location barriers across our global team, be more flexible in our personal lives. Benefits Beyond our great compensation package, you can receive incentive awards for your performance. Other great perks include a host of best-in-class benefits maternity and parental leave, PTO, health benefits, and much more. Belonging We aim to create a culture where every associate feels valued for who they are, rooted in respect for the individual. Our goal is to foster a sense of belonging, to create opportunities for all our associates, customers and suppliers, and to be a Walmart for everyone. At Walmart, our vision is "everyone included." By fostering a workplace culture where everyone is—and feels—included, everyone wins. Our associates and customers reflect the makeup of all 19 countries where we operate. By making Walmart a welcoming place where all people feel like they belong, we’re able to engage associates, strengthen our business, improve our ability to serve customers, and support the communities where we operate. Equal Opportunity Employer Walmart, Inc., is an Equal Opportunities Employer – By Choice. We believe we are best equipped to help our associates, customers and the communities we serve live better when we really know them. That means understanding, respecting and valuing unique styles, experiences, identities, ideas and opinions – while being inclusive of all people. Minimum Qualifications... Outlined below are the required minimum qualifications for this position. If none are listed, there are no minimum qualifications. Minimum Qualifications:Option 1: Bachelor's degree in computer science, computer engineering, computer information systems, software engineering, or related area and 4 years’ experience in software engineering or related area.Option 2: 6 years’ experience in software engineering or related area. Preferred Qualifications... Outlined below are the optional preferred qualifications for this position. If none are listed, there are no preferred qualifications. Master’s degree in Computer Science, Computer Engineering, Computer Information Systems, Software Engineering, or related area and 2 years' experience in software engineering or related area Primary Location... Rmz Millenia Business Park, No 143, Campus 1B (1St -6Th Floor), Dr. Mgr Road, (North Veeranam Salai) Perungudi , India R-2211097
Posted 4 days ago
10.0 years
3 - 6 Lacs
Hyderābād
On-site
Job Requirements About Phenom At Phenom, our purpose is to help a billion people find the right job. We’re a global HR tech company delivering AI-powered talent experience solutions for enterprise organizations. Our intelligent platform helps companies attract, engage, and retain top talent. Role Summary We are looking for a Principal DBOps Engineer to lead the strategy, performance, automation, and scalability of our database systems. You will be the go-to expert for everything related to database operations, including reliability, observability, automation, and infrastructure-as-code across multiple environments. This is a hands-on leadership role with strong influence on architecture, security, and database lifecycle management. Key Responsibilities Design and manage highly available, scalable, and secure database architectures across production and non-production environments. Automate database provisioning, monitoring, backup, and recovery workflows using DevOps tools and Infrastructure-as-Code (IaC). Partner with Engineering, DevOps, and Product teams to ensure database performance, reliability, and data integrity. Lead incident response and RCA (Root Cause Analysis) for any database-related issues and outages. Guide the team on best practices around schema management, indexing strategies, and query performance. Mentor and lead a team of DB Engineers and collaborate cross-functionally with SREs, DevOps, and Cloud Architects. Establish data governance, auditing, and compliance protocols across multi-cloud environments (AWS, Azure, etc.). Evaluate and implement database observability solutions (Prometheus, Grafana, etc.). Optimize costs through usage monitoring, capacity planning, and right-sizing of cloud-based DB infrastructure. Skills & Qualifications Bachelor’s/Master’s degree in Computer Science, Information Systems, or related field. 10+ years of experience in database administration and operations in high-scale production environments. Deep expertise in PostgreSQL, MySQL, MongoDB , or similar databases (relational and NoSQL). Proven experience in cloud-native DBOps (AWS RDS/Aurora, Azure SQL, GCP Cloud SQL, etc.). Strong scripting experience (Python, Bash, or Go) and use of automation frameworks (Ansible, Terraform). Exposure to containerization and orchestration (Kubernetes, Helm). Experience with CI/CD pipelines for DB changes and automated testing (Liquibase/Flyway). Solid understanding of database security, data masking, encryption, and access control models. Excellent communication, stakeholder management, and technical leadership skills. Nice to Have Certifications in cloud platforms (AWS, Azure, GCP) Experience with multi-region replication and disaster recovery design Contributions to open-source DB tools or platforms Why Phenom? Work with cutting-edge technologies in a high-impact role Be part of a fast-growing product company solving real-world problems at scale Culture focused on innovation, continuous learning, and collaboration
Posted 4 days ago
0.0 - 5.0 years
0 Lacs
Pune, Maharashtra, India
On-site
HMX Media Pvt. Ltd. is one of the fastest growing CGI advertising studios with a team of professionals, experienced artists & technologists who create engaging visual experiences for international blue chip clients. We specialize in crafting immersive experiences that attract & hold the audiences across various platforms. We cater a range of services to tell stories through dynamic videos, striking photography & powerful real time 3D interaction to stay relevant in this engaging & vibrant industry. We are looking for talented & skillful Photoshop Artist. Please check the below details Job Title : Photoshop (Retouch) Artist Experience : 0-5 Year (Fresher may also apply) Required Software : Photoshop Illustrator Required Skills : Strong hands on Photoshop. Photo Retouching Illustrator Good Selection and Masking skills. Strong computer skills. Thorough knowledge of design and aesthetic principles. Roles & Responsibilities : Photographic retouching artists will work primarily with 3D Renders. Enhance images by correcting resolution and composition, cropping images and adjusting tone, color, saturation and brightness. Adding or removing objects from an image or inserting text. Photographic retouch Artist work under the supervision of the presiding photographer. Job Type : Full-time Job Location : Pune (Balewadi, On-site) Joining : Immediate/30 Days
Posted 4 days ago
3.0 years
48 Lacs
Hyderābād
On-site
The Cloud Storage Administrator will manage and support cloud-based storage platforms in AWS and/or Azure. This role involves configuring, monitoring, and optimizing object, block, and file storage solutions to ensure high availability, performance, and data protection across our cloud infrastructure. Required Skills Administer and support cloud storage services such as Amazon S3, EBS, EFS, Glacier and Azure Blob, File and Archive Storage. Disaster mitigation design and implementation experience with a focus on architecture for cross-region replication, backup management, RTO and RPO planning and chaos engineering recovery. Demonstrate use of AWS Elastic Disaster Recovery or Azure Site Recovery. Certification and privacy standards associated with PII, data protection and compliance gap expectations. Ability to identify and tag PII, applying encryption and masking techniques and knowledge and experience in compliance certification (SOC2, ISO27001, GDPR, etc.) and demonstrate use of Azure Macie or Azure Purview. Monitoring and cost optimization practices to proactively alert on performance, usage and anomalies. Demonstrate use of AWS CloudWatch or Azure Monitor and AWS Cost Explorer or Azure Cost Management, . Embrace IaC and automation practices for backups, lifecycles, and archival polices. Demonstrate expertise with AWS CloudFormation or Azure DevOps and a history of use with Terraform modules for Cloud Storage. Manage backup and recovery processes using native cloud tools and third-party solutions. Implement storage policies including lifecycle rules, replication, and access controls. Perform capacity planning and forecasting for storage growth and utilization. Collaborate with infrastructure and application teams to meet storage and data access requirements. Ensure storage systems comply with data protection, retention, and security standards. Document configurations, procedures, and best practices for storage management. Respond to incidents and service requests related to storage systems. Participate in change and incident management processes aligned with ITSM standards. Required Experience 3+ years of experience in storage administration with cloud platforms (AWS, Azure, or both). Hands-on experience with cloud-native storage services and understanding of storage protocols. Experience with AWS CloudWatch, Azure Monitor, and the ability to set up proactive alerting on storage performance, usage, and anomalies. Strong troubleshooting and performance tuning skills related to storage. Familiarity with backup and disaster recovery solutions in cloud environments. Understanding of identity and access management as it pertains to storage services. Knowledge of ITSM processes such as incident, change, and problem management. Experienced with storage cost monitoring tools like AWS Cost Explorer or Azure Cost Management Knowledge of IaC tools (Terraform, CloudFormation) for provisioning storage resources, and automation of backup, lifecycle, and archival policies. Producing technical documentation. Exposure to enterprise backup solutions
Posted 4 days ago
10.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Do you want to help one of the most respected companies in the world reinvent its approach to data? At Thomson Reuters, we are recruiting a team of motivated data professionals to transform how we manage and leverage our commercial data assets. It is a unique opportunity to join a diverse and global team with centers of excellence in Toronto, London, and Bangalore. Are you excited about working at the forefront of the data driven revolution that will change the way a company works? Thomson Reuters Data and Analytics team is seeking an experienced Lead Engineer, Test Data Management with a passion for engineering quality assurance solutions for cloud-based data warehouse systems. About The Role As Lead Engineer, In this opportunity you will: Test Data Management, you play a crucial role in ensuring the quality and reliability of our enterprise data systems. Your expertise in testing methods, data validation, and automation are essential to bring best-in-class standards to our data products. Design test data management frameworks, apply data masking, data sub-setting, and generate synthetic data to create robust test data solutions for enterprise-wide teams. You will collaborate with Engineers, Database Architects, Data Quality Stewards to build logical data models, execute data validation, design manual and automated testing Mentor and lead the testing of key data development projects related to Data Warehouse and other systems. Lead engineering team members in implementation of test data best practices and the delivery of test data solutions. Be a thought leader investigating leading edge quality technology for test data management and systems functionality including performance testing for data pipelines. Innovate create ETL mappings, workflows, functions to move data from multiple sources into target areas. Partner across the company with analytics teams, engineering managers, architecture teams and others to design and agree on solutions that meet business requirements. Effectively communicate and liaise with other engineering groups across the organization, data consumers, and business analytic groups. Utilize your experience in the following areas: SQL for data querying, validation, and analysis Knowledge of database management systems (e.g., SQL Server, Postgresql, mySQL) Test Data Management Tools (e.g., K2View, qTest, ALM, Zephyr) Proficiency in Python for test automation and data manipulation PySpark for big data testing Test case design, execution, and defect management AWS Cloud Data practices and DevOps tooling Performance testing for data management solutions, especially for complex data flows Data Security, Privacy, and Data governance compliance principles About You You're a fit for the role of Lead Engineer, If your Job role includes: 10+ years of experience as a Tester, Developer or Data Analyst with experience in establishing end-to-end test strategies, planning for data validation, transformation, and analytics Advanced SQL Knowledge Designing and executing test procedures and documenting best practices Experience planning and executing regression testing, data validation, and quality assurance Advanced command of data warehouse creation, management, and performance strategies Experience engineering and implementing data quality systems in the cloud Proficiency in scripting language such as Python Hands on experience with data test automation applications (preference for K2View) Identification and remediation of data quality issues Data Management tools like: K2View, Immuta, Alation, Informatica Agile development Business Intelligence and Data Warehousing concepts Familiarity SAP, Salesforce systems Intermediate understanding of Big Data technologies AWS services and management, including serverless, container, queueing and monitoring services Experience with creating manual or automated tests on data pipelines Programming languages: Python Data interchange formats: Parquet, JSON, CSV Version control with GitHub Cloud security and compliance, privacy, GDPR What’s in it For You? Hybrid Work Model: We’ve adopted a flexible hybrid working environment (2-3 days a week in the office depending on the role) for our office-based roles while delivering a seamless experience that is digitally and physically connected. Flexibility & Work-Life Balance: Flex My Way is a set of supportive workplace policies designed to help manage personal and professional responsibilities, whether caring for family, giving back to the community, or finding time to refresh and reset. This builds upon our flexible work arrangements, including work from anywhere for up to 8 weeks per year, empowering employees to achieve a better work-life balance. Career Development and Growth: By fostering a culture of continuous learning and skill development, we prepare our talent to tackle tomorrow’s challenges and deliver real-world solutions. Our Grow My Way programming and skills-first approach ensures you have the tools and knowledge to grow, lead, and thrive in an AI-enabled future. Industry Competitive Benefits: We offer comprehensive benefit plans to include flexible vacation, two company-wide Mental Health Days off, access to the Headspace app, retirement savings, tuition reimbursement, employee incentive programs, and resources for mental, physical, and financial wellbeing. Culture: Globally recognized, award-winning reputation for inclusion and belonging, flexibility, work-life balance, and more. We live by our values: Obsess over our Customers, Compete to Win, Challenge (Y)our Thinking, Act Fast / Learn Fast, and Stronger Together. Social Impact: Make an impact in your community with our Social Impact Institute. We offer employees two paid volunteer days off annually and opportunities to get involved with pro-bono consulting projects and Environmental, Social, and Governance (ESG) initiatives. Making a Real-World Impact: We are one of the few companies globally that helps its customers pursue justice, truth, and transparency. Together, with the professionals and institutions we serve, we help uphold the rule of law, turn the wheels of commerce, catch bad actors, report the facts, and provide trusted, unbiased information to people all over the world. About Us Thomson Reuters informs the way forward by bringing together the trusted content and technology that people and organizations need to make the right decisions. We serve professionals across legal, tax, accounting, compliance, government, and media. Our products combine highly specialized software and insights to empower professionals with the data, intelligence, and solutions needed to make informed decisions, and to help institutions in their pursuit of justice, truth, and transparency. Reuters, part of Thomson Reuters, is a world leading provider of trusted journalism and news. We are powered by the talents of 26,000 employees across more than 70 countries, where everyone has a chance to contribute and grow professionally in flexible work environments. At a time when objectivity, accuracy, fairness, and transparency are under attack, we consider it our duty to pursue them. Sound exciting? Join us and help shape the industries that move society forward. As a global business, we rely on the unique backgrounds, perspectives, and experiences of all employees to deliver on our business goals. To ensure we can do that, we seek talented, qualified employees in all our operations around the world regardless of race, color, sex/gender, including pregnancy, gender identity and expression, national origin, religion, sexual orientation, disability, age, marital status, citizen status, veteran status, or any other protected classification under applicable law. Thomson Reuters is proud to be an Equal Employment Opportunity Employer providing a drug-free workplace. We also make reasonable accommodations for qualified individuals with disabilities and for sincerely held religious beliefs in accordance with applicable law. More information on requesting an accommodation here. Learn more on how to protect yourself from fraudulent job postings here. More information about Thomson Reuters can be found on thomsonreuters.com.
Posted 4 days ago
3.0 years
0 Lacs
Kolkata, West Bengal, India
Remote
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Staff (CTM – Threat Detection & Response) KEY Capabilities: Experience in working with Splunk Enterprise, Splunk Enterprise Security & Splunk UEBA Minimum of Splunk Power User Certification Good knowledge in programming or Scripting languages such as Python (preferred), JavaScript (preferred), Bash, PowerShell, Bash, etc Assist in remote and on-site gap assessment of the SIEM solution. Work on defined evaluation criteria & approach based on the Client requirement & scope factoring industry best practices & regulations Assist in interview with stakeholders, review documents (SOPs, Architecture diagrams etc) Asist in evaluating SIEM based on the defined criteria and prepare audit reports Good experience in providing consulting to customers during the testing, evaluation, pilot, production and training phases to ensure a successful deployment. Experience in onboarding data into Splunk from various sources including unsupported (in-house built) by creating custom parsers Verification of data of log sources in the SIEM, following the Common Information Model (CIM) Experience in parsing and masking of data prior to ingestion in SIEM Provide support for the data collection, processing, analysis and operational reporting systems including planning, installation, configuration, testing, troubleshooting and problem resolution Assist clients to fully optimize the SIEM system capabilities as well as the audit and logging features of the event log sources Assist client with technical guidance to configure their log sources (in-scope) to be integrated to the SIEM Experience in SIEM content development which includes : Hands-on experience in development and customization of Splunk Apps & Add-Ons Builds advanced visualizations (Interactive Drilldown, Glass tables etc) Build and integrate contextual data into notable events Experience in creating use cases under Cyber kill chain and MITRE attack framework Capability in developing advanced dashboards (with CSS, JavaScript, HTML, XML) and reports that can provide near real time visibility into the performance of client applications. Sound knowledge in configuration of Alerts and Reports. Good exposure in automatic lookup, data models and creating complex SPL queries. Create, modify and tune the SIEM rules to adjust the specifications of alerts and incidents to meet client requirement Experience in creating custom commands, custom alert action, adaptive response actions etc Qualification & experience: Minimum of 3 years’ experience in Splunk and 3 to 5 years of overall experience with knowledge in Operating System and basic network technologies Experience in SOC as L1/L2 Analyst will be an added advantage Strong oral, written and listening skills are an essential component to effective consulting. Good to have knowledge of Vulnerability Management, Windows Domains, trusts, GPOs, server roles, Windows security policies, user administration, Linux security and troubleshooting Certification in any other SIEM Solution such as IBM QRadar, Exabeam, Securonix will be an added advantage Certifications in a core security related discipline (CEH, Security+, etc) will be an added advantage. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.
Posted 4 days ago
3.0 years
0 Lacs
Kanayannur, Kerala, India
Remote
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Staff (CTM – Threat Detection & Response) KEY Capabilities: Experience in working with Splunk Enterprise, Splunk Enterprise Security & Splunk UEBA Minimum of Splunk Power User Certification Good knowledge in programming or Scripting languages such as Python (preferred), JavaScript (preferred), Bash, PowerShell, Bash, etc Assist in remote and on-site gap assessment of the SIEM solution. Work on defined evaluation criteria & approach based on the Client requirement & scope factoring industry best practices & regulations Assist in interview with stakeholders, review documents (SOPs, Architecture diagrams etc) Asist in evaluating SIEM based on the defined criteria and prepare audit reports Good experience in providing consulting to customers during the testing, evaluation, pilot, production and training phases to ensure a successful deployment. Experience in onboarding data into Splunk from various sources including unsupported (in-house built) by creating custom parsers Verification of data of log sources in the SIEM, following the Common Information Model (CIM) Experience in parsing and masking of data prior to ingestion in SIEM Provide support for the data collection, processing, analysis and operational reporting systems including planning, installation, configuration, testing, troubleshooting and problem resolution Assist clients to fully optimize the SIEM system capabilities as well as the audit and logging features of the event log sources Assist client with technical guidance to configure their log sources (in-scope) to be integrated to the SIEM Experience in SIEM content development which includes : Hands-on experience in development and customization of Splunk Apps & Add-Ons Builds advanced visualizations (Interactive Drilldown, Glass tables etc) Build and integrate contextual data into notable events Experience in creating use cases under Cyber kill chain and MITRE attack framework Capability in developing advanced dashboards (with CSS, JavaScript, HTML, XML) and reports that can provide near real time visibility into the performance of client applications. Sound knowledge in configuration of Alerts and Reports. Good exposure in automatic lookup, data models and creating complex SPL queries. Create, modify and tune the SIEM rules to adjust the specifications of alerts and incidents to meet client requirement Experience in creating custom commands, custom alert action, adaptive response actions etc Qualification & experience: Minimum of 3 years’ experience in Splunk and 3 to 5 years of overall experience with knowledge in Operating System and basic network technologies Experience in SOC as L1/L2 Analyst will be an added advantage Strong oral, written and listening skills are an essential component to effective consulting. Good to have knowledge of Vulnerability Management, Windows Domains, trusts, GPOs, server roles, Windows security policies, user administration, Linux security and troubleshooting Certification in any other SIEM Solution such as IBM QRadar, Exabeam, Securonix will be an added advantage Certifications in a core security related discipline (CEH, Security+, etc) will be an added advantage. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.
Posted 4 days ago
3.0 years
0 Lacs
Trivandrum, Kerala, India
Remote
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Staff (CTM – Threat Detection & Response) KEY Capabilities: Experience in working with Splunk Enterprise, Splunk Enterprise Security & Splunk UEBA Minimum of Splunk Power User Certification Good knowledge in programming or Scripting languages such as Python (preferred), JavaScript (preferred), Bash, PowerShell, Bash, etc Assist in remote and on-site gap assessment of the SIEM solution. Work on defined evaluation criteria & approach based on the Client requirement & scope factoring industry best practices & regulations Assist in interview with stakeholders, review documents (SOPs, Architecture diagrams etc) Asist in evaluating SIEM based on the defined criteria and prepare audit reports Good experience in providing consulting to customers during the testing, evaluation, pilot, production and training phases to ensure a successful deployment. Experience in onboarding data into Splunk from various sources including unsupported (in-house built) by creating custom parsers Verification of data of log sources in the SIEM, following the Common Information Model (CIM) Experience in parsing and masking of data prior to ingestion in SIEM Provide support for the data collection, processing, analysis and operational reporting systems including planning, installation, configuration, testing, troubleshooting and problem resolution Assist clients to fully optimize the SIEM system capabilities as well as the audit and logging features of the event log sources Assist client with technical guidance to configure their log sources (in-scope) to be integrated to the SIEM Experience in SIEM content development which includes : Hands-on experience in development and customization of Splunk Apps & Add-Ons Builds advanced visualizations (Interactive Drilldown, Glass tables etc) Build and integrate contextual data into notable events Experience in creating use cases under Cyber kill chain and MITRE attack framework Capability in developing advanced dashboards (with CSS, JavaScript, HTML, XML) and reports that can provide near real time visibility into the performance of client applications. Sound knowledge in configuration of Alerts and Reports. Good exposure in automatic lookup, data models and creating complex SPL queries. Create, modify and tune the SIEM rules to adjust the specifications of alerts and incidents to meet client requirement Experience in creating custom commands, custom alert action, adaptive response actions etc Qualification & experience: Minimum of 3 years’ experience in Splunk and 3 to 5 years of overall experience with knowledge in Operating System and basic network technologies Experience in SOC as L1/L2 Analyst will be an added advantage Strong oral, written and listening skills are an essential component to effective consulting. Good to have knowledge of Vulnerability Management, Windows Domains, trusts, GPOs, server roles, Windows security policies, user administration, Linux security and troubleshooting Certification in any other SIEM Solution such as IBM QRadar, Exabeam, Securonix will be an added advantage Certifications in a core security related discipline (CEH, Security+, etc) will be an added advantage. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.
Posted 4 days ago
0 years
20 - 25 Lacs
Pune, Maharashtra, India
On-site
We are a fast-growing data-analytics consultancy dedicated to the Life Sciences / Pharmaceutical commercial analytics space. Our teams build cloud-native data platforms that power sales, marketing, and patient-centric insights for leading global pharma brands—delivering compliant, high-impact solutions at enterprise scale. Role & Responsibilities Architect, build, and tune Snowflake data warehouses and ELT pipelines (SQL, Streams, Tasks, UDFs, Stored Procedures) to meet complex commercial-analytics workloads. Integrate diverse pharma data sources (Veeva, Salesforce, IQVIA, Symphony, RWD, patient-services feeds) via Fivetran, ADF, or Python-based frameworks, ensuring end-to-end data quality. Establish robust data models (star, snowflake, Data Vault) optimized for sales reporting, market-share analytics, and AI/ML use-cases. Drive governance & compliance (HIPAA, GDPR, GxP) through fine-grained access controls, masking, lineage, and metadata management. Lead code reviews, mentor engineers, and resolve performance bottlenecks while right-sizing compute for cost efficiency. Partner with business stakeholders to translate commercial objectives into scalable data solutions and actionable insights. Skills & Qualifications Must-Have 7+ yrs data-engineering / warehousing experience, incl. 4+ yrs hands-on Snowflake design & development. Expert‐level SQL plus strong data-modeling (Dimensional, Data Vault) and ETL/ELT optimisation skills. Proficiency in Python (or similar) for automation, API integrations, and orchestration. Proven governance/security acumen within regulated industries (HIPAA, GDPR, PII). Bachelor’s in Computer Science, Engineering, Information Systems (Master’s preferred). Strong client-facing communication and problem-solving ability in fast-paced, agile environments. Preferred Direct experience with pharma commercial datasets (sales, CRM, claims, MDM, adherence KPIs). Cloud-platform depth (AWS, Azure, or GCP) and familiarity with tools such as Matillion/DBT/Airflow, Git. Snowflake certifications (SnowPro Core / Advanced) plus Tableau, Power BI, or Qlik connectivity know-how. Skills: Data,Analytics,Snowflake,Sales,Cloud,AWS,Azure
Posted 4 days ago
0 years
20 - 25 Lacs
Pune, Maharashtra, India
On-site
We are a fast-growing data-analytics consultancy dedicated to the Life Sciences / Pharmaceutical commercial analytics space. Our teams build cloud-native data platforms that power sales, marketing, and patient-centric insights for leading global pharma brands—delivering compliant, high-impact solutions at enterprise scale. Role & Responsibilities Architect, build, and tune Snowflake data warehouses and ELT pipelines (SQL, Streams, Tasks, UDFs, Stored Procedures) to meet complex commercial-analytics workloads. Integrate diverse pharma data sources (Veeva, Salesforce, IQVIA, Symphony, RWD, patient-services feeds) via Fivetran, ADF, or Python-based frameworks, ensuring end-to-end data quality. Establish robust data models (star, snowflake, Data Vault) optimized for sales reporting, market-share analytics, and AI/ML use-cases. Drive governance & compliance (HIPAA, GDPR, GxP) through fine-grained access controls, masking, lineage, and metadata management. Lead code reviews, mentor engineers, and resolve performance bottlenecks while right-sizing compute for cost efficiency. Partner with business stakeholders to translate commercial objectives into scalable data solutions and actionable insights. Skills & Qualifications Must-Have 7+ yrs data-engineering / warehousing experience, incl. 4+ yrs hands-on Snowflake design & development. Expert‐level SQL plus strong data-modeling (Dimensional, Data Vault) and ETL/ELT optimisation skills. Proficiency in Python (or similar) for automation, API integrations, and orchestration. Proven governance/security acumen within regulated industries (HIPAA, GDPR, PII). Bachelor’s in Computer Science, Engineering, Information Systems (Master’s preferred). Strong client-facing communication and problem-solving ability in fast-paced, agile environments. Preferred Direct experience with pharma commercial datasets (sales, CRM, claims, MDM, adherence KPIs). Cloud-platform depth (AWS, Azure, or GCP) and familiarity with tools such as Matillion/DBT/Airflow, Git. Snowflake certifications (SnowPro Core / Advanced) plus Tableau, Power BI, or Qlik connectivity know-how. Skills: aws,analytics,sales,sql,data,snowflake,etl/elt optimization,python,data warehousing,azure,data modeling,data governance,cloud
Posted 4 days ago
0 years
20 - 25 Lacs
Thane, Maharashtra, India
On-site
We are a fast-growing data-analytics consultancy dedicated to the Life Sciences / Pharmaceutical commercial analytics space. Our teams build cloud-native data platforms that power sales, marketing, and patient-centric insights for leading global pharma brands—delivering compliant, high-impact solutions at enterprise scale. Role & Responsibilities Architect, build, and tune Snowflake data warehouses and ELT pipelines (SQL, Streams, Tasks, UDFs, Stored Procedures) to meet complex commercial-analytics workloads. Integrate diverse pharma data sources (Veeva, Salesforce, IQVIA, Symphony, RWD, patient-services feeds) via Fivetran, ADF, or Python-based frameworks, ensuring end-to-end data quality. Establish robust data models (star, snowflake, Data Vault) optimized for sales reporting, market-share analytics, and AI/ML use-cases. Drive governance & compliance (HIPAA, GDPR, GxP) through fine-grained access controls, masking, lineage, and metadata management. Lead code reviews, mentor engineers, and resolve performance bottlenecks while right-sizing compute for cost efficiency. Partner with business stakeholders to translate commercial objectives into scalable data solutions and actionable insights. Skills & Qualifications Must-Have 7+ yrs data-engineering / warehousing experience, incl. 4+ yrs hands-on Snowflake design & development. Expert‐level SQL plus strong data-modeling (Dimensional, Data Vault) and ETL/ELT optimisation skills. Proficiency in Python (or similar) for automation, API integrations, and orchestration. Proven governance/security acumen within regulated industries (HIPAA, GDPR, PII). Bachelor’s in Computer Science, Engineering, Information Systems (Master’s preferred). Strong client-facing communication and problem-solving ability in fast-paced, agile environments. Preferred Direct experience with pharma commercial datasets (sales, CRM, claims, MDM, adherence KPIs). Cloud-platform depth (AWS, Azure, or GCP) and familiarity with tools such as Matillion/DBT/Airflow, Git. Snowflake certifications (SnowPro Core / Advanced) plus Tableau, Power BI, or Qlik connectivity know-how. Skills: Data,Analytics,Snowflake,Sales,Cloud,AWS,Azure
Posted 4 days ago
0 years
20 - 25 Lacs
Mumbai Metropolitan Region
On-site
We are a fast-growing data-analytics consultancy dedicated to the Life Sciences / Pharmaceutical commercial analytics space. Our teams build cloud-native data platforms that power sales, marketing, and patient-centric insights for leading global pharma brands—delivering compliant, high-impact solutions at enterprise scale. Role & Responsibilities Architect, build, and tune Snowflake data warehouses and ELT pipelines (SQL, Streams, Tasks, UDFs, Stored Procedures) to meet complex commercial-analytics workloads. Integrate diverse pharma data sources (Veeva, Salesforce, IQVIA, Symphony, RWD, patient-services feeds) via Fivetran, ADF, or Python-based frameworks, ensuring end-to-end data quality. Establish robust data models (star, snowflake, Data Vault) optimized for sales reporting, market-share analytics, and AI/ML use-cases. Drive governance & compliance (HIPAA, GDPR, GxP) through fine-grained access controls, masking, lineage, and metadata management. Lead code reviews, mentor engineers, and resolve performance bottlenecks while right-sizing compute for cost efficiency. Partner with business stakeholders to translate commercial objectives into scalable data solutions and actionable insights. Skills & Qualifications Must-Have 7+ yrs data-engineering / warehousing experience, incl. 4+ yrs hands-on Snowflake design & development. Expert‐level SQL plus strong data-modeling (Dimensional, Data Vault) and ETL/ELT optimisation skills. Proficiency in Python (or similar) for automation, API integrations, and orchestration. Proven governance/security acumen within regulated industries (HIPAA, GDPR, PII). Bachelor’s in Computer Science, Engineering, Information Systems (Master’s preferred). Strong client-facing communication and problem-solving ability in fast-paced, agile environments. Preferred Direct experience with pharma commercial datasets (sales, CRM, claims, MDM, adherence KPIs). Cloud-platform depth (AWS, Azure, or GCP) and familiarity with tools such as Matillion/DBT/Airflow, Git. Snowflake certifications (SnowPro Core / Advanced) plus Tableau, Power BI, or Qlik connectivity know-how. Skills: Data,Analytics,Snowflake,Sales,Cloud,AWS,Azure
Posted 4 days ago
0 years
20 - 25 Lacs
Thane, Maharashtra, India
On-site
We are a fast-growing data-analytics consultancy dedicated to the Life Sciences / Pharmaceutical commercial analytics space. Our teams build cloud-native data platforms that power sales, marketing, and patient-centric insights for leading global pharma brands—delivering compliant, high-impact solutions at enterprise scale. Role & Responsibilities Architect, build, and tune Snowflake data warehouses and ELT pipelines (SQL, Streams, Tasks, UDFs, Stored Procedures) to meet complex commercial-analytics workloads. Integrate diverse pharma data sources (Veeva, Salesforce, IQVIA, Symphony, RWD, patient-services feeds) via Fivetran, ADF, or Python-based frameworks, ensuring end-to-end data quality. Establish robust data models (star, snowflake, Data Vault) optimized for sales reporting, market-share analytics, and AI/ML use-cases. Drive governance & compliance (HIPAA, GDPR, GxP) through fine-grained access controls, masking, lineage, and metadata management. Lead code reviews, mentor engineers, and resolve performance bottlenecks while right-sizing compute for cost efficiency. Partner with business stakeholders to translate commercial objectives into scalable data solutions and actionable insights. Skills & Qualifications Must-Have 7+ yrs data-engineering / warehousing experience, incl. 4+ yrs hands-on Snowflake design & development. Expert‐level SQL plus strong data-modeling (Dimensional, Data Vault) and ETL/ELT optimisation skills. Proficiency in Python (or similar) for automation, API integrations, and orchestration. Proven governance/security acumen within regulated industries (HIPAA, GDPR, PII). Bachelor’s in Computer Science, Engineering, Information Systems (Master’s preferred). Strong client-facing communication and problem-solving ability in fast-paced, agile environments. Preferred Direct experience with pharma commercial datasets (sales, CRM, claims, MDM, adherence KPIs). Cloud-platform depth (AWS, Azure, or GCP) and familiarity with tools such as Matillion/DBT/Airflow, Git. Snowflake certifications (SnowPro Core / Advanced) plus Tableau, Power BI, or Qlik connectivity know-how. Skills: Data,Analytics,Snowflake,Sales,Cloud,AWS,Azure
Posted 4 days ago
0 years
20 - 25 Lacs
Nashik, Maharashtra, India
On-site
We are a fast-growing data-analytics consultancy dedicated to the Life Sciences / Pharmaceutical commercial analytics space. Our teams build cloud-native data platforms that power sales, marketing, and patient-centric insights for leading global pharma brands—delivering compliant, high-impact solutions at enterprise scale. Role & Responsibilities Architect, build, and tune Snowflake data warehouses and ELT pipelines (SQL, Streams, Tasks, UDFs, Stored Procedures) to meet complex commercial-analytics workloads. Integrate diverse pharma data sources (Veeva, Salesforce, IQVIA, Symphony, RWD, patient-services feeds) via Fivetran, ADF, or Python-based frameworks, ensuring end-to-end data quality. Establish robust data models (star, snowflake, Data Vault) optimized for sales reporting, market-share analytics, and AI/ML use-cases. Drive governance & compliance (HIPAA, GDPR, GxP) through fine-grained access controls, masking, lineage, and metadata management. Lead code reviews, mentor engineers, and resolve performance bottlenecks while right-sizing compute for cost efficiency. Partner with business stakeholders to translate commercial objectives into scalable data solutions and actionable insights. Skills & Qualifications Must-Have 7+ yrs data-engineering / warehousing experience, incl. 4+ yrs hands-on Snowflake design & development. Expert‐level SQL plus strong data-modeling (Dimensional, Data Vault) and ETL/ELT optimisation skills. Proficiency in Python (or similar) for automation, API integrations, and orchestration. Proven governance/security acumen within regulated industries (HIPAA, GDPR, PII). Bachelor’s in Computer Science, Engineering, Information Systems (Master’s preferred). Strong client-facing communication and problem-solving ability in fast-paced, agile environments. Preferred Direct experience with pharma commercial datasets (sales, CRM, claims, MDM, adherence KPIs). Cloud-platform depth (AWS, Azure, or GCP) and familiarity with tools such as Matillion/DBT/Airflow, Git. Snowflake certifications (SnowPro Core / Advanced) plus Tableau, Power BI, or Qlik connectivity know-how. Skills: Data,Analytics,Snowflake,Sales,Cloud,AWS,Azure
Posted 4 days ago
0 years
20 - 25 Lacs
Nashik, Maharashtra, India
On-site
We are a fast-growing data-analytics consultancy dedicated to the Life Sciences / Pharmaceutical commercial analytics space. Our teams build cloud-native data platforms that power sales, marketing, and patient-centric insights for leading global pharma brands—delivering compliant, high-impact solutions at enterprise scale. Role & Responsibilities Architect, build, and tune Snowflake data warehouses and ELT pipelines (SQL, Streams, Tasks, UDFs, Stored Procedures) to meet complex commercial-analytics workloads. Integrate diverse pharma data sources (Veeva, Salesforce, IQVIA, Symphony, RWD, patient-services feeds) via Fivetran, ADF, or Python-based frameworks, ensuring end-to-end data quality. Establish robust data models (star, snowflake, Data Vault) optimized for sales reporting, market-share analytics, and AI/ML use-cases. Drive governance & compliance (HIPAA, GDPR, GxP) through fine-grained access controls, masking, lineage, and metadata management. Lead code reviews, mentor engineers, and resolve performance bottlenecks while right-sizing compute for cost efficiency. Partner with business stakeholders to translate commercial objectives into scalable data solutions and actionable insights. Skills & Qualifications Must-Have 7+ yrs data-engineering / warehousing experience, incl. 4+ yrs hands-on Snowflake design & development. Expert‐level SQL plus strong data-modeling (Dimensional, Data Vault) and ETL/ELT optimisation skills. Proficiency in Python (or similar) for automation, API integrations, and orchestration. Proven governance/security acumen within regulated industries (HIPAA, GDPR, PII). Bachelor’s in Computer Science, Engineering, Information Systems (Master’s preferred). Strong client-facing communication and problem-solving ability in fast-paced, agile environments. Preferred Direct experience with pharma commercial datasets (sales, CRM, claims, MDM, adherence KPIs). Cloud-platform depth (AWS, Azure, or GCP) and familiarity with tools such as Matillion/DBT/Airflow, Git. Snowflake certifications (SnowPro Core / Advanced) plus Tableau, Power BI, or Qlik connectivity know-how. Skills: Data,Analytics,Snowflake,Sales,Cloud,AWS,Azure
Posted 4 days ago
0 years
20 - 25 Lacs
Solapur, Maharashtra, India
On-site
We are a fast-growing data-analytics consultancy dedicated to the Life Sciences / Pharmaceutical commercial analytics space. Our teams build cloud-native data platforms that power sales, marketing, and patient-centric insights for leading global pharma brands—delivering compliant, high-impact solutions at enterprise scale. Role & Responsibilities Architect, build, and tune Snowflake data warehouses and ELT pipelines (SQL, Streams, Tasks, UDFs, Stored Procedures) to meet complex commercial-analytics workloads. Integrate diverse pharma data sources (Veeva, Salesforce, IQVIA, Symphony, RWD, patient-services feeds) via Fivetran, ADF, or Python-based frameworks, ensuring end-to-end data quality. Establish robust data models (star, snowflake, Data Vault) optimized for sales reporting, market-share analytics, and AI/ML use-cases. Drive governance & compliance (HIPAA, GDPR, GxP) through fine-grained access controls, masking, lineage, and metadata management. Lead code reviews, mentor engineers, and resolve performance bottlenecks while right-sizing compute for cost efficiency. Partner with business stakeholders to translate commercial objectives into scalable data solutions and actionable insights. Skills & Qualifications Must-Have 7+ yrs data-engineering / warehousing experience, incl. 4+ yrs hands-on Snowflake design & development. Expert‐level SQL plus strong data-modeling (Dimensional, Data Vault) and ETL/ELT optimisation skills. Proficiency in Python (or similar) for automation, API integrations, and orchestration. Proven governance/security acumen within regulated industries (HIPAA, GDPR, PII). Bachelor’s in Computer Science, Engineering, Information Systems (Master’s preferred). Strong client-facing communication and problem-solving ability in fast-paced, agile environments. Preferred Direct experience with pharma commercial datasets (sales, CRM, claims, MDM, adherence KPIs). Cloud-platform depth (AWS, Azure, or GCP) and familiarity with tools such as Matillion/DBT/Airflow, Git. Snowflake certifications (SnowPro Core / Advanced) plus Tableau, Power BI, or Qlik connectivity know-how. Skills: Data,Analytics,Snowflake,Sales,Cloud,AWS,Azure
Posted 4 days ago
0 years
20 - 25 Lacs
Solapur, Maharashtra, India
On-site
We are a fast-growing data-analytics consultancy dedicated to the Life Sciences / Pharmaceutical commercial analytics space. Our teams build cloud-native data platforms that power sales, marketing, and patient-centric insights for leading global pharma brands—delivering compliant, high-impact solutions at enterprise scale. Role & Responsibilities Architect, build, and tune Snowflake data warehouses and ELT pipelines (SQL, Streams, Tasks, UDFs, Stored Procedures) to meet complex commercial-analytics workloads. Integrate diverse pharma data sources (Veeva, Salesforce, IQVIA, Symphony, RWD, patient-services feeds) via Fivetran, ADF, or Python-based frameworks, ensuring end-to-end data quality. Establish robust data models (star, snowflake, Data Vault) optimized for sales reporting, market-share analytics, and AI/ML use-cases. Drive governance & compliance (HIPAA, GDPR, GxP) through fine-grained access controls, masking, lineage, and metadata management. Lead code reviews, mentor engineers, and resolve performance bottlenecks while right-sizing compute for cost efficiency. Partner with business stakeholders to translate commercial objectives into scalable data solutions and actionable insights. Skills & Qualifications Must-Have 7+ yrs data-engineering / warehousing experience, incl. 4+ yrs hands-on Snowflake design & development. Expert‐level SQL plus strong data-modeling (Dimensional, Data Vault) and ETL/ELT optimisation skills. Proficiency in Python (or similar) for automation, API integrations, and orchestration. Proven governance/security acumen within regulated industries (HIPAA, GDPR, PII). Bachelor’s in Computer Science, Engineering, Information Systems (Master’s preferred). Strong client-facing communication and problem-solving ability in fast-paced, agile environments. Preferred Direct experience with pharma commercial datasets (sales, CRM, claims, MDM, adherence KPIs). Cloud-platform depth (AWS, Azure, or GCP) and familiarity with tools such as Matillion/DBT/Airflow, Git. Snowflake certifications (SnowPro Core / Advanced) plus Tableau, Power BI, or Qlik connectivity know-how. Skills: Data,Analytics,Snowflake,Sales,Cloud,AWS,Azure
Posted 4 days ago
0 years
2 - 3 Lacs
India
On-site
Job Title: Painter Location: Serilingampally Salary: ₹20,000 - ₹30,000 per month Job Description: We are looking for Painters to join our team at Serilingampally location. Key Responsibilities: Prepare surfaces for painting (cleaning, sanding, filling cracks and holes). Mix, match, and apply paints and finishes as per specifications. Apply primer, paints, varnishes, or other finishes using brushes, rollers, or spray guns. Protect surrounding areas using drop cloths or masking tape. Ensure high-quality finishing and attention to detail. Follow safety protocols and use protective equipment. Clean up after completing work and maintain tools and equipment. Requirements: Proven experience as a painter (residential, commercial, or industrial). Knowledge of various painting techniques and materials. Good physical condition and ability to work at heights if required. Attention to detail and precision. Ability to work independently and as part of a team. Education: No formal education required. Relevant experience is mandatory. Job Types: Full-time, Permanent Pay: ₹20,000.00 - ₹30,000.00 per month Benefits: Health insurance Provident Fund Schedule: Day shift Supplemental Pay: Performance bonus Work Location: In person
Posted 5 days ago
3.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Job Title : Data Warehouse Administrator Job Summary We are seeking an experienced Data Warehouse Administrator with strong expertise in Snowflake to manage, monitor, and optimize our enterprise data warehousing environment. The ideal candidate will be responsible for implementing and maintaining secure, scalable, and high-performance Snowflake solutions while ensuring data availability and reliability. Key Responsibilities Snowflake Administration : Manage Snowflake accounts, warehouses, databases, roles, and users. Monitor performance, resource usage, and optimize warehouse configurations. Handle data replication, failover, and disaster recovery setup. Data Management & Security Implement security best practices : RBAC, masking, encryption. Support data governance and compliance requirements (e.g., GDPR, HIPAA). ETL/ELT & Data Integration Support Work closely with data engineers to support data pipelines and transformations. Manage integrations between Snowflake and tools like DBT, Fivetran, Airflow, etc. Monitoring & Troubleshooting Proactively identify performance bottlenecks and resolve issues. Implement alerts, usage monitoring, and cost tracking in Snowflake. Upgrades & Maintenance Stay current with Snowflake updates and implement new features. Schedule and manage routine maintenance, backups, and data archiving. Documentation & Support Create and maintain system documentation, runbooks, and best practices. Provide L2/L3 support for data warehouse-related issues. Required Skills & Qualifications Bachelor's degree in Computer Science, Information Systems, or related field. 3-5+ years of experience with data warehouse administration. 2+ years of hands-on experience with Snowflake. Proficiency in SQL, scripting (Python or Bash), and version control (Git). Experience with cloud platforms (AWS, Azure, or GCP). Familiarity with data modeling, ELT frameworks, and CI/CD practices. Preferred Qualifications Snowflake certifications (e.g., SnowPro Core/Advanced). Experience with tools like DBT, Airflow, Fivetran, or Matillion. Exposure to data cataloging, data governance tools (e.g., Collibra, Alation). Soft Skills Strong problem-solving and analytical skills. Effective communication with technical and non-technical teams. Ability to work independently and in a team-oriented environment. (ref:hirist.tech)
Posted 5 days ago
3.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Key Responsibilities Design, develop, and deploy conversational agents using platforms like Google Dialogflow, Amelia, Amazon Lex, etc. Create and optimize NLP/NLU models to support dynamic, multi-turn interactions. Develop dialog flows, intents, entities, and fulfillment logic with API integrations. Integrate bots with external systems (CRM, EHR, contact center, databases) using RESTful APIs. Collaborate with UX designers, business analysts, and stakeholders to define and refine conversation flows. Implement voice interface capabilities (SIP/SBC integration, TTS/STT engines). Conduct testing (unit, regression, UAT) and optimize bot performance using analytics and user feedback. Ensure compliance with data privacy standards (e.g., HIPAA, GDPR) and implement masking/redaction as needed. Document architecture, workflows, and development best practices. Required Skills and Qualifications: Bachelor’s degree in Computer Science, Engineering, or related field. 3+ years of experience in Conversational AI development. Proficiency with one or more platforms: Google Dialogflow, Amelia.ai, IBM Watson, Microsoft Bot Framework, Rasa, or similar. Strong understanding of NLU/NLP concepts and tools. Hands-on experience in REST API integration and backend scripting (Node.js, Python, or Java). Familiarity with telephony integrations (SIP, Twilio, Avaya, Genesys) is a plus. Experience with TTS/STT engines like Google Cloud, Nuance, or Amazon Polly. Strong debugging and problem-solving skills. Excellent communication and collaboration abilities. Preferred Qualifications: Certification in any Conversational AI platform (e.g., Dialogflow CX, Amelia Certified Developer). Experience with analytics tools for bot performance monitoring. Exposure to agentic AI design patterns or multi-agent systems. Understanding of AI governance and bias mitigation practices.
Posted 5 days ago
4.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Job Title: Security Engineer III Work Office- 5 days Location- Gurgaon About Us: Nykaa is a leading e-commerce platform that combines fashion and technology to deliver a seamless shopping experience. To fortify our commitment to security, we are seeking a dedicated Cyber Security engineer to join our team. If you have a strong background in securing infrastructure and are passionate about protecting e-commerce platforms, we encourage you to apply. Job Overview: We are looking for a talented and forward-thinking Cybersecurity Engineer to join our team. This role focuses on advancing our security infrastructure through cloud security, perimeter defenses, and cutting-edge security engineering practices powered by Artificial Intelligence (AI) and Machine Learning (ML). The ideal candidate will bring expertise in leveraging new technologies to enhance threat detection, automate responses, and predict vulnerabilities across cloud and network environments. Key Responsibilities: ● Security Engineering: ○ Build and integrate security solutions, such as firewalls, encryption tools, and intrusion detection systems, to protect critical infrastructure and data. ○ Collaborate with development teams to integrate security measures into the software development lifecycle (SDLC). ○ Ensure automation of security workflows, vulnerability management, and incident response processes. ○ Lead security initiatives to address evolving threat landscapes, ensuring systems are resilient against emerging cyber risks. ● Cloud Security: ○ Design, implement, and manage secure cloud architectures for platforms such as AWS, Azure, and Google Cloud. ○ Utilize AI/ML-driven security tools to enhance cloud monitoring, incident response, and threat detection in cloud environments. ○ Ensure secure cloud infrastructure by automating security configurations and leveraging AI for predictive vulnerability assessments. ○ Work with DevOps and infrastructure teams to implement automated security controls. ● Data Protection Controls: ○ Design and manage data encryption, tokenization, and masking practices to protect sensitive data both at rest and in transit. ○ Design and enforce data classification schemes, access controls, and data retention policies to mitigate risks to sensitive information. ○ Monitor and enforce security controls related to data handling, ensuring data is securely stored, processed, and transmitted in accordance with best practices. ● Collaboration & Reporting: ○ Work closely with cross-functional teams to embed AI-powered security practices in development pipelines, system architecture, and cloud-based environments. ○ Provide detailed insights and reports on AI/ML-driven security improvements, potential risks, and recommended mitigations to management and stakeholders. ○ Assist in creating and updating security policies, procedures, and standards to ensure they reflect emerging AI/ML technologies and best practices. ○ Conduct training and workshops for other security teams on AI/ML techniques in security operations. Required Skills & Qualifications: ● 4+ years of experience in cybersecurity with a strong focus on cloud security, perimeter security, and security engineering. ● Practical experience with cloud platforms (AWS, Azure, Google Cloud) and security services (IAM, encryption, security groups, etc.). ● Strong understanding of network security protocols (e.g., firewalls, VPNs, IDS/IPS) and their integration with AI/ML models for enhanced defense. ● Hands-on experience with AI/ML techniques for cybersecurity applications, including supervised and unsupervised learning, anomaly detection, and threat classification. ● Proficiency in programming and scripting languages (Python, R, TensorFlow, Keras, or similar AI/ML tools). ● Familiarity with cloud-native security tools that leverage AI/ML for threat detection (e.g., AWS GuardDuty, Azure Sentinel). ● Experience with threat intelligence, vulnerability management, and incident response frameworks. ● Experience in building and deploying security models in automated, scalable environments. This role is perfect for a cybersecurity professional who is passionate about leveraging AI and machine learning to revolutionize security operations, proactively defending cloud environments and networks against emerging threats. If you're eager to work with advanced technologies to secure the future of digital infrastructures, we'd love to hear from you!
Posted 5 days ago
0 years
2 - 3 Lacs
Thrissur
On-site
Company Description Redlands Ashlyn Motors Plc, a division of the Redlands Ashlyn group of companies, is dedicated to enhancing productivity in the Indian agriculture sector through advanced mechanization. We manufacture a range of agricultural machinery, including harvester combines, straw balers, rice transplanters, muck trucks, tillers, and weed cutters. Our fully equipped factory and fabrication workshop are located in Malumachampatty, Coimbatore, Tamilnadu, India. Our mission is to provide innovative, user-friendly, and affordable agricultural equipment to accelerate the mechanization of Indian agriculture. Role Description This is a full-time on-site role for an Automotive Painter located in Thrissur. The Automotive Painter will be responsible for preparing vehicles and machinery for painting, applying paint using various techniques, and ensuring high-quality finishes. Day-to-day tasks include sanding and masking surfaces, mixing and applying paint, inspecting painted surfaces for quality, and maintaining painting equipment and work areas. Qualifications Proficiency in automotive painting techniques, including spray painting Experience in surface preparation, sanding, masking, and paint mixing Knowledge of paint types, finishes, and application methods Attention to detail and a strong focus on quality and workmanship Ability to follow safety protocols and maintain a clean work environment Prior experience in automotive or machinery painting is preferred High school diploma or equivalent Job Type: Full-time Pay: ₹20,000.00 - ₹25,000.00 per month Benefits: Health insurance Schedule: Day shift Morning shift Work Location: In person
Posted 6 days ago
50.0 years
0 Lacs
Ranjangaon
On-site
At Jabil we strive to make ANYTHING POSSIBLE and EVERYTHING BETTER. We are proud to be a trusted partner for the world's top brands, offering comprehensive engineering, manufacturing, and supply chain solutions. With over 50 years of experience across industries and a vast network of over 100 sites worldwide, Jabil combines global reach with local expertise to deliver both scalable and customized solutions. Our commitment extends beyond business success as we strive to build sustainable processes that minimize environmental impact and foster vibrant and diverse communities around the globe. JOB SUMMARY To coordinate tasks with other Manufacturing staff to fulfill customer requirements, such as Paint process, Paint specification and aesthetic appearance of painted parts. adhere to safety of hazardous operations and consistent quality and customer specifications. ESSENTIAL DUTIES AND RESPONSIBILITIES GENERAL DUTIES: Works under direct, close supervision, with output monitored frequently. Follows mostly routine, standardized procedures to accomplish assigned tasks. May be exposed to more advanced functions as part of training and development. Selects from a variety of established procedures to perform assigned duties. Resolves routine questions and problems, referring more complex issues to higher levels. Errors can cause minor delay, expense and disruption. Assembles finished units per customer specifications. Coordinates with teammates to organize tasks requiring multiple team members to accomplish. Utilizes manual and automated lifting devices while adhering to product safety specifications. Provides information and coordinates action plans at cross-functional meetings and communicates issues with team members and/or visitors to drive corrective actions. Individual must be able to work overtime as required, must be able to respond to conflicting deadlines, changing priorities, and continuous interruptions. Organizes and maintains spare parts inventory and orders spare parts as needed to fill customer orders. Assists in area organization 5S attributes. Keeps abreast of spare parts inventory locations for ease of order fulfillments. Performs preventive maintenance on area tooling according to schedules. Follows preventive maintenance procedural requirements to ensure audit compliance. May perform other duties and responsibilities as assigned. Coating MAY be a responsibility within this job. If Coating is a responsibility, the following duties appl): KEY DUTIES SUPPORTING COATING: Perform manual conformal coating of product per required specifications. Prepare assemblies for automated coating processes and operate equipment as needed. Maintain spray equipment (spray guns, booths, stripping area) Ensure assemblies and components are properly handled and marked. Accurately maintain daily thickness logs and MES record keeping. Utilize bar code scanners and small hand tools. Inspect assemblies visually for proper masking application and placement of required materials. Work under direct, close supervision of manufacturing supervisor, or in his/her absence, from Group Leader, or other management so assigned. Follow detailed written or verbal instructions, including visual aids. Ensure that assigned area is clean and organized per 5S standards. Adhere to all safety and health rules and regulations associated with this position and as directed by supervisor. Comply and follow all procedures within the company security policy. JOB QUALIFICATIONS KNOWLEDGE REQUIREMENTS Ability to effectively present information and respond to questions from groups of managers, clients, customers, and the general public. Ability to define problems, collect data, establish facts, and draw valid conclusions. Ability to operate a personal computer including using a Windows based operating system and related software. Advanced PC skills, including training and knowledge of Jabil’s software packages. Ability to write simple correspondence. Read and understand visual aid. Ability to apply common sense understanding to carry out simple one- or two-step instructions. Ability to deal with standardized situations with only occasional or no variables. Ability to read and comprehend simple instructions, short correspondence, and memos. Ability to add, subtract, multiply, and divide in all units of measure, using whole numbers, common fractions, and decimals. Ability to compute rate, ratio, and percent and to draw and interpret graphs. BE AWARE OF FRAUD: When applying for a job at Jabil you will be contacted via correspondence through our official job portal with a jabil.com e-mail address; direct phone call from a member of the Jabil team; or direct e-mail with a jabil.com e-mail address. Jabil does not request payments for interviews or at any other point during the hiring process. Jabil will not ask for your personal identifying information such as a social security number, birth certificate, financial institution, driver’s license number or passport information over the phone or via e-mail. If you believe you are a victim of identity theft, contact your local police department. Any scam job listings should be reported to whatever website it was posted in. Jabil, including its subsidiaries, is an equal opportunity employer and considers qualified applicants for employment without regard to race, color, religion, national origin, sex, sexual orientation, gender identity, age, disability, genetic information, veteran status, or any other characteristic protected by law. Accessibility Accommodation If you are a qualified individual with a disability, you have the right to request a reasonable accommodation if you are unable or limited in your ability to use or access Jabil.com/Careers site as a result of your disability. You can request a reasonable accommodation by sending an e-mail to Always_Accessible@Jabil.com with the nature of your request and contact information. Please do not direct any other general employment related questions to this e-mail. Please note that only those inquiries concerning a request for reasonable accommodation will be responded to. #whereyoubelong
Posted 6 days ago
10.0 years
0 Lacs
Pune, Maharashtra, India
On-site
About Atos Atos is a global leader in digital transformation,European number one in Cloud, Cybersecurity and High-Performance Computing, the Group provides end-to-end Orchestrated Hybrid Cloud, Big Data, Business Applications and Digital Workplace solutions. The Group is the Worldwide Information Technology Partner for the Olympic & Paralympic Games and operates under the brands Atos, Atos Syntel, and Unify. Atos is a SE (Societas Europaea), listed on the CAC40 Paris stock index. The purpose of Atos is to help design the future of the information space. Its expertise and services support the development of knowledge, education and research in a multicultural approach and contribute to the development of scientific and technological excellence. Across the world, the Group enables its customers and employees, and members of societies at large to live, work and develop sustainably, in a safe and secure information space Role Overview The Technical Architect - Snowflake designs, implements and optimizes scalable data warehousing solutions. The jobholder has extensive experience with Snowflake, data architecture, and cloud integration, ensuring high performance, security, and reliability. Responsibilities Design and implement Snowflake-based data architectures to meet business requirements. Architect and optimize data solutions for performance, scalability, and reliability. Develop and optimize data pipelines and ETL/ELT processes. Establish best practices for data governance, security, and compliance. Collaborate with cross-functional teams to integrate Snowflake solutions with existing systems. Monitor and troubleshoot Snowflake environments for performance and reliability. Stay updated on Snowflake advancements and industry trends to recommend innovative solutions. Key Technical Skills & Responsibilities Minimum 10 + years of experience of designing and developing data warehouse / big data applications Must be able to provide thought leadership to customers for their data modernization initiatives using latest technology trends Must be able to lead data product development using Streamlit and Cortex Deep understanding of relational as well as NoSQL data stores, data modeling methods and approaches (star and snowflake, dimensional modeling) Good communication skill. Must have experience of solution architecture using Snowflake Design solutions using Snowflake for all type of data analytics use cases Must have experience of working with Snowflake data platform, it’s utilities (SnowSQL, SnowPipe etc) and it’s features (time travel, support to semi-structured data etc) Must have experience of migrating on premise data warehouse to Snowflake cloud data platform Must have experience of working with any cloud platform, AWS | Azure | GCP Experience of developing accelerators (using Python, Java etc) to expedite the migration to Snowflake Extensive experience of developing ANSI SQL queries and Snowflake compatible stored procedures Snowflake for AI/ML DevOps with Snowflake Data security and data masking features Multi cloud data exchange using Snowflake Snowflake certification is preferred Effective communication and required pre sales experience Eligibility Criteria Bachelor’s degree in Computer Science, Data Engineering, or a related field. Proven experience as a Snowflake Architect or similar role. Snowflake certification (e.g., SnowPro Core Certification). Experience with cloud platforms like AWS, Azure, or GCP. Proficiency in Snowflake, SQL, and data modeling. Strong understanding of ETL/ELT processes and cloud integration. Excellent problem-solving and communication skills
Posted 6 days ago
7.0 years
15 - 25 Lacs
Pune, Maharashtra, India
On-site
At Improzo ( Improve + Zoe; meaning Life in Greek ), we believe in improving life by empowering our customers. Founded by seasoned Industry leaders, we are laser focused on delivering quality-led commercial analytical solutions to our clients. Our dedicated team of experts in commercial data, technology, and operations has been evolving and learning together since our inception. Here, you won't find yourself confined to a cubicle; instead, you'll be navigating open waters, collaborating with brilliant minds to shape the future. You will work with leading Life Sciences clients, seasoned leaders and carefully chosen peers like you! People are at the heart of our success, so we have defined our CARE values framework with a lot of effort, and we use it as our guiding light in everything we do. We CARE! Customer-Centric: Client success is our success. Prioritize customer needs and outcomes in every action. Adaptive: Agile and Innovative, with a growth mindset. Pursue bold and disruptive avenues that push the boundaries of possibilities. Respect: Deep respect for our clients & colleagues. Foster a culture of collaboration and act with honesty, transparency, and ethical responsibility. Execution: Laser focused on quality-led execution; we deliver! Strive for the highest quality in our services, solutions, and customer experiences. About The Role Introduction: We are seeking an experienced and highly skilled Snowflake Data Lead/Architect to lead strategic projects focused on Pharma Commercial Data Management. This role demands a professional with 7-9 years of experience in data architecture, data management, ETL, data transformation, and governance, with an emphasis on providing scalable and secure data solutions for the pharmaceutical sector. The ideal candidate will bring a deep understanding of data architecture principles, experience with cloud platforms like Snowflake and Databricks, and a solid background in driving commercial data management projects. If you're passionate about leading impactful data initiatives, optimizing data workflows, and supporting the pharmaceutical industry's data needs, we invite you to apply. Responsibilities Key Responsibilities: Snowflake Solution Design & Development: Work closely with client stakeholders, data architects, and business analysts to understand detailed commercial data requirements and translate them into efficient Snowflake technical designs. Design, develop, and optimize complex ETL/ELT processes within Snowflake using SQL, Stored Procedures, UDFs, Streams, Tasks, and other Snowflake features. Implement data models (dimensional, star, snowflake schemas) optimized for commercial reporting, analytics, and data science use cases. Implement data governance, security, and access controls within Snowflake, adhering to strict pharmaceutical compliance regulations (e.g., HIPAA, GDPR, GxP principles). Develop and manage data sharing and collaboration solutions within Snowflake for internal and external partners. Optimize Snowflake warehouse sizing, query performance, and overall cost efficiency. Data Integration Integrate data from various commercial sources, including CRM systems (e.g., Veeva, Salesforce), sales data (e.g., IQVIA, Symphony), marketing platforms, patient services data, RWD, and other relevant datasets into Snowflake. Utilize tools like Fivetran, Azure Data Factory or custom Python scripts for data ingestion and transformation. Tech Leadership & Expertise Provide technical expertise and support for Snowflake-related issues, troubleshooting data discrepancies and performance bottlenecks. Participate in code reviews, ensuring adherence to best practices and coding standards. Mentor junior developers and contribute to the growth of the data engineering team. Data Quality, Governance & Security Implement robust data quality checks, validation rules, and reconciliation processes to ensure accuracy and reliability of commercial data. Apply and enforce data governance policies, including data lineage, metadata management, and master data management principles. Implement and maintain strict data security, access controls, and data masking techniques within Snowflake, adhering to pharmaceutical industry compliance standards (e.g., HIPAA, GDPR, GxP principles). Required Qualifications Bachelor's degree in Computer Science, Information Systems, Engineering, or a related quantitative field. Master's degree preferred. 7+ years of progressive experience in data warehousing, ETL/ELT development, and data engineering roles. 4+ years of hands-on, in-depth experience as a Snowflake Developer, with a proven track record of designing and implementing complex data solutions on the Snowflake platform. Expert-level proficiency in SQL for data manipulation, complex query optimization, and advanced stored procedure development within Snowflake. Strong understanding and practical experience with data modeling techniques (e.g., Dimensional Modeling, Data Vault). Experience with data integration tools for Snowflake (e.g., Fivetran, Matillion, DBT, Airflow, or custom Python-based ETL frameworks). Proficiency in at least one scripting language (e.g., Python) for data processing, API integration, and automation. Demonstrable understanding of data governance, data security, and regulatory compliance within the pharmaceutical or other highly regulated industries (e.g., GxP, HIPAA, GDPR, PII). Experience working in a client-facing or consulting environment with strong communication and presentation skills. Excellent problem-solving, analytical, and communication skills. Ability to work independently and as part of a collaborative team in a fast-paced environment. Preferred Qualifications Specific experience with pharmaceutical commercial data sets such as sales data (e.g., IQVIA, Symphony), CRM data (e.g., Veeva, Salesforce), claims data, patient services data, or master data management (MDM) for commercial entities. Knowledge of commercial analytics concepts and KPIs in the pharma industry (e.g., sales performance, market share, patient adherence). Experience working with cloud platforms (AWS, Azure, or GCP) and their native services for data storage and processing. Experience with version control systems (e.g., Git). Snowflake certifications (e.g., SnowPro Core, SnowPro Advanced). Experience with data visualization tools (e.g., Tableau, Power BI, Qlik Sense) and their connectivity to Snowflake. Knowledge of Agile methodologies for managing data projects. Benefits Competitive salary and benefits package. Opportunity to work on cutting-edge tech projects, transforming the life sciences industry Collaborative and supportive work environment. Opportunities for professional development and growth. Skills: data visualization tools,data vault,azure data factory,data architecture,client-facing,data governance,data quality,data,snowflake,sql,data integration,fivetran,pharma commercial,data security,python,dimensional modeling,etl,data management
Posted 6 days ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
20312 Jobs | Dublin
Wipro
11977 Jobs | Bengaluru
EY
8165 Jobs | London
Accenture in India
6667 Jobs | Dublin 2
Uplers
6464 Jobs | Ahmedabad
Amazon
6352 Jobs | Seattle,WA
Oracle
5993 Jobs | Redwood City
IBM
5803 Jobs | Armonk
Capgemini
3897 Jobs | Paris,France
Tata Consultancy Services
3776 Jobs | Thane