Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
10.0 years
0 Lacs
Pune, Maharashtra, India
On-site
About The Job Job Title : Cloud DevOps Architect Location : Pune, India Experience : 10 - 15 Years Work Mode : Full-time, Office-based Company Smartavya Analytica Private Limited Company Overview : Smartavya Analytica is a niche Data and AI company based in Mumbai, established in 2017. We specialize in data-driven innovation, transforming enterprise data into strategic insights. With expertise spanning over 25+ Data Modernization projects and handling large datasets up to 24 PB in a single implementation, we have successfully delivered data and AI projects across multiple industries, including retail, finance, telecom, manufacturing, insurance, and capital markets. We are specialists in Cloud, Hadoop, Big Data, AI, and Analytics, with a strong focus on Data Modernization for On Premises, Private, and Public Cloud Platforms. Job Summary We are looking for an accomplished Cloud DevOps Architect to design and implement robust DevOps and Infrastructure Automation frameworks across Azure, GCP, or AWS environments. The ideal candidate will have a deep understanding of CI/CD, IaC, VPC Networking, Security, and Automation using Terraform or Ansible. Key Responsibilities Architect and build end-to-end DevOps pipelines using native cloud services (Azure DevOps, AWS CodePipeline, GCP Cloud Build) and third-party tools (Jenkins, GitLab, etc.). Define and implement foundation setup architecture (Azure, GCP and AWS) as per the recommended best practices. Design and deploy secure VPC architectures, manage networking, security groups, load balancers, and VPN gateways. Implement Infrastructure as Code (IaC) using Terraform or Ansible for scalable and repeatable deployments. Establish CI/CD frameworks integrating with Git, containers, and orchestration tools (e.g., Kubernetes, ECS, AKS, GKE). Define and enforce cloud security best practices including IAM, encryption, secrets management, and compliance standards. Collaborate with application, data, and security teams to optimize infrastructure, release cycles, and system performance. Drive continuous improvement in automation, observability, and incident response practices. Must-Have Skills 10 to 15 years of experience in DevOps, Infrastructure, or Cloud Architecture roles. Deep hands-on expertise in Azure, GCP, or AWS cloud platforms (any one is mandatory, more is a bonus). Strong knowledge of VPC architecture, Cloud Security, IAM, and Networking principles. Expertise in Terraform or Ansible for Infrastructure as Code. Experience building resilient CI/CD pipelines and automating application deployments. Strong troubleshooting skills across networking, compute, storage, and containers. Preferred Certifications Azure DevOps Engineer Expert / AWS Certified DevOps Engineer Professional / Google Professional DevOps Engineer HashiCorp Certified : Terraform Associate (Preferred for Terraform users) (ref:hirist.tech)
Posted 1 week ago
0 years
0 Lacs
Pune, Maharashtra, India
On-site
Responsibilities Design, develop, and maintain robust, scalable, and high-performance backend applications using Java, Spring Boot, and Spring Batch. Implement and optimize RESTful APIs to ensure seamless communication between backend services and front-end applications. Build and manage batch processing workflows using Spring Batch to handle large-scale data processing. Ensure the scalability, performance, and reliability of applications through efficient design and architecture. Collaborate with front-end developers, product managers, and other stakeholders to gather requirements and design technical solutions. Work closely with database administrators to design and optimize database schemas, queries, and stored procedures. Write clean, maintainable, and efficient code that adheres to best practices and coding standards. Participate in code reviews, and ensure best practices are followed by the development team. Troubleshoot, debug, and optimize applications for performance and reliability. Stay up-to-date with emerging technologies and industry trends and apply them in projects. Contribute to building an Agile, DevOps-driven, and CI/CD-based development culture. Assist in creating and maintaining technical documentation. Must-Have Skills Proficiency in Java, Spring Boot, and Spring Batch for building backend services and batch processing. Strong understanding of RESTful web services and microservices architecture. Experience working with relational databases (MySQL, PostgreSQL) and NoSQL databases (MongoDB, etc.). Familiarity with CI/CD tools such as Jenkins, GitLab, or similar version control tools like Git. Experience with unit testing frameworks like JUnit and integration testing. Knowledge of performance tuning and troubleshooting in Java applications. Familiarity with Agile methodologies and experience working with tools like Rally. Strong problem-solving skills, ability to think critically, and troubleshoot technical challenges. Excellent communication skills and ability to collaborate with cross-functional teams. Good-to-Have Skills Experience with Cloud platforms (AWS, Azure, or GCP) for deploying and scaling Java applications. Familiarity with containerization technologies like Docker and orchestration with Kubernetes. Experience with message queues like Kafka or RabbitMQ. Exposure to DevOps practices and tools. Knowledge of OAuth2 and other authentication mechanisms. Familiarity with front-end technologies (React, Angular) for full-stack development. Experience with microservices security best practices. Knowledge of Big Data tools and technologies such as Hadoop or Spark is a plus. Note: This requisition can accept 17 more applications for this requisition! (ref:hirist.tech)
Posted 1 week ago
1.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Description As a Data Engineer, You will be working on building and maintaining complex data pipelines, assemble large and complex datasets to generate business insights and to enable data driven decision making and support the rapidly growing and dynamic business demand for data. You will have an opportunity to collaborate and work with various teams of Business analysts, Managers, Software Dev Engineers, and Data Engineers to determine how best to design, implement and support solutions. You will be challenged and provided with tremendous growth opportunity in a customer facing, fast paced, agile environment. Key job responsibilities Design, implement and support an analytical data platform solutions for data driven decisions and insights Design data schema and operate internal data warehouses & SQL/NOSQL database systems Work on different data model designs, architecture, implementation, discussions and optimizations Interface with other teams to extract, transform, and load data from a wide variety of data sources using AWS big data technologies like EMR, RedShift, Elastic Search etc. Work on different AWS technologies such as S3, RedShift, Lambda, Glue, etc.. and Explore and learn the latest AWS technologies to provide new capabilities and increase efficiency Work on data lake platform and different components in the data lake such as Hadoop, Amazon S3 etc. Work on SQL technologies on Hadoop such as Spark, Hive, Impala etc.. Help continually improve ongoing analysis processes, optimizing or simplifying self-service support for customers Must possess strong verbal and written communication skills, be self-driven, and deliver high quality results in a fast-paced environment. Recognize and adopt best practices in reporting and analysis: data integrity, test design, analysis, validation, and documentation. Enjoy working closely with your peers in a group of talented engineers and gain knowledge. Be enthusiastic about building deep domain knowledge on various Amazon’s business domains. Own the development and maintenance of ongoing metrics, reports, analyses, dashboards, etc. to drive key business decisions. Basic Qualifications 1+ years of data engineering experience Experience with data modeling, warehousing and building ETL pipelines Experience with one or more query language (e.g., SQL, PL/SQL, DDL, MDX, HiveQL, SparkSQL, Scala) Experience with one or more scripting language (e.g., Python, KornShell) Preferred Qualifications Experience with big data technologies such as: Hadoop, Hive, Spark, EMR Experience with any ETL tool like, Informatica, ODI, SSIS, BODI, Datastage, etc. Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner. Company - ADCI HYD 13 SEZ Job ID: A3013333
Posted 1 week ago
0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Company Description Blend is a premier AI services provider, committed to co-creating meaningful impact for its clients through the power of data science, AI, technology, and people. With a mission to fuel bold visions, Blend tackles significant challenges by seamlessly aligning human expertise with artificial intelligence. The company is dedicated to unlocking value and fostering innovation for its clients by harnessing world-class people and data-driven strategy. We believe that the power of people and AI can have a meaningful impact on your world, creating more fulfilling work and projects for our people and clients. For more information, visit www.blend360.com. Job Description We are looking for a skilled Test Engineer with experience in automated testing, rollback testing, and continuous integration environments. You will be responsible for ensuring the quality and reliability of our software products through automated testing strategies and robust test frameworks Design and execute end-to-end test strategies for data pipelines, ETL/ELT jobs, and database systems. Validate data quality, completeness, transformation logic, and integrity across distributed data systems (e.g., Hadoop, Spark, Hive). Develop Python-based automated test scripts to validate data flows, schema validations, and business rules. Perform complex SQL queries to verify large datasets across staging and production environments. Identify data issues and work closely with data engineers to resolve discrepancies. Contribute to test data management, environment setup, and regression testing processes. Work collaboratively with data engineers, business analysts, and QA leads to ensure accurate and timely data delivery. Participate in sprint planning, reviews, and defect triaging as part of the Agile process. Qualifications 4+ of experience in Data Testing, Big Data Testing, and/or Database Testing. Strong programming skills in Python for automation and scripting. Expertise in SQL for writing complex queries and validating large datasets. Experience with Big Data technologies such as Hadoop, Hive, Spark, HDFS, Kafka (any combination is acceptable). Hands-on experience with ETL/ELT testing and validating data transformations and pipelines. Exposure to cloud data platforms like AWS (Glue, S3, Redshift), Azure (Data Lake, Synapse), or GCP is a plus. Familiarity with test management and defect tracking tools like JIRA, TestRail, or Zephyr. Experience with CI/CD pipelines and version control (e.g., Git) is an advantage.
Posted 1 week ago
0 years
0 Lacs
Pune, Maharashtra, India
On-site
Position Overview Job Title: Infra and DevOps Engineer, AS Location: Pune, India We are looking for the best people to help create the next big thing in digital banking Role Description The Infra & DevOps team within DWS India, sits horizontally over the project delivery, committed to provide best in class shared services across build, release and QA Automation space. Its’ main functional areas encompass Environment build, Integration of QA Automation suite, Release and Deployment Automation Management, Technology Management and Compliance Management. This role will be key to our programme delivery and include working closely with stakeholders including client Product Owners, Digital Design Organisation, Business Analysts, Developers and QA to advise and contribute from Infra and DevOps capability perspective by Building and maintaining non-prod and prod environments, setting up end to end alerting and monitoring for ease of operation and oversee transition of the project to L2 support teams as part of Go Live. What We’ll Offer You As part of our flexible scheme, here are just some of the benefits that you’ll enjoy Best in class leave policy Gender neutral parental leaves 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Employee Assistance Program for you and your family members Comprehensive Hospitalization Insurance for you and your dependents Accident and Term life Insurance Complementary Health screening for 35 yrs. and above Your Key Responsibilities Drives automation (incl. automated build, test and deploy) Supports and manages Data / Digital systems architecture (underlying platforms, APIs, UI, Datasets …) in line with the architectural vision set by the Digital Design Organisation across environments. Drives integration across systems, working to ensure service layer integrates with the core technology stack whilst ensuring that services integrate to form a service ecosystem Monitors digital architecture to ensure health and identify required corrective action Serve as a technical authority, working with developers to drive architectural standards on the specific platforms that are developing upon Build security into the overall architecture, ensuring adherence to security principles set within IT and adherence to any required industry standards Liaises with other technical areas, conducting technology research, and evaluating software required for maintaining the development environment. Works with the wider QA function within the business to drive Continuous Testing by integrating QA Automation suites with available toolsets. Your Skills And Experience Proven technical hands on in Linux/Unix is a must have. Proven experience on Infrastructure Architecture - Clustering, High Availability, Performance and Tuning, Backup and Recovery; Hands-on experience with DevOps build and deploy tools like TeamCity, Jenkins, GIT / Bit Bucket / Artifactory. Knowledge about automation/configuration management using tools such as Ansible or relevant. Hands-on experience of Google Cloud platform. Knowledge of hadoop ecosystem. A working understanding of code and scripting language such as (Python, Perl, Ruby or JS). In depth Knowledge and experience in Docker technology, OpenShift and Kubernetes containerisation Ability to deploy complex solutions based on IaaS, PaaS and cloud-based infrastructures Basic understanding of networking and firewalls. Knowledge of best practices and IT operations in an agile environment Ability to deliver independently: confidently able to translate requirements into technical solutions with minimal supervision Collaborative by nature: able to work with scrum teams, technical teams, the wider business, and IT&S to provide platform-related knowledge Flexible: finds a way to say yes and to make things happen, only exercising authority as needed to prevent the architecture from breaking Coding and scripting: Able to develop in multiple languages in order to mobilise, configure and maintain digital platforms and architecture Automation and tooling: strong knowledge of the automation landscape, with ability to rapidly identify and mobilise appropriate tools to support testing, deployment etc Security: understands security requirements and is able to independently drive compliance Fluent English (written/verbal). Education / Certification Bachelor’s degree from an accredited college or university with a concentration in Science, Engineering or an IT-related discipline (or equivalent). How We’ll Support You Training and development to help you excel in your career Coaching and support from experts in your team A culture of continuous learning to aid progression A range of flexible benefits that you can tailor to suit your needs About Us And Our Teams Please visit our company website for further information: https://www.db.com/company/company.htm We strive for a culture in which we are empowered to excel together every day. This includes acting responsibly, thinking commercially, taking initiative and working collaboratively. Together we share and celebrate the successes of our people. Together we are Deutsche Bank Group. We welcome applications from all people and promote a positive, fair and inclusive work environment.
Posted 1 week ago
6.0 - 11.0 years
0 - 1 Lacs
Pune, Chennai
Work from Office
Hiring alerts "Data Engineer Role", Immediate Joiners to 15 days are preferred Role & responsibilities 8+ years of relevant experience in the Data Engineering, ApacheSpark Strong proficiency in PySpark and Apache Spark. Solid experience with SQL, NoSQL databases (e.g., Hive, HBase, Cassandra). Hands-on experience with big data frameworks (Hadoop, Kafka, etc.). Expertise in cloud platforms (AWS, Azure, or GCP). Proficiency in Python programming and data manipulation. Knowledge of ETL tools, data lakes, and data warehouses. Experience in CI/CD, containerization (Docker, Kubernetes) is a plus. Strong problem-solving and analytical skills. Excellent communication and teamwork skills. Experience in systems analysis and programming of software applications Experience in managing and implementing successful projects Ability to work under pressure and manage deadlines or unexpected changes in expectations or requirements Regards, Jaya Gowri 9500954668 jaya.gowri@photon.com
Posted 1 week ago
4.0 - 7.0 years
10 - 20 Lacs
Hyderabad
Work from Office
We are seeking a skilled Data Engineer with extensive experience in the Cloudera Data Platform (CDP) to join our dynamic team. The ideal candidate will have over four years of experience in designing, developing, and managing data pipelines, and will be proficient in big data technologies. This role requires a deep understanding of data engineering best practices and a passion for optimizing data flow and collection across a diverse range of sources. Required Skills and Qualifications: Experience: 4+ years of experience in data engineering, with a strong focus on big data technologies. Cloudera Expertise: Proficient in Cloudera Data Platform (CDP) and its ecosystem, including Hadoop, Spark, HDFS, Hive, Impala, and other relevant tools. Programming Languages: Strong programming skills in Python, Scala, or Java. ETL Tools: Experience with ETL tools and processes. Data Warehousing: Knowledge of data warehousing concepts and experience with data modeling. SQL: Advanced SQL skills for querying and manipulating large datasets. Linux/Unix: Proficiency in Linux/Unix shell scripting. Version Control: Familiarity with version control systems like Git. Problem-Solving: Strong analytical and problem-solving skills. Communication: Excellent verbal and written communication skills, with the ability to explain complex technical concepts to non-technical stakeholders. Preferred Qualifications: Cloud Experience: Experience with cloud platforms such as AWS, Azure, or Google Cloud. Data Streaming: Experience with real-time data streaming technologies like Kafka. DevOps: Familiarity with DevOps practices and tools such as Docker, Kubernetes, and CI/CD pipelines. Education: Bachelor’s degree in computer science, Information Technology, or a related field. Main Skill: Hadoop, Spark,Hive,Impala,Scala,Python,Java,Linux Roles and Responsibilities Develop and maintain scalable data pipelines using Cloudera Data Platform (CDP) components. Design and implement ETL processes to extract, transform, and load data from various data sources into the data lake or data warehouse. Optimize and troubleshoot data workflows for performance and efficiency. Manage and administer Hadoop clusters within the Cloudera environment. Monitor and ensure the health and performance of the Cloudera platform. Implement data security best practices, including encryption, data masking, and user access control. Work closely with data scientists, analysts, and other stakeholders to understand data requirements and provide the necessary support. Collaborate with cross-functional teams to design and deploy big data solutions that meet business needs. Participate in code reviews, provide feedback, and contribute to team knowledge sharing. Create and maintain comprehensive documentation of data engineering processes, data architecture, and system configurations. Provide support for production data pipelines, including troubleshooting and resolving issues as they arise. Train and mentor junior data engineers, fostering a culture of continuous learning and improvement. Stay up to date with the latest industry trends and technologies related to data engineering and big data. Propose and implement improvements to existing data pipelines and architectures. Explore and integrate new tools and technologies to enhance the capabilities of the data engineering team.
Posted 1 week ago
0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Make an impact with NTT DATA Join a company that is pushing the boundaries of what is possible. We are renowned for our technical excellence and leading innovations, and for making a difference to our clients and society. Our workplace embraces diversity and inclusion – it’s a place where you can grow, belong and thrive. Your day at NTT DATA The Senior Associate Data Engineer is a developing specialist role, tasked with supporting the transformation of data into a structured format that can be easily analyzed in a query or report. This role is responsible for developing structured data sets that can be reused or compliment other data sets and reports. This role analyzes the data sources and data structure and designs and develops data models to support the analytics requirements of the business which includes management / operational / predictive / data science capabilities. Key responsibilities: Contributes to the creation of data models in a structured data format to enable analysis thereof. Proactively supports the design and development of scalable extract, transform and loading (ETL) packages from the business source systems and the development of ETL routines to populate data from sources. Participates in the transformation of object and data models into appropriate database schemas within design constraints. Interprets installation standards to meet project needs and produce database components as required. Receives instructions from various stakeholders to create test scenarios and be responsible for participating in thorough testing and validation to support the accuracy of data transformations. Proactively supports the running of data migrations across different databases and applications, e.g. MS Dynamics, Oracle, SAP and other ERP systems. Support the definition and implementation of data table structures and data models based on requirements. Contributes to analysis, and development of ETL and migration documentation. Receives instructions from various stakeholders to evaluate potential data requirements. Supports the definition and management of scoping, requirements, definition, and prioritization activities for small-scale changes and assist with more complex change initiatives. Contributes to the recommendation of improvements in automated and non-automated components of the data tables, data queries and data models. To thrive in this role, you need to have: Knowledge of the definition and management of scoping requirements, definition and prioritization activities. Understanding of database concepts, object and data modelling techniques and design principles and conceptual knowledge of building and maintaining physical and logical data models. Knowledge of Microsoft Azure Data Factory, SQL Analysis Server, SAP Data Services, SAP BTP. Understanding of data architecture landscape between physical and logical data models. Analytical mindset with good business acumen skills. Problem-solving aptitude with the ability to communicate effectively, both written and verbal. Ability to build effective relationships at all levels within the organization. Seasoned expert in programing languages (Perl, bash, Shell Scripting, Python, etc.). Academic qualifications and certifications: Bachelor's degree or equivalent in computer science, software engineering, information technology, or a related field. Relevant certifications preferred such as SAP, Microsoft Azure etc. Certified Data Engineer, Certified Professional certification preferred. Required experience: Moderate level experience in data engineering, data mining within a fast-paced environment. Familiarity with building modern data analytics solutions that delivers insights from large and complex data sets with multi-terabyte scale. Moderate level experience with architecture and design of secure, highly available and scalable systems. Familiarity with automation, scripting and proven examples of successful implementation. Familiarity with scripts using scripting language (Perl, bash, Shell Scripting, Python, etc.). Moderate level experience with big data tools like Hadoop, Cassandra, Storm etc. Moderate level experience in any applicable language, preferably .NET. Familiarity with working with SAP, SQL, MySQL databases and Microsoft SQL. Moderate level experience working with data sets and ordering data through MS Excel functions, e.g. macros, pivots. Workplace type: Hybrid Working About NTT DATA NTT DATA is a $30+ billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long-term success. We invest over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies. Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure, and connectivity. We are also one of the leading providers of digital and AI infrastructure in the world. NTT DATA is part of NTT Group and headquartered in Tokyo. Equal Opportunity Employer NTT DATA is proud to be an Equal Opportunity Employer with a global culture that embraces diversity. We are committed to providing an environment free of unfair discrimination and harassment. We do not discriminate based on age, race, colour, gender, sexual orientation, religion, nationality, disability, pregnancy, marital status, veteran status, or any other protected category. Join our growing global team and accelerate your career with us. Apply today.
Posted 1 week ago
3.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Job Description Responsible to assemble large, complex sets of data that meet non-functional and functional business requirements. Responsible to identify, design and implement internal process improvements including re-designing infrastructure for greater scalability, optimizing data delivery, and automating manual processes. Building required infrastructure for optimal extraction, transformation and loading of data from various data sources using Azure, Databricks and SQL technologies Responsible for the transformation of conceptual algorithms from R&D into efficient, production ready code. The data developer must have a strong mathematical background in order to be able to document and maintain the code Responsible for integrating finished models into larger data processes using UNIX scripting languages such as ksh, Python, Spark, Scala, etc. Produce and maintain documentation for released data sets, new programs, shared utilities, or static data. This must be done within department standards Ensure quality deliverables to clients by following existing quality processes, manually calculating comparison data, developing statistical pass/fail testing, and visually inspecting data for reasonableness: the requirement is on-time with zero defects Qualifications Education/Training B.E./B.Tech. with a major in Computer Science, BIS, CIS, Electrical Engineering, Operations Research or some other technical field. Course work or experience in Numerical Analysis, Mathematics or Statistics is a plus Hard Skills Proven experience working as a data engineer Highly proficient in using the spark framework (python and/or Scala) Extensive knowledge of Data Warehousing concepts, strategies, methodologies. Programming experience in Python, SQL, Scala Direct experience of building data pipelines using Apache Spark (preferably in Databricks), Airflow. Hands on experience designing and delivering solutions using Azure including Azure Storage, Azure SQL Data Warehouse, Azure Data Lake Experience with big data technologies (Hadoop) Databricks & Azure Big Data Architecture Certification would be plus Must be team oriented with strong collaboration, prioritization, and adaptability skills required Ability to write highly efficient code in terms of performance / memory utilization Basic knowledge of SQL; capable of handling common functions Experience Minimum 3-6 year of experience as Data engineer Experience modeling or manipulating large amounts of data is a plus Experience with Demographic, Retail business is a plus Additional Information Our Benefits Flexible working environment Volunteer time off LinkedIn Learning Employee-Assistance-Program (EAP) About NIQ NIQ is the world’s leading consumer intelligence company, delivering the most complete understanding of consumer buying behavior and revealing new pathways to growth. In 2023, NIQ combined with GfK, bringing together the two industry leaders with unparalleled global reach. With a holistic retail read and the most comprehensive consumer insights—delivered with advanced analytics through state-of-the-art platforms—NIQ delivers the Full View™. NIQ is an Advent International portfolio company with operations in 100+ markets, covering more than 90% of the world’s population. For more information, visit NIQ.com Want to keep up with our latest updates? Follow us on: LinkedIn | Instagram | Twitter | Facebook Our commitment to Diversity, Equity, and Inclusion NIQ is committed to reflecting the diversity of the clients, communities, and markets we measure within our own workforce. We exist to count everyone and are on a mission to systematically embed inclusion and diversity into all aspects of our workforce, measurement, and products. We enthusiastically invite candidates who share that mission to join us. We are proud to be an Equal Opportunity/Affirmative Action-Employer, making decisions without regard to race, color, religion, gender, gender identity or expression, sexual orientation, national origin, genetics, disability status, age, marital status, protected veteran status or any other protected class. Our global non-discrimination policy covers these protected classes in every market in which we do business worldwide. Learn more about how we are driving diversity and inclusion in everything we do by visiting the NIQ News Center: https://nielseniq.com/global/en/news-center/diversity-inclusion
Posted 1 week ago
0 years
0 Lacs
Delhi Cantonment, Delhi, India
On-site
Make an impact with NTT DATA Join a company that is pushing the boundaries of what is possible. We are renowned for our technical excellence and leading innovations, and for making a difference to our clients and society. Our workplace embraces diversity and inclusion – it’s a place where you can grow, belong and thrive. Your day at NTT DATA We are seeking an experienced Data Architect to join our team in designing and delivering innovative data solutions to clients. The successful candidate will be responsible for architecting, developing, and implementing data management solutions and data architectures for various industries. This role requires strong technical expertise, excellent problem-solving skills, and the ability to work effectively with clients and internal teams to design and deploy scalable, secure, and efficient data solutions. What You'll Be Doing We are seeking an experienced Data Architect to join our team in designing and delivering innovative data solutions to clients. The successful candidate will be responsible for architecting, developing, and implementing data management solutions and data architectures for various industries. This role requires strong technical expertise, excellent problem-solving skills, and the ability to work effectively with clients and internal teams to design and deploy scalable, secure, and efficient data solutions. Experience and Leadership: Proven experience in data architecture, with a recent role as a Lead Data Solutions Architect, or a similar senior position in the field. Proven experience in leading architectural design and strategy for complex data solutions and then overseeing their delivery. Experience in consulting roles, delivering custom data architecture solutions across various industries. Architectural Expertise: Strong expertise in designing and overseeing delivery of data streaming and event-driven architectures, with a focus on Kafka and Confluent platforms. In-depth knowledge in architecting and implementing data lakes and lakehouse platforms, including experience with Databricks and Unity Catalog. Proficiency in conceptualising and applying Data Mesh and Data Fabric architectural patterns. Experience in developing data product strategies, with a strong inclination towards a product-led approach in data solution architecture. Extensive familiarity with cloud data architecture on platforms such as AWS, Azure, GCP, and Snowflake. Understanding of cloud platform infrastructure and its impact on data architecture. Data Technology Skills: A solid understanding of big data technologies such as Apache Spark, and knowledge of Hadoop ecosystems. Knowledge of programming languages such as Python or R is beneficial. Exposure to ETL/ ELT processes, SQL, NoSQL databases is a nice-to-have, providing a well-rounded background. Experience with data visualization tools and DevOps principles/tools is advantageous. Familiarity with machine learning and AI concepts, particularly in how they integrate into data architectures. Design and Lifecycle Management: Proven background in designing modern, scalable, and robust data architectures. Comprehensive grasp of the data architecture lifecycle, from concept to deployment and consumption. Data Management and Governance: Strong knowledge of data management principles and best practices, including data governance frameworks. Experience with data security and compliance regulations (GDPR, CCPA, HIPAA, etc.) Leadership and Communication: Exceptional leadership skills to manage and guide a team of architects and technical experts. Excellent communication and interpersonal skills, with a proven ability to influence architectural decisions with clients and guide best practices Project and Stakeholder Management: Experience with agile methodologies (e.g. SAFe, Scrum, Kanban) in the context of architectural projects. Ability to manage project budgets, timelines, and resources, maintaining focus on architectural deliverables. Location: Delhi or Bangalore Workplace type: Hybrid Working About NTT DATA NTT DATA is a $30+ billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long-term success. We invest over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies. Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure, and connectivity. We are also one of the leading providers of digital and AI infrastructure in the world. NTT DATA is part of NTT Group and headquartered in Tokyo. Equal Opportunity Employer NTT DATA is proud to be an Equal Opportunity Employer with a global culture that embraces diversity. We are committed to providing an environment free of unfair discrimination and harassment. We do not discriminate based on age, race, colour, gender, sexual orientation, religion, nationality, disability, pregnancy, marital status, veteran status, or any other protected category. Join our growing global team and accelerate your career with us. Apply today.
Posted 1 week ago
0 years
0 Lacs
Delhi Cantonment, Delhi, India
On-site
Make an impact with NTT DATA Join a company that is pushing the boundaries of what is possible. We are renowned for our technical excellence and leading innovations, and for making a difference to our clients and society. Our workplace embraces diversity and inclusion – it’s a place where you can grow, belong and thrive. Your day at NTT DATA The Senior Associate Data Engineer is a developing specialist role, tasked with supporting the transformation of data into a structured format that can be easily analyzed in a query or report. This role is responsible for developing structured data sets that can be reused or compliment other data sets and reports. This role analyzes the data sources and data structure and designs and develops data models to support the analytics requirements of the business which includes management / operational / predictive / data science capabilities. Key responsibilities: Contributes to the creation of data models in a structured data format to enable analysis thereof. Proactively supports the design and development of scalable extract, transform and loading (ETL) packages from the business source systems and the development of ETL routines to populate data from sources. Participates in the transformation of object and data models into appropriate database schemas within design constraints. Interprets installation standards to meet project needs and produce database components as required. Receives instructions from various stakeholders to create test scenarios and be responsible for participating in thorough testing and validation to support the accuracy of data transformations. Proactively supports the running of data migrations across different databases and applications, e.g. MS Dynamics, Oracle, SAP and other ERP systems. Support the definition and implementation of data table structures and data models based on requirements. Contributes to analysis, and development of ETL and migration documentation. Receives instructions from various stakeholders to evaluate potential data requirements. Supports the definition and management of scoping, requirements, definition, and prioritization activities for small-scale changes and assist with more complex change initiatives. Contributes to the recommendation of improvements in automated and non-automated components of the data tables, data queries and data models. To thrive in this role, you need to have: Knowledge of the definition and management of scoping requirements, definition and prioritization activities. Understanding of database concepts, object and data modelling techniques and design principles and conceptual knowledge of building and maintaining physical and logical data models. Knowledge of Microsoft Azure Data Factory, SQL Analysis Server, SAP Data Services, SAP BTP. Understanding of data architecture landscape between physical and logical data models. Analytical mindset with good business acumen skills. Problem-solving aptitude with the ability to communicate effectively, both written and verbal. Ability to build effective relationships at all levels within the organization. Seasoned expert in programing languages (Perl, bash, Shell Scripting, Python, etc.). Academic qualifications and certifications: Bachelor's degree or equivalent in computer science, software engineering, information technology, or a related field. Relevant certifications preferred such as SAP, Microsoft Azure etc. Certified Data Engineer, Certified Professional certification preferred. Required experience: Moderate level experience in data engineering, data mining within a fast-paced environment. Familiarity with building modern data analytics solutions that delivers insights from large and complex data sets with multi-terabyte scale. Moderate level experience with architecture and design of secure, highly available and scalable systems. Familiarity with automation, scripting and proven examples of successful implementation. Familiarity with scripts using scripting language (Perl, bash, Shell Scripting, Python, etc.). Moderate level experience with big data tools like Hadoop, Cassandra, Storm etc. Moderate level experience in any applicable language, preferably .NET. Familiarity with working with SAP, SQL, MySQL databases and Microsoft SQL. Moderate level experience working with data sets and ordering data through MS Excel functions, e.g. macros, pivots. Workplace type: Hybrid Working About NTT DATA NTT DATA is a $30+ billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long-term success. We invest over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies. Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure, and connectivity. We are also one of the leading providers of digital and AI infrastructure in the world. NTT DATA is part of NTT Group and headquartered in Tokyo. Equal Opportunity Employer NTT DATA is proud to be an Equal Opportunity Employer with a global culture that embraces diversity. We are committed to providing an environment free of unfair discrimination and harassment. We do not discriminate based on age, race, colour, gender, sexual orientation, religion, nationality, disability, pregnancy, marital status, veteran status, or any other protected category. Join our growing global team and accelerate your career with us. Apply today.
Posted 1 week ago
4.0 - 6.0 years
10 - 20 Lacs
Noida
Hybrid
Designation: Senior Software Engineer/ Software Engineer - Data Engineering Location: Noida Experience: 4 -6 years Job Summary/ Your Role in a Nutshell: The ideal candidate would be a skilled Data Engineer proficient in Python, Scala, or Java with a strong background in Hadoop, Spark, SQL, and various data platforms and have expertise in optimizing the performance of data applications and contributing to rapid and agile development processes. What youll do: Review and understand business requirements ensuring that development tasks are completed within the timeline provided and that issues are fully tested with minimal defects Partner with a software development team to implement best practices and optimize the performance of Data applications to ensure that client needs are met at all times Collaborate across the company and interact with our customers to understand, translate, define, design their business challenges and concerns into innovative solutions Research on new Big Data technologies, assessing maturity and alignment of technology to business and technology strategy Work in a rapid and agile development process to enable increased speed to market while maintaining appropriate controls What you need: BE/B.Tech/MCA with At least 4+ years of experience in design and development using Data Engineering technology stack and programming languages Mandatory experience in following areas: Python/Scala/Java Hadoop, HDFS, MR Spark SQL, Dataframes, RDDs SQL Hive / Snowflake/SQL Server/Bigquery Elastic Search Preferred experience in 3 or more of the following areas: Spark Streaming, Spark ML Kafka/Flume Apache NiFi Apache Airflow/Oozie Cloud-based Data Platforms NoSQL Databases HBase/Cassandra/Neo4j/MongoDB Good knowledge of the current technology landscape and ability to visualize industry trends Working knowledge of Big Data Integration with Third-party or in-house built Metadata Management, Data Quality, and Master Data Management solutions Active community involvement through articles, blogs, or speaking engagements at conferences
Posted 1 week ago
5.0 - 10.0 years
15 - 25 Lacs
Pune
Work from Office
Develop and optimize Big Data solutions using Apache Spark. Work extensively with PySpark and Data Engineering tools. Handle real-time data processing using Kafka and Spark Streaming. Design and implement ETL pipelines and migrate workflows to Spark. Required Candidate profile Hands-on experience with Hadoop, HDFS, YARN. Strong programming skills in Scala, Java, and Python. Exposure to CI/CD automation for Big Data workflows.
Posted 1 week ago
1.0 - 5.0 years
1 - 4 Lacs
Nashik, Manmad
Work from Office
We are looking for a highly skilled and experienced Legal Officer to join our team at Equitas Small Finance Bank. Roles and Responsibility Manage and oversee legal matters related to mortgages and other financial products. Draft and review contracts, agreements, and other legal documents. Provide legal advice and guidance on various banking-related matters. Conduct legal research and analysis to support business decisions. Collaborate with cross-functional teams to ensure compliance with regulatory requirements. Develop and implement strategies to mitigate legal risks and minimize losses. Job Requirements Strong knowledge of legal principles and practices applicable to the BFSI industry. Experience working with SBL or similar institutions is preferred. Excellent analytical, communication, and problem-solving skills. Ability to work independently and as part of a team. Strong attention to detail and organizational skills. Familiarity with mortgage laws and regulations is desirable.
Posted 1 week ago
3.0 - 8.0 years
5 - 9 Lacs
Hubli
Work from Office
We are looking for a skilled Regional Receivables Manager to join our team at Equitas Small Finance Bank. Roles and Responsibility Manage and oversee regional receivables operations to ensure timely recovery of outstanding amounts. Develop and implement strategies to improve collection efficiency and reduce delinquencies. Collaborate with internal stakeholders to resolve customer complaints and disputes. Analyze and report on key performance indicators to identify areas for improvement. Ensure compliance with regulatory requirements and company policies. Lead and motivate a team of collection professionals to achieve targets. Job Requirements Strong knowledge of Inclusive Banking, SBL, and Mortgages concepts. Excellent communication and interpersonal skills. Ability to work in a fast-paced environment and meet deadlines. Strong analytical and problem-solving skills. Experience in managing teams and leading by example. Familiarity with financial software and systems is desirable.
Posted 1 week ago
2.0 - 6.0 years
7 - 11 Lacs
Coimbatore, Erode, Gandhi
Work from Office
We are looking for a highly skilled and experienced Relationship Manager to join our team at Equitas Small Finance Bank. Roles and Responsibility Manage and maintain strong relationships with existing clients to increase business growth. Identify new business opportunities and develop strategies to expand the client base. Provide excellent customer service and support to ensure high levels of client satisfaction. Collaborate with internal teams to achieve sales targets and improve overall performance. Develop and implement effective relationship management plans to drive business results. Analyze market trends and competitor activity to stay ahead in the industry. Job Requirements Strong knowledge of Assets, Inclusive Banking, SBL, Mortgages, Standalone Merchant OD, and Relationship Management. Excellent communication and interpersonal skills are required to build strong relationships with clients and colleagues. Ability to work in a fast-paced environment and meet sales targets. Strong analytical and problem-solving skills to analyze market trends and competitor activity. Experience working in the BFSI industry with a focus on relationship management and business growth. Ability to work collaboratively as part of a team to achieve business objectives.
Posted 1 week ago
2.0 - 6.0 years
6 - 10 Lacs
Hyderabad
Work from Office
Key Responsibilities: Big Data Architecture: Design, build, and implement scalable Big Data solutions to process and analyze vast datasets in a timely and efficient manner. Data Pipeline Development: Develop ETL (Extract, Transform, Load) pipelines for large-scale data processing. Ensure data pipelines are automated, scalable, and robust to handle high volumes of data. Distributed Systems: Work with distributed computing frameworks (e.g., Apache Hadoop , Apache Spark , Flink ) to process big datasets across multiple systems and clusters. Data Integration: Integrate data from multiple sources (structured, semi-structured, and unstructured) into a unified data architecture.
Posted 1 week ago
1.0 - 4.0 years
1 - 4 Lacs
Hubli
Work from Office
We are looking for a highly skilled and experienced Legal Officer to join our team at Equitas Small Finance Bank. Roles and Responsibility Manage and oversee legal matters related to mortgages and other financial products. Provide legal support and guidance to internal stakeholders on various banking operations. Conduct legal research and analysis to ensure compliance with regulatory requirements. Develop and implement effective legal strategies to mitigate risks and protect the bank's interests. Collaborate with cross-functional teams to achieve business objectives. Ensure all legal documents and contracts are properly executed and stored. Job Requirements Strong knowledge of legal principles and practices applicable to the BFSI industry. Experience working with SBL or similar institutions is preferred. Excellent analytical and problem-solving skills with attention to detail. Ability to work independently and as part of a team. Strong communication and interpersonal skills. Familiarity with mortgage laws and regulations is essential.
Posted 1 week ago
5.0 years
0 Lacs
India
Remote
Location: Remote Role Description This is a full-time remote role for an AWS Data Engineer. The Data Engineer will be responsible for tasks related to data engineering, data modeling, ETL processes, data warehousing, and data analytics. Qualifications Experience Data Engineering, Data Modeling, and ETL skills Experience with Data Warehousing and Data Analytics skills 5+ years experience with AWS data services - Glue, Data Pipelines, S3, Redshift, Athena and RDS 5 + years experience in AWS Quicksight for developing dashboards and custom visualization reports 10+ years in IT or Technology related field. Proficiency in programming languages like Python or SQL Strong problem-solving and analytical skills Experience with big data technologies like Hadoop or Spark Bachelor's degree in Computer Science, Information Technology, or related field
Posted 1 week ago
7.0 - 12.0 years
8 - 13 Lacs
Chennai
Work from Office
Overview We are ooking for a highy skied Lead Engineer to spearhead our data and appication migration projects. The idea candidate wi have in-depth knowedge of coud migration strategies, especiay with AWS, and hands-on experience in arge-scae migration initiatives. This roe requires strong eadership abiities, technica expertise, and a keen understanding of both the source and target patforms. Responsibiities Lead end-to-end migration projects, incuding panning, design, testing, and impementation. Coaborate with stakehoders to define migration requirements and goas. Perform assessments of existing environments to identify the scope and compexity of migration tasks. Design and architect scaabe migration strategies, ensuring minima downtime and business continuity. Oversee the migration of on-premises appications, databases, and data warehouses to coud infrastructure. Ensure the security, performance, and reiabiity of migrated workoads. Provide technica eadership and guidance to the migration team, ensuring adherence to best practices. Troubeshoot and resove any technica chaenges reated to the migration process. Coaborate with cross-functiona teams, incuding infrastructure, deveopment, and security. Document migration procedures and essons earned for future reference. Lead end-to-end migration projects, incuding panning, design, testing, and impementation. Coaborate with stakehoders to define migration requirements and goas. Perform assessments of existing environments to identify the scope and compexity of migration tasks. Design and architect scaabe migration strategies, ensuring minima downtime and business continuity. Oversee the migration of on-premises appications, databases, and data warehouses to coud infrastructure. Ensure the security, performance, and reiabiity of migrated workoads. Provide technica eadership and guidance to the migration team, ensuring adherence to best practices. Troubeshoot and resove any technica chaenges reated to the migration process. Coaborate with cross-functiona teams, incuding infrastructure, deveopment, and security. Document migration procedures and essons earned for future reference.
Posted 1 week ago
4.0 - 5.0 years
6 - 7 Lacs
Mumbai, Pune, Chennai
Work from Office
Job Category: IT Job Type: Full Time Job Location: Bangalore Chennai Mumbai Pune Location-Mumbai Pune Bangalore Chennai Experience- 5+ We need Azure Data Bricks with QA Must Have SQL queries from test script implementation perspective sql hands-on .test scenarios and QA concepts with respect to Azure stack Azure Data Bricks, Azure Data Factory, Spark SQL knowledge Years - 4-5 years of testing experience in Azure Data Bricks Strong experience in SQL along with performing Azure Data bricks Quality Assurance. Understand complex data system by working closely with engineering and product teams Develop scalable and maintainable applications to extract, transform, and load data in various formats to SQL Server, Hadoop Data Lake or other data storage locations. Kind Note: Please apply or share your resume only if it matches the above criteria
Posted 1 week ago
6.0 - 11.0 years
8 - 13 Lacs
Hyderabad
Work from Office
Job Description You will be a part of our Data Engineering team and will be focused on delivering exceptional results for our clients. A large portion of your time will be in the weeds working alongside your team architecture, designing, implementing, and optimizing data solutions. You ll work with the team to deliver, migrate and/or scale cloud data solutions; build pipelines and scalable analytic tools using leading technologies including AWS, Azure, GCP, Spark, Hadoop, etc. What you ll be doing? Develop data pipelines to move and transform data from various sources to data warehouses Ensure the quality, reliability, and scalability of the organizations data infrastructure Optimize data processing and storage for performance and cost-effectiveness Collaborate with data scientists, analysts, and other stakeholders to understand their requirements and develop solutions to meet their needs Continuously monitor and troubleshoot data pipelines to ensure their reliability and availability Stay up to date with the latest trends and technologies in data engineering and apply them to improve our data capabilities Qualifications Bachelor s degree in computer science, Software Engineering, or a related field 6+ years of experience in data engineering or a related field Strong programming skills in Python or Scala
Posted 1 week ago
7.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
About The Opportunity A leading player in the Artificial Intelligence and Machine Learning sector, we specialize in developing cutting-edge solutions that transform data into actionable intelligence. Our team delivers innovative algorithms and models that drive performance and user engagement across various industries. We are on the lookout for a seasoned AI/ML Developer with significant experience to join our dynamic team in India and contribute to exciting projects that impact the future. Role & Responsibilities Design, develop, and deploy state-of-the-art AI/ML models that solve complex business problems. Collaborate effectively with cross-functional teams to translate business requirements into technical specifications for data-driven applications. Implement machine learning algorithms and optimize the performance of existing models through rigorous testing and data analysis. Integrate AI/ML solutions into production environments, ensuring robustness and scalability. Mentor junior developers, providing guidance on best practices and latest industry trends. Stay updated with emerging AI technologies and frameworks while innovating new solutions that enhance our product offerings. Skills & QualificationsMust-Have 7+ years of experience in AI/ML development with a solid understanding of algorithms. Proficiency in Python and libraries such as TensorFlow, PyTorch, and Scikit-learn. Experience with natural language processing (NLP) and image processing techniques. Strong background in data analysis and statistical modeling. Familiarity with deployment tools and practices in cloud environments. Hands-on experience with big data tools like Apache Spark or Hadoop. Preferred Experience with cloud platforms such as AWS or Azure for ML deployments. Knowledge of model versioning and monitoring strategies. Familiarity with RESTful APIs and microservices architecture. Benefits & Culture Highlights Collaborative work environment focused on innovation and continuous learning. Opportunities for professional development and career advancement. Competitive compensation package and employee benefits. Skills: apache spark,hadoop,model deployment,python,model versioning,monitoring strategies,scikit-learn,aws,microservices architecture,deep learning,gcp,image processing,pytorch,statistical modeling,machine learning,cloud environments,tensorflow,ai/ml development,natural language processing (nlp),restful apis,azure,ai/ml,data analysis,deployment tools
Posted 1 week ago
0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Greetings from LTIMindtree!! About the job Are you looking for a new career challenge? With LTIMindtree, are you ready to embark on a data-driven career? Working for global leading manufacturing client for providing an engaging product experience through best-in-class PIM implementation and building rich, relevant, and trusted product information across channels and digital touchpoints so their end customers can make an informed purchase decision – will surely be a fulfilling experience. Location: Pan India. Key Skill : Hadoop-Spark SparkSQL – Scala Interested candidates kindly apply in below link and share updated cv to Hemalatha1@ltimindtree.com https://forms.office.com/r/zQucNTxa2U Job Description Experience in Scala programming languages Experience in Big Data technologies including Spark Scala and Kafka Who have a good understanding of organizational strategy architecture patterns Microservices Event Driven and technology choices and coaching the team in execution in alignment to these guidelines.Who can apply organizational technology patterns effectively in projects and make recommendations on alternate options.Who have hands on experience working with large volumes of data including different patterns of data ingestion processing batch realtime movement storage and access for both internal and external to BU and ability to make independent decisions within scope of project Who have a good understanding of data structures and algorithms Who can test debug and fix issues within established SLAs Who can design software that is easily testable and observable Who understand how teams goals fit a business need Who can identify business problems at the project level and provide solutions Who understand data access patterns streaming technology data validation data performance cost optimization Strong SQL skills Why join us? Work in industry leading implementations for Tier-1 clients Accelerated career growth and global exposure Collaborative, inclusive work environment rooted in innovation Exposure to best-in-class automation framework Innovation first culture: We embrace automation, AI insights and clean data Know someone who fits this perfectly? Tag them – let’s connect the right talent with right opportunity DM or email to know more Let’s build something great together Fill | LTIMindtree is looking for a Big Data Engineer!! A post on Microsoft Forms provided by: forms.office.com
Posted 1 week ago
5.0 years
0 Lacs
New Delhi, Delhi, India
Remote
About Agoda Agoda is an online travel booking platform for accommodations, flights, and more. We build and deploy cutting-edge technology that connects travelers with a global network of 4.7M hotels and holiday properties worldwide, plus flights, activities, and more . Based in Asia and part of Booking Holdings, our 7,100+ employees representing 95+ nationalities in 27 markets foster a work environment rich in diversity, creativity, and collaboration. We innovate through a culture of experimentation and ownership, enhancing the ability for our customers to experience the world. Our Purpose – Bridging the World Through Travel We believe travel allows people to enjoy, learn and experience more of the amazing world we live in. It brings individuals and cultures closer together, fostering empathy, understanding and happiness. We are a skillful, driven and diverse team from across the globe, united by a passion to make an impact. Harnessing our innovative technologies and strong partnerships, we aim to make travel easy and rewarding for everyone. Get to Know our Team: The Security Department oversees security, governance, risk management, and compliance, and security operations for all Agoda. We are vigilant in ensuring there is no breach or vulnerability threatening our company or endangering our employees to keep Agoda safe and protected. Given that the security ecosystem is moving forward at tremendous speed, we like to be early adaptors of recent technology and products. This would be a great challenge for those who want to work with the best technology in a dynamic and advanced environment. The Opportunity: As a Security Analyst, you will focus on identifying, analyzing, and remediating vulnerabilities across our environment. You will be hands-on with penetration testing and vulnerability management, ensuring our systems remain secure and resilient. In this Role, you’ll get to: Develop Security Automation Tools to implement solutions at scale Triage security findings from multiple tools and work with hundreds of teams to get them remediated within the right SLA Conduct security assessments through code reviews, vulnerability assessments, penetration testing and risk analysis Research on the negative effects of a vulnerability, from minimizing the impact to altering security controls for future prevention Identify potential threats so that the organization can protect itself from malicious hackers. This includes Vulnerability Management, Bug Bounty Program, Penetration Testing Be responsible for developing Security Trainings for developers Work with DevSecOps team in integration of tools into CI/CD, as well as fine-tune the rules and precision What you’ll Need to Succeed: 5+ years in the information security field 5+ years of experience with Penetration Testing (Web, Infra, Mobile, APIs etc.) and Vulnerability Management Minimum 1 year of experience running a bug bounty platform Minimum 2years of experience with any of public/private cloud environments (Openshift, Rancher, K8s, AWS, GCP, Azure, etc.) Experience performing security testing, e.g. code review and web application security testing Familiarity with Gitlab, Defectdojo, JIRA, Confluence Proficient in one or more programming languages such as Python, Go, Node.js, Python etc. Familiar with analytics platform and databases such as GraphQL , REST APIs, Postgres, MSSQL, Kafka, Hadoop, S3 etc Strong knowledge of Security Assessment tools such as security scanners (Nessus, Acunetix and similar platforms) and fuzzers It’s great if you have: Knowledge in Container Image Security, Dependency Checking, Fuzzing and License Scanning Familiarity with security incident response processes and 0-days Security Certifications Relocation package is provided in case you prefer to relocate to Bangkok, Thailand. Our benefits are… Hybrid Working Model WFH Set Up Allowance 30 Days of Remote Working from anywhere globally every year Employee discount for accommodation globally Global team of 90+ nationalities 40+ offices and 25+ countries Annual CSR / Volunteer Time off Benevity Subscription for employee donations Volunteering opportunities globally Free Headspace subscription Free Odilo & Udemy subscriptions Access to Employee Assistance Program (third party for personal and workplace support) Enhanced Parental Leave Life, TPD & Accident Insurance #sanfrancisco #sanjose #losangeles #sandiego #oakland #denver #miami #orlando #atlanta #chicago #boston #detroit #newyork #portland #philadelphia #dallas #houston #austin #seattle #sydney #melbourne #perth #toronto #vancouver #montreal #shanghai #beijing #shenzhen #prague #Brno #Ostrava #cairo #alexandria #giza #estonia #paris #berlin #munich #hamburg #stuttgart #cologne #frankfurt #dusseldorf #dortmund #essen #Bremen #leipzig #dresden #hanover #nuremberg #athens #hongkong #budapest #jakarta #bali #dublin #telaviv #jerusalem #milan #rome #venice #florence #naples #turin #palermo #bologna #tokyo #osaka #yokohama #nagoya #okinawa #fukuoka #sapporo #kualalumpur #malta #amsterdam #oslo #manila #warsaw #krakow #bucharest #doha #alrayyan #moscow #saintpetersburg #riyadh #jeddah #mecca #medina #singapore #capetown #johannesburg #seoul #barcelona #madrid #stockholm #zurich #taipei #tainan #taichung #kaohsiung #bangkok #Phuket #istanbul #dubai #abudhabi #sharjah #london #manchester #edinburgh #kiev #hcmc #hanoi #amsterdam #bucharest #lodz #wroclaw #poznan #katowice #rio #salvador #newdelhi #Hyderabad #bangalore #Mumbai #Bengaluru #Chennai #Kolkata #Lucknow #bandung #yokohama #nagoya #okinawa #fukuoka #IT #4 Equal Opportunity Employer At Agoda, we pride ourselves on being a company represented by people of all different backgrounds and orientations. We prioritize attracting diverse talent and cultivating an inclusive environment that encourages collaboration and innovation. Employment at Agoda is based solely on a person’s merit and qualifications. We are committed to providing equal employment opportunity regardless of sex, age, race, color, national origin, religion, marital status, pregnancy, sexual orientation, gender identity, disability, citizenship, veteran or military status, and other legally protected characteristics. We will keep your application on file so that we can consider you for future vacancies and you can always ask to have your details removed from the file. For more details please read our privacy policy . Disclaimer We do not accept any terms or conditions, nor do we recognize any agency’s representation of a candidate, from unsolicited third-party or agency submissions. If we receive unsolicited or speculative CVs, we reserve the right to contact and hire the candidate directly without any obligation to pay a recruitment fee.
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
20312 Jobs | Dublin
Wipro
11977 Jobs | Bengaluru
EY
8165 Jobs | London
Accenture in India
6667 Jobs | Dublin 2
Uplers
6464 Jobs | Ahmedabad
Amazon
6352 Jobs | Seattle,WA
Oracle
5993 Jobs | Redwood City
IBM
5803 Jobs | Armonk
Capgemini
3897 Jobs | Paris,France
Tata Consultancy Services
3776 Jobs | Thane