Jobs
Interviews

6304 Scala Jobs - Page 8

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

4.0 - 8.0 years

0 Lacs

pune, maharashtra

On-site

The Zendesk Core Services Packaging and Consumption team is seeking a Senior Software Engineer - Backend to drive successful feature adoption for Zendesk customers. The ideal candidate should possess experience in analyzing various data sources with strong SQL skills, a solid understanding of domain-driven design, and a willingness to explore the unknown. Your responsibilities will include collaborating with product management, architecture, and engineers to design simple solutions to complex problems, ensuring delivery on commitments, following best practices in frameworks and tools, championing proper test coverage, participating in code reviews and design discussions, and partnering across all areas of the software development life cycle. Additionally, you will work across teams and organization boundaries to standardize and integrate services, tools, and workflows. To excel in this role, you should have at least 4 years of relevant experience in an object-oriented language such as Scala or Java, proficiency with databases like MySQL and/or DynamoDB, analytical mindset, good articulation skills, experience with CI/CD and delivery systems, knowledge of API design, distributed systems, and Kafka, familiarity with log aggregation tools like Datadog, and a customer-first mentality in service incident management and data analysis. Moreover, you should have a hunger for learning new technologies, a collaborative attitude, and excellent written and verbal communication skills. Bonus skills include experience with JavaScript/Typescript, SaaS-based products, AWS stack (e.g., Aurora), Datawarehouse technologies like Snowflake, and Ruby on Rails. Please note that candidates must be physically located and plan to work from Karnataka or Maharashtra. The role offers a hybrid experience with a mix of onsite and remote work, with the specific in-office schedule to be determined by the hiring manager. Zendesk's software aims to bring calm to the world of customer service, powering billions of conversations with renowned brands. We believe in providing a fulfilling and inclusive experience for our employees through hybrid working, allowing for in-person connection, collaboration, and learning, along with remote work flexibility. Zendesk is committed to making reasonable accommodations for applicants with disabilities and disabled veterans. If you require an accommodation for the application process or testing, please reach out to peopleandplaces@zendesk.com with your specific request.,

Posted 1 week ago

Apply

3.0 - 7.0 years

0 Lacs

karnataka

On-site

You should have about 3-4 years of strong software development experience in a product company, with a minimum of two years of hands-on experience in each of the following skills: As a Key Account Manager for Telecom (Mobility) field, you should have good contacts and be able to liaise effectively with telecom clients. Your responsibilities will involve working on large scale distributed and highly scalable Big Data processing systems. You must have hands-on experience in tools such as Hadoop, HBase, Map Reduce, Hive, and Big Data SQL. You should be proficient in developing software that can scale for large volume batch and online systems, and have experience in writing code to process large amounts of structured and unstructured data using MapReduce/Spark or batch processing systems. Proficiency in solving problems using an object-oriented programming language, preferably Java, is required. Additionally, you should have good experience and exposure to data modeling, querying, and optimization for handling big table data stores. A working knowledge of J2EE technologies, exposure to Analytics, and proficiency in Python, Scala, or a functional language is preferred. Experience in R would be a plus. You must possess excellent analytical capability and problem-solving abilities. Experience in working with public and private clouds, *nix environments, scripting, and other toolsets is also necessary. Preferred qualifications include experience in Agile processes with smaller/quick releases, exposure to predictive analytics and machine learning, and familiarity with virtual environments like Virtual Box. Any experience with Test Driven Development, continuous integration, and release management will be considered a great advantage for this role.,

Posted 1 week ago

Apply

3.0 - 7.0 years

0 Lacs

delhi

On-site

As a Backend Developer at Eskimi, you will play a crucial role in shaping our Ad Exchange by developing features that directly impact performance and drive revenue. Your work will be at the core of our platform, from fine-tuning real-time bidding to scaling systems that handle massive traffic. You will have the opportunity to collaborate closely with a cross-functional team of engineers, product managers, and data analysts in an Agile environment with frequent iterations and releases. You will dive deep into the fast-paced world of Ad Tech, designing and implementing features that generate revenue. Your strong communication skills will be essential as you work with various storage systems and technologies, such as modern programming languages like Java, C#, Go, or Scala. You will continuously improve and refactor the existing codebase to ensure it remains clean, scalable, and robust. At Eskimi, we offer flexible work arrangements, including hybrid work models and remote work options. Professional development opportunities are available through programs like Leaders Assembly and Mentorship programs, and you will have access to regular learning sessions and external consultants. Our recognition culture celebrates achievements, with bonus systems and the Bonusly recognition system highlighting accomplishments. Join us at Eskimi and be part of a fast-growing AdTech company that is changing the landscape of digital advertising globally. Together, we can achieve extraordinary things and reach new heights in the world of advertising.,

Posted 1 week ago

Apply

2.0 - 6.0 years

0 Lacs

ahmedabad, gujarat

On-site

As a Backend Engineer at Shipmnts, you will work in a fast-paced collaborative and agile environment, following Scrum/Kanban methodologies to build custom Ruby internal and public-facing applications. Your responsibilities will include integrating user-facing elements developed by front-end developers with server-side logic, writing clean and efficient code, designing robust and scalable features, and contributing in all phases of the development lifecycle. You will be involved in analyzing software requirements, providing solutions, and overseeing the implementation of engineering best practices. Additionally, you will be responsible for optimizing the application for maximum speed and scalability, as well as designing and implementing data storage solutions. Shipmnts is at the forefront of applying technology and problem-solving to the logistics industry in innovative ways. We encourage taking smart risks and championing new ideas. We prioritize creating beautifully architected apps that are polished, fast, and a joy to use without compromising on quality. To be successful in this role, you should have at least 2+ years of experience in Backend development. A strong background in Ruby and Python, familiarity with web frameworks, ORM libraries, modern databases, and browser-based technologies are essential. Experience with Test-Driven Development (TDD), continuous integration, Scrum, Kanban, and knowledge of major cloud providers are also required. Proficiency in Ruby on Rails, along with other common libraries such as RSpec and Resque, is a plus. We are looking for candidates who are calm under pressure, have a great work ethic, and communicate effectively. You should be self-aware, always pushing for a higher standard, and open to new ideas and personal feedback. Being detail-oriented, intolerant to compromise, and a problem solver with a global mindset are qualities we value in our team members. Bonus qualifications include proficiency with NoSQL data stores, knowledge of GraphQL, container orchestration, and event sourcing. If you are anti-fragile, continuously strive for improvement, and have experience working on a DevOps team, we would love to hear from you. Join us in pushing the boundaries of what's possible with Rails, Python, and Scala at Shipmnts.,

Posted 1 week ago

Apply

4.0 - 10.0 years

0 Lacs

hyderabad, telangana

On-site

As a Big Data Architect with 4 years of experience, you will be responsible for designing and implementing scalable solutions using technologies such as Spark, Scala, Hadoop MapReduce/HDFS, PIG, HIVE, and AWS cloud computing. Your role will involve hands-on experience with tools like EMR, EC2, Pentaho BI, Impala, ElasticSearch, Apache Kafka, Node.js, Redis, Logstash, statsD, Ganglia, Zeppelin, Hue, and KETTLE. Additionally, you should have sound knowledge in Machine learning, Zookeeper, Bootstrap.js, Apache Flume, FluentD, Collectd, Sqoop, Presto, Tableau, R, GROK, MongoDB, Apache Storm, and HBASE. To excel in this role, you must have a strong background in development with both Core Java and Advanced Java. A Bachelor's degree in Computer Science, Information Technology, or MCA is required along with 4 years of relevant experience. Your analytical and problem-solving skills will be put to the test as you tackle complex data challenges. Attention to detail is crucial, and you should possess excellent written and verbal communication skills. This position requires you to work independently while also being an effective team player. With 10 years of overall experience, you will be based in either Pune or Hyderabad, India. Join us in this dynamic role where you will have the opportunity to contribute to cutting-edge data architecture solutions.,

Posted 1 week ago

Apply

3.0 - 7.0 years

0 Lacs

noida, uttar pradesh

On-site

India is proud to be recognized as one of the Top 25 Best Companies to Work For in 2024 by the Great Place to Work Institute. This accolade marks our second consecutive appearance on this esteemed Best Workplaces list, following our Top 50 recognition in 2023. At R1 India, we prioritize employee wellbeing, inclusion, and diversity, as evidenced by our various prestigious recognitions. R1 India has been ranked among the Best in Healthcare, Top 100 Best Companies for Women by Avtar & Seramount, and one of the Top 10 Best Workplaces in Health & Wellness. Our mission is to revolutionize the healthcare industry through our innovative revenue cycle management services. We are dedicated to enhancing the efficiency of healthcare systems, hospitals, and physician practices to ultimately improve healthcare for all. With a global workforce exceeding 30,000 employees, our team in India comprises over 16,000 professionals located in Delhi NCR, Hyderabad, Bangalore, and Chennai. Our inclusive culture ensures that every employee is valued, respected, and appreciated, supported by a comprehensive range of employee benefits and engagement activities. **Position Title:** Specialist **Reports to:** Program Manager- Analytics BI **Location:** Noida **Position Summary:** As a Specialist, you will collaborate with the development team and take on individual development tasks. You are expected to possess strong technical skills and effectively communicate with clients. **Key Duties & Responsibilities:** - Work on Specialist Data engineering projects for E2E Analytics. - Ensure timely project delivery. - Mentor and guide other team members. - Gather requirements from clients and maintain effective communication. - Create timely documentation for knowledge base, user guides, and various communication systems. - Deliver against business needs, team goals, and objectives. - Handle large datasets in various formats, conduct integrity/QA checks, and reconcile accounting systems. - Lead efforts to troubleshoot and resolve process or system-related issues. - Uphold and comply with company policies, procedures, and Standards of Business Ethics and Conduct. - Experience working with Agile methodology. **Experience, Skills, and Knowledge:** - Bachelor's degree in computer science or equivalent experience required; B.Tech/MCA preferred. - Minimum of 3-4 years of experience. - Excellent communication skills and a strong commitment to delivering top-notch service. - Technical proficiency in Spark, Scala, Azure Data Factory, Azure Data bricks, data Lake, SQL, Snowflake, SSIS, ADF, Python, and Astronomer Airflow. - Exposure to Microsoft Azure Data Fundamentals. **Key Competency Profile:** - Own your development by implementing and sharing your learnings. - Motivate team members to perform at their highest level. - Act with integrity and adhere to the company's values. - Proactively identify problems and solutions for yourself and others. - Communicate effectively when facing challenges. - Demonstrate accountability and responsibility. In our dynamic healthcare environment, we leverage our collective expertise to deliver innovative solutions. Our rapidly expanding team provides opportunities for learning and growth through collaboration, meaningful interactions, and the freedom to explore professional interests. At R1 India, associates are empowered to contribute, innovate, and create impactful work that serves communities worldwide. We foster a culture of excellence that drives customer success and enhances patient care. Additionally, we believe in giving back to the community and offer a competitive benefits package. Learn more about us at r1rcm.com. Connect with us on Facebook for more updates and insights.,

Posted 1 week ago

Apply

0.0 - 4.0 years

0 Lacs

hyderabad, telangana

On-site

As an IT intern at Qualcomm India Private Limited, you will collaborate with a team of IT professionals and engineers to develop, implement, and maintain various technologies within the organization. With a background in computer science, engineering, or information technology, you will have the opportunity to contribute to a range of projects. During your internship, you may be involved in tasks such as framework rollouts, system-level integration issues, designing and integrating new features, project documentation, data analysis, network security, vendor management, development, testing, application, database, and infrastructure maintenance and support, project management, as well as server and system administration. The technologies you may work with include operating systems like Android, Linux, Windows, and Chrome, as well as native platforms such as RIM. You will also engage with Microsoft Office Suite, packaged/cloud services like Salesforce, ServiceNow, and Workday, enterprise service management tools, cloud computing services like AWS and Azure, version control tools, high-performance computing, virtualization, firewalls, VPN technologies, storage, monitoring tools, proxy services, various frameworks like Hadoop, Ruby on Rails, Grails, Angular, and React, programming languages such as Java, Python, JavaScript, Objective C, Go Lang, Scala, and .Net, databases like Oracle, MySQL, PostgreSQL, MongoDB, Elastic Search, and MapR DB, analytics tools like ETL and visualization, and DevOps tools like containers, Jenkins, Ansible, Chef, and Azure DevOps. Qualcomm is an equal opportunity employer committed to providing accessibility accommodations for individuals with disabilities during the application and hiring process. If you require any accommodations, you can reach out to myhr.support@qualcomm.com or Qualcomm's toll-free number. The company also expects its employees to comply with all applicable policies and procedures, including security measures for protecting confidential information. Please note that Qualcomm's Careers Site is for individuals seeking jobs directly at Qualcomm, and submissions from staffing and recruiting agencies will be considered unsolicited. For more information about this internship role, you can contact Qualcomm Careers directly.,

Posted 1 week ago

Apply

2.0 - 6.0 years

0 Lacs

karnataka

On-site

You have experience with Azure Data Bricks, Data Factory, Azure Data components like Azure SQL Database, Azure SQL Warehouse, and SYNAPSE Analytics. You are proficient in Python, Pyspark, Scala, and Hive Programming. You also have experience in building CI/CD pipelines in Data environments. Your primary skills include ADF (Azure Data Factory) or ADB (Azure Data Bricks). Additionally, you possess excellent verbal and written communication skills along with the ability to work both independently and within a team environment. For the AWS Data Engineer role, you should have at least 2 years of experience working on the AWS Cloud platform with strong knowledge in Python. Knowledge of AWS services such as S3, Glue, API Gateway, Crawler, Athena, Lambda, Dynamic DB, and Redshift is advantageous. Experience or knowledge with streaming technologies, particularly Kafka, is essential. Moreover, familiarity with SQL, good analytical skills, experience working on Linux platforms, and understanding the pros and cons and cost impact of the AWS services being used are required. Strong communication skills are also necessary for this role. In the case of the GCP Data Engineer position, you must have a minimum of 4 years" experience in GCP Data Engineering. You should possess strong data engineering skills using Java or Python programming languages or Spark on Google Cloud. Experience in handling big data, Agile methodologies, ETL, ELT skills, data movement skills, and data processing skills are essential. Certification as a Professional Google Cloud Data Engineer would be an added advantage. Proven analytical skills, a problem-solving attitude, and the ability to function effectively in a cross-team environment are also required. Your primary skills in this role include GCP, data engineering, Java/Python/Spark on GCP, programming experience in either Python, Java, or PySpark, GCS (Cloud Storage), Composer (Airflow), and BigQuery experience, and experience in building data pipelines using the aforementioned skills.,

Posted 1 week ago

Apply

4.0 - 8.0 years

0 Lacs

pune, maharashtra

On-site

The Applications Development Intermediate Programmer Analyst position is an intermediate level role where you will be responsible for contributing to the establishment and implementation of new or revised application systems and programs in coordination with the Technology team. Your main objective will be to assist in applications systems analysis and programming activities. You will utilize your knowledge of applications development procedures and concepts, along with basic knowledge of technical areas, to identify and define necessary system enhancements. This includes using script tools, analyzing code, and consulting with users, clients, and other technology groups to recommend programming solutions. Additionally, you will install and support customer exposure systems and apply fundamental knowledge of programming languages for design specifications. As an Intermediate Programmer Analyst, you will analyze applications to identify vulnerabilities and security issues, conduct testing and debugging, and serve as an advisor or coach to new or lower-level analysts. You will be responsible for identifying problems, analyzing information, and making evaluative judgments to recommend and implement solutions. Operating with a limited level of direct supervision, you will exercise independence of judgment and autonomy while acting as a subject matter expert to senior stakeholders and/or other team members. In this role, it is crucial to appropriately assess risk when making business decisions, with a focus on safeguarding Citigroup, its clients, and assets. This includes driving compliance with applicable laws, rules, and regulations, adhering to policies, applying sound ethical judgment, and escalating, managing, and reporting control issues with transparency. Qualifications: - 4-6 years of proven experience in developing and managing Big Data solutions using Apache Spark and Scala is required - Strong programming skills in Scala, Java, or Python - Hands-on experience with technologies like Apache Hive, Apache Kafka, HBase, Couchbase, Sqoop, Flume, etc. - Proficiency in SQL and experience with relational databases (Oracle/PL-SQL) - Experience in working on Kafka, JMS/MQ applications - Familiarity with data warehousing concepts and ETL processes - Knowledge of data modeling, data architecture, and data integration techniques - Experience with Java, Web services, XML, JavaScript, Microservices, SOA, etc. - Strong technical knowledge of Apache Spark, Hive, SQL, and the Hadoop ecosystem - Experience with developing frameworks and utility services, logging/monitoring, and high-quality software delivery - Experience creating large-scale, multi-tiered, distributed applications with Hadoop and Spark - Profound knowledge of implementing different data storage solutions such as RDBMS, Hive, HBase, Impala, and NoSQL databases Education: - Bachelor's degree or equivalent experience This job description provides a high-level overview of the responsibilities and qualifications for the Applications Development Intermediate Programmer Analyst position. Other job-related duties may be assigned as required.,

Posted 1 week ago

Apply

2.0 - 6.0 years

0 Lacs

haryana

On-site

About KPMG in India KPMG entities in India are professional services firm(s) affiliated with KPMG International Limited since August 1993. Leveraging the global network of firms, professionals at KPMG in India are well-versed in local laws, regulations, markets, and competition. With offices across various cities in India such as Ahmedabad, Bengaluru, Chandigarh, Chennai, Gurugram, Hyderabad, Jaipur, Kochi, Kolkata, Mumbai, Noida, Pune, Vadodara, and Vijayawada, KPMG entities offer services to national and international clients across sectors. The services aim to be rapid, performance-based, industry-focused, and technology-enabled, reflecting a deep understanding of global and local industries and the Indian business environment. Major Duties & Responsibilities - Collaborate with business stakeholders and cross-functional SMEs to understand business context and key questions. - Develop Proof of concepts (POCs) / Minimum Viable Products (MVPs) and guide them through production deployment. - Influence machine learning strategy for Digital programs and projects. - Provide solution recommendations balancing speed to market and analytical soundness. - Explore design options, assess efficiency and impact, and develop approaches for improvement. - Develop analytical / modelling solutions using commercial and open-source tools like Python, R, TensorFlow. - Formulate model-based solutions by combining machine learning algorithms with other techniques. - Design, adapt, and visualize solutions based on evolving requirements. - Create algorithms to extract information from large, multiparametric data sets. - Deploy algorithms to production for actionable insights from large databases. - Compare and recommend optimal techniques from various methodologies. - Develop and embed automated processes for predictive model validation, deployment, and implementation. - Work on multiple pillars of AI including cognitive engineering, conversational bots, and data science. - Ensure solutions meet high standards of performance, security, scalability, maintainability, and reliability. - Lead discussions, provide thought leadership, and expertise in machine learning techniques. - Facilitate sharing of ideas, learnings, and best-practices across geographies. Required Qualifications - Minimum Bachelor of Science or Bachelor of Engineering. - Strong analytical, problem-solving skills, and programming knowledge. - Proficiency in statistical/programming software (e.g., R, Python) and data querying languages (e.g., SQL, Hadoop/Hive, Scala). - Hands-on skills in feature engineering and hyperparameter optimization. - Experience in producing high-quality code, tests, documentation. - Familiarity with Microsoft Azure or AWS data management tools. - Understanding of descriptive and exploratory statistics, predictive modelling, decision trees, machine learning algorithms, and deep learning methodologies. - Proficiency in statistical concepts and ML algorithms. - Knowledge of Agile principles and process. - Ability to lead, manage, build, and deliver customer business results through data scientists or professional services team. - Strong communication skills and ability to share ideas effectively. - Self-motivated, proactive problem solver, capable of working independently and in teams. QUALIFICATIONS - B.Tech/M.Tech/MCA/M.Sc,

Posted 1 week ago

Apply

10.0 - 14.0 years

0 Lacs

noida, uttar pradesh

On-site

You are a Staff Software Engineer (ETL) with 8-10 years of experience looking to join R1's ETL Development team. In this role, you will play a crucial part in designing, developing, and leading the implementation of ETL processes and data architecture solutions. Reporting to the Engineering Manager, you will be responsible for planning, designing, and implementing a centralized data warehouse solution for data acquisition, ingestion, large data processing, and automation/optimization across all company products. Your key responsibilities will include leading the design and architecture of ETL processes and data integration solutions, developing and maintaining ETL workflows using tools like SSIS, Azure Databricks, SparkSQL, or similar, collaborating with stakeholders to ensure seamless integration, transformation, and loading of data, optimizing ETL processes for performance, scalability, and reliability, conducting code reviews, providing technical guidance, mentoring junior developers, troubleshooting and resolving issues related to ETL processes and data integration, ensuring compliance with data governance, security policies, and best practices, documenting ETL processes, and staying updated with the latest trends and technologies in data integration and ETL. To qualify for this role, you should have a Bachelor's degree in computer science, Information Technology, or a related field, along with 10-12 years of experience in ETL development and data integration. You should possess expertise in ETL tools such as SSIS, T-SQL, Azure Databricks, or similar, knowledge of SQL/NoSQL data storage mechanisms and Big Data technologies, experience in Data Modeling, familiarity with Azure data factory, Azure Data bricks, Azure Data Lake, and experience in Scala, SparkSQL, Airflow is preferred. Strong problem-solving and analytical skills, excellent communication and leadership abilities, proficiency in working effectively in a team-oriented environment, experience with agile methodology, and healthcare industry experience are also preferred qualifications. At R1, you will have the opportunity to work in an evolving healthcare setting where shared expertise is utilized to deliver innovative solutions. The fast-growing team provides opportunities to learn and grow through rewarding interactions, collaboration, and the freedom to explore professional interests. Associates are encouraged to contribute, innovate, and create meaningful work that impacts the communities served globally. R1 also offers a culture of excellence that drives customer success, improves patient care, and believes in giving back to the community with a competitive benefits package. To learn more, visit r1rcm.com.,

Posted 1 week ago

Apply

5.0 years

0 Lacs

Pune, Maharashtra, India

Remote

About Fusemachines Fusemachines is a leading AI strategy, talent, and education services provider. Founded by Sameer Maskey Ph.D., Adjunct Associate Professor at Columbia University, Fusemachines has a core mission of democratizing AI. With a presence in 4 countries (Nepal, United States, Canada, and Dominican Republic and more than 450 employees). Fusemachines seeks to bring its global expertise in AI to transform companies around the world. About The Role This is a remote full-time position, responsible for designing, building, testing, optimizing and maintaining the infrastructure and code required for data integration, storage, processing, pipelines and analytics (BI, visualization and Advanced Analytics) from ingestion to consumption, implementing data flow controls, and ensuring high data quality and accessibility for analytics and business intelligence purposes. This role requires a strong foundation in programming, and a keen understanding of how to integrate and manage data effectively across various storage systems and technologies. We're looking for someone who can quickly ramp up, contribute right away and lead the work in Data & Analytics, helping from backlog definition, to architecture decisions, and lead technical the rest of the team with minimal oversight. We are looking for a skilled Sr. Data Engineer/Technical Lead with a strong background in Python, SQL, Pyspark, Redshift and AWS cloud-based large scale data solutions with a passion for data quality, performance and cost optimization. The ideal candidate will develop in an Agile environment, and would have GCP experience too, to contribute to the migration from AWS to GCP. This role is perfect for an individual passionate about leading, leveraging data to drive insights, improve decision-making, and support the strategic goals of the organization through innovative data engineering solutions. Qualification / Skill Set Requirement: Must have a full-time Bachelor's degree in Computer Science Information Systems, Engineering, or a related field 5+ years of real-world data engineering development experience in AWS and GCP (certifications preferred). Strong expertise in Python, SQL, PySpark and AWS in an Agile environment, with a proven track record of building and optimizing data pipelines, architectures, and datasets, and proven experience in data storage, modeling, management, lake, warehousing, processing/transformation, integration, cleansing, validation and analytics Senior person who can understand requirements and design end to end solutions with minimal oversight Strong programming Skills in one or more languages such as Python, Scala, and proficient in writing efficient and optimized code for data integration, storage, processing and manipulation Strong knowledge SDLC tools and technologies, including project management software (Jira or similar), source code management (GitHub or similar), CI/CD system (GitHub actions, AWS CodeBuild or similar) and binary repository manager (AWS CodeArtifact or similar) Good understanding of Data Modeling and Database Design Principles. Being able to design and implement efficient database schemas that meet the requirements of the data architecture to support data solutions Strong SQL skills and experience working with complex data sets, Enterprise Data Warehouse and writing advanced SQL queries. Proficient with Relational Databases (RDS, MySQL, Postgres, or similar) and NonSQL Databases (Cassandra, MongoDB, Neo4j, etc.) Skilled in Data Integration from different sources such as APIs, databases, flat files, event streaming. Strong experience in implementing data pipelines and efficient ELT/ETL processes, batch and real-time, in AWS and using open source solutions, being able to develop custom integration solutions as needed, including Data Integration from different sources such as APIs (PoS integrations is a plus), ERP (Oracle and Allegra are a plus), databases, flat files, Apache Parquet, event streaming, including cleansing, transformation and validation of the data Strong experience with scalable and distributed Data Technologies such as Spark/PySpark, DBT and Kafka, to be able to handle large volumes of data Experience with stream-processing systems: Storm, Spark-Streaming, etc. is a plus Strong experience in designing and implementing Data Warehousing solutions in AWS with Redshift. Demonstrated experience in designing and implementing efficient ELT/ETL processes that extract data from source systems, transform it (DBT), and load it into the data warehouse Strong experience in Orchestration using Apache Airflow Expert in Cloud Computing in AWS, including deep knowledge of a variety of AWS services like Lambda, Kinesis, S3, Lake Formation, EC2, EMR, ECS/ECR, IAM, CloudWatch, etc Good understanding of Data Quality and Governance, including implementation of data quality checks and monitoring processes to ensure that data is accurate, complete, and consistent Good understanding of BI solutions including Looker and LookML (Looker Modeling Language) Strong knowledge and hands-on experience of DevOps principles, tools and technologies (GitHub and AWS DevOps) including continuous integration, continuous delivery (CI/CD), infrastructure as code (IaC – Terraform), configuration management, automated testing, performance tuning and cost management and optimization Good Problem-Solving skills: being able to troubleshoot data processing pipelines and identify performance bottlenecks and other issues Possesses strong leadership skills with a willingness to lead, create Ideas, and be assertive Strong project management and organizational skills Excellent communication skills to collaborate with cross-functional teams, including business users, data architects, DevOps/DataOps/MLOps engineers, data analyst, data scientists, developers, and operations teams. Essential to convey complex technical concepts and insights to non-technical stakeholders effectively Ability to document processes, procedures, and deployment configurations Responsibilities: Design, implement, deploy, test and maintain highly scalable and efficient data architectures, defining and maintaining standards and best practices for data management independently with minimal guidance Ensuring the scalability, reliability, quality and performance of data systems Mentoring and guiding junior/mid-level data engineers Collaborating with Product, Engineering, Data Scientists and Analysts to understand data requirements and develop data solutions, including reusable components Evaluating and implementing new technologies and tools to improve data integration, data processing and analysis Design architecture, observability and testing strategies, and building reliable infrastructure and data pipelines Takes ownership of storage layer, data management tasks, including schema design, indexing, and performance tuning Swiftly address and resolve complex data engineering issues, incidents and resolve bottlenecks in SQL queries and database operations Conduct Discovery on existing Data Infrastructure and Proposed Architecture Evaluate and implement cutting-edge technologies and methodologies and continue learning and expanding skills in data engineering and cloud platforms, to improve and modernize existing data systems Evaluate, design, and implement data governance solutions: cataloging, lineage, quality and data governance frameworks that are suitable for a modern analytics solution, considering industry-standard best practices and patterns. Define and document data engineering architectures, processes and data flows Assess best practices and design schemas that match business needs for delivering a modern analytics solution (descriptive, diagnostic, predictive, prescriptive) Be an active member of our Agile team, participating in all ceremonies and continuous improvement activities Equal Opportunity Employer: Race, Color, Religion, Sex, Sexual Orientation, Gender Identity, National Origin, Age, Genetic Information, Disability, Protected Veteran Status, or any other legally protected group status. Powered by JazzHR 5D0vSXHNRb

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

pune, maharashtra

On-site

As a Senior Quality Engineer at Mastercard Data & Services, you will be an integral part of the Advance Analytics Program team. Your primary responsibility will be to play a crucial role in a technical test lead capacity, driving Quality and implementing test approaches, automation improvements, and monitoring early in the development cycle. You will undertake Requirements Analysis to identify business scenarios and user stories, creating applicable test scenarios and managing test data assets. Your role will involve developing and executing both manual exploratory tests and automated tests for APIs and GUIs. Collaborating closely with development teams, you will enhance existing software development processes and partner with developers to improve and automate test and release processes. Your contribution will be vital in making Quality an integral part of the development process. To excel in this role, you must possess solid professional software testing experience in complex distributed systems. Your expertise should include creating and maintaining data-driven automated testing for distributed systems, with a good understanding of Page Object model frameworks. Proficiency in various types of testing such as Smoke, Functional, Regression, Backend, Browser, and Non-Functional testing is essential. You should have a strong command of Selenium Web Driver (Java 11-17) for automated GUI testing and experience with tools like Postman/SOAPUI Pro for automated API testing. Knowledge of Test or Behaviour Driven Development, along with experience in using TestNG, Maven, and ANT build tools, is required. Familiarity with executing test suites for Stress & load tests using Gatling tool with Scala is a plus. Experience working in a continuous integration environment, configuring Jenkins builds, and executing tests using Jenkins is necessary. You should have a comprehensive understanding of test and project delivery life cycles, the ability to analyze application logs, and assist engineers using basic debugging techniques. Proficiency in defect/requirements management tools and version control tools like Subversion and GIT is expected. In addition to technical skills, excellent communication skills both written and verbal are crucial for this role. If you are looking to make a significant impact in a dynamic environment with a focus on innovation and quality, this is the opportunity for you to join a fast-growing engineering team at Mastercard Data & Services.,

Posted 1 week ago

Apply

10.0 years

0 Lacs

Hyderabad, Telangana, India

Remote

Company Description Experian is a global data and technology company, powering opportunities for people and businesses around the world. We help to redefine lending practices, uncover and prevent fraud, simplify healthcare, create marketing solutions, all using our unique combination of data, analytics and software. We also assist millions of people to realise their financial goals and help them save time and money. We operate across a range of markets, from financial services to healthcare, automotive, agribusiness, insurance, and many more industry segments. We invest in people and new advanced technologies to unlock the power of data. As a FTSE 100 Index company listed on the London Stock Exchange (EXPN), we have a team of 22,500 people across 32 countries. Our corporate headquarters are in Dublin, Ireland. Learn more at experianplc.co m Job Description We are looking for an enthusiastic Engineering Manager to work within our Consumer Information Business. This is a hybrid role requiring travelling to Hyderabad office at least 2 times per month We're using Java and C++ based distributed batch systems alongside .Net, Python and Scala based cloud native systems and are always keen to evaluate new features to see if we can take advantage of them. We use a wide variety of testing and quality techniques, and our services are pushed through our environments using a combination of GIT, Jenkins for source control, build and release for the backbone of our CI pipeline. As we develop into the future targeting cloud infrastructure and scalability, we want to bring you along the journey with us over the coming years. Key Responsibilities You will be based in Hyderabad and reporting to Director Engineering. Leading an agile team to develop quality solutions Collaborating effectively to support and enhance the full product lifecycle Reviewing proposals, evaluating alternatives, providing estimates and making recommendations Serving as an expert on applications and providing technical support Revising, updating, refactoring and debugging greenfield and established codebases Supporting other team members' development Qualifications 10+ years of relevant experience & leading teams Ability to pick up new skills and proprietary code Proficient at delivering in Linux Server based environments Proficient in scripting languages (powershell, bash, etc) Experience of Security Protocols Experience of developing with AWS cloud services including (but not limited to) AWS Glue, S3, Step Functions, Lambdas, EventBridge and SQS Experience of programming using Scala and Python at enterprise levels Working multiple architectures (monolith and microservice) Working in an Agile team Useful Skills Implemented and used a variety of automated testing approaches and techniques Serverless development and deployment (desirable) Knowledge of AWS infrastructure (desirable) Additional Information Our uniqueness is that we celebrate yours. Experian's culture and people are important differentiators. We take our people agenda very seriously and focus on what matters; DEI, work/life balance, development, authenticity, collaboration, wellness, reward & recognition, volunteering... the list goes on. Experian's people first approach is award-winning; World's Best Workplaces™ 2024 (Fortune Global Top 25), Great Place To Work™ in 24 countries, and Glassdoor Best Places to Work 2024 to name a few. Check out Experian Life on social or our Careers Site and Glassdoor to understand why. Experian is proud to be an Equal Opportunity and Affirmative Action employer. Innovation is a critical part of Experian's DNA and practices, and our diverse workforce drives our success. Everyone can succeed at Experian and bring their whole self to work, irrespective of their gender, ethnicity, religion, color, sexuality, physical ability or age. If you have a disability or special need that requires accommodation, please let us know at the earliest opportunity. Benefits Experian care for employee's work life balance, health, safety and wellbeing. In support of this endeavor, we offer best-in-class family well-being benefits, enhanced medical benefits and paid time off. This is a hybrid remote/in-office role. Experian Careers - Creating a better tomorrow together Find out what its like to work for Experian by clicking here

Posted 1 week ago

Apply

4.0 years

0 Lacs

Greater Kolkata Area

On-site

Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Senior Associate Job Description & Summary At PwC, our people in data and analytics engineering focus on leveraging advanced technologies and techniques to design and develop robust data solutions for clients. They play a crucial role in transforming raw data into actionable insights, enabling informed decision-making and driving business growth. In data engineering at PwC, you will focus on designing and building data infrastructure and systems to enable efficient data processing and analysis. You will be responsible for developing and implementing data pipelines, data integration, and data transformation solutions. *Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us . At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Responsibilities: Job Description: · Analyses current business practices, processes, and procedures as well as identifying future business opportunities for leveraging Microsoft Azure Data & Analytics Services. · Provide technical leadership and thought leadership as a senior member of the Analytics Practice in areas such as data access & ingestion, data processing, data integration, data modeling, database design & implementation, data visualization, and advanced analytics. · Engage and collaborate with customers to understand business requirements/use cases and translate them into detailed technical specifications. · Develop best practices including reusable code, libraries, patterns, and consumable frameworks for cloud-based data warehousing and ETL. · Maintain best practice standards for the development or cloud-based data warehouse solutioning including naming standards. · Designing and implementing highly performant data pipelines from multiple sources using Apache Spark and/or Azure Databricks · Integrating the end-to-end data pipeline to take data from source systems to target data repositories ensuring the quality and consistency of data is always maintained · Working with other members of the project team to support delivery of additional project components (API interfaces) · Evaluating the performance and applicability of multiple tools against customer requirements · Working within an Agile delivery / DevOps methodology to deliver proof of concept and production implementation in iterative sprints. · Integrate Databricks with other technologies (Ingestion tools, Visualization tools). · Proven experience working as a data engineer · Highly proficient in using the spark framework (python and/or Scala) · Extensive knowledge of Data Warehousing concepts, strategies, methodologies. · Direct experience of building data pipelines using Azure Data Factory and Apache Spark (preferably in Databricks). · Hands on experience designing and delivering solutions using Azure including Azure Storage, Azure SQL Data Warehouse, Azure Data Lake, Azure Cosmos DB, Azure Stream Analytics · Experience in designing and hands-on development in cloud-based analytics solutions. · Expert level understanding on Azure Data Factory, Azure Synapse, Azure SQL, Azure Data Lake, and Azure App Service is required. · Designing and building of data pipelines using API ingestion and Streaming ingestion methods. · Knowledge of Dev-Ops processes (including CI/CD) and Infrastructure as code is essential. · Thorough understanding of Azure Cloud Infrastructure offerings. · Strong experience in common data warehouse modeling principles including Kimball. · Working knowledge of Python is desirable · Experience developing security models. · Databricks & Azure Big Data Architecture Certification would be plus Mandatory skill sets: ADE, ADB, ADF Preferred skill sets: ADE, ADB, ADF Years of experience required: 4-8 Years Education qualification: BE, B.Tech, MCA, M.Tech Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Bachelor of Engineering, Master of Engineering Degrees/Field of Study preferred: Certifications (if blank, certifications not specified) Required Skills Microsoft Azure Optional Skills Accepting Feedback, Accepting Feedback, Active Listening, Agile Scalability, Amazon Web Services (AWS), Analytical Thinking, Apache Airflow, Apache Hadoop, Azure Data Factory, Communication, Creativity, Data Anonymization, Data Architecture, Database Administration, Database Management System (DBMS), Database Optimization, Database Security Best Practices, Databricks Unified Data Analytics Platform, Data Engineering, Data Engineering Platforms, Data Infrastructure, Data Integration, Data Lake, Data Modeling, Data Pipeline {+ 27 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Not Specified Available for Work Visa Sponsorship? No Government Clearance Required? No Job Posting End Date

Posted 1 week ago

Apply

4.0 years

0 Lacs

Greater Kolkata Area

On-site

Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Senior Associate Job Description & Summary At PwC, our people in data and analytics engineering focus on leveraging advanced technologies and techniques to design and develop robust data solutions for clients. They play a crucial role in transforming raw data into actionable insights, enabling informed decision-making and driving business growth. In data engineering at PwC, you will focus on designing and building data infrastructure and systems to enable efficient data processing and analysis. You will be responsible for developing and implementing data pipelines, data integration, and data transformation solutions. *Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us . At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Responsibilities: Job Description: · Analyses current business practices, processes, and procedures as well as identifying future business opportunities for leveraging Microsoft Azure Data & Analytics Services. · Provide technical leadership and thought leadership as a senior member of the Analytics Practice in areas such as data access & ingestion, data processing, data integration, data modeling, database design & implementation, data visualization, and advanced analytics. · Engage and collaborate with customers to understand business requirements/use cases and translate them into detailed technical specifications. · Develop best practices including reusable code, libraries, patterns, and consumable frameworks for cloud-based data warehousing and ETL. · Maintain best practice standards for the development or cloud-based data warehouse solutioning including naming standards. · Designing and implementing highly performant data pipelines from multiple sources using Apache Spark and/or Azure Databricks · Integrating the end-to-end data pipeline to take data from source systems to target data repositories ensuring the quality and consistency of data is always maintained · Working with other members of the project team to support delivery of additional project components (API interfaces) · Evaluating the performance and applicability of multiple tools against customer requirements · Working within an Agile delivery / DevOps methodology to deliver proof of concept and production implementation in iterative sprints. · Integrate Databricks with other technologies (Ingestion tools, Visualization tools). · Proven experience working as a data engineer · Highly proficient in using the spark framework (python and/or Scala) · Extensive knowledge of Data Warehousing concepts, strategies, methodologies. · Direct experience of building data pipelines using Azure Data Factory and Apache Spark (preferably in Databricks). · Hands on experience designing and delivering solutions using Azure including Azure Storage, Azure SQL Data Warehouse, Azure Data Lake, Azure Cosmos DB, Azure Stream Analytics · Experience in designing and hands-on development in cloud-based analytics solutions. · Expert level understanding on Azure Data Factory, Azure Synapse, Azure SQL, Azure Data Lake, and Azure App Service is required. · Designing and building of data pipelines using API ingestion and Streaming ingestion methods. · Knowledge of Dev-Ops processes (including CI/CD) and Infrastructure as code is essential. · Thorough understanding of Azure Cloud Infrastructure offerings. · Strong experience in common data warehouse modeling principles including Kimball. · Working knowledge of Python is desirable · Experience developing security models. · Databricks & Azure Big Data Architecture Certification would be plus Mandatory skill sets: ADE, ADB, ADF Preferred skill sets: ADE, ADB, ADF Years of experience required: 4-8 Years Education qualification: BE, B.Tech, MCA, M.Tech Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Bachelor of Engineering, Master of Engineering Degrees/Field of Study preferred: Certifications (if blank, certifications not specified) Required Skills Microsoft Azure Optional Skills Accepting Feedback, Accepting Feedback, Active Listening, Agile Scalability, Amazon Web Services (AWS), Analytical Thinking, Apache Airflow, Apache Hadoop, Azure Data Factory, Communication, Creativity, Data Anonymization, Data Architecture, Database Administration, Database Management System (DBMS), Database Optimization, Database Security Best Practices, Databricks Unified Data Analytics Platform, Data Engineering, Data Engineering Platforms, Data Infrastructure, Data Integration, Data Lake, Data Modeling, Data Pipeline {+ 27 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Not Specified Available for Work Visa Sponsorship? No Government Clearance Required? No Job Posting End Date

Posted 1 week ago

Apply

3.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start Caring. Connecting. Growing together. Primary Responsibilities Propose Innovative Solutions for the existing problems of the Product Develop and Manage Big Data Platform using Public and Private Cloud infrastructure Develop high performing Transactional and Non-Transactional Big Data Platform Participate and contribute in Architecture discussion with Senior Architects, Product Owners, Dev and QE Managers. Propose Innovative Solutions for the existing problems of Products Come up with long term architecture roadmap for the existing products Constantly look at security aspects and propose architectural solution for security vulnerabilities Propose and implement best in class architectural solution for big and complex systems hosted on cloud infrastructure Define and create system architecture design document and discuss different pros and cons of any particular approach with Architecture council Develop detailed design specifications, data mapping and data transformation rules as per Product need Define, use and communicate design patterns and best practices Communicate with internal and external business partners. Should be able to handle Onshore or Offshore model Act as the liaison between Business and Developers and project management groups Present and evaluate design solutions objectively and facilitate conflict resolution Develops pioneering approaches to emerging technology and industry trends Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Required Qualifications B. Tech or M. Tech or MCA or equivalent experience 3+ years of experience in designing and developing or coding software components in Bigdata, that includes various tools like Map Reduce, Hive, Sqoop, HBase, Spark, Scala, Kafka. Solid knowledge in Azure or AWS Solid working experience of Java and RDBMS At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone-of every race, gender, sexuality, age, location and income-deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes — an enterprise priority reflected in our mission.

Posted 1 week ago

Apply

2.0 years

0 Lacs

Delhi, India

On-site

Job Description Job Description About The Job Help our clients-internal and external-understand and use RMS services better by understanding their requirements, queries, and helping address the same through knowledge of data science and RMS. Responsibilities Building knowledge of Nielsen suite of products and demonstrating the same Understanding client concerns Able to put forth ways and means of solving client concerns with supervision Automation and development of solutions for existing processes Taking initiative to understand concerns/problems in the RMS product and participating in product improvement initiatives About The Job Help our clients-internal and external-understand and use RMS services better by understanding their requirements, queries, and helping address the same through knowledge of data science and RMS. Responsibilities Building knowledge of Nielsen suite of products and demonstrating the same Understanding client concerns Able to put forth ways and means of solving client concerns with supervision Automation and development of solutions for existing processes Taking initiative to understand concerns/problems in the RMS product and participating in product improvement initiatives Qualifications Professionals with degrees in Maths, Data Science, Statistics, or related fields involving statistical analysis of large data sets 2-3 years of experience in market research or relevant field Mindset and Approach to work: Embraces change, innovation and iterative processes in order to continuously improve the products value to clients Continuously collaborate & support to improve the product Active interest in arriving at collaboration and consensus in communication plans, deliverables and deadlines Plans and completes assignments independently within an established framework, breaking down complex tasks, making reasonable decisions. Work is reviewed for overall technical soundness. Participates in data experiments and PoCs, setting measurable goals, timelines and reproducible outcomes. Applies critical thinking and takes initiative Continuously reviews the latest industry innovations and effectively applies them to their work Consistently challenges and analyzes data to ensure accuracy Functional Skills: Ability to manipulate, analyze and interpret large data sources Experienced in high-level programming languages (f.e. Python, R, SQL, Scala), as well as with data visualization tools (e.g. Power BI, Spotfire, Tableau, MicroStrategy) Able to work in virtual environment. Familiar with git/Bitbucket processes People with at least some experience in RMS, NIQ, will have an advantage Can use a logical reasoning process to break down and work through increasingly challenging situations or problems to arrive at positive outcomes Identify and use data from various sources to influence decisions Interpret effectively the data in relation to business objectives Soft Skills Ability to engage/communicate with team and extended team members Can adapt to change and new ideas or ways of working Exhibits emotional intelligence when partnering with internal and external stakeholders Additional Information Our Benefits Flexible working environment Volunteer time off LinkedIn Learning Employee-Assistance-Program (EAP) About NIQ NIQ is the world’s leading consumer intelligence company, delivering the most complete understanding of consumer buying behavior and revealing new pathways to growth. In 2023, NIQ combined with GfK, bringing together the two industry leaders with unparalleled global reach. With a holistic retail read and the most comprehensive consumer insights—delivered with advanced analytics through state-of-the-art platforms—NIQ delivers the Full View™. NIQ is an Advent International portfolio company with operations in 100+ markets, covering more than 90% of the world’s population. For more information, visit NIQ.com Want to keep up with our latest updates? Follow us on: LinkedIn | Instagram | Twitter | Facebook Our commitment to Diversity, Equity, and Inclusion NIQ is committed to reflecting the diversity of the clients, communities, and markets we measure within our own workforce. We exist to count everyone and are on a mission to systematically embed inclusion and diversity into all aspects of our workforce, measurement, and products. We enthusiastically invite candidates who share that mission to join us. We are proud to be an Equal Opportunity/Affirmative Action-Employer, making decisions without regard to race, color, religion, gender, gender identity or expression, sexual orientation, national origin, genetics, disability status, age, marital status, protected veteran status or any other protected class. Our global non-discrimination policy covers these protected classes in every market in which we do business worldwide. Learn more about how we are driving diversity and inclusion in everything we do by visiting the NIQ News Center: https://nielseniq.com/global/en/news-center/diversity-inclusion

Posted 1 week ago

Apply

6.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

About The Role Grade Level (for internal use): 11 The Role As a Principal Software Engineer, you will design, create and maintain high-quality cloud-based applications to help S&P Global’s Private Valuations and Assessments business grow. You will be working with a team of experienced application/web developers (in multiple global locations), business analysts and business stakeholders. Day to day, you will contribute to current and future technology architectural designs and development efforts towards various different modules of S&P Global Private Valuations and Assessments solutions. You will have access to market-leading tools and services, and be part of a wider group that values technical excellence. The Team We are an established development team, looking to grow, innovation and push the boundaries of what's possible in software development. We prioritize providing an excellent developer experience, which means you'll have access to market-leading tools, services and learning, as well as expertise from both our team and the wider organization. You'll work in a strong DevOps culture, where we've automated every stage of our deployment pipeline, allowing you to focus on what you do best. You’ll have the opportunity to work with cutting-edge technology and use AI productivity tools and AI models as standard. Responsibilities Candidate needs to: Be an active player in system architecture and design discussions. Have a passion for technology and offer new ideas and approaches. Recommend product, process, and tooling improvements. Analyse business requirements, design, implement and test features. Understand non-functional requirements like performance, scalability etc. aspect of the application. Be delivery focused. Participate in agile product development through all SDLC phases. Build high-quality, reusable code and libraries for future use in a timely and efficient manner. Work closely with the QA and Business teams and highlight issues and risks proactively. Embrace quality standards including: code conventions, code reviews, unit testing, static analysis and revision control. Coordinate with QA/QC staff for product functional and system testing. Maintain and support all areas of the application. Work in an individual capacity as well as in teams across geographies. Work under the general supervision of the Development Manager and take direction from other Leads within the organization where required. Business competencies Education and experience 6-10 years strong technical and platform knowledge, including some or all of: Required Java Thorough understanding of agile software development methodology and industry best practices. Thorough understanding of fundamental software engineering and computer science principles: object-oriented design, functional programming, structured design, databases, algorithms, data structures, usability, refactoring, debugging, and configuration management. Excellent design and problem solving skills and disciplined, engineering oriented mind set. Minimum 5 years of experience developing commercial web applications or services used by external customers. Excellent verbal and written communication skills, including presentation skills. Experience of Cloud Development preferably AWS and AWS Cloud Products Desirable AWS Lambda and Serverless Architecture SQL AWS managed services, Linux, Containerization Platform Scala (or other functional programming experience) Angular 18+ JavaScript/TypeScript Spring Other JVM languages Experience with Continuous Integration and build systems Git + Gitlab/Github Good degree in Computer Science, Engineering or Numerate field Experience with configuration languages (CloudFormation, Terraform, etc.) Commercial awareness Knowledge of Private Market Valuations and/or Financial Sector is advantageous Leadership Ability to lead design and development of a new or existing project/service Display commitment to own features and take them to successful delivery Personal competencies Personal impact Must work effectively with a diverse team spread across globe Strong analytical, investigative and problem solving skills Proactive, organised and able to work independently with minimal supervision Self-motivated and enthusiastic Open minded, flexible and willing to adapt to changing situations Delivery focused Communication Must be an excellent communicator, both written and verbally Teamwork Strong interpersonal skills, with the ability to work with a diverse team, including team members overseas About S&P Global Market Intelligence At S&P Global Market Intelligence, a division of S&P Global we understand the importance of accurate, deep and insightful information. Our team of experts delivers unrivaled insights and leading data and technology solutions, partnering with customers to expand their perspective, operate with confidence, and make decisions with conviction. For more information, visit www.spglobal.com/marketintelligence. What’s In It For You? Our Purpose Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technology–the right combination can unlock possibility and change the world. Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence®, pinpointing risks and opening possibilities. We Accelerate Progress. Our People We're more than 35,000 strong worldwide—so we're able to understand nuances while having a broad perspective. Our team is driven by curiosity and a shared belief that Essential Intelligence can help build a more prosperous future for us all. From finding new ways to measure sustainability to analyzing energy transition across the supply chain to building workflow solutions that make it easy to tap into insight and apply it. We are changing the way people see things and empowering them to make an impact on the world we live in. We’re committed to a more equitable future and to helping our customers find new, sustainable ways of doing business. We’re constantly seeking new solutions that have progress in mind. Join us and help create the critical insights that truly make a difference. Our Values Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits We take care of you, so you can take care of business. We care about our people. That’s why we provide everything you—and your career—need to thrive at S&P Global. Our Benefits Include Health & Wellness: Health care coverage designed for the mind and body. Flexible Downtime: Generous time off helps keep you energized for your time on. Continuous Learning: Access a wealth of resources to grow your career and learn valuable new skills. Invest in Your Future: Secure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly Perks: It’s not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the Basics: From retail discounts to referral incentive awards—small perks can make a big difference. For more information on benefits by country visit: https://spgbenefits.com/benefit-summaries Global Hiring And Opportunity At S&P Global At S&P Global, we are committed to fostering a connected and engaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. Recruitment Fraud Alert If you receive an email from a spglobalind.com domain or any other regionally based domains, it is a scam and should be reported to reportfraud@spglobal.com. S&P Global never requires any candidate to pay money for job applications, interviews, offer letters, “pre-employment training” or for equipment/delivery of equipment. Stay informed and protect yourself from recruitment fraud by reviewing our guidelines, fraudulent domains, and how to report suspicious activity here. Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to: EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only: The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf 20 - Professional (EEO-2 Job Categories-United States of America), IFTECH202.2 - Middle Professional Tier II (EEO Job Group), SWP Priority – Ratings - (Strategic Workforce Planning) Job ID: 315919 Posted On: 2025-08-04 Location: Noida, Uttar Pradesh, India

Posted 1 week ago

Apply

1.0 - 3.0 years

0 Lacs

India

Remote

Role : Data Engineer Location : Remote Shift Timing : 2:00 Pm - 11:00 Pm Experience : 1- 3 years relevant Who we are: Randstad Sourceright’s global talent solutions provides instant access to experienced recruitment and contingent workforce management support by combining technology, analytics and deep global and local expertise. Our operations consist of the client aligned service delivery teams operating across RPO, MSP and Blended Workforce Solutions. We are certified as a “great place to work” for the last 3 consecutive years and are recognized as the best place to work by Glassdoor Group Objective The mission of the business intelligence team is to create a data-driven culture that empowers leaders to integrate data into daily decisions and strategic planning. We aim to provide visibility, transparency, and guidance regarding the quantity and quality of results, activities, financial KPIs, and leading indicators to identify trends aimed at data-based decision-making easily. Position Objective As a Senior Data Engineer, you will be responsible for designing, architecting, and implementing robust data solutions in a cloud-based environment (GCP). You will partner with other data engineers and technical teams to ensure the availability, reliability, and performance of our data systems. Position Summary Programming & Code Writing Architect and build complex data pipelines using advanced cloud data technologies Lead efforts to optimize data pipelines for performance, scalability, and cost-efficiency Define industry best practices for building data pipelines Ensure data security, compliance, and governance standards are met. Partner with leadership team to define and implement agile and DevOps methodologies Consulting & Partnership Serve as subject matter expert and define data architecture and infrastructure requirements Partner with business analysts to plan project execution including appropriate product and technical specifications, direction, resources, and establishing realistic completion times Understand data technology trends and identify opportunities to implement new technologies and provide forward-thinking recommendations Proactively partner with internal stakeholders to bridge gaps, provide historical references, and design the appropriate processes Troubleshooting & Continuous Improvement Design and implement a robust data observability process Resolve escalated reporting requests and communicate proactively and timely Troubleshoot, and provide technical guidance to resolve issues related to misaligned or inaccurate data or data fields or new customer requirements Maintain new release, migration, and sprint schedules for software upgrades, enhancements, and fixes to aid with product evolution Write QA/QC Scripts to conduct first round of testing and partner with BA team for test validation for new developments prior to moving to production Use industry knowledge & feedback to aid in the development of technology roadmap and future product(s) vision Document standard ways of working via QRGs, intranet pages, and video series Senior activities Drive day-to-day development activities of development team in close collaboration with on-site and off-shore resources, scrum masters and product owners Bootstrapping a data engineering team at an early stage in the team’s evolution Provide leadership on technical front in difficult situations facilitate contentious discussions, and report up when necessary Guide, mentor and coach offshore resources Provide input in forming a long-term data strategy Education Master’s degree in Computer Science / Information Technology or related field, highly preferred Experience Extensive knowledge of BI concepts and related technologies that help drive sustainable technical solutions Extensive Experience with data lakes, ETL and data warehouses Advanced experience of building data pipelines Passion for building quality BI software Project Management and/or process improvement experience highly preferred Knowledge, Skills, and Abilities Polyglot coder and expert level in multiple languages including, Python, R, Java, SQL, relational databases, ERP, DOMO or other data visualization tools i.e. Tableau Advanced and proven experience with Google cloud platform (GCP) is preferred. But experience with Microsoft Azure / Amazon will be considered. Any exposure to Kafka, Spark, and Scala will be an added advantage. Should demonstrate a strong understanding of OOPS concepts and methodologies Expert level understanding of data engineering Intrinsic motivation and problem-solving Proactive leadership, project management, time management, and problem-solving skills Demonstrated continuous improvement, process documentation, and workflow skills Extensive experience with data analysis , modeling, and data pipelining including data cleaning, standardizing, scaling, tuning, scheduling and deployment Experience composing detailed technical documentation and procedures for data models Ability to prioritize and manage multiple projects, tasks, and meeting deadlines while maintaining quality Strong drive and commitment for delivering outstanding results Strong follow up and service orientation Supervisory Responsibility ☒ Provides guidance, leadership, or training to junior employees ☐ Directly responsible for supervising non-exempt, clerical, or office administrative personnel ☐ Directly responsible for supervising exempt, professional, or technical employees ☒ Directly responsible for supervising supervisory/managerial employees Organizational Structure: Job Title this position reports: Manager, Data Engineering

Posted 1 week ago

Apply

5.0 years

0 Lacs

India

Remote

This role is for one of Weekday's clients Min Experience: 5 years Location: Remote (India) JobType: full-time Requirements REQUIREMENTS Proficient in Programming language: Python, PySpark , Scala Azure Environment: Azure Data Factory, Databricks, Key Vault, DevOps CI CD Storage/ Databases: ADLS Gen 2, Azure SQL DB, Delta Lake Data Engineering: Apache Spark, Hadoop, optimization, performance tuning, Data modelling Experience working with data sources such as Kafka and MongoDB is preferred Experience with Automation of Test Cases of Big Data & ETL Pipelines and Agile Methodology Basic Understanding of ETL Pipelines A strong understanding of AI, machine learning, and data science concepts is highly beneficial Strong analytical and problem-solving skills with attention to detail Ability to work independently and as part of a team in a fast-paced environment Excellent communication skills, able to collaborate with both technical and non-technical stakeholders Experience designing and implementing scalable and optimized data architectures followed by all best practices Strong understanding of data warehousing concepts, data lakes, and data modeling Familiarity with data governance, data quality, and privacy regulations Key Responsibilities: Data Pipeline Development: Design, develop, and maintain scalable and efficient data pipelines to collect, process, and store data from various sources (e.g., databases, APIs, third-party services) Data Integration: Integrate and transform raw data into clean, usable formats for analytics and reporting, ensuring consistency, quality, and integrity Data Warehousing: Build and optimize data warehouses to store structured and unstructured data, ensuring data is organized, reliable, and accessible ETL Processes: Develop and manage ETL (Extract, Transform, Load) processes for data ingestion, cleaning, transformation, and loading into databases or data lakes Performance Optimization: Monitor and optimize data pipeline performance to handle large volumes of data with low latency, ensuring reliability and scalability Collaboration: Work closely with other product teams , TSO and business stakeholders to understand data requirements and ensure that data infrastructure supports analytical needs Data Quality & Security: Ensure that data systems meet security and privacy standards, and implement best practices for data governance, monitoring, and error handling Automation & Monitoring: Automate data workflows and establish monitoring systems to detect and resolve data issues proactively Understand the broad architecture of the GEP's entire system as well as Analytics Take full accountability for role, own development and results

Posted 1 week ago

Apply

6.0 years

0 Lacs

Greater Kolkata Area

On-site

Overview Working at Atlassian Atlassians can choose where they work – whether in an office, from home, or a combination of the two. That way, Atlassians have more control over supporting their family, personal goals, and other priorities. We can hire people in any country where we have a legal entity. Interviews and onboarding are conducted virtually, a part of being a distributed-first company. Responsibilities What you'll do Build and ship features and capabilities daily in highly scalable, cross-geo distributed environment Be part of an amazing open and collaborative work environment with other experienced engineers, architects, product managers, and designers Review code with best practices of readability, testing patterns, documentation, reliability, security, and performance considerations in mind Mentor and level up the skills of your teammates by sharing your expertise in formal and informal knowledge sharing sessions Ensure full visibility, error reporting, and monitoring of high performing backend services Participate in Agile software development including daily stand-ups, sprint planning, team retrospectives, show and tell demo sessions Qualifications Your background 6+ years of experience building and developing backend applications Bachelor's or Master's degree with a preference for Computer Science degree Experience crafting and implementing highly scalable and performant RESTful micro-services Proficiency in any modern object-oriented programming language (e.g., Java, Kotlin, Go, Scala, Python, etc.) Fluency in any one database technology (e.g. RDBMS like Oracle or Postgres and/or NoSQL like DynamoDB or Cassandra) Real passion for collaboration and strong interpersonal and communication skills Broad knowledge and understanding of SaaS, PaaS, IaaS industry with hands-on experience of public cloud offerings (AWS, GAE, Azure) Familiarity with cloud architecture patterns and an engineering discipline to produce software with quality

Posted 1 week ago

Apply

5.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Why We Work at Dun & Bradstreet Dun & Bradstreet unlocks the power of data through analytics, creating a better tomorrow. Each day, we are finding new ways to strengthen our award-winning culture and accelerate creativity, innovation and growth. Our 6,000+ global team members are passionate about what we do. We are dedicated to helping clients turn uncertainty into confidence, risk into opportunity and potential into prosperity. Bold and diverse thinkers are always welcome. Come join us! Learn more at dnb.com/careers . The Role The Senior Software Engineer will analyze and develop systems that support Dun & Bradstreet’s core services hosted in legacy Datacenters, AWS, and GCP. Duties include Software Development of our Big Data Platform, ensuring you develop Unit Test for your code and collaborate in performing daily Pull Request Reviews. The right candidate is passionate about developing and curious about Big Data Platforms with a development and problem-solving mindset. This role will collaborate with Development, SRE and DevOps teams to translate business requirements and functional specifications into innovative solutions implementing performant, scalable program designs, code modules and stable systems. Key Responsibilities Develop capabilities to meet business requirement and develop the necessary tools to help us to be fully automated. Engineer solutions on GCP foundation platform using Infrastructure As Code methods (e.g., Terraform). Focus on making the build and deployments fully automated. Implement optimizations in the cloud software development life cycle process to provide improvement in productivity, tools and techniques Building and configuring the necessary Instrumentation (Monitoring, Metering, Reporting, Logging, Observability, Tracing) to give runtime insights to resolve problems Manage code repo and ensure the team follows Release Branch strategies and setting up the needed development and runtime environments. Collaborate across the different teams and areas such as Cloud Platform, Security, Data, Risk & Compliance to create cost-effective optimum solutions for the business. Key Requirements 5+ years of experience developing commercial software in an agile SDLC environment with a focus on DevOps and automation in large scale distributed systems. Proven experience managing Platforms on GCP or AWS utilizing a broad set of the services available Strong understanding of performance issues and how to resolve them on GCP / AWS both in Application runtime and Infrastructure levels Experience in Java/Python to build tooling as needed, working knowledge of Scala/Spark understanding is beneficial Capability to conduct root cause analysis for production incidents and resolution for future prevention. Experience in creating automated CI/CD pipeline using Harness, GitHub Actions, Gitlab, Jenkins, etc. Knowledge and building and maintaining source code branches and packaged into artifacts using tools such as JFrog and Version control tools such as Github, bitbucket or similar are required Monitoring and Logging in Splunk, ELK Stack and similar is required Experience of deploying and operating Big Data Service Platforms highly desirable. Show an ownership mindset in everything you do. Be a problem solver, be curious and be inspired to take action. Be proactive, seek ways to collaborate and connect with people and teams in support of driving success. Where applicable, fluency in English and languages relevant to the working market. All Dun & Bradstreet job postings can be found at https://www.dnb.com/about-us/careers-and-people/joblistings.html and https://jobs.lever.co/dnb . Official communication from Dun & Bradstreet will come from an email address ending in @dnb.com. Notice to Applicants: Please be advised that this job posting page is hosted and powered by Lever. Your use of this page is subject to Lever's Privacy Notice and Cookie Policy , which governs the processing of visitor data on this platform.

Posted 1 week ago

Apply

10.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Position Overview: We are looking for a Senior Engineer with 10-15 years of experience to help design, build, and maintain enterprise-grade data and AI platforms. This role requires strong technical expertise in distributed systems, real-time data integration, metadata-driven design, and cloud-native development. The Senior Engineer will work closely with Solutions and Data Architects , contributing to the technical implementation of scalable, secure, and well-governed systems while collaborating across engineering, data, and governance teams. Role & Responsibilities: Collaborate with Solutions and Data Architects to implement platform-level components, including metadata management, data integration, and semantic modeling Develop and maintain reliable, scalable, and observable services supporting both operational and analytical workloads Contribute to the design and development of real-time data pipelines, schema evolution strategies, and lineage tracking mechanisms Participate in architectural reviews, POCs, and production rollouts for key platform capabilities Implement and optimize APIs, orchestration patterns, and data services for internal and external consumers Build tooling and infrastructure to support governance automation, quality enforcement, and data access controls Take ownership of feature delivery from design to deployment, ensuring alignment with technical standards and business needs Troubleshoot and resolve performance bottlenecks, integration issues, and infrastructure gaps in collaboration with DevOps and architecture teams Document engineering components, workflows, and dependencies following architecture and compliance guidelines Must have: 10+ years of hands-on experience in backend, data, or platform engineering, preferably in cloud-native environments Strong knowledge of distributed systems, microservices, and scalable data architecture Proven experience in implementing real-time data ingestion, streaming pipelines, and event-driven architectures (e.g., Kafka, Pulsar) Proficient in Python, Java/Scala, and SQL for building high-performance data services and APIs Experience integrating with REST, GraphQL, and gRPC APIs across multiple services Hands-on experience with Kubernetes, GitOps, and infrastructure-as-code (e.g., Terraform, Helm) Familiarity with metadata management, data lineage, and semantic models Understanding of data governance practices, including access control, quality rules, and auditability Strong experience with CI/CD pipelines, version control, and observability tools (logging, tracing, monitoring) Exposure to AI/ML pipelines, feature stores, and model integration Ability to translate architectural designs into working prototypes, production-grade services, and reusable modules Excellent problem-solving and debugging skills in distributed and high-concurrency systems Strong communication skills and ability to work with Architects, Data Engineers, Product Owners, and Governance Teams Nice to have: Experience working in federated data platforms, data mesh, or knowledge graph environments Understanding of multi-cloud architectures and cloud-native best practices Familiarity with data classification, vector search, or ML-based discovery techniques Qualifications: Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field Certifications in Cloud platforms (AWS/GCP/Azure), Kubernetes, or Data Engineering preferred

Posted 1 week ago

Apply

3.0 years

0 Lacs

Pune, Maharashtra, India

Remote

Entity: Technology Job Family Group: IT&S Group Job Description: You will work with This team is responsible for response and management of cyber incidents, applying an intelligence-led approach for identification, mitigation, and rapid response to safeguard bp on a global scale. By applying lessons learned and data analytics, they establish engineering principles and enhance the technology stack to continuously bolster bp’s cybersecurity posture. Let me tell you about the role We are looking for a Security Engineering Specialist who will support a team dedicated to enabling security experts and software engineers to write, deploy, integrate, and maintain security standards and develop secure applications and automations. You will advocate for and help ensure that cloud, infrastructure, and data teams adhere to secure policies, uncover vulnerabilities and provide remediation insights, and contribute to the adoption of secure practices. You will stay informed on industry and technology trends to strengthen bp’s security posture and contribute to a culture of excellence. What you will deliver Support development of and implement platform security standards, co-design schemas, ensure quality at the source of infrastructure build and configuration, and find opportunities to automate manual secure processes wherever possible. Work with business partners to implement security strategies and to coordinate remediation activities to ensure products safely meet business requirements. Contribute as a subject matter expert in at least one domain (cloud, infrastructure, or data). Provide hands-on support to teams on secure configuration and remediation strategies. Align strategy, processes, and decision-making across teams. Actively participate in a positive engagement and governance framework and contribute to an inclusive work environment with teams and collaborators including engineers, developers, product owners, product managers and portfolio managers. Evolve the security roadmap to meet anticipated future requirements and needs. Provide support to the squads and teams through technical guidance and by managing dependencies and risks. Create and articulate materials on how to embed and measure security on our cloud, infrastructure, or data environments. Contribute to mentoring and promote a culture of continuous development! What you will need to be successful (experience and qualifications) 3+ years of experience in security engineering or technical infrastructure roles. A minimum of 3 years of Cyber Security experience on one of the following areas: Cloud (AWS and Azure), Infrastructure (IAM, Network, endpoint, etc.), or Data (DLP, data lifecycle management, etc.). Deep and hands-on experience designing security architectures and solutions for reliable and scalable data infrastructure, cloud and data products in complex environments. Development experience in one or more object-oriented programming languages (e.g., Python, Scala, Java, C#) and/or development experience in one or more cloud environments (including AWS, Azure, Alibaba, etc.). Exposure/experience with full stack development. Experience with automation and scripting for security tasks (e.g., IaC, CI/CD integration) and security tooling (e.g., vulnerability scanners, CNAPP, Endpoint and/or DLP). Deep knowledge and hands-on experience in technologies across all data lifecycle stages. Foundational knowledge of security standards, industry laws, and regulations such as Payment Card Industry Data Security Standards (PCI-DSS), General Data Protection Regulation (GDPR), California Consumer Privacy Act (CCPA) and Sarbanes-Oxley (SOX). Strong collaborator management and ability to influence teams through technical guidance. Continuous learning and improvement approach. About Bp Our purpose is to deliver energy to the world, today and tomorrow. For over 100 years, bp has focused on discovering, developing, and producing oil and gas in the nations where we operate. We are one of the few companies globally that can provide governments and customers with an integrated energy offering. Delivering our strategy sustainably is fundamental to achieving our ambition to be a net zero company by 2050 or sooner! We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform crucial job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation. We are an equal opportunity employer and value diversity at our company. We do not discriminate on the basis of race, religion, color, national origin, sex, gender, gender expression, sexual orientation, age, marital status, veteran status, or disability status. Even though the job is advertised as full time, please contact the hiring manager or the recruiter as flexible working arrangements may be considered. Travel Requirement Up to 10% travel should be expected with this role Relocation Assistance: This role is eligible for relocation within country Remote Type: This position is a hybrid of office/remote working Skills: Automation system digital security, Client Counseling, Conformance review, Digital Forensics, Incident management, incident investigation and response, Information Assurance, Information Security, Information security behaviour change, Intrusion detection and analysis, Legal and regulatory environment and compliance, Risk Management, Secure development, Security administration, Security architecture, Security evaluation and functionality testing, Solution Architecture, Stakeholder Management, Supplier security management, Technical specialism Legal Disclaimer: We are an equal opportunity employer and value diversity at our company. We do not discriminate on the basis of race, religion, color, national origin, sex, gender, gender expression, sexual orientation, age, marital status, socioeconomic status, neurodiversity/neurocognitive functioning, veteran status or disability status. Individuals with an accessibility need may request an adjustment/accommodation related to bp’s recruiting process (e.g., accessing the job application, completing required assessments, participating in telephone screenings or interviews, etc.). If you would like to request an adjustment/accommodation related to the recruitment process, please contact us. If you are selected for a position and depending upon your role, your employment may be contingent upon adherence to local policy. This may include pre-placement drug screening, medical review of physical fitness for the role, and background checks.

Posted 1 week ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies