Cookies on this website

We use cookies to ensure that we give you the best experience on our website. If you click 'Accept all cookies' we'll assume that you are happy to receive all cookies and you won't see this message again. If you click 'Reject all non-essential cookies' only necessary cookies providing core functionality such as security, network management, and accessibility will be enabled. Click 'Find out more' for information on how to change your cookie settings.

Novelty detection, one-class classification, or outlier detection, is typically employed for analysing signals when few examples of "abnormal" data are available, such that a multi-class approach cannot be taken. Multivariate, multimodal density estimation can be used to construct a model of the distribution of normal data. However, setting a decision boundary such that test data can be classified "normal" or "abnormal" with respect to the model of normality is typically performed using heuristic methods, such as thresholding the unconditional data density, p(x). This paper describes two principled methods of setting a decision boundary based on extreme value statistics: (i) a numerical method that produces an "optimal" solution, and (ii) an analytical approximation in closed form. We compare the performance of both approaches using large datasets from biomedical patient monitoring and jet engine health monitoring, and conclude that the analytical approach performs novelty detection as successfully as the "optimal" numerical approach, both of which outperform the conventional method. © 2009 IEEE.

Original publication




Conference paper

Publication Date



13 - 16