During the last several decades, classification proved to be useful in many spheres of science and daily life. Its further applications in numerous fields require constant improvement and development of the methodological basis. This book investigates latest advances in the data classification that exploit the recently emerged concept of the statistical depth function. Its first chapter provides a concise introduction into depth and classification. Chapter 2 introduces a depth-based methodology for supervised learning, the so-called DDalpha-classifier. This is a two-step procedure, which first maps the learning sample into the space of the depths with respect to the training classes (DD-plot), and then separates it, close to optimally, using the projective-invariant method alpha-procedure. The developed technique is completely nonparametric, robust and very fast. Further, Chapter 3 widely addresses the problem of "outsiders" - points that cannot be referred to any class based on the depth values only, suggests a number of outsider treatments, and explores the question of configuring the classifier. Chapter 4 provides an extension of this machinery to functional data. Single sections prove Bayes optimality of the procedure under standard settings, and conduct a broad experimental analysis with synthetic data as well as 50 real-world classification problems. Chapter 5 proposes two algorithms for exact computation of the Tukey (halfspace, location) depth. One follows the breadth-first spreading over the cone segmentation of the multivariate space using linear programming. Another is of combinatorial nature, and allows for a simple implementation.
Die Inhaltsangabe kann sich auf eine andere Ausgabe dieses Titels beziehen.