Large-scale data processing is important. Most successful applications of m- ern science and engineering, from discovering the human genome to predicting weather to controlling space missions, involve processing large amounts of data and large knowledge bases. The corresponding large-scale data and knowledge processing requires intensive use of computers. Computers are based on processing exact data values and truth values from the traditional 2-value logic. The ability of computers to perform fast data and knowledgeprocessingisbasedonthehardwaresupportforsuper-fastelementary computer operations, such as performing arithmetic operations with (exactly known) numbers and performing logical operations with binary (“true”-“false”) logical values. In practice, we need to go beyond exact data values and truth values from the traditional 2-value logic. In practical applications, we need to go beyond such operations. Input is only known with uncertainty. Let us ?rst illustrate this need on the example of operations with numbers. Hardware-supported computer operations (implicitly) assume that we know the exact values of the input quantities. In reality, the input data usually comes from measurements. Measurements are never 100% accurate. Due to such factors as imperfection of measurement - struments and impossibility to reduce noise level to 0, the measured value x of each input quantity is, in general, di?erent from the (unknown) actual value x of this quantity. It is therefore necessary to ?nd out how this input uncertainty def ?x = x ?x = 0 a?ects the results of data processing.
Die Inhaltsangabe kann sich auf eine andere Ausgabe dieses Titels beziehen.
Most successful applications of modern science and engineering, from discovering the human genome to predicting weather to controlling space missions, involve processing large amounts of data and large knowledge bases. The ability of computers to perform fast data and knowledge processing is based on the hardware support for super-fast elementary computer operations, such as performing arithmetic operations with (exactly known) numbers and performing logical operations with binary ("true"-"false") logical values. In practice, measurements are never 100% accurate. It is therefore necessary to find out how this input inaccuracy (uncertainty) affects the results of data processing. Sometimes, we know the corresponding probability distribution; sometimes, we only know the upper bounds on the measurement error -- which leads to interval bounds on the (unknown) actual value. Also, experts are usually not 100% certain about the statements included in the knowledge bases. A natural way to describe this uncertainty is to use non-classical logics (probabilistic, fuzzy, etc.).
This book contains proceedings of the first international workshop that brought together researchers working on interval and probabilistic uncertainty and on non-classical logics. We hope that this workshop will lead to a boost in the much-needed collaboration between the uncertainty analysis and non-classical logic communities, and thus, to better processing of uncertainty.
„Über diesen Titel“ kann sich auf eine andere Ausgabe dieses Titels beziehen.
Gratis für den Versand innerhalb von/der Deutschland
Versandziele, Kosten & DauerAnbieter: Buchpark, Trebbin, Deutschland
Zustand: Sehr gut. Zustand: Sehr gut | Seiten: 376 | Sprache: Englisch | Produktart: Sonstiges. Artikel-Nr. 4160609/12
Anzahl: 1 verfügbar