The general concept of information is here, for the first time, defined mathematically by adding one single axiom to the probability theory. This Mathematical Theory of Information is explored in fourteen chapters: 1. Information can be measured in different units, in anything from bits to dollars. We will here argue that any measure is acceptable if it does not violate the Law of Diminishing Information. This law is supported by two independent arguments: one derived from the Bar-Hillel ideal receiver, the other is based on Shannon's noisy channel. The entropy in the 'classical information theory' is one of the measures conforming to the Law of Diminishing Information, but it has, however, properties such as being symmetric, which makes it unsuitable for some applications. The measure reliability is found to be a universal information measure. 2. For discrete and finite signals, the Law of Diminishing Information is defined mathematically, using probability theory and matrix algebra. 3. The Law of Diminishing Information is used as an axiom to derive essential properties of information. Byron's law: there is more information in a lie than in gibberish. Preservation: no information is lost in a reversible channel. Etc. The Mathematical Theory of Information supports colligation, i. e. the property to bind facts together making 'two plus two greater than four'. Colligation is a must when the information carries knowledge, or is a base for decisions. In such cases, reliability is always a useful information measure. Entropy does not allow colligation.
Die Inhaltsangabe kann sich auf eine andere Ausgabe dieses Titels beziehen.
The general concept of information is here, for the first time, defined mathematically by adding one single axiom to the probability theory. This Mathematical Theory of Information is explored in fourteen chapters: 1. Information can be measured in different units, in anything from bits to dollars. We will here argue that any measure is acceptable if it does not violate the Law of Diminishing Information. This law is supported by two independent arguments: one derived from the Bar-Hillel ideal receiver, the other is based on Shannon's noisy channel. The entropy in the 'classical information theory' is one of the measures conforming to the Law of Diminishing Information, but it has, however, properties such as being symmetric, which makes it unsuitable for some applications. The measure reliability is found to be a universal information measure. 2. For discrete and finite signals, the Law of Diminishing Information is defined mathematically, using probability theory and matrix algebra. 3. The Law of Diminishing Information is used as an axiom to derive essential properties of information. Byron's law: there is more information in a lie than in gibberish. Preservation: no information is lost in a reversible channel. Etc. The Mathematical Theory of Information supports colligation, i. e. the property to bind facts together making 'two plus two greater than four'. Colligation is a must when the information carries knowledge, or is a base for decisions. In such cases, reliability is always a useful information measure. Entropy does not allow colligation.
The Mathematical Theory of Information presents a new mathematical theory of information, built on a single powerful postulate: the Law of Diminishing Information. The concept of information is here, for the first time, defined mathematically by adding this postulate to the axioms of the probability theory. The Law of Diminishing Information is founded on a fusion of two fundamental ideas: Carnap and Bar-Hillel's 'Ideal Receiver' and Shannon's 'Noisy Channel'. The Law of Diminishing Information is applied to information technology, game theory, legislation, logic of research, algorithmic information, chaos theory, control engineering, medical tests, and biological evolution. In physics, both the Second Law of Thermodynamics and Schrodinger's wave function are derived from the Law of Diminishing Information. Conventional information theory, that of telecommunications, is analyzed as a special case, and eight conditions for its applicability are listed. The reader will get the essential ideas to understand and use the concept of information. The Mathematical Theory of Information is suitable as a textbook in general information theory for students of technical, scientific, and mathematical subjects. The book is ideal as a supplementary textbook in traditional courses on telecommunications information theory at all levels. The website of the book is www.matheory.info.
„Über diesen Titel“ kann sich auf eine andere Ausgabe dieses Titels beziehen.
EUR 13,79 für den Versand von Vereinigtes Königreich nach USA
Versandziele, Kosten & DauerAnbieter: Ria Christie Collections, Uxbridge, Vereinigtes Königreich
Zustand: New. In. Artikel-Nr. ria9781461353324_new
Anzahl: Mehr als 20 verfügbar
Anbieter: Revaluation Books, Exeter, Vereinigtes Königreich
Paperback. Zustand: Brand New. 516 pages. 9.25x6.10x1.18 inches. In Stock. Artikel-Nr. x-1461353327
Anzahl: 2 verfügbar
Anbieter: AHA-BUCH GmbH, Einbeck, Deutschland
Taschenbuch. Zustand: Neu. Druck auf Anfrage Neuware - Printed after ordering - The general concept of information is here, for the first time, defined mathematically by adding one single axiom to the probability theory. This Mathematical Theory of Information is explored in fourteen chapters: 1. Information can be measured in different units, in anything from bits to dollars. We will here argue that any measure is acceptable if it does not violate the Law of Diminishing Information. This law is supported by two independent arguments: one derived from the Bar-Hillel ideal receiver, the other is based on Shannon's noisy channel. The entropy in the 'classical information theory' is one of the measures conforming to the Law of Diminishing Information, but it has, however, properties such as being symmetric, which makes it unsuitable for some applications. The measure reliability is found to be a universal information measure. 2. For discrete and finite signals, the Law of Diminishing Information is defined mathematically, using probability theory and matrix algebra. 3. The Law of Diminishing Information is used as an axiom to derive essential properties of information. Byron's law: there is more information in a lie than in gibberish. Preservation: no information is lost in a reversible channel. Etc. The Mathematical Theory of Information supports colligation, i. e. the property to bind facts together making 'two plus two greater than four'. Colligation is a must when the information carries knowledge, or is a base for decisions. In such cases, reliability is always a useful information measure. Entropy does not allow colligation. Artikel-Nr. 9781461353324
Anzahl: 1 verfügbar