107,99 €
inkl. MwSt.
Versandkostenfrei*
Versandfertig in 1-2 Wochen
payback
54 °P sammeln
  • Broschiertes Buch

This book is intended to be an introduction to the mathematical description of information in science. The necessary mathematical theory of this introduction will be treated in a more vivid way than in the usual theorem-proof structure. This, however, enables us to develop an idea of the connections between different information measures and to understand the trains of thought in their derivation, which is a crucial point for correct applications. It is therefore our intention in the mathematical descriptions to evolve the important ideas of the derivations, so that we obtain the resulting…mehr

Produktbeschreibung
This book is intended to be an introduction to the mathematical description of information in science. The necessary mathematical theory of this introduction will be treated in a more vivid way than in the usual theorem-proof structure. This, however, enables us to develop an idea of the connections between different information measures and to understand the trains of thought in their derivation, which is a crucial point for correct applications. It is therefore our intention in the mathematical descriptions to evolve the important ideas of the derivations, so that we obtain the resulting functions as well as the main thoughts and the conditions for the validity of the result. This simplifies the handling of the information measures, which are sometimes hard to classify without any additional background information. Though the mathematical descriptions are the exact formulations of the measures examined, we do not restrict ourselves to rigorous mathematical considerations, but we will also integrate the different measures into the structure and context of possible information measures. Nevertheless the mathematical approach is unavoidable when we are looking for an objective description and for possible applications in optimization.
Autorenporträt
Information Measures introduces the mathematical description of information in science and engineering, treating the necessary mathematical theory in a more vivid way than in the usual theoretical proof structure. This enables readers to develop an idea of the connections between different information measures and to understand the trains of thoughts in their derivation. As there exist a great number of different possible ways to describe information, these measures are presented in a coherent manner. Examples of the measures treated: Shannon information, applied in coding theory; Akaike information criterion, used in system identification to determine auto-regressive models and in neural networks to identify the number of neurons; and Cramer-Rao bound or Fisher information, describing the minimal variances achieved by unbiased estimators. "A tool [to show] bioinformaticians... how to handle immense amounts of raw data, such as are generated from genome mapping, make sense of them, and render them accessible to scientists." IEEE Engineering in Medicine and Biology
Rezensionen
"Bioinformaticians are facing the challenge of how to handle immense amounts of raw data, such as are generated from genome mapping, make sense of them, and render them accessible to scientists working on a wide variety of problems. "Information Measures: Information and its Description in Science and Engineering" can be such a tool."

IEEE Engineering in Medicine and Biology