Artificial neural networks have been recognized as a powerful tool to learn and reproduce systems in various fields of applications. Neural net works are inspired by the brain behavior and consist of one or several layers of neurons, or computing units, connected by links. Each artificial neuron receives an input value from the input layer or the neurons in the previ ous layer. Then it computes a scalar output from a linear combination of the received inputs using a given scalar function (the activation function), which is assumed the same for all neurons. One of the main properties of neural networks is their ability to learn from data. There are two types of learning: structural and parametric. Structural learning consists of learning the topology of the network, that is, the number of layers, the number of neurons in each layer, and what neurons are connected. This process is done by trial and error until a good fit to the data is obtained. Parametric learning consists of learning the weight values for a given topology of the network. Since the neural functions are given, this learning process is achieved by estimating the connection weights based on the given information. To this aim, an error function is minimized using several well known learning methods, such as the backpropagation algorithm. Unfortunately, for these methods: (a) The function resulting from the learning process has no physical or engineering interpretation. Thus, neural networks are seen as black boxes.
Die Inhaltsangabe kann sich auf eine andere Ausgabe dieses Titels beziehen.
Artificial neural networks have been recognized as a powerful tool to learn and reproduce systems in various fields of applications. Neural net works are inspired by the brain behavior and consist of one or several layers of neurons, or computing units, connected by links. Each artificial neuron receives an input value from the input layer or the neurons in the previ ous layer. Then it computes a scalar output from a linear combination of the received inputs using a given scalar function (the activation function), which is assumed the same for all neurons. One of the main properties of neural networks is their ability to learn from data. There are two types of learning: structural and parametric. Structural learning consists of learning the topology of the network, that is, the number of layers, the number of neurons in each layer, and what neurons are connected. This process is done by trial and error until a good fit to the data is obtained. Parametric learning consists of learning the weight values for a given topology of the network. Since the neural functions are given, this learning process is achieved by estimating the connection weights based on the given information. To this aim, an error function is minimized using several well known learning methods, such as the backpropagation algorithm. Unfortunately, for these methods: (a) The function resulting from the learning process has no physical or engineering interpretation. Thus, neural networks are seen as black boxes.
This book introduces 'functional networks', a novel neural-based paradigm, and shows that functional network architectures can be efficiently applied to solve many interesting practical problems. Included is an introduction to neural networks, a description of functional networks, examples of applications, and computer programs in Mathematica and Java languages implementing the various algorithms and methodologies. Special emphasis is given to applications in several areas such as: * Box-Jenkins AR(p), MA(q), ARMA(p,q), and ARIMA (p,d,q) models with application to real-life economic problems such as the consumer price index, electric power consumption and international airlines' passenger data. Random time series and chaotic series are considered in relation to the Henon, Lozi, Holmes and Burger maps, as well as the problems of noise reduction and information masking. * Learning differential equations from data and deriving the corresponding equivalent difference and functional equations. Examples of a mass supported by two springs and a viscous damper or dashpot, and a loaded beam, are used to illustrate the concepts. * The problem of obtaining the most general family of implicit, explicit and parametric surfaces as used in Computer Aided Design (CAD). * Applications of functional networks to obtain general nonlinear regression models are given and compared with standard techniques. Functional Networks with Applications: A Neural-Based Paradigm will be of interest to individuals who work in computer science, physics, engineering, applied mathematics, statistics, economics, and other neural networks and data analysis related fields.
„Über diesen Titel“ kann sich auf eine andere Ausgabe dieses Titels beziehen.
Gratis für den Versand innerhalb von/der Deutschland
Versandziele, Kosten & DauerAnbieter: moluna, Greven, Deutschland
Zustand: New. Artikel-Nr. 4195691
Anzahl: Mehr als 20 verfügbar
Anbieter: AHA-BUCH GmbH, Einbeck, Deutschland
Taschenbuch. Zustand: Neu. Druck auf Anfrage Neuware - Printed after ordering - Artificial neural networks have been recognized as a powerful tool to learn and reproduce systems in various fields of applications. Neural net works are inspired by the brain behavior and consist of one or several layers of neurons, or computing units, connected by links. Each artificial neuron receives an input value from the input layer or the neurons in the previ ous layer. Then it computes a scalar output from a linear combination of the received inputs using a given scalar function (the activation function), which is assumed the same for all neurons. One of the main properties of neural networks is their ability to learn from data. There are two types of learning: structural and parametric. Structural learning consists of learning the topology of the network, that is, the number of layers, the number of neurons in each layer, and what neurons are connected. This process is done by trial and error until a good fit to the data is obtained. Parametric learning consists of learning the weight values for a given topology of the network. Since the neural functions are given, this learning process is achieved by estimating the connection weights based on the given information. To this aim, an error function is minimized using several well known learning methods, such as the backpropagation algorithm. Unfortunately, for these methods: (a) The function resulting from the learning process has no physical or engineering interpretation. Thus, neural networks are seen as black boxes. Artikel-Nr. 9781461375623
Anzahl: 1 verfügbar
Anbieter: Ria Christie Collections, Uxbridge, Vereinigtes Königreich
Zustand: New. In. Artikel-Nr. ria9781461375623_new
Anzahl: Mehr als 20 verfügbar
Anbieter: Revaluation Books, Exeter, Vereinigtes Königreich
Paperback. Zustand: Brand New. 320 pages. 9.25x6.10x0.73 inches. In Stock. Artikel-Nr. x-1461375622
Anzahl: 2 verfügbar