Pages

Sabtu, 01 Juni 2013

Examples of Artificial Neural Networks

Examples of Artificial Neural Networks

Artificial Neural Networks (ANN) are part of Artificial Intelligence (AI), which is the area of computer science concerned with making computers behave more intelligently. ANN are modeled on the brain where neurons are connected in complex patterns to process data from the senses, establish memories and control the body. ANN process data and exhibit some intelligent behaviors like learning, generalization and pattern recognition.

Backpropagation Nets

    Backpropagation nets are the most common kind of ANN. The basic topology is that layers of neurons are connected to each other. Patterns cause information to flow in one direction, then the errors "backpropagate" in the other direction, changing the strength of the interconnections between layers. Backpropagation nets learn to classify patterns. When they are presented with a pattern, the interconnections between the artificial neurons are adjusted until they give a correct response. After sufficient training with a number of patterns, they will give the correct response to a pattern they have never seen. The nets can learn to generalize. One of the most successful backpropagation nets is NetTalk, which was invented by Terry Sejnowski, professor and head of the Computational Neurobiology Laboratory at the Salk Institute in La Jolla, California. This net learns to read English (or any other language) and is used all over the world to read to blind people,

Hopfield Nets

    John Hopfield, a Nobel Prize-winning physicist at California Institute of Technology (Caltech), invented Hopfield nets. The basic topology is that every artificial neuron is connected to every other artificial neuron. These nets memorize collections of patterns. When given a part of one of the patterns or a badly distorted pattern, the net delivers the complete pattern. These nets have found application in fingerprint recognition. Given a partial print or a smudged print, the Hopfield net can deliver the complete fingerprint. NASA uses Hopfield nets to orient deep-space craft by visual star fields. When the craft looks at a picture of the stars, a Hopfield net can match the view with the known pictures of the stars to orient the craft.

Self-Organizing Maps

    Finnish professor Teuvo Kohonen invented self-organizing maps, also known as Kohonen nets. The basic topology is that each artificial neuron is connected only to its neighbors. Kohonen nets reduce the complexity of data--especially experimentally obtained data. Repeatedly "training" a Kohonen net with an n-dimensional data set can produce a lower dimensional data set that captures the essential nature of the n-dimensional data set in a much simpler form. A major application of self-organizing maps is in the several projects that are looking for a simpler way to understand the Internet. Kohonen nets are regularly used as a preprocessor for other types of ANN.

0 komentar:

Posting Komentar