پایان نامه های شبکه عصبی مصنوعی(4)

Artificial neural network learning mechanism enhancements.

McCormack, C. . University College Cork (Ireland), ProQuest Dissertations Publishing, 1977. U100888.

ProQuest document link

 

 

 

 

ABSTRACT

Neural network learning rules have undergone a continuous process of refinement and improvement over the last few years. Unfortunately, in spite of improvements to the efficiency of learning rules it remains necessary to manually select appropriate learning rule parameter values in order to achieve an acceptable solution. The learning rule parameters which yield a performance of highest quality (where quality can be defined as the speed of convergence and the accuracy of the resultant network) are usually unique for each problem with no effective method of judging what parameter value is suitable for which problem. This is a significant shortcoming in the area of learning rule implementation as the use of an inappropriate parameter can have a marked effect on the performance of most learning rules. This work outlines an application independent method of automating learning rule parameter selection using a form of supervisor neural network, known as Meta Neural Network, to alter the value of a learning rule parameter during training. The Meta Neural Network is trained using data generated by observing the training of a neural network and recording the effects on the selection of various parameter values.

The Meta Neural Network is then combined with a learning rule and is used to augment the learning rules performance. Experiments were undertaken to see how the method performs by using it to adapt a global parameter of the RPROP and Quick propagation learning rules. The method was found to yield a consistently superior performance over conventional methods. This method of creating a Meta Neural Network is proposed as a first step in an attempt to develop a self-modifying Neural Network and involves a method which does not need intervention, is a consistent performer and requires a minimal of computational overhead. Two improvements of the Meta Neural Network scheme are discussed. One approach investigates the result of combining training sets from several Meta Neural Networks, the other involves polling a number of individual Meta Neural Networks to produce a consensus on how learning rule parameters can be adapted.

 

LINKS

DETAILS

 

Subject:

Artificial intelligence

 

Classification:

0800: Artificial intelligence

 

Identifier / keyword:

(UMI)AAIU100888 Applied sciences

 

Number of pages:

1

 

Publication year:

1977

 

Degree date:

1977

 

School code:

1269

 

Source:

DAI-C 70/21, Dissertation Abstracts International

 

Place of publication:

Ann Arbor

 

Country of publication:

United States

 

Publication subject:

Psychology--Abstracting, Bibliographies, Statistics

 

University/institution:

University College Cork (Ireland)

 

University location:

Ireland

 

Degree:

Ph.D.

 

Source type:

Dissertations &Theses

 

Language:

English

 

Document type:

Dissertation/Thesis

 

Dissertation/thesis number:

U100888

 

ProQuest document ID:

301400872

 

Document URL:

https://search.proquest.com/docview/301400872?accountid=8243

 

Copyright:

Database copyright ProQuest LLC; ProQuest does not claim copyright in the individual underlying works.

 

Database:

ProQuest Dissertations &Theses Global

 

 

 

Artificial Brain Again Seen As a Guide To the Mind

Johnson, George . New York Times , Late Edition (East Coast); New York, N.Y. [New York, N.Y]16 Aug 1988: C.1.

ProQuest document link

 

 

 

 

ABSTRACT (ABSTRACT)

''The possibility is really before us to start making models that bridge the gap between neurons and complex behavior,'' said Patricia Smith Churchland, a philosopher at the University of California at San Diego. ''For years there has been this great no-man's-land between the two levels. Now we're really beginning to investigate that no-man's-land, and in the long haul we might get a theory of how the brain works.''

 

The new developments in neural networks ''are tremendously important for philosophers,'' she said. ''Plato's question was, 'What is knowledge and how is it possible?' Now we have a real shot at answering that.''

 

''Neural networks are serving as a medium or a mathematical language for describing all these phenomena,'' said Terrence J. Sejnowki of Johns Hopkins University's Department of Biophysics. ''There is still a big gulf between psychology and neuroscience, but at least they're now in the same mathematical ballpark.''

 

 

LINKS

DETAILS

 

Subject:

DATA PROCESSING (COMPUTERS); BRAIN; PSYCHOLOGY AND PSYCHOLOGISTS

 

People:

Minsky, Marvin MINSKY, MARVIN JOHNSON, GEORGE

 

Publication title:

New York Times, Late Edition (East Coast); New York, N.Y.

 

Pages:

C.1

 

Publication year:

1988

 

Publication date:

Aug 16, 1988

 

Section:

C

 

Publisher:

New York Times Company

 

Place of publication:

New York, N.Y.

 

Country of publication:

United States

 

Publication subject:

General Interest Periodicals--United States

 

ISSN:

03624331

 

CODEN:

NYTIAO

 

Source type:

Newspapers

 

Language of publication:

English

 

Document type:

NEWSPAPER

 

ProQuest document ID:

426918985

 

Document URL:

https://search.proquest.com/docview/426918985?accountid=8243

 

Copyright:

Copyright New York Times Company Aug 16, 1988

 

Last updated:

2017-11-15

 

Database:

Global Newsstream

 

 

 

Computers bridging gap in artificial intelligence

Reuters . Financial Post ; Toronto, Ont. [Toronto, Ont]23 Sep 1988: 14.

ProQuest document link

 

 

 

 

ABSTRACT (ABSTRACT)

Neural networks are a type of artificial intelligence. The most common artificial intelligence approach, expert systems, are programmed with a set of rules and make determinations by applying those rules.

Neural network consultant Tom Schwartz of Schwartz Associates estimates that sales of neural network modeling tools, the building blocks for applications, will grow 50% a year for the next five years, to US$150 million by 1992.

Japan's Ministry of International Trade and Industry received funding requests last month for a major effort to develop neural network technologies. [Edward Rosenfeld] said that although Japan lags behind the United States and Europe in this area now, indications are that neural networks will be a major part of its Sixth Generation Human Frontier Project, a consortium of government and industry to develop advanced technologies.

 

LINKS

DETAILS

 

Company / organization:

Name: Nestor Inc; Ticker: NEST; NAICS: 511210; SIC: 7372; DUNS: 11-600-9499

 

Publication title:

Financial Post; Toronto, Ont.

 

Pages:

14

 

Number of pages:

0

 

Publication year:

1988

 

Publication date:

Sep 23, 1988

 

Dateline:

Boston,MA

 

Section:

1, News

 

Publisher:

The Financial Times Limited

 

Place of publication:

Toronto, Ont.

 

Country of publication:

United Kingdom

 

Publication subject:

Business And Economics--Banking And Finance

 

ISSN:

08388431

 

Source type:

Newspapers

 

Language of publication:

English

 

Document type:

NEWS

 

ProQuest document ID:

436739422

 

Document URL:

https://search.proquest.com/docview/436739422?accountid=8243

 

Copyright:

(Copyright The Financial Post 1988)

 

Last updated:

2010-06-29

 

Database:

Global Newsstream

 

 

 

Digital implementation issues of artificial neural networks

Pesulima, Edward Elisha . Florida Atlantic University, ProQuest Dissertations Publishing, 1990. 1341366.

ProQuest document link

 

 

 

 

ABSTRACT

Recent years have seen the renaissance of the neural network field. Significant advances in our understanding of neural networks and its possible applications necessitate investigations into possible implementation strategies. Among the presently available implementation medium, digital VLSI hardware is one of the more promising because of its maturity and availability. We discuss various issues connected with implementing neural networks in digital VLSI hardware. A new sigmoidal transfer function is proposed with that implementation in mind. Possible realizations of the function for stochastic and deterministic neural networks are discussed. Simulation studies of applying neural networks in constraint optimization and learning problems are carried out. These simulations were performed strictly in integer arithmetic. Simulation results provides an encouraging outlook for implementing these neural network applications in digital VLSI hardware. Important results concerning the sizes of various network values were found for learning algorithms.

 

LINKS

DETAILS

 

Subject:

Computer science; Electrical engineering; Neurology

 

Classification:

0984: Computer science; 0544: Electrical engineering; 0317: Neurology

 

Identifier / keyword:

Applied sciences Biological sciences

 

Number of pages:

224

 

Publication year:

1990

 

Degree date:

1990

 

School code:

0119

 

/ 0 نظر / 39 بازدید