پایان نامه های شبکه عصبی مصنوعی(11)

U.S. Seeks Brain-Like Computers

ANDREW POLLACK, Special to the New York Times . New York Times , Late Edition (East Coast); New York, N.Y. [New York, N.Y]18 Aug 1988: D.1.

ProQuest document link

 

 

 

 

ABSTRACT (ABSTRACT)

''Bees are pretty smart compared to smart weapons,'' Dr. Fields said. ''Bees can evade. Bees can choose routes and choose targets.''

 

''There's a great deal of concern that they not do what they did with A.I., which was widely regarded as a disaster,'' said James Anderson, professor of cognitive and linquistic sciences and psychology at Brown University. He was referring to the artificial intelligence program, in which more money was allocated than could be used. ''They dumped a huge amount of money, almost without warning. So much money was involved that people couldn't cope with it.''

 

There is also some concern that Darpa's involvement would shift the focus of work toward military goals. ''We suck all of our best scientists into doing military research because that's where the money is,'' said David Rumelhart, professor of psychology and computer science at Stanford University.

 

LINKS

DETAILS

 

Subject:

DATA PROCESSING (COMPUTERS); NEW MODELS, DESIGN AND PRODUCTS; FINANCES; RESEARCH; BRAIN

 

People:

POLLACK, ANDREW

 

Company / organization:

Name: Defense Advance Research Projects Agency; NAICS: 928110

 

Publication title:

New York Times, Late Edition (East Coast); New York, N.Y.

 

Pages:

D.1

 

Publication year:

1988

 

Publication date:

Aug 18, 1988

 

Dateline:

SAN FRANCISCO, Aug. 17

 

Section:

D

 

Publisher:

New York Times Company

 

Place of publication:

New York, N.Y.

 

Country of publication:

United States

 

Publication subject:

General Interest Periodicals--United States

 

ISSN:

03624331

 

CODEN:

NYTIAO

 

Source type:

Newspapers

 

Language of publication:

English

 

Document type:

NEWSPAPER

 

ProQuest document ID:

426923754

 

Document URL:

https://search.proquest.com/docview/426923754?accountid=8243

 

Copyright:

Copyright New York Times Company Aug 18, 1988

 

Last updated:

2017-11-15

 

Database:

Global Newsstream

 

 

 

Computers you can train work somewhat like a brain

Garfinkel, Simson L . The Christian Science Monitor ; Boston, Mass. [Boston, Mass]15 Nov 1988.

ProQuest document link

 

 

 

 

ABSTRACT

 

THE mind-vs.-brain controversy is coming to a head in the world of artificial intelligence and computer science; While researchers admit they're far away from a machine that can think - estimates range from 20 to 50 years or even never - computers already developed replicate many tasks, such as learning, once the sole province of biological brains; ''In AI there are two goals,'' says Bernardo Huberman, a scientist at Xerox's Palo Alto Research Center.

 

 

LINKS

DETAILS

 

Subject:

Neural networks; Studies; Robots; Computers; Artificial intelligence

 

Company / organization:

Name: Defense Advanced Research Projects Agency; NAICS: 928110; Name: Massachusetts Institute of Technology; NAICS: 611310

 

Publication title:

The Christian Science Monitor; Boston, Mass.

 

Publication year:

1988

 

Publication date:

Nov 15, 1988

 

Section:

NEWS

 

Publisher:

The Christian Science Publishing Society (d/b/a "The Christian Science Monitor"), trusteeship under the laws of the Commonwealth of Massachusetts

 

Place of publication:

Boston, Mass.

 

Country of publication:

United States

 

Publication subject:

General Interest Periodicals--United States

 

ISSN:

08827729

 

Source type:

Newspapers

 

Language of publication:

English

 

Document type:

News

 

ProQuest document ID:

1034597637

 

Document URL:

https://search.proquest.com/docview/1034597637?accountid=8243

 

Copyright:

Copyright 1988 The Christian Science Publishing Society

 

Last updated:

2017-11-19

 

Database:

Global Newsstream

 

 

 

A self-organizing neural network for representing sequences

Tolat, Viral Vipin . Stanford University, ProQuest Dissertations Publishing, 1989. 9011589.

ProQuest document link

 

 

 

 

ABSTRACT

A neural network model called the "representation network" is developed. The network is different from other networks because information is represented by the structure of the network as well as by the weights. Through an unsupervised learning method, the network forms a homemorphic (topology-preserving) mapping of the input pattern space. Absolute pattern information is represented by the weights of the network. Relational pattern information, e.g., pattern topology, is represented by the structure of the network. Arbitrary spatial topologies can be learned with the choice of an adequate similarity measure. The ability of the network to correctly form a map is proven analytically through an analysis in which the behavior of the network is described by a system of energy equations.

 For sequences of patterns, the topology of the patterns is determined by the temporal order of the patterns; the similarity measure is temporal. Extensions to the network are presented that allow it to form maps of an arbitrary sequence of n-dimensional patterns. Once such a map is formed, the output of the network can be used for sequence recognition and classification. An identification network model is developed for this purpose. The identification network examines the output of the representation network and computes a score that reflects the level of match between the input sequence and the learned sequence. A high score results if the input sequence is the learned sequence. When this model was tested on the task of speaker dependent isolated digit recognition obtained, close to 100% recognition was obtained.

 Because the topology of the input patterns is preserved by the output of the network, the output of the network can be used for nonlinear mapping with a minimal amount of supervised training. A mapping network is developed for this purpose. Supervised training is used to learn a small number of exemplar mappings. Unsupervised learning is then used to train the weights so that unknown mappings are interpolated from the exemplar mappings. Unsupervised learning algorithms are presented for linear and cubic interpolation. Example mapping problems are used to demonstrate the ability of this approach.

 

LINKS

DETAILS

 

Subject:

Electrical engineering; Computer science; Artificial intelligence

 

Classification:

0544: Electrical engineering; 0984: Computer science; 0800: Artificial intelligence

 

Identifier / keyword:

Applied sciences

 

Number of pages:

167

 

Publication year:

1989

 

Degree date:

1989

 

School code:

0212

 

Source:

DAI-B 50/12, Dissertation Abstracts International

 

Place of publication:

Ann Arbor

 

Country of publication:

United States

 

Advisor:

Peterson, Allen M.

 

University/institution:

Stanford University

 

University location:

United States -- California

 

Degree:

Ph.D.

 

Source type:

Dissertations &Theses

 

Language:

English

 

Document type:

Dissertation/Thesis

 

Dissertation/thesis number:

9011589

 

ProQuest document ID:

303723351

 

Document URL:

https://search.proquest.com/docview/303723351?accountid=8243

 

Copyright:

Database copyright ProQuest LLC; ProQuest does not claim copyright in the individual underlying works.

 

Database:

ProQuest Dissertations &Theses Global

 

 

 

Adaptive control of nonlinear systems using neural networks

Chen, Fu-Chuang . Michigan State University, ProQuest Dissertations Publishing, 1990. 9117798.

ProQuest document link

 

 

 

 

ABSTRACT (ENGLISH)

Layered neural networks are used in the adaptive control of nonlinear discrete-time systems. The control algorithm is described and two convergence results are provided. The first result shows that the plant output converges to zero in the adaptive regulation system. The second result shows that the error between the plant output and the reference command converges to a bounded ball in the adaptive tracking system. Computer simulations verify the theoretical results at the end of this thesis.

 

LINKS

DETAILS

 

Subject:

Electrical engineering

 

Classification:

0544: Electrical engineering

 

Identifier / keyword:

Applied sciences

 

Number of pages:

113

 

Publication year:

1990

 

Degree date:

1990

 

School code:

0128

 

Source:

DAI-B 52/01, Dissertation Abstracts International

 

Place of publication:

Ann Arbor

 

/ 0 نظر / 131 بازدید