Copy Link
Add to Bookmark
Report

Neuron Digest Volume 06 Number 53

eZine's profile picture
Published in 
Neuron Digest
 · 14 Nov 2023

Neuron Digest   Sunday,  9 Sep 1990                Volume 6 : Issue 53 

Today's Topics:
Results of Second Order Survey
Book recently published
CMU Benchmark collection
voice discrimination
Mactivation 3.3 on new ftp site
Grossberg model image processing
connectionism conference
PSYCHOLOGICAL PROCESSES


Send submissions, questions, address maintenance and requests for old issues to
"neuron-request@hplabs.hp.com" or "{any backbone,uunet}!hplabs!neuron-request"
Use "ftp" to get old issues from hplpm.hpl.hp.com (15.255.176.205).

------------------------------------------------------------

Subject: Results of Second Order Survey
From: "Eric A. Wan" <wan@whirlwind.Stanford.EDU>
Date: Fri, 31 Aug 90 13:17:00 -0700

Here is a list of references I received for the survey on second order
techniques for training neural networks. Most fall under the categories
of Newton, Quasi Newton and Conjugate Gradient methods. I included who
sent each reference for your information. I did not include everyone's
comment on the subject. If you want someone's comments e-mail your
request to me and I will forward it to you. (If you are one of the
persons who sent me comments and do not want it broadcast just let me
know.)

My general comment on the subject is that while people have started
looking to more sophisticated methods over gradient descent, results are
far from conclusive. While these methods seem promising, there has not
been adequate benchmarking and comprehensive comparisons between the
trade off's of the different algorithms.

Thanks to all who replied. I hope this is of some use to people in the
field:


>From: shastri@central.cis.upenn.edu (Lokendra Shastri)

Watrous, R.L. Learning algorithms for connectionist networks.
Applied gradient descent methods for non-linear optimization.
In Proc. of ICNN-87.

- -------- Phoneme discrimination using connectionist networks. Journal
of the Acoustical Society of America. 87(3).


- ------ & Shastri, L. Learning Phonetic Features using Connectionist
Networks: An Experiment in Speech Recognition.
In Proc. of ICNN-87.


>From: LAI-WAN CHAN <LWCHAN%CUCSD.CUHK.HK@Forsythe.Stanford.EDU>

L.W.Chan & F. Fallside, "An adaptive training algorithm for back propagation
networks"
, Computer Speech and Language (1987) 2, p205-218.

R.A. Jacobs, "Increased rates of convergence through learning rate adaptation",
neural networks, Vol 1, 1988 p295-307.

Finally, a comparative study of the learning algorithms will be published in
the proceeding of the IEEE region 10 conference on Computer and communication
systems (TENCON'90), 1990.

L.W. Chan, "Efficacy of different learning algorithms of the back propagation
network"
.

>From: becker@cs.toronto.edu (Sue Becker)

Becker, S. and le Cun, Y. (1988). Improving The Convergence Of
Back-Propagation Learning With Second-Order Methods.
Proceedings of the 1988 Connectionist Models Summer School,


>From: bennett@cs.wisc.edu (Kristin Bennett)

Kristin P. Bennett and Olvi L. Mangasarian,
"Neural Network Training Via Linear Programming",
Computer Science Technical Report #948,
University of Wisconsin - Madison, 1990.

>From: Venkatesh Murthy <venk@blake.acs.washington.edu>

Broyden-Fletcher-Goldfarb-Shanno algorithm.

Raymond L.Watrous. 1987. Learning algorithms for connectionist
networks: Applied gradient methods of nonlinear optimization.
U. Penn Tech. Report: MS-CIS-87-51.

A very similar paper can be found in IEEE First Conf. on Neural
Nets, June 1987. II-619-627.

Another tech report of which I don't have the exact citing (but
have a copy of the report itself) is:

Raymond L.Watrous. 1987. Learning acoustic features from speech
data using connectionist networks. U. Penn Tech. Report:?

Finally, our short paper which uses this algorithm to perform
patterns transformations, to simulate some data obtained from
electrophysiological experiments in my advisor's lab is:

Fetz, E.E., Shupe, L. and Murthy, V.N. 1990. Neural Networks
controlling wrist movements. IJCNN in San Diego, June 1990,
II-675-679.

>From: Ron Cole <cole@cse.ogi.edu>

My speech group has published about 10 papers using backprop with
conjugate gradient optimization, all relating to speech recognition.

(This is the one I know about)
%A E. Barnard
%A R. Cole
%T A neural-net training program based on
conjugate-gradient optimization
%I Oregon Graduate Center
%R CSE 89--014
%D 1988

We have also made the OPT simulator available and it is being used
in about 30 different laboratories.
(Available via anonymous ftp)

>From: OWENS%ESVAX%dupont.com@RELAY.CS.NET

A. J. Owens and D. L. Filkin, Efficient training of the Back
Propagation Network by solving a system of stiff ordinary
differential equations, IJCNN June 1989 (Washington), II, 381-386.

>From: chrisp@bilby.cs.uwa.OZ.AU

@techreport{webb_lowe_hybrid,
TITLE ="A Hybrid Optimization Strategy for Adaptive Feed-forward Layered Networks",
AUTHOR ="A. R. Webb and David Lowe",
INSTITUTION ="Royal Signals and Radar Establishment",
YEAR =1988,
TYPE ="RSRE Memo",
NUMBER ="4193",
ADDRESS ="Malvern, England"}

@techreport{webb_etal_comparison,
TITLE ="A Comparison of Non-linear Optimization Strategies for Adaptive Feed-forward Layered Networks",
AUTHOR ="A. R. Webb and David Lowe and M. D. Bedworth",
INSTITUTION ="Royal Signals and Radar Establishment",
YEAR =1988,
TYPE ="RSRE Memo",
NUMBER ="4157",
ADDRESS ="Malvern, England"}

They may be obtained by writing to

The Librarian,
Royal Signals and Radar Establishment
Malvern, Worcestershire, UK.

>From: @neural.att.com:yann@lamoon. (Yann Le Cun)

@phdthesis (lecun-87,
AUTHOR = "Y. {Le Cun}",
TITLE = "Mod\`{e}les Connexionnistes de l'Apprentissage",
SCHOOL = "Universit\'{e} Pierre et Marie Curie",
YEAR = "1987",
ADDRESS = "Paris, France"
)

@techreport {becker-lecun-88,
author = "Becker, S. and {Le Cun}, Y.",
title = "Improving the Convergence of Back-Propagation Learning with Second-Order Methods",
institution = "University of Toronto Connectionist Research Group",
year = "1988",
number = "CRG-TR-88-5",
month = "September"
}

@techreport ( lecun-89,
author = "{Le Cun}, Y." ,
title = "Generalization and Network Design Strategies",
institution = "University of Toronto Connectionist Research Group",
year = "1989",
number = "CRG-TR-89-4",
month = "June"
)

@inproceedings ( lecun-90,
author = "{Le Cun}, Y. and Boser, B. and Denker, J. S. and Henderson, D.
and Howard, R. E. and Hubbard, W. and Jackel, L. D."
,
title = "Handwritten Digit Recognition with a Back-Propagation Network",
booktitle= NIPS,
address = "(Denver, 1989)",
year = "1990",
editor = "Touretzky, David",
volume = 2,
publisher= "Morgan Kaufman"
)


Finally a few others that I can add (wan@isl.stanford.edu)

S. Fahlman, "Faster-Learning Variations on Back-Propagation: An
Empirical Study"
, Proceedings of the 1988 Connectionist Model Summer
School, p38.
(quick-prop: Quadratic fit through two points under the assumption
that all the weights are independent)

R. Battiti, "Accelerated Backpropagation Learning: Two Optimization
Methods"
, Complex Systems 3 (1989) 331-342,
(CG type methods)

D. Parker, "Optimal Algorithms for Adaptive Networks: Second Order
Back Propagation, Second Order Direct Propagation, and Second Order
Hebbian Learning"
??? (sorry)
(Second order methods in continuous time)

P Gawthrop, D. Sbarbaro, "Stochastic Approximation and Multilayer
Perceptrons: The Gain Backpropagation Algorithm"
, Complex Systems 4
(1990) 51-74.
(A RLS type algorithm)

S. Kollias, D. Anastassiou, "Adaptive Training of Multilayer Neural Networks Using a Least Squares Estimation Technique", ICNN 88 I-
383.
(A RLS type algorithm)

S. Makram-Ebeid, J. Sirat, J. Viala, "A Rationalized Error
Back-Propagation Learning Algorithm"
, IJCNN 89, Washington, II-373
(CG based algorithms)

Also two good books that explain most of these methods form the
nonlinear programming viewpoint:

D. Luenberger, "Linear and Nonlinear Programming", Addison-Wesley 1984.

P. Gill, W. Murray, W. Wright, "Practical Optimization", New York:
Academic Press, 1981.

------------------------------

Subject: Book recently published
From: Eduardo Sontag <sontag@hilbert.rutgers.edu>
Date: Sun, 02 Sep 90 11:11:44 -0400

The following textbook in control and systems theory may be useful to
those working on neural nets, especially if interested in recurrent nets
and other dynamic behavior. The level is begining-graduate; it is
written in a careful mathematical style, but its contents should be
accessible to anyone with a good undergraduate-level math background
including calculus, linear algebra, and differential equations:

Eduardo D. Sontag,
__Mathematical Control Theory: Deterministic Finite Dimensional Systems__
Springer, New York, 1990. (396+xiii pages)

Some highlights:

** Introductory chapter describing intuitively modern control theory

** Automata and linear systems covered in a *unified* fashion

** Dynamic programming, including variants such as forward programming

** Passing from dynamic i/o data to internal recurrent state representations

** Stability, including Lyapunov functions

** Tracking of time-varying signals

** Kalman filtering as deterministic optimal observation

** Linear optimal control, including Riccati equations

** Determining internal states from input/output experiments

** Classification of internal state representations under equivalence

** Frequency domain considerations: Nyquist criterion, transfer functions

** Feedback, as a general concept, and linear feedback; pole-shifting

** Volterra series

** Appendix: differential equation theorems

** Appendix: singular values and related matters


** Detailed bibliography (400 up-to-date entries)
** Large computer-generated index

Some data:

Springer-Verlag, ISBN: 0-387-97366-4; 3-540-97366-4
Series: Textbooks in Applied Mathematics, Number 6. Hardcover, $39.00
[Can be ordered in the USA from 1-800-SPRINGER (in NJ, 201-348-4033)]


------------------------------

Subject: CMU Benchmark collection
From: Scott.Fahlman@SEF1.SLISP.CS.CMU.EDU
Date: Mon, 03 Sep 90 22:15:32 -0400

[[ Editor's Note: My thanks to Scott for this complete and thoughtful
reply. The subject of benchmarking arise periodically, hence my last
reference to Scott's valient efforts. As always, I assume readers will
scan this message carefully and follow the directions. If anyone
volunteers to be an additional repository for the files, especially if
they are willing to make up tapes or diskettes and/or provide UUCP
access, please contact me or Scott. -PM ]]

Since the topic of the CMU benchmark collection was raised here, let me
post this information as a way of avoiding dozens of individual questions
and answers.

The benchmark collection is available via internet FTP -- directions for
how to access the collection are included below. I have hesitated to
advertise it to this newsgroup because so many people out on usenet have
no FTP access. As a rule, I don't have time to E-mail these files to
individuals (some are quite large and would have to be chopped up), and
we certainly are not in a position to send out copies on mag tape or
floppy disk. However, any of you who are able to access this material
via FTP are welcome to do so.

I set up the collection a couple of years ago as part of my own empirical
research on neural network learning algorithms. An important question is
how to measure one algorithm against another, even when they deal with
problem domains of similar size and shape. The typical paper in this
field describes some new algorithm and then presents an experiment or two
comparing the new algorithm vs. vanilla backprop. Unfortunately, no two
researchers seem to run the same problem in the same way. Not
surprisingly, everyone beats backprop by at least an order of magnitude,
and usually more. Of course, backprop is very sensitive to the choice of
training parameters, so with a new problem there is always the question
of whether backprop was given a fair chance. The more interesting
question of how a new algorithm stacks up against other post-backprop
algorithms is seldom addressed at all.

So my goal has been to collect a variety of benchmark problems, including
some small, artificial ones (e.g. parity) and some larger, more realistic
ones (e.g. nettalk). For each of these, the collection contains a
problem description, data sets for testing and training (or an algorithm
for generating the same), and a summary of results that people have
obtained on the problem in question using various algorithms. These
results make it possible for people with new algorithms to compare them
against the best results reported to date, and not just against vanilla
backprop. This material is provided solely for the benfit of
researchers; we have no desire to become the "Guiness Book of World
Records"
for neural networks. Since my goal is to compare learning
algorithms, not machines, these results are expressed in epochs or
floating-point operations rather than "seconds on a Cray Y/MP" or
whatever.

There is a mailing list for frequent users of this collection and other
interested in benchmarking issues. It is named "nn-bench@cs.cmu.edu"
(Internet address). Mail to this address goes to a couple of hundred
places worldwide, so "add me" requests and other administrative messages
should not go there. Instead they should go to
"nn-bench-request@cs.cmu.edu".

Unfortunately, not too many people have contributed problems to this
collection, and I have been too busy with other things to spend a lot of
time promoting this effort and combing the literature for good problems.
Consequently, the collection and the mailing list have been dormant of
late. I am enclosing a list of files currently in the collection. I
have a couple of other data sets that need some work to get them into
usable form. I hope to find some time for this in the near future, but
that is hard to predict. If someone with lots of time and energy, lots
of online file storage, and good connections to the Internet wants to
take over this effort and run with it, please contact me and we can
discuss this.

Scott Fahlman
School of Computer Science
Carnegie Mellon University
Pittsburgh, PA 15213
Internet: fahlman@cs.cmu.edu
.......................................................................
Current contents of Neural Net Benchmark directory.
First number is file size in bytes.

8094 Mar 15 1989 FORMAT
11778 Aug 13 1989 GLOSSARY
13704 Dec 5 1989 nettalk
541269 Jul 15 17:55 nettalk.data
7382 Oct 16 1989 nettalk.list
5570 Apr 16 1989 parity
1911 Oct 16 1989 protein
14586 Aug 22 1989 protein.test
73489 Aug 22 1989 protein.train
5872 Dec 23 1989 sonar
49217 Dec 23 1989 sonar.mines
43052 Dec 23 1989 sonar.rocks
7103 Feb 27 22:20 two-spirals
16245 Mar 4 23:01 vowel
61542 Apr 23 1989 vowel.data
6197 Apr 15 1989 xor
.......................................................................
FTP access instructions:

For people (at CMU, MIT, and soon some other places) with access to the
Andrew File System (AFS), you can access the files directly from
directory "/afs/cs.cmu.edu/project/connect/bench". This file system uses
the same syntactic conventions as BSD Unix: case sensitive names, slashes
for subdirectories, no version numbers, etc. The protection scheme is a
bit different, but that shouldn't matter to people just trying to read
these files.

For people accessing these files via FTP:

1. Create an FTP connection from wherever you are to machine
"pt.cs.cmu.edu". The internet address of this machine is
128.2.254.155, for those who need it.

2. Log in as user "anonymous" with no password. You may see an error
message that says "filenames may not have /.. in them" or something
like that. Just ignore it.

3. Change remote directory to "/afs/cs/project/connect/bench". Any
subdirectories of this one should also be accessible. Parent
directories should not be.

4. At this point FTP should be able to get a listing of files in this
directory and fetch the ones you want.

5. The directory "/afs/cs/project/connect/code" contains public-domain
programs implementing the Quickprop and Cascade-Correlation
algorithms, among other things. Access it in the same way.

I've tested this access method, but it's still possible that some of our
ever vigilant protection demons will interfere with access from out in
net-land. If you try to access this directory by FTP and have trouble,
please contact me.

The exact FTP commands you use to change directories, list files, etc.,
will vary from one version of FTP to another.
...........................................................................

------------------------------

Subject: voice discrimination
From: fritz_dg%ncsd.dnet@gte.com
Date: Wed, 05 Sep 90 09:42:34 -0400


I'm looking for references on neural network research on
voice discrimination, that is, telling one language and/or speaker apart
from another without necessarily understanding the words. Any leads at
all will be appreciated. I will summarize & return to the list any
responses. Thanks.

Dave Fritz
fritz_dg%ncsd@gte.com

------------------------------

Subject: Mactivation 3.3 on new ftp site
From: Mike Kranzdorf <mikek@boulder.Colorado.EDU>
Date: Wed, 05 Sep 90 10:14:27 -0600

Mactivation version 3.3 is available via anonymous ftp on
alumni.Colorado.EDU (internet address 128.138.240.32) The file is in /pub
and is called mactivation.3.3.sit.hqx (It is stuffited and binhex'ed) To
get it, try this:

ftp alumni.Colorado.EDU
anonymous
<any password will do>
binary
cd /pub
get mactivation.3.3.sit.hqx

Then get it to your Mac and use Stuffit to uncompress it and BinHex 4.0
to make it back into an application.

If you can't make ftp work, or you want a copy with the nice MS Word
docs, then send $5 to:

Mike Kranzdorf
P.O. Box 1379
Nederland, CO 80466-1379
USA

For those who don't know about Mactivation, here's the summary:

Mactivation is an introductory neural network simulator which runs on all
Apple Macintosh computers. A graphical interface provides direct access
to units, connections, and patterns. Basic concepts of network operations
can be explored, with many low level parameters available for
modification. Back-propagation is not supported (coming in 4.0) A user's
manual containing an introduction to connectionist networks and program
documentation is included. The ftp version includes a plain text file,
while the MS Word version available from the author contains nice
graphics and footnotes. The program may be freely copied, including for
classroom distribution.

--mikek

internet: mikek@boulder.colorado.edu
uucp:{ncar|nbires}!boulder!mikek
AppleLink: oblio


------------------------------

Subject: Grossberg model image processing
From: slehar@thalamus.bu.edu
Date: Wed, 05 Sep 90 15:01:20 -0400


daft@debussy.crd.ge.com (Chris Daft) wrote in the last Neuron Digest:

-----------------------------------------------------------------------
| Some time ago I posted a request for references on neural networks and
| image processing/image understanding ... here are the results of that
| and a literature search.
| Conspicuously absent from my list is any mention of Grossberg's work...
-----------------------------------------------------------------------

I am sorry I missed your original request. As it happens, for the
last several years I have been implementing, modifying and extending
Grossberg's Boundary Contour System / Feature Contour System (BCS/FCS)
with particular emphasis on image processing applications. You can
read about my work in the following:

Lehar S. & Worth A. APPLICATION OF BOUNDARY COUNOUR/FEATURE CONTOUR
SYSTEM TO MAGNETIC RESONANCE BRAIN SCAN IMAGERY. proceedings IJCNN
June 1990 San Diego.

Lehar S., Howells T, & Smotroff I. APPLICATION OF GROSSBERG AND
MINGOLLA NEURAL VISION MODEL TO SATELLITE WEATHER IMAGERY.
Prroceedings of the INNC July 1990 Paris.

I will also be presenting an extension to the BCS in the SPIE
conference in Florida in April 1991.

The BCS/FCS is a very interesting model mostly because it does not
just try to perform image processing with neural techniques, but
actually attempts to duplicate the exact neural architecture used by
the brain. The model is based not only on neurophysiological
findings, but much of the model is directly based on visual illusions-
things that the human eye sees that arn't really there! The idea is
that if we can model the illusions as well as the vision, then we will
have a mechanism that not only does the same job as the eye, but does
it the same way as the eye does.

Imagine if you were given a primitive pocket calculator, and asked to
figure out how it works without taking it apart. Giving it
calculations like 1+1= will not make you any the wiser. When you ask
it to compute (1/3)*3= however you will learn not only how it works,
but also how it fails. The BCS/FCS is the only model that can explain
a wide range of psychophysical phenomena such as neon color spreading,
pre-attentive perceptual grouping, mach bands, brightness and color
illusions, illusory boundaries and illusory motions of various sorts.

Application of this strategy to human vision has resulted in a neural
model that is both complicated and bizzar. Studying that model
reveals an elegant and improbable mechanim with very interesting
properties.

My own work has focused on applying Grossberg's algorithm to natural
imagery in order to explore its potential for image enhancement and
image understanding and the results have been very encouraging. If
anyone is interested in further information please send me email and I
will be happy to provide it. (Hurry up because our e-address is about
to be changed!)

(O)((O))(((O)))((((O))))(((((O)))))(((((O)))))((((O))))(((O)))((O))(O)
(O)((O))((( slehar@bucasb.bu.edu )))((O))(O)
(O)((O))((( Steve Lehar Boston University Boston MA )))((O))(O)
(O)((O))((( (617) 424-7035 (H) (617) 353-6741 (W) )))((O))(O)
(O)((O))(((O)))((((O))))(((((O)))))(((((O)))))((((O))))(((O)))((O))(O)

------------------------------

Subject: connectionism conference
From: ai-vie!georg@relay.EU.net (Georg Dorffner)
Date: Sat, 01 Sep 90 13:18:47 -0100





Sixth Austrian Artificial Intelligence Conference

---------------------------------------------------------------
Connectionism in Artificial Intelligence
and Cognitive Science
---------------------------------------------------------------

organized by the Austrian Society for
Artificial Intelligence (OGAI)
in cooperation with the Gesellschaft fuer Informatik
(GI, German Society for Computer Science),
Section for Connectionism

Sep 18 - 21, 1990
Hotel Schaffenrath
Salzburg, Austria

Conference chair: Georg Dorffner (Univ. of Vienna, Austria)

Program committee: J. Diederich (GMD St. Augustin, Germany)
C. Freksa (Techn. Univ. Munich, Germany)
Ch. Lischka (GMD St.Augustin, Germany)
A. Kobsa (Univ. of Saarland, Germany)
M. Koehle (Techn. Univ. Vienna, Austria)
B. Neumann (Univ. Hamburg, Germany)
H. Schnelle (Univ. Bochum, Germany)
Z. Schreter (Univ. Zurich, Switzerland)

invited lectures:
Paul Churchland (UCSD)
Gary Cottrell (UCSD)
Noel Sharkey (Univ. of Exeter)

Workshops:
Massive Parallelism and Cognition
Localist Network Models
Connectionism and Language Processing

Panel: Explanation and Transparency of Connectionist Systems

IMPORTANT! The conference languages are German and English.
Below, the letter 'E' indicates that a talk or workshop will be
held in English.
=====================================================================

Scientific Program (Wed, Sep 19 til Fri, Sep 21):

Wednesday, Sep 19, 1990:

U. Schade (Univ. Bielefeld)
Kohaerenz und Monitor in konnektionistischen Sprachproduktionsmodellen

C. Kunze (Ruhr-Univ. Bochum)
A Syllable-Based Net-Linguistic Approach to Lexical Access

R. Wilkens, H. Schnelle (Ruhr-Univ. Bochum)
A Connectionist Parser for Context-Free Phrase Structure Grammars

S.C.Kwasny (Washington Univ. St.Louis),
K.A.Faisal (King Fahd Univ. Dhahran)
Overcoming Limitations of Rule-based Systems: An Example of a
Hybrid Deterministic Parser (E)

N. Sharkey (Univ. of Exeter), eingeladener Vortrag
Connectionist Representation for Natural Language: Old and New (E)


Workshop: Connectionism and Language Processing (chair: H.
Schnelle) (E)

T. van Gelder (Indiana University)
Connectionism and Language Processing

H. Schnelle (Ruhr-Univ. Bochum)
Connectionism for Cognitive Linguistics

G. Dorffner (Univ. Wien, Oest. Forschungsinst. f. AI)
A Radical View on Connectionist Language Modeling

R. Deffner, K. Eder, H. Geiger (Kratzer Automatisierung
Muenchen) Word Recognition as a First Step Towards Natural
Language Processing with Artificial Neural Networks

N. Sharkey (Univ. of Exeter)
Implementing Soft Preferences for Structural Disambiguation


Paul Churchland, UCSD (invited talk)
Some Further Thoughts on Learning and Conceptual Change (E)

-----------------------------------------------------------

Thursday, Sep 20,1990:

G. Cottrell, UCSD (invited talk)
Will Connectionism replace symbolic AI? (E)

T. van Gelder (Indiana Univ.)
Why Distributed Representation is Inherently Non-Symbolic (E)

M. Kurthen, D.B. Linke, P. Hamilton (Univ. Bonn)
Connectionist Cognition

M. Mohnhaupt (Univ. Hamburg)
On the Importance of Pictorial Representations for the
Symbolic/Subsymbolic Distinction

M. Rotter, G. Dorffner (Univ. Wien, Oest. Forschungsinst. f. AI)
Struktur und Konzeptrelationen in verteilten Netzwerken

C. Mannes (Oest. Forschungsinst. f. AI)
Learning Sensory-Motor Coordination by Experimentation and
Reinforcement Learning

A. Standfuss, K. Moeller, J. Funke (Univ. Bonn)
Wissenserwerb ueber dynamische Systeme: Befunde konnektionistischer
Modellierung


Workshop: Massive Parallelism and Cognition (chair: C. Lischka) (E)

C. Lischka (GMD St. Augustin)
Massive Parallelism and Cognition: An Introduction

T. Goschke (Univ. Osnabrueck) Representation of Implicit
Knowledge in Massively Parallel Architectures

G. Helm (Univ. Muenchen)
Pictorial Representations in Connectionist Systems

M. Kurthen (Univ. Bonn)
Connectionist Cognition: A Summary

S. Thrun, K. Moeller (Univ. Bonn), A. Linden (GMD St. Augustin)
Adaptive Look-Ahead Planning


Panel:
Explanation and Transparency of Connectionist Systems (E)

speakers: J. Diederich, C. Lischka (GMD),
G. Goerz (Univ. Hamburg), P. Churchland (UCSD),

---------------------------------------------------------------------

Friday, Sep 21, 1990:


Workshop: Localist Network Models (chair: J. Diederich) (E)

S. Hoelldobler (ICSI Berkeley)
On High-Level Inferencing and the Variable Binding Problem
in Connectionist Networks

J. Diederich (GMD St.Augustin, UC Davis)
Recruitment vs. Backpropagation Learning:
An Empirical Study on Re-Learning in Connectionist Networks

W.M. Rayburn, J. Diederich (UC Davis)
Some Remarks on Emotion, Cognition, and Connectionist Systems

G. Paass (GMD St. Augustin)
A Stochastic EM Learning Algorithm for Structured Probabilistic
Neural Networks

T. Waschulzik, H. Geiger (Kratzer Automatisierung Muenchen)
Eine Entwicklungsmethodik fuer strukturierte konnektionistische
Systeme

G. Cottrell (UCSD)
Why Localist Connectionism is a Mistake


A.N. Refenes (Univ. College London)
ConSTrainer: A Generic Toolkit for Connectionist Dataset Selection (E)

J.L. van Hemmen, W. Gerstner(TU Muenchen), A. Herz, R. Kuehn,
B. Sulzer, M. Vaas (Univ. Heidelberg)
Encoding and Decoding of Patterns which are Correlated in Space and
Time

R. Salomon (TU Berlin)
Beschleunigtes Lernen durch adaptive Regelung der Lernrate bei
back-propagation in feed-forward Netzen

T. Waschulzik, H. Geiger (Kratzer Automatisierung Muenchen)
Theorie und Anwendung strukturierter konnektionistischer Systeme

H. Bischof, A. Pinz (Univ.f.Bodenkultur Wien)
Verwendung von neuralen Netzwerken zur Klassifikation natuerlicher
Objekte am Beispiel der Baumerkennung aus Farb-Infrarot-Luftbildern.

H.G. Ziegeler, K.W. Kratky (Univ. Wien)
A Connectionist Realization Applying Knowledge-Compilation and
Auto-Segmentation in a Symbolic Assignment Problem

A. Lebeda, M. Koehle (TU Wien) Buchstabenerkennung unter
Beruecksichtigung von kontextueller Information

=================================================================
Registration:

Please send the following form to:

Georg Dorffner
Inst.f. Med. Kybernetik und Artificial Intelligence
Universitaet Wien
Freyung 6/2
A-1010 Vienna, Austria

For further questions write to the same address or contact directly
Georg Dorffner (Tel: +43 1 535 32 810, Fax: +43 1 63 06 52,
email: georg@ai-vie.uucp)

-----------------------------------------------------------------

Connectionism in AI and Cognitive Science
(KONNAI)
Registration Application Form:

I herewith apply for registration at the 6th Austrian AI conference

Name: _______________________________________________________________

Address: ____________________________________________________________

____________________________________________________________

_____________________________________________________________

Telephone: __________________________________ email: ________________

I will participate in the following events:

o Plenary lectures, scient. program, Panel AS 1.950,--
(DM 280,--)

reduced price for OGAI members AS 1.800,--
(DM 260,--)

reduced price for students (with ID!) AS 1.000,--
(DM 150,--)
---------------

Amount: _______________

o Workshops (price is included in conference fee)

o Massive Parallelism and Cognition
o Localist Network Models
o Connectionism and Language Processing


o Ich want to demonstrate a program and need the following
hard- and software:
__________________________________________________


o I transfer the mony to the checking account of the OGAI at the
Ersten Oesterreichischen Spar-Casse-Bank, No. 004-71186

o I am sending a eurocheque

o I need an invoice


signature: ____________________________________


======================================================================

Accomodation:
The conference will be held at
Hotel Schaffenrath, Alpenstrasse 115, A-5020 Salzburg.

No rooms are available any more at that hotel. You can, however, send
the form below to the Hotel Schaffenrath, who will forward the
reservation to another nearby hotel.

=====================================================================

Connectionism in AI and Cognitive Science
(KONNAI)
Hotel reservation

I want a room from __________________ to _______________________
(day of arrival) (day of departure)

ein o single AS 640,-- incl. breakfast

o double AS 990,-- incl. breakfast

o three beds AS 1200,-- incl. breakfast


Name: ________________________________________________________________

Address: _____________________________________________________________

_____________________________________________________________

_____________________________________________________________

Telephone: __________________________________


------------------------------

Subject: PSYCHOLOGICAL PROCESSES
From: Noel Sharkey <N.E.Sharkey@cs.exeter.ac.uk>
Date: Tue, 04 Sep 90 14:28:22 +0100



I have been getting a lot of enquiries about the special issue of
connection science on psychological processes (i the announcement months
ago and of course people have lost it). So here it is again folk.

noel



******************** CALL FOR PAPERS ******************


CONNECTION SCIENCE SPECIAL ISSUE


CONNECTIONIST MODELLING OF PSYCHOLOGICAL PROCESSES

EDITOR
Noel Sharkey

SPECIAL BOARD
Jim Anderson
Andy Barto
Thomas Bever
Glyn Humphreys
Walter Kintsch
Dennis Norris
Kim Plunkett
Ronan Reilly
Dave Rumelhart
Antony Sanford


The journal Connection Science would like to encourage submissions from
researchers modelling psychological data or conducting experiments
comparing models within the connectionist framework. Papers of this
nature may be submitted to our regular issues or to the special issue.

Authors wishing to submit papers to the special issue should mark them
SPECIAL PSYCHOLOGY ISSUE. Good quality papers not accepted for the
special issue may appear in later regular issues.


DEADLINE FOR SUBMISSION 12th October, 1990.


Notification of acceptance or rejection will be by the end of
December/beginning of January.

------------------------------

End of Neuron Digest [Volume 6 Issue 53]
****************************************

← previous
next →
loading
sending ...
New to Neperos ? Sign Up for free
download Neperos App from Google Play
install Neperos as PWA

Let's discover also

Recent Articles

Recent Comments

Neperos cookies
This website uses cookies to store your preferences and improve the service. Cookies authorization will allow me and / or my partners to process personal data such as browsing behaviour.

By pressing OK you agree to the Terms of Service and acknowledge the Privacy Policy

By pressing REJECT you will be able to continue to use Neperos (like read articles or write comments) but some important cookies will not be set. This may affect certain features and functions of the platform.
OK
REJECT