Copy Link
Add to Bookmark
Report

Neuron Digest Volume 05 Number 31

eZine's profile picture
Published in 
Neuron Digest
 · 14 Nov 2023

Neuron Digest	Friday, 21 Jul 1989		Volume 5 : Issue 31 

Today's Topics:
"Transformations" tech report
Abstract for CNLS Conference
Computational Neuroscience Symposium
EURASIP Workshop on Neural Networks
Call for Papers: INNS/IEEE Conference on Neural Networks, Jan. 1990
journal reviewers
Neural Computation, Vol. 1, No. 2
Tech Report available
Preprint available
Seminar notice
6 month post-doc job
REPORTS ON SPARSE DISTRIBUTED MEMORY


Send submissions, questions, address maintenance and requests for old issues to
"neuron-request@hplabs.hp.com" or "{any backbone,uunet}!hplabs!neuron-request"
ARPANET users can get old issues via ftp from hplpm.hpl.hp.com (15.255.16.205).

------------------------------------------------------------

Subject: "Transformations" tech report
From: Eric Mjolsness <mjolsness-eric@YALE.ARPA>
Date: Tue, 07 Mar 89 21:23:16 -0500

A new technical report is available:

"Algebraic Transformations of Objective Functions"

(YALEU/DCS/RR-686)

by Eric Mjolsness and Charles Garrett
Yale Department of Computer Science
P.O. 2158 Yale Station
New Haven CT 06520

Abstract: A standard neural network design trick reduces the number of
connections in the winner-take-all (WTA) network from O(N^2) to O(N). We
explain the trick as a general fixpoint-preserving transformation applied to
the particular objective function associated with the WTA network. The key
idea is to introduce new interneurons which act to maximize the objective,
so that the network seeks a saddle point rather than a minimum. A number of
fixpoint-preserving transformations are derived, allowing the simplification
of such algebraic forms as products of expressions, functions of one or two
expressions, and sparse matrix products. The transformations may be applied
to reduce or simplify the implementation of a great many structured neural
networks, as we demonstrate for inexact graph-matching, convolutions and
coordinate transformations, and sorting. Simulations show that
fixpoint-preserving transformations may be applied repeatedly and
elaborately, and the example networks still robustly converge. We discuss
implications for circuit design.

To request a copy, please send your physical address by e-mail to
mjolsness-eric@cs.yale.edu
OR mjolsness-eric@yale.arpa (old style)
Thank you.


------------------------------

Subject: Abstract for CNLS Conference
From: Stevan Harnad <harnad@Princeton.EDU>
Date: Mon, 13 Mar 89 13:57:26 -0500

Here is the abstract for my contribution to the session on the "Emergence of
Symbolic Structures"
at the 9th Annual International Conference on Emergent
Computation, CNLS, Los Alamos National Laboratory, May 22 - 26 1989

Grounding Symbols in a Nonsymbolic Substrate

Stevan Harnad
Behavioral and Brain Sciences
Princeton NJ

There has been much discussion recently about the scope and limits of purely
symbolic models of the mind and of the proper role of connectionism in
mental modeling. In this paper the "symbol grounding problem" -- the problem
of how the meanings of meaningless symbols, manipulated only on the basis of
their shapes, can be grounded in anything but more meaningless symbols in a
purely symbolic system -- is described, and then a potential solution is
sketched: Symbolic representations must be grounded bottom-up in nonsymbolic
representations of two kinds: (1) iconic representations are analogs of the
sensory projections of objects and events and (2) categorical
representations are learned or innate feature-detectors that pick out the
invariant features of object and event categories. Elementary symbols are
the names of object and even categories, picked out by their (nonsymbolic)
categorical representations. Higher-order symbols are then grounded in these
elementary symbols. Connectionism is a natural candidate for the mechanism
that learns the invariant features. In this way connectionism can be seen
as a complementary component in a hybrid nonsymbolic/symbolic model of the
mind, rather than a rival to purely symbolic modeling. Such a hybrid model
would not have an autonomous symbolic module, however; the symbolic
functions would emerge as an intrinsically "dedicated" symbol system as a
consequence of the bottom-up grounding of categories and their names.

------------------------------

Subject: Computational Neuroscience Symposium
From: jfbrule@cmx.npac.syr.edu (Jim Brule)
Organization: Northeast Parallel Architectures Center, Syracuse NY
Date: Wed, 31 May 89 20:17:28 +0000


Preliminary Announcement:

Computational Neuroscience and Parallel Computing

October 23-24, 1989
Sheraton University Inn and Conference Center
Syracuse, NY

sponsored by:
Northeast Parallel Architectures Center (NPAC)
Syracuse University
Syracuse, NY

Symposium Chair:
Erich Harth, Syracuse University

Program Committee:
Michael Arbib, USC
James Brule', NPAC
Erich Harth, SU
J. Alan Robinson, SU
Charles Stormon, Coherent Research Inc.
Thomas Schwartz, TJ Schwartz Associates


Great strides are being made in the fields of neuroscience and parallel
computing. This is in part due to the technological advances made in support
of each field, allowing scientists to further their work more effectively.

The rapid progress in each field has led to an overlap between them. Work
that takes place in this overlap is beginning to gain stature as a field in
its own right. This fledgling discipline has come to be known as
"Computational Neuroscience." It has found itself at the center of much
attention and controversy. As such, Computational Neuroscience has generated
both enthusiasm and caution among researchers.

The goal of this Symposium is to explore this overlap with the intent of
discovering the richest opportunities for research there. Invited
neuroscientists and computer scientists will speak, and lead panel
discussions and roundtable exchanges. A total of seven invited lectures and
two panels will be presented. The following topics are a partial
representation of the final program:

Connectionism and Massively Parallel Systems
Neural Networks
Computational Neuroscience
Dynamic Link Architectures

Application Areas (panel)
Implementation Issues (panel)


In an effort to promote meaningful exchange, attendance will
be limited to 125.

Fees: $385 until August 31, 1989; $450 thereafter.
5% discount for members in good standing of IEEE or INNS

For more information contact:

James F. Brule', Ass't Dir. for Research Programs
Northeast Parallel Architectures Center (NPAC)
Center for Science and Technology
111 College Place
Syracuse University
Syracuse, NY 13244
(315) 443-3924


Thirty thousand mighty suns shone down in a soul- *jfbrule@nova.npac.syr.edu
searing splendor that was more frighteningly cold **************************
in its awful indifference than the bitter wind * Isaac Asmiov
that shivered across the cold, horribly bleak world.* "Nightfall"

------------------------------

Subject: EURASIP Workshop on Neural Networks
From: Connectionists-Request@cs.cmu.edu
Date: Thu, 01 Jun 89 10:13:45 -0400


CALL FOR PAPERS
EURASIP WORKSHOP ON NEURAL NETWORKS


Sesimbra, Portugal
February 15-17, 1990

The workshop will be held at the Hotel do Mar in Sesimbra, Portugal. It
will take place in 1990, from February 15 morning to 17 noon, and will be
sponsored by EURASIP, the European Association for Signal Processing. It
will be open to participants from all countries, both from inside and
outside of Europe.

Contributions from all fields related to the neural network area are
welcome. A (non-exclusive) list of topics is given below. Care is being
taken to ensure that the workshop will have a high level of quality.
Proposed contributions will be evaluated by an international technical
committee. A proceedings volume will be published, and will be handed to
participants at the beginning of the workshop. The number of participants
will be limited to 50. Full contributions will take the form of oral
presentations, and will correspond to papers in the proceedings. Some short
contributions will also be accepted, for presentation of ongoing work,
projects (ESPRIT, BRAIN, DARPA,...), etc. They will be presented in poster
format, and will not originate any written publication. A small number of
non-contributing participants may also be accepted. The official language of
the workshop will be English.


TOPICS:

- - signal processing (speech, image,...)
- - pattern recognition
- - algorithms (training procedures, new structures, speedups,...)
- - generalization
- - implementation
- - specific applications where NN have been proved better than other
approaches
- - industrial projects and realizations

SUBMISSION PROCEDURES:

Submissions, both for long and for short contributions, will consist of
(strictly) 2-page summaries. Three copies should be sent directly to the
Technical Chairman, at the address given below. The calendar for
contributions is as follows:

Full contributions Short contributions
Deadline for submission June 15, 1989 Oct 1, 1989
Notif. of acceptance Sept 1, 1989 Nov 15, 1989
Camera-ready paper Nov 1, 1989


ORGANIZING COMMITTEE

General Chairman: Luis B. Almeida, INESC, Apartado 10105,
P-1017 Lisboa, Codex, Portugal
Phone: +351-1-544607;
Fax: +351-1-525843;
E-mail: {any backbone, uunet}!mcvax!inesc!lba

Technical Chairman: Christian J. Wellekens,
Philips Research Laboratory Brussels,
Av. Van Becelaere 2, Box 8, B-1170 Brussels, Belgium
Phone: +32-2-6742275;
Fax: +32-2-6742299;
E-mail: wlk@prlb2.uucp

Technical committee:

John Bridle (Royal Signal and Radar Establishment, Malvern, UK),
Herve Bourlard (Intern. Computer Science Institute, Berkeley, USA),
Frank Fallside (University of Cambridge, Cambridge, UK),
Francoise Fogelman (Ecole de H. Etudes en Informatique, Paris, France),
Jeanny Herault (Institut Nat. Polytech. de Grenoble, Grenoble, France),
Larry Jackel (AT\&T Bell Labs, Holmdel, NJ, USA),
Renato de Mori (McGill University, Montreal, Canada),
H. Muehlenbein (GMD, Sankt Augustin, FRG).

REGISTRATION, FINANCE, LOCAL ARRANGEMENTS:

Joao Bilhim, INESC, Apartado 10105, P-1017 Lisboa, Codex, Portugal
Phone: +351-1-545150; Fax: +351-1-525843.

WORKSHOP SPONSOR

EURASIP - European Association for Signal Processing

CO-SPONSORS:

INESC - Instituto de Engenharia de Sistemas e Computadores, Lisbon,
Portugal
IEEE, Portugal Section

THE LOCATION:

Sesimbra is a fishermens village, located in a nice region about 30 km south
of Lisbon. Special transportation from/to Lisbon will be arranged. The
workshop will end on a Saturday at lunch time; therefore, the participants
will have the option of either flying back home in the afternoon, or staying
for sightseeing for the remainder of the weekend in Sesimbra and/or Lisbon.
An optional program for accompanying persons is being organized.



------------------------------

Subject: Call for Papers: INNS/IEEE Conference on Neural Networks, Jan. 1990
From: lehr@isl.Stanford.EDU (Michael Lehr)
Organization: Stanford University EE Dept.
Date: Tue, 13 Jun 89 05:36:53 +0000



CALL FOR PAPERS

International Joint Conference on Neural Networks
IJCNN-90-WASH DC

January 15-19, 1990,
Washington, DC


The Winter 1990 session of the International Joint Conference on Neural
Networks (IJCNN-90-WASH DC) will be held on January 15-19, 1990 at the Omni
Shoreham Hotel in Washington, DC, USA. The International Neural Network
Society (INNS) and the Institute of Electrical and Electronics Engineers
(IEEE) invite all those interested in the field of neural networks to submit
papers for possible publication at this meeting. Brief papers of no more
than 4 pages may be submitted for consideration for oral or poster
presentation in any of the following sessions:

APPLICATIONS TRACK:

* Expert System Applications
* Robotics and Machine Vision
* Signal Processing Applications (including speech)
* Neural Network Implementations: VLSI and Optical
* Applications Systems (including Neurocomputers & Network
Definition Languages)

NEUROBIOLOGY TRACK:

* Cognitive and Neural Sciences
* Biological Neurons and Networks
* Sensorimotor Transformations
* Speech, Audition, Vestibular Functions
* Systems Neuroscience
* Neurobiology of Vision

THEORY TRACK:

* Analysis of Network Dynamics
* Brain Theory
* Computational Vision
* Learning: Backpropagation
* Learning: Non-backpropagation
* Pattern Recognition


**Papers must be postmarked by August 1, 1989 and received by August 10,
1989 to be considered for presentation. Submissions received after August
10, 1989 will be returned unopened.**

International authors should be particularly careful to submit their work
via Air Mail or Express Mail to ensure timely arrival. Papers will be
reviewed by senior researchers in the field, and author notifications of the
review decisions will be mailed approximately October 15, 1989. A limited
number of papers will be accepted for oral and poster presentation. All
accepted papers will be published in full in the meeting proceedings, which
is expected to be available at the conference. Authors must submit five (5)
copies of the paper, including at least one in camera-ready format
(specified below), as well as four review copies. Do not fold your paper
for mailing. Submit papers to:

IJCNN-90-WASH DC
Adaptics
16776 Bernardo Center Drive, Suite 110 B
San Diego, CA 92128 UNITED STATES

(619) 451-3752


SUBMISSION FORMAT:

Papers should be written in English and submitted on 8-1/2 x 11 inch or
International A4 size paper. The print area on the page should be 6-1/2 x 9
inches (16.5 x 23 cm on A4 paper). All text and figures must fit into no
more than 4 pages. The title should be centered at the top of the first
page, and it should be followed by the names of the authors and their
affiliations and mailing addresses (also centered on the page). Skip one
line, and then begin the text of the paper. We request that the paper be
printed by typewriter or letter-quality printer with clear black ribbon,
toner, or ink on plain bond paper. We cannot guarantee the reproduction
quality of color photographs, so we recommend black and white only. The
type font should be Times Roman or similar type font, in 12 point type
(typewriter pica). You may use as small a type as 10 point type (typewriter
elite) if necessary. The paper should be single-spaced, one column, and on
one side of the paper only. Fax submissions are not acceptable.

**Be sure to specify which track and session you are submitting your paper
to and whether you prefer an Oral or Poster presentation. Also include the
name, complete mailing address and phone number (or fax number) of the
author we should communicate with regarding your paper.**

If you would like to receive an acknowledgment that your paper has been
received, include a self-addressed, stamped post-card or envelope for reply,
and write the title and authors of the paper on the back. We will mark it
with the received date and mail it back to you within 48 hours of receipt of
the paper. Submission of the paper to the meeting implies copyright
approval to publish it as part of the conference proceedings. Authors are
responsible for obtaining any clearances or permissions necessary prior to
submission of the paper.

------------------------------

Subject: journal reviewers
From: Lyn Shackleton <lyn@CS.EXETER.AC.UK>
Date: Fri, 16 Jun 89 10:50:29 -0000


******* CONNECTION SCIENCE ******

Editor: Noel E. Sharkey

Because fo the number of specialist submissions, the journal is currently
expanding its review panel. This is an interdisciplinary journal with an
emphasis on replicability of results.

If you wish to volunteer please send details of your review area to the
address below. Or write for further details.

lyn shackleton
(assistant editor)

Centre for Connection Science JANET: lyn@uk.ac.exeter.cs
Dept. Computer Science
University of Exeter UUCP: !ukc!expya!lyn
Exeter EX4 4PT
Devon BITNET: lyn@cs.exeter.ac.uk.UKACRL
U.K.


------------------------------

Subject: Neural Computation, Vol. 1, No. 2
From: terry%sdbio2@ucsd.edu (Terry Sejnowski)
Date: Thu, 22 Jun 89 19:16:14 -0700

NEURAL COMPUTATION -- Issue #2 -- July 1, 1989

Views:

Recurrent backpropagation and the dynamical approach to
adaptive neural computation. F. J. Pineda

New models for motor control. J. S. Altman and J. Kien

Seeing chips: Analog VLSI circuits for computer vision. C. Koch

A proposal for more powerful learning algorithms. E. B. Baum

Letters:

A possible neural mechanism for computing shape from shading.
A. Pentland

Optimization in model matching and perceptual organization.
E. Mjolsness, G. Gindi and P. Anandan

Distributed parallel processing in the vestibulo-oculomotor
system. T. J. Anastasio and D. A. Robinson

A neural model for generation of some behaviors in the
fictive scratch reflex. R. Shadmehr

A robot that walks: Emergent behaviors from a carefully
evolved network. R. A. Brooks

Learning state space trajectories in recurrent neural
networks. B. A. Pearlmutter.

A learning algorithm for continually running fully recurrent
neural networks. R. J. Williams and D. Zipser.

Fast learning in networks of locally-tuned processing units.
J. Moody and C. J. Darken.

- -----

SUBSCRIPTIONS:

______ $35. Student
______ $45. Individual
______ $90. Institution

Add $9. for postage outside USA and Canada surface mail
or $17. for air mail.

MIT Press Journals, 55 Hayward Street, Cambridge, MA 02142.
(617) 253-2889.


------------------------------

Subject: Tech Report available
From: munnari!cs.flinders.oz.au!guy@uunet.UU.NET (Guy Smith)
Date: Thu, 29 Jun 89 17:56:22 -0600


The Tech Report "Back Propagation with Discrete Weights and Activations"
describes a modification of BP which generates a net with discrete (but not
integral) weights and activations. The modification is simple: weights and
activations are restricted to discrete values. The weights/activations
calculated by BP are rounded to one of the neighbouring discrete values.

For simple discrete problems, the learning performance of the net was not
much affected until the granularity of the legal weight/activation values
was as coarse as ten values per integer (ie 0.0, 0.1, 0.2, ...).

To request a copy, mail to "guy@cs.flinders.oz..." or write to

Guy Smith,
Computer Science Department,
Flinders University,
Adelaide 5042,
AUSTRALIA.

Guy Smith.

------------------------------

Subject: Preprint available
From: "Harel Shouval, Tal Grossman" <FEGROSS%WEIZMANN.BITNET@VMA.CC.CMU.EDU>
Date: Fri, 30 Jun 89 09:52:40 +0300


The following preprint describes a theoretical and experimental work on
optical neural network that is based on a negative weights nn model. Please
send your requests by email to: feshouva@weizmann (bitnet), or write to:
Harel Shouval, Electronics Dept., Weizmann Inst. Rehovot 76100, ISRAEL.

---------------------

An All-Optical Hopfield Network: Theory and Experiment
- -------------------------------------------------------
Harel Shouval, Itzhak Shariv, Tal Grossman,
Asher A. Friesem and Eytan Domany.
Dept. of Electronics, Weizmann Institute of Science,
Rehovot 76100 Israel.

--- ABSTRACT ---

Realization of an all-optical Hopfield-type neural network is made possible
by eliminating the need for subtracting light intensities. This can be done
without significntly degrading the network's preformance, if only inhibitory
connections (i.e. $J_{ij}<0$) are used. We present theoretical analysis of
such a network, and its experimental implementation, that uses a liquid
crystal light valve for the neurons and an array of sub-holograms for the
interconnections.

Acknowledge-To: <FEGROSS@WEIZMANN>

------------------------------

Subject: Seminar notice
From: jacobs@marfak.crd.ge.com (jacobs)
Date: Mon, 03 Jul 89 13:37:32 -0400



Neural Networks and High-Level Cognitive Tasks
Robert B. Allen, Bellcore
Thursday, July 6, 10am, Guest House
GE Research and Development Center, Schenectady, NY

While connectionist networks are clearly applicable to signal processing
tasks, they have been claimed not to be relevant to high-level cognitive
tasks. However, the networks' ability to adapt to context and the parsimony
of a vertically integrated cognitive model make their use for high-level
tasks worth careful investigation. This talk reviews the author's work with
temporal networks on applications including 4-term analogies, agent
modeling, agent interaction, grammars, planning, plan recognition, and
'language use'. In addition novel architectures and procedures such as
adaptive training and a new reinforcement technique will be described.
While the models to be reported have substantial limitations, the scope and
relative ease with which results have been obtained seems promising.

------------------------------

Subject: 6 month post-doc job
From: Geoffrey Hinton <hinton@ai.toronto.edu>
Date: Thu, 06 Jul 89 08:18:29 -0400


CONNECTIONIST POST-DOC POSITION

(If you know of individuals who might be interested but are not on the
connectionists mailing list, please forward this to them.)

The connectionist research group at the University of Toronto is looking for
a post-doctoral researcher for a period of six months starting on January 1
1990. The ideal candidate would have the following qualifications:

1. A significant amount of experience at running connectionist simulations,
preferably in a unix/C environment, and a willingness to use the Toronto
Research Simulator (not publicly available).

2. Some knowledge of neuropsychology.

3. A genuine desire to spend six months working intensively on connectionist
simulations that explain neuropsychological phenomena. Examples of the
types of syndrome we are interested in are given in Shallice, T. "From
Neuropsychology to Mental Structure"
, Cambridge, 1988.

4. A PhD that is already completed or will clearly be completed by Jan 1
1990. The starting date is inflexible because the job is designed to
coincide with a six month visit to the University of Toronto by Tim
Shallice. Also, it will not be possible to finish off a PhD or convert a
recent PhD into journal papers during the six month period.


For the right person, this would be an excellent opportunity to work in a
leading connectionist group with excellent simulation facilities and with
close collaboration with a neuropsychologist who has a detailed
understanding of connectionist models. One example of the kind of research
we have in mind is described in "Lesioning a connectionist network:
Investigations of acquired dyslexia"
by Hinton and Shallice. To order this
TR, send email requesting CRG-TR-89-3 to carol@ai.toronto.edu

Applications should be made in writing to

Geoffrey Hinton
Department of Computer Science
University of Toronto
10 Kings College Road
Toronto, Ontario, M5S 1A4
Canada

Please enclose a full CV, a copy of a recent relevant TR or paper, and the
names addresses and phone numbers of three referees. The salary is
negotiable, but will be approximately $20,000 for six months for a person
with a PhD. I will be in europe until the end of July, so no replies will
be forthcoming for a while.

Geoff Hinton

------------------------------

Subject: REPORTS ON SPARSE DISTRIBUTED MEMORY
From: Michael R. Raugh <raugh@riacs.edu>
Date: Thu, 20 Jul 89 15:13:24 -0700


ANNOUNCING A SERIES OF RIACS REPORTS ON KANERVA'S SPARSE DISTRIBUTED MEMORY

The Sparse Distributed Memory (SDM) Project is now in its fourth year at the
Research Institute for Advanced Computer Science (RIACS) at the NASA Ames
Research Center forty miles southeast of San Francisco. We are studying a
massively parallel architecture invented by the Project Principal
Investigator, Pentti Kanerva. The basic theory is set forth in Kanerva's
book "Sparse Distributed Memory" (MIT Press, 1988).

In brief, SDM is an associative, random-access memory that uses very large
patterns (hundreds to thousands of bits long) as both addresses and data.
When writing a pattern at an address in the memory, the pattern is added to
existing information at each of many nearby memory locations. When reading
from an address in the memory, information stored at nearby memory locations
is pooled and threshholded for output. The memory's potential utility is a
result of its statistical properties and of several factors: (1) A large
pattern representing an object or a scene or a moment of experience can
encode a large amount of information about what it represents. (2) This
information can serve as an address to the memory, and it can also serve as
data. (3) The memory can interpolate and extrapolate from existing data and
is fault tolerant. (4) The memory is also noise tolerant -- the information
need not be exact. (5) The memory can be made very large, and large amounts
of information can be stored in it. (6) The memory can store long sequences
of patterns and can "predict" (recall) the remaining portion of a sequence
when prompted by an earlier segment of the sequence. (7) The architecture
is inherently parallel, allowing large memories to be fast. (8) The
mathematical theory is clearcut and is well understood. (9) Learning is
fast -- only a small number of training cycles are necessary.

We have developed a theory of SDM-based autonomous learning systems, have
built prototypes (including a large-scale simulator on the CM-2), a hardware
digital prototype has been built for us at Stanford University, and we are
studying the applicability of SDM to speech- and shape-recognition. We are
also investigating an important relationship between SDM and the mammalian
cerebellum, important because the cerebellum coordinates a myriad of sensory
inputs and motor outputs with far more sophistication than is possible with
present-day man-made computers. We expect our combined studies of the
cerebellum and of SDM-style associative memories to lead to useful results
for controlling robots.

If you would like to learn more about the project, please ask for our
publications list by sending email to sdmpubs@riacs.edu. The list also
provides information on how to order reports.

Michael Raugh
RIACS Assistant Director and
SDM Project Manager

------------------------------

End of Neurons Digest
*********************

← previous
next →
loading
sending ...
New to Neperos ? Sign Up for free
download Neperos App from Google Play
install Neperos as PWA

Let's discover also

Recent Articles

Recent Comments

Neperos cookies
This website uses cookies to store your preferences and improve the service. Cookies authorization will allow me and / or my partners to process personal data such as browsing behaviour.

By pressing OK you agree to the Terms of Service and acknowledge the Privacy Policy

By pressing REJECT you will be able to continue to use Neperos (like read articles or write comments) but some important cookies will not be set. This may affect certain features and functions of the platform.
OK
REJECT