Copy Link
Add to Bookmark
Report

Neuron Digest Volume 07 Number 15

eZine's profile picture
Published in 
Neuron Digest
 · 14 Nov 2023

Neuron Digest   Monday, 25 Mar 1991                Volume 7 : Issue 15 

Today's Topics:
Re: Artificial Neural Nets, AI and ...
Back-propagation S/W........
Introduction
Another distribution point
Request for info on simulators related to fuzzy logic
Off-line Signature recognition
Triangle NN talk: Dan Levine
Classification and Regression Trees
Second Printing of BackPercolation Manuscript
abstracts - J. of Ideas, Vol. 2 #1
Award Nominations Due 3/31/91
Feature map and vector quantization bibliography
LMS-tree source code available
Looking for Backprop in Lisp
looking for references

[[Editor's Note: This issue has no paper or conference announcements.
Please check the articles carefully for discussion items, requests for
information, or software availability. -PM ]]

Send submissions, questions, address maintenance and requests for old issues to
"neuron-request@hplabs.hp.com" or "{any backbone,uunet}!hplabs!neuron-request"
Use "ftp" to get old issues from hplpm.hpl.hp.com (15.255.176.205).


------------------------------------------------------------

Subject: Re: Artificial Neural Nets, AI and ...
From: Salih Kabay <ska@sun.engineering.leicester.ac.uk>
Date: Tue, 05 Mar 91 16:39:20 +0000

Does anyone know of any ongoing projects, commercial or academic, that
attempts to integrate a Neural Network environment with an Expert System
(or similar) and to more conventional procedural signal processing
algorithms in order to solve a complex pattern recognition problem,
etc...

My current area of research is in the area of BlackBoard Systems. This is
concerned with coupling multiple domain-specific knowledge sources to
solving a relatively complex and ill-defined practical problem. A
knowledge source could be an expert system or a procedural algorithm to
perform, say, filtering, FFT, spectral analysis etc... The natural
extension to this would be to add in a neural network to perform more
complex and powerful signal processing functions.

Our application is in simulation and diagnosis of neurophysiological
disorders.

Any thoughts or opinions would be most welcome. One good reference is:
"Blackboard Systems", Engelmore & Morgan, Addison, 1989.

Salih Kabay
University of Leicester, UK.

ska@uk.ac.le.engg.sun


------------------------------

Subject: Back-propagation S/W........
From: Barak Pearlmutter <barak@james.psych.yale.edu>
Date: Tue, 05 Mar 91 14:35:31 -0500

My "cbp" simulator has a recurrent backprop mode. It can be FTPed from
f.gp.cs.cmu.edu, user anonymous, file /usr/bap/afs/src/cbp/cbp.tar.Z, use
binary mode. There isn't much documentation available, but poke around
that directory for it. I can give a very limited amount of assistance.

--Barak.

------------------------------

Subject: Introduction
From: Darrell Duane <duane@xanth.cs.odu.edu>
Date: Wed, 06 Mar 91 19:01:20 -0500


I'm new to neuron-request.

I'm a senior in Electrical Engineering at Old Dominion University in
Norfolk, VA with a curriculum emphasis in Communications. I have held an
interest in NN for a while now, and will be attending the weeklong class
at Boston University in May. Currently I am working on implementing a
back-propogation algorithm in Pascal...any advice about where to find
some code already written or specific books would be greatly appreciated.

--Darrell Duane

%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
% Darrell Duane, Jr. % If life is a plateful of oysters, then it %
% duane@xanth.cs.odu.edu % is only the empty ones which open easily. %
% ^(Internet) 1:109/331 <-(Fidonet) % -- C.H. Waddington, Tools for Thought %
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%

[[Editor's Note: Back issues of the Digest have listed a number of books
for introducing the algorithms as well as simulators (see Vol 7 #4 & #5,
most rectntly). The PDP Vol 3 book (MIT Press) has C code for
back-propagation. The ftp directory on hplpm.hpl.hp.com has a little bit
of software. Of course, finding Pascal code itself may be a challenge. -PM ]]

------------------------------

Subject: Another distribution point
From: elsberry@evax.uta.edu (Wesley R Elsberry)
Date: Fri, 08 Mar 91 07:24:55 -0600


Alternate Distribution Point

There is another distribution point for various and sundry ANN related
files. Central Neural System BBS provides a free access system for the
exchange of files and discussion of ANN related topics. This is
available to anyone with a modem and terminal program, making it one of
the few online resources open to those without Internet or Bitnet access.
I have collected quite a few packages, tutorials, and text files for
distribution, including the Neural Network benchmark files, several
different simulators for a wide variety of machines, and past Neuron
Digest articles.

I'm always looking for more interesting items to add to the collection,
though. I'm especially interested in tutorials and simulations that
include source code. If you have or know of a file to add to the system,
I can be contacted via email at <elsberry@arrisun3.uta.edu>,
<elsberry@evax.uta.edu>, or <wesley.elsberry@f303.n930.z8.rbbs-net.org>.

Central Neural System is also home to the NEURAL_NET Echo, which is a
nationally available EchoMail discussion area. Any of over six thousand
FidoNet and RBBS-Net BBS's can add NEURAL_NET to the list of Echoes
carried, which means that you probably are within local calling distance
of a board that carries or may be persuaded to carry the Echo. I try to
port the comp.ai.neural-nets newsgroup to the Echo on a regular basis.

The following is a list of files in the ANNSIM area.

Central Neural System, (817) 551-9363
RBBS-Net Node 8:930/303.0
BinkleyTerm 2.40, TPBoard 6.2
ANNSIM Section Directory Listing
as of

04:00 Wednesday 6-Mar-91

ADALINE.ZIP 49k (DOS) ADALINE ANN model tutorial, demo simulation
AIEXP_NN.ZIP 113k (DOS) AI Expert Mag's Neural Network demo disk.
ARTICLE2.TXT 21k (TEXT) An unorganized NN related bibliography
ART_CODE.ANN 23k (SOURCE IN C) Source code for ART 1 & 2 from P. van der Smagt
BAM.ZIP 30k (DOS) Bidirectional Associative Memory Simulation
BENCH.ZIP 318k (TEXT) Scott Fahlman's Neural Network Benchmarks
BPS_100.ZIP 208k (DOS,UNIX,MAC) Back-Propagation Simulation package
CASCOR.ZIP 109k (TEXT) Scott Fahlman's Cascade Correlation NNs in LISP and C
CORLITHM.ZIP 11k Correlithm theory overview and software modules, some refs
DDJ-AI.ZIP 200k Dr. Dobb's AI programs, including SILOAM -- a perceptron ANN
DDJ-APR.ZIP 31k Dr. Dobb's programs, including some ANN simulations
DESIRE.ZIP 161k (DEMO: DOS) Korn's DESIRE modelling package demo
DYNSYS.ZIP 325k (DEMO: DOS) Dynamical System's modelling package demo
ET.ZIP 54k (DOS) Perceptron simulator
GENALG.ZIP 32k (DOS) A simple genetic algorithm demo program
GENESIS.ZIP 58k (UNIX) Genesis Genetic Algorithm package
HOPFIELD.ZIP 31k Hopfield neural net - need UNIX tar command to unpack it
JEDI.ZOO 70k (DOS) Jet engine simulation system using ANNs
KNOWREP.ZIP 15k (TEXT) MIND Workshop on NNs for Knowledge Rep. abstracts, etc
MAC33.HQX 141k (MAC) Mactivation 3.3 in binhex/Stuffit format
MATH'ICA.ZIP 355k (DEMO: DOS w/VGA) Mathematica demonstration program
MINDNOTE.ZIP 44k (TEXT) Metroplex Inst. for Neural Dynamics (MIND) Meeting Notes
MIND_TXT.ZIP 48k (TEXT) MIND Newsletter announcements
N'WARE.ZIP 380k (DEMO: DOS) Neuralware's Neuralworks II demo
NDV6_0.ZIP 109k (TEXT) Neuron Digest V6 I:0-9
NDV6_1.ZIP 108k (TEXT) Neuron Digest V6 I:10-19
NDV6_2.ZIP 129k (TEXT) Neuron Digest V6 I:20-29
NDV6_3.ZIP 128k (TEXT) Neuron Digest V6 I:30-39
NDV6_4.ZIP 133k (TEXT) Neuron Digest V6 I:40-49
NDV6_5.ZIP 119k (TEXT) Neuron Digest V6 I:50-59
NDV6_6.ZIP 121k (TEXT) Neuron Digest V6 I:60-69
NDV6_7.ZIP 53k (TEXT) Neuron Digest V6 I:70-79
NDV7_0.ZIP 106k (TEXT) Neuron Digest V7 I:1-6
NDV7_1.ZIP 13k (TEXT) Neuron Digest V7 I:10-?
ND_V5.ZIP 201k (TEXT) Neuron Digest Volume 5
NEURTTT.ZIP 63k (DOS) Neural Network Tic-Tac-Toe demonstration
NEWRULE.BAS 8k Gary Coulter's NEWRULE source code, GWBASIC ASCII text
NEWRULE.ZIP 34k Gary Coulter's NEWRULE BP ANN simulation with GWBASIC source
NEWRULEC.ZIP 27k Edward Nicol's C port of Gary Coulter's NEWRULE BP ANN
NNS.ZIP 30k Neural Net Simulator written by Fred Mitchell (c) 1987.
NN_FTP.LST 22k (TEXT) FTP sites for NN simulators.
NRLNET10.ZIP 45k (DOS) Back-propagation network construction kit
NSHELL.ZIP 22k (DEMO: DOS) NeuroShell demo
RESOURCE.TXT 12k (TEXT) ANN Resources for Research, Study, & Play
TOGAI.ZIP 137k (DEMO: WIN3) Togai Infralogic's fuzzy & neural network demos
USENET01.ZIP 69k Usenet text discussions on AI and ANNs
USENET02.ZIP 41k More Usenet neural network discussions
V7I10.ND 35k (TEXT) Neuron Digest V7 I:10
WORKABST.MND 23k (TEXT) Abstracts from MIND's 4th Annual Workshop
WRE_THES.ZIP 265k (DOS) Master's thesis, with ANN source, exec, and text.

I'm willing to provide diskette service given the following conditions:

1) The user provides MS-DOS diskette(s) (5.25" 360K or 1.2M; 3.5" 720K or
1.44M), a self-addressed stamped envelope, and a prioritized list of
requested files

-OR-

2) specification of MS-DOS disk type requested, return address, a
prioritized list of files requested, and $5 US per disk requested, and
I'll provide the disk(s) and return mailer.

Please note that option 1 is preferred, as it places the least strain on
my spare time.

If you know of students or other folk who are interested in ANNs, but who
don't have an Internet account, consider passing on the word concerning
the existence of Central Neural System BBS and the NEURAL_NET Echo.

Wesley R. Elsberry


[[Editor's Note: Mr. Elsberry has been a faithful reader of Neuron Digest
for years and is also a nice guy. HIs resources are well worth the
effort. Of course, if you are *only* on the Internet, I don't know if he
can arrange ftp or other "direct" access. Wesley? -PM ]]

------------------------------

Subject: Request for info on simulators related to fuzzy logic
From: FUZZY LOGIC GROUP / CORIMME / SGS-THOMSON MICROELECTRONICS <DUDZIAKM@isnet.inmos.COM>
Date: Fri, 08 Mar 91 07:34:04 -0700

I am in the process of compiling an annotated list of simulators and
emulators of fuzzy logic and other related non-deterministic logics.
This list will be made available to the network community. I welcome any
information about products and especially distributable, public-domain
prototypes. I am familiar with a few of the commercial products but do
not know much about what is available through the academic research
community. There seem to be quite a number of neural/ connectionist
simulators, such as have been recently described on this and other
mailing lists.

Your assistance is appreciated.

Martin Dudziak
SGS-THOMSON Microelectronics
dudziakm@isnet.inmos.com
(alternate: dudziakm@agrclu.st.it in Europe)
fax: 301-290-7047
phone: 301-995-6952


------------------------------

Subject: Off-line Signature recognition
From: Eliezer Dekel <dekel@utdallas.edu>
Date: Tue, 12 Mar 91 19:00:53 -0600

I am looking for references to work on off-line signature recognition.
I'm aware of some work that was done before 1985. I would greatly
appreciate information about more recent work. I'll summerize and post
to the list.

Eliezer Dekel
The University of Texas at Dallas
dekel@utdallas.edu


------------------------------

Subject: Triangle NN talk: Dan Levine
From: Jonathan Marshall <marshall@cs.unc.edu>
Date: Wed, 13 Mar 91 11:34:50 -0500

[[Editor's Note: As usual, few readers will actually be able to attend,
but Levine's work is quite interesting. Much of his modeling is inspired
by Grossberg. A year or two ago, he write a paper modeling frontal lobe
damage in the Wisconson Card Sort task, for example. Wesley Elsberry,
above, also worked with Levine. -PM]]


====== TRIANGLE AREA NEURAL NETWORK INTEREST GROUP presents: ======

Prof. DANIEL S. LEVINE
Department of Mathematics
University of Texas at Arlington

Tuesday, April 2, 1991
5:30 p.m.
Refreshments will be served at 5:15.

Sitterson Hall (Computer Science), room 011
UNC Chapel Hill

=----------------------------------------------------------------------
NETWORK MODELING OF NEUROPSYCHOLOGICAL DATA

A general class of neural network architectures will be discussed, based
on such principles as associative learning, competition, and opponent
processing. Examples of this sort of architecture will be introduced
that model data on neuropsychological deficits arising from frontal lobe
damage. These deficits include inability to switch criteria on a card
sorting task; excessive attraction to novel stimuli; loss of verbal
fluency; and difficulty in learning a flexible motor sequence. Frontal
lobe damage is modeled in these networks by weakening of a specified
connection.

Dan Levine is author of the forthcoming textbook Introduction to Neural
and Cognitive Modeling, published by L. Erlbaum Associates, 1991. He is
a co-founder of the Dallas-Ft.Worth area neural network interest group
M.I.N.D.
=----------------------------------------------------------------------

Co-Sponsored by:
Department of Electrical and Computer Eng., NCSU
Department of Computer Science, UNC-CH
Humanities Computing Facility, Duke Univ.

For more information:
Jonathan Marshall (UNC-CH, 962-1887, marshall@cs.unc.edu) or
John Sutton (NCSU, 737-5065, sutton@eceugs.ece.ncsu.edu).

Directions:
Sitterson Hall is located across the street from the Carolina Inn,
on South Columbia Street (Route 86), which is the main north-south
street through downtown Chapel Hill. Free parking is available in
the UNC lots, two of which are adjacent to Sitterson Hall.
Municipal parking lots are located 2-3 blocks north, in downtown
Chapel Hill.

=----------------------------------------------------------------------
We invite you to participate in organizing and running the new
Triangle-area neural network (NN) interest group. It is our hope that
the group will foster communication and collaboration among the local NN
researchers, students, businesses, and the public.



------------------------------

Subject: Classification and Regression Trees
From: Roysam <roysam@ecse.rpi.edu>
Date: Sat, 16 Mar 91 17:49:21 -0500


I am looking for a (preferably inexpensive/non-commercial) computer
program that builds Classification and Regression Trees (CART).

Would appreciate any suggestions on this.

Thanks

Badri Roysam
Department of ECSE
RPI, Troy, NY 12180.
roysam@ecse.rpi.edu
(518)-276-8067


------------------------------

Subject: Second Printing of BackPercolation Manuscript
From: mgj@cup.portal.com
Date: Sat, 16 Mar 91 16:13:24 -0800

SHORT ANNOUNCEMENT

Due to a large number of requests for BackPercolation, a second printing
is now underway. If you have already requested a copy and did not yet
receive it, fear not. Another shipment will leave on 20 March 90. All
remaining requests will be fulfilled at that time.

Mark Jurik, JURIK RESEARCH & CONSULTING, PO 2379, Aptos, CA 95001, USA

------------------------------

Subject: abstracts - J. of Ideas, Vol. 2 #1
From: well!moritz@apple.com (Elan Moritz)
Date: Mon, 18 Mar 91 18:18:57 -0800



+=++=++=++=++=++=++=++=++=++=++=++=++=++=++=+

please post & circulate
Announcement
.........

Abstracts of papers appearing in
Volume 2 # 1 of the Journal of Ideas



THOUGHT CONTAGION AS ABSTRACT EVOLUTION


Aaron Lynch


Abstract: Memory abstractions, or mnemons, form the basis of a
memetic evolution theory where generalized self-replicating ideas
give rise to thought contagion. A framework is presented for
describing mnemon propagation, combination, and competition. It is
observed that the transition from individual level considerations to
population level considerations can act to cancel individual
variations and may result in population behaviors. Equations for
population memetics are presented for the case of two-idea
interactions. It is argued that creativity via innovation of ideas is
a population phenomena. Keywords: mnemon, meme, evolution,
replication, idea, psychology, equation.

...................


CULTURE AS A SEMANTIC FRACTAL:
Sociobiology and Thick Description


Charles J. Lumsden


Department of Medicine, University of Toronto
Toronto, Ontario, Canada M5S 1A8


Abstract: This report considers the problem of modeling culture as a
thick symbolic system: a system of reference and association
possessing multiple levels of meaning and interpretation. I suggest
that thickness, in the sense intended by symbolic anthropologists
like Geertz, can be treated mathematically by bringing together two
lines of formal development, that of semantic networks, and that of
fractal mathematics. The resulting semantic fractals offer many
advantages for modeling human culture. The properties of semantic
fractals as a class are described, and their role within
sociobiology and symbolic anthropology considered. Provisional
empirical evidence for the hypothesis of a semantic fractal
organization for culture is discussed, together with the prospects
for further testing of the fractal hypothesis. Keywords: culture,
culturgen, meme, fractal, semantic network.


...................

MODELING THE DISTRIBUTION OF A "MEME" IN A SIMPLE AGE DISTRIBUTION
POPULATION: I. A KINETICS APPROACH AND SOME ALTERNATIVE MODELS


Matthew Witten


Center for High Performance Computing
University of Texas System, Austin, TX 78758-4497


Abstract. Although there is a growing historical body of literature
relating to the mathematical modeling of social and historical
processes, little effort has been placed upon modeling the spread of
an idea element "meme" in such a population. In this paper we
review some of the literature and we then consider a simple kinetics
approach, drawn from demography, to model the distribution of a
hypothetical "meme" in a population consisting of three major age
groups. KEYWORDS: Meme, idea, age-structure, compartment,
sociobiology, kinetics model.


...................


THE PRINCIPIA CYBERNETICA PROJECT


Francis Heylighen, Cliff Joslyn, and Valentin Turchin



The Principia Cybernetica Project[dagger]


Abstract: This note describes an effort underway by a group of
researchers to build a complete and consistent system of philosophy.
The system will address, issues of general philosophical concern,
including epistemology, metaphysics, and ethics, or the supreme
human values. The aim of the project is to move towards conceptual
unification of the relatively fragmented fields of Systems and
Cybernetics through consensually-based philosophical development.
Keywords: cybernetics, culture, evolution, system transition,
networks, hypermedia, ethics, epistemology.


...................


Brain and Mind: The Ultimate Grand Challenge


Elan Moritz


The Institute for Memetic Research
P. O. Box 16327, Panama City, Florida 32406


Abstract: Questions about the nature of brain and mind are raised.
It is argued that the fundamental understanding of the functions and
operation of the brain and its relationship to mind must be regarded
as the Ultimate Grand Challenge problem of science. National
research initiatives such as the Decade of the Brain are discussed.
Keywords: brain, mind, awareness, consciousness, computers,
artificial intelligence, meme, evolution, mental health, virtual
reality, cyberspace, supercomputers.



+=++=++=++=++=++=++=++=++=++=++=++=++=++=++=+



The Journal of Ides an archival forum for discussion of 1) evolution
and spread of ideas, 2) the creative process, and 3) biological and
electronic implementations of idea/knowledge generation and
processing.



The Journal of Ideas, ISSN 1049-6335, is published quarterly by the
Institute for Memetic Research, Inc. P. O. Box 16327, Panama City
Florida 32406-1327.


>----------- FOR MORE INFORMATION ------->

E-mail requests to Elan Moritz, Editor, at moritz@well.sf.ca.us.


------------------------------

Subject: Award Nominations Due 3/31/91
From: Bradley Dickinson <bradley@ivy.Princeton.EDU>
Date: Tue, 19 Mar 91 16:46:29 -0500

Nominations Sought for IEEE Neural Networks Council Award

The IEEE Neural Networks Council is soliciting nominations for an award,
to be presented for the first time at the July 1991 International Joint
Conference on Neural Networks.


IEEE Transactions on Neural Networks Outstanding Paper Award

This is an award of $500 for the outstanding paper published in the
IEEE Transactions on Neural Networks in the previous two-year period.
For 1991, all papers published in 1990 (Volume 1) in the IEEE Transactions
on Neural Networks are eligible. For a paper with multiple authors, the
award will be shared by the coauthors.

Nominations must include a written statement describing the outstanding
characteristics of the paper. The deadline for receipt of nominations is
March 31, 1991. Nominations should be sent to Prof. Bradley W.
Dickinson, NNC Awards Chair, Dept. of Electrical Engineering, Princeton
University, Princeton, NJ 08544-5263.

Questions about the award may be directed to Prof. Bradley W. Dickinson:
telephone: (609)-258-2916, electronic mail: bradley@ivy.princeton.edu


------------------------------

Subject: Feature map and vector quantization bibliography
From: JJ Merelo <jmerelo@ugr.es>
Date: 20 Mar 91 20:31:00 +0200

I have collected this bibliography on vector quantization and Kohonen
maps. Hope it is useful for starters. Should it be asked, I could
collect also an address and e-mail address data bank, to ask for further
information to authors. If anybody knows of more information on feature
maps, Kohonen maps LVQ, its mathematical derivations, or whatever, please
post them or send them to my e-mail address.


Acker, R; Kurz, A
On the biologically motivated derivation of Kohonen's
self-organizing feature maps. Parallel Processing in Neural Systems and
Computers, pp 229-232 - 1990

Iwamida, H; Katagiri, S; McDermott, E; Tohkura, Y
A hybrid speech recognition system using HMMs with an LVQ-trained
codebook. ICASSP-90, S10.1, pp489 ff. - 1990

Kangas, J; Kohonen, T
Transient Map Method in Stop Consonant Discrimination

Knaghenhjelm, P; Brauer, P
Classification of vowels in continuous speech using mlp and a
hybrid net. Speech Communications 9 ( 1990 ), 31-34 - 1990

Kangas, J; Kohonen, T; Laaksonen, J; Simula, O; Vent, O
Variants of Self-Organizing Maps. IJCNN 89-Washington DC; IEEE Trans. on
NN, vol 1, no. 1 1989, June 18-22, 1990 -

Kangas, J
Time-Delayed Self-Organizing Maps. 1990

Kimber, DG; Bush, M; Tajchman, GN
Speaker-Independent Vowel Classification Using Hidden Markov Models and
LVQ2. ICASSP-90, S10.3 - 1990

Kohonen, T
Self-organized formation of topologically correct feature maps.
Biological Cybernetics, 43:59-69 - 1982

Kohonen, T; Mkisara, K; Saramki, T
Phonotopic map: insightful representation of phonological
features for speech recognition. IEEE 7th Conf on Pattern Recognition,
Montreal, Canada - 1984

Kohonen, T
An Introduction to Neural Computing. Neural Networks, 1, pp3-16, 1988 -
1988

Kohonen, T; Barna, G; Chrisley, R
Statistical Pattern Recognition with Neural Networks:
Benchmarking Studies. Proc. IEEE Int. Conf. on NN, ICNN-88 - 1988

Kohonen, T
Speech Recognition Based on Topology-Preserving Neural Maps. Neural
Computing Architectures, pp 26 ff - 1989

Kohonen, t
Speech recognition based on topology-preserving neural maps. Neural
Computing Architectures, Aleksander, Igor, ed - 1989

Kohonen, T
Learning Vector Quantization. INNC Vol 1 - 1990

Kohonen, T
Statistical Pattern Recognition Revisited. Advanced Neural Computers, R.
Eckmiller, ( Ed ), pp 137 ff - 1990

Lippmann, RP
Review of Neural Networks for Speech Recognition. Neural Computation 1,
1-38 - 1989

Makhoul, J; Roucos, S; Gish, H
Vector Quantization in Speech Coding. Procs. IEEE, vol 73, no. 11 - 1985
Nov

McDermott, E; Katagiri, S
Shift-Invariant, Multi-Category Phoneme Recognition using Kohonen's LVQ2.
ICASSP-89, pp 81ff - 1989

Nakagawa, S; Hirata, Y
Comparison Among Time-Delay Neural Networks, LVQ2, Discrete Parameter HMM
and Continuous Parameter HMM. ICASSP-90, S10.6, PP 509 FF - 1990

Obermayer, K; Ritter, H; Schulten, K
Large-Scale Simulation of a Self-Organizing Neural Network: Formation of
a Somatotopic Map. Parallel Processing in Neural Systems and Computers,
pp 71-74 - 1990

Pope, C; Atlas, L; Nelson, C
A comparison between neural network and conventional vector quantization
codebook algorithms. IEEE PacRim Conf. on Communications, Computers and
Signal Processing, pp 521ff - 1989

Ritter, H
A Spatial Approach to Feature Linking

Ritter, H; Kohonen, T
Self-Organizing Semantic Maps. Biol. Cybern. 61, 241-254 - 1989

Visa, A
Stability Study of Learning Vector Quantization. INNC Vol 2, pp 729 ff -
1990

JJ Merelo
Depto. Electronica y Tecnologia de Computadores
Facultad de Ciencias
Campus Fuentenueva, s/n
18071-Granada ( Spain )
e-mail JMERELO@ugr.es



------------------------------

Subject: LMS-tree source code available
From: "Terence D. Sanger" <tds@ai.mit.edu>
Date: Thu, 21 Mar 91 02:32:37 -0500


Source code for a sample implementation of the LMS-tree algorithm is now
available by anonymous ftp. The code includes a bare-bones
implementation of the algorithm incorporated into a demo program which
predicts future values of the mackey-glass differential delay equation.
The demo will run under X11R3 or higher, and has been tested on sun-3 and
sun-4 machines. Since this is a deliberately simple implementation not
including tree pruning or other optimizations, many improvements are
possible. I encourage any and all suggestions, comments, or questions.

Terry Sanger (tds@ai.mit.edu)

To obtain and execute the code:

> mkdir lms-trees
> cd lms-trees
> ftp ftp.ai.mit.edu
login: anonymous
password: <ret>
ftp> cd pub
ftp> binary
ftp> get sanger.mackey.tar.Z
ftp> quit
> uncompress sanger.mackey.tar.Z
> tar xvf sanger.mackey.tar
> make mackey
> mackey


Some documentation is included in the header of the file mackey.c . A
description of the algorithm can be found in the paper I recently
announced on this network:

Basis-Function Trees as a Generalization of Local Variable Selection
Methods for Function Approximation

Abstract

Local variable selection has proven to be a powerful technique for
approximating functions in high-dimensional spaces. It is used in
several statistical methods, including CART, ID3, C4, MARS, and others .
In this paper I present a tree-structured network which is a
generalization of these techniques. The network provides a framework for
understanding the behavior of such algorithms and for modifying them to
suit particular applications.


To obtain the paper:

>ftp cheops.cis.ohio-state.edu
login: anonymous
password: <ret>
ftp> cd pub/neuroprose
ftp> binary
ftp> get sanger.trees.ps.Z
ftp> quit
>uncompress sanger.trees.ps.Z
>lpr -h -s sanger.trees.ps

Good Luck!


------------------------------

Subject: Looking for Backprop in Lisp
From: "BAYNES ROBERT T" <baynes@nusc-npt.navy.mil>
Date: 25 Mar 91 10:10:00 +0000


I am looking for an implmentation of the PDP software (backprop)
in Lisp, public domain please.

I am not on the mailing list yet so please send responses to
baynes@nusc-npt.navy.mil.

Thanks,


Bob Baynes

[[Editor's Note: See the first article in this Digest. Again, LISP may
be a challenge to get actual code, but should be quite straightforward to
do based on eitehr existing C code or straight from any of the many
excellent ANN books. -PM]]

------------------------------

Subject: looking for references
From: <GANKW%NUSDISCS.BITNET@CUNYVM.CUNY.EDU>
Date: Mon, 25 Mar 91 16:37:00 -0800


I am looking for articles on the application of ART in supervised
==========
learning situations. Can anyone help?

Thanks.

Kok Wee Gan
Department of Information Systems and Computer Science
National University of Singapore
bitnet address: gankw@nusdiscs.bitnet

[[Editor's Note: Perhaps someone from Boston U. could answer in a future
Digest? I thought ART was, strictly speaking, unsupervised only. -PM]]

------------------------------

End of Neuron Digest [Volume 7 Issue 15]
****************************************

← previous
next →
loading
sending ...
New to Neperos ? Sign Up for free
download Neperos App from Google Play
install Neperos as PWA

Let's discover also

Recent Articles

Recent Comments

Neperos cookies
This website uses cookies to store your preferences and improve the service. Cookies authorization will allow me and / or my partners to process personal data such as browsing behaviour.

By pressing OK you agree to the Terms of Service and acknowledge the Privacy Policy

By pressing REJECT you will be able to continue to use Neperos (like read articles or write comments) but some important cookies will not be set. This may affect certain features and functions of the platform.
OK
REJECT