Copy Link
Add to Bookmark
Report

AIList Digest Volume 8 Issue 046

eZine's profile picture
Published in 
AIList Digest
 · 15 Nov 2023

AIList Digest           Saturday, 13 Aug 1988      Volume 8 : Issue 46 

Query Responses:

Feigenbaum's citation
Sigmoid transfer function

----------------------------------------------------------------------

Date: 11 Aug 88 06:07:38 GMT
From: mcvax!inria!crin!napoli@uunet.uu.net (Amedeo NAPOLI)
Subject: Feigenbaum's citation

Is there anybody to tell me the title of the book in which E. Feigenbaum
says:

``AI focused its attention most exclusively on the development of clever
inference methods. But the power of its systems does not reside in the
inference methods; almost any inference method will do. The power
resides in the knowledge''

Many thanx in advance,
--
--- Amedeo Napoli @ CRIN / Centre de Recherche en Informatique de Nancy
EMAIL : napoli@crin.crin.fr - POST : BP 239, 54506 VANDOEUVRE CEDEX, France

------------------------------

Date: 11 Aug 88 16:29:22 GMT
From: glacier!jbn@labrea.stanford.edu (John B. Nagle)
Subject: Re: Feigenbaum's citation


I heard him say things very similar to that around Stanford in 1983.
In the early days of expert systems, that was a common remark. It reflects
a turf battle with the logicians that was taking place at the time, theorem-
proving having been a dominant paradigm in AI in the late 1970s and early
1980s.

It's not clear that such a remark has relevance today. The optimistic
hope that dumping lots of rules into a dumb inference engine would produce
something profound has faded. Experience with that approach has produced
more understanding of what can and cannot be accomplished in that way.
More work is taking place on the underlying machinery again. But now,
there is the realization that the machinery exists to process the knowledge
base, not to implement some interesting logical function. In retrospect,
both camps (and there were camps, at Stanford, in separate buildings)
were taking extreme positions, neither of which turned out to be entirely
satisfactory. Work today lies somewhere between those poles.

Plans are underway, amusingly, to get both groups at Stanford under
one roof again in a new building.

John Nagle

------------------------------

Date: 11 Aug 88 17:17:26 GMT
From: pasteur!agate!garnet.berkeley.edu!ked@ames.arpa (Earl H.
Kinmonth)
Subject: Re: Feigenbaum's citation

As I remember, Feigenbaum achieved notoriety for his (probably
largely ghosted book) on the Japanese "Fifth Generation
Project."
Did anything ever come out of the Fifth Generation
project other than big lecture fees for Feigenbaum to go around
warning about the Japanese peril?

Is he really a pompous twit (the impression given by the book) or
is that due to the scatter-brained ghost writer?

------------------------------

Date: 11 Aug 88 23:35:44 GMT
From: prost.llnl.gov!daven@lll-winken.llnl.gov (David Nelson)
Subject: Re: Feigenbaum's citation

In article <17626@glacier.STANFORD.EDU> jbn@glacier.UUCP (John B. Nagle) writes:
>

[stuff about Feigenbaum's remark omitted]

> It's not clear that such a remark has relevance today. The optimistic
>hope that dumping lots of rules into a dumb inference engine would produce
>something profound has faded. ....

To be replaced by the optimistic hope that dumping lots of examples into a
dumb neural net will produce something profound :-)

daven


daven (Dave Nelson)
arpa: daven @ lll-crg.llnl.gov
uucp: ...{seismo,mordor,sun,lll-lcc}!lll-crg!daven

------------------------------

Date: 8 Aug 88 03:00:35 GMT
From: glacier!jbn@labrea.stanford.edu (John B. Nagle)
Subject: Re: Sigmoid transfer function

In article <25516@ucbvax.BERKELEY.EDU> munro@icsia.UUCP (Paul Munro) writes:
>
>Try this one : f(x) = x / (1 + |x|)
>

The graph looks OK, although some scaling is needed to make it comparable
to the sigmoid. Someone should try it in one of the popular neural net
simulators and see how the results change.

John Nagle

------------------------------

Date: 10 Aug 88 17:46:01 GMT
From: amdahl!pyramid!prls!philabs!aecom!krishna@ames.arpa (Krishna
Ambati)
Subject: Re: Sigmoid transfer function

In a previous article, John B. Nagle writes:
> In article <25516@ucbvax.BERKELEY.EDU> munro@icsia.UUCP (Paul Munro) writes:
> >
> >Try this one : f(x) = x / (1 + |x|)
> >
>
> The graph looks OK, although some scaling is needed to make it
> comparable
> to the sigmoid. Someone should try it in one of the popular neural net
> simulators and see how the results change.
>
> John Nagle




I did try it out in a simulation for The Traveling Salesman using
the Hopfield Tank model. Unfortunately, it yields pretty poor results.
Probably because it does not rise quickly in the middle region, and
furthermore, its convergence to 1 (after scaling) is pretty slow.
I would be happy to hear more positive results.


Krishna Ambati
krishna@aecom.uucp

------------------------------

Date: 11 Aug 88 16:14:21 GMT
From: glacier!jbn@labrea.stanford.edu (John B. Nagle)
Subject: Re: Sigmoid transfer function

In article <1960@aecom.YU.EDU> krishna@aecom.YU.EDU (Krishna Ambati) reports
that

f(x) = x / (1 + |x|)

is a poor transfer function for neural net units, not rising steeply enough
near the transition point. This seems reasonable.

What we may need is something that looks like a step function fed through
a low-pass filter. The idea is to come up with a function that works but
can be computed with less hardware (analog or digital) than the sigmoid.
Try again?

John Nagle

------------------------------

Date: 11 Aug 88 20:09:22 GMT
From: phri!cooper!gene@nyu.edu (Gene (the Spook) )
Subject: Re: Sigmoid transfer function

in article <17615@glacier.STANFORD.EDU>, John B. Nagle says:
> Xref: cooper sci.electronics:2943 comp.ai:1519 comp.ai.neural-nets:158
>
>
> Recognize that the transfer function in a neural network threshold unit
> doesn't really have to be a sigmoid function. It just has to look roughly
> like one. The behavior of the net is not all that sensitive to the
> exact form of that function. It has to be continuous and monotonic,
> reasonably smooth, and rise rapidly in the middle of the working range.
> The trigonometric form of the transfer function is really just a notational
> convenience.
>
> It would be a worthwhile exercise to come up with some other forms
> of transfer function with roughly the same graph, but better matched to
> hardware implementation. How do real neurons do it?

Oooh, yeah! Why not make a differential amplifier out of two transistors
or so? Just look up how ECL gates are constructed to get the basic design.
If you want, you can take a basic op amp and "compare" an input reference
of the median voltage, then scale up/down and level shift if necessary.
I'm assuming that you mostly care about just the three input levels you
mentioned. Try this:

Vi = -oo Vo = 0.0
Vi = 0.0 Vo = 0.5
Vi = +oo Vo = 1.0

so take an op amp and run it off of, say, +-10V.
With a gain of around +10, an absolute value of around 1V will saturate
the output at the respective supply rail. For all practical purposes,
you'll have a linear output within +-1.0V, with maximum output being the
supply voltages. To increase the linear "spread", lower the gain; to
decrease it, increase the gain.

So fine, that'll get you a +-10V output. Now use two resistors to make a
voltage divider. A 1k and 9k will give you a /10 divider, now giving you
a +-1.0V output, for example. Use a trimmer if you want to and get the
right voltage swing for your purposes. In your case, a /20 divider will
get you a +-0.5V swing. Use a second op amp as a level shifter, set the
shift at 0.5V, and voila! Now you have a 0.0 to 1.0 voltage swing!

If that's acceptable for your purposes, fine. If you want to "soften" the
corners, use a pair of inverse-parallel diodes which will start to
saturate as you get near their corner- or knee-voltage (V-sub-gamma).
In short, just play around with whatever comes to mind, and see if it
suits your purpose. Have fun!

Spookfully yours,
Gene

...!cmcl2!cooper!gene

------------------------------

Date: 11 Aug 88 21:30:52 GMT
From: ankleand@athena.mit.edu (Andy Karanicolas)
Subject: Re: Sigmoid transfer function (long)


THE VIEWS AND OPINIONS HERE ARE NOT NECESSARILY THOSE OF M.I.T.

Here is a schematic of a circuit that should perform the "sigmoid" transfer
function talked about. The op-amp could be replaced with current mirrors to
perform a subtraction but this circuit is easier (to draw!). I'm sure there
are plenty of other (better, simpler) ways to accomplish the task. Maintaining
voltage as the analog variable adds to circuit complexity.

* PLEASE, NO FLAMES; THIS IS JUST A SUGGESTION *

NOTE: The 'X' indicates a connection where ambiguity with
a crossing may exist.
The 'N' on the transistors indicates emitter for NPN device.


___________________VC1
| |
/ /
R0 \ \ R0
/ / R1
\ \ ___/\/\/\______
| | R1 | |
| X _______________/\/\/\_____X___|\ --VC2 |
| | V2 |- \ _______X____
X _________|________________/\/\/\_____ ___|+ / |
| V1 | R1 X |/ --(-VC2) |
Q1 | | Q2 | A2 |
|/ \| / /
----| |---- R1 \ RP2 \___VOUT
| |\N RP0 N/| | / /
| |___/\/\___| | \ \
| | | | VREF |
| | | RP1 | |
| | | GND____/\/\/\/\/\____VC1 GND
| | |
| | |
| |___________|__________________ ______ __/\/\/\__GND
| | --> IX | | X R4
| R2 ___/\/\___X | | |
| R2 | | \| | |/
| ____/\/\____X__|\--VC2 | Q3 |----X----| Q4
X |- \ ___| N/| |\N
| ____|+ / | |
VIN GND |/--(-VC2) | |
A1 / /
\ R3 \ R3
/ /
\ \
|_____________X____(-VC2)


The transfer function of this circuit is:

VOUT ~= B2 * { B1 * VC1 + IX * R0 * TANH[ VIN / VTH ] }

where VTH = kT/q and is about 25mV at room temp.

The constants B2 and B1 are less than unity and are set by potentiometers
RP2 and RP1 respectively.

Circuit description:

Q1 and Q2 form a differential amplifier that provides the tanh
function. The potentiometer RP0 helps to equalize transistor
mismatches. RP0 should be as small as possible to maintain the
tanh function of this amplifier. Choosing a large RP0 will cut
down the gain at the midpoint of the 'S'; the tanh function gets
'linearized' and the above transfer equation becomes invalid.


The op-amp A1 provides an invereted version of the input voltage;
together with the input itself, the input to the diff. amp is a
differential mode signal (within component tolerances) equal to
2 * VIN. The input should be from a low impedance source or an
input buffer will be needed.

The op-amp A2 is configured in a differencing mode. The transfer
function of this amplifier is: VOUT = V1 - V2 + VREF. The
adjustable reference VREF adjusts the constant B1 and the attenuation
pot. on the ouput of A2 adjusts the constant B2.

Q3 and Q4 form a current source IX. It can be replaced by a simple
resistor but the current source should help maintain the tanh
function for large input signals. IX is set by R3, R4 and VC2.

One design example:

VC1 = 5V
VC2 = 15V (typical supply is +5, +15, -15)
set IX * R0 = 0.5
set B1 * VC1 = 0.5 (VREF = 0.5)
use RP0 = 25 ohms
set IX = 0.5mA so that R0 = 1K
to cut down loading effects, set R1 = 47K (arbitrarily)
set RP1 to 1K (much smaller than R1)
set RP2 to 1K
for IX = 0.5mA, set R3 = 100 ohms; R4 then is about (15 - .65)/.5mA
set R4 to 27K
the choice of R2 is not critical; use 1K
Q1-4 can be standard 2N2222 or 2N3904 NPN's
A1 and A2 can be LM301s (or, UGHH!!, 741's even..)


Have fun and good luck!

Andy Karanicolas
Microsystems Technology Laboratory
ankleand@caf.mit.edu

------------------------------

End of AIList Digest
********************

← previous
next →
loading
sending ...
New to Neperos ? Sign Up for free
download Neperos App from Google Play
install Neperos as PWA

Let's discover also

Recent Articles

Recent Comments

Neperos cookies
This website uses cookies to store your preferences and improve the service. Cookies authorization will allow me and / or my partners to process personal data such as browsing behaviour.

By pressing OK you agree to the Terms of Service and acknowledge the Privacy Policy

By pressing REJECT you will be able to continue to use Neperos (like read articles or write comments) but some important cookies will not be set. This may affect certain features and functions of the platform.
OK
REJECT