Copy Link
Add to Bookmark
Report

Machine Learning List Vol. 4 No. 20

eZine's profile picture
Published in 
Machine Learning List
 · 13 Dec 2023

 
Machine Learning List: Vol. 4 No. 20
Friday, Oct 2, 1992

Contents:

ML93 format
UCI Machine Learning Repository
ECML93 Workshop
MORGAN KAUFMANN PUBLISHERS special offer
MIT Press Books on Machine Learning
SUMMARY: Multi-Algorithm Machine Learning Systems


The Machine Learning List is moderated. Contributions should be relevant to
the scientific study of machine learning. Mail contributions to ml@ics.uci.edu.
Mail requests to be added or deleted to ml-request@ics.uci.edu. Back issues
may be FTP'd from ics.uci.edu in pub/ml-list/V<X>/<N> or N.Z where X and N are
the volume and number of the issue; ID: anonymous PASSWORD: <your mail address>

----------------------------------------------------------------------

Date: Mon, 28 Sep 92 20:42:28 EDT
From: utgoff%zinc@cs.umass.EDU
Subject: ML93 format

The format for ML93 will be four (4) invited talks, plus a mixture
of parallel and plenary paper sessions. All accepted papers will be
allocated eight (8) proceedings pages and a thirty (30) minute talk,
parallel or plenary. There will be no posters. See call-for-papers
for additional information.

Paul Utgoff, chair ML93

------------------------------

To: ml@focl.ICS.UCI.EDU
Subject: UCI Machine Learning Repository
From: "Patrick M. Murphy" <pmurphy@focl.ICS.UCI.EDU>

Based on responses to a previous posting of updates to the repository,
I'm posting information on how to access the repository via ftp and
a brief overview of the contents.

- Patrick

============================================================================
This is the UCI Repository Of Machine Learning Databases and Domain Theories
23 September 1992
ics.uci.edu: pub/machine-learning-databases
Site Librarian: Patrick M. Murphy (ml-repository@ics.uci.edu)
Off-Site Assistant: David W. Aha (aha@insight.cs.jhu.edu)
83 databases and domain theories (30000K)
============================================================================

This database contains data sets, domain theories and a few misc. items.

The contents of this repository can be remotely copied to other network
sites via ftp to ics.uci.edu. Enter "anonymous" for userid, and e-mail
address (user@host) for password. These databases can be found by
executing "cd pub/machine-learning-databases".

_____________________________________________________________________
Brief Overview of Databases and Domain Theories:

Quick Listing:
1. annealing (David Sterling and Wray Buntine)
2-3. audiology (Ray Bareiss and Bruce Porter, used in Protos)
1. Original Version
2. Standardized-Attribute Version of the Original.
4. autos (Jeff Schlimmer)
5. breast-cancer (Ljubljana Institute of Ontcology, restricted access)
6. breast-cancer-wisconsin (Wisconsin Breast Cancer D'base, Olvi Mangasarian)
7. bridges (Yoram Reich)
8-15. chess
1. Partial generator of Quinlan's chess-end-game data (kr-vs-kn) (Schlimmer)
2. Shapiros' endgame database (kr-vs-kp) (Rob Holte)
3-8. Six domain theories (Nick Flann)
16-17. Credit Screening Database
1. Japanese Credit Screening Data and domain theory (Chiharu Sano)
2. Credit Card Application Approval Database (Ross Quinlan)
18. Ein-Dor and Feldmesser's cpu-performance database (David Aha)
19. dgp-2 data generation program (Powell Benedict)
20. Nine small EBL domain theories and examples in sub-directory ebl
21. Evlin Kinney's echocardiogram database (Steven Salzberg)
22. flags (Richard Forsyth)
23. function-finding (Cullen Schafer's 352 case studies)
24. glass (Vina Spiehler)
25. hayes-roth (from Hayes-Roth^2's paper)
26-29. heart-disease (Robert Detrano)
30. hepatitis (G. Gong)
31. horse colic database (Mary McLeish & Matt Cecile)
32. Image segmentation database (Carla Brodley)
33. ionosphere information (Vince Sigillito)
34. iris (R.A. Fisher, 1936)
35. kinship (J. Ross Quinlan)
36. labor-negotiations (Stan Matwin)
37-38. led-display-creator (from the CART book)
39. lenses (Cendrowska's database donated by Benoit Julien)
40. letter-recognition database (created and donated by David Slate)
41. liver-disorders (BUPA Medical's database donated by Richard Forsyth)
42. logic-theorist (Paul O'Rorke)
43. lung cancer (Stefan Aeberhard)
44. lymphography (Ljubjana Institute of Oncology, restricted access)
45-46. mechanical-analysis (Francesco Bergadano)
1. Original Mechanical Analysis Data Set
2. PUMPS DATA SET
47-48. molecular-biology
1. promoter sequences (Towell, Shavlik, & Noordewier, domain theory also)
2. splice-junction sequences (Towell, Noordewier, & Shavlik,
domain theory also)
49. mushroom (Jeff Schlimmer)
50. othello domain theory (Tom Fawcett)
51. Pima Indians diabetes diagnoses (Vince Sigillito)
52. Primary Tumor (Ljubjana Institute of Oncology, restricted access)
53. Quadraped Animals (John H. Gennari)
54. shuttle-landing-control (Bojan Cestnik)
55. solar flare (Gary Bradshaw)
56-57. soybean (from Ryszard Michalski's groups)
58. spectrometer (Infra-Red Astronomy Satellite Project Database, John Stutz)
59. tic-tac-toe endgame database (Turing Institute, David W. Aha)
60-69. thyroid-disease (Garavan Institute, J. Ross Quinlan; Stefan Aeberhard)
70-77. Undocumented databases: sub-directory undocumented
1. Information retrieval (IR) data collection (David Lewis)
2. Economic sanctions database (domain theory included, Mike Pazzani)
3. Cloud cover images (Philippe Collard)
5. DNA secondary structure (Qian and Sejnowski, donated by Vince Sigillito)
6. Nettalk data (Sejnowski and Rosenberg, taken from connectionist-bench)
7. Sonar data (Gorman and Sejnowski, taken from connectionist-bench)
8. Protein folding data (see connectionist-bench)
9. Vowel data (Qian and Sejnowski, taken from connectionist-bench (see 9))
78. university (Michael Lebowitz, donated by Steve Souders)
79. voting-records (Jeff Schlimmer)
80-81. waveform domain (taken from CART book)
82. Wine Recognition Database (donated by Stefan Aeberhard)
83. Zoological database (Richard Forsyth)


------------------------------

Date: Thu, 1 Oct 92 12:32:43 EDT
From: spears@aic.nrl.navy.MIL
Subject: ECML93 Workshop

CALL FOR PAPERS
Workshop on ``Foundations of Evolutionary Computation''
To be held after ECML93
Thursday April 8, 1993 Vienna, Austria

Evolutionary computation refers to the simulated evolution of
structures based on their performance in an environment. A variety
of evolutionary computation approaches have emerged in the last few
decades, including "evolutionary programming" (Fogel, 1966), "evolu-
tion strategies" (Rechenberg, 1973), "genetic algorithms" (Holland,
1975), and "genetic programming" (de Garis, 1990; Koza, 1990).

The goal of this workshop is to focus on the more general topic
of evolutionary computation and to draw researchers from diverse
areas to discuss its foundations. The topic of this workshop is a
unifying theme for researchers working in the different evolutionary
computation approaches. It will also be of interest to related
research communities, such as artificial life. The workshop
encourages papers on the following topics:

- Theories of evolutionary computation. The theories should con-
trast and compare different evolutionary computation approaches,
such as genetic algorithms, evolution strategies, evolutionary
programming, and genetic programming.

- Comparisons of different evolutionary computation approaches on
machine learning tasks. The comparisons may be theoretical
and/or experimental.

Please send 4 hard copies of a paper (10-15 double-spaced pages,
ECML-93 format) or (if you do not wish to present a paper) a
description of your current research to:

Commanding Officer
Naval Research Laboratory
Code 5510, Attn: William M. Spears
4555 Overlook Avenue, SW
Washington, DC 20375-5320

Email submissions to spears@aic.nrl.navy.mil are also acceptable,
but they must be in PostScript. FAX submissions will not be
accepted. If you have any questions about the workshop, please send
email to William M. Spears at spears@aic.nrl.navy.mil or call 202-
767-9006.

Important Dates (all deadlines will be strict):

January 11 - Papers and research descriptions due
February 1 - Acceptance notification
February 22 - Final version of papers due

Program Committee:

William M. Spears, Naval Research Laboratory (USA, chair)
Kenneth A. De Jong, George Mason University (USA, co-chair)
Gilles Venturini, Universite de Paris-Sud (France, co-chair)
Diana F. Gordon, Naval Research Laboratory (USA)
David Fogel, ORINCON Corporation (USA)
Hugo de Garis, Electro Technical Lab (Japan)
Thomas Baeck, University of Dortmund (Germany)

------------------------------

Date: Fri, 2 Oct 92 16:30:37 PDT
From: Morgan Kaufmann <morgan@unix.sri.COM>
Subject: MORGAN KAUFMANN PUBLISHERS special offer


MORGAN KAUFMANN PUBLISHERS

MACHINE LEARNING LIST SPECIAL OFFER

PRICES SHOWN REFLECT 10% DISCOUNT
GOOD THROUGH OCTOBER 31, 1992



C4.5: PROGRAMS FOR MACHINE LEARNING, by J. Ross Quinlan
(University of Sydney), October 1992,
Book only: ISBN 1-55860-238-0; $40.46
Book & Software: ISBN 1-55860-240-2 $62.96

Classifier systems play a major role in machine learning and knowledge-
based systems, and Ross Quinlan's work on ID3 and C4.5 is widely
acknowledged to have made some of the most significant contributions to
their development. This book is a complete guide to the C4.5 system as
implemented in C for the UNIX environment. It contains a comprehensive
guide to its use, the source code (about 8,800 lines), and implementation
notes. The source code and sample datasets are also available on a 3.5-
inch floppy diskette for a Sun workstation.

C4.5 starts with large sets of cases belonging to known classes. The
cases, described by any mixture of nominal and numeric properties, are
scrutinized for patterns that allow the classes to be reliably
discriminated. These patterns are then expressed as models, in the form
of decision trees or sets of if-then rules, that can be used to classify
new cases, with emphasis on making the models understandable as well as
accurate. The system has been applied successfully to tasks involving
tens of thousands of cases described by hundreds of properties. The book
starts from simple core learning methods and shows how they can be
elaborated and extended to deal with typical problems such as missing
data and over hitting. Advantages and disadvantages of the C4.5 approach
are discussed and illustrated with several case studies.

This book and software should be of interest to developers of
classification-based intelligent systems and to students in machine
learning and expert systems courses.

Introduction * Constructing Decision Trees * Unknown Attribute Values *
Pruning Decision Trees * From Trees to Rules * Windowing * Grouping
Attribute Values * Interacting with Classification Models * Guide to
Using the System * Limitations * Desirable Additions * Appendix *
References & Bibliography * Author Index * Subject Index *



ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 4, Proceedings of the
1991 conference, edited by John E. Moody (Yale University), Steven J.
Hanson (Siemens Research Center), and Richard P. Lippmann (MIT Lincoln
Laboratory)

Volume 4: April 1992, 1187 pages; cloth; ISBN 1-55860-222-4; $44.96
Volume 3: 1991, 1200 pages; cloth; ISBN 1-55860-184-8; $44.96
Volume 2: 1990; 853 pages; cloth; ISBN 1-55860-100-7; $35.96
Volume 1: 1989; 819 pages; cloth; ISBN 1-55860-015-9; $35.96

Research in neural networks comprises results from many disciplines.
These volumes contain the collected papers from the premier forum for
neural networks research the IEEE Conferences on Neural Information
Processing Systems-Natural and Synthetic, held annually in Denver,
Colorado. Papers are rigorously evaluated for scientific merit and
revised for publication. Topics from these books include: rules and
connectionist models, speech, vision, neural network dynamics,
neurobiology, computational complexity issues, fault tolerance in neural
networks, benchmarking and comparing neural network applications,
architectural issues, fast training techniques, VLSI, control,
optimization, statistical inference, and genetic algorithms.



MACHINE LEARNING: PROCEEDINGS OF THE NINTH INTERNATIONAL WORKSHOP (1992),
edited by Derek Sleeman and Peter Edwards (Both of Aberdeen University),
June 1992; ISBN 1-55860-247-X; 488 pages $35.96

MACHINE LEARNING: PROCEEDINGS OF THE EIGHTH INTERNATIONAL WORKSHOP
(1991), edited by Lawrence Birnbaum and Gregg Collins (Both of
Northwestern University), June 1991, ISBN 1-55860-200-3; $35.96



READINGS IN MODEL-BASED DIAGNOSIS edited by Walter Hamscher, Luca Console
and Johan de Kleer, July 1992, ISBN 1-55860-249-6, 520 pages; $40.46

Automated diagnosis has always been an important AI problem not only for
its potential practical applications but because it exposes issues common
to all automated reasoning efforts and presents real challenges to
existing paradigms. Diagnosis is probably the single largest category
of expert systems in use, a substantial fraction of which are concerned
with the diagnosis of engineered systems and devices.

This Readings book is about new artificial intelligence techniques for
the diagnosis of engineered systems based on a general purpose model of
the internal structure and behavior of the target device. These general
purpose models can be constructed using standard AI technologies such as
predicate logic, frames, constraints and rules. Complementing the
modeling technology are algorithms for diagnosis that are also based on
standard AI techniques such as theorem proving, heuristic search,
qualitative simulation, and Bayes nets. The 42 papers reprinted in the
volume reflect the recent maturation of the field and include the most
seminal and frequently referenced sources. The editors have provided
introductions to the papers and an annotated bibliography of over 350
works.

Readings in Model-based Diagnosis will be of interest to a wide range of
professionals in the AI and engineering communities concerned with the
building of diagnostic systems and with understanding the technology
underlying them.


* Introduction to Model-Based Diagnosis * Logical Foundations * The
General Diagnostic Engine * Fault Models * Analog Systems * Diagnosing
Devices with State * Hierarchies * Relaxation of Diagnostic Assumptions
* Probabilistic Approaches * Annotated Bibliography * Credits * Authors
Index * Subject Index *



PARADIGMS OF ARTIFICIAL INTELLIGENCE PROGRAMMING: CASE STUDIES IN COMMON
LISP, by Peter Norvig (University of California at Berkeley), October
1991, ISBN 1-55860-191-0 946 pages; $40.46

"This book will become one of the classics. It goes further than any
other text currently available for teaching advanced Lisp coding
techniques and giving the reader a perspective on AI programming. This
is definitely the best text I have seen...I think this book will be a
classic for Common Lisp as Abelson and Sussman's is for Scheme."
>David C. Loeffler (MCC)

...I think (this book's) appeal should be tremendous because the student
studies and builds real, working programs. (It) contains far and away the
best example (programs) I have seen to date."
>Prof. Bruce D'Ambrosio (Oregon State University)

"(This is) a book that I can recommend to experienced software developers
that they will use and enjoy...the honesty and matter-of-fact style will
appeal to those who have `seen it all' in software development..."
>Mary Boelk (Johnson Controls)

Paradigms of AI Programming is the first text to teach advanced Common
Lisp techniques in the context of building major AI systems.

By reconstructing authentic, complex AI programs using state-of-the-art
Common Lisp, the book teaches students and professionals how to build and
debug robust practical programs, while demonstrating superior programming
style and important AI concepts.

The author strongly emphasizes the practical, performance issues of
writing real working programs of significant size, including chapters on
troubleshooting and efficiency. Also included is a discussion of the
fundamentals of object-oriented programming and a description of the main
CLOS functions. This volume is an excellent text for a course on AI
programming, a useful supplement for general AI courses and an
indispensable reference for the professional programmer.

Preface * Introduction to Lisp * A Simple Lisp Program * Overview of Lisp
* GPS: The General Problem Solver * Eliza: Dialog with a Machine *
Building Software Tools * Student: Solving Algebra Word Problems *
Symbolic Mathematics: A Simplification Program * Efficiency Issues * Low-
Level Efficiency Issues * Logic Programming * Compiling Logic Programs
* Object-Oriented Programming * Knowledge Representation and Reasoning
* Symbolic Mathematics and Canonical Forms * Expert Systems * Line
Diagram Labeling by Constraint Satisfaction * Search and the Game of
Othello * Introduction to Natural Language * Unification Grammars * A
Grammar of English * Scheme: An Uncommon Lisp * Compiling LISP * ANSI
Common Lisp * Troubleshooting * Appendix * Bibliography * Index *



MACHINE LEARNING: A THEORETICAL APPROACH, by Balas K. Natarajan (Hewlett-
Packard Laboratories), July 1991, ISBN 1-55860-148-1, pages; $38.66

This is the first comprehensive introduction to computational learning
theory. The author's uniform presentation of fundamental results and
their applications offers AI researchers a theoretical perspective on
the problems they study. The book presents tools for the analysis of
probabilistic models of learning, tools that crisply classify what is
and is not efficiently learnable. After a general introduction to
Valiant's PAC paradigm and the important notion of the Vapnik-
Chervonenkis dimension, the author explores specific topics such as
finite automata and neural networks. The presentation is intended for
a broad audience, the author's ability to motivate and pace discussions
for beginners has been praised by reviewers. Each chapter contains
numerous examples and exercises, as well as a useful summary of
important results. An excellent introduction to the area, suitable
either for a first course, or as a component in general machine learning
and advanced AI courses. Also an important reference for AI researchers.

Introduction * Learning Concepts on Countable Domains * Time-Complexity
of Concept Learning * Learning Concepts on Uncountable Domains * Learning
Functions * Finite Automata * Neural Networks * Generalizing the Learning
Model * Conclusion * Notation * Bibliography



FOUNDATIONS OF GENETIC ALGORITHMS, Edited by Gregory J. E. Rawlins
(Indiana University), July 1991, ISBN 1-55860-170-8; $41.36

Genetic Algorithms (GAs) are becoming an important tool in machine
learning research. GAs have been applied to problems such as design of
semiconductor layout and factory control, and have been used in AI
systems and neural networks to model processes of cognition such as
language processing and induction. They are the principal heurisitic
search method of classifier systems and they have been used on NP-hard
combinatorial optimization problems. Although much is known about their
basic behavior, there are many aspects of GAs that have not been
rigorously defined or studied formally. This book addresses the need
for a principled approach to understanding the foundations of genetic
algorithms and classifier systems as a way of enhancing their further
development and application. Each paper presents original research, and
most are accessible to anyone with general training in computer science
or mathematics. This book will be of interest to a variety of fields
including machine learning, neural networks, theory of computation,
mathematics and biology.

Contributors: Clayton L. Bridges, David E. Goldberg, Yuval Davidor,
Carol A. Ankenbrandt, Kalyanmoy Deb, Gilbert Syswerda, Steven Y.
Goldsmith, Delores M. Etter, Robert E. Smith, Thomas H. Westerdale,
Gunar E. Liepins, Lashon Booker, Rick Riolo, John R. Koza, H. James
Antonisse, Alden H. Wright, Ping-Chung Chi, Petrus Handoko, Darrell
Whitley, J. David Schaffer, Larry J. Eshelman, Daniel Offutt, Piet
Spiessens, Jon T. Richardson, David L. Battle, Michael D. Vose,
Stephanie Forrest, Melanie Mitchell, John Grefenstette, Larry J.
Eshelman, Barry R. Fox, Mary Beth McMahon, Kenneth De Jong, William
Spears, Heinz Muehlenbein



CONCEPT FORMATION: KNOWLEDGE AND EXPERIENCE IN UNSUPERVISED LEARNING,
Edited by Douglas Fisher (Vanderbilt University) and Michael Pazzani
(University of California, Irvine), July 1991; ISBN 1-55860-201-1; $38.66

Concept formation lies at the center of learning and cognition. Unlike
much work in machine learning and cognitive psychology, research on this
topic focuses on the unsupervised and incremental acquisition of
conceptual knowledge. Recent work on concept formation addresses a
number of important issues. Foremost among these are the principles
of similarity that guide concept learning and retrieval in human and
machine, including the contribution of surface features, goals, and
`deep' features. Another active area of research explores mechanisms for
efficiently reorganizing memory in response to the ongoing experiences
that confront intelligent agents. Finally, methods for concept formation
play an increasing role in work on problem solving and planning,
developmental psychology, engineering applications, and constructive
induction. This book brings together results on concept formation from
cognitive psychology and machine learning, including explanation-based
and inductive approaches. Chapters from these differing perspectives are
intermingled to highlight the commonality of their research agendas.
In addition to cognitive scientists and AI researchers, the book will
interest data analysts involved in clustering, philosophers concerned
with the nature and origin of concepts, and any researcher dealing with
issues of similarity, memory organization, and problem solving.

Computational Models of Concept Learning * An Iterative Bayesian
Algorithm for Categorization * Representational Specificity and Learning
* Discrimination Net Models of Concept Formation * Concept Formation in
Structured Domains * Theory-Driven Concept Formation * Explanation-Based
Learning as Concept Formation * Some Influences of Instance Comparisons
in Concept Formation * Harpoons and Long Sticks: Theory and Similarity
in Rule Induction * Concept Formation over Problem-Solving Experiences
* Concept Formation in Context * The Formation and Use of Abstract
Concepts in Design * Learning to Recognize Movements * Representation
Generation in an Exploratory Learning System * Q-SOAR: A Computational
Account of Children's Learning About Number Conservation



GENETIC ALGORITHMS: PROCEEDINGS OF THE FOURTH INTERNATIONAL CONFERENCE
(1991), Edited by Richard K. Belew (Univ. of California, San Diego),
Lashon Booker (Naval Research Laboratory) and J. David Schaffer (Philips
Laboratories), July 1991; ISBN 1-55860-208-9; $35.96

Also Available: Proceedings of the Third Int'l. Conference (1989):
Edited by J. David Schaffer (Philips Laboratories), 1989, ISBN 1-55860-
066-3; 452 pages; $35.96

This volume contains the papers presented at the Fourth International
Conference on Genetic Algorithms. The papers, each written by a
leading researcher in the field, reflect the growth and diversity of
this field. Topics include: Holland's Genetic Algorithm and Classifier
Systems, machine learning and optimization using these systems,
relations to other learning paradigms (such as connectionist networks),
parallel implementations, related biological modeling issues and
practical applications.


CONNECTIONIST MODELS: PROCEEDINGS OF THE 1990 SUMMER SCHOOL WORKSHOP,
edited by David S. Touretzky (Carnegie Mellon University), Jeffrey L.
Elman (University of California, San Diego), Terrence J. Sejnowski (Salk
Institute) and Geoffrey E. Hinton (University of Toronto), 1990; 404
pages; paper; ISBN 1-55860-156-2; $31.46

1988 Proceedings also available: edited by David S. Touretzky (Carnegie
Mellon University), Geoffrey Hinton (University of Toronto), and
Terrence J. Sejnowski (Salk Institute), 1988; 527 pages; paper; ISBN
1-55860-035-3; $26.96

The Connectionist Models Summer Schools bring together distinguished
researchers and outstanding graduate students to evaluate current
research results at the forefront of connectionist models in neural
networks. The papers, rigorously selected from faculty and student
entries, have been updated and revised to incorporate workshop
discussions, as well as the authors' and editors' interaction. The
selections serve to summarize a wide variety of efforts in VLSI design,
optimization methods, learning theory, vision, speech, neuroscience,
linguistics, and cognitive psychology. This collection, like its
successful predecessor, will be a valuable reference for researchers and
students.



CASE BASED REASONING: PROCEEDINGS OF THE 1991 DARPA WORKSHOP, May 1991;
500 pages; paper; ISBN 1-55860-199-6; $36.00




COLT 1991: Proceedings of the Fourth Annual Workshop on Computational
Learning Theory, edited by Leslie Valiant (Harvard University) and
Manfred Warmuth (University of California, Santa Cruz), July 1991; 395
pages; Paper; ISBN 1-55860-213-5; $35.96



MACHINE LEARNING: AN ARTIFICIAL INTELLIGENCE APPROACH, VOLUME III, edited
by Yves Kodratoff (French National Scientific Research Council) and
Ryszard Michalski (George Mason University). June 1990; 825 pages;
Cloth; ISBN 1-55860-119-8 $49.46

General Issues * Empirical Learning Methods * Analytical Learning Methods
* Integrated Learning Systems * Subsymbolic Learning Systems * Formal
Analysis



READINGS IN MACHINE LEARNING, edited by Jude Shavlik (University of
Wisconsin, Madison) and Thomas Dietterich (Oregon State
University). June 1990; 853 pages; Paper; ISBN 1-55860-143-0; $40.46

General Aspects of Machine Learning * Inductive Learning Using Pre-
Classified Training Examples * Unsupervised Concept Learning and
Discovery * Improving the Efficiency of a Problem Solver * Using Pre-
Existing Domain Knowledge Inductively * Explanatory/Inductive Hybrids



COMPUTATIONAL MODELS OF SCIENTIFIC DISCOVERY AND THEORY FORMATION, edited
by Jeff Shrager (Xerox PARC) and Pat Langley (NASA Ames Research Center).
June 1990; 498 pages; Cloth; ISBN 1-55860-131-7; $38.66

Introduction * The Conceptual Structure of the Geologic Revolution * A
Unified Analogy Model of Explanation and Theory Formulation * An
Integrated Approach to Empirical Discovery * Deriving Basic Laws by
Analysis of Process and Equations * Theory Formation by
Abduction: Initial Results of a Case Study Based on the Chemical
Revolution * Diagnosing and Fixing Faults in Theories * Hypothesis
Formation as Design * Towards a Computational Model of Theory Revision
* Evaluation of KEKADA as an AI Program * Scientific Discovery in the Lay
Person * Designing Good Experiments to Test Bad Hypotheses * On Finding
the Most Probable Model * Commonsense Perception and the Psychology of
Theory Formation * Five Questions for Computationalists



COMPUTER SYSTEMS THAT LEARN: CLASSIFICATION AND PREDICTION METHODS FROM
STATISTICS, NEURAL NETS, MACHINE LEARNING AND EXPERT SYSTEMS, by Sholom
Weiss and Casimir Kulikowski (Both of Rutgers
University). October 1990; approx. 250 pages; Cloth; ISBN
1-55860-065-5; $40.46

Overview of Learning Systems * How to Estimate the True Performance of
a Learning System * Statistical Pattern Recognition * Neural Nets *
Machine Learning: Easily Understood Decision Rules * Which Technique is
Best? * Expert Systems



A GENERAL EXPLANATION-BASED LEARNING MECHANISM AND ITS APPLICATION TO
NARRATIVE UNDERSTANDING, by Raymond J. Mooney (University of Texas,
Austin). 1989; 243 pages; Paper; ISBN 1-55860-091-4, $26.96



EXTENDING EXPLANATION-BASED LEARNING BY GENERALIZING THE STRUCTURE OF
EXPLANATIONS, by Jude W. Shavlik (University of Wisconsin, Madison).
1990; 219 pages; Paper; ISBN 1-55860-109-0, $26.96.



MATHEMATICAL FOUNDATIONS OF LEARNING MACHINES, by Nils Nilsson (Stanford
University), with a New Introduction by Terrence J. Sejnowski (Salk
Institute) and Hal White (University of California, San Diego). 1990; 138
pages; Paper; ISBN 1-55860-123-6; $23.36

Introduction by Terrence J. Sejnowski and Hal White * Trainable Pattern
Classifiers * Some Important Discriminant Functions: Their Properties and
Their Implementations * Parametric Training Methods * Some Nonparametric
Training Methods for Learning Machines * Training Theorems * Layered
Machines * Piecewise Linear Machines * Appendix

=================================================================
MORGAN KAUFMANN PUBLISHERS
MACHINE LEARNING LIST SPECIAL OFFER
10% DISCOUNT THROUGH OCTOBER 31,1992

ORDER FORM

Please send me the following books:
No. Copies Author/Title ISBN# Price


_________________________________________________________________________
_________________________________________________________________________
_________________________________________________________________________
_________________________________________________________________________
_________________________________________________________________________
___________________ SUBTOTAL:
___________
CALIFORNIA RESIDENTS ADD APPROPRIATE SALES TAX: ___________
SHIPPING & HANDLING: ___________

U.S. __ add $3.50 for 1st book, $2.50 for each additional;
Foreign __ add $6.50 for 1st book; $3.50 for each additional:

TOTAL: _____________

Check enclosed _____ Charge my VISA _____ MasterCard _____

Account #__________________________________ Expires ___________

Signature ________________________________
Phone_______________________

Name as on card_____________________________________

Send books to:
__________________________________________________________
__________________________________________________________
__________________________________________________________
__________________________________________________________

Send form to:
Morgan Kaufmann, 2929 Campus Drive, Suite 260, Dept. E3
San Mateo, CA 94403. Telephone Orders: (800) 745-7323 (US and Canada),
415-578-9911, Fax: (415) 578-0672.


------------------------------

From: Brooke Stevens <bsteve@MIT.EDU>
Subject: MIT Press Books on Machine Learning
Date: Thu, 01 Oct 92 14:32:19


THE MIT PRESS

MACHINE LEARNING LIST OFFER

GENETIC PROGRAMMING: ON THE PROGRAMMING OF COMPUTERS BY MEANS OF NATURAL
SELECTION, by John R. Koza (Stanford University), November 1992, 840
pp., 270 illus., $55.00, ISBN 0-262-11170-5, bookcode KOZGH

Genetic programming may be more powerful than neural networks and other
machine learning techniques, able to solve problems in a wider range of
disciplines. John Koza shows how this remarkable paradigm works and
provides substantial empirical evidence that solutions to a great
variety of problems from many different fields can be found by
genetically breeding populations of computer programs. The book contains
a great many worked examples and includes a sample computer code that
will allow readers to run their own programs.

ADAPTATION IN NATURAL AND ARTIFICIAL SYSTEMS: AN INTRODUCTORY ANALYSIS
WITH APPLICATIONS TO BIOLOGY, CONTROL, AND ARTIFICIAL INTELLIGENCE, by
John H. Holland (University of Michigan and Santa Fe Institute), May
1992, 228 pp., paper, $14.95, ISBN 0-262-58111-6, bookcode HOLPP/Cloth,
$30.00, ISBN 0-262-08213-6, bookcode HOLPH

Genetic algorithms are playing an increasingly important role in studies
of complex adaptive systems, ranging from adaptive agents in economic
theory to the use of machine learning techniques in the design of
complex devices such as aircraft turbines and integrated circuits.
Adaptation in Natural and Artificial Systems is the book that initiated
this field of study, presenting the theoretical foundations and
exploring applications.

DESIGNING AUTONOMOUS AGENTS: THEORY AND PRACTICE FROM BIOLOGY TO
ENGINEERING AND BACK, edited by Pattie Maes,
1991, 200 pp., paper, $19.95, ISBN 0-262-63135-0, bookcode MAEDP

Designing Autonomous Agents provides a summary and overview of the
radically different architectures that have been developed over the past
few years for organizing robots. These architectures have led to major
breakthroughs that promise to revolutionize the study of autonomous
agents and perhaps artificial intelligence in general.


EMERGENT COMPUTATION, edited by Stephanie Forrest, 1991, 450 pp., paper
$32.50, ISBN 0-262-56057-7, bookcode FOREP

These 31 essays define and explore the concept of emergent computation
in such areas as artificial networks, adaptive systems, classifier
systems, connectionist learning, other learning, and biological networks
to determine what properties are required of the supporting
architectures that generate them. Many of the essays share the themes of
design (how to construct such systems), the importance of preexisting
structure to learning and the role of parallelism, and the tension
between cooperative and competitive models of interaction. In the
introduction, Stephanie Forrest presents several detailed examples of
the kinds of problems emergent computation can address. Special Issues
of Physica D.

CELLULAR AUTOMATA: THEORY AND EXPERIMENT, edited by Howard Gutowitz,
1991, 488 pp., paper, $37.50, ISBN 0-262-57086-6, bookcode GUTCP

Cellular automata, dynamic systems in which space and time are discrete,
are yielding interesting applications in both the physical and natural
sciences. The thirty-four contributions in this book cover many aspects
of contemporary studies on cellular automata and include reviews,
research reports, and guides to recent literature and available
software. Chapters cover mathematical analysis, the structure of the
space of cellular automata, learning rules with specified properties:
cellular automata in biology, physics, chemistry, and computation
theory; and generalizations of cellular automata in neural nets, Boolean
nets, and coupled map lattices. Special Issues of Physica D.

TOWARD A PRACTICE OF AUTONOMOUS SYSTEMS: PROCEEDINGS OF THE FIRST
EUROPEAN CONFERENCE ON ARTIFICIAL LIFE, edited by Francisco J. Varela
and Paul Bourgine, 1992, 550 pp., paper, $55.00, ISBN
0-262-72019-1, bookcode VARTP

Artificial life embodies a recent and important conceptual step in
modern science: asserting that the core of intelligence and cognitive
abilities is the same as the capacity for living.These proceedings serve
two important functions: they address bottom-up theories of artificial
intelligence and explore what can be learned from simple models such as
insects about the cognitive processes and characteristic autonomy of
living organisms, while also engaging researchers and philosophers in an
exciting examination of the epistemological basis of this new trend.

FROM ANIMALS TO ANIMATS: PROCEEDINGS OF THE FIRST INTERNATIONAL
CONFERENCE ON SIMULATION OF ADAPTIVE BEHAVIOR, edited by Jean-Arcady
Meyer and Stewart W. Wilson, 1991, 562 pp., paper, $60.00
ISBN 0-262-63138-5, bookcode MEYAP

These 60 contributions from researchers in ethology, ecology,
cybernetics, artificial intelligence, robotics, and related fields delve
into the behaviors and underlying mechanisms that allow animals and,
potentially, robots to adapt and survive in uncertain environments. They
focus in particular on simulation models in order to help characterize
and compare various organizational principles or architectures capable
of inducing adaptive behavior in real or artificial animals.


KNOWLEDGE REPRESENTATION, edited by Ronald J. Brachman, Hector J.
Levesque, and Raymond Reiter, 1992, 416 pp., $29.00, ISBN 0-262-52168-7,
bookcode BRANH

Growing interest in symbolic representation and reasoning has pushed
this backstage activity into the spotlight as a clearly identifiable and
technically rich subfield in artificial intelligence. This collection of
extended versions of 12 papers from the First International Conference
on Principles of Knowledge Representation and Reasoning provides a
snapshot of the best current work in AI on formal methods and principles
of representation and reasoning. The topics range from temporal
reasoning to default reasoning to representations for natural language.
Special Issues of Artificial Intelligence.

FOUNDATIONS OF ARTIFICIAL INTELLIGENCE, edited by David Kirsh, 1992, 358
pp., $25.00, ISBN 0-262-61075-2, bookcode KIRFP

Have the classical methods and ideas of AI outlived their usefulness?
Foundations of Artificial Intelligence critically evaluates the
fundamental assumptions underpinning the dominant approaches to AI. In
the 11 contributions, theorists historically associated with each
position identify the basic tenets of their position. They discuss the
underlying principles, describe the natural types of problems and tasks
in which their approach succeeds, explain where its power comes from,
and what its scope and limits are. Theorists generally skeptical of
these positions evaluate the effectiveness of the method or approach and
explain why it works - to the extent they believe it does - and why it
eventually fails. Special Issues of Artificial Intelligence.

NEURAL NETWORKS AND NATURAL INTELLIGENCE, edited by Stephen Grossberg,
1988, 656 pp., paper, $27.50, ISBN 0-262-57091-2, bookcode GROEP

Stephen Grossberg and his colleagues at Boston University's Center for
Adaptive Systems are producing some of the most exciting research in the
neural network approach to making computers "think." Packed with
real-time computer simulations and rigorous demonstrations of these
phenomena, this book includes results on vision, speech, cognitive
information processing, adaptive pattern recognition, adaptive robotics,
conditioning and attention, cognitive-emotional interactions, and
decision making under risk.

NEURAL NETWORKS FOR CONTROL, edited by W. Thomas Miller, III, Richard S.
Sutton, and Paul J. Werbos, 1991, 542 pp., $52.50, ISBN 0-262-13261-3,
bookcode MILNH

Neural Netowrks for Control highlights key issues in learning control
and identifies research directions for practical solutions for control
problems in critical application domains. It brings together examples of
the most important paradigms along with evaluations of the possible
applicatins by experts in each application area. Special emphasis is
placed on designs based on optimization or reinforcement, which are
important in dealing with complex engineering challenges or real
biological control problems. Two of the editors are among the primary
developers of reinforcement learning in artificial neural networks.

NATURALLY INTELLIGENT SYSTEMS, Maureen Caudill and Charles Butler,
September 1992, 320 pp., $10.95 paper, ISBN 0-262-53113-5, bookcode
CAUNP

Naturally Intelligent Systems provides a technically accurate, yet
down-to-earth discussion of neural networks, clearly explaining the
underlying concepts of key neural network designs, how they are trained,
and why they work. Throughout, the authors present actual applications
that illustrate neural networks' utility in the new world.

____________________________________________________________________
THE MIT PRESS
ML_LIST Order Form


Please send me the following book(s):

Qty Author Bookcode Price
____ Brachman BRANH $29.00
____ Caudill CAUNP 10.95
____ Forrest FOREP 32.50
____ Grossberg GROEP 27.50
____ Gutowitz GUTCP 37.50
____ Holland HOLPH 30.00cloth
____ Holland HOLPP 14.95 paper
____ Kirsh KIRFP 25.00
____ Koza KOZGH 55.00
____ Maes MAEDP 19.95
____ Meyer MAYEP 60.00
____ Miller MILNH 52.50
____ Varela VARTP 55.00


___ Payment Enclosed ___ Purchase Order Attached

Charge to my ___ Master Card ___ Visa

Card# _______________________________

Exp.Date _______________

Signature _________________________________________________

_____ Total for books
$2.75 Postage for 1st book
_____ Please add 50c postage for each additional book
_____ Canadian customers Add 7% GST
_____ TOTAL due MIT Press

Send To:

Name ______________________________________________________

Address ___________________________________________________

City ________________________ State ________ Zip __________

Daytime Phone ________________ Fax ________________________

Make checks payable and send order to:
The MIT Press * 55 Hayward Street * Cambridge, MA 02142

For fastest service call (617) 625_8569
or toll_free 1-800-356-0343

ML-LIST

The MIT Guarantee: If for any reason you are not completely satisfied,
return your book(s) within ten days of receipt for a full refund or
credit.

------------------------------

Date: Wed, 23 Sep 1992 09:17:27 -0600
From: Dick Jackson <Dick_Jackson@qm.ibd.nrc.ca>
Subject: SUMMARY: Multi-Algorithm Machine Learning Systems

Date 9/23/92
Subject SUMMARY- Multi-Algorithm Ma
From Dick Jackson
To Machine Learning List

Subject: SUMMARY: Multi-Algorithm Machine_ Time:9:11 Date:9/23/92


Patient readers,

What follows is a summary of replies to my request for information on
existing work on Multi-Algorithm Machine Learning systems. I would like
to thank all the kind folks whose replies I am summarizing here.
From my original posting...

Our Informatics group has been discussing the need for a software
system for doing Multivariate Analysis, primarily for classification
and clustering tasks, making techniques from different areas of Machine
Learning available to the user.

What I would like to know is: has something of this kind been developed
already? Many excellent individual machine-learning programs are
available from different sources, but has anyone made a system which
allows combination and comparison of different algorithms?
[...]


Summary of the Summary:
=======================
A few groups are in fact developing systems which incorporate different
machine learning algorithms. Of the ones I've heard about, the most
wide-ranging projects seem to be:
- the AIMS project of University of Illinois at Urbana-Champaign
- the PARIS project of James Cook University (Australia)
- the Machine Learning Toolbox project of many collaborators in the
European Community
- LIB LEARNERS of University of Sussex (England)
These systems cross the domains of inductive, statistical and
connectionist learning.

Other systems are mentioned with somewhat narrower scope:

Inductive
IND, ALEX, KDW
Connectionist
too many to mention, see FAQ for comp.ai.neural-nets

Here are some excerpts from some of the replies I received,
which pertain to systems which incorporate diverse learning algorithms
(I also thank those who sent informative notes about other related work).

I regret that I was not yet able to get addresses or complete availability
information on all of these systems, but I think the time to post a summary
has come. I hope you find this information useful!

-Dick


========== Excerpts of replies: ==========
========================================

From: Stephen Lu <lu@kbesrl.me.uiuc.edu>:

[...]
We developed a machine learning TOOLBOX that contains
multiple inductive learning algorithms, and, more importantly, uses
multiple objective optimization to automatically select the best
algorithm and its control parameters for a particular given learning problem.


[Other points from speaking with Stephen:
- their system is know as Adaptive Interactive Modelling System (AIMS)
- it has decision tree, neural net and regression models at present
(more modules can be, and are being added)
- while this development was originally done in Lisp, most has now been
converted to C]

==========
From: Lothar WINKELBAUER <lothar@iiasa.ac.at>

[...]
Over the last year(s) I have been working on the development
of a multi-algorithm learning system.
Up to know I have a first prototype implemented
which is so far restricted to work with TDITD
(top-down-induction-of-decision-tree) learning algorithms.
The structure of the system object-oriented, so there is no problem
to extend it to other learning argorithms as well.


************************************
* ALEX *
* An Adaptive Learning Environment *
************************************

Lothar Winkelbauer
Advanced Computer Applications (ACA)
International Institute for Applied Systems Analysis (IIASA)
A-2361 Laxenburg, Austria


Abstract
========

The major bottleneck in the development and widespread use of expert systems
is knowledge acquisition ie., the transfer of domain-specific
knowledge from a human expert or another computer system to the machine.
Supporting knowledge acquisition by the machine itself,
automatic learning aims at quantitative and qualitative performance
improvement over more rigid, traditional approaches.

Instead of developing yet another learning algorithm,
an environment for automatic learning (ALEX) has been
designed and implemented in a modular fashion:
the system consists of an example generation module (ie., a tutor software
system representing the application domain); the learning subsystem;
an analysis component; and the user interface and control structure
integrating these components.

As the core of the learning subsystem the incremental learning
algorithm ID-H has been developed, based on the incremental
application of hybrid clustering.
This extends Quinlan's ID3 concept in
terms of applicability to different problem domains and transparency and
concept-orientation of its internal representation of knowledge.

To improve the overall performance of the learning environment
a feedback loop between the results of a learning step and the input
of the next learning step has been introduced. The learning
environment can automatically direct its learning strategy
according to its assessment of its current performance from learning
step to learning step.

The learning domains to which ALEX has been applied are
river water quality management and urban air pollution control.

Future work will be directed to enhancing the feedback loop with
criteria for qualitative evaluation of the learning results,
and to applying ALEX to new learning tasks.

==========
From: simoudis@titan.rdd.lmsc.lockheed.com (Evangelos Simoudis)

[...]
Michalski and Kerschberg at George Mason university had created such a
tool a couple of years ago. It was called INLEN.


==========
From: Peter Turney <peter@ai.iit.nrc.ca>

[...]
(1) Thomas Hoppe (hoppet@cs.tu-berlin.de) is maintaining and extending
a library of machine learning algorithms for the "Gesellcshaft fur
Informatik" at the Technical University of Berlin. This project started
in 1988. You may want to contact him.
[This is quite an extensive library of Prolog routines. -Dick]

[...]
(3) Wray Buntine has implemented a C program called IND that does several
different decision tree algorithms, including ID3.

The postscript file "About the IND Tree Package" may
be obtained by anonymous ftp from "cheops.cis.ohio-state.edu",
directory "pub/neuroprose", file "buntine.treecode.ps.Z".
Don't forget to use binary mode to transfer the file.
Wray Buntine's e-mail address is wray@ptolemy.arc.nasa.gov.

(4) Stefan Keller (keller@ifi.unizh.ch) is working on a project at the
Institute for Informatics of the University of Zurich, called
"portable AI Lab". The goal is to provide software to teach AI
to university students. Stefan is responsible for collecting
machine learning algorithms for the project. The project is at
least a year old by now.
[This is another extensive library of Lisp routines! -Dick]

[from a previous posting on the portable AI Lab:
>Version 2.1 is now available via anonymous ftp at the network address
>pobox.cscs.ch in the directory /pub/ai/pail-2.1. This directory
>contains a README and a compressed tar file of the whole system in
>source form. This is the standard version of the system. Copies taken
>from other ftp locations are not necessarily standard. The
>installation procedure should be self explanatory.
]


==========
From: Chris Thornton <christ@cogs.sussex.ac.uk>

At Sussex we have constructed a large package of the type you envisage
called LIB LEARNERS. At the moment it incorporates the following files,
which correspond to algorithms in a fairly obvious way.

id3.p
bayes_classifier.p
nearest_neighbours.p
quickprop.p
pdp_backprop.p
tl_backprop.p
tl_recurrent.p
conjgrad.p
cascade_correlation.p
hopfield_net.p
competitive_learning.p
back_propagation.p
schema_model.p
pattern_associator.p
kohonen_net.p
boltzmann_machine.p
ebg.p
classifier_system.p
focussing.p
perceptron.p
lms.p
wisard.p

It allows training sets in more or less any format to be fed into any of
these algorithms. It also provides a large range of analysis methods
often with graphical output. It is still a bit lacking in statistical
algorithms.

Unfortunately, it runs under Poplog so you have to get that as well to
run it.

Chris Thornton
School of Cognitive and Computing Sciences
University of Sussex
Falmer
Brighton
BN1 9QN

christ@cogs.susx.uk.ac

==========
From: elkan@cs.UCSD.EDU (Charles Elkan)

[...]
Alberto Segre at Cornell has an ongoing project to implement
speedup learning algorithms in a common environment, and maybe
also some induction algorithms. He is segre@cs.cornell.edu

There is a large multinational European project called "Machine
Learning Toolbox" led by Derek Sleeman of the University of Aberdeen,
project # P2154. Try robin@turing.ac.uk or mlt@csd.abdn.ac.uk.

Charles Elkan


==========
From: olivier@curacoa.cs.jcu.edu.au (Olivier de Vel)

[...]
In fact, we are developing a workbench-based system for classification
and clustering here at the Computer Science Department at the James
Cook University, Townsville, Queensland, Australia. The workbench is
called PARIS, for "Pattern Analysis and Recognition - an Interactive
System".

The system is being developed as part of ongoing research of several
postgraduate students, under the joint supervision of Dr. Danny Coomans
(Dept. of Mathematics and Statistics) and Dr. Olivier de Vel (Dept. of
Computer Science). The research projects are:

a.) Classification and modelling in high dimensional settings (PhD)
(Statistical and Neural Network based approaches).
b.) Object recognition using model-based Neural Networks (PhD).
c.) Cluster algorithms in high dimensional settings (Honours).
d.) Nonparametric statistical classification (Honours).

The software is being written in C++ on two platforms simultaneously.
The two platforms are :

- a 386/486 PC-compatible (using Turbo C++ and a MS-Windows interface)
- a UNIX platform (Ultrix 4.2 on a Digital 5000).

The PC version is a commercial product that incorporates a Windows
look-and-feel, true multitasking interface for interactive data input,
processing and graphical display. Here, we are developing a very user-
friendly environment for a variety of end-users. The basic system
consists of a modular generic workbench, which can be tailor-made
to suit different applications by adding different software modules.
e.g. different graphical displays for chemometric and environmetric
applications etc... A beta version should be ready at the
end of the first quarter 1993.

The Unix version is a research platform that deals with larger data sets
(in terms of dimensionality). Most of the algorithms are developed and
tested for correctness etc.. on this platform.

The following is a functional description of the system.

Pre-processing of raw data file:
- Present :
- missing data handling (mean or regression based)
- transformations, (PC based, Fishers discriminant plane, Fisher-
Fukunaga-Koonz transform, Fisher-Radius transform)
- Variable selection/elimination
- Class pooling/elimination
- Categorisation (i.e. transform continuous into nominal variables)
- Future :
- Transformation of individual variables

Classifier Evaluation :
- Classifiers included presently :
Regularised Discriminant Analysis, Linear and
Quadratic Discriminant Analysis, KNN, SIMCA, DIRC, SMAC, SLAK
(the three last have all been developed as part of my research).
Most classifiers allow classes to be modelled with several
subpopulations, and class bights, a priori probabilities and
misclassification costs can be specified.

- Classifiers to be included in future :
Neural Network Based (Back-prop, ART, etc.), RBFs, Kernel Methods and
CART and MARS (can also be used as a method for function approximation
and for time-series analysis).

- Evaluation methods : For each classifier, one of four evaluation
methods can be chosen (already implemented): Leave-one-out,
Cross validation, Resubstitution and Independent test sets.
The different sets generated with cross validation may be saved.
Both overall and class specific classification accuracies are
reported, and misclassified objects are indicated.
Results may be saved to a file. Furthermore, any of the four
evaluation methods allows the use of multiple files, which is
very useful for the comparison of different classification methods
on hand of artificial data with many replicates. In case of several
files, the variance in the results is also computed.

Classification :
Any of the classifiers may also be used to allocate unknown
samples.

Parameters :
There are default values for the parameters of all classifiers,
transformations, evaluation techniques etc., which can be set by
the user (both the default values and the values for a specific
run).

Clustering :
Algorithms are being developed and will be integrated in the
system in the near future.

Random Data Generation :
Part of the system is an extensive multivariate random data generator.
Distributions include normal, uniform, exponential, gamma,
lognormal and mixtures thereof (i.e. the distributions may be
different from variable to variable or class to class).
Dimensionalities, number of classes and number of replicates can
be specified. In future, it will be possible to just
save the parameters, such that the same data can be regenerated
and does not need to be saved explicitly.

Outlier detection :
An outlier detection based on SIMCA is implemented. The user
may delete the outliers found. Alternative outlier detection
techniques will be implemented.

Class modelling :
Based on SIMCA, interclass distances, modelling and
discriminant powers of variables etc. are computed.

Variable selection :
Various variable selection procedures are provided (F-test based,
Wilks lambda-based, SIMCA-based and classifier performance-based).
Evaluation of the variable subset with test samples set aside
at the start is possible.

Data Types :
The PARIS workbench handles various data formats e.g. ASCII files,
Lotus 1-2-3 files etc.. Facilities for handling missing data and
performing exploratory data analysis are being made available.

Data visualization :
A facility exists to convert the file format to formats
used by Xgobi, and Xgobi (a public domain data visualization
tool) can be used from within the system (research based system only).
A 2-D and 3-D graphical interface is being developed for the PC
platform.

Interface :
A graphical user interface (GUI) is being developed for the commercial
PC version.

From your description of what you are looking for, it seems that
our system could be of interest to you. Should this indeed be
the case, two options exist :

- you may consider purchasing PC version once released, or

- you may want to get involved in the further development
of the research based software.


Please contact me if you wish to discuss any matter or for any
further information.

I thank you in advance.

Regards.

Olivier de Vel. (olivier@curacoa.cs.jcu.edu.au)


==========
From: pleung@mprgate.mpr.ca (Peter Leung)

[...] I've come across such a system in my work at MPR Teltech:
Knowledge Discovery Workbench (KDW), developed by GTE Labs. The KDW
seems to address most of the "requirements" you listed:

1. pre-processing input data,

2. analysis using decision-tree induction algorithms (aka ID3), clustering,
visualization, anomaly detection,

3. testing/using classifier systems.

[...]
The principal investigator of the Knowledge Discovery Workbench project
at GTE Labs is Dr. Gregory Piatetsky-Shapiro.

Peter
MPR Teltech Ltd.
pleung@mprgate.mpr.ca (Peter Leung)

P.S. MPR Teltech is the R & D subsidiary of BC Tel.



==========
From: gps0%eureka@gte.com (Gregory Piatetsky-Shapiro)

Dick

There are varous experimental systems that do Multi-Algorithm
Machine Learning (see for example our book, Knowledge Discovery in Databases,

eds. G. Piatetsky-Shapiro and W. Frawley, MIT Press, 1991).
I am not aware of any commercial products that fit the bill.
[...]

== Gregory


==========
From: morik@kilo.informatik.uni-dortmund.de

The European Community funds a project, the Machine Learning
Toolbox (MLT), where several learning algorithms and a statistical
package are made available in an environment with a common look and
feel. Moreover, the Common Knowledge Representation Language
CKRL is the data format which can be read by all the integrated
systems. It is translated into the internal representation so that
the system uses its appropriate representation form for learning.
The learning result is then re-translated into CKRL. So, the user
of the MLT represents the data in CKRL and calls various learning
or statistical algorithms. The user is supported in choosing the
best suited algorithm by a CONSULTANT.
The MLT includes statistical algorithms (provided by INRIA, France),
a knowledge-intensive clustering system using a restricted first
order logic, KBG (Univ. Paris-Sud,F),
a variant of ID3 called NewID (Turing Institute, UK),
a variant of AQ and ID called CN2 (Turing Institute, UK),
a learning apprentice in a restricted first-order logic called APT
(ISoft and Univ. Paris-Sud, F),
an algorithm for similarity based discrimination called LASH
(British Aerospace, UK),
a multistrategy learning system called MOBAL which integrates
model-based learning in

first-order logic and constructing a 
lattice of constant terms into a knowledge acquisition system
(GMD, Germany).
The project coordinator is Marc Uszynski, Alcatel Alsthom Recherche,
e-mail: mlt@aar.alcatel-alsthom.fr


========== End of Excerpts ==========
===================================


-Dick

Dick Jackson
Institute for Biodiagnostics National Research Council Canada
Winnipeg, Manitoba Dick_Jackson@ibd.nrc.ca


------------------------------

End of ML-LIST (Digest format)
****************************************

← previous
next →
loading
sending ...
New to Neperos ? Sign Up for free
download Neperos App from Google Play
install Neperos as PWA

Let's discover also

Recent Articles

Recent Comments

Neperos cookies
This website uses cookies to store your preferences and improve the service. Cookies authorization will allow me and / or my partners to process personal data such as browsing behaviour.

By pressing OK you agree to the Terms of Service and acknowledge the Privacy Policy

By pressing REJECT you will be able to continue to use Neperos (like read articles or write comments) but some important cookies will not be set. This may affect certain features and functions of the platform.
OK
REJECT