Copy Link
Add to Bookmark
Report

Machine Learning List Vol. 6 No. 26

eZine's profile picture
Published in 
Machine Learning List
 · 13 Dec 2023

 
Machine Learning List: Vol. 6 No. 26
Friday, October 7, 1994

Contents:
JAIR ML paper
Kolmogorov complexity, priors, algorithmic art
Machine Learning course available
New release of PEBLS system now available
AI Faculty positions at UC Irvine
Preliminary Call for Papers ML95
The AI and Statistics conference in '95 is strong on learning!
AI/Stats Workshop



The Machine Learning List is moderated. Contributions should be relevant to
the scientific study of machine learning. Mail contributions to ml@ics.uci.edu.
Mail requests to be added or deleted to ml-request@ics.uci.edu. Back issues
may be FTP'd from ics.uci.edu in pub/ml-list/V<X>/<N> or N.Z where X and N are
the volume and number of the issue; ID: anonymous PASSWORD: <your mail address>
URL- http://www.ics.uci.edu/AI/ML/Machine-Learning.html

----------------------------------------------------------------------

Date: Mon, 26 Sep 94 15:44:13 PDT
From: Steve Minton <minton@ptolemy-ethernet.arc.nasa.gov>
Subject: JAIR ML paper

Readers of ML-list my be interested in the following paper recently published
in JAIR:

Safra, S. and Tennenholtz, M. (1994)
"On Planning while Learning", Volume 2, pages 111-129.
PostScript: volume2/safra94a.ps (202K)

Abstract: This paper introduces a framework for Planning while
Learning where an agent is given a goal to achieve in an environment
whose behavior is only partially known to the agent.
We discuss the tractability of various plan-design processes. We
show that for a large natural class of Planning while Learning
systems, a plan can be presented and verified in a reasonable time.
However, coming up algorithmically with a plan, even for simple
classes of systems is apparently intractable.
We emphasize the role of off-line plan-design processes, and show
that, in most natural cases, the verification (projection) part can be
carried out in an efficient algorithmic manner.

The PostScript file is available via:
-- comp.ai.jair.papers
-- World Wide Web at http://www.cs.washington.edu/research/jair/home.html
-- Anonymous FTP from either of the two sites below:
CMU: p.gp.cs.cmu.edu directory: /usr/jair/pub/volume2
Genoa: ftp.mrg.dist.unige.it directory: pub/jair/pub/volume2
-- automated email. Send mail to jair@cs.cmu.edu or jair@ftp.mrg.dist.unige.it
with the subject AUTORESPOND, and the body GET VOLUME2/SAFRA94A.PS
(either upper or lowercase is fine).
-- JAIR Gopher server: At p.gp.cs.cmu.edu, port 70.

For more information about JAIR, check out our WWW or FTP sites, or
send electronic mail to jair@cs.cmu.edu with the subject AUTORESPOND
and the message body HELP, or contact jair-ed@ptolemy.arc.nasa.gov.


------------------------------

From: Juergen Schmidhuber <schmidhu@informatik.tu-muenchen.de>
Subject: Kolmogorov complexity, priors, algorithmic art
Date: Fri, 30 Sep 1994 18:33:54 +0100


Wolpert writes:

>> Kolmogorov complexity theory has nothing to do with the
>> nfl statements. The statements hold independent of where on the
>> Chomsky hierarchy the learning algorithms lie. In fact, the
>> algorithms can have different computational abilities and the
>> results still hold.''

My little complexity argument does not address the issue of ``where on
the Chomsky hierarchy the learning algorithms lie''. It does not care.
It just goes like this: Consider all finite relations between finite
(bit)strings and finite (bit)strings, and all possible ways of choosing
training sets and (non-overlapping) test sets. Without reference to any
particular prior, a simple counting argument shows: In almost all cases,
the shortest algorithm computing the test set from the training set
essentially will have the size of the trivial algorithm listing the
whole test set (the programming language does not matter). Therefore,
in almost all cases, (1) knowledge of the training set does not tell
us anything about the test set, (2) there is no hope for generalization.

This is similar in spirit to what has been said in the recent
discussion, and indicates that one should not be surprised by negative
results concerning generalization capability in almost all cases.

Admittedly, however, Wolpert is right by saying that this
``does not _directly_ address the same scenario as the nfl results''.
Perhaps somebody out there is interested in working out precise
formal relationships.

An additional remark on the prior problem: With infinitely many (but
enumerable) solution candidates, but without problem specific knowledge,
it seems that we ought to be glad about a discrete enumerable prior that
assigns to every solution candidate a probability at least as high as the
one assigned by any other such prior P (ignoring a constant factor
depending only on P). A remarkable property of the Solomonoff-Levin
distribution (or universal prior) P_U is this: P_U dominates all discrete
enumerable semimeasures P (including probability distributions) in the
sense that for all P there is a constant c such that P_U(x) >= cP(x) for
all x.

------------------------------

Date: Thu, 29 Sep 94 8:10:28 EDT
From: Tom.Mitchell@cs.cmu.edu
Subject: Machine Learning course available


MACHINE LEARNING COURSE NOTES AVAILABLE ON MOSAIC

The lecture slides and syllabus for CMU's course on Machine Learning are now
available on the web. Feel free to use the slides or handouts in your own
courses if you find them helpful. This material is from the fall 1994 course
at CMU, offered to upper-level undergraduates and graduate students.

Suggestions for improvements are solicited! Also, any good homework problems.
(suggestions -> Tom.Mitchell@cmu.edu). The URL is
http://www.cs.cmu.edu:8001/afs/cs.cmu.edu/usr/avrim/www/ML94/courseinfo.html

Tom Mitchell and Avrim Blum

------------------------------

From: Steven Salzberg <salzberg@blaze.cs.jhu.edu>
Sender: salzberg@blaze.cs.jhu.edu
Date: Tue, 4 Oct 94 08:16:33 EDT
Subject: new release of PEBLS system now available



A new release of the PEBLS system, PEBLS 3.0,
is now available via anonymous FTP.

PEBLS is a nearest-neighbor learning system designed for
applications where the instances have symbolic feature values. PEBLS
has been applied to the prediction of protein secondary structure and
to the identification of DNA promoter sequences. A technical
description appears in the article by Cost and Salzberg, Machine
Learning journal 10:1 (1993).

PEBLS 3.0 is written entirely in ANSI C. It is thus capable of
running on a wide range of platforms. Version 3.0 incorporates a
number of additions to version 2.1 (released in 1993) and to the
original PEBLS described in the paper:

S. Cost and S. Salzberg. A Weighted Nearest Neighbor
Algorithm for Learning with Symbolic Features,
Machine Learning, 10:1, 57-78 (1993).

PEBLS 3.0 now makes it possible to draw more comparisons between
nearest-neighbor and probabilistic approaches to machine learning, by
incorporating a capability for tracking statistics for Bayesian
inferences. The system can thus serve to show specifically where
nearest-neighbor and Bayesian methods differ. The system is also able
to perform tests using simple distance metrics (overlap, Euclidean,
Manhattan) for baseline comparisons. Research along these lines was
described in the following paper:

J. Rachlin, S. Kasif, S. Salzberg, and D. Aha. Towards a Better
Understanding of Memory-Based and Bayesian Classifiers. {\it
Proceedings of the Eleventh International Conference on Machine
Learning} (pp. 242-250). New Brunswick, NJ, July 1994, Morgan
Kaufmann Publishers.

TO OBTAIN PEBLS BY ANONYMOUS FTP
________________________________

The latest version of PEBLS is available free of charge, and may
be obtained via anonymous FTP from the Johns Hopkins University
Computer Science Department.

To obtain a copy of PEBLS, type the following commands:

UNIX_prompt> ftp blaze.cs.jhu.edu
[Note: the Internet address of blaze.cs.jhu.edu is 128.220.13.50]
Name: anonymous
Password: [enter your email address]

ftp> bin
ftp> cd pub/pebls
ftp> get pebls.tar.Z
ftp> bye

[Place the file pebls.tar.Z in a convenient subdirectory.]

UNIX_prompt> uncompress pebls.tar.Z
UNIX_prompt> tar -xf pebls.tar

[Read the files "README" and "pebls_3.doc"]


For further information, contact:

Prof. Steven Salzberg
Department of Computer Science
Johns Hopkins University
Baltimore, Maryland 21218
Email: salzberg@cs.jhu.edu

PEBLS 3.0 IS INTENDED FOR RESEARCH AND EDUCATIONAL PURPOSES ONLY.
PEBLS 3.0 may be used, copied, and modified freely for this purpose.
Any commercial or for-profit use of PEBLS 3.0 is strictly prohibited
without the express written consent of Prof. Steven Salzberg,
Department of Computer Science, The Johns Hopkins University.

------------------------------

Subject: AI Faculty positions at UC Irvine
Date: Thu, 06 Oct 1994 16:31:35 -0700
From: Michael Pazzani <pazzani@super-pan.ICS.UCI.EDU>


UNIVERSITY OF CALIFORNIA, IRVINE
Department of Information and Computer Science


Faculty Positions in Artificial Intelligence

The Department of Information and Computer Science (ICS) is
seeking to fill a possible assistant professor and a possible
associate professor position in the area of Artificial Intelligence.
Research emphases of interest include, but are not limited to,
automated reasoning, machine learning, neural networks, and
planning. We are looking for candidates with strong research
records who would thrive in a highly productive setting. Duties
include undergraduate and graduate teaching in computer science.
Applicants must possess a Ph.D. Candidates should show
excellent promise of a distinguished research career.

There are currently 4 Faculty with 25 students pursuing Ph.D.s in
artificial intelligence and several international scholars working
with the artificial intelligence group. The Artificial Intelligence
faculty have research funding from agencies such as ARPA,
AFOSR, NSF, and ONR as well as industrial partners.

In addition to artificial intelligence, the ICS Department has
research groups in the areas of algorithms and data structures,
computer networks and distributed systems, computer systems
design, educational technology, parallel processing, social and
managerial analysis of computing, and software.

The ICS Department is an independent campus unit reporting to
the Executive Vice Chancellor. ICS faculty emphasize core
computer science as well as research in emerging areas of the
discipline, with effective inter-disciplinary ties to colleagues in
management, neurobiology, cognitive science, engineering, and
the social sciences. The department currently has 25 full-time
faculty and 130 Ph.D. students.

Graduate student, research, and administrative computing equipment
includes 90 Macintoshes, a Sequent multiprocessor, more than 200 Sun
workstations (Sparc 1s and 2s, Sun-3s and Sun-4s), more than 20
fileservers, a MasPar, and an assortment of PCs. Departmental
undergraduate instructional computing equipment consists of 150
Macintoshes, a Sequent multiprocessor, 25 Sun workstations, and a
large SPARC fileserver. All our major workstations and computers are
tied together with networks, which are gatewayed to the campus
network, and from there to the Internet. In addition, department
members have access to campus-wide computing resources as well as
regional super-computer access.

UC-Irvine is located in Orange County, three miles from the
Pacific Ocean near Newport Beach, and approximately forty
miles south of Los Angeles. The campus is situated in the heart
of a national center of high-technology enterprise. Both the
campus and the enterprise area offer exciting professional and
cultural opportunities. Salaries and benefits are competitive.
Mortgage and housing assistance are available. Housing options
include newly built, for-sale housing located on campus and
within short walking distance from the Department.

Send resume and contact information for four references to:
Artificial Intelligence Position
Lisa Tellier
Department of Information and Computer Science
University of California, Irvine
Irvine, CA 92717-3425

Application screening will begin immediately upon receipt of
curriculum vitae. Maximum consideration will be given to
applications received by December 15, 1994.

The University of California is an Affirmative Action/Equal
Opportunity Employer, committed to excellence through diversity.

------------------------------

Date: Mon, 3 Oct 1994 16:29:39 -0700
From: "Jeffrey C. Schlimmer" <schlimme@eecs.wsu.edu>
Subject: Preliminary Call for Papers ML95

PRELIMINARY CALL FOR PAPERS
Twelfth International Conference on Machine Learning

Tahoe City, California
July 9-12, 1995

The Twelfth International Conference on Machine Learning (ML95)
will be held at the Granlibakken Resort in Tahoe City, California
during July 9-12, 1995, with informal workshops on July 9. We invite
paper submissions from researchers in all areas of machine learning.
The conference will include presentations of refereed papers and
invited talks.


REVIEW CRITERIA

Each submitted paper will be reviewed by at least two members of
the program committee and will be judged on significance, originality,
and clarity. Papers submitted simultaneously to other conferences must
clearly state so on the title page.


PAPER FORMAT

Submissions must be clearly legible, with good quality print.
Papers are limited to a total of twelve (12) pages, EXCLUDING title
page and bibliography, but INCLUDING all tables and figures. Papers
must be printed on 8-1/2 x 11 inch paper or A4 paper using 12 point
type (10 characters per inch) with no more than 38 lines per page and
75 characters per line (e.g., LaTeX 12 point article style). The
title page must include an abstract and email and postal addresses of
all authors. Papers without this format will not be reviewed. To save
paper and postage costs please use DOUBLE-SIDED printing.


REQUIREMENTS FOR SUBMISSION

Send four (4) copies of each submitted paper to one of the
conference co-chairs. Papers must be received by

FEBRUARY 7, 1995 .

Electronic or FAX submissions are not acceptable. Notification of
acceptance or rejection will be mailed to the first (or designated)
author by March 22, 1995. Camera-ready accepted papers are due on
April 25, 1995.


INFORMAL WORKSHOPS

Proposals for informal workshops are invited in all areas of
machine learning. Send a two (2) page description of the proposed
workshop, its objectives, organizer(s), and expected number of
attendees to the workshop chair. Proposals must be received by
DECEMBER 1, 1994.


Conference Co-Chairs

Armand Prieditis
Department of Computer Science
University of California
Davis, CA 95616
priediti@cs.ucdavis.edu

Stuart Russell
Computer Science Division
University of California
Berkeley, CA 94720
russell@cs.berkeley.edu

Program Committee

(To Be Announced).

Workshop Chair

Sridhar Mahadevan
Department of Computer Science and Engineering
University of Southern Florida
4202 East Fowler Avenue, EBG 118
Tampa, Florida 33620
mahadeva@csee.usf.edu

Publicity Chair

Jeff Schlimmer
School of Electrical Engineering and Computer Science
Washington State University
Pullman, WA 99164-2752
schlimme@eecs.wsu.edu
http://www.eecs.wsu.edu/~schlimme

Local Arrangements

Debbie Chadwick
Department of Computer Science
University of California
Davis, CA 95616
chadwick@cs.ucdavis.edu


GENERAL INQUIRIES

Please send general inquiries to ml95@cs.ucdavis.edu .

To receive future conference announcements please send a note to
the publicity chair. Current conference information available online
on the World-Wide Web as http://www.eecs.wsu.edu/~schlimme/ml95.html .


Jeffrey C. Schlimmer, Asst. Prof., School of EE & CS, Washington State
University, Pullman, WA 99164-2752, (509) 335-2399, (509) 335-3818 FAX




------------------------------

Date: Sun, 25 Sep 94 13:14:15 PDT
From: Wray Buntine <wray@ptolemy-ethernet.arc.nasa.gov>
Subject: the AI and Statistics conference in '95 is strong on learning!

Just looking at the program of the AI and Statistics in Florida, Jan.
'95. This years primary theme is "learning from data". Not only is
there a strong contingent of learning papers, there is also a very
impressive selection of tutorials on learning, or in areas related to
learning:
Machine Learning (Aha)
Statistical Methods for Inducing Models from Data (Steffen Lauritzen)
Probabilistic Models of Causality (Glenn Shafer)
Statistical Models for Function Estimation and Classification
(Trevor Hastie)
Steffen Lauritzen and Trevor Hastie largely cover different areas in
statistics, so if you're interested in a statistical view of machine learning,
I'd recommend attending both.

So if you interested in finding out about the interface between machine
learning and statistics, I'd highly recommend AI and Statistics in '95.
Dealine for early (cheap) regististration is 1st December.


------------------------------

Date: Sat, 1 Oct 1994 10:40:52 +0600
From: "Douglas H. Fisher" <dfisher@vuse.vanderbilt.edu>
Subject: AI/Stats Workshop


Preliminary Call for Participation

Fifth International Workshop on
ARTIFICIAL INTELLIGENCE and STATISTICS

January 4-7, 1995
Ft. Lauderdale, Florida


TECHNICAL and TUTORIAL PROGRAM:
This is the fifth in a series of workshops that has brought
together researchers in Artificial Intelligence and in
Statistics to discuss problems of mutual interest. To
encourage interaction and a broad exchange of ideas, there
will be 20 discussion papers in single session meetings over
three days (Jan. 5-7). Two poster sessions will provide the
means for presenting and discussing the remaining research
papers. Attendance at the workshop is *not* limited to paper
presenters.

The three days of research presentations will be preceded by
a day of tutorials (Jan. 4). The tutorial topics, presenters,
and approximate times are:

(1) Machine Learning 9:00AM - 12:15PM
(Dr. David Aha, Naval Research Lab)

(2) Statistical Methods for Inducing 9:00AM - 12:15PM
Models from Data
(Prof. Steffen Lauritzen, Aalborg U.)

(3) Probabilistic Models of Causality 2:00PM - 5:15PM
(Prof. Glenn Shafer, Rutgers U.)

(4) Statistical Models for Function 2:00PM - 5:15PM
Estimation and Classification
(Prof. Trevor Hastie, Stanford U.)

Notes prepared by the tutorial presenters will be made available
at the Workshop.

LOCATION:
The 1995 Workshop will be held at

Pier Sixty Six Resort & Marina
2301 SE 17th Street Causeway
Fort Lauderdale, Florida, 33316
USA.

Phone: 800-327-3796 (outside Florida)
305-525-6666
Fax : 305-728-3541

The hotel is a 22 acre resort located on the intracoastal waterway.
Available amenities include two pools, a 40 person hydrotherapy
pool, spa, tennis courts, a children's activity club, seven
restaurants and lounges, and water shuttle service to the beach.

The Hotel is most conveniently reached from Fort Lauderdale
International Airport, which is about 5-10 minutes by car/cab.
The Hotel is approximately 45-60 minutes by car from Miami
International Airport.

The Resort is holding a block of rooms at the rate of $95 US
dollars (for single/double) until Dec. 10, 1994. Reservations
should be made before this date. The block is held under the
name `SOCIETY for ARTIficial Intelligence and Statistics'
(or SOCIETY ARTI).


REGISTRATION:
Registration for the Technical Program (plenary and poster
sessions) includes a proceedings of papers submitted by authors,
continental breakfasts each day of the technical program,
and tentatively, two lunches and one dinner. The Workshop
offers student rates and an early-registration discount.
Registration rates and instructions can be found on the
Registration Form at the end of this Call. Registration
for tutorials can also be made in advance using the
Registration Form.


PROGRAM COMMITTEE:

General Chair: D. Fisher Vanderbilt U., USA
Program Chair: H. Lenz Free U. Berlin, Germany
Members: W. Buntine NASA (Ames), USA
J. Catlett AT&T Bell Labs, USA
P. Cheeseman NASA (Ames), USA
P. Cohen U. of Mass., USA
D. Draper U. of Bath, UK
Wm. Dumouchel Columbia U., USA
A. Gammerman U. of London, UK
D. J. Hand Open U., UK
P. Hietala U. Tampere, Finland
R. Kruse TU Braunschweig, Germany
S. Lauritzen Aalborg U., Denmark
W. Oldford U. of Waterloo, Canada
J. Pearl UCLA, USA
D. Pregibon AT&T Bell Labs, USA
E. Roedel Humboldt U., Germany
G. Shafer Rutgers U., USA
P. Smyth JPL, USA
Tutorial Chair: P. Shenoy U. Kansas, USA


MORE INFORMATION:
For more information write dfisher@vuse.vanderbilt.edu
or call 615-343-4111.


SPONSORS: Society for Artificial Intelligence and Statistics
International Association for Statistical Computing


***********


Papers accepted for Technical Program

Fifth International Workshop on
Artificial Intelligence
and
Statistics



PLENARY PAPERS


Almond, Schimert (MathSoft) Missing data models as meta-data

Brent, Murthy, Lundberg Minimum description length induction
(John Hopkins U) for discovering morphemic suffixes

Buntine (NASA Ames) Software for data analysis with
graphical models: basic tools

Chickering, Geiger, Heckerman Learning Bayesian networks: search
(MicroSoft) methods and experimental results

Cohen, Gregory, Ballesteros, Two algorithms for inducing structural
St Amant (U Mass) equation models from data

Cooper (U Pitt) Causal discovery from observational
data in the presence of selection
bias

Cox (US West) Using causal knowledge to learn more
useful decision rules from data

Decatur (Harvard U) Learning in hybrid noise environments
using statistical queries

Elder (Rice U) Heuristic search for model structure

Gebhardt, Kruse Learning possibilistic networks from data
(U Braunschweig)

Kasahara, Ishikawa, Viewpoint-based measurement of semantic
Matsuzawa, Kawaoka similarity between words
(Nippon TT)

Lubinsky (U Witwatersrand SA) Structured interpretable regression

Madigan, Almond (U Washington) Test selection strategies for belief
networks

Malvestuto (U L'Aquila, IT) Derivation DAGs for inferring
interaction models

Merz (U Cal Irvine) Dynamic learning bias selection

Pearl (UCLA) A causal calculus for statistical
research with applications to
observational and experimental
studies

Riddle, Frenedo, Newman Framework for a generic knowledge
(Boeing) discovery tool

Shafer, Kogan, Spirtes A generalization of the Tetrad
(Rutgers) representation theorem

St Amant, Cohen (U Mass) Preliminary design for an EDA assistant

Yao, Tritchler (U Toronto) Likelihood-based causal inference




POSTER PAPERS


Aha, Bankert (NRL) A comparative evaluation of
sequential feature selection
algorithms

Ali, Brunk, Pazzani Learning multiple relational rule-based
(U Cal Irvine) models

Almond (MathSoft) Hypergraph grammars for knowledge-based
model construction

Anderson, Carlson, Westbrook Tools for analyzing AI programs
Hart, Cohen (U Mass)

Bergman, Rivest (MIT) Picking the best expert from a sequence

Blau (U Rochester) Ploxoma: Test-bed for uncertain
inference

Breese, Heckerman Probabilistic case-based reasoning
(MicroSoft)

Burke (U Nevada) Comparing the prediction accuracy of
statistical models and artificial
neural networks in breast cancer

Catlett (ATT) Tailoring rulesets to misclassification
cost

Chen, Yeh Predicting stock returns with genetic
(National Chengchi U) programming

Cheng (U Cincinnati) Analysis and Application of the
Generalized Mean-Shift Process

Cozman, Krotkov (CMU) Truncated Gaussians as tolerance sets

Cunningham (U Waikato) Textual data mining

De Vel, Li, Coomans Non-Linear dimensionality reduction:
(U James Cook, NZ) A comparative performance study


DuMouchel, Friedman, Johnson Natural language processing of
Hripcsak (Columbia U) radiology reports

Esposito, Malerba, Semeraro A further study of pruning methods in
(U degli Studi, IT) decision tree induction

Feelders, Verkooijen Which method learns most from the data?
(U Twente, Netherlands)

Franz (CMU) Classifying new words for robust
parsing

Gelsema (Erasmus U, Abductive reasoning in Bayesian belief
The Netherlands) networks using a genetic algorithm

Harner, Galfalvy Omega-Stat: An environment for
(West Virginia U) implementing intelligent modeling
strategies

Heckerman, Shachter A decision-based view of causality
(MicroSoft)

Howe (Colorado St U) Finding dependencies in event streams
using local search

Jenzarli (U Tampa) Solving influence diagrams using
Gibbs sampling

John (Stanford U) Robust linear discriminant trees

Ketterlin, Gancarski, Korczak Hierarchical clustering of composite
(U Louis Pasteur) objects with a variable number of
components

Kim (Korea Adv. Inst. of Sci. An approach to fitting large influence
and Eng.) diagrams

Kim, Moon (Syracuse U) Modeling life time data by neural
networks

Kloesgen (German Nat. Rsch.) Learning from data: Pattern evaluations
and search strategies

Larranaga, Murga, Poza, Structure learning of Bayesian networks
Kuijpers (U Basque, by hybrid genetic algorithms
Spain)

Lekuona, Lacruz, Lasala Graphical models for dynamic systems
(U de Zaragoza, Spain)

Liu (U Kansas) Propagation of Gaussian belief
functions

Martin (U Cal, Irvine) A hypergeometric null hypothesis
probability test for feature
selection and stopping

Martin (U Cal, Irvine) Evaluating and comparing classifiers:
Complexity measures

Murthy (John Hopkins U) Statistical preprocessing of
decision trees

Neufeld, Adams, Choy, Philip, Part-of-speech tagging from small
Tawfik (U Saskatchewan) data sets

Oates, Gregory, Cohen (U Mass) Detecting complex dependencies in
categorical data

Pazzani (U Cal Irvine) Searching for attribute dependencies
in Bayesian classifiers

Provan, Singh (Inst. for Learning ``Predictively-Optimal''
Decision Systems Res.) Bayesian Networks

Risius, Seidelmann Combining statistics and AI in the
(Hahn-Meitner Inst) optimization of semiconductor films
for solar cells

Shenoy (U Kansas) Representing and solving asymmetric
decision problems using valuation
networks

Srkantan, Srihari Data representations in learning
(SUNY Buffalo)

Sun, Qiu, Cox (US West) A hill-climbing approach to construct
near optimal decision trees

Valtorta (U South Carolina) MENTOR: A Bayesian model for prediction
and intervention in mental
retardation

Young, Lubinsky (UNC) Learning from data by guiding the
analyst: On the representation, use,
and creation of visual statistical
strategies


***********


Registration Form

Fifth International Workshop on
Artificial Intelligence
and
Statistics


Participants may register on site. To register in advance of
the Workshop send this form and a check (in US dollars) made
to the order of **Society for Artificial Intelligence and
Statistics** in the appropriate amount to:

Doug Fisher
Department of Computer Science
Box 1679, Station B
Vanderbilt University
Nashville, Tennessee 37235
USA

Advance registration discounts apply if registration is received
by Dec. 1, 1994.


Name: ________________________________________

Affiliation: _________________________________

Phone: _______________________________________

Fax: _________________________________________

Email: _______________________________________

Address: _____________________________________

_____________________________________

_____________________________________


Technical Program -- check one:

____ Technical Program (regular, by Dec. 1, 1994): $245

____ Technical Program (student, by Dec. 1, 1994): $155

____ Technical Program (regular, after Dec. 1, 1994): $295

____ Technical Program (student, after Dec. 1, 1994): $195


Technical Program Subtotal: $____


Tutorial Program -- check applicable tutorials, if any.
Note that the tutorial times may conflict;
to avoid conflict at most one selection
from (1) and (2), and one selection from
(3) and (4) may be made.


____ (1) Machine Learning

____ (regular, by Dec. 1): $ 70

____ (student, by Dec. 1): $ 45

____ (regular, after Dec. 1): $ 80

____ (student, after Dec. 1): $ 55


____ (2) Statistical Methods for Inducing Models from Data

____ (regular, by Dec. 1): $ 70

____ (student, by Dec. 1): $ 45

____ (regular, after Dec. 1): $ 80

____ (student, after Dec. 1): $ 55


____ (3) Probabilistic Models of Causality

____ (regular, by Dec. 1): $ 70

____ (student, by Dec. 1): $ 45

____ (regular, after Dec. 1): $ 80

____ (student, after Dec. 1): $ 55


____ (4) Statistical Models for Function Estimation and Classification

____ (regular, by Dec. 1): $ 70

____ (student, by Dec. 1): $ 45

____ (regular, after Dec. 1): $ 80

____ (student, after Dec. 1): $ 55


Tutorial Program Subtotal: $____


Technical and Tutorial Total: $____





------------------------------

End of ML-LIST (Digest format)
****************************************

← previous
next →
loading
sending ...
New to Neperos ? Sign Up for free
download Neperos App from Google Play
install Neperos as PWA

Let's discover also

Recent Articles

Recent Comments

Neperos cookies
This website uses cookies to store your preferences and improve the service. Cookies authorization will allow me and / or my partners to process personal data such as browsing behaviour.

By pressing OK you agree to the Terms of Service and acknowledge the Privacy Policy

By pressing REJECT you will be able to continue to use Neperos (like read articles or write comments) but some important cookies will not be set. This may affect certain features and functions of the platform.
OK
REJECT