Copy Link
Add to Bookmark
Report

AIList Digest Volume 2 Issue 119

eZine's profile picture
Published in 
AIList Digest
 · 15 Nov 2023

AIList Digest            Sunday, 16 Sep 1984      Volume 2 : Issue 119 

Today's Topics:
LISP - VAX Lisps & CP/M Lisp,
Philosophy - Syllogism Correction,
Scientific Method - Induction vs. Deduction,
Course - Logic Programming,
Conference - Database Systems
----------------------------------------------------------------------

Date: Sun, 16 Sep 84 14:28 BST
From: TONY HASEMER (on ALVEY at Teddington) <TONH%alvey@ucl-cs.arpa>
Subject: Lisp on the VAX

We have a VAX 11/750 with four Mb of memory, running NIL. We also have
four Lisp hackers of several years' standing who are likely to write
quite substantial programs. We have to decide whether to buy some extra
memory, or to spend the money on Golden Common Lisp, which someone told
us is much more effiecient than NIL.

Can anyone please advise us? Thank you.

Tony.

------------------------------

Date: 11 Sep 84 17:36:37-PDT (Tue)
From: hplabs!sdcrdcf!sdcsvax!stan @ Ucb-Vax.arpa
Subject: Lisp under CP/M
Article-I.D.: sdcsvax.52

I recently purchased a copy of ProCode's Waltz Lisp for the Z80 and CP/M
and found it to be a very good imitation of Franz Lisp.

I downloaded some rather substantial programs I'ld written over the
past two years and within 20 minutes had them up and running on my
Kaypro. Surprisingly, there was little speed degradation unless there
was a major amount of computations involved.

All that was required (for my programs) were a few support routines to
implement defun, terpri, etc.

The manual is very complete and well written. (For example, it had examples
of how to write defun in it.)

Cost was just under $100.00, and well worth it.

Now, if only my Kaypro could handle background processes like the VAX...

Stan Tomlinson

------------------------------

Date: 11 Sep 84 11:06:09-PDT (Tue)
From: hplabs!hao!seismo!rochester!rocksvax!rocksanne!sunybcs!gloria!colonel
@ Ucb-Vax.arpa
Subject: Re: Now and Then
Article-I.D.: gloria.535

>> All swans are white.
>> This is a swan.
>> Therefore it is white.
>>
>> Notice that the conclusion (3rd sentence) is only true iff the two
>> premises (sentences 2 and 3) are true.

A minor correction: "iff" does not belong here. The premises do not follow
from the conclusion.
--
Col. G. L. Sicherman
...seismo!rochester!rocksanne!rocksvax!sunybcs!gloria!colonel

------------------------------

Date: 14 Sep 84 09:01 PDT
From: Feuerman.pasa@XEROX.ARPA
Subject: Re: Inductive Proof - The Heap Problem

At the risk of getting involved.....

One thing bothers me about the inductive proof that all heaps are small.
I will claim that that is NOT an inductive proof after all. The second
requirement for a (mathematical) proof by induction states that one must
show that P(n) implies P(n+1). I see nothing in the fact that one
"speck" is small that NECESSARILY implies that two "specks" constitutes
a small heap. One seems to conclude the fact that a two-speck heap is
small from some sort of outside judgment of size. Thus, Small(1 Speck)
does NOT imply Small(2 Specks), something else implies that.

Lest we get into an argument about the fact that large for one could be
small for another, I'll bring up another mathematical point: The
Archimedian Principle. It basically says that given any number (size,
number of specks, what have you), one can ALWAYS find a natural number
that is greater. Applying that to the heap problem, given anyone's
threshold of what constitutes a large heap and what constitutes a small
heap, one can ALWAYS make a large heap out of a small heap by adding one
speck at a time. I'll further note that one need not make that
transition between small and large heaps a discreet number; as long as
you can put a number on some sense of a large heap (regardless of
whether that is the smallest large heap), you can always exceed it. For
example, I will arbitrarily say that 10**47 specks in a heap makes it
large. I don't have to say that 10**47 - 1 is small. Yet we will still
be able to create a large heap (eventually).

Now, anyone interested in speculating about what happens if someone's
size function is not constant, but varies with time, mood, money in the
bank, etc.?

As further proof of my Archimedian Principle, we will note that I have
just in fact turned a small heap/argument (Ken Laws' four line Heap
Problem) into a large one (this message).

--Ken <Feuerman.pasa@Xerox.arpa>

------------------------------

Date: Fri 14 Sep 84 14:30:14-PDT
From: BARNARD@SRI-AI.ARPA
Subject: induction vs. deduction

The discussion of induction vs. deduction has taken a curious turn.
Normally, when we speak of induction, we don't mean *mathematical
induction*, which is a formally adequate proof technique. We mean
instead the inductive mode of reasoning, which is quite different.
Inductive reasoning can never be equated to deductive reasoning
because it begins with totally different premises. Inductive
reasoning involves two principles:

(1) The principle of insufficient reason, which holds that in the
absence of other information, the expectation over an ensemble of
possibilities is uniform (heads and tails are equally probable).

(2) The principle of Occam's razor, which hold that given a variety of
theories about some data, the one that is "simplest" is preferred.
(We prefer the Copernican model of the solar system to the Ptolemaic
one, even though they both account for the astronomical data.)

The relationship of time, causality, and induction has been
investigated by the Nobel Laureate, Ilya Prigogine. The laws of
classical physics, with one exception, are neutral with respect to the
direction of time. The exception is the Second Law of Thermodynamics,
which states that the entropy of a closed systems must increase, or
equivalently, that a closed system will tend toward more and more
disordered states. For a long time, physicists tried to prove the
Second Law in terms of Newtonian principles, but with no success.
Eventually, Boltzman and Gibbs explained the Second Law
satisfactorily by using inductive principles to show that the
probability of a system entering a disordered, high-entropy state is
far higher than the converse. Prigogine proposes that random,
microscopic events cause macroscopic events to unfold in a
fundamentally unpredictable way. He extends thermodynamics to open
systems, and particularly to "dissipative systems" that, through
entropy exchange, evolve toward or maintain orderly, low-entropy
states.

Inductive reasoning is also closely connected with information theory.
Recall that Shannon uses entropy as the measure of information.
Brillouin, Carnap, and Jaynes have shown that these two meanings of
entropy (information in a message and disorder of a physical system)
are equivalent.

Steve Barnard

------------------------------

Date: Wed 12 Sep 84 21:16:28-EDT
From: Michael J. Beckerle <BECKERLE@MIT-XX.ARPA>
Subject: Course Offering - Logic Programming

[Forwarded from the MIT bboard by Laws@SRI-AI.]


TECHNOLOGY OF LOGIC PROGRAMMING

CS 270
Fall 1984

Professor Henryk Jan Komorowski
Harvard University
Aiken Computation Lab. 105
495-5973

Meeting: Mondays, Wednesdays - 12:30 to 2 PM, Pierce Hall 209

This year the course will focus on presenting basic concepts
of logic programming by deriving them from logic. We shall
study definite clause programs:

- What they specify (the least Herbrand model).
- How they are used: a logical view of the notion of
query.
- What computations of logic programs are: the resolu-
tion principle, SLD-refutability, completeness and nega-
tion by failure.

This general background will serve as a basis to introduce a
logic programming language Prolog and will be associated by
a number of assignments to master specification programming.
It will be followed by some implementation issues like in-
terpreting, compiling, debugging and other programmer's sup-
port tools. We shall then critically investigate a number
of applications of Prolog to software specification, com-
piler writing, expert system programming, embedded languages
implementation, database programming, program transforma-
tions, etc., and study language's power and limitations.
The course will end with a comparison of definite clause
programming to other formalisms, eg. attribute grammars,
functional programming, rule based programming. Time per-
mitting parallelism, complexity and other topics of interest
will be studied.

REQUIREMENTS A background in propositional logic, some fami-
liarity with predicate calculus and general background in
computer science (reasonable acquaintance with parsing, com-
piling, databases, programming im recursive languages, etc.)
is expected.

WORKLOAD
- one problem set on logic.
- Two sets of Prolog assignments.
- Mid-term mid-size Prolog single person project.
- A substantial amount of papers to read: core papers
and elected one-topic papers (the latter to be reviewed
in sections).
- Final research paper on individually elected topic
(with instructor's consent).

LITERATURE, REQUIRED

PROGRAMMING IN PROLOG by Clocksin and Mellish. RESEARCH PA-
PERS distributed in class.


LITERATURE, OPTIONAL

LOGIC FOR PROBLEM SOLVING, by Kowalski MICRO-PROLOG: LOGIC
PROGRAMMING, by Clark and McCabe LOGIC AND DATABASES, edited
by Gallaire and Minker IMPLEMENTATIONS OF PROLOG, edited by
Campbell


TENTATIVE PLAN
25 meetings

- Introduction: declarative and imperative programming, the
goals of Vth Generation Project.

- Informal notions of: model, truth, provability. The syn-
tax of predicate calculus, proof systems for predicate cal-
culus completemess, soundness, models.

- Transformation to clausal form, resolution and its com-
pleteness.

- Definite clause programs:

* operational semantics
* proof-theoretic semantics
* fixed point semantics

- Introduction to programming in Prolog.
- Data structures.
- Negation by failure and cut.
- Specification programming methodology.
- Advanced Prolog programming.
- Algorithmic debugging.
- Parsing and compiling in Prolog.
- Abstract data type specification in Prolog.
- Logic programming and attribute grammars, data flow
analysis.
- Interpretation and compilation of logic programs

- Artificial intelligence applications:

* metalevel programming
* expert systems programming
* Natural language processing

- Alternatives to Prolog; breadth-first search, coroutines,
LOGLISP, AND- and OR-parallelism.
- Concurrent Prolog.
- Relations between LP and functional programming.
- LP and term rewriting.
- Program transformation and derivation.
- Object oriented programming.
- Some complexity issures.
- LP and databases.
- Architecture for LP.

------------------------------

Date: Wed, 12 Sep 84 10:40:23 pdt
From: Jeff Ullman <ullman@diablo>
Subject: Conference - Database Systems

CALL FOR PAPERS

FOURTH ANNUAL ACM SIGACT/SIGMOD SYMPOSIUM ON
PRINCIPLES OF DATABASE SYSTEMS

Portland, Oregon March 25-27, 1985


The conference will cover new developments in both the
theoretical and practical aspects of database systems.
Papers are solicited that describe original and novel
research into the theory, design, or implementation of data-
base systems.

Some suggested but not exclusive topics of interest
are: application of AI techniques to database systems, con-
currency control, database and database scheme design, data
models, data structures for physical database implementa-
tion, dependency theory, distributed database systems,
logic-based query languages and other applications of logic
to database systems, office automation theory, performance
evaluation of database systems, query language optimization
and implementation, and security of database systems.

You are invited to submit 9 copies of a detailed
abstract (not a complete paper) to the program chairman:

Jeffrey D. Ullman
Dept. of Computer Science
Stanford University
Stanford, CA 94305

Submissions will be evaluated on the basis of significance,
originality, and overall quality. Each abstract should 1)
contain enough information for the program committee to
identify the main contribution of the work; 2) explain the
importance of the work, its novelty, and its relevance to
the theory and/or practice of database management; 3)
include comparisons with and references to relevant litera-
ture. Abstracts should be no longer than 10 typed double-
spaced pages (12,000 bytes of source text). Deviations from
these guidelines may affect the program committee's evalua-
tion of the paper.

Program Committee

Jim Gray Richard Hull
Frank Manola Stott Parker
Avi Silberschatz Jeff Ullman
Moshe Vardi Peter Weinberger
Harry Wong

The deadline for submission of abstracts is October 12,
1984. Authors will be notified of acceptance or rejection
by December 7, 1984. The accepted papers, typed on special
forms or typeset camera-ready in the reduced-size model page
format, will be due at the above address by January 11,
1985. All authors of accepted papers will be expected to
sign copyright release forms. Proceedings will be distri-
buted at the conference and will be available for subsequent
purchase through ACM. The proceedings of this conference
will not be widely disseminated. As such, publication of
papers in this record will not, of itself, inhibit republi-
cation in ACM's refereed publications.


General Chairman: Local Arrangements Chairman:
Seymour Ginsburg David Maier
Dept. of CS Dept. of CS
USC Oregon Graduate Center
Los Angeles, CA 90007 19600 N. W. Walker Rd.
Beaverton, OR 97006

------------------------------

End of AIList Digest
********************

← previous
next →
loading
sending ...
New to Neperos ? Sign Up for free
download Neperos App from Google Play
install Neperos as PWA

Let's discover also

Recent Articles

Recent Comments

Neperos cookies
This website uses cookies to store your preferences and improve the service. Cookies authorization will allow me and / or my partners to process personal data such as browsing behaviour.

By pressing OK you agree to the Terms of Service and acknowledge the Privacy Policy

By pressing REJECT you will be able to continue to use Neperos (like read articles or write comments) but some important cookies will not be set. This may affect certain features and functions of the platform.
OK
REJECT