💾 Archived View for helion.cybersavior.dev › lisp.gmi captured on 2024-06-20 at 12:12:08. Gemini links have been rewritten to link to archived content

View Raw

More Information

⬅️ Previous capture (2023-04-19)

🚧 View Differences

-=-=-=-=-=-=-

HOME

HOME

host

oldgate

tags

#black - completeness, singularity, silence, ego death

#law - order, cathedral, stagnation

#mind - it's not just rational.

#safe - probably not harmful. Maybe.

#technology - bits and such

LISP

notes on anatomy of lisp chapter 1

lisp[1]

[1] lisp.mp3 [audio]

Audio Log

This terminal can not view this object.

I think it makes the most sense to keep a record of what I'm thinking about everything

I'm going over in case it provides insight to this truly ancient document.

Anatomy of LISP - Preface

"It is important not to lose sight of the fact that there is a difference between training and education. If computer science is a fundamental discipline, then university education in this field should empathize enduring fundamental principles rather than transient current technology."

Peter Wegner, Three Computer Cultures [Weg 70]

This text is nominally about LISP and data structures. However, in the process it covers much broader ares of computer science. The author has long felt that the beginning student of computer science has been getting a distorted and disjoined picture of the field. In some ways this confusion is natural; the field has been growing at such a rapid rate that few are prepared to be judged experts in all areas of the discipline. The current alternative seems to be to give a few introductory courses in programming and machine organization followed by relatively specialized courses in more technical areas. The difficulty with this approach is that much of the technical material never gets related., the student's perspective and motivation suffer in the process. This book see LISP as a means for relating topics which normally get treated in several separate courses. The point is not that we can do this in LISP, but rather that it is natural to do it in LISP. The high-level notation for algorithms is beneficial in explaining and understanding complex algorithms. The use of abstract data structures and abstract :ISP programs shows the intent of structured programming and step-wise refinement. Much of the current work in the mathematical theories of computation is based on LISP-like languages. Thus LISP is a formalism for describing algorithms, for writing programs, and for proving properties of algorithms. We use data structures as the main thread in our discussions because a proper appreciation of data structures as abstract objects is a necessary prerequisite to an understanding of modern computer science.

The importance of abstraction obviously goes much farther than its appearance in LISP. Abstraction has often been used in other disciplines as a means for controlling complexity. In mathematics, we frequently are able to gain new insights be recasting a particularly intransigent problem in a more general setting. Similarly, the intent of an algorithm expressed in a high-level language like Fortran or PL/1 is more readily apparent than its machine-language equivalent. These are both examples of the use of abstraction. Our use of abstraction will impinge on both the mathematical and the programming aspects. Initially, we will talk about data structures as abstract objects just as the mathematician takes the natural numbers as abstract entities. We will attempt to categorize properties common to data structures and introduce notation for describing functions defined on these abstractions. At this level of discussion we are thinking of our LISP-like language primarily as a notational convenience rather than a computational device. However, after a certain familiarity has been established it is important to look at our work from the viewpoint of computer science. Here we must think of the computational aspects of our notation. We must be concerned with the representational problems: implementation on realistic machines, and efficiency of algorithms and data structures. However, it cannot be over-emphasized that our need for understanding is best served at the higher level of abstractions the advantage of a high-level language is notational rather than computational. That is, it allows us to think and represent our algorithms in mathematical terms rather than in terms of the machine. It is after a clear understand of the problem is attained that we should begin thinking about representation.

We can exploit the analogy with traditional mathematics a bit further. When we write sqrt(x) in Fortran, for example, we are initially only concerned with sqrt as a mathematical function defined such that x = sqrt(x) * sqrt(x). We are not interested in the specific algorithm used to approximate the function intended in the notation. Indeed, thought of as a mathematical notation, it doesn't mater how sqrt is computed. We might wish to prove some properties of the algorithm which we are encoding. If so, we would only use the mathematical properties of the idealized square root function, Only later, after we had convinced ourselves of the correct encoding of our intention in the Fortran program, would we worry about the computational aspects of the Fortran implementation sqrt. The typical user will never proceed deeper into the representation that there; only if his computation is lethargic due to inefficiencies, or inaccurate due to uncooperative approximations, will she look at the actual implementation of sqrt.

Just as it is unnecessary to learn machine language to study numerical algorithms, it is also unnecessary to learn machine language to understand non-numerical or data structure processes. We make a distinction between data structures and storage structures. Data structures are abstractions, independent of how they are implemented on a machine. Data structures are representations of information chosen to exhibit certain ordering and accessibility relationships between data items. Storage structures are particular implementations of the abstract ideas. Certainly we cannot ignore storage structures when we are deciding upon the data structures which will encode the algorithm, but the interesting aspects of the representation of information can be discussed at the level of data structures with no loss of generality. The mapping of data structures to storage structures is usually quite machine dependent and we are more interested in ideas than coding tricks. We will see that it is possible, and most beneficial, to structure our programs such that there is a very clean interface between the abstract algorithm and the chosen representation. That is, there will be a set of representation-manipulating programs to test, select or construct elements of the domain; and there will be a program encoding the algorithm. Changes of the representation only require changes to the programs which access the representation, not to the basic program. One important insight which should be cultivated in this process is the distinction between the concepts of function and algorithm. The idea of function is mathematical and is independent of any notation of computational the meaning of "Algorithm" is computational, the effect of an algorithm being to compute a function.

This text is not meant to be a programming manual for LISP. A certain amount of time is spent giving insights into techniques for writing LISP functions. There are two reasons for this. First, the style of LISP programming is quite different than of "Normal" programming. LISP was one of the first languages to exploit the virtues of recursive programming and explore the power of procedure-valued variables. Second, we will spend a great deal of time discussing various levels of implementation of the language. LISP is an excellent medium for introducing standard techniques in data structure manipulation. Techniques for implementation of recursion, implementation of complex data structures, storage management, and symbol table manipulation are easily motivated in the context of language implementation. Many of these standard techniques first arose in the implementation of LISP. But it is pointless to attempt a discussion of implementation unless the reader has a thorough grasp of the language.

Granting the efficacy of our endeavor in abstraction, why study LISP? LISP is at least fifteen years old and many new languages have been proposed. [ 15 years my ass. This shit was old when this document was first created, who knows how long it has been since then? ] The difficulty is that the appropriate combination of these features is not present in any other language. LISP unifies and rationalizes many divergent formulations of language constructs. One might surmise that a new language, profiting from LISP's experience, would make a better pedagogical tool. A strong successor has not arrived, and toy languages a suspect for several reasons. The student may suspect that She is a subject in a not too clever experiment being performed upon her by her instructor. Having a backlog of fifteen years of experience and example programs should do much to alleviate this discomfort. The development of LISP also shows many of the mistakes that the original implementers and designers made. We will point out the flaws and pitfalls awaiting the unwary language designer.

We claim the more interesting aspects of LISP for students of computer science lie not in its features as a programming language, but in what it can show about the structure of computer science. There is a rapidly expanding body of knowledge unique to computer science, neither mathematical nor engineering per se. Much of this area is presented most clearly by studying LISP.

Again there are two ways to look at a high level language: as a mathematical formalism, and as a programming language. LISP is a better formalism than most of its mathematical rivals because there is sufficient organizational complexity present in LISP so as to make its implementation a realistic computer science task and not just an interesting mathematical curiosity. Much of the power of LISP lies in its simplicity. The data structures are rich enough to easily describe sophisticated algorithms but not so rich as to become obfuscatory. Most every aspect of the implementation of LISP and its translators has immediate implications to the implementation of other languages and to the design of programming languages in general.

We will describe language translators (interpreters and compilers) as LISP functions. The structure of these translators when exposed as lisp functions aids immensely in understanding the essential character of such translators. This is partly due to the simplicity of the language, but perhaps more due to our ability to go right to the essential translating algorithm without becoming bogged down in details of syntax.

LISP has very important implications in the field of programming language semantics, and is the dominant language in the closely related study of provability of properties of programs. The idea of proving properties of programs has been around for a very long time. Goldstein and con Neumann were aware of the practical benefits of such endeavors. J. McCarthy's work in LISP and the THEORY OF COMPUTATION sought to establish formalisms and rules of inference for reasoning about programs. However, the working programmers recognized debugging as the only tool with which to generate a "Correct" program, though clearly the non-occurrence of bugs is no guarantee of correctness. Until very recently techniques for establishing correctness of practical programs simply did not exist.

A resent set of evens is beginning to change this.

Programs are becoming so large and complex that, even though we write in a high-level language, our intuitions are not sufficient to sustain us when we try to find bugs. We are literally being forced to look beyond debugging.

The formalisms are maturing. We know a lot more about how to write "Structured programs"L we know how to design languages whose constructs are more amenable to proof techniques. And most importantly, the tools we need for expressing properties of programs are finally being developed.

The development of on-line techniques. The on-line system, with its sophisticated display editors, debuggers and file handlers, is the only reason that the traditional means of construction and modification of complex programs and systems has been able to survive this long. The interactive experience can now be adapted to program verifiers and synthesizers.

This view of the programming process blends well with the LISP philosophy. We will show that the most natural way to write LISP programs is "Structured" in the best sense of the word, being clean in control structure, concise by not attempting to do too much, and independent of a particular data representation.

Many of the existing techniques for establishing correctness originated in McCarthy's investigations of LISP; and some very recent work on mathematical models for programming languages is easily motivated from a discussion of LISP.

LISP is the starting point for those interested in Artificial Intelligence. [ so blatantly admitting that? Is this before the convention?! Just how old is this?! ] It is no longer the "Research" language, but has become the "Systems" language for A.I. Today's research languages are built on LISP, using LISP as a machine language.

Finally there are certain properties of LISP-like languages which make them the natural candidate for interactive program specification. In the chapter on implications of LISP we will characterize "LISP-like" and show how interactive methods can be developed.

This text is primarily designed for undergraduates and therefore an attempt is made to make it self-contained. There are basically five areas in which to partition topics: the mechanics of the language, the evaluation of expressions in LISP, the static structure of LISP, the dynamic structure of LISP, and the efficient representation of data structures and algorithms. Each area builds on the previous. Taken as a group these topics introduce much of what is interesting computer science.

The first area develops the programming philosophy of LISP the use of data structures in programming the language primitives, recursion, and other control structures. The second area, involving a careful study of the meaning of evaluation in LISP, gives insights into other languages and to the general question on implementation. The next two areas are involved with implementation. The section on static structure deals with the basic organization of memory for a LISP machine -- be it hardware or simulated in software. The dynamics of LISP discusses the primitive control structures necessary for implementation of the LISP control structures and procedure calls. LISP compilers are discussed here. The final section relates our discussion of LISP and its implementation to the more traditional material of a data structures course. We discuss the problems of efficient representation of data structures. By this point the student should have a better understand of the uses of data structures and should be motivated to examine these issues with a better understanding.

A large collection of problems has been included. The reader is urged to do as many as possible. The problems are mostly non-trivial; they attempt to be realistic, introducing some new information which the readers should be able to discover themselves. There are also a few rather substantial projects. At least one should be attempted. There is a significant difference between being able to program small problems and being able to handle large projects. Small programming projects can be accomplished in spite of any admonitions about "Good programming style". A large project is an effective demonstration of the need for elegant programming techniques. The text is large and covers much more than is recommended for a one-semester course. A typical one semester course on data structures covers:

Chapter 1: all

Chapter 2: without 2.4, 2.5, and 2.10

Chapter 3: without the mathematical aspects of 3.13

Chapter 4: without 4.7, 4.8, and the mathematical aspects of 3.13

Chapter 5: without 5.8, 5.19, and 5.20

Chapter 6: without 6.8, and 6.12 through 6.20

Chapter 7: without 7.5, 7.6, and 7.10 through 7.14

Chapter 8 is also optional.

If a good interactive LISP implementation is available, then the pace can be quickened and the projects enlarged. However, if only a poor or mediocre implementation is accessible, then the course time is better spent without any actual programming, or the course should be augmented to include and implementation laboratory. LISP is an interactive language; attempts at other modes of operation do a disservice to both the language and the user.

Finally as a note on the structure of the text. The Emphasis flows from the abstract to the specific, beginning with a description of the domain of LISP functions and the operations defined over that domain, and moves to a discussion of the details of efficient implementation of LISP-like languages. The practical-minded programmer might be put off by the "Irrelevant" theory and the theoretical-minded mathematician might be put off by the "Irrelevant" details of the implementation. If you lie somewhere between these two extremes, then welcome.

Notes

It seems like this is the real deal. I have sent some new search terms to the archivists

To see if they can find anything related to LISP and artificial intelligence.

It seems like this may have been the term they used to refer to thinking machines.

Looking at the next section it may not be possible to directly include the full text

This terminal is somewhat limited so I am not sure exactly how to encode all the

Pictographic information.

.

incoming references

INDEX - hierachical view of every page as relates to its host.