💾 Archived View for tanelorn.city › ~vidak › epicurics-manifesto.gemini captured on 2020-11-07 at 02:09:23. Gemini links have been rewritten to link to archived content
⬅️ Previous capture (2020-09-24)
-=-=-=-=-=-=-
Back to EPICURICS Gemini Capsule Index
D1: Imperative programming is a programming paradigm that is carried
out by issuing statements to a computer, in order to change that
computer's internal state.
D2: Functional programming is a declarative programming paradigm. The
fundamental unit of functional programming is not the statement, it is
the 'expression'. Statements go through the process of 'execution',
but expressions go through the process of 'evaluation'. Evaluation is
the process of determining what value an expression returns when
computation is performed upon functions.
D3: The Von Neumann model of computation is the organisation of a
computer that is best suited for imperative programming. A Von Neumann
machine is composed of registers, arithmetic/logic units (ALUs),
computer memory, and a control unit. The purpose of a Von Neumann
machine is to _execute human-programmed statements_.
D4: A LISP machine is a computer, whether physical or virtual, which
is able to compute the LISP programming language. Its structure is
singly suited to evaluating LISP functions, expressions, and
objects. A LISP machine is a stack machine. A stack machine holds its
computer instructions in stacks instead of in registers.
D5: Another term for 'Operating System' is 'Main Control Program'. I
believe this is a better label.
D6: LISP machines do not possess Central Processing Units (CPUs) like
Von Neumann machines. A LISP machine is built around the concept of a
'Symbolic Processing Unit' (SPU).
D7: LISP machines are designed to process data as symbols. Von Neumann
machines are in the business of processing data as numbers.
A1: All computer technology that finds its tradition in, and results
from the work directed by capitalism and the state is fundamentally
immoral.
A2: Despite its shortcomings, the movement for computer freedom, be it
hardware- or software-based, is intrinsically good.
A3: The project will exclude itself from any relationship to
capitalism and imperialism.
I’m not sure how to put this, but the people involved in carrying
out this project will not end up like Linus Torvalds, Moxie
Marlinspike, etc - no-one on this project will ever work for any
firm or form of the military-ideological-industrial complex
ever. No-one will go work at Google, Twitter, Facebook, IBM, Xerox,
any office of any political state. None of the ideas we use will be
implemented willingly for capitalist industry.
The paradigm in which the fundamental language that a main control
program (MCP) is written in, determines the character of that MCP.
The Von Neumann machine model of computation was caused by the great
prevalence of imperative programming languages. Indeed Von Neumann
machines are virtually always written in imperative languages. They
are tied, either by logical necessity or the inertia of a long and
rigid culture, to the history of the C programming language. They
cannot escape the following engineering pitfalls.
Imperative Languages and Von Neumann machine computers cannot:
- Rid themselves of kernel-based MCPs; therefore they cannot
- Rid themselves of the need for context switching between superuser
mode and user mode. This is an enormous security attack surface, and
is also the main reason why kernel-based MCPs constantly obsess over
how fast they can process data: context switching is an
intrinsically slow computer engineering technology.
- Rid themselves of a model of MCP processes which conceives them as
greedy, resource-hungry entities that are in constant selfish
competition with each other. The model of inter-process
communication (IPC) that kernel MCPs are supposed to carry out is famously
modelled by the "Dining Philosopher's Problem", which reifies MCP
resource management as a "tragedy of the commons" thought
experiment. This model of IPC was adapted from politically
reactionary models of distributive justice that concerned _human_
consumption of physical resources.
- Rid themselves of a Von Neumann model of computation that follows
after the PDP-11. This is true due to the overwhelming use of the C
programming language in today's UNIX kernel development. As Standh
in the [CLOSOS Manifesto](http://metamodular.com/lispos.pdf)
reveals, despite the fact that our current microcomputers possess
many orders of magnitude more resources and computational power than
the PDP-11, we still carry out MCP process scheduling as if the
resources of our computer systems cannot all be loaded into memory
at once. Our kernel MCPs still treat programs as if they will
completely fill up the computer's address space.
Currently, EPICURICS is designed around three small C libraries from
the University of Utah's [Operating System
Toolkit](https://www.cs.utah.edu/flux/oskit/), developed by the 'Flux'
think-tank in the late 1990s, early 2000s.
This decision was made to allow rapid prototyping of a LISP Main
Control Program. These C libraries must be replaced with LISP
equivalents as soon as possible.
The first kind of reasoning for this proposition is widely understood
and influential:
- Intel and AMD are both central pillars of the military industrial
complex. To buy amd64 systems is to make the state fat and rich,
while simultaneously immiserate yourself.
- Speculative execution in amd64 processors makes one vulnerable to
_Spectre_ and _Meltdown_ security vulnerabilities. There is no known
immediate precaution one can take if you run amd64 CPUs.
I add the following arguments in support of this proposition:
- The amd64 CPU architecture is a Von Neumann machine, and therefore
it is insufficient, and inadequate for the radical transition of MCP
design from imperative-based UNIX to functional-paradigmatic LISP
MCPs.
This proposition is a consequence of the validity of P2. Due to the
inadequacy of the imperative-paradigm amd64 CPU architecture, hardware
designed to run LISP must also be developed in order to rid ourselves
of C, UNIX, and the tradition of Intel x86 CPUs.
Luckily it is well-known how to construct hardware designed for
interpreting LISP. There exist many and various historical prototype
and production-quality hardware LISP machines, such as:
- Thomas Knight's _CONS_ (1970s)
- MIT's _CADR_ (1970s)
- Symbolics Corporation's _3600 series_ LISP machine workstations (1980s)
The term "computing power" continues to this day to be erroneously
taken to have a meaning which can only be expressed quantitatively.
In fact the true "power" of a computer has nothing to do with the
amount of instructions per second, or how long it takes to perform a
specific project or process. The actual "power" of a computer system
is only tangentially related to the amount of data it can transmit,
process, or record and access.
This quantitive aspect upon which virtually all reverence for
computers rests is merely a symptom of what truly gives computers
their "power". Computer power is, at its most fundamental level,
qualitative. The extent to which a computer can decrease unpleasant
labour needed to be performed by humans, and the degree to which it
can both lengthen and enhance the quality of our leisure time is the
real sense of the concept "computing power".
Therefore, this project will not measure its success against the
criterion that it must compute as fast as possible.
The consequence of this deliberate rejection of calculation speed and
data processing throughput means this operating MCP and hardware
ecosystem will be orders of magnitude more "powerful" than the C and
x86 ecosystem.
No-one from the Intel/UNIX/C computer philosophy tradition will have
even the slightest conception of how this is possible, how, computers
with avowedly mediocre calculation speeds will be able to compete or
overtake the raw calculation power advantage of the current imperative
programming paradigm.
The functional programming paradigm of LISP allows one to modify the
MCP in real time, with instantaneous results. Virtually all the major
imperative programming languages lack the ability to construct MCPs
that do not have to stop, recompile or reinstall, and restart their
fundamental structure when the user desires to have it changed.
The reason for this is that imperative computing paradigms lack
"reflection". _Reflection_ is the ability of a computer system to
inspect, edit, and change its own structure. In addition to being
declarative, functional programming languages such as LISP are also
_reflective_.
This manifesto argues that programming languages must possess the
ability to perform reflection if they are to be considered as
acceptable tools for computer systems development. This is because
reflection is a necessary component of assembling a sufficiently
"powerful" human-computer interface.
There is a reason why LISP was historically deployed to research and
explore computer-automated intelligence. Its ability to perform
reflection means it can treat symbols in its computer programs as
either data to be evaluated _or_ executed. This is a powerful feature
that, for complex reasons that cannot be elucidated here, allow for a
much deeper and richer human-computer interfacing.
Any philosophy of sentience and consciousness which privileges or
eliminates rationalism, non-reductive forms of materialism
(Epicurianism, Lucretianism) and even idealism (like Hegel and Kant)
in favour of Democritean reductive or eliminativist materialism such
as physicalism, or mechanistic conceptions of evolutionary biology,
are toxic chauvinism and are to be excluded from the project.
There are to be no hierarchies within the organisational structures of
the EPICURICS project. It is a very worrying commonplace today that
people who write the code are the ones who are "making" the computer
software, and that people who perform the project documentation merely
"talk about it".
The EPICURICS project regards this as fundamentally untrue.
This project will practice "literate coding", a form of software
development which mixes documentation and executable code into
documents which resemble discursive essays.
It is dismaying that web browsers are, today, practically their own
kind of Main Control Program. What is the point of writing another MCP
within MCPs like Windows and UNIX?
Web browsers are also the main tool that enormous military-industrial
and political capitalist state components use to control and destroy
innocent human life. The nefariousness of corporate social media
platforms in enthusiastically censoring the internet and assisting
reactionary political organisations win control over large portions of
the capitalist state is well known. I need to enumerate any of their
specific crimes, which are reason enough to have their empires
liquidated and dismantled.
Although not an immediate priority, the EPICURICS project will attempt
to dismantle the distinction between its MCP and "the internet", and
will explore federated, distributed, and other kinds of decentralised
computing systems.
Further, what we now take to be "hypertext", HTTP, is a very dim
shadow of how deep and rich the internet could be. Ideally, a great
deal of effort will eventually be put into exploring how to integrate
the internet into EPICURICS without ever having the need for a
discrete web browser system component program.
It goes without saying that the EPICURICS project needs to put its
money where its mouth is, and demonstrate how its conception of
computer "power" is superior to the quantitative conception to which
the imperative programming paradigm holds.
This project predicts that if EPICURICS is able to function on cheap,
accessible hardware, then it has a much greater chance of being
adopted by regular computer users. In addition to this, because the
EPICURICS system is based on computer science which better interfaces
with human cognition, this project predicts that users will prefer the
EPICURICS MCP.