💾 Archived View for spam.works › mirrors › textfiles › programming › edoe.txt captured on 2023-07-22 at 20:43:44.
⬅️ Previous capture (2023-06-16)
-=-=-=-=-=-=-
Newsgroups: comp.theory.self-org-sys From: aa954@Freenet.carleton.ca (T. Dunlop) Subject: E.D.O.E a Unified View of Computing Message-ID: <1993Mar6.072411.3794@freenet.carleton.ca> Organization: National Capital Freenet, Ottawa, Canada Date: Sat, 6 Mar 1993 07:24:11 GMT Lines: 316 E.D.O.E a Unified View of Computing ----------------------------------- { Event-Driven, Data-Oriented, Object-Enviornment } There are only two ways to look at the world: 1 - Serially, algorothmic, one step at a time, Sychronously. 2 - Parallel, multiprocessing, many things happening, Asychronously. Most things in the world are neither all sychronouse or all asychronouse, but a combination of both. So, why are computers thought of in terms of algorithmic processing? Process: even the words we use to talk about computers imply single steps one after another, all following along in sychronouse fashion. This gets in the way of a simple understandable description of problems, the algorithmic rut (we seem to be in) forces upon programming convoluted solutions to what should be simple problems elegently described. Real time programming is made very difficult because of the algorithmic rut! Even object oriented programming (which seems to be an attempt to get out of the rut) suffers because algorithms are being used to describe an event system. It would be much easier to do it the other way around and use an events approach to describe algorithms. Put another way algorithms are naturally a subset of events but events can not be made a subset of algorithms! There lies the problem and the reason there is a need for a Unified View of Computing. The seperation of the two views lies at the bottom of the problem and at the heart of all computing. To fix the problem and Unify the View of Computing means rethinking from the ground up. Starting at the Processor. E.D.O.E as a Unified View of Computing can sit just above any single processor or multiprocessor design and all will appear to the user, programmer, and programs as having no difference, as being no different except for speed of execution. The flip side of any computer is the Interrupt system, an Event System. This is what must be merged with the algorithmic instruction pointer to build a Unified View of Computing. Whether we build such hardware in the future is not an issue, though the CM-5 may be very close. For the programming enviornment built upon the hardware can easily merge the two concepts into a single programming model! The difficulity in doing so is small but means rethinking much of what computing is today. The heart of the problem is in how Instruction Pointers are handled and viewed in todays approachs to computing. If we rebuild from the instruction pointer upward again we can unify the view of programming into a single simple model. The instruction pointer needs to be under the programmers control. This requires that there be methods to create, terminate, suspend, and restart the instruction pointer(s) as required by the description of the problem being solved, described as a program on the system. The four methods can be done as three, this solves the need to know what type of object is being addressed, used, by a method, and allows their use in sychronous as well as asychronous solutions to problems. I refere to the fact that objects can be either active of passive, the same holds true for messages. 1 - Start processing at location. 2 - Terminate instruction pointer, free for reassignment. 3 - Suspend execution for restart at location, but do not reassign. Are the three methods required to handle instruction pointers. With the use of Enviornment Variables and nesting counts the three methods can easily determine what is required and free the programmer from having to be overly specific in requesting that a new instruction pointer be used or in assigning the use of an existing instruction pointer. This is accomplished with a message object placed in the object enviornment, a high level view of the instruction pointer. With a Message Object in the object enviornment the two views of computing are merged and yet retain their seperate identity. The object enviornment is the Event System and the Methods are the algorithmic sub-elements within. And the programmer is freed to choose whatever approach is most approprate for the description of a problem. --- The current object approach folds the methods upward where the message addressing hierarchy needs to be. It is a result of using algorithmic approachs to deal with events and needs to be unfolded, placing the methods and their associated alogrithms below that of objects. Algorithms are the building blocks from which methods are constructed, algrothims are not a part of the descriptive hierarchy of, type, group, class, categorie, ... When I write a letter I use a hierarchy to address it, from specific to general. Or a phone call uses a hierarchy of codes to reach the correct destination, from general to specific. This in turn causes a minor problem associated with subroutines, function calls and macro expansion. The solution is to use a boundary to enclose groups of objects and their associated methods into functional elements such as devices, programs, event handlers, GUIs, ... These boundaries can then be viewed as larger self-contained black-boxes. The algorithmic elements contined within are for the most part not useful beyond these larger boundary. To have the E.D.O.E approach work this folding has to be undone. Methods can not be thought of as belonging to super-classes, but must be thought of as building blocks, a resources to be used within the system. To this end methods are the store house boundary for IP algorithms. Events can also be used to describe algorithmic solutions to problems so where is the dividing line at what point do instruction pointer algorithms end and event algorithms begin? That point is at the subroutine call in BASIC or the function call in 'C' below this event-messages are not efficient. It is more practical to use the instruction pointers assembler instructions, they consume fewer cycles than processing a message requires. ---- Objects as Black-Box Elements is what the event system is, a message enviornment filled with active, passive and quiesient objects. The instruction pointer rides along as part of the message, yes all messages are active in this respect. The instruction pointer at the destination is used to run the initial method, what happens after that depends upon the type of object and the type of message, active or passive. Sending a passive message to a passive object doesn't make a great deal of sense but there are uses for it, such as setting up the context within which an instance is be used before activating the method to be applied to that instance of an object. ---- Methods as Black-Box Elements are the store house of the algorithmic instructor pointer elements, the pseudo-assembler language of E.D.O.E. With the inclusion of target assemblers as part of the E.D.O.E system any program could be quickly assembled and pre-bound into the enviornment. What to do with methods that are not associated with, do not belong to, an object is a problem. Placing them in a method hierarchy, a seperate class structure from that of the objects, is a resonable solution. Which eliminates another headache in computer programming, understanding, and use. Also allows each user to put together a custom enviornment to meet their individual needs, requirements, and equipment. The user would be free to pick and choose among programs, methods, and pieces written by themselves. Late or early binding of the objects would not matter, except for the peniality in speed encured in looking for objects not pre-bound. I see this as a virtual-memory system which banishes the problems of fetching, accessing and binding with objects on disk to a seperate peripherial domain rather than being a part of the E.D.O.E enviornment. ---- I am in a partial void here. I can read books, magazines, watch T.V. programs, but have no-one to disscuss my ideas with. Hopefully the UseNet will be a resonablely safe place to meet, talk with, others that have a knowledge equal to or that exceeds my own. It is difficult to know if I am going in the right direction, how the world will accept my ideas, theories, designs. I think E.D.O.E is the first step in building a Thinking Reasoning Machine, that is where this need for a unified view comes from, the reverse designing of ACI concepts. There is just too much, too many pieces. I am not good at making assumptions about my audience. And this is a first posting, make that the first posting to any UseNet News Group. I am trying to guess some of the questions and answer them, dum idea I suspose? I am trying not to loose everybody with new concepts, or new definitions applied to old words...etc... Sincerely T. Dunlop aa954@freenet.carleton.ca March, 01st, 1993 Ottawa, Ontario, Canada Copyright T. Dunlop 1993 E.D.O.E objects --------------- Environment Variables, UnBounded Global Variables - can be used to share information with other objects or to set up the context (environment) with in which an instance is to function. All Environment Variables are owned by some object they are the simplest way to pass information among objects. They can be used to wake up a quiescent object. Report Objects - are the high level structure of interrupts. They send events into the envvironment, (report upon what goes on with out). A disk is inserted in a drive, the printer wants more attention, the user has pressed a mouse button, or their are key strokes,...etc... Control Objects - complement the report objects by sending commands out of the environment to the external hardware. read the track, turn on the motor, place the printer in underline mode,...etc... IP Objects, Instruction Pointer - is a machine specific implementation and is at the heart of building a Unified Environment! The IP object is just one part of a collection of pieces that together allow the events and algorithmic aspects of the E.D.O.E environment to co-exist. Note: If the Instruction Pointers are not handled properly here and in the Message Objects the whole environment will suffer. There are three methods that apply to the Instruction Pointers 1 - Start at location, or ReStart a quiescent IP. 2 - Terminate IP, free it for reassignment. 3 - Suspend processing but do not reassign, set up a restart location. On a single processor system the need to Time Slice the CPU's action is required. This is done with a hardware clock and a queue of active instruction pointers. The discontinued IPs are kept in a list for reassignment, and IP in a suspended atate are marked with a flag. Leaving the quiescent IPs in the queue means testing every IP, but taking them out means the bother of placing them back in each time the object they are assigned to is activated. Which is better depends upon the application: If you have a large number of IP and only a few are quiescent then testing each wastes CPU cycles. But if you have a few IPs and most are quiescent then the overhead of removing them and placing them into the queue list can be large. And somewhere in between either will work equally well. I think the removal of quiescent IPs from the queue in the long run is a better, prefered, choice. It has the advantage of speed with a large number of IPs and the extra processor cycles freeded between events that may happen infrequently should off set the load and removal time. Yes, their are of course exceptions to this reasoning somewhere I expect. Note: The quiescent IPs are usually assigned to report objects associated with incomming interrupts as a means to speed interrupt processing, responce. But for some object types are equally valuable with in the environment. Message Objects - are the high level representation (program objects) of the the Instruction Pointers! The problem is, Where do events stop and algorithms begin? Events are so flexiable as to allow algorithms to be defined as a subset of events! Using events to describe an algorithm is possible but not always very efficient because of the overhead of message processing. I think you would agree that placing a message between each assembler instruction to pass the IP along is just not very efficient. What about test and branch or a subroutine jump and return? How about a function call in 'C' or a GoSub in 'BASIC'? You see the problem. The function call is about right for one to end and the other to take over. That acts as a resonable guide. What about synchronous and asynchronous operation? The same ability to use events to describe algorithms allows a simple solution to synchronize events, If the programmer has control of the Messages IP Type! For method A to synchronize with method B, method A can pass its IP to method B which will return it when method B has completed. Sort of a game of Tag Your It. Asynchronous operation comes in several forms depending upon the type of object being activated, a passive object needs an IP to become active. An active object can be quiescent, requiring waking up first. With careful design of the IP object and the three methods associated with them some of the problems, a need to know just what type of object is being activacted can be eliminated. And at the same time allow the programmer to choose whether the object is to be used synchronously or asynchronously. This involves environment flags and a nesting count that the three methods can question. To me the problem here is not how to activate an object but ensuring that it will do what the programmer wants with the instance its asked to operate with, upon. This is where the Environment Variables are most helpful, allowing the context to be preset before the object:instance is activated. Note: The activating object may not be responsible for setting the Environment Variables, they coould be set by other objects or even the user selecting options in a GUI interface. I suspose their are plenty of other pieces I could deacribe, but to me these are the important parts. The pieces that show how Events and Algorithms can live happily together in a Single Unified Environment! Allowing the user to choose the most natural approach to describe the solution to a given problem! A single simple Unified View of Computing! That is what E.D.O.E is designed to be !!! { Event-Driven, Data-Oriented, Object-Environment } = E.D.O.E ! Sincerely T. Dunlop, aa954@Freenet.carleton.ca Ottawa, Ontario, Canada. February 15th 1993. The Law of Complexity --------------------- 1 - All systems are driven. No driving force, no change. - It's just a matter of finding the 'Force' => lambda factor. 2 - Self-Organisation allows the driving force to be more easily resisted, thus the system remains unchanged longer. - '2 is stronger than 1 and 3 is better than 2'. 2a - Some systems can only Passively resist the driving force. 2b - Others can Actively resist, use the force, convert it into something else, change the force or redirect it, or emit the force outward again. 3 - The lambda point => When the driving force (lambda factor) is equal to the boundary to chaos. 3a - Increasing the force drives the system into chaos. - The order (organisation) will collaspe, break down. - (?) is entrophy a form of excess force ? 3b - Decreasing the force the system becomes cyclic. - When the force is below that needed to drive the system into chaos the system will store the force and cycle as the value exceeds the lambda point, at the edge of chaos. 3c - Remove the driving force and the system will solidify or stagnate. - the system will cease to change. - action driven by the lambda force factor will terminate. - and (?) entrophy takes over. 4 - Entrophy can be viewed as the removal, bleeding off, of any stored lambda force in the system. 5 - Large systems can be viewed as a whole with the lambda force being applied from without, externally, or as a collection of smaller subsystems interacting where the lambda force is being moved around within the larger whole and affecting each of the parts seperatly. ---------------------------------------------------------------------- T. Dunlop aa954@Freenet.carleton.ca February 10, 1993. I suspose you could call it a new 3rd law of thermodynamics or a new 2nd law, but 'The Law of Complexity' seems to ring true. Where are you Stuart Kauffman, Chris Langton, John Holland, Doyne Farmer, Murray Gell-Mann, and George Cowan? I hope you get a chance to read this. Sincerely T. Dunlop, Ottawa, Ontario, Canada ---------------------------------------------------------------------- --