💾 Archived View for oberdada.pollux.casa › gemlog › 2023-06-19_variations_feedback.gmi captured on 2024-05-10 at 10:44:45. Gemini links have been rewritten to link to archived content
⬅️ Previous capture (2023-07-10)
-=-=-=-=-=-=-
It shouldn't need pointing out that positive feedback can be a terribly bad thing. Let's consider some very different cases.
Ploum recently made some interesting remarks about the sensory takeover of portable augmented reality. Let me say that I share Ploum's healthy skepticism of monopoly capitalism, but I'll go off on a tangent.
Porter un casque audio et visuel dans la rue deviendra un jour ou l’autre une norme acceptable. Ce qui ne serait pas un problème si la technologie n’était pas complètement contrôlée par ces morbides monopoles qui veulent transformer les humains en utilisateurs, en clients passifs.
gemini://ploum.net/2023-06-07-privatisation-de-nos-sens.gmi
Mais oui, it still _would be_ a problem even if the users had the full ownership of this technology. The scenario that comes to my mind is that people already use their headphones to mask out an undesirable, harsh sonic environment. In response to this shielding, I suppose, sound signals and announcements have to be made louder to reach even those listening to music in their headset. Anyway, I can think of no equally plausible explanation for the absurdly loud announcements of the next stop at the regional train, occasionally also on busses. Yes, the population's hearing is also gradually deteriorating by frequent exposure to loud sound levels, so the signals that must penetrate the noise have to be made louder again.
The Acoustic Ecology movement (R. Murray Schafer and others) would ordain increased awareness of our sounding surroundings instead of trying to block them out. Which is not to say that a noisy city dominated by the diffuse persistent motor hum and rubber decks against roads, mixed to a blur of vehicles at all distances, is an ideal or particularly fascinating soundscape. Schafer would prefer a quiter backdrop in which delicate sounds are given a chance to be heard, a rural, pre-industrial soundscape.
I'm not sure where the acoustic ecology movement is at today, although there is a journal devoted to soundscape studies. The focus on the sonic environment as such may feel reductionist or limited, but it must be admitted that sound pollution has been given modest attention in ecology.
With the brutality of cities, it is no wonder citizens prefer to be attentionally absorbed into their little scroll-screens. The few trees there used to be are felled, the formerly static advertising is replaced by moving images trying to pull viewers' attention in the direction of commerce. Again there is competition, the contraptions to block out the surroundings must be more immersive, which leads to ever more eye-catching attempts by advertisers (as well as street vendors and others craving your attention), in a positive feedback loop.
The point has been made before, also here in the geminispace, that generative AI models will be vulnerable to being fed their own output. Now some researchers have diagnosed the problem.
https://arxiv.org/abs/2305.17493v2
Data poisoning, the insertion of "malicious data" into a training data set, evidently can throw a model out of kilter. But the phenomenon of model collapse is different. It happens when parts of a model's output is used in training an updated version of the model. And that, of course, is the likely outcome as folks start using the chattery GPTs to write stuff on the web, which is then scraped for input for the same large language models.
Intuitively, the thing modelled ("reality") can be thought of as a distribution. Generating derived material from the model corresponds to sampling the distribution. The tail is less likely to be sampled than more common elements, so the new distribution will begin to deviate from reality. After a few iterations and a strike of unluck, you'll end up with a delta spike.
For a brief news release version:
https://www.lightbluetouchpaper.org/2023/06/06/will-gpt-models-choke-on-their-own-exhaust/
LLMs are like fire – a useful tool, but one that pollutes the environment.
Pollution indeed. However, these authors do not consider the environmental (resource & energy use) aspects of machine learning at all.
Despite all generative AI skepticism, I would also like to mention a potentially useful application. A television show script writer trained a model on his own scripts and generated a few versions. Sure enough, he was confronted with some of his own clichés and over-used ideas and expressions, which he would henceforth be able to avoid. He could have reached the same insights by asking a collegue for critique, or waiting til someone makes a sarcastic parody.
Iterated probability distributions also seem promising for stochastic algorithmic composition. Flat distributions (of pitches, durations, dynamics, what have you) may sound bland or unprofiled, though some ear-catching events such as identical note repetitions might sneak in. Instead, highly uneven distributions create a differentiation between common and rare events, which is crucial for building up expectations. One way to randomly create such an uneven distribution is to sample a few items from an equiprobable distribution, let's say, make the number of samples twice the number of elements in the distribution. Then calculate the relative frequencies of the sampled items, and make this the new distribution. Iterating the process will bring the distribution to the certainty of a delta, though in an unpredictable way.
Un coup de dés jamais n'abolira le hasard.
The opposite goal of stochastic music approaching serialism can be achieved by James Tenney's algorithm. Randomly drawn notes may exhibit more repetition than people expect from randomness. In serialism, on the other hand, you use all twelve pitches before you reuse any of them (to simplify a bit). Suppose the twelve chromatic notes are initially equiprobable. Draw one note, set its probability to zero, increase the probability of all other notes so that the probabilities sum to 1. Iterate.