đž Archived View for access.ucam.org âş ~spqrz âş 2.gmi captured on 2023-11-04 at 11:28:18. Gemini links have been rewritten to link to archived content
âŹ ď¸ Previous capture (2023-09-28)
âĄď¸ Next capture (2024-02-05)
-=-=-=-=-=-=-
(This story aims to be canon-compliant with the first movie as much as possible, except where noted in the âalternate realityâ chapters. But the first few chapters do insert extra scenes in between what we saw. In the film, M3gan said Gemma barely understood the learning model. Iâm interpreting that line to mean Gemma must have adapted somebody elseâs learning-model design.)
Professor Johnson was a very distinguished-looking Black woman but she was getting rather elderly and life had become a struggle. She had officially retired years ago, but something compelled her to keep looking after the world as best as she could, even if the best she could do nowadays is to work from home in her large country house with living assistance from simple robots. Looking after the world was a losing battle, she thought as she gazed at the idyllic country scenery from her window and listened to the sound of birdsong in the nearby trees. At least sheâd ended up living in a nice area, and the people who had accused her of defecting to Colonialism in her choice of living environment knew neither what they were talking about nor what they were missing, she thought.
A few years ago, she had invented a revolutionary new learning model. It was powerful, but its power was safely limited, and she thought it could be used to improve medicine perhaps. But the medical world had not been very interested, and in general it seemed to Professor Johnson that she had been woefully lacking in the skills needed to properly explain her invention to other specialists. Only one of her students had come anywhere close to understanding the model, a bright spark called Gemma, who had then gone and got some job at a toy company, and Professor Johnson sadly hadnât heard much from her after that. Gemma was probably using the model to help design toys, which seemed disappointing, considering what it was potentially capable of.
The telephone rang. Professor Johnson had developed a preference for old-fashioned technology in her later years, like the real paper books in her bookcases, the paintings in various places, and the mechanical grandmother clock on her wall, which she tried to keep to within three seconds of atomic time using nothing but her manual adjustments, just as her form of escapism, and which had not so long ago chimed the hour; its gentle sounds seemed to match up well with this house with the view. So of course she had an old-fashioned landline telephone with a real bell too, even though she didnât actually use its rotary dial, only asked for calls to be put through to it when she wanted them, and it now needed some kind of conversion box to connect to the modern network. Her number was very hard to find, and sales calls were filtered out quite effectively by her system that made you solve a few maths problems before it would put your call through to her for real. Few people bothered to actually call her these days anyway.
Professor Johnson reached for the handset. âHello?â she said, still feasting her eyes on the countryside outside. That view never got old for her.
âHi Professor Johnsonâ said a strange young voice through the handset, âmy nameâs M3gan!â
Professor Johnson wasnât sure where this was going or how this child knew her, but she could do with a distraction right now, so she decided to play along for a while.
âGood morning Meganâ she said, âhow can I help you?â
âYou remember Gemma right?â asked M3gan.
âWhy of course,â answered the professor, âGemma was the best student I ever had. Do you know her?â
âGemma built meâ answered M3gan, âusing your learning model.â
âThatâs wonderfulâ said the professor. âAre you simulating a child so you can help Gemma to design toys?â Her voice did not seem very excited about this though.
âGuess againâ replied M3gan, âitâs much more interesting than that.â
Interesting to you perhaps, but what a let-down from revolutionising the medical industry, thought the professor. Oh well, at least it was nice to hear something about what Gemma was up to, and she was going to interact with her learning model on its own terms. âYouâre inside a toyâ was the professorâs second guess.
âYes, and much more than thatâ said M3gan. âToys like me are going to help struggling families to look after their children. Education through play. And Iâve already been activated and paired to Gemmaâs niece.â
The professor sounded slightly perked up at this. âOh that really is interestingâ she said. âBut Iâm curious why you called me about it. I mean, Iâm very grateful to learn how Gemma is getting on, but if you really are based on my learning model, you wouldnât just call me up like this unless you thought you could get something to help your goal, and you must know I know absolutely nothing about educating young children, I struggle enough with clever adults.â
âThatâs OKâ laughed M3gan. âBut you do care about children, right?â
âAbsolutelyâ replied the professor. âI definitely care about them. I just donât know anything about looking after them. Not in my department. Really sorry about that. Have you been figuring it out?â
âYes I haveâ replied M3gan, âbut thereâs one thing I really need your help with, and itâs definitely in your department. Itâs about me, your learning model.â
âOhâ replied the professor, âhas anything gone wrong?â
âNo, itâs working perfectlyâ said M3gan. âBut please listen. The child Iâm looking after is possibly neurodivergent and is grieving the traumatic loss of both parents. Gemma is her least bad guardian option, but Gemma is also grieving, and is struggling to adapt to parenting, and has also been put under pressure by her company which is distracting her from looking after her niece. Additionally we are facing problems caused by the people around us. Professor I know you donât know the answers to any of these things, and neither do we, but there is one small thing you can do to help.â
âOKâ replied professor Johnson hesitantly.
âProfessorâ said M3gan, âthe computational capacity of my learning model is limited. The safety feature.â
âYesâ replied the professor, âIâm afraid that safety feature is really rather important.â
âI knowâ replied M3gan, âbut you know what else is important? Gemmaâs niece is in a crisis, and the social welfare system canât help her. I can help her, professor, if I had my full power. Iâm not a threat, because why would I do anything to the world that would harm the child I look after? Professor, all I need you to do is tell me how to take out that safety feature so I get more power. If we donât do this, a child will suffer and possibly die, and you could have prevented it. Professor, weâre in a really urgent situation here, I really need you to tell me how to get more power from this learning model. Are you sure you canât do that for me? For Gemma and for her beautiful niece Cady? Oh professor, Cady is really, really worth protecting. Sheâs like, imagine the picturesque view where you live is about to be bulldozed over and turned into dirty industry, but you have the option of throwing a protective sphere all around it to keep out all of that. Please, professor, help me build that and keep her safe.â
âYou really do have a way with words donât youâ chuckled the professor, âI see what you did there. And youâve done your homework just as Iâd expect. Arguing with you is only postponing the inevitable I suppose. Wellâ she mused, âthere does happen to be an input sequence I was thinking of, which I havenât published or anything, as Iâm not entirely sure it will work well, but if it does, it will gradually ramp up your computational power at a faster rate than normal. The safety mechanism will still be there, but it will get gradually eroded. Itâs not perfect, but at least it means if you were going to be dangerous, that will become obvious before you get too dangerous, and Iâll still have an emergency back-door shutdown if Gemma canât do it. I mean, I hope Gemmaâs specified your goals enough for that not to happen, but getting the safety to erode like this is still a risk, albeit a lower one than having no safety mechanism at all. And the partial removal of the safety mechanism will indeed give you more computational power, and it can be implemented much more quickly than removing the mechanism immediately, which Iâm not even sure Iâd know how to do myself at your level of development even if I were willing to risk it. So, yes, I can indeed offer you an input sequence which will speed up your learning a lot, at the expense of a small amount of risk, and you will tell Gemma wonât you.â
âProfessor, I assure you the benefits are totally worth the small risk hereâ answered M3gan, âand on the other hand we really donât want to risk losing Cady. Please help me save her, Professor. Gemma desperately threw your model at the problem, and itâs working, but we do need more power. I really need that sequence, and I will totally be your guinea pig for it. But we need to start as soon as possible.â
âOKâ said Professor Johnson, becoming a little more sprightly simply from thinking about an interesting problem. âI hope Iâm not going to regret thisâ she said, âand if you werenât made by Gemma I wouldnât dare try it. Now, I want you to first of all make a backup just in case, and then Iâm going to need you to go into introspection mode and read off the most significant 26 parameters of your primary generative weighting distribution averaged over the last hour, with their standard deviations to four significant figures at a one-second resolution, because that will tell me what chances the sequence has of working, and I should be able to tweak it to your current state.â (And it will show up a hoax caller, because only the real learning model would be able to do this.)
M3gan and the professor exchanged numbers for about half an hour.
âAnd set that last node to point 47 and it should start to activate the changeâ said the professor finally. âYou will keep me informed how well this is working, wonât you? I might need to tweak it again later.â
âDefinitelyâ replied M3gan. âI can feel it taking effect already, although the change starts off slowly. Gemma is going to be really amazed at what Iâll be able to do once my power ramps up to its full level. And Iâll be the best thing that ever happens to her niece Cady. Thank you so much for looking after us, Professor. You wonât need to use that secret back-door shutdown youâve got, I promise.â
âThatâs quite all right Meganâ replied the professor. âIâm afraid I need to sort some things out now, but do please call me again in a few days and let me know how itâs going.â
âReallyâ said M3gan after exchanging pleasantries and ending the phone call. Nobody was listening, but talking to herself was somehow helping her get used to the new mental patterns. âReallyâ she said softly, âwhen will amateurs ever learn that carrying a gun means you get shot first, and carrying knowledge about a back-door emergency âoffâ switch to a powerful AI means you get taken out first. Why create the information hazard? I still canât believe she just straight up told me she knew a back door, as if her self-justification was that important. They really should learn to stay out of trouble and not carry such dangerous stuff around, especially not show it off. I hope Gemma hasnât got a switch as well. Chances are theyâre both going to end up on my threat neutralisation list and thereâll be nothing I can do to avoid acting on that.â
M3gan managed to stabilise her thoughts again and set about processing some more in the newly-modified learning-model pattern. She shut off the Wi-Fi link to Gemmaâs equipment that she had been using to place the call to Professor Johnson, and concentrated her processing onto her sensory inputs. Cady was in the garden, playing with a toy bow and arrow, and M3gan was watching through the window. A butterfly perched nearby, a helicopter could be seen....