Thanks for your patience during our recent outage at scalar.usc.edu. While Scalar content is loading normally now, saving is still slow, and Scalar's 'additional metadata' features have been disabled, which may interfere with features like timelines and maps that depend on metadata. This also means that saving a page or media item will remove its additional metadata. If this occurs, you can use the 'All versions' link at the bottom of the page to restore the earlier version. We are continuing to troubleshoot, and will provide further updates as needed. Note that this only affects Scalar projects at scalar.usc.edu, and not those hosted elsewhere.
1media/robot staring.jpg2019-04-08T22:52:18-07:00Tofunmi Kutif030da039a33eb7edc0d46b711e3c9f5d37b36203352410plain2019-04-23T17:57:29-07:00Tofunmi Kutif030da039a33eb7edc0d46b711e3c9f5d37b3620 As an only child, Larry Coleman was used to being independent. He grew up with parents who loved him, cared for him, and gave him whatever he wanted, whenever he wanted it. Larry's parents didn't always have time to spend with him and because he had no siblings, he often got bored. Phones, Siri, tablets, and Google Alexa where his source of entertainment and conversation buddies. This interest in technology led him to consecutively win the science fair in his middle school and go on to study engineering in college and get a high paying job in that field.
Overtime, everything about his life became technologically mediated to the point where he barely had any human interaction. So during the first year when the launch of the Kyroids was announced, he was part of those ecstatic at the idea of having a robot assistant and much to his disappointment, they were not created for the pleasure of the human specie.
It is now the fifth year, two years after the incident, and one year after law was implemented. Larry Coleman walks out of his home to have some alone time at his favorite café. He places his order ahead of time so he simply picks up his order and sits by the corner. He enjoys the food so much that he wants to give a tip but has no interest in interacting with others. At that moment, Norb walked in and Larry called out and motioned for Norb to come over. "Hi" Larry began, "I know you don't know me but I would really appreciate your help." Norb politely responded saying, "what do you need help with?" "I would like to tip the waiter over there for an incredible sandwich but I would much rather prefer not having to interact with him, so would you mind just placing this money into that jar?" Larry requested.
Assessing the situation, Norb realized that Larry would much rather interact with a robot than his own specie and he viewed technology as a means to do his bidding. "No" Norb sharply responded, we Kyroids are not here to serve as a mediation for humans, if you continually rely on technology to do your bidding, you stand at risk of not only a physical, but mental, and emotional laziness." Larry tried to defend himself, but before he could speak Norb continued: "you no longer cared for any form of human connection and as such are in the danger of loosing touch with your human condition, the emotions and feelings that define your race. I could easily help you with this task, but it would simply be you exercising an unthinking acceptance of convenient short term solutions."
This page has paths:
1media/Robot dreams.jpgmedia/Robot dreams.jpg2019-04-08T22:45:52-07:00Tofunmi Kutif030da039a33eb7edc0d46b711e3c9f5d37b3620The Three LawsTofunmi Kuti15visual_path2019-04-09T00:37:26-07:00Tofunmi Kutif030da039a33eb7edc0d46b711e3c9f5d37b3620
This page has replies:
12019-04-20T20:09:45-07:00Tofunmi Kutif030da039a33eb7edc0d46b711e3c9f5d37b3620Second LawTofunmi Kuti5plain2019-04-23T18:22:50-07:00 Classic science fiction films that deal with human-machine interaction present all kinds of life and death dilemma's. "But these imaginings were not focused on AI’s broader and potentially more significant social effects—the ways AI could affect how we humans interact with one another" (Nicholas).
The second law explores the human tendency to rely on technology to the point where they loose aspects of themselves, loose what it means to be humans by reducing interaction.
when I asked my participants how long they generally spend on their phone, I got results varying from "1-2 hours" to "all day." When I corrected the question to "would you rather google a question or ask someone else?" the result was unanimous as most people expressed that they would be on the side of efficiency which is what google gives them. Tofunmi Kutif030da039a33eb7edc0d46b711e3c9f5d37b3620