Using live computerâgenerated music, machine ... âBreaking Outâ & PercepOve Radio (BBC 2012, BBC ... Big Data, and the models react live to the watcher /.
The Singing Terminator? Never mind the Robots: here’s the 3 Laws of Bio‐controlled Media Alexis Kirke, ICCMR, Plymouth University
Purpose • The technology discussed in this talk is all in very early stages. • However this is the Nme to begin laying ethical groundwork. • Also because it is in early stages, and less obvious than the issue of military roboNcs, few people are aware of the potenNal ethical issues.
Overview • BCMI MiDaS • ARTHUR • Many Worlds ‐ difference between music and film in terms of emoNonal power • Other examples of Intelligent Directed Media • Machine Learning, Big Data and OpNmizaNon • Four Possible Ethical Issues • Cruel to be Kind • OpNmized Media for Control • OpNmized Media and Brain Science for InterrogaNon and Weaponry • Media AddicNon • The 3 laws of IDM
BCMI‐MiDAS • Project I iniNated, now a 4.5 year government funded project between Plymouth and Reading. • Using live computer‐generated music, machine learning and brain emoNon detecNon to create music that induces a specific emoNon. • Computer learns how your brain reacts to music to allow spontaneous emoNon manipulaNon under its control.
ARTHUR • AffecNve ReacNve Trajectories Harnessing Unit Response • ARTHUR is a very simple simulated human who “reacts affecNvely” to certain musical features and structures. Developed by me as a test bed for BCMI‐MiDAS. • Did a version of ARTHUR based on my esNmated emoNonal reacNon to music features, like key and tempo. • We a\empted to build a simple version of BCMI‐ MiDAS to learn and control ARTHUR.
Actual Effect of Predicted Route on ARTHUR: Stressed to Relaxed • EmoNon model in chart is the Valence / Arousal model (x‐axis / y‐axis) • When ARTHUR’s state was set to “Stressed” and a set of generated tunes, with the features predicted by the machine learning were played, it’s state went to approximately “Relaxed” as predicted.
Actual Effect of Predicted Route on ARTHUR: Sad to Happy • When ARTHUR’s state was set to “Sad” and a set of generated tunes, with the features predicted by the machine learning were played, it’s state went to approximately “Happy” as predicted.
‘many worlds’ • What about MULTI‐modal art‐forms targeNng of emoNons? For example, films: – h\p://www.alexiskirke.com/#many‐worlds
• Adding video to sound increases emoNonal impact tremendously. • Consider music videos • RaNngs in films are partly to do with vulnerable people being potenNally harmed by intensity as well as content (such issues in CD releases are treated far less seriously).
Other Examples • Other Examples of Intelligent Directed Media (IDM) include: – Microsoc patent (Xbox One?) which watches if people are in the room for adverts. – NeuroFicNon, adapNve engagement (Rajaniemi & Halliday 2013) – “Breaking Out” & PercepNve Radio (BBC 2012, BBC 2013) – Verizon patent: “targeted ads can be sent to TV viewers based on informaNon collected from infrared cameras, microphones and other devices that capture the conversaNons and moods of the people watching.”
Machine Learning, Big Data and OpNmizaNon • The very best script and producNon teams have learned how to intensify emoNonal involvement and response across a large proporNon of a target audience. But this is fixed once media released. • If this could be modelled using Machine Learning and Big Data, and the models react live to the watcher / listener based on their biosignals and movement, emoNonal intensity and prescripNon could be ratcheted to a new level. • Can be compared to difference between a bomb being dropped and a computer auto‐guided drone missile. • Such precise manipulaNon of affecNve state could expand outside of entertainment, for example medical applicaNons to relieve depression and anxiety.
Four Possible Ethical Issues • • • •
“Cruel to be kind” Control A\ack AddicNon
“Cruel to be Kind”
OpNmized Media for Control • “Analyses revealed that individuals reported themselves as shopping longer when exposed to familiar music but actually shopped longer when exposed to unfamiliar music.” (RF Yalch, ER Spangenberg ‐ Journal of business Research, 2000) • “With the slow‐tempo background music used in this study, patrons stayed longer, ate about the same amount of food, but consumed more alcoholic beverages” (The Influence of Background Music on the Behavior of Restaurant Patrons, R.E. Milliman, Journal of Consumer Research, 1986) • Using music to change mood while driving (MD Zwaag, JH Janssen, et al, Ergonomics, 2013) • Consider such music which could be generated and manipulated in real‐ Nme based on machine learning and bio‐signals received involuntarily from people. • Possibility of adding visual sNmuli: lighNng, screens on walls, etc.
OpNmized Media and Brain Science for InterrogaNon and Weaponry • Music for Torture and as a Weapon – Norriega’s surrender, “War on Terror” InterrogaNons (SG Cusick, Journal of Society for American Music, 2008) • Use of fMRI brainscans used in interrogaNon (J.R.Simpson, J Am Acad Psychiatry Law, 2008). • Remember ARTHUR could be moved in any direcNon. • Consider biosignal opNmized pain generaNon using sound, plus addiNon of VR glasses or immersive screens to the sounds, all opNmized by machine learning and live bio‐signal collecNon.
Media AddicNon • Measuring television addicNon (CW Horvath ‐ Journal of BroadcasNng & Electronic Media, 2004) • Issues for DSM‐V: Internet addicNon (J Block ‐ American Journal of Psychiatry, 2008)
New PotenNal Types of Media AddicNon • Live opNmized internet‐browsing / television has potenNal to be far more addicNve. • There are two levels of addicNon: – “Alcohol”: only 5% of IDM users become addicted – Heroin: 100% of IDM users become addicted
• We will probably not be able to avoid the “alcohol” level emerging in society with IDM. • We can avoid the “heroin” level by careful monitoring and ethical standards.
“Three Laws” • Three laws of roboNcs: 1. A robot may not injure a human being or, through inacNon, allow a human being to come to harm. 2. A robot must obey the orders given to it by human beings, except where such orders would conflict with the First Law. 3. A robot must protect its own existence as long as such protecNon does not conflict with the First or Second Law.
Three Laws of IDM: Law 1 1. An IDM may not physically or mentally injure a human being or, through inac?on, allow a human being to come to be physically or mentally harmed. This requires the se+ng of “harm boundaries” in the mental and emo7onal states it can take a human into (including abnormal states like epilepsy). Medical tes7ng of IDMs – some kind of body which combines the MHRA and the BBFC, both in terms of their physical effects (noise on ears, non‐eye blinking) and their emo7onal ones.
Three Laws of IDM: Law 2 • An IDM must obey the orders given to it by human beings, except where such orders would conflict with the First Law. An IDM must have a safe cut‐off buNon which can be accessed quickly, and remotely. This buNon must trigger a cool‐ down process if pressed once, or a total cut‐off if pressed mul7ple 7mes. The IDM should also be able to detect a “desire to stop” in a human, differen7a7ng between a ra7onal desire to stop and an irra7onal compulsion to con7nue.
Three Laws of IDM: Law 3 3. IDMs which can override the first two laws require strict government regula?on and should not be freely available. A special order is required to build or use one. The military and government are going to use these “non‐two‐ laws” devices whether we want them to or not, plus there may be unexpected medical uses for them. Hence it actually strengthens the first two laws if they are allowed to be broken, but under strict legal regula7on.
End • cmr.soc.plymouth.ac.uk • www.alexiskirke.com