0:00
/
0:00
Transcript

Moral Machine Monster ...

teaching human beings (how/not) to be?

– #A bit more than just a compilation of teasers –

THE MORAL OF THE MORAL MACHINE[1]

But today’s society is characterized by achievement orientation, and consequently it adores people who are successful and happy and, in particular, it adores the young. It virtually ignores the value of all those who are otherwise, and in so doing blurs the decisive difference between being valuable in the sense of dignity and being valuable in the sense of usefulness. (Viktor E. Frankl, Man’s Search for Meaning)

THE CASE OF[2] / AGAINST[3] SELF-DETERMINATION

Since the human ability of of continuous learning (adaptability) still surpasses those of machines, the cheaper way of adapting humans to machines is usually chosen than vice versa. Operational cost accounting and the organisation of health care affirms of such a systemic approach, omitting consideration of adverse "social costs" (inclining absenteeism rate, escalating mental disorders, degenerating social relationships), i.e. "fall-out costs" coinciding with a company’s organisational and operational condition. ... Such common "effects of rationalisation" are also known to the introduction of information technology and artificial intelligence, specifically, tacit requirements to "deskilling" the majority of staff ("human rudiments"), monocultivation of communication and monotonisation of work, decaying "job contentment", digital taskwork sparking increased stress levels (corresponding psycho-social consequences). (Wilhelm Steinmuller, Automation of Intellectual Work)

THE BATTLE FOR OUR MIND[4] AND BRAIN[5]

What is Perry Rhodan about? No matter what, it boils down to one subject matter remaining at issue only: Knowledge is Power, Power is Technology, Technology is Perry Rhodan, epitomising a World of Man and Machine having established unison.

There is no struggle between Man and Machine, for they don't withstand eachother. Dystopia? No problem. The individual's fear of a numb but superior technology; the individual's revolt against an abstracted community called Society; the individual asserting oneself against an anonymous, amorphous singularity of state authority representing the public, embodying the social equivalent to the machine, obviate the intimacy of Love (Orwell) and the last resort of Death (Huxley).

For in Rhodan's cosmos there is neither ego nor subject, capable of feeling terrified according to the fundamentals of psychology. On the contrary, he impersonates the role model of a human being having become a rudimentary part, a Homo Minusculus of a ubiquitous technical apparatus.

What kinds of know-how and power are subject to the hero's mission? Rhodan is neither sovereign nor servant of technology. He is a descendant among other descendants, a functionary among countless other functionaries, having inherited and administered the legacy of technological superiorty.

Eventually, applying abstract knowledge prompts kind of digital purification techniques corresponding to transposing and transforming human beings into abstracta to be known, stored, kept available and managed as information.

Such technomorphoses expropriate the original human being of self-determination, incorporating the individual's personal data into somewhat of a Cocoon – a cluster of digital sociation – being on sale, remaining at everyone's disposal. (W. Brockjan, The Orwellian Decade and the Future of Science: Despotism‘s Wishful Thinking – Notes on the Political Reality of Science Fiction – Perry Rhodan, a case in point)

OUROBOROS

If billions of individuals consulted AI or had a Hal ... Deep Blue ... Al Go at their side, drawing on a Panazee considered the epitome of global know-how, it would simply emulate a gigantic self-referential loop-like system, a societal Ouroboros unheard of.

OXYMORON

Artificial Intelligence is an oxymoron. Simply thinking in terms of opposites reveals its phrasal non-sense. Please make your choices – ever heard of ...

Artificial Knowledge / Artificial Intuition ... Artificial Fact / Artificial Feeling ... Artificial Seriousness / Artificial Silliness ... Artificial Reason / Artificial Ridicule ... Artificial Belief / Artificial Disbelief ...

Artificial Care / Artificial Carelessness ... Artificial Agreement / Artificial Disagreement ... Artificial Solution / Artificial Dilution ... Artificial Curiosity / Artificial Apathy ... Artificial Emotion / Artificial Frigitity ... Artificial Belief / Artificial Atheism ... Artificial Responsibility / Artificial Irresponsibility ... Artificial Concern / Artificial Indifference ... Artificial Morality / Artificial Immorality ...

Artificial Aggression / Artificial Peacefulness ... Artificial Obstruction / Artificial Submissiveness ... Artificial Faith / Artificial Despair ... Artificial Love / Artificial Hate ... Artificial Happiness / Artificial Sadness ... Artificial Euphoria / Artificial Depression ... Artificial Intelligence / Artificial Idiocy etc.

APPENDIX ON ARTIFICAL INTELLIGENCE (AI) COMMONLY IN USE

Academic Authenticity[6] – Acting[7] – Arts[8] – Big Data[9] – Corporate Decision Making and Taking[10] – Crime[11] – Education[12] – Film[13] – Financial Trading[14] – Gaming[15] – History[16] – Human Relations[17] – Justice[18] – Law[19] – Music[20] – News[21] – Politics[22] – Pseudoscience[23] – Reality[24] – Science[25] – Society[26] – Warfare[27]

APPENDIX ON DIGITAL MEDIA (DM) AND SOCIAL ENGINEERING (SE) IN USE

Media[28] – Priming[29] – Framing[30] – Confidence Swindling[31] –– Experience Economy[32] – Market Intelligence[33] – Surveillance Capitalism and Democracy[34] – Credit Score Dating[35] –– Employee Scoring[36] – Employment Experience[37] – Employee Scheduling[38] – Behavioural Scoring[39] – Socio Badges[40] – Online Interaction[41]

APPENDIX ON SCIENCES RELATING TO SOCIETAL PHENOMENOLOGY IN TERMS OF ORDER AND DISORDER

Social Network[42] – Self-Reference[43] – Synchronisation[44] – Chaos Theory[45] – Bifurcation[46]

Frame of Reference

FrankenstAIn (aD)

Granting Man Future-Proofness

Let's Try HighSciFi In A Mad World

We're On Apollo 13

It’s not a Game – Industry

Happy Human Plug-Ins

Homebodies Blinded by the (Screen) Light

The Spirits we Summoned

I'm sorry Dave


[1] Moral Machine (Wikiped) ––– Trolley problem (Wikiped) ––– .The Moral Machine Experiment (core.ac.uk) ––– Moral Machines - Teaching Robots Rights From Wrong (pageplace.de) ––– Moral Machine Monster : Internet Archive

[2] Self-determination (Wikiped)

[3] Beyond Freedom and Dignity (Wikiped) ––– B. F. Skinner - Beyond Freedom and Dignity (selfdefinition.org) ––– The Case Against B.F. Skinner (ehu.eus) ––– related: B. F. Skinner (Wikiped)

[4] Positioning: The Battle for Your Mind (yourhomeworksolutions.com)

[5] Nita Farahany - The Battle For Your Brain

[6] ChatGPT: Cardiff students admit using AI on essays - BBC News ––– Students are using ChatGPT to do their homework. Should schools ban AI tools, or embrace them? | Euronews ––– AI: Plagiarism - Artificial Intelligence: ChatGPT and Beyond - LibGuides at Hillsborough Community College (hccfl.edu) ––– AI makes plagiarism harder to detect, argue academics – in paper written by chatbot | Chatbots | The Guardian ––– Cheating after ChatGPT – will AI destroy academic integrity? - CapX ––– ChatGPT Is Making Universities Rethink Plagiarism | WIRED ––– Alarmed by A.I. Chatbots, Universities Start Revamping How They Teach - The New York Times (nytimes.com)

[7] How AI Could Change Acting Forever (thewrap.com) ––– AI voice acting, trained by actual voice actors, is on the rise - Rest of World

[8] FE News | The New Relationship Between AI and Human Creativity

[9] Cathy O'Neil | Weapons of Math Destruction - YouTube, Weapons of Math Destruction | Cathy O'Neil | Talks at Google - YouTube ––– The era of blind faith in big data must end | Cathy O'Neil - YouTube ––– Algorithms Are Taking Over The World: Christopher Steiner at TEDxOrangeCoast - YouTube ––– Safiya Noble | Challenging the Algorithms of Oppression - YouTube

[10] SAP-Digital-Boardroom-Implementation (sapanalytics.cloud)

[11] The AI Crime Wave: Police Warn of ChatGPT and Other AIs' Dark Side - Gizmochina ––– Lawyers face enhanced risk in "grim" AI-fuelled crime outlook - Legal Futures

[12] AI in Education Market is forecast to reach USD 53.68 Bn by 2032 (enterpriseappstoday.com) ––– AI Revolution: Rethinking Exams and Embracing the Future of Education - Innovation Origins ––– Generative AI: Education In The Age Of Innovation (forbes.com) ––– New ISBN Publication - Artificial Intelligence and Education - Education (coe.int) ––– What Is the Future of AI in Education, According to the Write Essay Professionals? - FAULT Magazine (fault-magazine.com)

[13] Future Films: Written and Directed by…AI? | Mind Matters

[14] Electronic trading platform (Wikiped) ––– Was software responsible for the financial crisis? | Technology | The Guardian ––– Can Algorithms Help Predict the Next Financial Crisis? (upenn.edu) ––– Spoofing (finance) (Wikiped) ––– Mirror trading (Wikiped) ––– Social trading (Wikiped) ––– Copy trading (Wikiped) ––– Bloomberg Unveils a GPT Finance-Focused AI Model (infoq.com)

[15] How ChatGPT AI makes video games - Polygon

[16] The slippery slope of using AI and deepfakes to bring history to life (theconversation.com) ––– Artificial Intelligence Will Make Forging Anything Entirely Too Easy | WIRED

[17] Pivot88 Discusses How AI Builds Human Relationships – Sourcing Journal ––– Sex, love and companionship ... with AI? Why human-machine relationships could go mainstream (theconversation.com) ––– What is artificial intelligence doing to human relationships? - Marketplace ––– Why People Are Confessing Their Love For AI Chatbots | Time ––– You’ve got to have heart: Computer scientist works to help AI comprehend human emotions - Purdue University News ––– Forging genuine customer experiences through AI - TechNative

[18] Algorithms and Justice (Chris Bavitz) - YouTube ––– A mathematician reveals how algorithms make the justice system worse (businessinsider.com) ––– Artificial intelligence algorithms in the criminal justice system (cnbc.com) ––– The danger of predictive algorithms in criminal justice | Hany Farid | TEDxAmoskeagMillyard - YouTube ––– Predictive policing (Wikiped) ––– Stop-and-frisk in New York City (Wikiped) ––– Racial profiling (Wikiped) ––– CompStat (Wikiped) ––– Punchlab - Under the Hood ––– PredPol – The Predictive Policing Company ––– AI predicts crime a week in advance with 90 per cent accuracy | New Scientist ––– How AI is transforming financial crime detection - FinTech Global ––– EU rights watchdog warns of bias in AI-based detection of crime, hate-speech | Reuters ––– The big idea: should robots take over fighting crime? | Books | The Guardian ––– "I can build this - but should I?" Welcome to the AI crime prediction debate (diginomica.com) ––– A.I., Brain Scans and Cameras: The Spread of Police Surveillance Tech - The New York Times (nytimes.com) ––– How AI became 'the autocrat’s new toolkit' [BOOK EXCERPT] - Breaking Defense ––– Know It All: AI And Police Surveillance : 1A : NPR ––– The Global Struggle Over AI Surveillance: Emerging Trends and Democratic Responses (ned.org) ––– Betacom intros AI surveillance system for private 5G in Industry 4.0 venues (rcrwireless.com) ––– Inside Safe City, Moscow’s AI Surveillance Dystopia | WIRED ––– New AI Surveillance Camera Tech Knows Who Your Friends Are | PetaPixel

[19] A.I. Is Doing Legal Work. But It Won’t Replace Lawyers, Yet. - The New York Times (nytimes.com) ––– Automated And Agile: The New Paradigm For Legal Service (forbes.com) ––– Biglaw Automation: Whose Job Goes First? - Above the Law ––– Over 100,000 Legal Roles to be Automated | Financial Times (ft.com) ––– Lawyers could be replaced by artificial intelligence (cnbc.com) ––– Biglaw Automation: Whose Job Goes First? - Above the Law ––– Over 100,000 Legal Roles to be Automated | Financial Times (ft.com) ––– Lawyers could be replaced by artificial intelligence (cnbc.com) ––– A.I. Is Coming for Lawyers, Again - The New York Times (nytimes.com) ––– How AI will revolutionize the practice of law (brookings.edu) ––– Lawtech entrepreneur offers to pay $1M for using AI in court | Cybernews ––– A Judge Just Used ChatGPT to Make a Court Decision (vice.com) ––– US court uses ChatGPT to deliver ruling | Mint (livemint.com) ––– Chinese courts allow AI to make rulings, charge people and carry out punishments (dailymail.co.uk) ––– Colombian judge says he used ChatGPT in ruling | ChatGPT | The Guardian

[20] Generative AI in Music Market | GANs segment has accounting for the largest global revenue of 41% in 2022. - EIN Presswire (einnews.com) ––– The Rise of AI-Generated Music: A Challenge to Human Creativity - Electronic Groove

[21] Guy Launches News Site That’s Completely Generated by AI (futurism.com) ––– 30 Best Bots for Marketers in 2023 (hubspot.com) ––– Headline Generator (title-generator.com) ––– Headline Generator (plot-generator.org.uk) ––– Plot Generator - Infinite story ideas based on your input - Aardgo (plot-generator.org.uk)

[22] Be very scared of AI + social media in politics - GZERO Media

[23] Filtering science from pseudoscience | Deccan Herald ––– List of Topics Characterized as Pseudoscience | Encyclopedia MDPI ––– What is pseudoscience? | Definition from TechTarget ––– R. Paul Wilson On: The Dangerous Rise Of Pseudoscience (casino.org) ––– ​Why the simulation hypothesis is pseudoscience - Big Think ––– AI Models, a Pseudoscience-Based Myth? | NewsClick ––– UK watchdog warns against AI for emotional analysis, dubs 'immature' biometrics a bias risk | TechCrunch ––– ‘Molding science to fit ideology’: 5 ways the Nazis leveraged pseudoscience to support fascism - Genetic Literacy Project ––– Emotion AI: A possible path to thought policing | VentureBeat ––– Emotion AI Analyzes Facial Expressions To Guess Future Attitudes - Dataconomy ––– AI that promotes objectivity in recruitment is ‘pseudoscience’, study finds (siliconrepublic.com) ––– Cambridge: AI recruitment tools “automated pseudoscience” (cosmosmagazine.com) ––– Meta Trained an AI on 48M Science Papers. It Was Shut Down After 2 Days - CNET

[24] Lyra: Slim Smart Glasses Aim to Combine Augmented Reality and AI (mixed-news.com)

[25] What ChatGPT and generative AI mean for science (nature.com) ––– How Will We Prevent AI-Based Forgery? (hbr.org)

[26] When Computers Became Dangerous, The Swedish Computer Discourse of the 1960s (Vulnerable Society): During the 1960s, Swedish society underwent a rapid and revolutionary computerisation process. Having been viewed as a harmless tool in the service of the engineering sciences during the first part of the decade, the computer became, during the second part, a symbol of the large-scale technology society and its downsides. When the controversy reached its peak in 1970, it was the threats to privacy that above all came into focus. This debate resulted in the adoption of the world’s first data act in the early 1970s. This paper will study and analyse the Swedish computer discourse during the 1960s, with special focus on the establishment of the Data Act. The core issues are what factors of development and what main figures were instrumental to the changing approach to computer technology during the later part of the decade. ––– Computer unreliability and social vulnerability - ScienceDirect: Many have argued that industrial societies are becoming more technology-dependent and are thus more vulnerable to technology failures. Despite the pervasiveness of computer technology, little is known about computer failures, except perhaps that they are all too common. This article analyses the sources of computer unreliability and reviews the extent and cost of unreliable computers. Unlike previous writers, the authors argue that digital computers are inherently unreliable for two reasons: first, they are prone to total rather than partial failure; and second, their enormous complexity means that they can never be thoroughly tested before use. The authors then describe various institutional attempts to improve reliability and possible solutions proposed by computer scientists, but they conclude that as yet none is adequate. Accordingly, they recommend that computers should not be used in life-critical applications. ––– Human choice and computers : proceedings of the IFIP Conference on Human Choice and Computers, Vienna, April 1-5, 1974 (Book 1975) [WorldCat.org] ––– Human choice and computers, an ever more intimate relationship ––– ASSESSMENT OF SOCIAL VULNERABILITY (diva-portal.org): Climate change will cause long term effects on ecosystems and human systems. Different systems are however not equally susceptible to and have different possibilities of coping with these effects. A system’s vulnerability refers to the degree to which the system can cope with changes and whether it is susceptible to it or not (Parry 2007). Vulnerability therefore depends on the exposure to climate change (the character, magnitude or rate of change or effect), the sensitivity and the adaptive capacity of the system. Still, all components and people in the system will not be affected equally and will have different vulnerabilities. (..) A total of ten scientific articles were chosen as a basis for this summery, both from the natural hazards field and the field of climate change research. The articles were chosen to show a broad range of approaches to study and view social vulnerability, be suitable and useful for a Swedish setting and also to be relevant in relation to the goals of the project in which the study was made. One article (Füssel 2007) serves to give a general orientation in the field and a meta-analytical perspective, while the other texts provide examples of recent frameworks developed for assessing vulnerability (Cutter et al. 2003, Cutter et al. 2008, Wilhelmi and Hayden 2010, Holand et al. 2011, Reid et al. 2009), whereas some texts discuss the use of social indicators (King and MacGregor 2000), seek to contextualize social vulnerability (Kuhlicke et al. 2011) or review recent finding on certain climate related risks (Oudin Åström et al. 2011, Rocklöv et al. 2011). In addition to the scientific literature in the field, Swedish tools designed by the research programme CLIMATOOLS for the specific purpose of assessing vulnerability have been included. ––– Wilhelm Steinmuller, Legal problems of computer networks: A methodological survey - ScienceDirect ––– IFIP Congress (1977-1989) - Rationalisation and Modellification; Complementary Implications of Information Technologies · OA.mg

[27] AI Weapons Will Cause Artificial Arms Race Between US and China - Bloomberg ––– Air Force Bugbot Nano Drone Technology - ASM International | ASM International

[28] Media (communication) (Wikiped)

[29] psychological priming - YouTube

[30] The Framing Theory - YouTube ––– Framing (social sciences) (Wikiped)

[31] Sociological Analysis of Confidence Swindling (northwestern.edu)

[32] The Experience Economy (Wikiped) ––– Qualtrics (Wikiped) ––– Engagement marketing (Wikiped)

[33] Trusted Connections at the Moments that Matter the Most | Neustar (home.neustar)

[34] Shoshana Zuboff, Surveillance Capitalism and the Challenge of Collective Action (oru.se) ––– In a nutshell: Shoshana Zuboff: Shoshana Zuboff: Surveillance Capitalism and Democracy - YouTube

[35] Credit Score Dating | Where Good Credit is Sexy!

[36] Can Your Resume Beat The Bots? How to Make It ATS-friendly | Glassdoor ––– Screening candidates - Automated application screening - Whaii ––– HRMS | Kronos

[37] Employee experience management (Wikiped) ––– Internal communications (Wikiped)

[38] Scheduling software (Wikiped) ––– The effects of 'clopening' on employees: What employers can do | HR Dive

[39] Big Five personality traits (Wikiped)

[40] Sociometric Badges - Information (mit.edu) ––– We Spent Two Weeks Wearing Employee Trackers: Here’s What We Learned (fastcompany.com)

[41] Cataphora Launches Digital Mirror Software to Reflect Users’ Online Interactions | Benzinga ––– Cataphora - Crunchbase Company Profile & Funding

[42] Social Network Diagram (segment).svg (Wikiped)

[43] Self-reference (Wikiped) ––– Diagonal lemma (Wikiped) ––– Recurrence (Wikiped) ––– Eternal return (Wikiped) ––– Ouroboros (Wikiped)

[44] Synchronization (Wikiped) ––– related: Mechanical resonance (Wikiped) ––– BBC The Trap "What Happened to Our Dreams of Freedom" 1 of 3 - YouTube, BBC The Trap "What Happened to Our Dreams of Freedom" 2 of 3 - YouTube, BBC The Trap What Happened to Our Dreams of Freedom 3 of 3 - YouTube ––– Dead Poets Society - The point of conformity... - YouTube ––– The Psychology of Obedience and The Virtue of Disobedience - YouTube ––– Herding (Wikiped)

[45] Chaos theory (Wikiped) ––– Butterfly effect (Wikiped)

[46] Bifurcation theory (Wikiped)

Discussion about this video

User's avatar