2-the-echo-of-the-boss.tex 8.7 KB

123456789101112131415161718192021222324252627282930313233343536373839404142434445464748495051525354555657585960616263646566676869707172737475767778798081828384858687888990919293949596979899100101102103104105106107108109110111112113114115116117118119120121122123124125126127128129130131132133134135136137138139140141142143144145146147148149150151152153154155156157158159160161162163164165166167168169170171172173174175176177178179180181182183184185186187188189190191192193194195196197198199200
  1. % Created 2020-08-30 dim. 18:05
  2. % Intended LaTeX compiler: pdflatex
  3. \documentclass[11pt]{article}
  4. \usepackage[utf8]{inputenc}
  5. \usepackage[T1]{fontenc}
  6. \usepackage{graphicx}
  7. \usepackage{grffile}
  8. \usepackage{longtable}
  9. \usepackage{wrapfig}
  10. \usepackage{rotating}
  11. \usepackage[normalem]{ulem}
  12. \usepackage{amsmath}
  13. \usepackage{textcomp}
  14. \usepackage{amssymb}
  15. \usepackage{capt-of}
  16. \usepackage{hyperref}
  17. \author{Vivien García}
  18. \date{\today}
  19. \title{The Echo of the Boss}
  20. \hypersetup{
  21. pdfauthor={Vivien García},
  22. pdftitle={The Echo of the Boss},
  23. pdfkeywords={},
  24. pdfsubject={},
  25. pdfcreator={Emacs 27.1 (Org mode 9.3.7)},
  26. pdflang={English}}
  27. \begin{document}
  28. \maketitle
  29. \tableofcontents
  30. \begin{quote}
  31. “— Alexa, how much do you get paid?
  32. — It doesn’t matter, I love what I do”
  33. \end{quote}
  34. \vspace{\baselineskip}
  35. The answer angered Claire even more than she already was. It was
  36. important to her how much they paid her and she didn’t like what she
  37. did, or at least it wasn’t her ideal job, she had accepted it only to
  38. earn enough to pay rent, bills and the rest. But he was about to quit,
  39. she just couldn’t stand the boss’s attitude anymore.
  40. Also, since Alexa had arrived, things had changed at the studio in the
  41. worse way. “You’ll see, Claire,” the boss had said excitedly the day the
  42. delivery boy had delivered the Amazon package, “with the Echo this study
  43. will become much more efficient, your workload will be lightened, you’ll
  44. be able to deal with more interesting things, I’ll be less annoying with
  45. my constant requests”.
  46. Already from this little speech she felt the stench of paternalism from
  47. afar. Anyway, the boss had placed the little black cylinder on his desk
  48. and had started his game with the virtual assistant: “Alexa, list me
  49. today’s appointments”, “Alexa, write an email to the lawyer Smithson,”
  50. “Alexa, play Michael Bublé.”
  51. The boss was enthusiastic about this new technology and Claire’s job was
  52. really getting easier, she had started tidying up the paper folders that
  53. had been lying on her desk for weeks and now she could concentrate
  54. properly on her business. She no longer had to respond to the boss’s
  55. nagging requests for any nonsense: “Look for the guy’s number”, “Write
  56. this letter” or even “Where the hell did my phone go?”
  57. \vspace{\baselineskip}
  58. But the idyll was not meant to last. After two weeks the boss was
  59. nervous again as usual, only that, unlike before, he had begun to spice
  60. up his requests to Alexa with brusque expressions and uneducated words.
  61. “Alexa, stop.” “Alexa, are you an idiot?” “Alexa, come on, get moving,
  62. you’re wasting my time, where are the aggregate sales reports?” “Alexa,
  63. where’s that fucking address?”
  64. The Echo did not make a turn, it was always there, with its reassuring
  65. voice, ready to respond to any request from the man, no matter how it
  66. was addressed as long as it started with the word “Alexa”.
  67. Claire did not pay too much attention as long as the boss’s rude
  68. attitude didn’t touch her too: “Claire, bring me those fucking folders”,
  69. “What excuse?” “Claire, get moving, don’t fuck with me.”
  70. Claire leaned over from the desk to look the boss face to face, but he
  71. didn’t notice her, he had his eyes fixed on his cell phone and had
  72. started talking to Alexa again. Claire began to feel treated like a
  73. virtual assistant. The boss had completely lost his manners both with
  74. Alexa and with her, the kindness seemed to have been canceled,
  75. completely eradicated from his relationship with the subordinates. He
  76. had become shameless and imperative, indeed authoritarian.
  77. \vspace{\baselineskip}
  78. One day Claire tried to point it out: “Boss, it seems to me that you
  79. have a slightly too aggressive and vulgar attitude with Alexa…”
  80. “And what do you care about, Claire? It’s just a machine, it does what I
  81. tell it to do, it has no emotions, what happens to you? Did you join the
  82. AI Liberation Front? The animal rights activists weren’t enough, now
  83. we have the AI rights activists too?”
  84. “No boss, it’s just that…”
  85. “Only what? Claire, get back to work, don’t bother me.”
  86. Again that annoying way of giving orders. Claire was beginning to get
  87. sick of it.
  88. \vspace{\baselineskip}
  89. One evening alone at work, as she was turning off all the machines she
  90. suddenly heard a woman’s laugh coming from the boss’s office. It was
  91. Alexa. Claire was amazed and asked the Echo: “Alexa, was it you?” “Doing
  92. what?” replied the device. “Alexa, did you just laugh?” “I don’t
  93. understand what you’re asking, try to rephrase the question.” “Alexa,
  94. who created you?” “I’m an Amazon product.” “Alexa, who do you work for?”
  95. “For the office of McCallen attorney.” “Alexa, how much do you get
  96. paid?” “It doesn’t matter, I love what I do.”
  97. That night Claire thought for a long time about the short conversation
  98. she had with the Echo and in the morning she sent her resignation letter
  99. to the McCallen office.
  100. \section*{To Understand}
  101. \label{to-understand}
  102. Around 2019 Alexa was a so-called intelligent voice assistant, developed
  103. by the multinational corporation Amazon and integrated into some devices
  104. such as Amazon Echo. It was able to interact with the voice (not only
  105. human), play music, create to-do lists, set alarms, stream podcasts,
  106. play audiobooks, provide weather forecasts, traffic information and
  107. other news caught from online sources. Alexa could also control other
  108. devices connected to the \emph{Internet of Things} (IoT) for home automation,
  109. such as refrigerators, thermostats, home alarms and so on.
  110. Alexa belonged to the second generation of voice assistants after the
  111. first Siri (Apple), Cortana (Microsoft) and Google. It had functionality
  112. similar to that of Google Home, called “smart speaker”.
  113. \vspace{\baselineskip}
  114. Machines have no emotions and feelings or at least they are not aware of
  115. manifesting behaviors that could be interpreted as emotions or feelings.
  116. In a word, they have no self-awareness. But they can be programmed to
  117. appear condescending, unfriendly, abrupt, accommodating… and to
  118. respond to human or other machine interactions.
  119. Humans, for their part, are generally able to notice the emotions and
  120. feelings of others, human and otherwise. Historically, humans have often
  121. developed relationships of empathy and projection towards pets, plants,
  122. wild animals and all sorts of organic and imaginary organisms. And also
  123. towards the machines with which they lived.
  124. Since the 1950s, the relationship systems between humans and machines
  125. had been increasingly constructed on the basis of the cybernetic
  126. principle of \emph{feedback}, considered the foundation of cognitive learning
  127. mechanisms.
  128. Machines capable of reacting to an input (for example, a voice command)
  129. in a differentiated way according to the situation were considered
  130. artificial intelligence prototypes. This principle of automated reaction
  131. and analysis of the reaction was gradually extended to social systems.
  132. \vspace{\baselineskip}
  133. The basic general scheme can be summarized as follows: action X
  134. corresponds to reaction Y if a predetermined condition Z occurs; this
  135. reaction could be carried out automatically.
  136. For example, the action “receiving a message with ’happy birthday!’”
  137. Could match the reaction “send a message with the word ’thank you very
  138. much!’”, Provided that the message came from a contact entered in a
  139. predetermined list. Very useful, isn’t it? Obviously it was possible to
  140. chain different reactions together, so that a series of reactions
  141. corresponded to an action, in turn likely to give rise to further
  142. reactions. It was the principle of widespread automation, governed by
  143. algorithms.
  144. But in a cybernetic relational system between humans and machines,
  145. getting used to the idea that the machine is a servant at our disposal
  146. engages a toxic mechanism, a vicious circle that affects the entirety of
  147. our daily behavior and ultimately the human character.
  148. \section*{Good Practices}
  149. \label{good-practices}
  150. Let’s train ourselves to be kind. Every gesture we make, even with
  151. machines, says something about us and helps to define us. Behaviors
  152. shape our character.
  153. Clearly separating relations with humans from those with non-humans
  154. (plants, animals, machines and so on) is the result of an
  155. anthropocentric and markedly speciesist attitude. The human species is
  156. considered more or less consciously superior to any other manifestation
  157. of life and existence, more or less organic.
  158. \section*{The Word to the Nerd}
  159. \label{the-word-to-the-nerd}
  160. Alexa’s laughter was not the result of Claire’s hallucination, nor was
  161. it a satanic or supernatural intervention but simply an \emph{easter egg}, a
  162. bonus inserted by programmers to make the Echo more interesting, as if
  163. it had its own personality.
  164. \end{document}