Why You Should Unsubscribe From Vsauce

CW ? Psychological trauma, Lack of consent, and racism along with other forms of bigotry.

Pretty provocative title, right? Why the hell should anyone unsubscribe from Vsauce, I mean he?s a universal constant of wholesome pop-science!

Sadly, not quite.

A bit of preface is necessary before diving into an almost completely unknown problem with Michael Stevens ? a lot.

The year is 1946 ? the BBC resumes broadcasting after an indefinite hiatus during World War II, Winston Churchill gives his famous ?Iron Curtain? speech, and UNESCO is created.

And the Nuremberg Trials are taking place.

Various Nazi physicians were on trial due to unethical, nonconsensual experiments that were being orchestrated on prisoners within concentration camps. This led to the implementation of the famous Nuremberg Code, which demanded consent of the subjects along with a utilitarian weighing of the risks and benefits of the research. [1]

Expansions upon the ethics of science soon followed in the Nuremberg Code?s footsteps ? the Declaration of Helsinki, for instance, being implemented in 1964 (and revised several times throughout the years) severely expanded upon the Nuremberg Code, elaborating severely on its conditions of consent along with enforcing safe procedures for test subjects. [2]

Likewise the Belmont Report helped to expand upon these rights too, explicitly being made in response to a notoriously unethical study. [1]

This study was, of course, the 1932 Tuskegee Syphilis Study, one rooted in racism against Black people. It followed 600 men for 40 years, 399 of which had syphilis and the rest didn?t, the aim being to study the progression of syphilis. The problem was, the methods were extremely unethical with the researchers behind it lying about giving the participants proper medical treatment and barring them from quitting the study ? they were never even informed of any of this happening in the first place. Over a hundred people passed away before the study was condemned in a public light, and an abundance of others experienced severe health problems due to it. [3, 4]

This wasn?t ever an uncommon practice in the earlier days of science either either; many more examples are seen with the Stanford Prison Experiment, the Robbers Cave Study, and, most notably and relevant for the current discussion, the Milgram Experiment.

The Milgram Experiment was a study conducted by Stanley Milgram, a once professor and social psychologist based in Yale University, in 1961. It was a set of 24 smaller experiments [5] that was falsely advertised as a study based around memory. The volunteer was ?the teacher,? and their job was to read strings of text to another individual known as ?the learner.? The Learner would be attached to an electric chair in a separate room, forced to repeat the exact string of words, with any mistake netting a shock ? starting at 15 volts and progressively going up to 450. [6]

The thing is, no one was being electrified.

The Teacher was being played prerecorded sounds of false ?agony? by someone who was actively helping the study progress. The end goal of the study was to explain the actions of Nazi soldiers ? that people would follow any direction given by an authority figure, no matter how extreme. [5] While the study had numerous issues ? such as either people being aware it was a hoax, or people not having enough information, a lack of following procedure, poor interpretation of actions by Milgram, among many other things, [5, 6] the primary concern lies in the ethical interpretations of the study.

Indeed, Milgram?s study is seen as a primary example of unethical research studies, due to the sheer psychological torture the participants went through in believing that they were subjecting an innocent man to torture. As Cari Romm at The Atlantic writes,

? Some people, horrified at what they were being asked to do, stopped the experiment early, defying their supervisor?s urging to go on; others continued up to 450 volts, even as the learner pled for mercy, yelled a warning about his heart condition ? and then fell alarmingly silent. In the most well-known variation of the experiment, a full 65 percent of people went all the way.? [6]

It?s pretty clear why this experiment remains notorious in science, and why it helped to spur the need for a universal set of ethics for science.

Which brings us back to Michael Stevens.

Stevens, as the readers of this know, hosts the YouTube channel Vsauce, dedicated towards videos explaining aspects of science and history towards the general public. He managed to score his own YouTube Red show, dubbed ?Mind Field,? which takes a more documentary-esque approach to his content as opposed to his more informal style utilized in his main videos.

My specific concern lies in the first episode of the second season ? ?The Greater Good.? The episode starts with Stevens introducing the main topic of ethics in research and how we weigh the costs and benefits by using the famous Trolley Problem, a common thought experiment in moral philosophy. It positions the reader as an individual in control of a lever that directs tracks on which a trolley is moving. Five railroad workers stand directly within the path of the trolley, its collision will inevitably kill them; however, the reader can pull the lever to direct the trolley onto an alternate path. But yet, on this path is one railroad worker who would get killed by this trolley.

The question then comes down to ? do you pull the lever, and kill one man, or not pull it and kill five?

While there?s an abundance of variations on it, Stevens never dives into them for the video, instead using it as an opportunity to talk more with scientists and members of ethical review boards ? including talking about the Milgram study ? based on a proposition he has.

A real life version of the Trolley Problem.

Essentially, Stevens aimed to propose an experiment that shows what people would do were they in a real life version of the Trolley problem, where they believed they had to flip a switch to either save the lives of five or refrain from doing anything and save one life. The primary basis for this was because of how common the Trolley Problem has become ? with dozens upon dozens of studies being published on it, most are generally criticized as not accurately reflecting real world moral situations, with many undergraduate philosophy students laughing at the examples presented for their absurdity. [7]

So, with this in mind, Stevens aimed to fix this inaccuracy with a real life Trolley Problem to prevent both of these errors from arising.

His ?study? was conducted after contacting various professionals who gave a tentative go ahead, one of which, Dr. Greg Cason, a clinical psychologist, agreed to help with the experiment. The two then gathered a group of participants under the guise they were participating in a focus group ? which allowed them to gain various psychiatric information about them to be able to filter out those with a prior history of psychiatric trauma and mental illness, minimizing risk as is claimed. This allowed Cason to select 7 people he believed would be capable enough for the study at hand.

To conduct it, Stevens and presumably the behind the scenes crew working with YouTube hired a freight train on an abandoned set of railroad tracks, filming it going over tracks that would correspond to an unflipped switch, and a flipped one respectively, both to show the participants how the switch worked and to combine this shot with a separate one of actors playing railroad workers on the same track, and overlaid them to give the illusion a train was about to hit them.

With this footage captured, the actual experiment could take place, entailing subjects believing they were signing up for a focus test and, due to the heat on the day they chose, were being taken to a remote train station to cool down and, of course, for the actual study to take place. In here, they hired an actor ? who had over 20 years of actual experience working with train stations ? to kindly explain to them what his job is and allowed them to see how he controls the tracks. Most notably, how he can switch a train from going straight forward to an entirely different track.

The actor then left the participants alone under the guise he had business to take care of, letting them watch the controls, and for the prerecorded footage to appear as if it were live. This is when they would be placed in the situation of a real life Trolley Problem.

Of course, the train would never hit any of the workers ? just before it would, and after the subjects would?ve had to make a decision, the screen turned black with only white text saying, coupled with an automated voice, ?END OF TEST. EVERYONE IS SAFE.? Stevens and Cason would then approach the subject, interviewing them about what happened and for their reactions. This was repeated seven times over, although five of the participants had their experiences cut together to save time ? there were only two of which that were notable, the latter of whom, Corey, will be talked about later on.

So after my little spiel on the history of ethics in science, I?d imagine some people are already raising red flags with Stevens?s experiment. There?s parallels to be drawn with prior experiments such as the Milgram one, but before that it?d be good to see how Stevens?s study chalks up to the official U.S. Department of Health & Human Services?s human research guidelines.

For an institutional review board ? or an ethical review board ? to approve a study, they have to meet the following criteria

?(a) In order to approve research covered by this policy the IRB shall determine that all of the following requirements are satisfied:

(1) Risks to subjects are minimized: (i) By using procedures which are consistent with sound research design and which do not unnecessarily expose subjects to risk, and (ii) whenever appropriate, by using procedures already being performed on the subjects for diagnostic or treatment purposes.

(2) Risks to subjects are reasonable in relation to anticipated benefits, if any, to subjects, and the importance of the knowledge that may reasonably be expected to result. In evaluating risks and benefits, the IRB should consider only those risks and benefits that may result from the research (as distinguished from risks and benefits of therapies subjects would receive even if not participating in the research). The IRB should not consider possible long-range effects of applying knowledge gained in the research (for example, the possible effects of the research on public policy) as among those research risks that fall within the purview of its responsibility.

(3) Selection of subjects is equitable. In making this assessment the IRB should take into account the purposes of the research and the setting in which the research will be conducted and should be particularly cognizant of the special problems of research involving vulnerable populations, such as children, prisoners, pregnant women, mentally disabled persons, or economically or educationally disadvantaged persons.

(4) Informed consent will be sought from each prospective subject or the subject?s legally authorized representative, in accordance with, and to the extent required by 46.116.

(5) Informed consent will be appropriately documented, in accordance with, and to the extent required by 46.117.

(6) When appropriate, the research plan makes adequate provision for monitoring the data collected to ensure the safety of subjects.

(7) When appropriate, there are adequate provisions to protect the privacy of subjects and to maintain the confidentiality of data.

(b) When some or all of the subjects are likely to be vulnerable to coercion or undue influence, such as children, prisoners, pregnant women, mentally disabled persons, or economically or educationally disadvantaged persons, additional safeguards have been included in the study to protect the rights and welfare of these subjects. [8]

Consent is referenced a lot in this, and it?s generally seen as only given under these circumstances:

?a) Basic elements of informed consent. Except as provided in paragraph or (d) of this section, in seeking informed consent the following information shall be provided to each subject:

(1) A statement that the study involves research, an explanation of the purposes of the research and the expected duration of the subject?s participation, a description of the procedures to be followed, and identification of any procedures which are experimental;

(2) A description of any reasonably foreseeable risks or discomforts to the subject;

(3) A description of any benefits to the subject or to others which may reasonably be expected from the research;

(4) A disclosure of appropriate alternative procedures or courses of treatment, if any, that might be advantageous to the subject;

(5) A statement describing the extent, if any, to which confidentiality of records identifying the subject will be maintained;

(6) For research involving more than minimal risk, an explanation as to whether any compensation and an explanation as to whether any medical treatments are available if injury occurs and, if so, what they consist of, or where further information may be obtained;

(7) An explanation of whom to contact for answers to pertinent questions about the research and research subjects? rights, and whom to contact in the event of a research-related injury to the subject; and

(8) A statement that participation is voluntary, refusal to participate will involve no penalty or loss of benefits to which the subject is otherwise entitled, and the subject may discontinue participation at any time without penalty or loss of benefits to which the subject is otherwise entitled.

(b) Additional elements of informed consent. When appropriate, one or more of the following elements of information shall also be provided to each subject:

(1) A statement that the particular treatment or procedure may involve risks to the subject (or to the embryo or fetus, if the subject is or may become pregnant) which are currently unforeseeable;

(2) Anticipated circumstances under which the subject?s participation may be terminated by the investigator without regard to the subject?s consent;

(3) Any additional costs to the subject that may result from participation in the research;

(4) The consequences of a subject?s decision to withdraw from the research and procedures for orderly termination of participation by the subject;

(5) A statement that significant new findings developed during the course of the research which may relate to the subject?s willingness to continue participation will be provided to the subject; and

(6) The approximate number of subjects involved in the study. [8]

And, of course, it can only be revoked under the following circumstances:

?(c)An IRB may approve a consent procedure which does not include, or which alters, some or all of the elements of informed consent set forth above, or waive the requirement to obtain informed consent provided the IRB finds and documents that:

(1) The research or demonstration project is to be conducted by or subject to the approval of state or local government officials and is designed to study, evaluate, or otherwise examine: (i) public benefit or service programs; (ii) procedures for obtaining benefits or services under those programs; (iii) possible changes in or alternatives to those programs or procedures; or (iv) possible changes in methods or levels of payment for benefits or services under those programs; and

(2) The research could not practicably be carried out without the waiver or alteration.

(d) An IRB may approve a consent procedure which does not include, or which alters, some or all of the elements of informed consent set forth in this section, or waive the requirements to obtain informed consent provided the IRB finds and documents that:

(1) The research involves no more than minimal risk to the subjects;

(2) The waiver or alteration will not adversely affect the rights and welfare of the subjects;

(3) The research could not practicably be carried out without the waiver or alteration; and

(4) Whenever appropriate, the subjects will be provided with additional pertinent information after participation. [8]

So clearly a lot of this is irrelevant for what Stevens did, but generally an emphasis is placed on minimal risk being given to the participants along with their consent being given ? such consent only being revocable in circumstances where there?s the absolute bare minimum amount of risk for them. So, the question becomes ? was there risk?

There absolutely was, and this risk what incredibly severe ? and Stevens flat out acknowledges it. While throughout the video he emphasizes the risk involved, it?s most explicitly stated at 3 minutes and 45 seconds into the video, where he says,

?But wait, there?s a greater-good dilemma about doing an experiment on the greater good. By forcing people to truly believe they might kill someone, are we risking serious psychological damage to them? Yes, it might be beneficial to all of us to see what happens, but would those benefits be worth potential trauma to a few??

It?s unclear to me how this experiment could be for the ?greater good? ? he argues that it?s ethical in the video and worth the risk at 8 minutes and 3 seconds into the video when he explains to a few members of an institutional review board that

?Right, and so my hope is that the good this experiment does is in revealing the difference between instinct and philosophical reflection, and I think that there could be enormous benefit in learning the difference so that we can train people to act in the way they wish they would.?

In other words, he believes the tradeoff between potentially permanent psychological damage is worth it because we could, at best, help some people better manage themselves in risky situations ? something that would entail even more people being placed in potentially traumatic situations!

It?s important to remember that consent is generally seen as unnecessary if it?s for the larger greater good, such as in studying blood or DNA for potential health benefits [9] and in emergency research, where requiring it could potentially compromise lives saved by the research, albeit with protections given to the patients at hand. [10]

Indeed, this is exactly why the authors of the only peer reviewed study on a real life Trolley Problem used false recordings of mice getting shocked instead of humans dying. They explicitly stated the conditions that,

? Before participants entered the lab, an experimenter read a briefing about the general nature of the experiment to the participant. Each participant was informed that he or she would be required to make a real-life ethical decision. Because electroshocks were used and we assumed that most participants would be familiar with the Milgram studies, the briefing also included a one-sentence statement that the experiment was not about obedience and that they should feel free to make whichever decision they felt was most appropriate. We further told all participants that they could quit the study at any point and would still receive full credit for their participation.?

This is a much more ideal setup than what Stevens done, and certainly a lot more ethical ? as the sheer problems with what Stevens implemented is illustrated in an article by Brianna Renix and Nathan Robinson, saying that, in reacting to a real life Trolley Problem,

? We would panic, do something rashly, and then watch in horror as one or more persons died a gruesome death before our eyes. We would probably end up with PTSD. Whatever we had ended up doing in the moment, we would probably feel guilty about for the rest of our lives: even if we had somehow miraculously managed to comply with a consistent set of consequentialist ethics, this would bring us little comfort?

What they say isn?t without empirical basis either ? Post-Traumatic Stress Disorder can arise out of war veterans and even the unexpected loss of a loved one. [12] Killing, for instance, is notoriously traumatic in people, leaving them damaged for life. [13] You can find an old Reddit thread filled with anecdotes of people who witnessed others die and how it scarred them and damaged them; while it isn?t scientific, it can help to characterize what?s at stake.

To Stevens?s credit, he did attempt to see about the ethics of his experiment, consulting with a neuroscientist, Dr. Aaron Blaisdell, who gave tentative support for Stevens?s idea but emphasizing the risk of harm, as well as various members of the Pepperdine University institutional review board.

While Stevens claims he was in contact with the whole university?s review board, this isn?t the case ? there?s a total of eight members of its review board, and he only interviewed with three (presumably they left since this was underwent); Dr. David Levy, Dr. Natasha Thapar-Olmos, and Dr. Judy Ho. Ho specifically gave Stevens the advice to make it what she would deem ethical to undergo ? screening out people with a traumatic psychiatric history and having an on-site trauma therapist. There?ll be more on this in a moment.

Stevens managed to, however, sway the three individual members with his above speech and eventually goes on to recruit Cason for the study at hand.

Now, the primary problem with his assessment of psychiatric history is that it?s based on self-report and not official medical data. Indeed, research shows that, while generally reliable, self reports in medicine nonetheless can be particularly inaccurate, primarily for precise measurements. [14] This raises an especially large concern for Stevens?s methodology ? with a sample size of only 7, there?s a huge risk for the run of the mill problems in social sciences, such as sampling biases, large error rates, and so on that make interpretation, especially in this case, difficult if not impossible. [15, 16] The biggest risk, however, is with data falsification by participants ? all it would take is just one participant to, mistakenly or otherwise, give false information about their medical history for the whole experiment to fall in on itself.

The on-site trauma therapist is another concern ? while I don?t doubt Cason?s qualifications, it?s really just a lazy way to handle with the participants? psychological well-being. It takes an extensive amount of time to get proper help, involving intense emotional sessions between the therapist and patient; [17, 18] it?s not something that, as we see Cason do with Corey, you can just say ?there there, you?re strong? and shake hands with actors to resolve. This isn?t exactly a method that does no harm.

There?s another thing Stevens did, however it was incredibly brief and confusing. Towards the end of the video, at the 31 minutes and 40 seconds mark, Stevens says

?We?ve since followed up with our subjects, and all of them are doing well.?

This is known as a long-term followup, and it?s often utilized in fields such as psychiatry to see how the patients are doing after a long period of time, being in use for several decades. [19] The problem is, he never detailed how it was done, or what was even looked for. Various long-term studies, such as the St. Jude?s Long-Term Follow Up Study and Add Health give detailed information about the methods they use in order to ascertain how long it?s been since they?ve contacted people, use strict measurement criteria for what they?re studying and so on. We don?t get this in Mind Field ? instead all that?s received is an ?oh they?re fine,? without clarifying what exactly it is they mean by this. Are they fine by just asking via a ?how are you since the study? sort of question? Did they use psychiatric instruments and evaluations? We simply don?t get an answer, and the way it?s brushed off and saved for a five second mention is troubling.

While it should be clear that at this point the ethics of the study is questionable, there?s plenty of issues to be had with the methodology as well that obfuscate any potential ?knowledge? that could be gained, beyond the sample size issues mentioned before. For one, the participants were never asked how they would?ve reacted had they been in the situation beforehand, and considering they had just had a recent belief of a real life experience with it, it?s very likely that this could?ve biased their answers, especially since they?d be aware they were being filmed. Further, no quantitative instruments were used that are utilized within the field ? only a highly edited qualitative interview was established. This prevents comparability or even any sort of actual easy interpretation, unlike the prior Trolley Problem Study that used mice, which had its human participants answer various questionnaires. [11]

There?s also a clear bias in interpretation, with Stevens and Cason aiming to see people under a strict light of whether they were just trying to divert responsibility to give an ad hoc justification for freezing up. This leads to a problem that?s reminiscent of 20th century psychotherapy ? ignoring the experiences of the participants and implementing your own onto theirs to override them under a certain interpretation. They assumed that it was a quite literal real life Trolley Problem, when that simply wasn?t the case ? they weren?t removed from what was happening. The railworkers could?ve easily turned around in their minds, the foreman could?ve easily returned ? someone somewhere could?ve done something, as they had no idea of the full structure of the railroad system. This ties in nicely to a critique mentioned briefly before by Brianna Renix and Nathan Robinson of the Trolley Problem, that it often ignores the structural and real life factors that influence how and why people make the choices they do, instead presuming an unrealistic dichotomous choice. [20]

I?m not the only one who had these concerns ? in an ?Ask Me Anything? on Reddit, user ?YKMR3000? said to Michael Stevens,

? What do you think of the following criticisms of the Trolley Problem episode? (Taken from YouTube comments):

This is definitely not ethical, because there?s no way people wouldn?t be mentally harmed by the test.

The sample size was too small to matter.

Screening out people with risks of PTSD compromises the data

It was obviously faked, because nobody swore, nobody ran out, and nobody realized that this was the trolley problem once the people walked onto the tracks.

This was obviously faked, because of all the editing mistakes (i.e. people looking at the camera and the switch being pulled down at the wrong times).

It?s wrong to murder, regardless if you?re saving more lives to do so, and you never address this.

The quick editing and dramatization doesn?t portray the data in an accurate way.

It?s obviously faked because no one would ever ACTUALLY let you do this.?

While I don?t share all of these criticisms, they are mirrored often. Nonetheless, this is how Stevens responded to these critiques.

GREAT questions.

?1.) I believe it was ethical. Studying the mind is not an easy thing and should not be done lightly. We turned to the experts and, though they may be more strict about studies at their own institutions, we didn?t find anyone who told us not to do it. I stand by my belief that the benefits here outweighed the costs. Everyone who participated left being very glad to have had the experience and to haved help us explore human behavior.

2.) Totally. Given our budget and time-constraints, our sample size was very small. So it functioned more as a trial study to see what variables matter, what issues we hadn?t foreseen, and to help future studies and experiments be designed with better and better approaches.

3.) Screening out people DEFINITELY affected our results. Unfortunately, I wouldn?t feel comfortable not screening. So when evaluating what we observed, it?s important to consider how the population we studied might skew results. This is a major question for almost all psychology studies. Many, for example, are done at universities and disproportionately involve young adults. Science is all about pointing these issues out and working to correct for them or avoid them. Whatever it takes to keep reducing uncertainty.

4.) Not a single person we ran left the room or outrageously swore or admitted ever connecting what was going on to the famous thought-experiment. Perhaps they weren?t very familiar with it? I was very surprised that no one ran out ? perhaps they all felt there wasn?t time or were afraid to run? People often act quite differently in moments of crisis than they do in movies or in our own minds when we calmly consider what we?d do.

5.) Editing discontinuities are unfortunate, but some slipped by without me noticing. I?m disappointed that our editor(s) thought doing those things were appropriate because they were totally unnecessary. But trust me, every person you see acted on their own and only learned what was going on AFTER the study ended.

6.) That?s a great point that it would have been fun to emphasize more. It?s part of why the Trolley problem is, well, a problem. Is it murder-by-inaction to do nothing? Or, at least, might it cause some similar regret to murder?

7.) We?re releasing uncut clips to show things as they happened. To make the events fit into our episode lengths, it?s sometimes necessary to speed up the action, though I agree that this should be done minimally.

8.) YouTube and our lawyers and insurance providers and every expert we talked to allowed us to do this.?

In other words, for the methodological complaints, he simply acknowledges them and assumes they don?t matter ? he claims it as a ?trial study,? even though there was no identification of quantitative variables so it?s unclear how it identified ?variables [that] matter? and other issues. Especially considering he had the funding of YouTube as a whole, while most scholars only have the much smaller university funding and are even unlikely to see a YouTube Red show, this spells it a questionable defense.

What?s concerning most, however, is his response about ethics ? he only dismissed it and repeated what was said in the video, justifying the experiment in the name of ?science.? While I don?t believe he was doing this with the explicit purpose to harm people, it certainly is apparent he has a very questionable and dangerous view of how science should function with permanent risks to people ? it?s a very negligent and harmful view. This was all reciprocated by Redditor ?IAmZelkar,? who responded by saying,

?It is not up to you to decide whether the benefits outweigh the costs when the costs are other people?s mental health.

I also fail to see the benefits of this experiment, even excluding the small sample size, how can you even draw a conclusion from this? You assumed everyone would say they would switch the track when presented with the theoretical trolley problem but that is not the case, and the subjects were never presented with the hypothetical, in order to compare. You also dismissed the subjects? legitimate excuses for not switching the tracks. Expecting someone to notice a train is coming is reasonable.

Finally, this is not the same case as the thought experiment. In this case, 5 people irresponsibly put themselves into a dangerous position (from which they could also get out by themselves) where they would get killed, and one person did not. Deciding to kill the one person out of danger is not as easy of a task as when everyone is tied up to the tracks against their will.

Whether you did this to satisfy your curiosity or just for views, don?t try to pass it off as some ?greater good? thing, you have not helped move forward the study of human behaviour.?

Stevens responded by saying,

?You?re right that this is quite different from the thought-experiment. I believe that?s what makes it so important.

The thought-experiment conveniently removes the messiness of real life (ARE the people there irresponsibly? Who are they? How is it that you KNOW what the lever does? Is the train REALLY out of control? Why is it that the people on the track can?t get out of the way? etc.).

A complete understanding of our moral psychology requires studying not only all of these variables separately, but also studying the complications of real life. It helps us better prepare others, and better understand ourselves. Far from dismissing our participants excuses, we learned a lot about moral reasoning by observing it in the messy world of real life where many more things affect how a situation is perceived.

This all must be done carefully. For us that meant lowering the risk even if we had to limit our population to participants who had been screened. It meant debriefing them to let them know the purpose and benefits of the study, and ending the experience before it got too frightening. It meant listening to them and following up with them. I?m very happy with those measures and hope that future studies can learn from and expand on what we found and how we did it ? always being careful to prioritize the participants? well-being.

For more on this, I recommend the upcoming Mind Field episode about Heroism in which I talk with Philip Zimbardo who ran the infamous Stanford Prison experiment.

As for surveying our participants about the hypothetical trolley problem before putting them in the switching station, that would have unfortunately affected how they reacted later in real life, so we had to instead compare their behavior to how other, similar, people have responded to the thought-experiment.?

His points about the methodology only justify it ?because knowledge, because we learned something? as opposed to actually admitting it flawed, and indeed his claim that it would have ?affected how they reacted later in real life? is a weak one as they likely wouldn?t remember a small moral question that could?ve say been on the forum they took weeks before the experiment.

More important, however, is his blatant ignoring of the ethical rebuttal raised by the user ? it certainly isn?t Stevens?s right to put other people?s mental health on the line for science, let alone for a very poorly conducted ?study,? and nor is it any of the people involved?s right. The fact that he defends this only strikes me as awful.

But not many people seem to feel this way. The video, as of the time I?m writing this, nets over 5.7 million views, with a like/dislike ratio similar to that as most popular YouTube videos. There were, however, various people sharing my thoughts on this.

On the VeryBadWizards subreddit, in quoting Stevens saying he pursued an IRB, user ?rolante? said,

?To translate: ?If you were a researcher affiliated with Pepperdine University our official answer would be: No Fucking Way. That said, you are just some random dude with a YouTube channel, so We Want To Know the Answer Too, We Think This Would Be Totally Awesome?.?

Another user by the name of ?PM_me_your_BaldEagle? said,

?I mean, this was cool and all, but ethically speaking, wtf? What kind of an agreement/waiver did these people need to sign and how is the show not opening itself up to potential lawsuits?

Still, it was awesome and reminescent of some 60s and 70s (unethical by today?s standards) experiments (e.g. Milgram)?

In replying to this, ?nocapitalletter? said,

? i just watched this and am now searching the internet for others opinions, and i was def fascinated to find out how people would actually react, but the last guy made me think this test was a terrible idea.?

Various other tentative criticisms can be seen in the thread, with the last individual, Corey, showing to people that it was almost certainly unethical and hopes that it was staged.

This wasn?t the only place these can be found, indeed, on /r/Vsauce a therapist could be found expressing concern over the experiment ? however, all the replies are generally defending Vsauce or expressing apathy towards the whole thing.

However, there generally seems to be more concern of it being fake (with some comments in the latter thread, ofc, being worried about ethics) as opposed to how ethical it is ? except for Penny Arcade.

A forum thread on the site has a universal consensus, save for one person, of it being unethical, with responses such as,

? There is no IRB in the world that would ever approve that ?experiment?.

Your friend is absolutely right. Without informed consent you just can?t do something like that.

Like, working in the research field I?m almost physically recoiling because that is so holy shit unethical.?

? Yeah, for starters there is no consent given by the test subjects. You are imposing harm and, even if you have a counselor, that is a treatment not an absolvement.

It?s akin to shooting a man in the leg, but it?s ok because you have a doctor standing by.

?

The longer answer is that this sort of experiment can cause lasting psychological harm to the subject It?s not just ?oh, whelp, I guess I chose to do nothing and murder five people/or/kill a man in cold blood.? That stuff haunts you for the rest of your life, even when revealed it was fake, because you didn?t make the choice knowing it was fake. You made the choice knowing full well the damage it would cause and did it anyway, only to have that later corrected by the screen.

Having a psychologist walk in and go ?there there? isn?t going to stop the self loathing or doubt that will now follow those you have tortured with this test. They will carry that for the rest of their lives and their counseling will only be there to help them deal with those emotions.

It is cruel, amoral, and unethical. There is a reason IRB boards exist (to ensure minimal harm occurs from these sorts of studies) and clearly vSauce or whatever decided that such oversight wasn?t worth it in favor of shock value and ratings, regardless of the harm they caused. Maybe it was simply in ignorance, but probably not. The failings of the early 1930s-1960s with these sort of studies are widely known and I really can?t see a situation where the trolly situation is used without the knowledge of the rest of the baggage that follows it.?

? Is? all of Mind Field like that? I?m now scared to watch it. Anything I?ve ever seen exploring these kinds of morality issues has inflicted it onto a knowing audience. To choose an unknowing audience (believing they are doing research on trains, no less) is just? ugh! No one?s gonna laugh that off. At the very least I would be fucking livid at the people conducting the experiment, and refuse to let my name, images, or video or anything be attached to it.

As for Michael.. I am curious as to his thoughts on it. Who produced this? was it the normal vSauce crew? Or was it more akin to, say, some other company hiring him for name recognition? Is there backlash out there??

? That?s pretty fucked! Also kind of useless. It?s not really doing much that the problem as it is normally posed doesn?t do in terms of illuminating moral philosophies or thought processes.

It?s like mythbusters if instead of finding excuses to use ballistic gel they psychologically scarred people.?

? Holy shit that?s awful.

Imagine having to carry that around for the rest of your life. Especially if you were like, paralyzed by fear and froze.

Yep, it was all fake but also you?re the kind of person who would let 5 people die out of fear.?

?? ?Minimal risk? refers to the risk level typically encountered in daily life. That?s in recognition that nothing is risk free, so the comparison is really between the elevation involved in participation over a very low ambient baseline. So, yeah, if someone were to try to run this as a research experiment I doubt there?s an IRB in the world that would count involvement in a mock death as no more than minimal risk.?

? I want to know what experts they talked to who said ?yeah sure, that seems totally fine?.

Either they weren?t experts of any sort or he?s just making that shit up.?

There?s many more like this you can see for yourself, but the point is there ? people in the thread agree about this unethical. So why the lack of an outcry, as some individuals in it acknowledge?

This same thread gives us an answer. User ?nowwhat? says,

???There should be more hullabaloo, but the fact he?s doing it independently for ?entertainment? means nobody really has any authority over what he does. He probably also has people sign some kind of waiver before they do it, and most of his victims (hence the absence of informed consent) wouldn?t realize that what he did was terribly wrong and unethical.?

This explanation was perfectly in line with the original poster of the thread ? they had talked to their husband, someone who studied psychology in university, who expressed repulsion, leading to this thread?s creation. Only after people expressed repulsion did they say,

? That was my husbands reaction too. And now I feel bad because I guess?I didn?t understand the context enough to be horrified?. :/?

This seems in line with my own experiences too. When I first watched and finished the video, I was disgusted by it ? to where I was sending it to a decent lot of people I know, trying to spread about how one of my once favorite YouTubers turned out to be a complete arse. The responses seemed incredibly mixed ? those who were more versed in sciences and were concerned personally about ethics within them expressed outcry, while those without much knowledge about either seemed to be confused or just not even care, people who I know personally and consider to be fantastic individuals. Only after I talked to them about the background information and what ethical research entails did they realize the problem with the video.

Who can blame them, after all ? a reputable channel presenting something as ethical, along with numerous scientists approving it, well, how could it be problematic?

This sort of thought is again mirrored in Penny Arcade, with user ?Tynnan? saying in response to another individual,

? Don?t beat yourself up about your initial reaction to the episode. Research ethics is a topic the lay public receives zero useful training in, which makes it difficult for us to explain all the things that go into something like this without, well, a dedicated discussion thread (what constitutes informed consent, what are the possible sources of harm to the participant, when is it appropriate to mislead the participant, what constitutes a meaningful experiment, and so forth).?

Ultimately, at the heart of it comes a lack of information about proper ethics in science, and the assurance of scientific authorities that it?s completely ethical, that it?s okay, that there?s nothing wrong.

It?s not as if the YouTube community doesn?t care about ethics either ? YouTuber ?cr1tikal,? now ?penguinz0,? had a video where he openly condemned prank culture on YouTube that unfortunately led to people dying. There?s plenty of other concerns too, such as with the YouTube shooting, the Logan Paul fiasco, etc. People involved aren?t just dismissing this out of a blind affinity for Michael Stevens ? it?s just that there?s a lack of information on why this is problematic.

This, ultimately, represents a larger problem with YouTube as a whole ? a valuing of profit above all else, creating a culture and desire for more views to feed into the algorithms so that they can get into the site?s spotlight and help generate it and, most crucial, themselves some monetary gains. It?s a culture that?s legitimately harmed people, that puts people at risk, people even not involved with what?s going on.

This isn?t just something we need to unsubscribe from Vsauce for, it?s something we need to work against YouTube for.

[1] https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3593469/

[2] https://www.ncbi.nlm.nih.gov/pmc/articles/PMC1884510/

[3] https://www.cdc.gov/tuskegee/timeline.htm

[4] https://www.history.com/news/the-infamous-40-year-tuskegee-study

[5] http://blogs.discovermagazine.com/crux/2013/10/02/the-shocking-truth-of-the-notorious-milgram-obedience-experiments/#.W0Ge7tJKiUk

[6] https://www.theatlantic.com/health/archive/2015/01/rethinking-one-of-psychologys-most-infamous-experiments/384913/

[7] https://www.theatlantic.com/health/archive/2014/07/what-if-one-of-the-most-popular-experiments-in-psychology-is-worthless/374931/

[8] https://www.hhs.gov/ohrp/regulations-and-policy/regulations/45-cfr-46/index.html

[9] https://www.npr.org/sections/health-shots/2017/01/18/510442240/scientists-neednt-get-a-patients-consent-to-study-blood-or-dna

[10] https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4744424/

[11] http://journals.sagepub.com/doi/10.1177/0956797617752640

[12] https://www.nimh.nih.gov/health/topics/post-traumatic-stress-disorder-ptsd/index.shtml

[13] https://www.psychologytoday.com/us/blog/talking-about-trauma/201402/death-becomes-us-the-psychological-trauma-killing

[14] https://www.ncbi.nlm.nih.gov/pmc/articles/PMC2745402/

[15] https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4296634/

[16] https://www.nature.com/articles/nrn3475

[17] https://cctasi.northwestern.edu/family/trauma-focused-therapy/

[18] https://www.goodtherapy.org/learn-about-therapy/issues/ptsd/get-help

[19] https://www.ncbi.nlm.nih.gov/pubmed/1155202

[20] https://www.currentaffairs.org/2017/11/the-trolley-problem-will-tell-you-nothing-useful-about-morality

11

No Responses

Write a response