Human Research
Humans in Research
When we do research in family science and related fields, we often (but not always) need to interact in some way with research “subjects,” or the people or things being studied. Often, because this is behavioral science and it is behaviors and attitudes and such that we’re interested in, we are studying living beings, usually humans (though occasionally animals serve our purposes). These individuals deserve to be protected as they participate in research. They have rights that must be recognized and honored.
The recognition of these rights and the need to protect them comes from an unfortunate history of not having great ethical protections in place and having sometimes disastrous results. We’ll talk about some real-life examples of how this has gone wrong, but first, consider a fictionalized scenario:
In 1998, actor Jim Carey starred in the movie The Truman Show. At first glance, the film appears to depict a perfect research experiment. Just imagine the possibilities if we could control every aspect of a person’s life, from where that person lives, to where they work, their lifestyle, and whom they marry. Of course, keeping someone in a bubble of your creation and sitting back to watch how they fare would be highly unethical, not to mention illegal. However, the movie clearly inspires thoughts about the differences between scientific research and research on nonhumans. One of the most exciting, albeit challenging, aspects of conducting social science research is that most of our studies involve human subjects. The free will and human rights of the people we study will always have an impact on what we are able to research and how we are able to conduct that research.
Human Research Versus Nonhuman Research
While all research comes with its own set of ethical concerns, those associated with research conducted on human subjects vary dramatically from those of research conducted on nonliving entities. The US Department of Health and Human Services (USDHHS) defines a human subject as “a living individual about whom an investigator (whether professional or student) conducting research obtains (1) data through intervention or interaction with the individual, or (2) identifiable private information” (USDHHS, 1993, para. 1). Some researchers prefer to use the term “participants” as opposed to “subjects,” as it acknowledges the agency of the people who participate in the study. For our purposes, we will use the two terms interchangeably.
In some states, human subjects also include deceased individuals and human fetal materials. On the other hand, nonhuman research subjects are objects or entities that investigators manipulate or analyze in the process of conducting unobtrusive research projects. Nonhuman research subjects can include sources such as newspapers, historical documents, clinical notes, television shows, buildings, and even garbage. Unsurprisingly, research on human subjects is regulated much more heavily than research on nonhuman subjects. However, there are ethical considerations that all researchers must consider regardless of their research subject. We’ll discuss those considerations in addition to concerns that are unique to research on human subjects.
A Historical Look at Research on Humans
Research on humans hasn’t always been regulated in the way that it is today. The earliest documented cases of research using human subjects are of medical vaccination trials (Rothman, 1987). One such case took place in the late 1700s, when scientist Edward Jenner exposed an 8-year-old boy to smallpox in order to identify a vaccine for the devastating disease. Medical research on human subjects continued without much law or policy intervention until the end of World War II, when Nazi doctors and scientists were put on trial for conducting human experimentation, during the course of which they tortured and murdered many concentration camp inmates (Faden & Beauchamp, 1986). One little-known fact, as described by Faden and Beauchamp in their 1986 book, is that during the time the Nazis conducted their horrendous experiments, Germany had written regulations specifying that human subjects must clearly and willingly consent to their participation in medical research. Obviously, these regulations were completely disregarded by the Nazi experimenters, but the fact that they existed suggests that efforts to regulate the ethical conduct of research, while necessary, are certainly not sufficient for ensuring that human subjects’ rights will be honored. The trials conducted after the war in Nuremberg, Germany resulted in the creation of the Nuremberg Code, a 10-point set of research principles designed to guide doctors and scientists who conduct research on human subjects. Today, the Nuremberg Code guides medical and other research conducted on human subjects, including social scientific research.
Medical scientists are not the only researchers who have conducted questionable research on humans. In the 1960s, psychologist Stanley Milgram (1974) conducted a series of experiments designed to understand obedience to authority in which he tricked subjects into believing they were administering an electric shock to other subjects. The electric shocks were not real at all, however some of Milgram’s research participants experienced extreme emotional distress after the experiment (Ogden, 2008). A reaction of emotional distress is understandable. The realization that you are willing to administer painful shocks to another human being, just because someone who looked authoritative told you to do so, might indeed be traumatizing. This can be true even after you learn that the shocks you administered were not real.
Around the same time that Milgram conducted his experiments, sociology graduate student Laud Humphreys (1970) was collecting data for his dissertation research on the tearoom trade, which was the practice of men engaging in anonymous sexual encounters in public restrooms. Humphreys wished to understand who these men were and why they participated in the trade. To conduct his research, Humphreys offered to serve as a “watch queen,” the person who watches for police and gets to watch the sexual encounters, in a local park restroom where the tearoom trade was known to occur. What Humphreys did not do was identify himself as a researcher to his subjects. Instead, he watched them for several months, getting to know them while learning more about the tearoom trade practice. And, without the knowledge of his research subjects, he would jot down their license plate numbers as they entered and exited the parking lot near the restroom.
After participating as a watch queen, Humphreys utilized the license plate numbers and his insider connections with the local motor vehicle registry to obtain the names and home addresses of his research subjects. Then, disguised as a public health researcher, Humphreys visited his subjects in their homes and interviewed them about their lives and their health. Humphreys’ research dispelled a good number of myths and stereotypes about the tearoom trade and its participants. He learned, for example, that over half of his subjects were married to women and many of them did not identify as gay or bisexual.
When Humphreys’ work became public, he was met with much controversy from his home university, fellow scientists, and the entire public, as his study raised many concerns about the purpose and conduct of social science research. His work was so ethically problematic that the chancellor of his university even tried to have his degree revoked. In addition, the Washington Post journalist Nicholas von Hoffman wrote the following warning about “sociological snoopers”:
We’re so preoccupied with defending our privacy against insurance investigators, dope sleuths, counterespionage men, divorce detectives and credit checkers, that we overlook the social scientists behind the hunting blinds who’re also peeping into what we thought were our most private and secret lives. But they are there, studying us, taking notes, getting to know us, as indifferent as everybody else to the feeling that to be a complete human involves having an aspect of ourselves that’s unknown (von Hoffman, 1970).
In the original version of his report, Humphreys defended the ethics of his actions. In 2008, years after Humphreys’ death, his book was reprinted with the addition of a retrospect on the ethical implications of his work. In his written reflections on his research and its resulting fallout, Humphreys maintained that his tearoom observations constituted ethical research on the grounds that those interactions occurred in public places. But Humphreys added that he would conduct the second part of his research differently. Rather than trace license numbers and interview unwitting tearoom participants in their homes under the guise of public health research, Humphreys instead would spend more time in the field and work to cultivate a pool of informants. Those informants would know that he was a researcher and would be able to fully consent to being interviewed. In the end, Humphreys concluded “there is no reason to believe that any research subjects have suffered because of my efforts, or that the resultant demystification of impersonal sex has harmed society” (Humphreys, 2008, p. 231).
With the increased regulation of social scientific research, it is unlikely that researchers would be permitted to conduct projects like Humphreys’ in today’s world. Some argue that Humphreys’ research was deceptive, put his subjects at risk of losing their families and their positions in society, and was therefore unethical (Warwick, 1973; Warwick, 1982). Others suggest that Humphreys’ research “did not violate any premise of either beneficence or the sociological interest in social justice” and that the benefits of Humphreys’ research, namely the dissolution of myths about the tearoom trade specifically and human sexual practice more generally, outweigh the potential risks associated with the work (Lenza, 2004, p. 23). What do you think, and why?
Another example of a disregard for human dignity is the Tuskegee Syphilis Experiment, conducted in Alabama from the 1930s to the 1970s. The goal of the study was to understand the natural progression of syphilis in human beings. Investigators working for the Public Health Service enrolled hundreds of poor African American men in the study, some of whom had been diagnosed with syphilis and others who had not. Even after effective syphilis treatment was identified in the 1940s, research participants were denied treatment so that researchers could continue to observe the progression of the disease. The study came to an end in 1972 after knowledge of the experiment became public. In 1997, President Clinton publicly apologized on behalf of the American people for the study (Reverby, 2009). These and other studies led to increasing public awareness and concern regarding research on human subjects. In 1974, the US Congress enacted the National Research Act, which created the National Commission for the Protection of Human Subjects in Biomedical and Behavioral Research. The commission produced The Belmont Report, a document outlining basic ethical principles for research on human subjects (National Commission for the Protection of Human Subjects in Biomedical and Behavioral Research, 1979). The National Research Act (1974) also required that all institutions receiving federal support establish institutional review boards (IRBs) to protect the rights of human research subjects. Since that time, many private research organizations that do not receive federal support have also established their own review boards to evaluate the ethics of the research that they conduct.
The Belmont Report is an especially important part of this history; the principles of Respect for Persons, Beneficence, and Justice guide ethical practice today. You’ll learn more about each of those shortly.
Because we do have protections in place today, you might think that serious violations of research ethics are a thing of the past. Unfortunately, that’s not at all the case. There are still major ethical violations happening all the time, though hopefully they are recognized and dealt with more swiftly than in the past thanks to more regulations and better training of scientists.
Consider an example that you yourself, or someone you know, may have been included in because of using Facebook: Without informing users of what they were doing, a group of researchers took one week in January 2012 and manipulated the algorithm that Facebook was using to create News Feeds. Some randomly-selected users were shown fewer positive posts that normal, and others were shown fewer negative posts that normal. As hypothesized, uses who saw fewer positive things in turn posted fewer positive things (evidence of an emotional contagion effect, according to the researchers) and vice-versa (Kramer et al., 2014). The researchers said they thought the study feel within Facebook’s Data Use Policy, which all users agree to when they create an account. However, some folks felt this study, in which emotions were directly and purposefully manipulated, was different from normal use and not informing participants or giving them a chance to decline to participate was ethically wrong. As with many ethical quandaries, there is not a clear-cut answer here, and there has been continued discussion of whether this study was ethical or not (Meyer, 2014).
What do you think of this? Were you or a family member on Facebook during those years? How would you feel if you found out you were involved in a study without your knowledge, especially one that might have manipulated your emotions, mental health, or physical health?
Institutional Review Boards (IRBs)
There is a governing body at nearly every research institution that is required to oversee human research to be sure ethical guidelines are being following and general human decency is being practiced: the Institutional Review Board, affectionately known to many as the IRB.
Institutional Review Boards are tasked with ensuring that the rights and welfare of human research subjects will be protected at all institutions, including universities, hospitals, nonprofit research institutions, and other organizations that receive federal support for research. IRBs typically consist of members from a variety of disciplines, such as sociology, economics, education, social work, and communications. Most IRBs also include representatives from the community in which they reside. For example, representatives from nearby prisons, hospitals, or treatment centers might sit on the IRBs of university campuses near them. The diversity of membership ensures that the complex ethical issues of human subjects research will be considered fully by a knowledgeable, experienced panel. Investigators conducting research on human subjects are required to submit proposals outlining their research plans to IRBs for review and approval prior to beginning their research. Even students who conduct research on human subjects must have their proposed work reviewed and approved by the IRB before beginning any research (though, on some campuses, some exceptions are made for classroom projects that will not be shared outside of the classroom).
The IRB has three levels of review, defined in statute by the USDHHS. Exempt review is the lowest level of review. Exempt studies expose participants to the least potential for harm and often involve little participation by the human subjects. In social sciences, exempt studies often examine data that is publicly available or secondary data from another researcher that has been de-identified by the person who collected it. Expedited review is the middle level of review. Studies considered under expedited review do not have to go before the full IRB board because they expose participants to minimal risk. However, the studies must be thoroughly reviewed by a member of the IRB committee. While there are many types of studies that qualify for expedited review, the most relevant to social scientists include the use of existing medical records, recordings (such as interviews) gathered for research purposes, and research on individual group characteristics or behavior. Finally, the highest level of review is called a full board review. When researchers submit a proposal under full board review, the full IRB board will meet, discuss any questions or concerns with the study, invite the researcher to answer questions and defend their proposal, and vote to approve the study or send it back for revision. Full board proposals pose greater than minimal risk to participants. They may also involve the participation of vulnerable populations, or people who need additional protection from the IRB. Vulnerable populations include pregnant women, prisoners, children, people with cognitive impairments, people with physical disabilities, employees, and students. While some of these populations can fall under expedited review in some cases, they will often require the full IRB’s approval to study.
It may surprise you to hear that IRBs are not always popular or appreciated by researchers. Sometimes, researchers are concerned that the IRBs are well-versed in biomedical and experimental research and less familiar with the qualitative, open-ended nature of social science research. The members of IRBs often want specific details about your study. They may require you to describe aspects of your study, including but not limited to: the specific population you will be studying, observation methods, potential interview questions for participants, and any predictions you have about your findings. For example, it would be extraordinarily frustrating or nearly impossible to provide this level of detail for a large-scale group participant observation study.
Oftentimes, social science researchers cannot study controversial topics or use certain data collection techniques due to ethical concerns of the IRB. When important social research is not permitted by review boards, researchers may become frustrated (and rightfully so). The solution is not to do away with review boards, which serve a necessary and important function. Instead, an effort should be made to educate IRB members about the importance of social science research methods and topics.
The Researcher’s Toolkit: How I Apply What I Teach
When I was preparing for my dissertation, I proposed a study involving a vulnerable population: adolescent sexual offenders. Because of the sensitive nature of the topic and the elevated risk involved, my study required a full board IRB review. As you have just read, this means the entire Institutional Review Board met to evaluate my proposal, ask questions, discuss ethical concerns, and ultimately vote on whether the study could move forward.
I was invited to attend the meeting to defend my proposal, which focused on interviewing adolescents about their home environments. While I expected concerns, especially given the population’s vulnerability, the board’s main issue was that I was not a trained therapist. They were understandably cautious about the ethical implications of asking minors sensitive questions without proper clinical training or safety protocols. Their concern reflected the principle of Beneficence, which requires that researchers work to minimize potential risks and maximize the benefits of their research, especially when working with vulnerable participants.
When I left that meeting, I genuinely believed my study might not move forward. However, what I learned was invaluable. The IRB wasn’t trying to stop my research—they were working to protect both my participants and me. Their feedback led me to revise my study design. Instead of interviewing the adolescents directly, I shifted to interviewing the therapists who worked with them. This allowed me to gain meaningful insight while maintaining ethical integrity and participant protection.
What students can take away from this experience is this: IRB review is not a barrier to research—it’s a safeguard that upholds ethical standards and ensures your study is sound, respectful, and responsible. Even when proposals are sent back for revision, the process helps you become a stronger, more thoughtful researcher. I’m grateful for what I learned through the IRB’s rigorous review and their commitment to ethical research.
CITI Training
As part of being approved by an institution’s IRB, researchers usually have to undergo ethics trainings. You should seek out the requirements of your institution to be sure. Many institutions require their professional and student researchers to be CITI-trained – that is, complete the appropriate course through the Collaborative Institutional Training Initiative. Work with your professor and IRB to find out which course(s) you are required to take, how long the certification lasts, and how to sign up through your institution so that your profile is linked with your CITI Status; if you later change institutions but they also use CITI for their ethical training, your records should come with you!
The Three Principles
If you complete a CITI course, you’ll get more in-depth information about each of the three principles of the Belmont report. Let’s quickly overview the three here though, just in case you need a refresher or to prepare you for your CITI readings and quizzes!
Respect for Persons
The first principle of the report is Respect for Persons. This principle is mainly concerned with respecting the autonomy of individuals and protecting those with diminished autonomy (for example, minors or those with restrictions on their ability to consent for themselves). One of the primary ways that this principle is upheld is through gaining informed consent from each research participant. We’ll talk more about that specific document later, so for now, just think about what you’d want to know before you decided to participate in a research study – what things would you want to be sure you reviewed first? Would you need to ask certain questions? Would you, for example, want to know what risks you were more and less likely to encounter as part of the study? What benefits you might receive? How to quit if you started to feel uncomfortable? These are all parts of the informed consent process.
For individuals who cannot consent for themselves, we need to be especially careful to be sure these individuals are protected and that their legal guardians have the chance to be informed and provide consent in their stead. In these cases, we still will seek assent from the individuals themselves, whenever possible, which is verbal consent (as opposed to the informed consent document, which unless specifically approved by the IRB to not require a signature, should be a physical or electronic document that is signed by the consent-giver).
Let’s Break it Down
Show
Respect for Persons
Tap on each heart to learn how it connects to the Belmont Report’s Respect for Persons.
* This image was created using napkin.ai; however, the concept, design direction, and creative vision were conceived by Dr. Knight
Beneficence
The principle of Beneficence requires that researchers work to minimize risks and maximize benefits of research. Whenever possible, the goal should really be to do no harm with research; however, if there is a potential for risk in the research, the benefits should clearly outweigh the risk and the risks be reduced as much as possible. It can sometimes be challenging to think about research this way – is taking a 5-minute survey risky, for example? Maybe not, but is it beneficial either? Some studies do carry their own benefits, of course (like participating in a novel therapy method – if it works, then the participants have gained a benefit!), but not all do. In those cases, it can be helpful to consider the overall benefits to society that stand to result from the research, as sometimes individual benefits may not be obvious (or really even exist). Carefully conducting a risk-benefit analysis (more on that soon) and then communicating both the risks and benefits in the informed consent process is vital.
Let’s Break it Down
Show
Beneficence in Research
Tap on each person to discover how the Belmont Report addresses the principle of beneficence in research.
* This image was created using napkin.ai; however, the concept, design direction, and creative vision were conceived by Dr. Knight
Justice
Finally, the principle of Justice rounds out the three. You might have learned about justice in other settings – punishment, for example, or distribution of aid. In those discussions, you probably became familiar with various models of justice – is it justice for aid to be given equally to everyone, for example, or is it more just for it to be given according to need, or even to effort? In research, justice means that we must consider how risk and benefits of research are shared among different people and populations. Again, this grows out of examples of obvious problems in past research studies (such as studies in which all those exposed to the risks were poor or otherwise marginalized populations, whereas those that benefitted from the findings were not) as well as a broader understanding that when we select our research participants, we need to carefully consider who should be involved, not just who is the easiest to access or to coerce into being in a study. This principle is also supported by informed consent and by risk-benefit analysis, but also by carefully justifying the selection of research participants when designing a study.
Let’s Break it Down
Show
Justice in Research
Tap on the scales to learn more about how fairness is practiced in research, including how participants are chosen and how risks and benefits are shared.
* This image was created using napkin.ai; however, the concept, design direction, and creative vision were conceived by Dr. Knight
Image attributions
roundtable meeting by Debora Cartagena CC-0
References
Faden, R. R., & Beauchamp, T. L. (1986). A history and theory of informed consent. Oxford, UK: Oxford University Press.
Humphreys, L. (1970). Tearoom trade: Impersonal sex in public places. London, UK: Duckworth.
Humphreys, L. (2008). Tearoom trade: Impersonal sex in public places, enlarged edition with a retrospect on ethical issues. New Brunswick, NJ: Aldine Transaction.
Kramer, A. D. L., Guillory, J. E., & Hancock, J. T. (2014). Experimental evidence of massive-scale emotional contagion through social networks. PNAS, 111(24), 8788-8790. https://www.pnas.org/doi/full/10.1073/pnas.1320040111
Lenza, M. (2004). Controversies surrounding Laud Humphreys’ tearoom trade: An unsettling example of politics and power in methodological critiques. International Journal of Sociology and Social Policy, 24, 20–31.
Meyer, M. N. (2014). How an IRB could have legitimately approved the Facebook experiment – and why that may be a good thing. The Faculty Lounge: Conversations about Law, Culture, and Academia. https://www.thefacultylounge.org/2014/06/how-an-irb-could-have-legitimately-approved-the-facebook-experimentand-why-that-may-be-a-good-thing.html
Milgram, S. (1974). Obedience to authority: An experimental view. New York, NY: Harper & Row.
National Commission for the Protection of Human Subjects in Biomedical and Behavioral Research. (1979). The Belmont report: Ethical principles and guidelines for the protection of human subjects of research. Retrieved from https://www.hhs.gov/ohrp/regulations-and-policy/belmont-report/index.html
National Research Act of 1974, Pub. L. no. 93-348 Stat 88. (1974). The act can be read at https://history.nih.gov/research/downloads/PL93-348.pdf
Ogden, R. (2008). Harm. In L. M. Given (Ed.), The sage encyclopedia of qualitative research methods (p. 379–380). Los Angeles, CA: Sage.
Reverby, S. M. (2009). Examining Tuskegee: The infamous syphilis study and its legacy. Chapel Hill, NC: University of North Carolina Press.
Rothman, D. J. (1987). Ethics and human experimentation. The New England Journal of Medicine, 317, 1195–1199.
US Department of Health and Human Services. (1993). Institutional review board guidebook glossary. Retrieved from https://ori.hhs.gov/education/products/ucla/chapter2/page00b.htm
von Hoffman, N. (1970, January 30). Sociological snoopers. The Washington Post, p. B1.
Warwick, D. P. (1973). Tearoom trade: Means and ends in social research. Hastings Center Studies, 1, 39–49.
Warwick, D. P. (1982). Types of harm in social research. In T. L. Beauchamp, R. R. Faden, R. J. Wallace Jr., & L. Walters (Eds.), Ethical issues in social science research. Baltimore, MD: Johns Hopkins University Press.