1 00:00:11,912 --> 00:00:14,648 Good morning, everyone in the room and 2 00:00:14,648 --> 00:00:17,651 everywhere else that you're watching from. 3 00:00:18,418 --> 00:00:21,039 As Holly mentioned, I'm going to talk about a 4 00:00:21,039 --> 00:00:21,622 framework 5 00:00:21,622 --> 00:00:23,423 for the ethical conduct of clinical research. 6 00:00:23,423 --> 00:00:25,292 And as Holly already alluded to, 7 00:00:25,292 --> 00:00:29,196 this is a framework that the course is sort of built 8 00:00:29,196 --> 00:00:29,496 on. 9 00:00:29,496 --> 00:00:31,256 So you so the things that I'm going 10 00:00:31,256 --> 00:00:33,166 to talk about in the next 40 minutes, 11 00:00:33,700 --> 00:00:35,302 you're going to hear over and over again 12 00:00:35,302 --> 00:00:36,703 throughout the rest of the course, 13 00:00:36,970 --> 00:00:40,107 and I'll make some references to those. 14 00:00:40,107 --> 00:00:41,534 Before I start, I want to thank Holly 15 00:00:41,534 --> 00:00:42,576 for doing such a great job 16 00:00:42,576 --> 00:00:43,710 of organizing this course. 17 00:00:43,710 --> 00:00:46,713 And Alex, so thank you both. 18 00:00:47,214 --> 00:00:49,930 There's my disclaimer, and I have no conflicts of 19 00:00:49,930 --> 00:00:50,484 interest. 20 00:00:51,184 --> 00:00:53,009 So I think when we think about the ethics 21 00:00:53,009 --> 00:00:53,987 of clinical research, 22 00:00:53,987 --> 00:00:55,255 there are several questions 23 00:00:55,255 --> 00:00:56,986 that sort of big picture questions 24 00:00:56,986 --> 00:00:58,258 we should ask ourselves. 25 00:00:58,458 --> 00:01:00,093 One is the first. 26 00:01:00,093 --> 00:01:01,361 Before I ask any of them, 27 00:01:01,361 --> 00:01:02,730 I want to make sure that we understand 28 00:01:02,730 --> 00:01:03,630 what we're talking about 29 00:01:03,630 --> 00:01:05,766 when we're talking about clinical research, 30 00:01:05,766 --> 00:01:07,339 because there are lots of kinds of 31 00:01:07,339 --> 00:01:09,236 research that we might be interested in. 32 00:01:09,236 --> 00:01:14,401 The ethics of clinical research being research that's about the 33 00:01:14,401 --> 00:01:14,975 health 34 00:01:15,375 --> 00:01:16,944 and humans. 35 00:01:16,944 --> 00:01:21,377 And it's either research to, to understand better the health 36 00:01:21,377 --> 00:01:22,115 of humans 37 00:01:22,382 --> 00:01:24,710 or illness in humans or to intervene, 38 00:01:24,710 --> 00:01:27,220 in prevention or diagnosing or treating. 39 00:01:27,587 --> 00:01:30,257 So that's the domain of research we're talking about. 40 00:01:30,257 --> 00:01:32,962 So that precludes, for example, basic 41 00:01:32,962 --> 00:01:33,694 research, 42 00:01:34,094 --> 00:01:37,130 pre-clinical research, research with non-human animals, 43 00:01:37,597 --> 00:01:41,064 all other kinds of research that's not related to human 44 00:01:41,064 --> 00:01:41,568 health. 45 00:01:43,303 --> 00:01:45,939 So three important questions. 46 00:01:45,939 --> 00:01:46,640 What's the value? 47 00:01:46,640 --> 00:01:48,408 Why do we do research at all? 48 00:01:48,408 --> 00:01:49,943 That's one question to ask. 49 00:01:49,943 --> 00:01:51,901 And then the second and subset of 50 00:01:51,901 --> 00:01:54,214 that is okay, if we do research at all 51 00:01:54,214 --> 00:01:57,477 to try to understand humans, why do we have to include humans 52 00:01:57,477 --> 00:01:57,851 in it? 53 00:01:57,851 --> 00:01:59,987 Why do we have to involve humans? 54 00:01:59,987 --> 00:02:01,855 Experiments on humans. 55 00:02:01,855 --> 00:02:03,786 And if the answer to both of those is, 56 00:02:03,786 --> 00:02:04,091 well, 57 00:02:04,091 --> 00:02:06,693 it's important and we need to involve humans, 58 00:02:06,693 --> 00:02:09,830 then how should we do it in the most ethical manner. 59 00:02:09,830 --> 00:02:12,833 And so I'm going to try to cover each of these very briefly 60 00:02:13,333 --> 00:02:15,669 with respect to 61 00:02:15,669 --> 00:02:16,737 the value. 62 00:02:16,737 --> 00:02:18,238 This is just one example. 63 00:02:18,238 --> 00:02:20,390 This is the NIH website that talks 64 00:02:20,390 --> 00:02:22,542 about the impact of NIH research. 65 00:02:22,542 --> 00:02:24,878 Now, NIH is one funder of research, 66 00:02:24,878 --> 00:02:26,413 happens to be probably 67 00:02:26,413 --> 00:02:29,295 the largest funder, public funder of research in the 68 00:02:29,295 --> 00:02:29,683 world. 69 00:02:30,250 --> 00:02:32,152 But it's just one funder. 70 00:02:32,152 --> 00:02:34,755 But you can see on the the left hand side here, 71 00:02:36,590 --> 00:02:38,125 it talks about 72 00:02:38,125 --> 00:02:40,530 improving health, revolutionizing science 73 00:02:40,530 --> 00:02:41,762 and serving society. 74 00:02:41,762 --> 00:02:44,587 So the value of research is to understand 75 00:02:44,587 --> 00:02:45,966 and improve health. 76 00:02:45,966 --> 00:02:48,546 For sure. That's what we're going to be talking about 77 00:02:48,546 --> 00:02:48,935 mostly. 78 00:02:48,935 --> 00:02:50,303 But there are a couple of other things. 79 00:02:50,303 --> 00:02:52,542 You know, we learn how to do science 80 00:02:52,542 --> 00:02:53,040 better. 81 00:02:53,040 --> 00:02:55,209 We learn new techniques that will 82 00:02:55,209 --> 00:02:57,511 in the future help improve health. 83 00:02:58,011 --> 00:03:00,720 And then there's a way in which, research 84 00:03:00,720 --> 00:03:01,314 actually 85 00:03:01,314 --> 00:03:04,518 has a huge impact on other aspects of society. 86 00:03:04,518 --> 00:03:07,185 So, for example, if you go to that website 87 00:03:07,185 --> 00:03:08,455 on serving society, 88 00:03:08,755 --> 00:03:11,110 it'll talk about the economy value 89 00:03:11,110 --> 00:03:12,426 of doing research. 90 00:03:12,793 --> 00:03:15,362 It's a boon to the economy. 91 00:03:15,362 --> 00:03:18,365 It raises, it in places 92 00:03:18,365 --> 00:03:22,269 where research is funded, the economy is improved. 93 00:03:22,636 --> 00:03:25,148 And by helping people be healthier, 94 00:03:25,148 --> 00:03:27,374 we improve the economy because 95 00:03:27,374 --> 00:03:28,041 people can work. 96 00:03:28,041 --> 00:03:30,868 So there's a lot of ways in which research itself is 97 00:03:30,868 --> 00:03:31,411 valuable. 98 00:03:31,411 --> 00:03:32,979 And this is just, 99 00:03:33,980 --> 00:03:34,715 you know, when I age. 100 00:03:34,715 --> 00:03:37,433 But everywhere else that funds research 101 00:03:37,433 --> 00:03:38,618 or does research 102 00:03:38,852 --> 00:03:41,855 has the same kind of goals. 103 00:03:42,823 --> 00:03:45,125 So what about including humans? 104 00:03:45,125 --> 00:03:48,195 So one thing that's probably true 105 00:03:48,562 --> 00:03:53,100 is that the kinds of results that we get from doing 106 00:03:53,100 --> 00:03:56,563 clinical research have compelling societal health 107 00:03:56,563 --> 00:03:57,270 benefits. 108 00:03:57,571 --> 00:04:00,273 So these are new therapies, diagnostic and preventive 109 00:04:00,273 --> 00:04:02,879 strategies improvement and quality of life, 110 00:04:02,879 --> 00:04:03,243 etc.. 111 00:04:03,910 --> 00:04:07,080 I think all of us can probably recognize 112 00:04:07,080 --> 00:04:09,845 examples of people who have survived 113 00:04:09,845 --> 00:04:11,918 from a devastating disease 114 00:04:11,918 --> 00:04:16,123 or not got a devastating disease because of research discoveries. 115 00:04:16,389 --> 00:04:19,226 So, for example, treatments for HIV 116 00:04:19,226 --> 00:04:22,062 have changed the trajectory of HIV 117 00:04:22,062 --> 00:04:26,133 from a rapidly fatal disease to a pretty people 118 00:04:26,133 --> 00:04:28,930 live a normal life with a chronic controlled 119 00:04:28,930 --> 00:04:29,503 disease. 120 00:04:30,070 --> 00:04:33,158 Treatments for childhood leukemia have changed from, 121 00:04:33,158 --> 00:04:33,573 again, 122 00:04:33,573 --> 00:04:37,570 rapid decline in death to 90 plus percent 123 00:04:37,570 --> 00:04:38,545 survival. 124 00:04:39,312 --> 00:04:42,482 Lipid lowering drugs constantly. 125 00:04:42,482 --> 00:04:44,460 I mean, many, many people are on lipid 126 00:04:44,460 --> 00:04:46,386 lowering drugs that that's improved. 127 00:04:48,221 --> 00:04:48,788 Morbidity 128 00:04:48,788 --> 00:04:51,792 and mortality from cardiovascular disease, 129 00:04:51,792 --> 00:04:55,262 vaccines, vaccines for measles, for polio, for, 130 00:04:56,596 --> 00:04:58,832 HPV for Covid, so many others. 131 00:04:58,832 --> 00:04:59,432 So you can 132 00:04:59,432 --> 00:05:02,068 you can sort of think about the kinds of things 133 00:05:02,068 --> 00:05:06,306 that have been shown to work through clinical research 134 00:05:06,306 --> 00:05:10,653 that have made a major impact on the health of society, of 135 00:05:10,653 --> 00:05:11,178 people 136 00:05:11,178 --> 00:05:13,727 and saving lives and improving the quality of 137 00:05:13,727 --> 00:05:14,181 health. 138 00:05:15,882 --> 00:05:17,117 Also, 139 00:05:17,117 --> 00:05:19,322 and this is sort of the same idea, 140 00:05:19,322 --> 00:05:21,721 but in a sort of a more micro level. 141 00:05:22,055 --> 00:05:23,852 When you go to a physician's office 142 00:05:23,852 --> 00:05:25,392 to get treated for something, 143 00:05:25,692 --> 00:05:29,329 you hope that the physician has some evidence upon which 144 00:05:29,329 --> 00:05:31,957 to make a suggestion about what you have 145 00:05:31,957 --> 00:05:33,600 and what to do about it. 146 00:05:33,900 --> 00:05:36,570 And those kinds of that evidence, again, 147 00:05:36,570 --> 00:05:38,572 comes from clinical research. 148 00:05:38,972 --> 00:05:41,960 So most of the, kinds of things that we do 149 00:05:41,960 --> 00:05:42,742 when we go 150 00:05:42,742 --> 00:05:45,053 to a doctor's office is based on research 151 00:05:45,053 --> 00:05:46,913 that have been done in the past. 152 00:05:48,215 --> 00:05:49,216 There's also 153 00:05:49,216 --> 00:05:53,406 a pretty clear understanding that human participation is 154 00:05:53,406 --> 00:05:54,154 necessary 155 00:05:54,154 --> 00:05:58,291 in order to gain accurate and relevant data for humans. 156 00:05:58,625 --> 00:06:01,628 So we do, 157 00:06:01,928 --> 00:06:03,063 the normal trajectory 158 00:06:03,063 --> 00:06:06,399 of research is basic in vivo. 159 00:06:06,399 --> 00:06:07,934 I mean, in vitro, sorry. 160 00:06:07,934 --> 00:06:10,971 Then preclinical in non-human animals. 161 00:06:10,971 --> 00:06:15,080 But most of the non-human animal models are not perfect models 162 00:06:15,080 --> 00:06:15,875 for humans. 163 00:06:15,875 --> 00:06:17,963 And so testing things in humans or 164 00:06:17,963 --> 00:06:20,480 understanding humans as part of research 165 00:06:20,480 --> 00:06:23,550 is, is really important to understand 166 00:06:23,550 --> 00:06:26,253 how humans work and what works in humans. 167 00:06:26,253 --> 00:06:28,455 And that's the reason we do it. 168 00:06:28,455 --> 00:06:30,457 So why is it challenging then? 169 00:06:30,457 --> 00:06:32,259 Okay, research is good. 170 00:06:32,259 --> 00:06:33,660 Humans are necessary. 171 00:06:33,660 --> 00:06:35,528 What's the problem? Right. 172 00:06:35,528 --> 00:06:37,968 The problem is that the primary goal of 173 00:06:37,968 --> 00:06:40,533 doing research is to generate knowledge, 174 00:06:41,067 --> 00:06:43,937 to generate useful knowledge about human health and illness. 175 00:06:43,937 --> 00:06:46,607 It's not to benefit the people that are part of the 176 00:06:46,607 --> 00:06:46,973 study. 177 00:06:47,741 --> 00:06:50,123 Now, that doesn't mean that people in studies don't 178 00:06:50,123 --> 00:06:50,543 benefit. 179 00:06:50,543 --> 00:06:53,165 They often do, but that's not the goal, 180 00:06:53,165 --> 00:06:55,181 and they often don't benefit. 181 00:06:56,616 --> 00:06:58,251 So we're really asking a 182 00:06:58,251 --> 00:07:02,155 small number of people to accept risk and burden 183 00:07:02,489 --> 00:07:04,648 in order to answer important questions 184 00:07:04,648 --> 00:07:06,693 that might be beneficial to others. 185 00:07:07,294 --> 00:07:10,330 And therefore, the people who are being 186 00:07:10,330 --> 00:07:13,726 enrolled in research or participating in research are 187 00:07:13,726 --> 00:07:14,367 the means 188 00:07:14,367 --> 00:07:17,537 to developing useful knowledge and therefore at risk 189 00:07:17,971 --> 00:07:20,940 of being exploited. 190 00:07:21,975 --> 00:07:25,011 There's lots of public 191 00:07:25,011 --> 00:07:26,784 sort of feeling about this issue, 192 00:07:26,784 --> 00:07:28,181 and if you talk to people 193 00:07:28,181 --> 00:07:29,559 who don't know much about research, 194 00:07:29,559 --> 00:07:30,150 they will say, 195 00:07:30,150 --> 00:07:31,985 oh my gosh, you can't do research on humans. 196 00:07:31,985 --> 00:07:33,705 That's, you know, we're making humans 197 00:07:33,705 --> 00:07:34,821 into human guinea pigs. 198 00:07:34,821 --> 00:07:37,772 And, you know, there's a lot of misunderstanding about 199 00:07:37,772 --> 00:07:37,991 the 200 00:07:38,325 --> 00:07:39,948 the goals of research and the reasons 201 00:07:39,948 --> 00:07:41,528 research is done and how it's done. 202 00:07:41,828 --> 00:07:43,935 But there's a lot of feeling about, 203 00:07:43,935 --> 00:07:45,632 not using people as a human. 204 00:07:45,632 --> 00:07:46,199 Guinea pigs. 205 00:07:47,567 --> 00:07:49,087 So the ethics of clinical research 206 00:07:49,087 --> 00:07:49,936 is what I've said. 207 00:07:49,936 --> 00:07:52,815 Made sense so far, really has two really 208 00:07:52,815 --> 00:07:53,606 important, 209 00:07:54,341 --> 00:07:56,643 parts to it or goals to it. 210 00:07:56,643 --> 00:08:00,192 One is that we want to promote responsible 211 00:08:00,192 --> 00:08:01,881 and useful research 212 00:08:01,881 --> 00:08:04,818 that's going to benefit society and future patients. 213 00:08:04,818 --> 00:08:07,721 That's a really important ethical imperative of doing good 214 00:08:07,721 --> 00:08:08,988 clinical research. 215 00:08:08,988 --> 00:08:12,992 At the same time, we want to minimize any harm 216 00:08:13,360 --> 00:08:15,968 and the possibility of exploitation 217 00:08:15,968 --> 00:08:17,831 by protecting the rights 218 00:08:17,831 --> 00:08:22,135 and welfare of the participants, protecting and respecting. 219 00:08:22,135 --> 00:08:24,771 So both of those are important. 220 00:08:24,771 --> 00:08:26,239 And sometimes they come into tension. 221 00:08:26,239 --> 00:08:28,963 By the way, and the ethical requirements 222 00:08:28,963 --> 00:08:31,211 that we are going to learn about 223 00:08:31,211 --> 00:08:32,412 are you're going to learn about in this course. 224 00:08:32,412 --> 00:08:34,738 And some of you might already know some 225 00:08:34,738 --> 00:08:36,349 about how provide guidance 226 00:08:36,349 --> 00:08:38,127 in order to allow this to happen, 227 00:08:38,127 --> 00:08:40,120 to how to ethically conduct research 228 00:08:40,120 --> 00:08:43,018 in order to promote responsible research, 229 00:08:43,018 --> 00:08:44,290 seeking progress, 230 00:08:44,591 --> 00:08:46,590 minimizing exploitation and harm, 231 00:08:46,590 --> 00:08:49,195 or the possibility of ensuring that rights 232 00:08:49,195 --> 00:08:51,679 and welfare of participants are respected 233 00:08:51,679 --> 00:08:53,133 and helping to maintain 234 00:08:53,433 --> 00:08:56,770 the trust of the public in the research enterprise. 235 00:08:56,770 --> 00:08:58,238 So important pieces of it. 236 00:08:59,773 --> 00:09:02,011 So how do we how do we conduct research 237 00:09:02,011 --> 00:09:02,642 ethically? 238 00:09:02,642 --> 00:09:03,510 How do we do that? 239 00:09:03,510 --> 00:09:05,235 How do we promote useful research 240 00:09:05,235 --> 00:09:07,013 to protect the rights and welfare 241 00:09:07,313 --> 00:09:09,883 and maintain public trust? 242 00:09:09,883 --> 00:09:10,817 Well, we've 243 00:09:10,817 --> 00:09:13,139 we've learned over the course of history 244 00:09:13,139 --> 00:09:15,055 through many historical lessons, 245 00:09:15,055 --> 00:09:16,796 some of the things that are important 246 00:09:16,796 --> 00:09:18,491 in terms of how to conduct research 247 00:09:18,825 --> 00:09:21,294 and then we can also sort of go through some ethical reasons. 248 00:09:21,294 --> 00:09:24,097 And I'm going to do a little bit of each. 249 00:09:24,097 --> 00:09:27,112 I'm not going to spend much time on history because 250 00:09:27,112 --> 00:09:27,467 later 251 00:09:27,467 --> 00:09:29,402 Susan Lederer is going to speak to you. 252 00:09:29,402 --> 00:09:30,670 And she's a historian. 253 00:09:30,670 --> 00:09:32,372 She's a historian that's done 254 00:09:32,372 --> 00:09:34,774 a huge amount of work on the history 255 00:09:34,774 --> 00:09:36,576 of the ethics of research. 256 00:09:36,576 --> 00:09:39,612 And so let her give you the specific examples. 257 00:09:39,612 --> 00:09:42,482 And she's a wonderful lecturer as well. 258 00:09:42,482 --> 00:09:46,519 But years ago, Dick Emanuel and I wrote a paper, 259 00:09:46,820 --> 00:09:51,157 which is, which was rewritten for the textbook Oxford Textbook 260 00:09:51,157 --> 00:09:54,273 of Clinical Research Ethics in 2008, 261 00:09:54,273 --> 00:09:57,130 which tried to divide the ethics 262 00:09:57,130 --> 00:10:00,333 of clinical research historically into different eras 263 00:10:00,934 --> 00:10:03,817 and just briefly, to give you a sense of how this 264 00:10:03,817 --> 00:10:04,170 went. 265 00:10:04,771 --> 00:10:08,052 You know, for a long time, hundreds of years, there were no 266 00:10:08,052 --> 00:10:08,441 rules, 267 00:10:08,641 --> 00:10:11,078 there were no guidelines, there were no, 268 00:10:11,078 --> 00:10:12,979 limits on what people could do. 269 00:10:12,979 --> 00:10:15,982 And in fact, there wasn't much money or, 270 00:10:16,316 --> 00:10:19,152 there weren't many resources for clinical research either. 271 00:10:19,152 --> 00:10:23,920 So research was often done by individual scientists or 272 00:10:23,920 --> 00:10:24,891 clinicians 273 00:10:25,325 --> 00:10:27,802 who had some extra time or money, 274 00:10:27,802 --> 00:10:29,829 but mostly with the motive 275 00:10:29,829 --> 00:10:32,020 of trying to benefit the individuals 276 00:10:32,020 --> 00:10:33,967 that they were responsible for. 277 00:10:34,534 --> 00:10:37,309 And because there were so few guidelines, 278 00:10:37,309 --> 00:10:39,339 there was a lot of, you know, 279 00:10:40,373 --> 00:10:42,625 sometimes the experiments were worse 280 00:10:42,625 --> 00:10:43,877 than no experiment, 281 00:10:43,877 --> 00:10:47,213 and there was a lot of, negative outcomes 282 00:10:47,213 --> 00:10:50,216 rather than only positive that 283 00:10:50,817 --> 00:10:53,371 when there was more money and more resources and more 284 00:10:53,371 --> 00:10:53,853 attention 285 00:10:53,853 --> 00:10:56,328 to research, research, excuse me, and the value of 286 00:10:56,328 --> 00:10:56,823 research, 287 00:10:57,190 --> 00:10:59,377 we went into what people have, in 288 00:10:59,377 --> 00:11:02,161 retrospect, called a very utilitarian era 289 00:11:02,562 --> 00:11:06,145 where people who were marginalized, 290 00:11:06,145 --> 00:11:07,066 captive, 291 00:11:07,634 --> 00:11:11,377 unimportant to society were asked to participate in 292 00:11:11,377 --> 00:11:12,038 research 293 00:11:12,038 --> 00:11:15,207 that was seen as good for the common good, 294 00:11:15,207 --> 00:11:16,943 good for the majority. 295 00:11:17,477 --> 00:11:20,404 And you'll hear about some of those examples from 296 00:11:20,404 --> 00:11:21,180 Susan later. 297 00:11:21,180 --> 00:11:23,077 And probably, you know, some of them 298 00:11:23,077 --> 00:11:24,183 anyway, from history 299 00:11:24,584 --> 00:11:27,687 that led to an era of scrutiny, 300 00:11:27,687 --> 00:11:31,175 some of the exposure of those kinds of experiments 301 00:11:31,175 --> 00:11:31,524 done 302 00:11:31,524 --> 00:11:34,019 with prisoners and orphans and other 303 00:11:34,019 --> 00:11:36,930 very vulnerable people, people, you know, 304 00:11:36,930 --> 00:11:39,257 the scientific and the way communities 305 00:11:39,257 --> 00:11:41,401 started to say, what are we doing? 306 00:11:41,401 --> 00:11:42,823 You know, what is what is happening 307 00:11:42,823 --> 00:11:43,636 with this research? 308 00:11:43,636 --> 00:11:47,084 And so there was intense scrutiny, driven 309 00:11:47,084 --> 00:11:49,943 by a couple of very, public cases 310 00:11:50,577 --> 00:11:53,680 that led to questions about what the limits 311 00:11:53,680 --> 00:11:55,973 and the scope of clinical research 312 00:11:55,973 --> 00:11:56,849 ought to be. 313 00:11:56,849 --> 00:12:00,513 And from that scrutiny, there were derived a number of 314 00:12:00,513 --> 00:12:00,920 rules 315 00:12:00,920 --> 00:12:06,059 and regulations which emphasized and have focused on protection 316 00:12:06,059 --> 00:12:09,266 of the participants, protection of human participants and 317 00:12:09,266 --> 00:12:09,829 research. 318 00:12:09,829 --> 00:12:12,932 And so you'll see, or maybe, you know, a lot of the rules 319 00:12:12,932 --> 00:12:15,935 and regulations have that as their primary 320 00:12:15,935 --> 00:12:16,436 focus, 321 00:12:17,737 --> 00:12:19,205 I'd say in the last 322 00:12:19,205 --> 00:12:22,175 now almost 30, 30 years, I guess there's been 323 00:12:22,175 --> 00:12:25,478 a little bit of a pendulum swing in the sense that, 324 00:12:26,045 --> 00:12:30,350 clinical research is recognized as a benefit, 325 00:12:30,350 --> 00:12:32,518 that a benefit sometimes to people 326 00:12:32,518 --> 00:12:34,687 who want to be in clinical trials 327 00:12:34,687 --> 00:12:36,456 because it's a way for them to access 328 00:12:36,456 --> 00:12:38,403 experimental drugs that they couldn't get 329 00:12:38,403 --> 00:12:38,925 elsewhere. 330 00:12:38,925 --> 00:12:39,392 Elsewhere. 331 00:12:39,392 --> 00:12:42,048 Excuse me, but also benefit for 332 00:12:42,048 --> 00:12:43,162 populations. 333 00:12:43,162 --> 00:12:45,625 So there's been, for example, recognition 334 00:12:45,625 --> 00:12:47,066 that children have been 335 00:12:47,433 --> 00:12:49,914 what some people call therapeutic orphans 336 00:12:49,914 --> 00:12:51,971 because there's been insufficient 337 00:12:51,971 --> 00:12:54,507 amount of research to generate evidence 338 00:12:54,507 --> 00:12:57,043 on how to treat children appropriately 339 00:12:57,277 --> 00:13:00,913 and therefore including them in research is a benefit, 340 00:13:00,913 --> 00:13:02,882 because then we learn how to treat children. 341 00:13:02,882 --> 00:13:04,384 So that's been, 342 00:13:04,384 --> 00:13:06,614 a very important insight that's happened 343 00:13:06,614 --> 00:13:08,287 over the last 30 or so years. 344 00:13:10,790 --> 00:13:14,160 During this time, and currently there are 345 00:13:15,194 --> 00:13:18,398 there there were derived a number of codes 346 00:13:18,398 --> 00:13:22,301 and guidelines and regulations that guide clinical research. 347 00:13:22,602 --> 00:13:24,426 Some of these you've probably heard from, 348 00:13:24,426 --> 00:13:25,672 I've heard about excuse me. 349 00:13:25,672 --> 00:13:28,675 And I'm just going to mention a few of them just briefly. 350 00:13:29,075 --> 00:13:32,300 The Nuremberg Code was the code that was written at the end of 351 00:13:32,300 --> 00:13:32,612 World 352 00:13:32,612 --> 00:13:36,114 War two, when there was a trial of the Nazi doctors who did a 353 00:13:36,114 --> 00:13:36,516 number 354 00:13:36,516 --> 00:13:40,420 of experiments on prisoners, Nazi prisoners of World War Two. 355 00:13:41,788 --> 00:13:44,023 The Declaration of Helsinki 356 00:13:44,023 --> 00:13:47,026 followed 15 years almost later 357 00:13:47,293 --> 00:13:49,512 by the World Medical Association, 358 00:13:49,512 --> 00:13:51,731 because it began to realize that 359 00:13:52,365 --> 00:13:55,618 although the the sort of tenets of the Nuremberg Code were 360 00:13:55,618 --> 00:13:56,235 important, 361 00:13:56,636 --> 00:13:58,104 they didn't apply. 362 00:13:58,104 --> 00:14:02,566 So obviously to research that was done by doctors with 363 00:14:02,566 --> 00:14:03,309 patients 364 00:14:03,309 --> 00:14:05,826 to try to improve the understanding 365 00:14:05,826 --> 00:14:07,480 of what patients need. 366 00:14:07,947 --> 00:14:10,917 And so the Declaration of Helsinki in 64, 367 00:14:10,917 --> 00:14:14,353 started was by doctors for doctors 368 00:14:14,353 --> 00:14:17,421 about how to do research with patients and people who are 369 00:14:17,421 --> 00:14:17,690 ill. 370 00:14:18,858 --> 00:14:19,859 That declaration 371 00:14:19,859 --> 00:14:22,995 has been revised multiple times. 372 00:14:22,995 --> 00:14:25,798 The most recent published version is 2013. 373 00:14:25,798 --> 00:14:27,673 But there is a they are in the process 374 00:14:27,673 --> 00:14:29,202 of revising it at this moment. 375 00:14:29,202 --> 00:14:32,172 There's been a number of, meetings around the world, 376 00:14:32,172 --> 00:14:32,972 actually, to, 377 00:14:33,539 --> 00:14:37,509 consultant consultancies to try to get that, revised 378 00:14:37,509 --> 00:14:38,044 again. 379 00:14:38,044 --> 00:14:40,713 So we'll see a new declaration of Helsinki. 380 00:14:40,713 --> 00:14:43,716 I'm assuming this year 381 00:14:43,716 --> 00:14:46,953 the Belmont Report was a report from the Nash, 382 00:14:46,953 --> 00:14:49,047 the United States National Commission 383 00:14:49,047 --> 00:14:50,123 for the Protection 384 00:14:50,123 --> 00:14:54,527 of Participants of Medical and Behavioral Research, which 385 00:14:55,728 --> 00:14:58,231 which was a commission that was put into place 386 00:14:58,231 --> 00:15:00,424 after the exposure of the Tuskegee 387 00:15:00,424 --> 00:15:03,069 Public Health Service, syphilis studies, 388 00:15:03,069 --> 00:15:03,603 which you're going 389 00:15:03,603 --> 00:15:06,032 to hear more about later and you probably have already 390 00:15:06,032 --> 00:15:06,572 heard about 391 00:15:07,006 --> 00:15:08,641 the Belmont Report, is is a 392 00:15:09,642 --> 00:15:11,811 it's not a, 393 00:15:11,811 --> 00:15:12,845 set of rules. 394 00:15:12,845 --> 00:15:16,382 It's a explication of the underlying principles 395 00:15:16,816 --> 00:15:18,351 for the ethics of clinical research. 396 00:15:18,351 --> 00:15:21,387 And it's it's a very useful, document. 397 00:15:21,387 --> 00:15:23,871 It's also short, it's easy to read, 398 00:15:23,871 --> 00:15:25,291 and it is the basis 399 00:15:25,291 --> 00:15:28,261 on which our regulations have been made. 400 00:15:29,362 --> 00:15:31,364 And I'll come to that in a minute. 401 00:15:31,364 --> 00:15:33,384 The Sims is the Council of International 402 00:15:33,384 --> 00:15:35,101 Organization of Medical Sciences, 403 00:15:35,368 --> 00:15:37,254 which has collaborated over the years 404 00:15:37,254 --> 00:15:39,038 with the World Health Organization 405 00:15:39,505 --> 00:15:43,276 to provide international guidelines. 406 00:15:43,276 --> 00:15:47,213 And those guidelines, they originally took the 407 00:15:47,980 --> 00:15:50,650 the goal that they took was to take the principles 408 00:15:50,650 --> 00:15:53,119 that were enunciated in the Declaration of Helsinki 409 00:15:53,119 --> 00:15:56,222 and apply them specifically to the kind of research 410 00:15:56,222 --> 00:16:00,059 that's sponsored by a resource rich country and conducted 411 00:16:00,059 --> 00:16:02,279 in a research poorer country, resource, 412 00:16:02,279 --> 00:16:03,930 poorer countries, excuse me. 413 00:16:04,397 --> 00:16:08,326 And they're one of the things that I really like about the 414 00:16:08,326 --> 00:16:08,868 science 415 00:16:08,868 --> 00:16:12,338 guidelines is not only do they have tenets 416 00:16:12,338 --> 00:16:15,341 which all these others do too, but they explain them. 417 00:16:15,341 --> 00:16:17,877 They have many paragraphs of explanation. 418 00:16:17,877 --> 00:16:19,440 Why do they think this is important, 419 00:16:19,440 --> 00:16:21,047 how you would actually implement it? 420 00:16:21,047 --> 00:16:23,159 So it's a useful document to look at for 421 00:16:23,159 --> 00:16:25,218 more than just international research. 422 00:16:26,285 --> 00:16:28,855 And then the OECD is 423 00:16:28,855 --> 00:16:30,763 International Conference on Harmonization 424 00:16:30,763 --> 00:16:32,391 Good Clinical Practice guidelines. 425 00:16:32,725 --> 00:16:35,962 So the International Conference on Harmonization was a group 426 00:16:35,962 --> 00:16:41,267 that got together in the 1990s to say, 427 00:16:42,001 --> 00:16:43,657 well, we're doing clinical research 428 00:16:43,657 --> 00:16:45,171 all around the world when we do 429 00:16:45,171 --> 00:16:48,624 research in the United States, and then we want to adopt the 430 00:16:48,624 --> 00:16:49,141 findings 431 00:16:49,141 --> 00:16:50,916 in the European Union, it has to go 432 00:16:50,916 --> 00:16:53,045 through the whole process all over again. 433 00:16:53,312 --> 00:16:56,415 So there was an effort to harmonize those regulations. 434 00:16:56,415 --> 00:16:58,885 So that research that was done in one 435 00:16:58,885 --> 00:17:00,953 jurisdiction could be accepted 436 00:17:00,953 --> 00:17:03,956 by another jurisdiction by the regulatory authorities. 437 00:17:04,390 --> 00:17:06,392 And it 438 00:17:06,392 --> 00:17:09,262 documents has multiple, multiple volumes 439 00:17:09,262 --> 00:17:12,163 about multiple things related to how to do clinical 440 00:17:12,163 --> 00:17:12,732 research. 441 00:17:13,032 --> 00:17:17,536 One important guidance from for how to do research 442 00:17:17,536 --> 00:17:19,909 ethically is the good clinical practice 443 00:17:19,909 --> 00:17:20,640 guidelines. 444 00:17:20,640 --> 00:17:25,552 And that, that has become in many jurisdictions around the 445 00:17:25,552 --> 00:17:26,145 world, 446 00:17:26,145 --> 00:17:28,304 the guiding regulatory framework for 447 00:17:28,304 --> 00:17:30,883 how research is done in that jurisdiction. 448 00:17:33,152 --> 00:17:35,278 So in addition to codes and guidelines, 449 00:17:35,278 --> 00:17:36,422 we have regulations. 450 00:17:36,422 --> 00:17:40,026 So in the United States we have the common rule 451 00:17:40,293 --> 00:17:45,865 which is a set of regulations about how to protect 452 00:17:46,299 --> 00:17:49,302 participants in research that is 453 00:17:49,602 --> 00:17:52,071 adopted, has been adopted by 454 00:17:52,071 --> 00:17:55,074 the 17 US federal agencies, 455 00:17:55,741 --> 00:17:57,812 one of which is the Department of Health 456 00:17:57,812 --> 00:17:59,779 and Human Services and the Department 457 00:17:59,779 --> 00:18:03,594 of Health and Human Services, which is the agency that NIH 458 00:18:03,594 --> 00:18:04,383 belongs to. 459 00:18:05,284 --> 00:18:08,287 Uses the common rule, which is found at us 460 00:18:08,521 --> 00:18:11,807 at 45, Code of Federal Regulations, 461 00:18:11,807 --> 00:18:12,558 not 46. 462 00:18:12,992 --> 00:18:14,661 And you know that you might say, well, 463 00:18:14,661 --> 00:18:15,628 who cares about that? 464 00:18:15,628 --> 00:18:18,048 But you'll find around here, for example, 465 00:18:18,048 --> 00:18:20,232 people say, well, what is 45 CFR 46? 466 00:18:20,232 --> 00:18:22,305 Say, and you have to know what that means. 467 00:18:22,305 --> 00:18:22,601 Okay. 468 00:18:22,601 --> 00:18:24,036 That's that's the common rule. 469 00:18:25,004 --> 00:18:26,772 Interestingly, 470 00:18:26,772 --> 00:18:28,959 the FDA, Food and Drug Administration 471 00:18:28,959 --> 00:18:30,910 has their own set of regulations 472 00:18:30,910 --> 00:18:33,773 that are not they don't they don't fall under the common 473 00:18:33,773 --> 00:18:34,080 rule. 474 00:18:34,413 --> 00:18:36,026 And they have a set of regulations, 475 00:18:36,026 --> 00:18:37,316 several sets of regulations 476 00:18:37,316 --> 00:18:40,353 which are relevant to the ethics of doing research. 477 00:18:41,053 --> 00:18:43,756 Most relevant are probably 478 00:18:43,756 --> 00:18:46,826 the ones found at 21 CFR 50 and 56. 479 00:18:47,393 --> 00:18:49,962 And they're very similar to what 480 00:18:49,962 --> 00:18:52,965 you see in 45 CFR 46, but not identical. 481 00:18:53,499 --> 00:18:56,734 And so it's important to think about the differences 482 00:18:56,734 --> 00:18:57,169 there. 483 00:18:57,436 --> 00:19:00,001 And if somebody is developing a product 484 00:19:00,001 --> 00:19:01,974 that will ultimately want to, 485 00:19:02,341 --> 00:19:05,344 apply for FDA approval, 486 00:19:05,644 --> 00:19:08,647 then they have to follow the FDA regs 487 00:19:09,048 --> 00:19:12,351 in addition to the regulations that govern the agencies. 488 00:19:12,685 --> 00:19:16,122 There are often or maybe usually 489 00:19:16,555 --> 00:19:20,126 institutional policies that, people have to adhere to. 490 00:19:20,126 --> 00:19:23,229 So the NIH, for example, has a number of policies about 491 00:19:23,229 --> 00:19:26,038 who to include and how research ought to be 492 00:19:26,038 --> 00:19:26,365 done 493 00:19:26,365 --> 00:19:29,702 ethically that are, relevant to research 494 00:19:29,702 --> 00:19:33,072 that's funded by or done within the NIH. 495 00:19:33,572 --> 00:19:35,993 And, Holly mentioned earlier that, 496 00:19:35,993 --> 00:19:38,911 you know, most of our focus is domestic, 497 00:19:38,911 --> 00:19:41,685 but many research projects in these 498 00:19:41,685 --> 00:19:44,617 in this day are done across borders. 499 00:19:44,617 --> 00:19:46,852 And so there are laws and jurisdiction 500 00:19:46,852 --> 00:19:50,105 regulations in many, many other jurisdictions that are also 501 00:19:50,105 --> 00:19:50,656 relevant. 502 00:19:51,123 --> 00:19:54,226 If somebody is doing research that's funded by the NIH, 503 00:19:54,493 --> 00:19:56,295 it doesn't matter where it's being done. 504 00:19:56,295 --> 00:19:58,726 They still have to follow when I rules, 505 00:19:58,726 --> 00:20:01,033 but they may also have to follow the 506 00:20:01,300 --> 00:20:03,909 the laws and regulations of the jurisdiction in which 507 00:20:03,909 --> 00:20:04,303 they're 508 00:20:04,837 --> 00:20:06,639 conducting their research. 509 00:20:06,639 --> 00:20:09,108 So lots of details. 510 00:20:09,108 --> 00:20:10,943 I went over that pretty fast. 511 00:20:10,943 --> 00:20:14,395 I do want to focus just a moment on what the Belmont Report does 512 00:20:14,395 --> 00:20:14,880 include, 513 00:20:14,880 --> 00:20:17,053 because I think I mentioned it's it's 514 00:20:17,053 --> 00:20:19,518 not a code in the same way as the others, 515 00:20:19,518 --> 00:20:22,242 but it has ethical principles underlying the conduct of 516 00:20:22,242 --> 00:20:22,688 research 517 00:20:22,688 --> 00:20:25,006 and explicates them what they are 518 00:20:25,006 --> 00:20:28,027 and how they apply so respect for persons. 519 00:20:28,027 --> 00:20:30,182 They apply to the process of informed 520 00:20:30,182 --> 00:20:31,464 consent, beneficence, 521 00:20:31,464 --> 00:20:34,600 to the process of assessing risks and benefits, and research 522 00:20:34,867 --> 00:20:38,164 and justice to the process of selecting participants 523 00:20:38,164 --> 00:20:38,671 fairly. 524 00:20:38,971 --> 00:20:41,173 And you're going to hear about all of those more. 525 00:20:41,173 --> 00:20:43,096 The other thing, important thing, 526 00:20:43,096 --> 00:20:45,311 I think that the Belmont Report does, 527 00:20:45,311 --> 00:20:48,173 and very few of these other documents 528 00:20:48,173 --> 00:20:50,649 do, is describe the distinction 529 00:20:50,649 --> 00:20:54,595 and the boundaries between clinical practice and clinical 530 00:20:54,595 --> 00:20:55,287 research. 531 00:20:55,955 --> 00:20:58,424 And interestingly, 532 00:20:58,424 --> 00:21:01,660 when you think about the kinds of ethical issues 533 00:21:01,660 --> 00:21:04,300 or tensions that come up in a clinical research 534 00:21:04,300 --> 00:21:05,030 environment, 535 00:21:05,397 --> 00:21:08,768 they often are because of tension 536 00:21:08,768 --> 00:21:11,770 or confusion about these boundaries. 537 00:21:12,438 --> 00:21:15,805 So I wanted to spend one minute talking about what I mean by 538 00:21:15,805 --> 00:21:16,142 that. 539 00:21:17,042 --> 00:21:18,962 So the difference between clinical 540 00:21:18,962 --> 00:21:20,713 research and clinical practice 541 00:21:20,713 --> 00:21:22,781 is multifold. 542 00:21:22,781 --> 00:21:24,283 First of all, there are different goals. 543 00:21:24,283 --> 00:21:25,651 And I've already mentioned that 544 00:21:25,651 --> 00:21:28,024 the goal of clinical research is to answer 545 00:21:28,024 --> 00:21:29,889 useful, generalizable questions. 546 00:21:30,289 --> 00:21:32,342 The goal of clinical practice is to help 547 00:21:32,342 --> 00:21:34,293 the peop, the person or the people in 548 00:21:34,293 --> 00:21:36,503 front of you based on evidence that you've 549 00:21:36,503 --> 00:21:38,397 that somebody else collected before 550 00:21:39,899 --> 00:21:41,467 and also use different methods. 551 00:21:41,467 --> 00:21:46,005 So in research you may be aware and you will hear more about 552 00:21:46,705 --> 00:21:48,274 we randomized people. 553 00:21:48,274 --> 00:21:50,259 We randomized people to get this drug 554 00:21:50,259 --> 00:21:51,010 or that drug. 555 00:21:51,010 --> 00:21:53,384 We blind them, meaning they're not allowed 556 00:21:53,384 --> 00:21:55,080 to know what they're getting. 557 00:21:55,381 --> 00:21:57,524 Sometimes the investigator doesn't know what they're 558 00:21:57,524 --> 00:21:58,184 getting either. 559 00:21:58,184 --> 00:21:59,852 That's a double blind. 560 00:21:59,852 --> 00:22:03,722 We offer placebos as one part of a control group. 561 00:22:04,089 --> 00:22:06,692 That's very common in research. 562 00:22:06,692 --> 00:22:10,062 We do, dose escalation in trials. 563 00:22:10,062 --> 00:22:12,187 So, you know, these three people will get this 564 00:22:12,187 --> 00:22:12,464 dose. 565 00:22:12,464 --> 00:22:14,600 The next three people will get a higher dose. 566 00:22:14,600 --> 00:22:17,603 So there's a number of really important 567 00:22:18,204 --> 00:22:21,840 methodology in research that are critical to, 568 00:22:23,209 --> 00:22:25,144 getting good unbiased data 569 00:22:25,144 --> 00:22:28,614 that you would find anathema in clinical practice. 570 00:22:28,614 --> 00:22:30,813 So if you went to your doctor's office 571 00:22:30,813 --> 00:22:33,185 with a sore throat and your doctor said, 572 00:22:33,552 --> 00:22:34,653 I'm going to give you a drug, but 573 00:22:35,821 --> 00:22:37,289 I'm not going to tell you what it is. 574 00:22:37,289 --> 00:22:40,125 I don't even know what it is you would be like, 575 00:22:40,125 --> 00:22:41,794 I think I need a new doctor. 576 00:22:41,794 --> 00:22:44,344 So I think there is really an interesting 577 00:22:44,344 --> 00:22:45,464 dichotomy between 578 00:22:45,464 --> 00:22:48,927 the methods that are common to each of those 579 00:22:48,927 --> 00:22:49,635 domains. 580 00:22:50,402 --> 00:22:52,549 There's also different justification for 581 00:22:52,549 --> 00:22:52,871 risk. 582 00:22:52,871 --> 00:22:55,562 So in clinical practice, certainly 583 00:22:55,562 --> 00:22:58,410 people absorb accept whatever risk. 584 00:22:58,410 --> 00:23:01,413 There's risks with interventions of all kinds. 585 00:23:01,880 --> 00:23:05,317 But those risks are justified because they 586 00:23:06,452 --> 00:23:11,523 the, benefit that is presumed to be happening 587 00:23:11,523 --> 00:23:14,793 by giving this intervention outweighs the risk. 588 00:23:15,094 --> 00:23:17,663 And so the risk is worth it for that individual. 589 00:23:17,663 --> 00:23:20,399 In research, we ask people sometimes 590 00:23:20,399 --> 00:23:23,159 to do things for which there are no benefits 591 00:23:23,159 --> 00:23:23,535 then. 592 00:23:23,535 --> 00:23:26,639 So we may ask them, for example, to have an extra biopsy 593 00:23:26,639 --> 00:23:29,305 or an extra blood draw or an extra X-ray, 594 00:23:29,305 --> 00:23:30,476 or to take a drug 595 00:23:30,809 --> 00:23:34,647 just to get a pharmacokinetic, reading on it. 596 00:23:35,114 --> 00:23:37,668 Those are things that are done for no benefit to those 597 00:23:37,668 --> 00:23:38,284 individuals, 598 00:23:38,517 --> 00:23:40,720 and they accept those risks for the 599 00:23:40,720 --> 00:23:41,287 for the, 600 00:23:41,287 --> 00:23:44,290 interests of science. 601 00:23:44,523 --> 00:23:47,126 There are also different levels of uncertainty. 602 00:23:47,126 --> 00:23:49,128 Again, not everything in clinical practice 603 00:23:49,128 --> 00:23:50,129 is certain for sure. 604 00:23:50,129 --> 00:23:53,024 People we we often have to say, well, 605 00:23:53,024 --> 00:23:54,667 here's what we know. 606 00:23:54,667 --> 00:23:55,958 Here's what we don't know, here's 607 00:23:55,958 --> 00:23:56,935 what we're going to try. 608 00:23:56,935 --> 00:24:00,072 But in but for the most part in clinical practice, 609 00:24:00,406 --> 00:24:03,909 what is done is based on prior evidence 610 00:24:03,909 --> 00:24:06,619 or prior experience or the standards in the 611 00:24:06,619 --> 00:24:07,313 community. 612 00:24:08,047 --> 00:24:10,364 In research, the starting point is 613 00:24:10,364 --> 00:24:11,250 uncertainty. 614 00:24:11,250 --> 00:24:14,453 In other words, we wouldn't do a clinical trial if we knew, 615 00:24:15,020 --> 00:24:17,082 if we knew that if we're testing A versus 616 00:24:17,082 --> 00:24:18,490 B and we knew A was better, 617 00:24:18,957 --> 00:24:20,759 why would we do the trial? 618 00:24:20,759 --> 00:24:23,128 So the starting point is we don't know. 619 00:24:23,128 --> 00:24:24,431 We don't know which one is better, 620 00:24:24,431 --> 00:24:25,197 which one is safer, 621 00:24:25,197 --> 00:24:26,899 which was more effective, whatever. 622 00:24:26,899 --> 00:24:28,500 And that's the starting point. 623 00:24:28,500 --> 00:24:30,775 And because of all these differences 624 00:24:30,775 --> 00:24:31,470 and goals, 625 00:24:31,470 --> 00:24:33,852 methods, justification for risk and levels of 626 00:24:33,852 --> 00:24:34,540 uncertainty, 627 00:24:34,540 --> 00:24:37,255 we also have different obligations of the 628 00:24:37,255 --> 00:24:38,977 the players, if you will. 629 00:24:39,211 --> 00:24:42,114 So clinicians have an obligation 630 00:24:42,114 --> 00:24:43,820 to competently offer care and treatment 631 00:24:43,820 --> 00:24:45,351 in their patient's best interests. 632 00:24:45,918 --> 00:24:48,087 That's their their obligation. 633 00:24:48,087 --> 00:24:50,949 Whereas researchers have an obligation 634 00:24:50,949 --> 00:24:53,359 to competently conduct research 635 00:24:53,659 --> 00:24:55,544 while they're respecting the rights 636 00:24:55,544 --> 00:24:57,429 and welfare of their participants. 637 00:24:57,896 --> 00:25:00,232 So very different obligations as well. 638 00:25:01,633 --> 00:25:02,568 So now you get to this 639 00:25:02,568 --> 00:25:04,665 point in my talk, and then we say, whoa, 640 00:25:04,665 --> 00:25:05,871 that's a lot of stuff. 641 00:25:05,871 --> 00:25:08,107 And what doesn't make any sense. 642 00:25:08,107 --> 00:25:10,909 And I think oftentimes 643 00:25:10,909 --> 00:25:13,487 for investigators, for people who are involved in 644 00:25:13,487 --> 00:25:14,012 research, 645 00:25:14,012 --> 00:25:18,866 for people who are on IRBs, they're like, okay, so how do we 646 00:25:18,866 --> 00:25:19,351 know? 647 00:25:19,351 --> 00:25:21,854 How do we really know what to do? 648 00:25:21,854 --> 00:25:25,663 Do we follow the common rule, or do we follow the Declaration of 649 00:25:25,663 --> 00:25:26,258 Helsinki, 650 00:25:26,625 --> 00:25:28,537 or do we have to know both or, you know, 651 00:25:28,537 --> 00:25:29,828 and what if they conflict? 652 00:25:29,828 --> 00:25:32,030 How do we figure out what to do? 653 00:25:32,030 --> 00:25:35,033 So there's a lot of unfortunate confusion. 654 00:25:35,634 --> 00:25:38,036 And to add to that, 655 00:25:38,036 --> 00:25:41,073 a lot of the guidance that I've mentioned, the rules 656 00:25:41,073 --> 00:25:43,558 and the regulations and the codes of ethics were 657 00:25:43,558 --> 00:25:44,076 developed 658 00:25:44,076 --> 00:25:46,753 in response to a particular historical 659 00:25:46,753 --> 00:25:47,246 event. 660 00:25:47,246 --> 00:25:49,960 So, as I mentioned, Nuremberg was response to Nazi 661 00:25:49,960 --> 00:25:50,449 Germany. 662 00:25:50,883 --> 00:25:54,453 Helsinki was a response to the Nuremberg Code. 663 00:25:54,887 --> 00:25:57,286 Belmont was a response to Tuskegee, 664 00:25:57,286 --> 00:25:58,657 among other things. 665 00:25:58,657 --> 00:26:01,860 But, I mean, each of them sort of came about historically 666 00:26:01,860 --> 00:26:04,363 because of something that had happened 667 00:26:04,363 --> 00:26:06,865 and therefore their their main focus, 668 00:26:06,865 --> 00:26:10,330 focus is often in response to that historical 669 00:26:10,330 --> 00:26:10,869 event. 670 00:26:13,172 --> 00:26:13,972 The end 671 00:26:13,972 --> 00:26:16,445 result is that different regulations 672 00:26:16,445 --> 00:26:17,476 and different, 673 00:26:17,876 --> 00:26:20,646 guidance sometimes come into not conflict, 674 00:26:20,646 --> 00:26:22,426 but have different interpretations 675 00:26:22,426 --> 00:26:24,049 or emphasize different things. 676 00:26:24,283 --> 00:26:27,286 And so understanding them, there's some divergence. 677 00:26:27,686 --> 00:26:30,882 And I also mentioned already that different, regulations 678 00:26:30,882 --> 00:26:31,623 might apply. 679 00:26:32,057 --> 00:26:34,447 So for example, if you're a NIH funded 680 00:26:34,447 --> 00:26:35,327 investigator, 681 00:26:35,594 --> 00:26:39,375 you are you have to follow the common rule 682 00:26:39,375 --> 00:26:40,365 45 CFR 46. 683 00:26:40,599 --> 00:26:44,369 If you're an NIH investigator and you're developing a product 684 00:26:44,369 --> 00:26:46,738 that you want to ultimately get approved by the FDA, 685 00:26:46,738 --> 00:26:49,374 you also have to follow the FDA regulations. 686 00:26:49,374 --> 00:26:52,382 If you're an NIH investigator following common rule and FDA 687 00:26:52,382 --> 00:26:53,045 regulations, 688 00:26:53,312 --> 00:26:56,982 and you want to get your paper published in the journal, 689 00:26:57,616 --> 00:26:58,968 you have to follow the Declaration 690 00:26:58,968 --> 00:26:59,485 of Helsinki. 691 00:26:59,485 --> 00:27:01,572 In fact, many journals ask you to say 692 00:27:01,572 --> 00:27:02,588 you have followed 693 00:27:02,588 --> 00:27:05,023 the Declaration of Helsinki, so you have to know what it is. 694 00:27:05,023 --> 00:27:07,143 So all of these things are important 695 00:27:07,143 --> 00:27:08,026 to know about. 696 00:27:08,694 --> 00:27:12,637 But now I'm going to take you to a different place because I 697 00:27:12,637 --> 00:27:13,031 think 698 00:27:13,732 --> 00:27:16,902 recognizing the number of different, 699 00:27:16,902 --> 00:27:19,737 guidances and regulations and the confusion that they 700 00:27:19,737 --> 00:27:20,539 might generate 701 00:27:20,973 --> 00:27:24,543 led us 20 some odd years ago to develop a 702 00:27:24,877 --> 00:27:27,078 what we called a systematic, coherent, 703 00:27:27,078 --> 00:27:29,047 universally applicable framework. 704 00:27:29,047 --> 00:27:31,149 And this is the framework that you're going to 705 00:27:31,149 --> 00:27:34,536 learn about now and hear about repeatedly throughout the 706 00:27:34,536 --> 00:27:35,020 course. 707 00:27:35,521 --> 00:27:37,656 And this framework has eight principles. 708 00:27:38,757 --> 00:27:39,057 You can 709 00:27:39,057 --> 00:27:41,885 see what they are collaborative partnership, valuable scientific 710 00:27:41,885 --> 00:27:42,327 question, 711 00:27:42,828 --> 00:27:45,821 valid scientific methodology, fair subject or participant 712 00:27:45,821 --> 00:27:46,398 selection, 713 00:27:46,398 --> 00:27:48,200 favorable risk benefit, 714 00:27:48,200 --> 00:27:49,743 independent review, informed consent, 715 00:27:49,743 --> 00:27:51,203 and respect for enrolled subjects. 716 00:27:51,537 --> 00:27:54,540 And I'm going to really briefly say what each of those mean 717 00:27:54,940 --> 00:27:56,842 and why we think it's important. 718 00:27:56,842 --> 00:27:59,845 But also some of the challenges with it. 719 00:28:00,112 --> 00:28:02,570 So collaborative partnership basically, 720 00:28:02,570 --> 00:28:04,082 if the goal of research 721 00:28:04,082 --> 00:28:08,001 is to try to improve health, then we have to realize that it 722 00:28:08,001 --> 00:28:08,654 should be 723 00:28:08,654 --> 00:28:11,614 a partnership with the relevant partners, 724 00:28:11,614 --> 00:28:13,058 those, for example, 725 00:28:13,058 --> 00:28:15,050 who are responsible for planning, 726 00:28:15,050 --> 00:28:17,162 conducting and overseeing research 727 00:28:17,162 --> 00:28:19,274 and integrating the results of research 728 00:28:19,274 --> 00:28:20,465 into a health system. 729 00:28:20,866 --> 00:28:23,218 The contribution of the various partners 730 00:28:23,218 --> 00:28:24,336 that are involved, 731 00:28:24,703 --> 00:28:27,706 and collaboration with existing systems of health. 732 00:28:28,373 --> 00:28:31,843 And this means collaborating with policymakers, 733 00:28:31,843 --> 00:28:34,484 health systems, community advisory boards 734 00:28:34,484 --> 00:28:35,514 or communities. 735 00:28:36,014 --> 00:28:39,017 Patient advocates advocates for research funding, 736 00:28:39,251 --> 00:28:42,497 collaborating investigators, practicing clinicians, 737 00:28:42,497 --> 00:28:43,388 participants. 738 00:28:43,388 --> 00:28:46,391 I mean, the list could go on, but thinking about 739 00:28:46,592 --> 00:28:49,628 who are the important partners in any research endeavor 740 00:28:49,861 --> 00:28:54,132 so that the, research is responsive 741 00:28:54,132 --> 00:28:57,517 to what partners need and respects the contributions 742 00:28:57,517 --> 00:28:58,103 of each. 743 00:28:59,071 --> 00:29:00,973 There's a lots of examples of 744 00:29:00,973 --> 00:29:03,887 collaborative partnership, some probably more effective 745 00:29:03,887 --> 00:29:04,576 than others. 746 00:29:05,310 --> 00:29:08,313 Many studies have community advisory boards. 747 00:29:08,513 --> 00:29:11,516 There are many advocacy groups. 748 00:29:11,950 --> 00:29:14,528 Like where advocacy for rare diseases, 749 00:29:14,528 --> 00:29:16,088 the Friends of Cancer, 750 00:29:16,455 --> 00:29:17,940 the Clinical Center, for example, 751 00:29:17,940 --> 00:29:19,291 has a patient advisory group. 752 00:29:19,291 --> 00:29:21,285 There's a lot of different examples 753 00:29:21,285 --> 00:29:22,995 of the kinds of collaboration 754 00:29:22,995 --> 00:29:25,097 that, makes sense. 755 00:29:25,097 --> 00:29:27,532 But collaboration is also complex. 756 00:29:27,532 --> 00:29:30,669 If you go, for example, to a 757 00:29:30,669 --> 00:29:33,173 recent issue of the New England Journal 758 00:29:33,173 --> 00:29:34,072 and just look 759 00:29:34,072 --> 00:29:37,075 at some of the randomized trials that might be reported in there. 760 00:29:37,309 --> 00:29:40,850 You know, sometimes you'll see a study 761 00:29:40,850 --> 00:29:42,714 that has 40 authors 762 00:29:43,415 --> 00:29:46,618 and was done in 83 countries with, you know, 763 00:29:47,686 --> 00:29:51,156 600 sites and 10,000 people. 764 00:29:51,757 --> 00:29:55,746 And that, you know, in one way, they're all collaborators, 765 00:29:55,746 --> 00:29:56,228 right? 766 00:29:56,228 --> 00:29:59,911 But that's a huge, complex mass of people to try to figure 767 00:29:59,911 --> 00:30:00,165 out 768 00:30:00,165 --> 00:30:02,934 how to collaborate with. So it's challenging. 769 00:30:02,934 --> 00:30:04,756 So the challenges are identifying 770 00:30:04,756 --> 00:30:05,971 who the partners are, 771 00:30:05,971 --> 00:30:10,008 who are the right partners are, and then how to engage them. 772 00:30:11,076 --> 00:30:12,811 There's a lot of information, 773 00:30:12,811 --> 00:30:14,653 a lot of literature now on patient 774 00:30:14,653 --> 00:30:16,982 engagement, engagement, public engagement, 775 00:30:16,982 --> 00:30:20,619 community based engagement, all really important concepts, 776 00:30:20,919 --> 00:30:23,236 but all over the map in terms of what 777 00:30:23,236 --> 00:30:25,490 that mode of engagement looks like. 778 00:30:27,459 --> 00:30:30,362 The second principle is valuable scientific question. 779 00:30:30,362 --> 00:30:34,666 And this, we often talk about in our world as social value. 780 00:30:35,000 --> 00:30:37,669 And basically this 781 00:30:37,669 --> 00:30:39,538 what this is, is that 782 00:30:39,538 --> 00:30:41,644 the question has to be worth asking, 783 00:30:41,644 --> 00:30:43,809 how are we going to learn something? 784 00:30:43,809 --> 00:30:46,044 Are we going to learn something that's useful to somebody? 785 00:30:46,044 --> 00:30:48,490 If the goal is to find useful information 786 00:30:48,490 --> 00:30:49,981 to understand or improve 787 00:30:49,981 --> 00:30:53,385 human health, is the question that we're asking in this study 788 00:30:53,952 --> 00:30:56,955 or in this program of research going to do that? 789 00:30:57,622 --> 00:31:00,201 So it should answer valuable question, 790 00:31:00,201 --> 00:31:01,626 one that can or will 791 00:31:02,027 --> 00:31:04,536 generate new knowledge or understanding 792 00:31:04,536 --> 00:31:06,531 about human health or illness. 793 00:31:06,898 --> 00:31:08,711 And that could be socially valuable, 794 00:31:08,711 --> 00:31:09,768 clinically valuable, 795 00:31:09,768 --> 00:31:11,303 or scientifically valuable. 796 00:31:14,239 --> 00:31:15,841 We often ask, 797 00:31:15,841 --> 00:31:18,247 you know, then what is the value of answering a specific 798 00:31:18,247 --> 00:31:18,677 question? 799 00:31:18,677 --> 00:31:20,927 It's not always so easy to judge, 800 00:31:20,927 --> 00:31:21,813 but thinking 801 00:31:21,813 --> 00:31:23,468 carefully about what will we learn 802 00:31:23,468 --> 00:31:24,783 and how useful will it be? 803 00:31:25,016 --> 00:31:27,753 How will we know if it's useful, 804 00:31:27,753 --> 00:31:30,756 and to whom will the knowledge be useful? 805 00:31:30,756 --> 00:31:33,759 Is it to the participants in the study? 806 00:31:34,025 --> 00:31:35,060 Sometimes it is. 807 00:31:35,060 --> 00:31:36,928 Is it to the community in which they live? 808 00:31:36,928 --> 00:31:39,231 Is it to people with similar conditions? 809 00:31:39,231 --> 00:31:42,901 Future people, society in general, science sponsors? 810 00:31:42,901 --> 00:31:44,600 I mean, it could be lots of answers 811 00:31:44,600 --> 00:31:46,638 to the question, but trying to figure out 812 00:31:46,638 --> 00:31:49,641 what the value is of a specific study, 813 00:31:50,142 --> 00:31:53,145 there is room for more. 814 00:31:54,312 --> 00:31:55,614 Work on this topic. 815 00:31:55,614 --> 00:31:57,883 And there was a, I think, a very important, 816 00:31:59,684 --> 00:32:01,419 issue of bioethics 817 00:32:01,419 --> 00:32:04,368 of eight years ago that tried to ask the question 818 00:32:04,368 --> 00:32:04,790 about, 819 00:32:05,190 --> 00:32:06,855 you know, what really is social value 820 00:32:06,855 --> 00:32:08,160 and how do we understand it? 821 00:32:08,160 --> 00:32:10,428 And there were a number of important questions. 822 00:32:10,428 --> 00:32:13,799 I mean, papers in that volume. 823 00:32:13,799 --> 00:32:16,802 This is one from people in our group 824 00:32:16,802 --> 00:32:18,770 who were defending the social value 825 00:32:18,770 --> 00:32:20,739 requirement, another from a person 826 00:32:20,739 --> 00:32:23,975 in our group who was questioning the social value requirement 827 00:32:24,209 --> 00:32:27,212 and thinking about the limits of it. 828 00:32:27,479 --> 00:32:29,518 People have built on this, and there was 829 00:32:29,518 --> 00:32:31,149 a, I think, a very useful paper 830 00:32:31,149 --> 00:32:35,594 in the Hastings Center, in 2018 by Danny Winter, who talked 831 00:32:35,594 --> 00:32:36,121 about, 832 00:32:36,121 --> 00:32:38,476 you know, how to think about social value 833 00:32:38,476 --> 00:32:39,624 in a different way. 834 00:32:40,458 --> 00:32:43,188 And then there have been people like one of my fellows and I 835 00:32:43,188 --> 00:32:43,461 wrote 836 00:32:43,461 --> 00:32:48,107 a, paper for a chapter for a book recently which tried 837 00:32:48,107 --> 00:32:48,967 to select 838 00:32:48,967 --> 00:32:52,777 cases to try to exemplify what value means in specific 839 00:32:52,777 --> 00:32:53,271 cases. 840 00:32:53,271 --> 00:32:55,810 And we picked what we called high value 841 00:32:55,810 --> 00:32:58,610 cases or low value cases as a exploration. 842 00:32:59,811 --> 00:33:02,210 So that's a lot more to be done about what 843 00:33:02,210 --> 00:33:03,181 social value is. 844 00:33:03,181 --> 00:33:05,951 But you'll hear it throughout this course. 845 00:33:05,951 --> 00:33:08,987 The third is valid scientific methodology. 846 00:33:09,387 --> 00:33:10,956 So you've got partners in place. 847 00:33:10,956 --> 00:33:13,258 You've got a question. It's worth asking. 848 00:33:13,258 --> 00:33:16,494 The next real important ethical requirement 849 00:33:16,862 --> 00:33:20,003 is setting up the study so that it's valid 850 00:33:20,003 --> 00:33:21,199 so that you get 851 00:33:21,800 --> 00:33:25,203 reliable, understandable information 852 00:33:25,203 --> 00:33:27,744 that you can make sense out of, that you can answer the 853 00:33:27,744 --> 00:33:28,206 question. 854 00:33:28,540 --> 00:33:31,810 That doesn't mean it has to be a positive trial. 855 00:33:32,210 --> 00:33:34,520 It just means you have to be able to know 856 00:33:34,520 --> 00:33:36,548 what the answer to the question is. 857 00:33:37,649 --> 00:33:39,618 Excuse me when you're done. 858 00:33:39,618 --> 00:33:41,450 And that includes a bunch of things 859 00:33:41,450 --> 00:33:43,021 like how to design the trial, 860 00:33:43,588 --> 00:33:46,607 what the methods are that you're going to be using, what's the 861 00:33:46,607 --> 00:33:47,192 statistics, 862 00:33:47,192 --> 00:33:50,061 are you going to be using, what kind of statistical power? 863 00:33:51,396 --> 00:33:53,832 The goal being 864 00:33:53,832 --> 00:33:56,868 valid, reliable, generalizable 865 00:33:56,868 --> 00:34:01,373 and interpretable data is also an important part 866 00:34:01,373 --> 00:34:04,643 of valid scientific methodology, and that is feasibility. 867 00:34:05,176 --> 00:34:08,747 There are some questions that cannot be answered because 868 00:34:09,180 --> 00:34:10,869 you can't design the study in a way 869 00:34:10,869 --> 00:34:12,751 that will ever come up with an answer. 870 00:34:13,485 --> 00:34:17,489 And so making sure that what you're asking 871 00:34:17,489 --> 00:34:20,659 and how you're designing it is feasible is an important part 872 00:34:20,659 --> 00:34:24,262 of justifying asking people to participate. 873 00:34:26,798 --> 00:34:28,839 Sometimes when we talk about scientific 874 00:34:28,839 --> 00:34:29,467 methodology 875 00:34:29,467 --> 00:34:31,104 as an ethical requirement, people say, 876 00:34:31,104 --> 00:34:32,871 well, that's science, that's not ethics. 877 00:34:32,871 --> 00:34:35,265 And so my slide here is just to tell you, 878 00:34:35,265 --> 00:34:36,841 you can't pull them apart. 879 00:34:37,208 --> 00:34:39,744 They are totally integrated. 880 00:34:39,744 --> 00:34:42,580 And only to make something 881 00:34:42,580 --> 00:34:45,103 ethically okay has to be scientifically 882 00:34:45,103 --> 00:34:45,750 rigorous. 883 00:34:45,750 --> 00:34:47,346 And something that's scientifically 884 00:34:47,346 --> 00:34:48,987 rigorous should be ethical as well. 885 00:34:50,188 --> 00:34:51,530 There's a number of different 886 00:34:51,530 --> 00:34:52,223 considerations 887 00:34:52,223 --> 00:34:55,660 in thinking about scientific, validity. 888 00:34:56,127 --> 00:34:58,730 And these are just some examples. 889 00:34:58,730 --> 00:35:01,132 Choosing the endpoints, you know, as 890 00:35:01,132 --> 00:35:03,601 how do you measure a 891 00:35:03,601 --> 00:35:06,605 the effect of an anti viral or, 892 00:35:07,205 --> 00:35:10,021 or a vaccines antibodies or is it infection or is it 893 00:35:10,021 --> 00:35:10,508 disease. 894 00:35:10,508 --> 00:35:12,777 And it matters in terms of numbers of people. 895 00:35:12,777 --> 00:35:16,715 You would need certainty at the end of the study etc. 896 00:35:17,582 --> 00:35:19,851 when these randomized 897 00:35:19,851 --> 00:35:22,554 double blind placebo controlled trial 898 00:35:22,554 --> 00:35:24,920 and do you make trials noninferiority 899 00:35:24,920 --> 00:35:26,391 or superiority trials? 900 00:35:26,825 --> 00:35:29,961 When do you use qualitative and quantitative methods? 901 00:35:30,462 --> 00:35:32,197 How do you measure the outcome? 902 00:35:32,197 --> 00:35:33,447 What's the measure of the outcome 903 00:35:33,447 --> 00:35:35,000 that you're going to build into a study? 904 00:35:35,000 --> 00:35:36,601 How long do you follow people? 905 00:35:36,601 --> 00:35:37,802 What kind of power do you need? 906 00:35:37,802 --> 00:35:40,939 What kind of sample sizes, what what other 907 00:35:40,939 --> 00:35:43,942 methods of statistical methods do you need, etc.. 908 00:35:43,942 --> 00:35:46,344 So those are all interesting, important questions. 909 00:35:46,344 --> 00:35:48,513 And there's been some cartoons out there. 910 00:35:48,513 --> 00:35:50,615 There's the control group and the out of control group. 911 00:35:51,683 --> 00:35:53,952 The placebo 912 00:35:53,952 --> 00:35:56,955 member who died, member placebo group. 913 00:35:57,455 --> 00:36:00,258 And this one I think is really fun. 914 00:36:00,258 --> 00:36:03,395 You know, the the. 915 00:36:03,395 --> 00:36:05,463 Possible modes of statistics 916 00:36:05,463 --> 00:36:08,566 that can be mis, misused. 917 00:36:08,566 --> 00:36:10,402 So all of that's important. 918 00:36:10,402 --> 00:36:12,061 There's also a really interesting 919 00:36:12,061 --> 00:36:13,772 challenge is right now that we're 920 00:36:13,772 --> 00:36:17,008 sort of learning more about decentralized trials, 921 00:36:17,776 --> 00:36:20,011 pragmatic trials, platform 922 00:36:20,011 --> 00:36:23,662 trials, and secondary analysis of data specimens, all 923 00:36:23,662 --> 00:36:24,282 of which 924 00:36:24,282 --> 00:36:28,420 have methods that are different from the sort of standard one, 925 00:36:28,787 --> 00:36:31,589 you know, in person, randomized controlled trial. 926 00:36:31,589 --> 00:36:35,193 But require some very careful thought 927 00:36:35,193 --> 00:36:38,630 about how to set it up so that it's rigorous, valid, 928 00:36:38,997 --> 00:36:42,000 and you'll get interpretable data and feasible. 929 00:36:43,134 --> 00:36:44,632 You'll hear about some of these, but 930 00:36:44,632 --> 00:36:46,337 I don't have time to go over them today. 931 00:36:47,138 --> 00:36:50,030 Fair subject selection or participant selection is 932 00:36:50,030 --> 00:36:50,608 basically 933 00:36:51,376 --> 00:36:52,710 what it sounds like. 934 00:36:52,710 --> 00:36:56,938 And then the main issue is that the scientific 935 00:36:56,938 --> 00:36:57,949 objectives 936 00:36:58,249 --> 00:37:01,352 should guide inclusion criteria, 937 00:37:01,352 --> 00:37:04,322 recruitment strategies and selection. 938 00:37:04,322 --> 00:37:06,912 And this is based you know, some of this is in the Belmont 939 00:37:06,912 --> 00:37:07,225 Report 940 00:37:07,225 --> 00:37:11,162 based on a response to previous kinds of research that were done 941 00:37:11,529 --> 00:37:14,966 where privilege or easy availability 942 00:37:14,966 --> 00:37:17,502 or vulnerability were used as criteria 943 00:37:17,502 --> 00:37:19,170 for select participants. 944 00:37:19,170 --> 00:37:19,971 And those are all, 945 00:37:22,107 --> 00:37:24,542 wrong criteria. 946 00:37:24,542 --> 00:37:27,512 There is a Dave Wendler who you're going to hear from 947 00:37:27,912 --> 00:37:29,876 the course, talks about flipping it 948 00:37:29,876 --> 00:37:31,783 on a side and saying the best way 949 00:37:31,783 --> 00:37:36,214 to think about fair participant selection is to say there is no 950 00:37:36,214 --> 00:37:36,988 exclusion. 951 00:37:36,988 --> 00:37:39,924 Everybody's, 952 00:37:39,924 --> 00:37:42,961 eligible without justification. 953 00:37:43,294 --> 00:37:44,362 And so then you might say, well, 954 00:37:44,362 --> 00:37:46,856 what are the justifications that sometimes could be used to 955 00:37:46,856 --> 00:37:47,532 exclude people? 956 00:37:47,832 --> 00:37:49,762 Sometimes it's that they're scientifically 957 00:37:49,762 --> 00:37:51,002 inappropriate, can't study 958 00:37:51,236 --> 00:37:52,793 treatment of breast cancer and people that 959 00:37:52,793 --> 00:37:54,239 don't have breast cancer, for example. 960 00:37:54,806 --> 00:37:56,341 Now sometimes it's risk. 961 00:37:56,341 --> 00:37:59,310 And that's contentious, but also very important 962 00:37:59,310 --> 00:38:01,805 if the risk is too high for a certain group of people, 963 00:38:01,805 --> 00:38:02,313 maybe it's 964 00:38:02,580 --> 00:38:05,416 appropriate and justifiable to exclude them 965 00:38:05,416 --> 00:38:08,253 little even more. 966 00:38:08,253 --> 00:38:10,455 Controversy about vulnerability. 967 00:38:10,455 --> 00:38:12,204 Some people say you should exclude people 968 00:38:12,204 --> 00:38:13,057 who are vulnerable, 969 00:38:13,057 --> 00:38:14,592 and I'll talk about that again in a minute. 970 00:38:14,592 --> 00:38:16,861 But, again, it's contentious. 971 00:38:18,096 --> 00:38:19,063 It's also a part of 972 00:38:19,063 --> 00:38:22,567 their participant selection that, has us think about 973 00:38:22,567 --> 00:38:25,627 how do we fairly distribute the harms and benefits of 974 00:38:25,627 --> 00:38:26,204 research. 975 00:38:26,204 --> 00:38:27,983 In other words, don't always pick people 976 00:38:27,983 --> 00:38:29,808 with the same people over and over again 977 00:38:29,808 --> 00:38:31,900 because they then accept all the risks 978 00:38:31,900 --> 00:38:34,212 without necessarily getting the benefits. 979 00:38:34,512 --> 00:38:35,613 Don't select a group 980 00:38:35,613 --> 00:38:37,119 that's always going to take risks 981 00:38:37,119 --> 00:38:38,716 when benefits go to somebody else. 982 00:38:39,517 --> 00:38:43,060 You know, those kinds of, considerations are really 983 00:38:43,060 --> 00:38:43,755 important 984 00:38:43,755 --> 00:38:47,058 at the front end in selecting participants. 985 00:38:47,358 --> 00:38:51,886 And I mentioned this earlier, but there's a really interesting 986 00:38:51,886 --> 00:38:52,397 switch 987 00:38:52,397 --> 00:38:56,239 that's happened over time where both of these end up being 988 00:38:56,239 --> 00:38:56,968 important. 989 00:38:56,968 --> 00:39:00,251 But they're but they really are, historically situated in 990 00:39:00,251 --> 00:39:01,172 different ways. 991 00:39:01,172 --> 00:39:03,648 So research was seen for a long time 992 00:39:03,648 --> 00:39:04,542 as a burden. 993 00:39:04,909 --> 00:39:08,889 And participants need protection from that burden or from the 994 00:39:08,889 --> 00:39:09,280 risk. 995 00:39:09,814 --> 00:39:12,984 And as I mentioned earlier, if it's seen as a benefit, 996 00:39:12,984 --> 00:39:14,919 then people will need access to research. 997 00:39:14,919 --> 00:39:17,589 And so both of these are still true. 998 00:39:17,589 --> 00:39:20,542 But the pendulum swings back and forth a little bit between 999 00:39:20,542 --> 00:39:20,992 the two. 1000 00:39:21,960 --> 00:39:23,928 So I mentioned protecting vulnerable groups. 1001 00:39:23,928 --> 00:39:25,096 There's a lot of question about 1002 00:39:25,096 --> 00:39:28,145 what does vulnerability actually mean in the context of 1003 00:39:28,145 --> 00:39:28,700 research. 1004 00:39:29,200 --> 00:39:31,703 And if we protect people 1005 00:39:31,703 --> 00:39:34,706 too much, we keep them out of and, 1006 00:39:35,106 --> 00:39:37,520 prevent them from getting the benefits of 1007 00:39:37,520 --> 00:39:38,109 research? 1008 00:39:38,376 --> 00:39:39,410 I'm not sure. 1009 00:39:39,410 --> 00:39:41,923 Is there another session on vulnerability 1010 00:39:41,923 --> 00:39:42,413 at all? 1011 00:39:43,081 --> 00:39:44,048 Not specifically. 1012 00:39:44,048 --> 00:39:47,318 When it comes up for children. 1013 00:39:47,619 --> 00:39:49,654 Yeah. Okay. 1014 00:39:49,654 --> 00:39:52,590 There's really some challenges to selecting the appropriate 1015 00:39:52,590 --> 00:39:53,925 participants for a study. 1016 00:39:53,925 --> 00:39:55,919 For example, this is one that comes up all 1017 00:39:55,919 --> 00:39:56,394 the time. 1018 00:39:56,394 --> 00:39:59,192 Is it preferable to test an early, 1019 00:39:59,192 --> 00:40:01,332 potentially risky therapy 1020 00:40:01,733 --> 00:40:05,603 in healthy, affected adults who can give their own consent? 1021 00:40:06,271 --> 00:40:09,274 Or in severely ill infants 1022 00:40:09,607 --> 00:40:12,017 who are otherwise likely to die as infants 1023 00:40:12,017 --> 00:40:14,312 if they don't get something that works. 1024 00:40:14,312 --> 00:40:17,448 That's a that's a real dilemma that has happened in studies. 1025 00:40:17,849 --> 00:40:19,923 Another example of a challenge is 1026 00:40:19,923 --> 00:40:22,186 when do you enroll pregnant persons 1027 00:40:22,186 --> 00:40:22,687 in research? 1028 00:40:22,687 --> 00:40:26,424 This is another topic that could take a whole lecture. 1029 00:40:26,424 --> 00:40:27,292 And sometimes that's 1030 00:40:28,626 --> 00:40:31,001 the next element is favorable risk 1031 00:40:31,001 --> 00:40:31,629 benefit. 1032 00:40:32,130 --> 00:40:35,533 And the questions really to ask are risks to subjects necessary 1033 00:40:36,301 --> 00:40:38,494 and are they minimized to the extent 1034 00:40:38,494 --> 00:40:39,103 possible. 1035 00:40:39,103 --> 00:40:41,090 And then there's a judgment that all of us 1036 00:40:41,090 --> 00:40:42,840 have to make, both as investigators, 1037 00:40:42,840 --> 00:40:45,843 research teams, IRB members, even participants 1038 00:40:46,077 --> 00:40:49,914 are the risks justified by either 1039 00:40:49,914 --> 00:40:52,737 benefit to the individual or the social value of the 1040 00:40:52,737 --> 00:40:53,117 study. 1041 00:40:53,785 --> 00:40:56,054 And that's a, you know, incommensurate. 1042 00:40:56,054 --> 00:40:59,273 And people say evaluation, but it's an important one to 1043 00:40:59,273 --> 00:40:59,624 make. 1044 00:40:59,957 --> 00:41:02,960 And then many, 1045 00:41:03,161 --> 00:41:06,164 many argue for the need also to enhance benefits. 1046 00:41:07,031 --> 00:41:08,488 Importantly, as the Belmont Report 1047 00:41:08,488 --> 00:41:10,201 points out, and as I've mentioned, only 1048 00:41:10,201 --> 00:41:12,473 a couple of times, sometimes interests 1049 00:41:12,473 --> 00:41:14,505 other than those of the subjects. 1050 00:41:14,505 --> 00:41:18,409 In other words, the participants in the study can justify risks 1051 00:41:18,409 --> 00:41:20,682 as long as the rights of the participants 1052 00:41:20,682 --> 00:41:21,846 have been protected. 1053 00:41:24,148 --> 00:41:25,683 Lots of challenges. 1054 00:41:25,683 --> 00:41:29,821 One is how do you identify which which risks and which benefits? 1055 00:41:29,821 --> 00:41:30,922 Account? 1056 00:41:30,922 --> 00:41:33,246 And there's been a lot of interesting 1057 00:41:33,246 --> 00:41:34,125 debate about, 1058 00:41:34,125 --> 00:41:35,794 you know, in some of the guidance 1059 00:41:35,794 --> 00:41:37,261 that says we should consider 1060 00:41:37,261 --> 00:41:41,457 physical, psychosocial, and even economic and legal 1061 00:41:41,457 --> 00:41:42,033 risks, 1062 00:41:42,400 --> 00:41:46,411 but we don't consider physical, psychosocial, economic and legal 1063 00:41:46,411 --> 00:41:47,038 benefits. 1064 00:41:47,038 --> 00:41:49,707 So how do we deal with that? 1065 00:41:49,707 --> 00:41:51,242 How do we minimize risks. 1066 00:41:51,242 --> 00:41:54,121 And you'll hear from some of the, 1067 00:41:54,121 --> 00:41:57,348 others that in in protecting certain 1068 00:41:57,348 --> 00:41:59,627 groups of people, one of them strategy 1069 00:41:59,627 --> 00:42:01,786 strategies we use is by regulation. 1070 00:42:01,786 --> 00:42:04,555 We limit the amount of risks that people can be exposed to. 1071 00:42:04,555 --> 00:42:06,752 That's what happens with children, with 1072 00:42:06,752 --> 00:42:08,893 pregnant persons and with, prisoners. 1073 00:42:09,894 --> 00:42:12,611 There's a huge debate about direct versus 1074 00:42:12,611 --> 00:42:14,532 indirect benefits, you know, 1075 00:42:14,532 --> 00:42:18,302 direct benefits as the kind of benefit that can 1076 00:42:18,302 --> 00:42:19,926 justify a certain amount of risks, 1077 00:42:19,926 --> 00:42:20,738 whereas indirect 1078 00:42:20,738 --> 00:42:24,075 benefits like money or access to good clinicians or, 1079 00:42:24,609 --> 00:42:29,881 access to other kinds of care may not be the kind of benefit 1080 00:42:29,881 --> 00:42:32,311 that you want to justify a risk based on, 1081 00:42:32,311 --> 00:42:34,385 but they still happen in research. 1082 00:42:35,353 --> 00:42:35,853 And then 1083 00:42:35,853 --> 00:42:38,823 determining that level is is a challenge. 1084 00:42:39,123 --> 00:42:41,014 Just two more, three more, I guess 1085 00:42:41,014 --> 00:42:42,126 independent review, 1086 00:42:42,593 --> 00:42:44,522 which is ensuring that the regulatory and 1087 00:42:44,522 --> 00:42:46,497 ethical requirements have been fulfilled, 1088 00:42:46,864 --> 00:42:51,156 that it's a process that checks investigator biases and 1089 00:42:51,156 --> 00:42:51,936 conflicts 1090 00:42:52,403 --> 00:42:54,611 and ensures the public that the research 1091 00:42:54,611 --> 00:42:56,874 is not exploiting individuals or groups, 1092 00:42:56,874 --> 00:42:59,382 and you're going to learn more about IRBs 1093 00:42:59,382 --> 00:43:00,178 later today. 1094 00:43:00,945 --> 00:43:03,217 I just wanted to put up the regulatory 1095 00:43:03,217 --> 00:43:03,815 criteria, 1096 00:43:03,815 --> 00:43:07,952 both in 45 CFR and 21 CFR 56, 1097 00:43:08,252 --> 00:43:10,313 because you can see that they mimic 1098 00:43:10,313 --> 00:43:12,256 a lot of what I've already said, 1099 00:43:12,790 --> 00:43:15,652 that risks or minimize justified by anticipated 1100 00:43:15,652 --> 00:43:16,260 benefits, 1101 00:43:16,661 --> 00:43:19,530 even if there only to the participants 1102 00:43:19,530 --> 00:43:22,989 fair subject selection and then informed consent is 1103 00:43:22,989 --> 00:43:23,668 adequate. 1104 00:43:23,668 --> 00:43:26,971 And so our next oops forgot about this. 1105 00:43:29,173 --> 00:43:31,402 Two regulatory bodies you should know 1106 00:43:31,402 --> 00:43:33,511 about the Office of Human Research 1107 00:43:33,511 --> 00:43:36,023 Protections, which is the Department 1108 00:43:36,023 --> 00:43:38,116 of Health and Human Services, 1109 00:43:39,217 --> 00:43:40,118 office 1110 00:43:40,118 --> 00:43:43,688 that oversees research funded by HHS. 1111 00:43:44,021 --> 00:43:45,776 And then we have an intramural office 1112 00:43:45,776 --> 00:43:47,625 of Human Subjects research protection, 1113 00:43:48,025 --> 00:43:51,395 which is the federal wide 1114 00:43:51,395 --> 00:43:54,398 assurance holder for the intramural program. 1115 00:43:56,501 --> 00:43:57,568 Let me just get that. 1116 00:43:57,568 --> 00:44:00,571 So the next one is informed consent. 1117 00:44:00,771 --> 00:44:04,075 And this basically is a process that ensures that individuals 1118 00:44:04,075 --> 00:44:05,943 have the opportunity to decide 1119 00:44:05,943 --> 00:44:07,566 whether they want to participate or 1120 00:44:07,566 --> 00:44:09,514 continue to participate, and whether it's 1121 00:44:09,514 --> 00:44:12,517 compatible with their goals, their interests, their values. 1122 00:44:13,618 --> 00:44:15,660 Many people, when we first put out this 1123 00:44:15,660 --> 00:44:17,021 framework said, you know, 1124 00:44:17,021 --> 00:44:18,523 what is what makes clinical research 1125 00:44:18,523 --> 00:44:20,191 ethical and said, oh, informed consent. 1126 00:44:20,691 --> 00:44:22,928 Well, as you can see, this is the seventh 1127 00:44:22,928 --> 00:44:24,729 on the list of eight principles. 1128 00:44:24,729 --> 00:44:28,397 So the point being that you don't ask people to 1129 00:44:28,397 --> 00:44:29,333 participate 1130 00:44:29,700 --> 00:44:32,043 until then, all of the other things are 1131 00:44:32,043 --> 00:44:32,703 satisfied. 1132 00:44:33,070 --> 00:44:34,472 You have to have a good question. 1133 00:44:34,472 --> 00:44:37,074 A valid science, scientific methodology, 1134 00:44:38,109 --> 00:44:39,727 make sure you understand the risks and 1135 00:44:39,727 --> 00:44:40,111 benefits 1136 00:44:40,111 --> 00:44:43,518 and have an independent review before you invite people to 1137 00:44:43,518 --> 00:44:44,282 participate. 1138 00:44:45,183 --> 00:44:47,552 And informed consent has a lot of pieces. 1139 00:44:47,552 --> 00:44:48,753 I think I'm coming next week 1140 00:44:48,753 --> 00:44:50,679 to speak to you in more depth about it, 1141 00:44:50,679 --> 00:44:52,456 so I'll probably skip this for now. 1142 00:44:53,057 --> 00:44:55,693 A lot of interesting challenges right now 1143 00:44:55,693 --> 00:44:58,696 in terms of informed consent, focusing on 1144 00:45:00,431 --> 00:45:02,767 long, long standing problems, the quality 1145 00:45:02,767 --> 00:45:05,905 of informed consent, and how do we assess capacity 1146 00:45:05,905 --> 00:45:06,470 consent. 1147 00:45:06,837 --> 00:45:09,890 But there's also lots of changes in how we obtain informed 1148 00:45:09,890 --> 00:45:10,575 consent now. 1149 00:45:10,975 --> 00:45:12,998 And some of that is based on changes 1150 00:45:12,998 --> 00:45:14,178 in research methods. 1151 00:45:16,247 --> 00:45:19,016 The set the eighth of the principles 1152 00:45:19,016 --> 00:45:21,852 is what's called respect for enrolled participants. 1153 00:45:21,852 --> 00:45:25,189 And this is probably another one that needs a lot of attention. 1154 00:45:25,456 --> 00:45:27,391 And here's what I mean by it. 1155 00:45:27,391 --> 00:45:29,353 Early on, we just developed this 1156 00:45:29,353 --> 00:45:30,027 framework. 1157 00:45:30,027 --> 00:45:31,028 There was this idea, 1158 00:45:31,028 --> 00:45:32,657 as I just mentioned, that some people say, 1159 00:45:32,657 --> 00:45:34,131 what makes clinical research ethical? 1160 00:45:34,131 --> 00:45:35,466 When you get your IRB to review it 1161 00:45:35,466 --> 00:45:36,133 and then you get 1162 00:45:36,133 --> 00:45:39,203 the person's informed consent and you're done, ethics is done. 1163 00:45:40,004 --> 00:45:41,798 And that just makes no sense at all, 1164 00:45:41,798 --> 00:45:43,741 because at the beginning of the trial, 1165 00:45:43,741 --> 00:45:47,315 you're just beginning to engage with the people that are in the 1166 00:45:47,315 --> 00:45:47,712 study, 1167 00:45:48,045 --> 00:45:51,271 and you have responsibilities, ethical responsibilities to 1168 00:45:51,271 --> 00:45:52,049 those people. 1169 00:45:52,383 --> 00:45:54,964 And you have ethical responsibilities 1170 00:45:54,964 --> 00:45:56,220 to the, you know, 1171 00:45:56,220 --> 00:45:58,698 the world in terms of reporting your data 1172 00:45:58,698 --> 00:46:00,391 and reporting your science. 1173 00:46:00,391 --> 00:46:03,723 So there's a lot that happens after informed consent is 1174 00:46:03,723 --> 00:46:04,328 obtained. 1175 00:46:04,862 --> 00:46:08,699 And this is just a partial list protecting confidentiality, 1176 00:46:08,699 --> 00:46:12,213 monitoring welfare, right to withdraw, providing 1177 00:46:12,213 --> 00:46:13,604 information, etc.. 1178 00:46:14,805 --> 00:46:17,408 And there are some interesting challenges. 1179 00:46:17,408 --> 00:46:18,809 We don't do a good job 1180 00:46:18,809 --> 00:46:20,642 of providing results or information 1181 00:46:20,642 --> 00:46:22,580 to participants for the communities. 1182 00:46:22,580 --> 00:46:25,583 After studies are done. 1183 00:46:25,883 --> 00:46:27,385 So this is our framework. 1184 00:46:27,385 --> 00:46:31,022 Systematic, sequential we argue necessary. 1185 00:46:31,389 --> 00:46:33,363 Sometimes there are waivers of certain 1186 00:46:33,363 --> 00:46:35,026 of the procedural requirements. 1187 00:46:35,326 --> 00:46:36,460 Universal. 1188 00:46:36,460 --> 00:46:39,697 Although it has to be adapted to the context, 1189 00:46:40,264 --> 00:46:42,900 it does require balancing specification 1190 00:46:42,900 --> 00:46:44,807 and judgment, and you'll hear about that 1191 00:46:44,807 --> 00:46:45,903 throughout the course. 1192 00:46:45,903 --> 00:46:49,810 And we need people informed investigators, IRB members, 1193 00:46:49,810 --> 00:46:50,307 others 1194 00:46:50,307 --> 00:46:52,091 interested in research to be able 1195 00:46:52,091 --> 00:46:54,145 to understand what the principles are 1196 00:46:54,145 --> 00:46:54,945 and how to apply them.