1 00:00:09,442 --> 00:00:12,645 I'm happy to say that as far as I can tell, 2 00:00:12,645 --> 00:00:16,750 if any of the thousands of people who watch this online have other examples, 3 00:00:16,750 --> 00:00:19,085 I'd be very interested in hearing about them. 4 00:00:19,085 --> 00:00:23,189 As far as I can tell, the clinical center was one of the very 5 00:00:23,189 --> 00:00:26,726 first institutions in the U.S., and probably in the world, to develop 6 00:00:26,726 --> 00:00:29,929 a policy on research with adults who can't give informed consent. 7 00:00:29,963 --> 00:00:31,431 We call it 87-4. 8 00:00:31,431 --> 00:00:35,835 It's what it started in 1987, 4, that's the number for it. 9 00:00:35,835 --> 00:00:39,506 And that policy has been revised a number of times. 10 00:00:39,506 --> 00:00:42,442 And basically, what that policy aims to do, 11 00:00:42,442 --> 00:00:46,479 it aims to implement four important safeguards for research with individuals 12 00:00:46,479 --> 00:00:49,949 like Miss P, adults who can't give informed consent. 13 00:00:49,949 --> 00:00:54,554 First is what you might call the compelling reason, or the necessity 14 00:00:54,554 --> 00:00:55,188 requirement. 15 00:00:55,188 --> 00:01:00,093 And here are the ideas that if you can do your research just as well 16 00:01:00,093 --> 00:01:03,663 with people who can give informed consent, then don't enroll people 17 00:01:03,663 --> 00:01:05,298 who can't give informed consent. 18 00:01:05,298 --> 00:01:08,568 If those investigators could have done their research equally well 19 00:01:08,568 --> 00:01:11,805 with people who can consent, they shouldn't enroll Miss P. 20 00:01:11,805 --> 00:01:15,742 Only enroll Miss P, only enroll a 3 year old, only enroll 21 00:01:15,742 --> 00:01:18,678 somebody who's unconscious, when you have to enroll people 22 00:01:18,678 --> 00:01:22,248 from that population in order to answer an important scientific question. 23 00:01:22,515 --> 00:01:27,220 The way our policy puts that is the IRB has to document that there's 24 00:01:27,220 --> 00:01:30,557 a compelling reason to enroll individuals who can't give informed 25 00:01:30,557 --> 00:01:34,060 consent, adults in this case who can't give informed consent. 26 00:01:34,060 --> 00:01:35,228 Surrogate decision maker. 27 00:01:35,228 --> 00:01:39,599 If you're going to have somebody enroll them, you have a compelling reason 28 00:01:39,599 --> 00:01:40,600 to enroll them 29 00:01:40,600 --> 00:01:44,604 for an important scientific reason, and they can't make their own decisions 30 00:01:44,604 --> 00:01:49,676 about whether or not to be in research, ou should have somebody else who decides 31 00:01:49,676 --> 00:01:54,747 whether or not their research -- again, the buzz term for this in the U.S. 32 00:01:54,747 --> 00:01:57,784 regulations is a legally authorized representative, or an LAR. 33 00:01:57,784 --> 00:02:01,154 I'll just call them a surrogate decision maker for today. 34 00:02:01,154 --> 00:02:04,524 So you want somebody else who decides for that person. 35 00:02:04,524 --> 00:02:08,595 And the crucial point is that surrogates aren't supposed to be asking 36 00:02:08,595 --> 00:02:11,731 whether or not I want, Mr. P -- sorry, 37 00:02:11,731 --> 00:02:15,535 Miss P's husband's not supposed to say do I want her in the study. 38 00:02:15,535 --> 00:02:19,572 Miss P's husband is supposed to say would she want to be in the study. 39 00:02:19,572 --> 00:02:21,441 It's the test that I mentioned before. 40 00:02:21,441 --> 00:02:24,944 If she were able to make her own decisions, what would she say? 41 00:02:24,944 --> 00:02:27,380 And that's sometimes described as a substituted judgment standard. 42 00:02:27,380 --> 00:02:30,884 If Miss P could make this decision for herself, what would she say? 43 00:02:31,384 --> 00:02:35,121 And if we're confident she would say yes, then that provides us good reason 44 00:02:35,121 --> 00:02:37,524 to think, again, we're not taking advantage of her, 45 00:02:37,524 --> 00:02:39,125 exploiting her inability to understand herself. 46 00:02:39,125 --> 00:02:40,460 If she would say no, 47 00:02:40,460 --> 00:02:43,930 then that's a pretty compelling reason not to enroll her in the study. 48 00:02:43,930 --> 00:02:46,065 Unfortunately, there will be a lot of cases 49 00:02:46,065 --> 00:02:49,002 where you're not sure what the answer to that question is 50 00:02:49,002 --> 00:02:52,205 and typically what we're supposed to do is the best interest standard, 51 00:02:52,205 --> 00:02:55,141 just figure out what would be best for this particular person. 52 00:02:57,510 --> 00:02:58,411 So here's the 53 00:02:58,411 --> 00:03:01,514 way that our policy does it, it's to substitute judgment, 54 00:03:01,514 --> 00:03:04,584 it says the surrogate who enrolls the -- compelling reason 55 00:03:04,584 --> 00:03:07,387 and then it needs a surrogate, and the surrogate 56 00:03:07,387 --> 00:03:10,456 should have sufficient reason to believe participation in this study 57 00:03:10,456 --> 00:03:12,926 is consistent with the individuals preferences and values. 58 00:03:12,926 --> 00:03:16,930 So you might ask things like have you talked to her about ever 59 00:03:16,930 --> 00:03:17,864 being in research? 60 00:03:17,864 --> 00:03:21,434 Do you think this is the kind of thing she would want to do? 61 00:03:21,434 --> 00:03:24,470 Why do you think she would want to be in this study? 62 00:03:24,470 --> 00:03:26,506 Why do you think she would be willing 63 00:03:26,506 --> 00:03:29,809 to undergo, say, a lumbar puncture in order for investigators to collect data 64 00:03:29,809 --> 00:03:33,346 that might help other patients with dementia? 65 00:03:33,680 --> 00:03:35,848 So I mentioned task specificity before. 66 00:03:35,848 --> 00:03:39,819 So one important thing about this -- that was task specificity 67 00:03:39,819 --> 00:03:42,355 with respect to understanding a particular study. 68 00:03:42,355 --> 00:03:46,693 Here the point is that it's possible that somebody can't get informed 69 00:03:46,693 --> 00:03:50,330 consent for a study, but nonetheless, they retain the capacity 70 00:03:50,330 --> 00:03:53,566 to assign a surrogate to make decisions for them. 71 00:03:53,566 --> 00:03:57,570 So I think that probably was the case with Miss P. 72 00:03:57,837 --> 00:04:01,140 She couldn't possibly understand that research study, but she recognized 73 00:04:01,140 --> 00:04:05,078 her husband, she knew who her husband was, she trusted her husband, 74 00:04:05,078 --> 00:04:08,381 she trusted her husband to make medical decisions for her, 75 00:04:08,381 --> 00:04:12,018 she trusted her husband to decide how her life went. 76 00:04:12,018 --> 00:04:15,989 This is another nice study by Scott Kim, that I mentioned. 77 00:04:15,989 --> 00:04:19,959 And according to their data, about little more than one-third of people 78 00:04:19,959 --> 00:04:24,264 who were deemed not able to consent for themselves were able to identify 79 00:04:24,264 --> 00:04:27,900 and name a surrogate, somebody else to make decisions for them. 80 00:04:28,067 --> 00:04:32,372 So again, the important point is the fact that somebody can't consent doesn't mean 81 00:04:32,372 --> 00:04:36,976 that they can't make any decisions, and you just have to pick somebody for them. 82 00:04:36,976 --> 00:04:40,513 It might be that they can pick a surrogate for themselves. 83 00:04:40,513 --> 00:04:43,149 Okay, so this is the first two protections, 84 00:04:43,149 --> 00:04:44,384 compelling reason and surrogate. 85 00:04:44,384 --> 00:04:48,354 The third one, and this is taken from pediatric regulations, assent and dissent. 86 00:04:48,521 --> 00:04:53,426 And basically, the idea here is that even if a person can't fully understand 87 00:04:53,426 --> 00:04:56,562 to make an independent, autonomous decision, it doesn't mean 88 00:04:56,562 --> 00:05:01,467 we should just ignore them, and not care about what they want to do, 89 00:05:01,467 --> 00:05:04,604 and that's respected by the assent and dissent requirements. 90 00:05:04,604 --> 00:05:08,474 The idea here is that you should still explain the study, 91 00:05:08,474 --> 00:05:13,713 and what's going to happen, to the extent the person can understand it, and obtain, 92 00:05:13,713 --> 00:05:17,216 solicit their affirmative agreement, their willingness to be in it. 93 00:05:17,450 --> 00:05:20,119 And if somebody dissents, that should be respected. 94 00:05:20,119 --> 00:05:24,457 This is a nice paper by Betty Black and some of her colleagues 95 00:05:24,457 --> 00:05:27,794 at Hopkins on this a couple of years ago. 96 00:05:27,794 --> 00:05:30,463 Here's one of the real challenges, particularly 97 00:05:30,463 --> 00:05:34,133 when you start thinking about people who can't give informed consent. 98 00:05:34,133 --> 00:05:37,470 They don't fully understand, and they don't always communicate clearly, 99 00:05:37,470 --> 00:05:40,440 and it's not always obvious when they're doing something 100 00:05:40,440 --> 00:05:43,443 whether what they're doing is an expression of dissent. 101 00:05:43,776 --> 00:05:44,510 So you imagine 102 00:05:44,510 --> 00:05:48,047 somebody who just pulls their arm away when you want to do a blood draw. 103 00:05:48,047 --> 00:05:50,183 Does that mean they don't want the blood draw? 104 00:05:50,183 --> 00:05:52,785 Or does that mean they don't want to be in research? 105 00:05:52,785 --> 00:05:55,154 Or does that mean they don't understand what you're doing? 106 00:05:55,154 --> 00:05:57,523 Or does that just mean they're worried, that they're scared, 107 00:05:57,523 --> 00:06:00,393 and there's maybe something you could do to try to reassure them? 108 00:06:00,660 --> 00:06:04,230 So what I try to tell people here in cases like this, 109 00:06:04,230 --> 00:06:07,467 you get some maybe unclear sign is to stop, evaluate it. 110 00:06:07,467 --> 00:06:08,634 See what's going on 111 00:06:08,634 --> 00:06:12,472 and then see if there's something that you can do to address it. 112 00:06:12,472 --> 00:06:14,540 So maybe just the person was surprised 113 00:06:14,540 --> 00:06:18,378 that you grabbed their arm, so they just need to have it explained. 114 00:06:18,378 --> 00:06:20,179 Maybe they just need a break. 115 00:06:20,179 --> 00:06:23,850 Maybe their just nervous and you need to explain what's happening. 116 00:06:23,850 --> 00:06:27,053 In a lot of cases, by working through it, 117 00:06:27,053 --> 00:06:31,357 you can figure out what's going on, and address their concerns. 118 00:06:31,357 --> 00:06:32,058 Okay. 119 00:06:32,058 --> 00:06:36,362 Last, safeguard is a risk/benefit evaluation and here are the assumption is 120 00:06:36,362 --> 00:06:42,068 -- well, here I'll give you the background to this is people can think about this, 121 00:06:42,068 --> 00:06:46,339 this is a bonus quiz question is, at least in the U.S. 122 00:06:46,339 --> 00:06:49,575 regulations, there are no risk limits on research studies 123 00:06:49,575 --> 00:06:52,078 with adults who can give informed consent. 124 00:06:52,712 --> 00:06:56,516 Now that doesn't mean these studies are going to be approved, 125 00:06:56,516 --> 00:07:01,053 but at least in principle, the regulations don't block IRBs from approving studies 126 00:07:01,053 --> 00:07:04,857 that offer no chance of benefit, and have relatively high risk 127 00:07:04,857 --> 00:07:08,861 if the subjects are competent and give voluntary and informed consent. 128 00:07:08,861 --> 00:07:12,865 Now when we talk about people who can't give informed consent, 129 00:07:12,865 --> 00:07:16,002 there's almost unanimous agreement that even if that standard 130 00:07:16,002 --> 00:07:19,472 is appropriate for competent adults, no upper threshold on risks, 131 00:07:19,872 --> 00:07:23,609 when we go to people who can't give informed consent, 132 00:07:23,609 --> 00:07:28,481 such as adults like Miss P, we shouldn't have that kind of allowance. 133 00:07:28,481 --> 00:07:32,585 We should have much stricter limits on the level of risks 134 00:07:32,585 --> 00:07:35,388 that these people can be exposed to. 135 00:07:35,388 --> 00:07:36,689 Talk about this. 136 00:07:36,689 --> 00:07:41,527 Basically, the way it's done right now is roughly the idea that either 137 00:07:41,527 --> 00:07:46,766 the risks have to be low or their has to be a compensating potential 138 00:07:46,766 --> 00:07:50,970 for benefit, so the risks are justified by a potential for benefit. 139 00:07:50,970 --> 00:07:55,842 And then, of course, that raises the challenge here of what we need then, 140 00:07:55,842 --> 00:08:00,012 and this is a further challenge for people who work in research 141 00:08:00,012 --> 00:08:04,183 ethics, is develop a systematic framework to help people make these judgments, 142 00:08:04,183 --> 00:08:08,354 to figure out what the risks are, what the potential benefits are, 143 00:08:08,354 --> 00:08:12,525 to what extent do or don't the potential benefits justify the risks. 144 00:08:12,525 --> 00:08:15,828 So to give you that example, remember the study 145 00:08:15,828 --> 00:08:19,465 with Miss P, do people remember what the procedures were? 146 00:08:19,465 --> 00:08:21,300 Does anybody's short-term memory functioning? 147 00:08:21,300 --> 00:08:24,604 Do you remember? Female Speaker: MRI. 148 00:08:24,937 --> 00:08:28,241 MRI. [inaudible commentary] Behavioral testing. 149 00:08:28,241 --> 00:08:29,642 Computer testing. 150 00:08:29,642 --> 00:08:31,978 So lumbar puncture. 151 00:08:34,347 --> 00:08:37,717 All right. So here's the standard. 152 00:08:37,717 --> 00:08:39,519 You should only do research 153 00:08:39,519 --> 00:08:41,254 with somebody like Miss P -- it's not going to help her. 154 00:08:41,754 --> 00:08:44,624 So either it has to help her, this research isn't going to help her. 155 00:08:44,624 --> 00:08:47,560 Or the risks have to be sufficiently low. 156 00:08:47,560 --> 00:08:51,163 And then the question, if I put you guys IRB hats on, 157 00:08:51,163 --> 00:08:55,368 the question would be whether or not the risk of that study are sufficiently 158 00:08:55,368 --> 00:08:57,937 low that it's acceptable to do in somebody who can't give informed consent. 159 00:08:57,937 --> 00:09:00,640 Is it okay to have them do an MRI? 160 00:09:00,640 --> 00:09:02,742 Okay to have them do an LP? 161 00:09:02,742 --> 00:09:03,876 Okay to have them do blood draws? 162 00:09:03,876 --> 00:09:06,812 And then, of course, is it okay to have them exposed to the risks 163 00:09:06,812 --> 00:09:10,550 of all of those things in a context of a single study? 164 00:09:13,753 --> 00:09:14,520 All right 165 00:09:15,621 --> 00:09:17,323 The threatened part of the lecture. 166 00:09:17,323 --> 00:09:18,190 Quizzes. 167 00:09:18,357 --> 00:09:21,160 So I'm actually going to -- here's what we'll do. 168 00:09:21,160 --> 00:09:22,962 You have to have to talk about one of them before 169 00:09:22,962 --> 00:09:24,931 you guys are allowed to leave here. 170 00:09:24,931 --> 00:09:28,234 So I'll give you three quiz questions, and we can see if there's one 171 00:09:28,234 --> 00:09:30,469 that strikes anybody's fancy. I can pick one. 172 00:09:30,469 --> 00:09:31,604 So here's number one. 173 00:09:31,604 --> 00:09:33,839 So as I mentioned, remember the compelling justification. 174 00:09:34,307 --> 00:09:36,542 The compelling justification says 175 00:09:37,543 --> 00:09:41,280 you shouldn't enroll individuals 176 00:09:41,280 --> 00:09:46,319 who can't give informed consent when you can conduct the research 177 00:09:46,319 --> 00:09:49,488 equally well with people who can give informed consent. 178 00:09:49,488 --> 00:09:51,791 That's the compelling justification requirement. 179 00:09:51,791 --> 00:09:53,593 Now here's the question. 180 00:09:53,593 --> 00:09:56,362 What counts as a compelling justification? 181 00:09:56,362 --> 00:10:00,933 What if you're the IRB and the investigator says, 182 00:10:01,867 --> 00:10:05,137 "I don't need to enroll this person who can't give informed consent. 183 00:10:05,137 --> 00:10:09,041 I can do my study with people who can give informed consent, 184 00:10:09,041 --> 00:10:11,677 but this is a very promising drug. 185 00:10:11,677 --> 00:10:13,679 And this might help this person." So imagine 186 00:10:13,679 --> 00:10:16,682 now you've got somebody who has cognitive impairment, 187 00:10:16,682 --> 00:10:19,685 they can't give informed consent, they have diabetes, 188 00:10:19,685 --> 00:10:24,423 and now here's a potentially very promising drug for diabetes. 189 00:10:24,423 --> 00:10:27,827 What would you say in terms of enrolling them? 190 00:10:27,827 --> 00:10:31,030 And I can tell you this is a very lively debate. 191 00:10:31,030 --> 00:10:33,065 We actually debate this in our department. 192 00:10:33,065 --> 00:10:36,335 It seems, at least on the face of it, you could say one of two things. 193 00:10:36,335 --> 00:10:40,139 You could say, well, if it really has a chance to help her, 194 00:10:40,139 --> 00:10:43,909 why would you exclude the person just because they can't give informed consent? 195 00:10:43,909 --> 00:10:47,113 That almost seems like turning protection into a kind of discrimination. 196 00:10:47,113 --> 00:10:50,716 If it really has potential benefit, we should let the person enroll. 197 00:10:50,716 --> 00:10:53,452 And then they might say no. This is research. 198 00:10:53,452 --> 00:10:56,188 We can never be sure. There's all these risks. 199 00:10:56,188 --> 00:11:00,693 We can collect the data from people who can give informed consent. 200 00:11:00,693 --> 00:11:02,228 Informed consent's a crucial safeguard. 201 00:11:02,228 --> 00:11:05,898 We shouldn't enroll people who can't give informed consent 202 00:11:05,898 --> 00:11:09,268 when we can do the research with people who can give informed consent. 203 00:11:09,268 --> 00:11:13,339 And I can tell you those are both widely endorsed views 204 00:11:13,339 --> 00:11:17,376 both in our department here and also in the literature. 205 00:11:17,376 --> 00:11:19,612 That's quiz question number one. 206 00:11:19,612 --> 00:11:21,080 Quiz question number two. 207 00:11:23,749 --> 00:11:24,383 Imagine a 208 00:11:24,383 --> 00:11:27,286 study -- I can give you a couple of examples of these. 209 00:11:27,286 --> 00:11:31,123 We do conferences around the world on research ethics. 210 00:11:31,123 --> 00:11:34,860 And we were in the Philippines about five or six years ago, 211 00:11:34,860 --> 00:11:38,330 and I was giving a lecture very similar to this one. 212 00:11:38,330 --> 00:11:40,833 And afterwards, an investigator came up to me 213 00:11:40,833 --> 00:11:44,570 and told me about a kind of disease that they were studying. 214 00:11:44,570 --> 00:11:47,773 They were studying this very rare neurological disease. 215 00:11:47,773 --> 00:11:50,543 And they had this potential new treatment. 216 00:11:50,543 --> 00:11:56,315 Very new, very experimental, hadn't been given to very many 217 00:11:56,315 --> 00:12:00,286 people, might be helpful, but also could be very risky. 218 00:12:00,853 --> 00:12:03,322 And here's the challenge. 219 00:12:04,557 --> 00:12:07,493 You can give that drug and test it in 220 00:12:07,493 --> 00:12:10,763 people who can't give consent, but it poses low risk. 221 00:12:10,763 --> 00:12:15,334 So what the idea was in this case was this investigator had two populations. 222 00:12:15,334 --> 00:12:17,970 He had people who had very early disease. 223 00:12:17,970 --> 00:12:21,240 Disease hadn't progressed very far, and they were perfectly competent, 224 00:12:21,240 --> 00:12:23,075 and able to understand. 225 00:12:23,542 --> 00:12:26,812 And he said to me, "I could do the research in that group. 226 00:12:26,812 --> 00:12:29,115 But their brains are basically fully intact, 227 00:12:29,749 --> 00:12:33,519 and the risks of this study, of this drug, are cognitive effects, 228 00:12:34,353 --> 00:12:38,891 so there's great risks in that population because their brains are intact, 229 00:12:38,891 --> 00:12:43,629 and if this drug turns out to be harmful, it could harm their brains." But 230 00:12:43,629 --> 00:12:47,399 they could understand, and give consent, and they could understand those risks. 231 00:12:47,399 --> 00:12:48,968 That's the first population. 232 00:12:48,968 --> 00:12:52,138 The second population is people who have advanced disease. 233 00:12:52,138 --> 00:12:55,374 They have advanced disease, and the condition is already largely done 234 00:12:55,374 --> 00:12:56,842 its damage to their brain. 235 00:12:56,842 --> 00:13:00,913 So their brains have already been damaged in this way, and as a result, 236 00:13:00,913 --> 00:13:01,814 the drug doesn't 237 00:13:01,814 --> 00:13:05,017 have as many risks to those people because they've already suffered 238 00:13:05,017 --> 00:13:07,653 most of the deficits that they're going to get. 239 00:13:07,653 --> 00:13:09,989 So even if the drug is risky, it's 240 00:13:09,989 --> 00:13:13,492 not going to hurt them much more than they've already been harmed. 241 00:13:13,959 --> 00:13:15,494 So everybody has this setup? 242 00:13:15,494 --> 00:13:18,297 So you've got this population can understand high risks, 243 00:13:18,297 --> 00:13:21,433 this population can't understand relatively 244 00:13:21,433 --> 00:13:25,437 low risks, and the question is which group should you do it in? 245 00:13:25,437 --> 00:13:27,006 There's actually an historical example 246 00:13:27,006 --> 00:13:28,674 we could talk about for this one that's very interesting. 247 00:13:28,674 --> 00:13:29,475 But I'm going to give you the next one. 248 00:13:29,475 --> 00:13:30,409 Quiz question three. 249 00:13:32,878 --> 00:13:35,581 This is one I said there should be a surrogate, 250 00:13:35,581 --> 00:13:38,751 there should be somebody, a legally authorized representative, 251 00:13:38,751 --> 00:13:41,654 who is the one who makes decisions for people like 252 00:13:41,654 --> 00:13:45,591 Miss P, adults who can't understand, or make their decisions. 253 00:13:45,591 --> 00:13:52,398 And then the question is who is an appropriate surrogate in this regard? 254 00:13:52,398 --> 00:13:55,534 And this is a very challenging question in the U.S. 255 00:13:55,534 --> 00:13:56,702 What the U.S. 256 00:13:56,702 --> 00:14:00,673 federal regulations say is you need a legally authorized representative. 257 00:14:02,274 --> 00:14:05,611 And then you say, okay, well, let's look at the regulations. 258 00:14:05,611 --> 00:14:07,279 Who's a legally authorized representative. 259 00:14:07,279 --> 00:14:11,917 The answer is, according to the federal regulations, is you look to state law. 260 00:14:11,917 --> 00:14:15,588 And state law determines who counts as a legally authorized representative. 261 00:14:16,422 --> 00:14:21,260 The problem is that typically state laws, or most states 262 00:14:21,260 --> 00:14:24,563 in the U.S., don't have laws with respect to research. 263 00:14:24,563 --> 00:14:26,298 So what you have to do then, is you have to start 264 00:14:26,298 --> 00:14:29,969 looking at the laws with respect to surrogates for clinical care. 265 00:14:29,969 --> 00:14:31,604 And that raises some questions. 266 00:14:31,604 --> 00:14:35,674 To what extent is a clinical surrogate appropriate 267 00:14:35,674 --> 00:14:37,376 to enroll somebody in research? 268 00:14:37,376 --> 00:14:41,113 Almost every state -- I think maybe, I'm not a lawyer, 269 00:14:41,113 --> 00:14:43,449 but I think maybe every state has a next-of-kin hierarchy. 270 00:14:43,449 --> 00:14:47,519 So if I don't fill out an advanced directive, if I don't name a surrogate 271 00:14:47,519 --> 00:14:53,158 for myself, and I'm in a car accident, then there's the next-of-kin hierarchy. 272 00:14:53,158 --> 00:14:56,528 Somebody's chosen on that to make clinical care decisions for me. 273 00:14:56,528 --> 00:14:59,598 Is that an appropriate person to also make research decisions? 274 00:14:59,598 --> 00:15:03,569 Should we just leave it to the next-of-kin to make decisions for them? 275 00:15:07,806 --> 00:15:08,474 All right. 276 00:15:09,341 --> 00:15:14,480 That was quiz question one, quiz question two, and quiz question three. 277 00:15:15,981 --> 00:15:19,985 We've got six minutes, which means I'm not going to let you guys out too early. 278 00:15:19,985 --> 00:15:22,988 So does somebody want to give me the answer, 279 00:15:22,988 --> 00:15:27,026 so I can write up a paper on one of these three. 280 00:15:27,026 --> 00:15:28,961 I don't have a clear answer to any of these questions. 281 00:15:28,961 --> 00:15:33,065 Quiz question one, prospect to benefit when people can't consent. 282 00:15:33,065 --> 00:15:35,534 You enroll them or exclude them? 283 00:15:35,534 --> 00:15:39,204 Quiz question two, do you do the low risks, 284 00:15:39,204 --> 00:15:43,309 people who can't consent, or high-risk people who can consent? 285 00:15:43,309 --> 00:15:45,678 Or surrogates. 286 00:15:45,678 --> 00:15:46,879 Anybody have thoughts? 287 00:15:46,879 --> 00:15:48,647 Any of these are easier or harder? 288 00:15:48,647 --> 00:15:49,481 More interesting? 289 00:15:49,481 --> 00:15:52,051 Should I give you one more example? 290 00:15:52,051 --> 00:15:53,285 You tell me if this is a mistake 291 00:15:53,285 --> 00:15:56,855 or not? So I'll talk about this one. 292 00:15:56,855 --> 00:16:00,392 This one's contentious because it ended up in court 293 00:16:00,392 --> 00:16:03,162 and getting the facts isn't necessarily easy. 294 00:16:03,162 --> 00:16:06,699 I know people know the case of Jesse Gelsinger, 295 00:16:06,699 --> 00:16:12,671 who was an older teenager who enrolled in a study, and basically, 296 00:16:12,671 --> 00:16:16,809 at least on one description of the case, 297 00:16:16,809 --> 00:16:19,812 what was going on was there was an option, 298 00:16:19,812 --> 00:16:22,982 this was a gene transfer study, very early gene transfer study. 299 00:16:22,982 --> 00:16:26,485 And the option was they could do it, again, 300 00:16:26,485 --> 00:16:29,121 it's all contentious, who knows if this is right. 301 00:16:29,121 --> 00:16:32,591 One description, though, is could do the study in very young kids 302 00:16:32,591 --> 00:16:35,194 who couldn't understand it because they were very young, 303 00:16:35,194 --> 00:16:36,929 but because of their immune systems 304 00:16:36,929 --> 00:16:40,132 and their disease condition, the research would likely not be terribly 305 00:16:40,132 --> 00:16:41,000 risky for them. 306 00:16:41,000 --> 00:16:43,535 Although, it still would be risky. 307 00:16:43,535 --> 00:16:48,107 Or they could do it in older teenagers who according to most data can understand, 308 00:16:48,107 --> 00:16:51,176 but they would face much higher risks from it. 309 00:16:51,176 --> 00:16:53,912 So who do you do that study in? 310 00:16:53,912 --> 00:16:57,883 First, early gene transfer study, you could do it in 3 year olds, 311 00:16:57,883 --> 00:16:58,817 can't possibly understand, 312 00:16:58,817 --> 00:17:02,454 but you could get the permission of their parents, relatively low risks. 313 00:17:02,454 --> 00:17:06,425 Or you do it in 19 year olds who face much higher risks 314 00:17:06,425 --> 00:17:08,894 but can agree to themselves, consent for themselves. 315 00:17:09,495 --> 00:17:10,462 All right. Ready? 316 00:17:10,462 --> 00:17:12,698 Show of hands, everybody's got the question? 317 00:17:12,698 --> 00:17:16,568 All right, who says you do it in the 3 year olds? 318 00:17:16,568 --> 00:17:19,805 Can't understand but risks -- let's call them relatively low. 319 00:17:21,740 --> 00:17:22,374 All right. 320 00:17:22,374 --> 00:17:25,577 How about you do it in the 19 year old? 321 00:17:25,577 --> 00:17:27,913 Higher risks, but they can understand, and they're making the decision 322 00:17:27,913 --> 00:17:28,547 for themselves. 323 00:17:28,547 --> 00:17:30,282 Nobody else is making the decision for them? 324 00:17:30,282 --> 00:17:35,087 So almost -- almost an even split. 325 00:17:35,087 --> 00:17:39,691 Somebody want to give me -- somebody have an argument one way or the other? 326 00:17:39,691 --> 00:17:40,325 Strong reason? 327 00:17:40,325 --> 00:17:42,861 All right, I'll try to repeat it so we get it into the mic. 328 00:17:42,861 --> 00:17:47,466 Yeah. Male Speaker: Are the benefits equal in these examples? Yep. 329 00:17:47,466 --> 00:17:48,400 Benefits are equal. 330 00:17:48,400 --> 00:17:52,805 Both scientifically -- let's just imagine, scientific benefits are equal. 331 00:17:52,805 --> 00:17:57,109 So the researcher will tell you I'll learn as much from the 19 year 332 00:17:57,109 --> 00:17:59,878 old as the 3 year old, in terms of whether this works and everything. 333 00:17:59,878 --> 00:18:03,282 And also, to that person, similar benefits. 334 00:18:03,282 --> 00:18:06,218 Now, of course, there is a slight difference that a 3 year 335 00:18:06,218 --> 00:18:07,553 old gets the benefits, 336 00:18:07,553 --> 00:18:09,655 they'll have them for longer in their life than a 19 year old. 337 00:18:09,655 --> 00:18:13,258 I don't know if you want to take that into account. 338 00:18:13,258 --> 00:18:14,226 Any other questions? 339 00:18:15,427 --> 00:18:16,095 Any arguments? 340 00:18:16,095 --> 00:18:19,064 So who was on the 3 year old side? 341 00:18:19,064 --> 00:18:23,001 Somebody want to give the argument for the 3 year old side? 342 00:18:24,770 --> 00:18:25,537 Anybody? 343 00:18:26,572 --> 00:18:29,308 Anybody got a -- why are you doing the 3 year old? 344 00:18:29,308 --> 00:18:31,610 You don't care about informed consent? 345 00:18:31,610 --> 00:18:34,613 Female Speaker: It's just less risk. So less risk. 346 00:18:34,613 --> 00:18:38,016 So the important thing is protecting the subject from being harmed. 347 00:18:38,016 --> 00:18:42,988 Yeah. Female Speaker: They have a surrogate. And they have a surrogate. 348 00:18:42,988 --> 00:18:44,256 They have a parent. 349 00:18:44,256 --> 00:18:47,359 You figure, hopefully, the parent's pretty good decision-maker for them. 350 00:18:48,293 --> 00:18:49,962 And we have a lot of experience 351 00:18:50,762 --> 00:18:52,898 be a treatment like that. 352 00:18:52,898 --> 00:18:53,665 Where. 353 00:18:53,665 --> 00:18:56,735 and treatment like that where [inaudible]. Yeah, okay. 354 00:18:56,735 --> 00:18:59,505 Are you guys convinced the ones who wanted the teenagers. 355 00:19:00,973 --> 00:19:06,778 Everybody? I think you guys have -- I don't know. 356 00:19:06,778 --> 00:19:07,613 Did they win? 357 00:19:07,613 --> 00:19:10,115 Or did they jut cowed you guys into silence? 358 00:19:10,115 --> 00:19:12,885 Anybody want to argue for doing it with a teenager? 359 00:19:13,886 --> 00:19:17,756 [inaudible commentary] 360 00:19:17,923 --> 00:19:21,393 It all sounds good as long as know what? 361 00:19:21,393 --> 00:19:25,197 Female Speaker: That all the [inaudible] is correct. Yeah, yeah. 362 00:19:25,197 --> 00:19:25,831 So good. 363 00:19:25,831 --> 00:19:28,500 So one of the things that you might be worried about is 364 00:19:28,500 --> 00:19:32,905 this is research and, well, we think these are the risks, and we think 365 00:19:32,905 --> 00:19:37,943 these are the potential benefits, but we never can be sure, right. 366 00:19:38,477 --> 00:19:40,679 And you might think to the extent 367 00:19:40,679 --> 00:19:44,183 that you're not sure, maybe it shouldn't be us to us. 368 00:19:44,349 --> 00:19:48,253 Maybe it shouldn't even be up to the parents, right, and maybe that's a reason 369 00:19:48,253 --> 00:19:49,655 for letting people who can understand for themselves. 370 00:19:49,655 --> 00:19:53,058 I was going to leave everybody with this one because why I think 371 00:19:53,058 --> 00:19:56,695 this one is so great, and I don't know the answer to this question. 372 00:19:56,695 --> 00:19:59,565 But I think what's crucial and what's fundamental about this thinking 373 00:19:59,565 --> 00:20:01,133 about the ethics of clinical research, 374 00:20:01,133 --> 00:20:04,770 thinking about vulnerable subjects is, I think what it does is it presses on 375 00:20:05,037 --> 00:20:08,607 what I take to be two of the most important protections. 376 00:20:08,607 --> 00:20:10,209 Informed consent is really important 377 00:20:10,209 --> 00:20:15,280 and protecting people from risks and harm is really important. 378 00:20:15,280 --> 00:20:19,484 And what's challenging about this is it doesn't let you have both, right. 379 00:20:19,484 --> 00:20:22,054 It says, look, I'll give you informed consent, 380 00:20:22,054 --> 00:20:24,323 or I'll give you low risks, but you don't get both. 381 00:20:24,323 --> 00:20:27,893 And then what you have to decide, or what we have to decide is, 382 00:20:27,893 --> 00:20:30,929 we have to decide which one we're going to go with. 383 00:20:30,929 --> 00:20:34,466 And I'll give you an argument on both sides quickly before we end. 384 00:20:34,466 --> 00:20:38,270 One is, you might think, and this is true with parents, we let parents -- 385 00:20:38,270 --> 00:20:41,306 we give parents a lot of leeway in making decisions for kids. 386 00:20:41,573 --> 00:20:45,544 Or you might think in the case of Miss P and Mr. 387 00:20:45,544 --> 00:20:47,846 P, they were married for 40 years. 388 00:20:47,846 --> 00:20:51,149 You might think, okay, it's somebody else, but it's somebody 389 00:20:51,149 --> 00:20:54,453 who really knows her, and probably knows what she wants. 390 00:20:54,453 --> 00:20:55,787 Well, it turns out 391 00:20:55,787 --> 00:20:59,391 unfortunately that there's at least a fair amount of empirical data 392 00:20:59,391 --> 00:21:03,362 that we're not very good predictors of the preferences of other people, 393 00:21:03,362 --> 00:21:04,997 including people 394 00:21:04,997 --> 00:21:09,601 who are very close to us, people we've known for a really long time. 395 00:21:09,601 --> 00:21:13,305 So it's hard to know how much we can rely on other people. 396 00:21:13,305 --> 00:21:15,307 In the Jesse Gelsinger 397 00:21:15,307 --> 00:21:18,877 -- so that's the consent worry about how much we rely on surrogates. 398 00:21:18,877 --> 00:21:21,346 On the other hand, unfortunately in the Jesse Gelsinger 399 00:21:21,346 --> 00:21:25,984 -- this is an actual case, Jesse Gelsinger died as a result of being in this study. 400 00:21:25,984 --> 00:21:28,320 Now of course, that's not a clear reason why we should have done it 401 00:21:28,320 --> 00:21:33,125 because maybe the 3 year olds would have died, we're never going to know that, 402 00:21:33,125 --> 00:21:36,328 but it shows that these are really challenging, important questions 403 00:21:36,328 --> 00:21:41,333 to try to think hard about, and get right that people are faced with, okay. 404 00:21:41,333 --> 00:21:43,101 All right. Any last thoughts? 405 00:21:43,101 --> 00:21:44,703 Things I should have said? 406 00:21:44,703 --> 00:21:47,005 Questions we should have answered? Yes. 407 00:21:47,005 --> 00:21:49,941 Female Speaker: So a lot of times when dealing with this topic, 408 00:21:49,941 --> 00:21:51,610 people discussing informed consent, 409 00:21:51,610 --> 00:21:54,046 and this is one of the most important [inaudible] -- 410 00:21:54,046 --> 00:21:56,782 Yep. Female Speaker: -- but I very rarely hear 411 00:21:57,549 --> 00:21:59,284 all the protections are stacked up -- 412 00:22:02,821 --> 00:22:04,589 Yep. Female Speaker: -- at the beginning of the study. 413 00:22:04,589 --> 00:22:08,260 Right. Female Speaker: This makes perfect sense. 414 00:22:08,260 --> 00:22:09,828 But what happens when pregnant women 415 00:22:09,828 --> 00:22:12,197 with children, with prisoners, once they're in the study and approved? 416 00:22:12,197 --> 00:22:14,933 Are there still additional -- Yeah. Female Speaker: -- [inaudible] over 417 00:22:14,933 --> 00:22:15,567 other subjects? 418 00:22:15,567 --> 00:22:18,203 Or at that point is everybody equally considered? Good. 419 00:22:18,203 --> 00:22:18,837 Great question. 420 00:22:18,837 --> 00:22:23,608 So as you say, most -- as you pointed out here, most of the protections 421 00:22:23,608 --> 00:22:27,713 that we talked about are at the point of enrollment, or before enrollment. 422 00:22:27,713 --> 00:22:29,948 So IRB review, that happens before enrollment. 423 00:22:29,948 --> 00:22:33,452 And then at the point of enrollment, a surrogate, or extra 424 00:22:33,452 --> 00:22:36,621 -- for Maki, it was independent assessment of her voluntariness. 425 00:22:36,855 --> 00:22:41,827 So a lot of this goes on before a person is enrolled and starts in a study. 426 00:22:41,827 --> 00:22:45,197 So that's where I think typically most of the protections are. 427 00:22:45,197 --> 00:22:49,167 So your question's but what about added protections as the study goes along. 428 00:22:49,167 --> 00:22:52,371 So there are protections for everybody as the study goes along. 429 00:22:52,371 --> 00:22:56,174 So for instance, typically in the U.S., each study has to be re-reviewed 430 00:22:56,174 --> 00:22:57,642 at least once a year. 431 00:22:58,877 --> 00:22:59,478 You have 432 00:22:59,478 --> 00:23:02,547 to find out how things are going, whether or not 433 00:23:02,547 --> 00:23:05,917 anybody's been hurt, whether or not there's any information on risks. 434 00:23:05,917 --> 00:23:09,921 So there's ongoing evaluation of the study from the point of the IRB. 435 00:23:09,921 --> 00:23:12,057 Also, as it's always said, informed consent 436 00:23:12,057 --> 00:23:14,826 isn't just at the point of enrollment, it's ongoing. 437 00:23:14,826 --> 00:23:16,661 So somebody can always say no. 438 00:23:16,661 --> 00:23:18,163 Somebody can always withdraw. 439 00:23:18,163 --> 00:23:19,898 That's true of a competent adult. 440 00:23:19,898 --> 00:23:22,501 For a child it would be they could dissent. 441 00:23:22,501 --> 00:23:24,803 Or with Miss P, they could say dissent. 442 00:23:24,803 --> 00:23:26,538 They enroll, but a week later 443 00:23:26,538 --> 00:23:30,609 they might want to say I want to go home, so there's that protection. 444 00:23:30,609 --> 00:23:32,844 But that's also a protection that applies to competent adults, as well. 445 00:23:32,844 --> 00:23:33,578 So typically, 446 00:23:33,578 --> 00:23:39,184 I would say that most of the protections for vulnerable subjects happen early on. 447 00:23:39,184 --> 00:23:42,821 What you could do -- there are possibilities for added ones 448 00:23:42,821 --> 00:23:44,456 as the study goes along. 449 00:23:44,456 --> 00:23:46,458 So for instance, in the U.S., 450 00:23:46,458 --> 00:23:51,396 which I know the best, there's provisions for things like independent monitors. 451 00:23:51,863 --> 00:23:56,802 So what you could do is you could say, okay, we're worried about Miss P. 452 00:23:56,802 --> 00:24:01,072 We're going to enroll her, but I'm going to have somebody check in 453 00:24:01,072 --> 00:24:05,343 on her once every couple of days just to make sure she's okay. 454 00:24:05,343 --> 00:24:08,814 There is a group actually called the Human Subjects Protection Unit 455 00:24:08,814 --> 00:24:12,284 that's run out of the clinical director of the Mental Health 456 00:24:12,284 --> 00:24:15,454 Institute here that's a very novel way of doing that. 457 00:24:15,454 --> 00:24:17,656 And they do that for some studies. 458 00:24:17,656 --> 00:24:20,525 Studies where they're people who are identified as vulnerable. 459 00:24:20,525 --> 00:24:23,695 They are monitors, and they stay with the people throughout 460 00:24:23,695 --> 00:24:27,165 their participation in the study, and they check in on them 461 00:24:27,165 --> 00:24:30,969 every once in a while to make sure things are going okay. 462 00:24:30,969 --> 00:24:35,073 So that is one case where there's, I think, a valuable added protection 463 00:24:35,073 --> 00:24:36,007 for vulnerable subjects. 464 00:24:36,007 --> 00:24:39,144 So there's things like that you can do. 465 00:24:39,144 --> 00:24:41,580 I think without having any good data on it, 466 00:24:41,580 --> 00:24:44,583 I think it's probably relatively rare that that happens. 467 00:24:44,583 --> 00:24:47,118 Although, it's happening here in a number of studies. 468 00:24:47,118 --> 00:24:51,456 Anything else? All right. Thank you.