1 00:00:12,346 --> 00:00:15,249 Good morning. 2 00:00:15,249 --> 00:00:16,583 I'm going to try to do two things 3 00:00:16,583 --> 00:00:19,062 in this talk, which I hope don't come across as 4 00:00:19,062 --> 00:00:19,853 contradictory. 5 00:00:19,853 --> 00:00:21,785 If they do, please let me know afterwards 6 00:00:21,785 --> 00:00:23,056 and I'll try not to do it. 7 00:00:23,524 --> 00:00:26,161 Or I'll try to avoid the apparent contradiction in 8 00:00:26,161 --> 00:00:26,793 the future. 9 00:00:27,060 --> 00:00:30,063 So the two things first are. 10 00:00:31,398 --> 00:00:33,834 To argue that doing risk benefit 11 00:00:33,834 --> 00:00:35,986 assessment is obviously just central 12 00:00:35,986 --> 00:00:38,138 to the ethics of clinical research, 13 00:00:39,273 --> 00:00:41,875 but it's not done very often. 14 00:00:41,875 --> 00:00:43,510 It's not done systematically. 15 00:00:43,510 --> 00:00:45,712 I don't think it's done very well. 16 00:00:45,712 --> 00:00:48,455 And so the first thing I want to do 17 00:00:48,455 --> 00:00:49,082 is just 18 00:00:49,650 --> 00:00:52,263 present a framework for doing systematic 19 00:00:52,263 --> 00:00:54,288 assessments of research risks, 20 00:00:54,288 --> 00:00:57,491 and then make the argument that it's important to do this 21 00:00:57,491 --> 00:01:00,494 systematically and come. 22 00:01:01,094 --> 00:01:03,448 This is anecdotal, but people I can just I'm 23 00:01:03,448 --> 00:01:04,197 convinced now 24 00:01:04,531 --> 00:01:05,666 that's how I measure where I got here. 25 00:01:05,666 --> 00:01:06,867 So I've been doing this for 30 years. 26 00:01:06,867 --> 00:01:08,232 People just aren't doing this and 27 00:01:08,232 --> 00:01:09,970 they're not doing it in a systematic way. 28 00:01:09,970 --> 00:01:11,204 So that's the first part of it. 29 00:01:12,172 --> 00:01:13,740 Sorry, we're not doing systematically. 30 00:01:13,740 --> 00:01:16,743 We need to do it systematically. 31 00:01:17,010 --> 00:01:19,039 So for people just care about the basics 32 00:01:19,039 --> 00:01:20,814 of this, that framework is enough. 33 00:01:20,814 --> 00:01:23,817 You can just not pay attention to the rest of the talk. 34 00:01:23,817 --> 00:01:26,472 What the second part of the talk is going to be, which I hope 35 00:01:26,472 --> 00:01:26,820 doesn't 36 00:01:26,820 --> 00:01:29,000 seem in tension with the first part, 37 00:01:29,000 --> 00:01:30,090 is to think about 38 00:01:30,090 --> 00:01:33,860 some of the different challenges complexities 39 00:01:34,227 --> 00:01:36,132 that arise when you start thinking about 40 00:01:36,132 --> 00:01:37,798 how to do risk benefit assessment. 41 00:01:37,798 --> 00:01:37,965 Right? 42 00:01:37,965 --> 00:01:41,001 So I think the basics are pretty straightforward, 43 00:01:41,001 --> 00:01:43,591 but I think there are also a lot of really interesting 44 00:01:43,591 --> 00:01:44,071 questions 45 00:01:44,071 --> 00:01:46,457 that arise once you start thinking about this 46 00:01:46,457 --> 00:01:47,040 carefully. 47 00:01:47,040 --> 00:01:49,510 And these are questions a lot of which haven't been 48 00:01:49,510 --> 00:01:50,043 addressed. 49 00:01:50,677 --> 00:01:53,609 So people now know the Belmont Reports of 50 00:01:53,609 --> 00:01:54,181 Belmont 51 00:01:54,181 --> 00:01:57,295 Report is one of, if not the sort of founding 52 00:01:57,295 --> 00:01:57,918 document 53 00:01:58,185 --> 00:02:00,481 for research ethics, particularly research 54 00:02:00,481 --> 00:02:02,122 ethics, in the United States. 55 00:02:02,522 --> 00:02:04,887 And here are some things that they have to say with 56 00:02:04,887 --> 00:02:05,258 respect 57 00:02:05,258 --> 00:02:07,060 to risk benefit assessment. 58 00:02:07,060 --> 00:02:09,539 They say things like, we should try to do it as 59 00:02:09,539 --> 00:02:10,330 systematically 60 00:02:10,330 --> 00:02:13,533 as we can, as clearly as we can. 61 00:02:13,834 --> 00:02:17,070 I like this last one, which is just never done. 62 00:02:17,070 --> 00:02:22,209 Is that you shouldn't just do it based on intuition, 63 00:02:22,409 --> 00:02:25,684 but risk benefit assessments, they're empirical questions, 64 00:02:25,684 --> 00:02:26,079 right? 65 00:02:26,079 --> 00:02:29,216 They're questions of if I take a certain drug, 66 00:02:29,216 --> 00:02:30,751 what are the chances it's going to help me? 67 00:02:30,751 --> 00:02:32,586 What are the chances it's going to hurt me? 68 00:02:32,586 --> 00:02:34,221 Well, those are very empirical questions. 69 00:02:34,221 --> 00:02:37,824 And presumably in order to address them, it's 70 00:02:37,824 --> 00:02:40,260 valuable to have as much of the relevant 71 00:02:40,260 --> 00:02:41,294 data as you can. 72 00:02:41,294 --> 00:02:42,562 Belmont to 73 00:02:43,630 --> 00:02:46,833 emphasize that 50 years ago now, and basically nobody does it. 74 00:02:47,200 --> 00:02:50,184 And so one of the things we've been trying to argue is 75 00:02:50,184 --> 00:02:50,570 trying 76 00:02:50,570 --> 00:02:53,781 to convince people of the value and the importance of doing 77 00:02:53,781 --> 00:02:54,107 that. 78 00:02:54,307 --> 00:02:55,709 So I'm just going to give. 79 00:02:55,709 --> 00:02:59,179 So when I give talks, I usually say I get to tangents, 80 00:02:59,913 --> 00:03:01,394 but what I do, what I realize now is 81 00:03:01,394 --> 00:03:02,916 if I put some of them on the slides, 82 00:03:02,916 --> 00:03:04,651 they're not actually tangents. And then I get more. 83 00:03:04,651 --> 00:03:06,486 So I'm sneaking in more tangents here. 84 00:03:06,486 --> 00:03:09,056 So this is to leading into this talk. 85 00:03:09,056 --> 00:03:12,392 So this is a paper you can see we did 20 years ago 86 00:03:12,392 --> 00:03:12,726 now. 87 00:03:12,726 --> 00:03:15,395 And CMA was actually a post back fellow here. 88 00:03:15,395 --> 00:03:17,408 Her first. She came through the department 89 00:03:17,408 --> 00:03:18,031 three times. 90 00:03:18,031 --> 00:03:19,399 This was her first time. 91 00:03:19,399 --> 00:03:22,703 And this was a survey of IRB chairpersons 92 00:03:23,036 --> 00:03:24,855 who had worked in pediatric research, 93 00:03:24,855 --> 00:03:26,673 reviewed pediatric research studies. 94 00:03:27,007 --> 00:03:31,462 We gave them a list of common research interventions, like a 95 00:03:31,462 --> 00:03:32,279 blood draw 96 00:03:32,612 --> 00:03:36,159 like an allergy, skin testing like a bone marrow 97 00:03:36,159 --> 00:03:36,750 biopsy. 98 00:03:38,351 --> 00:03:40,400 And we just asked them, how risky are 99 00:03:40,400 --> 00:03:40,787 these? 100 00:03:40,787 --> 00:03:42,189 Are they minimal risk? 101 00:03:42,189 --> 00:03:43,623 Are they a minor increase or a mask? 102 00:03:43,623 --> 00:03:45,021 Are they more than a minor increase 103 00:03:45,021 --> 00:03:45,859 over a minimal risk? 104 00:03:45,859 --> 00:03:47,127 And when you hear this talk, 105 00:03:47,127 --> 00:03:49,084 you'll learn more about those categories 106 00:03:49,084 --> 00:03:50,797 and why they're the relevant ones. 107 00:03:50,997 --> 00:03:53,133 But that's the assessment that we did. 108 00:03:53,133 --> 00:03:55,361 And the data that everybody focuses on 109 00:03:55,361 --> 00:03:57,237 from that paper are two things. 110 00:03:57,771 --> 00:04:01,074 One, that IRB chairs tend to be really cautious. 111 00:04:01,274 --> 00:04:04,137 There is a big chunk of them who said that even a single 112 00:04:04,137 --> 00:04:04,444 blood 113 00:04:04,444 --> 00:04:07,247 draw was more than minimum risk, which it just isn't. 114 00:04:07,247 --> 00:04:08,582 That was the first thing. 115 00:04:08,582 --> 00:04:09,449 The second thing was that 116 00:04:09,449 --> 00:04:11,989 there was this dispersal, just an amazing amount of 117 00:04:11,989 --> 00:04:12,686 disagreement. 118 00:04:12,886 --> 00:04:14,454 So for instance, allergy skin testing, 119 00:04:14,454 --> 00:04:16,022 almost a third thought it was minimal 120 00:04:16,022 --> 00:04:16,923 risk, almost a third 121 00:04:16,923 --> 00:04:19,647 thought it was a minor increase over amount of risk, and almost 122 00:04:19,647 --> 00:04:19,993 a third 123 00:04:20,260 --> 00:04:22,444 thought it was more than a minor increase 124 00:04:22,444 --> 00:04:23,563 over a minimal risk. 125 00:04:23,563 --> 00:04:26,666 So why are we getting that much dispersion? 126 00:04:26,867 --> 00:04:29,042 Well, the other aspect of this study, 127 00:04:29,042 --> 00:04:30,570 which we couldn't publish 128 00:04:30,570 --> 00:04:31,935 because we didn't have tests of it, 129 00:04:31,935 --> 00:04:33,573 but I thought was especially interesting. 130 00:04:34,040 --> 00:04:35,041 These were interviews. 131 00:04:35,041 --> 00:04:37,677 We called up 188 IRB chairpersons. 132 00:04:37,677 --> 00:04:40,347 We gave them this list of interventions. 133 00:04:40,347 --> 00:04:42,116 And we said, tell us, is this minimal 134 00:04:42,116 --> 00:04:43,550 risk, minor increase our risk 135 00:04:43,550 --> 00:04:44,951 or more than my increase heart intervals. 136 00:04:44,951 --> 00:04:47,531 So you multiply 188 times 8 or 9, 137 00:04:47,531 --> 00:04:50,657 whatever the heck that turns out to be. 138 00:04:50,657 --> 00:04:50,957 Right? 139 00:04:50,957 --> 00:04:55,328 It's almost 50, 100, 2000 specific cases 140 00:04:55,328 --> 00:04:58,331 in which you're asking somebody to categorize the risks 141 00:04:58,598 --> 00:05:02,417 of a procedure intervention, and not, in one case, zero 142 00:05:02,417 --> 00:05:02,903 cases. 143 00:05:02,903 --> 00:05:05,405 Then any of the chairperson say, I don't know. 144 00:05:07,674 --> 00:05:08,842 And that was really striking 145 00:05:08,842 --> 00:05:10,677 because a lot of these things they didn't know about, 146 00:05:10,677 --> 00:05:13,680 they didn't have data on, and they should have said, I 147 00:05:13,680 --> 00:05:14,347 don't know, 148 00:05:14,347 --> 00:05:18,151 because I can't do what Belmont tells me to do. 149 00:05:18,151 --> 00:05:20,493 I can't do this systematic assessment 150 00:05:20,493 --> 00:05:22,455 of a rail of the relevant data 151 00:05:22,455 --> 00:05:24,090 because I don't have it and I don't know it. 152 00:05:24,090 --> 00:05:25,892 I'd have to look it up and get back to you. 153 00:05:25,892 --> 00:05:27,928 Nobody did that. 154 00:05:27,928 --> 00:05:30,497 So what's going on? 155 00:05:30,497 --> 00:05:33,433 I think what people are doing is and part of the problem is, and 156 00:05:33,433 --> 00:05:33,800 this is 157 00:05:34,367 --> 00:05:38,672 my previous career was spent studying evolutionary biology 158 00:05:38,905 --> 00:05:40,540 and I think one of the things we learned 159 00:05:40,540 --> 00:05:42,856 from evolution is that the reason 160 00:05:42,856 --> 00:05:44,611 why we're here right now 161 00:05:44,845 --> 00:05:47,981 is because our predecessors were very 162 00:05:48,348 --> 00:05:51,151 pretty good assessors of risks, 163 00:05:51,151 --> 00:05:54,254 and they made those assessments really, really fast. 164 00:05:54,721 --> 00:05:56,189 So you imagine 165 00:05:57,524 --> 00:05:59,993 two populations 166 00:05:59,993 --> 00:06:02,162 saber tooth, their sounds easily. 167 00:06:02,162 --> 00:06:03,997 It could be a saber tooth tiger. 168 00:06:03,997 --> 00:06:06,617 The one, the one group of our ancestors 169 00:06:06,617 --> 00:06:08,902 were very thoughtful about risks. 170 00:06:08,902 --> 00:06:11,753 And they started discussing it and thinking, could this be a 171 00:06:11,753 --> 00:06:12,038 saber 172 00:06:12,038 --> 00:06:13,240 tooth tiger? 173 00:06:13,240 --> 00:06:15,493 I don't know, maybe we should think about it for a 174 00:06:15,493 --> 00:06:15,809 while. 175 00:06:15,809 --> 00:06:17,210 Maybe we should investigate it. 176 00:06:17,210 --> 00:06:19,746 The other group ran away when they heard the sound. 177 00:06:19,746 --> 00:06:23,637 Well, guess which one survived and reproduced the second one, 178 00:06:23,637 --> 00:06:24,084 right. 179 00:06:24,084 --> 00:06:27,587 And so what we've got is we've inherited this response 180 00:06:27,587 --> 00:06:29,941 where we just make risks assessments 181 00:06:29,941 --> 00:06:30,790 really fast. 182 00:06:31,124 --> 00:06:33,298 That can be valuable in certain cases, 183 00:06:33,298 --> 00:06:35,528 but in other cases it leads us astray, 184 00:06:35,528 --> 00:06:38,123 particularly in research where we're often 185 00:06:38,123 --> 00:06:40,100 dealing with unfamiliar things, 186 00:06:40,100 --> 00:06:42,359 where the heuristics that we rely on 187 00:06:42,359 --> 00:06:44,304 aren't necessarily applicable. 188 00:06:44,304 --> 00:06:45,872 So that's IRB chairpersons. 189 00:06:45,872 --> 00:06:48,842 We're doing this just give you one other example. 190 00:06:49,142 --> 00:06:51,378 I started doing this for a long time. 191 00:06:51,378 --> 00:06:53,420 So people sometimes call me up and say, 192 00:06:53,420 --> 00:06:55,148 could you help us decide whether 193 00:06:55,148 --> 00:06:56,866 this procedure, that procedure is of 194 00:06:56,866 --> 00:06:58,918 minimal risk or greater than minimal risk? 195 00:06:59,152 --> 00:07:00,720 These are some investigators 196 00:07:02,088 --> 00:07:02,555 who study 197 00:07:02,555 --> 00:07:05,693 metabolism, and they often do IVF glucose 198 00:07:05,693 --> 00:07:06,993 tolerance tests. 199 00:07:07,427 --> 00:07:09,544 And the head of the national consortium 200 00:07:09,544 --> 00:07:11,064 called me up once and said, 201 00:07:11,398 --> 00:07:12,465 Dave, you got to help us. 202 00:07:12,465 --> 00:07:14,991 All these IRBs think that IV glucose 203 00:07:14,991 --> 00:07:17,937 tolerance test is more than minimal risk. 204 00:07:18,171 --> 00:07:19,903 And you have to help us convince them 205 00:07:19,903 --> 00:07:21,541 that they're actually minimal risk 206 00:07:21,541 --> 00:07:24,411 because we're confident that they are minimal risk. 207 00:07:24,411 --> 00:07:25,578 And I say, great. 208 00:07:25,578 --> 00:07:27,699 So when you submit these protocols 209 00:07:27,699 --> 00:07:29,883 arguing that you glucose tolerance 210 00:07:29,883 --> 00:07:33,460 tests are minimum risk, what data do you present to the 211 00:07:33,460 --> 00:07:33,720 IRB 212 00:07:33,720 --> 00:07:35,755 to convince them that these are minimal risk? 213 00:07:35,755 --> 00:07:37,390 You know what the guy said to me? 214 00:07:37,390 --> 00:07:38,958 What are you talking about? 215 00:07:38,958 --> 00:07:40,794 Why would we present data? 216 00:07:40,794 --> 00:07:43,129 This is like a nationally renowned 217 00:07:43,129 --> 00:07:46,749 clinical researcher who spends his career collecting 218 00:07:46,749 --> 00:07:47,167 data. 219 00:07:47,701 --> 00:07:49,375 But then when it comes to ethics, he's 220 00:07:49,375 --> 00:07:51,271 like, what's data got to do with it, Dave? 221 00:07:51,271 --> 00:07:53,373 They're just minimal risk. 222 00:07:53,373 --> 00:07:54,708 So nobody's doing this. 223 00:07:54,708 --> 00:07:57,310 IRBs aren't doing it. Investigators aren't doing this. 224 00:07:57,310 --> 00:08:00,013 And I should point out, you can notice that paper 225 00:08:01,181 --> 00:08:02,148 30 years after 226 00:08:02,148 --> 00:08:05,819 Belmont, this example, almost 45 years 227 00:08:06,152 --> 00:08:07,885 after Belmont, it's been a long time 228 00:08:07,885 --> 00:08:09,522 and people still aren't doing it. 229 00:08:09,522 --> 00:08:13,760 So that's the set up for trying to justify 230 00:08:13,760 --> 00:08:15,962 or give grounds for why we need a framework. 231 00:08:15,962 --> 00:08:18,965 And then I'm going to propose one and talk through it some. 232 00:08:18,965 --> 00:08:20,200 So just a really background. 233 00:08:20,200 --> 00:08:21,425 So this is just for the philosophers 234 00:08:21,425 --> 00:08:21,901 in the crowd. 235 00:08:21,901 --> 00:08:23,603 Don't get too worked up about this. 236 00:08:23,603 --> 00:08:26,439 So I'm gonna talk about things like risks and benefits. 237 00:08:26,439 --> 00:08:28,041 And then people get confused. 238 00:08:28,041 --> 00:08:30,610 These are the terms that the regulations use. 239 00:08:30,610 --> 00:08:33,498 But then people will say, well, a risk is a chance of a 240 00:08:33,498 --> 00:08:33,813 harm. 241 00:08:34,114 --> 00:08:37,091 So if I'm going to stick a kid with a needle, that's going to 242 00:08:37,091 --> 00:08:37,384 hurt, 243 00:08:37,951 --> 00:08:39,652 that must not count as a risk, right? 244 00:08:39,652 --> 00:08:41,435 Because risks are some probability 245 00:08:41,435 --> 00:08:43,690 of a harm where it's basically guaranteed. 246 00:08:43,690 --> 00:08:44,773 If I stick a kid with a needle, it's 247 00:08:44,773 --> 00:08:45,225 going to hurt. 248 00:08:45,225 --> 00:08:47,394 So that must not be a risk. That must be something else. 249 00:08:47,394 --> 00:08:50,130 Now, these are actually just terms of our risks. 250 00:08:51,331 --> 00:08:52,891 Apply to things where you're not sure 251 00:08:52,891 --> 00:08:54,367 if a bad thing is going to happen, 252 00:08:54,567 --> 00:08:56,834 and it's also where you're sure some bad things are going to 253 00:08:56,834 --> 00:08:57,137 happen. 254 00:08:57,137 --> 00:09:00,190 And benefits aren't just things that are good that definitely 255 00:09:00,190 --> 00:09:00,840 will happen. 256 00:09:00,840 --> 00:09:04,745 It's also potential benefits, speculative benefits, chances of 257 00:09:04,745 --> 00:09:05,311 benefit. 258 00:09:05,512 --> 00:09:09,482 So think about these terms very broadly okay. 259 00:09:09,482 --> 00:09:11,751 So here's the framework that I'm going to go through. 260 00:09:11,751 --> 00:09:15,255 And we're talking about this talk about our wonderful alums. 261 00:09:15,455 --> 00:09:16,256 The first author. 262 00:09:16,256 --> 00:09:18,015 And this was Annette a read who was 263 00:09:18,015 --> 00:09:19,926 a fellow post-doc fellow at the time. 264 00:09:20,193 --> 00:09:21,858 Right. So she's now a member of the 265 00:09:21,858 --> 00:09:22,429 department. 266 00:09:22,429 --> 00:09:25,432 So what I'm going to do is basically just go through 267 00:09:26,266 --> 00:09:28,334 each of the steps in this framework, describe 268 00:09:28,334 --> 00:09:31,337 basically how I think it works or should work. 269 00:09:31,671 --> 00:09:33,897 And then as I mentioned at the top, 270 00:09:33,897 --> 00:09:35,041 I'm also going to 271 00:09:35,041 --> 00:09:36,966 go into some of the more complexities 272 00:09:36,966 --> 00:09:38,111 that get raised here. 273 00:09:38,111 --> 00:09:39,546 I think try to chemistry. 274 00:09:39,546 --> 00:09:41,654 I think they're still really interesting 275 00:09:41,654 --> 00:09:42,182 questions 276 00:09:42,182 --> 00:09:44,099 to try to address here for anybody 277 00:09:44,099 --> 00:09:46,186 interested in doing research ethics. 278 00:09:46,619 --> 00:09:47,120 Okay. 279 00:09:47,120 --> 00:09:49,789 So to start, one thing to notice is that 280 00:09:51,624 --> 00:09:53,793 basically every clinical trial 281 00:09:53,793 --> 00:09:56,007 is a composite or an aggregate of lots 282 00:09:56,007 --> 00:09:57,230 of different things. 283 00:09:57,931 --> 00:09:59,909 You bring people in, you weigh them, you 284 00:09:59,909 --> 00:10:02,035 measure them, you ask them some questions. 285 00:10:02,035 --> 00:10:05,498 You take two blood draws, you give them a drug, you do an 286 00:10:05,498 --> 00:10:06,105 MRI scan. 287 00:10:06,306 --> 00:10:09,242 So there's all these different components 288 00:10:09,242 --> 00:10:11,611 that make up a given study. 289 00:10:11,611 --> 00:10:14,113 And so the point of this then when it comes to risk 290 00:10:14,113 --> 00:10:16,182 benefit assessment is well what are you doing. 291 00:10:16,182 --> 00:10:19,082 The risk benefit assessment on if there's all these different 292 00:10:19,082 --> 00:10:19,652 components, 293 00:10:19,986 --> 00:10:24,157 the answer is what you should do is you do first. 294 00:10:24,157 --> 00:10:27,202 So if we're talking from the perspective of an IRB, 295 00:10:27,202 --> 00:10:27,560 first 296 00:10:27,560 --> 00:10:30,174 they should look at the risk benefit of each individual 297 00:10:30,174 --> 00:10:30,697 component. 298 00:10:30,997 --> 00:10:32,732 Assess each one of those, 299 00:10:32,732 --> 00:10:35,835 make sure they're acceptable from a risk benefit perspective. 300 00:10:35,835 --> 00:10:38,805 And then look at the aggregate or the study as a whole. 301 00:10:38,805 --> 00:10:42,142 So do both a component and the study as a whole. 302 00:10:44,310 --> 00:10:45,879 Now not only there are lots 303 00:10:45,879 --> 00:10:47,453 of interventions, but they're often 304 00:10:47,453 --> 00:10:48,982 different kinds of interventions. 305 00:10:48,982 --> 00:10:50,868 A lot of studies will include clinical 306 00:10:50,868 --> 00:10:52,952 interventions and research interventions. 307 00:10:52,952 --> 00:10:56,289 So this is what standard they call an add on trial 308 00:10:56,523 --> 00:10:58,758 where you give everybody a standard of care. 309 00:10:58,758 --> 00:11:01,661 And then for the one arm you add something experimental on 310 00:11:01,661 --> 00:11:04,664 top to see if that improves standard of care. 311 00:11:04,964 --> 00:11:06,966 And so for the most part, 312 00:11:06,966 --> 00:11:08,852 when I IRBs are doing this component 313 00:11:08,852 --> 00:11:10,737 analysis, they should be evaluating 314 00:11:10,737 --> 00:11:12,562 the components that are research, 315 00:11:12,562 --> 00:11:14,941 not the components that are clinical care. 316 00:11:15,441 --> 00:11:16,743 And this is a statement 317 00:11:16,743 --> 00:11:19,846 basically directing our base to do that from the U.S. 318 00:11:19,846 --> 00:11:20,747 regulations. 319 00:11:20,747 --> 00:11:22,572 So what they should be looking at 320 00:11:22,572 --> 00:11:24,951 is the risks and benefits of the research. 321 00:11:25,318 --> 00:11:27,445 And this is distinguished from things, 322 00:11:27,445 --> 00:11:28,788 for instance, therapies 323 00:11:28,788 --> 00:11:31,634 that subjects would get even if they weren't in the 324 00:11:31,634 --> 00:11:32,191 research. 325 00:11:33,526 --> 00:11:35,195 So the thought here is why do you do that? 326 00:11:35,195 --> 00:11:37,797 Well, the thought here is if it's clinically indicated, 327 00:11:37,797 --> 00:11:39,657 then the risk benefit profile of that 328 00:11:39,657 --> 00:11:41,668 intervention is going to be acceptable. 329 00:11:41,668 --> 00:11:43,610 That's what it means essentially for it 330 00:11:43,610 --> 00:11:45,004 to be clinically indicated. 331 00:11:45,004 --> 00:11:46,902 So for the most part, IRBs don't have 332 00:11:46,902 --> 00:11:49,108 to worry about the clinical interventions. 333 00:11:49,375 --> 00:11:50,176 The caveat here. 334 00:11:50,176 --> 00:11:52,216 So the arrow is going to be different 335 00:11:52,216 --> 00:11:52,712 caveats. 336 00:11:52,712 --> 00:11:55,848 If we go through this talk here, the worry is that 337 00:11:56,716 --> 00:11:58,728 what I at least need to be worried about, 338 00:11:58,728 --> 00:12:00,053 though, is the possibility 339 00:12:00,053 --> 00:12:03,723 that that add on treatment in the one arm of the study 340 00:12:03,957 --> 00:12:05,999 that that might influence the risk 341 00:12:05,999 --> 00:12:08,461 benefit profile of the standard of care, 342 00:12:08,761 --> 00:12:11,022 it might make it more risky, it might make it less 343 00:12:11,022 --> 00:12:11,564 beneficial, 344 00:12:11,564 --> 00:12:13,316 it might make it more beneficial. 345 00:12:13,316 --> 00:12:13,900 Who knows. 346 00:12:13,900 --> 00:12:16,216 But so you can't completely break it off 347 00:12:16,216 --> 00:12:17,837 the clinical interventions. 348 00:12:17,837 --> 00:12:19,939 You have to think about if there are any interactions 349 00:12:19,939 --> 00:12:22,048 between the research interventions 350 00:12:22,048 --> 00:12:24,344 and the clinical interventions okay. 351 00:12:24,344 --> 00:12:25,979 So now here's the framework. 352 00:12:25,979 --> 00:12:28,081 So first step is that. 353 00:12:30,450 --> 00:12:30,950 We have 354 00:12:30,950 --> 00:12:32,872 this is the stick for the department 355 00:12:32,872 --> 00:12:33,886 clinical research. 356 00:12:33,886 --> 00:12:37,290 Is it ethically acceptable unless it's socially valuable. 357 00:12:37,290 --> 00:12:39,552 So the way you apply that here is 358 00:12:39,552 --> 00:12:42,362 you first make sure that for each of the 359 00:12:42,362 --> 00:12:44,714 interventions, the research interventions 360 00:12:44,714 --> 00:12:47,066 that there's a socially valuable reason. 361 00:12:47,066 --> 00:12:49,802 Why is that intervention in the study. 362 00:12:49,802 --> 00:12:52,620 So you ask the investigator you have three MRI's in this 363 00:12:52,620 --> 00:12:52,972 study. 364 00:12:52,972 --> 00:12:54,841 Why are you doing three MRI? 365 00:12:54,841 --> 00:12:56,075 What are you going to learn from? 366 00:12:56,075 --> 00:12:59,912 For instance, the third MRI that you couldn't learn from the 367 00:12:59,912 --> 00:13:00,680 second MRI. 368 00:13:00,680 --> 00:13:03,783 And you want a good explanation for that. 369 00:13:04,450 --> 00:13:06,973 What are the things I'm going to emphasize 370 00:13:06,973 --> 00:13:07,453 is that 371 00:13:08,554 --> 00:13:10,623 I sit on IRBs 372 00:13:10,623 --> 00:13:13,259 when I talk to people in IRBs, I talk about how important 373 00:13:13,259 --> 00:13:14,927 this stuff actually is in practice. 374 00:13:14,927 --> 00:13:17,964 I rattle on about this in IRB meetings 375 00:13:18,264 --> 00:13:20,108 and then what the IRB on actually says, 376 00:13:20,108 --> 00:13:21,668 okay, here's the protocol, Dave. 377 00:13:21,668 --> 00:13:23,913 You tell us whether the risks and 378 00:13:23,913 --> 00:13:26,839 the benefits of this study are acceptable. 379 00:13:27,073 --> 00:13:31,277 And my answer is almost always, I have absolutely no idea, 380 00:13:31,511 --> 00:13:33,314 because to make these assessments 381 00:13:33,314 --> 00:13:35,281 like whether or not you really need 382 00:13:35,581 --> 00:13:38,415 the third MRI, whether testing this new kind of 383 00:13:38,415 --> 00:13:39,018 treatment 384 00:13:39,018 --> 00:13:40,909 for leukemia, whether those things 385 00:13:40,909 --> 00:13:42,188 are socially valuable, 386 00:13:42,855 --> 00:13:44,439 being able to make those judgments, 387 00:13:44,439 --> 00:13:45,525 you have to know a lot. 388 00:13:45,525 --> 00:13:46,659 You have to know a lot of stuff 389 00:13:46,659 --> 00:13:50,563 that I just don't know about, for instance, leukemia. 390 00:13:50,763 --> 00:13:54,067 So whenever you do this, you're really need people 391 00:13:54,067 --> 00:13:57,603 who have content expertise to make these judgments. 392 00:13:57,603 --> 00:13:59,694 You can't you can rely on philosophers 393 00:13:59,694 --> 00:14:00,740 for the framework, 394 00:14:00,740 --> 00:14:03,045 but you can't rely on philosophers 395 00:14:03,045 --> 00:14:04,944 to implement the framework. 396 00:14:04,944 --> 00:14:07,317 So why don't you do the individual interventions 397 00:14:07,317 --> 00:14:07,613 that, 398 00:14:07,613 --> 00:14:10,076 as I mentioned, you put them all together 399 00:14:10,076 --> 00:14:11,818 as the package that they are 400 00:14:11,818 --> 00:14:14,628 and you evaluate the overall study 401 00:14:14,628 --> 00:14:17,190 to see if it has social value. 402 00:14:18,825 --> 00:14:21,928 Typically, people will point out 403 00:14:21,928 --> 00:14:23,299 you hope to gather important data 404 00:14:23,299 --> 00:14:24,297 from every given trial. 405 00:14:24,297 --> 00:14:25,098 But presumably, 406 00:14:25,098 --> 00:14:27,322 or at least it's rare that any given trial 407 00:14:27,322 --> 00:14:29,335 is going to give you definitive data. 408 00:14:29,335 --> 00:14:31,424 Typically, you have to combine the data 409 00:14:31,424 --> 00:14:33,673 from one trial with data that you collect 410 00:14:33,906 --> 00:14:35,341 from other trials. 411 00:14:35,341 --> 00:14:38,344 And again, this requires a lot of expertise 412 00:14:39,479 --> 00:14:39,979 okay. 413 00:14:39,979 --> 00:14:42,777 So let's just talk about some of the challenges at this 414 00:14:42,777 --> 00:14:43,082 step. 415 00:14:43,382 --> 00:14:45,304 So one of the things that I think 416 00:14:45,304 --> 00:14:46,586 is always interesting 417 00:14:48,421 --> 00:14:50,315 and my experience of being on IRB 418 00:14:50,315 --> 00:14:52,725 for 25 years is IRBs differ on the answer 419 00:14:52,725 --> 00:14:56,210 to this question, which is imagine that the 420 00:14:56,210 --> 00:14:57,263 investigator 421 00:14:58,397 --> 00:14:59,599 submits a protocol to you. 422 00:14:59,599 --> 00:15:00,800 So that's how I IRBs work. 423 00:15:00,800 --> 00:15:02,773 The investigator submits a protocol, 424 00:15:02,773 --> 00:15:03,870 the IRB reviews it. 425 00:15:03,870 --> 00:15:04,670 So here's a question. 426 00:15:04,670 --> 00:15:06,839 What's the job of the IRB? 427 00:15:06,839 --> 00:15:08,074 It's the job of the IRB. 428 00:15:08,074 --> 00:15:11,978 Just to look at that study that got submitted to them, 429 00:15:11,978 --> 00:15:14,117 and just see whether or not that study 430 00:15:14,117 --> 00:15:15,581 is ethically appropriate, 431 00:15:15,882 --> 00:15:18,651 or should somebody do something like, 432 00:15:18,651 --> 00:15:22,255 yeah, you want to do X with this patient population. 433 00:15:22,255 --> 00:15:23,523 But here's this other thing 434 00:15:24,490 --> 00:15:25,747 you could do with the same patient 435 00:15:25,747 --> 00:15:27,226 population that would be more valuable, 436 00:15:27,226 --> 00:15:30,185 that would be more interesting, that would give us more valuable 437 00:15:30,185 --> 00:15:30,463 data. 438 00:15:30,863 --> 00:15:33,564 Is it the job of the IRB or somebody else 439 00:15:33,564 --> 00:15:35,935 to make these comparative judgments 440 00:15:36,602 --> 00:15:38,679 across studies, right to this study, 441 00:15:38,679 --> 00:15:40,006 rather than that study 442 00:15:41,474 --> 00:15:44,343 that at least I've sat on almost never do? 443 00:15:44,343 --> 00:15:47,013 Maybe they should be doing it or somebody should be doing it. 444 00:15:47,013 --> 00:15:48,953 And the other one is just comparative 445 00:15:48,953 --> 00:15:50,316 judgments within a trial. 446 00:15:50,516 --> 00:15:53,886 Well, you designed it with just two MRI's. 447 00:15:53,886 --> 00:15:56,790 But if you added this third MRI, look at all this other 448 00:15:56,790 --> 00:15:57,423 information 449 00:15:57,857 --> 00:16:00,371 you could get that would make your trial 450 00:16:00,371 --> 00:16:01,627 even more valuable. 451 00:16:02,061 --> 00:16:03,830 What should it be do in that case? 452 00:16:03,830 --> 00:16:06,098 They think that the study 453 00:16:06,098 --> 00:16:09,101 that was proposed is okay, but here's a much better study. 454 00:16:09,402 --> 00:16:11,370 Should they say you then do the advisory? 455 00:16:11,370 --> 00:16:14,166 You can't do this one because you could do this better 456 00:16:14,166 --> 00:16:14,373 one 457 00:16:14,640 --> 00:16:18,511 or does it just have to meet some sufficiency standard? 458 00:16:18,511 --> 00:16:20,076 And then there's the question as you go 459 00:16:20,076 --> 00:16:21,480 through this about who should make 460 00:16:22,381 --> 00:16:23,115 these judgments. 461 00:16:23,115 --> 00:16:25,084 We could talk about enhancements. 462 00:16:25,084 --> 00:16:27,416 So this Annette and I used to have 463 00:16:27,416 --> 00:16:30,022 this debate when we wrote that paper. 464 00:16:30,256 --> 00:16:32,191 So typically most clinical research 465 00:16:32,191 --> 00:16:34,053 focuses on diseases and illnesses 466 00:16:34,053 --> 00:16:36,028 like cancers and things like that. 467 00:16:37,296 --> 00:16:38,965 But what about 468 00:16:38,965 --> 00:16:40,585 trials to do what people call 469 00:16:40,585 --> 00:16:41,367 enhancements, 470 00:16:41,367 --> 00:16:44,303 which is not to treat an illness or a disease, 471 00:16:44,303 --> 00:16:46,044 but to make people who are healthy 472 00:16:46,044 --> 00:16:48,040 even better off, to make them happier, 473 00:16:48,040 --> 00:16:50,368 to make them healthier, to make them live 474 00:16:50,368 --> 00:16:52,411 longer, is that is equally valuable 475 00:16:52,411 --> 00:16:55,448 as doing something like treating cancer? 476 00:16:55,448 --> 00:16:57,941 And again, should people be making those 477 00:16:57,941 --> 00:16:58,751 assessments, 478 00:16:59,185 --> 00:17:02,188 and if so, who should be making them? 479 00:17:02,955 --> 00:17:04,991 So this is just some suggestions. 480 00:17:04,991 --> 00:17:08,261 When I gave that previous slide, I get a lot of pushback from IRB 481 00:17:08,261 --> 00:17:08,995 folks like that. 482 00:17:08,995 --> 00:17:10,257 You shouldn't be making these comparative 483 00:17:10,257 --> 00:17:10,596 judgments. 484 00:17:10,596 --> 00:17:13,332 That's not what I IRB are supposed to be doing. 485 00:17:13,332 --> 00:17:14,500 So what I started doing 486 00:17:14,500 --> 00:17:17,621 is trying to collect at least some evidence to 487 00:17:17,621 --> 00:17:18,571 suggest this. 488 00:17:18,571 --> 00:17:21,265 In some cases, I think IRBs should be doing 489 00:17:21,265 --> 00:17:21,641 that. 490 00:17:21,974 --> 00:17:24,611 And these are cases now, interestingly, 491 00:17:24,611 --> 00:17:27,113 they're from pandemics or epidemics. 492 00:17:27,113 --> 00:17:30,126 So the first one was in West Africa 493 00:17:30,126 --> 00:17:31,417 during a Ebola 494 00:17:31,651 --> 00:17:35,794 where the problem was there's only so many patients, and all 495 00:17:35,794 --> 00:17:36,622 of a sudden 496 00:17:36,622 --> 00:17:39,992 you've got all of this money and resources 497 00:17:40,259 --> 00:17:43,429 going into the area to study Ebola. 498 00:17:43,429 --> 00:17:46,232 And what these guys point was, is that 499 00:17:46,232 --> 00:17:47,718 there's a limited number of patients 500 00:17:47,718 --> 00:17:49,368 for all the studies that we want to do. 501 00:17:49,602 --> 00:17:52,405 And so if we really want to get the most social value 502 00:17:52,405 --> 00:17:54,971 out of this situation, we're going to need 503 00:17:54,971 --> 00:17:57,109 to prioritize which of the studies 504 00:17:57,109 --> 00:17:59,399 we allow act to access these patients 505 00:17:59,399 --> 00:18:00,947 and which ones we don't. 506 00:18:01,147 --> 00:18:03,449 And then a similar thing this is arguments. 507 00:18:03,449 --> 00:18:08,460 More recently during Covid where there's arguments early in the 508 00:18:08,460 --> 00:18:09,255 pandemic, 509 00:18:09,255 --> 00:18:11,934 we wanted to learn as much as we could 510 00:18:11,934 --> 00:18:14,260 about Covid, at least initially. 511 00:18:14,260 --> 00:18:16,702 Obviously, in the long run there was lots and lots of 512 00:18:16,702 --> 00:18:17,163 patients, 513 00:18:17,163 --> 00:18:19,065 but initially there weren't that many. 514 00:18:19,065 --> 00:18:20,967 And who should be? 515 00:18:20,967 --> 00:18:24,804 Should somebody be prioritizing and deciding which studies to do 516 00:18:25,037 --> 00:18:26,973 and which studies not to do these? 517 00:18:26,973 --> 00:18:29,175 This group argue that at least within 518 00:18:29,175 --> 00:18:30,009 institutions. 519 00:18:32,078 --> 00:18:34,077 So, for instance, the NIH Clinical Center 520 00:18:34,077 --> 00:18:35,247 should be prioritizing. 521 00:18:35,247 --> 00:18:37,090 Johns Hopkins should be prioritizing, 522 00:18:37,090 --> 00:18:38,584 at least within institutions. 523 00:18:38,584 --> 00:18:40,753 There should be some prioritization. 524 00:18:40,753 --> 00:18:42,722 Okay. So that's potential benefits. 525 00:18:42,722 --> 00:18:44,757 Social value. 526 00:18:44,757 --> 00:18:47,393 So the next thing to look at is the risks. 527 00:18:47,393 --> 00:18:49,495 So the idea is to identify the risks 528 00:18:49,495 --> 00:18:51,831 of each of the individual interventions 529 00:18:52,131 --> 00:18:55,166 and then try to reduce them to the extent 530 00:18:55,166 --> 00:18:56,202 that you can. 531 00:18:56,602 --> 00:18:59,038 Now I'll come back to this in a second. 532 00:18:59,038 --> 00:19:01,805 One of the things that everybody agrees 533 00:19:01,805 --> 00:19:04,076 at this step is that you should 534 00:19:04,076 --> 00:19:06,481 look at all of the risks in a very, 535 00:19:06,481 --> 00:19:07,580 very broad way. 536 00:19:07,580 --> 00:19:11,350 So obviously that means health risks, like 537 00:19:11,584 --> 00:19:13,671 is this drug potentially going to knock out 538 00:19:13,671 --> 00:19:14,253 my kidneys, 539 00:19:14,253 --> 00:19:15,697 and I'm going to have to be on dialysis 540 00:19:15,697 --> 00:19:16,622 for the rest of my life. 541 00:19:16,622 --> 00:19:19,200 That's obviously a risk of being in a clinical 542 00:19:19,200 --> 00:19:19,592 trial. 543 00:19:19,592 --> 00:19:22,918 But when people emphasize there's other sorts of risks, 544 00:19:22,918 --> 00:19:23,462 there's, 545 00:19:23,462 --> 00:19:24,964 for instance, economic risks. 546 00:19:24,964 --> 00:19:29,101 What if knocking out my kidneys, I have a copay of $1,000, 547 00:19:29,335 --> 00:19:31,155 and then I'm going to have to start paying 548 00:19:31,155 --> 00:19:32,672 thousands of dollars out of pocket 549 00:19:32,672 --> 00:19:35,098 to get that kidney problem fixed, 550 00:19:35,098 --> 00:19:36,642 or at least treated. 551 00:19:36,942 --> 00:19:39,438 What about the psychological harms 552 00:19:39,438 --> 00:19:40,980 of being on dialysis 553 00:19:40,980 --> 00:19:42,214 and how that makes me feel? 554 00:19:42,214 --> 00:19:44,380 What about stigma if I'm on dialysis 555 00:19:44,380 --> 00:19:45,885 and how I'm stigmatized? 556 00:19:46,252 --> 00:19:48,854 The argument here with respect to risks 557 00:19:48,854 --> 00:19:51,724 is that all of those should be considered. 558 00:19:51,724 --> 00:19:53,192 I'm going to come back to that then 559 00:19:53,192 --> 00:19:54,772 when we talk about potential benefits, 560 00:19:54,772 --> 00:19:56,062 I mean, it seems right, right. 561 00:19:56,062 --> 00:19:57,660 People care about more than just their 562 00:19:57,660 --> 00:19:57,997 bodies. 563 00:19:57,997 --> 00:20:00,142 They care about their bodies, but they care about their bank 564 00:20:00,142 --> 00:20:00,499 accounts. 565 00:20:00,499 --> 00:20:01,575 Some people care about their bank 566 00:20:01,575 --> 00:20:01,901 accounts. 567 00:20:01,901 --> 00:20:03,736 More than they do their bodies, 568 00:20:03,736 --> 00:20:07,118 so that we should be worried about all the different types of 569 00:20:07,118 --> 00:20:07,506 risks. 570 00:20:10,509 --> 00:20:12,178 So there's a couple challenges with this. 571 00:20:12,178 --> 00:20:15,660 And the basic idea is that what this step says is 572 00:20:15,660 --> 00:20:16,015 that 573 00:20:16,515 --> 00:20:19,518 in order to decide whether a study is acceptable 574 00:20:19,985 --> 00:20:21,753 before it can begin, the IRB has to 575 00:20:21,753 --> 00:20:23,823 make sure that the risks are acceptable. 576 00:20:24,623 --> 00:20:25,724 It's obviously a conundrum. 577 00:20:25,724 --> 00:20:27,816 There is that that's typically the point 578 00:20:27,816 --> 00:20:29,228 of clinical trials, right? 579 00:20:29,228 --> 00:20:31,872 At least early phase clinical trials are designed to 580 00:20:31,872 --> 00:20:32,431 figure out 581 00:20:32,731 --> 00:20:35,701 what the risks are of a given intervention are. 582 00:20:35,701 --> 00:20:38,953 So if the goal of the trial is to find out what the risks 583 00:20:38,953 --> 00:20:39,238 are, 584 00:20:39,238 --> 00:20:41,069 how can you prospectively evaluate 585 00:20:41,069 --> 00:20:43,008 whether those risks are acceptable? 586 00:20:43,008 --> 00:20:44,934 It seems like you got to do the trial 587 00:20:44,934 --> 00:20:46,078 before you can decide 588 00:20:46,078 --> 00:20:48,814 whether the trial is ethical to do. 589 00:20:48,814 --> 00:20:52,143 And so in this, basically what IRBs need to do is they 590 00:20:52,143 --> 00:20:52,451 need 591 00:20:52,451 --> 00:20:54,592 to be creative, they need to expand 592 00:20:54,592 --> 00:20:56,489 beyond the given intervention. 593 00:20:56,489 --> 00:20:58,724 If there's no data on it. Sometimes there is. 594 00:20:58,724 --> 00:21:01,577 Sometimes this intervention has been used before in other 595 00:21:01,577 --> 00:21:02,228 populations. 596 00:21:02,661 --> 00:21:05,131 See how you can extrapolate to this population. 597 00:21:05,131 --> 00:21:07,900 Sometimes this specific drug hasn't been given 598 00:21:07,900 --> 00:21:10,543 to this population, but drugs of a similar 599 00:21:10,543 --> 00:21:12,304 same class have been given. 600 00:21:12,304 --> 00:21:14,673 And you can try to extrapolate from those data. 601 00:21:14,673 --> 00:21:16,857 Or maybe it's just the mechanism of action 602 00:21:16,857 --> 00:21:18,677 between this drug and other drugs. 603 00:21:18,677 --> 00:21:19,612 So basically they're trying to look 604 00:21:19,612 --> 00:21:23,115 for relevantly similar cases for which there are data 605 00:21:23,349 --> 00:21:25,254 and then seeing how they can apply them 606 00:21:25,254 --> 00:21:26,719 to this one, with the thought 607 00:21:27,019 --> 00:21:29,612 that the further you're extrapolating, 608 00:21:29,612 --> 00:21:31,590 the greater the uncertainty. 609 00:21:31,590 --> 00:21:34,583 And then you should factor that uncertainty into the risk 610 00:21:34,583 --> 00:21:35,161 assessment 611 00:21:35,161 --> 00:21:37,229 with a thought that the more uncertain you are, 612 00:21:37,229 --> 00:21:39,965 the riskier the intervention or the study 613 00:21:39,965 --> 00:21:40,232 is. 614 00:21:40,866 --> 00:21:42,168 So here's another challenge. 615 00:21:45,171 --> 00:21:46,570 The first one was that you got to know 616 00:21:46,570 --> 00:21:48,007 something about the intervention here. 617 00:21:48,007 --> 00:21:49,842 The point is that you don't have to know 618 00:21:49,842 --> 00:21:50,943 about the intervention. 619 00:21:50,943 --> 00:21:52,755 You have to know who the intervention 620 00:21:52,755 --> 00:21:53,979 is going to be put into. 621 00:21:53,979 --> 00:21:54,446 Right. 622 00:21:54,446 --> 00:21:57,146 So the risks of a certain drug can vary 623 00:21:57,146 --> 00:21:59,084 dramatically depending upon 624 00:21:59,084 --> 00:22:00,252 who's going to receive it. 625 00:22:00,252 --> 00:22:03,844 But you don't know who's going to receive it until the trial 626 00:22:03,844 --> 00:22:04,323 starts. 627 00:22:04,323 --> 00:22:06,453 So the example I have here is drugs 628 00:22:06,453 --> 00:22:08,460 that are cleared by the kidneys. 629 00:22:08,827 --> 00:22:10,780 They might be very safe for people, 630 00:22:10,780 --> 00:22:12,231 have good kidney function 631 00:22:12,231 --> 00:22:14,354 who can excrete the drug properly, 632 00:22:14,354 --> 00:22:16,602 but they might be really dangerous. 633 00:22:16,602 --> 00:22:18,679 The drug might build up in the systems of 634 00:22:18,679 --> 00:22:20,806 people who have impaired kidney function. 635 00:22:20,806 --> 00:22:23,173 So another thing that I need to be aware 636 00:22:23,173 --> 00:22:25,244 about is the extent to which risks 637 00:22:25,244 --> 00:22:28,722 and potential benefits depend upon who enrolls in the 638 00:22:28,722 --> 00:22:29,181 study. 639 00:22:29,481 --> 00:22:32,418 And so the way IRBs try to handle this is with 640 00:22:33,986 --> 00:22:35,487 inclusion exclusion criteria. 641 00:22:35,487 --> 00:22:37,223 You try to figure out who's going to face 642 00:22:37,223 --> 00:22:40,443 excessive risks and then you try to exclude 643 00:22:40,443 --> 00:22:40,893 them. 644 00:22:41,126 --> 00:22:43,293 The other thing I mention here is the 645 00:22:43,293 --> 00:22:43,996 importance. 646 00:22:43,996 --> 00:22:46,189 Sometimes you just don't know how a given 647 00:22:46,189 --> 00:22:47,900 drug is going to affect people. 648 00:22:47,900 --> 00:22:52,171 And so there's this importance of monitoring the trial as you 649 00:22:52,171 --> 00:22:52,871 go along. 650 00:22:52,871 --> 00:22:54,807 And this goes back to Holly's talk. 651 00:22:54,807 --> 00:22:56,759 If you start monitoring it in the third 652 00:22:56,759 --> 00:22:57,910 or fourth participant, 653 00:22:57,910 --> 00:23:00,682 you start seeing a trend of significant 654 00:23:00,682 --> 00:23:01,180 harms. 655 00:23:01,180 --> 00:23:04,842 Then you need to stop the study and reevaluate and see whether 656 00:23:04,842 --> 00:23:05,551 or not it's 657 00:23:05,551 --> 00:23:08,554 okay to keep going from. 658 00:23:09,555 --> 00:23:10,619 All right, so this is going to be 659 00:23:10,619 --> 00:23:11,457 even more into the weeds. 660 00:23:11,457 --> 00:23:12,157 So this is the end. 661 00:23:12,157 --> 00:23:14,210 Another I did this one yesterday just 662 00:23:14,210 --> 00:23:16,595 to make sure I really drive you guys nuts. 663 00:23:16,895 --> 00:23:19,495 And so for the most part when I've been thinking about 664 00:23:19,495 --> 00:23:19,832 risks, 665 00:23:19,832 --> 00:23:22,897 I've been talking about risks to the individuals who are in 666 00:23:22,897 --> 00:23:23,469 the study. 667 00:23:24,003 --> 00:23:26,605 And that's the standard risk benefit assessment. 668 00:23:26,605 --> 00:23:28,741 And clinical research is done. 669 00:23:28,741 --> 00:23:30,721 More recently, people have started 670 00:23:30,721 --> 00:23:32,177 to worry about the risks 671 00:23:32,177 --> 00:23:34,242 to what are just generally called 672 00:23:34,242 --> 00:23:35,180 third parties. 673 00:23:35,447 --> 00:23:37,149 So here's an obvious example. 674 00:23:39,251 --> 00:23:41,658 Challenge trials are trials where you take an 675 00:23:41,658 --> 00:23:42,621 infectious agent, 676 00:23:42,621 --> 00:23:46,724 like a virus, and you inject it into a couple of research 677 00:23:46,724 --> 00:23:47,660 participants 678 00:23:47,660 --> 00:23:48,627 and you see what happens. 679 00:23:48,627 --> 00:23:50,462 Maybe you're going to test different treatments, 680 00:23:50,462 --> 00:23:53,258 or are you going to try to study the natural history of the 681 00:23:53,258 --> 00:23:53,732 disorder? 682 00:23:54,033 --> 00:23:57,810 So obviously, injecting viruses into the participants poses 683 00:23:57,810 --> 00:23:58,771 risks to them. 684 00:23:59,238 --> 00:24:01,040 That's what we've been talking about. 685 00:24:01,040 --> 00:24:03,981 But the point here is that that can also pose risks to 686 00:24:03,981 --> 00:24:04,743 other people. 687 00:24:04,743 --> 00:24:06,823 If you're infecting the participants 688 00:24:06,823 --> 00:24:08,614 and then the participants work 689 00:24:08,614 --> 00:24:11,522 with the nurses, the participants go on 690 00:24:11,522 --> 00:24:14,653 weekend passes, the participants go home. 691 00:24:14,887 --> 00:24:17,756 They could transmit that infection 692 00:24:17,756 --> 00:24:20,759 to lots of other people. 693 00:24:22,995 --> 00:24:23,662 So what do you think? 694 00:24:23,662 --> 00:24:26,479 So that seems, that may I won't force people to 695 00:24:26,479 --> 00:24:26,899 think, 696 00:24:26,899 --> 00:24:30,002 but that seems like I should take that into account. 697 00:24:30,002 --> 00:24:31,337 Right? Does everybody agree? 698 00:24:31,337 --> 00:24:32,705 Yeah, of course you should take that into 699 00:24:32,705 --> 00:24:33,005 account. 700 00:24:33,005 --> 00:24:34,659 So it seems like you should take risks 701 00:24:34,659 --> 00:24:36,008 to third parties into account. 702 00:24:36,408 --> 00:24:38,382 But now the question is, how broad 703 00:24:38,382 --> 00:24:40,646 do you think that principle should be. 704 00:24:40,646 --> 00:24:43,349 So these are actually some real examples. 705 00:24:43,349 --> 00:24:46,352 And we did an ethics grand rounds on this 706 00:24:46,952 --> 00:24:49,488 about ten years ago where 707 00:24:50,856 --> 00:24:52,725 there was a participant 708 00:24:52,725 --> 00:24:55,861 who was a perfect candidate for the study, 709 00:24:56,428 --> 00:24:58,315 and the study offered the participant 710 00:24:58,315 --> 00:24:59,999 the chance, a potential benefit. 711 00:25:00,499 --> 00:25:02,963 But it turned out that the participant 712 00:25:02,963 --> 00:25:05,104 was the caretaker for two people 713 00:25:05,104 --> 00:25:07,396 who were really sick, and the participant 714 00:25:07,396 --> 00:25:09,241 could leave those people behind. 715 00:25:10,309 --> 00:25:11,983 So what the potential participant 716 00:25:11,983 --> 00:25:14,012 was going to do was, was going to drive 717 00:25:14,012 --> 00:25:18,105 from the west to Bethesda with those two people in the 718 00:25:18,105 --> 00:25:18,484 car, 719 00:25:19,918 --> 00:25:21,653 and the home physician was really worried 720 00:25:21,653 --> 00:25:24,095 that that drive could be dangerous 721 00:25:24,095 --> 00:25:25,891 to the other two people, 722 00:25:25,891 --> 00:25:28,894 not the participant, but to the other two people. 723 00:25:30,295 --> 00:25:32,112 Another example we just heard about 724 00:25:32,112 --> 00:25:34,032 in a recent ethics grant rounds was, 725 00:25:35,200 --> 00:25:37,147 participant who was undergoing bone 726 00:25:37,147 --> 00:25:39,204 marrow transplant that required them 727 00:25:39,204 --> 00:25:41,223 to stay in the clinical center here 728 00:25:41,223 --> 00:25:42,608 for about three months. 729 00:25:43,275 --> 00:25:44,925 It turned out they had little kids 730 00:25:44,925 --> 00:25:46,478 who were very attached to them, 731 00:25:47,045 --> 00:25:49,848 and people were worried that the participant 732 00:25:49,848 --> 00:25:52,431 being away from these little kids for three 733 00:25:52,431 --> 00:25:52,851 months 734 00:25:53,118 --> 00:25:55,671 would be traumatic to their little kids 735 00:25:55,671 --> 00:25:56,522 who couldn't 736 00:25:56,522 --> 00:25:58,453 come here and had to stay at home 737 00:25:58,453 --> 00:26:00,559 and be watched by the grandparents. 738 00:26:00,559 --> 00:26:03,462 It turns out, in that case, as we heard the story, that it 739 00:26:03,462 --> 00:26:03,662 did 740 00:26:03,662 --> 00:26:05,130 turn out to be really they did come. 741 00:26:05,130 --> 00:26:06,632 They participate in this study, 742 00:26:06,632 --> 00:26:09,635 and it was really traumatic for their two little kids. 743 00:26:09,835 --> 00:26:11,363 And this is a couple years later, 744 00:26:11,363 --> 00:26:12,938 and apparently the kids are still 745 00:26:12,938 --> 00:26:14,506 experiencing the effects of it. 746 00:26:14,506 --> 00:26:17,321 So in the second bullet, we said, yeah, 747 00:26:17,321 --> 00:26:18,043 of course 748 00:26:18,043 --> 00:26:19,570 you should take into account risks 749 00:26:19,570 --> 00:26:20,379 to third parties. 750 00:26:20,379 --> 00:26:21,580 But now what do we do with these? 751 00:26:21,580 --> 00:26:23,415 Do we say, well, we're gonna have 752 00:26:23,415 --> 00:26:25,083 inclusion exclusion criteria. 753 00:26:25,083 --> 00:26:28,720 If you have lots of little kids, you can't enroll in our trial. 754 00:26:29,121 --> 00:26:31,924 If you have to drive more than 50 miles to get here. 755 00:26:31,924 --> 00:26:33,725 You can't be at our trial. 756 00:26:33,725 --> 00:26:35,861 Then we have to look at all this information on people 757 00:26:35,861 --> 00:26:36,995 about how many kids they have 758 00:26:36,995 --> 00:26:39,681 and where they live, and whether they fly or they 759 00:26:39,681 --> 00:26:40,065 drive. 760 00:26:40,065 --> 00:26:41,553 Not only does that seem intrusive, 761 00:26:41,553 --> 00:26:42,734 but it starts to seem like 762 00:26:42,734 --> 00:26:43,535 that's just too much. 763 00:26:43,535 --> 00:26:45,737 RBC can't be doing that, right. 764 00:26:45,737 --> 00:26:47,950 So I think this is a really interesting 765 00:26:47,950 --> 00:26:48,574 challenge. 766 00:26:48,574 --> 00:26:49,341 Both. 767 00:26:49,341 --> 00:26:52,608 It seems like something we should be doing on the one 768 00:26:52,608 --> 00:26:52,978 hand, 769 00:26:52,978 --> 00:26:54,513 but then when you generalize 770 00:26:54,513 --> 00:26:55,790 the principle, it seems like it's 771 00:26:55,790 --> 00:26:57,416 something that's almost impossible to do. 772 00:26:57,416 --> 00:26:59,351 And I have absolutely no idea what the answer to this 773 00:27:00,419 --> 00:27:01,019 puzzle is. 774 00:27:01,019 --> 00:27:03,121 And if anybody knows the answer, I'd be very interested 775 00:27:03,121 --> 00:27:05,784 in hearing it afterwards or during a question answer 776 00:27:05,784 --> 00:27:06,091 okay, 777 00:27:07,726 --> 00:27:10,028 now a little bit of philosophy 778 00:27:10,028 --> 00:27:11,835 or philosophy of science that is, 779 00:27:11,835 --> 00:27:13,532 I think, practically relevant. 780 00:27:13,532 --> 00:27:16,219 So I think it's worth at least thinking about a little 781 00:27:16,219 --> 00:27:16,468 bit. 782 00:27:16,468 --> 00:27:18,320 So we've been talking about risks 783 00:27:18,320 --> 00:27:20,172 and benefits of an intervention, 784 00:27:20,172 --> 00:27:23,375 like giving somebody a drug as if this a straightforward 785 00:27:23,375 --> 00:27:24,176 figuring out. 786 00:27:24,176 --> 00:27:24,910 Right. 787 00:27:24,910 --> 00:27:26,912 Inject somebody with a drug. 788 00:27:26,912 --> 00:27:27,679 What are the risks? 789 00:27:27,679 --> 00:27:31,116 Well, it's the potential that bad things or harms 790 00:27:31,116 --> 00:27:33,552 are going to be experienced as a result of that. Right. 791 00:27:33,552 --> 00:27:36,555 That seems pretty easy to imagine. 792 00:27:36,889 --> 00:27:38,785 What's important to notice, though 793 00:27:38,785 --> 00:27:40,626 in some cases it doesn't matter. 794 00:27:40,626 --> 00:27:41,460 It's implicit. 795 00:27:41,460 --> 00:27:43,452 But in some cases it's really important, 796 00:27:43,452 --> 00:27:44,897 which is that whenever we do 797 00:27:44,897 --> 00:27:49,134 risk benefit assessment, we're appealing to some baseline 798 00:27:49,134 --> 00:27:51,824 or some comparison against which we're making those 799 00:27:51,824 --> 00:27:52,404 judgments. 800 00:27:52,404 --> 00:27:54,573 So here's an example to try to show you that 801 00:27:54,573 --> 00:27:56,494 there's a phase two study of a treatment 802 00:27:56,494 --> 00:27:57,743 that has been shown safe. 803 00:27:57,743 --> 00:27:59,545 So we know it's safe let's say, 804 00:27:59,545 --> 00:28:02,297 and offers a small chance of helping participants 805 00:28:02,297 --> 00:28:02,915 medically. 806 00:28:02,915 --> 00:28:04,383 So they have a serious illness. 807 00:28:04,383 --> 00:28:07,119 And there's a small chance this will help them. 808 00:28:07,119 --> 00:28:09,354 Is that a potential benefit study. 809 00:28:09,354 --> 00:28:12,457 So we think about the risks and benefits of that trial okay. 810 00:28:12,691 --> 00:28:14,426 So it's supposed to be a trick question right? 811 00:28:16,595 --> 00:28:18,043 Anybody want to take a shot and fall 812 00:28:18,043 --> 00:28:19,531 into the trick or address the trick? 813 00:28:19,531 --> 00:28:22,301 Or should I just click to the next slide. 814 00:28:22,301 --> 00:28:23,702 It depends. Right. 815 00:28:23,702 --> 00:28:25,737 So this is supposed to be what the comparison is. 816 00:28:25,737 --> 00:28:26,905 It depends on what. 817 00:28:26,905 --> 00:28:28,006 Well it depends on 818 00:28:28,006 --> 00:28:29,697 what's going to happen to these people 819 00:28:29,697 --> 00:28:31,209 if they don't enroll in my trial. 820 00:28:31,209 --> 00:28:31,610 Right. 821 00:28:31,610 --> 00:28:33,312 If outside of my trial 822 00:28:33,312 --> 00:28:35,936 they wouldn't get anything, they would just sit at home and 823 00:28:35,936 --> 00:28:36,381 watch TV. 824 00:28:36,882 --> 00:28:38,760 Then arguably, being in the trial 825 00:28:38,760 --> 00:28:40,752 is potentially beneficial for them 826 00:28:40,752 --> 00:28:42,257 because there's this small chance 827 00:28:42,257 --> 00:28:43,989 of benefit, a small chance, a benefit 828 00:28:43,989 --> 00:28:46,104 is better than no chance of benefit 829 00:28:46,104 --> 00:28:47,192 from watching TV. 830 00:28:47,192 --> 00:28:49,658 So overall, the comparison suggests 831 00:28:49,658 --> 00:28:50,996 potential benefit. 832 00:28:51,229 --> 00:28:53,105 But in contrast, imagine there's some 833 00:28:53,105 --> 00:28:55,133 drug out there that cures this disease. 834 00:28:55,133 --> 00:28:56,735 You take it once and it cures you. 835 00:28:58,236 --> 00:28:58,604 If the 836 00:28:58,604 --> 00:29:00,275 choice is between taking that drug 837 00:29:00,275 --> 00:29:02,341 and being in my trial, then it looks like 838 00:29:02,341 --> 00:29:03,782 not only isn't the trial beneficial, 839 00:29:03,782 --> 00:29:05,143 but it looks really risky, right? 840 00:29:05,143 --> 00:29:07,523 Being in my trial is that nine people 841 00:29:07,523 --> 00:29:09,581 the opportunity of being cured. 842 00:29:09,581 --> 00:29:11,483 That seems like a very bad thing. 843 00:29:11,483 --> 00:29:13,719 So now it looks like a really risky trial. 844 00:29:13,719 --> 00:29:17,758 So the point here is that it depends on what you're comparing 845 00:29:17,758 --> 00:29:18,023 to. 846 00:29:18,423 --> 00:29:19,891 Okay. So that's important. 847 00:29:19,891 --> 00:29:21,193 That's clear 848 00:29:21,193 --> 00:29:22,959 now why this is practically relevant, 849 00:29:22,959 --> 00:29:24,630 not just philosophically relevant. 850 00:29:24,863 --> 00:29:27,866 Is this that 851 00:29:28,066 --> 00:29:31,436 what the comparison is or what is available to people 852 00:29:31,436 --> 00:29:35,841 otherwise varies dramatically in populations. 853 00:29:35,841 --> 00:29:38,136 It varies dramatically in the U.S and 854 00:29:38,136 --> 00:29:40,679 it varies dramatically across the world. 855 00:29:40,979 --> 00:29:43,188 And so what that means in the end is that 856 00:29:43,188 --> 00:29:45,450 whether a study is potentially beneficial 857 00:29:45,450 --> 00:29:48,720 or risky is going to depend upon where you do it. 858 00:29:49,287 --> 00:29:50,868 And if you do it in certain populations, 859 00:29:50,868 --> 00:29:52,290 it might be potentially beneficial. 860 00:29:52,491 --> 00:29:55,894 If you do in other populations, it might be really risky. 861 00:29:56,194 --> 00:29:57,462 That's complicated. 862 00:29:57,462 --> 00:30:00,932 And it also drives people absolutely nuts. 863 00:30:00,932 --> 00:30:03,011 So to evaluate the risks of research, 864 00:30:03,011 --> 00:30:04,302 it's important to know 865 00:30:05,170 --> 00:30:06,896 if these people were in our trial, 866 00:30:06,896 --> 00:30:08,674 what would happen to them instead, 867 00:30:09,007 --> 00:30:10,886 and how does what we're proposing 868 00:30:10,886 --> 00:30:12,878 to do to them in the trial compare 869 00:30:13,111 --> 00:30:15,280 to what would happen to them instead of a trial 870 00:30:15,280 --> 00:30:18,330 may be risky in some places, but potentially beneficial in 871 00:30:18,330 --> 00:30:18,750 others. 872 00:30:19,217 --> 00:30:20,185 So where does this get? 873 00:30:20,185 --> 00:30:22,621 Is there going to be a session or an international? 874 00:30:22,621 --> 00:30:25,223 Okay, so somebody's probably gonna talk about this becomes a 875 00:30:25,223 --> 00:30:25,657 big deal, 876 00:30:25,657 --> 00:30:28,082 especially when you talk about research 877 00:30:28,082 --> 00:30:29,761 in international settings. 878 00:30:29,995 --> 00:30:33,548 And this was a big deal in some HIV studies about 30 879 00:30:33,548 --> 00:30:34,299 years ago. 880 00:30:34,299 --> 00:30:37,035 They turn into huge fights where 881 00:30:38,904 --> 00:30:40,205 still is true. 882 00:30:40,205 --> 00:30:43,642 Not as much as it used to be, unfortunately, but 20 years, 15, 883 00:30:43,642 --> 00:30:46,678 20 years ago, if you lived in certain places, 884 00:30:46,678 --> 00:30:49,266 there were very, very good treatments 885 00:30:49,266 --> 00:30:51,783 for HIV in other parts of the world 886 00:30:51,783 --> 00:30:54,546 where there was lots of HIV, those treatments weren't 887 00:30:54,546 --> 00:30:55,120 available. 888 00:30:56,555 --> 00:30:59,078 So now imagine you have this idea of this 889 00:30:59,078 --> 00:31:01,293 super cheap drug that's not as good 890 00:31:01,293 --> 00:31:03,788 as the fancy ones you can get in Bethesda, 891 00:31:03,788 --> 00:31:06,164 Maryland, but it's better than nothing. 892 00:31:07,532 --> 00:31:10,326 Now, notice if you ran that trial in 893 00:31:10,326 --> 00:31:11,103 Bethesda, 894 00:31:11,103 --> 00:31:13,974 Maryland, it's going to be like the example I 895 00:31:13,974 --> 00:31:14,740 gave before 896 00:31:14,740 --> 00:31:16,942 where there's a chance of a cure outside, 897 00:31:16,942 --> 00:31:19,144 not a cure, but a really good treatment. 898 00:31:19,377 --> 00:31:20,479 Whereas if you do it in Bethesda, 899 00:31:20,479 --> 00:31:23,348 it looks like a risky and maybe an unethical study. 900 00:31:23,348 --> 00:31:25,625 But if you do it in sub-Saharan Africa, 901 00:31:25,625 --> 00:31:27,319 where they don't have access 902 00:31:27,319 --> 00:31:29,003 to those treatments, it looks like 903 00:31:29,003 --> 00:31:30,589 this is potentially beneficial. 904 00:31:31,022 --> 00:31:33,992 So that's what risk benefit suggests, right. 905 00:31:34,326 --> 00:31:36,366 But now what you're saying is that a trial 906 00:31:36,366 --> 00:31:37,095 this unethical 907 00:31:37,095 --> 00:31:39,029 to do here because the risk is okay 908 00:31:39,029 --> 00:31:40,632 to do in sub-Saharan Africa. 909 00:31:40,632 --> 00:31:42,846 And that made people really nervous 910 00:31:42,846 --> 00:31:43,668 to say that. 911 00:31:43,869 --> 00:31:45,871 And some people thought that 912 00:31:45,871 --> 00:31:49,274 that made them so nervous that it must be wrong somehow. 913 00:31:49,274 --> 00:31:51,468 And we have to figure out how. And there were huge fights about 914 00:31:51,468 --> 00:31:51,676 this. 915 00:31:51,676 --> 00:31:53,197 So it sounds like maybe you guys will talk 916 00:31:53,197 --> 00:31:53,812 more about this. 917 00:31:53,812 --> 00:31:55,981 So the important point now here is just 918 00:31:55,981 --> 00:31:56,815 the comparison 919 00:31:56,815 --> 00:31:59,073 can have really practical implications 920 00:31:59,073 --> 00:32:00,619 for some of these trials. 921 00:32:00,852 --> 00:32:01,319 Okay. 922 00:32:02,487 --> 00:32:03,948 There's an end of the philosophy for a 923 00:32:03,948 --> 00:32:04,256 second. 924 00:32:04,256 --> 00:32:06,558 So now we've identified the risks of the interventions. 925 00:32:06,558 --> 00:32:08,493 We want to minimize them 926 00:32:08,493 --> 00:32:11,806 I just I just do these because this is personal trauma 927 00:32:11,806 --> 00:32:12,297 for me. 928 00:32:12,631 --> 00:32:15,567 I still get very freaked out by blood draws. 929 00:32:15,567 --> 00:32:18,740 And they used to I really had an incredibly hard 930 00:32:18,740 --> 00:32:19,070 time 931 00:32:19,271 --> 00:32:20,505 when I was a little kid. 932 00:32:20,505 --> 00:32:22,407 I still do, actually. 933 00:32:22,407 --> 00:32:24,342 And then 934 00:32:24,342 --> 00:32:26,645 I got tons of blood draws in my life. 935 00:32:26,645 --> 00:32:28,213 And then at one point, some nurse 936 00:32:28,213 --> 00:32:29,781 says to me, look out the window. 937 00:32:31,082 --> 00:32:33,718 I looked out the window, she did it. 938 00:32:33,718 --> 00:32:35,287 And she and I said, okay, you're going to do it. 939 00:32:35,287 --> 00:32:36,721 She says, no, we're done. 940 00:32:36,721 --> 00:32:38,290 And it turns out for at least a lot of people 941 00:32:38,290 --> 00:32:40,625 like me, it's not the physical sensation 942 00:32:40,625 --> 00:32:41,326 that's bad. 943 00:32:41,326 --> 00:32:43,508 It's the seeing this thing getting stuck 944 00:32:43,508 --> 00:32:45,363 inside your body that's horrible. 945 00:32:45,630 --> 00:32:48,300 And it just completely went away. 946 00:32:48,300 --> 00:32:50,635 And so the point of this is just 947 00:32:50,635 --> 00:32:52,971 we need people who know what they're doing, 948 00:32:52,971 --> 00:32:54,356 and we need people to be creative 949 00:32:54,356 --> 00:32:55,574 when we're doing this stuff. 950 00:32:55,574 --> 00:32:56,942 This just shows a way in which, 951 00:32:58,777 --> 00:33:00,507 in which we talked about, I don't people 952 00:33:00,507 --> 00:33:02,280 know the jiggling of the, the cheek one. 953 00:33:02,280 --> 00:33:04,382 My dentist taught me this one. 954 00:33:04,382 --> 00:33:04,616 All right. 955 00:33:04,616 --> 00:33:07,052 So so I know these are my tangent. Sorry. 956 00:33:07,052 --> 00:33:08,843 If I'm late, if I go over to my phone, 957 00:33:08,843 --> 00:33:10,021 holly holic, yell at me. 958 00:33:10,622 --> 00:33:11,723 This one is really interesting. 959 00:33:11,723 --> 00:33:14,426 It turns out that that nerve pathways. Right. 960 00:33:14,426 --> 00:33:16,491 So so if you stick somebody down here, 961 00:33:16,491 --> 00:33:18,230 the reason why it hurts, right? 962 00:33:18,230 --> 00:33:20,799 They're all these neuro pathways that goes up into your brain 963 00:33:20,799 --> 00:33:23,543 and your brain does some complicated things and and it 964 00:33:23,543 --> 00:33:24,102 says ouch. 965 00:33:24,369 --> 00:33:27,172 Right. Well, 966 00:33:27,172 --> 00:33:29,908 these pathways come together at various points, 967 00:33:29,908 --> 00:33:33,245 and they can't transmit more than one signal at a time. 968 00:33:33,745 --> 00:33:37,182 And what's really interesting and evolutionarily seems weird 969 00:33:37,515 --> 00:33:40,819 is that those are some of those nerve pathways prioritize 970 00:33:40,819 --> 00:33:43,116 sort of dull, pounding sensations 971 00:33:43,116 --> 00:33:44,856 over sharp, sticky ones. 972 00:33:45,891 --> 00:33:48,304 And so what that does is, and what my dentist 973 00:33:48,304 --> 00:33:48,894 used to do 974 00:33:49,261 --> 00:33:52,330 when we did this was sticking that big needle into my cheek. 975 00:33:52,597 --> 00:33:55,600 At the same time, what you do is jiggle my cheek, 976 00:33:56,034 --> 00:34:00,099 and that sort of dull sensation would get prioritized by the 977 00:34:00,099 --> 00:34:00,505 nerve 978 00:34:00,505 --> 00:34:02,084 pathways over there getting stuck, 979 00:34:02,084 --> 00:34:03,942 and I wouldn't feel that getting stuck. 980 00:34:03,942 --> 00:34:05,143 It was just fantastic. 981 00:34:05,143 --> 00:34:08,146 I think I was 40 before somebody before somebody did that to me. 982 00:34:08,146 --> 00:34:10,248 So I was like, people should be doing this 983 00:34:10,248 --> 00:34:11,149 more often, okay, 984 00:34:11,650 --> 00:34:13,818 second point is that 985 00:34:13,818 --> 00:34:15,754 sometimes the way you minimize risk is do 986 00:34:15,754 --> 00:34:18,300 less tools, fewer blood draws, do fewer 987 00:34:18,300 --> 00:34:18,757 MRI's. 988 00:34:19,190 --> 00:34:20,191 That can reduce risk, 989 00:34:20,191 --> 00:34:22,279 but it also can potentially undermine 990 00:34:22,279 --> 00:34:24,029 the social value of the study. 991 00:34:24,029 --> 00:34:25,096 So then that's the point. 992 00:34:25,096 --> 00:34:28,099 We're going to have to judgment and figure out how to balance 993 00:34:28,300 --> 00:34:29,935 these competing considerations. 994 00:34:29,935 --> 00:34:33,038 The fairness issue is also this is one 995 00:34:33,038 --> 00:34:36,808 that we've had huge debates in our department over years, 996 00:34:36,808 --> 00:34:39,612 which is here's another way to minimize 997 00:34:39,612 --> 00:34:41,913 the aggregate risks of a study, 998 00:34:43,081 --> 00:34:46,069 exclude anybody who faces really high risks from 999 00:34:46,069 --> 00:34:46,318 it. 1000 00:34:46,685 --> 00:34:49,965 If you do that, you're going to really minimize the aggregate 1001 00:34:49,965 --> 00:34:50,288 risk, 1002 00:34:50,288 --> 00:34:52,629 but at the same time, you're excluding people from 1003 00:34:52,629 --> 00:34:53,191 that study. 1004 00:34:53,191 --> 00:34:55,794 And if that study has a chance for a potential benefit, 1005 00:34:55,794 --> 00:34:57,134 then there's a way in which it's bad 1006 00:34:57,134 --> 00:34:58,363 for the people you're excluding. 1007 00:34:58,363 --> 00:35:01,366 And some people argue that that's unfair. 1008 00:35:02,200 --> 00:35:04,803 Okay. So here's some examples. 1009 00:35:04,803 --> 00:35:07,234 Experimental drug for serious conditions 1010 00:35:07,234 --> 00:35:09,240 with very few effective options. 1011 00:35:09,808 --> 00:35:12,544 This is this is a console we had a couple of years ago. 1012 00:35:12,544 --> 00:35:14,813 The drug poses a risk of significant 1013 00:35:14,813 --> 00:35:17,082 bleeding that requires transfusion. 1014 00:35:17,382 --> 00:35:19,368 So the investigator comes to us and says 1015 00:35:19,368 --> 00:35:21,453 I think this is a really important trial. 1016 00:35:21,653 --> 00:35:23,353 This is a really important drug to test, 1017 00:35:23,353 --> 00:35:24,756 but I'm terrified that I'm going 1018 00:35:24,756 --> 00:35:28,257 to kill some of my participants because of this risk of 1019 00:35:28,257 --> 00:35:28,893 bleeding. 1020 00:35:29,127 --> 00:35:32,130 And what I want to do is I want to exclude 1021 00:35:32,464 --> 00:35:35,533 anybody who faces a really high risk for 1022 00:35:35,533 --> 00:35:36,301 bleeding. 1023 00:35:36,534 --> 00:35:37,802 And so who is that? 1024 00:35:37,802 --> 00:35:39,437 That's people are bleeding disorders. 1025 00:35:39,437 --> 00:35:42,607 It's people have low platelets for whatever reason. 1026 00:35:42,607 --> 00:35:45,543 And interestingly, it's also people who declined 1027 00:35:45,543 --> 00:35:48,546 blood transfusions for religious reasons. 1028 00:35:48,780 --> 00:35:50,889 And so the question was if you want 1029 00:35:50,889 --> 00:35:53,118 to minimize the risks of this study, 1030 00:35:53,118 --> 00:35:54,963 what you should do is absolutely exclude 1031 00:35:54,963 --> 00:35:56,254 all of those people, right? 1032 00:35:56,821 --> 00:35:59,391 So is that the right thing to do or is that unfair, 1033 00:35:59,391 --> 00:36:01,370 particularly given serious condition, 1034 00:36:01,370 --> 00:36:02,494 no other treatments. 1035 00:36:02,494 --> 00:36:05,754 These are the kind of, I think, really important 1036 00:36:05,754 --> 00:36:06,297 debates 1037 00:36:06,297 --> 00:36:07,961 and questions that arise when we start 1038 00:36:07,961 --> 00:36:09,667 thinking about risk benefit assessment 1039 00:36:09,667 --> 00:36:11,923 and balancing risk benefit assessment 1040 00:36:11,923 --> 00:36:14,239 with other values like fairness okay. 1041 00:36:14,572 --> 00:36:16,241 So potential benefits. 1042 00:36:16,241 --> 00:36:18,443 So we've done social value. We've done risks. 1043 00:36:18,443 --> 00:36:20,661 Now we look at the potential benefits 1044 00:36:20,661 --> 00:36:21,980 of the interventions. 1045 00:36:22,213 --> 00:36:25,490 And again as with risk we look at the potential 1046 00:36:25,490 --> 00:36:26,117 benefits 1047 00:36:26,117 --> 00:36:28,254 above and beyond what people would get 1048 00:36:28,254 --> 00:36:30,221 if they were outside of the trial. 1049 00:36:32,190 --> 00:36:34,172 So now I want to come back to the point 1050 00:36:34,172 --> 00:36:35,493 I was harping on earlier. 1051 00:36:35,493 --> 00:36:38,034 Remember when I talked about financial risks, 1052 00:36:38,034 --> 00:36:39,164 psychological risk, 1053 00:36:39,164 --> 00:36:42,167 stigma, risks, 1054 00:36:43,168 --> 00:36:44,969 and we said you should count all of them? 1055 00:36:44,969 --> 00:36:47,939 Seems obvious. You should count all of them. 1056 00:36:47,939 --> 00:36:50,942 So here's the question is does the analogous point apply 1057 00:36:50,942 --> 00:36:52,343 to potential benefits? 1058 00:36:52,343 --> 00:36:56,848 Should you include not just the potential benefits 1059 00:36:56,848 --> 00:37:00,051 of clinically getting better, but the potential benefits 1060 00:37:00,051 --> 00:37:02,887 of other sorts of things that could happen to you? 1061 00:37:02,887 --> 00:37:04,809 So before we talked about economic risk, 1062 00:37:04,809 --> 00:37:06,491 if there's a chance that it's good 1063 00:37:06,758 --> 00:37:09,527 choice, had to knock out my kidneys 1064 00:37:09,527 --> 00:37:11,690 and I'm going to need treatment, I got to pay out of pocket for 1065 00:37:11,690 --> 00:37:11,896 that. 1066 00:37:11,896 --> 00:37:14,132 That's an economic risk of being in this study. 1067 00:37:14,132 --> 00:37:16,012 Well, now I'll take a different study that 1068 00:37:16,012 --> 00:37:17,802 offers everybody who participates $500. 1069 00:37:18,870 --> 00:37:21,372 Is that a benefit of being in the study? 1070 00:37:21,372 --> 00:37:22,674 Should that count as a benefit? 1071 00:37:22,674 --> 00:37:25,231 If there's economic risks over here, 1072 00:37:25,231 --> 00:37:27,645 are economic benefits over there. 1073 00:37:27,912 --> 00:37:31,416 And I'll just say that the almost consensus view 1074 00:37:31,416 --> 00:37:34,283 in the literature is, yeah, the economic risks 1075 00:37:34,283 --> 00:37:34,719 count. 1076 00:37:34,719 --> 00:37:36,737 If everybody's got to take those into 1077 00:37:36,737 --> 00:37:37,555 consideration. 1078 00:37:37,555 --> 00:37:39,453 The economic benefits don't they shouldn't 1079 00:37:39,453 --> 00:37:41,126 consider them or basically shouldn't 1080 00:37:41,126 --> 00:37:43,061 consider them at all. 1081 00:37:43,061 --> 00:37:46,101 And so obviously there's this question of 1082 00:37:46,101 --> 00:37:47,732 does that make sense? 1083 00:37:47,732 --> 00:37:49,459 Should we treat them differently, 1084 00:37:49,459 --> 00:37:50,034 and if so, 1085 00:37:50,034 --> 00:37:52,036 why should we treat them differently. 1086 00:37:52,036 --> 00:37:53,772 So I was just one of the other questions. 1087 00:37:53,772 --> 00:37:56,774 I'm going to leave us with for now. 1088 00:37:57,242 --> 00:37:58,276 So what's really identified? 1089 00:37:58,276 --> 00:38:01,466 The risks, identify the risks, minimize them, potential 1090 00:38:01,466 --> 00:38:02,046 benefits, 1091 00:38:02,046 --> 00:38:04,505 identify them and try to make them larger 1092 00:38:04,505 --> 00:38:05,884 to the extent you can. 1093 00:38:06,151 --> 00:38:10,488 And this puts us back into this worry about fairness again. 1094 00:38:11,456 --> 00:38:14,626 When I was on the infectious disease IRB, 1095 00:38:14,626 --> 00:38:18,263 they were just starting to test protease inhibitors, 1096 00:38:18,263 --> 00:38:20,098 which people might know turned out 1097 00:38:20,098 --> 00:38:22,407 to be like the initial blockbuster 1098 00:38:22,407 --> 00:38:23,902 for treatment of HIV. 1099 00:38:23,902 --> 00:38:26,437 It really revolutionized the treatment 1100 00:38:26,437 --> 00:38:28,973 of HIV, where HIV went from a disease 1101 00:38:28,973 --> 00:38:32,229 that basically killed everybody to a disease that people lived 1102 00:38:32,229 --> 00:38:32,544 with. 1103 00:38:32,911 --> 00:38:34,879 Now, we didn't know all that at the time, 1104 00:38:34,879 --> 00:38:37,282 but we know these drugs looked really good. 1105 00:38:37,282 --> 00:38:39,386 And the question was, who do we do 1106 00:38:39,386 --> 00:38:41,553 the early phase trials of protease 1107 00:38:41,553 --> 00:38:42,821 inhibitors in 1108 00:38:42,821 --> 00:38:45,623 do we do it in people who are relatively healthy, 1109 00:38:45,623 --> 00:38:47,826 with the assumption it will be less risk to them 1110 00:38:47,826 --> 00:38:50,828 because they're healthier, they're in better shape? 1111 00:38:51,229 --> 00:38:54,232 Or do we do it in people who are really, really sick, 1112 00:38:54,566 --> 00:38:56,675 who probably face greater risks from an 1113 00:38:56,675 --> 00:38:57,702 experimental drug? 1114 00:38:57,702 --> 00:38:59,837 But if it helps them, if it actually works, 1115 00:38:59,837 --> 00:39:01,462 they're going to gain a lot more than 1116 00:39:01,462 --> 00:39:03,174 the people who are relatively healthy. 1117 00:39:03,174 --> 00:39:04,642 So we actually debated that. 1118 00:39:04,642 --> 00:39:06,211 And it's interesting. 1119 00:39:06,211 --> 00:39:08,304 So I think about it now, I always say that 1120 00:39:08,304 --> 00:39:10,148 IRBs tend to be on the cautious side 1121 00:39:10,148 --> 00:39:11,714 where they want us to reduce risk, 1122 00:39:11,714 --> 00:39:13,051 but in this case, we didn't. 1123 00:39:13,051 --> 00:39:13,551 In this case, 1124 00:39:13,551 --> 00:39:15,756 we actually mandated that the studies 1125 00:39:15,756 --> 00:39:17,722 be done in people who were sick. 1126 00:39:17,722 --> 00:39:20,191 So we we did people who had low CD4 1127 00:39:20,191 --> 00:39:24,128 counts, I think under 50 to be to potentially do that trial. 1128 00:39:24,128 --> 00:39:26,136 So we focused on the potential benefit 1129 00:39:26,136 --> 00:39:27,932 rather than minimizing the risks. 1130 00:39:28,333 --> 00:39:30,068 Okay. So now we know what the risks are. 1131 00:39:30,068 --> 00:39:32,570 We know what the benefits are. We know what the social value 1132 00:39:32,570 --> 00:39:32,737 is. 1133 00:39:32,737 --> 00:39:34,772 So we're supposed to put it all together. 1134 00:39:34,772 --> 00:39:37,427 And here this step is just basically 1135 00:39:37,427 --> 00:39:39,344 to the potential benefits 1136 00:39:40,345 --> 00:39:42,780 to the participants plus the social value. 1137 00:39:42,780 --> 00:39:44,449 Put those together to those. 1138 00:39:44,449 --> 00:39:46,750 Justify the risks and the burdens 1139 00:39:46,750 --> 00:39:49,120 and the costs of doing the study. 1140 00:39:49,354 --> 00:39:51,151 If the answer is yes, then at least 1141 00:39:51,151 --> 00:39:53,258 from a risk benefit perspective, you got 1142 00:39:53,691 --> 00:39:54,826 you have an acceptable study. 1143 00:39:54,826 --> 00:39:56,425 You could basically stop doing a risk 1144 00:39:56,425 --> 00:39:57,895 benefit assessment at this point. 1145 00:39:57,895 --> 00:40:00,365 If no, then you need a couple more steps. 1146 00:40:00,365 --> 00:40:03,368 So I'll talk about those additional steps 1147 00:40:04,369 --> 00:40:04,802 in a minute. 1148 00:40:04,802 --> 00:40:07,805 The first thing is so people agonize over this all the time. 1149 00:40:08,006 --> 00:40:09,440 And this is a case. 1150 00:40:09,440 --> 00:40:10,825 So there's this there's this big debate 1151 00:40:10,825 --> 00:40:12,210 that always goes on in research ethics 1152 00:40:12,210 --> 00:40:14,205 about the way in which you should treat clinical 1153 00:40:14,205 --> 00:40:14,579 research 1154 00:40:14,579 --> 00:40:16,613 like clinical care, and the ways in which 1155 00:40:16,613 --> 00:40:18,349 you should treat them differently. 1156 00:40:19,550 --> 00:40:20,839 I think it depends, obviously, on the 1157 00:40:20,839 --> 00:40:21,152 context. 1158 00:40:21,152 --> 00:40:21,919 This is a case 1159 00:40:21,919 --> 00:40:24,023 where I think people get too confused 1160 00:40:24,023 --> 00:40:25,957 when it comes to research ethics. 1161 00:40:26,190 --> 00:40:28,893 So when I say this, the IAB should look at whether the 1162 00:40:28,893 --> 00:40:29,394 potential 1163 00:40:29,394 --> 00:40:31,611 benefits of the drug to the participants 1164 00:40:31,611 --> 00:40:32,664 justify the risks. 1165 00:40:33,031 --> 00:40:35,293 So like, how do you do that outweigh 1166 00:40:35,293 --> 00:40:35,733 risks? 1167 00:40:35,733 --> 00:40:36,701 What does outweigh mean? 1168 00:40:36,701 --> 00:40:39,771 That's a matter for confidence say for justified. 1169 00:40:39,771 --> 00:40:42,206 What do any of these things mean. 1170 00:40:42,206 --> 00:40:45,898 And my response to that is, yeah, I know this is hard, but 1171 00:40:45,898 --> 00:40:46,344 notice 1172 00:40:46,544 --> 00:40:48,648 that's what doctors and clinicians 1173 00:40:48,648 --> 00:40:49,947 do every day, right? 1174 00:40:49,947 --> 00:40:53,885 That's the job of clinicians, is to ask themselves for this 1175 00:40:53,885 --> 00:40:54,419 patient 1176 00:40:55,019 --> 00:40:57,212 to the potential benefits of this drug 1177 00:40:57,212 --> 00:40:59,290 outweigh the risks to this patient. 1178 00:40:59,490 --> 00:41:00,994 Now, it's harder often to do it in 1179 00:41:00,994 --> 00:41:01,392 research 1180 00:41:01,392 --> 00:41:03,461 because we don't have as much data. 1181 00:41:03,461 --> 00:41:06,208 But at least I think that the approach, 1182 00:41:06,208 --> 00:41:07,899 the sort of basic idea, 1183 00:41:07,899 --> 00:41:09,434 is exactly the same. 1184 00:41:09,434 --> 00:41:12,503 And so I've termed this the informed clinician test. 1185 00:41:12,737 --> 00:41:15,287 And basically what the should do is either 1186 00:41:15,287 --> 00:41:15,773 pretend 1187 00:41:15,973 --> 00:41:19,649 they're an informed clinician or call one up and actually ask 1188 00:41:19,649 --> 00:41:20,011 them. 1189 00:41:20,478 --> 00:41:21,546 Here's the situation. 1190 00:41:21,546 --> 00:41:22,962 Here are the patients. Here's the drug 1191 00:41:22,962 --> 00:41:23,781 we want to give them. 1192 00:41:23,781 --> 00:41:25,149 Here's what we know about it. 1193 00:41:25,149 --> 00:41:27,511 Do you think that the potential benefits 1194 00:41:27,511 --> 00:41:29,754 for these patients justify the risks? 1195 00:41:29,754 --> 00:41:32,590 If they say yes, it's a prospect of benefit study. 1196 00:41:32,590 --> 00:41:36,060 If they say no, it's a net, what I call a net risk study. 1197 00:41:39,330 --> 00:41:43,368 A little bit more philosophy here as so. 1198 00:41:44,569 --> 00:41:45,856 Well, I've been saying as you look 1199 00:41:45,856 --> 00:41:47,105 at the individual interventions. 1200 00:41:47,105 --> 00:41:47,505 Right. 1201 00:41:47,505 --> 00:41:50,508 And then you look at the whole package. 1202 00:41:52,777 --> 00:41:54,212 What are you supposed to do? 1203 00:41:54,212 --> 00:41:55,213 And I sort of highlighted this. 1204 00:41:55,213 --> 00:41:56,550 What do you suppose are we looking 1205 00:41:56,550 --> 00:41:57,849 at the individual interventions. 1206 00:41:57,849 --> 00:41:59,984 And here's the question 1207 00:41:59,984 --> 00:42:02,987 to the potential benefits of each intervention 1208 00:42:03,321 --> 00:42:05,923 have to justify the risks. 1209 00:42:05,923 --> 00:42:08,426 So imagine a study like this. 1210 00:42:08,426 --> 00:42:11,496 Imagine a study where I'm going to give people 1211 00:42:11,496 --> 00:42:13,734 a potential treatment for kidney disease 1212 00:42:13,734 --> 00:42:14,966 or say, liver disease 1213 00:42:16,300 --> 00:42:17,435 really could help them. 1214 00:42:17,435 --> 00:42:18,536 Looks really promising. 1215 00:42:18,536 --> 00:42:20,505 Lots of potential benefits. 1216 00:42:20,505 --> 00:42:23,845 But in order to do the study scientifically, I have to do a 1217 00:42:23,845 --> 00:42:24,242 biopsy 1218 00:42:24,442 --> 00:42:26,921 like a liver biopsy or a kidney biopsy 1219 00:42:26,921 --> 00:42:28,813 to pose a significant risks. 1220 00:42:29,013 --> 00:42:32,216 And I'm only doing it for research purposes. 1221 00:42:33,184 --> 00:42:33,885 If you look at the 1222 00:42:33,885 --> 00:42:37,155 liver biopsy or the kidney biopsy on its own, 1223 00:42:37,889 --> 00:42:41,360 it looks like that's very risky, not potential benefit 1224 00:42:41,360 --> 00:42:42,260 intervention. 1225 00:42:42,794 --> 00:42:45,797 And at least saying children, you wouldn't be allowed to do it 1226 00:42:45,797 --> 00:42:48,720 and people would probably be really worried about doing 1227 00:42:48,720 --> 00:42:48,933 it. 1228 00:42:48,933 --> 00:42:50,846 Adults. So is that the way you're supposed 1229 00:42:50,846 --> 00:42:51,302 to do it? 1230 00:42:51,302 --> 00:42:54,300 Each individual intervention potential benefits have to 1231 00:42:54,300 --> 00:42:55,173 justify for us? 1232 00:42:55,406 --> 00:42:57,199 Or do you put the whole thing together 1233 00:42:57,199 --> 00:42:58,709 and say, yeah, no, I know that. 1234 00:42:58,709 --> 00:43:01,712 But look, the potential benefits of getting the drug, 1235 00:43:02,213 --> 00:43:06,150 they justify the risks of not only just the drug, 1236 00:43:06,150 --> 00:43:08,786 but also the risks of the liver or kidney biopsy. 1237 00:43:08,786 --> 00:43:12,390 So that being in the study overall is prospect 1238 00:43:12,390 --> 00:43:15,560 a benefit for the people who are going to get into it. 1239 00:43:15,560 --> 00:43:17,566 So that's a question of whether or not 1240 00:43:17,566 --> 00:43:19,096 you have to look at each one 1241 00:43:19,096 --> 00:43:19,964 just individually. 1242 00:43:19,964 --> 00:43:22,150 Or you can have the potential benefits 1243 00:43:22,150 --> 00:43:23,301 of one intervention 1244 00:43:23,301 --> 00:43:26,304 in a study justifying the risks of another. 1245 00:43:26,304 --> 00:43:27,572 So hopefully that's clear. 1246 00:43:27,572 --> 00:43:29,273 If it's not, we can definitely come. 1247 00:43:29,273 --> 00:43:30,641 We can definitely come back to it. 1248 00:43:30,641 --> 00:43:32,343 I think that's a really interesting challenge. 1249 00:43:34,679 --> 00:43:36,614 Here's a practical example of this. 1250 00:43:36,614 --> 00:43:38,149 We wrote a paper on this. 1251 00:43:38,149 --> 00:43:42,320 Imagine a trial in kids placebo controlled trial. 1252 00:43:42,320 --> 00:43:44,489 Some people get this 1253 00:43:44,489 --> 00:43:46,346 potentially beneficial intervention 1254 00:43:46,346 --> 00:43:47,992 and some people get a placebo. 1255 00:43:48,993 --> 00:43:51,050 What's the risk benefit assessment 1256 00:43:51,050 --> 00:43:53,531 of or the profile of that overall study? 1257 00:43:54,131 --> 00:43:57,068 What some people will say as well 1258 00:43:57,068 --> 00:43:59,337 at the outset before randomization. 1259 00:43:59,337 --> 00:44:01,305 There's a chance you'll get the drug. 1260 00:44:01,305 --> 00:44:03,895 And if you get the drug and it's beneficial, then that 1261 00:44:03,895 --> 00:44:04,375 justifies 1262 00:44:05,443 --> 00:44:07,879 even being randomized to the placebo arm. 1263 00:44:07,879 --> 00:44:09,757 So the overall study is prospective 1264 00:44:09,757 --> 00:44:10,348 a benefit. 1265 00:44:10,348 --> 00:44:12,783 What we argued is actually that's a mistake. 1266 00:44:12,783 --> 00:44:15,678 And you need to individually look at the 1267 00:44:15,678 --> 00:44:18,789 risk benefit of the two arms of the trial. 1268 00:44:18,789 --> 00:44:22,838 And that means that the risks of the placebo arm have to be 1269 00:44:22,838 --> 00:44:23,661 acceptable. 1270 00:44:23,861 --> 00:44:27,324 And you can't justify them by the fact that there was a 1271 00:44:27,324 --> 00:44:27,765 chance 1272 00:44:27,765 --> 00:44:29,639 the person could have gotten into, 1273 00:44:29,639 --> 00:44:30,301 but didn't. 1274 00:44:30,301 --> 00:44:31,536 The active treatment are. 1275 00:44:31,536 --> 00:44:32,670 So that's just what exist. 1276 00:44:32,670 --> 00:44:34,906 Specific example of that. 1277 00:44:34,906 --> 00:44:35,406 Okay. 1278 00:44:35,406 --> 00:44:36,507 So we're just talking about 1279 00:44:36,507 --> 00:44:38,535 whether or not the potential benefits 1280 00:44:38,535 --> 00:44:39,577 justify the risks. 1281 00:44:39,577 --> 00:44:42,036 If they do, you have a prospect of benefit 1282 00:44:42,036 --> 00:44:42,446 study. 1283 00:44:42,446 --> 00:44:45,556 If they don't, you have what I just call a net risk 1284 00:44:45,556 --> 00:44:45,983 trial. 1285 00:44:45,983 --> 00:44:48,626 So then the next last challenge here 1286 00:44:48,626 --> 00:44:50,755 is thinking about net risks. 1287 00:44:51,055 --> 00:44:52,590 When are net risks acceptable? 1288 00:44:52,590 --> 00:44:55,593 How much net risk is acceptable. 1289 00:44:55,593 --> 00:44:59,230 So when it comes to individuals who can't consent like kids, 1290 00:44:59,697 --> 00:45:01,881 there are some people who think it's just 1291 00:45:01,881 --> 00:45:02,466 unethical. 1292 00:45:02,466 --> 00:45:05,469 Sister, it's been going on for 6100 years. 1293 00:45:05,770 --> 00:45:08,739 Some people say it's just unacceptable. 1294 00:45:08,739 --> 00:45:11,336 It's unethical to pose any risks to kids 1295 00:45:11,336 --> 00:45:13,477 for the benefit of other people. 1296 00:45:13,477 --> 00:45:15,253 It's always exploitation if they can't 1297 00:45:15,253 --> 00:45:17,214 give information, we can talk about that. 1298 00:45:17,214 --> 00:45:19,050 If people are interested. 1299 00:45:19,050 --> 00:45:21,085 Well, lots of people have that worry. 1300 00:45:21,085 --> 00:45:22,392 In fact, if you look at almost all 1301 00:45:22,392 --> 00:45:22,853 guidelines, 1302 00:45:22,853 --> 00:45:26,023 I know around the world they allow some risks. 1303 00:45:26,023 --> 00:45:28,306 And kids particularly, and this is the 1304 00:45:28,306 --> 00:45:30,828 buzzword you'll hear more about from CMO, 1305 00:45:30,828 --> 00:45:33,014 whether or not the risks are minimal 1306 00:45:33,014 --> 00:45:34,532 or greater than minimal. 1307 00:45:34,799 --> 00:45:35,833 And I'll just go through this. 1308 00:45:35,833 --> 00:45:37,386 I think you guys are going to hear 1309 00:45:37,386 --> 00:45:38,436 a lot more about this. 1310 00:45:38,436 --> 00:45:40,871 So what's the minimal risk? 1311 00:45:40,871 --> 00:45:42,406 The standard based on the U.S. 1312 00:45:42,406 --> 00:45:43,007 regulations 1313 00:45:43,007 --> 00:45:44,884 that a lot of people have adopted, is 1314 00:45:44,884 --> 00:45:46,711 what our bodies are supposed to do. 1315 00:45:47,111 --> 00:45:49,814 It's supposed to cover the risks of the intervention of the study 1316 00:45:49,814 --> 00:45:54,085 to the risks that are ordinarily encountered in daily life, 1317 00:45:54,452 --> 00:45:56,254 and if the risks of the intervention 1318 00:45:56,254 --> 00:45:57,655 are equal to or lower, then 1319 00:45:58,356 --> 00:45:59,792 the risks or the other encountered in 1320 00:45:59,792 --> 00:46:00,257 daily life, 1321 00:46:00,257 --> 00:46:02,878 then it counts as a minimal risk intervention 1322 00:46:02,878 --> 00:46:03,461 or study. 1323 00:46:03,461 --> 00:46:04,562 And if they're greater, 1324 00:46:04,562 --> 00:46:05,853 it counts as a more than a medium 1325 00:46:05,853 --> 00:46:06,831 risk intervention study. 1326 00:46:06,831 --> 00:46:09,532 That's the standard that when I was the study I 1327 00:46:09,532 --> 00:46:09,934 talked 1328 00:46:09,934 --> 00:46:11,525 about before that we did with CMO, 1329 00:46:11,525 --> 00:46:13,070 we're talking to the IRB chairs. 1330 00:46:13,070 --> 00:46:15,172 That's the question we were asking them. 1331 00:46:15,172 --> 00:46:17,535 And you notice how if I just say to you, allergy 1332 00:46:17,535 --> 00:46:18,175 skin testing 1333 00:46:19,043 --> 00:46:20,865 or the risk of, I was your skin testing, 1334 00:46:20,865 --> 00:46:22,413 11 year olds less than or greater 1335 00:46:22,413 --> 00:46:24,030 than the risk of daily life, I hope 1336 00:46:24,030 --> 00:46:26,017 all of you guys would say, I have no idea. 1337 00:46:26,517 --> 00:46:28,211 You're going to tell me what the risk of hours of skin 1338 00:46:28,211 --> 00:46:28,619 testing are. 1339 00:46:28,619 --> 00:46:30,290 You got to tell me what the risks of daily 1340 00:46:30,290 --> 00:46:31,722 life are before I can compare them. 1341 00:46:31,722 --> 00:46:32,456 And that's the thing. 1342 00:46:32,456 --> 00:46:35,413 I was pointing out that none of those IRB chairs did, 1343 00:46:35,413 --> 00:46:36,027 even ones. 1344 00:46:37,094 --> 00:46:38,396 Okay, so 1345 00:46:38,396 --> 00:46:40,152 this is a this is a study we finished 1346 00:46:40,152 --> 00:46:42,099 a couple of years ago with Will Shukman, 1347 00:46:42,466 --> 00:46:46,370 who was now in sociology program at UCLA, 1348 00:46:46,370 --> 00:46:50,207 who was a post back here, and we wanted to get data. 1349 00:46:50,207 --> 00:46:51,881 There's been this debate for a long time, 1350 00:46:51,881 --> 00:46:53,310 but there haven't been any data on 1351 00:46:53,310 --> 00:46:55,607 what the public thinks about exposing kids 1352 00:46:55,607 --> 00:46:57,248 to risk or benefit of others. 1353 00:46:57,248 --> 00:46:59,957 So we did a survey of this, a nationally representative 1354 00:46:59,957 --> 00:47:00,351 sample. 1355 00:47:00,551 --> 00:47:03,492 And what we found is really strong support 1356 00:47:03,492 --> 00:47:06,223 amongst the public, including parents, 1357 00:47:06,424 --> 00:47:08,251 for exposing kids to research risks 1358 00:47:08,251 --> 00:47:09,660 for the benefit of others, 1359 00:47:09,660 --> 00:47:12,708 including things like a bone marrow biopsy, which poses more 1360 00:47:12,708 --> 00:47:13,064 risks, 1361 00:47:13,064 --> 00:47:16,233 I think, than most IRBs would be willing to countenance. 1362 00:47:18,469 --> 00:47:19,904 Okay, that's kids. 1363 00:47:19,904 --> 00:47:21,172 You're going to get more of that. 1364 00:47:21,172 --> 00:47:23,845 So that leaves the question about what about competent 1365 00:47:23,845 --> 00:47:24,241 adults? 1366 00:47:24,241 --> 00:47:27,174 Should there be limits on the extent 1367 00:47:27,174 --> 00:47:29,780 to which researchers can expose 1368 00:47:29,780 --> 00:47:32,600 competent adults to net risks, to risks 1369 00:47:32,600 --> 00:47:34,552 for the benefit of others? 1370 00:47:34,885 --> 00:47:37,321 And how do we decide that? 1371 00:47:37,321 --> 00:47:38,689 And does it depend on things 1372 00:47:38,689 --> 00:47:40,944 like whether the participants are healthy 1373 00:47:40,944 --> 00:47:42,593 or the participants are sick? 1374 00:47:42,593 --> 00:47:44,261 People debate that a lot. 1375 00:47:45,362 --> 00:47:47,098 Here are some common answers. 1376 00:47:47,098 --> 00:47:48,733 You'll see that it runs the whole range. 1377 00:47:48,733 --> 00:47:50,658 So if you're sort of libertarian minded, 1378 00:47:50,658 --> 00:47:51,669 you might say, look, 1379 00:47:51,669 --> 00:47:53,282 you said competent adults, competent 1380 00:47:53,282 --> 00:47:54,805 adults get to decide what they do 1381 00:47:54,805 --> 00:47:56,868 with their lives, including facing risks 1382 00:47:56,868 --> 00:47:58,209 for the benefit of other. 1383 00:47:58,209 --> 00:48:00,644 So if they're competent, it's just up to them. 1384 00:48:00,644 --> 00:48:02,747 Other people have really strict standards. 1385 00:48:02,747 --> 00:48:06,117 They think, no, actually, I think IRBs sometimes do this. 1386 00:48:06,450 --> 00:48:08,422 The net risks to which we expose cannot 1387 00:48:08,422 --> 00:48:09,787 or shouldn't be much more, 1388 00:48:10,321 --> 00:48:12,912 if any, greater than the risks of which we're willing to expose 1389 00:48:12,912 --> 00:48:13,324 children. 1390 00:48:13,324 --> 00:48:15,178 Some people are really strict about this, 1391 00:48:15,178 --> 00:48:16,761 and then other people have argued. 1392 00:48:16,761 --> 00:48:19,764 So Frank Miller used to be in this department, 1393 00:48:19,997 --> 00:48:23,033 and he argues that, well, let's look at research. 1394 00:48:23,033 --> 00:48:25,305 Participation is kind of an altruistic 1395 00:48:25,305 --> 00:48:25,903 endeavor. 1396 00:48:25,903 --> 00:48:28,172 So let's look at what risks we think are acceptable 1397 00:48:28,172 --> 00:48:31,342 in other altruistic endeavors like organ donation. 1398 00:48:31,342 --> 00:48:33,257 Now notice that looks like it would allow 1399 00:48:33,257 --> 00:48:34,145 pretty high risks. 1400 00:48:34,145 --> 00:48:36,179 It's interesting at the end of this paper, 1401 00:48:36,179 --> 00:48:37,148 although these guys 1402 00:48:37,148 --> 00:48:39,011 suggest that standard, they're kind of 1403 00:48:39,011 --> 00:48:40,384 like maybe that's too high. 1404 00:48:40,384 --> 00:48:41,051 We're not sure. 1405 00:48:43,554 --> 00:48:45,556 Alex Lundin, who's at Carnegie Mellon, 1406 00:48:45,556 --> 00:48:47,757 suggests similar thing, but usually like 1407 00:48:47,757 --> 00:48:49,794 a public service, like firefighters. 1408 00:48:49,794 --> 00:48:51,595 And what do we think are risks acceptable 1409 00:48:51,595 --> 00:48:53,801 and that those are going to set the limits 1410 00:48:53,801 --> 00:48:54,799 for research risks 1411 00:48:54,799 --> 00:48:57,101 with competent adults. Okay. 1412 00:48:57,101 --> 00:49:00,104 So just a very brief summary. 1413 00:49:01,906 --> 00:49:03,240 Doing a systematic risk 1414 00:49:03,240 --> 00:49:05,342 benefit assessment is right by the Belmont 1415 00:49:05,342 --> 00:49:06,243 Report was right. 1416 00:49:06,444 --> 00:49:08,412 It's important to do this right. 1417 00:49:08,412 --> 00:49:10,881 I don't think people are doing it right. 1418 00:49:10,881 --> 00:49:12,923 And I think that if we did it more 1419 00:49:12,923 --> 00:49:13,884 systematically, 1420 00:49:13,884 --> 00:49:15,548 it would make the research better, 1421 00:49:15,548 --> 00:49:17,555 it would make the research more ethical. 1422 00:49:17,788 --> 00:49:20,327 And then the second part of it, which I hope didn't get too 1423 00:49:20,327 --> 00:49:20,758 confusing 1424 00:49:20,758 --> 00:49:21,258 and detract 1425 00:49:21,258 --> 00:49:23,248 from understanding the framework, is that 1426 00:49:23,248 --> 00:49:24,995 I think there's really interesting, 1427 00:49:25,262 --> 00:49:28,071 challenging questions here for us, for anybody who's 1428 00:49:28,071 --> 00:49:28,666 interested 1429 00:49:28,999 --> 00:49:30,501 in research ethics, to figure out.