1 00:00:09,476 --> 00:00:12,379 That's a whirlwind through a little bit of history 2 00:00:12,379 --> 00:00:15,915 and a little bit of the rules and regulations that exist. 3 00:00:15,915 --> 00:00:20,453 What I want to spend the rest of the time talking with you about 4 00:00:20,453 --> 00:00:24,324 are some sort of conceptual ideas about how you think about research 5 00:00:24,324 --> 00:00:27,861 in a way that helps you answer the questions of, "Is 6 00:00:27,861 --> 00:00:32,065 this research ethical?" And I want to preface it by saying that, 7 00:00:32,065 --> 00:00:35,935 you know, in our department some 15 or so years ago now, 8 00:00:35,935 --> 00:00:40,140 we realized that a lot of the guidance that's found in the codes 9 00:00:40,140 --> 00:00:43,343 and the regulations were developed in response to specific problems. 10 00:00:43,343 --> 00:00:45,945 So, they were developed historically at a time 11 00:00:45,945 --> 00:00:49,482 that they were responding to some events that happened in history. 12 00:00:49,716 --> 00:00:53,820 And so, they understandably kind of focused on what 13 00:00:53,820 --> 00:00:59,092 the sort of details of that event that initiated their implementation. 14 00:00:59,092 --> 00:01:04,831 In addition to -- or for the same reason, perhaps, some issues 15 00:01:04,831 --> 00:01:09,169 are probably incompletely addressed in some of these guidances, 16 00:01:09,169 --> 00:01:14,441 and some of the guidances and regulations are at least interpreted 17 00:01:14,441 --> 00:01:19,245 to have divergent recommendations to not agree on specific things. 18 00:01:19,746 --> 00:01:25,318 So, we thought that there was a need for investigators, for people 19 00:01:25,318 --> 00:01:29,522 who are interested in understanding the ethics of research, 20 00:01:29,522 --> 00:01:33,693 for people who review research, IRB members and others, 21 00:01:33,693 --> 00:01:37,897 to have a systematic, coherent, and universally applicable framework 22 00:01:37,897 --> 00:01:43,937 that they could use to make decisions about or make judgments about whether 23 00:01:43,937 --> 00:01:49,976 or not a particular research that they were writing or reviewing was ethical. 24 00:01:51,611 --> 00:01:53,813 And so, we proposed a seven-principle framework. 25 00:01:53,813 --> 00:01:58,218 And the seven principles -- and it's published in a couple of different places. 26 00:01:58,218 --> 00:02:00,720 The first paper was in 2000 in JAMA. 27 00:02:00,720 --> 00:02:02,288 It's also in a textbook. 28 00:02:02,288 --> 00:02:05,458 There are some additions to this in the international context. 29 00:02:05,458 --> 00:02:09,229 So, there's a number of references that I could provide for you 30 00:02:09,229 --> 00:02:12,999 if you want that explicate that -- this pretty in some detail, 31 00:02:12,999 --> 00:02:15,502 but I'm going to go through it today. 32 00:02:16,069 --> 00:02:17,770 So, there are seven principles. 33 00:02:17,770 --> 00:02:20,807 They include valuable scientific question, valid scientific methodology, fair 34 00:02:20,807 --> 00:02:22,509 subject selection, favorable risk-benefit evaluation, 35 00:02:22,509 --> 00:02:25,745 independent review, informed consent, and respect for enrolled subjects. 36 00:02:25,745 --> 00:02:29,315 One thing before I go into each one of them 37 00:02:29,315 --> 00:02:33,386 I want to say is we didn't make any of these up. 38 00:02:33,386 --> 00:02:36,089 So, if you look at the various guidelines 39 00:02:36,089 --> 00:02:39,492 and codes, you will find vestiges of each of these. 40 00:02:39,492 --> 00:02:44,597 Maybe not exact words, these exact words, but the concepts and the ideas are there. 41 00:02:45,365 --> 00:02:50,036 We've just put them together in a way that seemed more systematic to us. 42 00:02:50,036 --> 00:02:53,039 And what I mean by that is that it 43 00:02:53,039 --> 00:02:56,409 seems right that you have to start at the top. 44 00:02:56,409 --> 00:03:00,413 You have to answer the question, "Is this a valuable scientific question?" 45 00:03:00,413 --> 00:03:02,415 before you go down the list. 46 00:03:02,415 --> 00:03:07,086 And so, you know, the question of informed consent, informed consent is on there. 47 00:03:07,086 --> 00:03:10,390 It's a very important part of doing research. 48 00:03:10,390 --> 00:03:14,527 But you don't even get near it until you've gone 49 00:03:14,527 --> 00:03:18,865 through lots of other considerations in terms of a project. 50 00:03:18,865 --> 00:03:23,603 So, what does it mean to have a valuable scientific question? 51 00:03:23,603 --> 00:03:28,975 So, basically, what this means is that it has to be worth doing, 52 00:03:28,975 --> 00:03:32,278 if that's the sort of shorthand for it. 53 00:03:32,278 --> 00:03:36,382 An ethical clinical research should answer a valuable question, one 54 00:03:36,382 --> 00:03:40,520 that will generate new knowledge or understanding about human health 55 00:03:40,520 --> 00:03:43,823 or illness that's socially, clinically, or scientifically useful. 56 00:03:44,591 --> 00:03:50,897 So, a good example of this is an article that was written by a very prolific writer 57 00:03:50,897 --> 00:03:54,968 in research ethics, Benjamin Friedman, who wrote an article about -- 58 00:03:54,968 --> 00:03:59,772 in this article, he was writing about social value in a certain way. 59 00:03:59,772 --> 00:04:05,678 And what he said was he had a nightmare where he was walking down the hall, 60 00:04:05,678 --> 00:04:10,483 and he ran into one of his colleagues who excitedly came to him 61 00:04:10,483 --> 00:04:13,453 and said, "I just successfully transplanted an appendix." 62 00:04:14,320 --> 00:04:19,158 And in the article, Benji says he's -- he kind of scratched his head, 63 00:04:19,158 --> 00:04:23,329 and he said, "Well, was that ethical?" And his colleague responded, "Well, 64 00:04:23,329 --> 00:04:26,766 I got the guy's informed consent." And he uses that 65 00:04:26,766 --> 00:04:30,570 as an example of how informed consent doesn't solve the problem. 66 00:04:30,570 --> 00:04:34,374 If there wasn't a good reason, a good question to answer 67 00:04:34,374 --> 00:04:35,775 by transplanting an appendix, 68 00:04:35,775 --> 00:04:39,912 something that you might learn, something that you might use to do 69 00:04:39,912 --> 00:04:43,716 some important work with, then it wasn't ethical to do it. 70 00:04:44,651 --> 00:04:47,820 And it's an interesting concept, because it seems like 71 00:04:47,820 --> 00:04:49,589 it is the most important 72 00:04:49,589 --> 00:04:53,493 first question to ask, "Is this research worth doing?" Now, importantly, 73 00:04:53,493 --> 00:04:58,064 it doesn't mean that the answer to the question has to be positive. 74 00:04:58,064 --> 00:05:01,601 You can have negative -- you know, ask a question, 75 00:05:01,601 --> 00:05:04,437 get negative answers, and that's still socially valuable. 76 00:05:04,437 --> 00:05:09,742 It still gives you information that can be very useful and in many cases is. 77 00:05:10,376 --> 00:05:12,945 But if you can't answer -- 78 00:05:12,945 --> 00:05:17,283 if you can't say why this question is worth asking, 79 00:05:17,283 --> 00:05:22,255 then you probably have to re-ask yourself, "Is this research ethical?" 80 00:05:22,255 --> 00:05:26,359 There are many reasons that social value is important. 81 00:05:26,359 --> 00:05:28,528 It promotes benefit to society. 82 00:05:28,528 --> 00:05:34,567 So, the sort of one side of the balance that I showed you before. 83 00:05:34,567 --> 00:05:36,736 And it also minimizes exploitation. 84 00:05:36,736 --> 00:05:41,074 You're asking people to do things for a good reason, 85 00:05:41,074 --> 00:05:44,143 not just wasting their time for nothing. 86 00:05:44,143 --> 00:05:48,915 Therefore, it justifies asking them to accept some risk and burden. 87 00:05:48,915 --> 00:05:53,052 And it's also a responsible way to use resources. 88 00:05:53,052 --> 00:05:57,190 The second principle in the framework is scientific -- 89 00:05:57,190 --> 00:05:58,491 valid scientific methodology. 90 00:05:58,491 --> 00:06:02,395 And this basically says, "Ethical research should be designed 91 00:06:02,395 --> 00:06:06,332 in a methodologically rigorous manner, the design, the methods, 92 00:06:06,332 --> 00:06:09,802 the statistical power, et cetera, that will yield 93 00:06:09,802 --> 00:06:12,405 valid, reliable, generalizable, and interpretable data." 94 00:06:12,872 --> 00:06:17,009 So, the shorthand here is, you have a question that's worth asking. 95 00:06:17,009 --> 00:06:20,780 The second ethical requirement is to design it in a way 96 00:06:20,780 --> 00:06:24,884 that you'd get an answer that you can interpret, that you know 97 00:06:24,884 --> 00:06:29,021 what you found, that you have data that's, you know, reasonably understandable. 98 00:06:29,021 --> 00:06:31,424 Again, it doesn't have to be positive. 99 00:06:31,424 --> 00:06:33,459 It just has to be understandable. 100 00:06:33,459 --> 00:06:37,930 And it has to be designed in a way that's feasible as well. 101 00:06:39,432 --> 00:06:41,467 Now, this requires lots of 102 00:06:42,034 --> 00:06:45,438 details in terms of choices of endpoints, choices 103 00:06:45,438 --> 00:06:49,609 of designs, choices of procedures, statistical methods, et cetera. 104 00:06:49,609 --> 00:06:55,615 And you look at this list, and some of you might say, "Well, 105 00:06:55,615 --> 00:06:58,584 those are all scientific considerations." And in 106 00:06:59,152 --> 00:07:01,187 fact, they are scientific considerations. 107 00:07:01,187 --> 00:07:05,792 But every scientific choice that you make has ethical implications. 108 00:07:06,058 --> 00:07:09,629 And so, designing your study, picking the right outcomes, 109 00:07:09,629 --> 00:07:14,400 picking the right design to give you the best and most rigorous 110 00:07:14,400 --> 00:07:18,805 answer to your question, choosing the procedures that will be used 111 00:07:18,805 --> 00:07:21,974 to measure outcomes or data points or whatever. 112 00:07:21,974 --> 00:07:26,345 And not only planning your statistical methods, but managing your data, 113 00:07:26,345 --> 00:07:30,750 well, are all ethical requirements in terms of good clinical research. 114 00:07:32,585 --> 00:07:34,520 There's lots of controversies about this. 115 00:07:34,520 --> 00:07:39,325 Not so much about the idea that things need to be designed in a way 116 00:07:39,325 --> 00:07:43,830 that give you a rigorous answer, but how do you balance the best design, 117 00:07:43,830 --> 00:07:45,131 the most rigorous design 118 00:07:45,131 --> 00:07:49,936 with some of the other things that you need to care about, like the fact 119 00:07:49,936 --> 00:07:54,106 that you don't want to exploit people or expose them to excessive risk. 120 00:07:54,774 --> 00:07:57,710 So, some of you may have followed the debates 121 00:07:57,710 --> 00:08:01,681 that occurred over clinical trials related to Ebola in the last year. 122 00:08:01,681 --> 00:08:04,951 And there were a lot of discussions about, you know, 123 00:08:04,951 --> 00:08:08,888 what's the right kind of design that both gives us an answer 124 00:08:08,888 --> 00:08:13,159 to the questions that are so urgently needed to be answered and respects 125 00:08:13,159 --> 00:08:17,430 the rights and the welfare of the participants that are in these trials. 126 00:08:19,031 --> 00:08:20,399 There's also a lot 127 00:08:20,399 --> 00:08:23,436 of really important ethical weight that goes into feasibility. 128 00:08:23,436 --> 00:08:27,840 So, if you're -- for example, imagine designing a trial where you know 129 00:08:27,840 --> 00:08:33,246 you're not going to be able to get the participants that you need to answer it. 130 00:08:33,246 --> 00:08:37,316 That's sort of ethically a non-starter, because you can't ask some people 131 00:08:37,316 --> 00:08:41,354 to accept risk and burden to answer a question that you know 132 00:08:41,354 --> 00:08:45,424 you're not going to be able to answer, because it's not feasible 133 00:08:45,424 --> 00:08:50,830 to get all the people that you need or to get the equipment that you need 134 00:08:50,830 --> 00:08:54,233 or do you have the right freezer capabilities or whatever. 135 00:08:54,267 --> 00:08:57,169 There could be lots of details in terms of feasibility. 136 00:08:57,169 --> 00:08:58,971 The third is fair subject selection. 137 00:08:58,971 --> 00:09:02,875 So, you have an -- a question that's worth answering, a design and methodology 138 00:09:02,875 --> 00:09:03,409 that's rigorous, 139 00:09:03,409 --> 00:09:07,580 and it's going to give you an answer that you can make sense out of. 140 00:09:07,580 --> 00:09:10,650 Now, you have to ask the question, "Who do I include? 141 00:09:10,650 --> 00:09:13,686 Who do I go after to be part of this study?" 142 00:09:14,453 --> 00:09:17,890 And I think critically, especially looking historically, 143 00:09:17,890 --> 00:09:21,294 the first guide should be scientific objectives. 144 00:09:21,294 --> 00:09:26,198 That the scientific objectives of the study guide the inclusion 145 00:09:26,198 --> 00:09:30,603 criteria, the recruitment strategies, and the selection of subjects. 146 00:09:30,603 --> 00:09:34,507 And not the fact that people are easily 147 00:09:34,507 --> 00:09:38,911 available, or vulnerable, or marginable -- marginal, excuse me, 148 00:09:38,911 --> 00:09:42,848 or that they're privileged either, you know, that 149 00:09:42,848 --> 00:09:48,120 there needs to be a fairness in terms of scientific objectives. 150 00:09:48,120 --> 00:09:51,157 Scientific objectives is where you start. 151 00:09:51,390 --> 00:09:55,194 And then you -- when you're thinking about fair subject selection, 152 00:09:55,194 --> 00:09:59,131 there's also a consideration of fairly distributing the harms and benefits. 153 00:09:59,131 --> 00:10:01,734 So, let me give you an example. 154 00:10:01,734 --> 00:10:06,205 Let's imagine you think you're doing a study to test a treatment for 155 00:10:06,205 --> 00:10:06,872 breast cancer. 156 00:10:06,872 --> 00:10:10,676 You know, scientifically, you're going to need people in that study 157 00:10:10,676 --> 00:10:15,481 who have breast cancer and maybe it's a very specific type of breast cancer. 158 00:10:15,481 --> 00:10:16,515 When you're thinking 159 00:10:16,515 --> 00:10:20,987 about fairly distributing the harms and the benefits, you may want to -- 160 00:10:20,987 --> 00:10:25,057 you may think about, well, which subgroups within the population of people 161 00:10:25,057 --> 00:10:27,493 who have breast cancer are more susceptible 162 00:10:27,493 --> 00:10:30,930 to the kinds of risks that this intervention might have? 163 00:10:31,497 --> 00:10:37,003 Maybe kidney disease or liver disease or some kinds of things that might need 164 00:10:37,003 --> 00:10:40,139 to be used as exclusion criteria in order 165 00:10:40,139 --> 00:10:44,243 to reduce the risks and protect some of the subgroups. 166 00:10:44,243 --> 00:10:48,781 One of my colleagues talks about the best way to think 167 00:10:48,781 --> 00:10:53,085 about fair subject selection is to flip it on its head. 168 00:10:53,085 --> 00:10:57,423 In other words, not start with who scientifically should you include 169 00:10:57,423 --> 00:11:02,528 in the study, but start with the idea that everybody should be included, 170 00:11:02,528 --> 00:11:07,233 unless there's a reason to exclude them, a justification for excluding them. 171 00:11:07,833 --> 00:11:11,370 And so, then the question becomes, "What are acceptable justifications 172 00:11:11,370 --> 00:11:16,342 for excluding people?" Well, one might be that they can't help answer the question. 173 00:11:16,342 --> 00:11:18,444 They don't meet the scientific objectives. 174 00:11:18,444 --> 00:11:21,647 Another might be that they are at higher risk 175 00:11:21,647 --> 00:11:24,817 than everybody else who could meet these scientific objectives. 176 00:11:24,817 --> 00:11:30,122 And a third and a very important one is that they may be more vulnerable 177 00:11:30,122 --> 00:11:33,225 than other populations, and therefore we should think about 178 00:11:33,225 --> 00:11:37,596 how to -- whether we need to include them at the front end. 179 00:11:37,596 --> 00:11:42,001 I understand there will be another lecture -- a whole lecture on vulnerability. 180 00:11:42,001 --> 00:11:44,370 So, I won't say more about that. 181 00:11:44,370 --> 00:11:47,907 But if you have questions, we can talk about them. 182 00:11:47,907 --> 00:11:52,478 I think it's important to go back to something I've already alluded to, 183 00:11:52,478 --> 00:11:53,479 and that is, 184 00:11:53,479 --> 00:11:57,550 in thinking about distributing or fair distribution of the risks and benefits, 185 00:11:57,550 --> 00:11:59,552 it's important to keep in mind 186 00:11:59,552 --> 00:12:02,621 this sort of pendulum swing that's been occurring between 187 00:12:02,621 --> 00:12:06,992 thinking of research as always a burden and always subjects -- excuse me, 188 00:12:06,992 --> 00:12:11,063 as always in need of protection, and the sort of opposite swing 189 00:12:11,063 --> 00:12:15,668 of the pendulum where research can be or is in many cases 190 00:12:15,668 --> 00:12:20,473 a benefit, and subjects need and want access to participation. 191 00:12:20,473 --> 00:12:24,810 And so, thinking about the risks and the benefits, 192 00:12:24,810 --> 00:12:29,115 you need to keep in mind those different polls 193 00:12:29,115 --> 00:12:34,153 and where any particular project might fall in that spectrum. 194 00:12:34,153 --> 00:12:38,724 So, you've got a question worth asking, a design 195 00:12:38,724 --> 00:12:42,061 that's rigorous, selected the appropriate population given, 196 00:12:42,061 --> 00:12:45,931 scientific objectives, and risk and benefit and vulnerability. 197 00:12:46,232 --> 00:12:49,869 And now, the question is, "What's the risk-benefit look like 198 00:12:49,869 --> 00:12:55,674 for the participants in the study?" So, this is a sort of different kind of risk 199 00:12:55,674 --> 00:13:00,045 benefit analysis than the first one, which is, "Is the study worth 200 00:13:00,045 --> 00:13:04,016 doing?" In a study that's worth doing, there are still ways 201 00:13:04,016 --> 00:13:08,020 that one can make sure that the risks to the individuals 202 00:13:08,020 --> 00:13:10,956 in the study are minimized and necessary. 203 00:13:10,956 --> 00:13:16,061 So, you may see a proposal, for example, that asks to do, 204 00:13:16,061 --> 00:13:20,332 you know, 10 MRIs over the course of six months. 205 00:13:20,332 --> 00:13:25,671 And so, an interesting question to ask is, "Are 10 necessary?" Or could 206 00:13:26,105 --> 00:13:30,176 the question be answered by doing six or five or, 207 00:13:30,176 --> 00:13:34,847 you know, spreading them out over a further amount of time? 208 00:13:34,847 --> 00:13:39,552 Are there other ways to minimize the burden of those procedures 209 00:13:39,752 --> 00:13:41,086 to the participants? 210 00:13:41,086 --> 00:13:45,524 Are the risks to the individuals in the study justified 211 00:13:45,524 --> 00:13:50,830 by either benefit to them -- or potential benefit to them or 212 00:13:50,830 --> 00:13:55,301 -- and/or the importance of the knowledge, the social value? 213 00:13:55,301 --> 00:13:57,069 And are benefits enhanced? 214 00:13:57,069 --> 00:14:02,408 Are there ways to enhance benefits in the process of doing research? 215 00:14:02,408 --> 00:14:06,879 And these are -- these -- you might say, "Well, 216 00:14:06,879 --> 00:14:09,982 these are pretty normal and understandable questions, 217 00:14:09,982 --> 00:14:14,720 no big deal." But the process of doing that is not so straightforward. 218 00:14:14,720 --> 00:14:17,690 I mean, just thinking about risks, for example. 219 00:14:17,690 --> 00:14:22,261 There's a very interesting challenge in terms of defining what is a risk 220 00:14:22,261 --> 00:14:25,064 in research. What counts as a research risk? 221 00:14:25,064 --> 00:14:28,601 How do we know how bad this risk is or 222 00:14:28,601 --> 00:14:32,805 how to think about this risk on some kind of a scale? 223 00:14:33,105 --> 00:14:37,509 And there are certainly lots of reasons to think about 224 00:14:37,509 --> 00:14:42,348 both probability and magnitude of risk and different types of risk. 225 00:14:42,348 --> 00:14:45,451 So, one of the things that the 226 00:14:45,451 --> 00:14:49,421 regulations talk about is physical, psychological, social, financial risks. 227 00:14:49,421 --> 00:14:53,392 It's interesting to think about how often in research 228 00:14:53,392 --> 00:14:56,462 we consider financial risks as an example. 229 00:14:56,462 --> 00:15:00,432 Some people talk about -- or I've had questions 230 00:15:00,432 --> 00:15:04,403 raised that fall in the following sort of category. 231 00:15:04,670 --> 00:15:08,474 If somebody has to travel through a very tough neighborhood 232 00:15:08,474 --> 00:15:13,779 or a park really far away from the clinic and walk to the clinic 233 00:15:13,779 --> 00:15:18,717 and there are some hazards in that walk, are those risks of research, 234 00:15:18,717 --> 00:15:22,121 or are those separate from the risks of research? 235 00:15:22,121 --> 00:15:25,157 How do we take those things into account 236 00:15:25,157 --> 00:15:28,961 when we're making an assessment of what the risks are? 237 00:15:29,628 --> 00:15:33,232 And I think importantly, in all cases of research, 238 00:15:33,232 --> 00:15:35,634 there is some uncertainty about risks. 239 00:15:35,634 --> 00:15:37,236 And if there wasn't 240 00:15:37,236 --> 00:15:41,640 some uncertainty about risks, we probably wouldn't be doing the research 241 00:15:41,640 --> 00:15:45,644 because part of doing researches is helping to identify either 242 00:15:45,644 --> 00:15:49,248 what the risks are, the magnitude of the risks, 243 00:15:49,248 --> 00:15:53,252 the comparative risks, or some other nuance about the risks. 244 00:15:53,252 --> 00:15:58,457 So, defining risks and measuring risks are -- is a very important part 245 00:15:58,457 --> 00:16:03,028 of figuring this out, but it's not easy in many cases. 246 00:16:03,028 --> 00:16:06,065 Minimizing risks is also something that we need 247 00:16:06,065 --> 00:16:09,635 to pay attention to and some things are very straightforward. 248 00:16:09,635 --> 00:16:12,171 I mentioned already, you know, reducing the 249 00:16:12,171 --> 00:16:16,108 number of procedures if you can and still get an answer. 250 00:16:16,108 --> 00:16:20,045 Other things that people talk about are, you know, doing procedures 251 00:16:20,045 --> 00:16:23,248 that are clinically required and -- in other words, 252 00:16:23,248 --> 00:16:26,485 instead of asking for additional blood for research purposes, 253 00:16:26,485 --> 00:16:30,756 have it drawn at the time when clinical bloods are being drawn. 254 00:16:31,991 --> 00:16:33,392 Making sure that the 255 00:16:33,392 --> 00:16:37,229 people who are doing procedures have the right kind of training. 256 00:16:37,229 --> 00:16:39,331 That's a way to minimize risk. 257 00:16:39,331 --> 00:16:43,168 So, there are lots of really important sometimes very obvious ways 258 00:16:43,168 --> 00:16:44,403 to minimize risk. 259 00:16:44,403 --> 00:16:49,475 And limiting risk really gets down to how we protect certain groups of people. 260 00:16:49,475 --> 00:16:53,645 So, one of the ways, for example, that the regulations regarding research 261 00:16:53,645 --> 00:16:58,550 with children work is that it limits the amount of risk that can -- 262 00:16:58,550 --> 00:17:01,320 that children can be exposed to in research. 263 00:17:02,921 --> 00:17:06,191 Minimal risk procedures and research for children is -- 264 00:17:06,191 --> 00:17:07,659 basically are generally fine. 265 00:17:07,659 --> 00:17:12,031 More risk can be accepted when there is a prospect of benefit 266 00:17:12,031 --> 00:17:14,199 for the children in the study. 267 00:17:14,199 --> 00:17:18,237 And then the rules for anything more than a minimal risk 268 00:17:18,237 --> 00:17:22,241 without a prospect of benefit in children are much more stringent. 269 00:17:22,241 --> 00:17:25,277 And in some cases, research cannot be done. 270 00:17:25,277 --> 00:17:29,148 There are also similar challenges in terms of defining benefits. 271 00:17:29,214 --> 00:17:32,451 What is a prospect of benefit in research? 272 00:17:32,451 --> 00:17:37,289 If you're -- if you have a certain kind of serious illness, 273 00:17:37,289 --> 00:17:41,727 and there are a few interventions available to treat that illness, 274 00:17:41,727 --> 00:17:46,198 and you are entering a study in which there's an intervention 275 00:17:46,198 --> 00:17:50,235 being offered, is that by definition a prospect of benefit? 276 00:17:50,235 --> 00:17:56,308 Is it only a prospect of benefit if we already know something about the drug? 277 00:17:56,675 --> 00:18:00,979 If it's in Phase III versus Phase I, for example? 278 00:18:00,979 --> 00:18:06,151 How do we decide what's the prospect of benefit in certain cases? 279 00:18:06,151 --> 00:18:10,456 And a lot of discussion about distinguishing what are called 280 00:18:10,456 --> 00:18:13,892 direct benefits versus secondary or sometimes ancillary benefits. 281 00:18:13,892 --> 00:18:18,897 So, the direct benefits are, you know, maybe the therapeutic -- possible 282 00:18:19,198 --> 00:18:22,901 therapeutic benefits that one might derive from an intervention. 283 00:18:23,202 --> 00:18:28,707 The secondary benefits might be other medical care that you might receive 284 00:18:28,707 --> 00:18:33,312 as part of the study, payment that you might receive 285 00:18:33,312 --> 00:18:38,350 as part of the study, feeling like you're contributing to society 286 00:18:38,350 --> 00:18:42,921 that might be psychologically beneficial as part of the study. 287 00:18:42,921 --> 00:18:45,891 Those are generally considered secondary benefits. 288 00:18:45,891 --> 00:18:50,729 So, importantly, and going back to my original balancing scale, 289 00:18:50,729 --> 00:18:55,300 the most guidance recognizes that interests other than the subject 290 00:18:55,300 --> 00:18:58,470 those of the subjects may on some occasions 291 00:18:58,470 --> 00:19:01,273 be sufficient by themselves to justify risks. 292 00:19:01,273 --> 00:19:06,445 What this basically means is that research in which we know in advance 293 00:19:06,445 --> 00:19:10,415 the individuals who are participating will not benefit, there's no 294 00:19:10,415 --> 00:19:13,585 prospect of benefit, can sometimes be ethically acceptable 295 00:19:13,585 --> 00:19:17,956 as long as the rights of the individuals who are participants 296 00:19:17,956 --> 00:19:19,558 have been adequately protected. 297 00:19:22,127 --> 00:19:24,062 So, a question worth 298 00:19:24,062 --> 00:19:27,900 asking, a design that's rigorous, the right population 299 00:19:27,900 --> 00:19:32,237 chosen, efforts made to minimize risks, and enhance benefits. 300 00:19:32,237 --> 00:19:37,075 Now, the next thing on that list is independent review. 301 00:19:37,075 --> 00:19:41,914 And so, this is a process which allows an investigator 302 00:19:41,914 --> 00:19:46,251 or a research team who has investments in research 303 00:19:46,251 --> 00:19:50,088 to have somebody else say, "Is this acceptable 304 00:19:50,088 --> 00:19:54,426 to ensure that the ethical requirements have been fulfilled, 305 00:19:54,960 --> 00:19:58,430 check any biases that the investigator might bring, 306 00:19:58,430 --> 00:20:03,635 and assure the public that research is not exploiting individuals or groups?" 307 00:20:03,635 --> 00:20:07,105 And so, most research in the United States 308 00:20:07,105 --> 00:20:11,009 is independently reviewed by Institutional Review Boards or IRBs. 309 00:20:11,009 --> 00:20:14,513 There is research that's exempt from IRB review 310 00:20:14,513 --> 00:20:19,718 and expedited through IRB review, so there are other methods of reviewing. 311 00:20:19,718 --> 00:20:24,923 Most countries around the world have an equivalent kind of Ethics Research 312 00:20:24,923 --> 00:20:26,658 Review Committee that does 313 00:20:26,658 --> 00:20:30,796 looks at research before it starts, research with humans. 314 00:20:30,796 --> 00:20:35,801 And the criteria that are found in both the Common Rule 315 00:20:35,801 --> 00:20:41,873 and in the FDA regulations that the IRB is supposed to use in order 316 00:20:41,873 --> 00:20:46,211 to determine whether a particular study is approvable include these. 317 00:20:46,211 --> 00:20:50,716 And the -- and you'll see flavors of things that I've already talked about. 318 00:20:50,716 --> 00:20:52,017 That risks are minimized. 319 00:20:52,017 --> 00:20:55,554 There's more to that statement than I put on this slide. 320 00:20:55,554 --> 00:20:57,823 That risks are justified by any anticipated 321 00:20:57,823 --> 00:21:01,026 benefits to the subjects or the importance of the knowledge. 322 00:21:01,026 --> 00:21:03,595 That subjects will be selected and treated fairly. 323 00:21:03,595 --> 00:21:05,530 And that informed consent is adequate. 324 00:21:10,502 --> 00:21:14,339 There are lots of interesting and sort of 325 00:21:14,339 --> 00:21:18,710 in the news challenges right now with IRB review. 326 00:21:18,710 --> 00:21:22,080 One is lots of -- many IRBs 327 00:21:22,080 --> 00:21:25,450 have huge volume of protocols to review. 328 00:21:25,450 --> 00:21:31,256 So, how many of you have ever been to an IRB meeting? 329 00:21:31,256 --> 00:21:33,659 Anybody? Only one. Okay, two. 330 00:21:33,659 --> 00:21:34,159 Okay. 331 00:21:34,159 --> 00:21:38,030 So, oftentimes, IRB meetings can last four hours, 332 00:21:38,030 --> 00:21:42,834 but you have, you know, maybe 10 protocols to review. 333 00:21:42,834 --> 00:21:46,938 Some of them protocols sometimes are, you know, 300 pages 334 00:21:46,938 --> 00:21:50,876 long and dense in terms of their scientific background, et cetera. 335 00:21:50,876 --> 00:21:55,514 So, the challenge is not trivial in terms of understanding when it's about, 336 00:21:55,514 --> 00:21:59,084 having a process of asking questions often of the investigator, 337 00:21:59,084 --> 00:22:03,388 and then applying these principles and thinking about, "Well, is this something 338 00:22:03,388 --> 00:22:08,026 that's approvable in its current form, or does it need to be changed?" 339 00:22:09,494 --> 00:22:10,295 Consequently, there's 340 00:22:10,295 --> 00:22:15,100 been data -- plenty of evidence to show that IRBs are -- 341 00:22:15,100 --> 00:22:19,905 tend to be inconsistent in terms of how they make their judgments. 342 00:22:19,905 --> 00:22:23,909 And this has created problems in particular for multicenter studies 343 00:22:23,909 --> 00:22:25,110 or multi-site studies. 344 00:22:25,110 --> 00:22:29,114 So, if you have a study that has 100 different 345 00:22:29,114 --> 00:22:32,718 sites, oftentimes 100 different IRBs have to review it. 346 00:22:32,718 --> 00:22:36,722 And they may have either grossly different decisions or sometimes 347 00:22:36,722 --> 00:22:39,925 just mildly different decisions but that impact the 348 00:22:39,925 --> 00:22:43,528 entire spectrum of sites that are conducting the study. 349 00:22:43,829 --> 00:22:49,901 So, this is a problem that a lot of people have talked about 350 00:22:49,901 --> 00:22:53,638 and lots of attention right now to using 351 00:22:53,638 --> 00:22:56,675 single IRB review for multi-site studies. 352 00:22:56,675 --> 00:23:01,613 This year in September of 2015, the Department of Health 353 00:23:01,613 --> 00:23:06,752 and Human Services published what's called the Notice of Proposed Rulemaking. 354 00:23:06,752 --> 00:23:08,620 That's NPRM stands for. 355 00:23:08,620 --> 00:23:13,291 Maybe some of you have heard from -- about it. 356 00:23:13,291 --> 00:23:17,996 And basically, it's a proposal for changing the Common Rule. 357 00:23:17,996 --> 00:23:21,133 And there are some very interesting, some minor, 358 00:23:21,133 --> 00:23:25,036 and some huge changes in terms of what they're proposing. 359 00:23:25,036 --> 00:23:28,540 One of them is this, that for multicenter studies 360 00:23:28,540 --> 00:23:32,844 that are domestic, non-international, a single IRB review will be required. 361 00:23:32,844 --> 00:23:35,981 They won't be able to have different institutions 362 00:23:35,981 --> 00:23:39,885 doing IRBs that -- I mean, reviewing different institutions IRB 363 00:23:39,885 --> 00:23:44,990 is reviewing the study, but only one will be able to do it. 364 00:23:44,990 --> 00:23:50,061 And there's a lot of consternation about how this is going to look 365 00:23:50,061 --> 00:23:54,733 and whether it will work and whether it will be adequately protective 366 00:23:54,733 --> 00:23:59,237 and how you'll make arrangements between institutions, and lots of details. 367 00:23:59,237 --> 00:24:04,109 So, the sixth of the seven on our list is informed consent. 368 00:24:04,109 --> 00:24:08,413 And as someone in the audience said, a lot of people 369 00:24:08,413 --> 00:24:13,084 when they think about, you know, sort of abstractly, what is it 370 00:24:13,084 --> 00:24:17,389 that makes informed -- sorry, research ethical -- clinical research ethical? 371 00:24:17,856 --> 00:24:19,858 The answer often is informed consent. 372 00:24:19,858 --> 00:24:21,860 And certainly, in most cases, informed 373 00:24:21,860 --> 00:24:24,896 consent is an important part of doing ethical research. 374 00:24:24,896 --> 00:24:30,235 But what's interesting about it is if you think about it, until you have a proposal 375 00:24:30,235 --> 00:24:33,572 that's worth doing and well designed and you've minimized risks 376 00:24:33,572 --> 00:24:37,576 and had somebody else review it, there is no reason to ask 377 00:24:37,576 --> 00:24:39,277 anybody for their informed consent. 378 00:24:39,277 --> 00:24:43,548 You don't even get there until you've gone through those other steps. 379 00:24:43,548 --> 00:24:47,819 But informed consent, once you're there, has a goal of making sure 380 00:24:47,819 --> 00:24:50,322 that the people that you're offering participation 381 00:24:50,322 --> 00:24:54,226 to have an opportunity to decide whether they want to participate 382 00:24:54,226 --> 00:24:57,462 or continue to participate and whether or not participating 383 00:24:57,462 --> 00:25:01,366 is compatible with their goals and their interests and their values. 384 00:25:02,367 --> 00:25:05,303 So, it's a really important concept. 385 00:25:05,303 --> 00:25:09,174 Most of the documents, the codes, guidelines, regulations, 386 00:25:09,174 --> 00:25:12,811 have something related to informed consent in them. 387 00:25:12,811 --> 00:25:16,014 And some of them are very strong. 388 00:25:16,014 --> 00:25:21,453 So, Nuremberg Code is famously quoted as saying, "The voluntary consent of 389 00:25:21,453 --> 00:25:26,458 human subjects is absolutely essential." CIOMS says, "For all biomedical research 390 00:25:26,458 --> 00:25:30,529 involving human subjects, the investigator must obtain the informed 391 00:25:30,529 --> 00:25:34,633 consent of the prospective subject or an authorized representative." 392 00:25:34,866 --> 00:25:37,802 And the regulations have pretty much -- 393 00:25:37,802 --> 00:25:41,106 language is very similar to this as well. 394 00:25:41,106 --> 00:25:42,774 There are some exceptions. 395 00:25:42,774 --> 00:25:46,111 There are some ways to waive informed consent. 396 00:25:46,111 --> 00:25:47,779 Emergencies are one example. 397 00:25:47,779 --> 00:25:51,516 So, the study I showed you at the beginning, 398 00:25:51,516 --> 00:25:55,253 where you're encountering and doing an intervention on people 399 00:25:55,253 --> 00:25:59,024 in the emergency room who have traumatic brain injury. 400 00:25:59,024 --> 00:26:01,927 There is a process that's regulatorily described 401 00:26:01,927 --> 00:26:06,498 that one needs to go through in order to do that, 402 00:26:06,498 --> 00:26:10,101 but those kinds of research can sometimes be done 403 00:26:10,101 --> 00:26:14,906 without getting the consent of the individuals who are in a position 404 00:26:14,906 --> 00:26:19,744 for the most part to not be able to give informed consent. 405 00:26:19,744 --> 00:26:23,148 So, I will choose -- skip that one. 406 00:26:23,148 --> 00:26:25,750 So, what does informed consent involve? 407 00:26:25,750 --> 00:26:29,354 It involves a number of steps, disclosure of information 408 00:26:29,354 --> 00:26:32,557 to people, often -- usually written and oral, 409 00:26:33,425 --> 00:26:34,092 some sense 410 00:26:34,092 --> 00:26:38,663 that they understand the information well enough to be able to make a decision, 411 00:26:38,663 --> 00:26:41,967 a choice that they can make which is hopefully voluntary, 412 00:26:41,967 --> 00:26:46,204 in other words not coerced or unduly influenced by others or other factors. 413 00:26:46,204 --> 00:26:47,839 And then they authorize it. 414 00:26:47,839 --> 00:26:51,142 And I don't know if you can see my picture. 415 00:26:51,142 --> 00:26:54,412 But it's a cartoon, which is not really that funny. 416 00:26:55,313 --> 00:26:57,515 They're people standing around the table 417 00:26:57,515 --> 00:27:01,553 looking at the guy who's being asked to sign a consent. 418 00:27:01,553 --> 00:27:05,957 And you can see them with their trench coats and their hats 419 00:27:05,957 --> 00:27:10,362 and putting a lot of pressure on this poor guy who's sweating. 420 00:27:10,362 --> 00:27:12,931 Unfortunately, sometimes informed consent in clinical research 421 00:27:12,931 --> 00:27:16,635 does look like that or feels like that to people. 422 00:27:16,635 --> 00:27:21,406 In other times, it's done, you know, in a very open and trance 423 00:27:21,406 --> 00:27:24,709 -- iterative way so that people really can engage 424 00:27:24,709 --> 00:27:28,947 with the information, ask questions, understand it, and make a decision. 425 00:27:28,947 --> 00:27:31,683 There are lots of challenges to this. 426 00:27:31,683 --> 00:27:36,821 You can tell I spend a lot of time thinking about it for me 427 00:27:36,821 --> 00:27:40,892 and said I did some of my researches on informed consent. 428 00:27:41,259 --> 00:27:45,964 But just, for example, think about, how many of you have seen 429 00:27:45,964 --> 00:27:49,768 consent forms that are used in the intramural program? 430 00:27:49,768 --> 00:27:52,737 Many of you have seen them? Okay. 431 00:27:52,737 --> 00:27:57,375 And so, the -- oftentimes, not always, but oftentimes, the consent 432 00:27:57,375 --> 00:28:02,447 form of the intramural program is, you know, 15, 18 pages long. 433 00:28:02,447 --> 00:28:05,850 Those are long documents for people to read. 434 00:28:05,850 --> 00:28:09,654 Choices about what to put in those consent documents 435 00:28:09,654 --> 00:28:13,892 and how to say things are really very ethically important 436 00:28:13,892 --> 00:28:18,530 and very difficult in order to be able to give people 437 00:28:18,530 --> 00:28:22,367 the information they need without making it so long 438 00:28:22,367 --> 00:28:27,872 and so cumbersome that they can't read it or they can't absorb it. 439 00:28:30,842 --> 00:28:32,010 And then the 440 00:28:32,010 --> 00:28:36,314 IRB has an important role in reviewing consent, both the plan 441 00:28:36,314 --> 00:28:40,251 for informing participants about the various aspects of the study, 442 00:28:40,251 --> 00:28:45,356 how they're -- how investigators or the research team is going to know 443 00:28:45,356 --> 00:28:49,861 whether or not somebody understands and whether their choice is voluntary. 444 00:28:49,861 --> 00:28:54,766 So, the seventh of our seven-principle framework is what we called respect 445 00:28:54,766 --> 00:28:55,934 for enrolled subjects. 446 00:28:55,934 --> 00:28:57,502 And this is a -- 447 00:28:57,502 --> 00:29:01,573 an element or a principle that I think really needs some more work. 448 00:29:01,573 --> 00:29:06,578 And the reason I -- let me say where I -- where it came from first 449 00:29:06,578 --> 00:29:09,714 and then say why I think it needs more work. 450 00:29:09,714 --> 00:29:13,151 So, in reviewing the documents, the codes of ethics, the literature 451 00:29:13,151 --> 00:29:17,222 on research ethics, the regulations, they all talk about what you do upfront. 452 00:29:17,689 --> 00:29:20,458 You know, you get review, you make sure 453 00:29:20,458 --> 00:29:23,895 the risks and benefits are justified and risks are minimized, 454 00:29:23,895 --> 00:29:27,699 and get the person's informed consent, and then the study begins. 455 00:29:27,699 --> 00:29:31,503 And much of the guidance -- not all of it, but 456 00:29:31,503 --> 00:29:35,273 much of the guidance doesn't say much else about what happens. 457 00:29:35,273 --> 00:29:37,675 And that doesn't make any sense, right? 458 00:29:37,675 --> 00:29:41,146 Your ethical obligations to people that you're enrolling in research 459 00:29:41,146 --> 00:29:44,582 can't end the minute you get their informed consent, right? 460 00:29:44,783 --> 00:29:46,985 You still have obligations to people. 461 00:29:46,985 --> 00:29:50,922 And so, these obligations include at least things like protecting 462 00:29:50,922 --> 00:29:53,324 their confidentiality, monitoring their welfare, recognizing 463 00:29:53,324 --> 00:29:57,662 that they have a right to withdraw, providing them new information 464 00:29:57,662 --> 00:30:02,033 as it becomes available that might affect their decision to participate, 465 00:30:02,033 --> 00:30:07,972 informing them of new findings, planning for what happens at the end of the trial. 466 00:30:07,972 --> 00:30:11,543 There are other things that are not on the 467 00:30:11,543 --> 00:30:14,712 -- this list, compensation for research related injury. 468 00:30:14,712 --> 00:30:19,150 I mean, there are a lot of things that people do talk about that 469 00:30:19,150 --> 00:30:24,189 are sort of in this -- under this rubric of what happens after the study begins. 470 00:30:24,189 --> 00:30:27,659 And so, I think both -- hopefully, you can see why 471 00:30:27,659 --> 00:30:32,397 this is an important element, but also you can see why there needs to be 472 00:30:32,397 --> 00:30:33,665 more work on this 473 00:30:33,665 --> 00:30:38,102 because some of these things are sort of done without a lot of guidance. 474 00:30:40,104 --> 00:30:41,472 There is some guidance. 475 00:30:41,472 --> 00:30:44,576 So for example, in the Nuremberg Code, it says, 476 00:30:44,576 --> 00:30:48,713 "During the course of the experiment, the subject should be at liberty 477 00:30:48,713 --> 00:30:53,551 to bring the experiment to an end." So, you can withdraw at any time. 478 00:30:53,551 --> 00:30:56,654 And Helsinki talks about privacy and confidentiality, and minimizing 479 00:30:56,654 --> 00:30:59,757 the impact of the study on a person's physical 480 00:30:59,757 --> 00:31:01,826 and mental integrity. 481 00:31:03,294 --> 00:31:05,029 So, those are the seven. 482 00:31:05,029 --> 00:31:06,764 Hopefully, they made some sense. 483 00:31:06,764 --> 00:31:09,200 I think importantly, again I've said this, 484 00:31:09,200 --> 00:31:13,037 but you know, we think of them as systematic and sequential. 485 00:31:13,037 --> 00:31:18,243 So, you start at the top and work your way down, and that they're necessary. 486 00:31:18,243 --> 00:31:22,447 So that when you're thinking about a study, whether you're writing it 487 00:31:22,447 --> 00:31:26,951 or reviewing it or thinking about entering it as a participant, it's worth 488 00:31:26,951 --> 00:31:32,190 thinking them through one at a time and saying, okay, does this study have value? 489 00:31:32,357 --> 00:31:33,825 Is it well designed? 490 00:31:33,825 --> 00:31:36,794 Are the right people being asked to participate? 491 00:31:36,794 --> 00:31:38,663 Are the inclusion criteria right? 492 00:31:38,663 --> 00:31:40,865 It -- are the risks minimized? 493 00:31:40,865 --> 00:31:43,468 Has it been reviewed by an IRB? 494 00:31:43,468 --> 00:31:47,171 Is there a good process in place for informed consent? 495 00:31:47,171 --> 00:31:51,976 And are all the things that happen to people once a study starts? 496 00:31:51,976 --> 00:31:54,178 The monitoring, the confidentiality, the plans 497 00:31:54,178 --> 00:31:57,882 for what happens at the end, are those all adequate? 498 00:31:58,316 --> 00:32:00,618 And you can ask those questions 499 00:32:00,618 --> 00:32:04,689 for every single study that you consider in any way. 500 00:32:04,689 --> 00:32:09,093 It also seems important to recognize that these can be universal. 501 00:32:09,093 --> 00:32:13,364 In other words, they apply to all studies everywhere across time. 502 00:32:13,364 --> 00:32:17,201 What's important, however, is thinking about how they're actually manifested. 503 00:32:17,201 --> 00:32:21,072 So, risks, for example, might vary from -- over time 504 00:32:21,072 --> 00:32:26,077 from one time point to another, or from one geographic location to another. 505 00:32:26,577 --> 00:32:29,447 The method by which you disclose information 506 00:32:29,447 --> 00:32:34,385 and the process of informed consent might vary by population appropriately so. 507 00:32:34,385 --> 00:32:38,923 It doesn't mean informed consent the principle of giving people information 508 00:32:38,923 --> 00:32:42,627 that they can use to make a decision changes. 509 00:32:42,627 --> 00:32:47,165 It's how you implement it that changes from study to study. 510 00:32:47,165 --> 00:32:51,669 And, of course, they're going to require some balancing and specifying. 511 00:32:52,103 --> 00:32:55,940 So, the idea of, you know, having an adequate justification 512 00:32:55,940 --> 00:32:59,010 for the risks requires a lot of specification 513 00:32:59,010 --> 00:33:04,015 in terms of determining what are the risks that count in this case. 514 00:33:04,015 --> 00:33:07,085 How do we balance them with the benefits? 515 00:33:07,085 --> 00:33:10,922 What are the benefits that that count in this case? 516 00:33:10,922 --> 00:33:12,824 How do we weigh them? 517 00:33:12,824 --> 00:33:17,061 How do we balance them against the rigor of the study? 518 00:33:17,562 --> 00:33:21,099 And there are a lot of interesting examples of challenges, 519 00:33:21,099 --> 00:33:23,201 for example, in randomized control trials. 520 00:33:23,201 --> 00:33:26,738 People worry about things like randomizing people, especially if we 521 00:33:26,738 --> 00:33:31,676 one of the arms to which people might be randomized is the placebo arm. 522 00:33:31,676 --> 00:33:34,479 People worry about that from an ethical perspective. 523 00:33:34,479 --> 00:33:39,050 And yet, if the question is worth asking and the design is rigorous, 524 00:33:39,050 --> 00:33:43,654 it might be that placebo is the best way to answer the question. 525 00:33:43,654 --> 00:33:45,757 And not only the best way 526 00:33:45,757 --> 00:33:49,627 to answer the question, but the fairest thing for the individuals. 527 00:33:49,627 --> 00:33:53,031 Because if you know upfront that the drug is much 528 00:33:53,031 --> 00:33:56,768 better than placebo, then maybe you shouldn't be doing the study. 529 00:33:56,768 --> 00:34:00,538 So, it's that kind of balancing that needs to be done.