1 00:00:04,071 --> 00:00:05,739 I wanted to raise the fact 2 00:00:05,739 --> 00:00:07,846 that health inequities at the top, 3 00:00:07,846 --> 00:00:10,077 in the term that's used in non-U.S. 4 00:00:10,077 --> 00:00:13,146 contexts, and disparities use in the U.S. 5 00:00:13,146 --> 00:00:14,314 context. 6 00:00:14,314 --> 00:00:16,375 So does the difference in the definitions 7 00:00:16,375 --> 00:00:18,185 reflect anything about how the U.S. 8 00:00:18,185 --> 00:00:21,188 sees health disparities 9 00:00:21,722 --> 00:00:22,923 in the rest of the world? 10 00:00:22,923 --> 00:00:23,290 Yeah. 11 00:00:23,290 --> 00:00:27,594 So it's this is consistent with the, 12 00:00:29,529 --> 00:00:31,932 the terms are used in the United States. 13 00:00:31,932 --> 00:00:34,935 So it's interesting over 14 00:00:35,736 --> 00:00:38,466 the past, like a decades, many people 15 00:00:38,466 --> 00:00:39,573 started to use 16 00:00:39,573 --> 00:00:43,443 even in the United States, inequities are used. 17 00:00:43,710 --> 00:00:48,348 And and so ethical part is more explicit. 18 00:00:48,715 --> 00:00:51,852 I think in the early on, people thought people, 19 00:00:52,386 --> 00:00:54,575 scientists think health researchers 20 00:00:54,575 --> 00:00:55,389 thought that 21 00:00:55,389 --> 00:00:59,893 when you start to say inequity, it sounds like not science, 22 00:01:00,527 --> 00:01:03,318 because you're talking about morality 23 00:01:03,318 --> 00:01:05,732 and ethics, that's not science. 24 00:01:06,133 --> 00:01:09,883 So people start that, then it's just kind of like, well, and 25 00:01:09,883 --> 00:01:10,571 scientist. 26 00:01:11,038 --> 00:01:16,163 I want to talk about numbers and so them maybe disparity 27 00:01:16,163 --> 00:01:17,444 sounds better 28 00:01:17,878 --> 00:01:21,248 like a kind of bypass because it's a bit ambiguous. 29 00:01:21,648 --> 00:01:23,350 So that's how it started. 30 00:01:23,350 --> 00:01:27,287 And the disparities are actually it was going between cost 31 00:01:27,287 --> 00:01:30,624 around between inequality and inequities for a long time. 32 00:01:31,124 --> 00:01:34,199 And then now I think on it becoming a people are more 33 00:01:34,199 --> 00:01:34,895 comfortable 34 00:01:35,295 --> 00:01:36,563 to say when we say health, 35 00:01:37,531 --> 00:01:39,800 disparities in social disadvantage. 36 00:01:39,800 --> 00:01:40,133 Yeah. 37 00:01:40,133 --> 00:01:44,037 So ethics is front and center in a way of mine. 38 00:01:44,037 --> 00:01:46,106 So it's a kind of interesting history. 39 00:01:46,106 --> 00:01:46,673 It is. Yeah. 40 00:01:46,673 --> 00:01:47,641 Yeah. Thank you. 41 00:01:47,641 --> 00:01:50,577 Thank you. Here is the question from Jake. 42 00:01:50,577 --> 00:01:54,314 Could you give an example of a health inequity 43 00:01:54,581 --> 00:01:57,351 that is not a health disparity. 44 00:01:57,351 --> 00:01:59,553 So he's asking, right. 45 00:01:59,553 --> 00:02:02,522 You have a green and then an olive 46 00:02:02,522 --> 00:02:06,271 are there what's in that green circle versus what's in that 47 00:02:06,271 --> 00:02:07,160 olive circle. 48 00:02:07,561 --> 00:02:10,530 Yeah. So that's interesting. Good question. 49 00:02:10,530 --> 00:02:13,533 So the 50 00:02:18,972 --> 00:02:21,975 They're different. 51 00:02:22,843 --> 00:02:26,780 Philosophical perspectives to define health inequity. 52 00:02:26,780 --> 00:02:29,816 So I give you just 1 or 2. 53 00:02:30,417 --> 00:02:33,420 So one is that, 54 00:02:34,021 --> 00:02:37,791 equal opportunity for philosophers in the, realm? 55 00:02:38,125 --> 00:02:41,128 La la land is an there. 56 00:02:41,795 --> 00:02:44,931 What's wrong about it is that the, 57 00:02:44,931 --> 00:02:47,734 those factors beyond individual control, 58 00:02:48,802 --> 00:02:50,704 those are inequity, 59 00:02:50,704 --> 00:02:53,195 but something within your control 60 00:02:53,195 --> 00:02:55,308 and something that happens. 61 00:02:55,308 --> 00:02:59,913 So a typical example I gave is that you don't need to. 62 00:03:00,213 --> 00:03:03,850 But you insisted and you did the bungee jumping 63 00:03:04,651 --> 00:03:06,853 without safety precaution. 64 00:03:06,853 --> 00:03:09,022 And, you really didn't need to. 65 00:03:09,022 --> 00:03:10,757 It's your choice. 66 00:03:10,757 --> 00:03:14,094 And you happened to you just broke your leg. 67 00:03:14,728 --> 00:03:16,763 Then to that. 68 00:03:16,763 --> 00:03:21,043 I'm sorry that happens, but maybe society's not 69 00:03:21,043 --> 00:03:22,135 responsible 70 00:03:22,335 --> 00:03:24,004 for what you did. 71 00:03:24,004 --> 00:03:27,074 So that that health is. 72 00:03:27,074 --> 00:03:28,075 Okay. 73 00:03:28,075 --> 00:03:29,543 So something like that. 74 00:03:29,543 --> 00:03:31,945 So then you could think about it. 75 00:03:31,945 --> 00:03:35,378 So in that view the distinction is not 76 00:03:35,378 --> 00:03:37,818 so much which group it is. 77 00:03:38,151 --> 00:03:40,386 So you can imagine that then that, 78 00:03:40,386 --> 00:03:42,556 you know, even low income people 79 00:03:42,856 --> 00:03:45,559 within them, those people who did the bungee 80 00:03:45,559 --> 00:03:47,221 jumping and, you know, bungee jumping, 81 00:03:47,221 --> 00:03:48,795 you know, there's some differences. 82 00:03:49,029 --> 00:03:53,633 So it doesn't neatly fit into that groups. So, 83 00:03:55,669 --> 00:03:57,871 so that's one example, 84 00:03:57,871 --> 00:04:01,141 of that and how the theories are broader. 85 00:04:01,541 --> 00:04:06,480 And then often people think about the goods 86 00:04:06,480 --> 00:04:08,582 and then that's the poor part. 87 00:04:08,582 --> 00:04:10,684 I hope that we'll find out. 88 00:04:10,684 --> 00:04:12,873 So Jake, let us know if you have a follow 89 00:04:12,873 --> 00:04:13,086 up. 90 00:04:13,086 --> 00:04:14,855 NRMA. Hi. 91 00:04:14,855 --> 00:04:17,157 Thank you for your talk. 92 00:04:17,157 --> 00:04:19,759 So I have a question sort of broadly 93 00:04:19,759 --> 00:04:22,763 about intersectionality because I think, 94 00:04:23,630 --> 00:04:25,750 it seems like in the past 10 or 20 years, 95 00:04:25,750 --> 00:04:27,300 like we've, you know, started 96 00:04:27,300 --> 00:04:29,662 to talk more about the ways that people, 97 00:04:29,662 --> 00:04:31,905 can be a part of more than one group. 98 00:04:32,339 --> 00:04:35,509 So when researchers are doing a study, 99 00:04:35,509 --> 00:04:37,344 about the effects of, 100 00:04:37,711 --> 00:04:40,647 income on, you know, some other thing, 101 00:04:40,647 --> 00:04:43,650 it seems like in addition to, 102 00:04:44,684 --> 00:04:47,254 sort of grouping by low income, middle income 103 00:04:47,254 --> 00:04:49,023 and high income, they should have 104 00:04:49,023 --> 00:04:50,524 some sort of understanding, 105 00:04:51,458 --> 00:04:52,092 of the fact that 106 00:04:52,092 --> 00:04:55,646 gender influences income or maybe disability influences 107 00:04:55,646 --> 00:04:56,163 income. 108 00:04:56,163 --> 00:05:00,033 So how are researchers teasing apart, 109 00:05:01,668 --> 00:05:05,906 those sorts of issues and how are they like mapping on 110 00:05:05,906 --> 00:05:08,909 like a maybe like looking back like 111 00:05:09,743 --> 00:05:11,804 those sorts of complex interactions 112 00:05:11,804 --> 00:05:12,746 between groups. 113 00:05:13,413 --> 00:05:16,049 Yeah. So that's a very good question. 114 00:05:16,049 --> 00:05:19,052 And I. 115 00:05:22,822 --> 00:05:24,291 Get different ways to answer it. 116 00:05:24,291 --> 00:05:27,294 So the, 117 00:05:27,661 --> 00:05:29,796 Who are the researchers we're talking about. 118 00:05:29,796 --> 00:05:32,782 So those are the ones I know who are interested in the 119 00:05:32,782 --> 00:05:33,667 empirical work, 120 00:05:33,967 --> 00:05:36,970 like trials. 121 00:05:38,138 --> 00:05:40,807 As I said, 122 00:05:40,807 --> 00:05:43,654 the the choice of groups are often 123 00:05:43,654 --> 00:05:45,078 just intriguing. 124 00:05:46,279 --> 00:05:49,416 So that and I'm very interested in gender. 125 00:05:50,283 --> 00:05:51,918 I'm interested in race. 126 00:05:51,918 --> 00:05:53,553 And let's do that. 127 00:05:53,553 --> 00:05:54,888 And nothing wrong about it, but. 128 00:05:54,888 --> 00:05:56,823 And how about this, this this, this, this. 129 00:05:56,823 --> 00:05:59,685 It's not like a there's almost all things 130 00:05:59,685 --> 00:06:02,128 you can consider and then sit down 131 00:06:02,128 --> 00:06:05,131 and come up with the theory and pick the most important one. 132 00:06:05,131 --> 00:06:08,068 It doesn't work that way. Right. So that's one. 133 00:06:08,068 --> 00:06:12,305 And then also it's just in the how if it is measured. 134 00:06:13,173 --> 00:06:14,741 So for example, 135 00:06:15,976 --> 00:06:17,310 even now 136 00:06:17,310 --> 00:06:19,908 measuring sex, it's getting very complex 137 00:06:19,908 --> 00:06:20,947 how to measure. 138 00:06:21,581 --> 00:06:26,152 And so the do we have the data or do we know how to measure. 139 00:06:26,553 --> 00:06:29,522 So that becomes the consideration. 140 00:06:29,522 --> 00:06:36,029 So the also typically people have their own ism. 141 00:06:36,396 --> 00:06:40,066 So I'm interested in sexism, I'm interested in racism. 142 00:06:40,467 --> 00:06:43,203 I'm interested in capitalism. 143 00:06:43,203 --> 00:06:43,770 You know. 144 00:06:43,770 --> 00:06:46,773 So they just go into one. 145 00:06:47,207 --> 00:06:49,191 And then it's very difficult to say, 146 00:06:49,191 --> 00:06:50,844 well everything is important. 147 00:06:51,211 --> 00:06:53,246 I know which one to choose. 148 00:06:53,246 --> 00:06:55,949 And that is like not the way 149 00:06:55,949 --> 00:06:58,918 they, empirical work goes. 150 00:06:58,918 --> 00:07:01,885 I would like to have that conversation 151 00:07:01,885 --> 00:07:02,822 going more. 152 00:07:03,156 --> 00:07:07,584 But anyway, there is a gap that then we have to have that 153 00:07:07,584 --> 00:07:08,361 question. 154 00:07:08,561 --> 00:07:11,564 Thank you. 155 00:07:13,300 --> 00:07:14,134 Go ahead. Hi. 156 00:07:14,134 --> 00:07:15,702 Thanks again for your talk. 157 00:07:15,702 --> 00:07:17,504 My question is about, so I can 158 00:07:17,504 --> 00:07:20,284 you like the reasons why you conduct like a study, 159 00:07:20,284 --> 00:07:20,674 right. 160 00:07:20,674 --> 00:07:22,726 So it's like in terms of practice 161 00:07:22,726 --> 00:07:23,410 of patient 162 00:07:23,410 --> 00:07:26,680 or biological differences or social differences. 163 00:07:27,213 --> 00:07:30,183 And my question is how do you really study, 164 00:07:30,717 --> 00:07:35,488 like when, like there's conflict conflating between biology 165 00:07:35,488 --> 00:07:38,654 in social issues, like how we think of the history of 166 00:07:38,654 --> 00:07:39,192 science. 167 00:07:39,392 --> 00:07:42,896 In many ways, problems is conflating things 168 00:07:43,229 --> 00:07:45,808 between biology and social history, 169 00:07:45,808 --> 00:07:46,766 for example. 170 00:07:47,267 --> 00:07:49,677 So to me, going further in depth, 171 00:07:49,677 --> 00:07:52,672 like a lot of ways, like race and gender 172 00:07:52,672 --> 00:07:55,927 is one primary consequence of conflating 173 00:07:55,927 --> 00:07:58,611 these two, reasons why we study. 174 00:07:58,611 --> 00:08:01,581 Like, for example, we have a history of attributing 175 00:08:01,581 --> 00:08:04,527 like intelligence, for example, to race or to 176 00:08:04,527 --> 00:08:05,051 gender. 177 00:08:05,385 --> 00:08:08,040 And it could be the consequences can be as 178 00:08:08,040 --> 00:08:08,988 simple as like 179 00:08:08,988 --> 00:08:12,104 stereotyping to something super severe, like eugenics 180 00:08:12,104 --> 00:08:12,692 movement, 181 00:08:12,992 --> 00:08:15,025 which has been based off of the science 182 00:08:15,025 --> 00:08:16,796 of thinking that there's biology, 183 00:08:17,030 --> 00:08:20,100 biological differences between these groups. 184 00:08:20,100 --> 00:08:22,457 So my question is that how do we study 185 00:08:22,457 --> 00:08:24,938 these differences, like knowing if it's 186 00:08:24,938 --> 00:08:27,727 a biological difference or a whole social 187 00:08:27,727 --> 00:08:30,176 and cultural historical difference? 188 00:08:30,677 --> 00:08:33,513 We're comparing two groups and how can we ethically 189 00:08:33,513 --> 00:08:34,180 communicate 190 00:08:34,180 --> 00:08:36,773 these results in a way that's like 191 00:08:36,773 --> 00:08:39,519 uplifting rather than degrading of, 192 00:08:40,053 --> 00:08:43,323 making sure we are conscious of these differences. 193 00:08:44,124 --> 00:08:46,860 That's really good question. And then, 194 00:08:48,661 --> 00:08:50,630 it sounds kind of silly, 195 00:08:50,630 --> 00:08:52,997 but we just have to be very critical and 196 00:08:52,997 --> 00:08:53,767 and careful. 197 00:08:54,501 --> 00:08:59,806 So, If it is like 198 00:09:01,474 --> 00:09:04,477 the way I categorize is not, 199 00:09:06,880 --> 00:09:08,448 Stable in 200 00:09:08,448 --> 00:09:11,785 and then it's you can always say that a 201 00:09:12,452 --> 00:09:15,588 these like, intelligence 202 00:09:15,588 --> 00:09:18,224 each day I believe the biological. 203 00:09:18,224 --> 00:09:20,126 So I'm going to do biology 204 00:09:20,126 --> 00:09:22,062 and then maybe you think that way 205 00:09:22,062 --> 00:09:23,763 but and that could be wrong. 206 00:09:24,330 --> 00:09:27,600 So it's just a different ways you might see. 207 00:09:27,600 --> 00:09:30,470 And then it doesn't give you that license. 208 00:09:30,470 --> 00:09:32,238 You can do it. 209 00:09:32,238 --> 00:09:34,941 You you have to think that's an everyday state. 210 00:09:34,941 --> 00:09:37,410 Good question. And make sense. 211 00:09:37,410 --> 00:09:41,414 And so I really I agree with you. 212 00:09:41,815 --> 00:09:46,653 And At the same time, 213 00:09:49,255 --> 00:09:50,190 right now 214 00:09:50,190 --> 00:09:54,661 we try to include diverse populations 215 00:09:54,661 --> 00:09:58,231 and then do a little bit more careful examination. 216 00:09:58,798 --> 00:10:01,768 Before we didn't, 217 00:10:01,768 --> 00:10:05,141 although the medical studies came from very homogeneous 218 00:10:05,141 --> 00:10:05,939 populations. 219 00:10:06,539 --> 00:10:09,742 And so that is a problem too. 220 00:10:09,742 --> 00:10:12,783 But and at the same time, if you doesn't have an 221 00:10:12,783 --> 00:10:13,480 assumption 222 00:10:13,847 --> 00:10:18,341 and you start to just prove that assumption that are wrong 223 00:10:18,341 --> 00:10:18,651 to. 224 00:10:18,651 --> 00:10:22,815 So I, I know I'm not answering, but I just totally agree with 225 00:10:22,815 --> 00:10:23,156 you. 226 00:10:23,156 --> 00:10:26,159 And then I just have to be very careful. 227 00:10:26,493 --> 00:10:29,195 And so two quick things, right? I think 228 00:10:30,163 --> 00:10:33,333 transparency and as you said, critical. 229 00:10:33,533 --> 00:10:33,867 Right. 230 00:10:33,867 --> 00:10:37,337 So if you're looking at literature, right, 231 00:10:37,604 --> 00:10:41,007 it's important for you to be able to say, 232 00:10:41,341 --> 00:10:44,744 okay, let me think about what 233 00:10:44,744 --> 00:10:47,759 they were interested when they were designing this 234 00:10:47,759 --> 00:10:48,181 study. 235 00:10:48,715 --> 00:10:53,361 And let me critically look at whether or not they're 236 00:10:53,361 --> 00:10:53,987 basing 237 00:10:53,987 --> 00:10:59,025 their hypothesis on a very old and, 238 00:10:59,359 --> 00:11:02,662 you know, incorrect assumptions. 239 00:11:02,962 --> 00:11:04,718 And so if you're doing a meta analysis 240 00:11:04,718 --> 00:11:06,566 or something, right, you would be like, 241 00:11:06,566 --> 00:11:08,468 okay, that's garbage. 242 00:11:08,468 --> 00:11:10,779 I'm just, you know, to be hyperbolic, 243 00:11:10,779 --> 00:11:13,339 I'm not going to include that in my, my, 244 00:11:13,339 --> 00:11:16,809 you know, meta analysis because they weren't asking, 245 00:11:17,277 --> 00:11:20,013 an appropriate question, maybe something. 246 00:11:20,013 --> 00:11:20,847 Like that. Yeah. 247 00:11:20,847 --> 00:11:23,091 So I think that maybe it's helpful 248 00:11:23,091 --> 00:11:23,883 to go back. 249 00:11:23,883 --> 00:11:25,919 Yeah. It's an eight principle. 250 00:11:25,919 --> 00:11:28,755 So one is a variable scientific question. 251 00:11:28,755 --> 00:11:29,255 Yeah. 252 00:11:29,255 --> 00:11:32,392 So and then here the reason why I said that 253 00:11:32,392 --> 00:11:35,372 the equity consideration is not checkbox 254 00:11:35,372 --> 00:11:36,863 is that woven into. 255 00:11:37,063 --> 00:11:39,457 So that's actually a good example 256 00:11:39,457 --> 00:11:42,068 that if if you do this you're fine. 257 00:11:42,068 --> 00:11:43,269 It's not like that. 258 00:11:43,269 --> 00:11:46,139 So you have to think about I could do this. 259 00:11:46,139 --> 00:11:48,761 But then where does this question come 260 00:11:48,761 --> 00:11:49,175 from. 261 00:11:49,175 --> 00:11:53,246 Is a really variable meaning for sure man question. 262 00:11:53,446 --> 00:11:56,583 And then that always needs to be, examined. 263 00:11:57,684 --> 00:11:57,817 Yeah. 264 00:11:57,817 --> 00:11:58,651 Thank you. 265 00:11:58,651 --> 00:12:01,654 So this is another question from Jake. 266 00:12:02,188 --> 00:12:04,857 Can equity considerations 267 00:12:04,857 --> 00:12:08,695 ever justify not carrying out a study due 268 00:12:08,695 --> 00:12:11,481 to limited resources or other feasibility 269 00:12:11,481 --> 00:12:12,365 constraints? 270 00:12:13,066 --> 00:12:15,385 A study that lacks necessary resources 271 00:12:15,385 --> 00:12:17,704 to sufficiently mitigate risk of harm 272 00:12:18,204 --> 00:12:20,405 or obtain adequate informed consent 273 00:12:20,405 --> 00:12:23,109 from participants should not be conducted. 274 00:12:23,643 --> 00:12:26,312 But should we ever stop a study from moving forward 275 00:12:26,312 --> 00:12:29,282 due to equity concerns? 276 00:12:31,484 --> 00:12:31,784 Right. 277 00:12:31,784 --> 00:12:34,787 Like I just have to say that another way, right? 278 00:12:36,189 --> 00:12:38,690 You have let's say you have a great research 279 00:12:38,690 --> 00:12:39,258 question. 280 00:12:39,258 --> 00:12:42,736 You have thought very carefully about what the most appropriate, 281 00:12:42,736 --> 00:12:43,062 valid 282 00:12:43,062 --> 00:12:45,144 scientific methodology is, what groups 283 00:12:45,144 --> 00:12:46,733 you're going to sample from. 284 00:12:47,033 --> 00:12:50,220 But the sample size would be so large 285 00:12:50,220 --> 00:12:50,737 that, 286 00:12:52,005 --> 00:12:54,439 you know, you can't get funding to do that 287 00:12:54,439 --> 00:12:55,308 large a study. 288 00:12:55,308 --> 00:12:57,110 So you might have to compromise. 289 00:12:58,511 --> 00:13:00,079 And you say, okay, yeah. 290 00:13:00,079 --> 00:13:02,048 Yeah. 291 00:13:02,048 --> 00:13:05,718 So that's I'm trying to go back to my slide. 292 00:13:05,718 --> 00:13:08,721 Yeah. So that's happening. 293 00:13:09,222 --> 00:13:11,858 Yeah. So it's a game when I'm, I'm sorry. 294 00:13:11,858 --> 00:13:14,427 It's not like, black and white answers. 295 00:13:14,427 --> 00:13:17,430 And so before I think is that. 296 00:13:17,630 --> 00:13:20,605 It's just impossible because it's so expensive not 297 00:13:20,605 --> 00:13:21,200 feasible. 298 00:13:21,534 --> 00:13:24,871 And and also as I indicated, typically 299 00:13:25,171 --> 00:13:28,107 it's not a small sample size, but 300 00:13:28,107 --> 00:13:30,810 these are pretty populations. 301 00:13:30,810 --> 00:13:33,146 Often that then that for historical 302 00:13:33,146 --> 00:13:34,213 and good reason 303 00:13:34,747 --> 00:13:37,158 they do want to participate in the study 304 00:13:37,158 --> 00:13:38,785 because they're suspicious 305 00:13:38,785 --> 00:13:42,321 and and then who are we to blame because we did so wrong. 306 00:13:42,789 --> 00:13:45,792 So just to get that, 307 00:13:46,426 --> 00:13:48,995 necessary sample, 308 00:13:48,995 --> 00:13:50,763 it just cost more. 309 00:13:50,763 --> 00:13:51,998 You need more effort. 310 00:13:51,998 --> 00:13:53,908 You need more people to go in and, you 311 00:13:53,908 --> 00:13:56,069 know, you need to talk, and it's all that. 312 00:13:56,436 --> 00:14:00,306 So we say that if we don't do it because it's so expensive. 313 00:14:00,840 --> 00:14:03,376 So I think that's not right. 314 00:14:03,376 --> 00:14:04,944 So we have to push it. 315 00:14:04,944 --> 00:14:07,000 So it's very nice that the NIH said 316 00:14:07,000 --> 00:14:09,115 that the net cost is not a concern. 317 00:14:09,482 --> 00:14:13,252 Then you can write the grant and then you justify 318 00:14:13,252 --> 00:14:16,189 we need these people because we have this, you 319 00:14:16,189 --> 00:14:20,426 know, concern, and we need this much money 320 00:14:20,426 --> 00:14:22,995 and they give us money and they should get the money. 321 00:14:22,995 --> 00:14:25,498 So that's good. But then at the same time. 322 00:14:27,633 --> 00:14:29,363 There are many methodological things 323 00:14:29,363 --> 00:14:30,036 you could do. 324 00:14:30,036 --> 00:14:32,785 So here I talked about meta analysis 325 00:14:32,785 --> 00:14:33,473 methods. 326 00:14:33,740 --> 00:14:36,013 So those are the ones that then the each 327 00:14:36,013 --> 00:14:38,344 researcher try but still not sufficient. 328 00:14:38,745 --> 00:14:42,513 But then maybe collectively we could still make some 329 00:14:42,513 --> 00:14:43,382 difference. 330 00:14:43,683 --> 00:14:45,918 So it's not either or I think. 331 00:14:45,918 --> 00:14:48,354 And we have to do everything we could do. 332 00:14:48,354 --> 00:14:49,388 Yeah. Yeah. 333 00:14:49,388 --> 00:14:50,857 And right. 334 00:14:50,857 --> 00:14:52,692 Cost can't be an issue. 335 00:14:52,692 --> 00:14:55,328 But there are finite resources as well. Right. 336 00:14:55,328 --> 00:14:57,603 So if let's say we design a study 337 00:14:57,603 --> 00:14:59,465 that that is going to cost 338 00:14:59,799 --> 00:15:03,102 $5 million, and there are five other studies 339 00:15:03,102 --> 00:15:05,371 that cost $1 million, you know, that kind of stuff. 340 00:15:05,371 --> 00:15:09,285 So I guess just reinforce it's not an easy means easy 341 00:15:09,285 --> 00:15:09,876 answer.