1 00:00:05,238 --> 00:00:08,408 IT'S MY PLEASURE TO WELCOME YOU 2 00:00:08,475 --> 00:00:11,244 TO OUR WORKSHOP, THE NIH 3 00:00:11,311 --> 00:00:15,181 WORKSHOP TOWARD AN ETHICAL 4 00:00:15,248 --> 00:00:15,815 FRAMEWORK FOR ARTIFICIAL 5 00:00:15,882 --> 00:00:18,351 INTELLIGENCE IN BIOMEDICAL AND 6 00:00:18,418 --> 00:00:19,619 BEHAVIORAL RESEARCH, 7 00:00:19,686 --> 00:00:21,955 TRANSPARENCY FOR DATA AND MODEL 8 00:00:22,021 --> 00:00:26,159 REUSE, THINKING ABOUT A 9 00:00:26,226 --> 00:00:29,496 FRAMEWORK FOR ETHICAL A.I. AND 10 00:00:29,562 --> 00:00:30,930 TODAY ATTACKING HOW TO BUILD IN 11 00:00:30,997 --> 00:00:34,300 TRANSPARENCY AS A FIRST STEP TO 12 00:00:34,367 --> 00:00:41,641 BUILDING OUT ETHICAL A.I. 13 00:00:41,708 --> 00:00:46,679 WE HAVE COFFEE IN THE CAFETERIA, 14 00:00:46,746 --> 00:00:47,881 PLEASE HELP YOURSELF, THAT 15 00:00:47,947 --> 00:00:51,284 SHOULD BE AVAILABLE THROUGHOUT 16 00:00:51,351 --> 00:00:52,051 THE DAY. 17 00:00:52,118 --> 00:00:53,920 BATHROOMS, GO TO THE LOBBY AND 18 00:00:53,987 --> 00:00:54,921 HEAD TOWARD THAT DOOR, AND 19 00:00:54,988 --> 00:00:58,291 FOLLOW AROUND TO THE RIGHT OR 20 00:00:58,358 --> 00:01:00,693 LEFT DEPENDING WHICH YOU NEED. 21 00:01:00,760 --> 00:01:05,298 BREAK YOITS WILL BE BOTH -- 22 00:01:05,365 --> 00:01:07,634 BREAKOUTS WILL BE ON THIS FLOOR 23 00:01:07,700 --> 00:01:12,539 AND SECOND FLOOR, STAY TUNED FOR 24 00:01:12,605 --> 00:01:14,741 ADJUSTMENTS TO THE HANDOUTS 25 00:01:14,808 --> 00:01:17,343 RIGHT BEFORE WE BREAK. 26 00:01:17,410 --> 00:01:19,646 FOR THE SPEAKERS, MY COLLEAGUE 27 00:01:19,712 --> 00:01:21,381 JESSICA IN THE FRONT WILL GIVE 28 00:01:21,448 --> 00:01:26,085 YOU A WARNING IF YOU'RE ABOUT TO 29 00:01:26,152 --> 00:01:27,754 HIT YOUR TIME LIMIT. 30 00:01:27,821 --> 00:01:29,722 IT'S MY PLEASURE TO INTRODUCE 31 00:01:29,789 --> 00:01:31,224 DR. SUSAN GREGURICK, ASSOCIATE 32 00:01:31,291 --> 00:01:32,425 DIRECTOR FOR DATA SCIENCE FOR 33 00:01:32,492 --> 00:01:34,861 NIH, ALSO MY BOSS AND DIRECTOR 34 00:01:34,928 --> 00:01:36,596 OF THE OFFICE OF DATA SCIENCE 35 00:01:36,663 --> 00:01:37,697 STRATEGY, I'M SO PLEASED SHE'S 36 00:01:37,764 --> 00:01:39,833 HERE TO TALK ABOUT HOW THIS 37 00:01:39,899 --> 00:01:41,534 MEETING FITS WITH THE BROADER 38 00:01:41,601 --> 00:01:47,006 PICTURE AND GOALS OF NIH. 39 00:01:47,073 --> 00:01:47,774 SUSAN? 40 00:01:47,841 --> 00:01:48,041 [APPLAUSE] 41 00:01:48,107 --> 00:01:49,175 >> THANK YOU, LAURA. 42 00:01:49,242 --> 00:01:50,910 GOSH, IT'S SO NICE TO BE HERE, 43 00:01:50,977 --> 00:01:53,046 SO NICE TO SEE SO MANY FRIENDS 44 00:01:53,112 --> 00:01:55,215 AND COLLEAGUES I HAVEN'T SEEN IN 45 00:01:55,281 --> 00:01:58,184 YEARS, LOOKING AT YOU, TOM, DAN. 46 00:01:58,251 --> 00:01:59,152 IT'S LIKE SO NICE. 47 00:01:59,219 --> 00:02:05,291 TO WELCOME YOU TO OUR WORKSHOP. 48 00:02:05,358 --> 00:02:06,626 LAURA SAID I'M HER BOSS, I THINK 49 00:02:06,693 --> 00:02:08,661 OF IT AS A COLLEAGUE. 50 00:02:08,728 --> 00:02:11,197 WE HAVE A WONDERFUL SMALL TEAM. 51 00:02:11,264 --> 00:02:12,499 I'LL TELL YOU ABOUT THE WORK 52 00:02:12,565 --> 00:02:14,367 WRIT LARGE AND HOW THE WORK FITS 53 00:02:14,434 --> 00:02:16,436 INTO A LARGER PERSPECTIVE WITH 54 00:02:16,503 --> 00:02:17,103 A.I. EXECUTIVE ORDER, BECAUSE 55 00:02:17,170 --> 00:02:19,272 THAT'S IMPORTANT FOR YOU TO HEAR 56 00:02:19,339 --> 00:02:19,539 AS WELL. 57 00:02:19,606 --> 00:02:22,242 SO LET'S SEE WHAT HAPPENS IF 58 00:02:22,308 --> 00:02:25,979 I -- WHAT DO I DO? 59 00:02:26,045 --> 00:02:26,179 OKAY. 60 00:02:26,246 --> 00:02:34,521 I NEED SOME TECHNICAL HELP. 61 00:02:34,587 --> 00:02:36,256 I'M SUSAN GREGURICK, ASSOCIATE 62 00:02:36,322 --> 00:02:38,525 DIRECTOR OF DATA SCIENCE. 63 00:02:38,591 --> 00:02:40,293 OUR ROLE, A SMALL WONDERFUL 64 00:02:40,360 --> 00:02:45,665 TEAM, TIGER TEAM, TO CATALYZE 65 00:02:45,732 --> 00:02:47,400 DATA SCIENCE AND RELATED 66 00:02:47,467 --> 00:02:48,701 ACTIVITIES WITHIN THE COMMUNITY, 67 00:02:48,768 --> 00:02:50,236 WITHIN THIS COMMUNITY, OUR ROLE 68 00:02:50,303 --> 00:02:52,105 IS ALSO TO COORDINATE DATA 69 00:02:52,171 --> 00:02:53,773 SCIENCE AND ARTIFICIAL 70 00:02:53,840 --> 00:02:55,408 INTELLIGENCE ACTIVITIES ACROSS 71 00:02:55,475 --> 00:02:57,043 THE 27 INSTITUTES, CENTERS, 72 00:02:57,110 --> 00:02:58,444 OFFICES, THAT MAKE UP NIH. 73 00:02:58,511 --> 00:03:00,980 AND OUR ROLE IS ALSO TO PROVIDE 74 00:03:01,047 --> 00:03:02,282 SUPPORT THAT'S FUNDING SUPPORT 75 00:03:02,348 --> 00:03:04,784 TO YOU, THE COMMUNITY, BUT ALSO 76 00:03:04,851 --> 00:03:06,786 TO OUR NIH INSTITUTES, CENTERS, 77 00:03:06,853 --> 00:03:08,187 OFFICES, TO DO THINGS LIKE 78 00:03:08,254 --> 00:03:12,358 DEVELOP INFRASTRUCTURE FOR DATA 79 00:03:12,425 --> 00:03:13,426 SCIENCE, CLOUD COMPUTING, TO 80 00:03:13,493 --> 00:03:16,396 DEVELOP PROGRAMS TO SUPPORT DATA 81 00:03:16,462 --> 00:03:19,232 REPOSITORIES AND KNOWLEDGE 82 00:03:19,299 --> 00:03:21,100 BASES, PROGRAMS TO SUPPORT A.I. 83 00:03:21,167 --> 00:03:22,835 AND NEXT GENERATION OF 84 00:03:22,902 --> 00:03:24,070 RESEARCHERS AND DATA SCIENTISTS. 85 00:03:24,137 --> 00:03:27,006 WE'RE A SMALL TEAM OF I THINK 86 00:03:27,073 --> 00:03:29,175 ABOUT 30 FOLKS, MOST FEDS BUT 87 00:03:29,242 --> 00:03:31,411 NOT ALL, HOPEFULLY YOU'LL MEET A 88 00:03:31,477 --> 00:03:33,580 FEW OF THEM TODAY. 89 00:03:33,646 --> 00:03:35,214 SO, I'LL START WITH THE END, 90 00:03:35,281 --> 00:03:36,516 WHERE DO WE SEE OURSELVES GOING. 91 00:03:36,583 --> 00:03:37,884 I COULD TELL YOU ABOUT ALL THE 92 00:03:37,951 --> 00:03:40,019 THINGS WE'VE DONE IN THE PAST 93 00:03:40,086 --> 00:03:41,654 OVER THE PAST THREE YEARS BUT 94 00:03:41,721 --> 00:03:42,522 WHAT YOU MIGHT FIND MORE 95 00:03:42,589 --> 00:03:44,624 INTERESTING IS TO ASK US WHERE 96 00:03:44,691 --> 00:03:47,560 ARE WE GOING AT NIH. 97 00:03:47,627 --> 00:03:50,096 THIS IS DATA SCIENCE PLAN THAT'S 98 00:03:50,163 --> 00:03:51,397 DEVELOPED IN COLLABORATION WITH 99 00:03:51,464 --> 00:03:56,769 EVERYBODY FROM THE NIH, ALL 100 00:03:56,836 --> 00:03:58,271 INSTITUTES, CENTERS, OFFICES, 101 00:03:58,338 --> 00:03:59,806 AND WITH LEADERSHIP OF DR. 102 00:03:59,872 --> 00:04:00,473 MONICA BERTAGNOLLI. 103 00:04:00,540 --> 00:04:02,108 FIRST GOAL WHAT THE I.C. 104 00:04:02,175 --> 00:04:04,744 DIRECTORS WAS IMPORTANT ON THEIR 105 00:04:04,811 --> 00:04:05,945 MINDS, TO IMPROVE CAPABILITIES 106 00:04:06,012 --> 00:04:08,047 TO SUSTAIN NIH POLICY FOR DATA 107 00:04:08,114 --> 00:04:09,449 MANAGEMENT AND DATA SHARING. 108 00:04:09,515 --> 00:04:10,783 I'LL TELL WHAT YOU SOME 109 00:04:10,850 --> 00:04:14,921 CHALLENGES ARE AND GOALS ARE. 110 00:04:14,988 --> 00:04:16,022 A BIG PRIORITY FOR DR. 111 00:04:16,089 --> 00:04:17,557 BERTAGNOLLI AND MYSELF IS TO 112 00:04:17,624 --> 00:04:20,193 DEVELOP PROGRAMS, THESE ARE 113 00:04:20,259 --> 00:04:23,363 LARGE PROGRAMS, TO ENHANCE USING 114 00:04:23,429 --> 00:04:25,431 HUMAN DERIVED DATA FOR RESEARCH 115 00:04:25,498 --> 00:04:26,399 AND BEYOND. 116 00:04:26,466 --> 00:04:27,300 I'M A COMPUTER PERSON. 117 00:04:27,367 --> 00:04:30,903 MY PASSION IS GOING TO BE IN 118 00:04:30,970 --> 00:04:31,537 SOFTWARE, COMPUTATIONAL METHODS 119 00:04:31,604 --> 00:04:33,506 AND A.I., OF COURSE THE GOAL IS 120 00:04:33,573 --> 00:04:35,908 TO IMPROVE THESE OPPORTUNITIES 121 00:04:35,975 --> 00:04:37,343 IN SOFTWARE MODELING AND 122 00:04:37,410 --> 00:04:38,011 ARTIFICIAL INTELLIGENCE. 123 00:04:38,077 --> 00:04:42,382 WE ALSO WANT TO MAKE IT EASIER 124 00:04:42,448 --> 00:04:44,884 TO FIND DATA, SO FEDERATED 125 00:04:44,951 --> 00:04:45,451 BIOMEDICAL RESEARCH DATA 126 00:04:45,518 --> 00:04:47,553 INFRASTRUCTURE IS THE FOURTH 127 00:04:47,620 --> 00:04:47,854 GOAL. 128 00:04:47,920 --> 00:04:48,955 OF COURSE PROVIDING 129 00:04:49,022 --> 00:04:50,723 OPPORTUNITIES FOR NEXT 130 00:04:50,790 --> 00:04:54,460 GENERATION OF DATA SCIENCE AND 131 00:04:54,527 --> 00:04:55,428 A.I. RESEARCHERS. 132 00:04:55,495 --> 00:04:58,064 I'LL DIG IN A LITTLE BIT ABOUT 133 00:04:58,131 --> 00:05:01,134 SOME FUTURE THINKING GOALS. 134 00:05:01,200 --> 00:05:03,636 YOU PROBABLY REMEMBER LAST 135 00:05:03,703 --> 00:05:07,507 OCTOBER, WELL ACTUALLY LAST 136 00:05:07,573 --> 00:05:10,343 JANUARY 25, I BELIEVE, WE 137 00:05:10,410 --> 00:05:11,844 RELEASED THE NIH FINAL POLICY 138 00:05:11,911 --> 00:05:13,813 FOR DATA MANAGEMENT AND DATA 139 00:05:13,880 --> 00:05:14,313 SHARING. 140 00:05:14,380 --> 00:05:15,481 THIS REQUIRES NIH RESEARCHERS TO 141 00:05:15,548 --> 00:05:16,582 DEVELOP A DATA MANAGEMENT 142 00:05:16,649 --> 00:05:19,185 SHARING PLAN WHEN THEY SUBMIT 143 00:05:19,252 --> 00:05:22,355 THEIR GRANT, CONTRACT, OTHER 144 00:05:22,422 --> 00:05:23,423 TRANSACTIONAL AUTHORITY. 145 00:05:23,489 --> 00:05:24,991 THIS CREATES A NEED FOR 146 00:05:25,058 --> 00:05:27,160 RESEARCHERS TO UNDERSTAND DATA 147 00:05:27,226 --> 00:05:30,563 MANAGEMENT IN WAYS THAT WILL 148 00:05:30,630 --> 00:05:32,165 FOSTER GREATER FAIRSHARING AND 149 00:05:32,231 --> 00:05:34,600 REQUIRES NEED FOR DATA TO GO 150 00:05:34,667 --> 00:05:36,035 SOMEWHERE, FROM NEW STRATEGIES 151 00:05:36,102 --> 00:05:37,136 FOR BIOMEDICAL REPOSITORIES AND 152 00:05:37,203 --> 00:05:38,204 KNOWLEDGE BASES WRIT LARGE. 153 00:05:38,271 --> 00:05:40,506 SO WHAT ARE WE GOING TO DO? 154 00:05:40,573 --> 00:05:43,376 WE NEED TO SUPPORT AND WILL 155 00:05:43,443 --> 00:05:44,343 SUPPORT RESEARCHERS TO CREATE 156 00:05:44,410 --> 00:05:46,846 FAIR DATA, AND WE'VE HAD A 157 00:05:46,913 --> 00:05:49,482 NUMBER OF SUPPLEMENTS, CREATING 158 00:05:49,549 --> 00:05:50,817 ADDITIONAL OPPORTUNITIES FOR 159 00:05:50,883 --> 00:05:52,018 RESEARCHERS TO SHARE DATA NOT 160 00:05:52,085 --> 00:05:53,853 JUST IN THE ONLY COMMUNITY BUT 161 00:05:53,920 --> 00:05:55,388 ALSO IN THE INTRAMURAL RESEARCH 162 00:05:55,455 --> 00:05:58,024 COMMUNITY WITH NEW OPPORTUNITIES 163 00:05:58,091 --> 00:05:59,225 IN ELECTRONIC LAB NOTEBOOKS, FOR 164 00:05:59,292 --> 00:05:59,659 EXAMPLE. 165 00:05:59,726 --> 00:06:01,094 WE NEED TO THINK ABOUT THAT DATA 166 00:06:01,160 --> 00:06:03,162 IN A WAY THAT MAKES IT USABLE 167 00:06:03,229 --> 00:06:05,465 FOR OTHERS SO THAT'S DATA 168 00:06:05,531 --> 00:06:06,132 HARMONIZATION. 169 00:06:06,199 --> 00:06:07,800 THERE'S REALLY GREAT 170 00:06:07,867 --> 00:06:09,202 OPPORTUNITIES THAT HAVE HAPPENED 171 00:06:09,268 --> 00:06:13,573 AND I'M LOOKING AT CHRIS, DATA 172 00:06:13,639 --> 00:06:14,574 HARMONIZATION, N3C, A PHENOMENAL 173 00:06:14,640 --> 00:06:16,008 WORK THAT CHRIS AND HIS 174 00:06:16,075 --> 00:06:17,210 COLLEAGUES DID BUT WE NEED TO 175 00:06:17,276 --> 00:06:20,480 THINK ABOUT THAT AT SCALE, HOW 176 00:06:20,546 --> 00:06:24,851 CAN WE REALLY UTILIZE ONTOLOGY, 177 00:06:24,917 --> 00:06:26,719 COMMON DATA ELEMENTS. 178 00:06:26,786 --> 00:06:27,920 YOU'LL SEE LEVERAGE IN 179 00:06:27,987 --> 00:06:30,123 HARMONIZATION OF COMMON DATA 180 00:06:30,189 --> 00:06:31,224 ELEMENTS IN THE FUTURE, MORE 181 00:06:31,290 --> 00:06:32,425 OPPORTUNITIES LIKE SOME OF THE 182 00:06:32,492 --> 00:06:33,726 WORK CHRIS DID BUT NEED TO THINK 183 00:06:33,793 --> 00:06:34,260 ABOUT THAT. 184 00:06:34,327 --> 00:06:37,330 I WANT TO PUT A PLUG IN FOR A.I. 185 00:06:37,396 --> 00:06:39,632 THERE'S SOME NEAT WORK HAPPENING 186 00:06:39,699 --> 00:06:41,934 IN LARGE LANGUAGE MODELS FOR 187 00:06:42,001 --> 00:06:43,236 DATA CURATION AND HARMONIZATION 188 00:06:43,302 --> 00:06:45,438 THAT WE CAN EXPLORE MORE. 189 00:06:45,505 --> 00:06:47,306 FINALLY WE'RE SUPPORTING THE 190 00:06:47,373 --> 00:06:48,407 NIH-FUNDED DATA REPOSITORIES AND 191 00:06:48,474 --> 00:06:50,376 KNOWLEDGE BASES IN WAYS THAT 192 00:06:50,443 --> 00:06:54,080 MAKE THEM RESOURCES WITH NEW 193 00:06:54,147 --> 00:06:57,483 PROGRAMS AND SPECIAL EMPHASIS 194 00:06:57,550 --> 00:06:58,684 STUDY SECTIONS. 195 00:06:58,751 --> 00:06:59,752 AN IMPORTANT GOAL FOR DR. 196 00:06:59,819 --> 00:07:01,988 BERTAGNOLLI AND MYSELF TO THINK 197 00:07:02,054 --> 00:07:04,590 ABOUT THAT PATIENT-DERIVED DATA, 198 00:07:04,657 --> 00:07:06,325 FROM THE CLINICS AND HEALTH CARE 199 00:07:06,392 --> 00:07:07,994 SYSTEMS AND REAL WORLD DATA. 200 00:07:08,060 --> 00:07:11,297 THERE'S A NEED FOR AT-SCALE 201 00:07:11,364 --> 00:07:13,466 ACQUISITION OF THAT DATA, 202 00:07:13,533 --> 00:07:15,101 PROTECTING THAT DATA, UTILIZING 203 00:07:15,168 --> 00:07:17,403 THAT DATA FROM ELECTRONIC AND 204 00:07:17,470 --> 00:07:20,239 REAL WORLD HEALTH DATA THAT 205 00:07:20,306 --> 00:07:23,209 ENHANCES PATIENT TRUST MODEL, 206 00:07:23,276 --> 00:07:24,844 PATIENT CONFIDENTIALITY, 207 00:07:24,911 --> 00:07:26,045 PRIVACY, AND CONCERNS. 208 00:07:26,112 --> 00:07:28,347 IF YOU WORKED WITH ELECTRONIC 209 00:07:28,414 --> 00:07:29,348 HEALTH CARE DATA AS CHRIS HAS 210 00:07:29,415 --> 00:07:30,983 THERE'S A REAL DATA ISSUE THERE, 211 00:07:31,050 --> 00:07:31,717 DATA QUALITY. 212 00:07:31,784 --> 00:07:33,719 SO WE NEED TO THINK ABOUT THAT 213 00:07:33,786 --> 00:07:34,787 AND REALLY HOW DO WE USE THAT 214 00:07:34,854 --> 00:07:36,556 DATA IN A WAY THAT MAKES SENSE 215 00:07:36,622 --> 00:07:38,191 FOR THE MANY USE CASES THAT 216 00:07:38,257 --> 00:07:39,392 WE'RE GOING TO HAVE. 217 00:07:39,458 --> 00:07:41,360 AND AT THE SAME TIME, THERE'S A 218 00:07:41,427 --> 00:07:43,229 NEED TO LINK THAT DATA, THAT'S 219 00:07:43,296 --> 00:07:45,965 SOMETHING THAT WAS REALLY 220 00:07:46,032 --> 00:07:47,834 INTERESTING WITH N3C AND ALL OF 221 00:07:47,900 --> 00:07:51,871 US, WAYS WE CAN LINK DATA THAT 222 00:07:51,938 --> 00:07:52,638 PRESERVE ETHICAL, LEGAL 223 00:07:52,705 --> 00:07:54,173 IMPLICATIONS, IMPORTANT FOR 224 00:07:54,240 --> 00:07:54,841 UNDERREPRESENTED AND AT-RISK 225 00:07:54,907 --> 00:07:55,174 PEOPLES. 226 00:07:55,241 --> 00:07:57,677 SO WHAT CAN WE DO? 227 00:07:57,743 --> 00:08:02,281 WE WILL IMPROVE THE ABILITY TO 228 00:08:02,348 --> 00:08:08,087 UTILIZE CLINICAL DATA, ELECTRIC 229 00:08:08,154 --> 00:08:09,155 -- ELECTRONIC AND REAL WORLD 230 00:08:09,222 --> 00:08:09,388 DATA. 231 00:08:09,455 --> 00:08:13,526 YOU MAY HAVE SEEN FUNDING 232 00:08:13,593 --> 00:08:14,260 ANNOUNCEMENTS, I SUSPECT YOU 233 00:08:14,327 --> 00:08:16,429 MIGHT BE SEEING A FEW MORE OF 234 00:08:16,495 --> 00:08:17,797 THOSE TYPES OF ACTIVITIES. 235 00:08:17,864 --> 00:08:19,999 WE, MY OFFICE AND OTHERS, 236 00:08:20,066 --> 00:08:21,200 UNDERSTAND THERE'S A WHOLE 237 00:08:21,267 --> 00:08:22,501 HEALTH I.T. SYSTEM OUT THERE 238 00:08:22,568 --> 00:08:33,045 WITH ITS OWN DATA STANDARDS, 239 00:08:34,480 --> 00:08:35,648 FHIR, UNC, WE CAN LEVERAGE 240 00:08:35,715 --> 00:08:36,849 HEALTH AND ADMINISTRATIVE DATA 241 00:08:36,916 --> 00:08:38,284 INTO THE RESEARCH PLATFORMS. 242 00:08:38,351 --> 00:08:40,353 FINALLY, THIS IS REALLY 243 00:08:40,419 --> 00:08:42,855 IMPORTANT, LOOKING AT DEBRA FOR 244 00:08:42,922 --> 00:08:47,994 SURE, WAYS TO THINK ABOUT 245 00:08:48,060 --> 00:08:54,667 ADOPTING SOCIAL AND 246 00:08:54,734 --> 00:08:55,768 ENVIRONMENTAL DETERMINANTS OF 247 00:08:55,835 --> 00:08:56,369 HEALTH. 248 00:08:56,435 --> 00:08:57,303 SHARE HAS BEEN PILOTING AND WE 249 00:08:57,370 --> 00:08:58,838 CAN LOOK AT SCALE FOR NIH. 250 00:08:58,905 --> 00:09:01,140 YOU'LL BE SEEING MORE OF THIS IN 251 00:09:01,207 --> 00:09:02,108 THE FUTURE. 252 00:09:02,174 --> 00:09:04,310 I'M A COMPUTER PERSON. 253 00:09:04,377 --> 00:09:06,812 I LOVE COMPUTATIONAL SCIENCE. 254 00:09:06,879 --> 00:09:08,681 SO THERE'S SO MUCH OPPORTUNITY 255 00:09:08,748 --> 00:09:10,583 AND TRUSTWORTHY A.I. ETHICALLY, 256 00:09:10,650 --> 00:09:11,651 THAT'S WHY WE'RE HERE. 257 00:09:11,717 --> 00:09:12,818 WE NEED HELP. 258 00:09:12,885 --> 00:09:17,990 WE NEED TO KNOW HOW CAN WE 259 00:09:18,057 --> 00:09:19,625 ACTUALLY TAKE THESE IDEAS AND 260 00:09:19,692 --> 00:09:30,136 TAKE THEM AND INSTANTIATE. 261 00:09:35,207 --> 00:09:37,677 NEW OPPORTUNITIES AND A.I., TAKE 262 00:09:37,743 --> 00:09:39,845 ADVANTAGE OF GENERATIVE A.I. AND 263 00:09:39,912 --> 00:09:48,854 FOUNDATIONAL MODELS ON THE 264 00:09:48,921 --> 00:09:52,258 HORIZON, NEW SOFTWARE, 265 00:09:52,325 --> 00:09:53,859 TECHNOLOGIES, OUR OFFICE DOES 266 00:09:53,926 --> 00:09:54,527 SUPPORT SUSTAINABLE SOFTWARE AND 267 00:09:54,593 --> 00:09:57,296 SO YOU'LL SEE TWO FUNDING 268 00:09:57,363 --> 00:10:02,601 ANNOUNCEMENTS COMING OUT SOON TO 269 00:10:02,668 --> 00:10:03,469 SUPPORT ROBUST SOFTWARE 270 00:10:03,536 --> 00:10:05,738 DEVELOPMENT FOR WHAT DAN KATZ 271 00:10:05,805 --> 00:10:11,777 HAS BEEN TALKING ABOUT, BRIDGING 272 00:10:11,844 --> 00:10:14,680 THE GAP, REAL SOFTWARE ENABLING 273 00:10:14,747 --> 00:10:17,049 TECHNOLOGIES AS WELL AS 274 00:10:17,116 --> 00:10:19,118 SUPPORTING RESEARCH SOFTWARE 275 00:10:19,185 --> 00:10:19,385 ENGINEERS. 276 00:10:19,452 --> 00:10:21,854 SO ACTUALLY CAREER SOFTWARE 277 00:10:21,921 --> 00:10:22,722 PEOPLE IN ACADEMIA. 278 00:10:22,788 --> 00:10:24,590 SO LET ME TELL YOU BECAUSE I 279 00:10:24,657 --> 00:10:27,426 THINK I HAVE ENOUGH TIME, ABOUT 280 00:10:27,493 --> 00:10:30,730 WHAT IS HAPPENING IN TRUSTWORTHY 281 00:10:30,796 --> 00:10:30,896 A.I. 282 00:10:30,963 --> 00:10:32,698 AND SO MUCH HAPPENING IN THE 283 00:10:32,765 --> 00:10:33,132 FEDERAL SPACE. 284 00:10:33,199 --> 00:10:35,334 THERE'S THE BILL OF RIGHTS FOR 285 00:10:35,401 --> 00:10:37,603 A.I., THE TRUSTWORTHY A.I. PLAY 286 00:10:37,670 --> 00:10:39,171 BOOK, TONS OF STRATEGIC PLANS 287 00:10:39,238 --> 00:10:41,774 FOR A.I., BUT THE MOST IMPORTANT 288 00:10:41,841 --> 00:10:45,611 THING IS REALLY EXECUTIVE ORDER 289 00:10:45,678 --> 00:10:46,846 ON SAFE SECURE AND TRUSTWORTHY 290 00:10:46,912 --> 00:10:50,116 DEVELOPMENT THAT CAME UP ON 291 00:10:50,182 --> 00:10:51,217 OCTOBER 30 THIS PAST YEAR. 292 00:10:51,283 --> 00:10:54,286 HOW DOES THIS IMPACT OUR WORK 293 00:10:54,353 --> 00:10:55,621 AND HHS? 294 00:10:55,688 --> 00:11:02,061 THERE ARE FOUR MAIN OBJECTIVES 295 00:11:02,128 --> 00:11:03,629 IN THAT EXECUTIVE ORDER HHS WRIT 296 00:11:03,696 --> 00:11:05,264 LARGE, ESTABLISHING POLICIES FOR 297 00:11:05,331 --> 00:11:06,766 HEALTH AND HUMAN SERVICES, 298 00:11:06,832 --> 00:11:09,201 ADVANCING QUALITY AND SAFETY OF 299 00:11:09,268 --> 00:11:10,736 A.I. AND HEALTH, LEVERAGING 300 00:11:10,803 --> 00:11:12,505 GRANTMAKING TO ADVANCE A.I. USE, 301 00:11:12,571 --> 00:11:14,573 THAT SOUNDS LIKE NIH. 302 00:11:14,640 --> 00:11:17,810 AND PROMOTING COMPLIANCE WITH 303 00:11:17,877 --> 00:11:19,111 NON-DISCRIMINATORY AND PRIVACY 304 00:11:19,178 --> 00:11:19,311 LAWS. 305 00:11:19,378 --> 00:11:20,746 HOW DOES THIS MAP INTO WHAT 306 00:11:20,813 --> 00:11:21,747 WE'RE ACTUALLY DOING? 307 00:11:21,814 --> 00:11:23,682 I LOVE THAT SLIDE. 308 00:11:23,749 --> 00:11:26,218 WHAT YOU SEE IS THE OBJECTIVES, 309 00:11:26,285 --> 00:11:27,953 THE ACTIONS THAT ARE IN THE 310 00:11:28,020 --> 00:11:28,721 EXECUTIVE ORDER, AND THEN THE 311 00:11:28,788 --> 00:11:31,123 TIME FRAME THEY ARE SUPPOSED TO 312 00:11:31,190 --> 00:11:31,457 HAPPEN IN. 313 00:11:31,524 --> 00:11:32,758 I WON'T GO THROUGH EACH ONE. 314 00:11:32,825 --> 00:11:35,961 I WANT TO JUST SHOW YOU SO YOU 315 00:11:36,028 --> 00:11:38,130 HAVE IT AND YOU CAN LOOK AT THIS 316 00:11:38,197 --> 00:11:39,098 IN THE FUTURE. 317 00:11:39,165 --> 00:11:44,370 THERE'S A LOT OF ACTIVITIES. 318 00:11:44,437 --> 00:11:49,842 SOME OF THESE ARE -- YEARS LONG 319 00:11:49,909 --> 00:11:51,610 DEVELOPING THE FRAMEWORK, A REAL 320 00:11:51,677 --> 00:11:52,144 LONG ACTIVITY. 321 00:11:52,211 --> 00:11:54,447 OTHERS DUE A DAY AGO, 322 00:11:54,513 --> 00:11:55,981 ESTABLISHING A.I. TASK FORCE. 323 00:11:56,048 --> 00:11:59,718 WE HAVE ESTABLISHED AN A.I. TASK 324 00:11:59,785 --> 00:12:00,986 FORCE WITH LEADERSHIP FROM NIH, 325 00:12:01,053 --> 00:12:04,490 FROM THE OFFICE OF NATIONAL 326 00:12:04,557 --> 00:12:06,792 COORDINATOR, AND FROM THE DEP 327 00:12:06,859 --> 00:12:08,661 SEC OF HHS AND FDA. 328 00:12:08,727 --> 00:12:09,995 WE HAVE EIGHT WORKING GROUPS 329 00:12:10,062 --> 00:12:12,064 TAKING EACH OF THESE ACTIONS AS 330 00:12:12,131 --> 00:12:14,366 WELL AS OVERALL STRATEGY FOR 331 00:12:14,433 --> 00:12:17,436 A.I. AND WORKING THROUGH THOSE 332 00:12:17,503 --> 00:12:18,938 GOALS TO DELIVER A STRATEGIC 333 00:12:19,004 --> 00:12:21,540 PLAN THAT IS REQUIRED AT THE END 334 00:12:21,607 --> 00:12:25,411 OF 365 DAYS AS WELL AS CONCRETE 335 00:12:25,478 --> 00:12:26,946 DELIVERABLES DUE EVEN IN 180 336 00:12:27,012 --> 00:12:27,146 DAYS. 337 00:12:27,213 --> 00:12:28,681 SO YOU'LL SEE A LOT OF ACTIVITY 338 00:12:28,747 --> 00:12:29,281 HAPPENING. 339 00:12:29,348 --> 00:12:30,449 IF YOU'RE AT NIH YOU ARE 340 00:12:30,516 --> 00:12:32,184 INVOLVED IN A LOT OF ACTIVITY 341 00:12:32,251 --> 00:12:35,588 FROM THE HHS WORKING GROUPS. 342 00:12:35,654 --> 00:12:38,424 I DON'T WANT YOU TO THINK THAT 343 00:12:38,491 --> 00:12:39,625 A.I. IS NEW TO NIH. 344 00:12:39,692 --> 00:12:42,228 IN FACT, IT'S NOT. 345 00:12:42,294 --> 00:12:47,099 THE DATA WE'VE COLLECTED GO BACK 346 00:12:47,166 --> 00:12:49,068 MUCH FARTHER THAN 2019, THAT'S 347 00:12:49,135 --> 00:12:51,470 WHEN WE STARTED LABELING AS A.I. 348 00:12:51,537 --> 00:12:53,639 OVER SEVERAL YEARS THE FUNDING 349 00:12:53,706 --> 00:12:56,609 WE HAVE SUPPORTED IN ARTIFICIAL 350 00:12:56,675 --> 00:12:57,243 INTELLIGENCE GRANTS, CONTRACTS, 351 00:12:57,309 --> 00:12:58,811 HAS GONE UP QUITE A BIT. 352 00:12:58,878 --> 00:13:00,546 YOU SEE A NUMBER OF THE 353 00:13:00,613 --> 00:13:00,913 ACTIVITIES HERE. 354 00:13:00,980 --> 00:13:04,183 MOST OF THE FOLKS WHO ARE 355 00:13:04,250 --> 00:13:05,151 LEADING THESE ACTIVITIES ARE 356 00:13:05,217 --> 00:13:09,321 IT'S -- ALSO HERE, YOU CAN 357 00:13:09,388 --> 00:13:12,791 CONNECT WITH THEM. 358 00:13:12,858 --> 00:13:17,163 WE'VE SUPPORTED A $29 MILLION TO 359 00:13:17,229 --> 00:13:19,598 NIH AWARDEES TO BASICALLY MAKE 360 00:13:19,665 --> 00:13:21,667 THEIR EXISTING DATA A.I. READY 361 00:13:21,734 --> 00:13:23,869 TO DEVELOP NEW TRAINING 362 00:13:23,936 --> 00:13:24,403 OPPORTUNITIES, SKILLS AND 363 00:13:24,470 --> 00:13:26,605 COMPETENCIES, TO TAKE THE FIRST 364 00:13:26,672 --> 00:13:27,940 LOOK AT ETHICS, WHEN THINKING 365 00:13:28,007 --> 00:13:28,774 ABOUT A.I. DATA. 366 00:13:28,841 --> 00:13:30,843 THERE'S A FEW OF THE AWARDEE 367 00:13:30,910 --> 00:13:34,313 GRANTS LISTED ON THE FAR RIGHT. 368 00:13:34,380 --> 00:13:36,148 THIS INVOLVED A COLLABORATION 369 00:13:36,215 --> 00:13:37,283 WITH 20 INSTITUTES, CENTERS, 370 00:13:37,349 --> 00:13:38,217 OFFICE, THE FOUNDATION OF THE 371 00:13:38,284 --> 00:13:39,552 WORK WE DID THAT'S NOW GOING TO 372 00:13:39,618 --> 00:13:43,155 BE BUILT UPON BY THIS WORKSHOP 373 00:13:43,222 --> 00:13:43,956 AS WELL AS OTHER ACTIVITIES 374 00:13:44,023 --> 00:13:45,024 HAPPENING IN THE FUTURE. 375 00:13:45,090 --> 00:13:48,827 THIS IS A GENERAL THEME, WE DO A 376 00:13:48,894 --> 00:13:50,162 NUMBER OF SUPPLEMENT, GET WHAT 377 00:13:50,229 --> 00:13:51,463 WORKS AND WHERE GAPS AND 378 00:13:51,530 --> 00:13:52,865 OPPORTUNITIES ARE AND MOVE 379 00:13:52,932 --> 00:13:53,499 FORWARD. 380 00:13:53,566 --> 00:13:54,533 WE'LL BE MOVING FORWARD. 381 00:13:54,600 --> 00:13:56,936 HERE'S ONE OF THE WAYS WE'RE 382 00:13:57,002 --> 00:13:58,070 MOVING FORWARD. 383 00:13:58,137 --> 00:13:59,972 WITH THE NATIONAL A.I. RESEARCH 384 00:14:00,039 --> 00:14:00,706 RESOURCE. 385 00:14:00,773 --> 00:14:04,043 THIS IS LED BY NSF, WITH A 386 00:14:04,109 --> 00:14:05,911 SIGNIFICANT NUMBER OF AGENCY 387 00:14:05,978 --> 00:14:09,415 PARTNERSHIPS, AS WELL AS 388 00:14:09,481 --> 00:14:10,616 INDUSTRY PARTNERSHIPS; TO 389 00:14:10,683 --> 00:14:13,686 PROVIDE -- THINK OF THIS AS A 390 00:14:13,752 --> 00:14:15,988 RESOURCE LIKE A BTRR, YOU COME 391 00:14:16,055 --> 00:14:19,058 WITH A PROPOSAL, IT'S REVIEWED, 392 00:14:19,124 --> 00:14:20,926 ACCESS TO COMPUTING POWER, 393 00:14:20,993 --> 00:14:22,428 DATASETS AND OTHER SOFTWARE 394 00:14:22,494 --> 00:14:22,661 STACKS. 395 00:14:22,728 --> 00:14:23,929 THAT'S REALLY WHAT THIS IS. 396 00:14:23,996 --> 00:14:25,297 WE'RE SUPER THRILLED TO BE PART 397 00:14:25,364 --> 00:14:25,931 OF IT. 398 00:14:25,998 --> 00:14:30,970 WE'RE PART OF BOTH THE OPEN 399 00:14:31,036 --> 00:14:36,542 NAIRR AND SECURE NAIRR, WE 400 00:14:36,609 --> 00:14:38,077 CO-LEAD WITH OUR MAIN NAIRR 401 00:14:38,143 --> 00:14:42,047 SECURE PARTNERS AT THIS MOMENT. 402 00:14:42,114 --> 00:14:43,882 WE'LL HOPEFULLY BE LEVERAGING 403 00:14:43,949 --> 00:14:45,751 CLOUDLAB AND OPPORTUNITIES TO 404 00:14:45,818 --> 00:14:50,789 DEVELOP LARGE NETWORKS FOR THE 405 00:14:50,856 --> 00:14:51,290 NAIRR. 406 00:14:51,357 --> 00:14:53,058 FEDERATION IS A KEY WORD YOU'LL 407 00:14:53,125 --> 00:14:53,959 HEAR FROM MONICA AS SHE'S 408 00:14:54,026 --> 00:14:55,828 SPEAKING IN THE FUTURE HOW TO 409 00:14:55,894 --> 00:14:58,564 BRING TOGETHER ALL THE DATA AND 410 00:14:58,631 --> 00:15:00,633 THE DATA ASSETS, SO RESEARCHERS 411 00:15:00,699 --> 00:15:01,834 CAN USE THEM, SOMETHING WE'RE 412 00:15:01,900 --> 00:15:04,637 WORKING ON WITH HER AND IN 413 00:15:04,703 --> 00:15:05,271 COLLABORATION WITH THE NATIONAL 414 00:15:05,337 --> 00:15:07,306 LIBRARY OF MEDICINE. 415 00:15:07,373 --> 00:15:08,440 FINALLY, IT REALLY TAKES A 416 00:15:08,507 --> 00:15:10,042 VILLAGE TO CREATE A.I. 417 00:15:10,109 --> 00:15:12,344 SO WE WANT TO STRENGTHEN AND 418 00:15:12,411 --> 00:15:16,181 NURTURE THAT TALENT FROM DIVERSE 419 00:15:16,248 --> 00:15:18,050 SCIENTIFIC INTERESTS AND DIVERSE 420 00:15:18,117 --> 00:15:18,717 INSTITUTIONS, SOMETHING WE'LL BE 421 00:15:18,784 --> 00:15:20,152 CONTINUING IN THE FUTURE. 422 00:15:20,219 --> 00:15:21,220 BECAUSE I'M RUNNING OUT OF TIME 423 00:15:21,287 --> 00:15:23,055 THIS IS THE SLIDE I WANT TO 424 00:15:23,122 --> 00:15:23,856 LEAVE YOU WITH. 425 00:15:23,922 --> 00:15:26,358 IT ASKS YOU TO GIVE US INPUT ON 426 00:15:26,425 --> 00:15:27,126 WHAT YOU'VE HEARD. 427 00:15:27,192 --> 00:15:27,993 I REALLY MEAN IT. 428 00:15:28,060 --> 00:15:31,397 I HOPE YOU WILL TAKE THE TIME TO 429 00:15:31,463 --> 00:15:33,565 PROVIDE US INPUT, WHERE ARE WE 430 00:15:33,632 --> 00:15:34,566 HITTING THE MARK, MISSING THE 431 00:15:34,633 --> 00:15:36,201 MARK, WHAT SHOULD BE IN THERE 432 00:15:36,268 --> 00:15:37,636 THAT ISN'T IN THERE. 433 00:15:37,703 --> 00:15:38,504 THAT'S REALLY IMPORTANT. 434 00:15:38,570 --> 00:15:39,805 SO I'M GOING TO LEAVE THIS SLIDE 435 00:15:39,872 --> 00:15:43,776 UP AND I DON'T DON'T KNOW IF 436 00:15:43,842 --> 00:15:45,144 THERE'S TIME FOR QUESTIONS. 437 00:15:45,210 --> 00:15:46,712 THANK YOU FOR COMING AND MAKING 438 00:15:46,779 --> 00:15:57,289 THE TREK TO SOMEWHAT RAINY AND 439 00:16:01,593 --> 00:16:04,196 GRAY D.C. 440 00:16:04,263 --> 00:16:13,205 [APPLAUSE] 441 00:16:13,272 --> 00:16:13,939 >> THANK YOU. 442 00:16:14,006 --> 00:16:16,809 >> OKAY, WONDERFUL. 443 00:16:16,875 --> 00:16:18,444 MY JOB NOW IS TO TALK A LITTLE 444 00:16:18,510 --> 00:16:20,079 BIT WHAT YOU'LL BE DOING OVER 445 00:16:20,145 --> 00:16:20,946 THE NEXT COUPLE DAYS. 446 00:16:21,013 --> 00:16:22,581 AND WHY IT'S IMPORTANT TO US 447 00:16:22,648 --> 00:16:24,316 INSTEAD OF HOW WE CAME TO THIS 448 00:16:24,383 --> 00:16:26,151 POINT, HOPEFULLY DO A LITTLE BIT 449 00:16:26,218 --> 00:16:27,419 OF MIND MELD, THIS IS THE 450 00:16:27,486 --> 00:16:29,154 BEGINNING OF A JOURNEY. 451 00:16:29,221 --> 00:16:31,023 WE HAVE A SERIES OF EXERCISES 452 00:16:31,090 --> 00:16:32,558 MAPPED OUT FOR YOU THAT 453 00:16:32,624 --> 00:16:34,526 HOPEFULLY WILL SORT OF RESULT IN 454 00:16:34,593 --> 00:16:36,562 THE SORT OF INPUT AND DEEP 455 00:16:36,628 --> 00:16:37,796 THINKING THAT WE'RE HOPING YOU 456 00:16:37,863 --> 00:16:40,866 ALL WILL COME ALONG WITH US TO 457 00:16:40,933 --> 00:16:41,834 DO. 458 00:16:41,900 --> 00:16:44,336 SO JUST IN TERMS OF BACKGROUND, 459 00:16:44,403 --> 00:16:47,005 HOW WE GOT HERE, IT'S NOT A 460 00:16:47,072 --> 00:16:47,306 SECRET. 461 00:16:47,373 --> 00:16:50,275 I THINK THERE ARE MANY EXAMPLES 462 00:16:50,342 --> 00:16:50,809 WHERE A.I. HAS UNINTENDED 463 00:16:50,876 --> 00:16:52,778 CONSEQUENCES IN THE BIOMEDICAL 464 00:16:52,845 --> 00:16:53,879 AND HEALTHCARE SETTINGS. 465 00:16:53,946 --> 00:16:56,181 AND THE ROOT CAUSES OF THAT 466 00:16:56,248 --> 00:16:57,616 COULD BE IN THE DATA THAT ARE 467 00:16:57,683 --> 00:16:59,118 USED FOR TRAINING A MODEL, COULD 468 00:16:59,184 --> 00:17:01,086 BE IN THE META DATA, IN THE WAY 469 00:17:01,153 --> 00:17:02,521 THE MODEL WAS TRAINED, OR ANY 470 00:17:02,588 --> 00:17:03,856 COMBINATION OF THESE THINGS. 471 00:17:03,922 --> 00:17:08,894 SO WE REALLY NEED TO UNDERSTAND 472 00:17:08,961 --> 00:17:12,164 MORE ABOUT HOW TO NOT ONLY -- 473 00:17:12,231 --> 00:17:13,932 SOURCES OF THOSE IMPACTS BUT 474 00:17:13,999 --> 00:17:14,900 ALSO HOW TO REMEDIATE. 475 00:17:14,967 --> 00:17:16,301 WHAT WE HEARD FROM THE COMMUNITY 476 00:17:16,368 --> 00:17:18,070 AT THE POINT WHEN AN A.I. MODEL 477 00:17:18,137 --> 00:17:20,606 IS APPLIED IN A PARTICULAR 478 00:17:20,672 --> 00:17:21,907 SITUATION, IF IT DOESN'T PERFORM 479 00:17:21,974 --> 00:17:23,342 WELL OR AS EXPECTED IT'S VERY 480 00:17:23,409 --> 00:17:25,844 HARD FOR PEOPLE TO SORT OF SEE 481 00:17:25,911 --> 00:17:28,380 UPSTREAM WHERE THE CAUSES MIGHT 482 00:17:28,447 --> 00:17:28,547 BE. 483 00:17:28,614 --> 00:17:31,350 THERE'S ALSO A LOT OF ATTENTION 484 00:17:31,417 --> 00:17:32,451 AROUND A.I. ASSURANCE, SO 485 00:17:32,518 --> 00:17:35,554 THINKING ABOUT HOW TO MEASURE A 486 00:17:35,621 --> 00:17:36,922 MODELS PERFORMANCE WITH RESPECT 487 00:17:36,989 --> 00:17:39,358 TO ACCURACY BUT ALSO SOME MORE 488 00:17:39,425 --> 00:17:40,225 SOCIAL QUESTIONS, LEGAL 489 00:17:40,292 --> 00:17:40,492 QUESTIONS. 490 00:17:40,559 --> 00:17:41,994 SO WE WANT TO HAVE A BETTER 491 00:17:42,060 --> 00:17:44,930 HANDLE HOW TO DO THAT. 492 00:17:44,997 --> 00:17:47,766 AND FINALLY IN OPEN SCIENCE, 493 00:17:47,833 --> 00:17:49,034 FEDERALLY FUNDED RESEARCH, 494 00:17:49,101 --> 00:17:50,736 PERHAPS IN CONTRAST TO WHAT YOU 495 00:17:50,803 --> 00:17:52,171 MIGHT SEE IN INDUSTRY THERE'S 496 00:17:52,237 --> 00:17:53,772 MUCH MORE REUSE OF RESEARCH 497 00:17:53,839 --> 00:17:54,440 PRODUCTS. 498 00:17:54,506 --> 00:17:58,043 REUSE OF DATA, REUSE OF MODELS, 499 00:17:58,110 --> 00:17:59,144 ALGORITHMS, WORKFLOWS, A 500 00:17:59,211 --> 00:18:00,913 HALLMARK OF THE RESEARCH WE 501 00:18:00,979 --> 00:18:01,113 FUND. 502 00:18:01,180 --> 00:18:03,549 SO IT MAKES IT MORE DIFFICULT TO 503 00:18:03,615 --> 00:18:07,252 FOLLOW THOSE BREAD CRUMBS 504 00:18:07,319 --> 00:18:07,519 UPSTREAM. 505 00:18:07,586 --> 00:18:09,121 AND FINALLY, WE'RE GETTING ASKED 506 00:18:09,188 --> 00:18:10,222 BY OUR RESEARCH COMMUNITY, BY 507 00:18:10,289 --> 00:18:12,591 ALL OF YOU, BY OUR COLLEAGUES 508 00:18:12,658 --> 00:18:14,359 WITHIN NIH WHO ARE FUNDING 509 00:18:14,426 --> 00:18:16,462 RESEARCH, WHAT CAN THE RESEARCH 510 00:18:16,528 --> 00:18:17,863 COMMUNITY DO TO HELP SOLVE SOME 511 00:18:17,930 --> 00:18:19,064 OF THESE PROBLEMS, AND SO ONE OF 512 00:18:19,131 --> 00:18:20,265 THE MAIN THINGS I HOPE WE COME 513 00:18:20,332 --> 00:18:21,800 OUT OF THIS WORKSHOP WITH IS 514 00:18:21,867 --> 00:18:24,536 SOME ANSWER TO THAT QUESTION. 515 00:18:24,603 --> 00:18:25,637 WHAT CAN RESEARCHERS DO TO HELP 516 00:18:25,704 --> 00:18:27,940 MAKE A.I. MORE TRANSPARENT SO 517 00:18:28,006 --> 00:18:29,942 PEOPLE CAN MAKE MORE INFORMED, 518 00:18:30,008 --> 00:18:31,243 RESPONSIBLE, ETHICAL DECISIONS 519 00:18:31,310 --> 00:18:33,445 HOW THEY REUSE DATA AND MODELS. 520 00:18:33,512 --> 00:18:35,414 THIS IS THE WONDERFUL CAST OF 521 00:18:35,481 --> 00:18:37,049 CHARACTERS THAT WE HAVE THAT 522 00:18:37,115 --> 00:18:40,452 HELPED US SHAPE THIS WORKSHOP. 523 00:18:40,519 --> 00:18:46,925 SO OUR THREE CO-CHAIRS, TINA 524 00:18:46,992 --> 00:18:56,001 HERNANDEZ-BOUSSARD, JULIA 525 00:18:56,068 --> 00:18:57,202 STOYANOVICH AND IT'S BEEN 526 00:18:57,269 --> 00:19:01,139 WONDERFUL TO WORK WITH THEM . 527 00:19:01,206 --> 00:19:02,975 THE FIRST GOAL IS TO BEGIN TO 528 00:19:03,041 --> 00:19:04,643 DEVELOP SOME OF THESE 529 00:19:04,710 --> 00:19:07,913 TRANSPARENCY GUIDELINES FOR NIH 530 00:19:07,980 --> 00:19:08,380 AWARDEES. 531 00:19:08,447 --> 00:19:08,981 MAYBE USING, DEVELOPING, 532 00:19:09,047 --> 00:19:09,982 CONTRIBUTING TO A.I. 533 00:19:10,048 --> 00:19:11,950 THINK ABOUT WHAT WE MEAN BY NIH 534 00:19:12,017 --> 00:19:14,119 AWARDEE, IT COULD BE AN R01 535 00:19:14,186 --> 00:19:16,688 P.I., COULD BE SOMEONE WHO IS 536 00:19:16,755 --> 00:19:18,090 MANAGING A REPOSITORY OR 537 00:19:18,156 --> 00:19:21,159 DEVELOPING A NEW SEARCH 538 00:19:21,226 --> 00:19:21,460 CAPABILITY. 539 00:19:21,527 --> 00:19:23,362 WE WANT TO ALSO IDENTIFY SOME OF 540 00:19:23,428 --> 00:19:26,098 THE TOOLS AND CAPABILITY GAPS, 541 00:19:26,164 --> 00:19:27,633 THERE MIGHT BE A VISION OF THE 542 00:19:27,699 --> 00:19:29,167 FUTURE HOW WE WANT A.I. 543 00:19:29,234 --> 00:19:31,036 DEVELOPMENT TO WORK IN A 544 00:19:31,103 --> 00:19:33,438 TRANSPARENT, THERE'S A PART WE 545 00:19:33,505 --> 00:19:36,074 CAN DO NOW, PART MIGHT BE 546 00:19:36,141 --> 00:19:37,809 CONTINGENT ON BUILDING OUT MORE 547 00:19:37,876 --> 00:19:38,110 CAPABILITY. 548 00:19:38,176 --> 00:19:40,112 FINALLY WE WANT TO TAP INTO YOUR 549 00:19:40,178 --> 00:19:41,513 BRAINS AND HAVE YOU TELL US 550 00:19:41,580 --> 00:19:43,815 ABOUT WHERE THE FUTURE OF THIS 551 00:19:43,882 --> 00:19:45,250 FIELD OF TRANSPARENCY IS GOING. 552 00:19:45,317 --> 00:19:47,319 WHAT THE DRIVERS MIGHT BE AND 553 00:19:47,386 --> 00:19:54,326 WHAT NIH SHOULD BE PAYING 554 00:19:54,393 --> 00:19:54,660 ATTENTION TO. 555 00:19:54,726 --> 00:19:57,396 IN TERMS OF WHERE WE MIGHT GO, 556 00:19:57,462 --> 00:20:00,332 OUR THINKING IS WE'LL COME OUT 557 00:20:00,399 --> 00:20:02,634 WITH MAYBE NOT A FULL SKELETON 558 00:20:02,701 --> 00:20:04,403 BUT BONES OF GUIDANCE FOR NIH 559 00:20:04,469 --> 00:20:06,138 RESEARCHES AND WOULD PUT THAT 560 00:20:06,204 --> 00:20:08,240 OUT FOR FURTHER COMMENT FROM THE 561 00:20:08,307 --> 00:20:10,509 RESEARCH COMMUNITY. 562 00:20:10,576 --> 00:20:11,643 SO POSSIBLY THROUGH REQUEST FOR 563 00:20:11,710 --> 00:20:13,145 INFORMATION OR SOME OTHER MEANS, 564 00:20:13,211 --> 00:20:14,580 SORT OF BY DEFINITION THIS ROOM 565 00:20:14,646 --> 00:20:17,516 CAN'T DO IT ALONE AND WE'LL HAVE 566 00:20:17,583 --> 00:20:19,484 BROADER INPUT ONTO THE OUTPUT 567 00:20:19,551 --> 00:20:21,019 FROM THIS WORKSHOP. 568 00:20:21,086 --> 00:20:24,323 AND THEN THE OTHER TWO GOALS 569 00:20:24,389 --> 00:20:27,392 WILL FEED INTO STRATEGY HOW WE 570 00:20:27,459 --> 00:20:29,695 BUILD OUT THIS DATA ECOSYSTEM. 571 00:20:29,761 --> 00:20:32,064 LET ME NOW DO A MIND MELD WITH 572 00:20:32,130 --> 00:20:32,230 YOU. 573 00:20:32,297 --> 00:20:33,966 THIS IS A SCHEMATIC THAT I HOPE 574 00:20:34,032 --> 00:20:35,701 IS HELPFUL IN TERMS OF THINKING 575 00:20:35,767 --> 00:20:38,770 ABOUT WHAT WE'RE DOING TOGETHER 576 00:20:38,837 --> 00:20:39,504 TODAY. 577 00:20:39,571 --> 00:20:44,042 SO, THE BLUE BAR IN THE MIDDLE 578 00:20:44,109 --> 00:20:46,878 IS THE DATA AND A.I. DEVELOPMENT 579 00:20:46,945 --> 00:20:48,180 CYCLE, THE DEVELOPMENT PATH. 580 00:20:48,246 --> 00:20:50,148 THINK OF IT GOING FROM DATA 581 00:20:50,215 --> 00:20:51,116 GENERATION AND COLLECTION ALL 582 00:20:51,183 --> 00:20:53,452 THE WAY UP TO DEPLOYMENT OF A.I. 583 00:20:53,518 --> 00:20:56,755 MODEL IN PERHAPS THE HEALTHCARE 584 00:20:56,822 --> 00:20:58,824 SETTING OR RESEARCH ENVIRONMENT. 585 00:20:58,890 --> 00:21:01,893 THE LITTLE HEADS AROUND THAT ARE 586 00:21:01,960 --> 00:21:02,761 MEANT TO DEPICT STAKEHOLDERS, 587 00:21:02,828 --> 00:21:04,863 AND SOME OF THESE STAKEHOLDERS 588 00:21:04,930 --> 00:21:06,465 ON THE TOP I THINK OF IT THIS 589 00:21:06,531 --> 00:21:08,567 WAY, ON THE TOP PEOPLE WITH 590 00:21:08,634 --> 00:21:10,202 INFORMATION NEEDS. 591 00:21:10,268 --> 00:21:11,837 SO THESE ARE PEOPLE ALONG THAT 592 00:21:11,903 --> 00:21:12,938 CYCLE WHO MIGHT NEED TO KNOW 593 00:21:13,005 --> 00:21:14,039 SOMETHING ABOUT THE DATA THAT 594 00:21:14,106 --> 00:21:15,674 THEY ARE ABOUT TO REUSE OR THE 595 00:21:15,741 --> 00:21:18,844 MODEL THEY ARE ABOUT TO REUSE. 596 00:21:18,910 --> 00:21:20,512 AND ON THE BOTTOM ARE 597 00:21:20,579 --> 00:21:22,347 STAKEHOLDERS THAT HAVE 598 00:21:22,414 --> 00:21:22,681 INFORMATION. 599 00:21:22,748 --> 00:21:24,750 SO THESE ARE MAYBE THE P.I. THAT 600 00:21:24,816 --> 00:21:26,752 CREATED THE DATA, THAT 601 00:21:26,818 --> 00:21:30,355 UNDERSTANDS THE NUANCES OF HOW 602 00:21:30,422 --> 00:21:31,556 THE DATA WERE PROCESSED, WHAT 603 00:21:31,623 --> 00:21:32,858 CLEANING THEY DID, MAYBE SOMEONE 604 00:21:32,924 --> 00:21:35,260 WHO DEVELOPED A MODEL THAT 605 00:21:35,327 --> 00:21:37,696 UNDERSTAND THE DETAILS OF THE 606 00:21:37,763 --> 00:21:38,330 TRAINING MECHANISM. 607 00:21:38,397 --> 00:21:40,098 OUR GOAL IS TO FIGURE OUT HOW 608 00:21:40,165 --> 00:21:41,066 INFORMATION SHOULD FLOW FROM THE 609 00:21:41,133 --> 00:21:42,801 PEOPLE WHO HAVE IT TO THE PEOPLE 610 00:21:42,868 --> 00:21:44,036 WHO NEED IT. 611 00:21:44,102 --> 00:21:45,771 SO FROM THIS DIAGRAM ONE OF THE 612 00:21:45,837 --> 00:21:47,406 THINGS YOU SHOULD TAKE OUT OF 613 00:21:47,472 --> 00:21:49,474 THIS IS OUR THINKING AND OUR 614 00:21:49,541 --> 00:21:50,242 APPROACH TO TRANSPARENCY IS 615 00:21:50,308 --> 00:21:55,514 TRANSPARENCY IS A SORT OF 616 00:21:55,580 --> 00:21:57,149 SYSTEM, CHARACTERRIC OF A 617 00:21:57,215 --> 00:21:58,316 SYSTEM, ROOTING THIS IN 618 00:21:58,383 --> 00:21:59,351 PRACTICAL UNDERSTANDING WHAT THE 619 00:21:59,418 --> 00:22:01,453 STAKEHOLDERS NEEDS ARE. 620 00:22:01,520 --> 00:22:03,422 NOT JUST TRANSPARENCY FOR 621 00:22:03,488 --> 00:22:05,057 TRANSPARENCY SAKE BUT WHAT 622 00:22:05,123 --> 00:22:05,691 STAKEHOLDERS NEED WHAT 623 00:22:05,757 --> 00:22:06,792 INFORMATION AND HOW CAN WE GET 624 00:22:06,858 --> 00:22:08,994 IT TO THEM AND REALLY HELP THEM 625 00:22:09,061 --> 00:22:13,365 MAKE INFORMED AND RESPONSIBLE 626 00:22:13,432 --> 00:22:13,632 DECISIONS. 627 00:22:13,699 --> 00:22:15,667 JUST TO GIVE A SENSE, YOU'LL BE 628 00:22:15,734 --> 00:22:16,535 GOING -- THIS AFTERNOON YOU'LL 629 00:22:16,601 --> 00:22:18,070 BE GOING INTO A STAKEHOLDER 630 00:22:18,136 --> 00:22:19,171 MAPPING ACTIVITY AND SO JUST TO 631 00:22:19,237 --> 00:22:21,139 GIVE A SENSE OF WHAT WE THOUGHT 632 00:22:21,206 --> 00:22:23,008 ABOUT IN TERMS OF QUESTIONS 633 00:22:23,075 --> 00:22:24,743 PEOPLE MIGHT BE ASKING, SO THERE 634 00:22:24,810 --> 00:22:26,812 MIGHT BE SOMEONE WHO IS A 635 00:22:26,878 --> 00:22:29,347 PATIENT, AND THERE'S GOING TO BE 636 00:22:29,414 --> 00:22:30,482 A HEALTH DECISION MADE FROM, YOU 637 00:22:30,549 --> 00:22:32,851 KNOW, USING AN A.I. MODEL, THAT 638 00:22:32,918 --> 00:22:36,655 PATIENT MIGHT BE LET'S SAY FROM 639 00:22:36,722 --> 00:22:38,623 DIVERSE OR MINORITY GROUP, 640 00:22:38,690 --> 00:22:39,291 EITHER ETHNIC, RACIAL, OR 641 00:22:39,357 --> 00:22:41,026 BECAUSE THEY HAVE A RARE DISEASE 642 00:22:41,093 --> 00:22:42,360 OR SOME OTHER REASON, THEY WANT 643 00:22:42,427 --> 00:22:45,197 TO KNOW HOW WELL THE A.I. MODEL 644 00:22:45,263 --> 00:22:47,699 IS GOING TO SERVE THEIR NEEDS, 645 00:22:47,766 --> 00:22:49,234 REFLECT THEIR HEALTH CONDITION 646 00:22:49,301 --> 00:22:56,475 AND PREFERENCE ABOUT CARE. 647 00:22:56,541 --> 00:23:00,946 THEY WILL WANT TO KNOW WHAT SORT 648 00:23:01,012 --> 00:23:06,418 OF FACTORS WENT INTO THE MODEL, 649 00:23:06,485 --> 00:23:08,053 WHAT PERFORMANCE THEY THEY 650 00:23:08,120 --> 00:23:08,720 EXPECT. 651 00:23:08,787 --> 00:23:09,588 THESE WILL BE AVAILABLE ONLINE 652 00:23:09,654 --> 00:23:11,022 SO YOU CAN REFER BACK. 653 00:23:11,089 --> 00:23:12,758 WE WANTED TO GIVE YOU A SENSE OF 654 00:23:12,824 --> 00:23:13,759 THE QUESTIONS AND INFORMATION 655 00:23:13,825 --> 00:23:15,961 NEEDS THAT EXIST AROUND THIS 656 00:23:16,027 --> 00:23:16,261 MODEL. 657 00:23:16,328 --> 00:23:20,665 AND REALLY THINKING ABOUT THE 658 00:23:20,732 --> 00:23:22,100 ENTIRE DATA LANDSCAPE. 659 00:23:22,167 --> 00:23:24,736 IN TERMS OF GUIDELINES FOR NIH 660 00:23:24,803 --> 00:23:26,905 AWARDEES, WE'LL ASK YOU TO THINK 661 00:23:26,972 --> 00:23:28,874 THROUGH THIS STAKEHOLDER MAPPING 662 00:23:28,940 --> 00:23:34,012 AND PIVOT TO, OKAY, WHAT ARE THE 663 00:23:34,079 --> 00:23:35,781 ACTIONS THAT DIFFERENT AWARDEES 664 00:23:35,847 --> 00:23:37,082 TAKE, WHETHER DATA SHARING, 665 00:23:37,149 --> 00:23:39,484 SHARING A MODEL, REUSING A 666 00:23:39,551 --> 00:23:42,888 MODEL, WE'LL THINK THROUGH THAT 667 00:23:42,954 --> 00:23:44,990 SENSE AS WELL. 668 00:23:45,056 --> 00:23:46,525 WHEN YOU -- THE EXERCISES THAT 669 00:23:46,591 --> 00:23:48,059 WILL LEAD YOU THROUGH WILL GUIDE 670 00:23:48,126 --> 00:23:49,561 YOU TO THINKING ABOUT NOW THAT 671 00:23:49,628 --> 00:23:50,562 YOU'VE MAPPED OUT THE 672 00:23:50,629 --> 00:23:52,097 STAKEHOLDER NEEDS HOW DO YOU GET 673 00:23:52,164 --> 00:23:52,798 INFORMATION TO FLOW FROM ONE TO 674 00:23:52,864 --> 00:23:54,933 THE OTHER AND WHAT ARE 675 00:23:55,000 --> 00:23:57,569 CAPABILITY GAPS, SO THAT WILL BE 676 00:23:57,636 --> 00:23:58,970 OUR SESSION 2. 677 00:23:59,037 --> 00:24:00,071 FINALLY WHAT DOES TRANSPARENCY 678 00:24:00,138 --> 00:24:02,174 LOOK LIKE IN THE FUTURE, SO WE 679 00:24:02,240 --> 00:24:04,109 HAVE DONE THIS EXERCISE, YOU 680 00:24:04,176 --> 00:24:05,977 KNOW, IN THE PRESENT, BUT WHAT 681 00:24:06,044 --> 00:24:08,280 ARE THE DRIVERS OUT THERE, MAYBE 682 00:24:08,346 --> 00:24:09,681 TECHNOLOGY DRIVERS, CULTURAL 683 00:24:09,748 --> 00:24:11,116 DRIVERS, THEY MIGHT BE POLICY 684 00:24:11,183 --> 00:24:12,551 DRIVERS, SO WHAT CAN WE EXPECT 685 00:24:12,617 --> 00:24:17,389 FROM THIS FIELD IN THE NEXT FEW 686 00:24:17,455 --> 00:24:17,989 YEARS. 687 00:24:18,056 --> 00:24:21,226 SO, OUR AGENDA HAS THIS TEMPO TO 688 00:24:21,293 --> 00:24:21,560 IT. 689 00:24:21,626 --> 00:24:22,194 BREAKOUT SESSIONS, AND WE'LL 690 00:24:22,260 --> 00:24:25,163 COME BACK INTO THE ROOM FOR 691 00:24:25,230 --> 00:24:26,498 PLENARY READOUT, AND SO THE RED 692 00:24:26,565 --> 00:24:28,333 BOXES HERE ARE FOR THE ONLINE 693 00:24:28,400 --> 00:24:29,000 PARTICIPANTS. 694 00:24:29,067 --> 00:24:30,836 THAT'S WHEN YOU'LL HAVE A CHANCE 695 00:24:30,902 --> 00:24:37,742 TO TALK TO US THROUGH SLIDO. 696 00:24:37,809 --> 00:24:41,680 AND AFTER THE THIRD SESSION 697 00:24:41,746 --> 00:24:43,215 WE'LL COME BACK TOMORROW 698 00:24:43,281 --> 00:24:45,750 AFTERNOON FOR A SORT OF BIG 699 00:24:45,817 --> 00:24:48,687 GROUP DISCUSSION ABOUT WHAT BEEN 700 00:24:48,753 --> 00:24:50,222 CREATED IN BREAKOUTS AND WE'LL 701 00:24:50,288 --> 00:24:51,223 CONTINUE THAT DISCUSSION INTO 702 00:24:51,289 --> 00:24:52,624 THE MORNING OF FRIDAY. 703 00:24:52,691 --> 00:24:54,693 SO IF YOU'RE ONLINE, THAT'S THE 704 00:24:54,759 --> 00:24:55,594 MOST INTERACTIVE SESSIONS THAT 705 00:24:55,660 --> 00:24:56,561 WE'LL HAVE, SO TOMORROW 706 00:24:56,628 --> 00:25:00,298 AFTERNOON AND BEGINNING OF 707 00:25:00,365 --> 00:25:01,132 FRIDAY. 708 00:25:01,199 --> 00:25:04,369 LET ME GO QUICKLY INTO JUST MORE 709 00:25:04,436 --> 00:25:07,005 DETAIL INTO WHAT SORTS OF 710 00:25:07,072 --> 00:25:09,174 ARTIFACTS WE'LL BE CREATING, AND 711 00:25:09,241 --> 00:25:10,609 I'LL SPEND MORE TIME BECAUSE 712 00:25:10,675 --> 00:25:12,444 WE'LL GO STRAIGHT TO THE 713 00:25:12,510 --> 00:25:14,312 BREAKOUTS AFTER THE NEXT FEW 714 00:25:14,379 --> 00:25:14,546 TALKS. 715 00:25:14,613 --> 00:25:19,618 FIRST BREAKOUT SESSION THE GOAL 716 00:25:19,684 --> 00:25:20,452 IS THEMATIC BREAKOUTS, CONSIDER 717 00:25:20,518 --> 00:25:22,320 YOUR THEMES, IDENTIFY THE 718 00:25:22,387 --> 00:25:23,388 RELEVANT STAKEHOLDERS AND THEIR 719 00:25:23,455 --> 00:25:25,690 INFORMATION NEEDS. 720 00:25:25,757 --> 00:25:27,225 SO WHAT WE'RE IMAGINING, YOUR 721 00:25:27,292 --> 00:25:28,960 LEADS WILL GUIDE YOU THROUGH 722 00:25:29,027 --> 00:25:30,862 EXERCISE, YOU'LL COME OUT WITH 723 00:25:30,929 --> 00:25:33,698 SOME IDEAS OF, OKAY, HERE'S A 724 00:25:33,765 --> 00:25:34,766 STAKEHOLDER, HERE IS THEIR 725 00:25:34,833 --> 00:25:36,334 DECISION NEEDS, HERE IS THE 726 00:25:36,401 --> 00:25:38,069 INFORMATION THEY NEED IN ORDER 727 00:25:38,136 --> 00:25:39,170 TO MAKE INFORMED ETHICAL 728 00:25:39,237 --> 00:25:41,139 DECISIONS ABOUT DATA AND MODEL 729 00:25:41,206 --> 00:25:41,339 REUSE. 730 00:25:41,406 --> 00:25:43,208 AND SO YOU'LL COME OUT WITH A 731 00:25:43,275 --> 00:25:44,309 COLLECTION OF THOSE. 732 00:25:44,376 --> 00:25:46,478 THAT WILL HELP THEN GOING INTO 733 00:25:46,544 --> 00:25:49,114 THE NEXT SESSION, NEXT BREAKOUT 734 00:25:49,180 --> 00:25:50,782 SESSION, WHERE YOU'LL LINK THEN 735 00:25:50,849 --> 00:25:52,417 THOSE NEEDS WITH INFORMATION 736 00:25:52,484 --> 00:25:55,153 SOURCES, THINKING ABOUT HOW THE 737 00:25:55,220 --> 00:25:58,323 INFORMATION CAN FLOW AND WHAT 738 00:25:58,390 --> 00:26:00,091 SYSTEM CAPABILITIES ARE NEEDED. 739 00:26:00,158 --> 00:26:01,960 THIRD SESSION LEADS WILL TELL 740 00:26:02,027 --> 00:26:03,495 YOU WE'RE PIVOTING FROM INSTEAD 741 00:26:03,561 --> 00:26:05,430 OF THINKING ABOUT INDIVIDUALS 742 00:26:05,497 --> 00:26:07,532 LIKE A PATIENT OR A RESEARCHER, 743 00:26:07,599 --> 00:26:08,967 YOU'RE THINKING THEN ABOUT THEIR 744 00:26:09,034 --> 00:26:09,501 ACTIVITIES. 745 00:26:09,567 --> 00:26:10,769 SO FOR DATA SHARING, AND THEN 746 00:26:10,835 --> 00:26:12,137 BUILDING OUT SOME OF THE 747 00:26:12,203 --> 00:26:13,972 GUIDANCE THAT YOU WOULD WANT TO 748 00:26:14,039 --> 00:26:15,240 GIVE TO THAT PERSON WHO IS 749 00:26:15,307 --> 00:26:16,141 SHARING DATA. 750 00:26:16,207 --> 00:26:17,642 THIS IS WHAT YOU SHOULD BE 751 00:26:17,709 --> 00:26:19,210 THINKING ABOUT, THESE ARE THE 752 00:26:19,277 --> 00:26:21,079 PRINCIPLES YOU SHOULD BE 753 00:26:21,146 --> 00:26:24,916 FOLLOWING, TO MAKE THIS WHOLE 754 00:26:24,983 --> 00:26:26,918 SYSTEM MORE TRANSPARENT. 755 00:26:26,985 --> 00:26:28,219 AND THAT'S -- AFTER THAT SESSION 756 00:26:28,286 --> 00:26:30,288 WE'LL COME BACK INTO THE PLENARY 757 00:26:30,355 --> 00:26:31,923 ROOM HERE, AND WE'LL HAVE THIS 758 00:26:31,990 --> 00:26:33,391 DISCUSSION SORT OF THINKING 759 00:26:33,458 --> 00:26:35,226 ABOUT, OKAY, DID WE GET THE 760 00:26:35,293 --> 00:26:37,529 COLLECTION OF ACTIVITIES RIGHT? 761 00:26:37,595 --> 00:26:38,763 ARE WILL THINGS MISSING? 762 00:26:38,830 --> 00:26:40,465 ACTIONS WE MIGHT WANT TO PROVIDE 763 00:26:40,532 --> 00:26:41,433 GUIDANCE FOR THAT WE HAVEN'T 764 00:26:41,499 --> 00:26:42,867 THOUGHT OF YET? 765 00:26:42,934 --> 00:26:45,170 FOR EACH OF THOSE, DO WE HAVE 766 00:26:45,236 --> 00:26:47,572 THE RIGHT GUIDANCE FOR EACH OF 767 00:26:47,639 --> 00:26:48,773 THOSE ACTIONS AND WE'LL LEAD YOU 768 00:26:48,840 --> 00:26:50,408 THROUGH THAT DISCUSSION AS WELL. 769 00:26:50,475 --> 00:26:52,110 FINALLY FOR SAKE OF TIME I'LL BE 770 00:26:52,177 --> 00:26:53,812 QUICK HERE, THIS IS WHAT WE'LL 771 00:26:53,878 --> 00:26:56,314 DO FOR THINKING ABOUT THE 772 00:26:56,381 --> 00:26:58,149 FUTURE, JUST THINKING REALLY 773 00:26:58,216 --> 00:26:59,150 OPEN-ENDED CONVERSATION, WE WANT 774 00:26:59,217 --> 00:27:00,485 TO HAVE YOUR BRAINSTORMING ABOUT 775 00:27:00,552 --> 00:27:04,322 WHERE THIS FIELD MAY BE GOING. 776 00:27:04,389 --> 00:27:08,994 SO, A COUPLE OF TOP TIPS, RANDOM 777 00:27:09,060 --> 00:27:09,594 COMMENTS. 778 00:27:09,661 --> 00:27:12,630 THE FIRST IS OUR APPROACH TO 779 00:27:12,697 --> 00:27:16,901 TRANSPARENCY IS BASED ON THIS 780 00:27:16,968 --> 00:27:18,203 PRACTICAL THINKING AROUND 781 00:27:18,269 --> 00:27:19,270 INDIVIDUALS INFORMATION AND 782 00:27:19,337 --> 00:27:20,171 DECISION NEEDS, WHAT DO 783 00:27:20,238 --> 00:27:22,340 DIFFERENT PEOPLE AROUND THIS 784 00:27:22,407 --> 00:27:23,775 SORT OF PIPELINE, DIFFERENT 785 00:27:23,842 --> 00:27:25,744 STAKEHOLDERS, REALLY NEED TO 786 00:27:25,810 --> 00:27:25,977 KNOW. 787 00:27:26,044 --> 00:27:27,579 IN THE COURSE OF YOUR 788 00:27:27,645 --> 00:27:28,780 CONVERSATION, I'M SURE IT WILL 789 00:27:28,847 --> 00:27:30,248 COME UP THINKING ABOUT NOT JUST 790 00:27:30,315 --> 00:27:32,751 HOW TO MAKE THIS SYSTEM 791 00:27:32,817 --> 00:27:34,586 TRANSPARENT BUT ALSO RIGHT FROM 792 00:27:34,652 --> 00:27:35,053 WRONG. 793 00:27:35,120 --> 00:27:36,354 WHAT SHOULD PEOPLE BE DOING, 794 00:27:36,421 --> 00:27:38,223 WHAT SHOULD PEOPLE NOT BE DOING, 795 00:27:38,289 --> 00:27:40,825 AND I WOULD SAY IF YOU HAVE A 796 00:27:40,892 --> 00:27:42,794 CHOICE, ERR ON THE SIDE OF 797 00:27:42,861 --> 00:27:43,895 THINKING ABOUT TRANSPARENCY. 798 00:27:43,962 --> 00:27:45,764 THE PRIMARY GOAL HERE IS NOT TO 799 00:27:45,830 --> 00:27:46,765 DECIDE WHAT'S RIGHT, WHAT'S 800 00:27:46,831 --> 00:27:48,500 WRONG, BUT TO GIVE PEOPLE THE 801 00:27:48,566 --> 00:27:51,770 INFORMATION THEY NEED TO MAKE 802 00:27:51,836 --> 00:27:53,304 THOSE DECISIONS FOR THEMSELVES. 803 00:27:53,371 --> 00:27:59,878 THE THIRD BULLET IS AROUND BIAS 804 00:27:59,944 --> 00:28:04,149 VERSUS CAUSALITY VERSUS 805 00:28:04,215 --> 00:28:04,482 PREJUDICE. 806 00:28:04,549 --> 00:28:06,985 WE'VE HAD CONVERSATIONS THAT GO 807 00:28:07,052 --> 00:28:09,821 AROUND THIS. 808 00:28:09,888 --> 00:28:12,557 A MODEL CAN DISPLAY BIAS EVEN 809 00:28:12,624 --> 00:28:13,992 WHEN PERHAPS WITH RESPECT TO 810 00:28:14,059 --> 00:28:15,960 CERTAIN CATEGORIES, EVEN WHEN 811 00:28:16,027 --> 00:28:17,028 CATEGORIES ARE NOT NECESSARILY 812 00:28:17,095 --> 00:28:19,030 CAUSALLY RELATED TO THE OUTCOME. 813 00:28:19,097 --> 00:28:20,899 AND THAT ALSO DOESN'T MEAN THAT 814 00:28:20,965 --> 00:28:23,301 THE MODEL WAS DEVELOPED IN ANY 815 00:28:23,368 --> 00:28:25,070 KIND OF BIASED OR PREJUDICED 816 00:28:25,136 --> 00:28:25,503 WAY. 817 00:28:25,570 --> 00:28:27,005 THEY ARE JUST THINKING ABOUT 818 00:28:27,072 --> 00:28:28,206 THOSE SORT OF SEPARATELY AND 819 00:28:28,273 --> 00:28:31,943 BEING CAREFUL ABOUT HOW WE USE 820 00:28:32,010 --> 00:28:33,711 OUR LANGUAGE I WOULD SUGGEST 821 00:28:33,778 --> 00:28:35,213 PAYING ATTENTION TO. 822 00:28:35,280 --> 00:28:36,881 THE BREAKOUT THEMES WE'VE GIVEN 823 00:28:36,948 --> 00:28:37,949 YOU AND THAT INSTRUCTIONS I'VE 824 00:28:38,016 --> 00:28:39,150 GONE THROUGH ARE STARTING 825 00:28:39,217 --> 00:28:40,051 POINTS. 826 00:28:40,118 --> 00:28:43,421 THESE ARE NOT MEANT TO BE 827 00:28:43,488 --> 00:28:44,322 RESTRICTIVE OR CONSTRAINTS. 828 00:28:44,389 --> 00:28:46,057 THEY ARE A LUNCHING PAD. 829 00:28:46,124 --> 00:28:48,259 GO WHERE THE CONVERSATION TAKES 830 00:28:48,326 --> 00:28:48,593 YOU. 831 00:28:48,660 --> 00:28:52,197 THINK ABOUT THE FULL DATA AND 832 00:28:52,263 --> 00:28:53,598 MODEL DEVELOPMENT CYCLE. 833 00:28:53,665 --> 00:28:55,800 SO THAT MEANS THINKING ABOUT THE 834 00:28:55,867 --> 00:28:58,536 DATA ECOSYSTEM. 835 00:28:58,603 --> 00:29:00,405 WE'RE REALLY THINKING ABOUT DATA 836 00:29:00,472 --> 00:29:02,040 AS CREATED OR GENERATED PERHAPS 837 00:29:02,107 --> 00:29:04,876 IN A CLINIC OR RESEARCH SETTING 838 00:29:04,943 --> 00:29:06,044 TO WHERE IT'S DEPLOYED. 839 00:29:06,111 --> 00:29:08,279 AND SO YOU SHOULD BE THINKING 840 00:29:08,346 --> 00:29:10,915 ABOUT, YOU KNOW, DATA 841 00:29:10,982 --> 00:29:11,683 REPOSITORIES, WORKSPACES, SEARCH 842 00:29:11,749 --> 00:29:14,152 ENGINES, ALL THE COMPONENTS OF 843 00:29:14,219 --> 00:29:18,556 OUR ECOSYSTEM IN A HOLISTIC WAY. 844 00:29:18,623 --> 00:29:20,191 CONSIDER HEALTH APPLICATIONS AS 845 00:29:20,258 --> 00:29:21,826 WELL AS RESEARCH APPLICATIONS. 846 00:29:21,893 --> 00:29:23,261 I THINK IT'S VERY TEMPTING TO 847 00:29:23,328 --> 00:29:27,098 THINK ABOUT A.I. MODELS IN A 848 00:29:27,165 --> 00:29:29,400 HEALTH APPLICATION OR CLINICAL 849 00:29:29,467 --> 00:29:31,269 SETTING, BUT THERE ARE MANY 850 00:29:31,336 --> 00:29:33,872 TIMES AN A.I. MODEL, THE PURPOSE 851 00:29:33,938 --> 00:29:36,174 IS TO BE DEPLOYED IN A RESEARCH 852 00:29:36,241 --> 00:29:38,276 SETTING SO THINK ABOUT THAT TOO. 853 00:29:38,343 --> 00:29:40,445 AND FINALLY, ONE OF THE GOALS OF 854 00:29:40,512 --> 00:29:41,980 OUR BREAKOUTS IS TO MAKE SURE WE 855 00:29:42,046 --> 00:29:44,482 HEAR FROM ALL OF YOU. 856 00:29:44,549 --> 00:29:45,617 THAT'S WHY WE INVITED YOU AND 857 00:29:45,683 --> 00:29:47,352 WHY WE'RE SO HAPPY TO HAVE YOU, 858 00:29:47,418 --> 00:29:50,622 AND WHY THERE'S SO MANY EXCITED 859 00:29:50,688 --> 00:29:51,289 FACES AROUND. 860 00:29:51,356 --> 00:29:52,590 SO CHIME IN. 861 00:29:52,657 --> 00:29:54,025 DON'T FEEL SHY. 862 00:29:54,092 --> 00:29:55,560 OUR BREAKOUT LEADS WILL DO THEIR 863 00:29:55,627 --> 00:29:57,428 BEST TO DRAW THAT OUT OF YOU. 864 00:29:57,495 --> 00:30:02,233 SO I HOPE YOU WILL CONTRIBUTE. 865 00:30:02,300 --> 00:30:05,737 WE HAVE THEMES FOR ALL OF THE 866 00:30:05,803 --> 00:30:06,271 BREAKOUTS. 867 00:30:06,337 --> 00:30:08,907 YOU SAW THIS ONLINE, WHERE WE 868 00:30:08,973 --> 00:30:11,209 ASKED YOU TO PICK YOUR TOP 869 00:30:11,276 --> 00:30:12,744 CHOICES, THESE ARE ROUGHLY 870 00:30:12,810 --> 00:30:14,913 MAPPED OUT ACROSS THE DATA AND 871 00:30:14,979 --> 00:30:15,914 MODEL DEVELOPMENT CYCLE. 872 00:30:15,980 --> 00:30:17,982 AND SO THESE ARE AROUND 873 00:30:18,049 --> 00:30:21,252 SYNTHETIC DATA, DATA SHARING FOR 874 00:30:21,319 --> 00:30:23,555 GENERAL REUSE, MULTI-MODAL DATA, 875 00:30:23,621 --> 00:30:24,322 FOUNDATION MODELS, PROXY 876 00:30:24,389 --> 00:30:26,391 VARIABLES, AND WE HOPE TO SORT 877 00:30:26,457 --> 00:30:30,028 OF COVER A WIDE RANGE OF THE 878 00:30:30,094 --> 00:30:31,462 SPACE OF ETHICAL CHALLENGES AND 879 00:30:31,529 --> 00:30:33,431 QUESTIONS THAT CAN ARISE FROM 880 00:30:33,498 --> 00:30:35,300 A.I. BUT OF COURSE IT'S NOT THE 881 00:30:35,366 --> 00:30:36,834 FULL SPACE SO USE IT AS A 882 00:30:36,901 --> 00:30:40,104 LAUNCHING PAD, NOT A 883 00:30:40,171 --> 00:30:41,005 RESTRICTION. 884 00:30:41,072 --> 00:30:41,873 OKAY. 885 00:30:41,940 --> 00:30:43,641 THESE ARE YOUR BREAKOUT LEADS. 886 00:30:43,708 --> 00:30:47,445 I HAVE SOME NEW AND BREAKING 887 00:30:47,512 --> 00:30:47,779 INFORMATION. 888 00:30:47,845 --> 00:30:51,149 THESE ROOM NUMBERS ARE NOT 889 00:30:51,216 --> 00:30:51,382 CORRECT. 890 00:30:51,449 --> 00:30:53,117 SO, GET OUT A PEN AND I'M GOING 891 00:30:53,184 --> 00:30:56,854 TO TELL YOU WHICH ROOM YOU'RE 892 00:30:56,921 --> 00:30:57,956 GOING TO INSTEAD. 893 00:30:58,022 --> 00:30:59,924 IF YOU ARE IN THE FOUNDATION 894 00:30:59,991 --> 00:31:01,125 MODEL GROUP, AND BY THE WAY IF 895 00:31:01,192 --> 00:31:03,361 YOU DIDN'T PICK IT UP, THE GROUP 896 00:31:03,428 --> 00:31:07,065 ASSIGNMENTS ARE ON A SHEET 897 00:31:07,131 --> 00:31:07,632 OUTSIDE. 898 00:31:07,699 --> 00:31:08,833 IF YOU'RE IN THE FOUNDATION 899 00:31:08,900 --> 00:31:16,474 MODEL GROUP, YOU WERE ASSIGNED, 900 00:31:16,541 --> 00:31:17,875 YOU'RE NOW IN 260-C. 901 00:31:17,942 --> 00:31:20,178 IN THE GENERAL REUSE GROUP 902 00:31:20,245 --> 00:31:24,349 ASSIGNED TO 270B YOU'RE NOW IN 903 00:31:24,415 --> 00:31:26,851 260D FOR DAVID. 904 00:31:26,918 --> 00:31:28,620 IF YOU'RE IN MULTI-MODAL GROUP 905 00:31:28,686 --> 00:31:31,356 AND WERE ASSIGNED TO 260-F, 906 00:31:31,422 --> 00:31:34,626 YOU'RE NOW IN 260-E FOR EDWARD. 907 00:31:34,692 --> 00:31:36,027 PROXY VARIABLES, YOU LUCKED OUT 908 00:31:36,094 --> 00:31:37,729 AND YOU'RE IN THE ROOM I THOUGHT 909 00:31:37,795 --> 00:31:39,897 YOU WOULD BE IN. 910 00:31:39,964 --> 00:31:42,600 AND FOR SYNTHETIC DATA YOU'RE 911 00:31:42,667 --> 00:31:46,037 NOT IN 270-A, YOU'RE IN 150-B. 912 00:31:46,104 --> 00:31:47,672 SO IF YOU DIDN'T WRITE THAT DOWN 913 00:31:47,739 --> 00:31:51,576 DON'T WORRY, WE HAVE IT OUT 914 00:31:51,643 --> 00:31:52,310 FRONT. 915 00:31:52,377 --> 00:31:53,778 I HAVE MY TRUSTY MAPPING HERE SO 916 00:31:53,845 --> 00:31:56,848 I CAN LET YOU KNOW. 917 00:31:56,914 --> 00:31:58,283 THAT'S FOR THE FIRST THREE -- 918 00:31:58,349 --> 00:32:02,120 THIS IS FOR THE FIRST THREE 919 00:32:02,186 --> 00:32:05,490 SESSIONS, YOU'LL BE IN YOUR 920 00:32:05,556 --> 00:32:06,291 THEMATIC BREAKOUT. 921 00:32:06,357 --> 00:32:08,326 THE LAST SESSION ON FRIDAY, 922 00:32:08,393 --> 00:32:09,894 WE'LL MIX YOU ALL UP SO YOU'LL 923 00:32:09,961 --> 00:32:13,831 BE WITH A NEW GROUP OF FRIENDS 924 00:32:13,898 --> 00:32:15,133 AND WE WILL REASSIGN YOU ROOMS 925 00:32:15,199 --> 00:32:15,767 AS WELL. 926 00:32:15,833 --> 00:32:17,969 I'LL SHOW YOU THE BREAKOUT FOR 927 00:32:18,036 --> 00:32:18,303 THAT. 928 00:32:18,369 --> 00:32:20,271 SO, I WANT TO END BY THANKING 929 00:32:20,338 --> 00:32:24,309 AGAIN ALL OF THE CO-LEADS AND 930 00:32:24,375 --> 00:32:27,578 THE BREAKOUT LEADS. 931 00:32:27,645 --> 00:32:29,347 THIS HAS BEEN A JOYFUL 932 00:32:29,414 --> 00:32:30,448 EXPERIENCE TO WORK WITH ALL OF 933 00:32:30,515 --> 00:32:30,615 YOU. 934 00:32:30,682 --> 00:32:32,083 I REALLY APPRECIATE IT. 935 00:32:32,150 --> 00:32:34,152 I WAS TELLING THEM LAST WEEK 936 00:32:34,218 --> 00:32:36,654 I'VE NEVER BEEN SO RELAXED ONE 937 00:32:36,721 --> 00:32:37,522 WEEK OUT FROM A WORKSHOP BECAUSE 938 00:32:37,588 --> 00:32:39,390 I HAVE SUCH FAITH IN THIS GROUP. 939 00:32:39,457 --> 00:32:40,825 I THINK YOU'RE GOING TO HAVE A 940 00:32:40,892 --> 00:32:41,826 LOT OF FUN. 941 00:32:41,893 --> 00:32:43,561 I WANT TO THANK ALSO MY 942 00:32:43,628 --> 00:32:45,530 COLLEAGUES WHO HAVE BEEN HELPING 943 00:32:45,596 --> 00:32:50,234 THE ORGANIZATION AND REALLY 944 00:32:50,301 --> 00:32:52,203 HELPED BRING THIS WORKSHOP INTO 945 00:32:52,270 --> 00:32:55,606 BEING, THIS IS A SUBSET OF OUR 946 00:32:55,673 --> 00:32:57,608 NIH A.I. ETHICS WORKING GROUP, 947 00:32:57,675 --> 00:33:00,111 AND SO REALLY WANT TO THANK ALL 948 00:33:00,178 --> 00:33:02,180 OF THEM. 949 00:33:02,246 --> 00:33:04,449 AND FINALLY, THE SUPPORT TEAM 950 00:33:04,515 --> 00:33:06,317 YOU'LL SEE TODAY, FEEL FREE TO 951 00:33:06,384 --> 00:33:08,286 ASK THEM QUESTIONS. 952 00:33:08,353 --> 00:33:10,488 JESSICA ST. LOUIS HAS BEEN 953 00:33:10,555 --> 00:33:12,290 HELPING US, SIMON TWIGGER, 954 00:33:12,357 --> 00:33:14,425 MICHELLE BAILEY, KELLY WILSON AS 955 00:33:14,492 --> 00:33:17,929 WELL AS COLLEAGUES FROM SCG 956 00:33:17,995 --> 00:33:19,130 MANNING THE REGISTRATION DESK 957 00:33:19,197 --> 00:33:19,497 OUTSIDE. 958 00:33:19,564 --> 00:33:21,432 WITH THAT, I WOULD LOVE TO HAND 959 00:33:21,499 --> 00:33:24,268 IT OVER TO JULIA WHO IS GOING TO 960 00:33:24,335 --> 00:33:34,679 INTRODUCE US INTO THE SCIENCE OF 961 00:33:34,746 --> 00:33:35,813 TRANSPARENCY. 962 00:33:35,880 --> 00:33:37,515 >> ALL RIGHT. 963 00:33:37,582 --> 00:33:38,282 THANK YOU. 964 00:33:38,349 --> 00:33:38,816 [APPLAUSE] 965 00:33:38,883 --> 00:33:41,786 IT'S SO MUCH A PLEASURE TO BE 966 00:33:41,853 --> 00:33:42,186 HERE. 967 00:33:42,253 --> 00:33:45,056 AND TO HAVE THIS IMPOSSIBLE TASK 968 00:33:45,123 --> 00:33:50,461 OF GIVING AN INTRODUCTION TO 969 00:33:50,528 --> 00:33:51,829 TRANSPARENCY IN 20 MINUTES. 970 00:33:51,896 --> 00:33:54,365 I DECIDED NOT TO FOLLOW THE 971 00:33:54,432 --> 00:33:57,101 PROMPT PRECISELY BUT RATHER GIVE 972 00:33:57,168 --> 00:33:58,403 MY OPINION ON WHAT TRANSPARENCY 973 00:33:58,469 --> 00:34:00,171 MAY MEAN AND APPLY TO OUR 974 00:34:00,238 --> 00:34:01,806 CONVERSATION IN THE NEXT THREE 975 00:34:01,873 --> 00:34:02,640 DAYS, TO UNDERSCORE AND 976 00:34:02,707 --> 00:34:05,843 REINFORCE SOME OF THE THINGS 977 00:34:05,910 --> 00:34:07,945 SAID ALREADY BUT MAKE SURE SOME 978 00:34:08,012 --> 00:34:09,147 TOPICS WITHIN TRANSPARENCY REALM 979 00:34:09,213 --> 00:34:11,983 THAT I OFTEN SEE ARE MISSED THAT 980 00:34:12,049 --> 00:34:14,051 WE PAY ATTENTION TO THEM AS WE 981 00:34:14,118 --> 00:34:17,121 MOVE AHEAD. 982 00:34:17,188 --> 00:34:19,323 I'M A COMPUTER SCIENTIST, I KNOW 983 00:34:19,390 --> 00:34:21,959 I'M IN A MINORITY IN THIS 984 00:34:22,026 --> 00:34:23,928 MEETING, IT'S REALLY EXCITING TO 985 00:34:23,995 --> 00:34:25,863 MEET ALL THESE AMAZING PEOPLE 986 00:34:25,930 --> 00:34:29,801 WHO WILL TEACH ME ABOUT HOW 987 00:34:29,867 --> 00:34:32,670 TRANSPARENCY APPLIES IN THIS 988 00:34:32,737 --> 00:34:33,971 EXTREMELY IMPORTANT DOMAIN OF 989 00:34:34,038 --> 00:34:36,407 MEDICINE, OF HEALTH, RIGHT? 990 00:34:36,474 --> 00:34:38,576 AND I SEE THIS IS REALLY AN 991 00:34:38,643 --> 00:34:39,210 OPPORTUNITY TO ADVANCE THE STATE 992 00:34:39,277 --> 00:34:43,080 OF THE ART BOTH WITHIN COMPUTER 993 00:34:43,147 --> 00:34:44,582 SCIENCE AND WITHIN BIOMEDICAL 994 00:34:44,649 --> 00:34:44,982 RESEARCH. 995 00:34:45,049 --> 00:34:46,451 AS YOU'LL SEE MANY THINGS YOU 996 00:34:46,517 --> 00:34:48,653 MAY ASSUME ARE KIND OF 997 00:34:48,719 --> 00:34:49,854 OPERATIONALIZED AND DONE, JUST A 998 00:34:49,921 --> 00:34:50,955 MATTER OF FIGURING OUT WHAT IS 999 00:34:51,022 --> 00:34:55,993 THE RIGHT TOOL THAT I SHOULD BE 1000 00:34:56,060 --> 00:34:58,396 USING, THEY ARE WITHIN COMPUTER 1001 00:34:58,463 --> 00:35:01,466 SCIENCE, I'M GLAD THERE'S THIS 1002 00:35:01,532 --> 00:35:02,033 COLLABORATION AMPLIFYING AND 1003 00:35:02,099 --> 00:35:03,434 STARTING AND HOPEFULLY GOING 1004 00:35:03,501 --> 00:35:03,668 FURTHER. 1005 00:35:03,734 --> 00:35:05,837 TO START I WANT TO TELL YOU THAT 1006 00:35:05,903 --> 00:35:08,239 I HAD THE PLEASURE OF DIRECTING 1007 00:35:08,306 --> 00:35:10,107 THE CENTER FOR RESPONSIBLE A.I. 1008 00:35:10,174 --> 00:35:12,844 AT NYU, OUR GOAL IS TO MAKE 1009 00:35:12,910 --> 00:35:16,881 RESPONSIBLE A.I. AND A.I. 1010 00:35:16,948 --> 00:35:22,153 SYNONYMOUS, IN THE NOT-TOO- 1011 00:35:22,220 --> 00:35:23,020 DISTANT FUTURE. 1012 00:35:23,087 --> 00:35:24,655 OUR HOME PAGE, YOU'LL FIND 1013 00:35:24,722 --> 00:35:25,656 INFORMATION ABOUT OUR RESEARCH, 1014 00:35:25,723 --> 00:35:27,191 A LOT OF BASIC RESEARCH ON 1015 00:35:27,258 --> 00:35:28,292 COMPUTER SCIENCE, AS I 1016 00:35:28,359 --> 00:35:31,229 MENTIONED, THAT IS NECESSARY AND 1017 00:35:31,295 --> 00:35:31,896 ONGOING. 1018 00:35:31,963 --> 00:35:33,664 A LOT IS SOCIOTECHNICAL WORK, A 1019 00:35:33,731 --> 00:35:35,933 LOT IS WORK ON TECHNOLOGY 1020 00:35:36,000 --> 00:35:36,167 POLICY. 1021 00:35:36,234 --> 00:35:38,469 AND FINALLY, AS I'LL TELL YOU AT 1022 00:35:38,536 --> 00:35:41,072 THE END OF MY PRESENTATION WE DO 1023 00:35:41,138 --> 00:35:43,274 WORK ON EDUCATION AND TRAINING. 1024 00:35:43,341 --> 00:35:44,041 ALL OF THESE INTERVENTIONS ARE 1025 00:35:44,108 --> 00:35:47,545 IMPORTANT TO BE ABLE TO REALLY 1026 00:35:47,612 --> 00:35:48,546 ENACT LASTING SOCIETAL CHANGE IN 1027 00:35:48,613 --> 00:35:53,251 THE WAY WE USE A.I. AND RELATED 1028 00:35:53,317 --> 00:35:53,818 TECHNOLOGIES. 1029 00:35:53,885 --> 00:35:56,754 THE OTHER THING IS MY SLIDES 1030 00:35:56,821 --> 00:35:57,889 WILL BE VISUALLY INTERESTING, 1031 00:35:57,955 --> 00:36:00,825 AND THIS IS A BY-PRODUCT OF OUR 1032 00:36:00,892 --> 00:36:02,126 PUBLIC EDUCATION ACTIVITIES, AND 1033 00:36:02,193 --> 00:36:04,428 I ALSO THINK IT'S REALLY 1034 00:36:04,495 --> 00:36:07,798 IMPORTANT THAT WE BRING SOME 1035 00:36:07,865 --> 00:36:09,901 HUMAN, FUN, ART, AND OTHER WAYS 1036 00:36:09,967 --> 00:36:12,303 TO PRESENT INFORMATION INTO THIS 1037 00:36:12,370 --> 00:36:13,271 CONVERSATION. 1038 00:36:13,337 --> 00:36:15,473 BECAUSE THE CONVERSATION ABOUT 1039 00:36:15,540 --> 00:36:16,340 A.I. AND ETHICS WHICH 1040 00:36:16,407 --> 00:36:19,310 TRANSPARENCY IS A MAJOR 1041 00:36:19,377 --> 00:36:20,511 COMPONENT REALLY, IT'S A HEAVY 1042 00:36:20,578 --> 00:36:22,146 CONVERSATION FOR PEOPLE TO HAVE. 1043 00:36:22,213 --> 00:36:25,416 AND SO BY USING ART, THE GOAL IS 1044 00:36:25,483 --> 00:36:28,052 TO KIND OF SHORTEN THIS DISTANCE 1045 00:36:28,119 --> 00:36:28,719 BETWEEN PEOPLE AND 1046 00:36:28,786 --> 00:36:29,587 UNDERSTANDING, AND ACTIONS THAT 1047 00:36:29,654 --> 00:36:33,524 WE NEED TO BE TAKING 1048 00:36:33,591 --> 00:36:34,425 COLLECTIVELY. 1049 00:36:34,492 --> 00:36:35,359 SO, AS A DATA SCIENTIST I WILL 1050 00:36:35,426 --> 00:36:38,563 OF COURSE START WITH AN EXAMPLE. 1051 00:36:38,629 --> 00:36:40,064 I HOPE TO MOTIVATE A LOT OF WHAT 1052 00:36:40,131 --> 00:36:50,274 WE'LL BE TALKING ABOUT AND TO 1053 00:36:50,341 --> 00:36:52,209 MOTIVATE TRANSPARENCY, ONE ROOM 1054 00:36:52,276 --> 00:36:54,478 WILL CONTINUE ON DATA SYNTHESIS, 1055 00:36:54,545 --> 00:36:59,016 SYNTHETIC DATA GENERATION. 1056 00:36:59,083 --> 00:37:01,953 THE SPECIFIC EXAMPLE HERE IS 1057 00:37:02,019 --> 00:37:06,791 THIS PROJECT CALLED FASTMRI, A 1058 00:37:06,857 --> 00:37:07,658 COLLABORATION BETWEEN FORMERLY 1059 00:37:07,725 --> 00:37:10,161 FACEBOOK A.I. NOW META A.I. AND 1060 00:37:10,227 --> 00:37:11,362 NYU MEDICAL SCHOOL. 1061 00:37:11,429 --> 00:37:14,699 AND THE GOAL OF THIS PROJECT 1062 00:37:14,765 --> 00:37:17,501 ESSENTIALLY IS TO ACCELERATE AND 1063 00:37:17,568 --> 00:37:20,705 IMPROVE ACCESS AND IMPROVE 1064 00:37:20,771 --> 00:37:21,973 THROUGHPUT TO MRI TECHNOLOGY. 1065 00:37:22,039 --> 00:37:23,074 WHAT HAPPENS HERE? 1066 00:37:23,140 --> 00:37:26,077 THERE'S A CLEAR NEED FOR 1067 00:37:26,143 --> 00:37:30,448 IMPROVEMENT IN THIS DOMAIN. 1068 00:37:30,514 --> 00:37:37,555 ACQUISITION TIME IN MRI CAN 1069 00:37:37,622 --> 00:37:38,756 EXCEED 30 MINUTES, LEADING TO 1070 00:37:38,823 --> 00:37:40,625 PROBLEMS WITH COMFORT AND 1071 00:37:40,691 --> 00:37:41,292 COMPLIANCE AND ARTIFACTS FROM 1072 00:37:41,359 --> 00:37:45,663 PATIENT MOTION. 1073 00:37:45,730 --> 00:37:47,531 A CHALLENGE FOR MRI SYNTHESIS IS 1074 00:37:47,598 --> 00:37:49,233 THE LACK OF PUBLIC DATA AND 1075 00:37:49,300 --> 00:37:50,167 VALIDATION METHODS. 1076 00:37:50,234 --> 00:37:51,902 THIS IS WHAT THIS PROJECT IS 1077 00:37:51,969 --> 00:37:52,837 ATTEMPTING TO ADDRESS. 1078 00:37:52,903 --> 00:37:56,273 WHAT DO THEY DO? 1079 00:37:56,340 --> 00:37:58,943 THEY GENERATE SOME PRELIMINARY 1080 00:37:59,010 --> 00:38:01,245 SCANS THAT ARE GENERATED MUCH 1081 00:38:01,312 --> 00:38:02,313 MORE FAST THAN TRADITIONAL MRI 1082 00:38:02,380 --> 00:38:04,515 AND FILL IN THE BLANKS USING 1083 00:38:04,582 --> 00:38:04,949 DATA, RIGHT? 1084 00:38:05,016 --> 00:38:09,587 SO THIS IS A KIND OF SYNTHETIC 1085 00:38:09,654 --> 00:38:10,621 DATASET YOU GET AT. 1086 00:38:10,688 --> 00:38:12,957 THERE'S A CLEAR NEED FOR 1087 00:38:13,024 --> 00:38:13,591 IMPROVEMENT HERE. 1088 00:38:13,658 --> 00:38:15,593 THIS TECHNOLOGY HAS A RIGHT TO 1089 00:38:15,660 --> 00:38:16,560 EXIST AND THESE DATASETS ARE 1090 00:38:16,627 --> 00:38:20,064 VERY NECESSARY TO HELP US CREATE 1091 00:38:20,131 --> 00:38:21,699 SYNTHETIC MRI SCANS. 1092 00:38:21,766 --> 00:38:25,302 THE OTHER THING HERE THAT MAKES 1093 00:38:25,369 --> 00:38:26,170 THIS PARTICULAR PROJECT 1094 00:38:26,237 --> 00:38:27,038 APPROPRIATE WITHIN RESPONSIBLE 1095 00:38:27,104 --> 00:38:30,007 A.I. REALM IS WE CAN VALIDATE 1096 00:38:30,074 --> 00:38:31,409 THE QUALITY AND USEFULNESS OF 1097 00:38:31,475 --> 00:38:33,611 THE DATA AND OF ANY MODELS THAT 1098 00:38:33,678 --> 00:38:35,046 MAY BE TRAINED ON THIS DATA. 1099 00:38:35,112 --> 00:38:37,214 HOW DO WE DO THIS? 1100 00:38:37,281 --> 00:38:40,151 THERE'S A GROUND TRUTH, YOU CAN 1101 00:38:40,217 --> 00:38:45,423 COMPARE SYNTHETIC MRI IMAGE WITH 1102 00:38:45,489 --> 00:38:48,826 GROUND TRUTH IMAGE AND SEE IF 1103 00:38:48,893 --> 00:38:51,128 THEY DIFFERN WHAT WAY. 1104 00:38:51,195 --> 00:38:52,963 IN THIS PROJECT AND PUBLICATIONS 1105 00:38:53,030 --> 00:38:55,733 AUTHORS SPEAK ABOUT WAYS THEY 1106 00:38:55,800 --> 00:38:58,235 VALIDATE THE QUALITY OF THE 1107 00:38:58,302 --> 00:38:59,570 DATASETS GENERATED. 1108 00:38:59,637 --> 00:39:02,840 THESE VALIDATION METHODS ARE 1109 00:39:02,907 --> 00:39:04,942 BASED ESSENTIALLY ON STATISTICS, 1110 00:39:05,009 --> 00:39:07,244 BASED ON COMPARING WHETHER 1111 00:39:07,311 --> 00:39:09,413 PICTURES ARE SIMILAR AND 1112 00:39:09,480 --> 00:39:09,947 COMPARISONS BETWEEN BASIC 1113 00:39:10,014 --> 00:39:10,414 STRUCTURE, RIGHT? 1114 00:39:10,481 --> 00:39:12,016 ONE OF THE THINGS WE NEED TO 1115 00:39:12,083 --> 00:39:13,684 THINK ABOUT WHETHER THESE 1116 00:39:13,751 --> 00:39:14,285 COMPARISONS ARE APPROPRIATE, 1117 00:39:14,351 --> 00:39:17,054 WE'LL COME BACK IN JUST A 1118 00:39:17,121 --> 00:39:17,321 SECOND. 1119 00:39:17,388 --> 00:39:18,689 ANOTHER REASON WHY I LIKE TO 1120 00:39:18,756 --> 00:39:19,924 UNDERSCORE THIS PROJECT TO BRING 1121 00:39:19,990 --> 00:39:23,828 IT UP IS THAT WE ARE TECHNICALLY 1122 00:39:23,894 --> 00:39:27,131 READY TO BE ABLE TO GENERATE 1123 00:39:27,198 --> 00:39:28,132 THESE SYNTHETIC MRI SCANS. 1124 00:39:28,199 --> 00:39:30,835 THIS IS BECAUSE WE'RE ABLE TO 1125 00:39:30,901 --> 00:39:32,937 COLLECT THE DATA, WE HAVE 1126 00:39:33,003 --> 00:39:33,804 ALGORITHMIC TOOLS, THE HARDWARE, 1127 00:39:33,871 --> 00:39:36,340 TO BE ABLE TO ACTUALLY COME UP 1128 00:39:36,407 --> 00:39:37,641 WITH THESE DATASETS, RIGHT? 1129 00:39:37,708 --> 00:39:38,909 IT'S NOT A PIE IN THE SKY KIND 1130 00:39:38,976 --> 00:39:39,477 OF THING. 1131 00:39:39,543 --> 00:39:43,114 THIS IS SOMETHING WE CAN DO AS A 1132 00:39:43,180 --> 00:39:43,781 COMMUNITY. 1133 00:39:43,848 --> 00:39:45,750 AND THE FINAL POINT, EXTREMELY 1134 00:39:45,816 --> 00:39:47,818 IMPORTANT, IS THAT IN THIS 1135 00:39:47,885 --> 00:39:50,087 PARTICULAR DOMAIN UNLIKE MANY 1136 00:39:50,154 --> 00:39:52,189 OTHER DOMAINS WHERE A.I. AND 1137 00:39:52,256 --> 00:39:54,058 SYNTHETIC DATA ARE USED 1138 00:39:54,125 --> 00:39:57,628 AMBITIOUSLY, WE HAVE OR HOPE TO 1139 00:39:57,695 --> 00:39:58,963 HAVE DECISION MAKER READINESS. 1140 00:39:59,029 --> 00:40:01,565 THE CLINICIANS WHO ARE THEN 1141 00:40:01,632 --> 00:40:03,467 LOOKING AT SYNTHETIC SCANS OR 1142 00:40:03,534 --> 00:40:04,068 RESEARCHERS, THEY UNDERSTAND 1143 00:40:04,135 --> 00:40:07,071 WHAT THEY ARE DEALING WITH, 1144 00:40:07,138 --> 00:40:08,139 TRAINED IN MEDICAL ETHICS, IN 1145 00:40:08,205 --> 00:40:09,039 THE CASE OF RADIOLOGIST, 1146 00:40:09,106 --> 00:40:10,274 UNDERSTAND THE DECISION THEY ARE 1147 00:40:10,341 --> 00:40:12,309 GOING TO MAKE AS A RESULT TO 1148 00:40:12,376 --> 00:40:15,279 DIAGNOSE A PATIENT IN A 1149 00:40:15,346 --> 00:40:17,982 PARTICULAR WAY IS THEIR 1150 00:40:18,048 --> 00:40:18,516 DECISION. 1151 00:40:18,582 --> 00:40:19,416 SAYING THE COMPUTER TOLD ME SO 1152 00:40:19,483 --> 00:40:22,386 IS NOT GOING TO CUT IT, RIGHT? 1153 00:40:22,453 --> 00:40:26,323 AND THIS IS REALLY WHERE WE GET 1154 00:40:26,390 --> 00:40:28,092 TO OUR CONVERSATION OF 1155 00:40:28,159 --> 00:40:29,860 TRANSPARENCY, RIGHT? 1156 00:40:29,927 --> 00:40:34,331 IT'S BEING ABLE TO REASON 1157 00:40:34,398 --> 00:40:36,066 CAREFULLY ABOUT WHETHER THIS 1158 00:40:36,133 --> 00:40:36,734 PARTICULAR TECHNOLOGY OR 1159 00:40:36,801 --> 00:40:39,370 TECHNOLOGY LIKE IT AND WHETHER 1160 00:40:39,436 --> 00:40:41,338 THIS PARTICULAR DATASET IS 1161 00:40:41,405 --> 00:40:42,740 SOMETHING THAT A HUMAN DECISION 1162 00:40:42,807 --> 00:40:44,275 MAKER WOULD BE COMFORTABLE 1163 00:40:44,341 --> 00:40:44,675 USING. 1164 00:40:44,742 --> 00:40:44,875 RIGHT? 1165 00:40:44,942 --> 00:40:49,013 SO WHAT COULD BE SOME OF THE 1166 00:40:49,079 --> 00:40:51,382 CONCERNS THAT MAYBE SURFACE IN A 1167 00:40:51,448 --> 00:40:54,952 SYNTHETIC MRI DATASET? 1168 00:40:55,019 --> 00:41:02,126 DOES ANYBODY HAVE THOUGHTS? 1169 00:41:02,193 --> 00:41:04,762 IT MAY NOT BE REPRESENTATIVE, 1170 00:41:04,829 --> 00:41:07,731 RIGHT, SO THIS PARTICULAR STUDY 1171 00:41:07,798 --> 00:41:09,800 SAYS IS ABOUT KNEE MRI AND BRAIN 1172 00:41:09,867 --> 00:41:12,203 MRI, DOES NOT REPRESENT OTHER 1173 00:41:12,269 --> 00:41:14,672 TYPES OF ORGANS OR TISSUES. 1174 00:41:14,738 --> 00:41:17,241 IT MAY NOT BE REPRESENTATIVE OF 1175 00:41:17,308 --> 00:41:19,410 A VARIETY OF POPULATIONS OF 1176 00:41:19,476 --> 00:41:22,379 PATIENTS, THAT WE MAY WANT TO BE 1177 00:41:22,446 --> 00:41:23,247 USING OUR MODELS ON, ON WHICH 1178 00:41:23,314 --> 00:41:24,248 THE DATA IS TRAINED. 1179 00:41:24,315 --> 00:41:31,355 WHAT ELSE MAY BE A CONCERN HERE? 1180 00:41:31,422 --> 00:41:33,390 HOW GOOD -- RIGHT, HOW GOOD IS 1181 00:41:33,457 --> 00:41:34,592 THIS DATA ACTUALLY, IS IT 1182 00:41:34,658 --> 00:41:35,426 SUFFICIENT FOR US, IN OTHER 1183 00:41:35,492 --> 00:41:39,997 WORDS, TO BE USING EXISTING 1184 00:41:40,064 --> 00:41:40,998 METHODS TO EVALUATE DATA 1185 00:41:41,065 --> 00:41:42,967 SYNTHESIS TO TELL WHETHER THIS 1186 00:41:43,033 --> 00:41:45,135 DATA WILL ULTIMATELY BE -- THIS 1187 00:41:45,202 --> 00:41:48,339 IS THE HOLY GRAIL, 1188 00:41:48,405 --> 00:41:49,240 DIAGNOSTICALLY INTERCHANGEABLE 1189 00:41:49,306 --> 00:41:50,941 WITH GROUND TRUTH MRI SCANS, 1190 00:41:51,008 --> 00:41:51,408 RIGHT? 1191 00:41:51,475 --> 00:41:53,043 WHAT DO WE NEED TO DO TO MAKE 1192 00:41:53,110 --> 00:41:55,446 SURE WE CAN CHECK IN FACT THAT 1193 00:41:55,512 --> 00:41:57,314 THESE SYNTHETIC IMAGES ARE GOOD. 1194 00:41:57,381 --> 00:41:59,650 WHAT ELSE MAY BE SOME CONCERNS? 1195 00:41:59,717 --> 00:42:04,989 >> [OFF MICROPHONE] 1196 00:42:05,055 --> 00:42:05,723 >> YES, EXACTLY. 1197 00:42:05,789 --> 00:42:09,894 SO ARE WE ABLE TO GET PATIENT 1198 00:42:09,960 --> 00:42:10,527 BUY-IN, RIGHT? 1199 00:42:10,594 --> 00:42:11,962 ARE WE ABLE TO CONVINCE THE 1200 00:42:12,029 --> 00:42:14,932 PATIENTS THE DATA IS IN FACT AS 1201 00:42:14,999 --> 00:42:17,801 GOOD, THAT IT'S DIAGNOSTICALLY 1202 00:42:17,868 --> 00:42:18,636 INTERCHANGEABLE FOR ALL INTENTS 1203 00:42:18,702 --> 00:42:21,572 AND PURPOSES WITH THE REAL DATA. 1204 00:42:21,639 --> 00:42:22,907 OTHER STAKEHOLDERS, AND WE'LL DO 1205 00:42:22,973 --> 00:42:25,009 A LOT OF STAKEHOLDER ANALYSIS, 1206 00:42:25,075 --> 00:42:27,611 ARE THE CLINICIANS, RIGHT? 1207 00:42:27,678 --> 00:42:28,979 THE RADIOLOGISTS THEMSELVES, ARE 1208 00:42:29,046 --> 00:42:30,147 THEY TRUSTING THIS DATA. 1209 00:42:30,214 --> 00:42:31,582 AND WHAT DO WE NEED TO DO TO 1210 00:42:31,649 --> 00:42:33,684 MAKE SURE THEY ARE ABLE TO 1211 00:42:33,751 --> 00:42:37,288 REASON ABOUT THE QUALITY OF THIS 1212 00:42:37,354 --> 00:42:39,356 DATA AND FITNESS FOR USE FOR THE 1213 00:42:39,423 --> 00:42:40,991 TASK THEY HAVE AT HAND? 1214 00:42:41,058 --> 00:42:44,161 DO WE NEED TO TEACH THEM, TRAIN 1215 00:42:44,228 --> 00:42:46,997 THEM, SHOW SOMETHING IN THE WAY 1216 00:42:47,064 --> 00:42:47,598 OF CONFIDENCE INTERVALS FOR 1217 00:42:47,665 --> 00:42:49,133 EXAMPLE TOGETHER WITH THE DATA? 1218 00:42:49,199 --> 00:42:51,735 AND THEN ANOTHER CONCERN I WILL 1219 00:42:51,802 --> 00:42:52,536 RAISE IS PRIVACY. 1220 00:42:52,603 --> 00:42:54,505 SO THIS DATA THAT IS BEING 1221 00:42:54,571 --> 00:42:56,440 RELEASED HERE, GROUND TRUTH DATA 1222 00:42:56,507 --> 00:42:58,108 WAS ANONYMIZED BUT IS THIS 1223 00:42:58,175 --> 00:42:58,342 ENOUGH? 1224 00:42:58,409 --> 00:43:01,378 AND WE WILL TALK ABOUT LOTS AND 1225 00:43:01,445 --> 00:43:05,416 LOTS OF OTHER ASPECTS OF THIS 1226 00:43:05,482 --> 00:43:06,083 AND SIMILAR USE CASES, BUT WHAT 1227 00:43:06,150 --> 00:43:08,919 I WANT US TO BE THINKING ABOUT 1228 00:43:08,986 --> 00:43:11,989 FROM HERE IS THAT ULTIMATELY, 1229 00:43:12,056 --> 00:43:13,290 THIS USE CASE AND MANY OTHERS 1230 00:43:13,357 --> 00:43:15,626 WILL LEAD US INTO THE 1231 00:43:15,693 --> 00:43:16,460 CONVERSATIONS ABOUT 1232 00:43:16,527 --> 00:43:17,695 TRANSPARENCY, SO WHAT IS 1233 00:43:17,761 --> 00:43:18,228 TRANSPARENCY? 1234 00:43:18,295 --> 00:43:20,864 I WAS ASKED TO DEFINE THE TERM. 1235 00:43:20,931 --> 00:43:24,034 AND UNFORTUNATELY, THERE IS NO 1236 00:43:24,101 --> 00:43:26,103 CONSENSUS IN THE COMMUNITY ABOUT 1237 00:43:26,170 --> 00:43:29,840 WHAT TERMS TO USE TO DENOTE, IS 1238 00:43:29,907 --> 00:43:36,313 IT TRANSPARENCY INTERPRETABILIT? 1239 00:43:36,380 --> 00:43:36,880 EXPLAINABILITY? 1240 00:43:36,947 --> 00:43:37,247 INTELLIGIBILITY? 1241 00:43:37,314 --> 00:43:40,351 AND WHAT DO THE TERMS MEAN. 1242 00:43:40,417 --> 00:43:43,620 THEY ARE USED OFTEN 1243 00:43:43,687 --> 00:43:44,488 INTERCHANGEABLY. 1244 00:43:44,555 --> 00:43:47,658 USUALLY, THERE'S A DISTINCTION 1245 00:43:47,725 --> 00:43:48,993 BETWEEN TERMS THAT DENOTE 1246 00:43:49,059 --> 00:43:53,030 PROPERTIES OF A MODEL AS BEING 1247 00:43:53,097 --> 00:43:53,364 INTERPRETABLE. 1248 00:43:53,430 --> 00:43:55,099 SO, FOR EXAMPLE, CINDY RUDIN 1249 00:43:55,165 --> 00:43:55,866 SPEAKS ABOUT INTERPRETABLE 1250 00:43:55,933 --> 00:43:57,301 MACHINE LEARNING AS HAVING A 1251 00:43:57,368 --> 00:43:59,703 FOCUS ON DESIGNING MODELS THAT 1252 00:43:59,770 --> 00:44:00,504 ARE INHERENTLY INTERPRETABLE. 1253 00:44:00,571 --> 00:44:03,741 SO THIS IS A BIT OF A CIRCULAR 1254 00:44:03,807 --> 00:44:04,508 DEFINITION, RIGHT? 1255 00:44:04,575 --> 00:44:13,283 LIKE A DECISION TREE WOULD BE 1256 00:44:13,350 --> 00:44:14,018 INTERPRETABLE BUT THE NEURAL 1257 00:44:14,084 --> 00:44:16,453 NETWORK IS NOT. 1258 00:44:16,520 --> 00:44:18,288 AND A PERSON CAN PREDICT 1259 00:44:18,355 --> 00:44:20,157 UNDERSTAND OF A MODEL, MILLER 1260 00:44:20,224 --> 00:44:22,026 DEFINITION, SHIFTS THE FOCUS 1261 00:44:22,092 --> 00:44:23,994 AWAY FROM A MODEL AND BRINGS IN 1262 00:44:24,061 --> 00:44:25,629 THE PERSPECTIVE OF A HUMAN. 1263 00:44:25,696 --> 00:44:27,498 AND THIS IS A PERSPECTIVE THAT I 1264 00:44:27,564 --> 00:44:30,434 REALLY WANT US TO HAVE HERE. 1265 00:44:30,501 --> 00:44:32,503 THE WAY I MAKE SENSE OF THESE 1266 00:44:32,569 --> 00:44:37,007 TERMS IS ARRANGING THEM ON A 1267 00:44:37,074 --> 00:44:38,108 CONTINUUM, FROM TRANSPARENCY 1268 00:44:38,175 --> 00:44:38,742 AROUND INTERPRETABILITY 1269 00:44:38,809 --> 00:44:42,813 REFERRING TO PROPERTIES OF 1270 00:44:42,880 --> 00:44:44,014 TECHNICAL COMPONENT, ALGORITHM, 1271 00:44:44,081 --> 00:44:53,424 DATASET, SYSTEM, TO 1272 00:44:53,490 --> 00:44:54,024 EXPLAINABILITY AND INTEL 1273 00:44:54,091 --> 00:44:54,691 INTERCHANGEABLABLY, RATHER THAN 1274 00:44:54,758 --> 00:44:56,827 HOW IT WORKS. 1275 00:44:56,894 --> 00:44:59,563 I SUGGEST WE DON'T GET TANGLED 1276 00:44:59,630 --> 00:45:02,299 UP IN TERMINOLOGY, IF WE'RE TO 1277 00:45:02,366 --> 00:45:06,003 PICK ONE TERM LET'S TAKE 1278 00:45:06,070 --> 00:45:07,204 TRANSPARENCY BUT KEEP IN MIND 1279 00:45:07,271 --> 00:45:10,074 THAT OUR GOAL HERE IS TO ALLOW 1280 00:45:10,140 --> 00:45:12,242 PEOPLE, DIFFERENT KINDS OF 1281 00:45:12,309 --> 00:45:13,577 PEOPLE, PATIENTS, DOCTORS, 1282 00:45:13,644 --> 00:45:18,816 SOCIETY AT LARGE, REGULATORS, 1283 00:45:18,882 --> 00:45:20,451 AUDITORS, RESEARCHERS, TO 1284 00:45:20,517 --> 00:45:21,318 UNDERSTAND DATA, MODEL, OUTPUT, 1285 00:45:21,385 --> 00:45:23,087 TO ACT IN WHATEVER WAYS THAT ARE 1286 00:45:23,153 --> 00:45:33,263 APPROPRIATE FOR THEM TO ACT. 1287 00:45:33,330 --> 00:45:38,969 THE OVERARCHING GOAL IS NODES OF 1288 00:45:39,036 --> 00:45:40,404 RESPONSIBILITY TO PEOPLE, WHAT 1289 00:45:40,471 --> 00:45:42,139 LAURA TOLD US, TO SUPPORT 1290 00:45:42,206 --> 00:45:42,806 RESPONSIBLE DESIGN, DEVELOPMENT, 1291 00:45:42,873 --> 00:45:45,509 USE OF A.I., IN THIS CASE 1292 00:45:45,576 --> 00:45:49,379 COLLECTION AND USE OF BIOMEDICAL 1293 00:45:49,446 --> 00:45:49,713 DATA. 1294 00:45:49,780 --> 00:45:51,882 SO WHEN WE THINK ABOUT 1295 00:45:51,949 --> 00:45:52,549 TRANSPARENCY MORE CONCRETELY 1296 00:45:52,616 --> 00:45:53,817 THERE ARE A COUPLE QUESTIONS 1297 00:45:53,884 --> 00:45:57,254 THAT WE NEED TO BE ASKING. 1298 00:45:57,321 --> 00:46:01,325 ONE OF THEM IS TRANSPARENCY OF 1299 00:46:01,391 --> 00:46:02,192 WHAT? 1300 00:46:02,259 --> 00:46:03,594 TRANSPARENCY IS NOT ANALYTIC 1301 00:46:03,660 --> 00:46:03,827 CONCEPT. 1302 00:46:03,894 --> 00:46:05,796 WE NEED TO START SPLITTING IT UP 1303 00:46:05,863 --> 00:46:07,965 INTO COMPONENTS THAT WE CAN MORE 1304 00:46:08,031 --> 00:46:09,800 EASILY UNDERSTAND. 1305 00:46:09,867 --> 00:46:12,336 ARE WE INTERESTED IN DATA 1306 00:46:12,402 --> 00:46:12,736 TRANSPARENCY? 1307 00:46:12,803 --> 00:46:15,606 UNDERSTANDING HOW THE DATA CAME 1308 00:46:15,672 --> 00:46:17,374 TO BE? 1309 00:46:17,441 --> 00:46:18,575 WHETHER IT'S PRIVACY PRESERVING, 1310 00:46:18,642 --> 00:46:20,577 WHAT IT CONTAINS, WHAT IT'S 1311 00:46:20,644 --> 00:46:21,879 MISSING, WHAT IS APPROPRIATE 1312 00:46:21,945 --> 00:46:22,980 CONTEXT OF USE? 1313 00:46:23,046 --> 00:46:29,653 IS IT UNDERSTANDING THE MODELS 1314 00:46:29,720 --> 00:46:31,522 SO WE CAN ASSESS THEIR 1315 00:46:31,588 --> 00:46:33,056 PROPERTIES IN TERMS OF 1316 00:46:33,123 --> 00:46:34,458 ROBUSTNESS, IN TERMS OF 1317 00:46:34,525 --> 00:46:37,194 CORRECTNESS, IN TERMS OF 1318 00:46:37,261 --> 00:46:38,495 PERFORMANCE OVERALL AS SPECIFIC 1319 00:46:38,562 --> 00:46:41,031 SUBSETS OF DATA. 1320 00:46:41,098 --> 00:46:44,001 OR MAYBE WE'RE WANTING TO BRING 1321 00:46:44,067 --> 00:46:44,868 TRANSPARENCY OF THE OVERALL 1322 00:46:44,935 --> 00:46:45,235 PROCESS. 1323 00:46:45,302 --> 00:46:46,270 AND THIS IS SOMETHING THAT I 1324 00:46:46,336 --> 00:46:48,338 WOULD LIKE TO FOCUS ON HERE 1325 00:46:48,405 --> 00:46:51,508 BECAUSE WE DON'T REALLY THINK 1326 00:46:51,575 --> 00:46:56,880 HOLISTICALLY WHEN WE'RE STARTING 1327 00:46:56,947 --> 00:46:57,481 CONVERSATIONS. 1328 00:46:57,548 --> 00:46:59,850 WE'RE IN THE STATE OF DARK 1329 00:46:59,917 --> 00:47:01,285 THINKING ABOUT TRANSPARENCY, AND 1330 00:47:01,351 --> 00:47:02,252 DIFFERENT TOOLS AND ARTIFACTS 1331 00:47:02,319 --> 00:47:04,454 THAT HAVE BEEN DEVELOPED USUALLY 1332 00:47:04,521 --> 00:47:07,724 WE THINK ABOUT DATA FIX FOR 1333 00:47:07,791 --> 00:47:09,693 DATASETS OR MODEL CARDS FOR 1334 00:47:09,760 --> 00:47:10,561 MODEL PERFORMANCE, RIGHT? 1335 00:47:10,627 --> 00:47:12,930 BUT REALLY IT'S A BIG ECOSYSTEM 1336 00:47:12,996 --> 00:47:14,498 LIKE LAURA TOLD US ALSO, RIGHT? 1337 00:47:14,565 --> 00:47:16,033 THIS IS SOMETHING THAT I THINK 1338 00:47:16,099 --> 00:47:17,234 IS MISSING FROM THE 1339 00:47:17,301 --> 00:47:20,204 CONVERSATION, AND WHERE WE HAVE 1340 00:47:20,270 --> 00:47:22,940 AN OPPORTUNITY TO DEVELOP NEW 1341 00:47:23,006 --> 00:47:25,475 METHODS, AND THINK ABOUT 1342 00:47:25,542 --> 00:47:26,810 TRANSPARENCY THROUGH THE LIFE 1343 00:47:26,877 --> 00:47:29,012 CYCLE OF OUR ENTIRE SYSTEM 1344 00:47:29,079 --> 00:47:31,682 STARTING FROM THE QUESTION OF 1345 00:47:31,748 --> 00:47:33,483 WHY THE SYSTEM IS HERE, MOVING 1346 00:47:33,550 --> 00:47:35,886 TO DECIDING WHAT DATA TO 1347 00:47:35,953 --> 00:47:37,854 COLLECT, HOW TO ANNOTATE DATA, 1348 00:47:37,921 --> 00:47:39,590 PROCESS THE DATA, PREPARING IT 1349 00:47:39,656 --> 00:47:41,458 PERHAPS TO TRAIN A MODEL, AND 1350 00:47:41,525 --> 00:47:45,062 HOW TO REASON ABOUT MODELS' 1351 00:47:45,128 --> 00:47:45,362 PREDICTIONS. 1352 00:47:45,429 --> 00:47:46,730 SO, HERE I WANT TO GIVE ANOTHER 1353 00:47:46,797 --> 00:47:47,164 EXAMPLE. 1354 00:47:47,231 --> 00:47:50,334 HOW MUCH TIME DO I HAVE? 1355 00:47:50,400 --> 00:47:53,503 PERFECT. 1356 00:47:53,570 --> 00:47:56,273 AND THIS EXAMPLE -- I DON'T 1357 00:47:56,340 --> 00:47:57,774 ACTUALLY HAVE A LOT OF 1358 00:47:57,841 --> 00:47:59,209 EXPERIENCE IN THIS DOMAIN 1359 00:47:59,276 --> 00:48:00,410 SPECIFICALLY BUT I HOPE WE ALL 1360 00:48:00,477 --> 00:48:03,247 WILL BE ABLE TO RELATE TO THE 1361 00:48:03,313 --> 00:48:05,849 EXAMPLE I'M SHOWING. 1362 00:48:05,916 --> 00:48:07,417 AND IT'S FROM ALGORITHMIC 1363 00:48:07,484 --> 00:48:10,320 HIRING, A DOMAIN I'VE BEEN 1364 00:48:10,387 --> 00:48:11,989 DEEPLY ENGAGED IN. 1365 00:48:12,055 --> 00:48:13,957 SO, HERE CONSIDER ANN, A DATA 1366 00:48:14,024 --> 00:48:16,260 SCIENTIST AT A MAJOR RETAILER, 1367 00:48:16,326 --> 00:48:19,329 WHO WAS ASKED TO DEVELOP A 1368 00:48:19,396 --> 00:48:21,198 PREDICTOR TO DECIDE WHAT SALARY 1369 00:48:21,265 --> 00:48:23,066 TO OFFER TO INDIVIDUALS WHO 1370 00:48:23,133 --> 00:48:25,235 APPLIED FOR THE JOB AND WHO THE 1371 00:48:25,302 --> 00:48:29,840 COMPANY WANTS TO HIRE. 1372 00:48:29,906 --> 00:48:31,908 AND SHE HAS REPORT OF 1373 00:48:31,975 --> 00:48:33,343 DEMOGRAPHICS AND EMPLOYMENT 1374 00:48:33,410 --> 00:48:34,945 HISTORY OF JOB APPLICANTS. 1375 00:48:35,012 --> 00:48:36,380 FOLLOWING HER COMPANY'S BEST 1376 00:48:36,446 --> 00:48:37,681 PRACTICES SHE'S GOING TO SPLIT 1377 00:48:37,748 --> 00:48:38,949 HER DATA INTO TRAINING 1378 00:48:39,016 --> 00:48:39,983 VALIDATION AND TEST, THIS IS 1379 00:48:40,050 --> 00:48:42,853 WHAT WE DO IN ANY DOMAIN, RIGHT? 1380 00:48:42,919 --> 00:48:44,821 AND SHE'S GOING TO DO 1381 00:48:44,888 --> 00:48:46,123 PRE-PROCESSING OF DATA SO 1382 00:48:46,189 --> 00:48:47,891 SPECIFICALLY THE TYPE OF 1383 00:48:47,958 --> 00:48:48,859 PRE-PROCESSING I WANT TO FOCUS 1384 00:48:48,925 --> 00:48:59,469 ON SHE WILL FILL IN SOME MISSING 1385 00:49:05,509 --> 00:49:08,011 VALUES, INTERPOLATE MISSING 1386 00:49:08,078 --> 00:49:09,546 VALUES, MODEL TRAINING, TUNING, 1387 00:49:09,613 --> 00:49:09,846 VALIDATION. 1388 00:49:09,913 --> 00:49:12,949 WHEN LOOKING AT PER FORM 1389 00:49:13,016 --> 00:49:15,585 PERFORMANCE SHE'S GOING TO 1390 00:49:15,652 --> 00:49:18,855 NOTICE AND ISSUE, THAT ACCURACY 1391 00:49:18,922 --> 00:49:24,661 OF PREDICTION, SALARY LEVEL, 1392 00:49:24,728 --> 00:49:25,562 WILL BE LOWER SYSTEMATICALLY 1393 00:49:25,629 --> 00:49:26,663 LOWER FOR WOMEN WHO HAVE MORE 1394 00:49:26,730 --> 00:49:28,732 EXPERIENCE ON THE JOB. 1395 00:49:28,799 --> 00:49:32,002 THIS IS A FAIRNESS CONCERN, 1396 00:49:32,069 --> 00:49:32,202 RIGHT? 1397 00:49:32,269 --> 00:49:35,305 SO NOW BASED ON HER OWN LIVED 1398 00:49:35,372 --> 00:49:36,940 EXPERIENCE, ANN IS GOING TO TRY 1399 00:49:37,007 --> 00:49:38,775 TO FIGURE OUT WHY THIS IS 1400 00:49:38,842 --> 00:49:39,276 HAPPENING. 1401 00:49:39,343 --> 00:49:43,947 AND SHE WILL PAY ATTENTION 1402 00:49:44,014 --> 00:49:47,451 SPECIFICALLY TO THE DATA 1403 00:49:47,517 --> 00:49:48,518 INTERPOLATION STEP FOR THE 1404 00:49:48,585 --> 00:49:54,791 NUMBER OF YEARS OF EXPERIENCE. 1405 00:49:54,858 --> 00:49:56,426 AND SHE INTERPOLATES BY DEFAULT 1406 00:49:56,493 --> 00:49:58,628 IN THE PIPELINE PICKING THE MEAN 1407 00:49:58,695 --> 00:50:02,999 VALUE FROM ALL THE KNOWN VALUES, 1408 00:50:03,066 --> 00:50:03,233 RIGHT? 1409 00:50:03,300 --> 00:50:08,472 HER RESULT IS PRESENTED AT TOP 1410 00:50:08,538 --> 00:50:14,811 RIGHT, BUT GROUND TRUTH DATA IS 1411 00:50:14,878 --> 00:50:17,314 OLDER, NUMBER OF YEARS OF 1412 00:50:17,381 --> 00:50:20,283 EXPERIENCE IS A STRONG PROXY FOR 1413 00:50:20,350 --> 00:50:21,618 AGE, AND THAT'S WHAT WE'RE USING 1414 00:50:21,685 --> 00:50:24,187 TO DEPICT THE SITUATION. 1415 00:50:24,254 --> 00:50:26,823 IN THIS PARTICULAR DOMAIN, WHY 1416 00:50:26,890 --> 00:50:29,793 WOULD THIS BE HAPPENING, THAT 1417 00:50:29,860 --> 00:50:31,228 MISSING VALUE IMPUTATION THAT 1418 00:50:31,294 --> 00:50:35,599 ASSUMES VALUES ARE MISSING AT 1419 00:50:35,665 --> 00:50:36,800 RANDOM DOESN'T WORK LEADING US 1420 00:50:36,867 --> 00:50:39,202 TO TAKE THE MEAN OF THE 1421 00:50:39,269 --> 00:50:40,003 DISTRIBUTION, AND MISSING AT 1422 00:50:40,070 --> 00:50:42,038 RANDOM MEANS WHETHER OR NOT THE 1423 00:50:42,105 --> 00:50:42,806 VALUE IS MISSING DOESN'T DEPEND 1424 00:50:42,873 --> 00:50:45,075 WHAT THE VALUE IS. 1425 00:50:45,142 --> 00:50:47,611 BUT HERE VALUES ARE NOT MISSING 1426 00:50:47,677 --> 00:50:48,879 AT RANDOM BECAUSE OF AGE-BASED 1427 00:50:48,945 --> 00:50:50,347 DISCRIMINATION IN HIRING. 1428 00:50:50,414 --> 00:50:52,883 PEOPLE WHO ARE OLDER, WHO HAVE 1429 00:50:52,949 --> 00:50:54,851 MORE YEARS OF EXPERIENCE, THEY 1430 00:50:54,918 --> 00:50:56,253 ARE WITHHOLDING THAT 1431 00:50:56,319 --> 00:50:58,121 INFORMATION, RIGHT? 1432 00:50:58,188 --> 00:51:00,991 AND SO THINKING OF INTERPOLATION 1433 00:51:01,057 --> 00:51:05,328 STEP, SKEWING THIS DATA YOUNGER, 1434 00:51:05,395 --> 00:51:07,330 AND SALARIES ARE LOWER, AND IF 1435 00:51:07,397 --> 00:51:09,599 THESE PEOPLE MANY WOMEN WERE TO 1436 00:51:09,666 --> 00:51:11,701 ACCEPT THE OFFERS BEING MADE BY 1437 00:51:11,768 --> 00:51:14,104 THIS COMPANY, THEN THIS WOULD 1438 00:51:14,171 --> 00:51:19,810 REINFORCE THE GENDER WAGE GAP. 1439 00:51:19,876 --> 00:51:23,079 THIS IS SIMPLE BUT OFTEN 1440 00:51:23,146 --> 00:51:24,147 OVERLOOKED, BUT TRUE REALLY IF 1441 00:51:24,214 --> 00:51:25,916 YOU THINK ABOUT IT, IN THE WAY 1442 00:51:25,982 --> 00:51:27,217 THAT WE'RE ASSESSING THE QUALITY 1443 00:51:27,284 --> 00:51:31,188 OF THE DATA AND THE WAY IN WHICH 1444 00:51:31,254 --> 00:51:31,822 WE'RE PRE-PROCESSING DATA 1445 00:51:31,888 --> 00:51:34,624 LEADING UP TO MODEL TRAINING, 1446 00:51:34,691 --> 00:51:35,759 THERE ARE ADDITIONAL QUESTIONS 1447 00:51:35,826 --> 00:51:36,293 HERE. 1448 00:51:36,359 --> 00:51:39,930 FOR EXAMPLE, HOW ARE WE 1449 00:51:39,996 --> 00:51:41,231 INTERPOLATING WHERE CATEGORIES, 1450 00:51:41,298 --> 00:51:42,332 ARE WE INTERPOLATING MISSING 1451 00:51:42,399 --> 00:51:44,968 VALUES AS BELONGING TO ONE OF 1452 00:51:45,035 --> 00:51:47,037 THE RARE CATEGORIES, IF NOT 1453 00:51:47,103 --> 00:51:47,971 WE'LL BE SKEWING TO IMPUTE 1454 00:51:48,038 --> 00:51:49,973 VALUES ONLY IN THE MAJORITY 1455 00:51:50,040 --> 00:51:53,376 CATEGORIES. 1456 00:51:53,443 --> 00:51:55,879 EXAMPLE HER IS THE NATIVE 1457 00:51:55,946 --> 00:51:56,847 AMERICAN. 1458 00:51:56,913 --> 00:51:58,982 ARE WE REPRESENTING ALL THE 1459 00:51:59,049 --> 00:52:02,819 FUTURE VALUES, THE EXAMPLE HERE 1460 00:52:02,886 --> 00:52:04,988 IS WHAT IF GENDER, SOME IDENTIFY 1461 00:52:05,055 --> 00:52:07,858 AS NON-BINARY BUT INPUT FORM 1462 00:52:07,924 --> 00:52:09,059 SAYS MALE, FEMALE, OR LEAVE 1463 00:52:09,125 --> 00:52:11,695 BLANK OR OTHER, RIGHT? 1464 00:52:11,761 --> 00:52:13,430 AND EVERYBODY WHO IDENTIFIES AS 1465 00:52:13,497 --> 00:52:14,965 NON-BINARY IS GOING TO SELECT 1466 00:52:15,031 --> 00:52:18,134 OTHER OR LEAVE THE VALUE BLANK 1467 00:52:18,201 --> 00:52:19,769 AND INTERPOLATED AS MALE OR 1468 00:52:19,836 --> 00:52:22,305 FEMALE, JUST AN INCORRECT DATA 1469 00:52:22,372 --> 00:52:22,839 REPRESENTATION. 1470 00:52:22,906 --> 00:52:24,508 SO, SIMPLE THINGS LIKE THIS ARE 1471 00:52:24,574 --> 00:52:26,243 VERY IMPORTANT TO BUILD INTO OUR 1472 00:52:26,309 --> 00:52:28,845 THINKING ABOUT HOW WE PROCESS 1473 00:52:28,912 --> 00:52:30,280 DATA AND HOW WE DISCLOSE TO 1474 00:52:30,347 --> 00:52:32,883 OURSELVES AND ALSO TO THOSE 1475 00:52:32,949 --> 00:52:35,585 CONSUMING THE DATA, SOME OF 1476 00:52:35,652 --> 00:52:40,123 THESE STEPS THAT WENT INTO DATA 1477 00:52:40,190 --> 00:52:40,423 PROCESSING. 1478 00:52:40,490 --> 00:52:47,898 OTHERS ARE SIMPLE, FOR EXAMPLE 1479 00:52:47,964 --> 00:52:50,734 DATA FILTERING, WE CHANGE 1480 00:52:50,800 --> 00:52:51,535 DEMOGRAPHIC GROUP PROPORTIONS, 1481 00:52:51,601 --> 00:52:52,602 IF THESE OPERATIONS ARE AMONG 1482 00:52:52,669 --> 00:52:55,238 THE TENS OR HUNDREDS OF DATA 1483 00:52:55,305 --> 00:52:55,872 PRE-PROCESSING OPERATIONS YOU 1484 00:52:55,939 --> 00:52:57,173 THINK YOU'RE GOING TO LOSE THAT 1485 00:52:57,240 --> 00:52:59,809 INFORMATION BY THE TIME YOUR 1486 00:52:59,876 --> 00:53:00,677 MODEL IS TRAINED. 1487 00:53:00,744 --> 00:53:02,779 SO SOME OF THE CHALLENGES HERE 1488 00:53:02,846 --> 00:53:04,347 THAT WE'RE FACING IS BUILDING 1489 00:53:04,414 --> 00:53:07,284 TOOLS THAT WILL ALLOW US TO BE 1490 00:53:07,350 --> 00:53:08,685 TRANSPARENT EXPLAINING TO THE 1491 00:53:08,752 --> 00:53:12,289 DATA SCIENTIST BUT ALSO TO THOSE 1492 00:53:12,355 --> 00:53:14,824 CONSUMING THE RESULTS HOW DATA 1493 00:53:14,891 --> 00:53:17,327 PRE-PROCESSING AND WHETHER DATA 1494 00:53:17,394 --> 00:53:19,162 PRE-PROCESSING CHANGES SOME 1495 00:53:19,229 --> 00:53:20,730 BASIC DEMOGRAPHIC STATISTICS OF 1496 00:53:20,797 --> 00:53:22,032 THE POPULATIONS THAT WE'RE 1497 00:53:22,098 --> 00:53:23,133 WORKING WITH. 1498 00:53:23,199 --> 00:53:24,568 HERE I'M CORRELATING ONE OF THE 1499 00:53:24,634 --> 00:53:26,636 TOOLS THAT MY COLLABORATORS AND 1500 00:53:26,703 --> 00:53:30,006 I HAVE BUILT, THAT ALLOW YOU TO 1501 00:53:30,073 --> 00:53:31,575 TRACE INFORMATION ABOUT DATA 1502 00:53:31,641 --> 00:53:32,108 PRE-PROCESSING THROUGH THE 1503 00:53:32,175 --> 00:53:34,277 PIPELINE BUT THIS IS ONLY THE 1504 00:53:34,344 --> 00:53:35,278 FIRST STEP. 1505 00:53:35,345 --> 00:53:37,881 WE NEED MUCH MORE CAPABLE TOOLS 1506 00:53:37,948 --> 00:53:38,081 HERE. 1507 00:53:38,148 --> 00:53:39,716 THE OTHER QUESTION I WILL END ON 1508 00:53:39,783 --> 00:53:43,587 THIS, WE NEED TO BE ASKING, IS 1509 00:53:43,653 --> 00:53:46,323 TRANSPARENCY FOR WHOM AND WHY? 1510 00:53:46,389 --> 00:53:47,991 TO WHOM ARE WE EXPLAINING WHAT 1511 00:53:48,058 --> 00:53:49,593 WE NEED TO EXPLAIN? 1512 00:53:49,659 --> 00:53:51,795 AND WHY ARE WE EXPLAINING? 1513 00:53:51,861 --> 00:53:53,530 WHAT DO WE WANT THE RECIPIENTS 1514 00:53:53,597 --> 00:53:55,932 OF THE INFORMATION TO DO WITH 1515 00:53:55,999 --> 00:53:58,134 THIS INFORMATION? 1516 00:53:58,201 --> 00:54:00,303 AND FINALLY ARE THESE 1517 00:54:00,370 --> 00:54:01,404 EXPLANATIONS EFFECTIVE, ARE 1518 00:54:01,471 --> 00:54:02,405 PEOPLE UNDERSTANDING WHAT WE'RE 1519 00:54:02,472 --> 00:54:03,640 EXPLAINING TO THEM? 1520 00:54:03,707 --> 00:54:06,876 AND I HOPE THAT WE ALL WILL HAVE 1521 00:54:06,943 --> 00:54:08,545 TIME TO REFLECT ON THE 1522 00:54:08,612 --> 00:54:10,714 PARTICULAR TRANSPARENCY 1523 00:54:10,780 --> 00:54:11,615 MODALITIES IN OUR BREAKOUTS, 1524 00:54:11,681 --> 00:54:14,217 THAT WOULD BE APPROPRIATE FOR 1525 00:54:14,284 --> 00:54:15,885 DIFFERENT STAKEHOLDERS, ALSO 1526 00:54:15,952 --> 00:54:18,021 WILL REFLECT ON WAYS WE CAN 1527 00:54:18,088 --> 00:54:20,056 CHECK INFORMATION WAS ACTUALLY 1528 00:54:20,123 --> 00:54:22,993 REACHING THE RECIPIENT. 1529 00:54:23,059 --> 00:54:26,096 HERE TO COMPLEMENT WHAT AARON 1530 00:54:26,162 --> 00:54:28,298 WILL TELL US, I WANT TO SHOW TO 1531 00:54:28,365 --> 00:54:31,901 YOU THIS METAPHOR OF NUTRITIONAL 1532 00:54:31,968 --> 00:54:35,605 LABEL, THAT I THINK CAPTURES 1533 00:54:35,672 --> 00:54:36,940 BOTH PUBLIC FACING, PATIENT 1534 00:54:37,007 --> 00:54:37,607 FACING DISCLOSURE OF 1535 00:54:37,674 --> 00:54:41,878 INFORMATION, BUT ALSO CAN BE 1536 00:54:41,945 --> 00:54:43,279 USED VERY FLEXIBLY TO TELL US 1537 00:54:43,346 --> 00:54:47,584 PROPERTIES OF THE DATA, THE 1538 00:54:47,651 --> 00:54:50,620 MODEL, AND OVERALL PROCESS. 1539 00:54:50,687 --> 00:54:53,056 NUTRITIONAL LABELS SHOULD BE 1540 00:54:53,123 --> 00:54:55,358 COMPREHENSIBLE, SHORT, SIMPLE, 1541 00:54:55,425 --> 00:54:55,692 CLEAR. 1542 00:54:55,759 --> 00:55:03,233 THEY SHOULD BE CONSULTATIVE, 1543 00:55:03,299 --> 00:55:04,534 AND COMPUTABLE. 1544 00:55:04,601 --> 00:55:05,468 AUTOMATICALLY COMPUTABLE, TO 1545 00:55:05,535 --> 00:55:08,838 MAKE IT SO A PERSON WHO IS 1546 00:55:08,905 --> 00:55:09,272 IMPLEMENTING A DATA 1547 00:55:09,339 --> 00:55:10,340 TRANSFORMATION PIPELINE DOESN'T 1548 00:55:10,407 --> 00:55:13,076 ALSO HAVE TO WORK EXTRA HARD TO 1549 00:55:13,143 --> 00:55:13,977 DOCUMENT IT. 1550 00:55:14,044 --> 00:55:17,147 TO GENERATE THE NECESSARY 1551 00:55:17,213 --> 00:55:20,350 INFORMATION FOR TRANSPARENCY. 1552 00:55:20,417 --> 00:55:22,185 AND A CHALLENGE HERE IS BOTH 1553 00:55:22,252 --> 00:55:25,221 CREATING TOOLS THAT WILL ALLOW 1554 00:55:25,288 --> 00:55:28,358 US MANY STAKEHOLDERS TO HAVE AN 1555 00:55:28,425 --> 00:55:30,360 UNDERSTANDING OF THE DATA, 1556 00:55:30,427 --> 00:55:32,929 MODELS, THE PROCESS, BUT ALSO TO 1557 00:55:32,996 --> 00:55:36,099 MAKE SURE THAT THESE NUTRITIONAL 1558 00:55:36,166 --> 00:55:36,866 LABELS OR OTHER TRANSPARENCY 1559 00:55:36,933 --> 00:55:38,168 TOOLS WORK. 1560 00:55:38,234 --> 00:55:39,469 FOR THIS I'M HIGHLIGHTING 1561 00:55:39,536 --> 00:55:41,104 THERE'S A HUGE CHALLENGE, A BIG 1562 00:55:41,171 --> 00:55:45,141 GAP IN THE WAY THAT WE ARE 1563 00:55:45,208 --> 00:55:47,811 EDUCATING PEOPLE WHO ARE EITHER 1564 00:55:47,877 --> 00:55:52,916 BEING SUBJECTED TO A.I. OR ARE 1565 00:55:52,982 --> 00:55:54,584 PRODUCING OR CONSUMING DATA THAT 1566 00:55:54,651 --> 00:55:57,420 WAS COLLECTED AS WELL AS MODELS 1567 00:55:57,487 --> 00:55:58,188 AND RESULTS. 1568 00:55:58,254 --> 00:55:59,823 AND SPECIFICALLY HERE I WOULD 1569 00:55:59,889 --> 00:56:01,357 LIKE TO HIGHLIGHT THAT WE HAVE 1570 00:56:01,424 --> 00:56:03,426 BEEN DOING SOME WORK ON 1571 00:56:03,493 --> 00:56:04,527 ALGORITHMIC TRANSPARENCY AND 1572 00:56:04,594 --> 00:56:06,262 TEACHING DIFFERENT PEOPLE ABOUT 1573 00:56:06,329 --> 00:56:07,797 ALGORITHMIC TRANSPARENCY, BUT WE 1574 00:56:07,864 --> 00:56:10,433 HAVE NOT YET HAD THE PLEASURE OF 1575 00:56:10,500 --> 00:56:11,835 DEVELOPING WORKSHOPS AND 1576 00:56:11,901 --> 00:56:13,169 TRAINING PEOPLE IN THE 1577 00:56:13,236 --> 00:56:14,938 BIOMEDICAL DOMAIN SO THIS WOULD 1578 00:56:15,004 --> 00:56:16,339 BE THE NEXT FRONTIER. 1579 00:56:16,406 --> 00:56:20,410 HERE ARE A COUPLE REFERENCES 1580 00:56:20,477 --> 00:56:22,812 THAT I'M HAPPY TO SHARE AND SOME 1581 00:56:22,879 --> 00:56:24,147 PEOPLE WHO CONTRIBUTED TO THE 1582 00:56:24,214 --> 00:56:27,851 WORK THAT I DISCUSSED TODAY AND 1583 00:56:27,917 --> 00:56:30,086 IT WAS SUPPORTED BY THE NATIONAL 1584 00:56:30,153 --> 00:56:31,454 SCIENCE FOUNDATION. 1585 00:56:31,521 --> 00:56:34,190 I'D LIKE TO HIGHLIGHT THE 1586 00:56:34,257 --> 00:56:35,291 ALGORITHMIC TRANSPARENCY THAT IS 1587 00:56:35,358 --> 00:56:37,927 AVAILABLE IF YOU'D LIKE TO TAKE 1588 00:56:37,994 --> 00:56:39,996 A LOOK. 1589 00:56:40,063 --> 00:56:40,296 THANKS. 1590 00:56:40,363 --> 00:56:45,668 [APPLAUSE] 1591 00:56:45,735 --> 00:56:48,204 >> PLEASE PUT SOME QUESTIONS IN 1592 00:56:48,271 --> 00:56:49,405 THE SLIDO ONLINE OR IN THE ROOM 1593 00:56:49,472 --> 00:56:50,807 AND WE'LL TRY AND GET ANSWERS 1594 00:56:50,874 --> 00:56:54,110 AND NOW IT'S MY PLEASURE TO 1595 00:56:54,177 --> 00:56:54,978 INTRODUCE TINA 1596 00:56:55,044 --> 00:56:56,613 HERNANDEZ-BOUSSARD FOR THE NEXT 1597 00:56:56,679 --> 00:56:56,946 PRESENTATION. 1598 00:56:57,013 --> 00:56:57,714 >> THANK YOU. 1599 00:56:57,781 --> 00:56:58,581 OKAY, GOOD MORNING. 1600 00:56:58,648 --> 00:57:01,217 IT'S SUCH A PLEASURE TO BE HERE. 1601 00:57:01,284 --> 00:57:03,620 THANK YOU FOR THE ORGANIZERS FOR 1602 00:57:03,686 --> 00:57:06,656 HAVING ME LIKE START OFF THIS 1603 00:57:06,723 --> 00:57:07,690 WONDERFUL WORKSHOP. 1604 00:57:07,757 --> 00:57:10,059 IT'S A PLEASURE TO BE HERE WITH 1605 00:57:10,126 --> 00:57:10,827 FRIENDS, COLLEAGUES, NEW FACES 1606 00:57:10,894 --> 00:57:13,797 THAT I HOPE TO LEARN SO MUCH. 1607 00:57:13,863 --> 00:57:15,532 I WAS TASKED WITH GIVING A TALK 1608 00:57:15,598 --> 00:57:18,701 ABOUT WHAT CAN GO WRONG. 1609 00:57:18,768 --> 00:57:20,236 I WAS LIKE, WELL, GOSH, THERE'S 1610 00:57:20,303 --> 00:57:21,437 SO MUCH THAT IS WRONG AND 1611 00:57:21,504 --> 00:57:22,872 THERE'S SO MUCH TO TALK ABOUT, I 1612 00:57:22,939 --> 00:57:26,176 WASN'T SURE HOW TO REALLY TACKLE 1613 00:57:26,242 --> 00:57:26,376 THIS. 1614 00:57:26,442 --> 00:57:28,378 I'M GOING THROUGH A COUPLE 1615 00:57:28,444 --> 00:57:30,480 EXAMPLES AND HIGHLIGHT WHERE 1616 00:57:30,547 --> 00:57:32,348 THINGS HAVE GONE WRONG, AND WHAT 1617 00:57:32,415 --> 00:57:35,185 THIS IS THIS HAS CAUSED 1618 00:57:35,251 --> 00:57:36,152 REGARDING SOCIETAL AND 1619 00:57:36,219 --> 00:57:36,820 POPULATION HARM. 1620 00:57:36,886 --> 00:57:39,756 SO, WE KNOW WITH A.I. THERE'S SO 1621 00:57:39,823 --> 00:57:40,757 MUCH OPPORTUNITY, I DON'T NEED 1622 00:57:40,824 --> 00:57:43,059 TO EXPLAIN THIS TO THIS GROUP 1623 00:57:43,126 --> 00:57:45,695 BUT A.I. HAS THE OPPORTUNITY TO 1624 00:57:45,762 --> 00:57:48,965 REALLY TRANSFORM OUR ABILITY TO 1625 00:57:49,032 --> 00:57:50,834 DIAGNOSE, TREAT, AND REALLY 1626 00:57:50,900 --> 00:57:52,202 IMPROVE PATIENT OUTCOMES, CAN 1627 00:57:52,268 --> 00:57:53,503 HELP WITH CLINICAL DECISION 1628 00:57:53,570 --> 00:57:56,206 MAKING, GETTING THE RIGHT TYPE 1629 00:57:56,272 --> 00:57:56,873 OF INFORMATION, QUICKLY AT POINT 1630 00:57:56,940 --> 00:57:58,875 OF CARE IN THE HANDS OF 1631 00:57:58,942 --> 00:58:00,577 PATIENTS, CLINICIANS, HEALTH 1632 00:58:00,643 --> 00:58:01,544 CARE SETTINGS, AND GENERATIVE 1633 00:58:01,611 --> 00:58:03,746 A.I. IS A NEW ENDEAVOR THAT 1634 00:58:03,813 --> 00:58:05,481 WE'RE SO EXCITED ABOUT. 1635 00:58:05,548 --> 00:58:09,319 WE DON'T EVEN KNOW ALL THE 1636 00:58:09,385 --> 00:58:09,652 OPPORTUNITIES. 1637 00:58:09,719 --> 00:58:10,954 IT'S IMPORTANT TO NOTE THIS IS 1638 00:58:11,020 --> 00:58:12,689 REALLY A DOUBLE-EDGED SWORD. 1639 00:58:12,755 --> 00:58:18,094 BECAUSE WHILE A.I. HAS ITS 1640 00:58:18,161 --> 00:58:19,729 PROMISE TO IMPROVE DATA DRIVEN 1641 00:58:19,796 --> 00:58:24,767 MEDICINE HAS THE POTENTIAL TO 1642 00:58:24,834 --> 00:58:26,202 WIDEN DISPARITIES BEGAN AND 1643 00:58:26,269 --> 00:58:27,837 CREATE SOCIETAL AND POPULATION 1644 00:58:27,904 --> 00:58:28,304 HARM. 1645 00:58:28,371 --> 00:58:29,906 LAURA SAID WE'RE NOT JUST 1646 00:58:29,973 --> 00:58:31,774 FOCUSED ABOUT A.I. IN MEDICAL 1647 00:58:31,841 --> 00:58:33,509 PRACTICE BUT THROUGHOUT OUR 1648 00:58:33,576 --> 00:58:34,744 ENTIRE RESEARCH SYSTEM. 1649 00:58:34,811 --> 00:58:39,849 WHEN WE HAVE A.I. THAT IS 1650 00:58:39,916 --> 00:58:42,051 ETHICAL OR BIASED OR FLAWED IN 1651 00:58:42,118 --> 00:58:43,319 SOME SENSE IT TRICKLES THROUGH 1652 00:58:43,386 --> 00:58:45,021 THE ENTIRE HEALTH CARE SYSTEM. 1653 00:58:45,088 --> 00:58:50,260 SO WE'RE USING A.I. IN 1654 00:58:50,326 --> 00:58:51,794 BIOMEDICAL RESEARCH, AUTOMATED 1655 00:58:51,861 --> 00:58:52,896 EXPERIMENTS, SYNTHETIC DATA 1656 00:58:52,962 --> 00:58:56,833 GENERATION, ET CETERA, IN 1657 00:58:56,900 --> 00:58:58,468 TRANSLATIONAL RESEARCH 1658 00:58:58,534 --> 00:59:00,103 BIOMEDICAL DISCOVERY, DRUG 1659 00:59:00,169 --> 00:59:01,204 TARGETING, PRIORITIZATION, AND 1660 00:59:01,271 --> 00:59:02,739 THEN ALSO THIS ALL LEADS TO 1661 00:59:02,805 --> 00:59:04,607 MEDICAL PRACTICE. 1662 00:59:04,674 --> 00:59:08,544 HOW WE USE THESE RESEARCH AND 1663 00:59:08,611 --> 00:59:11,281 TRANSLATIONAL DATA TO REALLY 1664 00:59:11,347 --> 00:59:12,515 IMPROVE DIAGNOSIS, TREATMENT 1665 00:59:12,582 --> 00:59:15,685 SELECTION, ET CETERA. 1666 00:59:15,752 --> 00:59:17,520 SO, WHERE DOES HEALTH CARE -- 1667 00:59:17,587 --> 00:59:19,055 A.I. AND HEALTH CARE GO WRONG? 1668 00:59:19,122 --> 00:59:20,056 THERE'S SO MANY EXAMPLES. 1669 00:59:20,123 --> 00:59:22,358 I DON'T WANT TO GIVE YOU A 1670 00:59:22,425 --> 00:59:24,227 SYSTEMATIC REVIEW OF ALL OF 1671 00:59:24,294 --> 00:59:25,528 THESE DIFFERENT ASPECTS BUT 1672 00:59:25,595 --> 00:59:27,664 THERE'S SO MANY AREAS AND 1673 00:59:27,730 --> 00:59:31,234 CONCRETE EXAMPLES WHERE A.I. HAS 1674 00:59:31,301 --> 00:59:35,171 ACTUALLY BEEN IMPLEMENTED AND 1675 00:59:35,238 --> 00:59:36,940 CAUSED HARM. 1676 00:59:37,006 --> 00:59:38,441 THERE'S BIAS IN ALGORITHMIC 1677 00:59:38,508 --> 00:59:41,044 DECISION MAKING, A PAPER FROM 1678 00:59:41,110 --> 00:59:42,612 2019 WHERE THEY SHOWED AN 1679 00:59:42,679 --> 00:59:47,016 ALGORITHM DEPLOYED IN A 1680 00:59:47,083 --> 00:59:49,085 HEALTHCARE SETTING, ACTUALLY THE 1681 00:59:49,152 --> 00:59:51,487 OUTCOME WAS USED, THERE WAS A 1682 00:59:51,554 --> 00:59:54,791 PROXY, SO THE ALGORITHM WAS 1683 00:59:54,857 --> 00:59:56,826 DEVELOPED TO PREDICT COST, THEY 1684 00:59:56,893 --> 00:59:58,594 IMPLEMENTED IT TO PREDICT NEED. 1685 00:59:58,661 --> 01:00:00,563 SO THEY USED COST AS A PROXY FOR 1686 01:00:00,630 --> 01:00:02,966 NEED WHILE WE KNOW THAT WAS 1687 01:00:03,032 --> 01:00:03,733 BIASED BECAUSE DIFFERENT 1688 01:00:03,800 --> 01:00:05,601 INDIVIDUALS IN A HEALTHCARE 1689 01:00:05,668 --> 01:00:06,903 SETTING HAVE LESS ACCESS TO 1690 01:00:06,970 --> 01:00:08,671 CARE, SPEND LESS MONEY ON CARE, 1691 01:00:08,738 --> 01:00:11,240 SO THAT ENDED UP THE RESULT OF 1692 01:00:11,307 --> 01:00:12,608 THAT WAS THAT BLACK AMERICANS, 1693 01:00:12,675 --> 01:00:15,244 FOR EXAMPLE, HAD TO BE TWICE 1694 01:00:15,311 --> 01:00:17,547 AS -- MUCH MORE SICKER THAN 1695 01:00:17,613 --> 01:00:18,748 THEIR COUNTERPART AS 1696 01:00:18,815 --> 01:00:21,050 NON-HISPANIC WHITE TO GET THE 1697 01:00:21,117 --> 01:00:23,553 SAME AMOUNT OF RESOURCES. 1698 01:00:23,619 --> 01:00:25,088 WHERE ELSE HEALTH CARE GOES -- 1699 01:00:25,154 --> 01:00:27,924 OTHER PLACES WHETHER WE SEE 1700 01:00:27,991 --> 01:00:29,692 PROBLEMS IS IN PRIVACY BREACH, 1701 01:00:29,759 --> 01:00:31,127 DATA MISUSE. 1702 01:00:31,194 --> 01:00:33,429 SO THERE'S -- I'M IN SILICON 1703 01:00:33,496 --> 01:00:35,064 VALLEY, THERE'S A HUGE COMPONENT 1704 01:00:35,131 --> 01:00:39,302 OF HEALTH CARE COMPANIES, TECH 1705 01:00:39,369 --> 01:00:40,870 COMPANYING EATING UP AND REUSING 1706 01:00:40,937 --> 01:00:41,304 DATA. 1707 01:00:41,371 --> 01:00:45,008 WE SEE PROBLEMS WITH DATA 1708 01:00:45,074 --> 01:00:45,842 BREACHES. 1709 01:00:45,908 --> 01:00:46,976 UNAUTHORIZED DATA SHARING, AND 1710 01:00:47,043 --> 01:00:49,612 UNDERSTANDING HOW THIS CAN HARM 1711 01:00:49,679 --> 01:00:51,347 POPULATIONS AND IN PATIENTS AND 1712 01:00:51,414 --> 01:00:52,882 PEOPLE AND TRANSPARENCY HOW WE 1713 01:00:52,949 --> 01:00:55,518 THINK ABOUT THESE PRIVACY 1714 01:00:55,585 --> 01:00:57,687 BREACHES IS REALLY IMPORTANT. 1715 01:00:57,754 --> 01:01:05,128 WE HAVE FAULTY OR INACCURATE 1716 01:01:05,194 --> 01:01:09,432 MEDICAL DEVICES, WE HEARD ABOUT 1717 01:01:09,499 --> 01:01:10,933 PULSE OXIMETER IN COVID, 1718 01:01:11,000 --> 01:01:12,335 PIGMENTATION, WORKING BETTER FOR 1719 01:01:12,402 --> 01:01:14,003 FAIR-SKINNED PATIENTS AND SO 1720 01:01:14,070 --> 01:01:15,204 THERE'S A HUGE BIAS AND 1721 01:01:15,271 --> 01:01:16,973 DISCREPANCY THAT HAPPENED THAT 1722 01:01:17,040 --> 01:01:19,609 I'LL TALK MORE ABOUT. 1723 01:01:19,675 --> 01:01:20,676 THERE'S A RELIANCE ON TECHNOLOGY 1724 01:01:20,743 --> 01:01:23,079 SO WE'RE SEEING MORE AND MORE 1725 01:01:23,146 --> 01:01:26,916 EVIDENCE COMING OUT WHERE 1726 01:01:26,983 --> 01:01:28,317 CLINICIANS WITH BLINDLY USING 1727 01:01:28,384 --> 01:01:30,319 THE OUTPUTS OF A.I. ALGORITHM 1728 01:01:30,386 --> 01:01:35,425 EVEN WHEN ALGORITHMS GIVE THEM 1729 01:01:35,491 --> 01:01:36,526 FALSE INFORMATION THEY ARE 1730 01:01:36,592 --> 01:01:37,760 RELYING ON THIS INFORMATION, 1731 01:01:37,827 --> 01:01:39,729 REALLY IMPORTANT TO THINK ABOUT 1732 01:01:39,796 --> 01:01:40,463 OVERRELIANCE OF TECHNOLOGY AND 1733 01:01:40,530 --> 01:01:50,940 HOW THINGS CAN GO WRONG. 1734 01:01:59,015 --> 01:02:01,050 FIRST AN A.I. ALGORITHM, TRIAGE 1735 01:02:01,117 --> 01:02:03,119 TOOL IN E.D. TO DETECT PATIENTS 1736 01:02:03,186 --> 01:02:04,754 WITH HEART ATTACK SYMPTOMS AND 1737 01:02:04,821 --> 01:02:07,590 TRIAGE THEM TO DIFFERENT TIMES 1738 01:02:07,657 --> 01:02:08,825 OF CARE. 1739 01:02:08,891 --> 01:02:13,096 WE KNOW FROM MEDICAL LITERATURE 1740 01:02:13,162 --> 01:02:14,530 AND CLINICAL TRIALS MAJORITY OF 1741 01:02:14,597 --> 01:02:16,165 PATIENTS IN THE POPULATIONS ARE 1742 01:02:16,232 --> 01:02:24,507 MALE PATIENTS, FROM HISTORICALLY 1743 01:02:24,574 --> 01:02:26,142 LOOKING BACK IN CLINICAL, USING 1744 01:02:26,209 --> 01:02:28,778 DATA TO TRAIN THE ALGORITHM, 1745 01:02:28,845 --> 01:02:31,814 REALLY GOOD AT DETECTING HEART 1746 01:02:31,881 --> 01:02:33,382 ATTACK SYMPTOMS FOR MEN, LESS 1747 01:02:33,449 --> 01:02:34,584 ACCURATE FOR WOMEN, WOMEN HAVE 1748 01:02:34,650 --> 01:02:35,384 DIFFERENT SYMPTOMS. 1749 01:02:35,451 --> 01:02:37,120 THIS WAS DEPLOYED IN THE E.D. 1750 01:02:37,186 --> 01:02:40,289 SYSTEM AND THERE WAS THIS 1751 01:02:40,356 --> 01:02:43,126 SYSTEMATIC BIAS IN THE ABILITY 1752 01:02:43,192 --> 01:02:44,694 TO DETECT FEMALES WHO HAD HEART 1753 01:02:44,760 --> 01:02:55,238 ATTACK SYMPTOMS COMPARED TO 1754 01:02:57,473 --> 01:03:00,209 MALES. 1755 01:03:00,276 --> 01:03:02,044 DERMATOLOGY DIFFERENCES, 1756 01:03:02,111 --> 01:03:03,579 ACCURACY, DEVICES TO DETECT 1757 01:03:03,646 --> 01:03:03,846 MELANOMA. 1758 01:03:03,913 --> 01:03:08,017 A LOT OF THE A.I. TECHNOLOGY ON 1759 01:03:08,084 --> 01:03:13,756 SKIN LESIONS HAS BEEN TRAINED ON 1760 01:03:13,823 --> 01:03:14,557 FAIR SKINNED PATIENTS. 1761 01:03:14,624 --> 01:03:17,493 HEALTH CARE OR ANY TYPE OF 1762 01:03:17,560 --> 01:03:19,128 SETTING, THE ABILITY TO DETECT 1763 01:03:19,195 --> 01:03:21,430 MELANOMA ON A FAIR SKINNED 1764 01:03:21,497 --> 01:03:22,732 PATIENT IS MUCH MORE HIGHER THAN 1765 01:03:22,798 --> 01:03:27,103 THE ABILITY FOR THESE TOOLS TO 1766 01:03:27,170 --> 01:03:37,713 DETECT MELANOMA ON A PATIENT 1767 01:03:37,780 --> 01:03:40,449 WITH DARKER SKIN, OF THE 1768 01:03:40,516 --> 01:03:42,318 CLINICIANS ARE RELYING MORE ON 1769 01:03:42,385 --> 01:03:43,419 ALGORITHMS SO UNDERSTANDING WHO 1770 01:03:43,486 --> 01:03:45,821 THEY ARE TRAINED ON, HOW THESE 1771 01:03:45,888 --> 01:03:55,031 ALGORITHMS WORK ACROSS DIFFERENT 1772 01:03:55,097 --> 01:03:55,898 POPULATIONS IS KEY. 1773 01:03:55,965 --> 01:03:57,667 COVID-19 WE SAW NEED FOR A.I. TO 1774 01:03:57,733 --> 01:03:59,302 IMPROVE DECISION MAKING AND HELP 1775 01:03:59,368 --> 01:04:00,903 US UNDERSTAND THE UNKNOWN BUT 1776 01:04:00,970 --> 01:04:04,874 THERE WERE A LOT OF FAILURES FOR 1777 01:04:04,941 --> 01:04:07,810 A.I. IN COVID THAT HARMED 1778 01:04:07,877 --> 01:04:11,781 PATIENT TRUST, PROVIDER TRUST, 1779 01:04:11,847 --> 01:04:14,050 AND THE WHOLE AREA FOR THIS. 1780 01:04:14,116 --> 01:04:15,885 AND SO WHAT HAPPENED WAS A LOT 1781 01:04:15,952 --> 01:04:17,687 OF THESE ALGORITHMS IN THE NEED 1782 01:04:17,753 --> 01:04:20,389 TO GET SOMETHING OUT VERY 1783 01:04:20,456 --> 01:04:25,328 RAPIDLY WERE TRAINED ON SMALL 1784 01:04:25,394 --> 01:04:26,429 DATASETS, VERY 1785 01:04:26,495 --> 01:04:28,064 UNDERREPRESENTATIVE DATA SETS, 1786 01:04:28,130 --> 01:04:28,864 RAPIDLY DEPLOYED INTO HEALTH 1787 01:04:28,931 --> 01:04:30,600 CARE SYSTEMS, SO YOU SEE THINGS 1788 01:04:30,666 --> 01:04:32,535 LIKE THIS FROM THE "NEW YORK 1789 01:04:32,602 --> 01:04:36,172 TIMES" WHERE IS A.I. WORSENING 1790 01:04:36,239 --> 01:04:37,006 COVID-19 ON BLACK AMERICANS? 1791 01:04:37,073 --> 01:04:38,908 THESE COME OUT IN THE POPULATION 1792 01:04:38,975 --> 01:04:41,844 THAT BREAK DOWN THE TRUST 1793 01:04:41,911 --> 01:04:42,078 BARRIER. 1794 01:04:42,144 --> 01:04:43,946 MORE RECENTLY, WE'VE DONE SOME 1795 01:04:44,013 --> 01:04:45,748 WORK LOOKING AT RACE-BASED 1796 01:04:45,815 --> 01:04:48,317 MEDICINE, A LOT OF PEOPLE HERE 1797 01:04:48,384 --> 01:04:51,354 TODAY, WHERE WE WERE LOOKING AT 1798 01:04:51,420 --> 01:04:54,323 HOW OFTEN -- HOW IS RACE USED IN 1799 01:04:54,390 --> 01:04:56,726 THESE ALGORITHMS, AND WHAT IS IT 1800 01:04:56,792 --> 01:04:57,827 A PROXY FOR? 1801 01:04:57,893 --> 01:05:00,229 AND SO THIS WORK IS -- 1802 01:05:00,296 --> 01:05:01,797 RACE-BASED MEDICINE IS MEDICAL 1803 01:05:01,864 --> 01:05:02,865 PRACTICE THAT'S GUIDED BY 1804 01:05:02,932 --> 01:05:04,600 ALGORITHMS THAT INCLUDE RACE OR 1805 01:05:04,667 --> 01:05:05,334 ETHNICITY. 1806 01:05:05,401 --> 01:05:07,903 BUT IT'S IMPORTANT TO NOTE RACE 1807 01:05:07,970 --> 01:05:09,772 IS MORE THAN LIKELY -- MORE 1808 01:05:09,839 --> 01:05:15,244 OFTEN THAN NOT A PROXY FOR OTHER 1809 01:05:15,311 --> 01:05:16,579 TYPES OF VARIABLES THAT 1810 01:05:16,646 --> 01:05:17,546 CONTRIBUTE TO DISPARITIES AND 1811 01:05:17,613 --> 01:05:19,181 OUTCOME OF DISEASE. 1812 01:05:19,248 --> 01:05:21,284 RACE FAILS TO ACCOUNT FOR OTHER 1813 01:05:21,350 --> 01:05:23,919 FACTORS ASSOCIATED WITH HEALTH 1814 01:05:23,986 --> 01:05:26,522 OUTCOMES, AND REINFORCES BIAS. 1815 01:05:26,589 --> 01:05:28,090 SO, IT'S REALLY IMPORTANT TO 1816 01:05:28,157 --> 01:05:30,860 UNDERSTAND HOW THIS HAS BEEN 1817 01:05:30,926 --> 01:05:31,794 USED HISTORICALLY IN SOME OF THE 1818 01:05:31,861 --> 01:05:34,530 ALGORITHMS THAT HAVE BEEN PUT IN 1819 01:05:34,597 --> 01:05:36,365 PLACE AND HOW THIS TRANSPARENCY 1820 01:05:36,432 --> 01:05:38,167 COULD REALLY IMPROVE THE 1821 01:05:38,234 --> 01:05:38,601 SITUATION. 1822 01:05:38,668 --> 01:05:41,737 LET ME GIVE TWO EXAMPLES. 1823 01:05:41,804 --> 01:05:43,372 SO, KIDNEY DISEASE AND EGFR, 1824 01:05:43,439 --> 01:05:45,341 THIS IS A VERY WELL-KNOWN 1825 01:05:45,408 --> 01:05:46,976 EXAMPLE IN THE LITERATURE. 1826 01:05:47,043 --> 01:05:49,812 SO EGFR IS USED TO DIAGNOSE AND 1827 01:05:49,879 --> 01:05:50,813 MONITOR KIDNEY DISEASE. 1828 01:05:50,880 --> 01:05:54,517 AND PREVIOUSLY IT WAS ADJUSTED 1829 01:05:54,583 --> 01:05:58,421 FORAYS. -- FOR RACE, BASED ON 1830 01:05:58,487 --> 01:06:00,189 ASSUMPTION BLACK PATIENTS HAD 1831 01:06:00,256 --> 01:06:01,424 INHERENTLY HIGHER LEVELS OF 1832 01:06:01,490 --> 01:06:01,724 CREATININE. 1833 01:06:01,791 --> 01:06:02,958 I DRILLED INTO THIS AND WE'VE 1834 01:06:03,025 --> 01:06:04,593 DRILLED INTO THIS AND IT GOES 1835 01:06:04,660 --> 01:06:07,296 BACK TO A STUDY OVER 200 YEARS 1836 01:06:07,363 --> 01:06:09,832 AGO THAT HAD I THINK MAYBE 30 1837 01:06:09,899 --> 01:06:12,802 PATIENTS IN THE STUDY, AND THIS 1838 01:06:12,868 --> 01:06:13,569 ASSUMPTION CARRIED FORWARD 1839 01:06:13,636 --> 01:06:14,670 THROUGH YEARS AND YEARS. 1840 01:06:14,737 --> 01:06:17,139 WHAT HAPPENED WAS THIS RESULTED 1841 01:06:17,206 --> 01:06:19,041 IN BLACK PATIENTS BEING 1842 01:06:19,108 --> 01:06:20,543 DIAGNOSED LATER AND MORE SEVERE 1843 01:06:20,609 --> 01:06:21,844 DISEASE WHICH THEN REDUCED THEIR 1844 01:06:21,911 --> 01:06:23,979 CHANCES OF BEING PLACED ON 1845 01:06:24,046 --> 01:06:24,714 KIDNEY TRANSPLANT LISTS. 1846 01:06:24,780 --> 01:06:27,116 SO THIS IS A CLEAR EXAMPLE OF 1847 01:06:27,183 --> 01:06:30,519 HOW THIS A.I. ALGORITHM HAS 1848 01:06:30,586 --> 01:06:31,854 HARMED ENTIRE POPULATIONS. 1849 01:06:31,921 --> 01:06:38,627 ANOTHER ONE IS VAGINAL BIRTH 1850 01:06:38,694 --> 01:06:40,129 AFTER CASAREAN VBAC CALCULATORS, 1851 01:06:40,196 --> 01:06:43,132 RISK OF ADVERSE OUTCOME FOR 1852 01:06:43,199 --> 01:06:44,967 VAGINAL BIRTH FOLLOWING CASAREAN 1853 01:06:45,034 --> 01:06:45,735 BIRTH. 1854 01:06:45,801 --> 01:06:46,736 AGAIN, THIS CALCULATOR WAS 1855 01:06:46,802 --> 01:06:47,937 ADJUSTED FOR BLACK PATIENTS 1856 01:06:48,003 --> 01:06:50,106 BECAUSE IT WAS ASSUMED BLACK 1857 01:06:50,172 --> 01:06:53,175 PATIENTS HAD A HIGHER RISK OF 1858 01:06:53,242 --> 01:06:54,176 ADVERSE OUTCOME FOLLOWING 1859 01:06:54,243 --> 01:06:54,944 CASAREAN BIRTH. 1860 01:06:55,010 --> 01:06:58,114 SO WHAT HAPPENED TO THIS WAS IT 1861 01:06:58,180 --> 01:07:00,282 OVERESTIMATED RISK OF THESE 1862 01:07:00,349 --> 01:07:01,484 ADVERSE EVENTS IN BLACK 1863 01:07:01,550 --> 01:07:03,586 PATIENTS, WHICH LED TO 1864 01:07:03,652 --> 01:07:05,755 DIFFERENTIAL RECOMMENDATIONS AND 1865 01:07:05,821 --> 01:07:06,856 LIMITED TREATMENT OPTIONS. 1866 01:07:06,922 --> 01:07:08,524 SO IT'S IMPORTANT TO REMEMBER 1867 01:07:08,591 --> 01:07:10,459 WHEN WE'RE USING THESE TYPES OF 1868 01:07:10,526 --> 01:07:14,430 VARIABLES IN DATASET THAT RACE 1869 01:07:14,497 --> 01:07:16,766 IS OFTEN A SOCIAL CONSTRUCT 1870 01:07:16,832 --> 01:07:18,567 WITHIN RACE THAN BETWEEN RACE. 1871 01:07:18,634 --> 01:07:22,505 WHEN WE TALK ABOUT TRANSPARENCY 1872 01:07:22,571 --> 01:07:25,775 IN DATA IT'S IMPORTANT TO 1873 01:07:25,841 --> 01:07:27,877 REMEMBER WHAT ARE PROXY, 1874 01:07:27,943 --> 01:07:30,079 CAUSALLY RELATED TO OUTCOME, HOW 1875 01:07:30,146 --> 01:07:31,947 WE'RE TRANSPARENT ABOUT THAT. 1876 01:07:32,014 --> 01:07:34,417 AND SO BECAUSE OF THESE ISSUES I 1877 01:07:34,483 --> 01:07:36,485 THINK THERE'S A REAL ETHICAL AND 1878 01:07:36,552 --> 01:07:38,287 SOCIETAL IMPLICATION OF A.I. IN 1879 01:07:38,354 --> 01:07:39,021 HEALTH CARE. 1880 01:07:39,088 --> 01:07:41,457 ONE KEY POINT I THINK IS WE'RE 1881 01:07:41,524 --> 01:07:42,992 REALLY BREAKING DOWN THE PUBLIC 1882 01:07:43,058 --> 01:07:43,959 TRUST AND PERCEPTION OF A.I. AND 1883 01:07:44,026 --> 01:07:45,828 HOW IT CAN BE USED. 1884 01:07:45,895 --> 01:07:47,163 THERE'S SO MUCH OPPORTUNITIES 1885 01:07:47,229 --> 01:07:49,865 FOR A.I. TO BENEFIT THE HEALTH 1886 01:07:49,932 --> 01:07:51,200 CARE SETTING BUT WE'VE BROKEN 1887 01:07:51,267 --> 01:07:52,301 DOWN THE TRUST. 1888 01:07:52,368 --> 01:07:54,270 THERE'S GROWING CONCERNS ABOUT 1889 01:07:54,336 --> 01:07:56,439 DATA PRIVACY AND SECURITY, LACK 1890 01:07:56,505 --> 01:07:57,840 OF STANDARDS TO COMMUNICATE HOW 1891 01:07:57,907 --> 01:07:59,742 DATA IS USED AND REUSED, RIGHT? 1892 01:07:59,809 --> 01:08:01,944 SO WHEN A PATIENT COMES IN, HOW 1893 01:08:02,011 --> 01:08:03,813 DO YOU COMMUNICATE WITH THEM 1894 01:08:03,879 --> 01:08:05,981 THAT WHILE I'M GIVING YOU THIS 1895 01:08:06,048 --> 01:08:07,183 DIAGNOSIS BASED ON A.I. 1896 01:08:07,249 --> 01:08:08,184 ALGORITHM OR I'M GOING TO TAKE 1897 01:08:08,250 --> 01:08:09,985 YOUR DATA AND SELL IT TO THE 1898 01:08:10,052 --> 01:08:13,088 NEXT COMPANY, TO MAKE A PROFIT 1899 01:08:13,155 --> 01:08:16,058 FOR IMPROVING A TOOL, SO THERE'S 1900 01:08:16,125 --> 01:08:17,359 LACK OF GUIDELINES HOW WE 1901 01:08:17,426 --> 01:08:18,561 COMMUNICATE THAT. 1902 01:08:18,627 --> 01:08:21,964 THE REGULATORY AND LEGAL 1903 01:08:22,031 --> 01:08:24,600 LANDSCAPE IS REALLY EVOLVING 1904 01:08:24,667 --> 01:08:25,701 RIGHT NOW TO ADDRESS THESE NOVEL 1905 01:08:25,768 --> 01:08:29,505 ISSUES BUT RIGHT NOW THERE'S 1906 01:08:29,572 --> 01:08:30,940 LIMITED POLICIES, REALLY TO 1907 01:08:31,006 --> 01:08:32,141 PROTECT INDIVIDUALS AND TO 1908 01:08:32,208 --> 01:08:35,244 ENSURE A.I. ALGORITHMS ARE 1909 01:08:35,311 --> 01:08:37,279 EQUITABLE ACROSS DIFFERENT 1910 01:08:37,346 --> 01:08:37,813 POPULATIONS. 1911 01:08:37,880 --> 01:08:40,249 IN ADDITION THERE'S SOCIAL AND 1912 01:08:40,316 --> 01:08:41,150 CULTURAL AND ETHICAL NORMS WE 1913 01:08:41,217 --> 01:08:43,219 NEED TO THINK ABOUT. 1914 01:08:43,285 --> 01:08:43,886 WE MENTIONED SOCIAL DETERMINANTS 1915 01:08:43,953 --> 01:08:46,088 OF HEALTH AND HOW WE INCORPORATE 1916 01:08:46,155 --> 01:08:47,389 THOSE INTO OUR ALGORITHM. 1917 01:08:47,456 --> 01:08:48,791 WHERE IS THAT DATA COMING FROM? 1918 01:08:48,858 --> 01:08:50,559 WHO IS IT COLLECTED FROM? 1919 01:08:50,626 --> 01:08:52,428 HOW ARE WE CAPTURING THAT 1920 01:08:52,495 --> 01:08:52,928 INFORMATION? 1921 01:08:52,995 --> 01:08:54,497 IT'S REALLY IMPORTANT TO THINK 1922 01:08:54,563 --> 01:09:01,737 ABOUT HOW WE CAN BE MORE 1923 01:09:01,804 --> 01:09:02,805 PRO-ACTIVE DEVELOPING 1924 01:09:02,872 --> 01:09:03,806 TECHNOLOGIES THAT CAN 1925 01:09:03,873 --> 01:09:04,807 SYSTEMATICALLY CAPTURE THIS 1926 01:09:04,874 --> 01:09:05,774 INFORMATION. 1927 01:09:05,841 --> 01:09:06,642 THERE'S A BALANCE BETWEEN 1928 01:09:06,709 --> 01:09:07,610 TECHNOLOGY AND INNOVATION THAT 1929 01:09:07,676 --> 01:09:09,278 WE NEED TO THINK ABOUT. 1930 01:09:09,345 --> 01:09:10,579 WE ALWAYS THINK ABOUT, YOU KNOW, 1931 01:09:10,646 --> 01:09:12,882 HAVING THE TECHNOLOGY WITH 1932 01:09:12,948 --> 01:09:13,749 HIGHEST PERFORMANCE, BEST 1933 01:09:13,816 --> 01:09:15,284 ACCURACY, AND WE OFTEN DON'T 1934 01:09:15,351 --> 01:09:16,819 THINK ABOUT, WELL, HOW IS THE 1935 01:09:16,886 --> 01:09:18,787 PARODY OF THE ALGORITHM, HOW 1936 01:09:18,854 --> 01:09:21,090 DOES ALGORITHM PERFORM ACROSS 1937 01:09:21,156 --> 01:09:21,690 DIFFERENT POPULATIONS, 1938 01:09:21,757 --> 01:09:23,826 OFTENTIMES WE TAKE A HIT IN 1939 01:09:23,893 --> 01:09:26,028 ACCURACY FOR AN ALGORITHM WHEN 1940 01:09:26,095 --> 01:09:28,197 WE MAKE IT MORE EQUITABLE ACROSS 1941 01:09:28,264 --> 01:09:28,664 POPULATIONS. 1942 01:09:28,731 --> 01:09:29,765 SO SOMETHING ELSE WE NEED TO 1943 01:09:29,832 --> 01:09:32,601 THINK ABOUT. 1944 01:09:32,668 --> 01:09:34,870 AND THEN ALSO WE NEED TO THINK 1945 01:09:34,937 --> 01:09:36,539 ABOUT INEQUITIES IN HEALTH CARE 1946 01:09:36,605 --> 01:09:38,607 ACROSS OUTCOMES AND DIGITAL 1947 01:09:38,674 --> 01:09:39,375 DIVIDE. 1948 01:09:39,441 --> 01:09:40,442 SO WE'RE DEVELOPING THESE 1949 01:09:40,509 --> 01:09:42,011 ALGORITHMS BUT WE DON'T PUT A 1950 01:09:42,077 --> 01:09:44,280 LOT OF EMPHASIS ON HOW WE CAN 1951 01:09:44,346 --> 01:09:45,481 GET THEM TO COMMUNITY HEALTH 1952 01:09:45,548 --> 01:09:47,149 CARE SETTINGS, HOW WE CAN GET 1953 01:09:47,216 --> 01:09:49,852 THEM TO RURAL HEALTH, WHAT ARE 1954 01:09:49,919 --> 01:09:52,421 TRANSPARENCY AROUND HOW THESE 1955 01:09:52,488 --> 01:09:55,558 TECHNOLOGIES MIGHT BE 1956 01:09:55,624 --> 01:09:56,859 INTEGRATED, PERFORMED, DEPLOYED 1957 01:09:56,926 --> 01:09:57,960 ACROSS DIFFERENT COMMUNITIES. 1958 01:09:58,027 --> 01:10:01,530 SO HOW DO WE DEVELOP FAIR AND 1959 01:10:01,597 --> 01:10:03,432 RELIABLE ALGORITHMS? 1960 01:10:03,499 --> 01:10:09,138 THIS IS WORK WE'VE DONE WITH 1961 01:10:09,204 --> 01:10:10,339 MADALENA WHO IS HERE, WE'VE 1962 01:10:10,406 --> 01:10:12,074 TALKED ABOUT THE LIFE CYCLE, HOW 1963 01:10:12,141 --> 01:10:13,509 IT'S NOT JUST ABOUT THE DATA, 1964 01:10:13,576 --> 01:10:15,344 NOT JUST ABOUT THE MODEL, IT'S 1965 01:10:15,411 --> 01:10:16,679 ABOUT THE WHOLE LIFE CYCLE. 1966 01:10:16,745 --> 01:10:18,647 THIS IS SOME WORK WE DID WHERE 1967 01:10:18,714 --> 01:10:20,516 WE'RE THINKING YOU HAVE TO THINK 1968 01:10:20,583 --> 01:10:22,051 ABOUT HOW THE DATA ARE CREATED, 1969 01:10:22,117 --> 01:10:24,887 HOW ARE THE DATA ACQUIRED ONCE 1970 01:10:24,954 --> 01:10:26,322 IT'S CREATED? 1971 01:10:26,388 --> 01:10:28,724 THE MODEL DEVELOPMENT IS A KEY 1972 01:10:28,791 --> 01:10:28,958 ASPECT. 1973 01:10:29,024 --> 01:10:30,926 MODEL EVALUATION IS SOMETHING WE 1974 01:10:30,993 --> 01:10:32,227 JUST DON'T HAVE ENOUGH 1975 01:10:32,294 --> 01:10:33,662 TRANSPARENCY ON. 1976 01:10:33,729 --> 01:10:35,197 AND THEN MODEL DEPLOYMENT WHICH 1977 01:10:35,264 --> 01:10:36,632 THEN GOES BACK INTO CREATING 1978 01:10:36,699 --> 01:10:39,568 MORE DATA SO I WANT TO TALK 1979 01:10:39,635 --> 01:10:42,004 ABOUT THESE STAGES IN RELATION 1980 01:10:42,071 --> 01:10:44,273 TO TRANSPARENCY. 1981 01:10:44,340 --> 01:10:44,640 OKAY. 1982 01:10:44,707 --> 01:10:46,108 SO, FIRST FOR DATA CREATION WE 1983 01:10:46,175 --> 01:10:48,310 NEED TO KNOW HOW THE DATA ARE 1984 01:10:48,377 --> 01:10:48,777 PRODUCED. 1985 01:10:48,844 --> 01:10:50,179 IS IT AN ACADEMIC MEDICAL 1986 01:10:50,245 --> 01:10:50,946 CENTER? 1987 01:10:51,013 --> 01:10:53,449 WHO HAS ACCESS TO THAT ACADEMIC 1988 01:10:53,515 --> 01:10:55,351 MEDICAL CENTER? 1989 01:10:55,417 --> 01:10:56,652 WHO DOESN'T, MORE IMPORTANTLY? 1990 01:10:56,719 --> 01:11:00,022 WHAT'S DATA QUALITY, DATA 1991 01:11:00,089 --> 01:11:00,789 DIVERSITY, FOR ACQUISITION WE 1992 01:11:00,856 --> 01:11:03,192 HAVE TO GET AN IRB TO ACCESS 1993 01:11:03,259 --> 01:11:04,426 DATA SO THERE'S CONSTRAINTS, 1994 01:11:04,493 --> 01:11:05,961 THERE'S REGULATORY AND LEGAL 1995 01:11:06,028 --> 01:11:07,029 MANDATES FOR USING DATA. 1996 01:11:07,096 --> 01:11:08,130 WHAT ARE THOSE? 1997 01:11:08,197 --> 01:11:09,765 HOW DO THEY CHANGE THE DATA YOU 1998 01:11:09,832 --> 01:11:12,301 ACTUALLY GET TO USE IN YOUR 1999 01:11:12,368 --> 01:11:13,636 MODEL? 2000 01:11:13,702 --> 01:11:15,237 OFTEN TIMES FOR IRBs WE HAVE 2001 01:11:15,304 --> 01:11:16,038 TO AGGREGATE GROUPS BECAUSE 2002 01:11:16,105 --> 01:11:17,106 NUMBERS ARE TOO SMALL. 2003 01:11:17,172 --> 01:11:18,307 THAT'S IMPORTANT TO UNDERSTAND 2004 01:11:18,374 --> 01:11:19,642 WHEN WE MOVE FORWARD BECAUSE HOW 2005 01:11:19,708 --> 01:11:21,677 DO WE SEPARATE THOSE OUT TO 2006 01:11:21,744 --> 01:11:24,880 ENSURE THAT ALGORITHM WORKS 2007 01:11:24,947 --> 01:11:30,352 ACROSS THESE SMALLER GROUPS? 2008 01:11:30,419 --> 01:11:31,453 JULIA MENTIONED GROUND TRUTH, 2009 01:11:31,520 --> 01:11:33,722 WHO IS DEVELOPING TRANSPARENCY 2010 01:11:33,789 --> 01:11:37,259 AND GROUND TRUTH, MEDICAL 2011 01:11:37,326 --> 01:11:38,127 STUDENTS ANNOTATING RECORDS? 2012 01:11:38,193 --> 01:11:39,428 IS IT CLINICIANS? 2013 01:11:39,495 --> 01:11:40,429 DIVERSITY IN DEVELOPING GROUND 2014 01:11:40,496 --> 01:11:42,931 TRUTH IS REALLY IMPORTANT IN 2015 01:11:42,998 --> 01:11:44,133 UNDERSTANDING THAT. 2016 01:11:44,199 --> 01:11:45,334 MODEL EVALUATION, I THINK 2017 01:11:45,401 --> 01:11:46,435 THERE'S IMPORTANT INFORMATION 2018 01:11:46,502 --> 01:11:48,737 NEEDED TO KNOW WHAT TYPES OF 2019 01:11:48,804 --> 01:11:50,706 EVALUATIONS WERE PERFORMED. 2020 01:11:50,773 --> 01:11:52,107 AND WHAT WERE PERFORMANCES? 2021 01:11:52,174 --> 01:11:53,342 SHOULDN'T BE STUCK IN A 2022 01:11:53,409 --> 01:11:55,611 MANUSCRIPT BUT HAS TO BE 2023 01:11:55,678 --> 01:11:58,280 TRANSPARENTLY REPORTED. 2024 01:11:58,347 --> 01:12:00,449 AND THEN DEPLOYMENT, THINKING 2025 01:12:00,516 --> 01:12:01,884 ABOUT TECHNICAL BARRIERS TO 2026 01:12:01,950 --> 01:12:03,986 DEPLOY THIS, WHAT ARE THEY? 2027 01:12:04,053 --> 01:12:06,488 COMING FROM A HOSPITAL, WHAT ARE 2028 01:12:06,555 --> 01:12:08,123 THE FINANCIAL CONSTRAINTS, 2029 01:12:08,190 --> 01:12:09,358 FINANCIAL MOTIVES, RETURN ON 2030 01:12:09,425 --> 01:12:10,325 INVESTMENT? 2031 01:12:10,392 --> 01:12:11,093 ALL THESE ARE REALLY IMPORTANT 2032 01:12:11,160 --> 01:12:15,130 ACROSS THE LIFE CYCLE OF A.I. 2033 01:12:15,197 --> 01:12:16,765 AND SO I'M GOING TO SKIP THAT 2034 01:12:16,832 --> 01:12:17,733 GIVEN TIME. 2035 01:12:17,800 --> 01:12:18,967 WE'VE TAKEN THIS, THIS IS THE 2036 01:12:19,034 --> 01:12:20,502 PAPER WE JUST GOT OUT WITH 2037 01:12:20,569 --> 01:12:22,471 MARSHALL WHO IS HERE AND OTHERS, 2038 01:12:22,538 --> 01:12:24,773 ABOUT HOW DO WE BUILD GUIDING 2039 01:12:24,840 --> 01:12:28,043 PRINCIPLES ONTO THIS LIFE CYCLE, 2040 01:12:28,110 --> 01:12:32,648 THINKING ABOUT TRANSPARENCY? 2041 01:12:32,715 --> 01:12:39,455 ONE OF THE OVERARCHING VARIABLE 2042 01:12:39,521 --> 01:12:41,323 S , TRANSPARENCY HAS TO BE AT 2043 01:12:41,390 --> 01:12:41,824 EVERY STAGE. 2044 01:12:41,890 --> 01:12:44,393 THERE'S A LOT OF NEED TO GET 2045 01:12:44,460 --> 01:12:47,796 THESE GUIDING PRINCIPLES INTO 2046 01:12:47,863 --> 01:12:51,066 CONCRETE RULES AND -- NOT RULES, 2047 01:12:51,133 --> 01:12:53,469 I GUESS CONCRETE GUIDELINES TO 2048 01:12:53,535 --> 01:12:55,104 MOVE FORWARD WITH DEVELOPING 2049 01:12:55,170 --> 01:12:57,072 MORE DIVERSE AND EQUITABLE A.I. 2050 01:12:57,139 --> 01:12:58,040 HEALTH CARE SYSTEM. 2051 01:12:58,107 --> 01:13:00,576 SO I'M GOING TO FINISH WITH 2052 01:13:00,642 --> 01:13:02,978 TALKING ABOUT TRANSPARENCY IN 2053 01:13:03,045 --> 01:13:05,848 THE LIFE CYCLE BECAUSE I THINK 2054 01:13:05,914 --> 01:13:08,350 THAT IT IS -- SO I'M GOING TO GO 2055 01:13:08,417 --> 01:13:13,889 TO THIS SLIDE AND SORRY I'M NOT 2056 01:13:13,956 --> 01:13:16,058 ARTISTIC BUT YOU HAVE TO HELP ME 2057 01:13:16,125 --> 01:13:17,259 WITH THIS BUT IT'S REALLY THIS 2058 01:13:17,326 --> 01:13:24,266 PSYCH -- CYCLE WHERE WE HAVE 2059 01:13:24,333 --> 01:13:25,234 LIMITED TRAINING DATA, MODELS 2060 01:13:25,300 --> 01:13:28,103 THAT WORK ON ALL POPULATIONS, 2061 01:13:28,170 --> 01:13:30,739 WE'RE NOT SURE WHERE WE CAN AND 2062 01:13:30,806 --> 01:13:32,474 CANNOT APPLY THESE MODELS. 2063 01:13:32,541 --> 01:13:35,310 THIS LEADS TO DISPARITIES IN OUR 2064 01:13:35,377 --> 01:13:38,147 OUTCOMES, LEADS TO TREATMENTS 2065 01:13:38,213 --> 01:13:39,448 INEQUITABLE ACROSS POPULATIONS, 2066 01:13:39,515 --> 01:13:45,154 THEY CAN BE PREJUDICED, 2067 01:13:45,220 --> 01:13:47,256 INACCURATE, NOT PERFORM WELL 2068 01:13:47,322 --> 01:13:50,425 ACROSS MINORITY GROUPS, BREAKING 2069 01:13:50,492 --> 01:13:51,693 DOWN TRUST, BREAKS DOWN THE 2070 01:13:51,760 --> 01:13:54,363 CONFIDENCE IN OUR A.I. SYSTEM. 2071 01:13:54,429 --> 01:13:56,098 THAT BRINGS US BACK TO THE FACT 2072 01:13:56,165 --> 01:13:58,066 THAT BECAUSE WE HAVE BROKEN THIS 2073 01:13:58,133 --> 01:13:59,168 TRUST, PEOPLE DON'T WANT TO 2074 01:13:59,234 --> 01:14:00,803 PARTICIPATE IN OUR RESEARCH. 2075 01:14:00,869 --> 01:14:01,837 THEY DON'T WANT TO PARTICIPATE 2076 01:14:01,904 --> 01:14:02,805 IN CLINICAL TRIALS. 2077 01:14:02,871 --> 01:14:05,107 SO WE BREAK DOWN THIS ENGAGEMENT 2078 01:14:05,174 --> 01:14:06,942 WHICH THEN GOES BACK TO THE FACT 2079 01:14:07,009 --> 01:14:08,477 THAT WE DON'T HAVE DATA TO 2080 01:14:08,544 --> 01:14:09,678 REPRESENT ALL THESE POPULATIONS 2081 01:14:09,745 --> 01:14:11,313 AND WE DON'T HAVE GOOD 2082 01:14:11,380 --> 01:14:15,050 TECHNOLOGIES TO GET THEM MORE 2083 01:14:15,117 --> 01:14:15,317 INVOLVED. 2084 01:14:15,384 --> 01:14:16,385 SO, WITH THAT, THANK YOU SO 2085 01:14:16,451 --> 01:14:16,718 MUCH. 2086 01:14:16,785 --> 01:14:19,655 I HOPE I KIND OF HELPED PUT 2087 01:14:19,721 --> 01:14:20,756 THINGS INTO SOME CONCRETE 2088 01:14:20,823 --> 01:14:22,391 EXAMPLES OF WHERE THINGS HAVE 2089 01:14:22,457 --> 01:14:24,793 GONE WRONG AND WHERE WE NEED TO 2090 01:14:24,860 --> 01:14:26,061 THINK ABOUT MOVING THE 2091 01:14:26,128 --> 01:14:29,965 TRANSPARENCY TO. 2092 01:14:30,032 --> 01:14:30,232 THANK YOU. 2093 01:14:30,299 --> 01:14:35,204 [APPLAUSE] 2094 01:14:35,270 --> 01:14:36,839 >> THANKS TO ALL OUR SPEAKERS. 2095 01:14:36,905 --> 01:14:40,542 WE'RE ABOUT TO HEAD INTO A 2096 01:14:40,609 --> 01:14:40,776 BREAK. 2097 01:14:40,843 --> 01:14:42,110 LET ME SKIP THROUGH THESE. 2098 01:14:42,177 --> 01:14:44,479 HERE IS WHERE WE ARE. 2099 01:14:44,546 --> 01:14:46,148 SO IN 15 MINUTES, YOU CAN TAKE 2100 01:14:46,215 --> 01:14:47,583 15 MINUTES AND YOU ARE 2101 01:14:47,649 --> 01:14:49,885 RESPONSIBLE FOR GETTING YOURSELF 2102 01:14:49,952 --> 01:14:50,853 INTO YOUR BREAKOUT ROOM, LET ME 2103 01:14:50,919 --> 01:14:52,721 JUST TELL WHAT YOU THOSE ARE 2104 01:14:52,788 --> 01:14:52,921 AGAIN. 2105 01:14:52,988 --> 01:14:54,790 THEY ARE NOT CORRECT IN WHAT WAS 2106 01:14:54,857 --> 01:14:56,091 WRITTEN DOWN. 2107 01:14:56,158 --> 01:15:00,462 FOUNDATION MODEL THEMES IN ROOM 2108 01:15:00,529 --> 01:15:00,896 260-C. 2109 01:15:00,963 --> 01:15:04,533 GENERAL REUSE IN 260-D FOR 2110 01:15:04,600 --> 01:15:05,167 DAVID. 2111 01:15:05,234 --> 01:15:06,735 MULTI-MODAL IN 260-E FOR EDWARD. 2112 01:15:06,802 --> 01:15:08,203 PROXY, YOU'RE IN THE ROOM YOU 2113 01:15:08,270 --> 01:15:18,380 WERE ASSIGNED, 150-A, SYNTHETIC 2114 01:15:18,447 --> 01:15:18,747 DATA IN 150-B. 2115 01:15:18,814 --> 01:15:23,619 >> [OFF MICROPHONE]. 2116 01:15:23,685 --> 01:15:30,259 >> 260-C AS IN CAT, MY 2117 01:15:30,325 --> 01:15:30,692 APOLOGIES. 2118 01:15:30,759 --> 01:15:32,527 AND YOU HAVE A 15-MINUTE BREAK 2119 01:15:32,594 --> 01:15:33,629 NOW AND YOU'RE RESPONSIBLE FOR 2120 01:15:33,695 --> 01:15:36,164 GETTING INTO YOUR BREAKOUT ROOMS 2121 01:15:36,231 --> 01:15:37,366 BY 2:30. 2122 01:15:37,432 --> 01:15:39,001 BREAKOUT LEADS WILL START ON YOU 2123 01:15:39,067 --> 01:15:39,868 THE JOURNEY. 2124 01:15:39,935 --> 01:15:41,970 IF YOU'RE DIALING IN REMOTELY, 2125 01:15:42,037 --> 01:15:43,038 THEN THE NEXT OPPORTUNITY TO 2126 01:15:43,105 --> 01:15:47,542 COME BACK WITH US WILL BE AT A 2127 01:15:47,609 --> 01:15:50,812 LITTLE AFTER 4:00 FOR PLENARY 2128 01:15:50,879 --> 01:15:53,148 READ OUT AND DISCUSSION. 2129 01:15:53,215 --> 01:15:55,417 THERE ARE ELEVATORS. 2130 01:15:55,484 --> 01:15:58,387 THANK YOU. 2131 01:15:58,453 --> 01:15:59,354 >> WE'RE RIGHT ON TIME. 2132 01:15:59,421 --> 01:16:01,323 I HOPE YOU WERE ABLE TO GET 2133 01:16:01,390 --> 01:16:04,726 COFFEE AND ENJOYED THE BREAKOUT 2134 01:16:04,793 --> 01:16:07,362 SESSIONS. 2135 01:16:07,429 --> 01:16:10,198 I WAS ABLE TO BE A FLY ON THE 2136 01:16:10,265 --> 01:16:11,600 WALL, IT WAS AN INTERESTING 2137 01:16:11,667 --> 01:16:12,134 DISCUSSION. 2138 01:16:12,200 --> 01:16:14,436 BEFORE I HAND OVER TO AARON AND 2139 01:16:14,503 --> 01:16:16,138 THE DEBRIEFS FROM EACH OF THOSE, 2140 01:16:16,204 --> 01:16:17,739 I WANTED TO ANSWER A COUPLE 2141 01:16:17,806 --> 01:16:20,242 QUESTIONS THAT I HEARD THAT CAME 2142 01:16:20,309 --> 01:16:21,109 UP. 2143 01:16:21,176 --> 01:16:22,311 ONE IN PARTICULAR WAS JUST 2144 01:16:22,377 --> 01:16:24,079 AROUND THE ENORMITY OF THE TASK 2145 01:16:24,146 --> 01:16:27,649 THAT WE'VE JUST PUT IN FRONT OF 2146 01:16:27,716 --> 01:16:29,451 YOU AND HOW IT IS IMPOSSIBLE TO 2147 01:16:29,518 --> 01:16:32,621 SOLVE THIS IN THE 2 1/2, 3 DAYS 2148 01:16:32,688 --> 01:16:33,822 WE HAVE TOGETHER. 2149 01:16:33,889 --> 01:16:35,123 WE RECOGNIZE THAT. 2150 01:16:35,190 --> 01:16:36,892 AND SO MAYBE THE BEST THING I 2151 01:16:36,959 --> 01:16:39,628 CAN SAY IS JUST DON'T WORRY. 2152 01:16:39,695 --> 01:16:42,664 YOU ARE HELPING US BY -- HELPING 2153 01:16:42,731 --> 01:16:44,099 US GET HOWEVER FAR WE GET IN THE 2154 01:16:44,166 --> 01:16:45,634 NEXT DAY AND A HALF. 2155 01:16:45,701 --> 01:16:46,735 WE RECOGNIZE THAT WHAT WE COME 2156 01:16:46,802 --> 01:16:49,604 OUT WITH IS NOT GOING TO BE 2157 01:16:49,671 --> 01:16:50,372 PRISTINE GUIDANCE THAT WE CAN 2158 01:16:50,439 --> 01:16:51,740 HAND TO THE COMMUNITY. 2159 01:16:51,807 --> 01:16:52,975 WE'LL WANT MORE DISCUSSION ABOUT 2160 01:16:53,041 --> 01:16:53,942 IT, THAT'S EXPECTED. 2161 01:16:54,009 --> 01:16:55,610 WE HAVE THAT IN OUR PLANS SO 2162 01:16:55,677 --> 01:16:58,013 JUST MAYBE BE AT EASE WITH THAT 2163 01:16:58,080 --> 01:16:59,982 AND JUST KNOW YOU ARE STILL 2164 01:17:00,048 --> 01:17:02,250 HELPING US AS WE START TO PUT 2165 01:17:02,317 --> 01:17:03,185 THESE BONES TOGETHER. 2166 01:17:03,251 --> 01:17:04,786 SO WITH THAT I'M GOING TO HAND 2167 01:17:04,853 --> 01:17:06,455 OVER TO AARON WHO IS GOING TO 2168 01:17:06,521 --> 01:17:07,990 LEAD US THROUGH THE DEBRIEFING 2169 01:17:08,056 --> 01:17:10,459 SESSION AND HAVE SOME CLOSING 2170 01:17:10,525 --> 01:17:12,728 REMARKS FOR US. 2171 01:17:12,794 --> 01:17:13,895 >> ALL RIGHT. 2172 01:17:13,962 --> 01:17:15,964 GOOD TO SEE EVERYONE BACK. 2173 01:17:16,031 --> 01:17:17,833 WE'RE GOING TO DO THIS BY ASKING 2174 01:17:17,899 --> 01:17:19,134 ONE PERSON FROM EACH OF THE 2175 01:17:19,201 --> 01:17:24,806 GROUPS TO COME UP AND GIVE A 5 2176 01:17:24,873 --> 01:17:35,417 TO 6-MINUTE SUM RIF -- SUM MARY 2177 01:17:35,684 --> 01:17:36,952 OF THAT AND A FEW MINUTES FOR 2178 01:17:37,019 --> 01:17:41,023 QUESTIONS AND WE'LL GO THROUGH 2179 01:17:41,089 --> 01:17:42,324 EACH IN ORDER. 2180 01:17:42,391 --> 01:17:44,626 FIRST UP SYNTHETIC DATA GROUP, 2181 01:17:44,693 --> 01:17:46,294 HOPEFULLY YOU PICKED A PERSON TO 2182 01:17:46,361 --> 01:17:55,003 COME UP AND SPEAK. 2183 01:18:00,475 --> 01:18:06,882 >> SO THE RISK OF DOING THIS 2184 01:18:06,948 --> 01:18:08,183 EXTEMPORANEOUSLY, I MIGHT MISS 2185 01:18:08,250 --> 01:18:10,052 SOME THINGS, EVERYBODY WAS 2186 01:18:10,118 --> 01:18:12,020 CONTRIBUTING, A RICH DISCUSSION 2187 01:18:12,087 --> 01:18:12,788 AROUND CHALLENGES RELATED TO 2188 01:18:12,854 --> 01:18:15,624 SYNTHETIC DATA WITH RESPECT TO 2189 01:18:15,690 --> 01:18:16,425 ETHICS TRANSPARENCY. 2190 01:18:16,491 --> 01:18:17,592 SO WE HAD A NICE MIX OF PEOPLE 2191 01:18:17,659 --> 01:18:18,693 IN THE ROOM. 2192 01:18:18,760 --> 01:18:21,196 WE ACTUALLY STARTED WITH A 2193 01:18:21,263 --> 01:18:23,065 PROVOCATIVE QUESTION, WAIT A 2194 01:18:23,131 --> 01:18:26,268 SECOND, HOW DO WE DEFINE 2195 01:18:26,334 --> 01:18:28,437 SYNTHETIC DATA? 2196 01:18:28,503 --> 01:18:30,005 WE TALKED ABOUT PROVENANCE, CAN 2197 01:18:30,072 --> 01:18:31,606 YOU TRACE BACK TO COMPUTATIONAL 2198 01:18:31,673 --> 01:18:34,009 SOURCE OR A REAL WORLD SOURCE? 2199 01:18:34,076 --> 01:18:35,977 WE TALKED RAPIDLY, STARTED 2200 01:18:36,044 --> 01:18:37,746 TALKING ABOUT ISSUES OF TRUST, 2201 01:18:37,813 --> 01:18:38,814 ISSUES OF ACCESS, WE STARTED 2202 01:18:38,880 --> 01:18:41,149 TALKING, WE WERE ONE OF THE 2203 01:18:41,216 --> 01:18:43,652 GROUPS TALKING FRANKLY ABOUT THE 2204 01:18:43,718 --> 01:18:49,024 ENORMITY OF THE TASK, SOME -- WE 2205 01:18:49,091 --> 01:18:51,326 TALKED ABOUT FAST MRI, JULIA 2206 01:18:51,393 --> 01:18:52,794 ORIENTED THAT, OUR USE CASE AND 2207 01:18:52,861 --> 01:18:53,962 I'LL SPEAK TO THE STAKEHOLDERS 2208 01:18:54,029 --> 01:18:56,264 IN A MINUTE BUT USING THAT LENS 2209 01:18:56,331 --> 01:18:57,799 WE STARTED TALKING ABOUT THE 2210 01:18:57,866 --> 01:18:59,434 ENORMITY OF THE TASK TO SAY WHAT 2211 01:18:59,501 --> 01:19:01,536 GUIDANCE AND HOW DO WE MOVE FROM 2212 01:19:01,603 --> 01:19:02,270 UNDERSTANDING THE STAKEHOLDERS 2213 01:19:02,337 --> 01:19:04,139 AND THEIR NEEDS TO THINKING 2214 01:19:04,206 --> 01:19:05,474 ABOUT WHAT ARE THE CAPABILITIES 2215 01:19:05,540 --> 01:19:09,811 AND GAPS USING A COUPLE 2216 01:19:09,878 --> 01:19:10,712 EXAMPLES, RISK LEAVING THINGS 2217 01:19:10,779 --> 01:19:16,184 OUT BUT WE PUSHED FORWARD. 2218 01:19:16,251 --> 01:19:18,353 FAST MRI, AT THE END A COUPLE 2219 01:19:18,420 --> 01:19:20,155 WERE STILL TALKING ADDING 2220 01:19:20,222 --> 01:19:21,423 STAKEHOLDERS BECAUSE THERE WERE 2221 01:19:21,490 --> 01:19:21,623 MORE. 2222 01:19:21,690 --> 01:19:26,128 WE WEI BROUGHT IN THE IDEA OF 2223 01:19:26,194 --> 01:19:27,129 STAKEHOLDERS, THERE'S AN 2224 01:19:27,195 --> 01:19:27,863 IMPLICATION STAKEHOLDERS MAY 2225 01:19:27,929 --> 01:19:30,065 HAVE STAKE, MIGHT BE FINANCIAL 2226 01:19:30,132 --> 01:19:30,699 STAKE, MAYBE ANOTHER STAKE, BUT 2227 01:19:30,765 --> 01:19:33,702 AT THE END OF THE DAY WE HAD THE 2228 01:19:33,768 --> 01:19:35,537 INDIVIDUAL UNDERGOING MRI, A 2229 01:19:35,604 --> 01:19:37,205 PATIENT POTENTIALLY, THEIR 2230 01:19:37,272 --> 01:19:39,274 SUPPORT NETWORK, CLINICAL WORLD 2231 01:19:39,341 --> 01:19:41,476 AROUND THEM, RADIOLOGISTS AND 2232 01:19:41,543 --> 01:19:42,344 ANCILLARY FOLKS USING 2233 01:19:42,410 --> 01:19:44,412 TECHNOLOGY, WE MOVED INTO THE 2234 01:19:44,479 --> 01:19:45,947 REGULATORY PIECE OF IT, 2235 01:19:46,014 --> 01:19:48,250 INSURANCE AND PAYOR PIECE, WE 2236 01:19:48,316 --> 01:19:53,155 GOT TO THE COMMUNITY, TO THE 2237 01:19:53,221 --> 01:19:54,356 PUBLIC AND PERCEPTION, DEBRA 2238 01:19:54,422 --> 01:19:56,224 SAID THE PERCEPTION PIECE IS 2239 01:19:56,291 --> 01:19:57,092 CRITICAL BECAUSE IT TAKES ONE OR 2240 01:19:57,159 --> 01:19:58,860 TWO OF THESE TO GO WRONG AND 2241 01:19:58,927 --> 01:20:01,263 THEN THE KNEES ARE CUT OUT FROM 2242 01:20:01,329 --> 01:20:03,431 THE REST OF THE TECHNOLOGY WITH 2243 01:20:03,498 --> 01:20:06,535 BROAD IMPLICATIONS. 2244 01:20:06,601 --> 01:20:07,802 WE STARTED GETTING INTO THE 2245 01:20:07,869 --> 01:20:09,271 NEEDS THAT WILL FEED US INTO THE 2246 01:20:09,337 --> 01:20:09,704 GAPS. 2247 01:20:09,771 --> 01:20:13,108 AS FAR AS KEY THEMES ACCURACY 2248 01:20:13,175 --> 01:20:15,177 WAS CRITICAL, WE TALKED ABOUT 2249 01:20:15,243 --> 01:20:16,044 ERROR, UNCERTAINTY, QUANTIFYING. 2250 01:20:16,111 --> 01:20:18,046 ONE KEY THEME OF THE GROUP, 2251 01:20:18,113 --> 01:20:19,648 REALLY GREAT, IS THAT AT EVERY 2252 01:20:19,714 --> 01:20:20,849 STEP OF THE WAY WE WERE THINKING 2253 01:20:20,916 --> 01:20:23,518 IN THE BACK OF OUR MINDS, WE'LL 2254 01:20:23,585 --> 01:20:25,353 MAKE THIS EXPLICIT NEXT SESSION, 2255 01:20:25,420 --> 01:20:27,989 WHEN WE'RE WORRIED ABOUT THE 2256 01:20:28,056 --> 01:20:30,425 POTENTIAL GAP HOW DO WE MEASURE 2257 01:20:30,492 --> 01:20:31,726 AND QUANTIFY, IF IT'S 2258 01:20:31,793 --> 01:20:32,961 UNCERTAINTY OR FAIRNESS OR 2259 01:20:33,028 --> 01:20:34,196 ACCURACY, HOW ARE WE GOING TO 2260 01:20:34,262 --> 01:20:35,330 MEASURE IT? 2261 01:20:35,397 --> 01:20:36,698 THAT'S THE HIGH LEVEL THEMES. 2262 01:20:36,765 --> 01:20:38,466 IS THERE ANYTHING I MISSED, 2263 01:20:38,533 --> 01:20:40,235 FOLKS IN THE ROOM? 2264 01:20:40,302 --> 01:20:47,409 KEEP ME HONEST. 2265 01:20:47,475 --> 01:20:48,043 OKAY, THANK YOU. 2266 01:20:48,109 --> 01:20:50,278 >> BEFORE YOU LEAVE, ARE THERE 2267 01:20:50,345 --> 01:20:51,913 ANY QUESTIONS FOR THAT GROUP? 2268 01:20:51,980 --> 01:20:56,084 >> I WAS THERE FOR THE VERY 2269 01:20:56,151 --> 01:20:56,351 BEGINNING. 2270 01:20:56,418 --> 01:21:00,555 YOU WERE STARTING WITH WHAT WAS 2271 01:21:00,622 --> 01:21:01,756 MEANT BY THE TERM SYNTHETIC 2272 01:21:01,823 --> 01:21:05,160 DATA, WHERE DID YOU LAND? 2273 01:21:05,227 --> 01:21:08,430 >> WE PAINTED ATTRIBUTES OF SORT 2274 01:21:08,496 --> 01:21:10,832 OF MULTI-FACTORIAL ATTRIBUTES OF 2275 01:21:10,899 --> 01:21:12,901 SYNTHETIC DATA, DRAWN FROM A 2276 01:21:12,968 --> 01:21:13,702 REAL DISTRIBUTION, WE TALKED 2277 01:21:13,768 --> 01:21:15,003 ABOUT IS IT IMPOSSIBLE, RIGHT? 2278 01:21:15,070 --> 01:21:17,505 WE TALKED ABOUT THE FACT REAL 2279 01:21:17,572 --> 01:21:18,373 WORLD DATA INCLUDING ELECTRONIC 2280 01:21:18,440 --> 01:21:19,374 HEALTH RECORD CONTAIN IMPOSSIBLE 2281 01:21:19,441 --> 01:21:19,608 DATA. 2282 01:21:19,674 --> 01:21:21,476 WE CAN FIND REAL DATA THAT 2283 01:21:21,543 --> 01:21:23,111 EVIDENCES THINGS THAT DON'T 2284 01:21:23,178 --> 01:21:28,183 EXIST IN PHYSICS OR HEALTH, WE 2285 01:21:28,250 --> 01:21:30,986 FLESHED IT OUT, I DON'T THINK WE 2286 01:21:31,052 --> 01:21:31,987 SOLVED. 2287 01:21:32,053 --> 01:21:35,790 THE LINCHPIN IS THE PROVENANCE, 2288 01:21:35,857 --> 01:21:37,259 TRACE IT BACK, REPRODUCIBILITY 2289 01:21:37,325 --> 01:21:39,127 AND PROVENANCE, DOES IT 2290 01:21:39,194 --> 01:21:39,761 REPRESENT SOMETHING REALISTIC, 2291 01:21:39,828 --> 01:21:41,196 CAN YOU SHOW THIS COULD BE 2292 01:21:41,263 --> 01:21:45,233 STHOOG COULD -- SOMETHING THAT 2293 01:21:45,300 --> 01:21:51,473 COULD EXIST IN REALITY AND 2294 01:21:51,539 --> 01:21:52,340 FAIRNESS AND REPRESENTATIVENESS 2295 01:21:52,407 --> 01:21:53,775 BUT NOT DETERMINING IN THAT 2296 01:21:53,842 --> 01:21:54,009 SENSE. 2297 01:21:54,075 --> 01:21:54,242 PLEASE. 2298 01:21:54,309 --> 01:21:59,781 >> THERE'S A QUESTION FROM 2299 01:21:59,848 --> 01:22:01,316 ONLINE AROUND THE RELATIONSHIP 2300 01:22:01,383 --> 01:22:01,883 BETWEEN (INDISCERNIBLE) AND 2301 01:22:01,950 --> 01:22:03,084 TRANSPARENCY, SINCE YOU JUST 2302 01:22:03,151 --> 01:22:04,853 MENTIONED, HOW DO YOU THINK 2303 01:22:04,919 --> 01:22:15,363 ABOUT THAT ESPECIALLY AND 2304 01:22:16,564 --> 01:22:18,066 SYNTHETIC DATA (INAUDIBLE). 2305 01:22:18,133 --> 01:22:18,500 >> YEAH, ABSOLUTELY. 2306 01:22:18,566 --> 01:22:23,438 IN THE EXAMPLE WE UNPACKED WAS 2307 01:22:23,505 --> 01:22:25,273 FAST fMRI, IMAGINE THE 2308 01:22:25,340 --> 01:22:26,474 RADIOLOGIST INTERPRETING IT, THE 2309 01:22:26,541 --> 01:22:28,043 PATIENT UNDERGOING A SCAN, 2310 01:22:28,109 --> 01:22:31,546 ELEMENTS WILL BE DERIVED 2311 01:22:31,613 --> 01:22:32,714 DIRECTLY MEASURED AND GENERATED, 2312 01:22:32,781 --> 01:22:37,552 HAVING THE ABILITY TO TRACE BACK 2313 01:22:37,619 --> 01:22:39,054 WHICH PIXELS WERE PRODUCED OR 2314 01:22:39,120 --> 01:22:40,288 GENERATED AND MEASURED IS KEY. 2315 01:22:40,355 --> 01:22:42,457 THE OTHER THING THERE'S A ROLE 2316 01:22:42,524 --> 01:22:44,225 FOR THIS TECHNOLOGY TO DO THAT 2317 01:22:44,292 --> 01:22:45,960 SCAN WHERE EVERYTHING LOOKS 2318 01:22:46,027 --> 01:22:49,798 REALLY GOOD, IT FALLS WITHIN 2319 01:22:49,864 --> 01:22:50,665 NORMS AND PROBABILITY 2320 01:22:50,732 --> 01:22:51,766 DISTRIBUTION VERSUS THIS ONE 2321 01:22:51,833 --> 01:22:53,301 WHERE WE'RE NOT SURE, MEASURE 2322 01:22:53,368 --> 01:22:54,269 THIS ISN'T SUFFICIENT, NOW THAT 2323 01:22:54,336 --> 01:22:57,138 PATIENT NEEDS TO GO FOR THE HIGH 2324 01:22:57,205 --> 01:22:59,441 TESLA MRI SCAN TO GET THE NEXT 2325 01:22:59,507 --> 01:23:01,743 ONE DONE SO HOW FAR DOWN PATH 2326 01:23:01,810 --> 01:23:03,044 ARE WE DOING HERE? 2327 01:23:03,111 --> 01:23:05,246 IT FITS WITHIN A DIAGNOSTIC 2328 01:23:05,313 --> 01:23:06,748 WORKFLOW BUT IT WAS VERY CLEAR 2329 01:23:06,815 --> 01:23:08,083 THE ABILITY TO REALLY TRACE BACK 2330 01:23:08,149 --> 01:23:10,785 AND POINT YOUR FINGER AT EACH 2331 01:23:10,852 --> 01:23:13,655 STEP WAS CRITICAL IN THE FACE OF 2332 01:23:13,722 --> 01:23:14,356 TRANSPARENCY ESPECIALLY HERE 2333 01:23:14,422 --> 01:23:15,490 USING SYNTHETIC DATA FOR A 2334 01:23:15,557 --> 01:23:17,258 NUMBER OF USES, IF IT'S A 2335 01:23:17,325 --> 01:23:19,127 SYNTHETIC TRIAL GENERATING THAT 2336 01:23:19,194 --> 01:23:24,032 TRIAL ARM THERE'S A SYNTHETIC 2337 01:23:24,099 --> 01:23:26,000 PIECE, IS THIS THE PATIENT THAT 2338 01:23:26,067 --> 01:23:26,768 WALKED IN OR COMPUTATIONAL 2339 01:23:26,835 --> 01:23:27,969 DERIVED FROM A DISTRIBUTION OF 2340 01:23:28,036 --> 01:23:29,604 PATIENTS WHO MAYBE LOOKED LIKE 2341 01:23:29,671 --> 01:23:32,140 THIS ONE DID. 2342 01:23:32,207 --> 01:23:33,375 PLEASE, PLEASE. 2343 01:23:33,441 --> 01:23:37,812 I DO -- DO YOU WANT A MIC? 2344 01:23:37,879 --> 01:23:41,416 >> JUST TO ADD TO THE 2345 01:23:41,483 --> 01:23:42,183 TRANSPARENCY REPRODUCIBILITY 2346 01:23:42,250 --> 01:23:43,318 QUESTION, I THINK, YOU KNOW, THE 2347 01:23:43,385 --> 01:23:46,287 NOTION OF PROVENANCE THAT YOU 2348 01:23:46,354 --> 01:23:48,690 RAISED COMES, VERY INTERESTING, 2349 01:23:48,757 --> 01:23:49,791 BECAUSE THERE'S THIS QUESTION 2350 01:23:49,858 --> 01:23:53,495 OF, YOU KNOW, DOES MY PATIENT -- 2351 01:23:53,561 --> 01:23:55,130 IS MY PATIENT REPRESENTATIVE OF 2352 01:23:55,196 --> 01:23:56,798 THE POPULATION FROM WHICH THIS 2353 01:23:56,865 --> 01:23:58,333 SYNTHETIC DATA WERE DERIVED IN 2354 01:23:58,400 --> 01:23:59,634 THE FIRST PLACE? 2355 01:23:59,701 --> 01:24:01,369 IF I HAD ASSESSMENT OF THAT 2356 01:24:01,436 --> 01:24:04,639 WHICH COULD BE ENABLED BY A.I. 2357 01:24:04,706 --> 01:24:05,874 OR AUTOMATED METHODS TO SAY, YOU 2358 01:24:05,940 --> 01:24:08,042 KNOW, THIS IS WHAT WE THINK YOUR 2359 01:24:08,109 --> 01:24:10,445 MRI LOOKS LIKE BUT THE 2360 01:24:10,512 --> 01:24:11,446 CONFIDENCE LEVEL IS VERY LOW 2361 01:24:11,513 --> 01:24:13,515 BECAUSE THIS PATIENT IN THESE 2362 01:24:13,581 --> 01:24:16,684 SPECIFIC WAYS DOES NOT ALIGN 2363 01:24:16,751 --> 01:24:18,453 WITH THE ORIGINAL PATIENT 2364 01:24:18,520 --> 01:24:20,655 POPULATION FROM WHICH THE 2365 01:24:20,722 --> 01:24:21,956 SYNTHESIS WAS CREATED, AND SO 2366 01:24:22,023 --> 01:24:23,591 THAT OF COURSE RAISED THE 2367 01:24:23,658 --> 01:24:24,459 TENSION OF PRIVACY. 2368 01:24:24,526 --> 01:24:26,094 SO WHEN YOU TALK ABOUT 2369 01:24:26,161 --> 01:24:27,328 PROVENANCE THERE'S ALWAYS THE 2370 01:24:27,395 --> 01:24:29,397 QUESTION OF ARE WE REVEALING 2371 01:24:29,464 --> 01:24:30,799 MORE INFORMATION THAN WE SHOULD 2372 01:24:30,865 --> 01:24:30,965 BE. 2373 01:24:31,032 --> 01:24:34,536 >> THANK YOU FOR HIGHLIGHTING 2374 01:24:34,602 --> 01:24:35,303 THE PRIVACY PIECE. 2375 01:24:35,370 --> 01:24:41,576 WE TALKED ABOUT THAT QUITE A 2376 01:24:41,643 --> 01:24:41,743 BIT. 2377 01:24:41,810 --> 01:24:42,811 >> ANOTHER HAND RAISED 2378 01:24:42,877 --> 01:24:45,747 (INAUDIBLE). 2379 01:24:45,814 --> 01:24:48,750 >> THE EXAMPLE OF IMAGING DATA, 2380 01:24:48,817 --> 01:24:50,218 DID YOU TALK ABOUT SYNTHETIC 2381 01:24:50,285 --> 01:24:52,053 DATA FROM PERSPECTIVE OF 2382 01:24:52,120 --> 01:24:53,688 CREATING IMAGING DATA THAT'S 2383 01:24:53,755 --> 01:24:54,689 HUMAN CONSUMABLE VERSUS CREATING 2384 01:24:54,756 --> 01:24:56,191 THE RAW SYNTHETIC DATA THAT IS 2385 01:24:56,257 --> 01:24:58,092 ACTUALLY GOING TO GIVE YOU THE 2386 01:24:58,159 --> 01:24:58,860 QUANTITATIVE NUMBERS? 2387 01:24:58,927 --> 01:25:00,595 >> THAT'S A REALLY INTERESTING 2388 01:25:00,662 --> 01:25:00,762 ONE. 2389 01:25:00,829 --> 01:25:02,464 WE DIDN'T TALK A LOT ABOUT THAT 2390 01:25:02,530 --> 01:25:02,897 THEME. 2391 01:25:02,964 --> 01:25:04,098 IF YOU HAVE THOUGHTS IT WOULD BE 2392 01:25:04,165 --> 01:25:08,036 GREAT TO DISCUSS BUT WE DIDN'T 2393 01:25:08,102 --> 01:25:09,571 GET INTO THAT. 2394 01:25:09,637 --> 01:25:10,905 >> ALL RIGHT. 2395 01:25:10,972 --> 01:25:11,639 >> GREAT, THANK YOU. 2396 01:25:11,706 --> 01:25:15,610 >> NEXT UP IS DATA SHARING FOR 2397 01:25:15,677 --> 01:25:20,048 GENERAL REUSE. 2398 01:25:23,084 --> 01:25:25,420 >> ALL RIGHT. 2399 01:25:25,487 --> 01:25:27,188 I WILL SUMMARIZE TODAY AS ONE OF 2400 01:25:27,255 --> 01:25:28,723 THE SESSION CO-LEADS BUT 2401 01:25:28,790 --> 01:25:30,058 TOMORROW WE'LL IDENTIFY A GROUP 2402 01:25:30,124 --> 01:25:30,925 MEMBER PERHAPS. 2403 01:25:30,992 --> 01:25:34,529 SO SOME OF THE SIMILAR THEMES TO 2404 01:25:34,596 --> 01:25:36,498 WHAT COLIN ADDRESSED, WE WERE AS 2405 01:25:36,564 --> 01:25:39,234 USE CASE A LITTLE BIT FURTHER 2406 01:25:39,300 --> 01:25:40,235 AWAY FROM A DIRECT PATIENT, YOU 2407 01:25:40,301 --> 01:25:45,173 KNOW, KIND OF LIKE AN MRI, SO WE 2408 01:25:45,240 --> 01:25:46,941 CAME AT IT FROM, YOU KNOW, 2409 01:25:47,008 --> 01:25:49,143 SECOND REUSE OR REUSE OF DATA 2410 01:25:49,210 --> 01:25:51,312 WHICH I THINK MADE DATA THE 2411 01:25:51,379 --> 01:25:52,947 ESSENTIAL PART OF WHAT WE WERE 2412 01:25:53,014 --> 01:25:53,414 DISCUSSING. 2413 01:25:53,481 --> 01:25:57,552 WE HAD STARTED WITH SOME 2414 01:25:57,619 --> 01:25:58,753 POTENTIAL USE CASES OF, FOR 2415 01:25:58,820 --> 01:26:03,791 EXAMPLE, CLINICAL TRIAL DATA 2416 01:26:03,858 --> 01:26:05,960 PLACED INTO A REPOSITORY VERSUS 2417 01:26:06,027 --> 01:26:08,496 REAL WORLD LIKE EHR SYSTEMS. 2418 01:26:08,563 --> 01:26:10,031 QUICKLY DISTILLED THAT DOWN INTO 2419 01:26:10,098 --> 01:26:11,332 DIFFERENCES BETWEEN DATA WHICH 2420 01:26:11,399 --> 01:26:14,736 WERE COLLECTED AS PART OF A 2421 01:26:14,802 --> 01:26:15,303 HYPOTHESIS-DRIVEN RESEARCH 2422 01:26:15,370 --> 01:26:16,804 EFFORT, VERSUS DATA WHICH MIGHT 2423 01:26:16,871 --> 01:26:19,874 HAVE BEEN COLLECTED LIKE, SAY, 2424 01:26:19,941 --> 01:26:22,610 IN THE PROCESS OF CLINICAL CARE 2425 01:26:22,677 --> 01:26:29,484 WHICH WERE THEN NOW HIGH 2426 01:26:29,551 --> 01:26:30,418 HYPOTHESIS GENERATING, TALKING 2427 01:26:30,485 --> 01:26:32,120 THROUGH THE NIH SHIFT HOW THEY 2428 01:26:32,186 --> 01:26:34,088 ARE APPROACHING THE FUNDING OF 2429 01:26:34,155 --> 01:26:38,359 DATA RESOURCES, MOVING FROM KIND 2430 01:26:38,426 --> 01:26:40,094 OF HYPOTHESIS-DRIVEN PROPOSAL 2431 01:26:40,161 --> 01:26:43,264 NOW INTO NEWER A.I. WORK, 2432 01:26:43,331 --> 01:26:48,336 POTENTIALLY JUST HAVING LARGER 2433 01:26:48,403 --> 01:26:48,970 HYPOTHESIS-GENERATING DATASETS. 2434 01:26:49,037 --> 01:26:50,305 QUICKLY CAME THE QUESTION OF 2435 01:26:50,371 --> 01:26:52,507 CONSENT FROM THAT. 2436 01:26:52,574 --> 01:26:54,876 SO, WHAT CONSENT -- FOR THE 2437 01:26:54,943 --> 01:26:58,713 SUBJECT WHOSE DATA WAS IN THESE 2438 01:26:58,780 --> 01:26:59,714 REPOSITORY, EITHER HYPOTHESIS 2439 01:26:59,781 --> 01:27:01,583 DRIVEN OR GENERATING, WHAT LEVEL 2440 01:27:01,649 --> 01:27:02,884 OF CONSENT DID THEY PROVIDE? 2441 01:27:02,951 --> 01:27:03,818 ONLY FOR THE SPECIFIC QUESTION 2442 01:27:03,885 --> 01:27:06,154 WHICH WAS DRIVEN BY THE 2443 01:27:06,220 --> 01:27:06,688 HYPOTHESIS? 2444 01:27:06,754 --> 01:27:07,922 DID THEY PROVIDE, FOR EXAMPLE, 2445 01:27:07,989 --> 01:27:11,092 IF THEY WERE IN A THE HEALTH 2446 01:27:11,159 --> 01:27:12,727 SYSTEM CONSENT FOR CARE AND DATA 2447 01:27:12,794 --> 01:27:14,629 TO BE USED BUT NOT IN RESEARCH 2448 01:27:14,696 --> 01:27:16,130 CONTEXT AND KIND OF EVERYTHING 2449 01:27:16,197 --> 01:27:16,864 IN BETWEEN. 2450 01:27:16,931 --> 01:27:19,167 AND WE JUST THOUGHT THAT THOSE 2451 01:27:19,233 --> 01:27:19,901 WERE REALLY IMPORTANT 2452 01:27:19,968 --> 01:27:20,835 CONSIDERATIONS FOR THE WORK THAT 2453 01:27:20,902 --> 01:27:22,570 WE WERE GOING TO BE DOING OVER 2454 01:27:22,637 --> 01:27:23,571 THE NEXT SESSIONS. 2455 01:27:23,638 --> 01:27:25,006 YOU KNOW, WHAT WAS THE ORIGINAL 2456 01:27:25,073 --> 01:27:26,608 INTENT OF THE DATA? 2457 01:27:26,674 --> 01:27:29,243 AND WHAT LEVEL DID THAT RESEARCH 2458 01:27:29,310 --> 01:27:31,112 SUBJECT AND THEIR DATA, LIKE TO 2459 01:27:31,179 --> 01:27:33,648 WHAT DEGREE DO THEY UNDERSTAND 2460 01:27:33,715 --> 01:27:38,019 HOW THAT REUSE WOULD HAPPEN. 2461 01:27:38,086 --> 01:27:40,888 THAT INFLUENCES WHETHER REUSE OF 2462 01:27:40,955 --> 01:27:42,290 DATA FOR SUBSEQUENT 2463 01:27:42,357 --> 01:27:44,158 INVESTIGATION SHOULD BE BROADLY 2464 01:27:44,225 --> 01:27:47,662 -- YOU KNOW, BROADLY AVAILABLE 2465 01:27:47,729 --> 01:27:49,430 OR DYNAMIC AND CASE SPECIFIC. 2466 01:27:49,497 --> 01:27:51,065 WHAT MECHANISMS WE EVEN HAVE TO 2467 01:27:51,132 --> 01:27:54,335 MAKE SUCH A THING REALITY. 2468 01:27:54,402 --> 01:27:56,838 WE TALKED A LITTLE BIT, MAYBE 2469 01:27:56,904 --> 01:27:58,606 I'LL SWITCH TO STAKEHOLDERS, SO 2470 01:27:58,673 --> 01:28:00,575 WE STARTED WITH THE -- KIND OF 2471 01:28:00,642 --> 01:28:01,876 FROM THE DATA PERSPECTIVE AGAIN 2472 01:28:01,943 --> 01:28:04,512 AND THEN GOT DOWN TO THE PATIENT 2473 01:28:04,579 --> 01:28:04,712 LEVEL. 2474 01:28:04,779 --> 01:28:05,613 WE TALKED ABOUT INSTITUTIONS, 2475 01:28:05,680 --> 01:28:08,349 WHETHER THOSE ARE RESEARCH 2476 01:28:08,416 --> 01:28:09,751 PERFORMING ORGANIZATIONS, LET'S 2477 01:28:09,817 --> 01:28:12,186 SAY A UNIVERSITY, OTHER ACADEMIC 2478 01:28:12,253 --> 01:28:13,921 CENTERS, MAYBE HEALTH SYSTEMS, 2479 01:28:13,988 --> 01:28:15,990 INSTITUTIONS COULD OBVIOUSLY 2480 01:28:16,057 --> 01:28:17,291 INCLUDE COMPANIES, RIGHT, WHO 2481 01:28:17,358 --> 01:28:20,028 ARE -- YOU KNOW, DEVELOPING OR 2482 01:28:20,094 --> 01:28:21,996 COMMERCIALIZING TECHNOLOGY THAT 2483 01:28:22,063 --> 01:28:22,864 ORIGINATED FROM 2484 01:28:22,930 --> 01:28:24,098 HYPOTHESIS-DRIVEN WORK OR 2485 01:28:24,165 --> 01:28:25,400 SECONDARY USES OF DATA, 2486 01:28:25,466 --> 01:28:27,135 INSTITUTIONS WOULD INCLUDE 2487 01:28:27,201 --> 01:28:28,903 FUNDING AGENCIES, OBVIOUSLY 2488 01:28:28,970 --> 01:28:32,974 INSTITUTION WOULD ALSO INCLUDE 2489 01:28:33,041 --> 01:28:33,408 PUBLISHERS. 2490 01:28:33,474 --> 01:28:34,842 WE TALKED ABOUT THE KIND OF 2491 01:28:34,909 --> 01:28:36,144 SPECIFICS OF THE GRANT REVIEW 2492 01:28:36,210 --> 01:28:37,879 PROCESS BECAUSE I FEEL LIKE 2493 01:28:37,945 --> 01:28:40,114 THAT'S WHERE WE'RE HEADING, 2494 01:28:40,181 --> 01:28:41,215 GUIDANCE ABOUT HOW AN APPLICANT 2495 01:28:41,282 --> 01:28:42,817 IS TO ADDRESS SOME OF THESE 2496 01:28:42,884 --> 01:28:44,886 ISSUES OR MAYBE A STUDY SECTION 2497 01:28:44,952 --> 01:28:47,288 MEMBER SO WE THOUGHT THERE WAS 2498 01:28:47,355 --> 01:28:48,589 STAKEHOLDERS THERE. 2499 01:28:48,656 --> 01:28:51,759 AND THEN WE ULTIMATELY TALKED 2500 01:28:51,826 --> 01:28:54,462 ABOUT, YOU KNOW, PROBABLY THE 2501 01:28:54,529 --> 01:28:57,932 RISING DOMAIN WHICH IS THOSE 2502 01:28:57,999 --> 01:28:59,133 ENTITIES RESPONSIBLE FOR 2503 01:28:59,200 --> 01:29:00,234 ORGANIZING AND MAINTAINING THE 2504 01:29:00,301 --> 01:29:00,768 DATA. 2505 01:29:00,835 --> 01:29:02,870 SO THOSE COULD EITHER BE 2506 01:29:02,937 --> 01:29:04,172 REPOSITORIES, THEY COULD BE 2507 01:29:04,238 --> 01:29:07,542 INSTITUTIONS, THEY COULD ALSO BE 2508 01:29:07,608 --> 01:29:07,775 BROKERS. 2509 01:29:07,842 --> 01:29:10,745 AND, YOU KNOW, THEY ARE GOING TO 2510 01:29:10,812 --> 01:29:13,214 BE IMPORTANT STAKEHOLDER HERE, 2511 01:29:13,281 --> 01:29:15,750 THAT IS KIND OF IN THE MIDDLE OF 2512 01:29:15,817 --> 01:29:18,453 THE GENERATION OF THE DATA, 2513 01:29:18,519 --> 01:29:21,322 CURATION OF DATA, ULTIMATELY 2514 01:29:21,389 --> 01:29:21,756 REUSE. 2515 01:29:21,823 --> 01:29:23,624 FINALLY DOWN TO PATIENTS, IN 2516 01:29:23,691 --> 01:29:25,626 SOME CASES THEY ARE RESEARCH 2517 01:29:25,693 --> 01:29:26,594 SUBJECTS, IN OTHER CASES THEY 2518 01:29:26,661 --> 01:29:28,096 ARE PATIENTS, IN OTHER CASES 2519 01:29:28,162 --> 01:29:30,064 THEY ARE CITIZENS. 2520 01:29:30,131 --> 01:29:31,099 SO, YOU KNOW, AGAIN, DEPENDING 2521 01:29:31,165 --> 01:29:33,167 HOW THESE DATA WERE COLLECTED 2522 01:29:33,234 --> 01:29:34,235 LIKE SOCIAL MEDIA DATA, RIGHT, 2523 01:29:34,302 --> 01:29:36,237 THEY ARE NOT PATIENTS. 2524 01:29:36,304 --> 01:29:38,706 THOSE MIGHT BE CITIZENS. 2525 01:29:38,773 --> 01:29:39,507 HEALTH SYSTEM SUBJECTS, THOSE 2526 01:29:39,574 --> 01:29:40,575 ARE PATIENTS. 2527 01:29:40,641 --> 01:29:43,878 AND THEN RESEARCH SUBJECTS WHO 2528 01:29:43,945 --> 01:29:44,846 REALLY RISE FROM SOME 2529 01:29:44,912 --> 01:29:46,380 HYPOTHESIS. 2530 01:29:46,447 --> 01:29:48,583 AND THEN WE EXPANDED OUT FROM 2531 01:29:48,649 --> 01:29:48,916 THERE. 2532 01:29:48,983 --> 01:29:50,752 YOU KNOW, TOWARDS WHAT IS THE 2533 01:29:50,818 --> 01:29:51,853 INTEREST OF DIFFERENT COUNTRIES, 2534 01:29:51,919 --> 01:29:55,256 LIKE E.U. AND SOME OF THEIR 2535 01:29:55,323 --> 01:29:57,325 GUIDELINES AROUND DATA REUSE OR 2536 01:29:57,391 --> 01:29:57,892 AGGREGATION. 2537 01:29:57,959 --> 01:29:59,861 THE QUESTIONS ABOUT TRIBAL 2538 01:29:59,927 --> 01:30:02,730 NATIONS OR, YOU KNOW, ENTITIES 2539 01:30:02,797 --> 01:30:04,132 WITHIN OUR UNITED STATES THAT 2540 01:30:04,198 --> 01:30:06,200 HAVE DIFFERENT TYPE OF 2541 01:30:06,267 --> 01:30:09,270 SOVEREIGNTY OVER DATA AND OTHER 2542 01:30:09,337 --> 01:30:09,537 PRACTICES. 2543 01:30:09,604 --> 01:30:12,006 FAMILY MEMBERS, YOU KNOW, UNITS 2544 01:30:12,073 --> 01:30:14,275 THAT SURROUND WHOSE PRIVACY OR 2545 01:30:14,342 --> 01:30:15,943 OTHER -- DATA REUSE COULD HAVE 2546 01:30:16,010 --> 01:30:18,045 IMPLICATIONS ON THEM. 2547 01:30:18,112 --> 01:30:20,114 AND THEN PATIENTS WITH SPECIFIC 2548 01:30:20,181 --> 01:30:21,315 CONDITIONS, WHO CONSENT TO 2549 01:30:21,382 --> 01:30:22,416 PARTICIPATE IN A TRIAL BECAUSE 2550 01:30:22,483 --> 01:30:24,652 THEY BELIEVE THAT THEY WILL BE 2551 01:30:24,719 --> 01:30:27,321 BENEFITING OTHER PATIENTS IN THE 2552 01:30:27,388 --> 01:30:28,723 FUTURE, WITH THEIR 2553 01:30:28,790 --> 01:30:29,056 CONTRIBUTIONS. 2554 01:30:29,123 --> 01:30:36,097 THERE'S A LOT MORE THERE, 2555 01:30:36,164 --> 01:30:37,064 FUNDING, RESPONSIBILITY, 2556 01:30:37,131 --> 01:30:37,832 CONSENT, LICENSING, BIAS, WHICH 2557 01:30:37,899 --> 01:30:39,834 I THINK I HOPE WE'LL GET TO IN 2558 01:30:39,901 --> 01:30:41,903 THE SUBSEQUENT SESSIONS. 2559 01:30:41,969 --> 01:30:44,505 >> ALL RIGHT, THANK YOU. 2560 01:30:44,572 --> 01:30:45,840 >> COULD I JUST ASK YOU A 2561 01:30:45,907 --> 01:30:46,240 QUESTION? 2562 01:30:46,307 --> 01:30:50,244 DID YOU ALL TALK ABOUT ABOUT THE 2563 01:30:50,311 --> 01:30:50,845 COMMON RULE PERMITS? 2564 01:30:50,912 --> 01:30:59,187 THERE'S A A GOOD BIT OF REUSE 2565 01:30:59,253 --> 01:31:00,521 PERMITTED UNDERSTOOD THE COMMON 2566 01:31:00,588 --> 01:31:01,522 RULE WITH AND WITHOUT CONSENT, 2567 01:31:01,589 --> 01:31:03,257 WHERE THE IRB CAN ACTUALLY 2568 01:31:03,324 --> 01:31:04,592 REDIRECT THE DATA. 2569 01:31:04,659 --> 01:31:05,993 I JUST AM WONDERING IF YOU ALL 2570 01:31:06,060 --> 01:31:08,095 LOOKED AT THAT AT ALL, NOT THAT 2571 01:31:08,162 --> 01:31:10,264 I THINK THAT IS -- I MEAN, I 2572 01:31:10,331 --> 01:31:12,033 THINK THAT'S THE FLOOR, NOT THE 2573 01:31:12,099 --> 01:31:13,601 CEILING, BUT I WAS CURIOUS. 2574 01:31:13,668 --> 01:31:14,969 >> WE DIDN'T SPEAK SPECIFICALLY 2575 01:31:15,036 --> 01:31:16,170 ABOUT THAT BUT I THINK WE'LL 2576 01:31:16,237 --> 01:31:23,311 BRING THAT UP IN OUR NEXT 2577 01:31:23,377 --> 01:31:23,578 BREAKOUT. 2578 01:31:23,644 --> 01:31:24,078 >> THANK YOU. 2579 01:31:24,145 --> 01:31:25,046 >> CLOSELY RELATED QUESTION, 2580 01:31:25,112 --> 01:31:26,781 THAT IS DID YOUR GROUP CONSIDER 2581 01:31:26,848 --> 01:31:27,415 THE CONSEQUENCE -- I UNDERSTAND 2582 01:31:27,481 --> 01:31:30,284 YOU LOOKED AT THE SCOPE OF 2583 01:31:30,351 --> 01:31:32,286 CONSENT IN TERMS OF DATA REUSE. 2584 01:31:32,353 --> 01:31:34,789 BUT DID YOU LOOK AT THE 2585 01:31:34,856 --> 01:31:36,224 CONSEQUENCES OF REQUIRING 2586 01:31:36,290 --> 01:31:37,892 CONSENT AS OPPOSED TO 2587 01:31:37,959 --> 01:31:40,294 ALTERNATIVE OF ETHICAL 2588 01:31:40,361 --> 01:31:41,596 MANAGEMENT OF UNCONSENTED DATA, 2589 01:31:41,662 --> 01:31:45,399 WITH RESPECT TO THE BIAS THAT 2590 01:31:45,466 --> 01:31:45,766 THAT INTRODUCES? 2591 01:31:45,833 --> 01:31:47,001 >> SO, YOU KNOW, I THINK WE WERE 2592 01:31:47,068 --> 01:31:48,836 TRYING TO JUST MAKE SURE WE HAD 2593 01:31:48,903 --> 01:31:52,106 A GOOD GRASP ON USE CASES AND 2594 01:31:52,173 --> 01:31:53,107 STAKEHOLDERS BEFORE WE 2595 01:31:53,174 --> 01:31:54,308 APPROACHED CHALLENGING QUESTIONS 2596 01:31:54,375 --> 01:31:55,376 LIKE THAT. 2597 01:31:55,443 --> 01:31:56,811 BUT, YEAH, WE DIDN'T SPEAK ABOUT 2598 01:31:56,878 --> 01:31:59,013 THAT SPECIFICALLY BUT KIND OF 2599 01:31:59,080 --> 01:32:00,047 DANCED AROUND IT. 2600 01:32:00,114 --> 01:32:04,185 >> I WAS WONDERING IF YOU GOT A 2601 01:32:04,252 --> 01:32:05,353 CHANCE TO DISCUSS LICENSING, 2602 01:32:05,419 --> 01:32:06,787 COPYRIGHTS, THINGS LIKE THAT. 2603 01:32:06,854 --> 01:32:07,755 THE OTHER ONE REGARDING CONSENT, 2604 01:32:07,822 --> 01:32:11,459 WHETHER YOU HAD A CHANCE TO 2605 01:32:11,525 --> 01:32:12,360 EXPLORE BLANKET STATEMENTS LIKE 2606 01:32:12,426 --> 01:32:14,862 WHO WILL USE THIS DATA FOR ANY 2607 01:32:14,929 --> 01:32:16,063 FUTURE RESEARCH, LIKE GIVEN THAT 2608 01:32:16,130 --> 01:32:19,233 AS YOU WRITE THESE UP, LIKE A.I. 2609 01:32:19,300 --> 01:32:21,669 CATAPULTS THIS SORT OF ALL KINDS 2610 01:32:21,736 --> 01:32:24,472 OF FUTURE USE CASES, SHOULD THE 2611 01:32:24,538 --> 01:32:25,806 USER BLANKET STATEMENTS BE 2612 01:32:25,873 --> 01:32:27,742 ALLOWED OR NOT, DID YOU EVER -- 2613 01:32:27,808 --> 01:32:28,209 >> THAT CAME UP. 2614 01:32:28,276 --> 01:32:32,146 I DON'T KNOW THAT WE CAME TO A 2615 01:32:32,213 --> 01:32:32,413 CONSENSUS. 2616 01:32:32,480 --> 01:32:34,348 IN TERMS OF LICENSING, WE TALKED 2617 01:32:34,415 --> 01:32:37,285 ABOUT EXAMPLE FROM I THINK NASA, 2618 01:32:37,351 --> 01:32:38,819 WHERE THERE'S THE KIND OF 2619 01:32:38,886 --> 01:32:40,454 LICENSING SUCH THAT THOSE DATA 2620 01:32:40,521 --> 01:32:43,291 ARE FREELY AVAILABLE, THEY ARE 2621 01:32:43,357 --> 01:32:44,959 OPEN, THERE'S NO, YOU KNOW, KIND 2622 01:32:45,026 --> 01:32:47,662 OF LIKE CHECKS ON WHO GETS TO 2623 01:32:47,728 --> 01:32:48,896 USE THOSE DATA. 2624 01:32:48,963 --> 01:32:49,864 I DON'T THINK THAT'S PROBABLY 2625 01:32:49,931 --> 01:32:51,599 THE RIGHT PLACE FOR HEALTH DATA 2626 01:32:51,666 --> 01:32:54,368 TO GO, BUT WE DID DISCUSS IT IN 2627 01:32:54,435 --> 01:33:01,375 THAT CONTEXT. 2628 01:33:01,442 --> 01:33:06,380 >> ANY OTHER QUESTIONS? 2629 01:33:06,447 --> 01:33:07,715 >> (INAUDIBLE) DATA USE 2630 01:33:07,782 --> 01:33:09,417 AGREEMENTS BETWEEN ACADEMIC 2631 01:33:09,483 --> 01:33:12,753 ORGANIZATIONS? 2632 01:33:12,820 --> 01:33:14,455 >> WE DIDN'T DISCUSS THOSE. 2633 01:33:14,522 --> 01:33:18,192 >> (INAUDIBLE). 2634 01:33:18,259 --> 01:33:19,660 >> FOR THE ONLINE FOLKS COULD 2635 01:33:19,727 --> 01:33:21,829 YOU USE THE MIC? 2636 01:33:21,896 --> 01:33:22,530 SORRY. 2637 01:33:22,596 --> 01:33:23,998 MAYBE REPEAT THE QUESTION. 2638 01:33:24,065 --> 01:33:26,534 >> YEAH, I THINK THE QUESTION 2639 01:33:26,600 --> 01:33:29,603 WAS DID WE DISCUSS DATA USE 2640 01:33:29,670 --> 01:33:30,671 AGREEMENTS BETWEEN ACADEMIC 2641 01:33:30,738 --> 01:33:31,005 ORGANIZATIONS. 2642 01:33:31,072 --> 01:33:32,473 >> YES, I WAS THE OTHER OH 2643 01:33:32,540 --> 01:33:32,807 CHAIR. 2644 01:33:32,873 --> 01:33:37,278 WE DID LAY OUT THE STAKEHOLDERS, 2645 01:33:37,345 --> 01:33:38,479 CONCERNS, NEEDS, RECOGNIZING 2646 01:33:38,546 --> 01:33:41,248 TOMORROW WE'LL FOCUS ON THE 2647 01:33:41,315 --> 01:33:42,750 GAPS, CAPABILITIES AND GAPS. 2648 01:33:42,817 --> 01:33:44,385 SO EVERYTHING THAT YOU TALKED 2649 01:33:44,452 --> 01:33:46,454 ABOUT LIKE EVEN THE LICENSING, 2650 01:33:46,520 --> 01:33:50,091 WE KNOW THERE'S A GAP THERE 2651 01:33:50,157 --> 01:33:50,992 AROUND DATA USE AGREEMENTS, WE 2652 01:33:51,058 --> 01:33:54,795 KNOW THERE'S A GAP THERE. 2653 01:33:54,862 --> 01:33:55,796 CONSENTING, CONSENT VERSUS 2654 01:33:55,863 --> 01:33:56,597 UNCONSENTED DATA, THERE ARE 2655 01:33:56,664 --> 01:33:58,032 METHODS AND WE NEED TO IDENTIFY 2656 01:33:58,099 --> 01:34:01,369 METHODS AND GAPS TO MAKE IT 2657 01:34:01,435 --> 01:34:02,269 CONSISTENT ACROSS DIFFERENT DATA 2658 01:34:02,336 --> 01:34:11,779 TYPES. 2659 01:34:11,846 --> 01:34:16,450 >> NEXT UP MULTI-MODAL DATA. 2660 01:34:22,623 --> 01:34:24,158 >> ALL RIGHT. 2661 01:34:24,225 --> 01:34:26,127 WE ALSO HAD A VERY RICH 2662 01:34:26,193 --> 01:34:28,329 DISCUSSION AND HAD A REALLY 2663 01:34:28,396 --> 01:34:30,464 DIVERSE GROUP OF EXPERTS IN THE 2664 01:34:30,531 --> 01:34:30,831 ROOM. 2665 01:34:30,898 --> 01:34:32,533 I THINK TO START OFF WITH 2666 01:34:32,600 --> 01:34:33,701 MULTI-MODAL DATA, I THINK SOME 2667 01:34:33,768 --> 01:34:36,203 OF THE ISSUES THAT WE SURFACED 2668 01:34:36,270 --> 01:34:38,906 AS A GROUP WAS THAT CLEARLY WITH 2669 01:34:38,973 --> 01:34:40,141 MULTI-MODAL DATA LINKING BETWEEN 2670 01:34:40,207 --> 01:34:41,409 DATASETS IS OBVIOUSLY MORE 2671 01:34:41,475 --> 01:34:43,544 PRIMARY THAN MAYBE A FEW OF THE 2672 01:34:43,611 --> 01:34:46,313 OTHER USE CASES, SO A LOT OF 2673 01:34:46,380 --> 01:34:49,016 THOUGHTS ABOUT HOW LINKAGES TO 2674 01:34:49,083 --> 01:34:51,218 DATA REQUIRE MORE INFORMATION 2675 01:34:51,285 --> 01:34:53,721 ABOUT THE DATA FORMATS, DATA 2676 01:34:53,788 --> 01:34:54,488 DOCUMENTATION, SAMPLE SIZES OF 2677 01:34:54,555 --> 01:34:55,923 THE DATA, AND TRYING TO KEEP AN 2678 01:34:55,990 --> 01:34:57,892 EYE ON ALL OF THAT, KNOWING THAT 2679 01:34:57,958 --> 01:34:59,260 THE MORE YOU LINK, THE MORE YOU 2680 01:34:59,326 --> 01:35:03,230 CAN LOSE PATIENTS IN THE SAMPLE 2681 01:35:03,297 --> 01:35:04,432 AND LOSE INFORMATION WITH THE 2682 01:35:04,498 --> 01:35:05,199 TYPES AND AMOUNTS OF DATA 2683 01:35:05,266 --> 01:35:06,200 AVAILABLE TO YOU. 2684 01:35:06,267 --> 01:35:07,935 THAT CAME UP QUITE A BIT. 2685 01:35:08,002 --> 01:35:09,270 TRYING TO HIGHLIGHT THINGS THAT 2686 01:35:09,336 --> 01:35:10,905 HAVEN'T BEEN SAID IN OTHER 2687 01:35:10,971 --> 01:35:11,138 GROUPS. 2688 01:35:11,205 --> 01:35:14,308 WE ALSO TALKED A LOT ABOUT 2689 01:35:14,375 --> 01:35:16,343 MULTI-MODAL DATA ALSO PUTTING 2690 01:35:16,410 --> 01:35:17,912 THE COMMUNICATION BETWEEN TEAMS 2691 01:35:17,978 --> 01:35:19,413 ALSO CENTRAL TO THE ISSUE, SO 2692 01:35:19,480 --> 01:35:24,485 THAT CAME UP A NUMBER OF TIMES. 2693 01:35:24,552 --> 01:35:25,352 AND AFFECTED OUR STAKEHOLDER 2694 01:35:25,419 --> 01:35:29,090 LIST WHICH I'LL TALK ABOUT IN A 2695 01:35:29,156 --> 01:35:29,323 SECOND. 2696 01:35:29,390 --> 01:35:31,592 SOME OTHER MAIN COMMON 2697 01:35:31,659 --> 01:35:33,094 CHALLENGES AND ALSO 2698 01:35:33,160 --> 01:35:33,694 SUCCESSES/OPPORTUNITIES IN THE 2699 01:35:33,761 --> 01:35:36,864 SPACE I THINK HAVE MOSTLY BEEN 2700 01:35:36,931 --> 01:35:38,265 MENTIONED PREVIOUSLY BEFORE SO A 2701 01:35:38,332 --> 01:35:40,668 LOT OF NEEDS FOR GREATER DATA 2702 01:35:40,734 --> 01:35:42,002 AVAILABILITY, MORE DETAILING OF 2703 01:35:42,069 --> 01:35:44,438 MODELING TO BE ABLE TO REPRODUCE 2704 01:35:44,505 --> 01:35:46,941 MULTI-MODAL MODELS IN OTHER 2705 01:35:47,007 --> 01:35:48,042 SETTINGS OR INDICATIONS, 2706 01:35:48,109 --> 01:35:49,443 CONCERNS FOR PRIVACY AND 2707 01:35:49,510 --> 01:35:50,311 IDENTITY MANAGEMENT, ESPECIALLY 2708 01:35:50,377 --> 01:35:52,746 I WOULD SAY THAT ONE IS 2709 01:35:52,813 --> 01:35:53,314 PARTICULARLY FOR MULTI-MODAL 2710 01:35:53,380 --> 01:35:55,249 DATA GIVEN THE MORE YOU LINK 2711 01:35:55,316 --> 01:35:58,986 DATA POTENTIALLY THE HIGHER RISK 2712 01:35:59,053 --> 01:36:00,521 OF RE-IDENTIFICATION IS 2713 01:36:00,588 --> 01:36:00,855 POSSIBLE. 2714 01:36:00,921 --> 01:36:03,124 THE LACK OF PATIENT AND 2715 01:36:03,190 --> 01:36:06,026 COMMUNITY ENGAGEMENT ACROSS ALL 2716 01:36:06,093 --> 01:36:10,030 PHASES OF THE A.I. DATA 2717 01:36:10,097 --> 01:36:11,132 ACQUISITION TO DEPLOYMENT CYCLE, 2718 01:36:11,198 --> 01:36:21,175 AND THIS IDEA OF RISK AND RISK 2719 01:36:21,242 --> 01:36:22,510 APPROXIMATION, GENOMIC DATA 2720 01:36:22,576 --> 01:36:25,045 HAVING A DIFFERENT RISK THAN EHR 2721 01:36:25,112 --> 01:36:26,447 VERSUS WEARABLE DATA, ET CETERA. 2722 01:36:26,514 --> 01:36:28,182 THOSE ARE SOME THEMES THAT I 2723 01:36:28,249 --> 01:36:30,584 WOULD SAY SURFACED FROM OUR 2724 01:36:30,651 --> 01:36:31,552 GROUP DISCUSSION. 2725 01:36:31,619 --> 01:36:33,554 AND THEN IN TERMS -- WE SWITCHED 2726 01:36:33,621 --> 01:36:34,321 TO THE STAKEHOLDER CONVERSATION 2727 01:36:34,388 --> 01:36:39,360 LIKE MANY OF THE OTHER GROUPS 2728 01:36:39,426 --> 01:36:41,662 DISCUSSED, AND I WOULD SAY WE 2729 01:36:41,729 --> 01:36:43,531 ADDED MORE STAKEHOLDER GROUPS IN 2730 01:36:43,597 --> 01:36:44,498 OUR DISCUSSION, AND SOME OF THE 2731 01:36:44,565 --> 01:36:46,600 ONES I THINK ARE PARTICULARLY 2732 01:36:46,667 --> 01:36:49,537 INTERESTING IS WE TALKED A LOT 2733 01:36:49,603 --> 01:36:50,838 ABOUT DIFFERENT DISCIPLINARY 2734 01:36:50,905 --> 01:36:52,940 KNOWLEDGE, CREATION OF MODELS, 2735 01:36:53,007 --> 01:36:55,009 SO BASIC SCIENCE VERSUS HEALTH 2736 01:36:55,075 --> 01:36:55,876 SERVICES RESEARCH VERSUS 2737 01:36:55,943 --> 01:36:57,611 CLINICAL RESEARCH, AND WHEN 2738 01:36:57,678 --> 01:36:58,179 YOU'RE THINKING ABOUT 2739 01:36:58,245 --> 01:37:00,014 MULTI-MODAL DATA EACH OF THOSE 2740 01:37:00,080 --> 01:37:01,949 GROUPS HAS THEIR OWN LANGUAGE, 2741 01:37:02,016 --> 01:37:03,017 TERMINOLOGY HOW THEY ARE 2742 01:37:03,083 --> 01:37:04,952 THINKING ABOUT DATA AND 2743 01:37:05,019 --> 01:37:10,357 CONCEPTS, AND SO NOT JUST 2744 01:37:10,424 --> 01:37:13,294 LUMPING DATA SCIENTISTS TOGETHER 2745 01:37:13,360 --> 01:37:16,063 BUT TEASING INTO DISCIPLINARY 2746 01:37:16,130 --> 01:37:18,999 KNOWLEDGE SPACES, MOVING THROUGH 2747 01:37:19,066 --> 01:37:19,900 REAL ORGANIZATIONAL LEADERSHIP, 2748 01:37:19,967 --> 01:37:21,869 AND IN THAT BUCKET WE TALKED A 2749 01:37:21,936 --> 01:37:24,605 LOT ABOUT THE PEOPLE DEPLOYING 2750 01:37:24,672 --> 01:37:27,441 THE MODELS, MULTI-MODAL MODELS 2751 01:37:27,508 --> 01:37:29,944 ONCE DEVELOPED, MACHINE LEARNING 2752 01:37:30,010 --> 01:37:32,012 ENGINEERS BEING A DIFFERENT 2753 01:37:32,079 --> 01:37:33,414 GROUP THAN DATA SCIENTISTS 2754 01:37:33,480 --> 01:37:34,448 GENERATING MODELS, CREATING A 2755 01:37:34,515 --> 01:37:36,283 MODEL OR MANY TYPES OF DATA 2756 01:37:36,350 --> 01:37:37,484 MIGHT LOOK DIFFERENT THAN THE 2757 01:37:37,551 --> 01:37:39,253 FINAL MODEL DEPLOYED THAT CAN BE 2758 01:37:39,320 --> 01:37:44,491 DEPLOYED IN AN EHR AT THE POINT 2759 01:37:44,558 --> 01:37:45,726 OF CARE. 2760 01:37:45,793 --> 01:37:47,461 WE TALKED ABOUT REGULATORS, THAT 2761 01:37:47,528 --> 01:37:49,129 CAME UP QUITE A BIT BEFORE, THE 2762 01:37:49,196 --> 01:37:51,932 ABILITY FOR US TO HAVE A HIGHER 2763 01:37:51,999 --> 01:37:54,335 LEVEL OF CONTENT EXPERTISE 2764 01:37:54,401 --> 01:37:55,336 ACROSS DOMAINS INCLUDING 2765 01:37:55,402 --> 01:37:57,771 REGULATORS AND PAYERS AND 2766 01:37:57,838 --> 01:37:58,472 FUNDERS GIVING COMPLEXITIES 2767 01:37:58,539 --> 01:37:59,406 ACROSS DATA TYPES. 2768 01:37:59,473 --> 01:38:00,708 THE THEMES OF COMMUNICATION AND 2769 01:38:00,774 --> 01:38:03,344 TRUST I WOULD SAY WERE JUST 2770 01:38:03,410 --> 01:38:06,213 RELEVANT ACROSS EVERY 2771 01:38:06,280 --> 01:38:07,948 STAKEHOLDER GROUP THAT EMERGED. 2772 01:38:08,015 --> 01:38:10,017 I THINK I'LL STOP THERE AND KEEP 2773 01:38:10,084 --> 01:38:10,718 IT BRIEF. 2774 01:38:10,784 --> 01:38:12,786 YEAH, QUESTIONS? 2775 01:38:12,853 --> 01:38:14,188 >> SO I HAVE A QUESTION. 2776 01:38:14,255 --> 01:38:16,924 DID YOU GUYS TALK ABOUT -- WHEN 2777 01:38:16,991 --> 01:38:18,459 WE THINK ABOUT MULTI-MODAL WE'RE 2778 01:38:18,525 --> 01:38:23,964 PULLING TOGETHER DIFFERENT 2779 01:38:24,031 --> 01:38:26,900 TYPES OF DATA MODALITIES, 2780 01:38:26,967 --> 01:38:28,102 ASSOCIATED WITH ACCESS TO CARE, 2781 01:38:28,168 --> 01:38:29,436 WHICH IS GOING TO INFLUENCE 2782 01:38:29,503 --> 01:38:30,170 ACCURACY OF THE MODEL. 2783 01:38:30,237 --> 01:38:31,939 HOW DO YOU THINK ABOUT BEING 2784 01:38:32,006 --> 01:38:37,311 TRANSPARENT ABOUT LIKE THESE 2785 01:38:37,378 --> 01:38:40,814 MISSINGNESS, DIFFERENT TYPES 2786 01:38:40,881 --> 01:38:42,216 MODALITIES ACROSS A PATIENT 2787 01:38:42,283 --> 01:38:42,516 TRAJECTORY. 2788 01:38:42,583 --> 01:38:44,652 >> THAT CAME UP, WE STARTED OUR 2789 01:38:44,718 --> 01:38:45,753 CONVERSATION THERE. 2790 01:38:45,819 --> 01:38:46,920 AND INSTEAD OF RATHER THAN 2791 01:38:46,987 --> 01:38:48,856 TEASING APART, WE STARTED THE 2792 01:38:48,922 --> 01:38:49,790 CONVERSATION FROM OF COURSE WE 2793 01:38:49,857 --> 01:38:50,991 WANT TO USE ALL THE DATA 2794 01:38:51,058 --> 01:38:54,728 AVAILABLE TO US TO HAVE BETTER 2795 01:38:54,795 --> 01:38:56,030 DISCOVERY OR BETTER TRUTH OF 2796 01:38:56,096 --> 01:38:57,331 INDICATION BUT WE KNOW THAT 2797 01:38:57,398 --> 01:38:58,532 PEOPLE SHOW UP IN SAMPLES 2798 01:38:58,599 --> 01:38:59,833 DIFFERENTLY AND THE MORE YOU 2799 01:38:59,900 --> 01:39:01,602 LINK FRANKLY THE MORE YOUR 2800 01:39:01,669 --> 01:39:02,569 SAMPLE SIZE OFTEN DIMINISHES. 2801 01:39:02,636 --> 01:39:04,705 EVEN IF YOU JUST TOOK SIMPLE 2802 01:39:04,772 --> 01:39:06,240 EXAMPLES OF SPECIALTY CARE AND 2803 01:39:06,307 --> 01:39:07,875 PRIMARY CARE YOU'RE LOSING 2804 01:39:07,941 --> 01:39:09,610 PEOPLE WHO HAVE LOSS OF ACCESS 2805 01:39:09,677 --> 01:39:11,578 BUT IF YOU'RE ADDING MANY MORE 2806 01:39:11,645 --> 01:39:13,113 TYPES OF DATA, GENOMIC DATA, 2807 01:39:13,180 --> 01:39:15,416 WEARABLE DATA, OF COURSE THE 2808 01:39:15,482 --> 01:39:16,283 POPULATION LEVEL VIEW OF THE 2809 01:39:16,350 --> 01:39:17,651 DATA IS MORE LIMITED SO WE 2810 01:39:17,718 --> 01:39:19,353 STARTED FROM THAT, TO BE HONEST, 2811 01:39:19,420 --> 01:39:21,622 STARTED FROM THAT USE CASE AND 2812 01:39:21,689 --> 01:39:23,657 THAT WENT TO OUR COMMUNICATION 2813 01:39:23,724 --> 01:39:24,491 NEEDS, ABILITY FOR DIFFERENT 2814 01:39:24,558 --> 01:39:26,460 SCIENTISTS TO TALK TO ONE 2815 01:39:26,527 --> 01:39:27,261 ANOTHER AND REAM UNDERSTAND WHAT 2816 01:39:27,328 --> 01:39:29,029 IS THE DATA AVAILABLE TO THEM 2817 01:39:29,096 --> 01:39:36,670 AND WHERE IT WAS SOURCED AND HOW 2818 01:39:36,737 --> 01:39:37,504 IT CAME FROM, YEAH. 2819 01:39:37,571 --> 01:39:41,909 >> YOU MENTIONED HOW LINKING 2820 01:39:41,975 --> 01:39:43,143 DATA ACROSS DOMAINS, MODALITIES, 2821 01:39:43,210 --> 01:39:45,012 INCREASES RISK OF 2822 01:39:45,079 --> 01:39:45,412 RE-IDENTIFICATION. 2823 01:39:45,479 --> 01:39:48,716 I WAS CURIOUS HOW THAT 2824 01:39:48,782 --> 01:39:51,352 INFLUENCED YOUR LIST OR YOUR 2825 01:39:51,418 --> 01:39:54,655 TALK ABOUT TRANSPARENCY, HOW 2826 01:39:54,722 --> 01:39:58,359 THAT MAY HAVE INFLUENCED 2827 01:39:58,425 --> 01:39:59,593 STAKEHOLDER -- (INAUDIBLE). 2828 01:39:59,660 --> 01:40:00,894 >> I THINK IT INFLUENCED OUR 2829 01:40:00,961 --> 01:40:02,196 THEMES THAT CAME UP OVER THE 2830 01:40:02,262 --> 01:40:03,964 STAKEHOLDERS SO THIS IDEA OF 2831 01:40:04,031 --> 01:40:04,798 TRUST AND PRIVACY, NOT THAT IT 2832 01:40:04,865 --> 01:40:09,303 DIDN'T COME UP IN THE OTHER 2833 01:40:09,370 --> 01:40:13,173 GROUPS, BUT I THINK REALLY 2834 01:40:13,240 --> 01:40:14,274 CENTRAL TO OUR CONVERSATION, THE 2835 01:40:14,341 --> 01:40:15,109 MOST INTERESTING PART WHICH I 2836 01:40:15,175 --> 01:40:16,677 THINK IS ANSWERING YOUR QUESTION 2837 01:40:16,744 --> 01:40:19,213 IS THIS IDEA THAT EVEN THOUGH 2838 01:40:19,279 --> 01:40:19,813 YOU'RE DOING MULTI-MODAL DATA 2839 01:40:19,880 --> 01:40:21,081 THE SAME PERSON COULD HAVE A 2840 01:40:21,148 --> 01:40:23,350 DIFFERENT VIEW OF THE LINKING OF 2841 01:40:23,417 --> 01:40:24,885 THAT DATA DEPENDING WHAT TYPE OF 2842 01:40:24,952 --> 01:40:27,054 DATA AND WHAT MODEL IS BEING 2843 01:40:27,121 --> 01:40:29,456 USED FOR SO HOW NUANCED THAT IS. 2844 01:40:29,523 --> 01:40:31,125 WE HAD MORE CONVERSATION IN THAT 2845 01:40:31,191 --> 01:40:33,193 DOMAIN, THAT TRUST IS NEEDED BUT 2846 01:40:33,260 --> 01:40:35,062 FRANKLY ACTUALLY AS YOU'RE 2847 01:40:35,129 --> 01:40:36,697 LINKING ACROSS IT'S NOT THE 2848 01:40:36,764 --> 01:40:36,897 SAME. 2849 01:40:36,964 --> 01:40:38,866 IT'S NOT THAT EACH STAKEHOLDER 2850 01:40:38,932 --> 01:40:41,068 GROUP FOR EACH CASE WILL HAVE 2851 01:40:41,135 --> 01:40:44,371 THE SAME RISK TOLERANCE OR RISK 2852 01:40:44,438 --> 01:40:44,671 PROFILE. 2853 01:40:44,738 --> 01:40:48,175 I THINK THAT'S A ROUNDABOUT WAY 2854 01:40:48,242 --> 01:40:49,343 OF ANSWERING YOUR QUESTION. 2855 01:40:49,410 --> 01:40:51,478 >> ANY OTHER QUESTIONS? 2856 01:40:51,545 --> 01:40:51,745 THANK YOU. 2857 01:40:51,812 --> 01:40:52,179 >> AWESOME, THANKS. 2858 01:40:52,246 --> 01:40:53,347 >> ALL RIGHT. 2859 01:40:53,414 --> 01:40:57,317 NEXT UP IS FOUNDATION MODELS. 2860 01:41:09,797 --> 01:41:11,031 >> HI 2861 01:41:11,098 --> 01:41:14,668 I CO-LED THIS SESSION ON 2862 01:41:14,735 --> 01:41:16,403 FOUNDATION MODELS. 2863 01:41:16,470 --> 01:41:22,443 WE HAD A REALLY ACTIVE GROUP 2864 01:41:22,509 --> 01:41:23,177 THAT CONTRIBUTED TREMENDOUSLY 2865 01:41:23,243 --> 01:41:28,949 WITH A LOT OF ENERGY, SO THANK 2866 01:41:29,016 --> 01:41:30,184 YOU ALL FOR THAT. 2867 01:41:30,250 --> 01:41:32,820 I THINK THAT OUR DISCUSSION I 2868 01:41:32,886 --> 01:41:37,958 WOULD BREAK UP INTO TWO PARTS. 2869 01:41:38,025 --> 01:41:40,694 THE FIRST PART WE DEFINED WHAT 2870 01:41:40,761 --> 01:41:43,831 OUR SCOPE IN TERMS OF WHAT A 2871 01:41:43,897 --> 01:41:46,266 FOUNDATION MODEL IS. 2872 01:41:46,333 --> 01:41:50,771 AND IN PARTICULAR, WE AGREED 2873 01:41:50,838 --> 01:41:52,940 THAT FOUNDATION MODELS ARE 2874 01:41:53,006 --> 01:41:55,108 GENERAL PURPOSE, A.I. MODELS 2875 01:41:55,175 --> 01:41:56,343 CREATED OVER LARGE AMOUNTS OF 2876 01:41:56,410 --> 01:42:01,648 DATA THAT CAN THEN BE FURTHER 2877 01:42:01,715 --> 01:42:04,751 CUSTOMIZED FOR USE IN PARTICULAR 2878 01:42:04,818 --> 01:42:08,055 PURPOSES, AND THAT THE CHATBOTS 2879 01:42:08,121 --> 01:42:11,458 THAT WE LOOK AT ARE SPECIAL 2880 01:42:11,525 --> 01:42:13,293 CASE, USE OF FOUNDATION MODELS, 2881 01:42:13,360 --> 01:42:20,167 SO WE'RE NOT JUST THINKING ABOUT 2882 01:42:20,234 --> 01:42:21,702 CHATGPT IN TERMS OF -- THE SCOPE 2883 01:42:21,768 --> 01:42:23,337 OF OUR CASE. 2884 01:42:23,403 --> 01:42:28,308 WE WORKED THROUGH SEVERAL USE 2885 01:42:28,375 --> 01:42:31,144 CASES OF FOUNDATION MODELS 2886 01:42:31,211 --> 01:42:34,982 WITHIN THE BROAD SCOPE OF 2887 01:42:35,048 --> 01:42:36,617 BIOMEDICINE AND HEALTH CARE. 2888 01:42:36,683 --> 01:42:38,785 HAVING SAID THAT STAGE WE MOVED 2889 01:42:38,852 --> 01:42:44,157 ON TO OUR PHASE 2, WHICH WAS 2890 01:42:44,224 --> 01:42:47,127 IDENTIFYING STAKEHOLDERS, AND WE 2891 01:42:47,194 --> 01:42:51,198 WENT TO TOWN AND HAD, YOU KNOW, 2892 01:42:51,265 --> 01:42:54,268 40-SOME DIFFERENT STAKEHOLDERS 2893 01:42:54,334 --> 01:42:57,538 THAT WE IDENTIFIED. 2894 01:42:57,604 --> 01:42:57,905 [LAUGHTER] 2895 01:42:57,971 --> 01:43:02,442 AND THEN BEGAN TO START TO BOIL 2896 01:43:02,509 --> 01:43:04,511 THAT TO A MORE MANAGEABLE SIZE 2897 01:43:04,578 --> 01:43:07,814 WHEN WE RAN OUT OF TIME. 2898 01:43:07,881 --> 01:43:10,651 SO, WHAT WE ACTUALLY DID WAS 2899 01:43:10,717 --> 01:43:15,022 MOVE ON FROM THERE TO START 2900 01:43:15,088 --> 01:43:16,557 TALKING ABOUT THE OBJECTIVES, 2901 01:43:16,623 --> 01:43:20,160 THE INCENTIVES OF THE DIFFERENT 2902 01:43:20,227 --> 01:43:22,029 STAKEHOLDERS AND THE PLACES AS A 2903 01:43:22,095 --> 01:43:23,797 VIEW TO POINTING OUT PLACES 2904 01:43:23,864 --> 01:43:26,400 WHERE THERE ARE TRADEOFFS TO BE 2905 01:43:26,466 --> 01:43:30,571 MADE OR CONFLICTS TO BE 2906 01:43:30,637 --> 01:43:31,004 RESOLVED. 2907 01:43:31,071 --> 01:43:33,640 AND WE HAVE PARTIAL LIST OF HALF 2908 01:43:33,707 --> 01:43:38,445 A DOZEN OR SO SUCH THINGS THAT 2909 01:43:38,512 --> 01:43:42,282 WE'VE IDENTIFIED, WHICH WE PLAN 2910 01:43:42,349 --> 01:43:43,717 TO REPEAT TOMORROW MORNING AND 2911 01:43:43,784 --> 01:43:47,421 THEN USE THAT AS BASIS FOR OUR 2912 01:43:47,487 --> 01:43:53,026 GAP ANALYSIS AND GUIDANCE LATER 2913 01:43:53,093 --> 01:43:55,829 IN THE DAY. 2914 01:43:55,896 --> 01:44:05,072 >> ANY QUESTIONS? 2915 01:44:05,138 --> 01:44:06,807 >> THANK YOU. 2916 01:44:06,873 --> 01:44:09,142 >> LAST BUT NOT LEAST PROXY 2917 01:44:09,209 --> 01:44:11,812 VARIABLES GROUP. 2918 01:44:57,858 --> 01:44:58,625 >> THANK YOU. 2919 01:44:58,692 --> 01:45:01,662 WE ALSO HAVE A DIVERSE GROUP OF 2920 01:45:01,728 --> 01:45:03,196 PARTICIPANTS, A LIVELY AND 2921 01:45:03,263 --> 01:45:03,864 ENGAGING CONVERSATION. 2922 01:45:03,930 --> 01:45:05,298 SO, WHEN WE WERE TALKING ABOUT 2923 01:45:05,365 --> 01:45:06,867 THE USE CASE WE REALLY HASHED 2924 01:45:06,933 --> 01:45:08,368 OUT THOSE TWO EXAMPLES THAT TINA 2925 01:45:08,435 --> 01:45:15,375 GAVE IN HER TALK WHICH WERE THE 2926 01:45:15,442 --> 01:45:18,011 VBAC CALCULATOR AND OBERMEYER 2927 01:45:18,078 --> 01:45:18,211 STUDY. 2928 01:45:18,278 --> 01:45:19,212 PROXY VARIABLES, RACE AND 2929 01:45:19,279 --> 01:45:21,114 ETHNICITY IS A GOOD EXAMPLE OF A 2930 01:45:21,181 --> 01:45:21,548 PROXY VARIABLE. 2931 01:45:21,615 --> 01:45:25,118 WE SAW THAT IN THE VBAC 2932 01:45:25,185 --> 01:45:27,087 CALCULATOR BUT NOT ALWAYS RACE 2933 01:45:27,154 --> 01:45:30,724 OR ETHNICITY BE THE PROXY 2934 01:45:30,791 --> 01:45:31,758 VARIABLE. 2935 01:45:31,825 --> 01:45:33,126 SO THE OBERMEYER STUDY WAS 2936 01:45:33,193 --> 01:45:35,328 HEALTH CARE NEEDS, PROXY COSTS. 2937 01:45:35,395 --> 01:45:36,963 WE HAD A FUN CONVERSATION ABOUT 2938 01:45:37,030 --> 01:45:39,933 HOW EVEN IN THIS FIGURE PROXY 2939 01:45:40,000 --> 01:45:40,734 VARIABLES IS SOMETHING WE FIND 2940 01:45:40,801 --> 01:45:45,072 AT THE END OF THE DAY BUT REALLY 2941 01:45:45,138 --> 01:45:47,040 SOMETHING THAT COULD BE 2942 01:45:47,107 --> 01:45:49,042 IDENTIFIED EARLIER, SOMETIMES 2943 01:45:49,109 --> 01:45:49,910 USED INTENTIONALLY EARLIER, SO 2944 01:45:49,976 --> 01:45:52,746 REALLY EVEN BACK TO TINA'S SLIDE 2945 01:45:52,813 --> 01:45:54,514 ABOUT THE DATA LIFE CYCLE AND 2946 01:45:54,581 --> 01:45:57,918 HOW CONTINUOUS THIS IS, REALLY 2947 01:45:57,984 --> 01:45:59,586 UNDERSTANDING THE DATA FROM 2948 01:45:59,653 --> 01:46:01,321 WHICH MODELS ARE DERIVED. 2949 01:46:01,388 --> 01:46:02,689 WE FOCUSED ON STAKEHOLDERS. 2950 01:46:02,756 --> 01:46:05,892 AS WAS EXPECTED WE ENDED UP WITH 2951 01:46:05,959 --> 01:46:08,395 ALSO IDENTIFYING AREAS OF NEEDS 2952 01:46:08,462 --> 01:46:10,397 AND GAPS, SPILLING OVER INTO THE 2953 01:46:10,464 --> 01:46:12,566 NEXT SESSION, AND I INVITE 2954 01:46:12,632 --> 01:46:15,202 ANYBODY FROM THE GROUP TO CHIME 2955 01:46:15,268 --> 01:46:17,404 N WE MADE THESE SLIDES, A 2956 01:46:17,471 --> 01:46:20,140 REGURGITATION OF IDEAS. 2957 01:46:20,207 --> 01:46:25,712 I WON'T READ EVERYTHING, BUT 2958 01:46:25,779 --> 01:46:31,218 JUST TO START WITH FIRST STAKER, 2959 01:46:31,284 --> 01:46:33,386 WHETHER IT'S INVESTIGATOR OR 2960 01:46:33,453 --> 01:46:34,888 ANALYST, AND THE PRESSURES THEY 2961 01:46:34,955 --> 01:46:37,224 FACE MAY VARY BASED ON WHO IS 2962 01:46:37,290 --> 01:46:39,192 FUNDINGS, LENS AT WHICH THEY ARE 2963 01:46:39,259 --> 01:46:41,261 LOOKING AT DATA, REALLY IN THE 2964 01:46:41,328 --> 01:46:43,430 DATA CREATION STAGE FOR DATA 2965 01:46:43,497 --> 01:46:44,965 USERS TO UNDERSTAND EACH 2966 01:46:45,031 --> 01:46:47,400 VARIABLE AS WELL AS LIMITED TIME 2967 01:46:47,467 --> 01:46:49,035 AND RESOURCES TO COLLECT 2968 01:46:49,102 --> 01:46:50,237 VARIABLES OF INTEREST. 2969 01:46:50,303 --> 01:46:52,539 WE ALSO TALKED ABOUT WHAT WAS 2970 01:46:52,606 --> 01:46:56,843 THE INTENDED USE OF THE DATASET 2971 01:46:56,910 --> 01:46:58,145 VERSUS ACTUAL APPLIED USE. 2972 01:46:58,211 --> 01:47:00,113 THEN WE GOT INTO THIS 2973 01:47:00,180 --> 01:47:02,315 CONVERSATION ABOUT ARE WE 2974 01:47:02,382 --> 01:47:03,049 UNINTENTIONALLY USING PROXY 2975 01:47:03,116 --> 01:47:04,484 VARIABLES OR IS IT INTENTIONAL 2976 01:47:04,551 --> 01:47:07,254 AND WHEN IS IT JUSTIFIED, THOSE 2977 01:47:07,320 --> 01:47:08,255 ARE DEFINITELY UNRESOLVED ISSUES 2978 01:47:08,321 --> 01:47:09,222 I WOULD SAY. 2979 01:47:09,289 --> 01:47:11,858 WE STARTED TO HAVE EARLY 2980 01:47:11,925 --> 01:47:13,193 CONVERSATIONS ABOUT WHAT'S 2981 01:47:13,260 --> 01:47:16,897 NEEDED AND WHAT CAME UP WAS 2982 01:47:16,963 --> 01:47:20,634 AVAILABILITY OF NETWORKS AND 2983 01:47:20,700 --> 01:47:21,501 MULTI-DISCIPLINARY SETTINGS, 2984 01:47:21,568 --> 01:47:23,136 MANY EXPERTS, NEXT SLIDE, CAN 2985 01:47:23,203 --> 01:47:25,338 REALLY HELP IDENTIFY THESE PROXY 2986 01:47:25,405 --> 01:47:27,974 VARIABLES EARLIER ON IN MODEL 2987 01:47:28,041 --> 01:47:29,075 CREATION STAGES. 2988 01:47:29,142 --> 01:47:31,111 AND THEN, YOU KNOW, DATA USERS 2989 01:47:31,178 --> 01:47:35,415 ALSO HAVE A LOT OF COSTS OF 2990 01:47:35,482 --> 01:47:36,716 COLLECTING AND USING DATA THAT 2991 01:47:36,783 --> 01:47:38,151 MAY IMPACT HOW THEY USE DATA IN 2992 01:47:38,218 --> 01:47:40,253 A WAY THEY MAY NOT BE CATCHING 2993 01:47:40,320 --> 01:47:41,755 SOME OF THESE PROXY VARIABLES 2994 01:47:41,822 --> 01:47:42,422 THAT ARE HAPPENING. 2995 01:47:42,489 --> 01:47:43,790 WE TALKED ABOUT NEED FOR 2996 01:47:43,857 --> 01:47:47,360 REASSESSMENT AT THE END OF THE 2997 01:47:47,427 --> 01:47:49,663 MODEL LIFE CYCLE, IT'S A 2998 01:47:49,729 --> 01:47:51,765 CONTINUOUS PROCESS, NEED FOR 2999 01:47:51,832 --> 01:47:52,332 HYPOTHESIS-DRIVEN MODELS 3000 01:47:52,399 --> 01:47:53,300 UNDERSTANDING HOW VARIABLES IN 3001 01:47:53,366 --> 01:47:56,369 THE DATASET ARE RELATED TO 3002 01:47:56,436 --> 01:47:57,771 OUTCOMES OF INTEREST. 3003 01:47:57,838 --> 01:47:59,873 WE HAD A STAKEHOLDER HERE THAT 3004 01:47:59,940 --> 01:48:01,007 NO OTHER GROUP MENTIONED, I'M 3005 01:48:01,074 --> 01:48:02,442 REALLY EXCITED BECAUSE THIS WAS 3006 01:48:02,509 --> 01:48:05,212 ONE OF OUR CORE STAKEHOLDERS 3007 01:48:05,278 --> 01:48:10,450 WHICH WERE THIS GROUP OF 3008 01:48:10,517 --> 01:48:13,086 SOCIOLOGISTS, HISTORICAL 3009 01:48:13,153 --> 01:48:13,854 EXPERTS, PHILOSOPHERS, 3010 01:48:13,920 --> 01:48:14,988 DOT-DOT-DOT, PEOPLE WITH SIMILAR 3011 01:48:15,055 --> 01:48:19,860 EXPERTISE THAT NEED TO BE 3012 01:48:19,926 --> 01:48:20,327 INCLUDED AS BIOMEDICAL 3013 01:48:20,393 --> 01:48:22,162 RESEARCHERS BECAUSE THEY HAVE A 3014 01:48:22,229 --> 01:48:24,364 STRONGER UNDERSTANDING ABOUT HOW 3015 01:48:24,431 --> 01:48:25,966 VARIABLES WERE CREATED LIKE RACE 3016 01:48:26,032 --> 01:48:27,634 AND ETHNICITY FOR EXAMPLE, AND 3017 01:48:27,701 --> 01:48:31,137 THEY PROVIDE A REALLY VALUABLE 3018 01:48:31,204 --> 01:48:32,239 ROLE IN RISK EVALUATION, 3019 01:48:32,305 --> 01:48:34,207 UNDERSTANDING WHERE THE POSSIBLE 3020 01:48:34,274 --> 01:48:36,476 DOWNSTREAM IMPACT OF USING THIS 3021 01:48:36,543 --> 01:48:37,711 AT POPULATION LEVEL TRANSLATING 3022 01:48:37,777 --> 01:48:40,413 THE MODEL WHEN WE GO TO DO THAT 3023 01:48:40,480 --> 01:48:41,147 AT POPULATION LEVEL, SCOPING 3024 01:48:41,214 --> 01:48:42,949 INTENDED USE OF THE MODEL, BEING 3025 01:48:43,016 --> 01:48:44,050 ABLE TO IDENTIFY CLEARLY WHERE 3026 01:48:44,117 --> 01:48:47,420 THE MODEL SHOULD NOT BE USED AS 3027 01:48:47,487 --> 01:48:48,421 WELL. 3028 01:48:48,488 --> 01:48:50,824 AND THE GOAL AND THE WHY BEHIND 3029 01:48:50,891 --> 01:48:52,459 MODEL, STORIES BEHIND IT AS 3030 01:48:52,525 --> 01:48:52,659 WELL. 3031 01:48:52,726 --> 01:48:53,994 SO THIS WAS REALLY SOMETHING WE 3032 01:48:54,060 --> 01:48:56,830 SPENT A LOT OF TIME ON, AN 3033 01:48:56,897 --> 01:48:58,164 IMPORTANT CORE STAKEHOLDER FOR 3034 01:48:58,231 --> 01:48:58,465 US. 3035 01:48:58,531 --> 01:49:01,201 THE LAST ONE WE DISCUSSED IN 3036 01:49:01,268 --> 01:49:02,969 DETAIL WERE POPULATIONS AFFECTED 3037 01:49:03,036 --> 01:49:04,604 BY POPULATION MODELS, OFTEN THE 3038 01:49:04,671 --> 01:49:07,841 PATIENTS, WE HAD A SUBGROUP 3039 01:49:07,908 --> 01:49:08,441 ABOUT MARGINALIZED POPULATIONS 3040 01:49:08,508 --> 01:49:11,611 IN PARTICULAR WHO ARE MORE 3041 01:49:11,678 --> 01:49:16,449 LIKELY TO BE NEGATIVELY AFFECTED 3042 01:49:16,516 --> 01:49:17,751 GIVEN HISTORICAL 3043 01:49:17,817 --> 01:49:18,118 MARGINALIZATION. 3044 01:49:18,184 --> 01:49:19,152 SO WE TALKED ABOUT MANY OF THESE 3045 01:49:19,219 --> 01:49:21,154 OTHER ISSUES THAT CAME UP, I 3046 01:49:21,221 --> 01:49:23,657 WON'T REPEAT THEM, HOW MODELS 3047 01:49:23,723 --> 01:49:27,460 ARE EXPLAINED TO PATIENTS, 3048 01:49:27,527 --> 01:49:28,595 TRUSTWORTHINESS, CONSENT, DATA 3049 01:49:28,662 --> 01:49:29,529 SOVEREIGNTY, PRIVACY, DIDN'T GET 3050 01:49:29,596 --> 01:49:31,331 FAR TO GET INTO GAPS OR 3051 01:49:31,398 --> 01:49:32,432 SOLUTIONS BUT MANY SIMILAR 3052 01:49:32,499 --> 01:49:34,467 THEMES FROM THE OTHER SESSIONS. 3053 01:49:34,534 --> 01:49:38,338 AND THIS IS A LIST OF OTHER 3054 01:49:38,405 --> 01:49:41,741 STAKEHOLDERS THAT WE UNDERSTAND 3055 01:49:41,808 --> 01:49:42,809 ARE INVOLVED SIGNIFICANTLY WITH 3056 01:49:42,876 --> 01:49:45,645 THE USE OF PROXY VARIABLES 3057 01:49:45,712 --> 01:49:48,748 INCLUDING PAYERS, SPONSORS, 3058 01:49:48,815 --> 01:49:50,250 HEALTH SYSTEMS, AND 3059 01:49:50,317 --> 01:49:50,583 ORGANIZATIONS. 3060 01:49:50,650 --> 01:49:56,723 DOES ANYBODY ANYTHING TO ADD? 3061 01:49:56,790 --> 01:49:57,691 ANY QUESTIONS? 3062 01:49:57,757 --> 01:49:57,857 YES? 3063 01:49:57,924 --> 01:49:59,025 >> I THINK THIS WAS ALREADY 3064 01:49:59,092 --> 01:50:04,831 BROUGHT UP BUT I WANT TO 3065 01:50:04,898 --> 01:50:07,100 REINFORCE, I HEARD OBERMEYER 3066 01:50:07,167 --> 01:50:09,402 GIVE A TALK RACIAL PROXY AND 3067 01:50:09,469 --> 01:50:11,438 BIAS, AT THE TIME IT WAS 3068 01:50:11,504 --> 01:50:13,807 COLLECTED EVERYTHING KNEW IT WAS 3069 01:50:13,873 --> 01:50:14,908 AN IMPERFECT PROXY, WHAT 3070 01:50:14,975 --> 01:50:15,875 HAPPENED WAS THAT INFORMATION 3071 01:50:15,942 --> 01:50:17,677 GOT LOST BY THE TIME THAT THE 3072 01:50:17,744 --> 01:50:19,179 MODEL WAS BUILT, RIGHT? 3073 01:50:19,245 --> 01:50:21,948 SO MAYBE JUST A NUDGE WHEN YOU 3074 01:50:22,015 --> 01:50:22,615 THINK ABOUT TRANSPARENCY 3075 01:50:22,682 --> 01:50:22,949 INTERVENTIONS. 3076 01:50:23,016 --> 01:50:25,085 BUT THIS IS JUST A GREAT EXAMPLE 3077 01:50:25,151 --> 01:50:27,153 OF, YOU KNOW, TRANSPARENCY 3078 01:50:27,220 --> 01:50:28,388 THROUGH THE LIFE CYCLE. 3079 01:50:28,455 --> 01:50:30,390 >> YEAH, THAT'S REALLY HELPFUL 3080 01:50:30,457 --> 01:50:33,093 CONTEXT, THANK YOU. 3081 01:50:33,159 --> 01:50:36,596 >> I'M GOING ASK A LOADED 3082 01:50:36,663 --> 01:50:37,030 QUESTION. 3083 01:50:37,097 --> 01:50:38,598 SO, I TALKED ABOUT -- I LOVE 3084 01:50:38,665 --> 01:50:40,767 THAT YOU BROUGHT UP HISTORICAL 3085 01:50:40,834 --> 01:50:42,002 EXPERTS, THE FACT THAT IF YOU 3086 01:50:42,068 --> 01:50:43,303 DON'T LOOK BACK AT HISTORY 3087 01:50:43,370 --> 01:50:45,138 YOU'LL REPEAT THE SAME MISTAKE 3088 01:50:45,205 --> 01:50:48,441 OVER AND OVER AGAIN. 3089 01:50:48,508 --> 01:50:50,744 DO YOU THINK WE NEED TO FAVOR 3090 01:50:50,810 --> 01:50:52,145 FUNDED RESEARCH, NEEDS TO 3091 01:50:52,212 --> 01:50:54,681 PUBLISH NEGATIVE RESULTS, HOW DO 3092 01:50:54,748 --> 01:50:57,183 WE FACILITATE THAT? 3093 01:50:57,250 --> 01:51:02,222 ESPECIALLY LARGE LANGUAGE MODELS 3094 01:51:02,288 --> 01:51:03,623 COMBING PUBLISHED LITERATURE 3095 01:51:03,690 --> 01:51:05,959 ONLY COMBING POSITIVE RESULTS 3096 01:51:06,026 --> 01:51:07,127 BECAUSE THERE'S A PUBLICATION 3097 01:51:07,193 --> 01:51:09,029 BIAS, WHAT ARE WE GOING TO 3098 01:51:09,095 --> 01:51:10,263 REPEAT BECAUSE NEGATIVE RESULTS 3099 01:51:10,330 --> 01:51:12,832 HAVE NOT BEEN SHARED? 3100 01:51:12,899 --> 01:51:13,099 >> YEAH. 3101 01:51:13,166 --> 01:51:14,300 >> FOOD FOR THOUGHT. 3102 01:51:14,367 --> 01:51:15,468 >> SUCH AN IMPORTANT POINT. 3103 01:51:15,535 --> 01:51:17,337 THE LIVE IN THE GUIDELINE 3104 01:51:17,404 --> 01:51:18,204 DEVELOPMENT WORLD. 3105 01:51:18,271 --> 01:51:20,073 YOU KNOW HOW EVERYONE POINTS 3106 01:51:20,140 --> 01:51:20,306 FINGERS? 3107 01:51:20,373 --> 01:51:22,142 THE EVIDENCE BASE, YOU GUYS ARE 3108 01:51:22,208 --> 01:51:23,109 STUDYING ALL THESE THINGS, 3109 01:51:23,176 --> 01:51:24,444 THAT'S WHAT'S FEEDING INTO OUR 3110 01:51:24,511 --> 01:51:25,078 EVIDENCE BASE. 3111 01:51:25,145 --> 01:51:28,148 WE NEED TO DO A BETTER JOB IN 3112 01:51:28,214 --> 01:51:31,751 THE GUIDELINE WORLD OUTLINING 3113 01:51:31,818 --> 01:51:35,922 LIMITATIONS OF EVIDENCE BASE, 3114 01:51:35,989 --> 01:51:37,357 THERE'S A HUGE PUBLICATION BIAS 3115 01:51:37,424 --> 01:51:38,658 THAT I THINK NEEDS TO BE 3116 01:51:38,725 --> 01:51:41,161 ADDRESSED, AND I LIKE THOSE 3117 01:51:41,227 --> 01:51:42,028 IDEAS OF INTERVENTIONS, DEFINITE 3118 01:51:42,095 --> 01:51:43,630 SOMETHING WE CAN EXPAND ON IN 3119 01:51:43,696 --> 01:51:45,465 FUTURE SESSIONS. 3120 01:51:45,532 --> 01:51:45,832 THANK YOU. 3121 01:51:45,899 --> 01:51:48,601 >> A CONVERSATION FROM MY GROUP 3122 01:51:48,668 --> 01:51:51,571 WITH COURTNEY ABOUT WHO IS A 3123 01:51:51,638 --> 01:51:53,973 HISTORIAN, TRACKING DOWN IN HER 3124 01:51:54,040 --> 01:51:55,008 HOSPITAL NETWORK WHO KNEW WHY 3125 01:51:55,075 --> 01:51:57,811 THEY WERE RECORDING RACE A 3126 01:51:57,877 --> 01:51:59,446 PARTICULAR WAY, WHY THEY SHIFTED 3127 01:51:59,512 --> 01:52:02,148 FROM PATIENT REPORT TO PATIENT 3128 01:52:02,215 --> 01:52:02,916 REPORTED FROM THE CLINICIAN 3129 01:52:02,982 --> 01:52:04,484 MARKING SOMEONE DOWN AS 3130 01:52:04,551 --> 01:52:06,519 SOMETHING SO, YEAH, WE 3131 01:52:06,586 --> 01:52:07,654 OVERLOOKED THE HISTORIAN 3132 01:52:07,720 --> 01:52:08,721 STAKEHOLDER BUT DEFINING THAT I 3133 01:52:08,788 --> 01:52:10,457 THINK A LITTLE BIT CREATIVELY TO 3134 01:52:10,523 --> 01:52:12,992 WHO REALLY HAS THAT HISTORICAL 3135 01:52:13,059 --> 01:52:14,227 CONTEXT IS INTERESTING. 3136 01:52:14,294 --> 01:52:15,095 >> YEAH, ABSOLUTELY, THOSE 3137 01:52:15,161 --> 01:52:18,832 CONVERSATIONS CAME UP IN OUR 3138 01:52:18,898 --> 01:52:19,699 ROOM AS WELL. 3139 01:52:19,766 --> 01:52:22,235 >> SO YOU MENTIONED SOMETHING 3140 01:52:22,302 --> 01:52:23,670 EARLIER LIKE YOU HAD -- THERE 3141 01:52:23,736 --> 01:52:24,571 ARE SEVERAL PROXIES. 3142 01:52:24,637 --> 01:52:28,274 DO YOU THINK THERE'S GOING TO BE 3143 01:52:28,341 --> 01:52:30,009 A FINITE LIST OF PROXIES YOU'RE 3144 01:52:30,076 --> 01:52:31,878 GOING TO FOCUS ON, SEEING WHAT 3145 01:52:31,945 --> 01:52:33,413 THE PROXY IS FOR, OR DO YOU 3146 01:52:33,480 --> 01:52:35,048 THINK IT'S TOO BROAD, WE TALK 3147 01:52:35,115 --> 01:52:40,086 ABOUT RACE AS A PROXY OR HOW DO 3148 01:52:40,153 --> 01:52:42,288 YOU SEE THAT LIST GROWING OR IS 3149 01:52:42,355 --> 01:52:44,457 IT FINITE OR WHAT DO YOU THINK 3150 01:52:44,524 --> 01:52:44,691 THERE? 3151 01:52:44,757 --> 01:52:46,893 >> YEAH, NO, I THAT'S HELPFUL 3152 01:52:46,960 --> 01:52:47,560 POINT. 3153 01:52:47,627 --> 01:52:48,528 MAYBE HELPFUL TO THINK ABOUT, 3154 01:52:48,595 --> 01:52:51,998 YOU KNOW, SOME VERY COMMON PROXY 3155 01:52:52,065 --> 01:52:52,799 VARIABLES, HAVE GUIDANCE 3156 01:52:52,866 --> 01:52:53,967 SPECIFIC TO THOSE, UNDERSTAND 3157 01:52:54,033 --> 01:52:55,068 THERE MAY BE OTHER ONES 3158 01:52:55,135 --> 01:52:56,936 DEVELOPED THAT WE'RE GOING TO 3159 01:52:57,003 --> 01:52:58,037 CONTINUES TO BE IDENTIFY IT'S 3160 01:52:58,104 --> 01:52:59,439 THAT WE NEED TO SHARE THAT 3161 01:52:59,506 --> 01:53:00,540 INFORMATION BACK FOR OTHER 3162 01:53:00,607 --> 01:53:02,408 PEOPLE USING THE SAME DATASETS 3163 01:53:02,475 --> 01:53:04,377 OR SIMILAR MODELS AS WELL. 3164 01:53:04,444 --> 01:53:06,479 YOU KNOW, I THINK IT ALSO COMES 3165 01:53:06,546 --> 01:53:08,882 DOWN TO LIKE THE INTENTIONALITY, 3166 01:53:08,948 --> 01:53:10,416 TO THE COMMENT ABOUT WE KNEW IT 3167 01:53:10,483 --> 01:53:12,385 WAS A PROXY BUT THAT GOT LOST BY 3168 01:53:12,452 --> 01:53:16,322 THE PEOPLE WHO WERE IMPLEMENTING 3169 01:53:16,389 --> 01:53:17,957 IT, SOMETIMES PEOPLE ARE ALSO 3170 01:53:18,024 --> 01:53:20,026 USING IT AS A PROXY FOR 3171 01:53:20,093 --> 01:53:21,461 SOMETHING GOOD TO TACKLE 3172 01:53:21,528 --> 01:53:21,794 DISPARITIES. 3173 01:53:21,861 --> 01:53:24,197 WE'RE USING RACE AS PROXY FOR 3174 01:53:24,264 --> 01:53:26,733 SYSTEMIC RACISM, THEREFORE WE 3175 01:53:26,799 --> 01:53:29,936 KNOW THAT THERE IS, YOU KNOW, 3176 01:53:30,003 --> 01:53:32,238 KIDNEY TRANSPLANTS ARE BEING 3177 01:53:32,305 --> 01:53:33,439 DISPROPORTIONATELY ALLOCATED TO 3178 01:53:33,506 --> 01:53:40,246 CERTAIN GROUPS AND THEREFORE 3179 01:53:40,313 --> 01:53:42,549 SOCIAL DETERMINANTS ARE NOT 3180 01:53:42,615 --> 01:53:43,850 STRONG ENOUGH, THAT WE'RE GOING 3181 01:53:43,917 --> 01:53:47,053 TO USE RACE AS A PROXY FOR 3182 01:53:47,120 --> 01:53:47,620 SYSTEMIC RACISM BUT THERE'S 3183 01:53:47,687 --> 01:53:48,821 TRANSPARENCY, WHY ARE YOU 3184 01:53:48,888 --> 01:53:51,224 PUTTING IN THE MODEL, FOR THAT 3185 01:53:51,291 --> 01:53:51,457 REASON? 3186 01:53:51,524 --> 01:53:53,092 WE TALKED ABOUT SAY YOU BELIEVE 3187 01:53:53,159 --> 01:53:54,861 THE PROXY FOR A SPECIFIC GENE 3188 01:53:54,928 --> 01:53:56,062 THAT HASN'T BEEN IDENTIFIED, 3189 01:53:56,129 --> 01:53:57,363 LIKE WHEN THAT DATA BECOME 3190 01:53:57,430 --> 01:53:59,699 AVAILABLE HOW DO YOU MAKE SURE 3191 01:53:59,766 --> 01:54:02,001 THAT GETS REPLACED AND THESE 3192 01:54:02,068 --> 01:54:04,370 MODELS ARE CONTINUING TO BE 3193 01:54:04,437 --> 01:54:04,604 UPDATED. 3194 01:54:04,671 --> 01:54:06,172 >> I THINK THIS IS GREAT 3195 01:54:06,239 --> 01:54:06,472 DISCUSSION. 3196 01:54:06,539 --> 01:54:07,473 I WANT TO SUGGEST SOME THINGS 3197 01:54:07,540 --> 01:54:10,677 YOU MIGHT NOT THINK OF AS A 3198 01:54:10,743 --> 01:54:11,844 PROXY VARIABLE ACTUALLY ARE. 3199 01:54:11,911 --> 01:54:13,947 SEX IS ONE. 3200 01:54:14,013 --> 01:54:15,381 OBESITY IS ANOTHER. 3201 01:54:15,448 --> 01:54:18,318 I MEAN, YOU CAN DEFINE BOTH VERY 3202 01:54:18,384 --> 01:54:20,954 CLEARLY IN BIOLOGICAL TERMS BUT 3203 01:54:21,020 --> 01:54:22,255 WHAT THEY MEAN IS VERY MUCH 3204 01:54:22,322 --> 01:54:24,791 SITUATED IN WHERE YOU ARE IN 3205 01:54:24,857 --> 01:54:25,325 SOCIETY. 3206 01:54:25,391 --> 01:54:27,594 AND SO I JUST WANT TO THROW 3207 01:54:27,660 --> 01:54:29,696 THAT, WE TEND TO SAY WE CAN'T 3208 01:54:29,762 --> 01:54:31,798 USE PROXY VARIABLE BECAUSE WE 3209 01:54:31,864 --> 01:54:33,666 DON'T UNDERSTAND IT BUT I THINK 3210 01:54:33,733 --> 01:54:37,604 WE NEED TO REALIZE THAT LOTS OF 3211 01:54:37,670 --> 01:54:38,805 THINGS ARE PROXY VARIABLES, EVEN 3212 01:54:38,871 --> 01:54:40,974 IF WE HAVE A CLEAR BIOLOGICAL 3213 01:54:41,040 --> 01:54:41,574 DEFINITION. 3214 01:54:41,641 --> 01:54:43,843 SO I THOUGHT I WOULD SUGGEST 3215 01:54:43,910 --> 01:54:44,544 THAT FOR YOUR CONSIDERATION. 3216 01:54:44,611 --> 01:54:48,548 >> THANK YOU FOR THAT POINT. 3217 01:54:48,615 --> 01:54:51,451 >> SO YOU DID A GREAT JOB 3218 01:54:51,517 --> 01:54:52,518 SUMMARIZING OUR DISCUSSION. 3219 01:54:52,585 --> 01:54:52,919 >> THANK YOU. 3220 01:54:52,986 --> 01:54:56,756 >> WE HAD A GREAT DISCUSSION. 3221 01:54:56,823 --> 01:54:58,458 TO ELLEN'S POINT, THAT WHY IT'S 3222 01:54:58,524 --> 01:55:06,733 IMPORTANT WE TALK ABOUT 3223 01:55:06,799 --> 01:55:07,367 CONCEPTUALIZATION, PROXY 3224 01:55:07,433 --> 01:55:08,801 VARIABLES JUST EXIST, WE HAD TO 3225 01:55:08,868 --> 01:55:10,003 THINK ABOUT CONCEPTS WE'RE 3226 01:55:10,069 --> 01:55:12,405 TRYING TO MAP THESE THINGS ONTO 3227 01:55:12,472 --> 01:55:14,040 BUT BE CLEAR AGAIN, THIS WAS 3228 01:55:14,107 --> 01:55:15,875 SAID EARLIER, BEING CLEAR ABOUT 3229 01:55:15,942 --> 01:55:18,077 THE PURPOSE, ORIGINALLY, SO THAT 3230 01:55:18,144 --> 01:55:19,612 WHEN WE PASS THESE DATA ON, 3231 01:55:19,679 --> 01:55:22,682 THERE NEEDS TO BE NOTES ON THESE 3232 01:55:22,749 --> 01:55:24,584 DATA TO PROVIDE THAT TO USERS 3233 01:55:24,651 --> 01:55:26,986 AND THEN THAT CAN ADVANCE THE 3234 01:55:27,053 --> 01:55:29,489 SCIENCE AND NOT JUST ESSENTIALLY 3235 01:55:29,555 --> 01:55:30,256 REPLICATE MISTAKES THAT ARE 3236 01:55:30,323 --> 01:55:32,125 MADE, MAY HAVE BEEN MADE EARLIER 3237 01:55:32,191 --> 01:55:34,193 >> YEAH, THANK YOU, SUCH AN 3238 01:55:34,260 --> 01:55:36,629 IMPORTANT POINT. 3239 01:55:36,696 --> 01:55:40,033 ANYTHING ELSE? 3240 01:55:40,099 --> 01:55:40,233 NO? 3241 01:55:40,300 --> 01:55:40,433 OKAY. 3242 01:55:40,500 --> 01:55:42,902 >> GREAT. 3243 01:55:49,909 --> 01:55:50,310 >> OKAY. 3244 01:55:50,376 --> 01:55:53,780 LET ME SEE IF I CAN GET BACK TO 3245 01:55:53,846 --> 01:55:56,616 OUR SLIDES HERE. 3246 01:55:56,683 --> 01:55:58,051 LAURA, IF YOU'RE GOING TO HELP 3247 01:55:58,117 --> 01:56:00,653 ME, AM I GETTING CLOSE? 3248 01:56:00,720 --> 01:56:01,521 >> YEP. 3249 01:56:01,587 --> 01:56:02,455 YOU'RE GOOD. 3250 01:56:02,522 --> 01:56:03,523 >> OKAY, GREAT. 3251 01:56:03,589 --> 01:56:05,358 SO, I HAVE THE DOUBTFUL 3252 01:56:05,425 --> 01:56:07,360 PRIVILEGE OF TRYING TO SUMMARIZE 3253 01:56:07,427 --> 01:56:11,064 OUR FIRST HALF DAY OF 3254 01:56:11,130 --> 01:56:11,698 CONVERSATIONS. 3255 01:56:11,764 --> 01:56:15,335 I THINK, YOU KNOW, IT'S A REALLY 3256 01:56:15,401 --> 01:56:17,303 AMAZING OPPORTUNITY TO GET ALL 3257 01:56:17,370 --> 01:56:23,543 OF THESE -- ALL OF YOU HERE IN 3258 01:56:23,609 --> 01:56:24,877 THIS ROOM TO HAVE THIS 3259 01:56:24,944 --> 01:56:25,278 DISCUSSION. 3260 01:56:25,345 --> 01:56:26,479 IT'S A MOMENT IN TIME WE'RE AT 3261 01:56:26,546 --> 01:56:28,448 THE RIGHT TIME AND PLACE TO HAVE 3262 01:56:28,514 --> 01:56:30,583 THESE DISCUSSIONS, AND THEY ARE 3263 01:56:30,650 --> 01:56:33,019 BECOMING VERY, VERY PRESSING. 3264 01:56:33,086 --> 01:56:36,923 AND SO I'M JUST GOING TO GIVE 3265 01:56:36,989 --> 01:56:38,825 SORT OF MY MENTAL RECAP OF THE 3266 01:56:38,891 --> 01:56:40,059 DAY TODAY, HOPEFULLY A GOOD WAY 3267 01:56:40,126 --> 01:56:42,729 TO CLOSE OUT THE FIRST HALF DAY. 3268 01:56:42,795 --> 01:56:44,430 WE STARTED WITH A REALLY GREAT 3269 01:56:44,497 --> 01:56:46,866 KIND OF GROUNDING SESSION WHERE 3270 01:56:46,933 --> 01:56:50,703 WE HEARD THE GOALS OF WHAT THE 3271 01:56:50,770 --> 01:56:52,672 WORKSHOP ARE, TO EVENTUALLY 3272 01:56:52,739 --> 01:56:53,840 DEVELOP A DRAFT GUIDANCE THAT 3273 01:56:53,906 --> 01:56:55,641 COULD BE GIVEN TO THE NIH THAT 3274 01:56:55,708 --> 01:56:57,810 WOULD EVENTUALLY TURN INTO AN 3275 01:56:57,877 --> 01:57:02,448 RFI FOR PUBLIC COMMENT AND THEN 3276 01:57:02,515 --> 01:57:06,018 HOPEFULLY THEN SORT OF DEPLOYED 3277 01:57:06,085 --> 01:57:08,488 WIDELY ACROSS THE NIH GRANT 3278 01:57:08,554 --> 01:57:10,757 RECIPIENTS IN THE FUTURE. 3279 01:57:10,823 --> 01:57:14,026 JULIA GAVE US A GREAT TALK ON 3280 01:57:14,093 --> 01:57:15,895 THE DEFINITIONS OF TRANSPARENCY 3281 01:57:15,962 --> 01:57:18,297 AND SOME CHALLENGES OF THOSE 3282 01:57:18,364 --> 01:57:18,831 DEFINITIONS. 3283 01:57:18,898 --> 01:57:22,668 I THOUGHT THAT TALK WAS 3284 01:57:22,735 --> 01:57:23,770 EXTREMELY ILLUMINATING. 3285 01:57:23,836 --> 01:57:27,507 AND REALLY KIND OF SPELLED OUT 3286 01:57:27,573 --> 01:57:29,976 NUANCES OF SOME TERMINOLOGY. 3287 01:57:30,042 --> 01:57:32,078 TINA GAVE AN AMAZING TALK ABOUT 3288 01:57:32,145 --> 01:57:35,481 THE PROBLEMS WE ALL FACE, AS, 3289 01:57:35,548 --> 01:57:39,085 YOU KNOW, THESE ALGORITHMS AND 3290 01:57:39,152 --> 01:57:41,187 MODELS EVEN JUST TRADITIONAL 3291 01:57:41,254 --> 01:57:41,854 STATISTICAL MODELS HAVE BEEN 3292 01:57:41,921 --> 01:57:44,657 USED IN THE HISTORY OF MEDICINE, 3293 01:57:44,724 --> 01:57:46,459 AND, YOU KNOW, THERE'S A BOOK 3294 01:57:46,526 --> 01:57:48,294 CALLED "THE WEAPONS OF MASS 3295 01:57:48,361 --> 01:57:49,929 DESTRUCTION" THAT SORT OF CAME 3296 01:57:49,996 --> 01:57:54,434 TO MIND AS I HEARD HER TALK. 3297 01:57:54,500 --> 01:57:55,968 AND IT BOILS DOWN, YOU KNOW, 3298 01:57:56,035 --> 01:57:58,371 WHAT SOME OF THE CENTRAL ISSUES 3299 01:57:58,438 --> 01:58:00,473 ARE, AND I THINK THAT GENERATED 3300 01:58:00,540 --> 01:58:02,208 A LOT OF DISCUSSION IN THE 3301 01:58:02,275 --> 01:58:04,844 BREAKOUT SESSIONS THAT WE HAD. 3302 01:58:04,911 --> 01:58:07,246 I HAD AN OPPORTUNITY TO VISIT 3303 01:58:07,313 --> 01:58:09,449 ALMOST ALL THE DIFFERENT 3304 01:58:09,515 --> 01:58:10,116 BREAKOUT ROOMS. 3305 01:58:10,183 --> 01:58:14,153 AND WHAT I WAS STRUCK BY WAS THE 3306 01:58:14,220 --> 01:58:15,154 GREAT DIVERSITY OF WORK STYLES. 3307 01:58:15,221 --> 01:58:18,558 THAT WAS SOMETHING I DIDN'T 3308 01:58:18,624 --> 01:58:18,791 EXPECT. 3309 01:58:18,858 --> 01:58:20,927 THERE WERE SOME ROOMS THAT HAD 3310 01:58:20,993 --> 01:58:22,361 LIKE A FREE-FLOWING GROUP 3311 01:58:22,428 --> 01:58:24,430 DISCUSSION, AND THEN TRIED TO 3312 01:58:24,497 --> 01:58:25,765 SYNTHESIZE THAT INFORMATION. 3313 01:58:25,832 --> 01:58:27,366 THERE WERE OTHER GROUPS THAT 3314 01:58:27,433 --> 01:58:29,569 TOOK A VERY, YOU KNOW, 3315 01:58:29,635 --> 01:58:30,570 FLIP-BOARD STYLE LET'S WRITE 3316 01:58:30,636 --> 01:58:33,406 DOWN OUR THOUGHTS AS WE GO. 3317 01:58:33,473 --> 01:58:36,776 THERE WAS EVEN CROWDSOURCING, 3318 01:58:36,843 --> 01:58:38,678 THEY HAD ALL THE PARTICIPANTS IN 3319 01:58:38,744 --> 01:58:41,948 THE ROOM WRITE DOWN THEIR IDEAS 3320 01:58:42,014 --> 01:58:43,049 ON STICKIES AND INDEPENDENTLY, 3321 01:58:43,115 --> 01:58:45,985 AND THEN TRY TO PULL THAT ALL 3322 01:58:46,052 --> 01:58:46,752 TOGETHER. 3323 01:58:46,819 --> 01:58:49,055 AND THEN WE SAW THE RESULTS OF 3324 01:58:49,121 --> 01:58:50,790 THE LIVE GOOGLE SLIDE CREATION 3325 01:58:50,857 --> 01:58:53,559 THAT I SAW, WHICH IS A SKILL I 3326 01:58:53,626 --> 01:58:54,527 DON'T HAVE. 3327 01:58:54,594 --> 01:58:56,162 BEING ABLE TO, YOU KNOW, 3328 01:58:56,229 --> 01:58:57,463 SYNTHESIZE WHAT'S BEING SAID AND 3329 01:58:57,530 --> 01:59:01,868 PUT IT INTO BULLET POINTS ON A 3330 01:59:01,934 --> 01:59:02,401 SLIDE. 3331 01:59:02,468 --> 01:59:03,736 THERE'S ALSO DIFFERENCES IN 3332 01:59:03,803 --> 01:59:06,706 STRATEGIES LIKE ONE GROUP TOOK A 3333 01:59:06,772 --> 01:59:08,007 VERY DEPTH-FIRST SEARCH APPROACH 3334 01:59:08,074 --> 01:59:12,044 WHERE THEY REALLY ZOOMED IN ON 3335 01:59:12,111 --> 01:59:14,480 ONE STAKEHOLDER AND THEN HAD AN 3336 01:59:14,547 --> 01:59:16,949 IN-DEPTH DISCUSSION ABOUT WHAT 3337 01:59:17,016 --> 01:59:18,284 THAT STAKEHOLDER NEEDS WOULD BE, 3338 01:59:18,351 --> 01:59:20,586 THAT GAVE RISE TO OTHER 3339 01:59:20,653 --> 01:59:22,221 STAKEHOLDERS THAT WERE ADJACENT 3340 01:59:22,288 --> 01:59:23,322 TO THAT STAKEHOLDER. 3341 01:59:23,389 --> 01:59:27,059 OTHER GROUPS TOOK A MORE 3342 01:59:27,126 --> 01:59:28,694 PRINCIPLED BREADTH OF SEARCH, 3343 01:59:28,761 --> 01:59:30,997 LARGER FRAMEWORK, THINKING ABOUT 3344 01:59:31,063 --> 01:59:34,700 EACH ONE OF THOSE INDIVIDUALLY. 3345 01:59:34,767 --> 01:59:38,471 I HAVE TO SAY I THINK THE MOST 3346 01:59:38,538 --> 01:59:40,006 COMMON THEME I SAW IN THE 3347 01:59:40,072 --> 01:59:44,043 BREAKOUTS WAS THAT EVERYONE RAN 3348 01:59:44,110 --> 01:59:45,244 OUT OF TIME. 3349 01:59:45,311 --> 01:59:46,512 IT WAS ALMOST LIKE MORE THAN 3350 01:59:46,579 --> 01:59:48,314 HALF THE TIME WAS SPENT TRYING 3351 01:59:48,381 --> 01:59:49,849 TO UNDERSTAND THE SCOPE AND 3352 01:59:49,916 --> 01:59:52,752 DEFINITION OF WHAT THE GROUP WAS 3353 01:59:52,818 --> 01:59:54,120 SUPPOSED TO TACKLE, IN THE THREE 3354 01:59:54,186 --> 01:59:57,390 DAYS THAT WE'RE HERE TOGETHER. 3355 01:59:57,456 --> 01:59:58,925 AND THEN THAT LEFT VERY LITTLE 3356 01:59:58,991 --> 02:00:01,327 TIME AT THE END TO KIND OF 3357 02:00:01,394 --> 02:00:02,895 ACTUALLY TACKLE THE STAKEHOLDER 3358 02:00:02,962 --> 02:00:04,830 LIST THAT YOU GENERATED. 3359 02:00:04,897 --> 02:00:07,967 BUT IT WAS CLEAR THAT THERE WAS 3360 02:00:08,034 --> 02:00:09,502 A VERY COMMON, YOU KNOW, LIST 3361 02:00:09,569 --> 02:00:11,170 THAT EVERYBODY CAME UP WITH SO 3362 02:00:11,237 --> 02:00:13,072 EVEN THOUGH EVERYBODY TOOK LIKE 3363 02:00:13,139 --> 02:00:14,707 DIFFERENT APPROACHES TO TACKLE 3364 02:00:14,774 --> 02:00:16,776 THIS PROBLEM, THEY CAME UP WITH 3365 02:00:16,842 --> 02:00:18,644 A LIST THAT WAS FOR THE MOST 3366 02:00:18,711 --> 02:00:20,179 PART VERY SIMILAR TO ONE 3367 02:00:20,246 --> 02:00:21,847 ANOTHER, AND SO I THINK THAT 3368 02:00:21,914 --> 02:00:22,715 WILL BE REALLY INTERESTING. 3369 02:00:22,782 --> 02:00:24,350 I HOPE THAT THE BREAKOUT LEADS 3370 02:00:24,417 --> 02:00:26,852 WERE LISTENING TO THE OTHER 3371 02:00:26,919 --> 02:00:28,854 BREAKOUTS, AND THEIR LIST OF 3372 02:00:28,921 --> 02:00:30,489 STAKEHOLDERS, BECAUSE, YOU KNOW, 3373 02:00:30,556 --> 02:00:31,591 IT'S QUITE LIKELY THAT, YOU 3374 02:00:31,657 --> 02:00:33,225 KNOW, THERE'S A STAKEHOLDER THAT 3375 02:00:33,292 --> 02:00:34,961 YOU SHOULD PROBABLY INCORPORATE 3376 02:00:35,027 --> 02:00:37,730 IN YOUR DISCUSSIONS FOR 3377 02:00:37,797 --> 02:00:39,365 TOMORROW. 3378 02:00:39,432 --> 02:00:42,068 IF I COULD BREAK IT DOWN IT 3379 02:00:42,134 --> 02:00:43,836 SEEMED LIKE EVERYBODY WAS 3380 02:00:43,903 --> 02:00:44,804 THINKING ABOUT THE WHOLE LIFE 3381 02:00:44,870 --> 02:00:47,340 CYCLE PROCESS OF WHAT IT TAKES 3382 02:00:47,406 --> 02:00:48,774 TO BUILD AND DEPLOY A.I. MODEL 3383 02:00:48,841 --> 02:00:51,210 IN THIS SPACE SO EVERYTHING FROM 3384 02:00:51,277 --> 02:00:53,045 THE DATA GENERATION, DATA 3385 02:00:53,112 --> 02:00:55,348 CREATION, YOU KNOW, THE INPUTS 3386 02:00:55,414 --> 02:00:56,782 OF THAT DATA FROM CONSENT ALL 3387 02:00:56,849 --> 02:00:58,751 THE WAY TO THE HARMONIZATION OF 3388 02:00:58,818 --> 02:01:01,287 THAT DATA, AND THEN TO THE MODEL 3389 02:01:01,354 --> 02:01:03,356 BUILDERS, THE MODEL DEVELOPERS, 3390 02:01:03,422 --> 02:01:08,961 AND THEN THE FOLKS WHO MAY NOT 3391 02:01:09,028 --> 02:01:13,866 HAVE MODEL TRAINING EXPERTISE 3392 02:01:13,933 --> 02:01:15,201 BUT ARE IN CHARGE OF DEPLOYING 3393 02:01:15,267 --> 02:01:16,736 IN THE HEALTH CARE SYSTEM. 3394 02:01:16,802 --> 02:01:18,237 AND FINALLY THE PATIENTS 3395 02:01:18,304 --> 02:01:20,339 ACTUALLY AFFECTED BY THE 3396 02:01:20,406 --> 02:01:21,974 OUTCOMES OF THESE MODELS. 3397 02:01:22,041 --> 02:01:25,911 I WAS ALSO STRUCK WITH THE META 3398 02:01:25,978 --> 02:01:27,480 LEVEL DISCUSSION AFTERWARDS. 3399 02:01:27,546 --> 02:01:30,149 WE HEARD ABOUT THE 3400 02:01:30,216 --> 02:01:31,117 ANTHROPOLOGISTS WHO ARE 3401 02:01:31,183 --> 02:01:32,218 STAKEHOLDERS I HAD NOT 3402 02:01:32,284 --> 02:01:33,219 PREVIOUSLY THOUGHT ABOUT UNTIL 3403 02:01:33,285 --> 02:01:35,621 THAT WAS BROUGHT UP. 3404 02:01:35,688 --> 02:01:37,623 AND IT WAS ALMOST SO THAT IT 3405 02:01:37,690 --> 02:01:40,326 COULD ENABLE LIKE AN M AND M 3406 02:01:40,393 --> 02:01:42,061 TYPE DISSECTION OF WHAT WENT 3407 02:01:42,128 --> 02:01:44,263 WRONG IN THE FUTURE WHEN 3408 02:01:44,330 --> 02:01:46,232 INEVITABLY NO MATTER ALL THE 3409 02:01:46,298 --> 02:01:47,533 EFFORTS IN THIS ROOM, THERE ARE 3410 02:01:47,600 --> 02:01:49,602 GOING TO BE MISTAKES THAT WILL 3411 02:01:49,669 --> 02:01:51,570 HAPPEN, IN ORDER TO ENABLE A 3412 02:01:51,637 --> 02:01:52,672 CAREFUL DISSECTION OF WHAT WENT 3413 02:01:52,738 --> 02:01:55,408 WRONG AND FIX IT IN THE FUTURE, 3414 02:01:55,474 --> 02:01:56,842 I THINK, YOU KNOW, IDENTIFYING 3415 02:01:56,909 --> 02:01:58,344 THE NEEDS OF THOSE PEOPLE ARE 3416 02:01:58,411 --> 02:02:01,113 GOING TO BE REALLY CRITICAL. 3417 02:02:01,180 --> 02:02:02,648 ALL RIGHT. 3418 02:02:02,715 --> 02:02:05,718 SO, THAT WAS MY RECAP FOR THE 3419 02:02:05,785 --> 02:02:06,052 DAY. 3420 02:02:06,118 --> 02:02:08,020 AND I AM VERY EXCITED TO THINK 3421 02:02:08,087 --> 02:02:10,089 ABOUT WHAT'S GOING TO COME HOME, 3422 02:02:10,156 --> 02:02:14,593 AS YOU START TO THINK ABOUT THE 3423 02:02:14,660 --> 02:02:17,196 GAPS FROM THE CURRENT STATE OF 3424 02:02:17,263 --> 02:02:19,165 WHERE WE ARE WITH TRANSPARENCY 3425 02:02:19,231 --> 02:02:20,700 SO TOMORROW MORNING I'LL START 3426 02:02:20,766 --> 02:02:23,769 WITH A TALK WHERE I TRY TO DO MY 3427 02:02:23,836 --> 02:02:25,071 BEST BY EXPLAINING WHERE THINGS 3428 02:02:25,137 --> 02:02:28,841 ARE IN TERMS OF DATA SHEETS AND 3429 02:02:28,908 --> 02:02:31,243 MODEL CARDS, AND THEN YOU'LL GET 3430 02:02:31,310 --> 02:02:32,978 INTO YOUR BREAKOUT SESSIONS AND 3431 02:02:33,045 --> 02:02:34,814 I PROMISE YOU YOU'LL HAVE TIME 3432 02:02:34,880 --> 02:02:36,782 HOPEFULLY TO CHEW ON THESE IDEAS 3433 02:02:36,849 --> 02:02:38,584 AND HOPEFULLY EVEN DURING THE 3434 02:02:38,651 --> 02:02:39,452 EVENING YOU MIGHT HAVE 3435 02:02:39,518 --> 02:02:40,986 ADDITIONAL THOUGHTS THAT YOU CAN 3436 02:02:41,053 --> 02:02:43,389 BRING FORWARD TO YOUR GROUPS. 3437 02:02:43,456 --> 02:02:44,356 SO, THANK YOU. 3438 02:02:44,423 --> 02:02:45,791 AND LAURA, I DON'T KNOW IF YOU 3439 02:02:45,858 --> 02:02:49,595 HAVE ANYTHING YOU WANT TO SAY. 3440 02:02:51,564 --> 02:02:56,635 [APPLAUSE] 3441 02:02:56,702 --> 02:02:58,604 >> THANK YOU TO, AARON, ALL OUR 3442 02:02:58,671 --> 02:03:00,005 SPEAKERS, ALL OF YOU FOR THE 3443 02:03:00,072 --> 02:03:02,308 WORK YOU'VE BEEN DOING TODAY. 3444 02:03:02,374 --> 02:03:04,844 I THINK IT'S PAYING ENORMOUS 3445 02:03:04,910 --> 02:03:06,812 DIVIDENDS ALREADY, IT'S REALLY 3446 02:03:06,879 --> 02:03:07,246 EXCITING. 3447 02:03:07,313 --> 02:03:08,547 JUST A COUPLE LOGISTICS THINGS, 3448 02:03:08,614 --> 02:03:10,416 WE START TOMORROW AT NINE, 3449 02:03:10,483 --> 02:03:12,151 COFFEE OUT FRONT BEFORE THAT. 3450 02:03:12,218 --> 02:03:13,786 MANY OF YOU HAVE ORDERED LUNCH 3451 02:03:13,853 --> 02:03:15,521 WHICH WILL COME AND BE 3452 02:03:15,588 --> 02:03:15,788 DELIVERED. 3453 02:03:15,855 --> 02:03:17,089 IF YOU HAVEN'T ORDERED LUNCH, 3454 02:03:17,156 --> 02:03:18,491 YOU MIGHT WANT TO THINK ABOUT 3455 02:03:18,557 --> 02:03:19,959 HOW TO BRING SOMETHING TO EAT 3456 02:03:20,025 --> 02:03:21,594 BECAUSE WE ARE A LITTLE BIT OF A 3457 02:03:21,660 --> 02:03:22,294 FOOD DESERT. 3458 02:03:22,361 --> 02:03:24,396 COME TO US IF YOU HAVE QUESTIONS 3459 02:03:24,463 --> 02:03:26,499 ABOUT HOW TO SOLVE THAT PROBLEM 3460 02:03:26,565 --> 02:03:27,299 AND WE'LL GET CREATIVE WITH YOU 3461 02:03:27,366 --> 02:03:28,434 IF YOU NEED TO. 3462 02:03:28,501 --> 02:03:30,202 OTHERWISE IF YOU ORDERED LUNCH 3463 02:03:30,269 --> 02:03:32,104 ALREADY, IT WILL JUST COME 3464 02:03:32,171 --> 02:03:33,172 DURING THE LUNCH BREAK. 3465 02:03:33,239 --> 02:03:35,141 HAVE A VERY NICE EVENING. 3466 02:03:35,207 --> 02:03:36,242 REST UP. 3467 02:03:36,308 --> 02:03:37,243 ENJOY EACH OTHER. 3468 02:03:37,309 --> 02:03:38,444 AND RUMINATE ON SOME OF THIS 3469 02:03:38,511 --> 02:03:39,745 BECAUSE WE'LL BE PICKING IT UP 3470 02:03:39,812 --> 02:03:41,213 TOMORROW. 3471 02:03:41,280 --> 02:03:41,614 THANK YOU SO MUCH. 3472 02:03:41,680 [APPLAUSE]