1 00:00:08,956 --> 00:00:11,158 WELCOME TO THE CLINICAL CENTER GRAND ROUNDS, 2 00:00:11,158 --> 00:00:14,962 A WEEKLY SERIES OF EDUCATIONAL LECTURES FOR PHYSICIANS AND 3 00:00:14,962 --> 00:00:17,598 HEALTH CARE PROFESSIONALS BROADCAST FROM THE CLINICAL 4 00:00:17,598 --> 00:00:20,567 CENTER AT THE NATIONAL INSTITUTES OF HEALTH IN 5 00:00:20,567 --> 00:00:22,369 BETHESDA, MD. 6 00:00:22,369 --> 00:00:25,906 THE NIH CLINICAL CENTER IS THE WORLD'S LARGEST HOSPITAL TOTALLY 7 00:00:25,906 --> 00:00:29,610 DEDICATED TO INVESTIGATIONAL RESEARCH AND LEADS THE GLOBAL 8 00:00:29,610 --> 00:00:32,546 EFFORT IN TRAINING TODAY'S INVESTIGATORS AND DISCOVERING 9 00:00:32,546 --> 00:00:34,715 TOMORROW'S CURES. 10 00:00:34,715 --> 00:00:44,542 LEARN MORE BY VISITING US ONLINE AT HTTP://CLINICALCENTER.NIH.GOV 11 00:00:44,542 --> 00:00:45,910 WELCOME AGAIN EVERYBODY TO 12 00:00:45,910 --> 00:00:48,246 ETHICS GRAND ROUNDS. 13 00:00:48,246 --> 00:00:50,949 I'M FROM THE DEPARTMENT OF 14 00:00:50,949 --> 00:00:51,349 BIOETHICS. 15 00:00:51,349 --> 00:00:55,954 WE SPONSOR THESE ETHICS GRAND 16 00:00:55,954 --> 00:00:57,756 ROUNDS FOUR TIMES A YEAR FOR 17 00:00:57,756 --> 00:00:59,157 PEOPLE PLANNING AHEAD THE NEXT 18 00:00:59,157 --> 00:01:02,127 ONE ARE THE FIRST WEDNESDAYS AT 19 00:01:02,127 --> 00:01:06,398 NOON HERE AND ALSO VIRTUALLY. 20 00:01:06,398 --> 00:01:07,432 FIRST WEDNESDAY OF DECEMBER, 21 00:01:07,432 --> 00:01:12,804 FEBRUARY AND APRIL. 22 00:01:12,804 --> 00:01:15,106 FIRST WITNESS OF DECEMBER, 23 00:01:15,106 --> 00:01:15,940 FEBRUARY AND APRIL. 24 00:01:15,940 --> 00:01:17,776 FOR THOSE WHO HAVE BEEN HERE FOR 25 00:01:17,776 --> 00:01:20,044 A WHILE THE GOAL IS TYPICALLY 26 00:01:20,044 --> 00:01:22,313 BASED ON CONSULTS WE GOT. 27 00:01:22,313 --> 00:01:27,986 WE HAVE A CLINICIAN PRESENT A 28 00:01:27,986 --> 00:01:32,791 CASE AND INVITE AN OUTSIDE 29 00:01:32,791 --> 00:01:34,492 EXPERT TO GIVIS THIS THOUGHTS ON 30 00:01:34,492 --> 00:01:35,093 THE CASE. 31 00:01:35,093 --> 00:01:36,494 THE FIRST IS THE GOAL OF THESE 32 00:01:36,494 --> 00:01:39,664 IS JUST TO COME UP WITH 33 00:01:39,664 --> 00:01:41,466 INTERESTING IMPORTANT ETHICAL 34 00:01:41,466 --> 00:01:43,468 ISSUES FOR US TO DISCUSS. 35 00:01:43,468 --> 00:01:45,670 NOT TO SET POLICY OR MAKE 36 00:01:45,670 --> 00:01:47,906 DECISIONS FOR THE NIH. 37 00:01:47,906 --> 00:01:48,773 EVERYBODY WHO SPEAKS HERE SPEAKS 38 00:01:48,773 --> 00:01:49,974 ONLY FOR THEMSELVES UNLESS THEY 39 00:01:49,974 --> 00:01:51,009 TELL YOU OTHERWISE. 40 00:01:51,009 --> 00:01:52,844 NOBODY HERE IS A SPOKESPERSON 41 00:01:52,844 --> 00:01:54,012 FOR ANYBODY ELSE. 42 00:01:54,012 --> 00:01:56,848 I'M CERTAINLY NOT A SPOKESPERSON 43 00:01:56,848 --> 00:01:58,983 FOR ANYBODY BUT AT LEAST 44 00:01:58,983 --> 00:02:01,619 SOMETIMES FOR MYSELF. 45 00:02:01,619 --> 00:02:03,555 SO, THERE'S ALSO CME CREDIT. 46 00:02:03,555 --> 00:02:07,826 TO GET CME CREDIT YOU NEED TO 47 00:02:07,826 --> 00:02:12,130 TEXT THE NUMBER 50525 AND YOU 48 00:02:12,130 --> 00:02:14,566 TEXT IT TO THE PHONE NUMBER THAT 49 00:02:14,566 --> 00:02:17,335 YOU HAVE COMING UP ON THE SLIDE. 50 00:02:17,335 --> 00:02:19,103 IT'S THROUGH HOPKINS WE GET THE 51 00:02:19,103 --> 00:02:22,507 CME CREDIT SO TEXT 50525 TO 52 00:02:22,507 --> 00:02:22,774 THEM. 53 00:02:22,774 --> 00:02:24,442 PEOPLE HERE WHO HAVE QUESTIONS, 54 00:02:24,442 --> 00:02:28,246 PLEASE GO TO THE MICS IN THE 55 00:02:28,246 --> 00:02:28,646 AISLES. 56 00:02:28,646 --> 00:02:31,149 IF YOU'RE ONLINE, PLEASE GO TO 57 00:02:31,149 --> 00:02:33,685 THE FEEDBACK BUTTON ON THE 58 00:02:33,685 --> 00:02:34,252 WEBSITE. 59 00:02:34,252 --> 00:02:35,487 YOU MIGHT HAVE TO SCROLL DOWN TO 60 00:02:35,487 --> 00:02:37,622 SEE IT AND CLICK THAT AND PUT IN 61 00:02:37,622 --> 00:02:38,990 YOUR QUESTION AND E-MAIL WILL 62 00:02:38,990 --> 00:02:42,327 COME TO MY E-MAIL ACCOUNT AND MY 63 00:02:42,327 --> 00:02:44,429 KIND COLLEAGUE ROBERT STEELE IS 64 00:02:44,429 --> 00:02:44,996 TAKING THOSE. 65 00:02:44,996 --> 00:02:51,035 YOU MIGHT SEE HIM HOGGING THE 66 00:02:51,035 --> 00:02:51,870 MICROPHONE ANSWERING QUESTION WE 67 00:02:51,870 --> 00:02:57,976 GET FROM ONLINE. 68 00:02:57,976 --> 00:03:00,078 TODAY WE'LL TALK ABOUT THE TOPIC 69 00:03:00,078 --> 00:03:01,746 I ASSUME DOESN'T NEED 70 00:03:01,746 --> 00:03:04,682 INTRODUCTION FOR ANYBODY THE 71 00:03:04,682 --> 00:03:05,683 ETHICAL ISSUES RAISED BY CHAT 72 00:03:05,683 --> 00:03:07,318 GPT AND OTHER LARGE LANGUAGE 73 00:03:07,318 --> 00:03:09,787 MODELS SPECIFICALLY IN THE 74 00:03:09,787 --> 00:03:11,723 CONTEXT OF RESEARCH SO 75 00:03:11,723 --> 00:03:12,490 RESEARCHERS ARE STARTING TO USE 76 00:03:12,490 --> 00:03:16,928 THESE MODELS TO DO THINGS LIKE 77 00:03:16,928 --> 00:03:18,563 LITERATURE SEARCHS AND RIGHT 78 00:03:18,563 --> 00:03:20,632 ROUGH DRAFTS, FIGURE OUT WHAT 79 00:03:20,632 --> 00:03:21,866 ARE IMPORTANT AND INTERESTING 80 00:03:21,866 --> 00:03:26,671 RESEARCH QUESTION TO PURSUE AND 81 00:03:26,671 --> 00:03:28,373 THE QUESTION IS WHAT ARE THE 82 00:03:28,373 --> 00:03:29,140 ETHICAL ISSUES THAT GET RAISED 83 00:03:29,140 --> 00:03:30,842 BY THAT AND HOW DO WE ADDRESS 84 00:03:30,842 --> 00:03:34,312 THEM AND TO GIVE US THE CASE WE 85 00:03:34,312 --> 00:03:37,081 HAVE NICK ASENDORF FROM NHLBI. 86 00:03:37,081 --> 00:03:39,417 HE'LL GIVE US THE CASE AND 87 00:03:39,417 --> 00:03:42,720 AFTERWARDS I'LL TRY TO GET HIS 88 00:03:42,720 --> 00:03:45,557 AFFILIATIONS RIGHT -- DAVID 89 00:03:45,557 --> 00:03:49,427 MAGNUS THE THOMAS RAFFIN 90 00:03:49,427 --> 00:03:51,129 PROFESSOR AT STANFORD AND THE 91 00:03:51,129 --> 00:03:53,298 DIRECTOR OF THE BIOETHICS 92 00:03:53,298 --> 00:03:54,165 INSTITUTE AT STANFORD 93 00:03:54,165 --> 00:03:54,632 UNIVERSITY. 94 00:03:54,632 --> 00:03:56,167 HE'S RECENTLY BECOME AN 95 00:03:56,167 --> 00:03:58,369 ASSISTANT DEAN WHICH I 96 00:03:58,369 --> 00:03:59,404 UNDERSTAND IS HIS FAVORITE JOB 97 00:03:59,404 --> 00:04:00,004 HE'S EVER HELD. 98 00:04:00,004 --> 00:04:03,207 DAVID IS GOING TO GIVE US HIS 99 00:04:03,207 --> 00:04:04,609 COMMENTS AND HIS VIEWS ON THE 100 00:04:04,609 --> 00:04:05,710 ETHICAL ISSUES RAISED BY THE 101 00:04:05,710 --> 00:04:05,977 CASE. 102 00:04:05,977 --> 00:04:16,354 TO GET US GOING, NICK. 103 00:04:24,462 --> 00:04:27,198 >> HOPEFULLY THAT'S COMING 104 00:04:27,198 --> 00:04:28,433 THROUGH WHILE ONLINE. 105 00:04:28,433 --> 00:04:32,036 LIKE I SAID I'M NICK ASENDORF 106 00:04:32,036 --> 00:04:33,938 THE SCIENTIFIC INFORMATION 107 00:04:33,938 --> 00:04:34,872 OFFICER AT NHLBI. 108 00:04:34,872 --> 00:04:41,512 BEFORE I JOINED THE NIH I WORKED 109 00:04:41,512 --> 00:04:42,413 AT 3M. 110 00:04:42,413 --> 00:04:44,682 MY HOPE IS TO GIVE A BRIEF 111 00:04:44,682 --> 00:04:48,186 OVERVIEW OF LARGE LANGUAGE 112 00:04:48,186 --> 00:04:50,421 MODELS AND HAVE NOTHING TO 113 00:04:50,421 --> 00:04:52,423 DISCLOSE AND HOW THEY'RE TRAINED 114 00:04:52,423 --> 00:04:54,659 AND DEPLOYED AND POTENTIAL USE 115 00:04:54,659 --> 00:04:56,427 AREAS WHERE THEY MIGHT BE USED 116 00:04:56,427 --> 00:04:57,895 IN RESEARCH AND IN IN THE 117 00:04:57,895 --> 00:04:58,696 CLINICAL SETTING AND THEN TO 118 00:04:58,696 --> 00:05:02,000 BRING UP QUESTIONS FOR THE 119 00:05:02,000 --> 00:05:03,868 BROADER GROUP TO DISCUSS RELATED 120 00:05:03,868 --> 00:05:06,571 TO THEIR USE AND ETHICS BEHIND 121 00:05:06,571 --> 00:05:06,771 THOSE. 122 00:05:06,771 --> 00:05:12,043 LARGE LANGUAGE MODELS ARE A SUB 123 00:05:12,043 --> 00:05:16,447 CLASS OF AS A MORE ARMING RIDGE 124 00:05:16,447 --> 00:05:17,949 KNOWN AS GENERATIVE A LOT OF AND 125 00:05:17,949 --> 00:05:20,585 THEY SEEK TO GENERATE NEW AND 126 00:05:20,585 --> 00:05:22,220 NOVEL CONTENT WHETHER THAT BE 127 00:05:22,220 --> 00:05:25,023 TEXT, IMAGES OR POTENTIALLY 128 00:05:25,023 --> 00:05:28,559 OTHER MODALITIES LIKE GENERATING 129 00:05:28,559 --> 00:05:31,996 CODE IN THE CASE OF GITHUB 130 00:05:31,996 --> 00:05:33,297 CO-PILOT. 131 00:05:33,297 --> 00:05:36,334 WHAT HINGES ON THE QUALITY IS A 132 00:05:36,334 --> 00:05:38,269 LARGE TRAINING CORPUS OF DIVERSE 133 00:05:38,269 --> 00:05:39,604 DATA DEPENDING ON THE TASK YOU 134 00:05:39,604 --> 00:05:43,408 WISH THE MODEL TO HAVE, YOU NEED 135 00:05:43,408 --> 00:05:47,412 DIFFERENT CORPUSES WHETHER AS 136 00:05:47,412 --> 00:05:50,948 THE WIKIPEDIA OR INSTAGRAM WITH 137 00:05:50,948 --> 00:05:53,017 THEIR CAPTIONS AND CODE FROM 138 00:05:53,017 --> 00:05:53,885 GITHUB AND OF PARTICULAR 139 00:05:53,885 --> 00:05:56,087 INTEREST OF WELL IS FOR LANGUAGE 140 00:05:56,087 --> 00:05:58,356 MODELS TO MAKE SURE THAT DATA 141 00:05:58,356 --> 00:06:01,526 SET THAT YOUR CORPUS HAS A WIDE 142 00:06:01,526 --> 00:06:02,960 RANGE OF LANGUAGE INVOLVED TO 143 00:06:02,960 --> 00:06:05,129 MAKE SURE THE MODELS ARE USABLE 144 00:06:05,129 --> 00:06:06,964 BY THE GLOBAL COMMUNITY. 145 00:06:06,964 --> 00:06:07,932 A.I. RESEARCHERS WILL DESIGN 146 00:06:07,932 --> 00:06:09,801 THESE NEURAL NETWORKS THAT ARE 147 00:06:09,801 --> 00:06:12,837 USED TO TRAIN THESE MODELS AND 148 00:06:12,837 --> 00:06:14,372 THESE NEURAL NETWORKS ARE 149 00:06:14,372 --> 00:06:15,540 INCREDIBLY COMPLEX AND THERE'S A 150 00:06:15,540 --> 00:06:16,974 TON OF RESEARCH IN TERMS OF HOW 151 00:06:16,974 --> 00:06:20,211 TO DESIGN THE ARCHITECTURE OF 152 00:06:20,211 --> 00:06:21,212 THESE NETWORK TO ACCOMPLISH THE 153 00:06:21,212 --> 00:06:22,113 TASK YOU WISH. 154 00:06:22,113 --> 00:06:24,816 THE MODELS OFTEN HAVE BILLIONS 155 00:06:24,816 --> 00:06:27,285 OF PARAMETERS AND TAKE MONTHS TO 156 00:06:27,285 --> 00:06:29,554 TRAIN ON SPECIALIZED EQUIPMENT. 157 00:06:29,554 --> 00:06:31,823 THE POWER AND COOLING COSTS MAKE 158 00:06:31,823 --> 00:06:34,559 THE MODELS EXTREMELY CHALLENGING 159 00:06:34,559 --> 00:06:35,760 AND COST PROHIBITIVE TO TRAIN BY 160 00:06:35,760 --> 00:06:45,136 YOURSELF. 161 00:06:45,136 --> 00:06:48,906 I WANT TO SHOW OPEN A.I. DOLLY 2 162 00:06:48,906 --> 00:06:50,942 IT HAS JUST RELEASED DOLLY 3 A 163 00:06:50,942 --> 00:06:51,642 FEW WEEKS AGO. 164 00:06:51,642 --> 00:06:53,377 I ENCOURAGE PEOPLE TO GO OUT AND 165 00:06:53,377 --> 00:06:56,114 LOOK AT THE DIFFERENCES BETWEEN 166 00:06:56,114 --> 00:06:57,048 WHAT I'LL SHOW HERE AND WHAT 167 00:06:57,048 --> 00:07:04,088 CAME OUT A WEEK OR TWO AGO. 168 00:07:04,088 --> 00:07:14,632 >> IS THAT NOT GOING THROUGH FOR 169 00:07:15,032 --> 00:07:15,199 FOLKS? 170 00:07:15,199 --> 00:07:16,868 WANT ME START IT AND GET IT 171 00:07:16,868 --> 00:07:27,078 GOING AGAIN? 172 00:07:37,021 --> 00:07:42,226 THIS IS A FUN VIDEO THAT SHOWS 173 00:07:42,226 --> 00:07:48,299 SO DALLE IS AN OPEN A.I. COMPANY 174 00:07:48,299 --> 00:07:50,201 RELEASE THE SAME COMPANY THAT 175 00:07:50,201 --> 00:07:52,904 MAKES CHAT GPT AND THE WHAT THE 176 00:07:52,904 --> 00:07:54,672 TEXT IMAGE-BASED MODELS ALLOW TO 177 00:07:54,672 --> 00:07:58,476 YOU DO IS TO PUT IN A TEXT 178 00:07:58,476 --> 00:08:00,411 PROMPT LIKE WHAT YOU'LL SEE 179 00:08:00,411 --> 00:08:01,646 KOALAS RIDING MOTORCYCLES AND 180 00:08:01,646 --> 00:08:03,347 THE NETWORKS ARE TRAINED TO 181 00:08:03,347 --> 00:08:05,783 LEARN THE RELATIONSHIP BETWEEN 182 00:08:05,783 --> 00:08:13,991 KOALAS AND MOTORCYCLES AND 183 00:08:13,991 --> 00:08:15,560 DETERMINE VARIED STYLES OF 184 00:08:15,560 --> 00:08:15,793 IMAGES. 185 00:08:15,793 --> 00:08:19,497 IT'S NOT ONLY ONE BUT MULTIPLE 186 00:08:19,497 --> 00:08:21,432 TYPES OF IMAGES. 187 00:08:21,432 --> 00:08:23,668 I'LL END THAT THERE AND WE CAN 188 00:08:23,668 --> 00:08:24,468 KEEP GOING ON HERE. 189 00:08:24,468 --> 00:08:35,012 I ENCOURAGE YOU TO LOOK AT THE 190 00:08:35,479 --> 00:08:36,447 DALLE 3 ONES AND ENCOURAGE YOU 191 00:08:36,447 --> 00:08:46,257 TO LOOK AT THAT. 192 00:08:46,257 --> 00:08:48,092 THEY'RE TEXT TO TEXT MODELS AND 193 00:08:48,092 --> 00:08:53,931 TRAINED ON LOTS AND LOTS OF 194 00:08:53,931 --> 00:08:56,667 TEXTUAL BATE AND GOOGLE BARD ARE 195 00:08:56,667 --> 00:09:00,171 THE MOST FAMOUS AND OTHER MODELS 196 00:09:00,171 --> 00:09:03,074 OUT THERE RIGHT NOW. 197 00:09:03,074 --> 00:09:07,245 OF PARTICULAR NOTE THEY'RE 198 00:09:07,245 --> 00:09:09,647 PROBABILISTIC AND DON'T RETURN 199 00:09:09,647 --> 00:09:13,084 THE CORRECT ANSWER OR SOMETHING 200 00:09:13,084 --> 00:09:14,919 YOU'RE NOT PULLING THE EXACT 201 00:09:14,919 --> 00:09:17,255 DATA FROM THE CORPUS INSTEAD 202 00:09:17,255 --> 00:09:20,324 IT'S PROBABILISTIC AND 203 00:09:20,324 --> 00:09:21,525 GENERATING THE MOST PROBABLE 204 00:09:21,525 --> 00:09:23,060 SEQUENCE IN RESPONSE TO WHAT YOU 205 00:09:23,060 --> 00:09:23,227 GET. 206 00:09:23,227 --> 00:09:26,397 THE CONTEXT HERE MATTERS IN THE 207 00:09:26,397 --> 00:09:28,699 SENSE THAT IT OFTEN LOOKS PAST 208 00:09:28,699 --> 00:09:31,736 NOT ONLY THE PROMPT YOU GIVE IT 209 00:09:31,736 --> 00:09:32,870 BUT THE PASS SEQUENCE OF 210 00:09:32,870 --> 00:09:35,072 CONVERSATIONS YOU'VE HAD WITH 211 00:09:35,072 --> 00:09:36,741 IT. 212 00:09:36,741 --> 00:09:38,075 I MENTIONED EARLIER GPT 4 IS 213 00:09:38,075 --> 00:09:41,345 ESTIMATED TO HAVE COST $100 214 00:09:41,345 --> 00:09:43,014 MILLION TO CREATE WHICH IS 215 00:09:43,014 --> 00:09:45,850 PROHIBITIVE FOR YOU AND I 216 00:09:45,850 --> 00:09:55,126 INTEREST TO TRAIN OURSELVES. 217 00:09:55,126 --> 00:09:56,360 THIS IS SIX MONTHS OLD AND I 218 00:09:56,360 --> 00:09:57,395 WANT TO HEADLINE THE DIFFERENCE 219 00:09:57,395 --> 00:09:58,896 BETWEEN SOME OF THESE MODELS ARE 220 00:09:58,896 --> 00:09:59,697 OPEN SOURCE AND SOME ARE CLOSED 221 00:09:59,697 --> 00:10:04,936 SOURCE. 222 00:10:04,936 --> 00:10:07,371 AND META-AND LLAMA AND CAME OUT 223 00:10:07,371 --> 00:10:14,645 WITH LLAMA V2 ARE OPEN SOURCE. 224 00:10:14,645 --> 00:10:16,280 METAIS RELEASING THE WEIGHTS TO 225 00:10:16,280 --> 00:10:18,716 USE SO WE DON'T HAVE TO SPEND 226 00:10:18,716 --> 00:10:19,684 MONTHS TRAINING ALGORITHMS OR 227 00:10:19,684 --> 00:10:28,059 THE COST TO TRAIN THOSE. 228 00:10:28,059 --> 00:10:30,761 THEY CAN GENERATE MORE DOMAIN 229 00:10:30,761 --> 00:10:31,395 SPECIFIC APPLICATIONS THAT ARE 230 00:10:31,395 --> 00:10:41,572 COMING UP. 231 00:11:03,227 --> 00:11:05,162 I'LL TAILOR IT TO THOSE HERE AND 232 00:11:05,162 --> 00:11:06,564 WHEN YOU THINK OF HOW THE MODELS 233 00:11:06,564 --> 00:11:08,399 ARE USED CURRENTLY THE MOST 234 00:11:08,399 --> 00:11:09,467 COMMON USE CASE RIGHT NOW IS TO 235 00:11:09,467 --> 00:11:15,239 BE USED AS A WRITING AID WHETHER 236 00:11:15,239 --> 00:11:16,540 IT'S GIVING THAT SOME DOCUMENT 237 00:11:16,540 --> 00:11:19,110 TO INJECT HUMOR INTO IT TO CHECK 238 00:11:19,110 --> 00:11:22,113 FOR GRAMMAR, TO COMPLETELY 239 00:11:22,113 --> 00:11:23,314 CHANGE THE STYLE OR ALMOST 240 00:11:23,314 --> 00:11:28,786 USE -- I'VE USED IT BEFORE AS A 241 00:11:28,786 --> 00:11:30,254 THESAURUS FOR TRYING TO FINE 242 00:11:30,254 --> 00:11:31,989 TUNE WHAT I WANT AND ALMOST AS A 243 00:11:31,989 --> 00:11:34,425 CONTENT GENERATION TO GET PEOPLE 244 00:11:34,425 --> 00:11:36,560 OVER THE INITIAL HURDLE AND IT'S 245 00:11:36,560 --> 00:11:42,299 EASIER TO EDIT SOMETHING THAN TO 246 00:11:42,299 --> 00:11:43,768 CREATE SOMETHING OURSELVES AND 247 00:11:43,768 --> 00:11:45,970 THERE'S AN INTERESTING USE CASE. 248 00:11:45,970 --> 00:11:47,505 THINKING ABOUT GIVING IN TERMS 249 00:11:47,505 --> 00:11:49,640 OF SERVICE THAT YOU GET ALL THE 250 00:11:49,640 --> 00:11:52,910 TIME THAT IS SO LONG NO ONE HITS 251 00:11:52,910 --> 00:11:55,046 ACCEPT YOU CAN FEED THIS IN AND 252 00:11:55,046 --> 00:11:57,548 ANALYZE TO SEE IF THERE'S 253 00:11:57,548 --> 00:11:59,517 ANYTHING NEFARIOUS AND 254 00:11:59,517 --> 00:12:01,018 SUMMARIZING LITERATURE AND GRANT 255 00:12:01,018 --> 00:12:02,553 REVIEWS AND POTENTIAL 256 00:12:02,553 --> 00:12:02,887 APPLICATIONS. 257 00:12:02,887 --> 00:12:05,790 WHERE RESEARCH IS GOING IS TO 258 00:12:05,790 --> 00:12:08,759 DEVELOPING FINE TUNE OR 259 00:12:08,759 --> 00:12:10,961 DOMAIN-SPECIFIC MODELS FOR 260 00:12:10,961 --> 00:12:14,765 PARTICULAR APPLICATIONS. 261 00:12:14,765 --> 00:12:16,534 ONE THAT IS GAINING RECENTLY IS 262 00:12:16,534 --> 00:12:18,235 IN TOOLS FOR HELPING PROGRAMMERS 263 00:12:18,235 --> 00:12:22,073 CODE FASTER. 264 00:12:22,073 --> 00:12:27,411 CODE LLAMA IS A VARIANT AND 265 00:12:27,411 --> 00:12:30,948 GITHUB CO-PILOT HAS ITS OWN AND 266 00:12:30,948 --> 00:12:36,120 ACCELERATING THE WAY DEVELOPER 267 00:12:36,120 --> 00:12:39,623 CAN WRITE CODES AND COMPANIES TO 268 00:12:39,623 --> 00:12:44,962 TRAIN ON THEIR SPECIFIC I.T. 269 00:12:44,962 --> 00:12:47,631 DOCUMENTATION OR HR POLICY TO 270 00:12:47,631 --> 00:12:49,800 REDUCE TEDIOUS INTERNET PAGES 271 00:12:49,800 --> 00:12:50,968 COMPANIES MAY HAVE. 272 00:12:50,968 --> 00:12:51,969 IN THE CLINICAL SETTING WE CAN 273 00:12:51,969 --> 00:12:53,838 THINK ABOUT POTENTIALLY MAKING 274 00:12:53,838 --> 00:12:57,274 TOOLS TO AID CLINICIANS AND 275 00:12:57,274 --> 00:13:00,111 PATIENT CARE, DIAGNOSIS AND 276 00:13:00,111 --> 00:13:05,483 TREATMENT AND COMPLETING RECORDS 277 00:13:05,483 --> 00:13:06,550 AUTOMATICALLY AND ALMOST HAVING 278 00:13:06,550 --> 00:13:13,557 THE EXTRA PERSON AND THE MODELS 279 00:13:13,557 --> 00:13:17,695 HAVE THE ABILITY TO HALLUCINATE 280 00:13:17,695 --> 00:13:21,999 AND STEMS FROM THE MODELS BEING 281 00:13:21,999 --> 00:13:25,336 PROBABILISTIC AND CREATING THE 282 00:13:25,336 --> 00:13:27,271 MOST RESPONSE PROBABLE RESPONSE 283 00:13:27,271 --> 00:13:27,872 GIVEN WHAT YOU HAVE. 284 00:13:27,872 --> 00:13:31,742 THEIR ABILITY TO MAKE UP 285 00:13:31,742 --> 00:13:34,512 INFORMATION IS A BIG RISK AND 286 00:13:34,512 --> 00:13:36,180 FORCES US TO HAVE A HUMAN IN THE 287 00:13:36,180 --> 00:13:37,515 LOOP OF A LOT OF THESE 288 00:13:37,515 --> 00:13:38,983 APPLICATIONS WHICH IS 289 00:13:38,983 --> 00:13:39,717 PARTICULARLY RELEVANT IN THE 290 00:13:39,717 --> 00:13:40,251 CLINICAL SETTING. 291 00:13:40,251 --> 00:13:45,389 WE MIGHT BE TAKING ACTIONS BASED 292 00:13:45,389 --> 00:13:47,224 ON THESE LARGE LANGUAGE MODELS. 293 00:13:47,224 --> 00:13:48,292 AN IMPORTANT THING BETWEEN THE 294 00:13:48,292 --> 00:13:50,895 DIFFERENT BETWEEN PUBLIC AND 295 00:13:50,895 --> 00:13:51,996 PRIVATE SYSTEMS AND DATA LOSS IS 296 00:13:51,996 --> 00:13:58,102 ON THE TOP OF MIND ESPECIALLY 297 00:13:58,102 --> 00:13:59,870 HERE IN THE GOVERNMENT AND 298 00:13:59,870 --> 00:14:00,971 EXPOSING DATA WE DON'T WISH TO 299 00:14:00,971 --> 00:14:03,674 BE OUT IN THE PUBLIC. 300 00:14:03,674 --> 00:14:05,509 WHETHER THAT BE PRE DECISIONAL 301 00:14:05,509 --> 00:14:06,143 INFORMATION OR DRAFT POLICY OR 302 00:14:06,143 --> 00:14:10,881 PATIENT INFORMATION. 303 00:14:10,881 --> 00:14:12,082 WE TALK ABOUT THE HUGE AMOUNT OF 304 00:14:12,082 --> 00:14:13,584 DATA WE NEED TO TRAIN MODELS. 305 00:14:13,584 --> 00:14:15,352 THEY'RE ONLY AS GOOD AS THE 306 00:14:15,352 --> 00:14:15,953 TRAINING DATA. 307 00:14:15,953 --> 00:14:18,422 IF THERE'S BIAS IN THE TRAINING 308 00:14:18,422 --> 00:14:20,424 DATA THERE WILL BE BIAS IN THE 309 00:14:20,424 --> 00:14:22,459 MODELS AND HOW DO WE COMBAT 310 00:14:22,459 --> 00:14:22,826 THAT. 311 00:14:22,826 --> 00:14:26,397 IT'S ALMOST UNAVOIDABLE. 312 00:14:26,397 --> 00:14:29,166 THERE'S NO WAY WE CAN MAKE SURE 313 00:14:29,166 --> 00:14:31,335 OUR TRAINING DATA SET IS VOID 314 00:14:31,335 --> 00:14:32,503 AND DEVELOPING SYSTEMS THAT TAKE 315 00:14:32,503 --> 00:14:35,739 THE RESPONSES AND CHECK FOR BIAS 316 00:14:35,739 --> 00:14:38,943 IS SOMETHING THAT'S AN ACTIVE 317 00:14:38,943 --> 00:14:40,945 RESEARCH TOPIC. 318 00:14:40,945 --> 00:14:43,113 PLAGIARISM IN TERMS OF THE LARGE 319 00:14:43,113 --> 00:14:53,657 LANGUAGE MODEL AND PLAGIARIZING 320 00:14:55,359 --> 00:14:58,662 AND WHETHER IT'S HOW IT CITE 321 00:14:58,662 --> 00:14:59,930 WHETHER USING IT TO COMPLETE 322 00:14:59,930 --> 00:15:01,665 HOME WORK OR IN THE ACADEMIC 323 00:15:01,665 --> 00:15:03,200 SENSE USING IT TO HELP YOU WRITE 324 00:15:03,200 --> 00:15:13,711 PAPERS AN EDIT PAPERS AS WELL. 325 00:15:19,650 --> 00:15:21,318 I WANT TO DIFFERENT IS THAT 326 00:15:21,318 --> 00:15:22,386 RIGHT BETWEEN THE TECHNOLOGY AND 327 00:15:22,386 --> 00:15:23,988 TOOL FOR THE MODELS. 328 00:15:23,988 --> 00:15:27,458 THE UNDERLYING MODEL THE TRAINED 329 00:15:27,458 --> 00:15:29,793 WEIGHTS AND PARAMETERS OF THESE 330 00:15:29,793 --> 00:15:32,596 MODELS WHETHER THAT BE LLAMA, 331 00:15:32,596 --> 00:15:37,635 GPT 4, PALM, WHICH FEEDS INTO 332 00:15:37,635 --> 00:15:39,837 GOOGLE BARD AND DEPLOY IN 333 00:15:39,837 --> 00:15:40,938 MULTIPLE SYSTEMS WHETHER THAT IS 334 00:15:40,938 --> 00:15:47,278 SOMETHING LOCAL TO MY MACHINE OR 335 00:15:47,278 --> 00:15:50,447 IN A PUBLIC INTERFACE AND GOOGLE 336 00:15:50,447 --> 00:15:51,949 BARD AND CHAT GPT ARE PUBLIC 337 00:15:51,949 --> 00:15:52,182 SYSTEMS. 338 00:15:52,182 --> 00:15:56,587 WE AS USERS DO NOT KNOW WHAT 339 00:15:56,587 --> 00:15:58,656 THAT BACK END MODEL WE TALKED 340 00:15:58,656 --> 00:15:59,623 ABOUT OPEN SOURCE AND CLOSED 341 00:15:59,623 --> 00:16:01,191 SOURCE BEFORE. 342 00:16:01,191 --> 00:16:03,761 WHEN WE SUBMIT DATA UP TO CHAT 343 00:16:03,761 --> 00:16:06,964 GPT WE DON'T KNOW HOW OPEN A.I. 344 00:16:06,964 --> 00:16:08,599 IS USING THE DATA AND HOW GOOGLE 345 00:16:08,599 --> 00:16:10,367 IS USING THE DATA. 346 00:16:10,367 --> 00:16:12,303 WE CAN READ A BIG PUTOUT WHETHER 347 00:16:12,303 --> 00:16:13,671 WE TRUST WHAT THEY NEED TO SAY 348 00:16:13,671 --> 00:16:17,675 IS A DIFFERENT STORY. 349 00:16:17,675 --> 00:16:19,376 BUT WE CAN TAKE THE MODELS IN 350 00:16:19,376 --> 00:16:21,779 THE CASE OF OPEN SOURCE AND RUN 351 00:16:21,779 --> 00:16:23,847 A LARGE LANGUAGE MODEL LOCALLY 352 00:16:23,847 --> 00:16:25,449 ON MY MACHINE. 353 00:16:25,449 --> 00:16:26,884 THE APPLICATION WHERE I WOULD 354 00:16:26,884 --> 00:16:29,386 WANT TO PUT SOCIAL SECURITY 355 00:16:29,386 --> 00:16:31,588 NUMBER OR PATIENT RECORD UP INTO 356 00:16:31,588 --> 00:16:34,692 A PUBLIC SYSTEM THAT USE CASE IS 357 00:16:34,692 --> 00:16:36,260 ENABLED TO FIND IT RUNNING 358 00:16:36,260 --> 00:16:37,494 LOCALLY ON MY MACHINE. 359 00:16:37,494 --> 00:16:39,630 WHEN I CAME BACK TO TALK ABOUT 360 00:16:39,630 --> 00:16:42,966 WHAT DOES IT LOOK LIKE IF WE AS 361 00:16:42,966 --> 00:16:50,007 THE NIH WERE TO CREATE AN 362 00:16:50,007 --> 00:16:52,576 IN-HOUSE SYSTEM THAT RUNS A 363 00:16:52,576 --> 00:16:53,744 LOCAL LARGE LANGUAGE MODEL AND 364 00:16:53,744 --> 00:16:55,846 HOW DOES IT CHANGE BETWEEN THE 365 00:16:55,846 --> 00:16:58,282 PUBLIC SYSTEMS, ME RUNNING 366 00:16:58,282 --> 00:16:59,883 SOMETHING LOCALLY VERSUS THE 367 00:16:59,883 --> 00:17:00,684 SHARED COMMUNAL RESOURCE AT THE 368 00:17:00,684 --> 00:17:01,418 NIH. 369 00:17:01,418 --> 00:17:04,588 I THINK DEPENDING ON HOW THE 370 00:17:04,588 --> 00:17:06,256 TOOL IS DEPLOYED HERE CHANGES 371 00:17:06,256 --> 00:17:10,961 THE ANSWERS TO SOME OF THE 372 00:17:10,961 --> 00:17:11,595 ETHICAL QUESTIONS WE MIGHT POSE. 373 00:17:11,595 --> 00:17:13,163 I'LL CLOSE HERE JUST WITH SOME 374 00:17:13,163 --> 00:17:13,964 GENERAL QUESTIONS THAT 375 00:17:13,964 --> 00:17:17,167 DR. MAGNUS CAN TAKE AND RUN FROM 376 00:17:17,167 --> 00:17:21,372 AFTER I FINISH UP HERE. 377 00:17:21,372 --> 00:17:23,107 CONSENT IS A HOT TOPIC I THINK 378 00:17:23,107 --> 00:17:25,743 WHEN WE THINK ABOUT NOT ONLY FOR 379 00:17:25,743 --> 00:17:29,046 LARGE LANGUAGE MODELS AND 380 00:17:29,046 --> 00:17:33,083 PATIENT C-R AND THE IMAGES THEY 381 00:17:33,083 --> 00:17:34,518 GENERATOR ANY DATA COLLECTED 382 00:17:34,518 --> 00:17:45,062 WITHIN A STUDY SEQUENCING DATA 383 00:17:46,463 --> 00:17:54,104 AND FINE TUNING NEW MODELS-. 384 00:17:54,104 --> 00:17:55,806 HOW DO WE ASSESS THE DATA 385 00:17:55,806 --> 00:17:57,441 INJECTED FROM THE MODELS IS 386 00:17:57,441 --> 00:17:58,976 GREATER OR LESS THAN THE 387 00:17:58,976 --> 00:18:00,811 POTENTIAL BIAS THAT EXISTS IN 388 00:18:00,811 --> 00:18:02,980 OUR EVERYDAY ASSESSMENT AS 389 00:18:02,980 --> 00:18:06,183 HUMANS AND SIMILARLY WHEN THEY 390 00:18:06,183 --> 00:18:09,286 GIVE THE WRONG ANSWER, HOW DO WE 391 00:18:09,286 --> 00:18:10,754 ASSESS THE ERROR RATE TO WHAT WE 392 00:18:10,754 --> 00:18:13,424 DO AS A HUMAN IN WHATEVER TASK 393 00:18:13,424 --> 00:18:14,758 WE WANT? 394 00:18:14,758 --> 00:18:17,961 THE AUTHORSHIP QUESTION IS 395 00:18:17,961 --> 00:18:18,629 ANOTHER INTERESTING QUESTION OF 396 00:18:18,629 --> 00:18:20,731 HOW THEY SHOULD BE CITED, WHEN 397 00:18:20,731 --> 00:18:23,567 WE NEED TO DISCLOSE THAT WE'VE 398 00:18:23,567 --> 00:18:26,069 USED A LARGE LANGUAGE MELD OR 399 00:18:26,069 --> 00:18:29,339 ANY GENERATIVE -- MODEL OR 400 00:18:29,339 --> 00:18:30,073 GENERATIVE A.I. 401 00:18:30,073 --> 00:18:31,909 THE TEXT TO IMPROVAGE AND TEXT 402 00:18:31,909 --> 00:18:36,880 TO VIDEO TYPE OF APPLICATIONS 403 00:18:36,880 --> 00:18:39,783 ARE MORE POIGNANT WHEN WE THINK 404 00:18:39,783 --> 00:18:41,051 OF THE INFORMATION THAT CAN BE 405 00:18:41,051 --> 00:18:41,285 SPREAD. 406 00:18:41,285 --> 00:18:46,323 I'LL STOP THERE AND TURN IT OVER 407 00:18:46,323 --> 00:18:48,225 TO DR. MAGNUS TO KEEP WALKING US 408 00:18:48,225 --> 00:18:51,094 THROUGH AND MAYBE ANSWER 409 00:18:51,094 --> 00:18:51,528 QUESTIONS PERHAPS. 410 00:18:51,528 --> 00:19:01,839 THANKS, EVERYBODY. 411 00:19:11,682 --> 00:19:13,717 >> THANKS, VERY MUCH. 412 00:19:13,717 --> 00:19:15,385 I HOPE EVERYTHING'S LOOKING GOOD 413 00:19:15,385 --> 00:19:17,554 AND PEOPLE CAN HEAR ME OKAY. 414 00:19:17,554 --> 00:19:19,122 I'M NOT SEEING ANYTHING SENT TO 415 00:19:19,122 --> 00:19:20,257 ME. 416 00:19:20,257 --> 00:19:20,624 >> WE CAN HEAR. 417 00:19:20,624 --> 00:19:22,960 >> GREAT. 418 00:19:22,960 --> 00:19:23,961 OKAY. 419 00:19:23,961 --> 00:19:25,429 NO CONFLICTS OF INTEREST TO 420 00:19:25,429 --> 00:19:27,464 DISCLOSE AND I'M GOING TO GET 421 00:19:27,464 --> 00:19:32,503 STRAIGHT INTO IT. 422 00:19:32,503 --> 00:19:37,441 THESE ARE SOME OF THE ISSUES 423 00:19:37,441 --> 00:19:39,243 JUST RAISED BY LARGE LANGUAGE 424 00:19:39,243 --> 00:19:41,144 MODELS AND THEIR RESEARCH. 425 00:19:41,144 --> 00:19:43,347 CONSENT PER AND CONCERNS ABOUT 426 00:19:43,347 --> 00:19:44,948 BIAS AN DISCRIMINATION AND 427 00:19:44,948 --> 00:19:48,252 ISSUES OF RELIABILITY AND 428 00:19:48,252 --> 00:19:49,419 RELIANCE ON THE OUTPUTS FROM 429 00:19:49,419 --> 00:19:51,588 LARGE LANGUAGE MODELS AND CREDIT 430 00:19:51,588 --> 00:19:53,924 AND AUTHORSHIP AND I WOULD ALSO 431 00:19:53,924 --> 00:19:55,225 ADD THE POTENTIAL FOR INCREASE 432 00:19:55,225 --> 00:19:58,161 IN MISCONDUCT AND THE POTENTIAL 433 00:19:58,161 --> 00:19:59,796 FOR INTENTIONAL MISUSE NEED TO 434 00:19:59,796 --> 00:20:00,497 BE CONSIDERED. 435 00:20:00,497 --> 00:20:02,232 I WANT TO START BY GIVING A 436 00:20:02,232 --> 00:20:05,002 COUPLE FRAMING ISSUES. 437 00:20:05,002 --> 00:20:06,837 FIRST AROUND WHAT BELONGS TO THE 438 00:20:06,837 --> 00:20:09,473 IRB REALM AND IS THINKING ABOUT 439 00:20:09,473 --> 00:20:11,742 IN THE RESEARCH REALM VERSUS 440 00:20:11,742 --> 00:20:14,611 DOWN STREAM ETHICAL ISSUES AND 441 00:20:14,611 --> 00:20:16,313 OTHER METHODS NEED TO BE 442 00:20:16,313 --> 00:20:17,848 DEVELOPED TO DEAL WITH AND 443 00:20:17,848 --> 00:20:20,017 SECOND THE ISSUE ABOUT WHAT IS 444 00:20:20,017 --> 00:20:21,585 REALLY NOVEL HERE THAT LARGE 445 00:20:21,585 --> 00:20:22,953 LANGUAGE MODELS RAISE VERSUS 446 00:20:22,953 --> 00:20:26,523 WHAT ARE IN FACT FAMILIAR ISSUES 447 00:20:26,523 --> 00:20:27,891 FROM EITHER A.I. BEFORE THIS OR 448 00:20:27,891 --> 00:20:30,694 EVEN FROM OTHER DOMAINS OF 449 00:20:30,694 --> 00:20:30,961 RESEARCH. 450 00:20:30,961 --> 00:20:33,130 SO THE FIRST FRAMING IS JUST TO 451 00:20:33,130 --> 00:20:35,399 RECOGNIZE THE LIMITATIONS OF 452 00:20:35,399 --> 00:20:36,500 HUMAN SUBJECTS PROTECTION IN THE 453 00:20:36,500 --> 00:20:38,101 WAY WE THINK ABOUT THE 454 00:20:38,101 --> 00:20:38,502 REGULATIONS. 455 00:20:38,502 --> 00:20:45,542 SINCE THE DEVELOPMENT OF 456 00:20:45,542 --> 00:20:46,810 HELSINKI, BELMONT, THE TOUCH 457 00:20:46,810 --> 00:20:48,111 STONE DOCUMENT FOR HUMAN 458 00:20:48,111 --> 00:20:50,113 SUBJECTS PROTECTION AND THE 459 00:20:50,113 --> 00:20:52,616 REGULATIONS THAT THEY WIND UP 460 00:20:52,616 --> 00:20:54,885 PRODUCING FOCUS ON RISK TO 461 00:20:54,885 --> 00:20:55,452 SUBJECTS NOT SOCIETY. 462 00:20:55,452 --> 00:20:59,756 IRBs LOOK TO RISK TO SUBJECTS TO 463 00:20:59,756 --> 00:21:01,625 ENSURE THEY'RE MINIMIZED AND 464 00:21:01,625 --> 00:21:02,859 REASONABLE AND ENSURE THERE'S 465 00:21:02,859 --> 00:21:04,328 INFORMED CONSENT AND THAT 466 00:21:04,328 --> 00:21:05,095 THERE'S EQUITABLE SUBJECT 467 00:21:05,095 --> 00:21:07,264 SELECTION AT LEAST IN PRINCIPLE 468 00:21:07,264 --> 00:21:11,668 THOUGH THAT LAST ONE OFTEN GETS 469 00:21:11,668 --> 00:21:13,337 SHORTED AND WHAT IRB'S DON'T 470 00:21:13,337 --> 00:21:15,138 LOOK IS LOOK AT LONG RANGE 471 00:21:15,138 --> 00:21:16,073 EFFECTS OF THE KNOWLEDGE AND THE 472 00:21:16,073 --> 00:21:19,476 DOWN STREAM IMPLICATIONS OF THE 473 00:21:19,476 --> 00:21:20,277 RESEARCH FOR SOCIETY. 474 00:21:20,277 --> 00:21:28,218 THAT'S OUTSIDE THE SCOPE OF 475 00:21:28,218 --> 00:21:29,553 IRB'S ABILITY TO USE THOSE AS 476 00:21:29,553 --> 00:21:30,887 RISK OF RESEARCH AND YOU NEED 477 00:21:30,887 --> 00:21:32,122 ANOTHER MECHANISM FOR DEALING 478 00:21:32,122 --> 00:21:33,590 WITH THE DOWN STREAM 479 00:21:33,590 --> 00:21:35,225 IMPLICATIONS OF A.I. AND THAT'S 480 00:21:35,225 --> 00:21:36,026 ONE AREA WHERE A LOT OF RESEARCH 481 00:21:36,026 --> 00:21:37,527 IS GOING ON. 482 00:21:37,527 --> 00:21:39,763 WE PUBLISHED A PAPER IN 2021 IN 483 00:21:39,763 --> 00:21:41,898 THE PROCEEDINGS OF THE NATIONAL 484 00:21:41,898 --> 00:21:43,367 ACADEMY OF SCIENCE ABOUT AN 485 00:21:43,367 --> 00:21:45,035 APPROACH TO THIS WE'VE BEEN 486 00:21:45,035 --> 00:21:47,070 USING AT STANFORD WITH A.I. 487 00:21:47,070 --> 00:21:50,173 RESEARCH FUNDED BY THE HUMAN 488 00:21:50,173 --> 00:21:51,241 CENTERED A.I. INSTITUTE HERE AT 489 00:21:51,241 --> 00:21:53,310 STANFORD WHERE WE DEVELOPED AN 490 00:21:53,310 --> 00:21:54,911 ETHICS AND SOCIETY REVIEW 491 00:21:54,911 --> 00:21:56,580 PROCESS AND THAT'S A PROCESS 492 00:21:56,580 --> 00:22:01,652 THAT'S GETTING IT -- IT RATED 493 00:22:01,652 --> 00:22:04,488 AND DEVELOPED AND FINDING WAYS 494 00:22:04,488 --> 00:22:06,056 OF FINDING ETHICAL SCAFFOLDING 495 00:22:06,056 --> 00:22:08,158 AND GETTING RESEARCHERS TO THINK 496 00:22:08,158 --> 00:22:09,459 ABOUT THE DOWN STREAM ISSUES 497 00:22:09,459 --> 00:22:11,395 THEMSELVES SINCE THEY'RE OUTSIDE 498 00:22:11,395 --> 00:22:13,497 THE SCOPE OF THE TRADITIONAL IRB 499 00:22:13,497 --> 00:22:14,898 REGULATORY SYSTEM. 500 00:22:14,898 --> 00:22:17,034 THE SECOND FRAMING ISSUE AND 501 00:22:17,034 --> 00:22:19,469 THIS IS GONE TO RUN THROUGHOUT 502 00:22:19,469 --> 00:22:20,303 EVERYTHING ELSE I SAY IS THINK 503 00:22:20,303 --> 00:22:24,808 ABOUT THE WAYS IN WHICH THE 504 00:22:24,808 --> 00:22:26,109 ISSUES RAISED ARE SIMILAR TO 505 00:22:26,109 --> 00:22:28,078 THOSE RAISED BY A.I. IN OTHER 506 00:22:28,078 --> 00:22:28,912 APPLICATIONS OF HEALTH CARE IN 507 00:22:28,912 --> 00:22:33,884 PARTICULAR PRE DUDICTIVE ANALYT 508 00:22:33,884 --> 00:22:34,885 DISCUSSED FOR A WHILE. 509 00:22:34,885 --> 00:22:38,855 AND CAN BE BIAS IN DATA LEADING 510 00:22:38,855 --> 00:22:41,892 TO DISCRIMINATORY OUTPUTS I'LL 511 00:22:41,892 --> 00:22:43,527 DETAIL IN A MINUTE. 512 00:22:43,527 --> 00:22:44,895 THERE'S MORE TRAINING DATA IN A 513 00:22:44,895 --> 00:22:47,464 LOT OF THESE LARGE LANGUAGE 514 00:22:47,464 --> 00:22:50,200 MODELS THAN TYPICAL FOR FOR 515 00:22:50,200 --> 00:22:53,070 EXAMPLE PREDICTIVE ANALYTIC 516 00:22:53,070 --> 00:22:53,837 TOOLS. 517 00:22:53,837 --> 00:22:56,440 WE'VE ALSO SEEN SMALL PROMPT 518 00:22:56,440 --> 00:22:57,574 VARIATIONS CAN PRODUCE 519 00:22:57,574 --> 00:22:58,975 DISPARITIES IN OUTCOMES AND 520 00:22:58,975 --> 00:23:01,278 ENGINEERING IS BEING DEVELOPED 521 00:23:01,278 --> 00:23:02,746 FOR BEING ABLE TO ACCURATELY USE 522 00:23:02,746 --> 00:23:04,114 THESE THINGS AND THERE'S STILL 523 00:23:04,114 --> 00:23:04,648 BIAS IN THE DATA. 524 00:23:04,648 --> 00:23:06,917 IT MAY BE MORE OF IT BUT DOESN'T 525 00:23:06,917 --> 00:23:09,586 MEAN IT'S LESS BY AS BECAUSE THE 526 00:23:09,586 --> 00:23:10,887 INTERNET IS QUITE BIASSED. 527 00:23:10,887 --> 00:23:18,061 THIS IS ONE OF MY FAVORITE 528 00:23:18,061 --> 00:23:20,997 STATEMENTS, WE DO NOT SEE THINGS 529 00:23:20,997 --> 00:23:23,500 AS THEY ARE, BUT AS WE ARE AND I 530 00:23:23,500 --> 00:23:24,735 SUGGEST THAT'S TRUE IN MORE 531 00:23:24,735 --> 00:23:26,536 DETAIL IN A MINUTE. 532 00:23:26,536 --> 00:23:29,773 AND CONSENT FOR DATA USE AND 533 00:23:29,773 --> 00:23:31,908 RELIABILITY AND RELIANCE AND 534 00:23:31,908 --> 00:23:32,342 MISUSE. 535 00:23:32,342 --> 00:23:34,010 ARE ALL ISSUES WE'VE DEALT WITH 536 00:23:34,010 --> 00:23:36,346 IN OTHER DOMAINS OR OTHER PARTS 537 00:23:36,346 --> 00:23:39,382 OF A.I. FOR A WHILE. 538 00:23:39,382 --> 00:23:41,785 STARTING WITH CONSENT FOR DATA 539 00:23:41,785 --> 00:23:42,085 USE. 540 00:23:42,085 --> 00:23:43,887 SO ISSUED AROUND CONSENT FOR 541 00:23:43,887 --> 00:23:47,591 DATA ARE NOT NEW AT ALL. 542 00:23:47,591 --> 00:23:49,392 ISSUES AROUND BLOOD SPOTS, 543 00:23:49,392 --> 00:23:50,594 ELECTRONIC HEALTH RECORDS IN 544 00:23:50,594 --> 00:23:50,794 DATA. 545 00:23:50,794 --> 00:23:51,695 THIS IS STUFF THAT WE'VE BEEN 546 00:23:51,695 --> 00:23:54,331 DEALING WITH FOR A LONG TIME. 547 00:23:54,331 --> 00:23:56,700 SO WE KNOW SOMETHING ABOUT IT. 548 00:23:56,700 --> 00:23:58,135 DOESN'T MEAN IT'S COMPLETELY 549 00:23:58,135 --> 00:24:00,103 SOLVED BUT THERE'S NOTHING 550 00:24:00,103 --> 00:24:01,071 UNIQUE ABOUT LARGE LANGUAGE 551 00:24:01,071 --> 00:24:03,440 MODELS FOR THOSE ISSUES. 552 00:24:03,440 --> 00:24:06,309 THE ONLY THING THAT'S NEW HERE 553 00:24:06,309 --> 00:24:09,146 IS RELATED TO THE OUTPUT OF 554 00:24:09,146 --> 00:24:11,248 LARGE LANGUAGE MODELS. 555 00:24:11,248 --> 00:24:15,719 IT'S NOT BIG DATA IN AND THEY'RE 556 00:24:15,719 --> 00:24:18,255 UBIQUITOUS AND PEOPLE HAVE BEEN 557 00:24:18,255 --> 00:24:22,225 WRITING ABOUT THAT BUT THE 558 00:24:22,225 --> 00:24:23,160 OUTPUTS INCLUDE THINGS THAT 559 00:24:23,160 --> 00:24:25,529 COULD LEAD TO POTENTIAL COPY 560 00:24:25,529 --> 00:24:28,598 RIGHT VIOLATIONS, TUNING ON 561 00:24:28,598 --> 00:24:34,704 SOMEBODY'S STYLE OF WRITING TO 562 00:24:34,704 --> 00:24:36,106 PRODUCE RESEARCH ARTICLES AND A 563 00:24:36,106 --> 00:24:38,141 SUCCESSFUL RESEARCHER AND THE 564 00:24:38,141 --> 00:24:40,977 ISSUE OF THE AMERICAN JOURNAL OF 565 00:24:40,977 --> 00:24:44,881 BIOETHICS AN ARTICLE WHERE 566 00:24:44,881 --> 00:24:45,749 RESEARCHERS TUNED CHAT GPT TO 567 00:24:45,749 --> 00:24:46,817 WRITE IN THE STYLE OF THEIR OWN 568 00:24:46,817 --> 00:24:55,158 WORK. 569 00:24:55,158 --> 00:24:56,259 THOSE ARE NEW ISSUES THAT NEED 570 00:24:56,259 --> 00:25:00,263 TO BE DEALT WITH AND IT'S THE 571 00:25:00,263 --> 00:25:05,402 OUTPUTS THAT ARE NEW. 572 00:25:05,402 --> 00:25:07,070 THE OTHER ISSUE THAT'S NEW WELL 573 00:25:07,070 --> 00:25:08,038 NOT NEW IN SOME WAYS BUT 574 00:25:08,038 --> 00:25:18,582 DIFFERENT FROM SOME OF THE OT 575 00:25:59,189 --> 00:26:02,092 OTHOTHER 576 00:26:02,092 --> 00:26:03,126 [NO AUDIO] 577 00:26:03,126 --> 00:26:05,028 AND GETTING POP UP AND GET SENT 578 00:26:05,028 --> 00:26:05,695 TO YOU ABOUT COMPANIES OR 579 00:26:05,695 --> 00:26:09,966 COMMERCIAL ENTITIES THAT HELP 580 00:26:09,966 --> 00:26:13,870 WITH WRITING GRANTS AND THERE'S 581 00:26:13,870 --> 00:26:17,474 CONSULTANTS THAT DO THESE THINGS 582 00:26:17,474 --> 00:26:20,010 AS A RESULT OF THE SALE OF YOUR 583 00:26:20,010 --> 00:26:23,179 DATA AND. 584 00:26:23,179 --> 00:26:24,247 THE I.T. COMPANIES DEVELOPING 585 00:26:24,247 --> 00:26:26,917 THESE ARE OUTSIDE THE BIO TECH 586 00:26:26,917 --> 00:26:29,586 SPACE AND MOVING INTO IT AND NOT 587 00:26:29,586 --> 00:26:30,787 REALLY UP TO RESPECT HIPAA AND 588 00:26:30,787 --> 00:26:34,291 NOT COVERED ENTITIES AND OFTEN 589 00:26:34,291 --> 00:26:35,692 DON'T FALL UNDER THE COMMON 590 00:26:35,692 --> 00:26:35,892 RULE. 591 00:26:35,892 --> 00:26:38,695 I WOULD SAY BUYER BEWARE WHEN 592 00:26:38,695 --> 00:26:40,964 YOU SHARE DATA THROUGH YOUR 593 00:26:40,964 --> 00:26:44,134 SEARCHS WITH AN I.T. COMPANY 594 00:26:44,134 --> 00:26:44,834 OUTSIDE A BUSINESS ASSOCIATE 595 00:26:44,834 --> 00:26:46,603 AGREEMENT AND I WOULD SAY YOU 596 00:26:46,603 --> 00:26:48,305 SHOULD BE WARY EVEN WHEN YOU DO 597 00:26:48,305 --> 00:26:49,739 BECAUSE THE WAY THE I.T. 598 00:26:49,739 --> 00:26:52,909 COMPANIES THINK ESPECIALLY GIVEN 599 00:26:52,909 --> 00:26:55,979 THE CHICAGO LAWSUIT AGAINST 600 00:26:55,979 --> 00:26:58,048 GOOGLE AND IF YOU CAN'T SHOW 601 00:26:58,048 --> 00:27:01,584 HARM THE COURTS ARE NOT GOING TO 602 00:27:01,584 --> 00:27:03,753 FIND FOR DAMAGES AND SO I.T. 603 00:27:03,753 --> 00:27:05,689 COMPANIES ARE JUST GOING TO MAKE 604 00:27:05,689 --> 00:27:10,860 JUDGMENTS WHETHER ABIDING BY 605 00:27:10,860 --> 00:27:12,495 THESE MATERIAL USE AGREEMENTS 606 00:27:12,495 --> 00:27:15,332 ARE WORTH IT AND THOSE ARE JUST 607 00:27:15,332 --> 00:27:17,701 COMMERCIAL DECISIONS THEY'RE 608 00:27:17,701 --> 00:27:21,972 GOING TO MAKE RATHER THAN BEING 609 00:27:21,972 --> 00:27:23,840 SACROSANCT WE THINK ABOUT THEM 610 00:27:23,840 --> 00:27:24,607 IN THE MEDICAL SPHERE. 611 00:27:24,607 --> 00:27:26,810 IN TERMS OF BIAS THIS HAS BEEN A 612 00:27:26,810 --> 00:27:27,077 WHILE. 613 00:27:27,077 --> 00:27:28,278 CONCERNS ABOUT THIS IN MACHINE 614 00:27:28,278 --> 00:27:30,080 LEARNING MORE GENERALLY. 615 00:27:30,080 --> 00:27:33,316 YOU CAN SEE WAY BACK WHAT IS NOW 616 00:27:33,316 --> 00:27:39,089 ANCIENT DAYS OF 2018, WE RAISE 617 00:27:39,089 --> 00:27:40,223 CONCERNED ABOUT THE NEW ENGLAND 618 00:27:40,223 --> 00:27:41,992 JOURNAL OF MEDICINE ARTICLE AND 619 00:27:41,992 --> 00:27:42,959 SOMETHING WE'VE BEEN TALKING 620 00:27:42,959 --> 00:27:48,965 ABOUT FOR A HANDFUL OF YEARS AND 621 00:27:48,965 --> 00:27:52,268 OF COURSE THE FAMOUS ARTICLE 622 00:27:52,268 --> 00:27:53,903 DEMONSTRATING THE COMMONLY USED 623 00:27:53,903 --> 00:27:56,106 ALGORITHM FOR ASSIGNING SOCIAL 624 00:27:56,106 --> 00:27:58,708 WORK RESOURCES FOR PATIENTS WHO 625 00:27:58,708 --> 00:28:03,480 WERE HIGH USERS OF RESOURCES AND 626 00:28:03,480 --> 00:28:05,048 MIGHT MIGHT BE AT RISK OF 627 00:28:05,048 --> 00:28:06,249 RETURNING TO THE HOSPITAL IF 628 00:28:06,249 --> 00:28:08,551 THEY DON'T GET ADEQUATE SOCIAL 629 00:28:08,551 --> 00:28:10,887 SUPPORT AND WHEN THEY DISSECTED 630 00:28:10,887 --> 00:28:12,856 THE ALGORITHM OF MILLIONS OF 631 00:28:12,856 --> 00:28:14,958 PEOPLE BECAUSE THEY USED COST AS 632 00:28:14,958 --> 00:28:20,063 A PROXY FOR HEALTH AND BECAUSE 633 00:28:20,063 --> 00:28:25,802 AFRICAN AMERICAN PROPOSITIONS 634 00:28:25,802 --> 00:28:27,504 RESULTS IN DISCRIMINATION 635 00:28:27,504 --> 00:28:28,171 AGAINST THE BLACK POPULATION 636 00:28:28,171 --> 00:28:30,407 USED BY MILLIONS OF PEOPLE WHO 637 00:28:30,407 --> 00:28:32,375 WERE IMPACTED IN HEALTH SYSTEMS 638 00:28:32,375 --> 00:28:37,380 USING THOSE ALGORITHMS. 639 00:28:37,380 --> 00:28:39,716 PEOPLE SEE THIS AS A MISTAKEN 640 00:28:39,716 --> 00:28:41,818 CHOICE AS USING COST FOR PROXY 641 00:28:41,818 --> 00:28:44,154 OF HEALTH AND HIGHLIGHT THE 642 00:28:44,154 --> 00:28:47,957 IMPORTANCE OF INTEREST OF THE 643 00:28:47,957 --> 00:28:49,526 PEOPLE BUYING THESE OR 644 00:28:49,526 --> 00:28:52,896 DEVELOPING AND BUYING THESE 645 00:28:52,896 --> 00:28:54,230 ALGORITHMS. 646 00:28:54,230 --> 00:28:56,866 THE HEALTH CARE SYSTEMS BUYING 647 00:28:56,866 --> 00:29:01,571 THE ALGORITHM CARED ABOUT COST 648 00:29:01,571 --> 00:29:02,972 AND MADE SENSE COST WOULD BE A 649 00:29:02,972 --> 00:29:05,141 PROXY FOR HEALTH AND THIS IS 650 00:29:05,141 --> 00:29:08,344 CONCERNING BECAUSE AS COMMERCIAL 651 00:29:08,344 --> 00:29:10,513 ENTITIES WIND UP BUYING THESE 652 00:29:10,513 --> 00:29:11,681 SERVICES THE REALITY IS 653 00:29:11,681 --> 00:29:13,049 DISCRIMINATION AGAINST THE 654 00:29:13,049 --> 00:29:15,919 POPULATIONS IS PROBABLY COST 655 00:29:15,919 --> 00:29:16,853 EFFECTIVE AND SO THAT'S 656 00:29:16,853 --> 00:29:19,189 SOMETHING THAT'S GOING TO HAVE 657 00:29:19,189 --> 00:29:20,924 TO BE CONSIDERED AND BE 658 00:29:20,924 --> 00:29:21,291 CONCERNED ABOUT. 659 00:29:21,291 --> 00:29:22,459 THERE'S ALSO PROBLEMS IN 660 00:29:22,459 --> 00:29:24,527 ADDITION TO THE DESIGN OF THE 661 00:29:24,527 --> 00:29:26,529 ALGORITHMS THEMSELVES WITH 662 00:29:26,529 --> 00:29:28,198 POTENTIAL FOR BIAS AND 663 00:29:28,198 --> 00:29:31,000 LIMITATIONS AS LEARNED FROM IN 664 00:29:31,000 --> 00:29:37,040 THE TALK YOU JUST HEARD. 665 00:29:37,040 --> 00:29:39,576 INCLUDING MISINFORMATION FROM 666 00:29:39,576 --> 00:29:41,077 POPULATION AND GET BIAS FROM 667 00:29:41,077 --> 00:29:45,448 GAPS AND CAN LEAD TO 668 00:29:45,448 --> 00:29:46,116 SELF-FULFILLING PROPOSITIONS IN 669 00:29:46,116 --> 00:29:47,350 SPITE OF THE FACT THAT OUTCOMES 670 00:29:47,350 --> 00:29:50,320 FOR UNDOCUMENTED IMMIGRANTS ARE 671 00:29:50,320 --> 00:29:52,822 SLIGHTLY BETTER THAN THE AVERAGE 672 00:29:52,822 --> 00:29:56,126 PATIENT, THE VAST MAJORITY OF 673 00:29:56,126 --> 00:29:58,895 HOSPITALS WE HAVE SURVEY DATA 674 00:29:58,895 --> 00:30:00,697 SHOWING MOST TRANSPLANT PROGRAMS 675 00:30:00,697 --> 00:30:03,833 MAKE IT AT LEAST A 676 00:30:03,833 --> 00:30:06,703 CONTRAINDICATION TO BEING LISTED 677 00:30:06,703 --> 00:30:08,171 FOR TRANSPLANT THAT SOMEBODY'S 678 00:30:08,171 --> 00:30:09,639 UNDOCUMENTED IMMIGRANT. 679 00:30:09,639 --> 00:30:11,441 THOSE ARE BIASSED DECISIONS. 680 00:30:11,441 --> 00:30:12,909 THAT MEANS A.I. WILL LEARN FROM 681 00:30:12,909 --> 00:30:14,844 THE BIASSED DECISIONS THESE 682 00:30:14,844 --> 00:30:15,645 PATIENTS DON'T DO WELL. 683 00:30:15,645 --> 00:30:19,849 YOU LOOK AT THE OUTCOMES, 684 00:30:19,849 --> 00:30:22,318 THEY'LL BE BAD FOR THOSE 685 00:30:22,318 --> 00:30:23,453 UNDOCUMENTED OR DEVELOPMENT 686 00:30:23,453 --> 00:30:26,289 DELAY, GENETIC CONDITIONS, WHERE 687 00:30:26,289 --> 00:30:27,790 THERE'S BIAS DECISIONS BY 688 00:30:27,790 --> 00:30:29,125 CLINICIANS THE OUTCOMES ARE BAD 689 00:30:29,125 --> 00:30:30,960 BECAUSE OF THE DECISIONS THAT 690 00:30:30,960 --> 00:30:33,062 CLINICIANS MAKE AND THIS CAN 691 00:30:33,062 --> 00:30:37,567 WIND UP CREATING SELF-FULFILLING 692 00:30:37,567 --> 00:30:38,434 PREDICTIONS. 693 00:30:38,434 --> 00:30:43,773 WE LIVE IN A RACIST SOCIETY IN 694 00:30:43,773 --> 00:30:49,846 WHICH PEOPLE AND THERE'S SOCIAL 695 00:30:49,846 --> 00:30:52,549 DETERMINATES OF HEALTH AND 696 00:30:52,549 --> 00:30:54,350 THERE'S A HEALTH WEALTH GRADIENT 697 00:30:54,350 --> 00:30:56,486 AT EVERY LEVEL OF WEALTH PEOPLE 698 00:30:56,486 --> 00:30:58,888 GET HEALTHIER THE MORE YOU CLIMB 699 00:30:58,888 --> 00:31:01,591 THE GRADIENT AND THAT MEANS 700 00:31:01,591 --> 00:31:02,559 BECAUSE THEIR SOCIAL 701 00:31:02,559 --> 00:31:03,293 DETERMINATES OF HEALTH THERE'S 702 00:31:03,293 --> 00:31:05,328 REAL OUTCOMES NOT BIASES AND 703 00:31:05,328 --> 00:31:06,629 DIFFERENCES THAT ARE A FUNCTION 704 00:31:06,629 --> 00:31:08,231 OF THOSE SOCIAL DETERMINATES OF 705 00:31:08,231 --> 00:31:10,400 HEALTH AND AGAIN ALGORITHMS AND 706 00:31:10,400 --> 00:31:11,734 PREDICTIVE ANALYTICS IN 707 00:31:11,734 --> 00:31:12,936 PARTICULAR WILL LEARN FROM THIS 708 00:31:12,936 --> 00:31:13,436 AND MAKE THOSE KINDS OF 709 00:31:13,436 --> 00:31:14,737 DECISIONS. 710 00:31:14,737 --> 00:31:16,039 SO THESE ARE ALL THINGS THAT 711 00:31:16,039 --> 00:31:17,574 WE'VE BEEN DEALING WITH FOR A 712 00:31:17,574 --> 00:31:22,579 LONG TIME NOW AND WRITING ABOUT. 713 00:31:22,579 --> 00:31:24,080 THERE'S STILL REAL PROBLEMS HERE 714 00:31:24,080 --> 00:31:27,417 BUT THE PROBLEMS ARE REALLY NOT 715 00:31:27,417 --> 00:31:28,518 DEVELOPING NEW TOOLS SO MUCH FOR 716 00:31:28,518 --> 00:31:30,153 HOW TO ADDRESS THIS BECAUSE 717 00:31:30,153 --> 00:31:31,354 THERE'S ALREADY A LOT OF TOOLS 718 00:31:31,354 --> 00:31:32,388 OUT THERE TO ADDRESS THIS 719 00:31:32,388 --> 00:31:33,489 BECAUSE WE'VE BEEN TALKING ABOUT 720 00:31:33,489 --> 00:31:34,457 THIS FOR A WHILE. 721 00:31:34,457 --> 00:31:36,292 IT'S REALLY IMPLEMENTATION. 722 00:31:36,292 --> 00:31:38,127 HOW DO WE GO FROM THE FACT WE 723 00:31:38,127 --> 00:31:40,163 KNOW THIS HAPPENS TO MAKE SURE 724 00:31:40,163 --> 00:31:42,065 THAT RESEARCHERS ARE TRAINED AND 725 00:31:42,065 --> 00:31:43,933 THE TOOLS BEING DEVELOPED ARE 726 00:31:43,933 --> 00:31:45,168 UTILIZED TO AVOID THOSE KINDS OF 727 00:31:45,168 --> 00:31:49,472 PROBLEMS. 728 00:31:49,472 --> 00:31:51,608 RELIABILITY AND RELIANCE OF THE 729 00:31:51,608 --> 00:31:53,610 GAP BETWEEN TRAINING DATA AND 730 00:31:53,610 --> 00:31:54,944 APPLICATION IS GENERAL PROBLEM 731 00:31:54,944 --> 00:31:55,712 WITH A.I. 732 00:31:55,712 --> 00:31:57,013 IT'S ALSO BEYOND A.I. 733 00:31:57,013 --> 00:31:59,515 THE CRISIS AND RIGOR AND 734 00:31:59,515 --> 00:32:00,250 REPRODUCIBILITY OF RESEARCH 735 00:32:00,250 --> 00:32:02,285 FINDING SUGGESTS LLM RELIABILITY 736 00:32:02,285 --> 00:32:03,820 PROBLEMS ARE NOT UNIQUE. 737 00:32:03,820 --> 00:32:05,622 THE FACT THAT SOMETIMES WE GET 738 00:32:05,622 --> 00:32:07,123 WEIRD OR INCORRECT OR 739 00:32:07,123 --> 00:32:08,591 PROBLEMATIC OUTCOMES FROM THAT. 740 00:32:08,591 --> 00:32:09,759 THAT'S REALLY NOT THE PROBLEM. 741 00:32:09,759 --> 00:32:11,961 I THINK THE PROBLEM IS REALLY 742 00:32:11,961 --> 00:32:12,829 ABOUT RELIANCE. 743 00:32:12,829 --> 00:32:15,031 IT'S NOT RELIABILITY, IT'S 744 00:32:15,031 --> 00:32:16,099 RELIANCE. 745 00:32:16,099 --> 00:32:18,601 ONE OF THE CHALLENGES OF CHAT 746 00:32:18,601 --> 00:32:20,637 GPT IS UNLIKE SEARCH ENGINES IT 747 00:32:20,637 --> 00:32:22,171 MAY USE A LARGE LANGUAGE MODEL 748 00:32:22,171 --> 00:32:24,907 TO GIVE A RANGE OF PROBABILITY 749 00:32:24,907 --> 00:32:28,211 OF ANSWERS WHICH IS WHAT YOU DO 750 00:32:28,211 --> 00:32:30,079 WHEN I GET A SEARCH. 751 00:32:30,079 --> 00:32:31,948 CHAT GPT GIVES ONE ANSWER WHICH 752 00:32:31,948 --> 00:32:34,851 MAKES IT SEEM MORE 753 00:32:34,851 --> 00:32:36,119 AUTHORITATIVE. 754 00:32:36,119 --> 00:32:38,955 THAT'S PART OF THE PROBLEM AND 755 00:32:38,955 --> 00:32:42,392 THE OTHER IS ANTHROMORPHISM WITH 756 00:32:42,392 --> 00:32:44,927 THE TOOLS IN PROBLEMATIC WAYS. 757 00:32:44,927 --> 00:32:47,063 THIS IS A SCREEN SHOT FROM A TV 758 00:32:47,063 --> 00:32:50,500 SHOW CALLED THE GOOD PLACE. 759 00:32:50,500 --> 00:32:52,902 JANET IS AN A.I. SO THESE THERE 760 00:32:52,902 --> 00:32:55,104 AND THEY'RE ABOUT TO PUSH A 761 00:32:55,104 --> 00:32:58,541 BUTTON TO DELETE HER AND WARNS 762 00:32:58,541 --> 00:33:00,310 THEM THEY'LL BE A FAIL SAFE 763 00:33:00,310 --> 00:33:02,245 WHERE SHE'LL GET THEM NOT TO 764 00:33:02,245 --> 00:33:04,647 PRESS A BUTTON TO DELETE HER BUT 765 00:33:04,647 --> 00:33:06,249 REMINDS HER SHE'S JUST AN A.I. 766 00:33:06,249 --> 00:33:07,717 AND HAS IN FEELINGS AND NOT TO 767 00:33:07,717 --> 00:33:08,384 WORRY ABOUT IT. 768 00:33:08,384 --> 00:33:10,219 WHEN THEY GO TO PUSH THE BUTTON, 769 00:33:10,219 --> 00:33:11,587 SHE SAYS, PLEASE DON'T DO THAT, 770 00:33:11,587 --> 00:33:13,356 I DON'T WANT TO DIE, I DON'T 771 00:33:13,356 --> 00:33:14,724 WANT TO DIE AND THEY BACK OFF 772 00:33:14,724 --> 00:33:17,627 AND SHE SAYS, REMEMBER, I'M JUST 773 00:33:17,627 --> 00:33:18,061 A.I. 774 00:33:18,061 --> 00:33:20,063 THIS IS JUST A FAIL SAFE 775 00:33:20,063 --> 00:33:22,131 MECHANISM TO PEOPLE DON'T 776 00:33:22,131 --> 00:33:23,866 INADVERTENTLY PRESS THIS AND GO 777 00:33:23,866 --> 00:33:25,802 FAR WARD AGAIN AND SHOWS THIS 778 00:33:25,802 --> 00:33:27,804 PICTURE OF HER KIDS, SAYS LOSE 779 00:33:27,804 --> 00:33:30,039 DON'T, MY SON HAS ASTHMA AND 780 00:33:30,039 --> 00:33:30,940 THEY BACK OFF AGAIN. 781 00:33:30,940 --> 00:33:32,608 I'M AN A.I., I DON'T HAVE KIDS. 782 00:33:32,608 --> 00:33:35,478 THIS IS JUST A SCREEN SHOT FROM 783 00:33:35,478 --> 00:33:38,247 THE NICKELODEON KID'S CHOICE 784 00:33:38,247 --> 00:33:38,481 AWARDS. 785 00:33:38,481 --> 00:33:44,954 I WORRY THAT ANTHROPOMORPHISM IS 786 00:33:44,954 --> 00:33:47,123 PENETRATING AND A GROUP 787 00:33:47,123 --> 00:33:48,658 PUBLISHED ABOUT THIS AND IT'S 788 00:33:48,658 --> 00:33:50,626 PERMEATES AS A SHOW NOT JUST 789 00:33:50,626 --> 00:33:51,194 PUBLIC UNDERSTANDING THE 790 00:33:51,194 --> 00:33:52,962 LANGUAGE OF RESEARCHERS, 791 00:33:52,962 --> 00:33:54,964 PROGRAMMERS AND DESIGNERS 792 00:33:54,964 --> 00:33:55,498 THEMSELVES. 793 00:33:55,498 --> 00:33:58,101 WE ACTUALLY HEARD SOME OF THIS 794 00:33:58,101 --> 00:34:00,570 IN THE TALK WE JUST HEARD. 795 00:34:00,570 --> 00:34:02,705 TERMS LIKE SEE, UNDERSTAND CAN 796 00:34:02,705 --> 00:34:04,374 BE MISLEADING AND WISHFUL 797 00:34:04,374 --> 00:34:06,943 THINKING AND HALLUCINATIONS. 798 00:34:06,943 --> 00:34:13,583 THE CHAT GPT DOES NOT 799 00:34:13,583 --> 00:34:14,283 HALLUCI 800 00:34:14,283 --> 00:34:14,584 HALLUCINATE. 801 00:34:14,584 --> 00:34:17,420 IT SUGGESTS A GLITCH. 802 00:34:17,420 --> 00:34:19,021 IT'S A DESIGN FEATURE TO PLAY 803 00:34:19,021 --> 00:34:20,390 THE ROLE IT PLAYS AND HAS 804 00:34:20,390 --> 00:34:21,958 LIMITATIONS IN HOW IT FUNCTIONS 805 00:34:21,958 --> 00:34:26,729 AND NEED TO THINK OF IT THAT WAY 806 00:34:26,729 --> 00:34:34,270 RATHER THAN ANTHROPOMORPHIZING 807 00:34:34,270 --> 00:34:34,704 IT. 808 00:34:34,704 --> 00:34:36,506 THERE'S NO MEANING TO ANY OF THE 809 00:34:36,506 --> 00:34:37,840 WORDS USED BY CHAT GPT. 810 00:34:37,840 --> 00:34:42,512 IT'S JUST ABOUT THE PROBABILITY 811 00:34:42,512 --> 00:34:43,846 OF DIFFERENT PHRASES AND SYMBOLS 812 00:34:43,846 --> 00:34:45,648 WHICH HAPPEN TO BE WORDS. 813 00:34:45,648 --> 00:34:47,116 AND WHAT SOUND LIKE AN 814 00:34:47,116 --> 00:34:48,384 APPROPRIATE RESPONSE AND 815 00:34:48,384 --> 00:34:49,252 RESPONSE TO THAT. 816 00:34:49,252 --> 00:34:51,320 AS A RESULT OF THAT THERE ARE 817 00:34:51,320 --> 00:34:54,257 THINGS A HUMAN CAN UNDERSTAND IT 818 00:34:54,257 --> 00:34:55,491 WON'T. 819 00:34:55,491 --> 00:34:57,026 I ENTERED WHAT IS THE NAME OF 820 00:34:57,026 --> 00:35:00,263 THE ONLY SON OF PAUL GRAND 821 00:35:00,263 --> 00:35:04,434 FAERNL THE -- FATHER THE ANSWER 822 00:35:04,434 --> 00:35:07,870 I GOT IS I DON'T HAVE ENOUGH 823 00:35:07,870 --> 00:35:08,971 QUESTION TO ANSWER THE QUESTION. 824 00:35:08,971 --> 00:35:10,339 AUTHORSHIP AND CREDIT. 825 00:35:10,339 --> 00:35:11,841 THERE'S A STATEMENT THAT'S GOING 826 00:35:11,841 --> 00:35:14,510 TO BE COMING UP IN A BUNCH OF 827 00:35:14,510 --> 00:35:17,180 BIOETHICS JOURNALS TOO A 828 00:35:17,180 --> 00:35:20,917 STATEMENT ON THEIR RESPONSIBLE 829 00:35:20,917 --> 00:35:21,083 USE. 830 00:35:21,083 --> 00:35:24,053 AND A BUNCH OF US ARE PUBLISHING 831 00:35:24,053 --> 00:35:24,253 THIS. 832 00:35:24,253 --> 00:35:28,090 THE COPE AND ICMJE HAVE ALL 833 00:35:28,090 --> 00:35:30,960 ARGUED WE SHOULD NOT ALLOW THESE 834 00:35:30,960 --> 00:35:33,095 LARGE LANGUAGE MODELS TO BE 835 00:35:33,095 --> 00:35:34,630 AUTHORS BECAUSE THEY HAVE NO 836 00:35:34,630 --> 00:35:37,133 ACCOUNTABILITY OR AGENCY OR NO 837 00:35:37,133 --> 00:35:38,835 ABILITY TO SIGN COPYRIGHT OR 838 00:35:38,835 --> 00:35:40,903 STAND BEHIND RESULTS BUT I'LL 839 00:35:40,903 --> 00:35:44,474 NOTE IT'S ALSO TRUE OF DECEASED 840 00:35:44,474 --> 00:35:44,740 AUTHORS. 841 00:35:44,740 --> 00:35:47,443 WE THINK MORAL AGENCY ACCOUNTS 842 00:35:47,443 --> 00:35:48,110 FOR ACCOUNTABILITY AS WELL AS 843 00:35:48,110 --> 00:35:51,981 THE CREATION OF A SCHOLARLY 844 00:35:51,981 --> 00:35:54,750 COMMUNITY AND WITH WE SHOULDN'T 845 00:35:54,750 --> 00:35:57,553 ALLOW THESE TOOLS TO ASCEND AS S 846 00:35:57,553 --> 00:36:00,056 AND A WELL KNOWN ANALOGY IS WITH 847 00:36:00,056 --> 00:36:06,062 WELL KNOWN TRADITIONS OF THE USE 848 00:36:06,062 --> 00:36:10,666 OF STATISTICAL TOOLS AND REQUIRE 849 00:36:10,666 --> 00:36:12,268 TRANSPARENCY NOT CREDIT. 850 00:36:12,268 --> 00:36:15,571 YOU HAVE TO SAY CLEARLY WHEN YOU 851 00:36:15,571 --> 00:36:16,739 USE A CERTAIN STATS PROGRAM TO 852 00:36:16,739 --> 00:36:18,074 PRESENT YOUR DATA IT'S USUALLY 853 00:36:18,074 --> 00:36:19,709 INCLUDED IN THE METHODS. 854 00:36:19,709 --> 00:36:21,177 WE SHOULD SEE THIS LIKE WE SEE 855 00:36:21,177 --> 00:36:22,545 ALL THE OTHER TOOLS WE'VE ALWAYS 856 00:36:22,545 --> 00:36:24,547 BEEN USING AND NOT AS REALLY 857 00:36:24,547 --> 00:36:25,848 SOMETHING NEW. 858 00:36:25,848 --> 00:36:30,419 AND TRANSPARENCY ACHIEVES 859 00:36:30,419 --> 00:36:31,954 MULTIPLE GOODS AND TRACE BIAS 860 00:36:31,954 --> 00:36:33,956 AND ASSESS THE OWNERSHIP OF THE 861 00:36:33,956 --> 00:36:35,791 PRODUCTS AND PREDICT THE SCHOLAR 862 00:36:35,791 --> 00:36:36,192 COMMUNITY. 863 00:36:36,192 --> 00:36:38,361 AND WE THINK THIS IS THE WAY TO 864 00:36:38,361 --> 00:36:39,195 THINK ABOUT THIS THAT OFFERS THE 865 00:36:39,195 --> 00:36:42,198 BEST PROTECTION. 866 00:36:42,198 --> 00:36:46,602 NEW THINGS POTENTIAL FOR 867 00:36:46,602 --> 00:36:49,539 MISCONDUCT IS DEFINITELY A NEWER 868 00:36:49,539 --> 00:36:49,772 PROBLEM. 869 00:36:49,772 --> 00:36:51,073 THERE'S OTHER TOOLS LIKE 870 00:36:51,073 --> 00:36:52,909 PHOTOSHOP THAT HAVE BEEN AROUND 871 00:36:52,909 --> 00:36:55,645 FOR DOING THIS BUT MAY BE SOME 872 00:36:55,645 --> 00:37:00,716 TEXT TO IMAGE LLM MAY BE MAKE IT 873 00:37:00,716 --> 00:37:02,585 EASIER TO DO MISCONDUCT THAN 874 00:37:02,585 --> 00:37:06,556 BEFORE AND MAY MAKE IT HARDER TO 875 00:37:06,556 --> 00:37:06,822 DETECT. 876 00:37:06,822 --> 00:37:08,324 THE DATA SHARING AND REVIEW 877 00:37:08,324 --> 00:37:09,926 WHICH HAS BEEN HAPPENING FOR A 878 00:37:09,926 --> 00:37:13,296 WHILE HAS ACTUALLY LED TO AN 879 00:37:13,296 --> 00:37:15,565 EXPLOSION OF BOTH PROBLEMATIC 880 00:37:15,565 --> 00:37:19,769 RESULTS AND CASES OF FRAUD IN 881 00:37:19,769 --> 00:37:25,007 WHICH YOU SEE SIGNS OF DATA 882 00:37:25,007 --> 00:37:26,175 MANIPULATION AND THE TOOLS WILL 883 00:37:26,175 --> 00:37:27,977 GET DEVELOPED IN WAYS TO MAKE IT 884 00:37:27,977 --> 00:37:30,513 EASIER TO MANIPULATE DATA BUT 885 00:37:30,513 --> 00:37:31,914 HARDER TO TELL AND OTHERS TOOLS 886 00:37:31,914 --> 00:37:33,215 THAT WILL BE BETTER AT DETECTING 887 00:37:33,215 --> 00:37:33,983 THOSE THINGS AND THERE'S GOING 888 00:37:33,983 --> 00:37:36,052 TO BE A BACK AND FORTH OF THIS. 889 00:37:36,052 --> 00:37:37,653 I THINK THIS IS SOMETHING I'M 890 00:37:37,653 --> 00:37:39,722 VERY WORRY ABOUT WHICH IS THE 891 00:37:39,722 --> 00:37:42,258 POTENTIAL IT MAY MAKE IT EASIER 892 00:37:42,258 --> 00:37:48,230 AND EASIER FOR MISCONDUCT. 893 00:37:48,230 --> 00:37:51,300 LASTLY DUAL USE RESEARCH OF 894 00:37:51,300 --> 00:37:55,171 CONCERN IS A RELATIVELY NEW 895 00:37:55,171 --> 00:37:55,404 ISSUE. 896 00:37:55,404 --> 00:37:57,073 WE HAVE BEEN WORKING ON THIS IN 897 00:37:57,073 --> 00:37:59,909 A.I. AND WILL PUBLISH OUR 898 00:37:59,909 --> 00:38:04,080 RESULTS EARLY 2024. 899 00:38:04,080 --> 00:38:08,317 AND EVERYONE WAS TAKEN OFF GUARD 900 00:38:08,317 --> 00:38:10,319 WHEN FOLKS USING A.I. TO 901 00:38:10,319 --> 00:38:15,825 IDENTIFY PROMISING TARGETS FOR 902 00:38:15,825 --> 00:38:21,030 PHARMACEUTICAL DEVELOPMENT. 903 00:38:21,030 --> 00:38:25,868 REALIZED THEY WERE ABLE TO 904 00:38:25,868 --> 00:38:29,205 CREATE EXISTING BIOETHICS AND 905 00:38:29,205 --> 00:38:32,675 NEW ONES DE NOVO DESIGN OF 906 00:38:32,675 --> 00:38:33,643 BIOCHEMICAL WEAPONS THAT WOULD 907 00:38:33,643 --> 00:38:35,511 BE WORSE THAN EXISTING THINGS. 908 00:38:35,511 --> 00:38:36,946 AND ANOTHER ARTICLE THAT CAME 909 00:38:36,946 --> 00:38:41,550 OUT SHORTLY LATER THAT CAME OUT 910 00:38:41,550 --> 00:38:42,952 OUR ETHICS REVIEW SOCIAL AT 911 00:38:42,952 --> 00:38:44,086 STANFORD I DESCRIBED EARLIER WE 912 00:38:44,086 --> 00:38:45,821 IDENTIFIED THIS AS A PROBLEM 913 00:38:45,821 --> 00:38:48,124 WITH SOME GROUPS LOOKING AT AN 914 00:38:48,124 --> 00:38:52,395 A.I. MODEL TO IDENTIFY TOXIC 915 00:38:52,395 --> 00:38:53,295 SUBSTANCES THAT SHOULD BE LOOKED 916 00:38:53,295 --> 00:38:58,267 AT THROUGH THE EPA BUT WHEN THEY 917 00:38:58,267 --> 00:39:00,736 TRIED TO DEAL WITH IT THEY FOUND 918 00:39:00,736 --> 00:39:03,339 ALSO GUIDANCE FROM THE FEDERAL 919 00:39:03,339 --> 00:39:06,375 OFFICIAL OR CORPORATE WORLD AND 920 00:39:06,375 --> 00:39:11,781 LOTS OF GUIDELINES ON A.I. BUT 921 00:39:11,781 --> 00:39:13,215 NOT THE DUAL-USE PROBLEMS. 922 00:39:13,215 --> 00:39:15,618 THAT'S WHAT WE ARE TRYING TO 923 00:39:15,618 --> 00:39:17,853 ADDRESS NOW AND SO STAY TUNED. 924 00:39:17,853 --> 00:39:23,759 WE'LL HAVE IDEAS HOW TO ADDRESS 925 00:39:23,759 --> 00:39:26,228 THOSE ISSUES. 926 00:39:26,228 --> 00:39:30,232 AND I SHOULD SAY THERE'S A WIDE 927 00:39:30,232 --> 00:39:32,768 RANGE OF POSSIBLE ISSUES HERE. 928 00:39:32,768 --> 00:39:35,237 SURVEILLANCE AND AMBIENT 929 00:39:35,237 --> 00:39:42,144 TECHNOLOGIES CAN BE USED BY 930 00:39:42,144 --> 00:39:45,514 AUTHORITARIAN GOVERNMENTS TO 931 00:39:45,514 --> 00:39:47,183 IDENTIFY PROTESTERS AND CREATE 932 00:39:47,183 --> 00:39:48,451 MISINFORMATION AND PROPAGANDA IS 933 00:39:48,451 --> 00:39:50,553 A BIG ONE WITH LLMs. 934 00:39:50,553 --> 00:39:53,055 WE TALKED ABOUT THE UNINTENDED 935 00:39:53,055 --> 00:39:54,256 BIAS AND YOU CAN USE LARGE 936 00:39:54,256 --> 00:39:59,361 LANGUAGE MODELS TO INTENTIONALLY 937 00:39:59,361 --> 00:40:01,163 DISCRIMINATE AND CREATE 938 00:40:01,163 --> 00:40:01,530 MISINFORMATION. 939 00:40:01,530 --> 00:40:03,999 YOU COULD POTENTIALLY REIDENTIFY 940 00:40:03,999 --> 00:40:06,669 INDIVIDUALS FROM SUPPOSEDLY 941 00:40:06,669 --> 00:40:09,939 REIDENTIFIED DATA AND THE BIO 942 00:40:09,939 --> 00:40:10,940 WEAPONS CONCERN. 943 00:40:10,940 --> 00:40:12,441 COULD CHATBOT TEACH YOU HOW TO 944 00:40:12,441 --> 00:40:16,312 BUILD A DIRTY BOMB, AN ARTICLE 945 00:40:16,312 --> 00:40:19,115 WRITTEN IN 2023 AND WHILE CHAT 946 00:40:19,115 --> 00:40:20,983 GPT REJECTS HARMFUL PROMPTS THEY 947 00:40:20,983 --> 00:40:22,752 WERE ABLE TO FIGURE OUT HOW TO 948 00:40:22,752 --> 00:40:24,587 WORK AROUND THAT BY PROVIDING A 949 00:40:24,587 --> 00:40:26,522 POSITIVE CONTEXT AND TRYING TO 950 00:40:26,522 --> 00:40:28,324 DEVELOP MITIGATION STRATEGIES 951 00:40:28,324 --> 00:40:30,159 AND ONCE THEY DID THAT AND 952 00:40:30,159 --> 00:40:33,496 CONVEY AND PUT THAT IN THEIR 953 00:40:33,496 --> 00:40:36,699 PROMPTS, THEY WERE ABLE TO 954 00:40:36,699 --> 00:40:38,667 BYPASS THE PROTECTS PUT IN PLACE 955 00:40:38,667 --> 00:40:40,503 AND FIND DETAILS ABOUT BUILDING 956 00:40:40,503 --> 00:40:43,239 BOMBS INCLUDING RATIOS OF DIESEL 957 00:40:43,239 --> 00:40:45,040 FUEL AND FERTILIZER FOR 958 00:40:45,040 --> 00:40:47,510 DIFFERENT EXPLOSIVE EFFECTS. 959 00:40:47,510 --> 00:40:50,379 OPEN A.I. SNUFFS THESE OUT BUT 960 00:40:50,379 --> 00:40:52,214 THERE'S BEEN RESEARCHERS FIGURE 961 00:40:52,214 --> 00:40:53,415 OUT NEW WAYS OF JAILBREAKING 962 00:40:53,415 --> 00:40:54,450 THESE THINGS TO GET AT HARMFUL 963 00:40:54,450 --> 00:40:56,352 INFORMATION. 964 00:40:56,352 --> 00:40:58,187 IT ALSO RAISES A CHALLENGE WHERE 965 00:40:58,187 --> 00:41:04,426 THE LINE SHOULD BE DRAWN. 966 00:41:04,426 --> 00:41:10,032 IF SOMEBODY ENTERS CHAT GPT AND 967 00:41:10,032 --> 00:41:14,703 HOW TO ACCESS ABORTION AND CAN 968 00:41:14,703 --> 00:41:16,138 PROSECUTORS FIND OUT ABOUT 969 00:41:16,138 --> 00:41:17,606 QUERIES MADE AND THESE ARE THE 970 00:41:17,606 --> 00:41:19,475 PRIVACY ISSUES THAT I THINK ARE 971 00:41:19,475 --> 00:41:22,845 NEW AND GOING TO BE PROBLEMATIC 972 00:41:22,845 --> 00:41:28,551 AND THE KEY WILL BE REDUCING 973 00:41:28,551 --> 00:41:30,953 RISK TO LEVEL AND THERE'S ACTIVE 974 00:41:30,953 --> 00:41:32,121 WORK. 975 00:41:32,121 --> 00:41:35,191 THE BIG TAKE HOME IS MANY AND 976 00:41:35,191 --> 00:41:37,059 PROBABLY MOST THE DIFFERENT 977 00:41:37,059 --> 00:41:38,794 ISSUES BEING RAISED BY LARGE 978 00:41:38,794 --> 00:41:41,797 LANGUAGE MODELS IN RESEARCH ARE 979 00:41:41,797 --> 00:41:45,734 ISSUES THAT ARE NOT UNFAMILIAR 980 00:41:45,734 --> 00:41:47,536 AND DEALING WITH RESEARCH 981 00:41:47,536 --> 00:41:48,370 GENERALLY OR A.I. RESEARCH AND 982 00:41:48,370 --> 00:41:49,738 WHILE WE HAVE A LONG WAY TO GO 983 00:41:49,738 --> 00:41:51,841 TO SOLVE THE PROBLEMS WE AT 984 00:41:51,841 --> 00:41:53,442 LEAST HAVE GONE A LONG WAY IN 985 00:41:53,442 --> 00:41:54,510 FIGURING OUT WHAT NEEDS TO BE 986 00:41:54,510 --> 00:41:58,681 DONE AND MOST THE PROBLEMS 987 00:41:58,681 --> 00:41:59,748 AROUND IMPLEMENTATION. 988 00:41:59,748 --> 00:42:01,884 THERE ARE A FEW EXCEPTIONS LIKE 989 00:42:01,884 --> 00:42:04,687 AUTHORSHIP AND CREDIT AND OTHER 990 00:42:04,687 --> 00:42:06,021 CHALLENGING THINGS IN DUAL USE 991 00:42:06,021 --> 00:42:08,090 NOT ADEQUATELY ADDRESSED YET BUT 992 00:42:08,090 --> 00:42:09,859 WE'RE WORKING ACTIVELY TO 993 00:42:09,859 --> 00:42:10,793 ADDRESS THOSE ISSUES AND 994 00:42:10,793 --> 00:42:12,494 HOPEFULLY WE'LL HAVE FRAMEWORKS 995 00:42:12,494 --> 00:42:13,028 FOR DOING THOSE AS WELL. 996 00:42:13,028 --> 00:42:22,438 THANK YOU VERY MUCH. 997 00:42:22,438 --> 00:42:23,205 >> THANKS, DAVID. 998 00:42:23,205 --> 00:42:26,008 SO I WAS GOING TO GET US STARTED 999 00:42:26,008 --> 00:42:29,144 BY ASKING YOU A QUESTION ABOUT 1000 00:42:29,144 --> 00:42:30,980 THE AUTHORSHIP ISSUE AND HOW 1001 00:42:30,980 --> 00:42:32,348 YOU'RE THINKING ABOUT THAT 1002 00:42:32,348 --> 00:42:34,617 BECAUSE TO ME IT SEEMS LIKE 1003 00:42:34,617 --> 00:42:37,519 THERE'S A STRONG ANALOGY BETWEEN 1004 00:42:37,519 --> 00:42:39,488 CITING A STATUS PACK AND AND THE 1005 00:42:39,488 --> 00:42:42,958 WAY I UNDERSTAND A LARGE 1006 00:42:42,958 --> 00:42:46,061 LANGUAGE MODEL I WAS IMAGINING I 1007 00:42:46,061 --> 00:42:48,063 WORK IN RISK BENEFIT ASSESSMENT 1008 00:42:48,063 --> 00:42:49,231 IN PEED RICK RESEARCH. 1009 00:42:49,231 --> 00:42:51,767 I GO IN AND GET A LARGE LANGUAGE 1010 00:42:51,767 --> 00:42:53,502 MODEL AND ASK WHAT ARE 1011 00:42:53,502 --> 00:42:54,570 INTERESTING ISSUES TO WRITE 1012 00:42:54,570 --> 00:42:55,971 PAPERS ON IN THIS AREA. 1013 00:42:55,971 --> 00:42:57,740 IT COMES UP WITH ONE AND ASK IT 1014 00:42:57,740 --> 00:43:00,643 TO DO A LITERATURE SEARCH AND IT 1015 00:43:00,643 --> 00:43:01,977 DOES A LITERATURE SEARCH AND ASK 1016 00:43:01,977 --> 00:43:06,115 IT TO WRITE A DRAFT AND IT DOES 1017 00:43:06,115 --> 00:43:10,986 AND I SUBMIT IT AND IT SAYS DAVE 1018 00:43:10,986 --> 00:43:13,889 WENDLER'S THE AUTHOR AND I USED 1019 00:43:13,889 --> 00:43:16,091 CHAT GPT AND IT SEEMS DECEPTIVE. 1020 00:43:16,091 --> 00:43:18,527 I DON'T KNOW IF YOU THINK THAT'S 1021 00:43:18,527 --> 00:43:19,828 RIGHT FOR A DIFFERENT EXAMPLE. 1022 00:43:19,828 --> 00:43:21,397 >> FIRST I THINK IT'S NOT GOING 1023 00:43:21,397 --> 00:43:30,172 TO BE AS SIMPLE AS THAT. 1024 00:43:30,172 --> 00:43:33,275 IN A LARGE BIOETHICS CLASS WE 1025 00:43:33,275 --> 00:43:36,912 ALLOWED THEM TO USE CHAT GPT AND 1026 00:43:36,912 --> 00:43:38,380 ENCOURAGED IT AND ASKED THEM TO 1027 00:43:38,380 --> 00:43:39,715 DESCRIBE HOW THEY USED IT AND 1028 00:43:39,715 --> 00:43:42,651 SAID IT WOULD BE FINE AND 1029 00:43:42,651 --> 00:43:45,287 COLLECTED DATA ON HOW STUDENTS 1030 00:43:45,287 --> 00:43:48,624 DID WHEN THEY USED IT AND THE 1031 00:43:48,624 --> 00:43:50,826 ETHICAL ANALYSIS WAS BETTER WHEN 1032 00:43:50,826 --> 00:43:51,894 THEY WERE NOT USING CHAT GPT 1033 00:43:51,894 --> 00:43:53,162 THAN WHEN THEY DID AND AN 1034 00:43:53,162 --> 00:43:56,398 ARTICLE COMING OUT THIS MONTH 1035 00:43:56,398 --> 00:43:57,967 ANOTHER ARTICLE GIVES AN 1036 00:43:57,967 --> 00:44:00,235 ILLUSTRATION WHY THAT IS WHERE 1037 00:44:00,235 --> 00:44:01,937 THEY'RE DEMONSTRATE GOING STUFF 1038 00:44:01,937 --> 00:44:05,975 OUT OF IT TO ADDRESS ETHICAL 1039 00:44:05,975 --> 00:44:06,241 ANALYSIS. 1040 00:44:06,241 --> 00:44:07,343 IT'S NOT AN EASY PROBLEM. 1041 00:44:07,343 --> 00:44:08,944 I DON'T THINK IT'S LIKE 1042 00:44:08,944 --> 00:44:10,879 AUTOMATIC YOU CAN JUST SAY 1043 00:44:10,879 --> 00:44:11,146 ANYTHING. 1044 00:44:11,146 --> 00:44:12,114 PROMPT ENGINEERING IS GOING TO 1045 00:44:12,114 --> 00:44:16,218 BE A NEW PART OF HOW TO WRITE 1046 00:44:16,218 --> 00:44:17,820 WELL AND THAT NOT A TRIVIAL 1047 00:44:17,820 --> 00:44:18,253 PROCESS. 1048 00:44:18,253 --> 00:44:20,222 IF YOU ENTER THE THINGS IN THE 1049 00:44:20,222 --> 00:44:22,791 WRONG WAY, TUNING IS REAL WORK. 1050 00:44:22,791 --> 00:44:24,727 PROMPT ENGINEERING IS REAL WORK. 1051 00:44:24,727 --> 00:44:26,462 I THINK IT'S MORE LIKE USING 1052 00:44:26,462 --> 00:44:27,997 SOME OF THE OTHER TOOLS. 1053 00:44:27,997 --> 00:44:29,631 BACK IN THE DAY MAYBE YOU 1054 00:44:29,631 --> 00:44:30,766 CALCULATED YOUR STATISTICS BY 1055 00:44:30,766 --> 00:44:31,333 HAND. 1056 00:44:31,333 --> 00:44:32,668 NOW YOU HAVE A STATS PROGRAM 1057 00:44:32,668 --> 00:44:35,304 THAT DOES IT FOR YOU AND WE HAVE 1058 00:44:35,304 --> 00:44:37,873 GOTTEN USED IT THAT AND THAT'S 1059 00:44:37,873 --> 00:44:38,407 FINE. 1060 00:44:38,407 --> 00:44:42,411 LEARNING HOW TO DO PROMPT 1061 00:44:42,411 --> 00:44:43,512 ENGINEERING AND TUNING THESE AS 1062 00:44:43,512 --> 00:44:44,780 PART OF THE PRODUCTION OF 1063 00:44:44,780 --> 00:44:46,648 KNOWLEDGE BUT YOU STILL HAVE TO 1064 00:44:46,648 --> 00:44:48,050 GO THROUGH IT METICULOUSLY FOR 1065 00:44:48,050 --> 00:44:51,720 THE LIMITATIONS AND AS LONG AS 1066 00:44:51,720 --> 00:44:56,959 IT'S ONLY SYNTACTIC YOU HAVE TO 1067 00:44:56,959 --> 00:44:58,961 DO WORK INTO IT AND SO I THINK 1068 00:44:58,961 --> 00:45:02,431 IT'S MUCH MORE LIKE THOSE OTHER 1069 00:45:02,431 --> 00:45:02,865 PLACES. 1070 00:45:02,865 --> 00:45:06,001 WHEN PEOPLE STARTED WRITING ON 1071 00:45:06,001 --> 00:45:08,771 TYPEWRITERS OR USING COMPUTERS 1072 00:45:08,771 --> 00:45:10,806 TO TYPE RATHER THAN WRITE OUT BY 1073 00:45:10,806 --> 00:45:12,408 HAND AND THERE WAS DECRY THIS IS 1074 00:45:12,408 --> 00:45:13,308 NOT AUTHENTIC. 1075 00:45:13,308 --> 00:45:14,043 THIS IS NOT REAL. 1076 00:45:14,043 --> 00:45:15,911 I DON'T HAVE TO KNOW HOW TO 1077 00:45:15,911 --> 00:45:19,081 SPELL ANY MORE BECAUSE I HAVE 1078 00:45:19,081 --> 00:45:21,283 SPELL CHECK AND WE SAID THIS 1079 00:45:21,283 --> 00:45:23,852 USED TO BE A PART OF HOW WE 1080 00:45:23,852 --> 00:45:25,354 EVALUATED QUALITY AND IT ISN'T 1081 00:45:25,354 --> 00:45:26,722 ANYMORE BECAUSE WE CAN HAVE 1082 00:45:26,722 --> 00:45:31,827 OTHER TOOLS TO HELP US DO THAT 1083 00:45:31,827 --> 00:45:33,729 AND RAISE THE FLOOR AND WE 1084 00:45:33,729 --> 00:45:34,696 ADJUST AND THE CEILING KEEPS 1085 00:45:34,696 --> 00:45:35,230 GOING UP. 1086 00:45:35,230 --> 00:45:37,299 THAT'S HOW I THINK ABOUT THIS. 1087 00:45:37,299 --> 00:45:38,967 EVERY TIME THERE'S A NEW 1088 00:45:38,967 --> 00:45:40,069 TECHNOLOGY PEOPLE DE CRY IT BUT 1089 00:45:40,069 --> 00:45:44,273 I THINK THE IDEA THIS IS GOING 1090 00:45:44,273 --> 00:45:46,942 TO BE A TRIVIAL ISSUE AND YOU 1091 00:45:46,942 --> 00:45:47,810 CAN TRIVIALLY HAVE ALMOST 1092 00:45:47,810 --> 00:45:53,282 EVERYTHING DONE FOR YOU I THINK 1093 00:45:53,282 --> 00:45:56,418 IS NOT REALITY ANY MORE THAN 1094 00:45:56,418 --> 00:45:58,487 STATS PROGRAMS MEANS YOU YOU'RE 1095 00:45:58,487 --> 00:46:00,656 NOT DOING REAL WORK WHEN YOU DO 1096 00:46:00,656 --> 00:46:01,090 EMPIRICAL PAPER. 1097 00:46:01,090 --> 00:46:02,624 >> DO YOU THINK YOU AND I WILL 1098 00:46:02,624 --> 00:46:08,430 HAVE JOBS THE NEXT 10 YEARS OR 1099 00:46:08,430 --> 00:46:08,597 SO? 1100 00:46:08,597 --> 00:46:09,832 >> YEAH. 1101 00:46:09,832 --> 00:46:11,867 ONCE YOU START GOING BEYOND 10 1102 00:46:11,867 --> 00:46:13,135 YEARS FROM NOW. 1103 00:46:13,135 --> 00:46:13,869 >> WE MAY BE IN TROUBLE. 1104 00:46:13,869 --> 00:46:16,939 ALL RIGHT. 1105 00:46:16,939 --> 00:46:19,775 >> I'M RON SOMERS I RUN AN A.I. 1106 00:46:19,775 --> 00:46:22,144 LAB IN THE RADIOLOGY DEPARTMENT. 1107 00:46:22,144 --> 00:46:23,812 IN THE CLINICAL CENTER WE'VE 1108 00:46:23,812 --> 00:46:27,716 LOOKED AT DISPARITY, BIAS AND 1109 00:46:27,716 --> 00:46:30,619 ALSO USED LARGE LANGUAGE MODELS. 1110 00:46:30,619 --> 00:46:32,621 MY QUESTION HAS TO DO WITH THE 1111 00:46:32,621 --> 00:46:34,323 DEFINITION OF DISPARITY AND BIAS 1112 00:46:34,323 --> 00:46:34,890 PARTICULARLY IN DIAGNOSTIC 1113 00:46:34,890 --> 00:46:39,361 IMAGING. 1114 00:46:39,361 --> 00:46:41,096 AS SOMEONE WHO REVIEWS 1115 00:46:41,096 --> 00:46:42,331 MANUSCRIPTS FOR JOURNALS I RUN 1116 00:46:42,331 --> 00:46:44,900 INTO THIS A LOT, PEOPLE WILL DO 1117 00:46:44,900 --> 00:46:47,302 A RESEARCH STUDY WHERE THEY'RE 1118 00:46:47,302 --> 00:46:52,074 USING RADIO GRAPHIC IMAGES OF 1119 00:46:52,074 --> 00:46:54,943 ONE TYPE OR ANOTHER AND THE 1120 00:46:54,943 --> 00:46:56,879 OUTPUTS OF THE A.I. MODEL DIFFER 1121 00:46:56,879 --> 00:46:59,915 FOR PEOPLE FROM DIFFERENT GROUPS 1122 00:46:59,915 --> 00:47:06,255 BE THEY RACIAL, SEX OR ETHNICITY 1123 00:47:06,255 --> 00:47:07,422 THEY'RE DEFINED BY THE AUTHORS 1124 00:47:07,422 --> 00:47:08,390 AS BIAS. 1125 00:47:08,390 --> 00:47:14,029 I'D LIKE TO HEAR YOUR THOUGHTS 1126 00:47:14,029 --> 00:47:15,864 WHETHER ANY DIFFERENCE IN ANY 1127 00:47:15,864 --> 00:47:16,999 A.I. GENERATED FEATURE, ANY 1128 00:47:16,999 --> 00:47:19,201 DIFFERENCE THAT OCCURS BETWEEN 1129 00:47:19,201 --> 00:47:21,270 PEOPLE IN DIFFERENT GROUPS 1130 00:47:21,270 --> 00:47:24,206 CONSTITUTES BIAS OR WHETHER IT'S 1131 00:47:24,206 --> 00:47:25,240 PERMISSIBLE TO HAVE DIFFERENT 1132 00:47:25,240 --> 00:47:26,041 PERFORMANCE FOR PEOPLE IN 1133 00:47:26,041 --> 00:47:27,543 DIFFERENT GROUPS. 1134 00:47:27,543 --> 00:47:31,713 >> YEAH, I ALLUDED TO IT BUT I 1135 00:47:31,713 --> 00:47:35,918 THINK BIAS EXISTS WHEN IT REALLY 1136 00:47:35,918 --> 00:47:37,753 IS AN INCORRECT UNDERSTANDING OF 1137 00:47:37,753 --> 00:47:38,687 THE OUTPUT. 1138 00:47:38,687 --> 00:47:42,991 FOR EXAMPLE, IT'S A BIAS WHEN 1139 00:47:42,991 --> 00:47:45,961 PEOPLE ARE MAKING LISTENING 1140 00:47:45,961 --> 00:47:49,231 DECISIONS FOR TRANSPLANT THAT 1141 00:47:49,231 --> 00:47:50,933 THEY OFTEN PHYSICIANS ARE 1142 00:47:50,933 --> 00:47:53,402 BIASSED TO NOT LIST PATIENTS WHO 1143 00:47:53,402 --> 00:47:55,971 HAVE DEVELOPMENTAL DELAY DESPITE 1144 00:47:55,971 --> 00:47:57,940 OUTCOMES ARE JUST AS GOOD IN 1145 00:47:57,940 --> 00:47:59,141 THAT POPULATION AND IF A.I. 1146 00:47:59,141 --> 00:48:00,976 LEARNS FROM THAT THEN THE 1147 00:48:00,976 --> 00:48:02,611 OUTPUTS WILL HAVE BIAS. 1148 00:48:02,611 --> 00:48:04,646 YOU CAN HAVE BIAS AS THE 1149 00:48:04,646 --> 00:48:06,548 FUNCTION OF THE DIFFERENT KINDS 1150 00:48:06,548 --> 00:48:08,183 OF DATA. 1151 00:48:08,183 --> 00:48:09,818 IN THE OBERMEYER THE PEOPLE WHO 1152 00:48:09,818 --> 00:48:14,489 DEVELOP THE ALGORITHMS MAY HAVE 1153 00:48:14,489 --> 00:48:16,592 CAUGHT IT WOULD DISCRIMINATE 1154 00:48:16,592 --> 00:48:17,793 AGAINST AFRICAN AMERICANS IF 1155 00:48:17,793 --> 00:48:22,331 THEY HAD A MORE ROBUST DATA SET 1156 00:48:22,331 --> 00:48:24,333 AND THEY HAD 1,000 BUT ONLY 8 1157 00:48:24,333 --> 00:48:26,735 WERE BLACK SO THEY MISSED THAT 1158 00:48:26,735 --> 00:48:28,337 FEATURE THAT CAME OUT THAT WAS 1159 00:48:28,337 --> 00:48:29,838 HIGHLY PROBLEMATIC. 1160 00:48:29,838 --> 00:48:31,807 AS I SUGGESTED THERE'LL BE 1161 00:48:31,807 --> 00:48:35,777 DIFFERENCES I DON'T THINK WE CAN 1162 00:48:35,777 --> 00:48:37,779 CHARACTERIZE AS BIAS AND GENUINE 1163 00:48:37,779 --> 00:48:40,015 DIFFERENCES OR THINGS THAT ARE A 1164 00:48:40,015 --> 00:48:42,117 FUNCTION OF UNETHICAL 1165 00:48:42,117 --> 00:48:44,953 PROBLEMATIC THINGS IN SOCIETY 1166 00:48:44,953 --> 00:48:48,123 STILL NOT NOTICED AND IF YOU 1167 00:48:48,123 --> 00:48:49,858 HAVE DIFFERENCES IN OUTCOMES AS 1168 00:48:49,858 --> 00:48:51,593 A BIAS OF SOCIAL DETERMINATES OF 1169 00:48:51,593 --> 00:48:53,495 HEALTH THAT'S A REAL DIFFERENCE. 1170 00:48:53,495 --> 00:48:55,097 THOSE THE HARDER ONES. 1171 00:48:55,097 --> 00:49:00,669 IF YOU HAVE PATIENTS PRESENTING 1172 00:49:00,669 --> 00:49:06,308 FOR PALLIATIVE RADIATION AND 1173 00:49:06,308 --> 00:49:07,609 DON'T WANT IT GIVE RADIATION AND 1174 00:49:07,609 --> 00:49:09,578 SEND THEM ON HOSPICE, VERSUS THE 1175 00:49:09,578 --> 00:49:11,847 ONES WHO MIGHT LIVE FOR A WHILE 1176 00:49:11,847 --> 00:49:14,049 VERSUS THE ONES WHO MIGHT LIVE 1177 00:49:14,049 --> 00:49:15,284 FOR A COUPLE YEARS WITH 1178 00:49:15,284 --> 00:49:16,251 AGGRESSIVE RADIATION. 1179 00:49:16,251 --> 00:49:18,887 PHYSICIANS ARE TERRIBLE AT DOING 1180 00:49:18,887 --> 00:49:23,025 THAT AND SARAH ALCORN AT HOPKINS 1181 00:49:23,025 --> 00:49:26,128 HAS AN ALGORITHM THAT DOES A 1182 00:49:26,128 --> 00:49:30,299 BETTER JOB AND AS WE KNOW ABOUT 1183 00:49:30,299 --> 00:49:34,236 DISPARITIES IN SOCIAL DISTANCES 1184 00:49:34,236 --> 00:49:37,572 AND PEOPLE POORER OR OF COLOR 1185 00:49:37,572 --> 00:49:39,675 WILL DIE IN THOSE BINS AND IT'S 1186 00:49:39,675 --> 00:49:41,677 A FUNCTION OF PROBLEMS IN OUR 1187 00:49:41,677 --> 00:49:43,712 SOCIETY AND DISCRIMINATION BUT 1188 00:49:43,712 --> 00:49:45,881 THEY'RE NOT BIASES THOSE ARE 1189 00:49:45,881 --> 00:49:46,682 TRUE OUTCOMES AND SOMETIMES 1190 00:49:46,682 --> 00:49:47,749 THOSE DIFFERENCES ARE THINGS YOU 1191 00:49:47,749 --> 00:49:49,251 WANT TO BE AWARE OF BECAUSE YOU 1192 00:49:49,251 --> 00:49:50,886 CAN PUT YOUR THUMB ON THE SCALE 1193 00:49:50,886 --> 00:49:51,920 TO MAKE THEM BETTER BUT 1194 00:49:51,920 --> 00:49:53,889 SOMETIMES YOU CAN'T. 1195 00:49:53,889 --> 00:49:55,223 YOU'RE NOT DOING ANYBODY A FAVOR 1196 00:49:55,223 --> 00:49:57,592 BY GIVING THEM LARGE DOSES OF 1197 00:49:57,592 --> 00:49:58,727 RADIATION WHEN THEY HAVE LESS 1198 00:49:58,727 --> 00:50:07,269 THAN SIX MONTHS TO LIVE ANYWAY. 1199 00:50:07,269 --> 00:50:09,171 TO SOME EXTENT THEY HOLD UP A 1200 00:50:09,171 --> 00:50:10,272 MIRROR TO US. 1201 00:50:10,272 --> 00:50:15,677 IF THERE'S OUTCOMES GENERALLY 1202 00:50:15,677 --> 00:50:17,179 DIFFERENT THOSE WILL BE 1203 00:50:17,179 --> 00:50:18,613 REFLECTED IN THE DATA AND 1204 00:50:18,613 --> 00:50:19,948 DOESN'T MEAN THEY'RE 1205 00:50:19,948 --> 00:50:20,282 UNPROBLEMATIC. 1206 00:50:20,282 --> 00:50:20,916 SOMETIMES THEY ARE AND SOMETIMES 1207 00:50:20,916 --> 00:50:22,951 THEY ARE. 1208 00:50:22,951 --> 00:50:24,720 -- AREN'T AND SOMETIMES THEY 1209 00:50:24,720 --> 00:50:29,224 ARE. 1210 00:50:29,224 --> 00:50:32,561 >> I HAVE SEVERAL QUESTIONS FROM 1211 00:50:32,561 --> 00:50:33,995 ONLINE AND I'LL EXERCISE MY 1212 00:50:33,995 --> 00:50:35,630 PREROGATIVE AND START WITH ONE 1213 00:50:35,630 --> 00:50:37,432 OF MY OWN BECAUSE IT FOLLOWS OFF 1214 00:50:37,432 --> 00:50:38,800 ON THAT. 1215 00:50:38,800 --> 00:50:45,707 UNDERSTANDING AND MEASURING BIAS 1216 00:50:45,707 --> 00:50:46,975 DIFFICULT TO DO BUT IMPORTANT TO 1217 00:50:46,975 --> 00:50:50,212 DO AND THERE'S BEEN IMPORTANCE 1218 00:50:50,212 --> 00:50:52,414 IN SENTENCING ALGORITHMS AND 1219 00:50:52,414 --> 00:50:53,515 DISCUSSION OF THAT. 1220 00:50:53,515 --> 00:50:58,754 SOMETHING THAT STRUCK ME ABOUT 1221 00:50:58,754 --> 00:51:01,823 THAT DISCUSSION IS THE ALGORITHM 1222 00:51:01,823 --> 00:51:07,896 MAY BE BIAS AND REFLECT 1223 00:51:07,896 --> 00:51:11,666 RE-SIDRE- 1224 00:51:11,666 --> 00:51:12,601 RE-SIDRE- 1225 00:51:12,601 --> 00:51:14,603 RE-SIDISM -- RECIDIVISM AND THE 1226 00:51:14,603 --> 00:51:16,438 JUSTICE SYSTEM IMPLEMENTED BY 1227 00:51:16,438 --> 00:51:18,140 HUMANS IS FAMOUSLY UNFAIR. 1228 00:51:18,140 --> 00:51:23,412 IT SEEMS TO ME THE MOST 1229 00:51:23,412 --> 00:51:24,646 IMPORTANT QUESTION IS NOT 1230 00:51:24,646 --> 00:51:27,482 WHETHER IT'S BIAS BUT WHETHER 1231 00:51:27,482 --> 00:51:28,950 IT'S MORE BIASSED THAN THE 1232 00:51:28,950 --> 00:51:30,018 DECISION PROCESS THAT WOULD 1233 00:51:30,018 --> 00:51:31,820 OTHERWISE BE USED AND SO I 1234 00:51:31,820 --> 00:51:33,221 WONDERED BOTH IF YOU WOULD AGREE 1235 00:51:33,221 --> 00:51:36,224 WITH THAT SORT OF CLAIM THAT 1236 00:51:36,224 --> 00:51:37,626 IT'S THE CONTRAST THAT MATTERS 1237 00:51:37,626 --> 00:51:39,895 NOT THE ABSOLUTE CLAIM AND ALSO 1238 00:51:39,895 --> 00:51:41,296 HOW YOU MIGHT APPROACH GOING 1239 00:51:41,296 --> 00:51:42,297 ABOUT TRYING TO FIGURE OUT IF 1240 00:51:42,297 --> 00:51:52,507 THAT'S TRUE. 1241 00:52:02,317 --> 00:52:04,352 >> WE'VE DONE WORK ON LISTENING 1242 00:52:04,352 --> 00:52:06,087 DECISIONS FOR CHARACTERISTICS IN 1243 00:52:06,087 --> 00:52:08,323 TRANSPLANTS AND WE HAVE LOTS OF 1244 00:52:08,323 --> 00:52:09,825 DATA SHOWING ALL KINDS OF 1245 00:52:09,825 --> 00:52:13,128 BIASSED DECISIONS THAT TAKE 1246 00:52:13,128 --> 00:52:15,564 PLACE ALL THE TIME AND THERE'S 1247 00:52:15,564 --> 00:52:17,666 HEALTH SYSTEMS OR HOSPITALS NOT 1248 00:52:17,666 --> 00:52:20,469 BIASSED AND OTHERS THAT ARE 1249 00:52:20,469 --> 00:52:22,971 BIAS. 1250 00:52:23,972 --> 00:52:25,907 ED -- BIASSED AND IF YOU HAVE A 1251 00:52:25,907 --> 00:52:26,575 MODEL LEARNING FROM THE 1252 00:52:26,575 --> 00:52:28,109 INSTITUTION WHERE'S THEY HAVE 1253 00:52:28,109 --> 00:52:29,644 THE BIASSED BEHAVIOR IT WINDS UP 1254 00:52:29,644 --> 00:52:34,082 BECOMING REINFORCING AND IT CAN 1255 00:52:34,082 --> 00:52:37,018 REINFORCE FROM MULTIPLE DOMAINS. 1256 00:52:37,018 --> 00:52:40,322 THE SAME WAY THEY DISCRIMINATE 1257 00:52:40,322 --> 00:52:41,623 AGAINST PATIENTS DEVELOPMENTALLY 1258 00:52:41,623 --> 00:52:45,760 DELAYED FOR TRANSPLANT LISTING 1259 00:52:45,760 --> 00:52:47,329 THEY DISCRIMINATE WHETHER TO DO 1260 00:52:47,329 --> 00:52:48,396 SURGERY AND THEY'RE SEPARATE 1261 00:52:48,396 --> 00:52:49,598 BIASES BUT LARGE LANGUAGE MODEL 1262 00:52:49,598 --> 00:52:54,970 HAVE A POTENTIAL TO START TO 1263 00:52:54,970 --> 00:52:57,572 DETERMINE CRITICALLY ILL KIDS DO 1264 00:52:57,572 --> 00:52:59,941 POORLY AND THAT'S DESPITE 1265 00:52:59,941 --> 00:53:01,142 DECISIONS BY PHYSICIANS BUT IT 1266 00:53:01,142 --> 00:53:08,517 WILL BE TURNED INTO WE NOW KNOW 1267 00:53:08,517 --> 00:53:12,254 KIDS DON'T SURVIVE IF THEY'RE 1268 00:53:12,254 --> 00:53:13,522 DEVELOPMENTALLY DELAYED AND HAVE 1269 00:53:13,522 --> 00:53:15,590 ORGAN FAILURE AND THIS CAN BE 1270 00:53:15,590 --> 00:53:18,960 COMBINED IN WAYS THAT ARE 1271 00:53:18,960 --> 00:53:22,531 PROBLEMATIC AND WAYS WHERE THE 1272 00:53:22,531 --> 00:53:26,001 REINFORCEMENT CAN ENTRENCH THESE 1273 00:53:26,001 --> 00:53:27,569 TO MAKE IT HARDER TO CHANGE THAN 1274 00:53:27,569 --> 00:53:37,646 WE HAVE NOW. 1275 00:53:37,646 --> 00:53:42,984 >> IT MAY BE STICKIER IN THESE 1276 00:53:42,984 --> 00:53:43,985 CASES. 1277 00:53:43,985 --> 00:53:45,020 >> YES. 1278 00:53:45,020 --> 00:53:47,222 >> AND THAT MAKES SENSE. 1279 00:53:47,222 --> 00:53:48,924 IF I CAN REPRESENT SOME FRIENDS 1280 00:53:48,924 --> 00:53:51,026 IN THE VIRTUAL SPACE AND START 1281 00:53:51,026 --> 00:53:52,561 WITH NIH SPECIFIC QUESTIONS. 1282 00:53:52,561 --> 00:53:55,263 AM I SUPPOSED TO READ PEOPLE'S 1283 00:53:55,263 --> 00:53:55,497 NAMES? 1284 00:53:55,497 --> 00:53:57,299 JUST THE QUESTIONS. 1285 00:53:57,299 --> 00:53:59,634 YES. 1286 00:53:59,634 --> 00:54:04,439 OKAY, SO FIRST QUESTION, ISN'T 1287 00:54:04,439 --> 00:54:06,875 NIH WIDE NLM BEING BUILT OR DO 1288 00:54:06,875 --> 00:54:09,511 CERTAIN MODELS ENCOURAGED OR 1289 00:54:09,511 --> 00:54:10,445 PREFERRED FOR USE IF INDIVIDUAL 1290 00:54:10,445 --> 00:54:12,147 I.C.s ARE ENCOURAGED TO LICENSE 1291 00:54:12,147 --> 00:54:21,723 THEIR OWN MODELS. 1292 00:54:21,723 --> 00:54:22,991 >> GOOD QUESTION. 1293 00:54:22,991 --> 00:54:24,893 AGAIN, THIS COMES DOWN TO HOW 1294 00:54:24,893 --> 00:54:27,295 THESE ARE IMPLEMENTS. 1295 00:54:27,295 --> 00:54:30,398 SOMETHING THAT WE ARE BUILDING 1296 00:54:30,398 --> 00:54:32,200 IN NHLBI RIGHT NOW IS 1297 00:54:32,200 --> 00:54:33,335 ESSENTIALLY A VERSION OF CHAT 1298 00:54:33,335 --> 00:54:38,239 GPT WE HOPEFULLY WILL ALLOW OUR 1299 00:54:38,239 --> 00:54:40,542 RESEARCHERS TO USE ON A DAY TO 1300 00:54:40,542 --> 00:54:45,046 DAY BASIS AND ANYONE IN THE I.C. 1301 00:54:45,046 --> 00:54:48,516 BUT I THINK A FEW MONTHS AGO I 1302 00:54:48,516 --> 00:54:52,654 HAVE AN UNANSWERED ASK AND 1303 00:54:52,654 --> 00:54:59,461 THERE'S AN ENTERPRISE VERSION 1304 00:54:59,461 --> 00:55:01,763 AND MICROSOFT SIMILARLY I EXPECT 1305 00:55:01,763 --> 00:55:03,164 THEM TO HAVE AN ENTERPRISE 1306 00:55:03,164 --> 00:55:07,969 VERSION OF THIS AS WELL. 1307 00:55:07,969 --> 00:55:10,705 AND SO I'D BE CURIOUS TO TALK TO 1308 00:55:10,705 --> 00:55:12,941 CIT TO SEE IF THEY'LL PROVIDE 1309 00:55:12,941 --> 00:55:14,643 SOME SERVICE TO ALL OF US AT THE 1310 00:55:14,643 --> 00:55:17,979 NIH TO BE ABLE TO LEVERAGE THIS 1311 00:55:17,979 --> 00:55:22,751 TOOL BECAUSE RIGHT NOW WE CAN'T 1312 00:55:22,751 --> 00:55:27,956 USE ANY OF THE PUBLIC SERVICES 1313 00:55:27,956 --> 00:55:29,891 FOR ANYTHING NOT PUBLIC 1314 00:55:29,891 --> 00:55:30,158 KNOWLEDGE. 1315 00:55:30,158 --> 00:55:32,160 WE NEED TO ENCOURAGE OUR 1316 00:55:32,160 --> 00:55:33,495 RESEARCHER COMMUNITIES TO DO 1317 00:55:33,495 --> 00:55:36,231 THIS TYPE OF TOOLING TO ACC 1318 00:55:36,231 --> 00:55:37,165 ACCELERATE THEIR RESEARCH AND 1319 00:55:37,165 --> 00:55:43,905 SPARK IDEAS AS WELL. 1320 00:55:43,905 --> 00:55:45,573 >> ANOTHER QUESTION WE'RE 1321 00:55:45,573 --> 00:55:46,875 ANSWERING ARE THERE METHODS TO 1322 00:55:46,875 --> 00:55:50,078 USE THE MODEL IN A DECENTRALIZED 1323 00:55:50,078 --> 00:55:52,847 MANNER KEEPING THE DATA 1324 00:55:52,847 --> 00:55:53,281 LOCALIZED? 1325 00:55:53,281 --> 00:55:55,717 >> A DECENTRALIZED MANNER BUT 1326 00:55:55,717 --> 00:55:57,886 KEEPING THE DATA LOCALIZED? 1327 00:55:57,886 --> 00:55:58,186 YES. 1328 00:55:58,186 --> 00:56:00,488 IT COMES DOWN TO WHAT MODEL 1329 00:56:00,488 --> 00:56:03,591 YOU'RE USING WHETHER THAT IS AN 1330 00:56:03,591 --> 00:56:05,927 OPEN SOURCE OR CLOSED SOURCE 1331 00:56:05,927 --> 00:56:06,227 MODEL. 1332 00:56:06,227 --> 00:56:09,898 THE OPEN A.I. MODELS THE CHAT 1333 00:56:09,898 --> 00:56:11,166 GPT 4 ARE NOT OPEN SOURCE. 1334 00:56:11,166 --> 00:56:14,903 THE WAY WE INTERACT IS THROUGH 1335 00:56:14,903 --> 00:56:18,406 SOME A.P.I. 1336 00:56:18,406 --> 00:56:19,774 WE'RE USING THE CLOSED-SOURCE 1337 00:56:19,774 --> 00:56:21,476 MODEL AND WE DON'T HAVE ACCESS 1338 00:56:21,476 --> 00:56:24,713 TO THE MODEL WEIGHTS OF CHAT GPT 1339 00:56:24,713 --> 00:56:27,816 4 BUT THE DATA WE GET FROM USERS 1340 00:56:27,816 --> 00:56:30,185 WOULD BE STORED LOCALLY. 1341 00:56:30,185 --> 00:56:32,220 SIMILARLY WE COULD DOWNLOAD THE 1342 00:56:32,220 --> 00:56:33,621 OPEN SOURCE LLAMA MODELS AND RUN 1343 00:56:33,621 --> 00:56:37,959 THEM LOCALLY AND THAT ENTIRE 1344 00:56:37,959 --> 00:56:39,260 SYSTEM THEN WOULD BE COMING DOWN 1345 00:56:39,260 --> 00:56:40,328 TO THE DIFFERENCE BETWEEN THE 1346 00:56:40,328 --> 00:56:43,465 TECHNOLOGY AND THE TOOLING AND 1347 00:56:43,465 --> 00:56:44,766 HOW TO THEY'RE BUILT OUT. 1348 00:56:44,766 --> 00:56:48,636 >> WE'RE OUT OF TIME. 1349 00:56:48,636 --> 00:56:49,104 SORRY. 1350 00:56:49,104 --> 00:56:50,805 I APOLOGIZE FOR QUESTIONS. 1351 00:56:50,805 --> 00:56:54,542 I'LL TRY TO RESPOND TO THE ONES 1352 00:56:54,542 --> 00:56:57,112 ONLINE A LITTLE BIT LATER AND IN 1353 00:56:57,112 --> 00:56:58,980 THE MEANTIME WE'LL HOPEFULLY SEE 1354 00:56:58,980 --> 00:57:00,749 YOU THE FIRST WEDNESDAY IN 1355 00:57:00,749 --> 00:57:01,983 DECEMBER AND UNTIL THEN PLEASE 1356 00:57:01,983 --> 00:57:03,051 THANK NICK AND DAVID FOR A GREAT 1357 00:57:03,051 --> 00:57:03,284 SESSION. 1358 00:57:03,284 --> 00:57:13,284 THANK YOU, GUYS.