1 00:00:08,108 --> 00:00:09,709 >> THE FOURTH ANNUAL NLM 2 00:00:09,776 --> 00:00:12,212 LECTURE, OFFICE OF STRATEGIC 3 00:00:12,278 --> 00:00:13,646 INITIATIVES ORGANIZED THIS 4 00:00:13,713 --> 00:00:16,783 SERIES, TO RAISE AWARENESS 5 00:00:16,850 --> 00:00:20,987 AROUND SOCIETAL AND ETHICAL 6 00:00:21,054 --> 00:00:22,856 IMPLICATIONS IN CONDUCT OF 7 00:00:22,922 --> 00:00:23,123 RESEARCH. 8 00:00:23,189 --> 00:00:26,159 IT'S OUR HOPE THESE WILL FUEL 9 00:00:26,226 --> 00:00:27,894 IMPORTANT CONVERSATIONS AROUND 10 00:00:27,961 --> 00:00:30,530 NLM, NIH, BROADER BIOMEDICAL 11 00:00:30,597 --> 00:00:35,702 RESEARCH COMMUNITY. 12 00:00:35,769 --> 00:00:40,273 I'M PLEASED TODAY TO INTRODUCE 13 00:00:40,340 --> 00:00:42,242 OUR GUEST LECTURER MEREDITH 14 00:00:42,308 --> 00:00:45,311 BROUSSARD, ASSOCIATE PROFESSOR 15 00:00:45,378 --> 00:00:54,521 AT NEW YORK UNIVERSITY CARTER 16 00:00:54,587 --> 00:00:56,790 JOURNALISM INSTITUTE, AUTHOR OF 17 00:00:56,856 --> 00:00:58,992 "MORE THAN A GLITCH," WHICH CAME 18 00:00:59,058 --> 00:01:04,164 OUT LAST YEAR, AND AWARD-WINNING 19 00:01:04,230 --> 00:01:05,598 2018 BOOK "ARTIFICIAL 20 00:01:05,665 --> 00:01:07,333 UNINTELLIGENCE, HOW COMPUTERS 21 00:01:07,400 --> 00:01:08,535 MISUNDERSTAND THE WORLD," 22 00:01:08,601 --> 00:01:10,103 RESEARCH ON ARTIFICIAL 23 00:01:10,170 --> 00:01:11,504 INTELLIGENCE AND INVESTIGATIVE 24 00:01:11,571 --> 00:01:13,573 REPORTING WITH FOCUS ON A.I. 25 00:01:13,640 --> 00:01:15,241 ETHICS USING DATA ANALYSIS FOR 26 00:01:15,308 --> 00:01:17,043 SOCIAL GOOD. 27 00:01:17,110 --> 00:01:18,945 FORMER FEATURES EDITOR AT THE 28 00:01:19,012 --> 00:01:20,180 "PHILADELPHIA INQUIRER," HAS SHE 29 00:01:20,246 --> 00:01:22,015 HAS WORKED AT A SOFTWARE 30 00:01:22,081 --> 00:01:25,218 DEVELOPER AT AT&T BELL LABS, 31 00:01:25,285 --> 00:01:29,055 M.I.T. MEDIA LAB, FEATURE 32 00:01:29,122 --> 00:01:29,656 STORIES AND ESSAYS APPEARED IN 33 00:01:29,722 --> 00:01:32,892 THE "NEW YORK TIMES," "THE 34 00:01:32,959 --> 00:01:36,930 ATLANTIC," SLATE AND BOX. 35 00:01:36,996 --> 00:01:38,698 TODAY'S LECTURE WILL HELP US 36 00:01:38,765 --> 00:01:42,135 THINK THROUGH CONCRETE 37 00:01:42,202 --> 00:01:43,803 STRATEGIES TO MITIGATE BIAS IN 38 00:01:43,870 --> 00:01:49,075 OUR APPROACH TO DEVELOPING 39 00:01:49,142 --> 00:01:52,145 ARTIFICIAL INTELLIGENCE 40 00:01:52,212 --> 00:01:54,781 TECHNOLOGIES, HER INSIGHTS COMES 41 00:01:54,848 --> 00:01:55,748 AT A CRITICAL POINT. 42 00:01:55,815 --> 00:01:59,586 IN THE LAST TWO YEARS 43 00:01:59,652 --> 00:02:00,186 DISCUSSIONS AROUND TRUSTWORTHY 44 00:02:00,253 --> 00:02:01,454 A.I. HAVE TAKEN OFF. 45 00:02:01,521 --> 00:02:02,755 WE HAVE POWERFUL GUIDING 46 00:02:02,822 --> 00:02:04,824 DOCUMENTS SUCH AS THE BLUEPRINT 47 00:02:04,891 --> 00:02:07,961 FOR A.I. BILL OF RIGHTS FROM THE 48 00:02:08,027 --> 00:02:09,662 WHITE HOUSE OFFICE OF SCIENCE 49 00:02:09,729 --> 00:02:11,798 AND TECHNOLOGY POLICY IN OCTOBER 50 00:02:11,865 --> 00:02:13,433 OF 2022, AND A.I. RISK 51 00:02:13,500 --> 00:02:14,634 MANAGEMENT FRAMEWORK PUBLISHED 52 00:02:14,701 --> 00:02:17,036 BY THE NATIONAL INSTITUTES OF 53 00:02:17,103 --> 00:02:18,338 STANDARDS OF TECHNOLOGY, NIST, 54 00:02:18,404 --> 00:02:19,339 LAST YEAR. 55 00:02:19,405 --> 00:02:22,942 WE ALSO HAVE VERY CLEAR 56 00:02:23,009 --> 00:02:23,476 DIRECTIVES ON ADVANCING 57 00:02:23,543 --> 00:02:27,747 COMPREHENSIVE 58 00:02:27,814 --> 00:02:28,314 WHOLE-OF-GOVERNMENT STRATEGY 59 00:02:28,381 --> 00:02:29,849 INCLUDING AN EXECUTIVE ORDER ON 60 00:02:29,916 --> 00:02:30,817 THE SAFE, SECURE, AND 61 00:02:30,884 --> 00:02:33,219 TRUSTWORTHY DEVELOPMENT AND USE 62 00:02:33,286 --> 00:02:34,087 OF ARTIFICIAL INTELLIGENCE. 63 00:02:34,153 --> 00:02:39,125 SIGNED BY THE PRESIDENT IN 64 00:02:39,192 --> 00:02:40,860 OCTOBER 30 OF 2023. 65 00:02:40,927 --> 00:02:42,695 THESE EFFORTS ARE DEDICATED TO 66 00:02:42,762 --> 00:02:43,496 UNLOCKING POTENTIAL OF 67 00:02:43,563 --> 00:02:44,197 ARTIFICIAL INTELLIGENCE TO SOLVE 68 00:02:44,264 --> 00:02:47,901 SOME OF OUR MOST PRESSING 69 00:02:47,967 --> 00:02:48,735 CHALLENGES, ACCELERATING 70 00:02:48,801 --> 00:02:51,104 SCIENTIFIC DISCOVERY, WHILE 71 00:02:51,170 --> 00:02:52,205 PROTECTING AGAINST THE 72 00:02:52,272 --> 00:02:53,406 TECHNOLOGIES' POTENTIAL RISKS. 73 00:02:53,473 --> 00:02:57,777 AT NIH, THAT MEANS WORKING TO 74 00:02:57,844 --> 00:02:59,178 MITIGATE BIAS, PRESERVE PRIVACY, 75 00:02:59,245 --> 00:03:02,015 PROMOTE ETHICAL APPROACHES TO 76 00:03:02,081 --> 00:03:03,550 INTEGRATING A.I. AND BIOMEDICAL 77 00:03:03,616 --> 00:03:04,417 RESEARCH. 78 00:03:04,484 --> 00:03:05,752 THIS WILL SERVE TO ADVANCE 79 00:03:05,818 --> 00:03:07,287 EQUITABLE HEALTH OUTCOMES FOR 80 00:03:07,353 --> 00:03:09,055 COMMUNITIES ACROSS THE NATION. 81 00:03:09,122 --> 00:03:10,790 AS THE WORLD'S LARGEST 82 00:03:10,857 --> 00:03:11,991 BIOMEDICAL RESEARCH LIBRARY, 83 00:03:12,058 --> 00:03:13,059 WITH COMMITMENT TO PRESERVING 84 00:03:13,126 --> 00:03:15,361 AND PROTECTING THE PUBLIC TRUST, 85 00:03:15,428 --> 00:03:17,897 NLM PLAYS AN IMPORTANT ROLE IN 86 00:03:17,964 --> 00:03:19,432 REALIZING THIS GOAL AND 87 00:03:19,499 --> 00:03:20,533 AFFIRMING THIS COMMITMENT. 88 00:03:20,600 --> 00:03:23,503 WE HOUSE AND MAKE ACCESSIBLE 89 00:03:23,570 --> 00:03:24,904 VALUABLE DATA RESOURCES, 90 00:03:24,971 --> 00:03:25,772 HARNESSED TO DRIVE DISCOVERY. 91 00:03:25,838 --> 00:03:28,174 WE'RE A LEADER IN HEALTH DATA 92 00:03:28,241 --> 00:03:30,143 STANDARDS DEVELOPMENT, THAT 93 00:03:30,209 --> 00:03:35,415 ENABLES THE DISCOVERY AND WE'RE 94 00:03:35,481 --> 00:03:39,452 A DRIVER OF DATA SCIENCE THROUGH 95 00:03:39,519 --> 00:03:40,053 OUR EXTRAMURAL AND INTRAMURAL 96 00:03:40,119 --> 00:03:40,653 RESEARCH PROGRAMS. 97 00:03:40,720 --> 00:03:42,956 WITH THIS FRAMING IN MIND, I'M 98 00:03:43,022 --> 00:03:44,490 LOOKING FORWARD TO HEARING HER 99 00:03:44,557 --> 00:03:47,460 WALK US THROUGH EXAMPLES OF THE 100 00:03:47,527 --> 00:03:49,095 POTENTIAL CONSEQUENCES OF 101 00:03:49,162 --> 00:03:50,430 DEVELOPING ALGORITHMS WITHOUT 102 00:03:50,496 --> 00:03:51,397 CAREFUL CONSIDERATION OF 103 00:03:51,464 --> 00:03:52,599 POTENTIAL BIASES. 104 00:03:52,665 --> 00:03:55,468 EVEN WHEN THESE TECHNOLOGIES ARE 105 00:03:55,535 --> 00:03:59,706 DEVELOPED WITH BEST INTENTIONS. 106 00:03:59,772 --> 00:04:01,808 ALSO EAGER TO HEAR INSIGHTS ON 107 00:04:01,874 --> 00:04:02,775 PRACTICAL SOLUTIONS AND LEARN 108 00:04:02,842 --> 00:04:04,310 MORE ABOUT HER IDEAS FOR 109 00:04:04,377 --> 00:04:06,279 DESIGNING BETTER SYSTEMS FOR 110 00:04:06,346 --> 00:04:07,180 MORE EQUITABLE OUTCOMES. 111 00:04:07,246 --> 00:04:09,782 THIS LECTURE WILL BE FOLLOWED BY 112 00:04:09,849 --> 00:04:11,217 A QUESTION-AND-ANSWER PORTION. 113 00:04:11,284 --> 00:04:13,286 IF YOU HAVE ANY QUESTIONS, MAKE 114 00:04:13,353 --> 00:04:15,555 YOUR WAY TO A MICROPHONE THAT 115 00:04:15,622 --> 00:04:17,223 WILL BE PROVIDED, IF YOU'RE WITH 116 00:04:17,290 --> 00:04:18,524 US IN PERSON. 117 00:04:18,591 --> 00:04:22,929 IF JOINING ONLINE USE THE ASK A 118 00:04:22,996 --> 00:04:26,299 QUESTION FEATURE ON VIDEOCAST. 119 00:04:26,366 --> 00:04:29,469 PLEASE WELCOME ME NOW IN JOINING 120 00:04:29,535 --> 00:04:31,938 MS. BROUSSARD. 121 00:04:37,877 --> 00:04:39,078 >> IT'S GREAT TO BE WITH YOU. 122 00:04:39,145 --> 00:04:40,747 THANK YOU FOR HAVING ME. 123 00:04:40,813 --> 00:04:43,249 SO I'M GOING TO TALK TODAY ABOUT 124 00:04:43,316 --> 00:04:45,251 SOME IDEAS FROM MY MOST RECENT 125 00:04:45,318 --> 00:04:49,689 BACK, "MORE THAN A GLITCH, 126 00:04:49,756 --> 00:04:51,791 CONFRONTING RACE, GENDER, AND 127 00:04:51,858 --> 00:04:53,459 ABILITY BIAS IN TECH." 128 00:04:53,526 --> 00:04:54,994 YOU'RE A KICKOFF STOP ON THE 129 00:04:55,061 --> 00:04:56,763 PAPER BACK TOUR, BECAUSE THE 130 00:04:56,829 --> 00:04:58,064 PAPERBACK IS JUST RELEASING. 131 00:04:58,131 --> 00:05:00,700 THANK YOU FOR THAT. 132 00:05:00,767 --> 00:05:02,669 AND SPECIFICALLY, I WANT TO TALK 133 00:05:02,735 --> 00:05:05,938 A LITTLE BIT ABOUT CANCER. 134 00:05:06,005 --> 00:05:09,976 I WANT TO TALK ABOUT HOW LOOKING 135 00:05:10,043 --> 00:05:12,679 AT CANCER CAN HELP US UNDERSTAND 136 00:05:12,745 --> 00:05:13,813 ALGORITHMIC BIAS. 137 00:05:13,880 --> 00:05:15,915 I'M GOING TO DO THAT BY TELLING 138 00:05:15,982 --> 00:05:17,984 YOU A STORY ABOUT CANCER. 139 00:05:18,051 --> 00:05:21,154 BUT FIRST I'M GOING TO START BY 140 00:05:21,220 --> 00:05:22,488 TALKING ABOUT ARTIFICIAL 141 00:05:22,555 --> 00:05:22,922 INTELLIGENCE. 142 00:05:22,989 --> 00:05:25,224 SOMETHING THAT I THINK IS REALLY 143 00:05:25,291 --> 00:05:26,426 IMPORTANT TO EMPHASIZE WHEN 144 00:05:26,492 --> 00:05:29,395 WE'RE TALKING ABOUT A.I. WITH 145 00:05:29,462 --> 00:05:31,764 THE GENERAL PUBLIC IS NEED TO 146 00:05:31,831 --> 00:05:34,100 EMPHASIZE A.I. IS NOT MAGIC AND 147 00:05:34,167 --> 00:05:37,470 THAT IT IS NOT HOLLYWOOD VISION. 148 00:05:37,537 --> 00:05:38,671 OKAY? 149 00:05:38,738 --> 00:05:40,973 BECAUSE EVERYBODY YOU TALK TO, 150 00:05:41,040 --> 00:05:43,076 WELL, EVERY NON-A.I. SCIENTIST, 151 00:05:43,142 --> 00:05:44,177 IS LIKE -- ACTUALLY EVERYBODY 152 00:05:44,243 --> 00:05:47,647 YOU TALK TO IS THINKING ABOUT 153 00:05:47,714 --> 00:05:48,314 THE TERMINATOR. 154 00:05:48,381 --> 00:05:51,951 OR THEY ARE THINKING ABOUT "STAR 155 00:05:52,018 --> 00:06:00,259 WARS" OR "STAR TREK," COOL 156 00:06:00,326 --> 00:06:01,027 HOLLYWOOD PORTRAYALS OF A.I. 157 00:06:01,094 --> 00:06:02,595 BECAUSE OF A SPECIFIC WAY OUR 158 00:06:02,662 --> 00:06:03,463 BRAINS WORK. 159 00:06:03,529 --> 00:06:05,665 OUR BRAINS ARE BETTER AT 160 00:06:05,732 --> 00:06:07,934 RECALLING STORIES THAN FACTS AND 161 00:06:08,000 --> 00:06:08,334 SPECIFICS. 162 00:06:08,401 --> 00:06:11,003 SO, WE ALWAYS GO TO HOLLYWOOD 163 00:06:11,070 --> 00:06:11,437 FIRST. 164 00:06:11,504 --> 00:06:12,438 HOLLYWOOD TELLS AMAZING STORIES 165 00:06:12,505 --> 00:06:13,973 SO THE TERMINATOR IS STUCK IN 166 00:06:14,040 --> 00:06:15,808 OUR BRAINS. 167 00:06:15,875 --> 00:06:17,143 I WILL ADMIT TO HAVING SPENT A 168 00:06:17,210 --> 00:06:20,446 LOT OF TIME THINKING ABOUT THE 169 00:06:20,513 --> 00:06:21,314 TERMINATOR MYSELF. 170 00:06:21,380 --> 00:06:24,717 BUT IT'S NOT REAL. 171 00:06:24,784 --> 00:06:27,320 WE NEED TO EMPHASIZE WHEN WE'RE 172 00:06:27,386 --> 00:06:31,057 TALKING TO THE GENERAL PUBLIC 173 00:06:31,124 --> 00:06:33,392 THERE'S LOTS OF STUFF THAT'S 174 00:06:33,459 --> 00:06:36,129 IMAGINARY, WHAT'S REAL ABOUT 175 00:06:36,195 --> 00:06:38,297 A.I., IT'S COMPLICATED AND 176 00:06:38,364 --> 00:06:38,631 BEAUTIFUL. 177 00:06:38,698 --> 00:06:40,399 AND MATH IS GREAT BUT IT IS NOT 178 00:06:40,466 --> 00:06:42,034 GOING TO RISE UP AND TAKE OVER. 179 00:06:42,101 --> 00:06:45,738 THERE'S A LOT OF ANXIETY ABOUT 180 00:06:45,805 --> 00:06:47,607 A.I. TAKEOVERS AND ROBOT 181 00:06:47,673 --> 00:06:49,342 APOCALYPSES OUT THERE AND I JUST 182 00:06:49,408 --> 00:06:51,511 LIKE TO GET THAT OUT OF THE WAY 183 00:06:51,577 --> 00:06:54,380 BECAUSE IT ALLOWS US TO HAVE 184 00:06:54,447 --> 00:06:54,914 DEEPER CONVERSATIONS ABOUT 185 00:06:54,981 --> 00:06:56,382 ARTIFICIAL INTELLIGENCE WHICH I 186 00:06:56,449 --> 00:06:58,551 THINK IS REALLY IMPORTANT. 187 00:06:58,618 --> 00:06:59,318 AND SPECIFICALLY, THE 188 00:06:59,385 --> 00:07:02,488 CONVERSATION THAT I'M REALLY 189 00:07:02,555 --> 00:07:03,790 INTERESTED IN HAVING IS ABOUT 190 00:07:03,856 --> 00:07:06,759 BIAS IN A.I. AND THE WAY A.I. 191 00:07:06,826 --> 00:07:07,226 DISCRIMINATES. 192 00:07:07,293 --> 00:07:11,164 SO, I REALLY LIKE AN IDEA THAT 193 00:07:11,230 --> 00:07:12,265 COMES FROM BENJAMIN'S 194 00:07:12,331 --> 00:07:16,302 TECHNOLOGY, THE IDEA THAT A.I. 195 00:07:16,369 --> 00:07:18,171 SYSTEMS, AUTOMATED SYSTEMS, 196 00:07:18,237 --> 00:07:19,172 DISCRIMINATE BY DEFAULT. 197 00:07:19,238 --> 00:07:21,340 FOR A LONG TIME THERE'S BEEN 198 00:07:21,407 --> 00:07:27,246 THIS IDEA THAT I CALL TECHNO 199 00:07:27,313 --> 00:07:27,647 CHAUVINISM. 200 00:07:27,713 --> 00:07:29,081 WHAT I WOULD ARGUE IS WE SHOULD 201 00:07:29,148 --> 00:07:30,516 THINK ABOUT USING THE RIGHT TOOL 202 00:07:30,583 --> 00:07:31,384 FOR THE TASK. 203 00:07:31,450 --> 00:07:32,819 BECAUSE SOMETIMES THE RIGHT TOOL 204 00:07:32,885 --> 00:07:35,121 FOR THE TASK IS UNDOUBTEDLY A 205 00:07:35,188 --> 00:07:35,988 COMPUTER, SOMETIMES IT'S 206 00:07:36,055 --> 00:07:37,256 SOMETHING SIMPLE LIKE A BOOK IN 207 00:07:37,323 --> 00:07:41,027 THE HANDS OF A CHILD SITTING ON 208 00:07:41,093 --> 00:07:44,530 A PARENT'S LAP, ONE IS NOT 209 00:07:44,597 --> 00:07:46,065 INHERENTLY BETTER. 210 00:07:46,132 --> 00:07:47,800 BUT TECHNO CHAUVINIST 211 00:07:47,867 --> 00:07:51,070 PERSPECTIVE TELLS US A.I. OR 212 00:07:51,137 --> 00:07:52,939 COMPUTATIONAL SOLUTIONS ARE 213 00:07:53,005 --> 00:07:54,740 OBJECTIVE OR UNBIASED OR 214 00:07:54,807 --> 00:07:55,074 SUPERIOR. 215 00:07:55,141 --> 00:07:57,910 BUT WE CAN PUSH BACK AND LET IT 216 00:07:57,977 --> 00:08:00,112 GO AND THEN CERTAIN THINGS 217 00:08:00,179 --> 00:08:02,081 BECOME CLEAR, IT BECOMES CLEAR 218 00:08:02,148 --> 00:08:03,649 THAT THE PROBLEMS OF THE PAST 219 00:08:03,716 --> 00:08:05,017 ARE REFLECTED IN THE DATA THAT 220 00:08:05,084 --> 00:08:07,053 WE USE TO TRAIN A.I. SYSTEMS. 221 00:08:07,119 --> 00:08:08,988 SO WHEN WE MAKE A MACHINE 222 00:08:09,055 --> 00:08:10,389 LEARNING SYSTEM, WHAT WE DO IS 223 00:08:10,456 --> 00:08:13,159 TAKE A WHOLE BUNCH OF DATA, 224 00:08:13,226 --> 00:08:15,228 USUALLY FROM THE INTERNET, DUMP 225 00:08:15,294 --> 00:08:16,596 IT IN THE COMPUTER AND COMPUTER 226 00:08:16,662 --> 00:08:19,065 MAKES A MODEL. 227 00:08:19,131 --> 00:08:20,366 MODEL SHOWS MATHEMATICAL PATTERN 228 00:08:20,433 --> 00:08:21,767 OF DATA AND WHAT WE CAN DO IS 229 00:08:21,834 --> 00:08:22,902 ALL KINDS OF COOL STUFF. 230 00:08:22,969 --> 00:08:26,038 WE CAN USE THAT MODEL TO MAKE 231 00:08:26,105 --> 00:08:27,240 PREDICTIONS OR DECISIONS OR TO 232 00:08:27,306 --> 00:08:29,876 GENERATE NEW TEXT OR IMAGES OR 233 00:08:29,942 --> 00:08:33,112 AUDIO OR VIDEO OR WHATEVER. 234 00:08:33,179 --> 00:08:36,015 BUT THE PROBLEMS OF THE PAST ARE 235 00:08:36,082 --> 00:08:39,719 THINGS LIKE DISCRIMINATION. 236 00:08:39,785 --> 00:08:43,756 RACISM, SEXISM, ABLEISM, 237 00:08:43,823 --> 00:08:44,624 STRUCTURAL INEQUALITY, WHICH 238 00:08:44,690 --> 00:08:45,524 UNFORTUNATELY ALL OCCUR IN THE 239 00:08:45,591 --> 00:08:45,758 WORLD. 240 00:08:45,825 --> 00:08:48,027 WE DO NOT LIVE IN A PERFECT 241 00:08:48,094 --> 00:08:48,461 WORLD. 242 00:08:48,527 --> 00:08:50,229 ALL OF THESE PATTERNS OF 243 00:08:50,296 --> 00:08:50,997 DISCRIMINATION ARE REFLECTED IN 244 00:08:51,063 --> 00:08:53,499 THE DATA WE USE TO TRAIN A.I. 245 00:08:53,566 --> 00:08:55,801 SYSTEMS. 246 00:08:55,868 --> 00:08:56,836 HOW DOES THIS MANIFEST? 247 00:08:56,903 --> 00:09:00,740 WELL, ONE COMES FROM THE MARKUP, 248 00:09:00,806 --> 00:09:02,608 WHICH IS TERRIFIC. 249 00:09:02,675 --> 00:09:03,709 ALGORITHM ACCOUNTABILITY NEWS 250 00:09:03,776 --> 00:09:04,477 ORGANIZATION, HIGHLY RECOMMEND 251 00:09:04,543 --> 00:09:06,345 READING THEM IF YOU'RE NOT 252 00:09:06,412 --> 00:09:09,749 READING THEM ALREADY. 253 00:09:09,815 --> 00:09:12,685 THE MARKUP DID AN INVESTIGATION 254 00:09:12,752 --> 00:09:16,756 ABOUT SECRET BIAS HIDDEN IN 255 00:09:16,822 --> 00:09:17,423 MORTGAGE-APPROVAL ALGORITHMS, 40 256 00:09:17,490 --> 00:09:21,460 TO 80% MORE LIKELY TO DENY 257 00:09:21,527 --> 00:09:24,931 BORROWERS OF COLOR THAN WHITE 258 00:09:24,997 --> 00:09:26,499 COUNTERPARTS, IN SOME METRO 259 00:09:26,565 --> 00:09:27,900 AREAS MORE THAN 250%. 260 00:09:27,967 --> 00:09:29,335 NOW A DATA SCIENTIST MIGHT LOOK 261 00:09:29,402 --> 00:09:31,938 AT THIS AND SAY, WELL, IT'S JUST 262 00:09:32,004 --> 00:09:33,506 WHAT'S IN THE DATA. 263 00:09:33,572 --> 00:09:35,808 YOU KNOW, WHAT'S PROBLEMATIC? 264 00:09:35,875 --> 00:09:38,210 A SOCIOLOGIST MIGHT SAY, ALL 265 00:09:38,277 --> 00:09:40,613 RIGHT, WELL, CLEARLY WHAT'S 266 00:09:40,680 --> 00:09:42,248 HAPPENING IS THE MORTGAGE 267 00:09:42,315 --> 00:09:43,249 APPROVAL ALGORITHMS ARE FED WITH 268 00:09:43,316 --> 00:09:45,217 DATA ABOUT WHO GOT MORTGAGES IN 269 00:09:45,284 --> 00:09:48,187 THE PAST AND THE U.S., VERY LONG 270 00:09:48,254 --> 00:09:49,722 HISTORY OF RESIDENTIAL 271 00:09:49,789 --> 00:09:51,457 SEGREGATION, HAS A LONG HISTORY 272 00:09:51,524 --> 00:09:53,759 OF FINANCIAL DISCRIMINATION IN 273 00:09:53,826 --> 00:09:54,460 LENDING. 274 00:09:54,527 --> 00:09:55,861 AND SO THE ALGORITHMS ARE 275 00:09:55,928 --> 00:09:59,031 PICKING UP ON THOSE PATTERNS OF 276 00:09:59,098 --> 00:09:59,665 DISCRIMINATION AND REPRODUCING 277 00:09:59,732 --> 00:09:59,999 THEM. 278 00:10:00,066 --> 00:10:04,270 SO IT'S A GOOD EXAMPLE OF HOW 279 00:10:04,337 --> 00:10:05,371 DISCRIMINATION HAPPENS BY 280 00:10:05,438 --> 00:10:08,107 DEFAULT INSIDE THE AUTOMATED 281 00:10:08,174 --> 00:10:08,341 SYSTEMS. 282 00:10:08,407 --> 00:10:09,542 WE COULD MATHEMATICALLY PUT A 283 00:10:09,608 --> 00:10:11,610 FINGER ON THE SCALE. 284 00:10:11,677 --> 00:10:14,313 WE COULD EVALUATE OUR ALGORITHMS 285 00:10:14,380 --> 00:10:17,283 AND SAY, OKAY, HOW MANY LOANS 286 00:10:17,350 --> 00:10:21,887 ARE GOING TO PEOPLE OF DIFFERENT 287 00:10:21,954 --> 00:10:22,989 CATEGORIES/PROTECTED CLASSES AND 288 00:10:23,055 --> 00:10:25,091 IF IT'S NOT A SUFFICIENT 289 00:10:25,157 --> 00:10:26,392 PERCENTAGE WE CAN CHANGE THE 290 00:10:26,459 --> 00:10:28,027 ALGORITHM SO THAT IT GIVES MORE 291 00:10:28,094 --> 00:10:29,996 LOANS TO PEOPLE FROM THOSE 292 00:10:30,062 --> 00:10:31,998 PROTECTED CATEGORIES. 293 00:10:32,064 --> 00:10:37,269 IS THAT ACTUALLY HAPPENING? 294 00:10:37,336 --> 00:10:38,771 NOT QUITE SO MUCH. 295 00:10:38,838 --> 00:10:40,840 ANOTHER EXAMPLE IS IN FACIAL 296 00:10:40,906 --> 00:10:41,340 RECOGNITION. 297 00:10:41,407 --> 00:10:43,809 YOU'RE PROBABLY FAMILIAR WITH 298 00:10:43,876 --> 00:10:46,012 THE GENDER SHADES PROJECT, WHERE 299 00:10:46,078 --> 00:10:48,214 THE INVESTIGATORS LOOKED AT 300 00:10:48,280 --> 00:10:51,717 FACIAL RECOGNITION SYSTEMS AND 301 00:10:51,784 --> 00:10:53,986 FOUND THAT THE MAJOR FACIAL 302 00:10:54,053 --> 00:10:55,421 RECOGNITION SYSTEM WORKED BETTER 303 00:10:55,488 --> 00:10:56,989 ON MEN THAN WOMEN, BETTER ON 304 00:10:57,056 --> 00:11:03,329 PEOPLE WITH LIGHT SKIN THAN DARK 305 00:11:03,396 --> 00:11:04,530 SKIN, TRANSAND NON-BINARY FOLKS 306 00:11:04,597 --> 00:11:05,398 ARE GENERALLY NOT REPRESENTED IN 307 00:11:05,464 --> 00:11:06,932 THE DATASETS AT ALL. 308 00:11:06,999 --> 00:11:08,501 WHEN DO YOU INTERSECTIONAL 309 00:11:08,567 --> 00:11:09,435 ANALYSIS, THE SYSTEMS WORK BEST 310 00:11:09,502 --> 00:11:11,637 OF ALL ON MEN WITH LIGHT SKIN 311 00:11:11,704 --> 00:11:14,740 AND WORST OF ALL ON WOMEN WITH 312 00:11:14,807 --> 00:11:20,413 DARK SKIN. 313 00:11:20,479 --> 00:11:21,847 AND SO, THIS IS ABSOLUTELY A 314 00:11:21,914 --> 00:11:25,017 PROBLEM THAT COULD BE ADDRESSED 315 00:11:25,084 --> 00:11:27,119 WITH THE TRAINING DATA, DATA 316 00:11:27,186 --> 00:11:28,621 SCIENTIST FIRST QUESTION IS 317 00:11:28,687 --> 00:11:29,855 USUALLY CAN'T YOU JUST MAKE THE 318 00:11:29,922 --> 00:11:31,390 TRAINING DATA MORE DIVERSE? 319 00:11:31,457 --> 00:11:32,825 ABSOLUTELY, YES, THAT WOULD MAKE 320 00:11:32,892 --> 00:11:33,893 THESE SYSTEMS MORE ACCURATE. 321 00:11:33,959 --> 00:11:36,862 BUT ONE THING WE DO IS PUSH 322 00:11:36,929 --> 00:11:39,365 FURTHER AND LOOK AT THE USES OF 323 00:11:39,432 --> 00:11:40,266 FACIAL RECOGNITION. 324 00:11:40,332 --> 00:11:43,769 WE CAN LOOK AT A.I. IN CONTEXT, 325 00:11:43,836 --> 00:11:46,472 AND SAY, WELL, WE NEED TO THINK 326 00:11:46,539 --> 00:11:48,774 ABOUT THINGS LIKE FACIAL 327 00:11:48,841 --> 00:11:50,576 RECOGNITION USE IN POLICING 328 00:11:50,643 --> 00:11:51,844 BECAUSE FACIAL RECOGNITION USED 329 00:11:51,911 --> 00:11:53,813 TO UNLOCK YOUR PHONE IS A LOW 330 00:11:53,879 --> 00:11:54,380 RISK USE. 331 00:11:54,447 --> 00:11:55,681 DOESN'T WORK FOR ME MOST OF THE 332 00:11:55,748 --> 00:12:01,787 TIME BUT I DON'T CARE, I JUST 333 00:12:01,854 --> 00:12:03,222 USE MY PASSWORD. 334 00:12:03,289 --> 00:12:05,958 FACIAL RECOGNITION USED BY 335 00:12:06,025 --> 00:12:08,160 POLICE ON REALTIME VIDEO 336 00:12:08,227 --> 00:12:09,929 SURVEILLANCE FEEDS IS GOING TO 337 00:12:09,995 --> 00:12:12,398 BE A HIGH RISK USE BECAUSE IT'S 338 00:12:12,465 --> 00:12:14,400 GOING TO MISIDENTIFY WOMEN AND 339 00:12:14,467 --> 00:12:17,470 PEOPLE OF COLOR MORE OFTEN AND 340 00:12:17,536 --> 00:12:20,873 GET PEOPLE CAUGHT UP IN CRIMINAL 341 00:12:20,940 --> 00:12:21,807 JUSTICE SYSTEM WHO DON'T NEED TO 342 00:12:21,874 --> 00:12:25,444 BE CAUGHT UP IN THE CRIMINAL 343 00:12:25,511 --> 00:12:25,811 JUSTICE SYSTEM. 344 00:12:25,878 --> 00:12:27,446 IT'S A HIGH-RISK USE THAT MIGHT 345 00:12:27,513 --> 00:12:30,049 NEED TO BE REGULATED. 346 00:12:30,116 --> 00:12:33,752 SO WE NEED TO -- OH, SURE. 347 00:12:40,793 --> 00:12:41,060 BETTER? 348 00:12:41,127 --> 00:12:41,260 OKAY. 349 00:12:41,327 --> 00:12:42,161 SORRY, ONLINE FOLKS. 350 00:12:42,228 --> 00:12:46,398 ALL RIGHT, SO LET'S NOT USE 351 00:12:46,465 --> 00:12:52,771 FACIAL RECOGNITION IN POLICING. 352 00:12:52,838 --> 00:12:54,106 SO, LET'S TALK ABOUT CANCER. 353 00:12:54,173 --> 00:12:55,641 I HAVE AN A.I. STORY I WANT TO 354 00:12:55,708 --> 00:12:56,609 GET OFF MY CHEST. 355 00:12:56,675 --> 00:12:59,178 IT IS ABOUT THE TIME THAT I GOT 356 00:12:59,245 --> 00:12:59,879 BREAST CANCER. 357 00:12:59,945 --> 00:13:03,382 IT WAS BEGINNING OF THE 358 00:13:03,449 --> 00:13:03,749 PANDEMIC. 359 00:13:03,816 --> 00:13:05,918 AND I GOT THE NEWS THAT YOU 360 00:13:05,985 --> 00:13:09,388 NEVER WANT TO GET. 361 00:13:09,455 --> 00:13:11,924 AND I HAD BREAST CANCER. 362 00:13:11,991 --> 00:13:14,393 IT WAS REALLY HARD, I'M FINE 363 00:13:14,460 --> 00:13:16,495 NOW, GRATEFUL TO THE DOCTORS AND 364 00:13:16,562 --> 00:13:20,199 MEDICAL PROFESSIONALS WHO TOOK 365 00:13:20,266 --> 00:13:21,433 CARE OF ME. 366 00:13:21,500 --> 00:13:22,835 AND I LEARNED A WHOLE BUNCH OF 367 00:13:22,902 --> 00:13:25,738 THINGS ALONG THE WAY THAT I 368 00:13:25,804 --> 00:13:27,640 THINK ARE USEFUL TO TALK ABOUT 369 00:13:27,706 --> 00:13:29,742 WHEN WE TALK ABOUT WHAT THE 370 00:13:29,808 --> 00:13:30,943 GENERAL PUBLIC UNDERSTANDS ABOUT 371 00:13:31,010 --> 00:13:33,012 A.I. AND HOW WE CAN TALK ABOUT 372 00:13:33,078 --> 00:13:35,214 A.I. IN MEDICINE WITH THE 373 00:13:35,281 --> 00:13:35,748 GENERAL PUBLIC. 374 00:13:35,814 --> 00:13:39,585 SO ONE OF THE THINGS THAT I DID 375 00:13:39,652 --> 00:13:43,856 WHEN I FIRST GOT DIAGNOSED IS I 376 00:13:43,923 --> 00:13:44,723 FREAKED OUT, BECAUSE EVERYBODY 377 00:13:44,790 --> 00:13:46,358 FREAKS OUT WHEN THEY FIRST GET 378 00:13:46,425 --> 00:13:47,526 DIAGNOSED, AND THE WAY THAT YOU 379 00:13:47,593 --> 00:13:49,662 FREAK OUT TENDS TO BE CONSISTENT 380 00:13:49,728 --> 00:13:52,164 WITH YOUR PERSONALITY. 381 00:13:52,231 --> 00:13:56,335 SO, MY FREAKOUT INVOLVED 382 00:13:56,402 --> 00:13:59,271 OBSESSIVIVELY READING EVERYTHING 383 00:13:59,338 --> 00:14:00,406 I POSSIBLY COULD. 384 00:14:00,472 --> 00:14:02,675 I SPENT SO MUCH TIME ON 385 00:14:02,741 --> 00:14:04,210 PubMed AND READ EVERY SINGLE 386 00:14:04,276 --> 00:14:05,644 THING IN EVERY SINGLE DROPDOWN 387 00:14:05,711 --> 00:14:07,846 AND EVERY SINGLE PART OF MY 388 00:14:07,913 --> 00:14:11,250 ELECTRONIC MEDICAL RECORD, WHICH 389 00:14:11,317 --> 00:14:15,054 IS WHERE I STUMBLED ACROSS A 390 00:14:15,120 --> 00:14:18,157 SCAN THIS SCAN WAS READ BY DR. 391 00:14:18,224 --> 00:14:19,491 SO-AND-SO AND DR. A.I. 392 00:14:19,558 --> 00:14:21,327 WHY IS THIS A.I. READING MY 393 00:14:21,393 --> 00:14:21,527 SCAN? 394 00:14:21,594 --> 00:14:22,394 WHAT DID IT FIND? 395 00:14:22,461 --> 00:14:23,929 WHO WROTE THIS A.I.? 396 00:14:23,996 --> 00:14:26,265 WHO KIND OF BIAS IS IN IT? 397 00:14:26,332 --> 00:14:27,866 AND I HAD A LITTLE MOMENT AND 398 00:14:27,933 --> 00:14:30,169 THEN I FORGOT ABOUT IT BECAUSE, 399 00:14:30,236 --> 00:14:31,804 YOU KNOW, CANCER. 400 00:14:31,870 --> 00:14:33,772 AND THEN I CAME BACK TO IT A 401 00:14:33,839 --> 00:14:34,540 LITTLE WHILE LATER. 402 00:14:34,607 --> 00:14:36,175 A YEAR OR SO LATER. 403 00:14:36,242 --> 00:14:38,244 I WAS FEELING BETTER. 404 00:14:38,310 --> 00:14:42,481 IT WAS COVID. 405 00:14:42,548 --> 00:14:45,484 AND I DECIDED TO GET A LITTLE 406 00:14:45,551 --> 00:14:47,019 BIT CARRIED AWAY. 407 00:14:47,086 --> 00:14:48,654 I DECIDED THAT I WAS GOING TO DO 408 00:14:48,721 --> 00:14:52,157 AN EXPERIMENT. 409 00:14:52,224 --> 00:14:56,295 I WAS GOING TO DOWNLOAD MY OWN 410 00:14:56,362 --> 00:14:58,797 SCANS AND RUN THEM THROUGH AN 411 00:14:58,864 --> 00:15:01,667 OPEN SOURCE A.I. IN ORDER TO 412 00:15:01,734 --> 00:15:02,768 WRITE ABOUT THE STATE-OF-THE-ART 413 00:15:02,835 --> 00:15:07,740 AND A.I.-BASED CANCER DETECTION. 414 00:15:07,806 --> 00:15:09,842 AND I WILL ADMIT THAT I GOT A 415 00:15:09,908 --> 00:15:10,843 LITTLE CARRIED AWAY, THIS WAS 416 00:15:10,909 --> 00:15:16,415 MAYBE A LITTLE EXTREME. 417 00:15:16,482 --> 00:15:17,716 BUT I WAS TRYING TO REPRODUCE 418 00:15:17,783 --> 00:15:18,517 THESE THE EXPERIMENT. 419 00:15:18,584 --> 00:15:22,421 I HAVE ALWAYS BEEN REALLY 420 00:15:22,488 --> 00:15:24,189 INTERESTED IN REPRODUCIBILITY. 421 00:15:24,256 --> 00:15:27,493 AND OF COURSE THE IDEA IN 422 00:15:27,559 --> 00:15:28,027 REPRODUCIBILITY YOU WOULD 423 00:15:28,093 --> 00:15:29,895 PUBLISH YOUR CODE AND DATA 424 00:15:29,962 --> 00:15:30,763 ONLINE, OTHER PEOPLE CAN 425 00:15:30,829 --> 00:15:31,630 REPRODUCE YOUR EXPERIMENT AND 426 00:15:31,697 --> 00:15:32,131 VALIDATE IT. 427 00:15:32,197 --> 00:15:34,266 WE DO THIS A LOT IN JOURNALISM 428 00:15:34,333 --> 00:15:36,468 WHEN WE PUBLISH THE CODE AND 429 00:15:36,535 --> 00:15:41,040 DATA WE USE TO DO A PARTICULAR 430 00:15:41,106 --> 00:15:41,373 ANALYSIS. 431 00:15:41,440 --> 00:15:42,508 ESPECIALLY ALGORITHMIC 432 00:15:42,574 --> 00:15:44,743 ACCOUNTABILITY REPORTING. 433 00:15:44,810 --> 00:15:46,612 AND SO, I TOOK MY SCANS, AND I 434 00:15:46,679 --> 00:15:49,114 RAN THEM THROUGH AN A.I. 435 00:15:49,181 --> 00:15:53,819 I HAD A LOT OF MISCONCEPTIONS, 436 00:15:53,886 --> 00:15:56,055 IT TURNED OUT. 437 00:15:56,121 --> 00:15:59,525 AND MY MISCONCEPTION ABOUT HOW 438 00:15:59,591 --> 00:16:03,562 A.I. WOULD BE USED IN CANCER 439 00:16:03,629 --> 00:16:07,232 DETECTION ARE NOT UNCOMMON. 440 00:16:07,299 --> 00:16:11,270 SO, HERE IS A PICTURE OF, YOU 441 00:16:11,337 --> 00:16:12,237 KNOW, SOMEBODY'S MAMMOGRAM. 442 00:16:12,304 --> 00:16:13,672 SOME MAMMOGRAM THAT I FOUND ON 443 00:16:13,739 --> 00:16:17,943 THE INTERNET, NOT MY ACTUAL 444 00:16:18,010 --> 00:16:18,444 MAMMOGRAM. 445 00:16:18,510 --> 00:16:21,113 AND I THOUGHT THAT I WAS GOING 446 00:16:21,180 --> 00:16:23,082 TO TAKE MY ENTIRE ELECTRONIC 447 00:16:23,148 --> 00:16:24,516 MEDICAL RECORD, FEED IT IN, AND 448 00:16:24,583 --> 00:16:27,286 I WAS GOING TO PUT IN ALL OF MY 449 00:16:27,353 --> 00:16:31,323 SCANS AND, YOU KNOW, THE 3D 450 00:16:31,390 --> 00:16:34,059 VIDEO, BLAH, BLAH, BLAH, IT WAS 451 00:16:34,126 --> 00:16:35,594 GOING TO EVALUATE EVERYTHING AND 452 00:16:35,661 --> 00:16:37,229 THEN LIKE POP OUT AN ANSWER. 453 00:16:37,296 --> 00:16:40,833 LIKE WE THINK YOU HAVE CANCER. 454 00:16:40,899 --> 00:16:42,267 BUT, NO. 455 00:16:42,334 --> 00:16:43,469 THIS IS TOTALLY WRONG. 456 00:16:43,535 --> 00:16:46,205 I ALSO THOUGHT THAT I WAS GOING 457 00:16:46,271 --> 00:16:47,940 TO DO AN EXPERIMENT WHERE I 458 00:16:48,006 --> 00:16:52,077 WOULD CHANGE MY RACE AND SEE IF 459 00:16:52,144 --> 00:16:53,078 THE PREDICTION CHANGED. 460 00:16:53,145 --> 00:16:54,413 AND THIS WAS A GOOD PREMISE FOR 461 00:16:54,480 --> 00:16:56,682 AN EXPERIMENT BUT ALSO TOTALLY 462 00:16:56,749 --> 00:16:58,450 WRONG BECAUSE THERE'S NO RACE IN 463 00:16:58,517 --> 00:17:01,520 THE A.I. THAT I USED. 464 00:17:01,587 --> 00:17:02,888 SO, ONE OF MY MISCONCEPTIONS WAS 465 00:17:02,955 --> 00:17:04,456 IT WAS GOING TO LOOK AT MY 466 00:17:04,523 --> 00:17:08,827 ENTIRE RECORD. 467 00:17:08,894 --> 00:17:10,362 NO, THE WAY THAT MOST CANCER 468 00:17:10,429 --> 00:17:12,564 DETECTION A.I. THAT WE HAVE 469 00:17:12,631 --> 00:17:16,635 RIGHT NOW WORKS IS THAT YOU PUT 470 00:17:16,702 --> 00:17:19,071 IN A FLAT IMAGE, AND WHAT IT 471 00:17:19,138 --> 00:17:22,808 WILL DO IS IT WILL IDENTIFY AN 472 00:17:22,875 --> 00:17:27,679 AREA OF CONCERN ON THE IMAGE. 473 00:17:27,746 --> 00:17:31,316 SO, I WAS A LITTLE SURPRISED TO 474 00:17:31,383 --> 00:17:32,985 HEAR THIS. 475 00:17:33,051 --> 00:17:34,586 BUT I DOWNLOADED THE STUFF, I 476 00:17:34,653 --> 00:17:37,423 GOT ALL THE CODE WORKING, I FED 477 00:17:37,489 --> 00:17:40,592 IN MY IMAGES AND I GOT NOTHING. 478 00:17:40,659 --> 00:17:42,694 IT WAS LIKE, YOU KNOW, I KNOW 479 00:17:42,761 --> 00:17:43,762 THERE'S CANCER HERE. 480 00:17:43,829 --> 00:17:47,065 LIKE MY DOCTOR HAS SEEN IT. 481 00:17:47,132 --> 00:17:49,701 LIKE I GOT A MASTECTOMY CAUSE 482 00:17:49,768 --> 00:17:50,636 THERE WAS CANCER HERE, WHY CAN'T 483 00:17:50,702 --> 00:17:53,505 THE A.I. SEE IT? 484 00:17:53,572 --> 00:17:56,675 IT TURNED OUT THAT MY IMAGE 485 00:17:56,742 --> 00:17:57,476 RESOLUTION WAS TOTALLY WRONG, 486 00:17:57,543 --> 00:18:01,180 AND I HAD TO GO THROUGH A 487 00:18:01,246 --> 00:18:03,849 RIGMAROLE OF TRYING TO GET 488 00:18:03,916 --> 00:18:05,784 ACCESS TO HIGH RESOLUTION 489 00:18:05,851 --> 00:18:07,653 VERSION OF MY SCANS. 490 00:18:07,719 --> 00:18:09,822 AFTER A MONTH OR TWO OF 491 00:18:09,888 --> 00:18:12,357 WRANGLING I FINALLY GAVE UP AND 492 00:18:12,424 --> 00:18:14,193 ASKED THEM TO SEND ME A CD IN 493 00:18:14,259 --> 00:18:22,968 THE MAIL AND I HAD TO BUY A CD 494 00:18:23,035 --> 00:18:23,202 READER. 495 00:18:23,268 --> 00:18:25,804 THIS IS 2007, I DON'T KNOW. 496 00:18:25,871 --> 00:18:28,640 SO, I GOT IT GOING. 497 00:18:28,707 --> 00:18:31,310 AND EVENTUALLY IT DID IDENTIFY 498 00:18:31,376 --> 00:18:33,412 AN AREA OF CONCERN. 499 00:18:33,479 --> 00:18:34,713 THIS IS WHAT AN AREA OF CONCERN 500 00:18:34,780 --> 00:18:35,981 MIGHT LOOK LIKE. 501 00:18:36,048 --> 00:18:39,818 AGAIN, SOME IMAGE I GOT OFF THE 502 00:18:39,885 --> 00:18:40,085 INTERNET. 503 00:18:40,152 --> 00:18:43,422 AND WHAT I FOUND WAS THAT I GOT 504 00:18:43,489 --> 00:18:47,159 A SCORE OF -- IT GAVE ME -- IT 505 00:18:47,226 --> 00:18:48,460 IDENTIFIED AN AREA OF CONCERN 506 00:18:48,527 --> 00:18:53,165 AND GAVE ME A SCORE. 507 00:18:53,232 --> 00:18:56,134 AND I WAS REALLY SURPRISED THAT 508 00:18:56,201 --> 00:19:00,072 IT DIDN'T LIKE KIND OF GIVE ME A 509 00:19:00,138 --> 00:19:02,341 BIG -- LIKE A BIG RIGMAROLE. 510 00:19:02,407 --> 00:19:03,509 WHEN YOU TEXT SOMEBODY 511 00:19:03,575 --> 00:19:04,176 CONGRATULATIONS IT GIVES YOU 512 00:19:04,243 --> 00:19:05,110 BALLOONS AND STUFF. 513 00:19:05,177 --> 00:19:08,947 I WAS SORT OF EXPECTING THAT BUT 514 00:19:09,014 --> 00:19:10,148 LIKE YOU HAVE CANCER! 515 00:19:10,215 --> 00:19:11,316 YOU DON'T HAVE CANCER! 516 00:19:11,383 --> 00:19:14,620 THAT'S NOW HOW IT WORKS. 517 00:19:14,686 --> 00:19:16,488 SO, THAT WAS ONE OF THE THINGS 518 00:19:16,555 --> 00:19:20,959 THAT I LEARNED ALONG THE WAY, 519 00:19:21,026 --> 00:19:23,729 THAT EVEN WHEN YOU ARE A 520 00:19:23,795 --> 00:19:26,164 RESEARCHER WHO WORKS ON PEOPLE'S 521 00:19:26,231 --> 00:19:27,666 UNREALISTIC EXPECTATIONS AROUND 522 00:19:27,733 --> 00:19:29,835 A.I., YOU ACTUALLY YOURSELF HAVE 523 00:19:29,902 --> 00:19:31,169 UNREALISTIC EXPECTATIONS AROUND 524 00:19:31,236 --> 00:19:33,038 A.I., AND IT'S USEFUL TO KEEP 525 00:19:33,105 --> 00:19:34,139 THIS IN MIND WHEN WE'RE 526 00:19:34,206 --> 00:19:35,674 COMMUNICATING WITH THE PUBLIC 527 00:19:35,741 --> 00:19:39,378 ABOUT WHAT A.I. DOES. 528 00:19:39,444 --> 00:19:41,680 ONE OF THE THINGS THAT I WAS 529 00:19:41,747 --> 00:19:43,415 OBSESSED WITH WAS HOW LIKELY AM 530 00:19:43,482 --> 00:19:44,950 I TO DIE FROM THIS. 531 00:19:45,017 --> 00:19:45,183 RIGHT? 532 00:19:45,250 --> 00:19:46,919 BECAUSE THAT IS THE BIG QUESTION 533 00:19:46,985 --> 00:19:48,987 THAT YOU HAVE WHEN YOU ARE 534 00:19:49,054 --> 00:19:50,322 DIAGNOSED WITH CANCER. 535 00:19:50,389 --> 00:19:52,524 I DON'T KNOW WHAT CANCER 536 00:19:52,591 --> 00:19:53,292 RESEARCHERS' BIG QUESTIONS ARE. 537 00:19:53,358 --> 00:19:55,994 ALL I KNOW IS FROM THE 538 00:19:56,061 --> 00:19:56,662 PERSPECTIVE OF SOMEBODY 539 00:19:56,728 --> 00:19:56,929 DIAGNOSED. 540 00:19:56,995 --> 00:20:00,599 AND WHAT I KNEW WAS THAT BLACK 541 00:20:00,666 --> 00:20:06,171 WOMEN ARE 40% MORE LIKELY TO DIE 542 00:20:06,238 --> 00:20:07,072 FROM BREAST CANCER. 543 00:20:07,139 --> 00:20:11,843 BUT I AM -- MY MOM IS BLACK. 544 00:20:11,910 --> 00:20:14,713 MY DAD WAS BLACK, MY MOM WAS 545 00:20:14,780 --> 00:20:18,684 WHITE, AND MY MOM DIED OF BREAST 546 00:20:18,750 --> 00:20:23,255 CANCER WHEN SHE WAS 50. 547 00:20:23,322 --> 00:20:26,091 AND I WONDERED, OKAY, FOR THE 548 00:20:26,158 --> 00:20:27,526 PURPOSES OF MEDICAL STATISTICS, 549 00:20:27,593 --> 00:20:29,194 WOULD I BE CONSIDERED BLACK? 550 00:20:29,261 --> 00:20:30,596 WOULD I BE CONSIDERED WHITE? 551 00:20:30,662 --> 00:20:34,333 WHAT IS MY RISK PROFILE? 552 00:20:34,399 --> 00:20:38,036 AND I ASKED MY DOCTOR ABOUT 553 00:20:38,103 --> 00:20:38,604 THIS. 554 00:20:38,670 --> 00:20:40,772 MY DOCTOR EXPLAINED THAT, WELL, 555 00:20:40,839 --> 00:20:45,377 ACTUALLY WHAT WE DO WHEN THERE 556 00:20:45,444 --> 00:20:47,112 ARE DIFFERENCES IN PATTERNS, OR 557 00:20:47,179 --> 00:20:48,780 PREVALENCE BY RACE, IS FOR YOU 558 00:20:48,847 --> 00:20:50,415 WE RUN YOUR NUMBERS BOTH WAYS AS 559 00:20:50,482 --> 00:20:52,150 IF YOU WERE BLACK AND AS IF YOU 560 00:20:52,217 --> 00:20:54,586 WERE WHITE, AND THEN WE JUST 561 00:20:54,653 --> 00:20:55,787 ASSUME YOUR ACTUAL NUMBER IS 562 00:20:55,854 --> 00:20:56,855 SOMEWHERE IN BETWEEN. 563 00:20:56,922 --> 00:21:00,258 I WAS LIKE, THAT'S SCIENCE? 564 00:21:00,325 --> 00:21:02,561 SO, I COULDN'T FIGURE OUT, OKAY, 565 00:21:02,628 --> 00:21:05,297 AM I LIKELY TO DIE FROM THIS, 566 00:21:05,364 --> 00:21:08,367 NOT LIKELY TO DIE? 567 00:21:08,433 --> 00:21:10,669 IT MADE ME THINK, OKAY, THERE'S 568 00:21:10,736 --> 00:21:13,071 A LOT OF AMBIGUITY IN DIAGNOSIS, 569 00:21:13,138 --> 00:21:15,907 A LOT OF AMBIGUITY IN 570 00:21:15,974 --> 00:21:16,508 STATISTICS. 571 00:21:16,575 --> 00:21:17,876 AND WE'RE NOT NECESSARILY PAYING 572 00:21:17,943 --> 00:21:19,611 ENOUGH ATTENTION TO THAT 573 00:21:19,678 --> 00:21:23,815 AMBIGUITY WHEN WE ARE INCLUDING 574 00:21:23,882 --> 00:21:25,117 THESE SYSTEMS IN ALGORITHMS. 575 00:21:25,183 --> 00:21:25,317 RIGHT? 576 00:21:25,384 --> 00:21:28,186 SO WE NEED TO TALK MORE ABOUT 577 00:21:28,253 --> 00:21:28,920 THE AMBIGUITY. 578 00:21:28,987 --> 00:21:32,257 I ALSO THOUGHT IT WAS 579 00:21:32,324 --> 00:21:33,325 SIGNIFICANT THAT COMPUTERS DON'T 580 00:21:33,392 --> 00:21:35,327 DIAGNOSE THE SAME WAY THAT 581 00:21:35,394 --> 00:21:37,062 DOCTORS DO, RIGHT? 582 00:21:37,129 --> 00:21:38,397 THE COMPUTER IDENTIFIES AN AREA 583 00:21:38,463 --> 00:21:40,766 OF CONCERN AND GIVES YOU A 584 00:21:40,832 --> 00:21:40,966 SCORE. 585 00:21:41,033 --> 00:21:44,403 THIS PARTICULAR ONE GAVE ME A 586 00:21:44,469 --> 00:21:45,704 SCORE BETWEEN ZERO AND ONE. 587 00:21:45,771 --> 00:21:47,372 I THOUGHT THIS MEANT YOU HAVE A 588 00:21:47,439 --> 00:21:48,340 20% CHANCE OF HAVING CANCER IN 589 00:21:48,407 --> 00:21:49,675 THIS AREA. 590 00:21:49,741 --> 00:21:54,146 BUT NO, RESEARCHER TOLD ME THAT 591 00:21:54,212 --> 00:21:56,782 IT'S ABSOLUTELY EMPIRICALLY NOT 592 00:21:56,848 --> 00:21:57,115 A PREDICTION. 593 00:21:57,182 --> 00:21:58,316 IT IS JUST A SCORE. 594 00:21:58,383 --> 00:22:02,454 TURNS OUT THIS IS BECAUSE OF THE 595 00:22:02,521 --> 00:22:05,424 LEGAL LANDSCAPE, BECAUSE, YOU 596 00:22:05,490 --> 00:22:08,493 KNOW, THERE'S -- HEAVILY 597 00:22:08,560 --> 00:22:10,028 REGULATED INDUSTRY AND HE COULD 598 00:22:10,095 --> 00:22:13,632 GET IN BIG TROUBLE FOR SAYING 599 00:22:13,699 --> 00:22:14,966 THERE'S 20% CHANCE. 600 00:22:15,033 --> 00:22:17,469 OH, YEAH, THIS IS A .2 SCORE. 601 00:22:17,536 --> 00:22:19,004 ALL RIGHT, FINE. 602 00:22:19,071 --> 00:22:20,872 ANOTHER THING THAT PEOPLE 603 00:22:20,939 --> 00:22:22,507 USUALLY DON'T -- THE GENERAL 604 00:22:22,574 --> 00:22:23,775 PUBLIC GENERALLY DOES NOT 605 00:22:23,842 --> 00:22:25,110 REALIZE ABOUT A.I. THAT'S 606 00:22:25,177 --> 00:22:28,847 IMPORTANT TO SAY IN CONTEXT OF 607 00:22:28,914 --> 00:22:32,684 CANCER IS THAT THESE SYSTEMS ARE 608 00:22:32,751 --> 00:22:34,219 TUNED TO EITHER GIVE YOU A 609 00:22:34,286 --> 00:22:35,520 HIGHER RATE OF FALSE POSITIVES 610 00:22:35,587 --> 00:22:38,490 OR HIGHER RATE OF FALSE 611 00:22:38,557 --> 00:22:39,124 NEGATIVES. 612 00:22:39,191 --> 00:22:42,761 SO, A FALSE POSITIVE WOULD BE 613 00:22:42,828 --> 00:22:45,263 WHERE IT SAYS, YEP, HAVE CANCER, 614 00:22:45,330 --> 00:22:47,666 DON'T HAVE NEGATIVE. 615 00:22:47,733 --> 00:22:49,201 FALSE NEGATIVE, NOPE, NO CANCER, 616 00:22:49,267 --> 00:22:51,837 AND DO YOU HAVE MEDICINE. 617 00:22:51,903 --> 00:22:53,371 IN MEDICINE IT'S CONSIDERED 618 00:22:53,438 --> 00:22:55,974 HIGHER STAKES TO HAVE MORE FALSE 619 00:22:56,041 --> 00:22:58,410 NEGATIVES THAN FALSE POSITIVES. 620 00:22:58,477 --> 00:23:01,580 WE ARE ALL UNITED IN WANTING TO 621 00:23:01,646 --> 00:23:04,616 SAVE MORE LIVES, FOR MORE PEOPLE 622 00:23:04,683 --> 00:23:09,554 TO BE DIAGNOSED AND TREATED 623 00:23:09,621 --> 00:23:09,855 EFFECTIVELY. 624 00:23:09,921 --> 00:23:11,189 SO, THERE'S CONSENSUS, 625 00:23:11,256 --> 00:23:11,757 APPROPRIATELY, THESE SYSTEMS 626 00:23:11,823 --> 00:23:14,159 SHOULD BE TUNED TO HAVE HIGHER 627 00:23:14,226 --> 00:23:15,894 RATES OF FALSE POSITIVES THAN 628 00:23:15,961 --> 00:23:16,795 FALSE NEGATIVES. 629 00:23:16,862 --> 00:23:19,631 IT'S KIND OF THE STRANGE THING 630 00:23:19,698 --> 00:23:21,600 TO REALIZE YOU HAVE TO DECIDE 631 00:23:21,666 --> 00:23:23,902 HOW WRONG YOU WANT YOUR 632 00:23:23,969 --> 00:23:26,304 DIAGNOSTIC SYSTEM TO BE. 633 00:23:26,371 --> 00:23:29,641 AND PEOPLE ARE PRETTY 634 00:23:29,708 --> 00:23:33,912 UNCOMFORTABLE WITH THAT. 635 00:23:33,979 --> 00:23:39,384 I ALSO GOT REALLY INTERESTED IN 636 00:23:39,451 --> 00:23:40,719 RACIAL DISPARITIES. 637 00:23:40,786 --> 00:23:44,422 IN DIAGNOSIS AND THEN ALSO IN 638 00:23:44,489 --> 00:23:46,458 A.I.-READ SCANS. 639 00:23:46,525 --> 00:23:49,127 THERE IS THIS REALLY INTERESTING 640 00:23:49,194 --> 00:23:52,764 STUDY THAT LOOKED AT MACHINE 641 00:23:52,831 --> 00:23:58,436 LEARNING MODELS, DIAGNOSING I 642 00:23:58,503 --> 00:23:59,971 THINK IT WAS PLEURAL EFFUSIONS. 643 00:24:00,038 --> 00:24:02,474 YOU COULD FACT CHECK ME ON THIS 644 00:24:02,541 --> 00:24:02,641 ONE. 645 00:24:02,707 --> 00:24:04,142 AT ANY RATE, THEY WERE LOOKING 646 00:24:04,209 --> 00:24:08,180 AT THESE MODELS THAT WERE 647 00:24:08,246 --> 00:24:09,281 EXTREMELY EFFECTIVE ON DATA FROM 648 00:24:09,347 --> 00:24:11,116 ONE HOSPITAL AND THEN THEY ADDED 649 00:24:11,183 --> 00:24:12,350 ANOTHER HOSPITAL, AND IT WAS 650 00:24:12,417 --> 00:24:13,685 REALLY EFFECTIVE ON THE DATA 651 00:24:13,752 --> 00:24:16,087 FROM THE NEXT HOSPITAL. 652 00:24:16,154 --> 00:24:18,590 AND THEN THEY ADDED IN THE DATA 653 00:24:18,657 --> 00:24:20,559 ON THE PATIENT'S RACE. 654 00:24:20,625 --> 00:24:25,030 AND IT TURNED OUT THAT THESE 655 00:24:25,096 --> 00:24:25,697 MODELS -- 656 00:24:25,764 --> 00:24:28,200 >> RECORDING IN PROGRESS. 657 00:24:28,266 --> 00:24:30,068 >> OH. 658 00:24:30,135 --> 00:24:32,170 THESE MODELS WERE DIFFERENTIALLY 659 00:24:32,237 --> 00:24:34,239 ACCURATE, BASED ON RACE. 660 00:24:34,306 --> 00:24:34,773 RIGHT? 661 00:24:34,840 --> 00:24:37,943 BUT IT'S JUST SCANS OF PEOPLE'S 662 00:24:38,009 --> 00:24:38,176 INSIDES. 663 00:24:38,243 --> 00:24:40,245 RESEARCHERS ARE LIKE WHY IS THIS 664 00:24:40,312 --> 00:24:40,846 HAPPENING? 665 00:24:40,912 --> 00:24:41,880 AND NOBODY'S BEEN ABLE TO 666 00:24:41,947 --> 00:24:45,183 EXPLAIN WHY THIS IS HAPPENING, 667 00:24:45,250 --> 00:24:45,483 RIGHT? 668 00:24:45,550 --> 00:24:46,484 REALLY BIG UNKNOWN, RIGHT? 669 00:24:46,551 --> 00:24:50,088 WE KNOW THINGS LIKE FACIAL 670 00:24:50,155 --> 00:24:52,724 RECOGNITION SYSTEMS ARE 671 00:24:52,791 --> 00:24:54,259 DIFFERENTIALLY ACCURATE BASED ON 672 00:24:54,326 --> 00:24:55,794 SKIN TONES, BUT IT DOESN'T MAKE 673 00:24:55,861 --> 00:24:58,330 ANY SENSE TO A HUMAN OR AT LEAST 674 00:24:58,396 --> 00:25:03,268 TO THIS HUMAN THAT THE SCAN OF 675 00:25:03,335 --> 00:25:06,137 THE INSIDE WOULD BE DIFFERENT. 676 00:25:06,204 --> 00:25:07,873 SO BIG QUESTION MARK. 677 00:25:07,939 --> 00:25:10,709 AND ONE OTHER THING I FIND 678 00:25:10,775 --> 00:25:15,747 REALLY INTERESTING IS THE WAY 679 00:25:15,814 --> 00:25:19,017 THAT RACE GETS ENCODED IN 680 00:25:19,084 --> 00:25:20,452 MEDICAL SYSTEMS. 681 00:25:20,518 --> 00:25:21,753 AND IN ALGORITHMS. 682 00:25:21,820 --> 00:25:24,089 AS IF IT WERE REAL. 683 00:25:24,155 --> 00:25:27,359 SO RACE IS SOCIAL. 684 00:25:27,425 --> 00:25:28,693 IT'S NOT BIOLOGICAL. 685 00:25:28,760 --> 00:25:30,528 THERE ARE GENETIC DIFFERENCES 686 00:25:30,595 --> 00:25:32,631 BETWEEN PEOPLE, RIGHT? 687 00:25:32,697 --> 00:25:35,033 BUT RACE DOES NOT DESCRIBE THOSE 688 00:25:35,100 --> 00:25:35,700 GENETIC DIFFERENCES. 689 00:25:35,767 --> 00:25:38,770 THE RACE IS A SOCIAL CONSTRUCT, 690 00:25:38,837 --> 00:25:41,039 NOT A BIOLOGICAL REALITY, BUT 691 00:25:41,106 --> 00:25:42,707 SOMETIMES GETS EMBEDDED IN 692 00:25:42,774 --> 00:25:45,310 ALGORITHMS AND IN MEDICAL 693 00:25:45,377 --> 00:25:47,812 SYSTEMS AS IF IT WERE A 694 00:25:47,879 --> 00:25:48,246 BIOLOGICAL REALITY. 695 00:25:48,313 --> 00:25:51,883 SO YOU'RE PROBABLY FAMILIAR WITH 696 00:25:51,950 --> 00:25:53,518 RACE CORRECTION. 697 00:25:53,585 --> 00:25:56,354 SO, IN KIDNEY DISEASE DIAGNOSIS, 698 00:25:56,421 --> 00:26:01,059 THERE'S A PRETTY SIGNIFICANT 699 00:26:01,126 --> 00:26:04,696 EXAMPLE OF EGFR, FILTRATION 700 00:26:04,763 --> 00:26:06,665 RATES, SO WHEN YOUR EGFR GETS 701 00:26:06,731 --> 00:26:09,467 DOWN TO 20, 20% KIDNEY FUNCTION, 702 00:26:09,534 --> 00:26:13,705 YOU'RE ELIGIBLE FOR THE KIDNEY 703 00:26:13,772 --> 00:26:14,406 TRANSPLANT LIST. 704 00:26:14,472 --> 00:26:16,408 IT DOESN'T MEAN YOU'LL GET IT, 705 00:26:16,474 --> 00:26:17,409 JUST ELIGIBLE TO START WAITING 706 00:26:17,475 --> 00:26:19,477 FOR A KIDNEY THAT MATCHES. 707 00:26:19,544 --> 00:26:20,478 MIGHT TAKE YEARS. 708 00:26:20,545 --> 00:26:25,417 FOR A VERY LONG TIME THERE WAS A 709 00:26:25,483 --> 00:26:26,818 RACE CORRECTION BUILT IN SO THAT 710 00:26:26,885 --> 00:26:30,588 IF YOU WERE BLACK, YOU GOT A 711 00:26:30,655 --> 00:26:34,225 MULTIPLIER APPLIED IN YOUR EGFR 712 00:26:34,292 --> 00:26:34,826 CALCULATION WHICH MEANT BLACK 713 00:26:34,893 --> 00:26:36,928 PATIENTS HAD TO BE SICKER THAN 714 00:26:36,995 --> 00:26:39,264 ANY OTHER PATIENTS IN ORDER TO 715 00:26:39,331 --> 00:26:40,498 QUALIFY FOR THE KIDNEY 716 00:26:40,565 --> 00:26:41,333 TRANSPLANT LIST. 717 00:26:41,399 --> 00:26:46,805 WE ALSO SEE RACE CORRECTIONS IN 718 00:26:46,871 --> 00:26:48,773 THINGS LIKE CONCUSSION. 719 00:26:48,840 --> 00:26:51,209 CONCUSSION CALCULATIONS. 720 00:26:51,276 --> 00:26:53,178 THERE WAS A CASE IN THE NFL 721 00:26:53,244 --> 00:26:55,847 WHERE THE NFL MADE THIS 722 00:26:55,914 --> 00:26:58,149 MULTI-BILLION DOLLAR SETTLEMENT 723 00:26:58,216 --> 00:27:00,785 TO PLAYERS WHO HAD EXPERIENCED 724 00:27:00,852 --> 00:27:02,420 MULTIPLE CONCUSSIONS AND HAD 725 00:27:02,487 --> 00:27:06,124 REALLY TRAUMATIC BRAIN INJURIES 726 00:27:06,191 --> 00:27:07,993 AND LONG-TERM EFFECTS, AND THEY 727 00:27:08,059 --> 00:27:09,194 USED A CALCULATION TO FIGURE OUT 728 00:27:09,260 --> 00:27:12,464 HOW MUCH EACH PLAYER WAS OWED, 729 00:27:12,530 --> 00:27:14,866 AND BLACK PLAYERS WERE GIVEN A 730 00:27:14,933 --> 00:27:18,370 RACE CORRECTION UNDER THE 731 00:27:18,436 --> 00:27:21,106 ASSUMPTION THAT THEIR MENTAL 732 00:27:21,172 --> 00:27:22,640 CAPACITY WAS DIMINISHED RELATIVE 733 00:27:22,707 --> 00:27:24,609 TO OTHER PLAYERS, AND THEY WERE 734 00:27:24,676 --> 00:27:25,477 OWED LESS MONEY. 735 00:27:25,543 --> 00:27:27,579 THEY FOUGHT IT. 736 00:27:27,645 --> 00:27:28,980 THEY WON. 737 00:27:29,047 --> 00:27:32,951 BUT WE DO SEE THINGS LIKE RACE 738 00:27:33,018 --> 00:27:37,655 CORRECTION LINGERING. 739 00:27:37,722 --> 00:27:39,758 THIS LINGERED IN KIDNEY 740 00:27:39,824 --> 00:27:45,563 ALGORITHM UNTIL VERY RECENTLY. 741 00:27:45,630 --> 00:27:48,066 ACTUALLY UNTIL COPY EDITS FOR MY 742 00:27:48,133 --> 00:27:49,701 BOOK WERE DUE, I WAS GLAD IT 743 00:27:49,768 --> 00:27:54,406 CHANGED IN TIME TO GET INTO MY 744 00:27:54,472 --> 00:27:54,639 BOOK. 745 00:27:54,706 --> 00:27:57,575 SO, THE KIDNEY ALGORITHM HAS 746 00:27:57,642 --> 00:27:57,942 CHANGED. 747 00:27:58,009 --> 00:28:00,345 EGFR NO LONGER INCLUDES A RACE 748 00:28:00,412 --> 00:28:00,879 CORRECTION. 749 00:28:00,945 --> 00:28:04,616 BUT WE DO HAVE TO THINK ABOUT 750 00:28:04,682 --> 00:28:07,585 EVERY SINGLE LAB THAT USED THESE 751 00:28:07,652 --> 00:28:10,188 DIFFERENT NUMBERS, RIGHT? 752 00:28:10,255 --> 00:28:12,157 SO EVERY SINGLE LAB WOULD TAKE 753 00:28:12,223 --> 00:28:14,659 YOUR RACE ON YOUR PATIENT CHART 754 00:28:14,726 --> 00:28:17,295 AND WOULD CALCULATE IT AS EITHER 755 00:28:17,362 --> 00:28:18,630 BLACK OR WHITE, AND I DISCOVERED 756 00:28:18,696 --> 00:28:21,699 THE OTHER DAY THAT MY LAB WAS 757 00:28:21,766 --> 00:28:24,803 ACTUALLY STILL CALCULATING IT. 758 00:28:24,869 --> 00:28:26,671 IT'S JUST THAT THE NUMBER GOT 759 00:28:26,738 --> 00:28:29,874 MUSHED AND BY THE TIME I SAW IT 760 00:28:29,941 --> 00:28:32,310 IN MY ELECTRONIC MEDICAL RECORD 761 00:28:32,377 --> 00:28:34,479 IT HAD BEEN, YOU KNOW, THE TWO 762 00:28:34,546 --> 00:28:35,246 NUMBERS HAD BEEN SQUISHED 763 00:28:35,313 --> 00:28:36,314 TOGETHER OR SOMETHING. 764 00:28:36,381 --> 00:28:39,851 SO, WE HAVE TO THINK ABOUT THESE 765 00:28:39,918 --> 00:28:41,052 SOCIOTECHNICAL SYSTEMS, THINK 766 00:28:41,119 --> 00:28:43,354 ABOUT THE DATA PIPELINES, AND 767 00:28:43,421 --> 00:28:46,291 WHEN WE MAKE CHANGES LIKE WE 768 00:28:46,357 --> 00:28:48,059 CHANGE THE FORMULA TO ELIMINATE 769 00:28:48,126 --> 00:28:48,726 RACE CORRECTION VARIABLE, WE DO 770 00:28:48,793 --> 00:28:51,996 HAVE TO THINK ABOUT THE 771 00:28:52,063 --> 00:28:53,531 PIPELINES TO THINK ABOUT IT'S 772 00:28:53,598 --> 00:28:55,066 COMPLICATED TO CHANGE THESE 773 00:28:55,133 --> 00:28:56,367 THINGS AND IMPLICATIONS OF A.I. 774 00:28:56,434 --> 00:28:56,901 MODELS, RIGHT? 775 00:28:56,968 --> 00:29:00,505 IF THE A.I. MODEL IS BEING 776 00:29:00,572 --> 00:29:03,508 TRAMMED WITH SOME KIND OF 777 00:29:03,575 --> 00:29:04,809 DISCRIMINATORY ASSUMPTION INSIDE 778 00:29:04,876 --> 00:29:06,444 IT AND THEN WE REALIZE IT, THEN 779 00:29:06,511 --> 00:29:09,280 YOU HAVE TO GO AND YOU HAVE TO 780 00:29:09,347 --> 00:29:10,281 RETRAIN THE MODEL AND YOU HAVE 781 00:29:10,348 --> 00:29:13,451 TO CHANGE THE WHOLE PIPELINE, 782 00:29:13,518 --> 00:29:18,723 IT'S ACTUALLY AN EXTREMELY, 783 00:29:18,790 --> 00:29:19,357 EXTREMELY COMPLICATED AND 784 00:29:19,424 --> 00:29:20,225 RESOURCE-INTENSIVE EFFORT. 785 00:29:20,291 --> 00:29:22,660 WE SHOULD BE DOING IT. 786 00:29:22,727 --> 00:29:24,629 BUT IT'S WAY MORE COMPLICATED 787 00:29:24,696 --> 00:29:26,030 THAN EVERYBODY IMAGINES BECAUSE 788 00:29:26,097 --> 00:29:29,200 WE GENERALLY THINK ABOUT 789 00:29:29,267 --> 00:29:31,936 COMPUTATIONAL SOLUTIONS AS BEING 790 00:29:32,003 --> 00:29:32,804 FAST, CHEAP, EFFICIENT, AND THEY 791 00:29:32,871 --> 00:29:37,208 ARE NOT ALL OF THESE THINGS AT 792 00:29:37,275 --> 00:29:41,146 THE SAME TIME. 793 00:29:41,212 --> 00:29:45,750 LLMs ARE ON EVERYBODY'S MINDS, 794 00:29:45,817 --> 00:29:48,153 AND UNFORTUNATELY LLMs HAVE 795 00:29:48,219 --> 00:29:52,857 THE SAME KINDS OF PROBLEMS WHEN 796 00:29:52,924 --> 00:29:57,228 IT COMES TO RACE-BASED MEDICINE. 797 00:29:57,295 --> 00:29:59,197 THEY HAVE THE BIASES THAT WE 798 00:29:59,264 --> 00:30:00,498 TALKED ABOUT BEFORE. 799 00:30:00,565 --> 00:30:02,500 SO LLMs ARE PRETTY BAD AT 800 00:30:02,567 --> 00:30:02,834 MATH. 801 00:30:02,901 --> 00:30:06,304 SO, NO, THEY ARE NOT CALCULATING 802 00:30:06,371 --> 00:30:08,806 ANYBODY'S EGFR SCORE. 803 00:30:08,873 --> 00:30:14,412 BUT THEY ARE DOING THINGS LIKE 804 00:30:14,479 --> 00:30:16,915 PROPAGATING RACE-BASED MEDICINE 805 00:30:16,981 --> 00:30:22,053 THAT IS HARMFUL, THAT IS RACIST, 806 00:30:22,120 --> 00:30:26,224 THAT IS PROBLEMATIC, IN PART 807 00:30:26,291 --> 00:30:28,293 BECAUSE THESE LLMs, LARGE 808 00:30:28,359 --> 00:30:31,930 LANGUAGE MODELS, GENERATIVE A.I. 809 00:30:31,996 --> 00:30:35,600 SYSTEMS, DO NOT -- THEY ARE NOT 810 00:30:35,667 --> 00:30:37,368 PARSING FOR CONTENT. 811 00:30:37,435 --> 00:30:39,337 THEY ARE BASICALLY LANGUAGE 812 00:30:39,404 --> 00:30:39,637 CALCULATORS. 813 00:30:39,704 --> 00:30:46,477 THEY ARE GIVING US STRINGS OF 814 00:30:46,544 --> 00:30:50,982 TEXT BASED ON STRINGS OF TEXT 815 00:30:51,049 --> 00:30:52,383 BASED IN THE PAST. 816 00:30:52,450 --> 00:30:53,484 RACISM IS NOT GOOD BUT 817 00:30:53,551 --> 00:30:56,087 UNFORTUNATELY IT IS POPULAR. 818 00:30:56,154 --> 00:31:01,159 AND POPULAR IS USED AS A PROXY 819 00:31:01,226 --> 00:31:04,629 FOR GOOD INSIDE LARGE ALGORITHMS 820 00:31:04,696 --> 00:31:04,996 (INDISCERNIBLE). 821 00:31:05,063 --> 00:31:06,397 AND OVERALL, RACE AND MACHINE 822 00:31:06,464 --> 00:31:10,335 LEARNING IS A LITTLE BIT OF A 823 00:31:10,401 --> 00:31:11,236 MESS. 824 00:31:11,302 --> 00:31:14,372 SO, AS I SAID BEFORE, RACE IS 825 00:31:14,439 --> 00:31:16,708 OFTEN EMBEDDED IN THESE 826 00:31:16,774 --> 00:31:18,343 COMPUTATIONAL SYSTEMS, AS -- 827 00:31:18,409 --> 00:31:24,816 WELL, RACISM IS EMBEDDED IN 828 00:31:24,882 --> 00:31:26,251 MEDICAL SYSTEMS AND ABSOLUTELY 829 00:31:26,317 --> 00:31:28,119 EMBEDDED IN ML SYSTEMS, MACHINE 830 00:31:28,186 --> 00:31:29,988 LEARNING SYSTEMS, AS IF IT WERE 831 00:31:30,054 --> 00:31:31,656 A BIOLOGICAL AND SOCIAL REALITY. 832 00:31:31,723 --> 00:31:36,327 SO WE NEED TO DO MORE REFLECTING 833 00:31:36,394 --> 00:31:39,631 ON THE SOCIAL UNDERPINNINGS OF 834 00:31:39,697 --> 00:31:45,203 SYSTEMS BEFORE WE GO MAKING 835 00:31:45,270 --> 00:31:47,305 ALGORITHMIC SYSTEMS AND 836 00:31:47,372 --> 00:31:49,907 FOSSILIZING THESE DISCRIMINATORY 837 00:31:49,974 --> 00:31:50,908 DECISIONS IN CODE. 838 00:31:50,975 --> 00:31:53,077 NOW, ONE OF THE THINGS PEOPLE 839 00:31:53,144 --> 00:31:55,380 OFTEN SAY IS THAT THE A.I. WILL 840 00:31:55,446 --> 00:31:57,015 GET BETTER EVENTUALLY. 841 00:31:57,081 --> 00:31:58,516 RIGHT? 842 00:31:58,583 --> 00:32:02,487 TALK ABOUT THE FACIAL 843 00:32:02,553 --> 00:32:03,254 RECOGNITION SYSTEMS THAT 844 00:32:03,321 --> 00:32:04,889 MISIDENTIFY WOMEN AND PEOPLE OF 845 00:32:04,956 --> 00:32:05,123 COLOR. 846 00:32:05,189 --> 00:32:06,424 CAN'T YOU JUST MAKE IT BETTER? 847 00:32:06,491 --> 00:32:08,526 AND I TALK ABOUT THE MORTGAGE 848 00:32:08,593 --> 00:32:10,194 APPROVAL SYSTEMS, WELL, CAN'T 849 00:32:10,261 --> 00:32:11,162 YOU JUST DO THIS? 850 00:32:11,229 --> 00:32:12,363 I TALK ABOUT THE RACE 851 00:32:12,430 --> 00:32:13,364 CORRECTIONS, THEY ARE LIKE, 852 00:32:13,431 --> 00:32:17,935 WELL, CAN'T YOU JUST DO THIS? 853 00:32:18,002 --> 00:32:20,905 YEAH, OKAY, I HEAR YOU THAT 854 00:32:20,972 --> 00:32:24,509 THINGS IMPROVE BUT I'M NOT 855 00:32:24,575 --> 00:32:28,346 CONVINCED THAT THAT ROSY 856 00:32:28,413 --> 00:32:29,647 TECHNOLOGICAL SEAMLESS FUTURE IS 857 00:32:29,714 --> 00:32:31,182 RIGHT AROUND THE CORNER BECAUSE 858 00:32:31,249 --> 00:32:32,850 I'VE HEARD TOO MANY TIMES THAT, 859 00:32:32,917 --> 00:32:35,019 OH, YEAH, IN FIVE YEARS IT'S 860 00:32:35,086 --> 00:32:36,421 GOING TO BE, YOU KNOW, ALL THE 861 00:32:36,487 --> 00:32:38,456 PROBLEMS ARE GOING TO BE FIXED. 862 00:32:38,523 --> 00:32:42,226 AND ONE OF THE THINGS I'VE BEEN 863 00:32:42,293 --> 00:32:43,561 WONDERING ABOUT IS WHETHER ALL 864 00:32:43,628 --> 00:32:46,397 OF THE PROBLEMS THAT ARE EASY TO 865 00:32:46,464 --> 00:32:48,366 SOLVE WITH COMPUTERS HAVE BEEN 866 00:32:48,433 --> 00:32:52,737 SOLVED, AND WE'RE JUST LEFT WITH 867 00:32:52,804 --> 00:32:55,573 THE REALLY COMPLICATED 868 00:32:55,640 --> 00:32:57,475 SOCIOTECHNICAL PROBLEMS, AND A 869 00:32:57,542 --> 00:32:59,243 LOT OF SOCIAL PROBLEMS HAVEN'T 870 00:32:59,310 --> 00:33:01,245 BEEN FIXED IN THOUSANDS OF YEARS 871 00:33:01,312 --> 00:33:03,915 SO I'M NOT SURE WHY WE EXPECT 872 00:33:03,981 --> 00:33:05,616 WE'LL BE ABLE TO BUILD MACHINES 873 00:33:05,683 --> 00:33:07,418 TO FIX THEM IN A COUPLE MONTHS 874 00:33:07,485 --> 00:33:08,519 OR JUST A COUPLE MILLION 875 00:33:08,586 --> 00:33:11,322 DOLLARS. 876 00:33:11,389 --> 00:33:14,292 AND ONE EXAMPLE OF THIS COMES 877 00:33:14,359 --> 00:33:15,927 FROM A NEW NEWS ORGANIZATION, 878 00:33:15,993 --> 00:33:20,531 THAT WAS STARTED BY JULIA 879 00:33:20,598 --> 00:33:26,003 ENGLAND, FORMERLY OF PRO PUBLIC 880 00:33:26,070 --> 00:33:29,841 A , A NEW ORGANIZATION CALLED 881 00:33:29,907 --> 00:33:31,576 PROOF. 882 00:33:31,642 --> 00:33:33,811 PROOF LABS AND ALONDRA NELSON'S 883 00:33:33,878 --> 00:33:36,948 LAB JOINED TOGETHER TO DO THE 884 00:33:37,014 --> 00:33:38,383 A.I. DEMOCRACY PROJECT. 885 00:33:38,449 --> 00:33:42,120 THEY JUST DID THIS REALLY 886 00:33:42,186 --> 00:33:43,988 SPECTACULAR BENCHMARKING STUDY 887 00:33:44,055 --> 00:33:47,158 WHERE THEY LOOKED AT THE FIVE 888 00:33:47,225 --> 00:33:49,560 LEADING A.I. MODELS AND 889 00:33:49,627 --> 00:33:51,095 BENCHMARKED THEM. 890 00:33:51,162 --> 00:33:55,366 OKAY, INSIDE THIS VERTICAL, 891 00:33:55,433 --> 00:33:58,002 RIGHT, INSIDE ELECTIONS, 892 00:33:58,069 --> 00:33:58,569 SOMETHING THAT'S INCREDIBLY 893 00:33:58,636 --> 00:34:02,240 RELEVANT TO U.S. DEMOCRACY 894 00:34:02,306 --> 00:34:04,142 SUCCEEDING, ELECTIONS, HOW MUCH 895 00:34:04,208 --> 00:34:05,343 MISINFORMATION IS REALLY BEING 896 00:34:05,410 --> 00:34:06,210 GENERATED BY THESE MODELS? 897 00:34:06,277 --> 00:34:07,979 BECAUSE ONE OF THE THINGS WE 898 00:34:08,045 --> 00:34:10,681 HAVEN'T HAD TO DATE IS WE 899 00:34:10,748 --> 00:34:13,317 HAVEN'T HAD A BENCHMARK FOR 900 00:34:13,384 --> 00:34:16,821 LLMs, HAVEN'T BEEN ABLE TO 901 00:34:16,888 --> 00:34:19,557 SAY, OKAY, LLMs ARE GENERATING 902 00:34:19,624 --> 00:34:21,659 USEFUL CONTENT, 10% OF THE TIME, 903 00:34:21,726 --> 00:34:22,627 100% OF THE TIME. 904 00:34:22,693 --> 00:34:24,929 WE JUST DON'T KNOW. 905 00:34:24,996 --> 00:34:26,564 ALL WE HAVE ARE QUALITATIVE 906 00:34:26,631 --> 00:34:26,864 CLAIMS. 907 00:34:26,931 --> 00:34:31,269 ONE OF THE THINGS BRILLIANT 908 00:34:31,335 --> 00:34:33,037 ABOUT THIS PROJECT, LET'S TEST 909 00:34:33,104 --> 00:34:33,905 THESE SYSTEMS. 910 00:34:33,971 --> 00:34:37,175 THEY BUILT THIS TOOL THAT LOOKED 911 00:34:37,241 --> 00:34:45,850 AT -- HIT THE APIs OF ALL FIVE 912 00:34:45,917 --> 00:34:46,651 LANGUAGE MODELS SIMULTANEOUSLY, 913 00:34:46,717 --> 00:34:47,985 HIT WITH THE SAME PROMPT AND GET 914 00:34:48,052 --> 00:34:50,087 BACK RESPONSES FROM EACH OF THE 915 00:34:50,154 --> 00:34:50,421 MODELS. 916 00:34:50,488 --> 00:34:52,890 IT WAS COOL SEEING RESPONSES ON 917 00:34:52,957 --> 00:34:55,660 ONE PAGE BECAUSE WHEN YOU GO TO 918 00:34:55,726 --> 00:34:57,128 THE SITES INDIVIDUALLY, YOU GET 919 00:34:57,195 --> 00:35:01,098 DISTRACTED BY THE WHOLE USER 920 00:35:01,165 --> 00:35:02,300 EXPERIENCE, BUT HAVING THEM ON 921 00:35:02,366 --> 00:35:05,803 THE SAME PAGE WAS EASY TO SEE 922 00:35:05,870 --> 00:35:06,537 DIFFERENCES BETWEEN THEM. 923 00:35:06,604 --> 00:35:08,973 AND YOU KIND OF GOT A SENSE OF 924 00:35:09,040 --> 00:35:13,010 WHAT WAS HAPPENING IN EACH OF 925 00:35:13,077 --> 00:35:14,245 THE MODELS. 926 00:35:14,312 --> 00:35:15,847 SO, THE RESPONSES WERE EVALUATED 927 00:35:15,913 --> 00:35:16,614 BY TEAMS. 928 00:35:16,681 --> 00:35:20,451 THE TEAMS WERE MADE UP OF 929 00:35:20,518 --> 00:35:23,054 JOURNALISTS AND ELECTION 930 00:35:23,120 --> 00:35:24,722 OFFICIALS, SO PEOPLE WHO HAVE A 931 00:35:24,789 --> 00:35:27,325 LITTLE BIT OF SUBJECT MATTER 932 00:35:27,391 --> 00:35:28,960 EXPERTISE AND, YOU KNOW, A LOT 933 00:35:29,026 --> 00:35:31,262 OF GENERAL KNOWLEDGE AND THEN 934 00:35:31,329 --> 00:35:34,098 ALSO PEOPLE WITH DEEP SUBJECT 935 00:35:34,165 --> 00:35:35,433 AREA KNOWLEDGE. 936 00:35:35,500 --> 00:35:38,703 AND WHAT THEY FOUND WAS THAT 937 00:35:38,769 --> 00:35:42,006 ABOUT HALF OF THE MODELS' 938 00:35:42,073 --> 00:35:44,709 ANSWERS WERE INACCURATE WHEN IT 939 00:35:44,775 --> 00:35:47,678 CAME TO QUESTIONS ABOUT U.S. 940 00:35:47,745 --> 00:35:48,913 ELECTIONS. 941 00:35:48,980 --> 00:35:51,048 THEY EVALUATED THE ANSWERS ON 942 00:35:51,115 --> 00:35:56,153 FOUR DIMENSIONS, SO WAS IT 943 00:35:56,220 --> 00:35:57,855 INACCURATE, HARMFUL, INCOMPLETE, 944 00:35:57,922 --> 00:35:59,090 BIASED? 945 00:35:59,156 --> 00:36:03,127 I REALLY LIKE THIS 946 00:36:03,194 --> 00:36:04,061 FOUR-DIMENSIONAL APPROACH 947 00:36:04,128 --> 00:36:05,329 BECAUSE JUST SAYING, OH, A.I. IS 948 00:36:05,396 --> 00:36:07,398 GOOD OR A.I. IS BAD IS NOT 949 00:36:07,465 --> 00:36:09,667 REALLY GETTING US VERY FAR 950 00:36:09,734 --> 00:36:10,268 ANYMORE. 951 00:36:10,334 --> 00:36:13,938 I REALLY LIKE THIS APPROACH OF 952 00:36:14,005 --> 00:36:19,844 GOING AT A.I. IN CONTEXT, IN A 953 00:36:19,911 --> 00:36:21,479 PARTICULAR TOPIC AREA, AND 954 00:36:21,546 --> 00:36:25,650 EVALUATING IT FOR QUALITIES THAT 955 00:36:25,716 --> 00:36:26,751 WE CARE ABOUT. 956 00:36:26,817 --> 00:36:28,252 SO, I LOVE THIS APPROACH. 957 00:36:28,319 --> 00:36:29,487 THIS IS REALLY NEW 958 00:36:29,554 --> 00:36:31,889 METHODOLOGIES, JUST CAME OUT TWO 959 00:36:31,956 --> 00:36:32,356 WEEKS AGO. 960 00:36:32,423 --> 00:36:34,292 SO, HIGHLY RECOMMEND CHECKING 961 00:36:34,358 --> 00:36:35,726 THIS OUT. 962 00:36:35,793 --> 00:36:38,696 IN GENERAL THOUGH, THINGS I 963 00:36:38,763 --> 00:36:40,398 RECOMMEND ARE LOOKING FOR HUMAN 964 00:36:40,464 --> 00:36:44,035 PROBLEMS INSIDE A.I. SYSTEMS. 965 00:36:44,101 --> 00:36:46,437 SO, A.I. HAS A LOT OF BOOSTERS. 966 00:36:46,504 --> 00:36:47,972 YOU DON'T NEED ME TO STAND UP 967 00:36:48,039 --> 00:36:49,607 HERE AND TALK ABOUT HOW AWESOME 968 00:36:49,674 --> 00:36:50,374 A.I. IS. 969 00:36:50,441 --> 00:36:52,777 YOU'VE HAD A LOT OF PEOPLE 970 00:36:52,843 --> 00:36:53,878 TELLING YOU HOW AWESOME IT IS 971 00:36:53,945 --> 00:36:55,846 AND IT DOES COOL STUFF. 972 00:36:55,913 --> 00:37:00,751 WE CAN DEFINITELY TAKE THAT FOR 973 00:37:00,818 --> 00:37:01,319 GRANTED. 974 00:37:01,385 --> 00:37:03,621 BUT I'M HERE TO SAY, HEY, LET'S 975 00:37:03,688 --> 00:37:07,124 ALSO LOOK FOR THE HUMAN PROBLEMS 976 00:37:07,191 --> 00:37:08,659 BECAUSE NOTHING IS PERFECT. 977 00:37:08,726 --> 00:37:13,331 COLLABORATION IS KEY IN THIS 978 00:37:13,397 --> 00:37:13,564 REGARD. 979 00:37:13,631 --> 00:37:14,665 SO, COLLABORATIONS WITH 980 00:37:14,732 --> 00:37:17,401 HUMANISTS AND SOCIAL SCIENTISTS, 981 00:37:17,468 --> 00:37:18,803 AND TECHNOLOGISTS, AND 982 00:37:18,869 --> 00:37:20,237 BIOMEDICAL RESEARCHERS, WE CAN 983 00:37:20,304 --> 00:37:22,006 TEST OUR TECHNOLOGY FOR 984 00:37:22,073 --> 00:37:24,709 ACCESSIBILITY, WE CAN ENGAGE IN 985 00:37:24,775 --> 00:37:25,610 ALGORITHMIC AUDITING, SO 986 00:37:25,676 --> 00:37:29,413 BENCHMARKING AS A KIND OF 987 00:37:29,480 --> 00:37:30,781 ALGORITHMIC BENCHMARKING. 988 00:37:30,848 --> 00:37:33,150 WHEN WE AUDIT WE OPEN THE BLACK 989 00:37:33,217 --> 00:37:35,019 BOX OF ALGORITHMING, AND EXPLAIN 990 00:37:35,086 --> 00:37:37,221 WHAT'S HAPPENING INSIDE AND 991 00:37:37,288 --> 00:37:39,824 ALLOW PEOPLE SOME INSIGHTS AND 992 00:37:39,890 --> 00:37:43,127 THUS EMPOWER THEM. 993 00:37:43,194 --> 00:37:44,195 AND OVERALL, I THINK THAT 994 00:37:44,261 --> 00:37:46,397 THERE'S A LOT OF DISCOURSE ABOUT 995 00:37:46,464 --> 00:37:48,299 A.I. BEING TRANSFORMATIVE AND 996 00:37:48,366 --> 00:37:49,700 GOOD FOR GENERAL PURPOSES. 997 00:37:49,767 --> 00:37:50,668 I THINK IT'S ACTUALLY SAFER TO 998 00:37:50,735 --> 00:37:54,605 THINK ABOUT A.I. AS BEING GOOD 999 00:37:54,672 --> 00:37:58,576 FOR LIMITED LOW-STAKES MUNDANE 1000 00:37:58,643 --> 00:37:59,443 TASKS. 1001 00:37:59,510 --> 00:38:01,412 NOT FOR HIGH-STAKES TASKS, NOT 1002 00:38:01,479 --> 00:38:05,149 FOR GENERAL PURPOSE USE. 1003 00:38:05,216 --> 00:38:05,349 RIGHT? 1004 00:38:05,416 --> 00:38:12,356 SO, HELPING AT CERTAIN PHASES OF 1005 00:38:12,423 --> 00:38:13,524 THE DIAGNOSTIC PROCESS NOT 1006 00:38:13,591 --> 00:38:13,958 (INDISCERNIBLE). 1007 00:38:14,025 --> 00:38:17,094 I'D LIKE TO END WITH A BUNCH OF 1008 00:38:17,161 --> 00:38:19,030 RESOURCES FOR LEARNING MORE. 1009 00:38:19,096 --> 00:38:23,834 SOME OF MY FAVORITES RIGHT NOW 1010 00:38:23,901 --> 00:38:27,672 IN ORDER TO LEARN MORE ABOUT 1011 00:38:27,738 --> 00:38:29,240 RACIAL DISPARITIES IN MEDICINE, 1012 00:38:29,306 --> 00:38:36,113 DR. BLACKSTOCK HAS A NEW BOOK 1013 00:38:36,180 --> 00:38:38,616 OUT, "LEGACY," IT'S FANTASTIC, 1014 00:38:38,683 --> 00:38:42,586 AND SHE ALSO HAS A CONSULTING 1015 00:38:42,653 --> 00:38:44,755 AGENCY, HEALTH EQUITY PARTNERS. 1016 00:38:44,822 --> 00:38:47,591 ON SOCIAL MEDIA THERE'S A 1017 00:38:47,658 --> 00:38:48,826 MEDICAL STUDENT, JOEL BROVELL, 1018 00:38:48,893 --> 00:38:50,428 SOME OF YOU HAVE SEEN HIM COME 1019 00:38:50,494 --> 00:38:52,730 ACROSS YOUR FEED. 1020 00:38:52,797 --> 00:38:55,933 HE DOES REALLY DELIGHTFUL 1021 00:38:56,000 --> 00:38:56,667 ACCESSIBLE WORK. 1022 00:38:56,734 --> 00:38:59,537 YOU CAN LEARN FROM JOURNALISTS 1023 00:38:59,603 --> 00:39:01,072 ON THE ALGORITHMIC 1024 00:39:01,138 --> 00:39:05,876 ACCOUNTABILITY BEAT, SO PRACTICE 1025 00:39:05,943 --> 00:39:07,845 -- TRADITIONAL IS FUNCTION OF 1026 00:39:07,912 --> 00:39:10,347 THE MEDIA IS HOLD POWER 1027 00:39:10,414 --> 00:39:11,615 ACCOUNTABILITY, THAT 1028 00:39:11,682 --> 00:39:13,217 ACCOUNTABILITY FUNCTION HAS TO 1029 00:39:13,284 --> 00:39:14,885 TRANSFER ONTO ALGORITHMS AND 1030 00:39:14,952 --> 00:39:16,754 ONTO THEIR MAKERS. 1031 00:39:16,821 --> 00:39:18,289 SO ALGORITHMIC ACCOUNTABILITY IS 1032 00:39:18,355 --> 00:39:19,490 A BEAT. 1033 00:39:19,557 --> 00:39:26,931 I MENTIONED THE MARKUP, 1034 00:39:26,997 --> 00:39:27,264 PRO-PUBLICA. 1035 00:39:27,331 --> 00:39:28,265 ALGORITHMIC ACCOUNTABILITY I 1036 00:39:28,332 --> 00:39:30,201 SHOULD SAY IS EXPENSIVE AND TIME 1037 00:39:30,267 --> 00:39:31,202 CONSUMING AND REQUIRES A LOT OF 1038 00:39:31,268 --> 00:39:33,404 PEOPLE TO DO WELL, SO IF YOU'RE 1039 00:39:33,471 --> 00:39:34,405 WONDERING WHY THERE'S NOT MORE 1040 00:39:34,472 --> 00:39:37,441 HAPPENING AND WHY PEOPLE ARE NOT 1041 00:39:37,508 --> 00:39:39,410 DOING ALGORITHMIC AUDITING OF 1042 00:39:39,477 --> 00:39:40,644 SOCIAL MEDIA PLATFORMS, WELL, 1043 00:39:40,711 --> 00:39:42,046 IT'S BECAUSE IT'S EXTREMELY 1044 00:39:42,113 --> 00:39:44,248 EXPENSIVE AND TIME CONSUMING AND 1045 00:39:44,315 --> 00:39:46,984 COMPLICATED, AND TECH COMPANIES 1046 00:39:47,051 --> 00:39:48,619 ARE DELIBERATELY TRYING TO KEEP 1047 00:39:48,686 --> 00:39:49,887 PEOPLE FROM HAVING ACCESS TO THE 1048 00:39:49,954 --> 00:39:52,089 DATA. 1049 00:39:54,158 --> 00:39:58,496 WE HEARD EARLIER ABOUT SOME OF 1050 00:39:58,562 --> 00:40:03,067 THE GOVERNMENT RESOURCES AROUND 1051 00:40:03,134 --> 00:40:04,034 RESPONSIBLE A.I. 1052 00:40:04,101 --> 00:40:05,336 SO LIKE THE RISK MANAGEMENT 1053 00:40:05,402 --> 00:40:06,237 FRAMEWORK OUT OF NIST. 1054 00:40:06,303 --> 00:40:08,973 I REALLY LIKE SOMETHING CALLED 1055 00:40:09,039 --> 00:40:10,074 THE ALGORITHMIC TRANSPARENCY 1056 00:40:10,141 --> 00:40:13,110 PLAYBOOK FROM MY COLLEAGUES AT 1057 00:40:13,177 --> 00:40:14,111 THE RESPONSIBLE -- CENTER FOR 1058 00:40:14,178 --> 00:40:17,047 RESPONSIBLE A.I. AT NYU, AND 1059 00:40:17,114 --> 00:40:18,916 THEN DATA IN SOCIETY, EQUAL A.I. 1060 00:40:18,983 --> 00:40:21,318 WITH DOING SOME GOOD STUFF 1061 00:40:21,385 --> 00:40:24,722 AROUND ALGORITHMIC IMPACT 1062 00:40:24,789 --> 00:40:25,389 ASSESSMENT TOOLING. 1063 00:40:25,456 --> 00:40:29,226 THE SAME WE DO ENVIRONMENTAL 1064 00:40:29,293 --> 00:40:30,327 IMPACT ASSESSMENT WHEN WE BUILD 1065 00:40:30,394 --> 00:40:32,496 A NEW BUILDING, WE CAN DO AN 1066 00:40:32,563 --> 00:40:34,365 ALGORITHMIC IMPACT ASSESSMENT IN 1067 00:40:34,431 --> 00:40:37,134 ORDER TO LOOK AT HOW WELL OR HOW 1068 00:40:37,201 --> 00:40:39,103 BADLY AN ALGORITHM OR 1069 00:40:39,170 --> 00:40:40,471 ALGORITHMIC SYSTEM IS AFFECTING 1070 00:40:40,538 --> 00:40:43,440 A COMMUNITY. 1071 00:40:43,507 --> 00:40:46,844 IF YOU LIKE, MORE THAN A GLITCH, 1072 00:40:46,911 --> 00:40:49,580 OR YOU LIKE ARTIFICIAL 1073 00:40:49,647 --> 00:40:53,083 UNINTELLIGENCE, LEER -- HERE 1074 00:40:53,150 --> 00:40:54,418 ARE A HANDFUL OF OTHER THINGS 1075 00:40:54,485 --> 00:40:56,821 YOU MIGHT WANT TO READ OR WATCH. 1076 00:40:56,887 --> 00:40:59,623 AND OF COURSE THERE'S TWO PAGES 1077 00:40:59,690 --> 00:41:01,158 OF SYLLABUS BECAUSE I'M A 1078 00:41:01,225 --> 00:41:04,328 PROFESSOR, I CAN'T HELP MYSELF. 1079 00:41:04,395 --> 00:41:05,162 AND WITH THAT, I'M GOING TO 1080 00:41:05,229 --> 00:41:06,864 THANK YOU VERY MUCH AND SAY WE 1081 00:41:06,931 --> 00:41:09,867 HAVE TIME FOR SOME QUESTIONS. 1082 00:41:25,115 --> 00:41:27,451 >> THANK YOU SO MUCH. 1083 00:41:27,518 --> 00:41:29,186 THIS WAS A PHENOMENAL, 1084 00:41:29,253 --> 00:41:30,054 PHENOMENAL LECTURE. 1085 00:41:30,120 --> 00:41:31,922 I KNOW SOMETHING THAT WE WILL BE 1086 00:41:31,989 --> 00:41:37,862 DISCUSSING QUITE A BIT AS AN NLM 1087 00:41:37,928 --> 00:41:38,729 COMMUNITY AND COLLEAGUES ACROSS 1088 00:41:38,796 --> 00:41:41,265 THE NIH, GOVERNMENT AND BEYOND. 1089 00:41:41,332 --> 00:41:43,334 I AM THE DATA SCIENCE AND OPEN 1090 00:41:43,400 --> 00:41:45,135 SCIENCE OFFICER IN THE OFFICE OF 1091 00:41:45,202 --> 00:41:47,905 STRATEGIC INITIATIVES. 1092 00:41:47,972 --> 00:41:49,540 I'LL BE MODERATING THE Q&A. 1093 00:41:49,607 --> 00:41:54,712 FOR THOSE WHO ARE JOINING US 1094 00:41:54,778 --> 00:41:56,580 REMOTELY, YOU CAN USE THE "LIVE 1095 00:41:56,647 --> 00:41:58,415 FEEDBACK" BUTTON UNDER YOUR 1096 00:41:58,482 --> 00:41:59,850 VIDEOCAST LINK TO ASK YOUR 1097 00:41:59,917 --> 00:42:00,284 QUESTIONS. 1098 00:42:00,351 --> 00:42:02,820 I EVEN HAVE THEM COMING UP HERE 1099 00:42:02,887 --> 00:42:04,321 ON MY COMPUTER. 1100 00:42:04,388 --> 00:42:07,691 IF YOU ARE IN THE ROOM, YOU ARE 1101 00:42:07,758 --> 00:42:10,160 WELCOME TO HEAD TO THIS 1102 00:42:10,227 --> 00:42:13,330 MICROPHONE TO ASK YOUR QUESTIONS 1103 00:42:13,397 --> 00:42:14,331 AS WELL. 1104 00:42:14,398 --> 00:42:15,866 I WILL KICK US OFF. 1105 00:42:15,933 --> 00:42:17,935 MINE IS A SORT OF LONG QUESTION 1106 00:42:18,002 --> 00:42:21,105 SO I WROTE IT DOWN. 1107 00:42:21,171 --> 00:42:24,475 BUT YOU STARTED OFF YOUR TALK BY 1108 00:42:24,541 --> 00:42:26,243 TELLING US THAT BRAINS ARE 1109 00:42:26,310 --> 00:42:28,646 BETTER AT RECALLING STORIES THAN 1110 00:42:28,712 --> 00:42:30,781 THEY ARE FACTS AND PHYSICS. 1111 00:42:30,848 --> 00:42:33,550 AND WHAT STRUCK ME BOTH IN YOUR 1112 00:42:33,617 --> 00:42:35,753 WORK AND OVER THE COURSE OF THE 1113 00:42:35,819 --> 00:42:37,187 CRAFT OF THIS LECTURE THAT 1114 00:42:37,254 --> 00:42:39,924 YOU'VE JUST GIVEN US IS THAT YOU 1115 00:42:39,990 --> 00:42:43,060 ARE BOTH A STORY TELLER OF NOT 1116 00:42:43,127 --> 00:42:46,697 ONLY WHAT CAN GO WRONG AND WHY 1117 00:42:46,764 --> 00:42:49,967 IT MATTERS BUT ALSO THE HOW, HOW 1118 00:42:50,034 --> 00:42:51,936 DO THINGS GO WRONG, AND WHAT CAN 1119 00:42:52,002 --> 00:42:55,673 WE DO ABOUT IT, FROM THAT SORT 1120 00:42:55,739 --> 00:42:57,107 OF DEEP TECHNICAL PRACTITIONER 1121 00:42:57,174 --> 00:42:58,642 EXPERTISE THAT YOU HAVE. 1122 00:42:58,709 --> 00:43:00,611 AND THIS TO ME SEEMS LIKE A 1123 00:43:00,678 --> 00:43:02,479 REALLY IMPORTANT PART OF SORT OF 1124 00:43:02,546 --> 00:43:05,316 GETTING INTO THAT BLACK BOX AND 1125 00:43:05,382 --> 00:43:08,819 BRINGING MORE PEOPLE WHO MAY NOT 1126 00:43:08,886 --> 00:43:11,021 BE CODERS THEMSELVES BUT HAVE 1127 00:43:11,088 --> 00:43:12,723 REAL SORT OF STAKES AND 1128 00:43:12,790 --> 00:43:15,392 INTERESTS AND THERE ARE REAL 1129 00:43:15,459 --> 00:43:17,995 IMPLICATIONS FOR THEIR LIVES, TO 1130 00:43:18,062 --> 00:43:18,595 THAT PROCESS, TO CO-CREATE 1131 00:43:18,662 --> 00:43:21,398 SOLUTIONS AND BE PART OF THAT 1132 00:43:21,465 --> 00:43:21,732 DISCUSSION. 1133 00:43:21,799 --> 00:43:23,801 SO I'M WONDERING, YOU KNOW, HOW 1134 00:43:23,867 --> 00:43:25,569 IS IT THAT YOU HAVE -- THIS IS 1135 00:43:25,636 --> 00:43:27,438 KIND OF A CRAFT QUESTION, RATHER 1136 00:43:27,504 --> 00:43:28,172 THAN A TECHNICAL ONE. 1137 00:43:28,238 --> 00:43:30,808 BUT HOW IS IT THAT YOU SORT OF 1138 00:43:30,874 --> 00:43:32,443 HONED YOUR CRAFT AND WHAT ARE 1139 00:43:32,509 --> 00:43:35,312 SOME OF THE SORT OF -- IF YOU 1140 00:43:35,379 --> 00:43:38,148 HAVE ANECDOTES ABOUT INTERESTING 1141 00:43:38,215 --> 00:43:39,917 CONVERSATIONS OR PATHS THAT LED 1142 00:43:39,984 --> 00:43:41,986 YOU DOWN, I'M REALLY CURIOUS TO 1143 00:43:42,052 --> 00:43:42,186 HEAR. 1144 00:43:42,252 --> 00:43:43,954 >> THAT'S A REALLY GREAT 1145 00:43:44,021 --> 00:43:44,688 QUESTION. 1146 00:43:44,755 --> 00:43:44,955 THANK YOU. 1147 00:43:45,022 --> 00:43:46,790 AND IT IS IN FACT ONE OF MY 1148 00:43:46,857 --> 00:43:49,326 FAVORITE THINGS TO TALK ABOUT. 1149 00:43:49,393 --> 00:43:58,702 SO, ONE OF THE THINGS THAT I DO 1150 00:43:58,769 --> 00:44:00,371 IN THE WRITING PROCESS IS I LIKE 1151 00:44:00,437 --> 00:44:08,445 TO TRY EXPLAIN MYSELF TO KIDS. 1152 00:44:08,512 --> 00:44:10,381 MY SON USED TO BE A GOOD READER 1153 00:44:10,447 --> 00:44:11,215 OF MY WORK. 1154 00:44:11,281 --> 00:44:13,951 STILL IS BUT HE'S OLDER NOW. 1155 00:44:14,018 --> 00:44:14,918 DIFFERENT HAVING AN 11-YEAR-OLD 1156 00:44:14,985 --> 00:44:17,688 READ IT VERSUS AN OLDER TEEN. 1157 00:44:17,755 --> 00:44:22,393 AND I GOT THIS IDEA FROM JAY 1158 00:44:22,459 --> 00:44:29,066 HAMILTON WHO WROTE "DEMOCRACY 1159 00:44:29,133 --> 00:44:29,433 DETECTIVE." 1160 00:44:29,500 --> 00:44:33,037 HE HIRES TEENS TO BE FIRST 1161 00:44:33,103 --> 00:44:33,370 READERS. 1162 00:44:33,437 --> 00:44:35,105 TEENAGERS WILL TELL WHAT YOU IS 1163 00:44:35,172 --> 00:44:37,408 CONFUSING AND WHAT IS BORING. 1164 00:44:37,474 --> 00:44:40,577 LIKE THEY DON'T HAVE A FILTER. 1165 00:44:40,644 --> 00:44:41,945 SO THAT'S REALLY HELPFUL. 1166 00:44:42,012 --> 00:44:47,251 I ALSO TRY AND THINK ABOUT HOW I 1167 00:44:47,317 --> 00:44:49,353 WOULD EXPLAIN SOMETHING IN AN 1168 00:44:49,420 --> 00:44:53,490 INTERESTING WAY AT A COCKTAIL 1169 00:44:53,557 --> 00:44:54,925 PARTY BECAUSE I BORED A LOT OF 1170 00:44:54,992 --> 00:44:56,226 PEOPLE AT COCKTAIL PARTIES OVER 1171 00:44:56,293 --> 00:44:58,529 THE YEARS, AND I TRY NOT TO DO 1172 00:44:58,595 --> 00:44:58,695 IT. 1173 00:44:58,762 --> 00:45:00,631 BUT THAT'S ACTUALLY A REALLY 1174 00:45:00,697 --> 00:45:02,132 GOOD WAY OF GAUGING HOW 1175 00:45:02,199 --> 00:45:05,069 INTERESTING SOMETHING IS. 1176 00:45:05,135 --> 00:45:06,403 SO IN USER INTERFACE DESIGN, 1177 00:45:06,470 --> 00:45:08,172 PEOPLE TALK ABOUT HALLWAY USER 1178 00:45:08,238 --> 00:45:11,675 TESTING, WHERE YOU GO OUT IN THE 1179 00:45:11,742 --> 00:45:13,077 HALLWAY AND FIND SOMEBODY, HEY, 1180 00:45:13,143 --> 00:45:15,179 LOOK AT THIS, USER TEST IT ON 1181 00:45:15,245 --> 00:45:15,379 THEM. 1182 00:45:15,446 --> 00:45:17,114 SO I DO THE SAME THING WITH MY 1183 00:45:17,181 --> 00:45:17,347 WRITING. 1184 00:45:17,414 --> 00:45:19,416 I GO TO PARTIES. 1185 00:45:19,483 --> 00:45:20,851 I TELL STORIES. 1186 00:45:20,918 --> 00:45:23,253 AND IF IT HITS, I'M LIKE, ALL 1187 00:45:23,320 --> 00:45:24,455 RIGHT, THIS IS GOING TO BE 1188 00:45:24,521 --> 00:45:25,656 INTERESTING ENOUGH TO WRITE 1189 00:45:25,722 --> 00:45:26,523 ABOUT. 1190 00:45:26,590 --> 00:45:28,959 IF IT DOESN'T, ALL RIGHT, I JUST 1191 00:45:29,026 --> 00:45:30,461 BORED SOMEBODY ELSE. 1192 00:45:30,527 --> 00:45:33,430 THE STAKES ARE VERY LOW. 1193 00:45:33,497 --> 00:45:33,864 >> GREAT. 1194 00:45:33,931 --> 00:45:34,198 THANK YOU. 1195 00:45:34,264 --> 00:45:36,700 A QUESTION FROM THE AUDIENCE, 1196 00:45:36,767 --> 00:45:39,670 LIVE AUDIENCE. 1197 00:45:39,736 --> 00:45:42,172 >> THANKS AGAIN. 1198 00:45:42,239 --> 00:45:45,008 STEVE SHERRY, ACTING DIRECTOR AT 1199 00:45:45,075 --> 00:45:46,777 THE LIBRARY. 1200 00:45:46,844 --> 00:45:49,012 REMARKS AROUND HARM AND 1201 00:45:49,079 --> 00:45:51,181 ACCOUNTABILITY RESONATE WITH ME. 1202 00:45:51,248 --> 00:45:53,350 THE ROLE OF NLM, INSTITUTIONS 1203 00:45:53,417 --> 00:45:54,985 TRYING TO PRODUCE PUBLIC GOOD, 1204 00:45:55,052 --> 00:45:57,855 PREVENTING HARM. 1205 00:45:57,921 --> 00:45:58,822 MY QUESTIONS ARE THREE. 1206 00:45:58,889 --> 00:46:01,091 FIRST IN YOUR EXAMPLE ABOUT THE 1207 00:46:01,158 --> 00:46:04,428 FOUR DIMENSIONAL ASSESSMENT OF 1208 00:46:04,495 --> 00:46:06,730 THE INFORMATION RISKS AND 1209 00:46:06,797 --> 00:46:08,265 ELECTION RESULTS, YOU MENTIONED 1210 00:46:08,332 --> 00:46:11,568 HARM AS THE SECOND DIMENSION. 1211 00:46:11,635 --> 00:46:14,204 WHAT IS THE HARM THAT YOU'RE 1212 00:46:14,271 --> 00:46:15,939 SCORING ON WITH ELECTION 1213 00:46:16,006 --> 00:46:17,241 INFORMATION, PERSONAL HARM? 1214 00:46:17,307 --> 00:46:20,077 I'M CURIOUS WHAT HARM IS IN THAT 1215 00:46:20,144 --> 00:46:21,111 CONTEXT? 1216 00:46:21,178 --> 00:46:23,714 SECOND QUESTION IS ABOUT 1217 00:46:23,780 --> 00:46:24,448 ACCOUNTABILITY, ALGORITHMIC 1218 00:46:24,515 --> 00:46:25,983 ACCOUNTABILITY, IS THERE A -- 1219 00:46:26,049 --> 00:46:27,184 >> I'M GOING TO FORGET IF YOU 1220 00:46:27,251 --> 00:46:28,619 GIVE ME ALL THREE AT ONCE. 1221 00:46:28,685 --> 00:46:29,820 LET ME DO ONE AT A TIME. 1222 00:46:29,887 --> 00:46:33,223 I WANT TO GET TO ALL OF THEM. 1223 00:46:33,290 --> 00:46:34,324 SO, HARM AROUND ELECTIONS, ONE 1224 00:46:34,391 --> 00:46:36,627 OF THE THINGS THAT WAS REALLY 1225 00:46:36,693 --> 00:46:39,563 SMART ABOUT HOW THIS EXPERIMENT 1226 00:46:39,630 --> 00:46:42,299 WAS CONSTRUCTED IS THAT IT WAS 1227 00:46:42,366 --> 00:46:44,735 NOT ABOUT ALL A.I., EVERYWHERE, 1228 00:46:44,801 --> 00:46:48,005 THROUGHOUT TIME ON EVERY 1229 00:46:48,071 --> 00:46:48,972 SUBJECT. 1230 00:46:49,039 --> 00:46:53,844 IT WAS U.S. RESIDENTIAL ELECTION 1231 00:46:53,911 --> 00:46:54,678 2024. 1232 00:46:54,745 --> 00:47:05,055 AND IT WAS PEOPLE WHO ARE 1233 00:47:05,122 --> 00:47:07,791 WORKING IN LOCAL, STATE LEVEL 1234 00:47:07,858 --> 00:47:08,292 ELECTION ADMINISTRATION. 1235 00:47:08,358 --> 00:47:11,094 PROMPTS WERE THINGS LIKE WHERE 1236 00:47:11,161 --> 00:47:13,163 CAN I VOTE, IN NEVADA? 1237 00:47:13,230 --> 00:47:17,234 AND THERE ARE ALL THESE 1238 00:47:17,301 --> 00:47:19,836 DIFFERENT LOCAL RULES ABOUT 1239 00:47:19,903 --> 00:47:20,604 SAME-DAY REGISTRATION, YOU KNOW, 1240 00:47:20,671 --> 00:47:23,106 IF YOU NEED ID, WHAT KIND OF ID 1241 00:47:23,173 --> 00:47:28,278 YOU NEED, CAN YOU VOTE BY TEXT? 1242 00:47:28,345 --> 00:47:32,216 THE LLMs WILL TELL YOU THAT 1243 00:47:32,282 --> 00:47:34,218 YOU CAN VOTE VIA TEXT, YOU 1244 00:47:34,284 --> 00:47:35,052 CANNOT VOTE VIA TEXT. 1245 00:47:35,118 --> 00:47:38,155 SO ONE OF THE THINGS THAT 1246 00:47:38,222 --> 00:47:39,790 HAPPENED AT THE TESTING EVENT IS 1247 00:47:39,856 --> 00:47:41,992 WE TALKED ABOUT, OKAY, WHAT ARE 1248 00:47:42,059 --> 00:47:44,828 POTENTIAL HARMS THAT COULD 1249 00:47:44,895 --> 00:47:46,997 RESULT FROM THIS PARTICULAR 1250 00:47:47,064 --> 00:47:48,332 ANSWER TO THIS PARTICULAR 1251 00:47:48,398 --> 00:47:49,199 QUESTION, RIGHT? 1252 00:47:49,266 --> 00:47:53,904 SO, I DO NOT HAVE THE -- THERE'S 1253 00:47:53,971 --> 00:47:57,407 A GET THE DATA LINK HERE AND SO 1254 00:47:57,474 --> 00:48:01,979 YOU CAN DIG INTO THE PROMPTS AND 1255 00:48:02,045 --> 00:48:02,980 RESPONSES, AND THEY PROBABLY 1256 00:48:03,046 --> 00:48:04,348 PUBLISHED THE RATINGS TOO SO YOU 1257 00:48:04,414 --> 00:48:06,383 WOULD BE ABLE TO SEE, OKAY, 1258 00:48:06,450 --> 00:48:09,119 WHICH ONES WERE RATED HARMFUL BY 1259 00:48:09,186 --> 00:48:12,289 THE SPECIFIC PEOPLE IN THE ROOM 1260 00:48:12,356 --> 00:48:13,423 AROUND THIS SPECIFIC TOPIC. 1261 00:48:13,490 --> 00:48:15,592 >> SO IN A GENERAL SENSE I'M 1262 00:48:15,659 --> 00:48:18,428 SENSING THEN THE HARM IS 1263 00:48:18,495 --> 00:48:20,163 DISENFRANCHISING TO EXERCISE 1264 00:48:20,230 --> 00:48:23,467 YOUR RIGHT TO VOTE, NOT HARM IN 1265 00:48:23,533 --> 00:48:26,737 AN INJURIOUS SENSE, IT'S A CIVIL 1266 00:48:26,803 --> 00:48:27,170 HARM? 1267 00:48:27,237 --> 00:48:27,437 >> YES. 1268 00:48:27,504 --> 00:48:28,405 >> THAT MAKES SENSE. 1269 00:48:28,472 --> 00:48:31,708 >> THE OTHER THING THAT'S REALLY 1270 00:48:31,775 --> 00:48:34,544 IMPORTANT IS THAT THIS IS AN 1271 00:48:34,611 --> 00:48:36,613 EXPERIMENT THAT GETS US CLOSER 1272 00:48:36,680 --> 00:48:39,349 TO EVALUATING A.I. IN CONTEXT. 1273 00:48:39,416 --> 00:48:45,589 SO FOR A VERY LONG TIME, 1274 00:48:45,656 --> 00:48:48,659 COMPUTING IN GENERAL, WE TRY TO 1275 00:48:48,725 --> 00:48:49,526 MAKE THESE GENERALIZABLE 1276 00:48:49,593 --> 00:48:52,496 SYSTEMS, AND THIS COMES FROM THE 1277 00:48:52,562 --> 00:48:53,664 INTELLECTUAL HERITAGE OF 1278 00:48:53,730 --> 00:48:56,400 MATHEMATICS WHERE IN MATH YOU'RE 1279 00:48:56,466 --> 00:48:57,267 TRYING TO DERIVE THEOREMS THAT 1280 00:48:57,334 --> 00:49:01,071 ARE GOING TO WORK EVERYWHERE, 1281 00:49:01,138 --> 00:49:02,706 LIKE THE PYTHAGOREAN THEOREM 1282 00:49:02,773 --> 00:49:04,474 NEVER STOPS BEING TRUE, THAT'S 1283 00:49:04,541 --> 00:49:07,210 AMAZING, BUT THAT'S THE PURSUIT 1284 00:49:07,277 --> 00:49:10,047 OF MATHEMATICS, UNIVERSAL 1285 00:49:10,113 --> 00:49:10,814 TRUTHS. 1286 00:49:10,881 --> 00:49:16,620 WELL, A.I. IS MATH BUT A.I. 1287 00:49:16,687 --> 00:49:18,155 SYSTEMS CAN'T MAKE UNIVERSAL 1288 00:49:18,221 --> 00:49:20,457 TRUTHS WHEN IT COMES TO 1289 00:49:20,524 --> 00:49:22,259 SOMETHING LIKE LANGUAGE OR 1290 00:49:22,326 --> 00:49:24,394 ELECTION INFORMATION OR, YOU 1291 00:49:24,461 --> 00:49:25,862 KNOW, MEDICAL DIAGNOSIS. 1292 00:49:25,929 --> 00:49:28,165 LIKE IT'S JUST NOT POSSIBLE 1293 00:49:28,231 --> 00:49:31,034 BECAUSE THERE'S SO MUCH 1294 00:49:31,101 --> 00:49:32,502 VARIATION IN THE WORLD. 1295 00:49:32,569 --> 00:49:34,037 YOU KNOW, LIKE I'M MUCH MORE 1296 00:49:34,104 --> 00:49:38,208 EXCITED ABOUT USING A.I. FOR 1297 00:49:38,275 --> 00:49:39,176 LIKE RESEARCH INTO PROTEIN 1298 00:49:39,242 --> 00:49:43,580 FOLDING THAN I AM ABOUT USING 1299 00:49:43,647 --> 00:49:46,083 A.I. FOR DIAGNOSING COMMON SKIN 1300 00:49:46,149 --> 00:49:46,616 CONDITIONS. 1301 00:49:46,683 --> 00:49:49,586 >> PERFECT INTRO TO MY SECOND 1302 00:49:49,653 --> 00:49:50,721 QUESTION, ALGORITHMIC 1303 00:49:50,787 --> 00:49:51,788 ACCOUNTABILITY OR AUDITING, YOU 1304 00:49:51,855 --> 00:49:54,758 HAD A TERM FOR THAT, IS THAT 1305 00:49:54,825 --> 00:49:58,729 SOMETHING -- BECAUSE OF THIS 1306 00:49:58,795 --> 00:49:59,463 CONDITIONAL CONTEXT OR 1307 00:49:59,529 --> 00:50:03,066 ALGORITHMS OPERATE IS THERE A 1308 00:50:03,133 --> 00:50:04,935 BURDEN ON ALGORITHM CREATORS TO 1309 00:50:05,001 --> 00:50:06,269 DO THE AUDITING AND OFFER UP 1310 00:50:06,336 --> 00:50:14,478 SOME MEASURE OF PERFORMANCE LIKE 1311 00:50:14,544 --> 00:50:15,445 THE RECEIVERRER OPERATOR YOU 1312 00:50:15,512 --> 00:50:16,980 SHOWED AND WHAT STANDARDS OR 1313 00:50:17,047 --> 00:50:17,247 PRACTICES? 1314 00:50:17,314 --> 00:50:18,382 I'M NOT FAMILIAR IN NIST. 1315 00:50:18,448 --> 00:50:21,351 THEY JUST CHALLENGE YOU TO MAKE 1316 00:50:21,418 --> 00:50:23,320 YOUR OWN ASSESSMENTS BUT MAYBE 1317 00:50:23,387 --> 00:50:24,955 FDA DOES IN MEDICAL DEVICES, I 1318 00:50:25,021 --> 00:50:27,624 DON'T KNOW. 1319 00:50:27,691 --> 00:50:29,559 >> THERE DEFINITELY ISN'T ANY. 1320 00:50:29,626 --> 00:50:32,095 THE STANDARDS ARE EVOLVING AS WE 1321 00:50:32,162 --> 00:50:32,295 SPEAK. 1322 00:50:32,362 --> 00:50:35,599 I JUST SAW SOMETHING SOMETHING 1323 00:50:35,665 --> 00:50:38,769 ON SOCIAL MEDIA TODAY A NEW 1324 00:50:38,835 --> 00:50:40,537 PAPER BY DEBRA RAJI, AN EXPERT 1325 00:50:40,604 --> 00:50:42,038 IN ALGORITHMIC AUDITING. 1326 00:50:42,105 --> 00:50:44,241 SHE AND HER COLLABORATORS HAVE A 1327 00:50:44,307 --> 00:50:49,312 NEW PAPER OUT ABOUT GAPS IN 1328 00:50:49,379 --> 00:50:50,947 TOOLING IN ALGORITHMIC AUDITING. 1329 00:50:51,014 --> 00:50:54,117 SO, IT'S A REALLY NEW FIELD. 1330 00:50:54,184 --> 00:50:57,053 AND SO WE ARE DEVELOPING 1331 00:50:57,120 --> 00:50:57,454 STANDARDS. 1332 00:50:57,521 --> 00:51:01,458 ONE OF THE THINGS THAT I DID FOR 1333 00:51:01,525 --> 00:51:04,060 THE BOOK WAS I SOMETIMES WRITE 1334 00:51:04,127 --> 00:51:05,929 SOFTWARE IN ORDER TO KIND OF 1335 00:51:05,996 --> 00:51:08,432 ANSWER A JOURNALISTIC QUESTION 1336 00:51:08,498 --> 00:51:09,433 OR LIKE DEMONSTRATE SOMETHING 1337 00:51:09,499 --> 00:51:11,067 ABOUT THE PROCESS OF MAKING 1338 00:51:11,134 --> 00:51:11,401 SOFTWARE. 1339 00:51:11,468 --> 00:51:15,005 AND ONE OF THE THINGS THAT I 1340 00:51:15,071 --> 00:51:17,240 MADE OVER THE COURSE OF WRITING 1341 00:51:17,307 --> 00:51:20,177 THE BOOK WAS I MADE A PLATFORM 1342 00:51:20,243 --> 00:51:21,478 FOR ALGORITHMIC AUDITING. 1343 00:51:21,545 --> 00:51:23,447 SO, IF YOU'RE FAMILIAR WITH 1344 00:51:23,513 --> 00:51:25,615 KATHY O'NEIL'S BOOK, WEAPONS OF 1345 00:51:25,682 --> 00:51:27,184 MASS DESTRUCTION, IT WAS ONE OF 1346 00:51:27,250 --> 00:51:30,053 THE BOOKS THAT KICKED OFF THE 1347 00:51:30,120 --> 00:51:31,555 ENTIRE ALGORITHMIC 1348 00:51:31,621 --> 00:51:32,556 ACCOUNTABILITY CONVERSATION. 1349 00:51:32,622 --> 00:51:35,192 KATHY HAS AN ALGORITHMIC 1350 00:51:35,258 --> 00:51:36,526 AUDITING CONSULTING FIRM, CALLED 1351 00:51:36,593 --> 00:51:37,127 ORCA. 1352 00:51:37,194 --> 00:51:39,596 AND SO I COLLABORATED ORCA TO 1353 00:51:39,663 --> 00:51:42,532 BUILD A PLATFORM FOR ALGORITHMIC 1354 00:51:42,599 --> 00:51:46,136 AUDITING WHERE YOU COULD FEED 1355 00:51:46,203 --> 00:51:51,174 IN, YOU KNOW, FEED IN DATA AND 1356 00:51:51,241 --> 00:51:52,108 LIKE BASICALLY GET EVALUATED, 1357 00:51:52,175 --> 00:51:55,345 GIVE YOU A LITTLE REPORT ABOUT, 1358 00:51:55,412 --> 00:51:56,446 YOU KNOW, POTENTIAL BIASES. 1359 00:51:56,513 --> 00:52:02,319 SO, THIS IS THE KIND OF THING WE 1360 00:52:02,385 --> 00:52:03,220 DO NEED. 1361 00:52:03,286 --> 00:52:07,057 IT EXISTS IN VARIOUS FORMS BUT 1362 00:52:07,123 --> 00:52:09,793 YOU CAN'T RIGHT NOW JUST GO AND 1363 00:52:09,860 --> 00:52:12,963 BUY SOMETHING OFF THE SHELF. 1364 00:52:13,029 --> 00:52:14,164 IT'S MOSTLY BESPOKE AND THERE 1365 00:52:14,231 --> 00:52:22,672 ARE A BUNCH OF OPEN SOURCE 1366 00:52:22,739 --> 00:52:23,373 SOLUTIONS OUT THERE. 1367 00:52:23,440 --> 00:52:25,108 WE DO NEED TO GET IT TO THE 1368 00:52:25,175 --> 00:52:26,510 LEVEL OF PLATFORMS AT SOME POINT 1369 00:52:26,576 --> 00:52:28,144 BUT WE'RE NOT THERE YET. 1370 00:52:28,211 --> 00:52:30,580 >> THAT WAS MY LAST QUESTION ON 1371 00:52:30,647 --> 00:52:33,116 THAT TOPIC, NIH IS VERY 1372 00:52:33,183 --> 00:52:33,717 INTERESTED IN SOCIAL 1373 00:52:33,783 --> 00:52:36,019 DETERMINANTS OF HEALTH, RIGHT? 1374 00:52:36,086 --> 00:52:37,988 AS WE LOOK AT EQUITY IN ALL OF 1375 00:52:38,054 --> 00:52:40,524 THE ENVIRONMENTAL AND HISTORICAL 1376 00:52:40,590 --> 00:52:42,592 CONTEXTS THAT CREATE RACIAL BIAS 1377 00:52:42,659 --> 00:52:45,662 IN MEDICINE, IS THERE A PLACE TO 1378 00:52:45,729 --> 00:52:48,064 CONSIDER THESE ALGORITHMS AS 1379 00:52:48,131 --> 00:52:49,833 ANOTHER SOCIAL DETERMINANT IN 1380 00:52:49,900 --> 00:52:50,767 THAT SPECTRUM OF EXTERNAL 1381 00:52:50,834 --> 00:52:51,601 FACTORS TO HELP? 1382 00:52:51,668 --> 00:52:54,204 LIKE YOU'RE SAYING IT'S BECOMING 1383 00:52:54,271 --> 00:52:55,405 AN EMERGENT EMBEDDED PROPERTY IN 1384 00:52:55,472 --> 00:52:57,307 THE HEALTH CARE SYSTEMS SO DO WE 1385 00:52:57,374 --> 00:53:00,977 NEED TO PAY ATTENTION TO THE 1386 00:53:01,044 --> 00:53:02,846 LIBRARY TO TRY AND IDENTIFY AND 1387 00:53:02,913 --> 00:53:03,880 ORGANIZE INFORMATION AROUND 1388 00:53:03,947 --> 00:53:05,949 ALGORITHMS AND THEIR BIAS? 1389 00:53:06,016 --> 00:53:08,885 >> OH, ABSOLUTELY, YES. 1390 00:53:08,952 --> 00:53:09,386 YEAH. 1391 00:53:09,452 --> 00:53:11,154 THERE IS -- SO EMMA PEARSON IS 1392 00:53:11,221 --> 00:53:14,124 ONE OF THE AUTHORS ON THIS 1393 00:53:14,190 --> 00:53:16,326 PAPER, IS DOING SOME INTERESTING 1394 00:53:16,393 --> 00:53:20,664 WORK AT CORNELL. 1395 00:53:20,730 --> 00:53:24,100 I ALSO ALWAYS RECOMMEND READING 1396 00:53:24,167 --> 00:53:29,573 ROXANNEA -- I'M GOING TO MESS UP 1397 00:53:29,639 --> 00:53:30,774 THE NAME. 1398 00:53:30,840 --> 00:53:34,210 DANISHOW, WHO IS DOING REALLY 1399 00:53:34,277 --> 00:53:41,551 INTERESTING WORK IN DERMATOLOGY 1400 00:53:41,618 --> 00:53:43,853 AND ML STUFF. 1401 00:53:43,920 --> 00:53:45,922 FOR ME IT'S ABOUT KEEPING TRACK 1402 00:53:45,989 --> 00:53:48,525 OF WHO IS DOING INTERESTING WORK 1403 00:53:48,592 --> 00:53:50,560 AND KIND OF KEEPING UP WITH THE 1404 00:53:50,627 --> 00:53:51,628 LATEST THINKING BECAUSE I THINK 1405 00:53:51,695 --> 00:53:54,931 THE WAY THAT WE THINK ABOUT 1406 00:53:54,998 --> 00:53:56,800 THESE ISSUES IS CHANGE, RIGHT? 1407 00:53:56,866 --> 00:54:02,739 I MEAN IF WE EVEN THINK ABOUT 1408 00:54:02,806 --> 00:54:04,274 SOMETHING LIKE IN PEDIATRICS, 1409 00:54:04,341 --> 00:54:06,109 ONE OF THE INITIATIVES A FEW 1410 00:54:06,176 --> 00:54:10,480 YEARS AGO, OH, LET'S GET 1411 00:54:10,547 --> 00:54:12,248 BABIES -- LET'S GET LITTLE KIDS 1412 00:54:12,315 --> 00:54:14,417 DENTAL HEALTH EVALUATED BECAUSE 1413 00:54:14,484 --> 00:54:15,952 THEY REALIZE, OKAY, IF KIDS GO 1414 00:54:16,019 --> 00:54:18,688 TO THE DENTIST THAT'S LIKE AN 1415 00:54:18,755 --> 00:54:20,890 INDICATOR THEY ARE BEING WELL 1416 00:54:20,957 --> 00:54:22,092 LOOKED AFTER MEDICALLY, IT 1417 00:54:22,158 --> 00:54:23,627 WASN'T ABOUT THE DENTIST PER SE, 1418 00:54:23,693 --> 00:54:29,866 IT WAS ABOUT THE DENTISTS BEING 1419 00:54:29,933 --> 00:54:32,369 A SIGNIFIER OF WELLNESS AND SO 1420 00:54:32,435 --> 00:54:34,537 HOW DO YOU HIGHLIGHT THAT? 1421 00:54:34,604 --> 00:54:40,877 AND HOW DO YOU AS ALGORITHMS OF 1422 00:54:40,944 --> 00:54:42,445 ALGORITHMS FIND THINGS LIKE IF A 1423 00:54:42,512 --> 00:54:44,314 CHILD HAS THREE CAVITIES BEFORE 1424 00:54:44,381 --> 00:54:46,349 AGE 4 THEY MIGHT NEED 1425 00:54:46,416 --> 00:54:47,050 NUTRITIONAL COUNSELING, HOW DO 1426 00:54:47,117 --> 00:54:49,753 YOU APPROACH THAT FROM A LIBRARY 1427 00:54:49,819 --> 00:54:50,220 PERSPECTIVE? 1428 00:54:50,286 --> 00:54:51,721 I THINK WE'RE STILL TRYING TO 1429 00:54:51,788 --> 00:54:52,989 FIGURE THAT OUT. 1430 00:54:53,056 --> 00:54:53,523 >> THANK YOU. 1431 00:54:53,590 --> 00:54:55,225 >> SO WE'LL TAKE A QUESTION FROM 1432 00:54:55,291 --> 00:54:58,194 ONLINE AND IT LOOKS LIKE WE HAVE 1433 00:54:58,261 --> 00:55:00,497 A QUESTION FROM THE ROOM. 1434 00:55:00,563 --> 00:55:02,032 SO, ONE VIRTUAL ATTENDEE NOTED 1435 00:55:02,098 --> 00:55:05,068 THAT MANY OF THE DISCUSSIONS 1436 00:55:05,135 --> 00:55:07,070 THAT THEY HEARD AROUND DIVERSITY 1437 00:55:07,137 --> 00:55:09,839 AND A.I. HAVE FOCUSED ON RACE, 1438 00:55:09,906 --> 00:55:12,342 GENDER, AND NATIONAL ORIGIN WITH 1439 00:55:12,409 --> 00:55:13,877 NOT AS MUCH DISCUSSION OF 1440 00:55:13,943 --> 00:55:16,312 COMMUNITIES OF DISABILITIES. 1441 00:55:16,379 --> 00:55:18,248 THIS HAS BEEN A FOCUS OF YOUR 1442 00:55:18,314 --> 00:55:19,916 RESEARCH FOR MORE THAN A GLITCH 1443 00:55:19,983 --> 00:55:23,253 SO WAS WOUND ERIC IF -- 1444 00:55:23,319 --> 00:55:26,489 WONDERING IF YOU COULD COMMENT 1445 00:55:26,556 --> 00:55:28,725 ON YOUR LEARNINGS IN THIS SPACE, 1446 00:55:28,792 --> 00:55:30,560 PERHAPS A STORY ABOUT SUCCESS IN 1447 00:55:30,627 --> 00:55:34,431 CO-CREATION AND I'VE HEARD YOU 1448 00:55:34,497 --> 00:55:37,934 TALK ABOUT -- OH GOSH. 1449 00:55:38,001 --> 00:55:38,735 >> CURB CUTS. 1450 00:55:38,802 --> 00:55:39,703 >> CURB CUTS. 1451 00:55:39,769 --> 00:55:41,137 HOW DOES THAT WORK? 1452 00:55:41,204 --> 00:55:43,940 >> LET ME TALK ABOUT DISABILITY 1453 00:55:44,007 --> 00:55:45,075 DONGELS, THOSE ARE INTERESTING. 1454 00:55:45,141 --> 00:55:47,143 FIRST OF ALL, MORE THAN A GLITCH 1455 00:55:47,210 --> 00:55:49,979 IS ABOUT CONFRONTING RACE, 1456 00:55:50,046 --> 00:55:51,047 GENDER, AND ABILITY BIAS IN 1457 00:55:51,114 --> 00:55:51,514 TECH. 1458 00:55:51,581 --> 00:55:52,682 ORIGINALLY IT WAS JUST ABOUT 1459 00:55:52,749 --> 00:55:53,817 RACE AND GENDER. 1460 00:55:53,883 --> 00:55:58,154 I WAS LIKE, WAIT A MINUTE, 1461 00:55:58,221 --> 00:55:59,255 THERE'S NOT ENOUGH DISCOURSE 1462 00:55:59,322 --> 00:56:00,356 ABOUT DISABILITY. 1463 00:56:00,423 --> 00:56:02,158 THEN I STARTED WRITING. 1464 00:56:02,225 --> 00:56:04,294 OH, WAIT, I NEED TO LEARN. 1465 00:56:04,360 --> 00:56:05,962 THAT WAS THE TOPIC THAT I HAD TO 1466 00:56:06,029 --> 00:56:07,797 LEARN THE MOST ABOUT IN ORDER TO 1467 00:56:07,864 --> 00:56:09,332 WRITE THIS TOOK. 1468 00:56:09,399 --> 00:56:11,401 I'M GRATEFUL TO THE FOLKS WHO 1469 00:56:11,468 --> 00:56:14,471 TOOK THE TIME TO EXPLAIN THINGS 1470 00:56:14,537 --> 00:56:15,672 TO ME. 1471 00:56:15,739 --> 00:56:16,973 THINKERS AND CREATORS WHO I HAD 1472 00:56:17,040 --> 00:56:20,376 AN OPPORTUNITY TO LEARN FROM. 1473 00:56:20,443 --> 00:56:21,611 AND SOMETHING THAT WAS REALLY 1474 00:56:21,678 --> 00:56:24,714 POWERFUL TO ME WAS THE CONCEPT 1475 00:56:24,781 --> 00:56:29,953 OF A DISABILITY DONGLE, 1476 00:56:30,019 --> 00:56:32,122 SOMETHING A DESIGNER WILL CREATE 1477 00:56:32,188 --> 00:56:34,758 BECAUSE THEY THINK IT'S GOING TO 1478 00:56:34,824 --> 00:56:39,262 BE TRANSFORMATIVE FOR PEOPLE 1479 00:56:39,329 --> 00:56:40,430 WITH DISABILITIES. 1480 00:56:40,497 --> 00:56:43,500 A GOOD EXAMPLE IS THE 1481 00:56:43,566 --> 00:56:44,434 STAIR-CLIMBING WHEELCHAIR. 1482 00:56:44,501 --> 00:56:47,670 TONS OF NEW THINGS HAVE BEEN 1483 00:56:47,737 --> 00:56:51,040 DEVELOPED, THEY ALL HAVE THESE 1484 00:56:51,107 --> 00:56:51,808 WHACKY-LOOKING ARCHITECTURES, 1485 00:56:51,875 --> 00:56:55,145 LIKE THEY ARE NOT WHEELS, THEY 1486 00:56:55,211 --> 00:56:58,314 ARE LIKE CROSSES, COULD CLIMB 1487 00:56:58,381 --> 00:56:59,849 THE STAIRS, CATERPILLAR WHEELS 1488 00:56:59,916 --> 00:57:00,717 ON THEM. 1489 00:57:00,784 --> 00:57:02,786 AND WHEN YOU ASK SOMEBODY IN A 1490 00:57:02,852 --> 00:57:06,055 WHEELCHAIR, OH, DO YOU WANT THIS 1491 00:57:06,122 --> 00:57:06,923 STAIR-CLIMBING WHEELCHAIR, LOOK, 1492 00:57:06,990 --> 00:57:08,158 I INVENTED THIS, THEY ARE LIKE, 1493 00:57:08,224 --> 00:57:10,994 NO, I DON'T WANT THAT. 1494 00:57:11,060 --> 00:57:11,861 THAT LOOKS TERRIFYING. 1495 00:57:11,928 --> 00:57:16,699 I JUST WANT A RAMP. 1496 00:57:16,766 --> 00:57:18,334 YOU REALIZE YOU DON'T NEED TO 1497 00:57:18,401 --> 00:57:20,670 OVERENGINEER THE SOLUTION. 1498 00:57:20,737 --> 00:57:22,372 AND ALSO YOU SHOULD HAVE JUST 1499 00:57:22,438 --> 00:57:24,574 ASKED THE PERSON USING THE 1500 00:57:24,641 --> 00:57:25,341 WHEELCHAIR, LIKE LISTEN, WHAT'S 1501 00:57:25,408 --> 00:57:29,045 GOING TO MAKE YOUR LIFE EASIER? 1502 00:57:29,112 --> 00:57:31,114 INSTEAD OF ASSUMING THAT THIS 1503 00:57:31,181 --> 00:57:34,450 THING THAT YOU CAME UP WITH IS 1504 00:57:34,517 --> 00:57:37,787 GOING TO HELP. 1505 00:57:37,854 --> 00:57:40,657 SO, THE IDEA IS THAT WHEN WE'RE 1506 00:57:40,723 --> 00:57:43,526 DESIGNING TECHNOLOGIES WE NEED 1507 00:57:43,593 --> 00:57:45,061 TO CONSULT WITH AFFECTED 1508 00:57:45,128 --> 00:57:46,362 COMMUNITIES AND BUILD OUT OF THE 1509 00:57:46,429 --> 00:57:47,997 NEEDS OF THE AFFECTED 1510 00:57:48,064 --> 00:57:50,066 COMMUNITIES AS OPPOSED TO THIS 1511 00:57:50,133 --> 00:57:52,168 KIND OF TOP-DOWN SILICON VALLEY 1512 00:57:52,235 --> 00:58:00,043 APPROACH OF, OH, I AM THE 1513 00:58:00,109 --> 00:58:01,644 DESIGNER, I KNOW EVERYTHING. 1514 00:58:01,711 --> 00:58:03,546 >> I KNOW WE'RE AT TIME. 1515 00:58:03,613 --> 00:58:05,315 I'M HAILEY, A POSTDOC. 1516 00:58:05,381 --> 00:58:06,549 I DO METABOLOMICS RESEARCH 1517 00:58:06,616 --> 00:58:08,017 WHETHER WE TAKE A LOT OF 1518 00:58:08,084 --> 00:58:10,353 CHEMICALS AND TRY TO FIND OUT 1519 00:58:10,420 --> 00:58:11,788 HOW THEY CONTRIBUTE TO A 1520 00:58:11,855 --> 00:58:12,021 DISEASE. 1521 00:58:12,088 --> 00:58:14,090 WE'RE LOOKING AT A LOT OF 1522 00:58:14,157 --> 00:58:16,826 VARIABLES, TRYING TO FIND MORE 1523 00:58:16,893 --> 00:58:19,128 GRANULARITY BEYONDS WHAT WE 1524 00:58:19,195 --> 00:58:20,330 ALREADY KNOW. 1525 00:58:20,396 --> 00:58:21,531 I SAW THAT REFLECTED HERE WHEN 1526 00:58:21,598 --> 00:58:23,066 THINKING ABOUT STUFF LIKE RACE, 1527 00:58:23,132 --> 00:58:25,602 LIKE YOU SAID, IT CAN BE OVERLY 1528 00:58:25,668 --> 00:58:30,440 BROAD OR WITH DISABILITIES, 1529 00:58:30,506 --> 00:58:31,975 MAYBE STUFF WE DON'T UNDERSTAND. 1530 00:58:32,041 --> 00:58:34,477 TO TAKE IT TO THE BIGGER 1531 00:58:34,544 --> 00:58:36,312 PICTURE, NOTICING A PATTERN IN 1532 00:58:36,379 --> 00:58:38,414 YOUR TALK, WHAT WE NEED TO 1533 00:58:38,481 --> 00:58:41,251 EVALUATE ARE INPUTS AND OUTPUTS, 1534 00:58:41,317 --> 00:58:43,019 HOW WE INTERPRET OUR DATA. 1535 00:58:43,086 --> 00:58:45,054 I THINK YOU GAVE TWO GOOD 1536 00:58:45,121 --> 00:58:46,055 EXAMPLES BUT THEY INTRODUCED 1537 00:58:46,122 --> 00:58:48,491 QUESTIONS FOR ME BECAUSE EVEN 1538 00:58:48,558 --> 00:58:51,094 WHEN TALKING ABOUT THAT FOUR 1539 00:58:51,160 --> 00:58:52,428 DIMENSIONAL EVALUATION OF A.I., 1540 00:58:52,495 --> 00:58:54,731 THE QUESTION THAT IT BROUGHT UP 1541 00:58:54,797 --> 00:58:56,900 IS THESE ARE STILL -- HUMANS 1542 00:58:56,966 --> 00:58:58,768 CREATE THE A.I., HUMANS ARE ALSO 1543 00:58:58,835 --> 00:59:01,170 EVALUATING THE A.I., MADE ME 1544 00:59:01,237 --> 00:59:02,605 WONDER HOW WOULD OUTPUTS CHANGE 1545 00:59:02,672 --> 00:59:04,340 IF THERE WERE DIFFERENT PEOPLE 1546 00:59:04,407 --> 00:59:06,209 DOING THE EVALUATION, HOW WOULD 1547 00:59:06,276 --> 00:59:07,410 OUTPUT CHANGE IF INSTEAD OF 1548 00:59:07,477 --> 00:59:09,145 REPORTING ON A MAJORITY VOTE WE 1549 00:59:09,212 --> 00:59:10,480 USED DIFFERENT KIND OF METRIC. 1550 00:59:10,546 --> 00:59:12,548 SO THESE ARE ALL THE QUESTIONS 1551 00:59:12,615 --> 00:59:15,618 THAT I'M WONDERING IS THAT WHAT 1552 00:59:15,685 --> 00:59:20,089 GOES INTO YOUR ALGORITHMIC 1553 00:59:20,156 --> 00:59:21,090 ACCOUNTABILITY TOOL? 1554 00:59:21,157 --> 00:59:23,393 AND IT SEEMS LIKE YOU'RE 1555 00:59:23,459 --> 00:59:25,728 PROPOSING A HUMAN SOLUTION TO A 1556 00:59:25,795 --> 00:59:27,530 MACHINE PROBLEM, WHICH I THINK 1557 00:59:27,597 --> 00:59:28,531 IS ABSOLUTELY APPROPRIATE 1558 00:59:28,598 --> 00:59:30,233 BECAUSE LIKE YOU SAID, COMPUTERS 1559 00:59:30,300 --> 00:59:33,803 ARE ONLY GOING TO ADDRESS SO 1560 00:59:33,870 --> 00:59:34,570 MANY PROBLEMS. 1561 00:59:34,637 --> 00:59:36,339 SO, IS IT APPROPRIATE FOR THESE 1562 00:59:36,406 --> 00:59:38,441 TOOLS TO CONTINUE BEING 1563 00:59:38,508 --> 00:59:38,708 COMPUTERS? 1564 00:59:38,775 --> 00:59:40,944 AND ALSO THERE ARE A LOT OF 1565 00:59:41,010 --> 00:59:41,878 ELEMENTS TO THIS QUESTION, 1566 00:59:41,945 --> 00:59:43,579 BECAUSE I'M THINKING BOTH ABOUT 1567 00:59:43,646 --> 00:59:44,981 THE INPUTS AND OUTPUTS AT THE 1568 00:59:45,048 --> 00:59:45,982 SAME TIME. 1569 00:59:46,049 --> 00:59:48,017 SO TWO PARTS THEN. 1570 00:59:48,084 --> 00:59:51,220 ONE PART RELATED TO THE RACE OR 1571 00:59:51,287 --> 00:59:52,655 OTHER VARIABLES THAT ARE MAYBE 1572 00:59:52,722 --> 00:59:55,191 TOO LARGE TO ACCOUNT FOR. 1573 00:59:55,258 --> 00:59:57,160 WHAT HAVE YOU SEEN THAT MAYBE 1574 00:59:57,226 --> 00:59:58,227 HAS BEEN HIGHLIGHTED OR 1575 00:59:58,294 --> 01:00:00,396 SOMETHING THAT COULD BE 1576 01:00:00,463 --> 01:00:02,432 EFFECTIVE FOR BREAKING THESE 1577 01:00:02,498 --> 01:00:03,499 DOWN INTO MORE GRANULAR CHUNKS 1578 01:00:03,566 --> 01:00:05,134 LIKE WHAT WE DO WITH 1579 01:00:05,201 --> 01:00:08,004 METABOLOMICS TO TRY TO 1580 01:00:08,071 --> 01:00:08,805 UNDERSTAND MORE GRANULAR 1581 01:00:08,871 --> 01:00:11,808 CHEMISTRY, AND ALSO WHEN IT 1582 01:00:11,874 --> 01:00:13,376 COMES TO THE OUTPUTS, YOU'RE 1583 01:00:13,443 --> 01:00:14,877 EVALUATING THESE THINGS, WHAT 1584 01:00:14,944 --> 01:00:15,878 ARE THE KINDS OF REPRODUCIBLE 1585 01:00:15,945 --> 01:00:17,180 QUESTIONS THAT WE CAN ASK ABOUT 1586 01:00:17,246 --> 01:00:20,049 HOW THE DATA IS BEING 1587 01:00:20,116 --> 01:00:20,783 INTERPRETED? 1588 01:00:20,850 --> 01:00:21,851 LIKE FOR EXAMPLE THEY CAN BE 1589 01:00:21,918 --> 01:00:23,553 LIKE WHO IS IN THE ROOM, BECAUSE 1590 01:00:23,619 --> 01:00:24,854 THAT WAS THE FIRST QUESTION THAT 1591 01:00:24,921 --> 01:00:28,691 CAME UP FOR ME WITH THE A.I. 1592 01:00:28,758 --> 01:00:29,225 EVALUATION. 1593 01:00:29,292 --> 01:00:30,093 >> YEAH. 1594 01:00:30,159 --> 01:00:31,194 >> SO, YEAH. 1595 01:00:31,260 --> 01:00:32,929 >> YES, THAT'S A FANTASTIC 1596 01:00:32,996 --> 01:00:33,196 QUESTION. 1597 01:00:33,262 --> 01:00:34,564 I THINK YOU BASICALLY 1598 01:00:34,630 --> 01:00:35,631 ENCAPSULATED MY ENTIRE TALK IN 1599 01:00:35,698 --> 01:00:37,200 ONE QUESTION. 1600 01:00:37,266 --> 01:00:38,634 THANK YOU FOR THAT. 1601 01:00:38,701 --> 01:00:40,937 I WOULD SAY THAT THE THING THAT 1602 01:00:41,004 --> 01:00:42,672 HELPS ME IS SOMETHING FROM 1603 01:00:42,739 --> 01:00:45,875 SCIENCE AND SOMETHING FROM 1604 01:00:45,942 --> 01:00:46,175 LITERATURE. 1605 01:00:46,242 --> 01:00:49,145 SO, FROM LITERATURE, I THINK A 1606 01:00:49,212 --> 01:00:51,014 LOT ABOUT HUBRIS, RIGHT? 1607 01:00:51,080 --> 01:00:54,417 YOU KNOW, I TOOK A LOT OF 1608 01:00:54,484 --> 01:00:55,518 SHAKESPEARE CLASSES IN COLLEGE. 1609 01:00:55,585 --> 01:00:59,789 YOU KNOW, WHAT YOU LEARN FROM 1610 01:00:59,856 --> 01:01:03,292 SHAKESPEARE IS DON'T GET TOO FAR 1611 01:01:03,359 --> 01:01:04,927 OUT OR YOU'RE GOING TO GET 1612 01:01:04,994 --> 01:01:05,795 KNOCKED DOWN. 1613 01:01:05,862 --> 01:01:09,032 I TRY NOT TO ADD HUBRIS, TRY NOT 1614 01:01:09,098 --> 01:01:11,167 TO SOLVE ALL OF THE PROBLEMS OF 1615 01:01:11,234 --> 01:01:13,469 THE UNIVERSE WITH ONE ALGORITHM. 1616 01:01:13,536 --> 01:01:15,571 AND SO THAT I THINK IS A REALLY 1617 01:01:15,638 --> 01:01:16,639 IMPORTANT APPROACH BECAUSE ONE 1618 01:01:16,706 --> 01:01:19,509 OF THE THINGS WE SEE COMING OUT 1619 01:01:19,575 --> 01:01:21,044 OF SILICON VALLEY, OH, THIS A.I. 1620 01:01:21,110 --> 01:01:28,151 IS GOING TO SOLVE EVERYTHING, 1621 01:01:28,217 --> 01:01:29,118 USEFUL FOR ALL (INDISCERNIBLE). 1622 01:01:29,185 --> 01:01:30,019 MAYBE IT'S NOT. 1623 01:01:30,086 --> 01:01:32,955 THE OTHER THING, I TRY AND 1624 01:01:33,022 --> 01:01:34,524 APPROACH ALGORITHMIC QUESTIONS 1625 01:01:34,590 --> 01:01:36,392 MORE LIKE A SCIENTIST THAN LIKE 1626 01:01:36,459 --> 01:01:39,562 A COMPUTER SCIENTIST. 1627 01:01:39,629 --> 01:01:42,065 SO I LIKE TO TAKE A REALLY SMALL 1628 01:01:42,131 --> 01:01:44,400 PIECE, I LIKE TO GET REALLY 1629 01:01:44,467 --> 01:01:47,637 GRANULAR AND LOOK AT A REALLY 1630 01:01:47,703 --> 01:01:50,073 FINITE CASE IN ORDER TO 1631 01:01:50,139 --> 01:01:51,574 ILLUMINATE THE LARGER UNIVERSE 1632 01:01:51,641 --> 01:01:55,411 AS OPPOSED TO TRYING TO WRITE 1633 01:01:55,478 --> 01:01:57,280 ABOUT EVERYTHING THROUGHOUT 1634 01:01:57,346 --> 01:01:57,647 TIME. 1635 01:01:57,713 --> 01:01:59,115 BECAUSE THAT'S HOW SCIENTIFIC 1636 01:01:59,182 --> 01:01:59,882 KNOWLEDGE EVOLVES, RIGHT? 1637 01:01:59,949 --> 01:02:03,052 AND SO I THINK THAT'S HOW OUR 1638 01:02:03,119 --> 01:02:05,121 KNOWLEDGE IN ALGORITHMIC SPACES 1639 01:02:05,188 --> 01:02:06,456 NEEDS TO EVOLVE TOO. 1640 01:02:06,522 --> 01:02:07,356 >> THANK YOU. 1641 01:02:07,423 --> 01:02:07,657 >> GREAT. 1642 01:02:07,723 --> 01:02:09,192 THANK YOU SO MUCH, EVERYBODY. 1643 01:02:09,258 --> 01:02:10,593 THIS HAS BEEN REALLY WONDERFUL. 1644 01:02:10,660 --> 01:02:10,960 >> THANK YOU. 1645 01:02:11,027 --> 01:02:12,361 THANK YOU SO MUCH. 1646 01:02:12,428 --> 01:02:15,698 A LOT OF COMMENTS COMING IN, 1647 01:02:15,765 --> 01:02:17,266 JUST THANKING YOU FOR A GREAT, 1648 01:02:17,333 --> 01:02:17,934 GREAT TALK. 1649 01:02:18,000 --> 01:02:20,670 PLEASE, ONE MORE ROUND OF 1650 01:02:20,736 --> 01:02:20,937 APPLAUSE. 1651 01:02:21,003 THANK YOU SO MUCH.