1 00:00:29,576 --> 00:00:33,814 ASSOCIATE DIRECTOR FROM 2 00:00:33,814 --> 00:00:37,317 1965-1983, A CONTRIBUTOR AT THE 3 00:00:37,317 --> 00:00:39,253 NATIONAL CANCER INSTITUTE, 4 00:00:39,253 --> 00:00:42,656 LIBRARIANS AND INFORMATICS 5 00:00:42,656 --> 00:00:43,991 ENGINEER. 6 00:00:43,991 --> 00:00:46,526 THIS IS TO STIMULATE LIAISON 7 00:00:46,526 --> 00:00:47,761 BETWEEN YOUR ORGANIZATIONS. 8 00:00:47,761 --> 00:00:50,063 SO IN SPIRIT OF THAT GROWTH AND 9 00:00:50,063 --> 00:00:51,064 COLLABORATION, YOU ARE ALL 10 00:00:51,064 --> 00:00:52,065 INVITED TO PARTICIPATE IN A 11 00:00:52,065 --> 00:00:52,699 CONVERSATION. 12 00:00:52,699 --> 00:00:55,102 YOU WILL SEE A LIVE FEEDBACK 13 00:00:55,102 --> 00:00:56,003 BUTTON UNDER THE VIDEO STREAM 14 00:00:56,003 --> 00:00:57,504 SHORTLY. 15 00:00:57,504 --> 00:00:58,939 PLEASE USE THIS FEATURE TO 16 00:00:58,939 --> 00:01:00,907 SUBMIT QUESTIONS FOR OUR 17 00:01:00,907 --> 00:01:01,141 SPEAKER. 18 00:01:01,141 --> 00:01:02,175 FOLLOWING THE LECTURE, THERE 19 00:01:02,175 --> 00:01:06,847 WILL BE A Q&A SESSION, 20 00:01:06,847 --> 00:01:10,684 MODERATED BY NLM'S KEN KOYLE 21 00:01:10,684 --> 00:01:13,320 WHO SERVES AS DEPUTY CHIEF, 22 00:01:13,320 --> 00:01:15,722 PLEASE USE THE LIVE FEEDBACK 23 00:01:15,722 --> 00:01:16,990 BUTTON BELOW THE VIDEO TO 24 00:01:16,990 --> 00:01:19,593 SUBMIT YOUR QUESTIONS. 25 00:01:19,593 --> 00:01:20,694 WITH THAT HOUSEKEEPING NOTE OUT 26 00:01:20,694 --> 00:01:24,331 OF THE WAY, I'M HONORED TO 27 00:01:24,331 --> 00:01:26,233 INTRODUCE OUR ESTEEMED SPEAKER 28 00:01:26,233 --> 00:01:29,670 TODAY, DR. MAIA HIGHTOWER. 29 00:01:29,670 --> 00:01:32,205 CEO AND CO-FOUNDER OF EYE 30 00:01:32,205 --> 00:01:33,607 QUALITY AI AND CHIEF 31 00:01:33,607 --> 00:01:35,008 INFORMATION OFFICER OF 32 00:01:35,008 --> 00:01:36,777 UNIVERSITY OF CHICAGO, MEDICINE. 33 00:01:36,777 --> 00:01:38,045 DR. HIGHTOWER IS A LEADING 34 00:01:38,045 --> 00:01:40,681 VOICE IN THE INTERSECTION OF 35 00:01:40,681 --> 00:01:41,515 HEALTHCARE, DIGITAL 36 00:01:41,515 --> 00:01:42,049 TRANSFORMATION AND HEALTH 37 00:01:42,049 --> 00:01:43,016 EQUITY. 38 00:01:43,016 --> 00:01:45,252 SHE IS A CHAMPION FOR 39 00:01:45,252 --> 00:01:46,953 RESPONSIBLE AI, ENSURING THE 40 00:01:46,953 --> 00:01:48,622 DIGITAL FUTURE OF HEALTHCARE IS 41 00:01:48,622 --> 00:01:50,357 EQUITABLE AND JUST. 42 00:01:50,357 --> 00:01:53,226 HER COMPANY SEEKS TO DETECT AND 43 00:01:53,226 --> 00:01:57,030 REMEDIATE AI BIAS IN HEALTHCARE 44 00:01:57,030 --> 00:01:59,066 AND ALIGN AI STRATEGY WITH 45 00:01:59,066 --> 00:02:00,300 HEALTH OUTCOMES AND EQUITY. 46 00:02:00,300 --> 00:02:06,540 SHE IS A FOUR TIME C-SUITE 47 00:02:06,540 --> 00:02:08,241 EXECUTIVE, SPANNING HEALTHCARE 48 00:02:08,241 --> 00:02:13,113 IT, MEDICAL AFFAIRS AND 49 00:02:13,113 --> 00:02:15,115 POPULATION HEALTH, CLINICALLY 50 00:02:15,115 --> 00:02:15,882 INTEGRATED NETWORKS AND 51 00:02:15,882 --> 00:02:17,651 HEALTHCARE TECH COMPANIES. 52 00:02:17,651 --> 00:02:19,286 SHE RECEIVED HER B.A. IN 53 00:02:19,286 --> 00:02:23,490 CORNELL AND M.D. AND MASTERS OF 54 00:02:23,490 --> 00:02:25,592 PUBLIC HEALTH FROM ROCHESTER 55 00:02:25,592 --> 00:02:28,261 SCHOOL OF MEDICINE. 56 00:02:28,261 --> 00:02:30,364 INTERNAL MEDICINE AND 57 00:02:30,364 --> 00:02:32,332 PEDIATRICS AT UNIVERSITY OF 58 00:02:32,332 --> 00:02:35,435 CALIFORNIA, SAN DIEGO AND HOLDS 59 00:02:35,435 --> 00:02:36,737 AN MBA FROM WHARTON SCHOOL OF 60 00:02:36,737 --> 00:02:38,572 PENNSYLVANIA. 61 00:02:38,572 --> 00:02:41,208 I'M VERY PROUD TO HAVE DR. 62 00:02:41,208 --> 00:02:45,445 HIGHTOWER WITH US TODAY. 63 00:02:45,445 --> 00:02:47,047 NLM PLAYS AN IMPORTANT ROLE IN 64 00:02:47,047 --> 00:02:48,982 THE RESPONSIBLE USE OF AI IN 65 00:02:48,982 --> 00:02:50,384 BIO MEDICINE. 66 00:02:50,384 --> 00:02:52,919 AND RECOGNIZING AND ELIMINATING 67 00:02:52,919 --> 00:02:54,855 AI BIAS TO DRIVE DISCOVERY AND 68 00:02:54,855 --> 00:02:56,656 IMPROVE HEALTH FOR ALL PEOPLE. 69 00:02:56,656 --> 00:02:58,191 WITH THIS IN MIND I LOOK 70 00:02:58,191 --> 00:03:01,495 FORWARD TO LEARNING FROM DR. 71 00:03:01,495 --> 00:03:04,531 HIGHTOWER'S INSIGHT ON 72 00:03:04,531 --> 00:03:06,867 RESPONSIBLE AI. 73 00:03:06,867 --> 00:03:09,169 PLEASE JOIN ME IN WELCOMING DR. 74 00:03:09,169 --> 00:03:10,103 HIGHTOWER TO THE LECTURE. 75 00:03:10,103 --> 00:03:12,205 >> Maia Hightower: HELLO, 76 00:03:12,205 --> 00:03:13,340 THANKS FOR HAVING ME. 77 00:03:13,340 --> 00:03:15,942 THIS IS ONE OF MY MOST FAVORITE 78 00:03:15,942 --> 00:03:17,778 TOPICS TO TALK ABOUT. 79 00:03:17,778 --> 00:03:19,312 LITTLE DID YOU KNOW WHEN YOU 80 00:03:19,312 --> 00:03:20,714 CLICKED THAT JOIN BUTTON YOU 81 00:03:20,714 --> 00:03:22,849 WERE SIGNING UP, YOU WERE 82 00:03:22,849 --> 00:03:24,451 SAYING, HOST LET ME ON THE TEAM. 83 00:03:24,451 --> 00:03:27,187 I WANT TO PLAY A ROLE IN MAKING 84 00:03:27,187 --> 00:03:29,089 SURE THE DIGITAL FEATURE, THE 85 00:03:29,089 --> 00:03:30,690 AI FEATURE OF HEALTHCARE IS 86 00:03:30,690 --> 00:03:34,728 EQUITABLE AND FAIR. 87 00:03:34,728 --> 00:03:36,062 SO ON THE ZOOM TODAY YOU WILL 88 00:03:36,062 --> 00:03:38,098 KNOW YOU WILL HAVE A GOOD SENSE 89 00:03:38,098 --> 00:03:41,334 OF WHAT YOUR ROLE IS TO ENSURE 90 00:03:41,334 --> 00:03:47,974 THAT WHOEVER YOU REPRESENT, YOU 91 00:03:47,974 --> 00:03:49,609 ARE -- SERVE EVERYONE AND NOT 92 00:03:49,609 --> 00:03:57,918 JUST A SELECT FEW. 93 00:03:57,918 --> 00:04:00,620 WE ARE IN THIS AMAZING TIME IN 94 00:04:00,620 --> 00:04:03,957 HEALTHCARE. 95 00:04:03,957 --> 00:04:05,358 UNLOCK UNPRECEDENTED VALUE. 96 00:04:05,358 --> 00:04:06,626 WE MEASURE VALUE BY THE VALUE 97 00:04:06,626 --> 00:04:08,995 EQUATION. 98 00:04:08,995 --> 00:04:09,830 QUALITY OVER COST, MULTIPLIED 99 00:04:09,830 --> 00:04:10,931 BY EXPERIENCE. 100 00:04:10,931 --> 00:04:17,471 AND MANY OF THE INNOVATIONS ARE 101 00:04:17,471 --> 00:04:18,905 REALLY DRIVEN BY IMPROVING 102 00:04:18,905 --> 00:04:20,440 CARE, DRIVING DOWN THE COST AND 103 00:04:20,440 --> 00:04:22,742 IMPROVING THE EXPERIENCE OF ALL 104 00:04:22,742 --> 00:04:23,810 STAKEHOLDERS IN THE HEALTHCARE 105 00:04:23,810 --> 00:04:25,779 ECOSYSTEM. 106 00:04:25,779 --> 00:04:26,613 PROVIDERS, PATIENTS, CARE TEAM 107 00:04:26,613 --> 00:04:28,615 MEMBERS. 108 00:04:28,615 --> 00:04:30,717 AND A.I. HAS THAT POTENTIAL TO 109 00:04:30,717 --> 00:04:34,254 DRIVE EACH OF THESE METRICS, 110 00:04:34,254 --> 00:04:37,390 REALLY DEFINING HOW AI CAN BE 111 00:04:37,390 --> 00:04:40,894 MEASURED IN TERMS OF THE VALUE 112 00:04:40,894 --> 00:04:42,963 IT PROVIDES. 113 00:04:42,963 --> 00:04:44,264 EXAMPLES INCLUDE IMPROVING 114 00:04:44,264 --> 00:04:47,400 CLINICAL QUALITY WITH A.I., 115 00:04:47,400 --> 00:04:48,835 DIAGNOSTIC IMAGING, THEIR 116 00:04:48,835 --> 00:04:50,737 INCREDIBLE NEW DIAGNOSTICS THAT 117 00:04:50,737 --> 00:04:51,705 ARE ACTUALLY EMBEDDED WITHIN 118 00:04:51,705 --> 00:04:52,672 DEVICES. 119 00:04:52,672 --> 00:04:56,076 SOME OF WHICH ARE FDA APPROVED, 120 00:04:56,076 --> 00:04:59,246 FDA REGULATED THAT IMPROVE THE 121 00:04:59,246 --> 00:05:03,183 ABILITY TO DIAGNOSE SAY 122 00:05:03,183 --> 00:05:03,750 DIABETIC RETINOPATHY USING 123 00:05:03,750 --> 00:05:05,785 ARTIFICIAL INTELLIGENCE. 124 00:05:05,785 --> 00:05:07,220 TREATMENT RECOMMENDATIONS, AI 125 00:05:07,220 --> 00:05:08,955 HAS THIS ABILITY TO SIFT 126 00:05:08,955 --> 00:05:10,857 THROUGH VAST AMOUNTS OF 127 00:05:10,857 --> 00:05:12,325 CLINICAL DATA AND PROVIDE 128 00:05:12,325 --> 00:05:13,493 SUGGESTIONS ON WHAT TREATMENTS 129 00:05:13,493 --> 00:05:15,161 MAY BE BEST APPROPRIATE FOR A 130 00:05:15,161 --> 00:05:16,496 PARTICULAR PATIENT, ESPECIALLY 131 00:05:16,496 --> 00:05:18,899 WHEN IT COMES TO ONCOLOGY AND 132 00:05:18,899 --> 00:05:21,134 CLINICAL TRIAL MATCHING. 133 00:05:21,134 --> 00:05:22,569 AND PREDICTIVE ANALYTICS IN 134 00:05:22,569 --> 00:05:23,270 GENERAL, PREDICTING WHO IS AT 135 00:05:23,270 --> 00:05:24,704 RISK. 136 00:05:24,704 --> 00:05:26,840 WHO MAY MOST BENEFIT FROM AN 137 00:05:26,840 --> 00:05:28,441 INTERVENTION IN ORDER TO AVOID 138 00:05:28,441 --> 00:05:32,145 AN ADVERSE OUTCOME. 139 00:05:32,145 --> 00:05:34,180 AI, ESPECIALLY WITH GENERATIVE 140 00:05:34,180 --> 00:05:37,117 AI HAS NOW ALSO INTRODUCED THE 141 00:05:37,117 --> 00:05:41,388 PROMISE OF REDUCING 142 00:05:41,388 --> 00:05:41,955 HEALTHCARE'S ADMINISTRATIVE 143 00:05:41,955 --> 00:05:43,490 BURDEN. 144 00:05:43,490 --> 00:05:46,092 WE KNOW HEALTHCARE HAS AN 145 00:05:46,092 --> 00:05:47,027 INCREDIBLE AMOUNT OF 146 00:05:47,027 --> 00:05:49,162 ADMINISTRATIVE BURDEN THAT MAY 147 00:05:49,162 --> 00:05:53,300 BE DRIVING PHYSICIAN BURN OUT, 148 00:05:53,300 --> 00:05:56,536 NURSE BURN OUT, CLINICIAN BURN 149 00:05:56,536 --> 00:05:58,638 OUT, PATIENT BURN OUT. 150 00:05:58,638 --> 00:06:00,540 AI HAS THE POTENTIAL TO REDUCE 151 00:06:00,540 --> 00:06:01,474 THAT BURDEN. 152 00:06:01,474 --> 00:06:03,810 IMAGINE THE DAY WHERE THE EHR 153 00:06:03,810 --> 00:06:08,248 IS A JOY TO USE, AS JOYFUL AS 154 00:06:08,248 --> 00:06:11,284 SAY ONE'S iPHONE OR SOME OF THE 155 00:06:11,284 --> 00:06:11,818 OTHER DELIGHTFUL DIGITAL 156 00:06:11,818 --> 00:06:14,421 EXPERIENCES. 157 00:06:14,421 --> 00:06:16,122 NOW EHR IS A LITTLE BIT FAR 158 00:06:16,122 --> 00:06:19,659 FROM THAT, BUT THE ADVANCES IN 159 00:06:19,659 --> 00:06:20,260 CLINICAL DOCUMENTATION 160 00:06:20,260 --> 00:06:21,194 ASSISTANTS REALLY OFFERS THAT 161 00:06:21,194 --> 00:06:23,096 PROMISE. 162 00:06:23,096 --> 00:06:25,865 THERE ARE OTHER ADMINISTRATIVE 163 00:06:25,865 --> 00:06:28,535 BURDENS THAT AI CAN HELP 164 00:06:28,535 --> 00:06:30,670 AUGMENT OR IMPROVE OPERATIONAL 165 00:06:30,670 --> 00:06:31,271 EFFICIENCY, INCLUDING 166 00:06:31,271 --> 00:06:34,007 SCHEDULING AND BILLING AND 167 00:06:34,007 --> 00:06:37,644 CODE, WORKFLOW OPTIMIZATION. 168 00:06:37,644 --> 00:06:38,745 BUT THERE'S GOT TO BE A CATCH, 169 00:06:38,745 --> 00:06:39,646 RIGHT? 170 00:06:39,646 --> 00:06:40,780 BECAUSE OF COURSE THAT'S TOO 171 00:06:40,780 --> 00:06:41,982 GOOD TO BE TRUE. 172 00:06:41,982 --> 00:06:45,819 IF IT'S ALL GOOD, WHAT ABOUT 173 00:06:45,819 --> 00:06:47,354 ARE THERE ANY THINGS WE SHOULD 174 00:06:47,354 --> 00:06:49,122 BE CONCERNED ABOUT. 175 00:06:49,122 --> 00:06:49,956 OF COURSE THE SKEPTICS IN THE 176 00:06:49,956 --> 00:06:51,191 ROOM ARE CORRECT. 177 00:06:51,191 --> 00:06:52,926 THERE ARE THINGS WE DO HAVE TO 178 00:06:52,926 --> 00:06:54,794 CONSIDER. 179 00:06:54,794 --> 00:06:55,862 THERE ARE CHALLENGES. A.I 180 00:06:55,862 --> 00:06:57,564 BRINGS ON A NEW SET OF 181 00:06:57,564 --> 00:06:59,532 CHALLENGES WE NEED TO CONSIDER. 182 00:06:59,532 --> 00:07:02,068 ONE IS THE CLINICIAN OR HUMAN 183 00:07:02,068 --> 00:07:04,004 IN THE LOOP VERSUS AUTO PILOT. 184 00:07:04,004 --> 00:07:05,538 HOW DO WE ENSURE THE HUMAN IN 185 00:07:05,538 --> 00:07:07,641 THE LOOP WHO IS ACTUALLY THE 186 00:07:07,641 --> 00:07:11,177 DRIVER IN THE DRIVER SEAT IS 187 00:07:11,177 --> 00:07:20,153 PROPERLY EQUIPPED AND HAS THE 188 00:07:20,153 --> 00:07:23,490 CAPABILITY TO GUIDE AI AND HAVE 189 00:07:23,490 --> 00:07:25,925 IT AUGMENT AND NOT BECOME 190 00:07:25,925 --> 00:07:28,228 OVERLY RELIANT ON AN AUTOMATED 191 00:07:28,228 --> 00:07:30,697 SYSTEM, OR A SYSTEM THAT HASN'T 192 00:07:30,697 --> 00:07:33,199 YET EARNED THAT TRUST. 193 00:07:33,199 --> 00:07:34,000 REGULATIONS, THEY ARE MOVING 194 00:07:34,000 --> 00:07:34,901 QUITE FAST. 195 00:07:34,901 --> 00:07:36,636 I WILL TALK ABOUT THAT. 196 00:07:36,636 --> 00:07:38,271 THEY MAY NOT BE MOVING FAST 197 00:07:38,271 --> 00:07:39,873 ENOUGH, THE WHOLE GOAL OF 198 00:07:39,873 --> 00:07:42,876 REGULATION IS TO CREATE SOME 199 00:07:42,876 --> 00:07:46,112 FLOOR, A FOUNDATIONAL 200 00:07:46,112 --> 00:07:48,048 ACCEPTABLE USE TO ENSURE 201 00:07:48,048 --> 00:07:48,782 STANDARDS IN SAFETY IS 202 00:07:48,782 --> 00:07:49,983 ESTABLISHED. 203 00:07:49,983 --> 00:07:51,151 SO MOVING FAST BUT MAYBE NOT 204 00:07:51,151 --> 00:07:54,120 FAST ENOUGH. 205 00:07:54,120 --> 00:07:57,023 QUESTIONS ABOUT DATA PRIVACY, 206 00:07:57,023 --> 00:08:07,467 INTELLECTUAL PROPERTY AND 207 00:08:12,238 --> 00:08:13,606 ALGORITHMIC BIAS. 208 00:08:13,606 --> 00:08:16,242 ALL THE WAY TO THE WAY AN AI 209 00:08:16,242 --> 00:08:18,645 SYSTEM IS DEPLOYED INTO A 210 00:08:18,645 --> 00:08:19,145 SHARED DELIVERY SYSTEM, 211 00:08:19,145 --> 00:08:21,581 WORKFLOW. 212 00:08:21,581 --> 00:08:25,719 ALL CAN PERPETUATE AND AMPLIFY 213 00:08:25,719 --> 00:08:28,154 BIAS THAT ALREADY EXISTS. WHILE 214 00:08:28,154 --> 00:08:30,056 AI OFFERS INCREDIBLE POTENTIAL 215 00:08:30,056 --> 00:08:32,892 TO HEALTHCARE, THERE ARE THESE 216 00:08:32,892 --> 00:08:33,193 CHALLENGES. 217 00:08:33,193 --> 00:08:36,029 WHEN WE THINK OF DR. MARTIN 218 00:08:36,029 --> 00:08:38,765 LUTHER KING, OVER 50 YEARS AGO, 219 00:08:38,765 --> 00:08:49,309 HE HAD THIS QUOTE, "OF ALL THE 220 00:08:49,876 --> 00:08:53,246 FORMS OF INJUSTICE INEQUALITY, 221 00:08:53,246 --> 00:08:56,182 INJUSTICE IN HEALTHCARE IS THE 222 00:08:56,182 --> 00:08:57,484 MOST SHOCKING AND INHUMANE." 223 00:08:57,484 --> 00:08:59,185 HOW MUCH HAVE WE CLOSED THE GAP 224 00:08:59,185 --> 00:09:01,421 IN THE LAST 50 YEARS? 225 00:09:01,421 --> 00:09:02,989 WE HAVEN'T MADE THAT MUCH 226 00:09:02,989 --> 00:09:05,158 PROGRESS. 227 00:09:05,158 --> 00:09:06,493 INCREMENTAL AT BEST. 228 00:09:06,493 --> 00:09:09,429 AS WE ENTER INTO THIS NEW AI 229 00:09:09,429 --> 00:09:12,665 ERA, OUR TOOLS CAN EITHER WIDEN 230 00:09:12,665 --> 00:09:13,299 HEALTH DISPARITIES, REALLY 231 00:09:13,299 --> 00:09:14,768 AMPLIFY THEM. 232 00:09:14,768 --> 00:09:16,703 OR THIS IS AN INCREDIBLE 233 00:09:16,703 --> 00:09:19,105 OPPORTUNITY TO USE THESE NEW 234 00:09:19,105 --> 00:09:24,010 TOOLS TO NARROW AND ELIMINATE 235 00:09:24,010 --> 00:09:24,878 HEALTH DISPARITIES. 236 00:09:24,878 --> 00:09:29,315 WHEN IT COMES TO AI, ALGO 237 00:09:29,315 --> 00:09:32,952 RHYTHMIC BIAS IN MACHINE 238 00:09:32,952 --> 00:09:37,290 LEARNING, AI -- 239 00:09:37,290 --> 00:09:39,225 WEIZENBALM'S BOOK OF COMPUTER 240 00:09:39,225 --> 00:09:41,027 POWER AND HUMAN REASON SUGGESTS 241 00:09:41,027 --> 00:09:43,129 THAT BIAS COULD ARISE FROM THE 242 00:09:43,129 --> 00:09:45,565 DATA AND CODING ASSUMPTIONS 243 00:09:45,565 --> 00:09:47,600 EMBEDDED IN COMPUTER PROGRAMS. 244 00:09:47,600 --> 00:09:52,438 A MEDICAL SCHOOL IN THE U.K. 245 00:09:52,438 --> 00:09:54,174 DEVELOPED AN ALGORITHM TO 246 00:09:54,174 --> 00:09:56,576 AUTOMATE A PORTION OF THEIR 247 00:09:56,576 --> 00:09:59,312 ADMISSIONS PROCESS THAT 248 00:09:59,312 --> 00:10:00,713 RESULTED IN DISCRIMINATION 249 00:10:00,713 --> 00:10:02,515 AGAINST WOMEN AND MEMBERS OF 250 00:10:02,515 --> 00:10:03,550 ETHNIC MINORITIES. 251 00:10:03,550 --> 00:10:11,357 THE OUTPUT OF THE MODEL OF THE 252 00:10:11,357 --> 00:10:12,158 ALGORITHM DISCRIMINATED AGAINST 253 00:10:12,158 --> 00:10:15,061 WOMEN AND MEMBERS OF ETHNIC 254 00:10:15,061 --> 00:10:18,031 MINORITY GROUPS. NOT SURPRISING. 255 00:10:18,031 --> 00:10:20,600 IN 2016, WE BECAME MORE AWARE 256 00:10:20,600 --> 00:10:27,807 AS AI CONTINUED TO INCREASE IN 257 00:10:27,807 --> 00:10:28,908 ADOPTION ACROSS INDUSTRIES, 258 00:10:28,908 --> 00:10:34,781 PROPUBLICA FOUND BIAS IN THE 259 00:10:34,781 --> 00:10:36,983 COMPAS ALGORITHM FOR CRIMINAL 260 00:10:36,983 --> 00:10:42,355 RECIDIVISM. 261 00:10:42,355 --> 00:10:44,157 A PAPER DISCOVERED BIAS IN 262 00:10:44,157 --> 00:10:46,593 HEALTHCARE THAT USED A MODELING 263 00:10:46,593 --> 00:10:48,595 ERROR COST THAT HAS A 264 00:10:48,595 --> 00:10:52,265 PREDICTION OF RISK AND 265 00:10:52,265 --> 00:10:53,900 AMPLIFIED HEALTH DISPARITIES 266 00:10:53,900 --> 00:10:55,335 AND REAL QUESTIONS SINCE 2020 267 00:10:55,335 --> 00:10:58,738 HOW WE USE RACE IN VARIABLES IN 268 00:10:58,738 --> 00:11:00,874 CLINICAL ALGORITHMS. 269 00:11:00,874 --> 00:11:02,508 SO TO DIVE DEEPER, ALGO 270 00:11:02,508 --> 00:11:03,776 RHYTHMIC BIAS IS THE DARK SIDE 271 00:11:03,776 --> 00:11:13,887 OF AI. 272 00:11:14,320 --> 00:11:15,555 OBERMEYER'S PAPER, WHAT THEY 273 00:11:15,555 --> 00:11:17,557 FOUND ON A NATIONAL SCALE THIS 274 00:11:17,557 --> 00:11:19,192 MODEL HAD BEEN DEPLOYED IN 275 00:11:19,192 --> 00:11:22,328 HUNDREDS OF HEALTH SYSTEMS THAT 276 00:11:22,328 --> 00:11:24,230 BLACK PATIENTS WERE 50% LESS 277 00:11:24,230 --> 00:11:27,233 LIKELY TO BE REFERRED TO CASE 278 00:11:27,233 --> 00:11:27,767 MANAGEMENT, DESPITE BEING 279 00:11:27,767 --> 00:11:28,668 EQUALLY SICK. 280 00:11:28,668 --> 00:11:30,770 THEY COULD MEASURE THE DECLINE 281 00:11:30,770 --> 00:11:32,639 IN CASE MANAGEMENT RATES ON A 282 00:11:32,639 --> 00:11:33,506 NATIONAL LEVEL, USING A 283 00:11:33,506 --> 00:11:34,641 NATIONAL DATABASE. 284 00:11:34,641 --> 00:11:37,076 THEY WERE ABLE TO MEASURE THIS 285 00:11:37,076 --> 00:11:39,012 ANOMALY WHERE RATES OF REFERRAL 286 00:11:39,012 --> 00:11:40,813 WERE GOING DOWN AND THEY 287 00:11:40,813 --> 00:11:42,615 COULDN'T FIGURE IT OUT UNTIL 288 00:11:42,615 --> 00:11:46,219 THEY LOOKED AND SAW IT WAS AN 289 00:11:46,219 --> 00:11:46,486 ALGORITHM. 290 00:11:46,486 --> 00:11:48,588 THE GOOD NEWS IS THEY WERE ABLE 291 00:11:48,588 --> 00:11:50,456 TO DETECT THE BIAS AND MITIGATE 292 00:11:50,456 --> 00:11:51,724 THE BIAS. 293 00:11:51,724 --> 00:11:55,662 SO THERE ARE GOOD TOOLS OUT 294 00:11:55,662 --> 00:11:57,497 THERE IN MACHINE LEARNING WHERE 295 00:11:57,497 --> 00:11:59,399 YOU CAN ACTUALLY MEASURE BIAS 296 00:11:59,399 --> 00:12:02,502 AND YOU CAN MITIGATE THAT BIAS 297 00:12:02,502 --> 00:12:04,103 AND CREATE A FAIR MODEL. 298 00:12:04,103 --> 00:12:05,905 THAT'S WHAT THEY DID. 299 00:12:05,905 --> 00:12:08,341 THEY MEASURED THE OUTPUT OF 300 00:12:08,341 --> 00:12:10,410 MODEL, USING METRICS WE OFTEN 301 00:12:10,410 --> 00:12:11,878 CALL THEM FAIRNESS METRICS. 302 00:12:11,878 --> 00:12:13,546 THEY WERE ABLE TO BETTER 303 00:12:13,546 --> 00:12:15,148 ESTIMATE WHAT THE MODEL SHOULD 304 00:12:15,148 --> 00:12:17,583 LOOK LIKE AND THEN FIND THE 305 00:12:17,583 --> 00:12:21,554 BETTER LABEL. 306 00:12:21,554 --> 00:12:27,527 THE BETTER LABEL TO PREDICT WHO 307 00:12:27,527 --> 00:12:28,027 WOULD BENEFIT FROM CASE 308 00:12:28,027 --> 00:12:29,128 MANAGEMENT. 309 00:12:29,128 --> 00:12:31,030 WHAT THEY DID INSTEAD OF USING 310 00:12:31,030 --> 00:12:33,766 COST AS A PROXY FOR RISK, OR 311 00:12:33,766 --> 00:12:35,368 RISK OF ADVERSE HEALTH OUTCOME, 312 00:12:35,368 --> 00:12:39,238 THEY USED A NUMBER OF DISEASES 313 00:12:39,238 --> 00:12:41,441 ON THE PROBLEM LIST. 314 00:12:41,441 --> 00:12:44,777 IN A CLINICAL SENSE, WE ALL 315 00:12:44,777 --> 00:12:45,778 INTUITIVELY KNOW, SOMEONE IS 316 00:12:45,778 --> 00:12:47,580 HIGHER RISK IF THEY HAVE MORE 317 00:12:47,580 --> 00:12:48,815 MEDICAL PROBLEMS. 318 00:12:48,815 --> 00:12:51,050 SOMEONE IS HIGHER RISK IF THERE 319 00:12:51,050 --> 00:12:52,852 ARE DISEASES LIKE DIABETES, OR 320 00:12:52,852 --> 00:12:53,686 BLOOD PRESSURE ARE NOT 321 00:12:53,686 --> 00:12:55,021 CONTROLLED. 322 00:12:55,021 --> 00:12:58,925 AND SO THOSE WERE THE LABELS 323 00:12:58,925 --> 00:13:01,594 THEY USED TO PREDICT WHO IS AT 324 00:13:01,594 --> 00:13:03,930 RISK, VERSUS COST OR HOW MUCH 325 00:13:03,930 --> 00:13:06,065 DOLLARS ARE SPENT IN HEALTHCARE. 326 00:13:06,065 --> 00:13:09,002 BECAUSE WE KNOW THAT 327 00:13:09,002 --> 00:13:10,169 SYSTEMICALLY, BLACK PATIENTS 328 00:13:10,169 --> 00:13:13,106 SPEND LESS WHEN EQUALLY SICK, 329 00:13:13,106 --> 00:13:17,243 FOR A WIDE VARIETY OF 330 00:13:17,243 --> 00:13:17,844 SOCIOECONOMIC AND HISTORICAL 331 00:13:17,844 --> 00:13:18,878 FACTORS. 332 00:13:18,878 --> 00:13:20,480 SO THEY FIXED THE ALGORITHM, 333 00:13:20,480 --> 00:13:22,382 USED A NUMBER OF PROBLEMS ON 334 00:13:22,382 --> 00:13:24,050 THE PROBLEM LIST, WHETHER OR 335 00:13:24,050 --> 00:13:26,619 NOT DISEASE WAS CONTROLLED AND 336 00:13:26,619 --> 00:13:28,955 THEY CREATED A DE-BIASED MODEL 337 00:13:28,955 --> 00:13:30,656 THAT WAS EQUITABLE TO BOTH 338 00:13:30,656 --> 00:13:31,691 WHITE PATIENTS AND BLACK 339 00:13:31,691 --> 00:13:33,292 PATIENTS. 340 00:13:33,292 --> 00:13:37,030 SO THAT PARTICULAR MODEL WAS 341 00:13:37,030 --> 00:13:40,266 EXPOSED TO AT ONE POINT 342 00:13:40,266 --> 00:13:42,301 ESTIMATED 80 MILLION PATIENTS 343 00:13:42,301 --> 00:13:44,137 EXPOSED TO THIS BIAS MODEL. 344 00:13:44,137 --> 00:13:46,339 THE TOTAL LOST OPPORTUNITY 345 00:13:46,339 --> 00:13:47,407 COST, $1 BILLION OF LOST 346 00:13:47,407 --> 00:13:51,310 OPPORTUNITY. 347 00:13:51,310 --> 00:13:59,285 AND SO, YES BIAS AND 348 00:13:59,285 --> 00:14:00,920 ALGORHYTHMIC BIAS, HAVE A 349 00:14:00,920 --> 00:14:02,822 FINANCIAL TOLL. 350 00:14:02,822 --> 00:14:06,659 WE TALK ABOUT HUMAN TOLL. 351 00:14:06,659 --> 00:14:08,261 I'M A PHYSICIAN, INTERNIST, 352 00:14:08,261 --> 00:14:10,596 PRIMARY CARE DOCTOR, SO MANY OF 353 00:14:10,596 --> 00:14:11,531 MY PATIENTS HAVE COMPLEX 354 00:14:11,531 --> 00:14:13,933 DISEASES AND NAVIGATING THE 355 00:14:13,933 --> 00:14:14,400 HEALTHCARE SYSTEM IS 356 00:14:14,400 --> 00:14:17,937 CHALLENGING. 357 00:14:17,937 --> 00:14:19,672 WHEN OFFERED CASE MANAGEMENT TO 358 00:14:19,672 --> 00:14:22,708 HELP NAVIGATE, IT COULD BE SUCH 359 00:14:22,708 --> 00:14:24,343 A RELIEF TO PATIENTS AND 360 00:14:24,343 --> 00:14:25,945 FAMILIES, THAT OPPORTUNITY COST 361 00:14:25,945 --> 00:14:27,947 WAS NOT MEASURED. 362 00:14:27,947 --> 00:14:31,184 BUT WE COULD INTUITIVELY 363 00:14:31,184 --> 00:14:33,619 CALCULATE IT WAS A BURDEN ON 364 00:14:33,619 --> 00:14:34,587 PATIENTS THAT WAS UNFAIR AND 365 00:14:34,587 --> 00:14:36,956 UNJUST. 366 00:14:36,956 --> 00:14:37,857 FIRST QUESTION, ACTUALLY AT 367 00:14:37,857 --> 00:14:39,659 THAT TIME THIS MODEL WAS 368 00:14:39,659 --> 00:14:41,394 RELEASED I WAS CHIEF POPULATION 369 00:14:41,394 --> 00:14:42,628 HEALTH OFFICER AT A LARGE 370 00:14:42,628 --> 00:14:44,263 HEALTH SYSTEM. 371 00:14:44,263 --> 00:14:46,332 AND THIS MODEL WAS MOST LIKELY 372 00:14:46,332 --> 00:14:47,767 DEPLOYED IN MY HEALTH SYSTEM. 373 00:14:47,767 --> 00:14:49,836 AND I DIDN'T HAVE THE TOOLS TO 374 00:14:49,836 --> 00:14:53,206 MEASURE OR MITIGATE. 375 00:14:53,206 --> 00:14:55,741 BUT THE DEVELOPERS FIXED THE 376 00:14:55,741 --> 00:14:58,244 MODEL, AND I THOUGHT PHEW! 377 00:14:58,244 --> 00:14:59,145 GOOD THING THEY FIXED THAT 378 00:14:59,145 --> 00:15:00,880 MODEL. 379 00:15:00,880 --> 00:15:05,952 BUT LATER ON, 2020, SINCE THEN, 380 00:15:05,952 --> 00:15:08,788 THE QUESTION WAS THIS A ONE-OFF 381 00:15:08,788 --> 00:15:18,564 OR A FREQUENT EXPERIENCE? 382 00:15:18,564 --> 00:15:20,900 SO AHRQ AND NIH DID A STUDY. 383 00:15:20,900 --> 00:15:23,603 THEY DID A BROAD LITERATURE 384 00:15:23,603 --> 00:15:27,140 SEARCH, THAT SEARCH IDENTIFIED 385 00:15:27,140 --> 00:15:29,375 OVER 11,500 CITATIONS. 386 00:15:29,375 --> 00:15:33,679 AFTER FULL REVIEW, 58 ARTICLES 387 00:15:33,679 --> 00:15:36,048 MET CRITERIA. 388 00:15:36,048 --> 00:15:39,785 MANY OF THEM INCLUDED MODELING 389 00:15:39,785 --> 00:15:43,623 WIDE VARIETY OF HEALTHCARE 390 00:15:43,623 --> 00:15:46,559 MODELS, ALGORITHMS, ONLY 58 MET 391 00:15:46,559 --> 00:15:48,661 INCLUSION CRITERIA AND ONLY 14 392 00:15:48,661 --> 00:15:52,598 STUDIES HAD MET ENOUGH OF THE 393 00:15:52,598 --> 00:15:54,634 DUAL MULTIPLE INCLUSION 394 00:15:54,634 --> 00:15:56,235 CRITERIA TO ANSWER THIS FIRST 395 00:15:56,235 --> 00:15:59,672 QUESTION ON THE IMPACT ON RACE 396 00:15:59,672 --> 00:16:04,810 AND ETHNIC DISPARITIES, DOES 397 00:16:04,810 --> 00:16:06,846 ALGORITHMS INCREASE DISPARITIES 398 00:16:06,846 --> 00:16:08,481 AND DOES IT, THAT PARTICULAR 399 00:16:08,481 --> 00:16:10,383 QUESTION. 400 00:16:10,383 --> 00:16:15,855 SO CAN YOU IMAGINE, OVER 10,000 401 00:16:15,855 --> 00:16:21,494 STUDIES AND ONLY 58 STRATIFIED 402 00:16:21,494 --> 00:16:24,530 THEIR OUTCOMES BY POPULATION. 403 00:16:24,530 --> 00:16:26,532 AND STRATIFIED THE MODEL 404 00:16:26,532 --> 00:16:27,934 PERFORMANCE BY POPULATION. 405 00:16:27,934 --> 00:16:29,468 SO THAT'S BASICALLY WHAT WAS 406 00:16:29,468 --> 00:16:34,640 NEEDED TO KNOW WHETHER OR NOT 407 00:16:34,640 --> 00:16:36,542 HEALTH INEQUITIES. 408 00:16:36,542 --> 00:16:40,580 THINK FOUND 10 OUT OF 14 409 00:16:40,580 --> 00:16:42,148 PERPETUATED, 4 OUT OF 14 410 00:16:42,148 --> 00:16:43,649 DECREASED. 411 00:16:43,649 --> 00:16:47,119 OF THE FOUR THAT DECREASED IT 412 00:16:47,119 --> 00:16:49,488 WAS BY DESIGN. 413 00:16:49,488 --> 00:16:52,024 SO THE TAKE-HOME MESSAGE WHEN 414 00:16:52,024 --> 00:16:53,726 INTENTFUL, THEY CAN DECREASE 415 00:16:53,726 --> 00:16:57,630 HEALTH DISPARITIES. 416 00:16:57,630 --> 00:16:59,932 WHEN LEFT TO CHANCE, MOST 417 00:16:59,932 --> 00:17:02,134 LIKELY THEY INCREASE HEALTH 418 00:17:02,134 --> 00:17:03,869 DISPARITIES, BUT THE BIGGEST 419 00:17:03,869 --> 00:17:06,706 TRAVESTY IS MOST DO NOT EVEN 420 00:17:06,706 --> 00:17:08,241 MEASURE STRATIFIED PERFORMANCE, 421 00:17:08,241 --> 00:17:11,444 LET ALONE AFFECT ON OUTCOMES. 422 00:17:11,444 --> 00:17:13,079 SO OF COURSE THAT'S A DIFFERENT 423 00:17:13,079 --> 00:17:13,613 ERA. 424 00:17:13,613 --> 00:17:15,715 NOW THERE HAVE BEEN A LOT OF 425 00:17:15,715 --> 00:17:18,117 CHANGES ON HOW EVEN THE 426 00:17:18,117 --> 00:17:20,419 JOURNALS WILL REVIEW A NEW 427 00:17:20,419 --> 00:17:22,555 MODEL, WHETHER IT'S WORTHY OF 428 00:17:22,555 --> 00:17:23,789 PUBLICATION AND REALLY DEMAND 429 00:17:23,789 --> 00:17:25,424 SOME OF THAT STRATIFIED 430 00:17:25,424 --> 00:17:27,193 PERFORMANCE. 431 00:17:27,193 --> 00:17:29,195 SO BIAS OCCURS THROUGHOUT THE 432 00:17:29,195 --> 00:17:30,263 AI LIFE CYCLE. 433 00:17:30,263 --> 00:17:32,898 IT STARTS IN THE MOMENT OF DATA 434 00:17:32,898 --> 00:17:33,666 CREATION. 435 00:17:33,666 --> 00:17:37,003 AND FROM THAT DATA, REAL WORLD 436 00:17:37,003 --> 00:17:40,039 DATA IN MANY CASES, ESPECIALLY 437 00:17:40,039 --> 00:17:41,407 IN HEALTHCARE ALGORITHMS WE ARE 438 00:17:41,407 --> 00:17:43,042 USING HR DATA. 439 00:17:43,042 --> 00:17:44,677 SOMETIMES WE ARE LUCKY TO 440 00:17:44,677 --> 00:17:47,513 CURATE DATA FROM A CLINICAL 441 00:17:47,513 --> 00:17:49,715 TRIAL WHICH IS HIGHER QUALITY 442 00:17:49,715 --> 00:17:50,516 THAN ANY ELECTRONIC MEDICAL 443 00:17:50,516 --> 00:17:52,151 DATA. 444 00:17:52,151 --> 00:17:54,086 BUT THAT DATA CREATION, 445 00:17:54,086 --> 00:17:55,688 DEFINITELY HAS REAL WORLD BIAS 446 00:17:55,688 --> 00:17:56,789 THAT IS REFLECTED IN THE 447 00:17:56,789 --> 00:17:58,424 DATASET. 448 00:17:58,424 --> 00:18:01,761 NEXT PART OF THE AI LIFE CYCLE 449 00:18:01,761 --> 00:18:02,962 FROM PROBLEM FORMULATION, WHO 450 00:18:02,962 --> 00:18:04,563 GETS TO DECIDE WHAT KIND OF 451 00:18:04,563 --> 00:18:06,866 PROBLEMS TO SOLVE? 452 00:18:06,866 --> 00:18:08,134 RIGHT, DOES COMMUNITY DETERMINE 453 00:18:08,134 --> 00:18:09,268 WHAT PROBLEMS ARE IMPORTANT 454 00:18:09,268 --> 00:18:14,240 THAT WE ADDRESS IT WITH AI 455 00:18:14,240 --> 00:18:16,175 SYSTEMS OR IS IT HEALTHCARE 456 00:18:16,175 --> 00:18:17,176 SYSTEMS THAT DETERMINE. 457 00:18:17,176 --> 00:18:18,944 AND WHAT ARE THE UNDERLYING 458 00:18:18,944 --> 00:18:20,746 DRIVERS OF THOSE DECISIONS? 459 00:18:20,746 --> 00:18:24,016 ONCE THAT DATA IS ACQUIRED AND 460 00:18:24,016 --> 00:18:26,352 THE DEVELOPMENT IS UNDER WAY 461 00:18:26,352 --> 00:18:28,154 FOR MODEL DEVELOPMENT, A MODEL 462 00:18:28,154 --> 00:18:30,589 TO BE DEVELOPED, THEN IT'S LEFT 463 00:18:30,589 --> 00:18:32,725 TO THE INDIVIDUAL DATA 464 00:18:32,725 --> 00:18:34,694 SCIENTISTS AND THEY ARE MAKING 465 00:18:34,694 --> 00:18:37,663 HUNDREDS, IF NOT THOUSANDS OF 466 00:18:37,663 --> 00:18:39,398 DECISIONS DURING THE DATA MODEL 467 00:18:39,398 --> 00:18:40,700 DEVELOPMENT. 468 00:18:40,700 --> 00:18:41,934 MODEL EVALUATION, WHETHER OR 469 00:18:41,934 --> 00:18:44,570 NOT THE PERFORMANCE OF THE 470 00:18:44,570 --> 00:18:46,105 MODEL WAS STRATIFIED BY 471 00:18:46,105 --> 00:18:47,506 DEMOGRAPHICS IN ORDER TO ASSESS 472 00:18:47,506 --> 00:18:49,342 IF THE MODEL IS ACCURATE. 473 00:18:49,342 --> 00:18:54,480 OR THE OUTCOME OF THE MODEL 474 00:18:54,480 --> 00:18:55,414 FAIR ACROSS SUB POPULATIONS. 475 00:18:55,414 --> 00:18:57,316 THAT COULD BE DONE AND OFTEN 476 00:18:57,316 --> 00:18:59,752 TIMES IT ISN'T. 477 00:18:59,752 --> 00:19:06,392 COULD HAVE THE PERFECT MODEL, 478 00:19:06,392 --> 00:19:08,627 AND THEN YOU DEPLOY IT IN AN 479 00:19:08,627 --> 00:19:10,830 IMPERFECT WORLD. 480 00:19:10,830 --> 00:19:13,499 ALL ACROSS THE AI LIFE CYCLE, 481 00:19:13,499 --> 00:19:15,101 BIAS CAN OCCUR. 482 00:19:15,101 --> 00:19:18,037 THE GOOD NEWS AGAIN, THERE ARE 483 00:19:18,037 --> 00:19:19,438 REALLY GOOD MITIGATION METHODS 484 00:19:19,438 --> 00:19:20,906 THAT COULD BE DEPLOYED 485 00:19:20,906 --> 00:19:22,408 THROUGHOUT THE AI LIFE CYCLE. 486 00:19:22,408 --> 00:19:24,276 THINK OF THE TECHNICAL ONES, 487 00:19:24,276 --> 00:19:28,748 IT'S EASY TO THINK AI, LET'S 488 00:19:28,748 --> 00:19:30,282 JUMP TO TECHNICAL BIAS 489 00:19:30,282 --> 00:19:31,484 MITIGATION METHODS. 490 00:19:31,484 --> 00:19:34,220 BUT THERE ARE SOME SOCIAL 491 00:19:34,220 --> 00:19:35,454 MITIGATION METHODS THAT ARE 492 00:19:35,454 --> 00:19:36,822 VERY EFFECTIVE AS WELL. 493 00:19:36,822 --> 00:19:40,292 ONE OF THE GOALS OF NLM IS 494 00:19:40,292 --> 00:19:41,894 WORKFORCE DEVELOPMENT AND 495 00:19:41,894 --> 00:19:42,728 DEFINITELY WORKFORCE 496 00:19:42,728 --> 00:19:43,295 DEVELOPMENT, HAVING DIVERSE 497 00:19:43,295 --> 00:19:44,130 TEAMS. 498 00:19:44,130 --> 00:19:45,831 IT'S NOT EVEN DIVERSE TEAMS 499 00:19:45,831 --> 00:19:48,033 WHEN IT COMES TO SAY 500 00:19:48,033 --> 00:19:49,268 DEMOGRAPHIC DIVERSITY. 501 00:19:49,268 --> 00:19:51,404 IT'S LIVED EXPERIENCE DIVERSITY. 502 00:19:51,404 --> 00:19:56,142 IT'S DIVERSITY IN THE TYPES OF 503 00:19:56,142 --> 00:19:56,876 SKILLS THAT PEOPLE BRING TO THE 504 00:19:56,876 --> 00:19:59,078 TABLE. 505 00:19:59,078 --> 00:20:02,181 SO THAT TEAM DIVERSITY CAN HELP 506 00:20:02,181 --> 00:20:04,517 BALANCE OUT DIVERSITY AND BIAS 507 00:20:04,517 --> 00:20:06,852 THAT'S IN THE DATASET OR IN THE 508 00:20:06,852 --> 00:20:12,191 WAY THAT WE MODEL. 509 00:20:12,191 --> 00:20:14,760 GOVERNANCE, MAKING SURE YOU 510 00:20:14,760 --> 00:20:15,327 HAVE LOCAL POLICIES, 511 00:20:15,327 --> 00:20:16,562 PROCEDURES, THE ROLE OF 512 00:20:16,562 --> 00:20:17,296 REGULATION IS TO CREATE THAT 513 00:20:17,296 --> 00:20:17,963 FLOOR. 514 00:20:17,963 --> 00:20:20,332 AND SOME OF THE MORE TECHNICAL 515 00:20:20,332 --> 00:20:22,001 MITIGATION METHODS. 516 00:20:22,001 --> 00:20:25,438 FROM REGULATIONS, IT MOVES ONTO 517 00:20:25,438 --> 00:20:27,740 FRAMEWORKS AND STANDARDS AND 518 00:20:27,740 --> 00:20:30,409 ACTUAL METHODS WHERE YOU ARE 519 00:20:30,409 --> 00:20:31,744 DEPLOYING STATISTICAL METHODS 520 00:20:31,744 --> 00:20:33,612 AND MACHINE LEARNING METHODS TO 521 00:20:33,612 --> 00:20:35,581 BOTH DETECT BIAS AND MITIGATE 522 00:20:35,581 --> 00:20:37,249 THAT BIAS. 523 00:20:37,249 --> 00:20:38,517 AND THEN MONITORING FOR 524 00:20:38,517 --> 00:20:39,485 OUTCOMES OVERTIME. 525 00:20:39,485 --> 00:20:41,987 A LOT OF TOOLS AS WELL ARE 526 00:20:41,987 --> 00:20:45,124 EMERGING TOOLS ON VALIDATION 527 00:20:45,124 --> 00:20:47,493 AND MONITORING MODELS OVERTIME 528 00:20:47,493 --> 00:20:49,762 FOR ADRIFT OF DEGRADATION OF 529 00:20:49,762 --> 00:20:51,464 PERFORMANCE AND MAKING SURE 530 00:20:51,464 --> 00:20:53,265 THEY ARE CONTINUALLY REFRESHED 531 00:20:53,265 --> 00:20:54,333 TO MEET INSTITUTIONAL 532 00:20:54,333 --> 00:20:57,169 THRESHOLDS OF PERFORMANCE. 533 00:20:57,169 --> 00:20:58,938 SO WHEN WE TALKED ABOUT 534 00:20:58,938 --> 00:21:00,973 DIVERSITY AND WE ARE TALKING 535 00:21:00,973 --> 00:21:02,475 ABOUT CONCERN ABOUT HEALTH 536 00:21:02,475 --> 00:21:04,143 EQUITY, AND THAT REALLY IS ONE 537 00:21:04,143 --> 00:21:05,478 OF THE KEY DRIVERS. 538 00:21:05,478 --> 00:21:07,813 WHEN I TALKED TO HEALTHCARE 539 00:21:07,813 --> 00:21:10,850 LEADERS, THEY ARE REALLY 540 00:21:10,850 --> 00:21:15,187 WORRIED ABOUT AI BIAS AND 541 00:21:15,187 --> 00:21:16,589 MAGNIFYING HEALTH INEQUITY WHEN 542 00:21:16,589 --> 00:21:18,491 THEY DEPLOY A.I. 543 00:21:18,491 --> 00:21:23,662 YET ONLY 80% OF HEALTH EQUITY 544 00:21:23,662 --> 00:21:25,064 LEADERS REPORT THEY HAVE LITTLE 545 00:21:25,064 --> 00:21:28,367 OR NO DECISION MAKING IN THEIR 546 00:21:28,367 --> 00:21:30,135 ORGANIZATIONS STRATEGY. 547 00:21:30,135 --> 00:21:32,238 SO THERE'S THAT DISCONNECT. 548 00:21:32,238 --> 00:21:35,074 OFTEN YOU HAVE AI STRATEGY ON 549 00:21:35,074 --> 00:21:38,110 ONE PILLAR, YOUR HEALTH EQUITY, 550 00:21:38,110 --> 00:21:39,512 DIVERSITY, EQUITY INCLUSION ON 551 00:21:39,512 --> 00:21:40,779 ANOTHER PILLAR, AND THE TWO 552 00:21:40,779 --> 00:21:41,947 DON'T MEET. 553 00:21:41,947 --> 00:21:43,582 SO IT'S REALLY IMPORTANT THAT 554 00:21:43,582 --> 00:21:45,484 HEALTH EQUITY, IF NOT LEADERS 555 00:21:45,484 --> 00:21:47,353 THEMSELVES BUT THOSE THAT HAVE 556 00:21:47,353 --> 00:21:49,455 HEALTH EQUITY AS CORE PRINCIPLE 557 00:21:49,455 --> 00:21:52,491 OF WHAT THEY ARE DOING ARE 558 00:21:52,491 --> 00:21:54,360 INCLUDED IN AI DECISION MAKING 559 00:21:54,360 --> 00:21:59,865 AND AI STRATEGY. 560 00:21:59,865 --> 00:22:01,800 SO WE TALKED ABOUT THE RISKS, 561 00:22:01,800 --> 00:22:02,768 NOW THE REGULATIONS. 562 00:22:02,768 --> 00:22:04,103 IN THE LAST SIX MONTHS THERE'S 563 00:22:04,103 --> 00:22:05,871 BEEN A NUMBER OF REGULATIONS 564 00:22:05,871 --> 00:22:08,874 THAT HAVE STARTED TO PROVIDE 565 00:22:08,874 --> 00:22:12,177 THAT FOUNDATIONAL FLOOR. 566 00:22:12,177 --> 00:22:14,313 HTI ONE WAS THE FIRST ON THE 567 00:22:14,313 --> 00:22:18,651 HEALTHCARE FRONT. 568 00:22:18,651 --> 00:22:23,188 ONC RELEASED VERY MADE FINAL 569 00:22:23,188 --> 00:22:28,127 RULE, QUITE A MONUMENTAL BIT OF 570 00:22:28,127 --> 00:22:28,627 REGULATION REQUIRING AI 571 00:22:28,627 --> 00:22:30,262 TRANSPARENCY. 572 00:22:30,262 --> 00:22:32,498 THEY WOULD DESCRIBE IT AS SIX 573 00:22:32,498 --> 00:22:34,900 FEET WIDE AND A MILE DEEP. 574 00:22:34,900 --> 00:22:37,836 IT'S A VERY DEEP SET OF RULES 575 00:22:37,836 --> 00:22:40,506 WHEN IT COMES TO AI 576 00:22:40,506 --> 00:22:41,106 TRANSPARENCY REQUIREMENTS FOR 577 00:22:41,106 --> 00:22:45,411 CERTIFICATION. 578 00:22:45,411 --> 00:22:47,146 THE SCOPE IS NARROW, BASICALLY 579 00:22:47,146 --> 00:22:51,283 IT'S JUST THE 600 OR SO EHR'S 580 00:22:51,283 --> 00:22:54,553 THAT ARE CERTIFIED BY ONC. 581 00:22:54,553 --> 00:22:57,022 CMS FOLLOWED SUIT IN FEBRUARY, 582 00:22:57,022 --> 00:23:00,659 MOSTLY LIMITING AI USE FOR 583 00:23:00,659 --> 00:23:01,226 MEDICAID ADVANTAGE COVERAGE 584 00:23:01,226 --> 00:23:02,962 DETERMINATIONS. 585 00:23:02,962 --> 00:23:05,197 AND SIMILARLY, ACA LAST WEEK 586 00:23:05,197 --> 00:23:08,534 RELEASED THEIR FINAL RULE 587 00:23:08,534 --> 00:23:09,635 FOCUSING ON AI 588 00:23:09,635 --> 00:23:11,270 NON-DISCRIMINATION PROTECTIONS. 589 00:23:11,270 --> 00:23:14,773 ONE OF THE OTHER -- SO ONE OF 590 00:23:14,773 --> 00:23:16,976 THOSE OTHER MONUMENTAL TYPES OF 591 00:23:16,976 --> 00:23:19,778 REGULATIONS WAS THE EU AI ACT. 592 00:23:19,778 --> 00:23:22,014 THAT WAS MADE FINAL IN MARCH 593 00:23:22,014 --> 00:23:23,082 13, 2024. 594 00:23:23,082 --> 00:23:25,217 SO JUST A FEW MONTHS AGO. 595 00:23:25,217 --> 00:23:27,620 AND THAT ONE IS A MILE HIGH AND 596 00:23:27,620 --> 00:23:29,154 MILE DEEP. 597 00:23:29,154 --> 00:23:31,690 THE PENALTIES ARE STIFF FOR 598 00:23:31,690 --> 00:23:34,693 MULTI-NATIONAL CORPORATIONS. IT 599 00:23:34,693 --> 00:23:36,161 HAS WIDE-REACHING IMPACT. 600 00:23:36,161 --> 00:23:42,401 FOR MOST OF OUR PHARMA 601 00:23:42,401 --> 00:23:46,472 INDUSTRY, THAT HAS BEEN AN 602 00:23:46,472 --> 00:23:48,340 INCREDIBLY IMPACTFUL 603 00:23:48,340 --> 00:23:48,641 LEGISLATION. 604 00:23:48,641 --> 00:23:52,745 SYSTEM IS PER -- 605 00:23:52,745 --> 00:23:56,315 PERPETUATING THAT BIAS. 606 00:23:56,315 --> 00:23:57,416 UNDER PERFORMANCE OF AI 607 00:23:57,416 --> 00:24:01,053 INVESTMENTS. 608 00:24:01,053 --> 00:24:05,924 AND SOLUTION THAT AI GOVERNANCE 609 00:24:05,924 --> 00:24:07,926 AT THE CORE, MOST FUNDAMENTAL 610 00:24:07,926 --> 00:24:10,763 AND HAVING THAT ABILITY OF 611 00:24:10,763 --> 00:24:13,198 MONITORING YOUR AI MODELS, 612 00:24:13,198 --> 00:24:15,000 ALGORITHMS, MAKING SURE THEY 613 00:24:15,000 --> 00:24:16,135 COMPLY WITH INSTITUTIONAL 614 00:24:16,135 --> 00:24:20,172 POLICY AND STANDARDS AND IF THE 615 00:24:20,172 --> 00:24:21,674 INSTITUTION DEEMS HEALTH EQUITY 616 00:24:21,674 --> 00:24:27,312 IS HIGH PRIORITY, ALL THESE 617 00:24:27,312 --> 00:24:27,913 RESPONSIBILITIES ARE A 618 00:24:27,913 --> 00:24:29,248 PRIORITY, THAT WOULD BE 619 00:24:29,248 --> 00:24:31,116 REFLECTED IN AN AUDIT. 620 00:24:31,116 --> 00:24:33,385 FOLLOWED BY VALIDATION, ONGOING 621 00:24:33,385 --> 00:24:35,187 ABILITY TO PROVIDE THOSE 622 00:24:35,187 --> 00:24:38,023 AUDITS, NOT JUST A ONE-OFF, 623 00:24:38,023 --> 00:24:40,225 TWO-OFF, BUT EVERY MODEL BEING 624 00:24:40,225 --> 00:24:42,261 CONTEMPLATED OR DESIGNED IN THE 625 00:24:42,261 --> 00:24:43,762 HEALTH SYSTEM AND THEN 626 00:24:43,762 --> 00:24:44,496 MONITORING OVER TIME. 627 00:24:44,496 --> 00:24:48,333 THESE ARE SOME OF THE QUALITY 628 00:24:48,333 --> 00:24:48,801 ASSURANCE GOVERNANCE 629 00:24:48,801 --> 00:24:55,207 FRAMEWORKS. ALL THE WAY FROM 630 00:24:55,207 --> 00:24:57,710 U-CHICAGO, OBERMEYER AND HIS 631 00:24:57,710 --> 00:25:05,451 TEAM WROTE THE ALGORHYTHMIC 632 00:25:05,451 --> 00:25:15,728 BIAS PLAY BOOK. 633 00:25:18,030 --> 00:25:21,867 IT'S A COMPREHENSIVE FRAMEWORK 634 00:25:21,867 --> 00:25:22,935 TO MITIGATE RISK WHEN IT COMES 635 00:25:22,935 --> 00:25:24,837 TO AI. 636 00:25:24,837 --> 00:25:26,138 AND ISO, THE INTERNATIONAL 637 00:25:26,138 --> 00:25:27,139 STANDARD FOR ARTIFICIAL 638 00:25:27,139 --> 00:25:30,876 INTELLIGENCE MANAGEMENT SYSTEMS. 639 00:25:30,876 --> 00:25:38,283 THREADING THE NEEDLE FOR AI 640 00:25:38,283 --> 00:25:38,917 STANDARDS, ISO BROADEST, 641 00:25:38,917 --> 00:25:42,821 INTERNATIONAL STANDARD. 642 00:25:42,821 --> 00:25:44,289 NIST IS NATIONAL INSTITUTE 643 00:25:44,289 --> 00:25:45,958 STANDARDS. 644 00:25:45,958 --> 00:25:48,427 INDUSTRY AGNOSTIC. 645 00:25:48,427 --> 00:25:49,128 NATIONAL ACADEMY OF MEDICINE 646 00:25:49,128 --> 00:25:50,262 FOR THE INDUSTRY. 647 00:25:50,262 --> 00:25:54,233 AND WHEN IT COMES TO OUR HIGHLY 648 00:25:54,233 --> 00:25:55,167 REGULATED HEALTHCARE, THOSE 649 00:25:55,167 --> 00:25:57,169 INDUSTRY STANDARDS AS WELL AS 650 00:25:57,169 --> 00:25:59,171 STANDARDS UP ABOVE ARE DRIVING 651 00:25:59,171 --> 00:26:00,739 THE FEDERAL REGULATIONS AND HOW 652 00:26:00,739 --> 00:26:03,442 THEY ARE FORMULATED. 653 00:26:03,442 --> 00:26:05,544 AND THEN INSTITUTIONS, DEVELOP 654 00:26:05,544 --> 00:26:06,545 INSTITUTIONAL STANDARDS THAT 655 00:26:06,545 --> 00:26:08,447 LINEUP, THREAD THE NEEDLE FROM 656 00:26:08,447 --> 00:26:12,785 THAT INSTITUTIONAL STANDARD ALL 657 00:26:12,785 --> 00:26:14,419 THE WAY PERHAPS TO THE NIST 658 00:26:14,419 --> 00:26:16,555 STANDARD. 659 00:26:16,555 --> 00:26:20,692 SO EXAMPLES OF AI STRATEGY 660 00:26:20,692 --> 00:26:21,794 ALIGNMENT IS PRIORITIZING 661 00:26:21,794 --> 00:26:26,298 HEALTH EQUITY AND MAKING SURE 662 00:26:26,298 --> 00:26:27,566 COMPLIANCE HAS THAT 663 00:26:27,566 --> 00:26:28,700 FOUNDATIONAL COMPONENT. 664 00:26:28,700 --> 00:26:30,202 BUT THAT'S JUST THE FOUNDATION. 665 00:26:30,202 --> 00:26:33,238 WE ARE MORE THAN JUST 666 00:26:33,238 --> 00:26:34,640 COMPLIANCE, WHICH IS IMPORTANT. 667 00:26:34,640 --> 00:26:36,375 BUT WE ALSO WANT TO MAKE SURE 668 00:26:36,375 --> 00:26:38,677 WE ARE LIVING UP TO OUR MOST 669 00:26:38,677 --> 00:26:41,246 IMPORTANT PRIORITIES. 670 00:26:41,246 --> 00:26:44,850 AN AUDIT WOULD HAVE THAT 671 00:26:44,850 --> 00:26:50,222 TECHNICAL AUDIT OF MODELS USING 672 00:26:50,222 --> 00:26:52,724 METRICS, AS WELL AS PERFORMANCE 673 00:26:52,724 --> 00:26:55,460 STRATIFIED BY POPULATION. 674 00:26:55,460 --> 00:26:56,695 THAT STRATIFICATION COULD BE 675 00:26:56,695 --> 00:26:58,730 RACE, GENDER, IT COULD BE AGE, 676 00:26:58,730 --> 00:27:01,466 IT COULD BE DISABILITY STATUS, 677 00:27:01,466 --> 00:27:02,734 RURAL VERSUS URBAN. 678 00:27:02,734 --> 00:27:06,471 THERE ARE SO MANY WAYS YOU CAN 679 00:27:06,471 --> 00:27:08,006 STRATIFY LOOKING AT VULNERABLE 680 00:27:08,006 --> 00:27:08,774 POPULATIONS WHERE A MODEL MAY 681 00:27:08,774 --> 00:27:09,908 NOT WORK. 682 00:27:09,908 --> 00:27:11,844 AND THEN MAKING SURE THAT YOU 683 00:27:11,844 --> 00:27:13,345 HAVE EITHER ENOUGH DATA, YOU 684 00:27:13,345 --> 00:27:16,081 HAVE ENOUGH INPUT, YOU HAVE 685 00:27:16,081 --> 00:27:17,916 ENOUGH INFORMATION TO BE ABLE 686 00:27:17,916 --> 00:27:19,651 TO PROVIDE, BUILD A BETTER 687 00:27:19,651 --> 00:27:21,520 MODEL FOR ALL THAT THE MODEL 688 00:27:21,520 --> 00:27:23,488 MAY HAVE IMPACT. 689 00:27:23,488 --> 00:27:26,258 THE OTHER SIDE OF AN AUDIT AND 690 00:27:26,258 --> 00:27:29,628 THIS IS DEFINITELY IN ALIGNMENT 691 00:27:29,628 --> 00:27:31,063 WITH ISO OR NIST IS THAT 692 00:27:31,063 --> 00:27:32,998 PROCESS AUDIT. 693 00:27:32,998 --> 00:27:35,067 MAYBE AS AN INSTITUTION YOU CAN 694 00:27:35,067 --> 00:27:36,301 TECHNICALLY DO SOMETHING, BUT 695 00:27:36,301 --> 00:27:37,903 MORE IMPORTANTLY, ARE YOU 696 00:27:37,903 --> 00:27:38,904 FOLLOWING YOUR POLICIES AND 697 00:27:38,904 --> 00:27:40,539 PROCEDURES? 698 00:27:40,539 --> 00:27:47,079 SO WHEN IT COMES TO RESPONSIBLE 699 00:27:47,079 --> 00:27:49,715 AI, THERE ARE CERTAIN 700 00:27:49,715 --> 00:27:52,050 COMPONENTS OF STEPS THAT YOU 701 00:27:52,050 --> 00:27:56,288 CAN DO DURING THE MACHINE 702 00:27:56,288 --> 00:28:01,526 LEARNING OPS PROCESS TO MEASURE 703 00:28:01,526 --> 00:28:02,261 FAIRNESS, MEASURE 704 00:28:02,261 --> 00:28:05,197 EXPLAINABILITY IN A MODEL. 705 00:28:05,197 --> 00:28:10,469 AND SO IN A TYPICAL MACHINE 706 00:28:10,469 --> 00:28:11,169 LEARNING PRACTICE, DEVELOPMENT 707 00:28:11,169 --> 00:28:12,938 PRACTICE, YOU START WITH YOUR 708 00:28:12,938 --> 00:28:14,206 DATA. 709 00:28:14,206 --> 00:28:18,243 YOU TRAIN YOUR MODEL. 710 00:28:18,243 --> 00:28:19,144 YOU CHECK THE CURVE, 711 00:28:19,144 --> 00:28:20,178 PERFORMANCE. 712 00:28:20,178 --> 00:28:22,681 AND SOMETIMES THAT'S WHERE THE 713 00:28:22,681 --> 00:28:23,348 TRADITIONAL PROCESS KIND OF 714 00:28:23,348 --> 00:28:25,217 ENDS. 715 00:28:25,217 --> 00:28:27,319 OKAY, MY END OF THE CURVE IS 716 00:28:27,319 --> 00:28:31,356 0.8 5, GREAT. 717 00:28:31,356 --> 00:28:33,592 WELL IN A FAIRNESS OR 718 00:28:33,592 --> 00:28:34,826 RESPONSIBLE AI FRAMEWORK YOU 719 00:28:34,826 --> 00:28:36,695 WOULD DO THAT SUB POPULATION 720 00:28:36,695 --> 00:28:37,462 STRATIFIED PERFORMANCE. 721 00:28:37,462 --> 00:28:40,165 SO YOU ARE LOOKING TO MAKE SURE 722 00:28:40,165 --> 00:28:43,502 THAT, OKAY, AT THE AGGREGATE 723 00:28:43,502 --> 00:28:45,504 LEVEL, THE MODEL PERFORMS SAY 724 00:28:45,504 --> 00:28:49,174 AT A CERTAIN ACCURACY LEVEL. 725 00:28:49,174 --> 00:28:49,741 RACE, ETHNICITY, GENDER, ET 726 00:28:49,741 --> 00:28:51,510 CETERA. 727 00:28:51,510 --> 00:28:53,378 SO YOU STRATIFY AND SEE HOW 728 00:28:53,378 --> 00:28:55,514 THAT MODEL PERFORMS. 729 00:28:55,514 --> 00:28:58,684 YOU CAN ALSO DO IT NOT ONLY BY 730 00:28:58,684 --> 00:29:00,385 PERFORMANCE BY ALSO BY FAIRNESS 731 00:29:00,385 --> 00:29:02,454 METRICS WHERE YOU ARE ACTUALLY 732 00:29:02,454 --> 00:29:03,121 MEASURING THE OUTPUT OF THE 733 00:29:03,121 --> 00:29:06,458 MODEL. 734 00:29:06,458 --> 00:29:07,960 THE MODEL ACCURACY IS THE 735 00:29:07,960 --> 00:29:09,861 OUTPUT OF THE MODEL MAY BE 736 00:29:09,861 --> 00:29:11,296 PREDICTING WHETHER OR NOT 737 00:29:11,296 --> 00:29:12,698 SOMEBODY WOULD BE BENEFITING 738 00:29:12,698 --> 00:29:14,399 FROM PARTICULAR GOOD OR 739 00:29:14,399 --> 00:29:16,234 SERVICE, IS A DISTRIBUTION OF 740 00:29:16,234 --> 00:29:19,104 THAT GOOD OR SERVICE EQUITABLE 741 00:29:19,104 --> 00:29:20,706 ACROSS SUB POPULATIONS. 742 00:29:20,706 --> 00:29:22,541 THEN IF IT'S NOT, IF THE 743 00:29:22,541 --> 00:29:24,876 PERFORMANCE OF THE MODEL OR 744 00:29:24,876 --> 00:29:27,179 OUTPUT OF THE MODEL FALLS BELOW 745 00:29:27,179 --> 00:29:30,082 A THRESHOLD, THEN YOU COULD 746 00:29:30,082 --> 00:29:30,615 APPLY VARIOUS MITIGATION 747 00:29:30,615 --> 00:29:32,351 METHODS. 748 00:29:32,351 --> 00:29:33,819 AND THESE, SOME ARE MACHINE 749 00:29:33,819 --> 00:29:36,788 LEARNING METHODS. 750 00:29:36,788 --> 00:29:37,956 SOME ARE STATISTICAL METHODS, 751 00:29:37,956 --> 00:29:39,658 THERE ARE A LOT OF DIFFERENT 752 00:29:39,658 --> 00:29:40,692 METHODS OUT THERE. 753 00:29:40,692 --> 00:29:42,427 THERE ARE A LOT OF DIFFERENT 754 00:29:42,427 --> 00:29:44,229 WAYS OF RECTIFYING YOUR MODEL. 755 00:29:44,229 --> 00:29:49,001 AND THEN RECYCLING IT THROUGH 756 00:29:49,001 --> 00:29:51,436 THIS WHOLE PROCESS AGAIN. 757 00:29:51,436 --> 00:29:53,538 WE CAN'T DO A TALK ABOUT 758 00:29:53,538 --> 00:29:54,539 FAIRNESS METRICS WITHOUT 759 00:29:54,539 --> 00:29:56,742 TALKING ABOUT THERE ARE OFTEN 760 00:29:56,742 --> 00:29:58,310 TRADE-OFFS. IT'S VERY DIFFICULT 761 00:29:58,310 --> 00:30:01,413 TO FIND THAT PRECISE BALANCE 762 00:30:01,413 --> 00:30:04,583 BETWEEN ACCURACY AND FAIRNESS. 763 00:30:04,583 --> 00:30:07,285 OFTEN WHAT WE WILL FIND, 764 00:30:07,285 --> 00:30:09,988 ESPECIALLY WITH A LARGE DATASET 765 00:30:09,988 --> 00:30:12,224 THAT HAS ENOUGH OF YOUR SMALL 766 00:30:12,224 --> 00:30:16,261 SAMPLE SIZE OR SUB POPULATION 767 00:30:16,261 --> 00:30:17,095 REPRESENTED THAT YOUR 768 00:30:17,095 --> 00:30:18,063 PERFORMANCE MAY NOT GO DOWN 769 00:30:18,063 --> 00:30:19,197 SIGNIFICANTLY. 770 00:30:19,197 --> 00:30:21,500 OR IF IT DOES, YOU MAY FIND YOU 771 00:30:21,500 --> 00:30:22,634 NEED A LARGER POPULATION. 772 00:30:22,634 --> 00:30:24,136 THERE'S A LOT OF DIFFERENT 773 00:30:24,136 --> 00:30:26,071 THINGS YOU NEED TO DO. 774 00:30:26,071 --> 00:30:28,373 BUT THE IMPORTANT THING IS 775 00:30:28,373 --> 00:30:30,042 THESE FAIRNESS METRICS DO HAVE 776 00:30:30,042 --> 00:30:31,510 TRADE-OFFS. 777 00:30:31,510 --> 00:30:34,146 SO WE ARE GETTING CLOSE TO WHAT 778 00:30:34,146 --> 00:30:35,647 ARE THE ROLE, IN THE VERY 779 00:30:35,647 --> 00:30:37,516 BEGINNING YOU SAID I HAD A ROLE 780 00:30:37,516 --> 00:30:38,383 TO PLAY. 781 00:30:38,383 --> 00:30:40,819 I WANT TO KNOW WHAT MY ROLE IS. 782 00:30:40,819 --> 00:30:43,321 WHAT AM I SUPPOSED TO BE DOING 783 00:30:43,321 --> 00:30:49,394 TO MAKE SURE THE FUTURE OF AI 784 00:30:49,394 --> 00:30:50,162 IN HEALTHCARE IS EQUITABLE AND 785 00:30:50,162 --> 00:30:52,731 FAIR. 786 00:30:52,731 --> 00:30:55,901 IF ARE YOU ANY CONTRIBUTOR TO 787 00:30:55,901 --> 00:30:58,070 NLM AND LIBRARY SCIENCE THERE'S 788 00:30:58,070 --> 00:31:00,005 AN INCREDIBLE AMOUNT OF 789 00:31:00,005 --> 00:31:06,278 OPPORTUNITY THAT NLM LIBRARIANS 790 00:31:06,278 --> 00:31:06,978 CAN CONTRIBUTE TO RESPONSIBLE 791 00:31:06,978 --> 00:31:08,680 AI. 792 00:31:08,680 --> 00:31:12,751 ONE IS THE SUSTAINABLE DIGITAL 793 00:31:12,751 --> 00:31:16,788 ECOSYSTEM AND FOSTERING THAT 794 00:31:16,788 --> 00:31:18,190 ROBUST TRANSPARENCY ENVIRONMENT 795 00:31:18,190 --> 00:31:22,027 FOR RESPONSIBLE AI DEVELOPMENT. 796 00:31:22,027 --> 00:31:23,829 THE INITIATIVE IS FRONT AND 797 00:31:23,829 --> 00:31:26,064 CENTER AND OPEN SCIENCE IS 798 00:31:26,064 --> 00:31:33,071 ABOUT ENSURING AI RESEARCH IS 799 00:31:33,071 --> 00:31:33,638 TRANSPARENT, REPRODUCIBLE, 800 00:31:33,638 --> 00:31:34,439 ACCESSIBLE. 801 00:31:34,439 --> 00:31:39,978 THERE'S AN OPPORTUNITY FOR 802 00:31:39,978 --> 00:31:44,649 INCREASED TRANSPARENCY. 803 00:31:44,649 --> 00:31:46,384 IT'S MEETING INSTITUTIONAL 804 00:31:46,384 --> 00:31:48,220 GUIDELINES ON STRATIFIED 805 00:31:48,220 --> 00:31:50,455 POPULATIONS. SO INCREDIBLE 806 00:31:50,455 --> 00:31:52,023 OPPORTUNITY TO MARRY SCIENCE 807 00:31:52,023 --> 00:31:54,025 CONCEPT WITH RESPONSIBLE AI 808 00:31:54,025 --> 00:31:55,427 WITH PRODUCTS THAT COULD HELP 809 00:31:55,427 --> 00:31:57,629 THE NATION AT LARGE. 810 00:31:57,629 --> 00:31:59,764 WE TALKED A BIT ABOUT WORKFORCE 811 00:31:59,764 --> 00:32:00,065 DEVELOPMENT. 812 00:32:00,065 --> 00:32:02,400 THIS IS ONE OF THE MOST AMAZING 813 00:32:02,400 --> 00:32:06,304 THINGS BEING IN TECH AND 814 00:32:06,304 --> 00:32:07,672 HEALTHCARE IT COMPARED TO IT IN 815 00:32:07,672 --> 00:32:09,808 GENERAL. 816 00:32:09,808 --> 00:32:17,883 YOU LOOK AT TECH COMPANIES. 817 00:32:17,883 --> 00:32:20,485 ONE THING HEALTH IT IT IS THE 818 00:32:20,485 --> 00:32:22,220 DIVERSITY. 819 00:32:22,220 --> 00:32:23,221 AND THAT HISTORICALLY HAS BEEN 820 00:32:23,221 --> 00:32:26,258 BECAUSE OF THE DIVERSE PATHWAYS 821 00:32:26,258 --> 00:32:30,195 TO INFO MAT ICS. 822 00:32:30,195 --> 00:32:31,530 NURSING INFORMATICS PROGRAMS 823 00:32:31,530 --> 00:32:33,832 ACROSS THE COUNTRY PRODUCE SOME 824 00:32:33,832 --> 00:32:36,401 OF THE MOST LIVED EXPERIENCE, 825 00:32:36,401 --> 00:32:39,337 DIVERSE PEOPLE IN HEALTHCARE IT. 826 00:32:39,337 --> 00:32:42,340 THEY DON'T SAY IN CLASSICAL 827 00:32:42,340 --> 00:32:44,843 INFORMATIC ROLES, THEY ARE DATA 828 00:32:44,843 --> 00:32:45,477 SCIENTISTS, THEY ARE 829 00:32:45,477 --> 00:32:47,078 PROGRAMMERS. THEY ARE PART OF 830 00:32:47,078 --> 00:32:51,016 THE BROADER TECH ECOSYSTEM AND 831 00:32:51,016 --> 00:32:52,050 LIBRARIANS THAT ARE INFORMATICS 832 00:32:52,050 --> 00:32:53,752 TRAINED AS WELL HAVE THE SAME 833 00:32:53,752 --> 00:32:55,053 BROAD LIVED EXPERIENCE THAT 834 00:32:55,053 --> 00:32:56,121 REALLY HELP TO AUGMENT OUR 835 00:32:56,121 --> 00:32:59,224 TEAMS. 836 00:32:59,224 --> 00:33:00,625 COMMUNITY ENGAGEMENT. 837 00:33:00,625 --> 00:33:04,262 THE ROLE OF LIBRARY MEDICINE 838 00:33:04,262 --> 00:33:07,933 AND SCIENCES WHEN IT COMES TO 839 00:33:07,933 --> 00:33:09,034 DISSEMINATING INFORMATION, NOT 840 00:33:09,034 --> 00:33:11,670 ONLY TO MEDICAL PRACTITIONERS 841 00:33:11,670 --> 00:33:14,673 BUT TO PATIENTS, TO COMMUNITY, 842 00:33:14,673 --> 00:33:16,875 THROUGH RELATIONSHIPS. AT THE 843 00:33:16,875 --> 00:33:20,445 UNIVERSITY OF UTAH, WHERE I WAS 844 00:33:20,445 --> 00:33:22,347 THE CHIEF INFORMATION OFFICER. 845 00:33:22,347 --> 00:33:24,015 WE HAD A WONDERFUL PATIENT 846 00:33:24,015 --> 00:33:27,185 LIBRARY THAT THE LIBRARIANS OF 847 00:33:27,185 --> 00:33:29,120 COURSE, MEDICAL LIBRARY TRAINED. 848 00:33:29,120 --> 00:33:33,658 AND THAT ABILITY TO TRANSLATE 849 00:33:33,658 --> 00:33:35,794 DIGITAL LITERACY TO PATIENTS, 850 00:33:35,794 --> 00:33:37,596 TO PROVIDE THAT CONDUIT WAS 851 00:33:37,596 --> 00:33:39,297 INCREDIBLY IMPORTANT. 852 00:33:39,297 --> 00:33:41,433 AND WE ARE GOING TO NEED THAT 853 00:33:41,433 --> 00:33:43,301 IN THIS NEW AI WORLD. 854 00:33:43,301 --> 00:33:44,769 PATIENTS DON'T UNDERSTAND. 855 00:33:44,769 --> 00:33:45,904 THEY DON'T UNDERSTAND AND WE 856 00:33:45,904 --> 00:33:47,973 NEED TO PROVIDE THAT CAPACITY 857 00:33:47,973 --> 00:33:50,242 BUILDING TO GIVE BROADER 858 00:33:50,242 --> 00:33:52,043 COMMUNITY THE RESOURCES TO 859 00:33:52,043 --> 00:33:54,913 THRIVE IN AI DECISION MAKING. 860 00:33:54,913 --> 00:33:55,647 AND THEN PARTNERSHIPS IN 861 00:33:55,647 --> 00:33:57,082 GENERAL. 862 00:33:57,082 --> 00:33:58,750 SO LOTS OF WONDERFUL 863 00:33:58,750 --> 00:34:00,552 OPPORTUNITY FOR NLM, FOR 864 00:34:00,552 --> 00:34:03,154 LIBRARIANS ACROSS THE WORLD, 865 00:34:03,154 --> 00:34:04,756 REALLY, TO DRIVE RESPONSIBLE AI 866 00:34:04,756 --> 00:34:06,625 PRACTICES. 867 00:34:06,625 --> 00:34:09,227 FOR DEVELOPERS, IT REALLY IS 868 00:34:09,227 --> 00:34:11,363 DESIGNING YOUR MACHINE LEARNING 869 00:34:11,363 --> 00:34:12,197 MODELS WITH SUB POPULATIONS IN 870 00:34:12,197 --> 00:34:14,699 MIND. 871 00:34:14,699 --> 00:34:15,500 TESTING THOROUGHLY, DON'T BE IN 872 00:34:15,500 --> 00:34:16,501 A BOX. 873 00:34:16,501 --> 00:34:19,037 MAKING SURE YOU ARE LISTENING 874 00:34:19,037 --> 00:34:20,372 TO STAKEHOLDERS AND ADOPTING 875 00:34:20,372 --> 00:34:21,973 NEW METHODS. 876 00:34:21,973 --> 00:34:24,609 NEW METHODS ARE BEING DEVELOPED 877 00:34:24,609 --> 00:34:26,111 FREQUENTLY OFTEN REALLY TO 878 00:34:26,111 --> 00:34:30,682 ADDRESS FAIRNESS AND TO ADDRESS 879 00:34:30,682 --> 00:34:31,850 RESPONSIBLE AI PRINCIPLES, 880 00:34:31,850 --> 00:34:33,084 EXPLAINABILITY, TRANSPARENCY 881 00:34:33,084 --> 00:34:36,021 AND KEEPING UP-TO-DATE. 882 00:34:36,021 --> 00:34:38,523 FOR HEALTH EQUITY ADVOCATE, 883 00:34:38,523 --> 00:34:40,759 GOING FROM CHANGE AGENTS TO 884 00:34:40,759 --> 00:34:42,460 INNOVATORS. OFTEN WHEN WE THINK 885 00:34:42,460 --> 00:34:43,928 OF HEALTH EQUITY, ADVOCATE ARE 886 00:34:43,928 --> 00:34:45,297 IN THE ROOM. 887 00:34:45,297 --> 00:34:46,331 BUT THERE'S THIS GREAT 888 00:34:46,331 --> 00:34:48,433 OPPORTUNITY TO PARTICIPATE IN 889 00:34:48,433 --> 00:34:50,435 AI AND TECHNOLOGY GOVERNANCE, 890 00:34:50,435 --> 00:34:52,070 MAKING IT AN INCREDIBLY HIGH 891 00:34:52,070 --> 00:34:52,837 PRIORITY, HAVING THAT SEAT AT 892 00:34:52,837 --> 00:34:53,705 THE TABLE. 893 00:34:53,705 --> 00:34:56,107 WHEN HAVE THAT SEAT AT THE 894 00:34:56,107 --> 00:34:57,542 TABLE, MAKING SURE YOU ARE 895 00:34:57,542 --> 00:35:00,045 EXPANDING THAT SEAT TO OTHERS, 896 00:35:00,045 --> 00:35:01,079 INCLUDING DIVERSE COMMUNITY 897 00:35:01,079 --> 00:35:02,480 VOICES IN RECOGNIZING WHO IS 898 00:35:02,480 --> 00:35:03,782 MISSING FROM A LOT OF THE 899 00:35:03,782 --> 00:35:04,849 DECISION MAKING WHEN IT COMES 900 00:35:04,849 --> 00:35:08,119 TO AI. 901 00:35:08,119 --> 00:35:10,555 MEASURING INEQUITY AND FAIRNESS. 902 00:35:10,555 --> 00:35:12,857 OWN SURING THAT EQUITY GAPS 903 00:35:12,857 --> 00:35:13,658 DRIVE-IN NOVATION. 904 00:35:13,658 --> 00:35:15,860 WHEN YOU SEE THAT GAP, MAKING 905 00:35:15,860 --> 00:35:18,530 SURE WE ARE ACTUALLY 906 00:35:18,530 --> 00:35:19,331 ADDRESSING, USING OUR 907 00:35:19,331 --> 00:35:20,031 INNOVATIVE ENGINE TO ADDRESS 908 00:35:20,031 --> 00:35:21,966 THAT GAP. 909 00:35:21,966 --> 00:35:24,869 FOR COMMUNITY MEMBERS IT'S 910 00:35:24,869 --> 00:35:25,437 DEMANDING TRANSPARENCY AND 911 00:35:25,437 --> 00:35:26,604 ACCOUNTABILITY. 912 00:35:26,604 --> 00:35:29,107 SO OFTEN THE CURRENT STATE OF 913 00:35:29,107 --> 00:35:30,675 AI GOVERNANCE IS ACTUALLY 914 00:35:30,675 --> 00:35:32,477 PRETTY MATURE. 915 00:35:32,477 --> 00:35:36,815 YET AI GOVERNANCE IS THE WAY 916 00:35:36,815 --> 00:35:38,550 FOR AN INSTITUTION TO BE 917 00:35:38,550 --> 00:35:40,151 ACCOUNTABLE FOR A SYSTEM THEY 918 00:35:40,151 --> 00:35:41,986 DEPLOYED IN THEIR SYSTEM. 919 00:35:41,986 --> 00:35:44,022 PATIENTS NEED TO DEMAND THAT 920 00:35:44,022 --> 00:35:44,689 ACCOUNTABILITY AND PROVIDE THAT 921 00:35:44,689 --> 00:35:46,324 FEEDBACK. 922 00:35:46,324 --> 00:35:49,961 IT COULD BE GENTLE FEEDBACK OR 923 00:35:49,961 --> 00:35:50,962 LIKE RED-TEAMING TYPE OF 924 00:35:50,962 --> 00:35:51,896 FEEDBACK. 925 00:35:51,896 --> 00:35:53,598 I WON'T GO INTO RED-TEAMING 926 00:35:53,598 --> 00:35:54,265 RIGHT NOW. 927 00:35:54,265 --> 00:35:56,534 BUT SOME OF THE MOST 928 00:35:56,534 --> 00:35:57,635 INTERESTING WAYS OF DRIVING 929 00:35:57,635 --> 00:36:00,271 CHANGE HAVE COME FROM PATIENT 930 00:36:00,271 --> 00:36:04,943 ADVOCATE THAT HAVE FIND, 931 00:36:04,943 --> 00:36:07,145 ESSENTIALLY HACKED AND MADE 932 00:36:07,145 --> 00:36:08,346 VISIBLE VULNERABILITIES THAT 933 00:36:08,346 --> 00:36:09,414 PUT PATIENT PRIVACY AND SAFETY 934 00:36:09,414 --> 00:36:11,282 AT RISK. 935 00:36:11,282 --> 00:36:12,350 AND THEN ADVOCATING FOR 936 00:36:12,350 --> 00:36:13,818 RESPONSIBLE AI. 937 00:36:13,818 --> 00:36:15,520 THE ROLE OF CLINICIANS, 938 00:36:15,520 --> 00:36:16,454 UNDERSTANDING LIMITATIONS OF 939 00:36:16,454 --> 00:36:17,088 AI, GETTING INVOLVED IN 940 00:36:17,088 --> 00:36:19,791 DEVELOPMENT. 941 00:36:19,791 --> 00:36:25,029 AND THEN MAKING SURE THAT WE 942 00:36:25,029 --> 00:36:26,965 CLINICIANS REMAIN THE HUMAN. 943 00:36:26,965 --> 00:36:29,167 WITH THAT, THANK YOU VERY MUCH. 944 00:36:29,167 --> 00:36:30,635 I REALLY LOOK FORWARD TO 945 00:36:30,635 --> 00:36:32,203 ANSWERING YOUR QUESTIONS. 946 00:36:32,203 --> 00:36:33,705 THAT WAS KIND OF A WHIRLWIND 947 00:36:33,705 --> 00:36:35,240 TOUR. 948 00:36:35,240 --> 00:36:36,207 DID A LITTLE OF EVERYTHING 949 00:36:36,207 --> 00:36:36,941 THERE. 950 00:36:36,941 --> 00:36:37,842 AND LOOK FORWARD TO YOUR 951 00:36:37,842 --> 00:36:48,019 QUESTIONS. 952 00:36:48,920 --> 00:36:50,088 > KEN KOYLE: THANK YOU, WE 953 00:36:50,088 --> 00:36:51,289 APPRECIATE IT. 954 00:36:51,289 --> 00:36:53,358 WE DO HAVE TIME FOR QUESTIONS 955 00:36:53,358 --> 00:36:57,629 FROM OUR AUDIENCE. 956 00:36:57,629 --> 00:36:59,931 I'M KEN KOYLE, I WILL BE 957 00:36:59,931 --> 00:37:01,699 MODERATING THE Q&A SESSION. 958 00:37:01,699 --> 00:37:05,837 AS A REMINDER FOR THOSE 959 00:37:05,837 --> 00:37:07,639 WATCHING THE NIH VIDEOCAST, 960 00:37:07,639 --> 00:37:09,374 PLEASE CLICK THE LIVE FEEDBACK 961 00:37:09,374 --> 00:37:11,810 BUTTON TO SEND ANY QUESTIONS 962 00:37:11,810 --> 00:37:13,611 FOR DR. HIGHTOWER. 963 00:37:13,611 --> 00:37:15,346 THE FIRST QUESTION WE HAVE, 964 00:37:15,346 --> 00:37:17,649 COMES FROM ONE OF OUR VIEWERS, 965 00:37:17,649 --> 00:37:19,784 ARE YOU SEEING ANY POSITIVE 966 00:37:19,784 --> 00:37:21,886 SWING ALREADY IN HEALTH EQUITY 967 00:37:21,886 --> 00:37:24,022 AS AI IS IMPLEMENTED? 968 00:37:24,022 --> 00:37:26,224 CAN AI HELP PREVENT OUR HUMAN 969 00:37:26,224 --> 00:37:28,259 BIAS, AND IF YES, HOW IS THAT 970 00:37:28,259 --> 00:37:29,561 DATA BEING SHARED WITH 971 00:37:29,561 --> 00:37:30,695 CLINICIANS? 972 00:37:30,695 --> 00:37:33,131 HOW DO WE POINT OUT THE 973 00:37:33,131 --> 00:37:35,233 BENEFITS TO HELP PREVENT 974 00:37:35,233 --> 00:37:36,968 CLINICAL BIAS, THINGS LIKE 975 00:37:36,968 --> 00:37:39,137 CREATING PATIENT NOTES WHERE 976 00:37:39,137 --> 00:37:41,673 THE CLINICIAN MAY INADVERTENTLY 977 00:37:41,673 --> 00:37:44,042 INTRODUCE A BIAS TONE. 978 00:37:44,042 --> 00:37:45,343 >> Maia Hightower: THAT WAS 979 00:37:45,343 --> 00:37:47,145 LIKE FIVE QUESTIONS IN THERE. 980 00:37:47,145 --> 00:37:49,514 >> SORRY ABOUT THAT. 981 00:37:49,514 --> 00:37:51,783 STARTING WITH, ARE YOU SEEING, 982 00:37:51,783 --> 00:37:53,485 SO AI IS RELATIVELY NEW IN 983 00:37:53,485 --> 00:37:54,819 CLINICAL CARE. 984 00:37:54,819 --> 00:37:56,521 BUT ARE YOU SEEING A POSITIVE 985 00:37:56,521 --> 00:37:57,689 TREND SO FAR? 986 00:37:57,689 --> 00:37:58,990 >> Maia Hightower: YES, 987 00:37:58,990 --> 00:38:00,391 DEFINITELY I'M SEEING A 988 00:38:00,391 --> 00:38:03,428 POSITIVE TREND WHEN IT COMES TO 989 00:38:03,428 --> 00:38:04,429 STRATIFIED MEASUREMENT OF 990 00:38:04,429 --> 00:38:09,067 PERFORMANCE OF MODELS. 991 00:38:09,067 --> 00:38:12,504 RIGHT NOW AI, THERE'S YET TO BE 992 00:38:12,504 --> 00:38:15,773 -- AI IS A RELATIVELY NEW 993 00:38:15,773 --> 00:38:19,077 SCIENCE WHEN IT COMES TO 994 00:38:19,077 --> 00:38:20,678 CLINICAL TRIALS AND 995 00:38:20,678 --> 00:38:21,880 OUTCOMES-BASED MEASUREMENT OF 996 00:38:21,880 --> 00:38:23,848 THE IMPACT OF AI IN HEALTHCARE, 997 00:38:23,848 --> 00:38:25,149 EVEN THOUGH IT IS BEING 998 00:38:25,149 --> 00:38:27,385 DEPLOYED. 999 00:38:27,385 --> 00:38:29,888 ITS STILL IN ITS INFANCY IN 1000 00:38:29,888 --> 00:38:30,855 CLINICAL TRIALS, THAT'S A WHOLE 1001 00:38:30,855 --> 00:38:33,825 OTHER TALK. 1002 00:38:33,825 --> 00:38:35,760 PEOPLE TALK ABOUT THE NEED FOR 1003 00:38:35,760 --> 00:38:38,096 MORE CLINICAL TRIALS. AS THEY 1004 00:38:38,096 --> 00:38:40,765 ARE BEING ACTUALLY RIGHT NOW 1005 00:38:40,765 --> 00:38:42,300 DEPLOYED THEY ALMOST 1006 00:38:42,300 --> 00:38:43,167 UNIVERSALLY HAVE HEALTH EQUITY 1007 00:38:43,167 --> 00:38:46,404 IN MIND, ESPECIALLY THAT ARE 1008 00:38:46,404 --> 00:38:50,074 NIH FUNDED THAT ARE FUNDED BY 1009 00:38:50,074 --> 00:38:52,911 -- 1010 00:38:52,911 --> 00:38:54,712 MAJOR DONORS. ALMOST ALL OF 1011 00:38:54,712 --> 00:38:56,948 THEM LAST TWO YEARS, INCLUDING 1012 00:38:56,948 --> 00:38:59,551 AS HEALTH EQUITY BY DESIGN 1013 00:38:59,551 --> 00:39:00,418 COMPONENT, WHICH IS EXTREMELY 1014 00:39:00,418 --> 00:39:02,887 PROMISING. 1015 00:39:02,887 --> 00:39:04,822 WE HAVE BETTER MECHANISMS TODAY 1016 00:39:04,822 --> 00:39:07,025 FOR AI MOVING FROM SAY THE LAST 1017 00:39:07,025 --> 00:39:08,760 YEAR FORWARD, THAN WE DID TWO 1018 00:39:08,760 --> 00:39:10,061 OR THREE YEARS AGO. 1019 00:39:10,061 --> 00:39:12,030 SO THE TREND IS POSITIVE. 1020 00:39:12,030 --> 00:39:13,998 THE CHALLENGE THOUGH IS THAT A 1021 00:39:13,998 --> 00:39:18,469 LOT OF OUR MECHANISMS FOR 1022 00:39:18,469 --> 00:39:19,537 MEASURING HEALTH EQUITY REALLY 1023 00:39:19,537 --> 00:39:21,239 WERE AND ARE BASED ON 1024 00:39:21,239 --> 00:39:23,775 PREDICTIVE MODELS. 1025 00:39:23,775 --> 00:39:26,344 LESS SO ON NLM'S GENERATIVE AI. 1026 00:39:26,344 --> 00:39:28,980 WE DON'T REALLY HAVE GOOD 1027 00:39:28,980 --> 00:39:30,281 MECHANISMS FOR MEASURING 1028 00:39:30,281 --> 00:39:30,949 IMPACTS, OUTCOMES OF GENERATIVE 1029 00:39:30,949 --> 00:39:33,084 AI. 1030 00:39:33,084 --> 00:39:35,420 THEY ARE BEING DEPLOYED FOR 1031 00:39:35,420 --> 00:39:36,988 CLINICAL DOCUMENTATION, SORT OF 1032 00:39:36,988 --> 00:39:38,489 LOWER RISK OUTPUT. 1033 00:39:38,489 --> 00:39:41,292 BUT EVEN TO YOUR CALLER'S 1034 00:39:41,292 --> 00:39:43,828 QUESTION ABOUT TONE OF THOSE 1035 00:39:43,828 --> 00:39:45,330 GENERATIVE AI MODELS AS THEY 1036 00:39:45,330 --> 00:39:48,466 ARE HELP WITH CLINICAL 1037 00:39:48,466 --> 00:39:50,902 DOCUMENTATION, EVEN CHAT BOTS 1038 00:39:50,902 --> 00:39:52,036 FOR PATIENTS, THOSE STUDIES ARE 1039 00:39:52,036 --> 00:39:53,137 STILL YET TO BE DONE. 1040 00:39:53,137 --> 00:39:54,939 AND THEY CAN BE DONE. 1041 00:39:54,939 --> 00:39:57,075 WE HAVE GOOD METHODS FOR TAKING 1042 00:39:57,075 --> 00:39:59,978 A CLINICAL NOTE AND BEING ABLE 1043 00:39:59,978 --> 00:40:01,512 TO DETECT SENTIMENT. 1044 00:40:01,512 --> 00:40:02,814 A GOOD PROBABILITY THAT STUDY 1045 00:40:02,814 --> 00:40:05,249 IS UNDER WAY. 1046 00:40:05,249 --> 00:40:07,352 CONCURRENT WITH THE DEPLOYMENT 1047 00:40:07,352 --> 00:40:08,987 OF GENERATIVE MODELS IN 1048 00:40:08,987 --> 00:40:09,887 HEALTHCARE. 1049 00:40:09,887 --> 00:40:15,226 PRETTY EASY IN OUR MIND -- 1050 00:40:15,226 --> 00:40:16,861 YOU CAN ALREADY THINK ABOUT 1051 00:40:16,861 --> 00:40:18,563 WHAT METHODS YOU CAN USE, 1052 00:40:18,563 --> 00:40:21,399 AGGREGATE OF NOTES AND ASSESS 1053 00:40:21,399 --> 00:40:23,701 SAY A SENTIMENT TONE, SOME OF 1054 00:40:23,701 --> 00:40:25,737 THE WORK AT U-CHICAGO THEY HAVE 1055 00:40:25,737 --> 00:40:27,739 DONE ON THAT SENTIMENT ANALYSIS 1056 00:40:27,739 --> 00:40:30,074 BY RACE AND DEMOGRAPHICS. 1057 00:40:30,074 --> 00:40:33,111 >> OKAY, WONDERFUL, THANK YOU. 1058 00:40:33,111 --> 00:40:35,546 OUR NEXT QUESTION, THIS WAS A 1059 00:40:35,546 --> 00:40:36,481 FASCINATING AND IMPORTANT TALK, 1060 00:40:36,481 --> 00:40:38,149 THANK YOU. 1061 00:40:38,149 --> 00:40:40,284 TO WHAT EXTENT, IF AT ALL, DO 1062 00:40:40,284 --> 00:40:42,920 YOU SEE VALUE IN CONTRIBUTIONS 1063 00:40:42,920 --> 00:40:45,490 OF THE FIELD OF THE HEALTH 1064 00:40:45,490 --> 00:40:47,358 HUMANITIES AND HELPING US 1065 00:40:47,358 --> 00:40:49,761 NAVIGATE THE CURRENT AND FUTURE 1066 00:40:49,761 --> 00:40:51,462 LANDSCAPE OF A.I. 1067 00:40:51,462 --> 00:40:54,198 >> Maia Hightower: HEALTH 1068 00:40:54,198 --> 00:40:56,734 HUMANITIES, WHEN I THINK OF 1069 00:40:56,734 --> 00:40:58,436 HEALTH HUMANITIES I WAS A 1070 00:40:58,436 --> 00:40:59,270 SCIENCE AND TECHNOLOGY MAJOR IN 1071 00:40:59,270 --> 00:41:00,972 COLLEGE. 1072 00:41:00,972 --> 00:41:03,207 HEALTH AND HUMANITIES THAT'S A 1073 00:41:03,207 --> 00:41:04,509 BIG COMPONENT OF MY 1074 00:41:04,509 --> 00:41:05,343 UNDERGRADUATE TRAINING. 1075 00:41:05,343 --> 00:41:07,178 WHEN I THINK OF HEALTH 1076 00:41:07,178 --> 00:41:11,582 HUMANITIES I THINK OF THE 1077 00:41:11,582 --> 00:41:14,886 PHILOSOPHY OF SCIENCE, THE 1078 00:41:14,886 --> 00:41:17,055 ANTHROPOLOGY OF SCIENCE. 1079 00:41:17,055 --> 00:41:19,290 THERE'S LIKE TWO OTHERS, 1080 00:41:19,290 --> 00:41:20,591 SOCIOLOGY OF SCIENCE. 1081 00:41:20,591 --> 00:41:26,330 SO THOSE ALL PLAY A BIG ROLE IN 1082 00:41:26,330 --> 00:41:29,867 HOW WE TRAIN THE WORKFORCE THAT 1083 00:41:29,867 --> 00:41:32,603 ARE DEVELOPING MODELS AND AI 1084 00:41:32,603 --> 00:41:33,805 GOVERNANCE AND HAVING THAT 1085 00:41:33,805 --> 00:41:35,039 BROAD PERSPECTIVE. 1086 00:41:35,039 --> 00:41:37,975 WHEN IT COMES TO OUR AI 1087 00:41:37,975 --> 00:41:39,977 GOVERNANCE ALMOST ALL HAVE 1088 00:41:39,977 --> 00:41:41,512 EMPHASIS, DEFINITELY ON THE 1089 00:41:41,512 --> 00:41:42,814 NATIONAL PANELS AND WORK GROUPS 1090 00:41:42,814 --> 00:41:47,452 I'VE BEEN A PART OF, EMPHASIS 1091 00:41:47,452 --> 00:41:50,588 PLAY A HUGE ROLE IN AI 1092 00:41:50,588 --> 00:41:51,222 GOVERNANCE AND DEFINING 1093 00:41:51,222 --> 00:41:53,157 PRINCIPLES THAT ARE IMPORTANT. 1094 00:41:53,157 --> 00:41:54,258 AND ENSURING THAT HEALTH EQUITY 1095 00:41:54,258 --> 00:41:56,260 IS ONE OF THE DRIVING 1096 00:41:56,260 --> 00:41:57,862 PRINCIPLES BY DESIGN. 1097 00:41:57,862 --> 00:42:01,499 SO I THINK FROM WHETHER IT'S 1098 00:42:01,499 --> 00:42:03,701 ETHICS, ANTHROPOLOGY AND 1099 00:42:03,701 --> 00:42:07,472 UNDERSTANDING HUMAN BEHAVIOR, 1100 00:42:07,472 --> 00:42:08,873 SOCIOLOGY AND HOW WE WORK 1101 00:42:08,873 --> 00:42:10,208 TOGETHER, THEY PLAY AN 1102 00:42:10,208 --> 00:42:11,542 IMPORTANT ROLE, EITHER AS 1103 00:42:11,542 --> 00:42:16,147 WORKFORCE OR HOW MOD -- 1104 00:42:16,147 --> 00:42:16,981 MODELS ARE DEVELOPED AND 1105 00:42:16,981 --> 00:42:19,083 DEPLOYED. 1106 00:42:19,083 --> 00:42:20,118 >> KEN KOYLE: THANK YOU. 1107 00:42:20,118 --> 00:42:21,619 NEXT QUESTION. 1108 00:42:21,619 --> 00:42:24,956 THIS IS A LITTLE MORE DIRECT 1109 00:42:24,956 --> 00:42:26,991 AND PERHAPS LESS COMPLEX. 1110 00:42:26,991 --> 00:42:31,295 GENERAL QUESTION, HOW ARE THE 1111 00:42:31,295 --> 00:42:31,863 HEALTHCARE LANGUAGE MODELS 1112 00:42:31,863 --> 00:42:32,730 TRAINED? 1113 00:42:32,730 --> 00:42:33,898 HOW DO WE DECIDE WHAT TO TRAIN 1114 00:42:33,898 --> 00:42:35,366 WITH. 1115 00:42:35,366 --> 00:42:35,933 >> Maia Hightower: GREAT 1116 00:42:35,933 --> 00:42:37,301 QUESTION. 1117 00:42:37,301 --> 00:42:39,604 MOST ARE GENERAL PURPOSE MODELS. 1118 00:42:39,604 --> 00:42:41,439 THEY CAN BE FINE-TUNED BY 1119 00:42:41,439 --> 00:42:44,175 HEALTHCARE SYSTEMS. 1120 00:42:44,175 --> 00:42:46,477 BUT ESPECIALLY THE FIRST 1121 00:42:46,477 --> 00:42:49,347 GENERATION, SECOND GENERATION 1122 00:42:49,347 --> 00:42:56,020 OPEN AI, LARGE LANGUAGE MODELS, 1123 00:42:56,020 --> 00:42:58,689 THEY ARE GENERATIVE, NOT FOR 1124 00:42:58,689 --> 00:43:00,725 PURPOSE, GENERAL PURPOSE MODELS. 1125 00:43:00,725 --> 00:43:03,995 THERE ARE NEWER ONES BEING 1126 00:43:03,995 --> 00:43:05,663 DEVELOPED THAT ARE SPECIFICALLY 1127 00:43:05,663 --> 00:43:06,664 TO HEALTHCARE SPECIFIC. 1128 00:43:06,664 --> 00:43:10,268 BUT THEY HAVE YET TO BE AS 1129 00:43:10,268 --> 00:43:12,136 WIDELY DEPLOYED OR TESTED. 1130 00:43:12,136 --> 00:43:14,739 AT THE HEALTH SYSTEM LEVEL MANY 1131 00:43:14,739 --> 00:43:17,241 ARE DOING LOCAL FINE TUNING 1132 00:43:17,241 --> 00:43:19,977 USING THEIR OWN, WHATEVER DATA 1133 00:43:19,977 --> 00:43:22,513 THEY HAVE ACCESS TO TO IMPROVE 1134 00:43:22,513 --> 00:43:29,153 THE ACCURACY FOR HEALTHCARE. 1135 00:43:29,153 --> 00:43:30,655 >> KEN KOYLE: OKAY, THANK YOU. 1136 00:43:30,655 --> 00:43:31,923 THIS ONE IS MAYBE A LITTLE MORE 1137 00:43:31,923 --> 00:43:33,391 COMPLICATED. 1138 00:43:33,391 --> 00:43:36,160 GREAT TALK, THIS WRITER WRITES. 1139 00:43:36,160 --> 00:43:38,996 WHEN THERE'S LOW-HANGING FRUIT 1140 00:43:38,996 --> 00:43:40,865 IN HEALTHCARE, MAKING SURE 1141 00:43:40,865 --> 00:43:43,134 EVERYONE WHO NEEDS INSULIN CAN 1142 00:43:43,134 --> 00:43:44,635 AFFORD IT, PROBLEMS WE KNOW HOW 1143 00:43:44,635 --> 00:43:46,537 TO SOLVE IF WE JUST SPENT THE 1144 00:43:46,537 --> 00:43:49,106 MONEY, WHY SHOULD WE EXPLORE AI? 1145 00:43:49,106 --> 00:43:51,275 DOES AI HAVE A ROLE IN SOLVING 1146 00:43:51,275 --> 00:43:55,713 SOME OF THESE PROBLEMS THAT, I 1147 00:43:55,713 --> 00:43:57,748 THINK ARE WIDESPREAD AND EASILY 1148 00:43:57,748 --> 00:43:59,150 RECOGNIZED BY HARD TO SOLVE? 1149 00:43:59,150 --> 00:44:01,385 >> Maia Hightower: YEAH, WHEN 1150 00:44:01,385 --> 00:44:06,757 IT COMES TO UNHOUSED AND THE 1151 00:44:06,757 --> 00:44:08,659 INCREDIBLY HIGH COST OF 1152 00:44:08,659 --> 00:44:10,561 MEDICATIONS FOR THOSE WHO MAY 1153 00:44:10,561 --> 00:44:16,534 BE UNDER INSURED OR LACK 1154 00:44:16,534 --> 00:44:17,468 INSURANCE, AI MAY HAVE A ROLE 1155 00:44:17,468 --> 00:44:18,336 TO PLAY. 1156 00:44:18,336 --> 00:44:22,506 BUT IS THAT THE BEST, SHOULD WE 1157 00:44:22,506 --> 00:44:25,009 BE FOCUSING ON THESE MORE 1158 00:44:25,009 --> 00:44:26,944 COMPLEX ISSUES VERSUS AI. 1159 00:44:26,944 --> 00:44:28,946 IS IT COMPETING PRIORITIES THAT 1160 00:44:28,946 --> 00:44:29,714 RESULT IN DEVIATION OF 1161 00:44:29,714 --> 00:44:31,582 RESOURCES. 1162 00:44:31,582 --> 00:44:33,451 THAT IS MY INTERPRETATION, THIS 1163 00:44:33,451 --> 00:44:33,985 IS A RESOURCE ALLOCATION 1164 00:44:33,985 --> 00:44:37,455 QUESTION. 1165 00:44:37,455 --> 00:44:38,222 AND MORE RESOURCE DEPLOYMENT 1166 00:44:38,222 --> 00:44:39,223 TYPE OF QUESTION. 1167 00:44:39,223 --> 00:44:43,127 I THINK THAT'S ONE OF THE 1168 00:44:43,127 --> 00:44:44,528 CHALLENGES WITH HEALTHCARE. 1169 00:44:44,528 --> 00:44:46,764 YOU HAVE SO MANY STAKEHOLDERS 1170 00:44:46,764 --> 00:44:47,265 DRIVING THEIR STRATEGIC 1171 00:44:47,265 --> 00:44:48,599 PRIORITIES. 1172 00:44:48,599 --> 00:44:49,700 AND ULTIMATELY DECISIONS ARE 1173 00:44:49,700 --> 00:44:52,570 MADE BY THOSE THAT ARE DRIVING 1174 00:44:52,570 --> 00:44:54,672 THEIR WHATEVER INSTITUTION THEY 1175 00:44:54,672 --> 00:44:56,908 ARE PART OF, THOSE STRATEGIC 1176 00:44:56,908 --> 00:44:58,109 PRIORITIES. 1177 00:44:58,109 --> 00:45:00,611 SO FOR HEALTHCARE DELIVERY 1178 00:45:00,611 --> 00:45:02,246 SYSTEM, HOUSING THE UNHOUSED 1179 00:45:02,246 --> 00:45:03,848 ISN'T A STRATEGIC PRIORITY, BUT 1180 00:45:03,848 --> 00:45:12,256 IT IS A STRATEGIC PRIORITY TO 1181 00:45:12,256 --> 00:45:13,224 UNDERSTAND SOCIAL DETERMINANTS 1182 00:45:13,224 --> 00:45:16,394 OF HEALTH MAY BE AFFECTING OR 1183 00:45:16,394 --> 00:45:18,496 IMPACT INTO A UNIT THAT IS 1184 00:45:18,496 --> 00:45:20,598 ADDRESSABLE FOR THAT COMPONENT 1185 00:45:20,598 --> 00:45:21,432 OF THE HEALTHCARE DELIVERY 1186 00:45:21,432 --> 00:45:24,001 SYSTEM. 1187 00:45:24,001 --> 00:45:27,705 AND ACTUALLY, AI MAY BE ABLE TO 1188 00:45:27,705 --> 00:45:28,873 HELP IDENTIFY AND CONNECT 1189 00:45:28,873 --> 00:45:30,308 INFORMATION. 1190 00:45:30,308 --> 00:45:32,510 THERE MAY BE THAT ABILITY, SAY 1191 00:45:32,510 --> 00:45:35,279 A PATIENT THAT NEEDS SOME SORT 1192 00:45:35,279 --> 00:45:37,315 OF SOCIAL DETERMINANT OF HEALTH 1193 00:45:37,315 --> 00:45:38,916 RISK FACTOR ADDRESSED. 1194 00:45:38,916 --> 00:45:40,584 AND BE ABLE TO CONNECT IT TO 1195 00:45:40,584 --> 00:45:42,453 THE APPROPRIATE RESOURCES. 1196 00:45:42,453 --> 00:45:44,088 THAT'S WHERE AI MAY ACTUALLY BE 1197 00:45:44,088 --> 00:45:45,990 BENEFICIAL. 1198 00:45:45,990 --> 00:45:49,093 NOW WHETHER OR NOT, SINCE OFTEN 1199 00:45:49,093 --> 00:45:52,430 IN DECISION MAKING AT THE 1200 00:45:52,430 --> 00:45:53,564 HEALTH SYSTEM LEVEL, THE LEVEL 1201 00:45:53,564 --> 00:45:58,302 I CAN MOST TALK ABOUT -- 1202 00:45:58,302 --> 00:46:00,438 TYPES OF DECISION MAKING. 1203 00:46:00,438 --> 00:46:03,240 BUT THAT'S AT THE VERY LOCAL 1204 00:46:03,240 --> 00:46:04,175 HEALTHCARE DELIVERY SORT OF 1205 00:46:04,175 --> 00:46:06,877 UNIT LEVEL. 1206 00:46:06,877 --> 00:46:08,446 ON A NATIONAL LEVEL, THAT'S 1207 00:46:08,446 --> 00:46:10,715 VERY DIFFERENT, RIGHT? 1208 00:46:10,715 --> 00:46:12,149 THAT'S OUR POLICY MAKERS. 1209 00:46:12,149 --> 00:46:14,385 THAT'S THE FUNDING MECHANISM 1210 00:46:14,385 --> 00:46:17,421 FOR NIH AND WHO IS DRIVING THAT 1211 00:46:17,421 --> 00:46:18,856 DECISION MAKING ON WHAT GETS 1212 00:46:18,856 --> 00:46:20,024 FUNDED. 1213 00:46:20,024 --> 00:46:24,195 AND SO, I THINK THE POLITICAL 1214 00:46:24,195 --> 00:46:25,896 REALITY IS, DETERMINING OUR 1215 00:46:25,896 --> 00:46:27,798 HAVING INFLUENCE ON WHO IS 1216 00:46:27,798 --> 00:46:28,599 MAKING THOSE POLITICAL 1217 00:46:28,599 --> 00:46:31,902 DECISIONS ON HOW WE SPEND BROAD 1218 00:46:31,902 --> 00:46:32,870 TAXPAYER DOLLARS IS SOMETHING 1219 00:46:32,870 --> 00:46:35,806 WE ALL NEED TO RECKON WITH AND 1220 00:46:35,806 --> 00:46:38,442 TRY TO MAKE SURE THAT OUR 1221 00:46:38,442 --> 00:46:39,543 POLICY MAKERS REALLY ARE 1222 00:46:39,543 --> 00:46:40,644 DRIVING THE DECISIONS THAT ARE 1223 00:46:40,644 --> 00:46:41,979 MOST IMPORTANT TO US AS A 1224 00:46:41,979 --> 00:46:43,981 COLLECTIVE. 1225 00:46:43,981 --> 00:46:46,117 >> KEN KOYLE: THANK YOU. 1226 00:46:46,117 --> 00:46:49,020 THANK YOU, DR. HIGHTOWER. 1227 00:46:49,020 --> 00:46:50,554 OUR NEXT QUESTION, I THINK, 1228 00:46:50,554 --> 00:46:52,356 DRILLS DOWN A LITTLE BIT INTO 1229 00:46:52,356 --> 00:46:53,958 THE ROLES YOU TALKED ABOUT IN 1230 00:46:53,958 --> 00:46:55,259 YOUR LECTURE. 1231 00:46:55,259 --> 00:46:57,428 THIS IS A QUESTION REGARDING 1232 00:46:57,428 --> 00:46:59,830 THE CONSTITUENCY REPRESENTED BY 1233 00:46:59,830 --> 00:47:00,965 THE TWO SPONSORING 1234 00:47:00,965 --> 00:47:02,867 ORGANIZATIONS OF THIS LECTURE, 1235 00:47:02,867 --> 00:47:04,201 NLM AND MLA. 1236 00:47:04,201 --> 00:47:07,338 QUESTION IS, AS LIBRARIANS AND 1237 00:47:07,338 --> 00:47:09,640 INFORMATIONISTS, WE SUPPORT AND 1238 00:47:09,640 --> 00:47:11,876 ADVISE RESEARCHERS WHO WILL BE 1239 00:47:11,876 --> 00:47:13,811 INCREASING RELYING ON AI 1240 00:47:13,811 --> 00:47:14,512 GENERATED SEARCH RESULTS FOR 1241 00:47:14,512 --> 00:47:15,513 THEIR RESEARCH. 1242 00:47:15,513 --> 00:47:20,785 YOU TALKED ABOUT THE 1243 00:47:20,785 --> 00:47:22,987 INSTITUTIONAL ROLE FOR 1244 00:47:22,987 --> 00:47:25,122 INSTITUTES BUT HOW CAN WE MAKE 1245 00:47:25,122 --> 00:47:27,825 SURE THEIR QUERIES RETURN 1246 00:47:27,825 --> 00:47:28,659 EQUITY AND, UNBIASED RESULTS. 1247 00:47:28,659 --> 00:47:29,727 >> Maia Hightower: THAT'S A 1248 00:47:29,727 --> 00:47:30,761 GREAT QUESTION. 1249 00:47:30,761 --> 00:47:33,931 WHEN IT COMES TO QUERYING, ONE 1250 00:47:33,931 --> 00:47:35,466 IS THE TOOL, LIKE UNDERSTANDING 1251 00:47:35,466 --> 00:47:41,105 THE LIMITATIONS OF THE TOOLS. 1252 00:47:41,105 --> 00:47:42,206 SO YOU'RE RESEARCHERS OR EVEN 1253 00:47:42,206 --> 00:47:45,943 IN YOUR LIBRARY, ARE YOU 1254 00:47:45,943 --> 00:47:46,877 ACCESSING LARGE LANGUAGE MODELS 1255 00:47:46,877 --> 00:47:49,313 TO HELP WITH THAT RESEARCH 1256 00:47:49,313 --> 00:47:51,382 PROCUREMENT, LIKE THE 1257 00:47:51,382 --> 00:47:53,417 LITERATURE PROCUREMENT PROCESS, 1258 00:47:53,417 --> 00:47:54,318 UNDERSTANDING THE LIMITATIONS 1259 00:47:54,318 --> 00:47:55,753 OF SUCH TOOLS. 1260 00:47:55,753 --> 00:47:57,121 AND THEN UNDERSTANDING HOW TO 1261 00:47:57,121 --> 00:47:58,589 PROMPT. 1262 00:47:58,589 --> 00:48:01,592 THERE NEEDS TO BE A HUGE AMOUNT 1263 00:48:01,592 --> 00:48:04,028 OF TRAINING ON PROMPTING TO 1264 00:48:04,028 --> 00:48:07,665 ENSURE THAT THE OUTPUT OF A 1265 00:48:07,665 --> 00:48:11,702 PROMPT ACTUALLY MATCHES THE 1266 00:48:11,702 --> 00:48:14,738 INTENTION OF THE INVESTIGATOR 1267 00:48:14,738 --> 00:48:17,842 OR RESEARCHER OR LIBRARIAN. 1268 00:48:17,842 --> 00:48:19,910 THERE'S THIS WHOLE EMERGING 1269 00:48:19,910 --> 00:48:25,983 METHODS AROUND HOW BEST TO 1270 00:48:25,983 --> 00:48:27,952 PROMPT ENGINEERING, WHO EVER 1271 00:48:27,952 --> 00:48:29,787 HEARD OF PROMPT ENGINEERING AS 1272 00:48:29,787 --> 00:48:31,388 OF ONE YEAR AGO. 1273 00:48:31,388 --> 00:48:32,423 THIS WHOLE THING, PROMPT 1274 00:48:32,423 --> 00:48:33,224 ENGINEERING. 1275 00:48:33,224 --> 00:48:36,360 I THINK THOSE TOOLS WILL BE 1276 00:48:36,360 --> 00:48:36,927 INCREASINGLY IMPORTANT, 1277 00:48:36,927 --> 00:48:37,795 DEPLOYED UNDERSTANDING, NOT 1278 00:48:37,795 --> 00:48:41,298 ONLY AT THE LIBRARY LEVEL OR 1279 00:48:41,298 --> 00:48:42,133 INDIVIDUAL LIBRARIANS HELPING 1280 00:48:42,133 --> 00:48:43,834 RESEARCHERS. BUT EVEN WHEN IT 1281 00:48:43,834 --> 00:48:45,035 COMES TO STUDENTS AND GENERAL 1282 00:48:45,035 --> 00:48:45,970 PUBLIC. 1283 00:48:45,970 --> 00:48:48,072 LIKE WHO IS GETTING THE MOST 1284 00:48:48,072 --> 00:48:49,640 BENEFIT OUT OF GENERATIVE A.I. 1285 00:48:49,640 --> 00:48:51,509 AND BEING ABLE TO DETECT. 1286 00:48:51,509 --> 00:48:53,244 BECAUSE REALLY THE WHOLE POINT 1287 00:48:53,244 --> 00:48:54,812 OF GENERATIVE A.I. IS THE HUMAN 1288 00:48:54,812 --> 00:48:56,547 IN THE LOOP SHOULD HAVE ENOUGH 1289 00:48:56,547 --> 00:48:58,682 KNOWLEDGE ABOUT THE SUBJECT 1290 00:48:58,682 --> 00:49:00,918 MATTER THEY ARE RESEARCHING TO 1291 00:49:00,918 --> 00:49:03,154 BE ABLE TO IDENTIFY WHEN 1292 00:49:03,154 --> 00:49:05,222 SOMETHING MAY BE A 1293 00:49:05,222 --> 00:49:07,424 HALLUCINATION OR MAY BE A 1294 00:49:07,424 --> 00:49:07,691 FALSEHOOD. 1295 00:49:07,691 --> 00:49:08,993 THAT'S THE HUMAN IN THE LOOP. 1296 00:49:08,993 --> 00:49:10,995 THE ROLE OF THE LIBRARIAN AS 1297 00:49:10,995 --> 00:49:14,331 WELL IS TO BE THAT BIT OF A 1298 00:49:14,331 --> 00:49:15,866 FILTER, ESPECIALLY IF YOU HAVE 1299 00:49:15,866 --> 00:49:17,368 A NOVICE DOING THE QUERY. 1300 00:49:17,368 --> 00:49:19,570 I THINK THERE'S A LOT OF 1301 00:49:19,570 --> 00:49:21,605 BENEFIT TO BE HAD AT 1302 00:49:21,605 --> 00:49:23,541 UNDERSTANDING THE TOOLS, 1303 00:49:23,541 --> 00:49:24,642 UNDERSTANDING PROMPTING, 1304 00:49:24,642 --> 00:49:27,378 UNDERSTANDING THE LIMITATIONS, 1305 00:49:27,378 --> 00:49:28,379 THERE'S MORE, EVEN EVOLVING, 1306 00:49:28,379 --> 00:49:35,753 LIKE HOW DO WE USE CORPUSES OF 1307 00:49:35,753 --> 00:49:37,054 TRUST, KNOWLEDGE BANKS OF 1308 00:49:37,054 --> 00:49:38,889 TRUST, VERSUS A WHOLE 1309 00:49:38,889 --> 00:49:40,090 REPOSITORY LIKE A LARGE 1310 00:49:40,090 --> 00:49:41,225 LANGUAGE MODEL THAT IS 1311 00:49:41,225 --> 00:49:42,726 UNRESTRICTED. 1312 00:49:42,726 --> 00:49:44,228 AND DEPENDING ON HOW YOUR 1313 00:49:44,228 --> 00:49:47,064 HEALTH SYSTEM OR LIBRARY IS 1314 00:49:47,064 --> 00:49:49,667 DEPLOYED, THAT GENERATIVE MODEL 1315 00:49:49,667 --> 00:49:51,001 WITHIN YOUR ENVIRONMENT, I 1316 00:49:51,001 --> 00:49:53,137 THINK STEPHEN TALKED A LITTLE 1317 00:49:53,137 --> 00:49:54,238 ABOUT KNOWLEDGE GRAFTS. AND I 1318 00:49:54,238 --> 00:49:56,941 THINK THERE'S GOING TO BE AN 1319 00:49:56,941 --> 00:49:59,476 INCREASING ROLE IN MAKING SURE 1320 00:49:59,476 --> 00:50:00,711 WE ARE LEVERAGING CURATED 1321 00:50:00,711 --> 00:50:01,979 KNOWLEDGE GRAFTS WHEN IT COMES 1322 00:50:01,979 --> 00:50:05,516 TO THE DATA, THE OUTPUT OF SOME 1323 00:50:05,516 --> 00:50:09,153 OF THESE QUERY SEARCHES. 1324 00:50:09,153 --> 00:50:09,987 >> KEN KOYLE: OKAY, THANK YOU. 1325 00:50:09,987 --> 00:50:11,288 NEXT QUESTION. 1326 00:50:11,288 --> 00:50:13,524 CAN YOU SPEAK A LITTLE ABOUT 1327 00:50:13,524 --> 00:50:15,326 THE POTENTIAL NEGATIVE EFFECTS 1328 00:50:15,326 --> 00:50:20,564 OF A.I. ON PRIVACY AND KFL 1329 00:50:20,564 --> 00:50:21,665 CONFIDENTIALITY OF PATIENTS AND 1330 00:50:21,665 --> 00:50:23,000 POPULATIONS AT LARGE? 1331 00:50:23,000 --> 00:50:24,235 >> Maia Hightower: ABSOLUTELY. 1332 00:50:24,235 --> 00:50:26,403 RIGHT NOW THERE'S A LOT OF 1333 00:50:26,403 --> 00:50:28,973 CONCERNS ABOUT PRIVACY. 1334 00:50:28,973 --> 00:50:31,375 HIPAA STILL STANDS. 1335 00:50:31,375 --> 00:50:35,446 THERE STILL ARE HUGE CONCERNS 1336 00:50:35,446 --> 00:50:36,447 ABOUT ESPECIALLY GENERATIVE 1337 00:50:36,447 --> 00:50:38,182 MODELS, WHAT HAPPENS TO 1338 00:50:38,182 --> 00:50:43,120 PERSONAL HEALTH DATA WHEN IT 1339 00:50:43,120 --> 00:50:44,722 ENTERS INTO, WHEN YOU JUST 1340 00:50:44,722 --> 00:50:46,123 ENTER THAT INFORMATION INTO A 1341 00:50:46,123 --> 00:50:48,125 QUERY, INTO A PROMPT. 1342 00:50:48,125 --> 00:50:53,297 SO THERE ARE HUGE CONCERNS 1343 00:50:53,297 --> 00:50:54,632 ABOUT PRIVACY, SECURITY. 1344 00:50:54,632 --> 00:50:56,200 I WOULD SAY, IT DEPENDS ON WHO 1345 00:50:56,200 --> 00:50:56,867 YOU ARE. 1346 00:50:56,867 --> 00:50:59,270 IF YOU ARE A PATIENT, 1347 00:50:59,270 --> 00:51:00,904 DEFINITELY WITH CARE, YOU WOULD 1348 00:51:00,904 --> 00:51:04,108 WANT TO LIMIT THE AMOUNT OF 1349 00:51:04,108 --> 00:51:06,677 INFORMATION YOU PROVIDE. 1350 00:51:06,677 --> 00:51:09,280 SAY A GENERATIVE MODEL, SAY 1351 00:51:09,280 --> 00:51:10,080 OPEN AI, CHATGPT, OR SOMETHING 1352 00:51:10,080 --> 00:51:11,115 LIKE THAT. 1353 00:51:11,115 --> 00:51:13,017 AND THEN FOR HEALTH SYSTEMS, 1354 00:51:13,017 --> 00:51:17,454 MAKING SURE YOU ARE CREATING 1355 00:51:17,454 --> 00:51:19,690 SECURE ENVIRONMENTS, AS SECURE 1356 00:51:19,690 --> 00:51:22,493 AS POSSIBLE, THERE'S DIFFERENCE 1357 00:51:22,493 --> 00:51:25,329 BETWEEN OPEN AI AND WITHIN 1358 00:51:25,329 --> 00:51:28,799 AZURE OR MICROSOFT OR CLOUD 1359 00:51:28,799 --> 00:51:30,100 ENVIRONMENT, AN 1360 00:51:30,100 --> 00:51:30,601 ENTERPRISE-GRADE, LARGE 1361 00:51:30,601 --> 00:51:32,903 LANGUAGE MODEL THAT IS MORE 1362 00:51:32,903 --> 00:51:33,237 LOCKED DOWN. 1363 00:51:33,237 --> 00:51:35,873 IT'S NOT 100%. 1364 00:51:35,873 --> 00:51:37,775 BUT ANYWAYS, THE DATA DOESN'T 1365 00:51:37,775 --> 00:51:41,011 LEAVE, OR ISN'T FURTHER REUSED 1366 00:51:41,011 --> 00:51:42,846 OR REPURPOSED FOR FURTHER 1367 00:51:42,846 --> 00:51:45,082 FINE-TUNING OF THE MODEL. 1368 00:51:45,082 --> 00:51:47,284 SO THE WAY THAT THE MODEL MAY 1369 00:51:47,284 --> 00:51:49,019 BE DEPLOYED IN YOUR SYSTEM, 1370 00:51:49,019 --> 00:51:50,921 WHETHER YOUR LIBRARY OR YOUR 1371 00:51:50,921 --> 00:51:55,793 HEALTH SYSTEM IS REALLY 1372 00:51:55,793 --> 00:51:57,394 IMPORTANT TO MINIMIZE SOME OF 1373 00:51:57,394 --> 00:51:59,697 THE PRIVACY CONCERNS AND 1374 00:51:59,697 --> 00:52:01,065 POTENTIAL FOR HIPAA VIOLATION. 1375 00:52:01,065 --> 00:52:04,034 THERE ARE A LOT OF SORT OF DO'S 1376 00:52:04,034 --> 00:52:06,203 AND DON'T'S, IT'S EASY FOR 1377 00:52:06,203 --> 00:52:07,905 PATIENTS AND CLINICIANS TO 1378 00:52:07,905 --> 00:52:10,007 ACCIDENTALLY CROSS THAT LINE 1379 00:52:10,007 --> 00:52:13,043 AND CREATE SOME PRETTY BLATANT 1380 00:52:13,043 --> 00:52:17,715 HIPAA VIOLATIONS. SO BE CAREFUL. 1381 00:52:17,715 --> 00:52:19,416 TALK TO YOUR ITT BEFOREHAND. 1382 00:52:19,416 --> 00:52:21,652 IF YOU ARE JUST OPENING UP A 1383 00:52:21,652 --> 00:52:23,253 WEB BROWSER, JUST KNOW THAT'S 1384 00:52:23,253 --> 00:52:24,855 THE SAME AS OPENING IT UP TO 1385 00:52:24,855 --> 00:52:27,591 THE WORLD, RIGHT? 1386 00:52:27,591 --> 00:52:28,792 YOU JUST OPENED YOUR WHOLE 1387 00:52:28,792 --> 00:52:30,861 FRONT DOOR. 1388 00:52:30,861 --> 00:52:32,663 IF YOU ENTER YOUR INFORMATION 1389 00:52:32,663 --> 00:52:34,598 INTO AN OPEN WEB BROWSER, JUST 1390 00:52:34,598 --> 00:52:38,335 ASSUME YOU HAVE NO PRIVACY. 1391 00:52:38,335 --> 00:52:39,403 >> KEN KOYLE: OKAY, UNDERSTOOD, 1392 00:52:39,403 --> 00:52:41,672 THANK YOU. 1393 00:52:41,672 --> 00:52:42,673 THIS NEXT QUESTION REFERS BACK 1394 00:52:42,673 --> 00:52:44,375 TO WHAT YOU WERE SAYING, IN 1395 00:52:44,375 --> 00:52:45,909 YOUR RESPONSE A MOMENT AGO 1396 00:52:45,909 --> 00:52:48,045 ABOUT THE RESEARCHERS HAVING 1397 00:52:48,045 --> 00:52:50,214 THE EXPERTISE TO EVALUATE WHAT 1398 00:52:50,214 --> 00:52:53,117 THEY ARE SEEING FROM GENERATIVE 1399 00:52:53,117 --> 00:52:54,451 BY AI. 1400 00:52:54,451 --> 00:52:56,553 THIS QUESTION COMES IN, IF WE 1401 00:52:56,553 --> 00:52:58,422 HAVE IN PRACTICE ACCEPTED THAT 1402 00:52:58,422 --> 00:52:59,223 MOST INDIVIDUAL PROVIDERS DON'T 1403 00:52:59,223 --> 00:53:03,894 HAVE THE TIME TO CRITICALLY 1404 00:53:03,894 --> 00:53:05,696 APPRAISE PUBLIC EVIDENCE, POINT 1405 00:53:05,696 --> 00:53:08,098 OF REFERENCE TOOLS UP-TO-DATE 1406 00:53:08,098 --> 00:53:10,667 AND GUIDELINES BUILT INTO THE 1407 00:53:10,667 --> 00:53:12,269 ELECTRONIC HEALTH RECORDS, IS 1408 00:53:12,269 --> 00:53:13,804 IT REALISTIC TO EXPECT 1409 00:53:13,804 --> 00:53:15,439 PROVIDERS WILL HAVE THE TIME 1410 00:53:15,439 --> 00:53:18,642 AND DEVELOP THE SKILL TO 1411 00:53:18,642 --> 00:53:19,510 EVALUATE THE VARIOUS 1412 00:53:19,510 --> 00:53:20,878 IMPLICATIONS OF A.I. 1413 00:53:20,878 --> 00:53:22,413 AND MAYBE THIS GETS BACK TO THE 1414 00:53:22,413 --> 00:53:24,014 PEOPLE WHO ARE IN THE 1415 00:53:24,014 --> 00:53:26,049 PROFESSION OF HELPING THE 1416 00:53:26,049 --> 00:53:29,453 PROVIDERS TO USE THESE TOOLS 1417 00:53:29,453 --> 00:53:31,188 EFFECTIVELY AND EQUITABLY. 1418 00:53:31,188 --> 00:53:31,889 >> Maia Hightower: YEAH, THAT'S 1419 00:53:31,889 --> 00:53:33,524 A GREAT QUESTION. 1420 00:53:33,524 --> 00:53:38,462 SO FOR GOOD OR BAD, MANY OF THE 1421 00:53:38,462 --> 00:53:41,098 HEALTH SYSTEMS THAT ARE 1422 00:53:41,098 --> 00:53:43,200 EVALUATING THE OUTPUT OF 1423 00:53:43,200 --> 00:53:44,134 GENERATIVE AI RECOGNIZE DOCTORS 1424 00:53:44,134 --> 00:53:47,171 COULD BE A LITTLE LAZY AT TIMES. 1425 00:53:47,171 --> 00:53:48,572 BEING A DOCTOR MYSELF, I WILL 1426 00:53:48,572 --> 00:53:49,506 ADMIT IT AS WELL. 1427 00:53:49,506 --> 00:53:51,375 THEY ARE ON THE LOOK OUT. 1428 00:53:51,375 --> 00:53:54,778 MANY OF THE EVALUATION METHODS 1429 00:53:54,778 --> 00:53:59,917 ARE LOOKING FOR WHETHER DOCTORS 1430 00:53:59,917 --> 00:54:05,456 REVERT TO AUTOMATICITY AND 1431 00:54:05,456 --> 00:54:07,090 AUTOMATICALLY ACCEPTING THE 1432 00:54:07,090 --> 00:54:08,992 OUTPUT OF GENERATIVE MODELS. 1433 00:54:08,992 --> 00:54:10,627 THERE WAS A PAPER IN THE LAST 1434 00:54:10,627 --> 00:54:12,629 COUPLE WEEKS, THEY WERE LOOKING 1435 00:54:12,629 --> 00:54:14,965 AT THE OUTPUT, I CAN'T REMEMBER 1436 00:54:14,965 --> 00:54:16,500 EXACTLY WHAT THE OUTPUT WAS, 1437 00:54:16,500 --> 00:54:18,502 BUT WHAT THEY FOUND WAS A LOT 1438 00:54:18,502 --> 00:54:20,437 OF AUTOMATICITY. 1439 00:54:20,437 --> 00:54:21,772 HENCE THAT, BUT ALSO A LOT OF 1440 00:54:21,772 --> 00:54:23,240 ERRORS. 1441 00:54:23,240 --> 00:54:28,612 AND HOW DO WE THEN CREATE SOME 1442 00:54:28,612 --> 00:54:30,714 GUARDRAILS. SO THAT BE THE NEXT 1443 00:54:30,714 --> 00:54:33,550 STEPS, LIKE IF SOMEBODY IS 1444 00:54:33,550 --> 00:54:34,184 AUTOMATICALLY ACCEPTING EVERY 1445 00:54:34,184 --> 00:54:35,352 OUTPUT OF THE MODEL. 1446 00:54:35,352 --> 00:54:38,188 THAT KIND OF TELLS YOU, THEY 1447 00:54:38,188 --> 00:54:42,526 ARE AUTOMATICALLY ACCEPTING THE 1448 00:54:42,526 --> 00:54:46,063 OUTPUT WITHOUT QUESTION. 1449 00:54:46,063 --> 00:54:49,199 VERSUS THOSE MODIFYING THE 1450 00:54:49,199 --> 00:54:50,434 OUTPUT, MY CHART MESSAGES, YOU 1451 00:54:50,434 --> 00:54:53,370 HAVE ONE DOCTOR WHO JUST 100% 1452 00:54:53,370 --> 00:54:55,172 ACCEPTS AND SENDS. OR ANOTHER 1453 00:54:55,172 --> 00:54:57,908 DOCTOR 60% OF THE TIME IS 1454 00:54:57,908 --> 00:54:59,109 MODIFYING THE MESSAGE AND 1455 00:54:59,109 --> 00:55:01,645 THERE'S VARIATION IN THERE. 1456 00:55:01,645 --> 00:55:06,083 THAT VARIATION IN PERFORMANCE 1457 00:55:06,083 --> 00:55:09,686 MAY LATER ON RESULT IN 1458 00:55:09,686 --> 00:55:12,256 PROMPTING THE CLINICIAN TO 1459 00:55:12,256 --> 00:55:16,093 REVERT BACK TO THE MEAN, CHECK 1460 00:55:16,093 --> 00:55:18,829 A FEW MORE MESSAGE RESPONSES 1461 00:55:18,829 --> 00:55:22,633 BEFORE SENDING THEM ALONG. 1462 00:55:22,633 --> 00:55:24,668 RESEARCHERS ARE AWARE OF THAT 1463 00:55:24,668 --> 00:55:26,703 PROBLEM, AND TRYING TO BOTH 1464 00:55:26,703 --> 00:55:29,706 MEASURE IT AS WELL AS DEVELOP 1465 00:55:29,706 --> 00:55:31,041 METHODS FOR MITIGATING IT. 1466 00:55:31,041 --> 00:55:34,144 AND IT MAY BE SOME SORT OF 1467 00:55:34,144 --> 00:55:37,180 PROMPT OF THOSE THAT VARY FROM 1468 00:55:37,180 --> 00:55:39,316 THE MEAN, WE DON'T KNOW YET. 1469 00:55:39,316 --> 00:55:41,418 >> KEN KOYLE: IT'S A NEW WORLD, 1470 00:55:41,418 --> 00:55:43,520 I THINK THERE ARE, AS YOU SAID, 1471 00:55:43,520 --> 00:55:45,055 THERE ARE SOME BOUNDARIES AND 1472 00:55:45,055 --> 00:55:48,392 KIND OF GUIDE RAILS WE NEED TO 1473 00:55:48,392 --> 00:55:48,659 ESTABLISH. 1474 00:55:48,659 --> 00:55:50,327 THIS IS CERTAINLY ONE OF THEM, 1475 00:55:50,327 --> 00:55:53,130 ONE AREA WHERE I THINK THERE'S 1476 00:55:53,130 --> 00:55:58,302 A LOT MORE THOUGHT AND MAYBE 1477 00:55:58,302 --> 00:56:02,339 THIS IS WHERE THOSE HUMANISTS 1478 00:56:02,339 --> 00:56:09,079 COME IN TO REALLY MANAGE THAT. 1479 00:56:09,079 --> 00:56:12,316 AND INSURE THE QUALITY IS STILL 1480 00:56:12,316 --> 00:56:13,650 THERE, AND LAZINESS DOESN'T 1481 00:56:13,650 --> 00:56:15,986 COMPROMISE THAT. 1482 00:56:15,986 --> 00:56:18,288 KIND OF RELATED AND I THINK 1483 00:56:18,288 --> 00:56:23,360 PERHAPS A VERY LOADED AND 1484 00:56:23,360 --> 00:56:24,661 COMPLEX QUESTION. 1485 00:56:24,661 --> 00:56:26,663 ARE THERE ANY LEGAL SYSTEMS IN 1486 00:56:26,663 --> 00:56:28,131 PLACE TO ADDRESS USING AI FOR 1487 00:56:28,131 --> 00:56:29,900 PATIENT TREATMENT? 1488 00:56:29,900 --> 00:56:32,035 FOR EXAMPLE, IF AI WERE TO 1489 00:56:32,035 --> 00:56:33,770 SUGGEST A SPECIFIC TREATMENT 1490 00:56:33,770 --> 00:56:36,673 FOR A PATIENT AND THEY TAKE 1491 00:56:36,673 --> 00:56:38,709 THAT ADVICE BUT EXPERIENCE 1492 00:56:38,709 --> 00:56:39,343 ADVERSE RESULTS WHO IS 1493 00:56:39,343 --> 00:56:40,043 RESPONSIBLE? 1494 00:56:40,043 --> 00:56:41,845 I THINK WE ARE TALKING ABOUT A 1495 00:56:41,845 --> 00:56:44,247 MEMBER OF THE PUBLIC TAKING 1496 00:56:44,247 --> 00:56:47,784 MEDICAL ADVICE FROM AN AI TOOL 1497 00:56:47,784 --> 00:56:49,786 DIRECTLY AND NOT, OBVIOUSLY IF 1498 00:56:49,786 --> 00:56:51,388 A CLINICIAN IS INVOLVED, THE 1499 00:56:51,388 --> 00:56:52,956 CLINICIAN IS RESPONSIBLE. 1500 00:56:52,956 --> 00:56:55,659 BUT IF THERE'S NO CLINICIAN IN 1501 00:56:55,659 --> 00:57:00,831 THE LOOP, IS THERE ANY ANYTHING 1502 00:57:00,831 --> 00:57:04,935 TO KEEP THE AI MODELS, KIND OF 1503 00:57:04,935 --> 00:57:06,403 KEEP THEM HONEST? 1504 00:57:06,403 --> 00:57:07,971 >> Maia Hightower: YEAH, YOU 1505 00:57:07,971 --> 00:57:09,473 ARE RIGHT. 1506 00:57:09,473 --> 00:57:10,941 THE QUESTION WAS MORE AT THE 1507 00:57:10,941 --> 00:57:12,709 PATIENT LEVEL, BUT I THINK IT'S 1508 00:57:12,709 --> 00:57:13,377 IMPORTANT TO ADDRESS THE 1509 00:57:13,377 --> 00:57:14,678 CLINICIAN. 1510 00:57:14,678 --> 00:57:16,380 SO FOR THE CLINICIAN, LET'S SAY 1511 00:57:16,380 --> 00:57:19,249 RIGHT NOW, THERE ARE A COUPLE 1512 00:57:19,249 --> 00:57:19,850 LAWSUITS THAT ARE ACTUALLY 1513 00:57:19,850 --> 00:57:21,084 UNDER WAY. 1514 00:57:21,084 --> 00:57:24,454 AND ONE OF THE LAWSUITS IS A 1515 00:57:24,454 --> 00:57:26,423 CLINICIAN THAT FOLLOWED AN 1516 00:57:26,423 --> 00:57:28,692 ALERT, AN A.I. GENERATED ALERT, 1517 00:57:28,692 --> 00:57:32,295 THAT PREVENTED A PATIENT FROM 1518 00:57:32,295 --> 00:57:34,131 GETTING AN INTERVENTION, OR THE 1519 00:57:34,131 --> 00:57:35,365 CARE THAT WAS PROBABLY LIKELY 1520 00:57:35,365 --> 00:57:39,703 NEEDED. 1521 00:57:39,703 --> 00:57:42,639 BUT THE PATIENT FELL OUTSIDE 1522 00:57:42,639 --> 00:57:44,641 THE PERFORMANCE OF THE MODEL, 1523 00:57:44,641 --> 00:57:45,842 AGE RANGE. 1524 00:57:45,842 --> 00:57:48,078 A YOUNG PERSON WITH CHEST PAIN. 1525 00:57:48,078 --> 00:57:49,212 SO THE PHYSICIAN MAY STILL BE 1526 00:57:49,212 --> 00:57:50,580 FOUND LIABLE. 1527 00:57:50,580 --> 00:57:53,517 WE DON'T KNOW YET. 1528 00:57:53,517 --> 00:57:54,818 BECAUSE THE MODEL SUGGESTED 1529 00:57:54,818 --> 00:57:56,086 SOMETHING THAT WAS OUTSIDE 1530 00:57:56,086 --> 00:57:59,656 STANDARD OF CARE. 1531 00:57:59,656 --> 00:58:01,858 AND SO THE SORT OF FOUNDATION 1532 00:58:01,858 --> 00:58:03,126 FOR PHYSICIANS IS STILL TO 1533 00:58:03,126 --> 00:58:05,128 FOLLOW STANDARD OF CARE. 1534 00:58:05,128 --> 00:58:10,500 IF A MODEL TELLS YOU TO DO 1535 00:58:10,500 --> 00:58:11,501 SOMETHING OUTSIDE THE STANDARD 1536 00:58:11,501 --> 00:58:13,236 OF CARE AND YOU FOLLOW IT 1537 00:58:13,236 --> 00:58:15,605 BLINDLY, YOU MAY BE HELD LIABLE. 1538 00:58:15,605 --> 00:58:17,974 IF THE MODEL TELLS YOU TO DO 1539 00:58:17,974 --> 00:58:19,076 SOMETHING WITHIN STANDARD OF 1540 00:58:19,076 --> 00:58:22,145 CARE AND YOU IGNORE IT, YOU MAY 1541 00:58:22,145 --> 00:58:24,147 BE STILL HELD LIABLE. 1542 00:58:24,147 --> 00:58:28,251 STANDARD OF CARE STILL TRUMPS, 1543 00:58:28,251 --> 00:58:29,453 OR TRIUMPHS. FOLLOW TRAINING 1544 00:58:29,453 --> 00:58:31,221 AND STANDARD OF CARE. 1545 00:58:31,221 --> 00:58:34,124 AND IN AN IDEAL SITUATION, THE 1546 00:58:34,124 --> 00:58:35,759 CO-PILOT EFFECT WHERE THE MODEL 1547 00:58:35,759 --> 00:58:36,660 IS PROVIDING YOU INFORMATION 1548 00:58:36,660 --> 00:58:39,496 WHERE YOU MAY HAVE MISSED THE 1549 00:58:39,496 --> 00:58:40,997 STANDARD OF CARE, THEN YOU 1550 00:58:40,997 --> 00:58:43,333 PAUSE, YOU HAVE TO THINK ABOUT 1551 00:58:43,333 --> 00:58:46,103 IT AND THEN SAID OH I MIGHT 1552 00:58:46,103 --> 00:58:47,137 HAVE MISSED THAT. 1553 00:58:47,137 --> 00:58:48,038 AND GRAY AREA. 1554 00:58:48,038 --> 00:58:50,140 THERE ARE A LOT OF DECISIONS 1555 00:58:50,140 --> 00:58:51,408 THAT ARE WITHIN THE STANDARD OF 1556 00:58:51,408 --> 00:58:52,909 CARE. 1557 00:58:52,909 --> 00:58:55,645 THAT'S SORT OF NOW HOW THAT IS 1558 00:58:55,645 --> 00:58:57,047 AUGMENTED PHYSICIAN ROLE WORKS 1559 00:58:57,047 --> 00:58:58,582 IN A LEGAL FRAMEWORK. 1560 00:58:58,582 --> 00:59:00,450 AS FAR AS FOR PATIENTS, NOW 1561 00:59:00,450 --> 00:59:01,585 THIS IS A WHOLE OTHER THING, 1562 00:59:01,585 --> 00:59:02,652 RIGHT? 1563 00:59:02,652 --> 00:59:05,188 WHERE A PATIENT IS GETTING 1564 00:59:05,188 --> 00:59:07,357 MEDICAL ADVICE FROM AN 1565 00:59:07,357 --> 00:59:08,558 AI-DERIVED SYSTEM. 1566 00:59:08,558 --> 00:59:10,260 THIS IS WHERE, EVEN FOR OPEN 1567 00:59:10,260 --> 00:59:13,196 AI, IF YOU PUT A LOT OF 1568 00:59:13,196 --> 00:59:14,164 INFORMATION IN CHATGPT, IT 1569 00:59:14,164 --> 00:59:17,100 DOESN'T MAKE A DIAGNOSIS, WHAT 1570 00:59:17,100 --> 00:59:18,235 IT SUGGESTS IS THIS IS 1571 00:59:18,235 --> 00:59:20,337 SOMETHING YOU MAY WANT TO 1572 00:59:20,337 --> 00:59:21,738 CONSIDER, BUT I'M NOT A 1573 00:59:21,738 --> 00:59:23,140 CLINICIAN, AND TALK TO YOUR 1574 00:59:23,140 --> 00:59:25,108 DOCTOR. 1575 00:59:25,108 --> 00:59:26,710 SO WHAT PATIENTS REALLY NEED TO 1576 00:59:26,710 --> 00:59:34,184 DO, FROM A LEGAL PERSPECTIVE, A 1577 00:59:34,184 --> 00:59:37,721 LOT OF THE DIRECT TO CONSUMER, 1578 00:59:37,721 --> 00:59:40,557 DIGITAL HEALTH SOLUTIONS, WHEN 1579 00:59:40,557 --> 00:59:44,060 IT COMES TO AI, IF THEY ARE FDA 1580 00:59:44,060 --> 00:59:46,363 APPROVED, THEY HAVE AN 1581 00:59:46,363 --> 00:59:48,231 INCREDIBLY HIGH BAR. 1582 00:59:48,231 --> 00:59:49,766 AND USUALLY ARE AUGMENTED. 1583 00:59:49,766 --> 00:59:52,469 I DON'T KNOW OF MANY SYSTEMS 1584 00:59:52,469 --> 00:59:53,770 THAT ARE FDA APPROVED DIRECT TO 1585 00:59:53,770 --> 00:59:54,504 PATIENTS. 1586 00:59:54,504 --> 00:59:57,007 AND THEN YOU ARE LEFT WITH THIS 1587 00:59:57,007 --> 00:59:59,309 WILD WEST OF CONSUMER HEALTH. 1588 00:59:59,309 --> 01:00:00,644 SO IT ACTUALLY FALLS INTO A 1589 01:00:00,644 --> 01:00:02,345 DIFFERENT SORT OF LEGAL 1590 01:00:02,345 --> 01:00:04,347 FRAMEWORK THAN ACTUALLY 1591 01:00:04,347 --> 01:00:10,854 DIAGNOSIS OR TREATMENT. 1592 01:00:10,854 --> 01:00:13,557 BUT WHAT DO YOU CALL IT, 1593 01:00:13,557 --> 01:00:17,327 DIGITAL HEALTH AND WELLNESS. 1594 01:00:17,327 --> 01:00:20,864 SO YEAH, BUT IT'S A VERY GRAY 1595 01:00:20,864 --> 01:00:23,166 AREA, WHAT PATIENTS WILL 1596 01:00:23,166 --> 01:00:24,534 INTERPRET AS BEING MEDICAL 1597 01:00:24,534 --> 01:00:25,702 ADVICE, RIGHT. 1598 01:00:25,702 --> 01:00:27,838 EVEN THOUGH THERE MAY BE SOME 1599 01:00:27,838 --> 01:00:31,374 SMALL DISCLAIMER THAT SAYS THIS 1600 01:00:31,374 --> 01:00:33,443 IS NOT MEDICAL INFORMATION, 1601 01:00:33,443 --> 01:00:34,578 MEDICAL DIAGNOSIS OR TREATMENT, 1602 01:00:34,578 --> 01:00:36,813 THIS IS JUST INFORMATION YOU 1603 01:00:36,813 --> 01:00:37,948 SHOULD BE PERHAPS SHARING WITH 1604 01:00:37,948 --> 01:00:39,749 A CLINICIAN. 1605 01:00:39,749 --> 01:00:41,351 AND RIGHT NOW, I THINK THAT IS 1606 01:00:41,351 --> 01:00:44,020 A BIT OF A GRAY AREA BECAUSE 1607 01:00:44,020 --> 01:00:46,323 THERE USUALLY ARE SOME, FROM A 1608 01:00:46,323 --> 01:00:48,425 LEGAL PERSPECTIVE, GRAY AREA. 1609 01:00:48,425 --> 01:00:49,860 THERE ARE OFTEN THESE 1610 01:00:49,860 --> 01:00:52,062 DISCLAIMERS THAT IT'S NOT 1611 01:00:52,062 --> 01:00:52,996 ACTUALLY MEDICAL ADVICE, OR 1612 01:00:52,996 --> 01:00:54,097 MEDICAL INFORMATION. 1613 01:00:54,097 --> 01:00:58,735 IT'S KIND OF LIKE FOR THOSE WHO 1614 01:00:58,735 --> 01:01:00,604 MAY GET A STOCK TIP OR 1615 01:01:00,604 --> 01:01:02,806 SOMETHING, THIS IS NOT 1616 01:01:02,806 --> 01:01:04,908 FINANCIAL ADVICE, THIS IS FOR 1617 01:01:04,908 --> 01:01:09,112 INFORMATION PURPOSES ONLY. 1618 01:01:09,112 --> 01:01:11,481 BOTH FINANCE AND HEALTHCARE ARE 1619 01:01:11,481 --> 01:01:14,484 HIGHLY REGULATED SO TO GIVE 1620 01:01:14,484 --> 01:01:15,919 MEDICAL ADVICE, THERE'S ONLY 1621 01:01:15,919 --> 01:01:17,153 LIMITED ROLES WHERE THAT CAN 1622 01:01:17,153 --> 01:01:18,154 OCCUR LEGALLY. 1623 01:01:18,154 --> 01:01:20,957 >> KEN KOYLE RENT CONTROL RIGHT. 1624 01:01:20,957 --> 01:01:22,492 : RIGHT. 1625 01:01:22,492 --> 01:01:23,393 GREAT ANSWER TO THAT QUESTION 1626 01:01:23,393 --> 01:01:24,327 AND ALL THE QUESTIONS. 1627 01:01:24,327 --> 01:01:25,195 THANK YOU VERY MUCH. 1628 01:01:25,195 --> 01:01:26,129 THAT IS ALL THE TIME THAT WE 1629 01:01:26,129 --> 01:01:27,097 HAVE. 1630 01:01:27,097 --> 01:01:29,132 I WILL NOTE THAT WE HAD QUITE A 1631 01:01:29,132 --> 01:01:31,801 FEW MORE QUESTIONS THAT CAME IN. 1632 01:01:31,801 --> 01:01:34,404 SO, ON YOUR LAST SLIDE, YOU 1633 01:01:34,404 --> 01:01:35,972 PROVIDED SOME CONTACT 1634 01:01:35,972 --> 01:01:36,273 INFORMATION. 1635 01:01:36,273 --> 01:01:37,073 MAYBE WE COULD PUT THAT BACK UP 1636 01:01:37,073 --> 01:01:38,275 AGAIN. 1637 01:01:38,275 --> 01:01:40,343 WOULD YOU BE OKAY, DR. 1638 01:01:40,343 --> 01:01:42,579 HIGHTOWER IF THE PEOPLE WHO 1639 01:01:42,579 --> 01:01:44,080 WEREN'T ABLE TO GET THEIR 1640 01:01:44,080 --> 01:01:45,548 QUESTIONS ANSWERED IF THEY CAN 1641 01:01:45,548 --> 01:01:47,851 SEND THEM TO YOU ON THAT 1642 01:01:47,851 --> 01:01:48,418 CONTACT INFORMATION ON THAT 1643 01:01:48,418 --> 01:01:49,786 SLIDE? 1644 01:01:49,786 --> 01:01:52,589 >> Maia Hightower: ABSOLUTELY. 1645 01:01:52,589 --> 01:01:54,157 >> KEN KOYLE: WE WILL PUT THAT 1646 01:01:54,157 --> 01:01:55,091 SLIDE BACK UP. 1647 01:01:55,091 --> 01:01:56,526 THANK YOU FOR THE WONDERFUL 1648 01:01:56,526 --> 01:01:58,094 TALK AND ANSWERING THE 1649 01:01:58,094 --> 01:01:59,496 QUESTIONS FROM OUR AUDIENCE, I 1650 01:01:59,496 --> 01:02:04,267 WOULD LIKE TO THANK DR. STEPHEN 1651 01:02:04,267 --> 01:02:06,236 SHERRY FOR INTRODUCING TODAY'S 1652 01:02:06,236 --> 01:02:11,041 SPEAKER AND THANK YOU TO THE 1653 01:02:11,041 --> 01:02:13,810 NLM AND MLA, THEY PRODUCE AND 1654 01:02:13,810 --> 01:02:17,314 SUPPORT SEVERAL OTHER LECTURE 1655 01:02:17,314 --> 01:02:19,816 SERIES, SO PLEASE VISIT THEIR 1656 01:02:19,816 --> 01:02:21,017 WEBSITES TO LEARN MORE. 1657 01:02:21,017 --> 01:02:22,352 THANK YOU FOR ATTENDING AND 1658 01:02:22,352 --> 01:02:23,753 HAVE A SAFE AND PLEASANT DAY. 1659 01:02:23,753 --> 01:02:33,897 GOODBYE.