Should we be scared of ChatGPT?
Dr. Caroline Ritter, MBA, DBA
IT Strategy | Global Service Delivery | Agile Transformation | DevOps | IT Application Delivery | Risk Management | Cost Control | Operational Efficiencies/Automation | Data Privacy/Security
Pt. 2 of 2
As a continuation of my previous post about the use of ChatGPT in education, where else could it be problematic??
The topic of my doctoral dissertation was to understand the User Based Barriers to the Adoption of AI in Healthcare. My research was based on a survey of emergency room physicians to get their opinions on the various applications of AI in healthcare and why they would or wouldn’t want to use it. I expected one of the primary answers to be that the surveyed Emergency Room doctors would feel threatened by the technology and fear for their jobs. Actually, this wasn't a concern at all. Instead, the doctors were mainly concerned about trusting the outcomes of the technology on their patients. So how do we improve that trust?
I spoke to Cliff Lee this week, who created a ChatGPT demo for a healthcare use case with controlled training data. In this case, for multiple sclerosis (MS). You can try the demo here.? Controlling the output by using clinically valid training data?ensures the responses are predictable and, therefore, can be trusted.?Such?a deterministic solution could receive regulatory approval. Similar to the educational use case, it does seem that we can use the tools… but, again, change the rules.
Finally, the topic that I discussed this week with Allison Todd from Gartner was the data privacy implications of using ChatGPT. After two years of leading Cigna's GDPR efforts, it should have occurred to me earlier, but since my conversation with Allison, this is now my largest trust concern - what about the data that we enter into ChatGPT?
领英推荐
If I'm working on a confidential work project for a client and I use ChatGPT to do some research, could the input be aggregated with other information and searched to obtain information that would otherwise be confidential??
Or, from a personal standpoint, what if I search for the implications of a medical diagnosis that I just received? Could this information be made available in a way that it could be used against me?
As privacy regulations continue to evolve, the rules will continue to change, but will they be able to keep up with the tools?
What other topics have come up in your conversations? As we know, these tools will be used; what other rules need to change?
Disclaimer: the opinions expressed in this article are my own and do not represent those of my employers, past or present.