The Amazon Web Services (AWS) Publishing Symposium session on quality, trust, and authenticity in #genAI takes a lot of interesting turns:
Annie Callahan, CEO, EBSCO Information Services — #GenAI for academic research: Structure, structure, structure. Years of cataloging and taxonomy has paid off for leveraging AI and confidently attributing sources. This is key to trust. Use case for librarians—AI insights into full array of outputs from queries. This has given libraries more comfort. And users are not just satisfied with one-line answers—Users want to interrogate answers and determined how it was derived. That's a data literacy value for all to take note.
Todd A. Carpenter, Executive Director, National Information Standards Organization (NISO) – There are technical problems, but also social ones—Attribution needs attention from our community. Original abstraction and organization systems—you wouldn't buy if you didn't know what was in it. Searching the Internet changed that entirely. There's a trust problem for technologies—users are only going to use or pay for ones they can trust.
Safety and trust exist on a spectrum—a bedtime fairytale, quality takes a backseat. An academic paper? Quality is key. A doctor giving a major cardiac diagnosis? Safety/trust is beyond critical.
Manu Singh, VP of Data Science & Analytics, News Corp – AI can generate 10 different headlines, but at NYPost, we're known for our headlines—that flavor comes from humans. Operational effectiveness can compress the time, but there's always a journalist/human at the helm. Compression causes information loss. Journalists are inclined to use AI, but skeptical about quality, safety, and trust.
Andrew Jones, Director of AI/ML Engineering, Wiley — We are encouraging writers to use AI in safe environments, and they want to! But how do you qualify the outputs from your AI? You better be tracking all your AI inputs, because what if someone comes after you legally? You better have an audit trail. Expertise is critical in judging quality of outputs—the lower level of expertise of the end user, the more expertise required.
Lots of cases where there's a light lift to take nothing and pull it to something decent (on the way to perfect!). For example: Visual models to improve alt text. Key at Wiley is relationship between editor/writer relationship. We don't want to replace writers—we want to empower them with tools that will create better content.
Ed Klaris, Managing Partner, Klaris Law – Use anti-piracy software to check your AI-enhanced work before you put it out into the world... before someone else does and finds something off. Rule in US is AI-made content is not copyright-able. Even if prompting is super sophisticated, not copyright-able—you're not considered an author for legal purposes. Once content is generated, you can manipulate it and potentially make it viable for copyright. But what you're putting out there could be infringe-able.
Amazon #ai #llm #awspubsymposium