Why I fired ChatGPT as my research assistant

Why I fired ChatGPT as my research assistant

In writing my recent article on mindset and self-tracking, I turned to ChatGPT to help research some academic journal articles on the effectiveness of self-tracking in behavior change.

At a first glance, I was super impressed. Within 10 seconds, it generated a source list of academic journal articles that I could use to cite the power of self-tracking.

No alt text provided for this image

Did I just save hours of researching articles?

Before I was ready to cite the articles, I was curious if it pulled articles with positive results...or if it simply pulled the name of the paper that fit my query. I wanted to get a glimpse of the key results from the study. So, I asked ChatGPT so summarize the articles.

No alt text provided for this image

My research assistant let me down. So I tried to create a simpler request to simply pull the abstract from each study.

No alt text provided for this image

Something seemed off. To figure out where ChatGPT is getting stuck, I decided to pull up the article myself. I figured out why it couldn't pull the abstracts. There was a little problem. The first article didn't seem to exist.

No alt text provided for this image

So I looked for the second... Strike 2!

No alt text provided for this image

It appears that there is an article by a similar name, but the lead author didn't match the citation. Where is K.M. Lichtman?

No alt text provided for this image

I found Lichtman, but K.M. seems to be in a completely different field of research.

Despite the journal citations looking incredibly legitimate, ChatGPT completely fabricated the results.

Here are my 3 key takeaways from this week's use of ChatGPT:

  1. ChatGPT can be a very convincing liar. There was no hesitation in providing a response to my prompt requesting academic sources. It even displayed the request in the proper AP format, identified multiple authors, as well generated the name of the publication. However, in fact checking the sources, the articles simply didn't exist. And the authors may have existed, but not in the field of study that I was looking for.
  2. ChatGPT is self-critical in some of its limitations. I found the response that "this task as it goes against my capabilities as a language AI model" very interesting in disclosing its own limitations. I'm curious to discover where else it returns this response to my requests.
  3. ChatGPT requires the operator to know its limitations and how to develop prompts that provide useful outputs. In trying to create some redemption for my AI friend, I gave it one last chance. Summarizing a research article seemed like a reasonable task, so maybe I need to ask it in a different way. So I took the abstract of a research article that I was an author on, and copy/pasted it into ChatGPT. It actually did a really nice job summarizing when I asked "Can you summarize this to a 12 year old?"

No alt text provided for this image

I asked again, but this time for a more mature audience.

No alt text provided for this image

It did a great job adjusting the description for the intended PhD audience.

Even with that redemption, ChatGPT remains fired as my Research Assistant...but I'm keeping it around as my Writer's Assistant for now.

Frederick Heim, Adjunct Professor, MBA

NAVY // NIKE // Marketing // MBA // Pepperdine Graziadio

2 年

AI could be the Balderdash champion of the world. Great post, Josh!

Ramsay Brown

Building and Securing Synthetic Labor. CEO @ Mission Control AI. Research @ Cambridge.

2 年

As my shop teacher's shop teacher told him: "For every tool, a purpose. For every purpose, a tool"

Steven Antturi

A Lazy Perfectionist - Doing things with excellence in the most efficient way possible!

2 年

Another example of people not understanding what ChatGPT actually is and does. ( Sorry for the tone. I don't mean to be condescending in any way) Once something like ChatGPT is connected in realtime to the internet, then it might start to function as a research assistant. Right now it would only be of help to a human research assistant to write summaries. Its really good at summarizing or at comparing two articles and finding common ideas. What should amaze us with the current state of ChatGPT, is not that it gets the answers correct, but that it understood us at all. This will be an interesting year to see how all these AI tools develop.

Brian Kelly

STEM.org (CTO) | #GiveFirst ?? Techstars Mentor | Building with ?????????? ??????????????????. Helping democratize education and tech. | Mental Health | Sports Philanthropy | #Rugby #Esports #EdTech #SportsTech #STEAM

2 年

I’ve requested more information from Jasper Knoop, my theory is these journals/papers were removed for one reason or another - indexed in 2021, and no longer available in 2023. I will do my best to update this comment once I receive more information from the first author.

This is awesome! I think you said it well - to really leverage the technology, you have to be hyper aware of its limitations. Excited to read the next one!

要查看或添加评论,请登录

Josh Sackman的更多文章

社区洞察

其他会员也浏览了