Another reason to be wary of ChatGPT and the like -- they can guess your personal information based on what you type
Tate Jarrow
VP, Product @ ID.me | Product Leader | Cyber Security & Privacy Expert | Startup Founder (acquired) | Startup Advisor | x-USSS Special Agent | x-Army | x-Google | MBA | West Point Grad | Board Member
Check out this article on Substack here.
A recent study showed this information includes a person's race, gender, location, age, place of birth, job, and more...
ChatGPT and other generative AI chatbot tools can be pretty useful, but like a lot of technology products, especially free products, there are often hidden costs (like giving up your privacy). There has been no shortage of articles about the dangers of using these AI chatbots from a privacy perspective:
"Individual users, or basically anybody who leaves textual traces on the internet, should be more concerned as malicious actors could abuse the models to infer their private information."
What does this mean for you?
I’m a firm believer that we should all be aware of the various risks of the technology that we use and make proactive decisions instead of accidentally (or ignorantly) giving up our privacy or data for other uses.
It’s also important to understand when you may be giving up your anonymity. It can be easy to think that various mitigations we make can help keep us private and anonymous (i.e. incognito mode on Chrome — DOESN’T KEEP YOU PRIVATE").
领英推荐
This means that if you’re using an AI chatbot, you should assume that you have no expectation of privacy or anonymity.
And remember that AI chatbots are everywhere and in more and more places.
Basically, anywhere on the internet where you enter text and then “search” is likely to already have AI behind it, or will soon.
Stay Safe!
Tate
Follow me on Substack at