CHAT GPT is not our Best Friend
Paul Major
English Language Specialist/International English Business Consultant/Translator and Interpreter
CHAT GPT and other Large language Models have changed the game. One of the key features of Chat GPT and other newer models is the ability to generate human-like responses in real-time based on the user’s input. It responds? to questions in a conversational tone well as creating stories, essays, and poems.?
All of? this innovation brings privacy concerns, however. It is necessary when interacting with such a? “chatty” system? to be? mindful about what you share with them, they are just chatbots after all. So to help you with your chats with Chat GPT, Gemini, Perplexity, Co Pilot, Llama or Claude here is a list of things you should avoid sharing with AI chatbots for now:
Personal Identification Information: This includes your name, address, phone number, and even seemingly harmless details like your pet's name. This data can be used for impersonation scams or identity theft. Also birthdays, and social security numbers, should never be shared with our chatbot buddies. Despite Open AI implementing privacy features to prevent prompt data from being used for training, these measures are not foolproof. Bugs or user settings can inadvertently expose sensitive data, which can then be used to train AI models and potentially be exploited by malicious actors. So do not fall into the trap of asking CHAT GPT or whoever to wish you a “Happy Birthday”!
Financial Information: Do not under any circumstances share your bank details, credit card numbers, or any other financial information with chatbots. They really don't need it, even if you are asking it for financial advice and moreover a data breach could expose your sensitive data. Sharing such data can lead to severe financial repercussions if it falls into the wrong hands. Also be on your guard against fake AI chatbot platforms that might attempt to steal financial information.
Company Secrets: Sharing confidential work information? and sensitive company data with chatbots could be another area where you run the risk of serious consequences. It could lead to a data leak or even put employment, yours,? in jeopardy., Several companies, including Samsung and Apple, have banned the use of AI chatbots for sharing proprietary information due to security leaks. It is too big a danger, for now, so don’t be lulled into passing confidential company information.
Health Information We've all been there - googling our sniffles at 3 am, convinced we have some rare disease. But while chatbots can be fun to play with, spilling your deepest health woes to them might not be the best idea. Think of it like talking to a stranger on a bus - interesting for a quick chat, but maybe not for your entire medical history. Until these AI bots are a bit more watertight, it's best to keep your own health info on lockdown.
Usernames and Passwords: This should be a no-brainer. Never share your login credentials with any chatbot. Use a password manager for secure storage.Hackers lurk around AI platforms, just waiting for a low hanging fruit.? OpenAI experienced a data breach in May, highlighting the risks of data being accessed by unauthorised parties. Hackers often target login credentials, and reusing the same passwords across multiple platforms exacerbates the risk. Using password managers like Proton Pass and 1Password is advised to securely manage passwords. Users should never share their usernames or passwords with generative AI.
Chat History:? Irony of ironies, even your conversations with chatbots might not be entirely private. Some platforms have experienced data leaks or indexing issues that exposed user chats.The future of AI is promising, with potential for on-device, privacy-focused chatbots. However, until then, Be extremely wary of what you tell the chatbot. Don't share anything you wouldn't tell a stranger. Remember, the information you share can be used to train the AI and might not be completely confidential.
Vocabulary?
Fall into the trap = Make a mistake or get into a difficult situation by doing something or by trusting someone:
Lulled into something = To make someone feel safe in order to trick them:
Watertight = Having no errors? or loopholes; impossible to fault
?No-brainer = A decision or choice that is very easy to make
Lurk around = To wait or move in a secret way so that you cannot be seen
Spill your woes = Tell someone your problems
##New Vocabulary
##ArtificialIntelligence
##DataSecrecy