Are kids using ChatGPT for school projects?

Students and parents must know that ChatGPT and its variations come with a whole list of associated risks.

The short answer is yes; they are using ChatGPT at schools. Even though most AI tools strictly say that the minimum age to use ChatGPT is 18, many students bypass the rule and use ChatGPT to generate content later submitted as their own. OpenAI’s ChatGPT is one of many options; other tools are either based on ChatGPT or developed by OpenAI competitors, i.e., Microsoft’s Bing Chat and Google’s Bard AI. University students use it for assignments, too. It is an undisputed fact, that AI tools are being used by children in the USA.

ChatGPT’s risks

However, students and parents must know that ChatGPT and its variations come with a whole list of associated risks. Educational institutions are getting up to speed fast, and students could be accused of cheating if caught using AI tools such as ChatGPT. Even though there are no perfect plagiarism checkers yet, multiple tools out there could detect ingenuine content, and those tools are readily available to teachers. Such plagiarism checkers are often unnecessary as teachers familiar with their field of study can easily recognize made-up things and untruthful content. AI chatbots sometimes made-up dates, facts, and even articles, and students often learn this the hard way.

ALSO READ: Back-to-school cybersecurity tips for parents and children

Other associated risks with such tools come with the recently reported ‘dumbing down‘ of ChatGPT and other similar services. OpenAI has openly denied such claims, but many users noticed a significant decrease in the power/knowledge of ChatGPT over the last few months. Even though the ‘dumbing down’ is disputed by many, students must know that chatbots might not be a cure-all and would still need to research their topics thoroughly. Some people address chatbots as glorified typewriters unable to generate new content and ideas but only limited to using what’s already available. And the availability of AI bots is controlled by the companies’ owners.

Future-proofing must also be a concern for students using the new technology. Students must remember that if they’ve managed to pass an exam or submit an assignment at some point, this may fire back later. Universities and schools may one day return to reexamine the work and realize that the generated content was plagiarized, which could void a graduation diploma or a certificate. Students must know that ChatGPT is still highly unregulated.

ALSO READ: Back to school cyber security tips

Conversing with an AI-based chatbot may be exciting, but students need to know that the replies they receive might not be informative. Answers could be dangerously misleading. The bot might be pushing a political agenda and/or mispresenting facts. ChatGPT has shown political bias and has been biased on many sensitive topics. Often, chatbots notify users that they could potentially display inaccurate or offensive information… and this is the cold truth.

Students might feel tempted to take advantage of the tool and use it to write a book report to save time. However, actions have consequences, and it is possible that relying on the new, undeveloped technology comes with risks, and at least for now, reading the book and doing all the work the old-fashioned way is the safest method. Many lawyers and “journalists” have already learned this lesson the hard way.

The post Are kids using ChatGPT for school projects? appeared first on Panda Security Mediacenter.

Article Link: Are kids using ChatGPT for school projects? - Panda Security