Why banning disruptive technologies (like ChatGPT) in Education is wrong.
Upon release, ChatGPT became the fastest-adopted technology platform in history; it was backed by one of the largest technology companiesand was created by the people behind some of the world’s most impressive breakthroughs in the field of AI. It is also a student’s dream come true as it can also churn out homework and test answers in a matter of seconds and score better than their peers  .
Whenever a new disruptive technology is released, it is always met with resistance  . ChatGPT is no different; soon after its release, a flood of articles came in on why the technology is going to replace human jobs, people trying to ban it, and arguments that it generally harms society in other ways .
On ChatGPT and Education
Now the western education system is in need of serious reform    regardless of the “challenges” it faces from AI, but perhaps ChatGPT could highlight some of its shortcomings and be a catalyst to improve the system. When policymakers put rules around the use of new technology, they need to consider if it is the right thing to do or if they are preserving the status quo because it is easy to do.
ChatGPT is the first widely available, publicly accessible, large language model (LLM). However, this won’t be the last or the only LLM in the coming years. In fact, the underlying technology/algorithms that power LLMs are available to anyone, and given enough time/resources, most AI researchers and organisations could (and are) develop their own versions. ChatGPT has the advantage of being the first. Like all new technology, it will take time for the school system to adapt and find checks and balances.
Why ChatGPT should be embraced in the classroom:
- Trying to ban it is fruitless and waste a lot of time and effort — teachers and lawmakers can ban the user interface for chat GPT, but for those with a little bit of technological know-how, it’s easy to circumvent or simply create an alternative interface to the underlying technology. Also, as stated…