Learning how to think

Publisher Stuart Rosenthal is taking a sabbatical this month. His column will return in March.
All through school we are required to write. Writing makes us think. Getting us to write clearly is a significant goal of education.
There are now artificial intelligence (AI) programs that can create stories, novels, non-fictional books, answers to exams, blogs and whatever else you want to write.
They can generate ideas, edit and proofread. Even though they are relatively new, they are pretty good. Imagine how much better they will be in the next couple of years.
It all sounds wonderful, except it leaves out what you gain by doing your own writing — exercising your brain and enhancing your ability to think and present your thoughts to others.
Let’s say Johnny took an English Lit class. He was required to write an article about Hamlet. He went to a frat party instead of doing the assignment.
No matter. He turned on his laptop, opened one of his three AI writing apps, and typed in Hamlet, Shakespeare, critical essay, 500 words, use Johnny’s style.
Before he could get over to the printer, the essay came out double spaced and printed on both sides. He turned in the paper and got an A on the assignment.
Johnny couldn’t participate in the class discussion because he hadn’t read the play. Still, he passed the course because the essay was so good.
Johnny got through most of his college classes using AI to do his homework. His laptop passed more courses than he did. He graduated — but wasn’t educated.
Johnny got a job. He was enthusiastic and eager. He bought a new, more powerful laptop and loaded it with advanced AI writing programs.
Johnny’s boss called a meeting. Everyone at the conference table was as dependent on AI as Johnny. Even the boss used an AI program to research solutions to the company’s problems.
The issue under discussion was clear. Each person in the meeting put in some key words and was confident that the app would find a solution to the company crisis. It didn’t work.
Even though the AI programs could recover everything that had ever been posted on the internet, they couldn’t differentiate between good, bad, true or false. They couldn’t think. They lacked judgment.
Some of the solutions they came up with had proven disastrous for other companies. Other solutions were downright weird.
Nobody at the table had gotten out of education what education is supposed to offer — an enhanced ability to think and present ideas. I wonder if the company could survive with a laptop and computer app acting as CEO?
Taken to its logical conclusion, if AI becomes dominant and permissible, we might end up with smart laptops and nonthinking humans. Seems to me, we have enough non-thinking humans already.