|
|
By Alimat Aliyeva
According to a new study, using ChatGPT and other artificial intelligence tools may lead to a more superficial understanding of information compared to traditional Google searches, Azernews reports.
Researchers at the University of Pennsylvania conducted a series of experiments involving nearly 10,000 students, comparing learning outcomes when using AI versus search engines. They found that participants who used ChatGPT provided shorter and less detailed explanations, remembered fewer facts, and were less likely to summarize the information they received. Similar patterns emerged when studying topics related to health and gardening. Additionally, other students rated the advice generated by AI as less helpful.
Experts from the University of Agder in Norway also commented on the results.
“When we search for information on Google, we have to put in the work: evaluating results, selecting sources, reading and interpreting data, and then independently assembling individual fragments into a complete picture. This takes time and effort—and it’s precisely this effort that promotes deeper learning,” explains Morten Goodwin, a professor of artificial intelligence.
The study did not assess the accuracy of AI responses. However, other scientists, including experts from OpenAI, note that “hallucinations” remain a fundamental problem even for the most advanced AI models. AI often guesses answers and can provide incorrect information instead of admitting it doesn’t know. According to Erik Velddahl, a professor at the University of Oslo, ChatGPT is not a knowledge base but a language model that generates the most likely sequences of words.
Researchers emphasize that the issue is not the AI tools themselves, but how people use them. When used correctly—such as treating AI as a starting point or a tool for idea verification—it can complement learning rather than replace critical thinking. In other words, AI can serve as a helpful compass, but the journey to real knowledge still requires effort from the learner.
Print version