According to a statement issued by Europol earlier this year, “media created or altered using artificial intelligence” could potentially lead to “up to 90% of online articles being artificially generated by 2026.” The impact of this trend on women’s critical thinking abilities and whether it is a matter of concern remain open questions.
While AI applications have the potential to advance human progress and well-being by enhancing productivity, there is a darker side to AI. It is increasingly being utilized to produce extensive volumes of propaganda and misinformation, both in written form and through manipulated images or videos, with the aim of distorting narratives, influencing public opinion, manipulating populations, and diverting attention from pressing news issues. This manipulation has already had significant repercussions on society, from the events of January 6th to challenges in accessing healthcare due to health-related misinformation. For example, a study by the World Health Organization revealed that “between 20 to 30 percent of YouTube videos on emerging infectious diseases contained false or misleading information.”
While many AI systems operate on individual queries and some developers are striving to mitigate potential misuse, these systems are not impervious to exploitation. Reports on platforms like ChatGPT have illustrated methods to bypass their defenses, raising concerns among many individuals. Experts caution that safeguarding our emotional resilience still hinges on maintaining robust critical thinking skills.
Recently, psychologists at the University of Cambridge developed the first validated “Misinformation Susceptibility Test” (MIST) to assess an individual’s vulnerability to fake news. Younger Americans under 45 performed less effectively on the misinformation test compared to older adults, scoring an average of 12 out of 20 correct answers. This performance was partly linked to the amount of time spent engaging with online content, underscoring the significance of leisure activities in shaping cognitive responses.
On a daily basis, individuals rely on their own judgment to discern truth from falsehood. The Europol report highlights the prevalent belief that audio and visual recordings accurately depict events, raising concerns about the potential manipulation of such media to fabricate or distort events.
A Forbes Advisor Survey revealed that only 56% of consumers feel confident in their ability to differentiate between real and AI-generated content, with 76% expressing concerns about false information originating from artificial intelligence. A study by Public First in the UK focused on public attitudes towards AI applications, indicating a willingness to embrace certain AI uses such as early health warnings but resistance towards AI involvement in decision-making processes, particularly in legal or military contexts.
In a landscape where up to 90% of content is AI-generated, traditional critical thinking tools such as lateral and rotation thinking techniques may need to be reassessed for their effectiveness. These tools encourage individuals to actively seek diverse information sources and challenge their own beliefs, but their efficacy in a predominantly AI-generated environment warrants further evaluation.
To combat the proliferation of misinformation, particularly among younger generations, cultivating critical thinking skills is imperative. By empowering individuals to think critically, we can address the pervasive influence of propaganda and manipulative tactics in daily life, ranging from gaslighting to political radicalization. As Jim Atack, President of The Open Minds Foundation, advocates, nurturing a culture of critical thinking is essential for navigating the complexities of the modern information age.