
These issues interest me as both a teacher and a historian.
Modern students read little, and only what’s available online. If a book isn’t online, it doesn’t exist for them. They hardly ever go to libraries.
A separate issue is the use of AI. For an intelligent person, AI is a tool, facilitating information gathering, but for students, it often becomes an alternative to heuristic search and research as such. As a result, there’s no thirst for discovery; AI can perform this search in their place. Humans are creatures evolutionarily designed to set goals and achieve them, overcoming difficulties. AI deprives them of this. Moreover, AI doesn’t create creators. It can only create average performers. As a result, take away such a person’s phone, and we’re dealing with a tabula rasa.
Another problem is the transition to online classes. This process began during the Covid-19 era, but the system has since taken hold. However, the process of learning, the transmission of live emotions—and emotions are a necessary component of learning and engagement—is only possible with personal contact between the instructor and students. Currently, it seems that the instructor lectures into a void, not seeing the students’ faces or eyes, only hoping that someone is listening on the other side of the screen. Students simply listen, and if they record the lecture, they don’t take notes.
Another problem in this area is the recording of lectures by leading professors so that students can then watch these recorded lectures instead of live interaction. The result is no live interaction, no emotional engagement, which is especially important in a humanities education.
And, of course, such people are easily manipulated. We are dealing with indoctrination. Information warfare—their origins go back centuries, and we can recall the flyers of the Livonian War and the “magazine wars” between Russia and France in the 19th century. The Crimean War can be perceived as an example of a war orchestrated by public opinion.
Modern informational, or cognitive, or mental warfare, taking place in the information space plays no less a role, and often even greater, than real wars. And here we are dealing not just with isolated examples of historical falsification, but with mass falsification. Examples include the falsification of the history of World War II, the attribution of equal responsibility for the outbreak of war to the USSR and Nazi Germany, the theory of two totalitarianisms, and the idea of Russian/Soviet expansionism.
All of this has a specific goal: to exclude modern Russia from the ranks of the victorious powers and a permanent member of the UN Security Council. And already, according to public opinion polls, the majority of French people believe that the United States, not the Soviet Union, made the decisive contribution to the defeat of Nazism. And this, too, is the result of mass indoctrination.
What might have seemed absurd 50-80 years ago is now perceived as historical truth. And in the context of digitalization, such falsifications are becoming widespread. As a result, historical memory is being reformatted and history is being rewritten to suit political expediency and the shaping of a desired future. Deepfakes, created by modern technologies, emerge when people can no longer distinguish truth from lies.
To distinguish between truth and lies, one must be able to think independently. AI, however, is destroying this skill in the average person, the masses.
But we must understand that digitalization itself is not evil. AI is a human creation; it contains only what humans have implanted. “All is poison and all is medicine, but the dose determines each,” said Paracelsus. So it is with AI: it is a wonderful assistant, but everything depends on how it is used and what goals it is directed toward.





