Vol 52 No 2 (2024): Published June 30, 2024

DOI https://doi.org/10.18799/26584956/2024/2/1676

ChatGPT and science: interaction between the scientific community and artificial intelligence

The modern information world is creating new technologies that previously seemed possible only in science fiction films and books and have become a reality. The emergence of generative neural networks, such as ChatGPT, has created a problematic situation in the scientific community and education, which lies in the possibility of ethical use of neural network products in scientific work. A dilemma arose in the domestic and foreign intellectual space, which did not allow determining the place and possible actions applicable in scientific activity. Aim. To consider modern attempts of interaction between the scientific community and artificial intelligence and to propose a variant of successful communication. Relevance. The surge in public and scientific discussion about the impact of artificial intelligence and neural networks on the production of scientific knowledge. This article examines the origins of the emergence of generative neural networks and their impact on human activity, including scientific activity. The philosophical view is investigated and criticized, which proceeds from the fact that neural networks are a problem of modern society (N. Chomsky), and a project is proposed to integrate neural networks and artificial intelligence into scientific discourse, based on the methodology of B. Latour and actornetwork theory. The article defends the position of using neural network products as a tool for conducting scientific research. Conclusion. A solution to the problem of interaction between science and artificial intelligence is provided using B. Latour's actor‐network theory, which allows us to consider the neural network as an integral part of scientific activity.

Ключевые слова:

ChatGPT, scientific activity, originality, actor‐network theory, artificial intelligence

Авторы:

D.V. Malyarevich

Скачать pdf (Русский)