AI and ChatGPT in Science and the Humanities - DFG Formulates Guidelines for Dealing with Generative Models

The Deutsche Forschungsgemeinschaft (DFG, German Research Foundation) has formulated initial guidelines for dealing with generative models for text and image creation. A statement now published by the Executive Committee of the largest research funding organisation and central self-governing organisation for science and the humanities in Germany sheds light on the influence of ChatGPT and other generative AI models on science and the humanities and on the DFG's funding activities. As a starting point for continuous monitoring and support, the paper seeks to provide guidance for researchers in their work as well as for applicants to the DFG and those involved in the review, evaluation and decision-making process.

In the view of the DFG Executive Committee, AI technologies are already changing the entire work process in science and the humanities, knowledge production and creativity to a significant degree and are being used in various ways in the different research disciplines, albeit for differing purposes. In terms of generative models for text and image creation, this development is still very much in its infancy.

"In view of its considerable opportunities and development potential, the use of generative models in the context of research work should by no means be ruled out," says the paper: "However, certain binding framework conditions will be required in order to ensure good research practice and the quality of research results." Here, too, the standards of good research practice generally established in science and the humanities are fundamental.

In terms of concrete guidelines, the DFG Executive Committee says that when making their results publicly available, researchers should disclose whether or not they have used generative models and if so, which ones, for what purpose and to what extent. This also includes funding proposals submitted to the DFG. The use of such models does not relieve researchers of their own content-related and formal responsibility to adhere to the basic principles of research integrity.

Only the natural persons responsible may appear as authors in research publications, states the paper. "They must ensure that the use of generative models does not infringe anyone else’s intellectual property and does not result in scientific misconduct, for example in the form of plagiarism," the paper goes on.

The use of generative models based on these principles is to be permissible when submitting proposals to the DFG. In the preparation of reviews, on the other hand, their use is inadmissible due to the confidentiality of assessment process, states the paper, adding: "Documents provided for review are confidential and in particular may not be used as input for generative models."

Instructions to applicants and to those involved in the evaluation process are currently being added to the relevant documents and technical systems at the DFG Head Office.

Following on from these initial guidelines, the DFG intends to analyse and assess the opportunities and potential risks of using generative models in science and the humanities and in its own funding activities on an ongoing basis. A Senate Working Group on the Digital Turn is to address overarching epistemic and subject-specific issues in this context. Any possible impact in connection with acts of scientific misconduct are to be addressed by the DFG Commission on the Revision of the Rules of Procedure for Dealing with Scientific Misconduct. The DFG will also be issuing further statements in an effort to contribute to a "discursive and science-based process" in the use of generative models.

For the text of the statement, see the DFG website here

Most Popular Now

Stanford Medicine Study Suggests Physici…

Artificial intelligence-powered chatbots are getting pretty good at diagnosing some diseases, even when they are complex. But how do chatbots do when guiding treatment and care after the diagnosis? For...

OmicsFootPrint: Mayo Clinic's AI To…

Mayo Clinic researchers have pioneered an artificial intelligence (AI) tool, called OmicsFootPrint, that helps convert vast amounts of complex biological data into two-dimensional circular images. The details of the tool...

Adults don't Trust Health Care to U…

A study finds that 65.8% of adults surveyed had low trust in their health care system to use artificial intelligence responsibly and 57.7% had low trust in their health care...

Testing AI with AI: Ensuring Effective A…

Using a pioneering artificial intelligence platform, Flinders University researchers have assessed whether a cardiac AI tool recently trialled in South Australian hospitals actually has the potential to assist doctors and...

AI Unlocks Genetic Clues to Personalize …

A groundbreaking study led by USC Assistant Professor of Computer Science Ruishan Liu has uncovered how specific genetic mutations influence cancer treatment outcomes - insights that could help doctors tailor...

The 10 Year Health Plan: What do We Need…

Opinion Article by Piyush Mahapatra, Consultant Orthopaedic Surgeon and Chief Innovation Officer at Open Medical. There is a new ten-year plan for the NHS. It will "focus efforts on preventing, as...

Deep Learning to Increase Accessibility…

Coronary artery disease is the leading cause of death globally. One of the most common tools used to diagnose and monitor heart disease, myocardial perfusion imaging (MPI) by single photon...

People's Trust in AI Systems to Mak…

Psychologists warn that AI's perceived lack of human experience and genuine understanding may limit its acceptance to make higher-stakes moral decisions. Artificial moral advisors (AMAs) are systems based on artificial...

DMEA 2025 - Innovations, Insights and Ne…

8 - 10 April 2025, Berlin, Germany. Less than 50 days to go before DMEA 2025 opens its doors: Europe's leading event for digital health will once again bring together experts...

Relationship Between Sleep and Nutrition…

Diet and sleep, which are essential for human survival, are interrelated. However, recently, various services and mobile applications have been introduced for the self-management of health, allowing users to record...

New AI Tool Mimics Radiologist Gaze to R…

Artificial intelligence (AI) can scan a chest X-ray and diagnose if an abnormality is fluid in the lungs, an enlarged heart or cancer. But being right is not enough, said...

AI Model can Read ECGs to Identify Femal…

A new AI model can flag female patients who are at higher risk of heart disease based on an electrocardiogram (ECG). The researchers say the algorithm, designed specifically for female patients...