Research management

Sponsored by

Elsevier logo
Logo

The GenAI awakening: navigating the new frontier of research support

As Generative AI gains traction in the world of research, Ryan Henderson and Ayla Kruis shed light on using it responsibly in research support

,

Utrecht University,University Medical Center Utrecht
9 May 2024
copy
0
bookmark plus
  • Top of page
  • Main text
  • More on this topic

Research management

Sponsored by

Elsevier logo
Elsevier helps researchers and healthcare professionals advance science and improve health outcomes for the benefit of society.
Researcher looking through a microscope in a lab

You may also like

Three ways to leverage ChatGPT and other generative AI in research
5 minute read
Woman using AI to aid research

“Your scientists were so preoccupied with whether or not they could that they didn’t stop to think if they should.” Ian Malcolm’s prescient words from Jurassic Park ring true decades later as we discover the far-reaching implications of Generative artificial intelligence (GenAI). How can we harness the strength of this modern-day dinosaur, while minimising the risk that our choices come back to bite us?

Image
AI-generated cartoon image of a T. rex and a scientist
The perils of GenAI. We generated this figure in less than a minute using AI (Adobe Firefly). On the surface, it’s a pretty convincing cartoon of a Tyrannosaurus Rex and a scientist. But can you spot the errors? While GenAI can create a relatively likely image (or text), it is not designed to have an actual understanding of the content.

GenAI is transforming the way we approach research support and is being touted by some as the most disruptive technology since the internet. Whether you agree with this or dismiss it as hype, an increasing number of researchers are already using GenAI to write grant applications and scientific papers. Recognising this, the European Commission recently published living guidelines on the responsible use of GenAI for researchers, research organisations and funding bodies. This is a great starting point as we consider how we can use this tool, as well as how we should use it. 

Here, we provide concrete advice for research support professionals on how to responsibly integrate genAI into their work.

Protect privacy and data 

GenAI models use large amounts of (often publicly available) data, which could include sensitive information, and it’s usually not possible to delete data from a language model. It’s therefore safest to assume that most GenAI tools are not GDPR-compliant. We recommend: 

  • Treating GenAI inputs as if you were posting them publicly. If you wouldn’t post it in a public online space, don’t put it in an openly available GenAI tool.
  • Educating yourself about how the AI service provider handles the data you enter. Crucially, will your data be used to further train the model? Many tools allow you to opt out of this.
  • Using a local, offline model for more sensitive information. This requires more tech-savviness but can be worth it for the added privacy it offers.

Review all outputs for accuracy and bias

GenAI models recognise patterns without understanding the content. This makes them prone to hallucination: confidently asserting a “fact” that seems plausible but is nonsensical or unrelated to the input (for instance, a reference to a journal article that doesn’t exist). Since factual errors can be easy to miss, check any AI-generated output carefully. Be aware that you are always responsible for what you publish in your name, as most funders and publishers agree that AI cannot legally claim authorship.  

Human review can also help mitigate AI bias. Large models such as ChatGPT are trained on text written by humans so they have inherited our own biases and stereotypes. If the training data is biased or incomplete, the AI's outputs will also be biased or incomplete. To avoid factual errors and bias in your work, always critically review any GenAI outputs and be aware of what data set was used to train the model. 

Be transparent about using GenAI

With all the potential issues, not everyone will be keen to trust GenAI. Have an open discussion with the researchers you support about it, including your preferred tools and what information they are comfortable sharing with them. Together, you can set clear boundaries for when and how GenAI can be used in your collaboration.

This extends to submitting funding applications and reporting to funding agencies. Follow the guidelines provided by funding agencies, and honestly disclose any substantial use (in other words beyond basic author support) of GenAI in proposals.

Educate yourself and others

There is still so much to learn about how to use genAI responsibly and effectively, and most higher education staff are in the same boat. What can research support staff do?

Stay up to date: as technology and the laws surrounding it evolve, knowledge quickly becomes outdated. Staying – and helping colleagues to stay – up to date is essential. At our research institutions (University Medical Center Utrecht and Utrecht University) we are currently developing training on the effective and responsible use of GenAI for support staff and researchers, and will update these frequently. 

Find or create a GenAI community: connect to professional networks within or outside your organisation to share knowledge and exchange best practices. Or find colleagues around you who share an interest in GenAI and create your own!

Share your knowledge and experience: you can introduce newcomers and “non-techies” to genAI by curating a list of prompts and use cases, as well as the basic principles of prompt engineering. 

Develop clear guidelines: develop and communicate GenAI policy and guidelines. The living guidelines from the European Commission offer a great introduction but organisations may wish to expand on these for their specific situations.

By approaching GenAI with a healthy mix of curiosity and scepticism, we can contribute significantly to integrating GenAI into our organisations – without looking back and wondering “what have we done?”.

Ryan Henderson is grant advisor at the University Medical Center Utrecht; Ayla Kruis is grant and career advisor at Utrecht University.

If you would like advice and insight from academics and university staff delivered direct to your inbox each week, sign up for the Campus newsletter.

Research management

Sponsored by

Elsevier logo
Elsevier helps researchers and healthcare professionals advance science and improve health outcomes for the benefit of society.
Loading...

You may also like

sticky sign up

Register for free

and unlock a host of features on the THE site