KIDA-Chat: a prototype of an AI-Assistant to interact with scientific models

Predictive models are helpful tools for research and decision-making in different fields, including food production and safety. Over years, numerous predictive models have been published. However, their effective use is often limited by the lack of documentation, code availability and repositories to access them. Recently, a standard format (FSKX: FAIR Scientific Knowledge eXchange) has been developed to reference and exchange predictive models. The KIDA-Chat is a proof-of-concept software to interact with predictive models available in FSKX format via a chatbot user interface (UI) making use of a Large Language Model (LLM). It takes advantage of an agent-based architecture that has access to a model service API. On request, an LLM-agent queries the FSKX model repository for relevant models. If the request is about creating a model-based prediction, a specific agent provides the proper UI elements inside the chatbot to allow the user to define the input parameter values for the desired simulation scenario. This information is then captured, and another agent triggers the model execution. The generated results are then provided back to the user inside the chatbot UI. The KIDA-Chat can enhance the usability of predictive models. It allows interacting with FSKX models using natural language. It can capture the chat session context to select the appropriate predictive model for the user’s needs. Combined with the LLM access to knowledge not directly contained in the model simulations results, this contextcapturing ability can assist in understanding and interpreting simulation results, which in turn might help to form an informed decision. This research has been funded by the German Federal Ministry of Food and Agriculture (BMEL) in the research project “KI- & Daten-Akzelerator (KIDA)” with project number 28KIDA004.

Cite

Citation style:
Could not load citation form.

Access Statistic

Total:
Downloads:
Abtractviews:
Last 12 Month:
Downloads:
Abtractviews:

Rights

Use and reproduction:
All rights reserved