diff --git a/admin_manual/ai/app_context_chat.rst b/admin_manual/ai/app_context_chat.rst index 2615cbccd..53eeb8c2c 100644 --- a/admin_manual/ai/app_context_chat.rst +++ b/admin_manual/ai/app_context_chat.rst @@ -71,6 +71,7 @@ Nextcloud customers should file bugs directly with our Customer Support. Known Limitations ----------------- +* The underlying language model used by Context Chat cannot be changed * We currently only support the English language * Language models are likely to generate false information and should thus only be used in situations that are not critical. It's recommended to only use AI at the beginning of a creation process and not at the end, so that outputs of AI serve as a draft for example and not as final product. Always check the output of language models before using it. * Make sure to test this app for whether it meets your use-case's quality requirements diff --git a/admin_manual/ai/overview.rst b/admin_manual/ai/overview.rst index 76a565bdd..266fcf3b3 100644 --- a/admin_manual/ai/overview.rst +++ b/admin_manual/ai/overview.rst @@ -164,3 +164,15 @@ Backend apps ~~~~~~~~~~~~ * :ref:`context_chat + context_chat_backend` - (Customer support available upon request) + + +Frequently Asked Questions +-------------------------- + +Why is my prompt slow? +^^^^^^^^^^^^^^^^^^^^^^ + +Reasons for slow performance from a user perspective can be + + * Using CPU processing instead of GPU (sometimes this limit is imposed by the used app) + * High user demand for the feature: User prompts and AI tasks are usually processed in the order they are received, which can cause delays when a lot of users access these features at the same time.