diff --git a/admin_manual/ai/app_llm2.rst b/admin_manual/ai/app_llm2.rst index b599711a7..e1dee0539 100644 --- a/admin_manual/ai/app_llm2.rst +++ b/admin_manual/ai/app_llm2.rst @@ -42,7 +42,12 @@ Installation Supplying alternate models ~~~~~~~~~~~~~~~~~~~~~~~~~~ -This app allows supplying alternate LLM models as *gguf* files in the ``/nc_app__data`` directory of the docker container. +This app allows supplying alternate LLM models as *gguf* files in the ``/nc_app_llm2_data`` directory of the docker container. + +1. Download a **gguf** model e.g. from huggingface +2. Copy the **gguf** file to ``/nc_app_llm2_data`` inside the docker container +3. Restart the llm2 ExApp +4. Select the new model in the Nextcloud AI admin settings App store --------- diff --git a/admin_manual/ai/app_stt_whisper2.rst b/admin_manual/ai/app_stt_whisper2.rst index 248805c75..1cd555bcb 100644 --- a/admin_manual/ai/app_stt_whisper2.rst +++ b/admin_manual/ai/app_stt_whisper2.rst @@ -44,10 +44,12 @@ Installation Supplying alternate models ~~~~~~~~~~~~~~~~~~~~~~~~~~ -This app allows supplying alternate LLM models as *gguf* files in the ``/app/models`` directory of the docker container. You can use any `*faster-whisper* model by Systran on hugging face `_ by simply +This app allows supplying alternate LLM models as *gguf* files in the ``/nc_app_llm2_data`` directory of the docker container. You can use any `*faster-whisper* model by Systran on hugging face `_ by simply -1. git cloning the respective repository into the models directory -2. Selecting the respective model in the Nextcloud AI admin settings +1. git cloning the respective repository +2. Copying the folder with the git repository to ``/nc_app_llm2_data`` inside the docker container. +3. Restarting the Whisper ExApp +4. Selecting the respective model in the Nextcloud AI admin settings App store ---------