diff --git a/admin_manual/ai/ai_as_a_service.rst b/admin_manual/ai/ai_as_a_service.rst index 8633eda78..c254e1caa 100644 --- a/admin_manual/ai/ai_as_a_service.rst +++ b/admin_manual/ai/ai_as_a_service.rst @@ -4,10 +4,10 @@ AI as a Service .. _ai-ai_as_a_service: -At Nextcloud we focus on creating on-premise AI apps that run fully self-hosted on your own servers in order to preserve your privacy and data sovereignty. +At Nextcloud, we focus on creating on-premise AI apps that run fully self-hosted on your own servers in order to preserve your privacy and data sovereignty. However, you can also offload these resource-heavy tasks to an "AI as a Service" provider offering API access in exchange for payment. Examples of such providers are `OpenAI `_, with its ChatGPT APIs providing language model access -among other APIs as well as `Replicate `_. +among other APIs, as well as `Replicate `_ and `IBM watsonx `_. Installation ------------ @@ -18,9 +18,11 @@ In order to use these providers you will need to install the respective app from * ``integration_replicate`` -You can then add your API token and rate limits in the administration settings and set the providers live in the "Artificial intelligence" section of the admins settings. +* ``integration_watsonx`` -Optionally but recommended, setup background workers for faster pickup of tasks. See :ref:`the relevant section in AI Overview` for more information. +You can then add your account information, set rate limits, and set the providers live in the "Artificial intelligence" section of the administration settings. + +Optionally (but recommended), setup background workers for faster pickup of tasks. See :ref:`the relevant section in AI Overview` for more information. OpenAI integration ------------------ @@ -29,11 +31,20 @@ With this application, you can also connect to a self-hosted LocalAI or Ollama i for example `IONOS AI Model Hub `_, `Plusserver `_, `Groqcloud `_, `MistralAI `_ or `Together AI `_. -Do note however, that we test the Assistant tasks that this app implements only with OpenAI models and only against the OpenAI API, we thus cannot guarantee other models and APIs will work. +Do note, however, that we test the Assistant tasks that this app implements only with OpenAI models and only against the OpenAI API, we thus cannot guarantee other models and APIs will work. Some APIs claiming to be compatible with OpenAI might not be fully compatible so we cannot guarantee that they will work with this app. +IBM watsonx.ai integration +-------------------------- + +With this application, you can also connect to a self-hosted cluster running the IBM watsonx.ai software. + +Do note, however, that we test the Assistant tasks that this app implements only with the provided foundation models and only against IBM Cloud servers. +We thus cannot guarantee that other models or server instances will work. + Improve performance ------------------- -Prompts from integration_openai and integration_replicate can have a delay of 5 minutes. This can be optimized and more information can be found in :ref:`the relevant section in AI Overview `. +Prompts from these apps can have a delay of up to 5 minutes. +This can be optimized and more information can be found in :ref:`the relevant section in AI Overview `. diff --git a/admin_manual/ai/app_assistant.rst b/admin_manual/ai/app_assistant.rst index a52efefcc..92490a87f 100644 --- a/admin_manual/ai/app_assistant.rst +++ b/admin_manual/ai/app_assistant.rst @@ -66,6 +66,7 @@ In order to make use of text processing features in the assistant, you will need * :ref:`llm2` - Runs open source AI language models locally on your own server hardware (Customer support available upon request) * *integration_openai* - Integrates with the OpenAI API to provide AI functionality from OpenAI servers (Customer support available upon request; see :ref:`AI as a Service`) +* *integration_watsonx* - Integrates with the IBM watsonx.ai API to provide AI functionality from IBM Cloud servers (Customer support available upon request; see :ref:`AI as a Service`) These apps currently implement the following Assistant Tasks: diff --git a/admin_manual/ai/app_summary_bot.rst b/admin_manual/ai/app_summary_bot.rst index c9ff54560..38d087a1d 100644 --- a/admin_manual/ai/app_summary_bot.rst +++ b/admin_manual/ai/app_summary_bot.rst @@ -44,6 +44,8 @@ Installation - `Nextcloud OpenAI and LocalAI integration app `_ + - `Nextcloud IBM watsonx.ai integration app `_ + Setup (via App Store) ~~~~~~~~~~~~~~~~~~~~~ diff --git a/admin_manual/ai/overview.rst b/admin_manual/ai/overview.rst index 0607e595c..b62e4c249 100644 --- a/admin_manual/ai/overview.rst +++ b/admin_manual/ai/overview.rst @@ -33,6 +33,8 @@ Nextcloud uses modularity to separate raw AI functionality from the Graphical Us "","`OpenAI and LocalAI integration (via Plusserver) `_","Orange","No","Yes","No","No" "","`OpenAI and LocalAI integration (via Groqcloud) `_","Orange","No","Yes","No","No" "","`OpenAI and LocalAI integration (via MistralAI) `_","Orange","No","Yes","No","No" + "","`IBM watsonx.ai integration (via IBM watsonx.ai as a Service) `_","Yellow","No","Yes - e.g. Granite models by IBM","Yes","No" + "","`IBM watsonx.ai integration (via IBM watsonx.ai software) `_","Yellow","No","Yes - e.g. Granite models by IBM", "Yes","Yes" "Machine translation","`Local Machine Translation 2 (ExApp) `_","Green","Yes","Yes - MADLAD models by Google","Yes","Yes" "","`DeepL integration `_","Red","No","No","No","No" "","`OpenAI and LocalAI integration (via OpenAI API) `_","Red","No","No","No","No" @@ -113,6 +115,7 @@ Backend apps * :ref:`llm2` - Runs open source AI LLM models on your own server hardware (Customer support available upon request) * `OpenAI and LocalAI integration (via OpenAI API) `_ - Integrates with the OpenAI API to provide AI functionality from OpenAI servers (Customer support available upon request; see :ref:`AI as a Service`) +* `IBM watsonx.ai integration (via IBM watsonx.ai as a Service) `_ - Integrates with the IBM watsonx.ai API to provide AI functionality from IBM Cloud servers (Customer support available upon request; see :ref:`AI as a Service`) Machine translation