diff --git a/docs/assets/unraid-app-add-amd-devices.png b/docs/assets/unraid-app-add-amd-devices.png
new file mode 100644
index 00000000000..d899b443dfa
Binary files /dev/null and b/docs/assets/unraid-app-add-amd-devices.png differ
diff --git a/docs/assets/unraid-app-config.png b/docs/assets/unraid-app-config.png
new file mode 100644
index 00000000000..9644723cecf
Binary files /dev/null and b/docs/assets/unraid-app-config.png differ
diff --git a/docs/assets/unraid-app-details.png b/docs/assets/unraid-app-details.png
new file mode 100644
index 00000000000..1d26c1f3377
Binary files /dev/null and b/docs/assets/unraid-app-details.png differ
diff --git a/docs/index.md b/docs/index.md
index d55d6a14f8f..9c363a6603d 100644
--- a/docs/index.md
+++ b/docs/index.md
@@ -121,6 +121,7 @@ Mac and Linux machines, and runs on GPU cards with as little as 4 GB of RAM.
- [Automated Installer](installation/010_INSTALL_AUTOMATED.md)
- [Manual Installation](installation/020_INSTALL_MANUAL.md)
- [Docker Installation](installation/040_INSTALL_DOCKER.md)
+ - [Unraid Installation](installation/045_INSTALL_UNRAID.md)
### The InvokeAI Web Interface
- [WebUI overview](features/WEB.md)
@@ -160,9 +161,9 @@ get solutions for common installation problems and other issues.
Anyone who wishes to contribute to this project, whether documentation,
features, bug fixes, code cleanup, testing, or code reviews, is very much
-encouraged to do so.
+encouraged to do so.
-[Please take a look at our Contribution documentation to learn more about contributing to InvokeAI.
+[Please take a look at our Contribution documentation to learn more about contributing to InvokeAI.
](contributing/CONTRIBUTING.md)
## :octicons-person-24: Contributors
diff --git a/docs/installation/045_INSTALL_UNRAID.md b/docs/installation/045_INSTALL_UNRAID.md
new file mode 100644
index 00000000000..e6539ab8ffc
--- /dev/null
+++ b/docs/installation/045_INSTALL_UNRAID.md
@@ -0,0 +1,59 @@
+---
+title: Installing on Unraid
+---
+
+# :fontawesome-regular-server: Unraid
+
+## TL;DR
+
+Invoke AI is available on the Unraid Community Apps store. Simply search for "Invoke AI" and follow the on-screen instructions to install and configure.
+
+Once the application starts up, click the Invoke AI icon, select "WebUI", install some models, and start generating.
+
+### Prerequisites
+
+#### Install the [Community Apps plugin](https://docs.unraid.net/unraid-os/manual/applications/#installing-the-community-applications-plugin)
+
+### Setup
+
+Search for "Invoke AI" in the Community Apps store (available from the "grtgbln" repository) and click "Install".
+
+
+
+Select the branch to use for the application. The "latest" branch is recommended for most users, as this will pull the latest stable release of Invoke AI.
+
+Enable "Advanced View" in the top-right of the "Add Container" window to see all available settings.
+
+
+
+Out of the box, no settings need to be adjusted. If port 9090 is already in use by another application, you can change the "WebUI" port setting to a different port.
+
+Provide a HuggingFace Hub Token if you plan on downloading models from private repositories on HuggingFace.
+
+Click "Apply" to start the container.
+
+#### Nvidia GPU Support
+
+To use an Nvidia GPU, you will need to install the [Nvidia Driver](https://forums.unraid.net/topic/98978-plugin-nvidia-driver/) plugin, then add `--runtime=nvidia --gpus=all` to the "Extra Parameters" field in the container settings. This will pass all GPUs through to the container.
+
+Remove any AMD GPU devices from the container settings if you are using an Nvidia GPU.
+
+#### AMD GPU Support
+
+To use an AMD GPU, you will need to add `/dev/kfd` and `/dev/dri` as devices in the container settings. Click "Add another Path, Port, Variable, Label or Device" at the bottom of the "Add Container" window, then select "Device" from the dropdown. Add `/dev/kfd` and `/dev/dri` as devices.
+
+
+
+Remove any Nvidia GPU devices from the container settings if you are using an AMD GPU.
+
+#### Advanced Configuration
+
+By default, persistent data such as downloaded models and configuration files are stored in the `/mnt/user/appdata/invoke_ai` directory on your system. This can be edited in the "Appdata and Model Storage Path" setting, available by clicking "Show more settings" in the "Add Container" window.
+
+The internal path to this associated directory is `/invokeai_root`, and while it is editable in the "Data root" setting, it is not recommended to change this unless you are familiar with the internal workings of Invoke AI.