Update README.md
Browse files
README.md
CHANGED
@@ -21,12 +21,7 @@ license: other
|
|
21 |
|
22 |
These files are GGML format model files for [NousResearch's Redmond Hermes Coder](https://huggingface.co/NousResearch/Redmond-Hermes-Coder).
|
23 |
|
24 |
-
|
25 |
-
* [text-generation-webui](https://github.com/oobabooga/text-generation-webui)
|
26 |
-
* [KoboldCpp](https://github.com/LostRuins/koboldcpp)
|
27 |
-
* [LoLLMS Web UI](https://github.com/ParisNeo/lollms-webui) using the `c_transformers` backend.
|
28 |
-
* [llama-cpp-python](https://github.com/abetlen/llama-cpp-python)
|
29 |
-
* [ctransformers](https://github.com/marella/ctransformers)
|
30 |
|
31 |
## Repositories available
|
32 |
|
@@ -51,18 +46,12 @@ Below is an instruction that describes a task. Write a response that appropriate
|
|
51 |
These files are **not** compatible with llama.cpp.
|
52 |
|
53 |
Currently they can be used with:
|
54 |
-
* KoboldCpp, a powerful inference engine based on llama.cpp, with good UI: [KoboldCpp](https://github.com/LostRuins/koboldcpp)
|
55 |
* The ctransformers Python library, which includes LangChain support: [ctransformers](https://github.com/marella/ctransformers)
|
56 |
-
*
|
57 |
* [rustformers' llm](https://github.com/rustformers/llm)
|
58 |
* The example `starcoder` binary provided with [ggml](https://github.com/ggerganov/ggml)
|
59 |
|
60 |
-
As other options become available I will endeavour to update them here (do let me know in the Community tab if I've missed something!)
|
61 |
-
|
62 |
-
## Tutorial for using GPT4All-UI
|
63 |
-
|
64 |
-
* [Text tutorial, written by **Lucas3DCG**](https://huggingface.co/TheBloke/MPT-7B-Storywriter-GGML/discussions/2#6475d914e9b57ce0caa68888)
|
65 |
-
* [Video tutorial, by GPT4All-UI's author **ParisNeo**](https://www.youtube.com/watch?v=ds_U0TDzbzI)
|
66 |
<!-- compatibility_ggml end -->
|
67 |
|
68 |
## Provided files
|
|
|
21 |
|
22 |
These files are GGML format model files for [NousResearch's Redmond Hermes Coder](https://huggingface.co/NousResearch/Redmond-Hermes-Coder).
|
23 |
|
24 |
+
Please note that these GGMLs are **not compatible with llama.cpp, or currently with text-generation-webui**. Please see below for a list of tools known to work with these model files.
|
|
|
|
|
|
|
|
|
|
|
25 |
|
26 |
## Repositories available
|
27 |
|
|
|
46 |
These files are **not** compatible with llama.cpp.
|
47 |
|
48 |
Currently they can be used with:
|
49 |
+
* KoboldCpp, a powerful inference engine based on llama.cpp, with good UI and GPU acceleration: [KoboldCpp](https://github.com/LostRuins/koboldcpp)
|
50 |
* The ctransformers Python library, which includes LangChain support: [ctransformers](https://github.com/marella/ctransformers)
|
51 |
+
* LoLLMS-UI which uses ctransformers: [LoLLMS-UI](https://github.com/ParisNeo/lollms-ui)
|
52 |
* [rustformers' llm](https://github.com/rustformers/llm)
|
53 |
* The example `starcoder` binary provided with [ggml](https://github.com/ggerganov/ggml)
|
54 |
|
|
|
|
|
|
|
|
|
|
|
|
|
55 |
<!-- compatibility_ggml end -->
|
56 |
|
57 |
## Provided files
|