Chat prompt generation for tool use
#2
by
apepkuss79
- opened
According to the description in model card, you guys "convert function definitions to a similar text to TypeScript definitions". My question is, the conversion happens in llama-cpp-python
and some python script does it, correct? Is it possible to run the model with llama.cpp directly? BTW, I found a PR draft in llama.cpp. Thanks a lot!
apepkuss79
changed discussion title from
Chat prompt
to Chat prompt generation for tool use
This comment has been hidden