Text Generation
Transformers
llama

I tested it on an inf2.xlarge and was able to run through your example and a few more. I'm not sure if there is any use case that will max out the RAM.

Cannot merge
This branch has merge conflicts in the following files:
  • README.md

Sign up or log in to comment