Fix: when using http worker, only download if inference is on localhost 52fc709 lukestanley commited on Mar 4, 2024
Refactor code: Import libraries, compare moderation APIs, and remove unused code ac30aa7 lukestanley commited on Mar 2, 2024
WIP spicy Jigsaw - Wikipedia talk page dataset review and scoring 759e510 lukestanley commited on Mar 2, 2024
Add local logging option if SKIP_NETWORK environment variable is set 960dc11 lukestanley commited on Mar 1, 2024
Add data capture endpoint using Gradio's API hosted by HF's Gradio dynamic hostname 2c65c23 lukestanley commited on Mar 1, 2024
Add log_to_jsonl function to data.py and remove duplicate function from utils.py 44bab49 lukestanley commited on Feb 29, 2024
Add JSONL disk file logging functionality to app.py and utils.py abbebf8 lukestanley commited on Feb 29, 2024
Readme: Note on Mistral API used, serverless backend for reliability 8c64a35 lukestanley commited on Feb 29, 2024
Add Mistral API support due to my RunPod serverless system reliability issues 8093276 lukestanley commited on Feb 29, 2024
Add assert in improvement_loop function to make more robust 4901d0f lukestanley commited on Feb 29, 2024
Assert RunPod env vars are setup before trying to use them 00af17e lukestanley commited on Feb 29, 2024
Change return type of improvement_loop to dict in app.py 859cc57 lukestanley commited on Feb 29, 2024
Docs: Add local usage instructions for running the Gradio web server GUI 38a55db lukestanley commited on Feb 28, 2024
Clarify setup comments, remove unused global, increase max iterations c995e6d lukestanley commited on Feb 28, 2024
Doc: Idea for speed improvements and intermediate results display, grouping future directions 968cab3 lukestanley commited on Feb 28, 2024
Comment out llama-cpp-python installation command in Docker for HuggingFace Space 56e7667 Luke Stanley commited on Feb 28, 2024
Switch to serverless worker by default (PR #2 from lukestanley/serverless_json_llm) a054519 unverified Luke Stanley commited on Feb 28, 2024