openai/clip-vit-large-patch14 Zero-Shot Image Classification • Updated Sep 15, 2023 • 45.2M • • 1.57k
microsoft/BiomedCLIP-PubMedBERT_256-vit_base_patch16_224 Zero-Shot Image Classification • Updated Sep 27, 2024 • 94.4k • 249
laion/CLIP-ViT-H-14-laion2B-s32B-b79K Zero-Shot Image Classification • Updated Jan 16, 2024 • 1.15M • 344
laion/CLIP-ViT-B-32-laion2B-s34B-b79K Zero-Shot Image Classification • Updated Jan 15, 2024 • 8.02M • 105
laion/CLIP-ViT-L-14-laion2B-s32B-b82K Zero-Shot Image Classification • Updated Jan 16, 2024 • 33.5k • 46
OFA-Sys/chinese-clip-vit-base-patch16 Zero-Shot Image Classification • Updated Dec 9, 2022 • 129k • 95
OFA-Sys/chinese-clip-vit-large-patch14 Zero-Shot Image Classification • Updated Dec 9, 2022 • 705 • 29
OFA-Sys/chinese-clip-vit-large-patch14-336px Zero-Shot Image Classification • Updated Dec 9, 2022 • 1.01k • 23
OFA-Sys/chinese-clip-vit-huge-patch14 Zero-Shot Image Classification • Updated Dec 9, 2022 • 1.77k • 26
laion/CLIP-convnext_base_w-laion2B-s13B-b82K Zero-Shot Image Classification • Updated Apr 18, 2023 • 3.01k • 5
laion/CLIP-ViT-bigG-14-laion2B-39B-b160k Zero-Shot Image Classification • Updated Jan 16, 2024 • 208k • 243
laion/CLIP-convnext_large_d.laion2B-s26B-b102K-augreg Zero-Shot Image Classification • Updated Apr 18, 2023 • 1.98k • 5
laion/CLIP-ViT-g-14-laion2B-s34B-b88K Zero-Shot Image Classification • Updated Mar 22, 2024 • 38k • 23