Instructions to use timm/vit_base_patch16_siglip_256.webli with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- timm
How to use timm/vit_base_patch16_siglip_256.webli with timm:
import timm model = timm.create_model("hf_hub:timm/vit_base_patch16_siglip_256.webli", pretrained=True) - Transformers
How to use timm/vit_base_patch16_siglip_256.webli with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("image-feature-extraction", model="timm/vit_base_patch16_siglip_256.webli")# Load model directly from transformers import AutoModel model = AutoModel.from_pretrained("timm/vit_base_patch16_siglip_256.webli", dtype="auto") - Notebooks
- Google Colab
- Kaggle
Model card for vit_base_patch16_siglip_256.webli
timm SigLIP (image encoder only, with original attention pooling) weights from https://huggingface.co/timm/ViT-B-16-SigLIP-256
- Downloads last month
- 111
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support