Instructions to use transformers-community/custom_generate_example with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use transformers-community/custom_generate_example with Transformers:
# Load model directly from transformers import AutoModel model = AutoModel.from_pretrained("transformers-community/custom_generate_example", dtype="auto") - Notebooks
- Google Colab
- Kaggle
- Xet hash:
- e6ec49c6d460f02e59f3d4fc21dcd6f9bd5b2cdefdbf67eb2398732cd6c51d49
- Size of remote file:
- 247 Bytes
- SHA256:
- fded30552c15d28209902e114abb9e37d67e0947af31de7ac4ccce23209236ac
·
Xet efficiently stores Large Files inside Git, intelligently splitting files into unique chunks and accelerating uploads and downloads. More info.