hrithik sagar
hrithiksagar-tih
AI & ML interests
Large Language Models, VLM
Organizations
Adding Offline and Online inference via vLLM Code
#8 opened 5 months ago
by
hrithiksagar-tih
vLLM Support Query
π 3
3
#2 opened 5 months ago
by
hrithiksagar-tih
Will smaller models like 7B be released?
β 34
4
#3 opened 5 months ago
by
CelYuan
Dataset Explanation
2
#2 opened 5 months ago
by
hrithiksagar-tih
STATS
1
#1 opened 5 months ago
by
hrithiksagar-tih
Added requirement file for transformers
#13 opened 6 months ago
by
hrithiksagar-tih
Document Images
1
#2 opened 6 months ago
by
hrithiksagar-tih
Stats
8
#2 opened 6 months ago
by
hrithiksagar-tih
Inference code
π 1
3
#2 opened 6 months ago
by
hrithiksagar-tih
Added VLLM Offline Serve working code.
8
#107 opened 7 months ago
by
hrithiksagar-tih
FlashInfer requires sm75+
7
#48 opened 7 months ago
by
hrithiksagar-tih
vLLM FlashAttention3 with A6000
π 17
19
#33 opened 7 months ago
by
YieumYoon
Import error of ImportError of 'Qwen2_5_VLForConditionalGeneration' from transformers library
8
#17 opened about 1 year ago
by
akbarmq01
Dataset Format
#5 opened 7 months ago
by
hrithiksagar-tih
Is the dataset available?
#2 opened 7 months ago
by
hrithiksagar-tih
Upload modeling_patram.py
2
#5 opened 8 months ago
by
Prince-1
GhibliByVikas_PullReq
3
#4 opened 9 months ago
by
ipvikas
Error during loading AutoProcessor
3
#3 opened 9 months ago
by
Nitesh-95
Added Benchmark PNG
1
#2 opened 9 months ago
by
KingNish