oobabooga
|
d5c407cf35
|
Use Vulkan instead of ROCm for llama.cpp on AMD
|
2025-05-01 20:05:36 -07:00 |
|
oobabooga
|
c12a53c998
|
Use turboderp's exllamav2 wheels
|
2025-05-01 19:46:56 -07:00 |
|
oobabooga
|
a4bf339724
|
Bump llama.cpp
|
2025-04-30 11:13:14 -07:00 |
|
oobabooga
|
e9569c3984
|
Fixes after c5fe92d152
|
2025-04-30 06:57:23 -07:00 |
|
oobabooga
|
7f49e3c3ce
|
Bump ExLlamaV3
|
2025-04-30 05:25:09 -07:00 |
|
oobabooga
|
c5fe92d152
|
Bump llama.cpp
|
2025-04-30 05:24:58 -07:00 |
|
oobabooga
|
fa861de05b
|
Fix portable builds with Python 3.12
|
2025-04-26 18:52:44 -07:00 |
|
oobabooga
|
bf2aa19b21
|
Bump llama.cpp
|
2025-04-26 16:39:22 -07:00 |
|
oobabooga
|
2c7ff86015
|
Bump exllamav3 to de83084184
|
2025-04-25 05:28:22 -07:00 |
|
oobabooga
|
5993ebeb1b
|
Bump exllamav2 to 0.2.9
|
2025-04-25 05:27:59 -07:00 |
|
oobabooga
|
8ebe868916
|
Fix typos in b313adf653
|
2025-04-24 09:32:17 -07:00 |
|
oobabooga
|
b313adf653
|
Bump llama.cpp, make the wheels work with any Python >= 3.7
|
2025-04-24 08:26:12 -07:00 |
|
oobabooga
|
06619e5f03
|
Add vulkan requirements.txt files
|
2025-04-22 17:46:54 -07:00 |
|
oobabooga
|
ee09e44c85
|
Portable version (#6868)
|
2025-04-22 09:25:57 -03:00 |
|
oobabooga
|
c178ea02fe
|
Revert "Move the requirements*.txt to a requirements folder"
This reverts commit 6117ef7d64.
|
2025-04-20 19:27:38 -07:00 |
|
oobabooga
|
6117ef7d64
|
Move the requirements*.txt to a requirements folder
|
2025-04-20 19:12:04 -07:00 |
|