oobabooga
|
941e0663da
|
Update README
|
2025-05-05 14:18:16 -07:00 |
|
oobabooga
|
5f5569e9ac
|
Update README
|
2025-05-04 06:20:36 -07:00 |
|
oobabooga
|
f8aaf3c23a
|
Use ROCm 6.2.4 on AMD
|
2025-05-01 19:50:46 -07:00 |
|
oobabooga
|
89090d9a61
|
Update README
|
2025-05-01 08:22:54 -07:00 |
|
oobabooga
|
c5fb51e5d1
|
Update README
|
2025-04-28 22:40:26 -07:00 |
|
oobabooga
|
965ca7948f
|
Update README
|
2025-04-27 07:33:08 -07:00 |
|
oobabooga
|
a317450dfa
|
Update README
|
2025-04-26 14:59:29 -07:00 |
|
oobabooga
|
3a207e7a57
|
Improve the --help formatting a bit
|
2025-04-26 07:31:04 -07:00 |
|
oobabooga
|
d9de14d1f7
|
Restructure the repository (#6904)
|
2025-04-26 08:56:54 -03:00 |
|
oobabooga
|
da1919baae
|
Update the README
|
2025-04-22 08:03:22 -07:00 |
|
oobabooga
|
8d481ef9d5
|
Update README
|
2025-04-18 11:31:22 -07:00 |
|
oobabooga
|
e52f62d3ff
|
Update README
|
2025-04-18 09:29:57 -07:00 |
|
oobabooga
|
170ad3d3ec
|
Update the README
|
2025-04-18 06:03:35 -07:00 |
|
oobabooga
|
d7b336d37e
|
Update the README
|
2025-04-09 20:12:14 -07:00 |
|
oobabooga
|
8b8d39ec4e
|
Add ExLlamaV3 support (#6832)
|
2025-04-09 00:07:08 -03:00 |
|
oobabooga
|
a8a64b6c1c
|
Update the README
|
2025-04-06 17:40:18 -07:00 |
|
oobabooga
|
cf9676c4d5
|
Update README
|
2025-02-14 18:05:36 -08:00 |
|
oobabooga
|
32cdaa540f
|
Update README
|
2025-01-30 09:49:25 -08:00 |
|
oobabooga
|
3936589755
|
Update README
|
2025-01-28 12:53:55 -08:00 |
|
Shay Molcho
|
b76b7f6bf5
|
Minor README change (#6687)
|
2025-01-22 12:02:43 -03:00 |
|
oobabooga
|
c32f06d62f
|
Update README
|
2025-01-17 07:03:22 -08:00 |
|
oobabooga
|
d2f6c0f65f
|
Update README
|
2025-01-10 13:25:40 -08:00 |
|
oobabooga
|
ad118056b8
|
Update README
|
2025-01-08 14:29:46 -08:00 |
|
oobabooga
|
7157257c3f
|
Remove the AutoGPTQ loader (#6641)
|
2025-01-08 19:28:56 -03:00 |
|
oobabooga
|
fee23df1a5
|
Update README.md
|
2024-12-18 18:13:01 -08:00 |
|
oobabooga
|
9fd12605ac
|
Update README.md
|
2024-12-18 17:58:53 -08:00 |
|
oobabooga
|
57160cd6fa
|
Update README
|
2024-09-28 20:50:41 -07:00 |
|
oobabooga
|
3f0571b62b
|
Update README
|
2024-09-28 20:48:30 -07:00 |
|
oobabooga
|
3fb02f43f6
|
Update README
|
2024-09-28 20:38:43 -07:00 |
|
oobabooga
|
65e5864084
|
Update README
|
2024-09-28 20:25:26 -07:00 |
|
oobabooga
|
85994e3ef0
|
Bump pytorch to 2.4.1
|
2024-09-28 09:44:08 -07:00 |
|
oobabooga
|
ca5a2dba72
|
Bump rocm to 6.1.2
|
2024-09-28 09:39:53 -07:00 |
|
oobabooga
|
1124f71cf3
|
Update README.md
|
2024-08-20 11:19:46 -03:00 |
|
oobabooga
|
d9a031fcad
|
Update README.md
|
2024-08-20 01:52:30 -03:00 |
|
oobabooga
|
9d99156ca3
|
Update README.md
|
2024-08-20 01:27:02 -03:00 |
|
oobabooga
|
406995f722
|
Update README
|
2024-08-19 21:24:01 -07:00 |
|
oobabooga
|
1b1518aa6a
|
Update README.md
|
2024-08-20 00:36:18 -03:00 |
|
oobabooga
|
8bac1a9382
|
Update README.md
|
2024-08-19 23:10:04 -03:00 |
|
oobabooga
|
bb987ffe66
|
Update README.md
|
2024-08-19 23:06:52 -03:00 |
|
oobabooga
|
5c5e7264ec
|
Update README
|
2024-07-22 18:20:01 -07:00 |
|
oobabooga
|
05676caf70
|
Update README
|
2024-07-11 16:25:52 -07:00 |
|
oobabooga
|
f5599656b4
|
Update README
|
2024-07-11 16:22:00 -07:00 |
|
oobabooga
|
a30ec2e7db
|
Update README
|
2024-07-11 16:20:44 -07:00 |
|
oobabooga
|
53fbd2f245
|
Add TensorRT-LLM to the README
|
2024-06-25 14:45:37 -07:00 |
|
oobabooga
|
9420973b62
|
Downgrade PyTorch to 2.2.2 (#6124)
|
2024-06-14 16:42:03 -03:00 |
|
oobabooga
|
8930bfc5f4
|
Bump PyTorch, ExLlamaV2, flash-attention (#6122)
|
2024-06-13 20:38:31 -03:00 |
|
oobabooga
|
bd7cc4234d
|
Backend cleanup (#6025)
|
2024-05-21 13:32:02 -03:00 |
|
oobabooga
|
6a1682aa95
|
README: update command-line flags with raw --help output
This helps me keep this up-to-date more easily.
|
2024-05-19 20:28:46 -07:00 |
|
oobabooga
|
7a728a38eb
|
Update README
|
2024-05-07 02:59:36 -07:00 |
|
oobabooga
|
e61055253c
|
Bump llama-cpp-python to 0.2.69, add --flash-attn option
|
2024-05-03 04:31:22 -07:00 |
|