Commit graph

123 commits

Author SHA1 Message Date
oobabooga 78b315344a Update exllamav3 2025-11-28 06:45:05 -08:00
oobabooga 3cad0cd4c1 Update llama.cpp 2025-11-28 03:52:37 -08:00
oobabooga 327a234d23 Add ROCm requirements.txt files 2025-11-18 16:24:56 -08:00
oobabooga 4e4abd0841 Merge remote-tracking branch 'refs/remotes/origin/dev' into dev 2025-11-18 14:07:05 -08:00
oobabooga c45f35ccc2 Remove the macos 13 wheels (deprecated by GitHub) 2025-11-18 14:06:42 -08:00
oobabooga d85b95bb15 Update llama.cpp 2025-11-18 14:06:04 -08:00
dependabot[bot] 4a36b7be5b
Bump triton-windows in /requirements/full (#7311) 2025-11-18 18:51:26 -03:00
dependabot[bot] 3d7e9856a2
Update peft requirement from ==0.17.* to ==0.18.* in /requirements/full (#7310) 2025-11-18 18:51:15 -03:00
oobabooga a26e28bdea Update exllamav3 to 0.0.15 2025-11-18 11:24:16 -08:00
oobabooga 6a3bf1de92 Update exllamav3 to 0.0.14 2025-11-09 19:43:53 -08:00
oobabooga e7534a90d8 Update llama.cpp 2025-11-05 18:46:01 -08:00
oobabooga 92d9cd36a6 Update llama.cpp 2025-11-05 05:43:34 -08:00
oobabooga 67f9288891 Pin huggingface-hub to 0.36.0 (solves #7284 and #7289) 2025-11-02 14:01:00 -08:00
oobabooga cd645f80f8 Update exllamav3 to 0.0.12 2025-11-01 19:58:18 -07:00
dependabot[bot] c8cd840b24
Bump flash-linear-attention from 0.3.2 to 0.4.0 in /requirements/full (#7285)
Bumps [flash-linear-attention](https://github.com/fla-org/flash-linear-attention) from 0.3.2 to 0.4.0.
- [Release notes](https://github.com/fla-org/flash-linear-attention/releases)
- [Commits](https://github.com/fla-org/flash-linear-attention/compare/v0.3.2...v0.4.0)

---
updated-dependencies:
- dependency-name: flash-linear-attention
  dependency-version: 0.4.0
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-10-28 10:07:03 -03:00
oobabooga f4c9e67155 Update llama.cpp 2025-10-23 08:19:32 -07:00
Immanuel 9a84a828fc
Fixed python requirements for apple devices with macos tahoe (#7273) 2025-10-22 14:59:27 -03:00
oobabooga 24fd2b4dec Update exllamav3 to 0.0.11 2025-10-21 07:26:38 -07:00
oobabooga be81f050a7 Merge remote-tracking branch 'refs/remotes/origin/dev' into dev 2025-10-20 19:43:36 -07:00
oobabooga 9476123ee6 Update llama.cpp 2025-10-20 19:43:26 -07:00
dependabot[bot] 0d85744205
Bump triton-windows in /requirements/full (#7274) 2025-10-20 20:36:55 -03:00
oobabooga 163d863443 Update llama.cpp 2025-10-15 11:23:10 -07:00
oobabooga c93d567f97 Update exllamav3 to 0.0.10 2025-10-15 06:41:09 -07:00
oobabooga efaf2aef3d Update exllamav3 to 0.0.9 2025-10-13 15:32:25 -07:00
oobabooga 047855c591 Update llama.cpp 2025-10-13 15:32:03 -07:00
oobabooga 1831b3fb51 Use my custom gradio_client build (small changes to work with pydantic 2.11) 2025-10-10 18:01:21 -07:00
oobabooga dd0b003493 Bump pydantic to 2.11.0 2025-10-10 17:52:16 -07:00
oobabooga a74596374d Reapply "Update exllamav3 to 0.0.8"
This reverts commit 748007f6ee.
2025-10-10 17:51:31 -07:00
oobabooga 748007f6ee Revert "Update exllamav3 to 0.0.8"
This reverts commit 977ffbaa04.
2025-10-09 16:50:00 -07:00
dependabot[bot] af3c70651c
Update bitsandbytes requirement in /requirements/full (#7255) 2025-10-09 19:53:34 -03:00
oobabooga 977ffbaa04 Update exllamav3 to 0.0.8 2025-10-09 15:53:14 -07:00
oobabooga e0f0fae59d Exllamav3: Add fla to requirements for qwen3-next 2025-10-09 13:03:48 -07:00
oobabooga 0f3793d608 Update llama.cpp 2025-10-09 09:38:22 -07:00
Ionoclast Laboratories d229dfe991
Fix portable apple intel requirement for llama binaries (issue #7238) (#7239) 2025-10-08 12:40:53 -03:00
oobabooga 292c91abbb Update llama.cpp 2025-10-08 08:31:34 -07:00
oobabooga 64829071e0 Update llama.cpp 2025-10-05 07:32:41 -07:00
oobabooga 0eb8543d74 Update transformers 2025-10-05 07:30:33 -07:00
oobabooga b7effb22e0 Update exllamav3 2025-10-05 07:29:57 -07:00
oobabooga 8c9df34696 Update llama.cpp 2025-09-20 20:57:15 -07:00
oobabooga 9c0a833a0a Revert "Update bitsandbytes requirement in /requirements/full (#7193)"
This reverts commit fe15b67160.
2025-09-17 11:58:54 -07:00
oobabooga 8087a57fd8 Bump transformers to 4.56 2025-09-17 08:19:18 -07:00
dependabot[bot] 7131a478b9
Update safetensors requirement in /requirements/full (#7192) 2025-09-17 12:18:13 -03:00
dependabot[bot] fe15b67160
Update bitsandbytes requirement in /requirements/full (#7193) 2025-09-17 12:17:58 -03:00
dependabot[bot] 8f731a566c
Update peft requirement from ==0.16.* to ==0.17.* in /requirements/full (#7172) 2025-09-17 12:17:16 -03:00
oobabooga 483927a5be Update llama.cpp 2025-09-17 05:09:12 -07:00
oobabooga 557b78d31e Update llama.cpp 2025-09-03 16:50:03 -07:00
oobabooga d843afcf66 Update llama.cpp 2025-09-02 05:43:33 -07:00
oobabooga 00ebb295d3 Update llama.cpp 2025-08-31 16:27:23 -07:00
oobabooga 7b80e9a2ad Update llama.cpp 2025-08-30 20:22:11 -07:00
oobabooga 8042f76399 Make portable installs functional with Python 3.13 2025-08-27 05:37:01 -07:00