-
-
Notifications
You must be signed in to change notification settings - Fork 14.7k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
ollama: 0.5.1 -> 0.5.4 #365718
ollama: 0.5.1 -> 0.5.4 #365718
Conversation
|
Ollama are now at v0.5.4 |
f8b1bfb
to
cf779e7
Compare
pkgs/by-name/ol/ollama/package.nix
Outdated
postInstall = lib.optionalString stdenv.hostPlatform.isLinux '' | ||
# copy libggml_*.so and runners into lib | ||
# https://github.com/ollama/ollama/blob/v0.4.4/llama/make/gpu.make#L90 | ||
mkdir -p $out/lib | ||
cp -r dist/*/lib/* $out/lib/ | ||
''; | ||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This was failing, so I removed it. It now builds fine.
Not 100% sure about the eventual consequences.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
this copies built .so files which, if not present will not include cuda/rocm support. We will need to fix this as well.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Perhaps the condition stdenv.hostPlatform.isLinux
should be replaced with (enableRocm || enableCuda)
? Both enableRocm
and enableCuda
already check stdenv.hostPlatform.isLinux
, and if this is unnecessary for CPU builds, then it shouldn't break anything (and may fix any issues from the cp
failing due to the libraries being absent).
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
PostInstall is still needed and should not be removed.
I'm getting an error for rocm.
> [29/322] Building CXX object libc/src/__support/StringUtil/CMakeFiles/libc.src.__support.Stri…
┃ > [30/322] Building CXX object libc/src/__support/File/CMakeFiles/libc.src.__support.File.file.…
┃ > [31/322] Building CXX object libc/test/UnitTest/CMakeFiles/LibcUnitTest.dir/LibcTest.cpp.o
┃ > FAILED: libc/test/UnitTest/CMakeFiles/LibcUnitTest.dir/LibcTest.cpp.o
┃ > /nix/store/7sqn5142gh5asri8aqxvsvib9ii37fyq-rocm-llvm-clang-unwrapped-wrapper-6.0.2/bin/clang…
┃ > /build/source/libc/test/UnitTest/LibcTest.cpp:16:10: fatal error: 'cassert' file not found
┃ > #include <cassert>
┃ > ^~~~~~~~~
┃ > 1 error generated.
┃ > [32/322] Building CXX object libc/test/UnitTest/CMakeFiles/TestLogger.dir/TestLogger.cpp.o
┃ > ninja: build stopped: subcommand failed.
┃
not sure if this ollama/ollama#7392 is related.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
yes, why was it removed. It seems to have resulted in my CPU being used and not my GPU even when GPU is detected.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@rjpcasalino This is fixed in #373234.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
yeah, thanks so much! Wondering how it could have been avoided. Anyway, thanks again
|
I'm currently running |
I'll also investigate, after family festivities end... I did notice the error and it looks like there may be some llvm dependencies we need to add for rocm. |
I added
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
looks good to me
don't have rocm device to test...but it does build for all three cases.
At 8bf5d83 (auto-rebased onto 3667e0b by
|
|
1 similar comment
|
I only see the targets |
@collares in 0.5.4 they refactored the entire make structure. On the cuda side according to the comment in here https://github.com/ollama/ollama/blob/main/Makefile#L14 they try to build for both. in the package.nix we choose based on what's on the system for cuda (that's how it was) Looking at it now, I think maybe some more re-thinking needs to be done to match what the new make structure is doing. EDIT: see nixpkgs/pkgs/by-name/ol/ollama/package.nix Lines 90 to 104 in 6084a82
|
|
Thanks, I tweaked the target name and now it works with |
|
Fails on
|
Diff: ollama/ollama@v0.5.1...v0.5.4 Changelog: https://github.com/ollama/ollama/releases/tag/v0.5.4 Co-authored-by: Mastoca <[email protected]>
|
Looks like the |
perhaps a different version of go might work for darwin buildGo124Module? |
Same error unfortunately. |
Pinging @NixOS/darwin-maintainers. |
|
Ok merging since it's building fine outside the sandbox. |
And 0.5.5 released two days ago 🤷🏻 |
And there already is a PR, so no need to be grumpy :) Edit: also, 0.5.5 is a pre-release, so nothing to merge just yet |
Interesting, that renders as a shrug, not a grumpy person, on Darwin ;) |
FYI I think this may have broken GPU acceleration for me! Confirmed with a git bisect of nixpkgs. I'm running an Nvidia P100. ollama/ollama#8349 (comment) @peperunas also seems to be running into this |
postInstall = lib.optionalString stdenv.hostPlatform.isLinux '' | ||
# copy libggml_*.so and runners into lib | ||
# https://github.com/ollama/ollama/blob/v0.4.4/llama/make/gpu.make#L90 | ||
mkdir -p $out/lib | ||
cp -r dist/*/lib/* $out/lib/ | ||
''; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think this might be the problem? I'm going to some messing around
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Got it working locally! PR coming shortly
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
yes. thanks!
Things done
Changelog: https://github.com/ollama/ollama/releases/tag/v0.5.2
cc @abysssol @dit7ya @elohmeier @RoyDubnium
nix.conf
? (See Nix manual)sandbox = relaxed
sandbox = true
nix-shell -p nixpkgs-review --run "nixpkgs-review rev HEAD"
. Note: all changes have to be committed, also see nixpkgs-review usage./result/bin/
)Add a 👍 reaction to pull requests you find important.