Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ollama: 0.5.4 -> 0.5.5 #373234

Merged
merged 1 commit into from
Jan 16, 2025
Merged

ollama: 0.5.4 -> 0.5.5 #373234

merged 1 commit into from
Jan 16, 2025

Conversation

liberodark
Copy link
Contributor

@liberodark liberodark commented Jan 12, 2025

Changelog :

https://github.com/ollama/ollama/releases/tag/v0.5.4 > https://github.com/ollama/ollama/releases/tag/v0.5.5

Also is pre-release so probably can wait for release.

Things done

  • Built on platform(s)
    • x86_64-linux
    • aarch64-linux
    • x86_64-darwin
    • aarch64-darwin
  • For non-Linux: Is sandboxing enabled in nix.conf? (See Nix manual)
    • sandbox = relaxed
    • sandbox = true
  • Tested, as applicable:
  • Tested compilation of all packages that depend on this change using nix-shell -p nixpkgs-review --run "nixpkgs-review rev HEAD". Note: all changes have to be committed, also see nixpkgs-review usage
  • Tested basic functionality of all binary files (usually in ./result/bin/)
  • 25.05 Release Notes (or backporting 24.11 and 25.05 Release notes)
    • (Package updates) Added a release notes entry if the change is major or breaking
    • (Module updates) Added a release notes entry if the change is significant
    • (Module addition) Added a release notes entry if adding a new NixOS module
  • Fits CONTRIBUTING.md.

Add a 👍 reaction to pull requests you find important.

@liberodark liberodark marked this pull request as ready for review January 12, 2025 16:57
@liberodark liberodark marked this pull request as draft January 12, 2025 16:58
@niklaskorz
Copy link
Contributor

0.5.4 was just merged, please rebase

@niklaskorz niklaskorz mentioned this pull request Jan 12, 2025
13 tasks
@liberodark liberodark changed the title ollama: 0.5.1 > 0.5.5 ollama: 0.5.4 > 0.5.5 Jan 12, 2025
@liberodark liberodark marked this pull request as ready for review January 12, 2025 19:16
@GaetanLepage
Copy link
Contributor

Yes, I would prefer if we wait for it to be tagged as a proper release.

@liberodark liberodark marked this pull request as draft January 12, 2025 19:58
@mastoca
Copy link
Contributor

mastoca commented Jan 14, 2025

looks like 0.5.5 is official now.

@liberodark liberodark marked this pull request as ready for review January 15, 2025 07:58
@pbsds pbsds mentioned this pull request Jan 15, 2025
13 tasks
@liberodark liberodark force-pushed the ollama branch 2 times, most recently from 22dc139 to ad52a9b Compare January 15, 2025 09:13
@GaetanLepage
Copy link
Contributor

nixpkgs-review result

Generated using nixpkgs-review.

Command: nixpkgs-review pr 373234


x86_64-linux

✅ 5 packages built:
  • alpaca
  • chatd
  • ollama
  • ollama-cuda
  • ollama-rocm

aarch64-linux

❌ 3 packages failed to build:
  • alpaca
  • chatd
  • ollama
✅ 1 package built:
  • ollama-cuda

x86_64-darwin

✅ 1 package built:
  • ollama

aarch64-darwin

❌ 1 package failed to build:
  • ollama

@GaetanLepage
Copy link
Contributor

Fails on aarch64-linux:

Running phase: installPhase
cp: missing destination file operand after '/nix/store/4wcrf57b0mavzidx6b9z78k4wcrh3jxa-ollama-0.5.5/lib/'
Try 'cp --help' for more information.

@breakds
Copy link
Contributor

breakds commented Jan 15, 2025

Tested and can verify that now 0.5.5 runs well on x86_64 linux + Nvidia GPU. More specifically the postInstall addressed the issue that "since 0.5.2 without copying those libs, ollama will still work but does not start the model on CUDA".

@prusnak
Copy link
Member

prusnak commented Jan 15, 2025

Commit message should read ollama: 0.5.4 -> 0.5.5 (-> not >)

@GaetanLepage
Copy link
Contributor

Fails on aarch64-linux:

Running phase: installPhase
cp: missing destination file operand after '/nix/store/4wcrf57b0mavzidx6b9z78k4wcrh3jxa-ollama-0.5.5/lib/'
Try 'cp --help' for more information.

This issue is still making the aarch64-linux build fail.

@prusnak
Copy link
Member

prusnak commented Jan 16, 2025

This issue is still making the aarch64-linux build fail.

Yes, because @liberodark still has not applied the suggestion from my comment #373234 (comment)

Can you try if that fixes the build for you?

@liberodark liberodark changed the title ollama: 0.5.4 > 0.5.5 ollama: 0.5.4 -> 0.5.5 Jan 16, 2025
@GaetanLepage
Copy link
Contributor

nixpkgs-review result

Generated using nixpkgs-review.

Command: nixpkgs-review pr 373234


x86_64-linux

✅ 5 packages built:
  • alpaca
  • chatd
  • ollama
  • ollama-cuda
  • ollama-rocm

aarch64-linux

✅ 4 packages built:
  • alpaca
  • chatd
  • ollama
  • ollama-cuda

x86_64-darwin

❌ 1 package failed to build:
  • chatd
✅ 1 package built:
  • ollama

aarch64-darwin

❌ 2 packages failed to build:
  • chatd
  • ollama

Copy link
Contributor

@GaetanLepage GaetanLepage left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM !

@LunNova
Copy link
Member

LunNova commented Jan 16, 2025

nixpkgs-review result

Generated using nixpkgs-review.

Command: nixpkgs-review pr 373234


x86_64-linux

✅ 5 packages built:
  • alpaca
  • chatd
  • ollama
  • ollama-cuda
  • ollama-rocm

@GaetanLepage
Copy link
Contributor

nixpkgs-review result

Generated using nixpkgs-review.

Command: nixpkgs-review pr 373234


x86_64-linux

✅ 5 packages built:
  • alpaca
  • chatd
  • ollama
  • ollama-cuda
  • ollama-rocm

aarch64-linux

✅ 4 packages built:
  • alpaca
  • chatd
  • ollama
  • ollama-cuda

x86_64-darwin

❌ 1 package failed to build:
  • chatd
✅ 1 package built:
  • ollama

aarch64-darwin

❌ 2 packages failed to build:
  • chatd
  • ollama

@prusnak prusnak requested a review from abysssol January 16, 2025 17:12
Copy link
Contributor

@GaetanLepage GaetanLepage left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Good job, thanks !

Copy link
Contributor

@mastoca mastoca left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

looks good

Copy link
Contributor

@abysssol abysssol left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Tested ollama-rocm: it appears to function correctly.

Comment on lines +202 to +207
postInstall = lib.optionalString (stdenv.hostPlatform.isx86 || enableRocm || enableCuda) ''
# copy libggml_*.so and runners into lib
# https://github.com/ollama/ollama/blob/v0.4.4/llama/make/gpu.make#L90
mkdir -p $out/lib
cp -r dist/*/lib/* $out/lib/
'';
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think this is correct now, but I want to know @LunNova's thoughts. My previous suggestion on this (replacing the condition with (enableRocm || enableCuda)) turned out to be wrong, as @LunNova helpfully pointed out last time.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Seems good enough!

Copy link
Member

@LunNova LunNova Jan 16, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Oh, someone should check that x86_64-darwin's happy. I can't.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

My only lingering worry is that devices like Jetson have Cuda and aarch64.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

My only lingering worry is that devices like Jetson have Cuda and aarch64.

The condition (stdenv.hostPlatform.isx86 || enableRocm || enableCuda) only contains || (or), not && (and), so it should work fine if CUDA is requested on ARM (as long as ollama and llama-cpp correctly support that combination). However, enableCuda does require Linux; most programs aren't designed to run on bare metal, though, so that's probably not especially notable.

@abysssol abysssol requested a review from LunNova January 16, 2025 20:09
@GaetanLepage
Copy link
Contributor

nixpkgs-review result

Generated using nixpkgs-review.

Command: nixpkgs-review pr 373234


x86_64-linux

✅ 5 packages built:
  • alpaca
  • chatd
  • ollama
  • ollama-cuda
  • ollama-rocm

aarch64-linux

✅ 4 packages built:
  • alpaca
  • chatd
  • ollama
  • ollama-cuda

x86_64-darwin

❌ 1 package failed to build:
  • chatd
✅ 1 package built:
  • ollama

aarch64-darwin

❌ 2 packages failed to build:
  • chatd
  • ollama

@GaetanLepage GaetanLepage merged commit 9277281 into NixOS:master Jan 16, 2025
23 of 26 checks passed
@gshpychka
Copy link
Contributor

@GaetanLepage but why are the darwin build failures acceptable?

@prusnak
Copy link
Member

prusnak commented Jan 17, 2025

@GaetanLepage but why are the darwin build failures acceptable?

they are spurious - both chatd and ollama build just fine on my aarch64-darwin and also on x86_64-darwin

@GaetanLepage
Copy link
Contributor

@GaetanLepage but why are the darwin build failures acceptable?

Yes, in particular, the ollama failure on aarch64-darwin does not occur when building outside of the sandbox. Also, it was not brought by this PR.

@HuzaifaTP
Copy link

HuzaifaTP commented Jan 22, 2025

not sure if its related but encountered this on updating unstable and rebuilding:

  llvm-libc++-shared.cfg.in :: std/atomics/atomics.types.operations/atomics.types.operations.req/atomic_load_explicit.pass.cpp
  llvm-libc++-shared.cfg.in :: std/atomics/atomics.types.operations/atomics.types.operations.req/atomic_store.pass.cpp
  llvm-libc++-shared.cfg.in :: std/atomics/atomics.types.operations/atomics.types.operations.req/atomic_store_explicit.pass.cpp
  llvm-libc++-shared.cfg.in :: std/atomics/atomics.types.operations/atomics.types.operations.wait/atomic_notify_all.pass.cpp
  llvm-libc++-shared.cfg.in :: std/atomics/atomics.types.operations/atomics.types.operations.wait/atomic_notify_one.pass.cpp
  llvm-libc++-shared.cfg.in :: std/atomics/atomics.types.operations/atomics.types.operations.wait/atomic_wait.pass.cpp
  llvm-libc++-shared.cfg.in :: std/atomics/atomics.types.operations/atomics.types.operations.wait/atomic_wait_explicit.pass.cpp
  llvm-libc++-shared.cfg.in :: std/input.output/filesystems/class.path/path.nonmember/path.io.unicode_bug.pass.cpp
  llvm-libc++-shared.cfg.in :: std/utilities/function.objects/func.invoke/invoke_r.temporary.verify.cpp

********************
Failed Tests (1):
  llvm-libc++-shared.cfg.in :: libcxx/transitive_includes.sh.cpp


Testing Time: 2782.86s

Total Discovered Tests: 7869
  Unsupported      :  428 (5.44%)
  Passed           : 7400 (94.04%)
  Expectedly Failed:   40 (0.51%)
  Failed           :    1 (0.01%)
FAILED: CMakeFiles/check-runtimes /build/source/runtimes/build/CMakeFiles/check-runtimes
cd /build/source/runtimes/build && /nix/store/inqsga0i9wl1z80xwiq4nijbrnpqz6b4-python3-3.12.8-env/bin/python3.12 /nix/store/9rqf51h9yhz3q2y1in0ql12xl2vibg00-python3.12-lit-18.1.8/bin/.lit-wrapped -sv --show-xfail --show-unsupported /build/source/runtimes/build/libcxx/test
ninja: build stopped: subcommand failed.
error: builder for '/nix/store/35a4dhaa1b27dcscz6qnj97a15qma2bx-rocm-llvm-libcxx-6.0.2.drv' failed with exit code 1
error: 1 dependencies of derivation '/nix/store/wjf41g7sfn87s9s1x0apffcb034s5k6b-rocm-llvm-clang-wrapper-6.0.2.drv' failed to build
error: 1 dependencies of derivation '/nix/store/lcsrng48sryasfhzcgm5r3m2zs16ri6a-clr-6.0.2.drv' failed to build
error: 1 dependencies of derivation '/nix/store/fl29p5hq0a638c791g8big77aaz03wg0-rocm-clang.drv' failed to build
error: 1 dependencies of derivation '/nix/store/hs892zry0nf41pp65xr991w4xjhfh378-rocm-llvm-openmp-6.0.2.drv' failed to build
error: 1 dependencies of derivation '/nix/store/1wqp3mwbqg5rz8f5hj26npf9j1xasf18-stdenv-linux.drv' failed to build
error: 1 dependencies of derivation '/nix/store/xs7k2bwasf72rz0mmy4dcvnhgrcp1icd-graphics-drivers.drv' failed to build
error: 1 dependencies of derivation '/nix/store/rnqmkd85jsx7jqj35y5j9j7x2pq7yrx8-ollama-0.5.7.drv' failed to build
error: 1 dependencies of derivation '/nix/store/q31qri5xc3fhvnhwrxwdfnkck89r6y8m-graphics-driver.conf.drv' failed to build
error: 1 dependencies of derivation '/nix/store/xjc78q4p9s5kclzqyl48g18phlmg4nwb-system-path.drv' failed to build
error: 1 dependencies of derivation '/nix/store/qlkindfrbccs2zy2ywdnfayjwcsprpc4-unit-ollama.service.drv' failed to build
error: 1 dependencies of derivation '/nix/store/r9q92vrslbhmr60yw8xh0iivqg3i2dv8-unit-script-ollama-model-loader-start.drv' failed to build
error: 1 dependencies of derivation '/nix/store/z99gdg97bhi73clfmh33lf6gfrimp7qp-nixos-system-latitude2-25.05beta742084.9e4d5190a948.drv' failed to build

amd gpu + rocm config:

{pkgs,...}:

{
      systemd.packages = with pkgs; [ lact ];
      systemd.services.lactd.wantedBy = ["multi-user.target"];
      hardware.amdgpu.amdvlk.enable = false;
services.ollama = {
  enable = true;
  loadModels = [ "codellama" "llama3.2" ];
  acceleration = "rocm";
  rocmOverrideGfx = "10.1.1";
    environmentVariables = {
       HCC_AMDGPU_TARGET = "gfx1100";
     };
};
      hardware.amdgpu.opencl.enable = true;
      hardware.graphics = {
        enable = true;
        enable32Bit = true;
        extraPackages = with pkgs; [
                 mesa
                 rocmPackages.rocblas
                 rocmPackages.rocm-smi
                 rocmPackages.rocminfo
                 rocmPackages.hipblas
                 rocmPackages.rocm-device-libs
                 rocmPackages.rpp
                 pcre2
                 libselinux
                 libcap
        ];
      };
}

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

10 participants