-
-
Notifications
You must be signed in to change notification settings - Fork 14.9k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
libedgetpu: package and provide nixos module #188719
Comments
I'm interested in this as well, but have never packaged something for NixOS. |
There is also the edgetpu-compiler, which is already wrapped: https://github.com/NixOS/nixpkgs/blob/master/pkgs/development/libraries/science/robotics/edgetpu-compiler/default.nix Maybe wrapping libedgetpu would be pretty similar (unpacking the deb and copying the files to the correct output loaction), plus the udev stuff |
Here's a start: default.nix:
libedgetpu-stddef.diff:
|
@yorickvP were you able to get anywhere with that derivation? Thanks for making a start! |
It works with the following udev rules:
However, it's not nicely packaged up, it should be possible to add something like
to de derivation and then add |
Looks like we dropped this package internally because edgetpu's are out of stock anyways, and it depends on an out of date flatbuffers. cc @Lucus16 |
Is "Edge TPU" headed towards abandonment? Did we close this because it's future is dead, no one plans to work on it, etc? Just curious, I keep carrying this Coral around with me wondering if I should sell it or keep ahold of it... |
@cpcloud might have more information. Personally, I would sell the Coral while it still has any value. |
I think closing this is very presumptuous. Could we please leave it open? There are lots of people with Corals (for frigate) and there are no indications that Google will stop making them, just supply issues. |
Thanks @yorickvP |
I've recently been able to buy one, and it doesn't seem to be EOL yet (they have a policy about it here: https://coral.ai/products/accelerator#:~:text=our%20technology.-,Product%20lifecycle,-Product%20line%20enhancements). |
Frigate 0.12.0 does support Intel GNA via openvino. But I would surely be interested in someone packaging libedgetpu and pycoral, so we can support that as well. @cpcloud offered to check up on them, but have yet to hear back from him. The frigate PR at #214428 has been coming along nicely, and we're finishing up the tests currently. |
I'm looking at libedgetpu at the moment, as a background task. Currently
looking at upgrading tensorflow-lite.
…On Tue, 16 May 2023, 12:24 Martin Weinelt, ***@***.***> wrote:
There are lots of people with Corals (for frigate) and there are no
indications that Google will stop making them, just supply issues.
Frigate 0.12.0 does support Intel GNA via openvino. But I would surely be
interested in someone packaging libedgetpu
<https://github.com/google-coral/libedgetpu> and pycoral
<https://github.com/google-coral/pycoral>, so we can support that as well.
The frigate PR at #214428 <#214428>
has been coming along nicely and we're finishing up the tests currently.
—
Reply to this email directly, view it on GitHub
<#188719 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/ACTHH2LWY7HXKBZ3BFO4KJ3XGNPW3ANCNFSM5734GSNA>
.
You are receiving this because you commented.Message ID:
***@***.***>
|
See #217599 for my last try to get it to build again. |
Just came across this and wanted to note that I recently was able to get a few coral TPUs as well. It would be great to get this packaged as it seems there is still a lot of value having it included. |
I had real trouble getting this working ( I'm a nix noob ) but this flake appears to work and I was able to run yolov8 images using the ultralytics python package so thought I'd post here incase someone can fix/improve/find it useful.
pyproject.toml
For running on the USB device I had to export the nano model from yolo like this
The 640 models would not load and would crash with something like Deadline exceeded: USB transfer error 2 [LibUsbDataOutCallback] For reasons I don't understand I also have to manually set the LD_LIBRARY_PATH for the gcc/glib. I'm sure there's something stupid I'm missing in the flake here |
Thanks @Alan01252 that's progress! |
FWIW, I started working on a module over at https://github.com/jhvst/nix-config/blob/main/nixosModules/frigate/flake.nix It does not currently work (missing udev rules etc.) but at least it loads the libraries properly currently. It's a container, so it is supposed to be run like in here: https://nixos.wiki/wiki/NixOS_Containers and https://www.tweag.io/blog/2020-07-31-nixos-flakes/ @naggie have you got Frigate to work with the libedgetpu flake? |
I did not -- I've been using Ubuntu on my CCTV VM as a workaround for now. Thanks for starting on a module for Frigate! That's great, I'll switch to it some time if it works out. |
I had another go at this, and I've managed to get my USB Coral TPU working with frigate and libedgetpu! I've packaged libedgetpu in my NUR, based on the flake above: https://github.com/graham33/nur-packages/blob/master/pkgs/libedgetpu/default.nix. The only things I've really changed are to bump the version of Tensorflow slightly (just to 2.7.4, I'm not sure how much later libedgetpu would be compatible with), and to add the udev rules in the installation. Here's an abridged version of my NixOS config:
Then I get the magic lines in the frigate logs!:
|
Wow excellent well done and thanks! |
@graham33 How did you deal with the gasket dependency? AFAICT this just builds the library but the gasket kernel driver is still needed. There's a gasket package nixpkgs but it seems to be having severe kernel compatibility issues. |
I didn't hit this at all. I'm using the USB Coral, is it possible this driver is only needed for the PCIe one (I don't know much about it)? |
I seem to have it working with a mini-PCIe Coral, using package tweaks from this issue as well as a separate Github issue for gasket. My config is here: Hope that helps. |
@colino17 libedgetpu package worked great for a couple months but now flatbuffers and libcoraltpu aren't building because of the switch to gcc13. Setting gcc12Stdenv seems to fix the issue. |
How does one set gcc12Stdenv? I'm struggling to figure this out. Any help would be super appreciated! |
Never mind my previous comment. I was able to get my PCIe Coral working with the gasket kernel module via the unstable channel. I thought I'd need libedgetpu as well, but Frigate is working, and discovering both TPUs- so wonderful. |
That blows my mind, how is that working? I'm pretty sure you have to pass libedgetpu through as a delegate to tflite for it to work? Is frigate shipping with it? |
It does get installed in the container, yes, but I was under the impression that it needed to be installed on the host as well. |
I would keep an eye on your Frigate logs. I initially went down a similar route with only gasket on the host and thought I had everything working, but I remember there being some issue where my PCIE Coral kept disconnecting even though it was initially recognized. I didn't notice it for the longest time as I also had a USB Coral which was picking up the slack. Perhaps this is fixed now and working properly with that type of configuration. I think ultimately I'm going to migrate to dual USB Corals to avoid these types of issues entirely. |
Just a little update for anyone interested: |
Is anyone ever going to submit the packages into nixpkgs? I invested a lot of time into getting frigate packaged and openvino updated to a working state, and I hoped that something would manifest from this issue. But for some reason this package is being kept out of tree, and as such out of reach for many interested parties. |
Is there a configuration example around on how to get this all working? My goal is have Frigate running with a Coral mini-PCIe card, currently using the nixpkgs-23.11 versions of gasket and frigate and a custom libedgetpu.nix{ stdenv
, lib
, fetchFromGitHub
, python3
, libusb1
, abseil-cpp
, flatbuffers
, xxd
}:
let
flatbuffers_1_12 = flatbuffers.overrideAttrs (oldAttrs: rec {
version = "1.12.1";
NIX_CFLAGS_COMPILE = "-Wno-error=class-memaccess -Wno-error=maybe-uninitialized";
cmakeFlags = (oldAttrs.cmakeFlags or []) ++ ["-DFLATBUFFERS_BUILD_SHAREDLIB=ON"];
NIX_CXXSTDLIB_COMPILE = "-std=c++17";
configureFlags = (oldAttrs.configureFlags or []) ++ ["--enable-shared"];
src = fetchFromGitHub {
owner = "google";
repo = "flatbuffers";
rev = "v${version}";
sha256 = "sha256-5sHddlqWx/5d5/FOMK7qRlR5xbUR47rfejuXI1jymWM=";
};
});
in stdenv.mkDerivation rec {
# pname = "libedgetpu";
# version = "ddfa7bde33c23afd8c2892182faa3e5b4e6ad94e";
# src = fetchFromGitHub {
# owner = "google-coral";
# repo = pname;
# rev = version;
# sha256 = "sha256-NidGjBPOLu5py7bakqvNQLDi72b5ig9QF9C1UuQldn0=";
# };
pname = "libedgetpu";
version = "90b03d96ed83412178ed6e6cfddbd40bb3f84925";
src = fetchFromGitHub {
owner = "feranick";
repo = pname;
rev = version;
sha256 = "sha256-/Eneik+v+juGsg/us+0YBQxkKeJUpGnFqrRPu5nKYWk=";
};
# patches = [ ./libedgetpu-stddef.patch ];
makeFlags = ["-f" "makefile_build/Makefile" "libedgetpu" ];
buildInputs = [
libusb1
abseil-cpp
flatbuffers_1_12
];
nativeBuildInputs = [
xxd
];
NIX_CXXSTDLIB_COMPILE = "-std=c++17";
TFROOT = "${fetchFromGitHub {
owner = "tensorflow";
repo = "tensorflow";
rev = "v2.8.4"; # latest rev providing tensorflow/lite/c/common.c
sha256 = "sha256-MFqsVdSqbNDNZSQtCQ4/4DRpJPG35I0La4MLtRp37Rk=";
# rev = "v2.13.1"; # latest rev providing tensorflow/lite/c/common.c
# sha256 = "sha256-fCwf7I76gdyeOPVnPPqEw4cI7RrcrshTSHjdfevUriY=";
}}";
# TFROOT = "${python3.pkgs.tensorflow}";
enableParallelBuilding = false;
installPhase = ''
mkdir -p $out/lib
cp out/direct/k8/libedgetpu.so.1.0 $out/lib
ln -s $out/lib/libedgetpu.so.1.0 $out/lib/libedgetpu.so.1
mkdir -p $out/lib/udev/rules.d
cp debian/edgetpu-accelerator.rules $out/lib/udev/rules.d/99-edgetpu-accelerator.rules
'';
} Injecting it into Frigate like systemd.services.frigate.environment.LD_LIBRARY_PATH = "${libedgetpu}/lib"; Best I could do so far was a successful build when using TensorFlow 2.8.4, newer versions result in
which should be fixed in 2.13.1, but no difference. nixpkgs-23.11 includes TF 2.13.0, that is what I was aiming for. With 2.8.4 Frigate produces a segfault
Also I was surprised that the frigate package depends on tensorflow instead of tensorflow-lite. Thank you for the effort! |
Hi @mweinelt and @serpent213 , I can confirm - I have exactly the same problem. Have you been able to resolve it? Thank you.
|
Won't spend any time on this as long as it is not brought into nixpkgs. It is too hard to reason about the environments of everyone involved otherwise. I also don't own any coral device fwiw. |
No, didn't do further research, went with OpenVINO for now... |
Thank you @serpent213 . Could you please share the relevant Nix / Frigate configuration? I wonder if you need to somehow add OpenVINO Python module and also how the OpenVINO configuration looks like in Frigate config... Thank you. |
I also did not have success getting coral running using the newer tensorflow in 23.11. It used to work before my 23.11 upgrade and there's only so much time I want to put into getting it working again. So... I cheated by downgrading tensorflow/frigrate in an overlay. It seems to work. Eventually I hope to remove this band-aid though. nixpkgs-frigate.url = "github:NixOS/nixpkgs/5cfafa12d57374f48bcc36fda3274ada276cf69e"; final: prev:
let
system = prev.system;
frigatePkgs = inputs.nixpkgs-frigate.legacyPackages.${system};
in
{
# It seems that libedgetpu needs to be built with the newer version of tensorflow in nixpkgs
# but I am lazy so I instead just downgrade by using the old nixpkgs
libedgetpu = frigatePkgs.callPackage ./libedgetpu { };
frigate = frigatePkgs.frigate;
} |
A (very hacky) solution to this is to build libedgetpu using docker+bazel method, then push the built binary as " Steps:
libedgetpu-bin pkg{
stdenv,
}:
stdenv.mkDerivation {
src = ./libedgetpu.so.1.0;
pname = "libedgetpu";
version = "whatever";
dontBuild = true;
dontUnpack = true;
installPhase = ''
mkdir -p $out/lib
cp $src $out/lib/libedgetpu.so.1.0
ln -s $out/lib/libedgetpu.so.1.0 $out/lib/libedgetpu.so.1
mkdir -p $out/lib/udev/rules.d
cat >> $out/lib/udev/rules.d/99-edgetpu-accelerator.rules <<EOF
SUBSYSTEM=="usb",ATTRS{idVendor}=="1a6e",ATTRS{idProduct}=="089a",GROUP="plugdev"
SUBSYSTEM=="usb",ATTRS{idVendor}=="18d1",ATTRS{idProduct}=="9302",GROUP="plugdev"
EOF
'';
} Frigate systemd config{ pkgs, lib, ... }:
{
services.frigate.settings.detectors.coral = {
type = "edgetpu";
device = "usb";
};
systemd.services.frigate.environment.LD_LIBRARY_PATH = lib.makeLibraryPath [
pkgs.libedgetpu-bin
pkgs.libusb # libusb
];
systemd.services.frigate.serviceConfig = {
SupplementaryGroups = "plugdev";
};
services.udev.packages = [ pkgs.libedgetpu-bin ];
users.groups.plugdev = { };
} ^1: "work" is defined as:
|
I think I have a working set of packages for getting the drivers running. So far, everything is stable on frigate 13.2 To summarize, it seems like part of the problem is the incredibly old version of flatbuffers used, so I went and tried to mimic the build environment of the Ubuntu 22.04LTS container that libedgetpu expects since that's known to work (thanks @VTimofeenko). This meant using an old abseil version and gcc 12 which introduces some compiler bugs I had to work around with the extremely concerning I'll caution I'm not sure which parts are necessary and which aren't anymore, it took me a while to get this working and there might be a slightly more elegant way to get this to all behave, but at least this is a start. gasket.nix
libedgetpu.nix
|
I tried @pdelbarba config and it didn't work for me with NixOS 24.05, frigate 0.13.2, which uses tensorflow 2.13.0, python 3.11. I got a segfault when loading the model file (could repro with a simple python script) when calling in This sounds like some conflict between the tensorflow libraries linked in python vs. libedgetpu. @VTimofeenko docker compile approach worked for me. Here's the ldd output from the lib compiled with @pdelbarba config:
Here is the ldd output from @VTimofeenko docker compile approach:
To make this work without "luck" (i.e. having a compatible tensorflow version on both ends), I think the config needs to be changed to compile the library with statically linked dependencies (like in the docker version). |
I'm wrapping edgetpu support for USB devices into Would be cool if someone gave it a try. Basically configuring any detector with |
Thanks @mweinelt , really appreciate the effort. I intend to move my ubuntu frigate VM to NixOS, not sure when I'll get time though. Will post here when I do |
Project description
It would be nice to be able to use the Google Coral Edge TPU Usb device with NixOS.
I think it's mostly a matter of building libedgetpu (and maybe tensorflow?) and then bundling the udev rules with it?
Metadata
The text was updated successfully, but these errors were encountered: