Skip to content

Commit

Permalink
multi-upscaler: specify map_location when loading negative embedding
Browse files Browse the repository at this point in the history
  • Loading branch information
Laurent2916 authored and deltheil committed Jul 12, 2024
1 parent af1b302 commit 88325c3
Showing 1 changed file with 3 additions and 1 deletion.
Original file line number Diff line number Diff line change
Expand Up @@ -112,7 +112,9 @@ def load_negative_embedding(self, path: Path | None, key: str | None) -> str:
if path is None:
return ""

embeddings: Tensor | dict[str, Any] = torch.load(path, weights_only=True) # type: ignore
embeddings: torch.Tensor | dict[str, Any] = torch.load( # type: ignore
path, weights_only=True, map_location=self.device
)

if isinstance(embeddings, dict):
assert key is not None, "Key must be provided to access the negative embedding."
Expand Down

0 comments on commit 88325c3

Please sign in to comment.