Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

dynamic node meta not interpolated for constraint values #24651

Closed
tgross opened this issue Dec 11, 2024 · 1 comment
Closed

dynamic node meta not interpolated for constraint values #24651

tgross opened this issue Dec 11, 2024 · 1 comment

Comments

@tgross
Copy link
Member

tgross commented Dec 11, 2024

When looking into the state of #5764 (comment), I encountered a bug in our constraints for dynamic node metadata.

Consider the following setup:

$ nomad node status -verbose ae01 | grep hostname
unique.hostname                     = nomad0
$ nomad node meta apply --node-id ae01 host_machine=nomad0
jobspec
job "example" {

  group "group" {

    constraint {
      attribute = "${attr.unique.hostname}"
      value     = "${meta.host_machine}"
    }

    task "task" {

      driver = "docker"

      config {
        image   = "busybox:1"
        command = "httpd"
        args    = ["-vv", "-f", "-p", "8001", "-h", "/local"]
      }

      resources {
        cpu    = 100
        memory = 100
      }

    }
  }
}

If you run that job, you'd expect it to plan successfully, but it fails with:

Scheduler dry-run:
- WARNING: Failed to place all allocations.
  Task Group "group" (failed to place 1 allocation):
    * Class "multipass": 1 nodes excluded by filter
    * Constraint "${attr.unique.hostname} = ${meta.example}": 1 nodes excluded by filter

If you instead set the node meta statically via client.meta in its configuration, the job plans successfully. As far as I can tell the dynamic node metadata should be present:

$ nomad node meta read --node-id ae01
All Meta
connect.gateway_image                           = docker.io/envoyproxy/envoy:v${NOMAD_envoy_version}
connect.log_level                               = info
connect.proxy_concurrency                       = 1
connect.sidecar_image                           = docker.io/envoyproxy/envoy:v${NOMAD_envoy_version}
connect.transparent_proxy.default_outbound_port = 15001
connect.transparent_proxy.default_uid           = 101
host_machine                                    = nomad0

Dynamic Meta
host_machine = nomad0

Static Meta
connect.gateway_image                           = docker.io/envoyproxy/envoy:v${NOMAD_envoy_version}
connect.log_level                               = info
connect.proxy_concurrency                       = 1
connect.sidecar_image                           = docker.io/envoyproxy/envoy:v${NOMAD_envoy_version}
connect.transparent_proxy.default_outbound_port = 15001
connect.transparent_proxy.default_uid           = 101

$ nomad node status -verbose ae01
...
Meta
connect.gateway_image                           = docker.io/envoyproxy/envoy:v${NOMAD_envoy_version}
connect.log_level                               = info
connect.proxy_concurrency                       = 1
connect.sidecar_image                           = docker.io/envoyproxy/envoy:v${NOMAD_envoy_version}
connect.transparent_proxy.default_outbound_port = 15001
connect.transparent_proxy.default_uid           = 101
host_machine                                    = nomad0

There's no weird denormalization that happens in the RPC handlers, so it shows up fine there as well:

$ nomad operator api /v1/node/ae014ece-ae70-cec7-4b85-118ad61329b5 | jq .Meta
{
  "connect.log_level": "info",
  "connect.proxy_concurrency": "1",
  "connect.transparent_proxy.default_uid": "101",
  "connect.transparent_proxy.default_outbound_port": "15001",
  "host_machine": "nomad0",
  "connect.sidecar_image": "docker.io/envoyproxy/envoy:v${NOMAD_envoy_version}",
  "connect.gateway_image": "docker.io/envoyproxy/envoy:v${NOMAD_envoy_version}"
}
@tgross
Copy link
Member Author

tgross commented Dec 11, 2024

"Skill issue", as the kids say. I changed my meta key and forgot to update it in the jobspec. 🤦

@tgross tgross closed this as not planned Won't fix, can't repro, duplicate, stale Dec 11, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

1 participant