fix(cvm): Fix optional parameters never being omitted from requests #2988
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Initially noticed this problem when I tried to create a launch template containing the following block:
This resulted in an error during apply like this:
I figured okay, I'll just add what should be the defaults and get on with it. But ultimately I got the same error from the
BandwidthPackageId
field. There is no good default value for it, so I was unable to make the launch template.Turns out that throughout the SDK, all scalar fields are always included, at least wherever
InterfacesHeadMap
is used. The value of the individual fields is never checked, so every field ends up being sent, with the default value being not the default value defined in the schema, but the default value for the data type, which is definitely wrong. Since the default value for strings is an empty string, any field inside a block that takes an ID parameter of some kind will fail with an API validation error.In this commit I've only fixed the
internet_accessible
block for launch templates, but ultimately all locations whereInterfacesHeadMap
(and possibly others, I haven't investigated that much) need to be updated.FWIW, the AWS Terraform provider uses this very same way of dealing with this issue.