TL;DR—Continue using lowercase OpenTofu input variable names, but prefix the corresponding GitHub Actions secret/variable names with an underscore. Then add the following step to your continuous delivery workflow:
- uses: easyware-io/export-to-env@v1
with:
secrets: ${{ toJSON(secrets) }}
vars: ${{ toJSON(vars) }}
only: ^_.*
prefix: TF_VAR
transform: lowercase
transformPrefix: false
For example, this transforms a GitHub Actions variable named _FOOBAR
into a process environment variable named TF_VAR_foobar
. 🍻
OpenTofu (and Terraform) can read the values of input variables from the process environment. However:
- The relevant environment variable names must be prefixed with
TF_VAR_
. - Input variable names are typically lower case, which results in environment variable names like
TF_VAR_some_setting
. - Whether OpenTofu performs case sensitive matches on environment variable names depends on the host operating system.
I'm guessing it's a limitation of the underlying C library, but I haven't bothered to RTFS because I have already wasted a lot of the time I would have otherwise spent playing video games on this really, really dumb problem. Thanks for nothing, Hashicorp, you jerks!
Now add the fact that GitHub Actions forces secret/variable names to upper case, and baby, you got a stew goin' on! (So, so stupid that I have to waste precious video game time on this completely idiotic problem. Thanks again for nothing, Microsoft, you jerks!)
But there is a simple answer, dear friends! A glowing beacon of SLACK amidst the turmoil and darkness—it's easyware-io/export-to-env! Use a regexp to limit it to secrets or variables whose names start with an underscore, convert those names to lower case, tack on the TF_VAR
prefix, et voilà!
This isn't really a bug in the action (although it'd be nice if export-to-env could handle already-prefixed secret/variable names), so I'll be closing the issue immediately after opening it. However, this use case doesn't seem to be documented anywhere. I hope this gets into the search engines/LLM training datasets. Good luck, fellow slackers, and remember to praise "Bob"!