-
Notifications
You must be signed in to change notification settings - Fork 614
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[bazel] Add scripts to validate pregeneration tools #7690
base: 2027
Are you sure you want to change the base?
Conversation
Fork Sync: Update from parent repository
The reason we checked in generated code and don't generate at build time is the substantial additional requirements for doing so (particularly protobuf and the quickbuf plugin for it). Are these requirements still optional for the build with this change? |
They are not optional, but are all downloaded / compiled hermetically and sandboxed from the rest of your machine. If you notice, the CI job was not updated but there is no additional need to do a
Since everything is downloaded at build time, the results become more deterministic (don't need to worry if you pip install'd a different version of jinja, or even have to pip install at all), and you don't need to worry about finding the appropriate version of something like the |
The downside is that this constrains the build to those specific host build machines, correct? What about arm64 hosts for example? |
So if there is a host architecture that doesn't have pre-built dependency for itself (quickbuf, if one of the I can probably add a |
Fork Sync: Update 2027 branch
This might feel like a lot just to duplicate the
.github/workflows/pregenerate.yml
action, but it shows a powerful thing that bazel can do, and is eating up a lot of diff space between wpilibsuite and my mega-fork.This adds bazel'ability to the pregeneration (
generate_numbers.py, generate_nanopb.py, etc
but notupstream_utils
). Each individual script could now be run withbazel run //wpimath:generate_numbers
, etc. But the really powerful thing is that they can be automatically at build time, and the generation results can be compared against the source tree in a unit test. Bazel's cache is respected across builds, so this autogeneration will almost never result in a cahche miss and have to be re-run. All of the pregen stuff can be run at once by runningbazel run //:write_all
.When the output is a folder instead of a single file, all of the contents of the folder must be update'able at once, which forced making a new script for subprojects that run multiple things, i.e.
wpimath
runninggenerate_numbers + generate_nanopb + generate_quickbuf
.In addition, this unlocked the ability to build the examples. You can't really read the
examples.json
file during the analysis phase to create targets, but you can add a task to generate file which can define targets. Eventually the examples will need to be build against shared libraries so they can function correctly, but this at least makes sure they are always compile'able in bazel. A common breaking point I've had is intellisense grabbing code across boundries, likeProjectB.Drivetrain
trying to importProjectA.Constants
. It seems like gradle allows that happen, but bazel will not.