-
Notifications
You must be signed in to change notification settings - Fork 41
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Investigate options for continuous integration #63
Comments
Thanks for the research work! I tried to put the ideas into practice and managed to successfully build the ROM using Github Actions: https://github.com/GiacomoGarbin/github-actions/actions/runs/8971522515. At least I'm missing something, workflow artifacts aren't exposed by default and Steam credentials are stored as secret, so there shouldn't be any leaks of sensitive data. But the concern about PR vulnerability seems serious. To test the changes and build the ROM, PR would need access to secrets. Furthermore, a PR can maliciously alter the Makefile to leak the ROM or other sensitive data. Maybe it's possible to make PR safe for our scope, but I couldn't find an easy way around it. Perhaps a possible solution would be to translate the Makefile into a Github Actions workflow, which in theory cannot be tampered with by a malicious PR, but this does not exclude the possibility of exposing a vulnerability at some step of the workflow. However I think CI can still be viable and useful even if it is only run on push and merge events, i.e. after PR code review. Additionally, contributors can install CI on their fork repositories, using their Steam credentials stored as secrets, to run workflows on their push before opening the PR. The good news is that the entire workflow (downloading game data + building the ROM) took 7 minutes, and on other runs I've done it took less than 6 minutes, which is much less than I expected. There's also some room for improvement, since the download-game-data and setup-and-install-dependencies steps are independent and can be done in parallel before the build-ROM step. So I think caching would not be necessary. As for logging into Steam via SteamCMD, since I have Steam Guard enabled, I have stored |
Thanks for prototyping this! It looks very promising.
Yes, based on the documentation and your run you linked to, this is confirmed :)
I'm happy to see it's so quick, and agree caching isn't worth it in the first pass given the game was downloaded in ~1m. Getting to it in the future would be a nice (albeit small) improvement. Similarly, installing dependencies at the same time the game data is downloaded will save ~1m based on the numbers in the run you shared (assuming no caching). I did some more reading and there doesn't seem to be a nice way to run steps in parallel. We'd have to do some scripting to accomplish this, which wouldn't be difficult. So we have two possible ways to save that ~1m. My leaning is the former, since it results in the workflow doing less work. But as mentioned it's not critical.
Yes, it's tricky. If re-creating the Makefile logic, the workflow would no longer be running the exact same steps a user would, meaning bugs could slip through or the two build processes could diverge over time. Regardless, there would still be the possibility of malicious code somewhere else we don't handle (e.g., in the various scripts called during the build). I agree that CI is useful even if only after PR. But ideally contributors can be alerted to build issues before the merge, without having to configure their own CI. The less friction the better. I think it's good enough to just require approval to trigger workflow runs from forks. For example, using Environments. |
Sorry, you are right, steps are executed sequentially, I was thinking about jobs, which are executed in parallel, but I forgot that inter-job communication is done via upload/download artifacts, which is what we want to avoid. But yes, parallelization should also be possible via scripting.
Yes, the workflow approach is a bad idea from multiple points of view.
I haven't read up on caching yet, but it's definitely worth having a look.
Once we have the CI up and running, I think it would be useful to test the different setups ("recommended", docker and manual) on Ubuntu, assuming we want to maintain them all, and then maybe test them on Windows and macOS too, since the runners hosted on GitHub offer this opportunity. And all these tests can be performed in parallel jobs.
This sounds perfect. |
Yes, agreed. That will help detect problems that only affect specific build setups. Right now Windows and Mac won't work as-is with the manual setup steps, and Docker should (in theory) be the same on all platforms. Let's start with native ( |
See discussion in #47 (comment).
It would be easier to avoid breaking the build by implementing CI builds. For example, using GitHub Actions to compile whenever a branch is updated. The complicating factor here is that the Portal game files and compiled Portal 64 ROM must be kept private (see N64 Libraries).
SteamCMD can be used to download game files and credentials/API keys can be stored using GitHub Actions secrets. This could be slow and so caching is worth looking into. Based on GitHub Actions documentation, files generated during jobs are not exposed by default but this should be confirmed, particularly if using caching.
As for keeping the compiled ROM private, according to Storing workflow data as artifacts, again it appears that exposing build output is opt-in. Confirm this as well.
Important: Sensitive data must not be accessible through specially-crafted pull requests. See https://securitylab.github.com/research/github-actions-preventing-pwn-requests/. One way to avoid this is to require approval for workflow runs or to only trigger on events such as pull request approval.
If GitHub-hosted runners can't provide the necessary data security, a few other options are:
In summary:
The text was updated successfully, but these errors were encountered: