-
-
Notifications
You must be signed in to change notification settings - Fork 151
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Adding support for the Julia language #151
Comments
I'm not aware of anyone doing this so feel free to work on it! You can look at the C++ simple kit for the basic structure we would like for a kit. |
I might push a draft soon to get some feedback but I have mostly followed the Python kit. The object-oriented makes it slightly awkward so minor changes to use multiple dispatch but seems to be moving along. I have seen minor differences from the implementation and the public API definition but looking at the kit helped me figure things out. |
I have opened as a draft PR, so it isn't ready for a complete review, but would regardless appreciate some comments on how to start testing a new bot/port. I guess I would need to get a custom version of the CLI? |
Since Julia can be compiled, as long as you can compile the bot code into a file, eg main.out, then the CLI tool will try to execute it by running ./main.out, so no customization necessary So a test command would be lux-ai-2021 main.out main.out --replay=out.json |
Our team also is not fluent with Julia so if you can document the differences with the python API that would be great, and then feel free to implement however you like that best suits julia programmers. |
I think I understand the last component of the pipeline now. Probably the easiest solution would be to use PyJulia...
|
Can you not directly execute compiled Julia code? Or does it not compile to machine code |
One could potentially use PackageCompiler (see this section) to create an app that can be sent and run on other machines without Julia being installed on that machine. However, at least when looking at the Kaggle submission / environment and the Docker file it seems to be a lot easier to:
I guess once I figure out the last kinks in the starter code either way would work. |
Ah so compiled julia code still requires some kind of julia runner, in which case the pyjulia looks like a better option. |
As compiled code, it wouldn't require Julia, but at least for development, it is likely that just calling Julia from Python for that one line in that file to be an easier experience. I am testing making the executable files and as an option and basically people could do the same set up at the end regardless. I was mostly thinking of making it easier for folks to dev / work with the code. That way, they don't need to create the executables but have the option. |
Sounds great! I only had concerns over potentially speed and I wasn't sure how pyjulia works (e.g. is it transpiling? is it 100% accurate?) |
I have had some experience calling Python and R from Julia as well as Julia from R and Python... Not too much of that anymore these days. In general, it's quite useful and works well in most cases with rare exceptions. In this case, I imagine it would be best case scenario since the only thing being passed around is a |
I think I am about done with the startkit... Will push soon. I am building the test suite using
Using that to get verify the state of the game at initiation and how it progresses.
which isn't technically a Inspecting that game, the first observation that is received from the API has lines
meaning each player has their initial worker. Based on Worker
Worker |
@Nosferican Thanks for your time and effort to create the julia kits. I am not sure but is it possible to create a compile.sh which let us to install required packages? or can we create a custom DockerFile together with submission to customize the container image that running the bot (agent)? and for using pyjulia approach, seems that command call via python (main.py) approach, e.g. in the java kits, is more intuitive to me. any benefit using pyhulia instead of command call? |
Currently for the Julia simple kit I only use one package for parsing JSON (JSON3.jl). When I compile the Julia code into the executable it is embedded as part of the machine code files so everything is self-contained. Technically, depending on the options for compiling the app, you can use the base system image which includes the standard libraries, make a trimmed version including only the standard libraries the code uses or a mix with any additional package you may want the code to depend. The Julia packages are listed in a Project.toml and Manifest.toml files. The Project.toml gives the dependencies and compatibility requested. The Manifest.toml is the "lock-file" as it allows to get and generate the exact environment. Absent the Manifest.toml, it is generated from the Project.toml file. Users only need to run I guess have warmed up to the command call due to the I have tested the command line tool by feeding the |
are you mean to use "PackageCompiler.jl" to build the custom julia sysimage ( or standalone executable?) and included it in the submission zip file? ) (link)
yes, Pkg.instantiate() is good idea, but I am not sure the bot container can access the Internet to download the package on the fly. (@StoneT2000, can you clarify the container have internet access when running the match?) therefore, summarize all "PackageComplier.jl", "preinstall in Dockerfile", "pkg.instantiate()" & "local installation" approaches, seems still cannot completely resolve the additional packages issue.
Understand that, thanks for your clarification. |
There is not internet connection in the production containers. If there is, it's a bug and will probably be removed. I recommend trying to find a way to package all your libraries into your submission (which should be doable afaik) Moreover this way you don't lose any time in the match. |
I would be very surprised to see that Julia does not have any kind of local packaging system, that feels incredibly poorly designed, how would you develop local packages? |
For the compiled version once everything is running well, the ideal would be to generate the image trimming down any unused standard library (e.g. LinearAlgebra) to make it as small as possible while running the codebase with a test / burn script (feeding it the first two observation lines of a game such that those are also compiled for the image). The packages for Julia would already be embedded in the app so no need to download / install anything at that step. The
Hopefully that makes things a bit clearer. If not, let me know and I can try to clarify it. |
sorry for my misleading wordings, the "local" it means in project level / run time level, contrast to the system wise level (global), when u developing / editing an package, it still have a "local" copy in your project directory but not including its' dependences. Those dependence packages still in "global". |
I have an another idea, I think we can make use of the DEPOT_PATH to point the package repo to running directory.
step to create a submission file, it is only needed when additional packages is required:
corresponding changes in main.py,
Together with adding some selected preinstall packages in the match container Dockerfile, I think this can completely solve the package dependence issue, even it is a bit complicate. |
I think submission size can be upwards of a few hundred megabytes |
Hi, just bumping this thread up to see if there needs any more help / contributions from the Lux AI team (or anyone else) No rush though, thanks again for initiating this |
No worries. Had a few deadlines for work this week that eat some of my time but plan to take a look at it today. |
Mea culpa. I was out of town the last 10 days or so. Remind me at the end of the week if I haven't pushed an update. Should be getting to it this in the next couple days. |
Bumped! @Nosferican |
Thanks for the bump. I have not forgotten but have had a pretty eventful few days. Will try to get back it soon. |
Planning to take a look at it in case anyone else has the same idea, we can be aware of the efforts.
The text was updated successfully, but these errors were encountered: