-
Notifications
You must be signed in to change notification settings - Fork 3
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Use of depth image #55
Comments
Hi, thanks a lot for your question! I've tried reading out the depth data from a HDF5 from that dataset, and I can confirm that it indeed doesn't contain depth data, but rather the RGB data. That doesn't seem right—I'll follow up with some other folks on the project (@danielbear). Re the training files: IIRC we didn't generate the depth data for the training data to keep the size manageable (the HDF5s can be quite large). If you need access to those fields, you could consider regenerating the data with the right flags set. The code to generate the dataset can be found here: https://github.com/neuroailab/tdw_physics/tree/Neurips2021 |
Thanks for your reply! |
Hi there,
Thanks for your interest in our work and the Physion dataset! Sorry about
the issue with the `_depth` field. I actually remember that there was a bug
in the ThreeDWorld back-end that generated RGB instead of depth when using
MacOS. It's possible that's fixed, and if not submitting a PR to
https://github.com/threedworld-mit/tdw might get it resolved. Either way,
your best bet is to try regenerating the data you need depth for; Linux or
Windows might be better if you can.
Same goes for the training data. As Felix mentioned, we only generated RGB
for the training data to keep the file sizes smaller. But you can actually
generate them at whatever resolution you need and with whichever "passes"
you want (e.g. flow, depth, segments.)
Please let us know if the instructions for generating data, which Felix
linked to, are unclear or don't work for you!
Dan
…On Sun, Jun 11, 2023 at 1:08 AM jsw2000 ***@***.***> wrote:
Hi, thanks a lot for your question! I've tried reading out the depth data
from a HDF5 from that dataset, and I can confirm that it indeed doesn't
contain depth data, but rather the RGB data. That doesn't seem right—I'll
follow up with some other folks on the project ***@***.***
<https://github.com/danielbear>).
Re the training files: IIRC we didn't generate the depth data for the
training data to keep the size manageable (the HDF5s can be quite large).
If you need access to those fields, you could consider regenerating the
data with the right flags set. The code to generate the dataset can be
found here: https://github.com/neuroailab/tdw_physics/tree/Neurips2021
Thanks for your reply!
—
Reply to this email directly, view it on GitHub
<#55 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AA6QMZPY7SYARASWRDLETGLXKVHEZANCNFSM6AAAAAAY7SYGJQ>
.
You are receiving this because you were mentioned.Message ID:
***@***.***
com>
|
Hi,
I have followed some graph-based networks on Physion. It is an excellent work, and I believe there will be lots of reserachers following this work. These days, I have some questions on the use of depth data:
2.['_depth'],['_normal'],['_flow'], these keys only appear in test hdf5 files, and I can't find them in training files. How to use these features in training phase?
Thank you very much if you would like to help!
The text was updated successfully, but these errors were encountered: