-
Notifications
You must be signed in to change notification settings - Fork 57
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Google Colab: TypeError in bsoid_assign() (Too many bodyparts?) #7
Comments
Hi, Thanks for your issue. You guessed correctly, it was because we hard-coded in 6 body-parts (we've done HCA based on all kinds of features using those 6 body parts so we are only confident in 6 at the moment), so 9 will cause an issue. Implementation wise: For yours, if you could just use ['nose','left shoulder','right shoulder','left hip','right hip', 'tail root'], it should work. In other words, by removing the columns before b-soid_assign, it should run as expected. The number of rows and data file size should not be of concern. Your data specifically: Obviously, 9 data points will provide more detailed information for b-soid to parse, and likely resulting in finer behavioral differences (though in my computation of the 6 points, I do estimate points to get around that). We can talk more about incorporating "neck" or "centroid" if you like. By having these additional 3 points, you can analyze a barrage of kinematics that 6 points are lacking, but does not necessarily add weight to extracting new behaviors. Note that the python notebook is in beta and I have only tested it on my open-field data so far. I am currently still testing it on other experiments. Please keep me posted on any new issues using Google Colab! I am eager to fully implement the python version of this algorithm (pip install bsoid will be great for python users as opposed to running it on the cloud!). Alex |
Hi Alex, thanks for the quick reply. I will try out slicing the dataframe to 6 bodyparts! |
Hi, I've fixed it. It was a panda version difference that they recently updated on Colab (1.03). Let me know if this solves this issue. Alex |
Great work with the Google Colab notebook. As a python user, I was looking forward to this! Finally have the time to try it out!
Sadly:
Running your Google Colab notebook with original DeepLabCut data yields the following error in line
f_10fps,tsne_feats,labels,tsne_fig = bsoid_assign(data,fps = 30,comp = 1,kclass = 50,it = 30)
As documented, i only changed fps (from 60 to 30).
I am using an animal model with 9 Points, single animal (top-down view) the points are called ['nose', 'neck', 'left_shoulder', 'right_shoulder', 'left_hip', 'centroid', 'right_shoulder', 'tail_root', 'tail_tip'].
I was not able to see if you are using hardcoded bodyparts for the feature extraction. If so, this might be the issue?
More Details
I run the code completely on google colab with google drive connected. The deeplabcut csv files are raw but quite big (> 50k rows each).
If you need any further info, I am happy to provide it.
The text was updated successfully, but these errors were encountered: