You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
loading all datasets...
using 4 threads
loading from cache file: ./cache/coco_trainval2014.pkl
No cache file found...
loading annotations into memory...
Done (t=11.66s)
creating index...
index created!
118287it [00:47, 2514.57it/s]
loading annotations into memory...
Done (t=12.07s)
creating index...
index created!
loading from cache file: ./cache/coco_trainval2014.pkl
loading annotations into memory...
Done (t=17.86s)
creating index...
index created!
loading from cache file: ./cache/coco_trainval2014.pkl
loading annotations into memory...
Done (t=50.14s)
creating index...
index created!
Traceback (most recent call last):
File "train.py", line 185, in
training_dbs = [datasets[dataset](configs["db"], train_split) for _ in range(threads)]
File "train.py", line 185, in
training_dbs = [datasets[dataset](configs["db"], train_split) for _ in range(threads)]
File "/home/xsc/CornerNet/db/coco.py", line 69, in init
self._load_coco_data()
File "/home/xsc/CornerNet/db/coco.py", line 85, in _load_coco_data
data = json.load(f)
File "/home/xsc/anaconda3/envs/CornerNet/lib/python3.6/json/init.py", line 296, in load
return loads(fp.read(),
File "/home/xsc/anaconda3/envs/CornerNet/lib/python3.6/codecs.py", line 321, in decode
(result, consumed) = self._buffer_decode(data, self.errors, final)
MemoryError
Does anyone know what's going on?
The text was updated successfully, but these errors were encountered:
When run python train.py CornerNet, it's wrong:
loading all datasets...
using 4 threads
loading from cache file: ./cache/coco_trainval2014.pkl
No cache file found...
loading annotations into memory...
Done (t=11.66s)
creating index...
index created!
118287it [00:47, 2514.57it/s]
loading annotations into memory...
Done (t=12.07s)
creating index...
index created!
loading from cache file: ./cache/coco_trainval2014.pkl
loading annotations into memory...
Done (t=17.86s)
creating index...
index created!
loading from cache file: ./cache/coco_trainval2014.pkl
loading annotations into memory...
Done (t=50.14s)
creating index...
index created!
Traceback (most recent call last):
File "train.py", line 185, in
training_dbs = [datasets[dataset](configs["db"], train_split) for _ in range(threads)]
File "train.py", line 185, in
training_dbs = [datasets[dataset](configs["db"], train_split) for _ in range(threads)]
File "/home/xsc/CornerNet/db/coco.py", line 69, in init
self._load_coco_data()
File "/home/xsc/CornerNet/db/coco.py", line 85, in _load_coco_data
data = json.load(f)
File "/home/xsc/anaconda3/envs/CornerNet/lib/python3.6/json/init.py", line 296, in load
return loads(fp.read(),
File "/home/xsc/anaconda3/envs/CornerNet/lib/python3.6/codecs.py", line 321, in decode
(result, consumed) = self._buffer_decode(data, self.errors, final)
MemoryError
Does anyone know what's going on?
The text was updated successfully, but these errors were encountered: