Skip to content
This repository has been archived by the owner on May 9, 2020. It is now read-only.

Memory Error #1

Open
Hout1 opened this issue Jun 20, 2019 · 1 comment
Open

Memory Error #1

Hout1 opened this issue Jun 20, 2019 · 1 comment

Comments

@Hout1
Copy link

Hout1 commented Jun 20, 2019

Hi,
I want to use PySESA to decimated my point cloud, but python always sends me the memory problem
I used the following code and 'example_100000pts.xyz' point cloud to do a test :

import pysesa
import os
import shutil
import errno
infile = '/home/radewane/Bureau/Point-cloud-Rambla/pysesa-master/pysesa/example_100000pts.xyz'
pysesa.process(infile, 1, 4, 1, 1024, 0.05, 20, 1, 64, 1, 0)

but I still have the same problem:

memory error, using 1.0 max points
memory error, using 1.0 max points
memory error, using 1.0 max points
memory error, using 1.0 max points
memory error, using 1.0 max points
memory error, using 1.0 max points
^CTraceback (most recent call last):
File "Code111.py", line 3, in
pysesa.process(infile, 1, 4, 1, 1024, 0.05, 20, 1, 64, 1, 0)
File "/home/radewane/Bureau/Point-cloud-Rambla/local/lib/python2.7/site-packages/pysesa/_pysesa.py", line 452, in process
nr_pts = pysesa.partition(toproc, out, mxpts, minpts, prc_overlap).getdata() #res, bp
File "pysesa/partition.pyx", line 206, in pysesa.partition.partition.init
mxpts = np.max([1,mxpts-2])
File "/home/radewane/Bureau/Point-cloud-Rambla/local/lib/python2.7/site-packages/numpy/core/fromnumeric.py", line 2505, in amax
initial=initial)
File "/home/radewane/Bureau/Point-cloud-Rambla/local/lib/python2.7/site-packages/numpy/core/fromnumeric.py", line 86, in _wrapreduction
return ufunc.reduce(obj, axis, dtype, out, **passkwargs)
KeyboardInterrupt

Can you help me solve this problem please
Thank you in advance

@dbuscombe-usgs
Copy link
Owner

@Hout1 my apologies for the delay. I have upgraded the whole package. Please try the new workflow and let me know if you have issues. The following changes have been made:

  1. python 3
  2. new cython modules and a simpler way to compile them
  3. removed nifty package dependency, and all power spectra now computed using numpy
  4. simpler and faster way to compute integral lengthscales
  5. removed plotting libraries (too hard to maintain long-term)
  6. removed savitsky-golay filter (overkill)
  7. no option to 'smooth' the power spectrum (yielding inconsistent and sometimes bizarre results)
  8. spatial statistics no longer use the Knuth and Welford algorithm
  9. conda environment

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants