You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Currently, it appears that PyHa crashes when generating automated labels of particularly large datasets. I suspect that this is due to the size of the automated dataframe becoming too large.
Potential fixes:
Convert floats being stored int 8 byte floats (float8). By default, Pandas uses float64.
Try to use the builtin csv python library. It could be that we just create the individual dataframes on each clip, and then we just append
to some master csv file. This would hopefully shift the burden from memory onto storage.
Look into parallelization with DASK (this may speed things up, but I am skeptical if it addresses the memory problems)
The text was updated successfully, but these errors were encountered:
Currently, it appears that PyHa crashes when generating automated labels of particularly large datasets. I suspect that this is due to the size of the automated dataframe becoming too large.
Potential fixes:
to some master csv file. This would hopefully shift the burden from memory onto storage.
The text was updated successfully, but these errors were encountered: