-
Notifications
You must be signed in to change notification settings - Fork 82
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Plotting of large data sets fails #143
Comments
This comment has been minimized.
This comment has been minimized.
I don't want to be that guy, but why are you trying to create a plot with that many points? I would recommend using something like a lttb algorithm to reduce the number of points to plot. Can you provide us with more information about your use case? What renderer are you using? Can you provide an example? What version of charming are you using? |
I'm trying to visualize chromosome-wide genomics events. I could downsample, but if I understand that error it seems to be failing at less than 2 gb, so we're not talking about an obscene amount of resources here. I'm using the most recent version, pulled from the repo. This is the function used for plotting :
|
The v8 engine seems to have a default memory limit per process of ~1.4GB on 64-bit machines. Can you try running your code while setting this env? |
@bwbioinfo did it work? Do you need more help? |
When plotting ~200M or more points, the plotting fails with the following error :
I don't see an option to adjust the memory settings but this might be an avenue to explore.
The text was updated successfully, but these errors were encountered: