-
Notifications
You must be signed in to change notification settings - Fork 12
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
possible file too long #172
Comments
UPDATE: nope, even dissolving partially filtered subsets does not work. There are five values in that column, two works out, one gives the limit error, the other two give more trouble, with this explicit error:
I'll try diss/binding feature by feature... |
Can you try directly on the source file with command-line e.g.
|
The initial error is (I think) due to the inability of R to serialize a single string that long. If we could write ndjson it might help, but don't currently support that. The second error looks like the limit of mapshaper to export a single very long geometry. This issue might give you some pointers |
Hi, I can't actually create the json object from sf, same error as above, and even after stripping digits down to 6. I'll have a look tomorrow at your link below, but I think I need to deal with it in a different way. |
Do you have the original source file (shp, gpkg, geojson, etc), or only as a |
Hi all,
While trying to dissolve an
sf
polygons object grouping by one of its column:I received this error:
I've found similar reports about data.table, and looks like the problem is the R limit to read/write text files more than 2GB in size ? the object is actually 1,846,143,660B when saved as
rds
(that's the reason I don't post/link2 it), so possibly lots more in jsonI reckon the only quick solution ATM is to rbinding partial dissolutions, but is there a more elegant fix?
Thanks,
Luca
The text was updated successfully, but these errors were encountered: