Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

RIM_ONE_v3.hdf5 and all_data.hdf5 need how to generate it #4

Open
LuShuaie opened this issue Jan 2, 2018 · 6 comments
Open

RIM_ONE_v3.hdf5 and all_data.hdf5 need how to generate it #4

LuShuaie opened this issue Jan 2, 2018 · 6 comments

Comments

@LuShuaie
Copy link

LuShuaie commented Jan 2, 2018

RIM_ONE_v3.hdf5 and all_data.hdf5 need how to generate it, what are the requirements?

@seva100
Copy link
Owner

seva100 commented Jan 2, 2018

@2017john, HDF5 datasets can be recreated with scripts/Organize datasets.ipynb notebook.

Nevertheless, I have uploaded all the datasets here.

@LuShuaie
Copy link
Author

LuShuaie commented Jan 3, 2018

@seva100 Thank you.

@Geeks-Sid
Copy link

Hey @seva100 I am facing a similar issue in this particular line
h5f = h5py.File("../data/hdf5_datasets/all_data.hdf5", "r+")
I cant find this particular dataset at all. not even in the folder you mentioned.

@seva100
Copy link
Owner

seva100 commented Feb 3, 2018

@Geeks-Sid all_data.hdf5 is just a union of all data sets from the folder I mentioned.
For example, if you need DRIONS-DB data, you can just replace it with DRIONS-DB.hdf5

@Geeks-Sid
Copy link

@seva100 Hey, Can you explain how to create a union of all the datasets?

@seva100
Copy link
Owner

seva100 commented Mar 3, 2018

@Geeks-Sid you can just copy items from all the datasets by something like this:

import h5py

h5f_in = h5py.File("DRIONS-DB.hdf5", "r")
h5f_out = h5py.File("all_data.hdf5", "w")
h5f_out['DRIONS-DB/orig/images'] = h5f_in['DRIONS-DB/orig/images']
... # copy all other items
h5f_in.close()
h5f_out.close()

Or you can recreate it with scripts/Organize datasets.ipynb notebook.
But I think you won't really need a union of all the datasets for the sake of replication.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants