-
Notifications
You must be signed in to change notification settings - Fork 59
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Can't import Wiki Data - Either becomes Idle without finishing or using resources, or throws a DEADLOCK IMMANENT error #695
Comments
I tried running again after upgrading KGTK to 1.5.2. I used exactly the same command except this time I set I had several KGTK processes, one holding 96% or so of the memory and occasionally flaring up to 100%. Then rather than the silence of before it actually threw and error:
|
I tried again with two processes. Same result as my first post. I did notice that the two processes spawned to do the work eventually became zombies, from
|
Hi Brett, while I am trying to see who can help with this, we made the imported data for Wikidata 2022-11 available here: |
Hi! Thanks for that link, I'll try using the pre-processed version and that should get me unstuck for now. Thanks! |
Hello everyone! I have a similar problem. After about one hour of execution, my Mac shut down unexpectedly. This if the maximum number of processors is used. If, instead, I maintain 6 as this number, I have a similar problem to what was reported. My question is the following: I am giving a look at the wikidata kgtk files provided by @filievski. Where can I find the node.tsv, edge.tsv, and qualifier.tsv files? I can see many files and this is a bit unintuitive. |
I come back to this issue. After several tries, I launched the import in a cluster of my affiliation, with 32 cores and 250 GB of RAM. I launched the same command of this issue, but with At some point, the program stops giving outputs. The first time, I set a time limit of one day. The job got stuck before that time. On the second try, I put a time limit of one week. Similar to the first trial, the program got stuck. After one day it was stuck, the node in the cluster crashed, probably due to finished memory. Could you kindly explain as one could have this wikidata successfully imported? Please, also explain to my previous comment. |
Hi @filievski, the link you provided is no longer available. It was available one month ago, and now everything has been deleted. |
Hi @tommasocarraro, unfortunately, I am not at ISI anymore (since 14 months ago), and I am afraid that the KGTK codebase is not maintained anymore. Indeed, this also means that the website is now offline. If this helps you, here is a dump from Wikidata from 2022-11-02, which I think is the same version that was previously on our website: |
I'm trying to import the full wiki data zip, possibly just the English attributes, into the KGTK format for further analysis. The process runs for a few hours. I see from terminal that some of the processes have got up to 1.4 million lines processed (not sure of how many). While running I'm watching the system resources and seeing several kgtk processes using most of the machines memory between them. The number of kgtk processes drops over time. Now there are only two kgtk processes, neither using even 1% of memory and no CPU activity. It seems to have effectively stopped, yet the process is still displaying the last output of lines processed. So it seems it's still running, but it's ceased to do anything. It's been in this state for at least an hour.
To Reproduce
Installed KGTK under python 3.9.15 in local conda env
Downloaded zip of Wiki Data, ~70GB compressed, within the last 12 months.
activate the conda env
Run the command:
Expected behavior
The process to continue running and using system resources to indicates it's doing something, until all the wiki data has been converted to the TSV format, or some useful error is thrown.
Additional context
Not sure if the problem is caused by memory leaks or a deadlock issue 🤷♂️
After manually killing the process (Ctrl+C) The output throws an error:
This three times:
Followed By this:
/home/ubuntu/anaconda3/envs/kgtk-env/lib/python3.9/multiprocessing/resource_tracker.py:216: UserWarning: resource_tracker: There appear to be 76 leaked shared_memory objects to clean up at shutdown
The text was updated successfully, but these errors were encountered: