You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Find appropriate AC and/or AF cutoff for LD data , such that the code does not encounter out-of-memory issues.
Wenhan mentioned errors at 10 million variants per run (so per genetic ancestry group) but prior gnomAD v2 code mentioned the same at 30 million variants , both on standard workers ? where will I encounter this when running ?
And for cutoffs, we are running on both NFE and AFR. Is there the same cutoff that would get appropriate numbers (Wenhan ballparked 7-9million) for both ?
Relevant to #1656
The text was updated successfully, but these errors were encountered:
Find appropriate AC and/or AF cutoff for LD data , such that the code does not encounter out-of-memory issues.
Wenhan mentioned errors at 10 million variants per run (so per genetic ancestry group) but prior gnomAD v2 code mentioned the same at 30 million variants , both on standard workers ? where will I encounter this when running ?
And for cutoffs, we are running on both NFE and AFR. Is there the same cutoff that would get appropriate numbers (Wenhan ballparked 7-9million) for both ?
Relevant to #1656
The text was updated successfully, but these errors were encountered: