graph-learn功能空缺点 #1383
-
请问现在graph-learn是否支持spark分布式训练?多机多GPU分布式训练? |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment
-
graph-learn provides efficient support for sampling (
Yes, distributed training is supported and multi-gpu the train depends on the framework ifself. It would require certain setup effort, IMO.
GraphScope (as well as graphlearn) doesn't have those builtin algorihtms, but users definitely can implement such ones on top of current framework.
graphlearn hasn't been integrated with pyg yet.
Yes.
We haven't profiling yet and cannot tell such numbers currently.
GraphScope requires the |
Beta Was this translation helpful? Give feedback.
graph-learn provides efficient support for sampling (
graphscope.learning.Graph()
), and tensorflow is the builtin supported ML framework, and definitely other engines can be leveraged on top of graphlearn.Yes, distributed training is supported and multi-gpu the train depends on the framework ifself. It would require certain setup effort, IMO.
GraphScope (as well as graphlearn) doesn't have those builtin algorihtms, but users definitely can implement such ones on top of current framework.
graphlearn hasn't been integrated with pyg yet.