-
Notifications
You must be signed in to change notification settings - Fork 247
Open
Description
I note that the feature matrix of all nodes is feed into the initialization of model (encoder, aggregator), which will cause great memory when the feature matrix is too large.
The corresponding code is here:
agg = MeanAggregator(features, cuda=True)
enc = Encoder(features, n_feats_dim, args.n_hidden, refined_adj_lists, agg, gcn=args.gcn, cuda=args.cuda)
I think this method support large feature matrix if we do not implement it like this. I mean maybe we can assign the batch feature matrix into the model each time.
And this will make the code more applicable.
Any advice? Guys.
gumanchang, moushuai and LanceZPF
Metadata
Metadata
Assignees
Labels
No labels