Skip to content

Does not support very large dataset #12

@xuChenSJTU

Description

@xuChenSJTU

I note that the feature matrix of all nodes is feed into the initialization of model (encoder, aggregator), which will cause great memory when the feature matrix is too large.
The corresponding code is here:

agg = MeanAggregator(features, cuda=True)
enc = Encoder(features, n_feats_dim, args.n_hidden, refined_adj_lists, agg, gcn=args.gcn, cuda=args.cuda)

I think this method support large feature matrix if we do not implement it like this. I mean maybe we can assign the batch feature matrix into the model each time.

And this will make the code more applicable.
Any advice? Guys.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions