-
Notifications
You must be signed in to change notification settings - Fork 96
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Ensemble for PyTorch Geometric #105
Comments
Hi @ParasKoundal, could you provide the code snippet on using dataloaders with graph data, so that we can take a closer look. |
@xuyxu
I have created a custom class to preprocess dataset before loading into dataloader. After that I was trying as given in https://ensemble-pytorch.readthedocs.io/en/latest/quick_start.html. For regression I tried initially with VotingRegressor, doesn't work (error given in the initial issue raised). Similar with others too. |
Could you further provide the full exception traceback, thanks! |
Here's that
|
This could possibly be the side-effect of the commit from issue #75. Will see if this could be fixed in a few days, thanks for reporting @ParasKoundal ! |
@xuyxu Any update on this? |
Hi @ParasKoundal, sorry, I am kind of busy these days, and will take a look during the next weekend. |
In torchensemble, at each iteration the input loader is expected to return a list in the following forms:
The first kind of form is the most widely-used form of the dataloader (i.e., However, the dataloder in pytorch geometric conforms to neither of them:
which does not contain a target tensor since the label is simply the index of the batch in the tuple returned. Here is a simple solution, please let me know if it solves your problem on using torchensmeble models in pytorch geometric. The general idea is to override the from torch_geometric.nn import MetaPath2Vec
class CustomMetaPath2Vec(MetaPath2Vec):
def _sample(self, batch: List[int]) -> Tuple[Tensor, Tensor]:
if not isinstance(batch, Tensor):
batch = torch.tensor(batch, dtype=torch.long)
pos_sample = self._pos_sample(batch)
neg_sample = self._neg_sample(batch)
data = torch.cat((pos_sample, neg_sample), dim=0)
target = torch.cat(
(torch.ones(pos_sample.size(0)), torch.zeros(neg_sample.size(0))),
dim=0
)
return [data, target] Using this new class, the positive_batch and negative_batch will be concatenated as one tensor In addition, some extra steps are required in the Looking forward to your kind reply @ParasKoundal |
Hi,
I want to use Ensemble-PyTorch with PyTorch-Geometric. However, it doesn't recognize the dataloaders.
Is this under development or a bug.
The text was updated successfully, but these errors were encountered: