Handling imbalance of classes

I am trying edge classification (binary), refering - 5.2 Edge Classification/Regression β€” DGL 0.6.1 documentation

The data has class imbalance. To handle that I tried modifying loss function by including weight option like this:

class_weights = [90.0]

for e in range(epochs):
    ....

    loss = F.binary_cross_entropy_with_logits(logits[train_mask], edge_labels[train_mask], pos_weight=torch.FloatTensor(class_weights))

    ....
    loss.backward()
    ....

But there is no importance given to those class weights. Any idea on why that could happen?

1 Like

Sorry I don’t really get your question. Are you asking how to set weights for each class?

I want to train edge classifier for two classes. The way I am currently using was mentioned above - class_weights in loss function. But it is not affecting the training somehow, it is getting overfitted to majority class.

If weighting does not work, you could try down-sampling the samples of the majority class.

Thanks for your response, But the problem at hand is such that all data should be considered (kindof time series)

I may not use the most accurate word here. By down-sampling I mean during each iteration, you could take N minority samples plus N sampled majority samples to form a minibatch. As long as there are enough iterations, all data will be considered.

2 Likes

This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.