Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

how to implement softer-NMS by pytorch version #8

Closed
gittigxuy opened this issue Oct 27, 2018 · 4 comments
Closed

how to implement softer-NMS by pytorch version #8

gittigxuy opened this issue Oct 27, 2018 · 4 comments

Comments

@gittigxuy
Copy link

just change a little bit code in soft NMS ?could you please illustrate detailly?

@LiDaiY
Copy link

LiDaiY commented Apr 22, 2019

I saw the source code,it seems we should predict the standard variance before executing NMS. So there should be more change?

@JoyHuYY1412
Copy link

I implemented only the KL-loss part in pytorch.

I haven't test but you can check dies it work. The bbox_pred_std is the 'alpha' in the paper.

  def smooth_l1_loss(self, bbox_pred, bbox_targets, bbox_inside_weights, bbox_outside_weights, sigma=1.0):    
       sigma_2 = sigma ** 2
       box_diff = bbox_pred - bbox_targets
       in_box_diff = bbox_inside_weights * box_diff
       abs_in_box_diff = torch.abs(in_box_diff)
       smoothL1_sign = (abs_in_box_diff < 1. / sigma_2).detach().float()
       loss_box = (torch.pow(in_box_diff, 2) * (sigma_2 / 2.) * smoothL1_sign
                   + (abs_in_box_diff - (0.5 / sigma_2)) * (1. - smoothL1_sign)) * bbox_outside_weights
       return loss_box.sum() / loss_box.shape[0]

   def KL_loss(self, bbox_pred, bbox_targets, bbox_pred_std, bbox_inside_weights, bbox_outside_weights, sigma=1.0):
       #KL-loss
       sigma_2 = sigma ** 2
       box_diff = bbox_pred - bbox_targets
       in_box_diff = bbox_inside_weights * box_diff #bbox_inw = in_box_diff
       bbox_l1abs = torch.abs(in_box_diff)  #abs_in_box_diff = bbox_l1abs
       # bbox_sq = in_box_diff * in_box_diff
       smoothL1_sign = (bbox_l1abs < 1. / sigma_2).detach().float()  #1 if bbox_l1abs<1 else 0 
       bbox_inws = (torch.pow(in_box_diff, 2) * (sigma_2 / 2.) * smoothL1_sign
                   + (bbox_l1abs - (0.5 / sigma_2)) * (1. - smoothL1_sign)) 
       bbox_inws = bbox_inws.detach().float()  #?? to be confirmed
       scale = 1
       bbox_pred_std_abs_log = bbox_pred_std*0.5*scale
       bbox_pred_std_nabs = -1*bbox_pred_std
       bbox_pred_std_nexp = torch.exp(bbox_pred_std_nabs)
       bbox_inws_out = bbox_pred_std_nexp * bbox_inws
       bbox_pred_std_abs_logw = bbox_pred_std_abs_log * bbox_outside_weights
       bbox_pred_std_abs_logwr = torch.mean(bbox_pred_std_abs_logw, dim = 0)
       
       #bbox_pred grad, stop std
       loss_bbox = self.smooth_l1_loss(bbox_pred, bbox_targets, bbox_inside_weights, bbox_pred_std_nexp)
       bbox_pred_std_abs_logw_loss = torch.sum(bbox_pred_std_abs_logwr)
       bbox_inws_out = bbox_inws_out * scale
       bbox_inws_outr = torch.mean(bbox_inws_out, dim = 0)
       bbox_pred_std_abs_mulw_loss = torch.sum(bbox_inws_outr)
       return (loss_bbox + bbox_pred_std_abs_logw_loss + bbox_pred_std_abs_mulw_loss)

@wysot
Copy link

wysot commented Aug 12, 2019

@JoyHuYY1412
What are the bbox_inside_weights and bbox_outside_weights in the KL_LOSS function?

@ethanhe42
Copy link
Owner

3rd party reimplementation in PyTorch: ethanhe42/KL-Loss#20
BTW, this repo is deprecated. Please use our new repo: https://github.com/yihui-he/KL-Loss

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants