Skip to content

53th place(top2%) solution for Kaggle TGS Salt Identification Challenge

Notifications You must be signed in to change notification settings

fuxuliu/TGS-Salt

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

12 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

TGS-Salt

53rd place(top2%) solution for Kaggle TGS Salt Identification Challenge

General

This is a not bad solution to get top2% place without post-processing.
Accodring to the forum, binary empty vs non-empty classifier by Heng and +0.01 LB with snapshot ensembling and cyclic lr by Peter and so on,There are many useful tricks.

My solution

Augmentation:

I used padding image from 101x101 to 128*128, but I did not compared it with just resize. Some said the resize+flip is better than pad+aug. you can check the code heretransform.py

pretrained model

I used the resnet34 pretrained model and se-resnext50 pretraied model and se-resnext101 as the Unet encoder. From the results of experiments, the se-resnext50 pretrained model is the best Unet encoder, but some top kagglers said their best model is resnet34.

scSE and hypercolumn

I used the scSE block and hypercolumn on decoder. It can raise the score a little bit.

deep supervision

binary empty vs non-empty classifier. Deep supervision can help the model converge quickly and increase the LB score.

Loss function

From the results of experiments, train with only lovasz_loss and elu+1 is better than train model with bce in stage#1 and lovasz in stage#2.

LR_Scheduler

SGDR with cycle learing rate.

Other excellent Solutions

1th place by b.e.s.
4th place by SeuTao
5th place by AlexenderLiao
8th place by Igor Krashenyi
9th place by tugstugi
11th place by alexisrozhkov
22nd place by Vishunu
27th place by Roman Vlasov
32nd place by Oleg Yaroshevskyy
43th place by n01z3

About

53th place(top2%) solution for Kaggle TGS Salt Identification Challenge

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published