Skip to content

otutay/TinyYolov2Classification

Repository files navigation

TinyYolov2Classification

Pytorch implementation of tiny-yolov2 imagenet 1K results. I will share different LR schedule and theirs performance metrics. From 1-7 of the below results ADAM optimizer is used with a Batch size of 210, total epoch num = 40. From 8-10 of the results trained with ADAM batch size of 64.

Data augmentation and lr is changed across trials. Ryzen 1700x with single gtx 1080,python3.6,pytorch v1 are used in all trials.

Result 1

  • best top5 is around %67. For data augmentation only randomCrop,Resize, horizontal flip are applied. alt text

Result 2

  • best top5 is around %71.7. For data augmentation only randomCrop,Resize, horizontal flip are applied.When top1,top5 data is somewhat buggy but shows general trend. alt text

Result 3

  • best top5 is around %76.6. for data augmentation following code is applied.
trainDataset1 = datasets.ImageFolder(
    args.dir + 'train',
    transforms.Compose([
        transforms.Resize(256),
        transforms.RandomRotation((-10,10)),
        transforms.ColorJitter(brightness = 1,contrast = 1,saturation = 1,hue = 0.3),
        transforms.CenterCrop(224),
        transforms.ToTensor(),
        # normalize,
    ]))

trainDataset2 = datasets.ImageFolder(
    args.dir + 'train',
    transforms.Compose([
        transforms.Resize(256),
        transforms.RandomCrop(224),
        transforms.ToTensor(),
        # normalize,
    ]))

trainLoader = DataLoader(ConcatDataset([trainDataset1,trainDataset2])
    ,batch_size=args.batchSize, shuffle=True,
    num_workers=8, pin_memory=True, sampler=None) 

alt text

Result 4

  • best top5 is around %75.9. Only difference with 3th one is increase lr to 0.014. alt text

Result 5

  • Difference with 4th trial is that i applied weigth decay to adam. Not Working and terminated.
optim.Adam(net.parameters(), lr=args.lr,weight_decay=0.001)

alt text

Result 6

Lr = 0.02, weigth_decay = 0. Not working terminated. alt text

Result 7

top5 = 77.04 Lr = 0.012, LR rate is aggresively changed. first %50 iteration, lr is fixed at 0.012. consecutive %10 increased to 0.024. and then cosine annehealing is applied for the rest. alt text

Result 8

top5 = 76.13 Lr = 0.008, BatchSize = 64 LR rate is aggresively changed. alt text

Result 9

top5 = 77.81 Lr = 0.006, BatchSize = 64 LR rate is aggresively changed. Best result I had. Have some overfitting in training data. alt text

Result 10

top5 = 75.5 Lr = 0.012, BatchSize = 64 LR rate is aggresively changed. alt text