Bill Psomas commited on
Commit
7cdd705
·
1 Parent(s): 137e61b

update readme

Browse files
Files changed (2) hide show
  1. README.md +10 -0
  2. configs.yaml +5 -4
README.md CHANGED
@@ -23,6 +23,16 @@ ViT-S official model trained on ImageNet-1k for 100 epochs. Self-supervision wit
23
  SimPool is a simple attention-based pooling method at the end of network, released in this [repository](https://github.com/billpsomas/simpool/).
24
  Disclaimer: This model card is written by the author of SimPool, i.e. [Bill Psomas](http://users.ntua.gr/psomasbill/).
25
 
 
 
 
 
 
 
 
 
 
 
26
  ## BibTeX entry and citation info
27
 
28
  ```
 
23
  SimPool is a simple attention-based pooling method at the end of network, released in this [repository](https://github.com/billpsomas/simpool/).
24
  Disclaimer: This model card is written by the author of SimPool, i.e. [Bill Psomas](http://users.ntua.gr/psomasbill/).
25
 
26
+ ## Evaluation with k-NN
27
+
28
+ | k | top1 | top5 |
29
+ | ------- | ------- | ------- |
30
+ | 10 | 68.918 | 85.432 |
31
+ | 20 | 68.738 | 87.278 |
32
+ | 100 | 66.746 | 88.52 |
33
+ | 200 | 65.33 | 88.26 |
34
+
35
+
36
  ## BibTeX entry and citation info
37
 
38
  ```
configs.yaml CHANGED
@@ -1,12 +1,12 @@
1
  arch: vit_small
2
  backend: nccl
3
- batch_size_per_gpu: 10
4
  clip_grad: 0.0
5
- data_path: /mnt/data/imagenet/
6
  dist_url: env://
7
  drop_path_rate: 0.1
8
  epochs: 100
9
- eval_every: 5
10
  freeze_last_layer: 1
11
  global_crops_scale:
12
  - 0.25
@@ -18,6 +18,7 @@ local_crops_scale:
18
  local_rank: 0
19
  lr: 0.0005
20
  min_lr: 1.0e-05
 
21
  momentum_teacher: 0.996
22
  nb_knn:
23
  - 10
@@ -28,7 +29,7 @@ norm_last_layer: false
28
  num_workers: 10
29
  optimizer: adamw
30
  out_dim: 65536
31
- output_dir: /scratch/bill.psomas/logs/temp/
32
  patch_size: 16
33
  saveckp_freq: 20
34
  seed: 0
 
1
  arch: vit_small
2
  backend: nccl
3
+ batch_size_per_gpu: 100
4
  clip_grad: 0.0
5
+ data_path: /path/to/imagenet/
6
  dist_url: env://
7
  drop_path_rate: 0.1
8
  epochs: 100
9
+ eval_every: 15
10
  freeze_last_layer: 1
11
  global_crops_scale:
12
  - 0.25
 
18
  local_rank: 0
19
  lr: 0.0005
20
  min_lr: 1.0e-05
21
+ mode: official
22
  momentum_teacher: 0.996
23
  nb_knn:
24
  - 10
 
29
  num_workers: 10
30
  optimizer: adamw
31
  out_dim: 65536
32
+ output_dir: /path/to/output/
33
  patch_size: 16
34
  saveckp_freq: 20
35
  seed: 0