2015-04-08 1 views
1

J'essaie d'ajouter une couche de suppression basée sur l'exemple Imagenet (voir le code ci-dessous). Cependant, il semble être ignoré, il n'est pas imprimé comme faisant partie du réseau lorsque je forme le modèle et j'obtiens le message d'avertissement ci-dessous. J'ai la dernière version de caffe installée. Que dois-je faire pour l'inclure correctement?Couche d'abandon ignorée - "Attention: j'avais un ou plusieurs problèmes de mise à niveau de V1LayerParameter"

couche Dropout:

layer { 
    name: "drop4" 
    type: "Dropout" 
    bottom: "fc4" 
    top: "fc4" 
    dropout_param { 
    dropout_ratio: 0.5 
    } 
} 

journal de formation:

I0407 21:05:06.809166 5962 caffe.cpp:117] Use CPU. 
I0407 21:05:06.809468 5962 caffe.cpp:121] Starting Optimization 
I0407 21:05:06.809532 5962 solver.cpp:32] Initializing solver from parameters: 
test_iter: 100 
test_interval: 500 
base_lr: 0.01 
display: 100 
max_iter: 10000 
lr_policy: "fixed" 
gamma: 0.0001 
power: 0.75 
weight_decay: 0.0005 
snapshot: 1000 
snapshot_prefix: "hdf5_classification/data/train" 
solver_mode: CPU 
net: "hdf5_classification/cnn_train.prototxt" 
solver_type: ADAGRAD 
I0407 21:05:06.809566 5962 solver.cpp:70] Creating training net from net file: hdf5_classification/cnn_train.prototxt 
E0407 21:05:06.809836 5962 upgrade_proto.cpp:618] Attempting to upgrade input file specified using deprecated V1LayerParameter: hdf5_classification/cnn_train.prototxt 
E0407 21:05:06.809968 5962 upgrade_proto.cpp:636] Input NetParameter to be upgraded already specifies 'layer' fields; these will be ignored for the upgrade. 
E0407 21:05:06.810035 5962 upgrade_proto.cpp:623] Warning: had one or more problems upgrading V1LayerParameter (see above); continuing anyway. 
I0407 21:05:06.810108 5962 net.cpp:257] The NetState phase (0) differed from the phase (1) specified by a rule in layer data 
I0407 21:05:06.810132 5962 net.cpp:257] The NetState phase (0) differed from the phase (1) specified by a rule in layer accuracy 
I0407 21:05:06.810143 5962 net.cpp:257] The NetState phase (0) differed from the phase (1) specified by a rule in layer pred 
I0407 21:05:06.810266 5962 net.cpp:42] Initializing net from parameters: 
name: "CDR-CNN" 
state { 
    phase: TRAIN 
} 
layer { 
    name: "data" 
    type: "HDF5Data" 
    top: "data" 
    top: "label" 
    include { 
    phase: TRAIN 
    } 
    hdf5_data_param { 
    source: "hdf5_classification/data/train.txt" 
    batch_size: 10 
    } 
} 
layer { 
    name: "conv1" 
    type: "Convolution" 
    bottom: "data" 
    top: "conv1" 
    param { 
    lr_mult: 1 
    } 
    param { 
    lr_mult: 2 
    } 
    convolution_param { 
    num_output: 16 
    weight_filler { 
     type: "gaussian" 
     std: 0.01 
    } 
    bias_filler { 
     type: "constant" 
    } 
    kernel_h: 1 
    kernel_w: 3 
    stride_h: 1 
    stride_w: 1 
    } 
} 
layer { 
    name: "relu1" 
    type: "ReLU" 
    bottom: "conv1" 
    top: "conv1" 
} 
layer { 
    name: "pool1" 
    type: "Pooling" 
    bottom: "conv1" 
    top: "pool1" 
    pooling_param { 
    pool: MAX 
    kernel_h: 1 
    kernel_w: 2 
    stride_h: 1 
    stride_w: 2 
    } 
} 
layer { 
    name: "conv2" 
    type: "Convolution" 
    bottom: "pool1" 
    top: "conv2" 
    param { 
    lr_mult: 1 
    } 
    param { 
    lr_mult: 2 
    } 
    convolution_param { 
    num_output: 20 
    weight_filler { 
     type: "gaussian" 
     std: 0.01 
    } 
    bias_filler { 
     type: "constant" 
    } 
    kernel_h: 1 
    kernel_w: 11 
    stride_h: 1 
    stride_w: 1 
    } 
} 
layer { 
    name: "relu2" 
    type: "ReLU" 
    bottom: "conv2" 
    top: "conv2" 
} 
layer { 
    name: "conv3" 
    type: "Convolution" 
    bottom: "conv2" 
    top: "conv3" 
    param { 
    lr_mult: 1 
    } 
    param { 
    lr_mult: 2 
    } 
    convolution_param { 
    num_output: 120 
    weight_filler { 
     type: "gaussian" 
     std: 0.01 
    } 
    bias_filler { 
     type: "constant" 
    } 
    kernel_h: 7 
    kernel_w: 1 
    stride_h: 1 
    stride_w: 1 
    } 
} 
layer { 
    name: "relu3" 
    type: "ReLU" 
    bottom: "conv3" 
    top: "conv3" 
} 
layer { 
    name: "fc4" 
    type: "InnerProduct" 
    bottom: "conv3" 
    top: "fc4" 
    param { 
    lr_mult: 1 
    decay_mult: 1 
    } 
    param { 
    lr_mult: 2 
    decay_mult: 0 
    } 
    inner_product_param { 
    num_output: 84 
    weight_filler { 
     type: "gaussian" 
     std: 0.01 
    } 
    bias_filler { 
     type: "constant" 
     value: 0 
    } 
    } 
} 
layer { 
    name: "relu4" 
    type: "ReLU" 
    bottom: "fc4" 
    top: "fc4" 
} 
layer { 
    name: "fc5" 
    type: "InnerProduct" 
    bottom: "fc4" 
    top: "fc5" 
    param { 
    lr_mult: 1 
    decay_mult: 1 
    } 
    param { 
    lr_mult: 2 
    decay_mult: 0 
    } 
    inner_product_param { 
    num_output: 2 
    weight_filler { 
     type: "gaussian" 
     std: 0.01 
    } 
    bias_filler { 
     type: "constant" 
     value: 0 
    } 
    } 
} 
layer { 
    name: "loss" 
    type: "SoftmaxWithLoss" 
    bottom: "fc5" 
    bottom: "label" 
    top: "loss" 
    include { 
    phase: TRAIN 
    } 
} 
I0407 21:05:06.810355 5962 layer_factory.hpp:74] Creating layer data 
I0407 21:05:06.811199 5962 net.cpp:84] Creating Layer data 
I0407 21:05:06.811228 5962 net.cpp:338] data -> data 
I0407 21:05:06.811259 5962 net.cpp:338] data -> label 
I0407 21:05:06.811285 5962 net.cpp:113] Setting up data 
I0407 21:05:06.811301 5962 hdf5_data_layer.cpp:80] Loading list of HDF5 filenames from: hdf5_classification/data/train.txt 
I0407 21:05:06.811357 5962 hdf5_data_layer.cpp:94] Number of HDF5 files: 1 
I0407 21:05:06.853078 5962 net.cpp:120] Top shape: 10 14 7 24 (23520) 
I0407 21:05:06.853113 5962 net.cpp:120] Top shape: 10 (10) 
I0407 21:05:06.853132 5962 layer_factory.hpp:74] Creating layer conv1 
I0407 21:05:06.853159 5962 net.cpp:84] Creating Layer conv1 
I0407 21:05:06.853175 5962 net.cpp:380] conv1 <- data 
I0407 21:05:06.853199 5962 net.cpp:338] conv1 -> conv1 
I0407 21:05:06.853221 5962 net.cpp:113] Setting up conv1 
I0407 21:05:06.853638 5962 net.cpp:120] Top shape: 10 16 7 22 (24640) 
I0407 21:05:06.853665 5962 layer_factory.hpp:74] Creating layer relu1 
I0407 21:05:06.853684 5962 net.cpp:84] Creating Layer relu1 
I0407 21:05:06.853698 5962 net.cpp:380] relu1 <- conv1 
I0407 21:05:06.853713 5962 net.cpp:327] relu1 -> conv1 (in-place) 
I0407 21:05:06.853727 5962 net.cpp:113] Setting up relu1 
I0407 21:05:06.853744 5962 net.cpp:120] Top shape: 10 16 7 22 (24640) 
I0407 21:05:06.853757 5962 layer_factory.hpp:74] Creating layer pool1 
I0407 21:05:06.853772 5962 net.cpp:84] Creating Layer pool1 
I0407 21:05:06.853785 5962 net.cpp:380] pool1 <- conv1 
I0407 21:05:06.853799 5962 net.cpp:338] pool1 -> pool1 
I0407 21:05:06.853814 5962 net.cpp:113] Setting up pool1 
I0407 21:05:06.853837 5962 net.cpp:120] Top shape: 10 16 7 11 (12320) 
I0407 21:05:06.853849 5962 layer_factory.hpp:74] Creating layer conv2 
I0407 21:05:06.853867 5962 net.cpp:84] Creating Layer conv2 
I0407 21:05:06.853878 5962 net.cpp:380] conv2 <- pool1 
I0407 21:05:06.853893 5962 net.cpp:338] conv2 -> conv2 
I0407 21:05:06.853909 5962 net.cpp:113] Setting up conv2 
I0407 21:05:06.854030 5962 net.cpp:120] Top shape: 10 20 7 1 (1400) 
I0407 21:05:06.854048 5962 layer_factory.hpp:74] Creating layer relu2 
I0407 21:05:06.854063 5962 net.cpp:84] Creating Layer relu2 
I0407 21:05:06.854074 5962 net.cpp:380] relu2 <- conv2 
I0407 21:05:06.854087 5962 net.cpp:327] relu2 -> conv2 (in-place) 
I0407 21:05:06.854100 5962 net.cpp:113] Setting up relu2 
I0407 21:05:06.854113 5962 net.cpp:120] Top shape: 10 20 7 1 (1400) 
I0407 21:05:06.854125 5962 layer_factory.hpp:74] Creating layer conv3 
I0407 21:05:06.854140 5962 net.cpp:84] Creating Layer conv3 
I0407 21:05:06.854152 5962 net.cpp:380] conv3 <- conv2 
I0407 21:05:06.854166 5962 net.cpp:338] conv3 -> conv3 
I0407 21:05:06.854179 5962 net.cpp:113] Setting up conv3 
I0407 21:05:06.854748 5962 net.cpp:120] Top shape: 10 120 1 1 (1200) 
I0407 21:05:06.854771 5962 layer_factory.hpp:74] Creating layer relu3 
I0407 21:05:06.854785 5962 net.cpp:84] Creating Layer relu3 
I0407 21:05:06.854797 5962 net.cpp:380] relu3 <- conv3 
I0407 21:05:06.854811 5962 net.cpp:327] relu3 -> conv3 (in-place) 
I0407 21:05:06.854825 5962 net.cpp:113] Setting up relu3 
I0407 21:05:06.854838 5962 net.cpp:120] Top shape: 10 120 1 1 (1200) 
I0407 21:05:06.854851 5962 layer_factory.hpp:74] Creating layer fc4 
I0407 21:05:06.854871 5962 net.cpp:84] Creating Layer fc4 
I0407 21:05:06.854883 5962 net.cpp:380] fc4 <- conv3 
I0407 21:05:06.854897 5962 net.cpp:338] fc4 -> fc4 
I0407 21:05:06.854912 5962 net.cpp:113] Setting up fc4 
I0407 21:05:06.855232 5962 net.cpp:120] Top shape: 10 84 (840) 
I0407 21:05:06.855252 5962 layer_factory.hpp:74] Creating layer relu4 
I0407 21:05:06.855267 5962 net.cpp:84] Creating Layer relu4 
I0407 21:05:06.855278 5962 net.cpp:380] relu4 <- fc4 
I0407 21:05:06.855406 5962 net.cpp:327] relu4 -> fc4 (in-place) 
I0407 21:05:06.855432 5962 net.cpp:113] Setting up relu4 
I0407 21:05:06.855447 5962 net.cpp:120] Top shape: 10 84 (840) 
I0407 21:05:06.855458 5962 layer_factory.hpp:74] Creating layer fc5 
I0407 21:05:06.855582 5962 net.cpp:84] Creating Layer fc5 
I0407 21:05:06.855614 5962 net.cpp:380] fc5 <- fc4 
I0407 21:05:06.855631 5962 net.cpp:338] fc5 -> fc5 
I0407 21:05:06.855648 5962 net.cpp:113] Setting up fc5 
I0407 21:05:06.855674 5962 net.cpp:120] Top shape: 10 2 (20) 
I0407 21:05:06.855690 5962 layer_factory.hpp:74] Creating layer loss 
I0407 21:05:06.855710 5962 net.cpp:84] Creating Layer loss 
I0407 21:05:06.855721 5962 net.cpp:380] loss <- fc5 
I0407 21:05:06.855734 5962 net.cpp:380] loss <- label 
I0407 21:05:06.855751 5962 net.cpp:338] loss -> loss 
I0407 21:05:06.855768 5962 net.cpp:113] Setting up loss 
I0407 21:05:06.855785 5962 layer_factory.hpp:74] Creating layer loss 
I0407 21:05:06.855813 5962 net.cpp:120] Top shape: (1) 
I0407 21:05:06.855825 5962 net.cpp:122]  with loss weight 1 
I0407 21:05:06.855854 5962 net.cpp:167] loss needs backward computation. 
I0407 21:05:06.855865 5962 net.cpp:167] fc5 needs backward computation. 
I0407 21:05:06.855877 5962 net.cpp:167] relu4 needs backward computation. 
I0407 21:05:06.855890 5962 net.cpp:167] fc4 needs backward computation. 
I0407 21:05:06.855901 5962 net.cpp:167] relu3 needs backward computation. 
I0407 21:05:06.855912 5962 net.cpp:167] conv3 needs backward computation. 
I0407 21:05:06.855924 5962 net.cpp:167] relu2 needs backward computation. 
I0407 21:05:06.855937 5962 net.cpp:167] conv2 needs backward computation. 
I0407 21:05:06.855947 5962 net.cpp:167] pool1 needs backward computation. 
I0407 21:05:06.855959 5962 net.cpp:167] relu1 needs backward computation. 
I0407 21:05:06.855970 5962 net.cpp:167] conv1 needs backward computation. 
I0407 21:05:06.855983 5962 net.cpp:169] data does not need backward computation. 
I0407 21:05:06.855994 5962 net.cpp:205] This network produces output loss 
I0407 21:05:06.856011 5962 net.cpp:447] Collecting Learning Rate and Weight Decay. 
I0407 21:05:06.856029 5962 net.cpp:217] Network initialization done. 
I0407 21:05:06.856041 5962 net.cpp:218] Memory required for data: 368124 

Répondre

2

J'ai eu le même problème et sommes tombés sur cette question:

What does 'Attempting to upgrade input file specified using deprecated transformation parameters' mean?

Pour autant que je sache, dans la tentative de caffe de mettre à jour votre architecture de couche, caffe a déjà trouvé des couches qui sont écrites dans la nouvelle syntaxe protobuf. Ainsi, il ne met pas à jour votre ancienne syntaxe et ignore donc la couche Dropout. Mais je ne suis pas sûr à 100% à ce sujet, car c'est juste mon hypothèse, en jetant un coup d'œil dans le code de la café.

Pour moi, la solution est de le changer comme suit:

layers { 
    name: "drop4" 
    type: DROPOUT 
    bottom: "fc4" 
    top: "fc4" 
    dropout_param { 
    dropout_ratio: 0.5 
    } 
} 

espérons que cela fonctionne =)

+0

alternativement, vous pouvez mettre à jour votre prototxt dans un fichier tmp (en utilisant l'utilitaire de mise à niveau), puis ajoutez la couche d'abandon dans la nouvelle syntaxe. En tout cas, il est préférable de passer au nouveau format, plutôt que de redescendre à l'ancien, pourrait causer des problèmes de compatibilité à l'avenir ... – Shai