2017-10-19 3 views
1

J'ai une fonction qui utilise certaines fonctions tensorflow. J'ai besoin de cette fonction dans Theano parce que sur la plate-forme je veux utiliser ce code, seul Theano est installé et pas tensorflow. Je travaille principalement avec Keras donc tensorflow est assez cryptique pour moi. La fonction ressemble à ceci:Est-il possible de convertir le code tensorflow en code theano?

class WeightedBinaryCrossEntropy(object): 

    def __init__(self, pos_ratio): 
     neg_ratio = 1. - pos_ratio 
     self.pos_ratio = tf.constant(pos_ratio, tf.float32) 
     self.weights = tf.constant(neg_ratio/pos_ratio, tf.float32) 
     self.__name__ = "weighted_binary_crossentropy({0})".format(pos_ratio) 

    def __call__(self, y_true, y_pred): 
     return self.weighted_binary_crossentropy(y_true, y_pred) 

    def weighted_binary_crossentropy(self, y_true, y_pred): 
     # Transform to logits 
     epsilon = tf.convert_to_tensor(K.common._EPSILON, y_pred.dtype.base_dtype) 
     y_pred = tf.clip_by_value(y_pred, epsilon, 1 - epsilon) 
     y_pred = tf.log(y_pred/(1 - y_pred)) 

     cost = tf.nn.weighted_cross_entropy_with_logits(y_true, y_pred, self.weights) 
     return K.mean(cost * self.pos_ratio, axis=-1) 

model.compile(loss=WeightedBinaryCrossEntropy(0.05), optimizer=optimizer, metrics=['accuracy']) 

Installation tensorflow sur la plate-forme est impossible. J'ai reçu le code d'ici https://github.com/fchollet/keras/issues/2115

Y a-t-il des fonctions dans Theano qui fonctionnent comme les fonctions dans Tensorflow?

Répondre

3

Peut-être que vous devriez utiliser seulement keras et un modèle portable:
(fonctions KERAS: https://keras.io/backend/)

class WeightedBinaryCrossEntropy(object): 

    def __init__(self, pos_ratio): 
     neg_ratio = 1. - pos_ratio 
     self.pos_ratio = K.constant([pos_ratio]) 
     self.weights = K.constant([neg_ratio/pos_ratio]) 
     self.__name__ = "weighted_binary_crossentropy({0})".format(pos_ratio) 

    def __call__(self, y_true, y_pred): 
     return self.weighted_binary_crossentropy(y_true, y_pred) 

    def weighted_binary_crossentropy(self, y_true, y_pred): 
     # Transform to logits 
     epsilon = K.epsilon() 
     y_pred = K.clip(y_pred, epsilon, 1 - epsilon) 
     y_pred = K.log(y_pred/(1 - y_pred)) 

     #for the crossentropy, you can maybe (make sure, please) 
     #use K.binary_crossentropy and multiply the weights later 
     cost = self.approach1(y_true,y_pred) 

     #or you could simulate the same formula as in tensorflow: 
     #https://www.tensorflow.org/api_docs/python/tf/nn/weighted_cross_entropy_with_logits 
     cost = self.approach2(y_true,y_pred) 

     return K.mean(cost * self.pos_ratio, axis=-1) 

    #I use a similar thing in my codes, but I'm not sure my weights are calculated the same way you do 
    def approach1(self,y_true,y_pred): 

     weights = (y_true * self.weights) + 1 #weights applied only to positive values 
     return K.binary_crossentropy(y_true, y_pred,from_logits=True)*weights 

    #seems more trustable, since it's exactly the tensorflow formula 
    def approach2(self,y_true,y_pred): 

     posPart = y_true * (-K.log(K.sigmoid(y_pred))) * self.weights 
     negPart = (1-y_true)*(-K.log(1 - K.sigmoid(y_pred))) 

     return posPart + negPart    


model.compile(loss=WeightedBinaryCrossEntropy(0.05), optimizer=optimizer, metrics=['accuracy']) 
+0

Je vous remercie, il faudra un certain temps pour le tester: Après les essais, je ne l'acceptera probablement . –