2017-10-06 17 views
0

j'ai un problème pour remodeler une couche intermédiaire après qu'une couche Convolution1D. La sortie de la couche de convolution est (None, 15, 30) et je veux la remodeler en (None, 15, 30, 1). Voici le morceau de code:Reshape couche intermédiaire en Keras

model = Sequential() 
model.add(Reshape((-1, 15), input_shape=(3, 5))) 
model.add(Flatten()) 
model.add(Embedding(20, 30, input_length=5, 
        embeddings_initializer=initializers.lecun_uniform())) 
model.add(Dropout(0.5)) 
model.add(Conv1D(30, 3, padding='same', activation='tanh')) 
model.add(Reshape((15, 30, 1))) 

Et quand je lance ce modèle, je reçois l'erreur suivante:

Traceback (most recent call last): 
    File "main.py", line 98, in <module> 
    model_char.add(Reshape((15, 30, 1))) 
    File "/opt/local/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/sites-packages/keras/models.py" line 475, in add 
    output_tensor = layer(self.outputs[0]) 
    File "/opt/local/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/keras/engine/topology.py", line 602, in __call__ 
    output = self.call(inputs, **kwargs) 
    File "/opt/local/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/keras/layers/core.py", line 391, in call 
    target_shape = self.compute_output_shape(input_shape)[1:] 
    File "/opt/local/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/keras/layers/core.py", line 376, in compute_output_shape 
    input_shape[1:], self.target_shape) 
    File "/opt/local/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/keras/layers/core.py", line 367, in _fix_unknown_dimension 
    raise ValueError(msg) 
ValueError: total size of new array must be unchanged 

Je ne comprends pas pourquoi j'obtenir cette erreur parce que la forme a toujours le même nombre : 15 * 30 = 15 * 30 * 1

pour le rendre plus clair ici un exemple, je veux tourner ceci:

[ 
    [0.09311153 -0.05404745 0.50606126 -0.53520441 0.41794425 0.25171867 0.37156609 -0.26078582 -0.52433187 0.30997691 -0.95317268 0.94261593 0.56365687 0.21063149 0.13678613 0.52069747 -1.2293686 0.21358994 0.026743541 -0.69850469 0.0955583 -0.74581879 0.087773092 0.32595187 0.36514446 -0.49256474 0.053979304 -0.20099731 -0.71489674 0.36982241] 
    [-0.37371859 1.3228136 -1.7292237 0.29752117 0.24284564 -0.052321397 0.029835917 -0.80207682 0.890297 -0.73524159 -0.34133136 -0.35529694 -0.68623751 0.135149 0.66752154 -1.0757091 1.8932092 0.69630879 -0.0077456506 0.569112 -0.23351294 0.45924008 0.66961372 0.27655622 -1.4171039 -1.0308323 -0.012516789 0.23766349 0.23518381 -1.0214807] 
    [0.18247549 -0.78545868 -0.29058188 0.310718 0.62216043 0.91871214 0.88818812 0.98373055 -0.24269108 -0.90911531 0.4302102 -1.5717789 0.47344795 0.023262642 -0.66633916 0.020208906 0.23528977 0.007890353 -0.96216047 1.5474852 0.64963603 -0.61376446 -0.7431891 -0.9098407 -1.178241 0.029418966 -0.36299181 -0.18588017 -0.69983339 -1.2403098] 
    [-0.13907124 0.77455968 0.39949748 0.788373 -0.72379655 0.47056541 -0.10293989 -0.15979311 -0.13747469 0.49540848 1.3849578 -1.3091069 -1.5117174 0.10640619 -0.27224991 0.091859169 0.67863154 0.8041755 -0.950194 -0.56718034 -0.55053443 -0.31328604 -0.50172609 0.54067135 0.89353716 0.2819832 -1.4124448 -0.992626 0.27674574 -0.80656594] 
    [0.56941384 -0.52307433 -0.0014557609 -0.66424996 -0.1377265 -0.18622889 -0.27893946 -0.66485214 0.12398232 0.71157557 -0.052687954 -0.41807997 -0.86772496 -0.29088372 0.077019915 -0.1309973 -0.52031749 0.064666867 0.66582745 -0.38266385 0.11579557 -0.65265769 -0.92624879 0.30905446 0.2472447 -0.10585186 -0.4744457 -1.0572212 0.25316724 -0.56545675] 
    [-0.77612084 0.043356732 0.13998967 -0.19222628 0.03896746 0.2462908 -0.8805331 -0.98223174 -0.34058484 0.20567864 0.72753137 -0.33709928 -0.83310735 0.028127206 -0.49332875 0.37826887 0.5161047 -0.49551058 -1.2524649 0.62712663 -0.68346339 -0.53607458 -0.51002711 -0.081255086 0.16448797 0.052351121 -0.4282217 0.025603807 -0.0056298361 -0.70383257] 
    [-0.61657017 1.8046215 -0.63392532 -0.14126575 0.2719841 -0.49519449 -0.44583827 -0.63153619 0.36972648 0.07364779 -0.048049126 -0.74233592 0.67249382 1.0938615 -0.95455551 -0.018818242 -0.20023341 -0.0019912687 -0.29854766 -0.79328007 -1.489653 0.26387224 -0.26599407 0.50500441 0.97010094 0.28547913 -0.21993566 -0.32863438 0.55618548 -0.17297709] 
    [0.44050878 -0.17070045 -0.15680149 0.16606471 0.26279065 -0.29395095 -0.15396446 -0.47258478 -1.0586354 -0.65733254 -0.18761018 -0.57519519 0.81053412 0.59430808 0.042727698 1.1076951 -0.89282864 -0.027213758 0.96365273 0.40794414 -0.39281371 1.4093404 0.69447654 0.26741418 0.44905421 0.20673333 0.58201146 0.5447579 -0.77610832 0.50347972] 
    [0.23904422 1.8198916 -0.32127005 0.19016732 0.34059802 0.23843262 1.1336677 -0.87515587 1.0113547 0.12842716 -0.81049639 -1.4373963 0.49592969 0.73143089 -0.68618989 1.3859445 0.72093153 0.43309888 -1.3210315 0.2054542 0.048558954 -0.53217089 0.20947145 -0.4713847 0.50018942 -0.97975427 -0.16554829 -0.27636209 0.77673912 -0.17956567] 
    [0.10899831 1.0184085 -0.38978073 0.30730557 -0.69186461 0.49880576 0.63833827 -0.25505236 0.03314859 0.24156919 1.5816804 -0.43136993 -1.1080316 0.544221 0.029394349 0.37545529 0.045701563 -0.089494362 -0.65665811 -0.023093835 -1.4809865 -0.6045422 -0.32934037 0.52244681 -0.60119945 0.51370251 -0.80937928 0.47565889 0.85675395 1.063754] 
    [1.0831649 0.50079244 -0.30921742 -1.0280501 -0.23224308 -0.37653229 1.2576953 -1.2339777 -0.45807734 0.61158192 -0.09404964 -0.39444873 0.73790514 -0.31239685 -0.24413933 -0.567924 0.073173814 -0.64984584 0.64845377 -0.28436229 -1.5736789 -1.0005355 -0.70931745 0.871858 0.890601 0.2609764 -0.198215 -0.29297754 -0.2499871 0.038447738] 
    [1.1345748 0.62565184 -0.23180695 -1.0716654 -0.18751761 0.12118033 0.19484697 -1.2859532 0.59767735 0.57796526 -0.85377008 -0.77281773 1.1327845 -0.25830373 -0.43259808 0.41908079 0.023618467 -1.2751251 0.36925897 -0.0098595042 -0.31753924 -0.20560052 -0.43478742 0.97803283 0.38101795 -0.33882758 0.624989 -0.053847663 0.19007191 0.27741989] 
    [0.54994547 -0.94156992 -0.45858866 -0.045031343 -1.4409146 0.38791916 0.26716572 -0.5124234 -0.34381375 -0.7497589 -0.18270281 -0.56540006 -0.37611061 0.18021639 0.48165894 0.20478815 0.18975782 -0.61889905 0.7226029 -0.09786211 -0.14823321 -0.52758443 0.35245389 0.1051857 0.027430706 -0.11386452 -0.013007465 0.37519914 -0.941329 -0.70005715] 
    [0.39274859 -0.39766663 -0.70382959 -0.42397216 -1.480526 -0.68001819 0.3347947 -0.20469595 0.05962114 0.66365397 -0.77448803 -0.55189139 0.60185546 0.60116309 -0.17462984 0.32057542 -0.20577009 -1.1281829 0.027793719 -0.0533368 0.95331073 -1.3201096 0.15608752 0.056847919 0.39634478 0.1054792 0.50742185 -0.0057650856 0.25442541 0.54575342] 
    [0.8295418 0.65716922 0.32987151 -0.51133007 -0.959789 0.32615584 -0.82939672 -0.76657677 -0.5497672 -0.17259806 0.34265092 0.02187328 -0.33412746 -0.24402547 0.22011535 -0.052487958 -0.17772579 -0.066493936 -0.09538275 -0.76245272 -0.14960127 0.3900359 0.38033971 1.2810829 0.35877633 -0.19256417 0.72800469 0.25378975 -0.11317521 0.11448503] 
] 

Dans ce:

[ 
    [[0.09311153][-0.05404745][0.50606126][-0.53520441][0.41794425][0.25171867][0.37156609][-0.26078582][-0.52433187][0.30997691][-0.95317268][0.94261593][0.56365687][0.21063149][0.13678613][0.52069747][-1.2293686][0.21358994][0.026743541][-0.69850469][0.0955583][-0.74581879][0.087773092][0.32595187][0.36514446][-0.49256474][0.053979304][-0.20099731][-0.71489674][0.36982241]] 
    [[-0.37371859][1.3228136][-1.7292237][0.29752117][0.24284564][-0.052321397][0.029835917][-0.80207682][0.890297][-0.73524159][-0.34133136][-0.35529694][-0.68623751][0.135149][0.66752154][-1.0757091][1.8932092][0.69630879][-0.0077456506][0.569112][-0.23351294][0.45924008][0.66961372][0.27655622][-1.4171039][-1.0308323][-0.012516789][0.23766349][0.23518381][-1.0214807]] 
    [[0.18247549][-0.78545868][-0.29058188][0.310718][0.62216043][0.91871214][0.88818812][0.98373055][-0.24269108][-0.90911531][0.4302102][-1.5717789][0.47344795][0.023262642][-0.66633916][0.020208906][0.23528977][0.007890353][-0.96216047][1.5474852][0.64963603][-0.61376446][-0.7431891][-0.9098407][-1.178241][0.029418966][-0.36299181][-0.18588017][-0.69983339][-1.2403098]] 
    [[-0.13907124][0.77455968][0.39949748][0.788373][-0.72379655][0.47056541][-0.10293989][-0.15979311][-0.13747469][0.49540848][1.3849578][-1.3091069][-1.5117174][0.10640619][-0.27224991][0.091859169][0.67863154][0.8041755][-0.950194][-0.56718034][-0.55053443][-0.31328604][-0.50172609][0.54067135][0.89353716][0.2819832][-1.4124448][-0.992626][0.27674574][-0.80656594]] 
    [[0.56941384][-0.52307433][-0.0014557609][-0.66424996][-0.1377265][-0.18622889][-0.27893946][-0.66485214][0.12398232][0.71157557][-0.052687954][-0.41807997][-0.86772496][-0.29088372][0.077019915][-0.1309973][-0.52031749][0.064666867][0.66582745][-0.38266385][0.11579557][-0.65265769][-0.92624879][0.30905446][0.2472447][-0.10585186][-0.4744457][-1.0572212][0.25316724][-0.56545675]] 
    [[-0.77612084][0.043356732][0.13998967][-0.19222628][0.03896746][0.2462908][-0.8805331][-0.98223174][-0.34058484][0.20567864][0.72753137][-0.33709928][-0.83310735][0.028127206][-0.49332875][0.37826887][0.5161047][-0.49551058][-1.2524649][0.62712663][-0.68346339][-0.53607458][-0.51002711][-0.081255086][0.16448797][0.052351121][-0.4282217][0.025603807][-0.0056298361][-0.70383257]] 
    [[-0.61657017][1.8046215][-0.63392532][-0.14126575][0.2719841][-0.49519449][-0.44583827][-0.63153619][0.36972648][0.07364779][-0.048049126][-0.74233592][0.67249382][1.0938615][-0.95455551][-0.018818242][-0.20023341][-0.0019912687][-0.29854766][-0.79328007][-1.489653][0.26387224][-0.26599407][0.50500441][0.97010094][0.28547913][-0.21993566][-0.32863438][0.55618548][-0.17297709]] 
    [[0.44050878][-0.17070045][-0.15680149][0.16606471][0.26279065][-0.29395095][-0.15396446][-0.47258478][-1.0586354][-0.65733254][-0.18761018][-0.57519519][0.81053412][0.59430808][0.042727698][1.1076951][-0.89282864][-0.027213758][0.96365273][0.40794414][-0.39281371][1.4093404][0.69447654][0.26741418][0.44905421][0.20673333][0.58201146][0.5447579][-0.77610832][0.50347972]] 
    [[0.23904422][1.8198916][-0.32127005][0.19016732][0.34059802][0.23843262][1.1336677][-0.87515587][1.0113547][0.12842716][-0.81049639][-1.4373963][0.49592969][0.73143089][-0.68618989][1.3859445][0.72093153][0.43309888][-1.3210315][0.2054542][0.048558954][-0.53217089][0.20947145][-0.4713847][0.50018942][-0.97975427][-0.16554829][-0.27636209][0.77673912][-0.17956567]] 
    [[0.10899831][1.0184085][-0.38978073][0.30730557][-0.69186461][0.49880576][0.63833827][-0.25505236][0.03314859][0.24156919][1.5816804][-0.43136993][-1.1080316][0.544221][0.029394349][0.37545529][0.045701563][-0.089494362][-0.65665811][-0.023093835][-1.4809865][-0.6045422][-0.32934037][0.52244681][-0.60119945][0.51370251][-0.80937928][0.47565889][0.85675395][1.063754]] 
    [[1.0831649][0.50079244][-0.30921742][-1.0280501][-0.23224308][-0.37653229][1.2576953][-1.2339777][-0.45807734][0.61158192][-0.09404964][-0.39444873][0.73790514][-0.31239685][-0.24413933][-0.567924][0.073173814][-0.64984584][0.64845377][-0.28436229][-1.5736789][-1.0005355][-0.70931745][0.871858][0.890601][0.2609764][-0.198215][-0.29297754][-0.2499871][0.038447738]] 
    [[1.1345748][0.62565184][-0.23180695][-1.0716654][-0.18751761][0.12118033][0.19484697][-1.2859532][0.59767735][0.57796526][-0.85377008][-0.77281773][1.1327845][-0.25830373][-0.43259808][0.41908079][0.023618467][-1.2751251][0.36925897][-0.0098595042][-0.31753924][-0.20560052][-0.43478742][0.97803283][0.38101795][-0.33882758][0.624989][-0.053847663][0.19007191][0.27741989]] 
    [[0.54994547][-0.94156992][-0.45858866][-0.045031343][-1.4409146][0.38791916][0.26716572][-0.5124234][-0.34381375][-0.7497589][-0.18270281][-0.56540006][-0.37611061][0.18021639][0.48165894][0.20478815][0.18975782][-0.61889905][0.7226029][-0.09786211][-0.14823321][-0.52758443][0.35245389][0.1051857][0.027430706][-0.11386452][-0.013007465][0.37519914][-0.941329][-0.70005715]] 
    [[0.39274859][-0.39766663][-0.70382959][-0.42397216][-1.480526][-0.68001819][0.3347947][-0.20469595][0.05962114][0.66365397][-0.77448803][-0.55189139][0.60185546][0.60116309][-0.17462984][0.32057542][-0.20577009][-1.1281829][0.027793719][-0.0533368][0.95331073][-1.3201096][0.15608752][0.056847919][0.39634478][0.1054792][0.50742185][-0.0057650856][0.25442541][0.54575342]] 
    [[0.8295418][0.65716922][0.32987151][-0.51133007][-0.959789][0.32615584][-0.82939672][-0.76657677][-0.5497672][-0.17259806][0.34265092][0.02187328][-0.33412746][-0.24402547][0.22011535][-0.052487958][-0.17772579][-0.066493936][-0.09538275][-0.76245272][-0.14960127][0.3900359][0.38033971][1.2810829][0.35877633][-0.19256417][0.72800469][0.25378975][-0.11317521][0.11448503]] 
] 

Une idée de ce qui ne va pas dans mon morceau de code?

Merci d'avance.

+0

Vous utilisez des couches redondantes au début, remodelez et aplatissez. N'utilisez que Flatten. –

+0

Merci, j'ai enlevé le premier calque "reshape". Et voici le résumé je reçois: flatten_1 (Aplatir) (None, 15) 0 embedding_1 (Embedding) (Aucun, 5, 30) 630 dropout_1 (abandon scolaire) (None, 5, 30) 0 conv1d_1 (Conv1D) (Aucun, 5, 30) 2730 Paramètres totaux: 3,360 Paramètres paramétrables: 3,360 Paramètres non-entraînables: 0 – user2165135

Répondre

0

Le problème est le input_length passé à la couche Embedding. La sortie n'est pas (None, 15, 30) comme vous le pensez, mais (None, 5, 30).

Il suffit de retirer le input_length.

Voici le résumé de votre modèle actuel:

Layer (type)     Output Shape    Param # 
================================================================= 
reshape_3 (Reshape)   (None, 1, 15)    0   
_________________________________________________________________ 
flatten_2 (Flatten)   (None, 15)    0   
_________________________________________________________________ 
embedding_2 (Embedding)  (None, 5, 30)    600  
_________________________________________________________________ 
dropout_2 (Dropout)   (None, 5, 30)    0   
_________________________________________________________________ 
conv1d_2 (Conv1D)   (None, 5, 30)    2730  
================================================================= 
+0

Je reçois toujours le même message d'erreur si je conserve la dernière couche de remodelage. Quand j'essaie le modèle avec une entrée aléatoire, j'obtiens une forme de sortie de "(2, 15, 30)" deux étant le nombre d'exemples. – user2165135

-1

Ceci est la sortie que je reçois:

[ 
[0.09311153 -0.05404745 0.50606126 -0.53520441 0.41794425 0.25171867 0.37156609 -0.26078582 -0.52433187 0.30997691 -0.95317268 0.94261593 0.56365687 0.21063149 0.13678613 0.52069747 -1.2293686 0.21358994 0.026743541 -0.69850469 0.0955583 -0.74581879 0.087773092 0.32595187 0.36514446 -0.49256474 0.053979304 -0.20099731 -0.71489674 0.36982241] 
[-0.37371859 1.3228136 -1.7292237 0.29752117 0.24284564 -0.052321397 0.029835917 -0.80207682 0.890297 -0.73524159 -0.34133136 -0.35529694 -0.68623751 0.135149 0.66752154 -1.0757091 1.8932092 0.69630879 -0.0077456506 0.569112 -0.23351294 0.45924008 0.66961372 0.27655622 -1.4171039 -1.0308323 -0.012516789 0.23766349 0.23518381 -1.0214807] 
[0.18247549 -0.78545868 -0.29058188 0.310718 0.62216043 0.91871214 0.88818812 0.98373055 -0.24269108 -0.90911531 0.4302102 -1.5717789 0.47344795 0.023262642 -0.66633916 0.020208906 0.23528977 0.007890353 -0.96216047 1.5474852 0.64963603 -0.61376446 -0.7431891 -0.9098407 -1.178241 0.029418966 -0.36299181 -0.18588017 -0.69983339 -1.2403098] 
[-0.13907124 0.77455968 0.39949748 0.788373 -0.72379655 0.47056541 -0.10293989 -0.15979311 -0.13747469 0.49540848 1.3849578 -1.3091069 -1.5117174 0.10640619 -0.27224991 0.091859169 0.67863154 0.8041755 -0.950194 -0.56718034 -0.55053443 -0.31328604 -0.50172609 0.54067135 0.89353716 0.2819832 -1.4124448 -0.992626 0.27674574 -0.80656594] 
[0.56941384 -0.52307433 -0.0014557609 -0.66424996 -0.1377265 -0.18622889 -0.27893946 -0.66485214 0.12398232 0.71157557 -0.052687954 -0.41807997 -0.86772496 -0.29088372 0.077019915 -0.1309973 -0.52031749 0.064666867 0.66582745 -0.38266385 0.11579557 -0.65265769 -0.92624879 0.30905446 0.2472447 -0.10585186 -0.4744457 -1.0572212 0.25316724 -0.56545675] 
[-0.77612084 0.043356732 0.13998967 -0.19222628 0.03896746 0.2462908 -0.8805331 -0.98223174 -0.34058484 0.20567864 0.72753137 -0.33709928 -0.83310735 0.028127206 -0.49332875 0.37826887 0.5161047 -0.49551058 -1.2524649 0.62712663 -0.68346339 -0.53607458 -0.51002711 -0.081255086 0.16448797 0.052351121 -0.4282217 0.025603807 -0.0056298361 -0.70383257] 
[-0.61657017 1.8046215 -0.63392532 -0.14126575 0.2719841 -0.49519449 -0.44583827 -0.63153619 0.36972648 0.07364779 -0.048049126 -0.74233592 0.67249382 1.0938615 -0.95455551 -0.018818242 -0.20023341 -0.0019912687 -0.29854766 -0.79328007 -1.489653 0.26387224 -0.26599407 0.50500441 0.97010094 0.28547913 -0.21993566 -0.32863438 0.55618548 -0.17297709] 
[0.44050878 -0.17070045 -0.15680149 0.16606471 0.26279065 -0.29395095 -0.15396446 -0.47258478 -1.0586354 -0.65733254 -0.18761018 -0.57519519 0.81053412 0.59430808 0.042727698 1.1076951 -0.89282864 -0.027213758 0.96365273 0.40794414 -0.39281371 1.4093404 0.69447654 0.26741418 0.44905421 0.20673333 0.58201146 0.5447579 -0.77610832 0.50347972] 
[0.23904422 1.8198916 -0.32127005 0.19016732 0.34059802 0.23843262 1.1336677 -0.87515587 1.0113547 0.12842716 -0.81049639 -1.4373963 0.49592969 0.73143089 -0.68618989 1.3859445 0.72093153 0.43309888 -1.3210315 0.2054542 0.048558954 -0.53217089 0.20947145 -0.4713847 0.50018942 -0.97975427 -0.16554829 -0.27636209 0.77673912 -0.17956567] 
[0.10899831 1.0184085 -0.38978073 0.30730557 -0.69186461 0.49880576 0.63833827 -0.25505236 0.03314859 0.24156919 1.5816804 -0.43136993 -1.1080316 0.544221 0.029394349 0.37545529 0.045701563 -0.089494362 -0.65665811 -0.023093835 -1.4809865 -0.6045422 -0.32934037 0.52244681 -0.60119945 0.51370251 -0.80937928 0.47565889 0.85675395 1.063754] 
[1.0831649 0.50079244 -0.30921742 -1.0280501 -0.23224308 -0.37653229 1.2576953 -1.2339777 -0.45807734 0.61158192 -0.09404964 -0.39444873 0.73790514 -0.31239685 -0.24413933 -0.567924 0.073173814 -0.64984584 0.64845377 -0.28436229 -1.5736789 -1.0005355 -0.70931745 0.871858 0.890601 0.2609764 -0.198215 -0.29297754 -0.2499871 0.038447738] 
[1.1345748 0.62565184 -0.23180695 -1.0716654 -0.18751761 0.12118033 0.19484697 -1.2859532 0.59767735 0.57796526 -0.85377008 -0.77281773 1.1327845 -0.25830373 -0.43259808 0.41908079 0.023618467 -1.2751251 0.36925897 -0.0098595042 -0.31753924 -0.20560052 -0.43478742 0.97803283 0.38101795 -0.33882758 0.624989 -0.053847663 0.19007191 0.27741989] 
[0.54994547 -0.94156992 -0.45858866 -0.045031343 -1.4409146 0.38791916 0.26716572 -0.5124234 -0.34381375 -0.7497589 -0.18270281 -0.56540006 -0.37611061 0.18021639 0.48165894 0.20478815 0.18975782 -0.61889905 0.7226029 -0.09786211 -0.14823321 -0.52758443 0.35245389 0.1051857 0.027430706 -0.11386452 -0.013007465 0.37519914 -0.941329 -0.70005715] 
[0.39274859 -0.39766663 -0.70382959 -0.42397216 -1.480526 -0.68001819 0.3347947 -0.20469595 0.05962114 0.66365397 -0.77448803 -0.55189139 0.60185546 0.60116309 -0.17462984 0.32057542 -0.20577009 -1.1281829 0.027793719 -0.0533368 0.95331073 -1.3201096 0.15608752 0.056847919 0.39634478 0.1054792 0.50742185 -0.0057650856 0.25442541 0.54575342] 
[0.8295418 0.65716922 0.32987151 -0.51133007 -0.959789 0.32615584 -0.82939672 -0.76657677 -0.5497672 -0.17259806 0.34265092 0.02187328 -0.33412746 -0.24402547 0.22011535 -0.052487958 -0.17772579 -0.066493936 -0.09538275 -0.76245272 -0.14960127 0.3900359 0.38033971 1.2810829 0.35877633 -0.19256417 0.72800469 0.25378975 -0.11317521 0.11448503] 
] 

Avec l'entrée suivante:

[[[ 2 0 0 0 0] 
    [11 3 12 4 0] 
    [13 5 14 3 15]] 

[[16 17 0 0 0] 
    [18 5 19 4 0] 
    [ 0 0 0 0 0]]] 
+0

s'il vous plaît modifier votre question, pas la réponse – Paddy

-1

Ok, je suis arrivé ce que vous vouliez dire, voici la nouvelle version:

model = Sequential() 
model.add(Flatten(input_shape=(3,5))) 
model.add(Embedding(20, 30, input_length=15, 
       embeddings_initializer=initializers.lecun_uniform())) 
model.add(Dropout(0.5)) 
model.add(Conv1D(30, 3, padding='same', activation='tanh')) 
model.add(Reshape((15, 30, 1))) 

Et cela fonctionne comme prévu. Merci beaucoup!

+0

s'il vous plaît modifier votre question, pas le répondant – Paddy