1、Keras的Sequential模型的理解:

     Sequential可以理解为容器,将网络中的各个层都通过Add操作,添加到里面。

2、Tensorflow的MultiInput操作和Merge操作

from keras.models import Sequential
from keras.layers import Dense, Activation
from keras.layers import Merge
from keras.utils.visualize_util import plot

left_branch = Sequential()
left_branch.add(Dense(32, input_dim=784))

right_branch = Sequential()
right_branch.add(Dense(32, input_dim=784))

merged = Merge([left_branch, right_branch], mode='concat')

final_model = Sequential()
final_model.add(merged)
final_model.add(Dense(10, activation='softmax'))
plot(final_model, to_file='model.png',show_shapes=True)
说明:

1、对于多输入,每个Input通路,也是一个Sequential,在该Sequential中进行完操作后。再将该2个sequential进行Merge.

2、合并的模型也需要一个新的Sequential,将新的merged层放进去.

网络结构图如下所示:


3、Multi Input 和 Multi output的例子

代码如下:

from keras.utils.visualize_util import plot
from keras.layers import Input, Embedding, LSTM, Dense, merge
from keras.models import Model

main_input = Input(shape=(100,), dtype='int32', name='main_input')
x = Embedding(output_dim=512, input_dim=10000, input_length=100)(main_input)
lstm_out = LSTM(32)(x)
auxiliary_output = Dense(1, activation='sigmoid', name='aux_output')(lstm_out)
auxiliary_input = Input(shape=(5,), name='aux_input')
x = merge([lstm_out, auxiliary_input], mode='concat')
x = Dense(64, activation='relu')(x)
x = Dense(64, activation='relu')(x)
x = Dense(64, activation='relu')(x)
main_output = Dense(1, activation='sigmoid', name='main_output')(x)
model = Model(input=[main_input, auxiliary_input], output=[main_output, auxiliary_output])

plot(model, to_file='model.png',show_shapes=True)
得到的网络结构如下:


Logo

权威|前沿|技术|干货|国内首个API全生命周期开发者社区

更多推荐