摘要:網(wǎng)絡(luò)結(jié)構(gòu)來自固定隨機(jī)數(shù)種子以復(fù)現(xiàn)結(jié)果創(chuàng)建維向量,并擴(kuò)展維度適應(yīng)對輸入的要求,的大小為定義卷積層卷積核數(shù)量為卷積核大小為定義最大化池化層平鋪層,調(diào)整維度適應(yīng)全鏈接層定義全鏈接層編譯模型打印層的輸出打印網(wǎng)絡(luò)結(jié)構(gòu)最終輸出如下卷積結(jié)果網(wǎng)絡(luò)結(jié)
網(wǎng)絡(luò)結(jié)構(gòu)來自https://github.com/nfmcclure/...
Conv1Dimport numpy as np import keras # 固定隨機(jī)數(shù)種子以復(fù)現(xiàn)結(jié)果 seed=13 np.random.seed(seed) # 創(chuàng)建 1 維向量,并擴(kuò)展維度適應(yīng) Keras 對輸入的要求, data_1d 的大小為 (1, 25, 1) data_1d = np.random.normal(size=25) data_1d = np.expand_dims(data_1d, 0) data_1d = np.expand_dims(data_1d, 2) # 定義卷積層 filters = 1 # 卷積核數(shù)量為 1 kernel_size = 5 # 卷積核大小為 5 convolution_1d_layer = keras.layers.convolutional.Conv1D(filters, kernel_size, strides=1, padding="valid", input_shape=(25, 1), activation="relu", name="convolution_1d_layer") # 定義最大化池化層 max_pooling_layer = keras.layers.MaxPool1D(pool_size=5, strides=1, padding="valid", name="max_pooling_layer") # 平鋪層,調(diào)整維度適應(yīng)全鏈接層 reshape_layer = keras.layers.core.Flatten(name="reshape_layer") # 定義全鏈接層 full_connect_layer = keras.layers.Dense(5, kernel_initializer=keras.initializers.RandomNormal(mean=0.0, stddev=0.1, seed=seed), bias_initializer="random_normal", use_bias=True, name="full_connect_layer") # 編譯模型 model = keras.Sequential() model.add(convolution_1d_layer) model.add(max_pooling_layer) model.add(reshape_layer) model.add(full_connect_layer) # 打印 full_connect_layer 層的輸出 output = keras.Model(inputs=model.input, outputs=model.get_layer("full_connect_layer").output).predict(data_1d) print(output) # 打印網(wǎng)絡(luò)結(jié)構(gòu) print(model.summary())
最終輸出如下
======================卷積結(jié)果========================= [[-0.0131043 -0.11734447 0.13395447 -0.75453871 -0.69782442]] ======================網(wǎng)絡(luò)結(jié)構(gòu)========================= _________________________________________________________________ Layer (type) Output Shape Param # ================================================================= convolution_1d_layer (Conv1D (None, 21, 1) 6 _________________________________________________________________ max_pooling_layer (MaxPoolin (None, 17, 1) 0 _________________________________________________________________ reshape_layer (Flatten) (None, 17) 0 _________________________________________________________________ full_connect_layer (Dense) (None, 5) 90 ================================================================= Total params: 96 Trainable params: 96 Non-trainable params: 0 _________________________________________________________________ NoneConv2D
data_size = [10, 10] data_2d = np.random.normal(size=data_size) data_2d = np.expand_dims(data_2d, 0) data_2d = np.expand_dims(data_2d, 3) print data_2d.shape # 定義卷積層 conv_size = 2 conv_stride_size = 2 convolution_2d_layer = keras.layers.Conv2D(filters=1, kernel_size=(conv_size, conv_size), strides=(conv_stride_size, conv_stride_size), input_shape=(data_size[0], data_size[0], 1)) # convolution_2d_layer = keras.layers.Conv2D(filter=1, kernel_size=kernel, strides=[1,1], padding="valid", activation="relu", name="convolution_2d_layer", input_shape=(1, data_size[0], data_size[0])) # 定義最大化池化層 pooling_size = (2, 2) max_pooling_2d_layer = keras.layers.MaxPool2D(pool_size=pooling_size, strides=1, padding="valid", name="max_pooling_2d_layer") # 平鋪層,調(diào)整維度適應(yīng)全鏈接層 reshape_layer = keras.layers.core.Flatten(name="reshape_layer") # 定義全鏈接層 full_connect_layer = keras.layers.Dense(5, kernel_initializer=keras.initializers.RandomNormal(mean=0.0, stddev=0.1, seed=seed), bias_initializer="random_normal", use_bias=True, name="full_connect_layer") model_2d = keras.Sequential() model_2d.add(convolution_2d_layer) model_2d.add(max_pooling_2d_layer) model_2d.add(reshape_layer) model_2d.add(full_connect_layer) # 打印 full_connect_layer 層的輸出 output = keras.Model(inputs=model_2d.input, outputs=model_2d.get_layer("full_connect_layer").output).predict(data_2d) print("======================卷積結(jié)果=========================") print(output) # 打印網(wǎng)絡(luò)結(jié)構(gòu) print("======================網(wǎng)絡(luò)結(jié)構(gòu)=========================") print(model_2d.summary())
輸出
======================卷積結(jié)果========================= [[ 0.30173036 -0.10435719 -0.03354734 0.24000235 -0.09962128]] ======================網(wǎng)絡(luò)結(jié)構(gòu)========================= _________________________________________________________________ Layer (type) Output Shape Param # ================================================================= conv2d_1 (Conv2D) (None, 5, 5, 1) 5 _________________________________________________________________ max_pooling_2d_layer (MaxPoo (None, 4, 4, 1) 0 _________________________________________________________________ reshape_layer (Flatten) (None, 16) 0 _________________________________________________________________ full_connect_layer (Dense) (None, 5) 85 ================================================================= Total params: 90 Trainable params: 90 Non-trainable params: 0 _________________________________________________________________ None
文章版權(quán)歸作者所有,未經(jīng)允許請勿轉(zhuǎn)載,若此文章存在違規(guī)行為,您可以聯(lián)系管理員刪除。
轉(zhuǎn)載請注明本文地址:http://systransis.cn/yun/41192.html
摘要:一維卷積常用于序列模型,自然語言處理領(lǐng)域。三維卷積這里采用代數(shù)的方式對三維卷積進(jìn)行介紹,具體思想與一維卷積二維卷積相同。 由于計(jì)算機(jī)視覺的大紅大紫,二維卷積的用處范圍最廣。因此本文首先介紹二維卷積,之后再介紹一維卷積與三維卷積的具體流程,并描述其各自的具體應(yīng)用。1、二維卷積?? ? 圖中的輸入的數(shù)據(jù)維度為 14 × 14 ,過濾器大小為 5 × 5,二者做卷積,輸出的數(shù)據(jù)維度為 10 × 1...
摘要:可以這樣說,庫使得創(chuàng)建深度學(xué)習(xí)模型變得快速且簡單。在本教程中,你將了解如何用中更具靈活性的函數(shù)式來定義深度學(xué)習(xí)模型。如何使用函數(shù)式定義簡單的多層感知器卷積神經(jīng)網(wǎng)絡(luò)以及循環(huán)神經(jīng)網(wǎng)絡(luò)模型。 可以這樣說,Keras Python庫使得創(chuàng)建深度學(xué)習(xí)模型變得快速且簡單。序列API使得你能夠?yàn)榇蠖鄶?shù)問題逐層創(chuàng)建模型。當(dāng)然它也是有局限性的,那就是它并不能讓你創(chuàng)建擁有共享層或具有多個(gè)輸入或輸出的模型。Ker...
摘要:目前,是成長最快的一種深度學(xué)習(xí)框架。這將是對社區(qū)發(fā)展的一個(gè)巨大的推動作用。以下代碼是如何開始導(dǎo)入和構(gòu)建序列模型?,F(xiàn)在,我們來構(gòu)建一個(gè)簡單的線性回歸模型。 作者:chen_h微信號 & QQ:862251340微信公眾號:coderpai簡書地址:https://www.jianshu.com/p/205... Keras 是提供一些高可用的 Python API ,能幫助你快速的構(gòu)建...
閱讀 2488·2021-09-22 16:05
閱讀 2978·2021-09-10 11:24
閱讀 3647·2019-08-30 12:47
閱讀 2952·2019-08-29 15:42
閱讀 3394·2019-08-29 15:32
閱讀 1980·2019-08-26 11:48
閱讀 1096·2019-08-23 14:40
閱讀 908·2019-08-23 14:33