(16)tensorflow全连接的层实现

tech2025-11-04  8

全连接的层实现

功能函数代码层实现方式layers.Dense(units, activation)获取 Dense 类的权值矩阵fc.kernel获取 Dense 类的偏置向量fc.bias返回待优化参数列表fc.trainable_variables

层实现

layers.Dense(units, activation)units指定层输出节点数activation指定激活函数ayers.Dense在调用时会根据输入数据自动生成输入节点数 import tensorflow as tf from tensorflow.keras import layers x = tf.random.normal([3,50])#模拟2个样本,50个特征 fc = layers.Dense(2,activation=tf.nn.relu) h1 = fc(x) print('h1',h1) print('K',fc.kernel) print('b',fc.bias) print('v',fc.trainable_variables) out: C:\Users\Admin\PycharmProjects\untitled1\venv\Scripts\python.exe C:/Users/Admin/PycharmProjects/untitled1/tensor练习使用.py 2020-09-04 13:06:21.911380: I tensorflow/core/platform/cpu_feature_guard.cc:142] Your CPU supports instructions that this TensorFlow binary was not compiled to use: AVX2 h1 tf.Tensor( [[0.95606685 1.1911137 ] [0. 0. ] [0.14124475 0.04009145]], shape=(3, 2), dtype=float32) K <tf.Variable 'dense/kernel:0' shape=(50, 2) dtype=float32, numpy= array([[ 0.0455344 , -0.19482273], [-0.26869717, 0.02412117], [ 0.21953332, -0.03901473], [ 0.22162515, -0.32377893], [ 0.30170476, 0.07104701], [-0.24665953, 0.214688 ], [ 0.11439314, 0.2840066 ], [ 0.27506483, 0.2678889 ], [-0.3137263 , 0.16774249], [-0.08592525, 0.09441769], [-0.21467607, -0.00248331], [-0.23901463, 0.1094763 ], [ 0.08673692, 0.26048535], [-0.06203741, -0.1819801 ], [ 0.18807596, 0.2989127 ], [-0.25750422, 0.153965 ], [ 0.12020397, 0.19332612], [ 0.25078076, -0.12966038], [ 0.06809869, 0.265687 ], [-0.3342834 , -0.24474217], [ 0.1052615 , -0.11368768], [-0.12656587, 0.08862066], [ 0.11732206, 0.3000934 ], [ 0.3307876 , -0.3007087 ], [ 0.14409512, -0.21482137], [-0.04636192, 0.08430123], [ 0.14868012, 0.26966405], [-0.13744491, 0.18106598], [ 0.05841842, -0.14975952], [ 0.10092789, 0.2506609 ], [ 0.31394488, 0.10897231], [-0.22760901, -0.10806008], [ 0.23610026, 0.2924561 ], [ 0.19984013, -0.12296666], [-0.21468846, -0.29082647], [ 0.20162839, -0.24724546], [-0.05970627, 0.33641344], [ 0.30267793, -0.18490817], [ 0.31117034, -0.31890184], [ 0.09414282, 0.08582264], [ 0.00278926, -0.08124155], [ 0.17134207, 0.32664698], [ 0.2573135 , 0.1359677 ], [-0.31898403, 0.02981722], [-0.29750133, -0.33452296], [ 0.20652354, 0.09661892], [-0.21122393, -0.265467 ], [ 0.25332725, -0.15709157], [ 0.31241155, -0.11074884], [ 0.02461901, 0.18401676]], dtype=float32)> b <tf.Variable 'dense/bias:0' shape=(2,) dtype=float32, numpy=array([0., 0.], dtype=float32)> v [<tf.Variable 'dense/kernel:0' shape=(50, 2) dtype=float32, numpy= array([[ 0.0455344 , -0.19482273], [-0.26869717, 0.02412117], [ 0.21953332, -0.03901473], [ 0.22162515, -0.32377893], [ 0.30170476, 0.07104701], [-0.24665953, 0.214688 ], [ 0.11439314, 0.2840066 ], [ 0.27506483, 0.2678889 ], [-0.3137263 , 0.16774249], [-0.08592525, 0.09441769], [-0.21467607, -0.00248331], [-0.23901463, 0.1094763 ], [ 0.08673692, 0.26048535], [-0.06203741, -0.1819801 ], [ 0.18807596, 0.2989127 ], [-0.25750422, 0.153965 ], [ 0.12020397, 0.19332612], [ 0.25078076, -0.12966038], [ 0.06809869, 0.265687 ], [-0.3342834 , -0.24474217], [ 0.1052615 , -0.11368768], [-0.12656587, 0.08862066], [ 0.11732206, 0.3000934 ], [ 0.3307876 , -0.3007087 ], [ 0.14409512, -0.21482137], [-0.04636192, 0.08430123], [ 0.14868012, 0.26966405], [-0.13744491, 0.18106598], [ 0.05841842, -0.14975952], [ 0.10092789, 0.2506609 ], [ 0.31394488, 0.10897231], [-0.22760901, -0.10806008], [ 0.23610026, 0.2924561 ], [ 0.19984013, -0.12296666], [-0.21468846, -0.29082647], [ 0.20162839, -0.24724546], [-0.05970627, 0.33641344], [ 0.30267793, -0.18490817], [ 0.31117034, -0.31890184], [ 0.09414282, 0.08582264], [ 0.00278926, -0.08124155], [ 0.17134207, 0.32664698], [ 0.2573135 , 0.1359677 ], [-0.31898403, 0.02981722], [-0.29750133, -0.33452296], [ 0.20652354, 0.09661892], [-0.21122393, -0.265467 ], [ 0.25332725, -0.15709157], [ 0.31241155, -0.11074884], [ 0.02461901, 0.18401676]], dtype=float32)>, <tf.Variable 'dense/bias:0' shape=(2,) dtype=float32, numpy=array([0., 0.], dtype=float32)>]
最新回复(0)