sigmoid,softmax,tanh简单实现

it2025-08-28  4

sigmoid,softmax,tanh

simoid函数将值缩放为[0,1]

import tensorflow as tf a = tf.linspace(-6.,6.,13) #[-6,6]等分成13个数,dtype=float32 x = tf.sigmoid(a) with tf.Session() as sess: init = tf.initialize_all_variables()#数据初始化 sess.run(init) print(sess.run(x))

softmax函数将数值缩放为[0,1]并且所有数据的和为1

import tensorflow as tf logits = tf.random_uniform([1,10],minval=-2,maxval=2) prob = tf.nn.softmax(logits,axis=1) prob_sum = tf.reduce_sum(prob,axis=1) with tf.Session() as sess: init = tf.initialize_all_variables()#数据初始化 sess.run(init) print(sess.run(prob)) print(sess.run(prob_sum))

tanh函数将数值缩放为[-1,1]

import tensorflow as tf a = tf.constant([-2.,-1,0.,1.0, 2.0]) x = tf.tanh(a) with tf.Session() as sess: init = tf.initialize_all_variables()#数据初始化 sess.run(init) print(sess.run(x))
最新回复(0)