TensorBoard 可视化 #
什么是 TensorBoard? #
text
┌─────────────────────────────────────────────────────────────┐
│ TensorBoard 功能 │
├─────────────────────────────────────────────────────────────┤
│ │
│ 标量可视化: │
│ ├── 损失曲线 │
│ ├── 准确率曲线 │
│ └── 学习率变化 │
│ │
│ 模型可视化: │
│ ├── 网络结构图 │
│ └── 层连接关系 │
│ │
│ 直方图: │
│ ├── 权重分布 │
│ └── 梯度分布 │
│ │
│ 图像可视化: │
│ ├── 输入图像 │
│ └── 特征图 │
│ │
└─────────────────────────────────────────────────────────────┘
基本用法 #
启用 TensorBoard 回调 #
python
import keras
tensorboard = keras.callbacks.TensorBoard(
log_dir='./logs',
histogram_freq=1,
write_graph=True,
write_images=True,
update_freq='epoch',
profile_batch=2
)
model.fit(
x_train, y_train,
validation_data=(x_val, y_val),
epochs=100,
callbacks=[tensorboard]
)
启动 TensorBoard #
bash
tensorboard --logdir=./logs
参数详解 #
python
keras.callbacks.TensorBoard(
log_dir='./logs',
histogram_freq=0,
write_graph=True,
write_images=False,
write_steps_per_second=False,
update_freq='epoch',
profile_batch=2,
embeddings_freq=0,
embeddings_metadata=None
)
text
┌─────────────────────────────────────────────────────────────┐
│ TensorBoard 参数 │
├─────────────────────────────────────────────────────────────┤
│ │
│ log_dir: 日志目录 │
│ │
│ histogram_freq: 直方图记录频率 │
│ └── 0 表示不记录 │
│ │
│ write_graph: 是否记录模型图 │
│ │
│ write_images: 是否记录权重图像 │
│ │
│ update_freq: 更新频率 │
│ ├── 'batch': 每个 batch │
│ ├── 'epoch': 每个 epoch │
│ └── 整数: 每 N 个 batch │
│ │
│ profile_batch: 性能分析的 batch │
│ │
└─────────────────────────────────────────────────────────────┘
自定义标量 #
python
import keras
import tensorflow as tf
log_dir = './logs'
writer = tf.summary.create_file_writer(log_dir)
class CustomTensorBoard(keras.callbacks.Callback):
def on_epoch_end(self, epoch, logs=None):
with writer.as_default():
tf.summary.scalar('custom_metric', logs.get('custom_metric'), step=epoch)
writer.flush()
model.fit(
x_train, y_train,
epochs=10,
callbacks=[
keras.callbacks.TensorBoard(log_dir=log_dir),
CustomTensorBoard()
]
)
记录图像 #
python
import keras
import tensorflow as tf
def log_images(epoch, logs):
with writer.as_default():
for i in range(5):
img = x_test[i]
tf.summary.image(f'test_image_{i}', img[tf.newaxis, ...], step=epoch)
writer.flush()
writer = tf.summary.create_file_writer('./logs/images')
model.fit(
x_train, y_train,
epochs=10,
callbacks=[
keras.callbacks.LambdaCallback(on_epoch_end=log_images)
]
)
记录直方图 #
python
import keras
tensorboard = keras.callbacks.TensorBoard(
log_dir='./logs',
histogram_freq=1,
write_graph=True,
write_images=True
)
model.fit(
x_train, y_train,
validation_data=(x_val, y_val),
epochs=10,
callbacks=[tensorboard]
)
完整示例 #
python
import keras
import tensorflow as tf
import datetime
(x_train, y_train), (x_test, y_test) = keras.datasets.mnist.load_data()
x_train = x_train.reshape(-1, 28, 28, 1).astype('float32') / 255.0
x_test = x_test.reshape(-1, 28, 28, 1).astype('float32') / 255.0
y_train = keras.utils.to_categorical(y_train, 10)
y_test = keras.utils.to_categorical(y_test, 10)
log_dir = './logs/' + datetime.datetime.now().strftime('%Y%m%d-%H%M%S')
model = keras.Sequential([
keras.layers.Conv2D(32, 3, activation='relu', input_shape=(28, 28, 1)),
keras.layers.MaxPooling2D(),
keras.layers.Conv2D(64, 3, activation='relu'),
keras.layers.MaxPooling2D(),
keras.layers.Flatten(),
keras.layers.Dense(128, activation='relu'),
keras.layers.Dropout(0.5),
keras.layers.Dense(10, activation='softmax')
])
model.compile(
optimizer='adam',
loss='categorical_crossentropy',
metrics=['accuracy']
)
tensorboard = keras.callbacks.TensorBoard(
log_dir=log_dir,
histogram_freq=1,
write_graph=True,
write_images=True,
update_freq='epoch'
)
history = model.fit(
x_train, y_train,
validation_data=(x_test, y_test),
epochs=20,
batch_size=128,
callbacks=[
tensorboard,
keras.callbacks.EarlyStopping(patience=5, restore_best_weights=True)
]
)
下一步 #
现在你已经掌握了 TensorBoard 可视化,接下来学习 多 GPU 训练,加速模型训练!
最后更新:2026-04-04