Tensorboard
💡 TensorBoard 简介
TensorBoard 是一个专为机器学习实验设计,用于可视化和分析数据与模型的开源工具包。它最初是为 TensorFlow 开发的,但现在也支持 PyTorch、Hugging Face Transformers 等多种 AI 框架。
主要功能和用途¶
TensorBoard 提供了一套丰富的功能,帮助开发者更好地理解、调试和优化他们的机器学习模型:
📈 跟踪和可视化指标(Metrics):
- 实时显示训练过程中的损失函数 (Loss)、准确率 (Accuracy)、学习率等关键指标随时间变化的曲线。
- 可以比较不同实验(例如,不同超参数设置)之间的结果。
🕸️ 可视化模型计算图(Graph):
- 直观地展现神经网络的结构、层之间的连接以及数据的流向。
- 这有助于理解模型的构建和调试结构问题。
📊 查看张量(Tensor)的直方图:
- 显示模型中的权重(Weights)、偏差(Biases)或激活值(Activations)等张量随时间变化的分布情况。
- 有助于检测潜在的问题,如梯度消失或梯度爆炸。
🖼️ 显示图像、文本和音频数据:
- 可用于可视化输入数据、模型生成的图像,或卷积网络层输出的特征图。
🌌 嵌入投射(Embedding Projector):
- 将高维的嵌入向量(Embeddings)投影到较低维度的空间(如 2D 或 3D),以便进行交互式探索和分析数据结构及样本相似性。
总结¶
简而言之,TensorBoard 就是机器学习领域的“仪表盘”和“侦查工具”,它将复杂的训练过程和模型内部状态以可视化的方式呈现出来,让工程师能够:
- 监控训练进度。
- 理解模型结构和行为。
- 调试和优化模型性能。
SummaryWriter类¶
✍️ TensorBoard 中的 SummaryWriter 介绍¶
SummaryWriter 是 PyTorch (以及其他一些深度学习框架,如 TensorFlow 的 Keras API) 中用于将数据写入 TensorBoard 可视化工具的核心类。
你可以把 TensorBoard 看作是一个强大的“数据记录仪”和“展示板”,而 SummaryWriter 就是你用来记录和组织这些数据的笔。
核心功能与作用¶
- 创建事件文件 (Event Files):
SummaryWriter的主要作用是在指定的日志目录 (log_dir) 中创建 TensorBoard 需要读取的事件文件。 - 记录数据: 它提供了一系列
add_XXX()方法,允许你在模型训练、验证等过程中,方便地记录各种类型的数据,例如:- 标量 (Scalars): 记录损失函数 (
loss)、准确率 (accuracy)、学习率 (learning rate) 等随训练步数变化的数值。 - 直方图 (Histograms): 记录模型参数(如权重、偏置)或梯度等张量的分布情况。
- 图像 (Images): 记录输入图像、特征图 (feature maps) 或生成图像等。
- 图 (Graph): 记录模型结构。
- 文本 (Text): 记录配置信息、重要的日志等。
- 标量 (Scalars): 记录损失函数 (
- 可视化: TensorBoard 会读取这些事件文件,并将记录的数据以图表、图像等形式在网页界面上进行可视化,帮助你更好地理解和调试模型训练过程。
常见用法 (以 PyTorch 为例)¶
导入与初始化:
在 PyTorch 中,
SummaryWriter通常从torch.utils.tensorboard中导入(或旧版中使用tensorboardX库)。from torch.utils.tensorboard import SummaryWriter # 初始化 SummaryWriter,指定日志文件存放的目录 writer = SummaryWriter(log_dir='runs/my_experiment_name') # 'runs/my_experiment_name' 是默认或常用的日志目录
log_dir: TensorBoard 事件文件的存放路径。
记录标量数据 (最常用):
在训练循环中,记录损失或准确率等:
# 假设 loss 和 step 是当前训练步的损失值和步数 writer.add_scalar('Training/Loss', loss, global_step=step) writer.add_scalar('Training/Accuracy', accuracy, global_step=step)
'Training/Loss':标签 (Tag),用于在 TensorBoard 界面上分组显示。loss:标量值 (Scalar value)。global_step:步数,对应 TensorBoard 图表中的 X 轴。
记录图像数据:
# 假设 img_tensor 是一个图像张量,step 是当前步数 writer.add_image('Input Image', img_tensor, global_step=step)
关闭 Writer:
训练结束后,记得关闭 Writer 以确保所有数据都被写入文件:
writer.close()
如何查看 TensorBoard¶
在你的终端或命令行中,导航到你的项目根目录或存放 log_dir (例如 runs) 的目录,然后运行以下命令启动 TensorBoard 服务器:
tensorboard --logdir=runs
TensorBoard 启动后,它会提供一个本地 URL(通常是 http://localhost:6006),你通过浏览器访问该 URL 即可看到可视化的结果。
from torch.utils.tensorboard import SummaryWriter
help(SummaryWriter)
Help on class SummaryWriter in module torch.utils.tensorboard.writer:
class SummaryWriter(builtins.object)
| SummaryWriter(log_dir=None, comment='', purge_step=None, max_queue=10, flush_secs=120, filename_suffix='')
|
| Writes entries directly to event files in the log_dir to be consumed by TensorBoard.
|
| The `SummaryWriter` class provides a high-level API to create an event file
| in a given directory and add summaries and events to it. The class updates the
| file contents asynchronously. This allows a training program to call methods
| to add data to the file directly from the training loop, without slowing down
| training.
|
| Methods defined here:
|
| __enter__(self)
|
| __exit__(self, exc_type, exc_val, exc_tb)
|
| __init__(self, log_dir=None, comment='', purge_step=None, max_queue=10, flush_secs=120, filename_suffix='')
| Create a `SummaryWriter` that will write out events and summaries to the event file.
|
| Args:
| log_dir (str): Save directory location. Default is
| runs/**CURRENT_DATETIME_HOSTNAME**, which changes after each run.
| Use hierarchical folder structure to compare
| between runs easily. e.g. pass in 'runs/exp1', 'runs/exp2', etc.
| for each new experiment to compare across them.
| comment (str): Comment log_dir suffix appended to the default
| ``log_dir``. If ``log_dir`` is assigned, this argument has no effect.
| purge_step (int):
| When logging crashes at step :math:`T+X` and restarts at step :math:`T`,
| any events whose global_step larger or equal to :math:`T` will be
| purged and hidden from TensorBoard.
| Note that crashed and resumed experiments should have the same ``log_dir``.
| max_queue (int): Size of the queue for pending events and
| summaries before one of the 'add' calls forces a flush to disk.
| Default is ten items.
| flush_secs (int): How often, in seconds, to flush the
| pending events and summaries to disk. Default is every two minutes.
| filename_suffix (str): Suffix added to all event filenames in
| the log_dir directory. More details on filename construction in
| tensorboard.summary.writer.event_file_writer.EventFileWriter.
|
| Examples::
|
| from torch.utils.tensorboard import SummaryWriter
|
| # create a summary writer with automatically generated folder name.
| writer = SummaryWriter()
| # folder location: runs/May04_22-14-54_s-MacBook-Pro.local/
|
| # create a summary writer using the specified folder name.
| writer = SummaryWriter("my_experiment")
| # folder location: my_experiment
|
| # create a summary writer with comment appended.
| writer = SummaryWriter(comment="LR_0.1_BATCH_16")
| # folder location: runs/May04_22-14-54_s-MacBook-Pro.localLR_0.1_BATCH_16/
|
| add_audio(self, tag, snd_tensor, global_step=None, sample_rate=44100, walltime=None)
| Add audio data to summary.
|
| Args:
| tag (str): Data identifier
| snd_tensor (torch.Tensor): Sound data
| global_step (int): Global step value to record
| sample_rate (int): sample rate in Hz
| walltime (float): Optional override default walltime (time.time())
| seconds after epoch of event
| Shape:
| snd_tensor: :math:`(1, L)`. The values should lie between [-1, 1].
|
| add_custom_scalars(self, layout)
| Create special chart by collecting charts tags in 'scalars'.
|
| NOTE: This function can only be called once for each SummaryWriter() object.
|
| Because it only provides metadata to tensorboard, the function can be called before or after the training loop.
|
| Args:
| layout (dict): {categoryName: *charts*}, where *charts* is also a dictionary
| {chartName: *ListOfProperties*}. The first element in *ListOfProperties* is the chart's type
| (one of **Multiline** or **Margin**) and the second element should be a list containing the tags
| you have used in add_scalar function, which will be collected into the new chart.
|
| Examples::
|
| layout = {'Taiwan':{'twse':['Multiline',['twse/0050', 'twse/2330']]},
| 'USA':{ 'dow':['Margin', ['dow/aaa', 'dow/bbb', 'dow/ccc']],
| 'nasdaq':['Margin', ['nasdaq/aaa', 'nasdaq/bbb', 'nasdaq/ccc']]}}
|
| writer.add_custom_scalars(layout)
|
| add_custom_scalars_marginchart(self, tags, category='default', title='untitled')
| Shorthand for creating marginchart.
|
| Similar to ``add_custom_scalars()``, but the only necessary argument is *tags*,
| which should have exactly 3 elements.
|
| Args:
| tags (list): list of tags that have been used in ``add_scalar()``
|
| Examples::
|
| writer.add_custom_scalars_marginchart(['twse/0050', 'twse/2330', 'twse/2006'])
|
| add_custom_scalars_multilinechart(self, tags, category='default', title='untitled')
| Shorthand for creating multilinechart. Similar to ``add_custom_scalars()``, but the only necessary argument is *tags*.
|
| Args:
| tags (list): list of tags that have been used in ``add_scalar()``
|
| Examples::
|
| writer.add_custom_scalars_multilinechart(['twse/0050', 'twse/2330'])
|
| add_embedding(self, mat, metadata=None, label_img=None, global_step=None, tag='default', metadata_header=None)
| Add embedding projector data to summary.
|
| Args:
| mat (torch.Tensor or numpy.ndarray): A matrix which each row is the feature vector of the data point
| metadata (list): A list of labels, each element will be converted to string
| label_img (torch.Tensor): Images correspond to each data point
| global_step (int): Global step value to record
| tag (str): Name for the embedding
| metadata_header (list): A list of headers for multi-column metadata. If given, each metadata must be
| a list with values corresponding to headers.
| Shape:
| mat: :math:`(N, D)`, where N is number of data and D is feature dimension
|
| label_img: :math:`(N, C, H, W)`
|
| Examples::
|
| import keyword
| import torch
| meta = []
| while len(meta)<100:
| meta = meta+keyword.kwlist # get some strings
| meta = meta[:100]
|
| for i, v in enumerate(meta):
| meta[i] = v+str(i)
|
| label_img = torch.rand(100, 3, 10, 32)
| for i in range(100):
| label_img[i]*=i/100.0
|
| writer.add_embedding(torch.randn(100, 5), metadata=meta, label_img=label_img)
| writer.add_embedding(torch.randn(100, 5), label_img=label_img)
| writer.add_embedding(torch.randn(100, 5), metadata=meta)
|
| .. note::
| Categorical (i.e. non-numeric) metadata cannot have more than 50 unique values if they are to be used for
| coloring in the embedding projector.
|
| add_figure(self, tag: str, figure: Union[ForwardRef('Figure'), list['Figure']], global_step: Optional[int] = None, close: bool = True, walltime: Optional[float] = None) -> None
| Render matplotlib figure into an image and add it to summary.
|
| Note that this requires the ``matplotlib`` package.
|
| Args:
| tag: Data identifier
| figure: Figure or a list of figures
| global_step: Global step value to record
| close: Flag to automatically close the figure
| walltime: Optional override default walltime (time.time())
| seconds after epoch of event
|
| add_graph(self, model, input_to_model=None, verbose=False, use_strict_trace=True)
| Add graph data to summary.
|
| Args:
| model (torch.nn.Module): Model to draw.
| input_to_model (torch.Tensor or list of torch.Tensor): A variable or a tuple of
| variables to be fed.
| verbose (bool): Whether to print graph structure in console.
| use_strict_trace (bool): Whether to pass keyword argument `strict` to
| `torch.jit.trace`. Pass False when you want the tracer to
| record your mutable container types (list, dict)
|
| add_histogram(self, tag, values, global_step=None, bins='tensorflow', walltime=None, max_bins=None)
| Add histogram to summary.
|
| Args:
| tag (str): Data identifier
| values (torch.Tensor, numpy.ndarray, or string/blobname): Values to build histogram
| global_step (int): Global step value to record
| bins (str): One of {'tensorflow','auto', 'fd', ...}. This determines how the bins are made. You can find
| other options in: https://numpy.org/doc/stable/reference/generated/numpy.histogram.html
| walltime (float): Optional override default walltime (time.time())
| seconds after epoch of event
|
| Examples::
|
| from torch.utils.tensorboard import SummaryWriter
| import numpy as np
| writer = SummaryWriter()
| for i in range(10):
| x = np.random.random(1000)
| writer.add_histogram('distribution centers', x + i, i)
| writer.close()
|
| Expected result:
|
| .. image:: _static/img/tensorboard/add_histogram.png
| :scale: 50 %
|
| add_histogram_raw(self, tag, min, max, num, sum, sum_squares, bucket_limits, bucket_counts, global_step=None, walltime=None)
| Add histogram with raw data.
|
| Args:
| tag (str): Data identifier
| min (float or int): Min value
| max (float or int): Max value
| num (int): Number of values
| sum (float or int): Sum of all values
| sum_squares (float or int): Sum of squares for all values
| bucket_limits (torch.Tensor, numpy.ndarray): Upper value per bucket.
| The number of elements of it should be the same as `bucket_counts`.
| bucket_counts (torch.Tensor, numpy.ndarray): Number of values per bucket
| global_step (int): Global step value to record
| walltime (float): Optional override default walltime (time.time())
| seconds after epoch of event
| see: https://github.com/tensorflow/tensorboard/blob/master/tensorboard/plugins/histogram/README.md
|
| Examples::
|
| from torch.utils.tensorboard import SummaryWriter
| import numpy as np
| writer = SummaryWriter()
| dummy_data = []
| for idx, value in enumerate(range(50)):
| dummy_data += [idx + 0.001] * value
|
| bins = list(range(50+2))
| bins = np.array(bins)
| values = np.array(dummy_data).astype(float).reshape(-1)
| counts, limits = np.histogram(values, bins=bins)
| sum_sq = values.dot(values)
| writer.add_histogram_raw(
| tag='histogram_with_raw_data',
| min=values.min(),
| max=values.max(),
| num=len(values),
| sum=values.sum(),
| sum_squares=sum_sq,
| bucket_limits=limits[1:].tolist(),
| bucket_counts=counts.tolist(),
| global_step=0)
| writer.close()
|
| Expected result:
|
| .. image:: _static/img/tensorboard/add_histogram_raw.png
| :scale: 50 %
|
| add_hparams(self, hparam_dict, metric_dict, hparam_domain_discrete=None, run_name=None, global_step=None)
| Add a set of hyperparameters to be compared in TensorBoard.
|
| Args:
| hparam_dict (dict): Each key-value pair in the dictionary is the
| name of the hyper parameter and it's corresponding value.
| The type of the value can be one of `bool`, `string`, `float`,
| `int`, or `None`.
| metric_dict (dict): Each key-value pair in the dictionary is the
| name of the metric and it's corresponding value. Note that the key used
| here should be unique in the tensorboard record. Otherwise the value
| you added by ``add_scalar`` will be displayed in hparam plugin. In most
| cases, this is unwanted.
| hparam_domain_discrete: (Optional[Dict[str, List[Any]]]) A dictionary that
| contains names of the hyperparameters and all discrete values they can hold
| run_name (str): Name of the run, to be included as part of the logdir.
| If unspecified, will use current timestamp.
| global_step (int): Global step value to record
|
| Examples::
|
| from torch.utils.tensorboard import SummaryWriter
| with SummaryWriter() as w:
| for i in range(5):
| w.add_hparams({'lr': 0.1*i, 'bsize': i},
| {'hparam/accuracy': 10*i, 'hparam/loss': 10*i})
|
| Expected result:
|
| .. image:: _static/img/tensorboard/add_hparam.png
| :scale: 50 %
|
| add_image(self, tag, img_tensor, global_step=None, walltime=None, dataformats='CHW')
| Add image data to summary.
|
| Note that this requires the ``pillow`` package.
|
| Args:
| tag (str): Data identifier
| img_tensor (torch.Tensor, numpy.ndarray, or string/blobname): Image data
| global_step (int): Global step value to record
| walltime (float): Optional override default walltime (time.time())
| seconds after epoch of event
| dataformats (str): Image data format specification of the form
| CHW, HWC, HW, WH, etc.
| Shape:
| img_tensor: Default is :math:`(3, H, W)`. You can use ``torchvision.utils.make_grid()`` to
| convert a batch of tensor into 3xHxW format or call ``add_images`` and let us do the job.
| Tensor with :math:`(1, H, W)`, :math:`(H, W)`, :math:`(H, W, 3)` is also suitable as long as
| corresponding ``dataformats`` argument is passed, e.g. ``CHW``, ``HWC``, ``HW``.
|
| Examples::
|
| from torch.utils.tensorboard import SummaryWriter
| import numpy as np
| img = np.zeros((3, 100, 100))
| img[0] = np.arange(0, 10000).reshape(100, 100) / 10000
| img[1] = 1 - np.arange(0, 10000).reshape(100, 100) / 10000
|
| img_HWC = np.zeros((100, 100, 3))
| img_HWC[:, :, 0] = np.arange(0, 10000).reshape(100, 100) / 10000
| img_HWC[:, :, 1] = 1 - np.arange(0, 10000).reshape(100, 100) / 10000
|
| writer = SummaryWriter()
| writer.add_image('my_image', img, 0)
|
| # If you have non-default dimension setting, set the dataformats argument.
| writer.add_image('my_image_HWC', img_HWC, 0, dataformats='HWC')
| writer.close()
|
| Expected result:
|
| .. image:: _static/img/tensorboard/add_image.png
| :scale: 50 %
|
| add_image_with_boxes(self, tag, img_tensor, box_tensor, global_step=None, walltime=None, rescale=1, dataformats='CHW', labels=None)
| Add image and draw bounding boxes on the image.
|
| Args:
| tag (str): Data identifier
| img_tensor (torch.Tensor, numpy.ndarray, or string/blobname): Image data
| box_tensor (torch.Tensor, numpy.ndarray, or string/blobname): Box data (for detected objects)
| box should be represented as [x1, y1, x2, y2].
| global_step (int): Global step value to record
| walltime (float): Optional override default walltime (time.time())
| seconds after epoch of event
| rescale (float): Optional scale override
| dataformats (str): Image data format specification of the form
| NCHW, NHWC, CHW, HWC, HW, WH, etc.
| labels (list of string): The label to be shown for each bounding box.
| Shape:
| img_tensor: Default is :math:`(3, H, W)`. It can be specified with ``dataformats`` argument.
| e.g. CHW or HWC
|
| box_tensor: (torch.Tensor, numpy.ndarray, or string/blobname): NX4, where N is the number of
| boxes and each 4 elements in a row represents (xmin, ymin, xmax, ymax).
|
| add_images(self, tag, img_tensor, global_step=None, walltime=None, dataformats='NCHW')
| Add batched image data to summary.
|
| Note that this requires the ``pillow`` package.
|
| Args:
| tag (str): Data identifier
| img_tensor (torch.Tensor, numpy.ndarray, or string/blobname): Image data
| global_step (int): Global step value to record
| walltime (float): Optional override default walltime (time.time())
| seconds after epoch of event
| dataformats (str): Image data format specification of the form
| NCHW, NHWC, CHW, HWC, HW, WH, etc.
| Shape:
| img_tensor: Default is :math:`(N, 3, H, W)`. If ``dataformats`` is specified, other shape will be
| accepted. e.g. NCHW or NHWC.
|
| Examples::
|
| from torch.utils.tensorboard import SummaryWriter
| import numpy as np
|
| img_batch = np.zeros((16, 3, 100, 100))
| for i in range(16):
| img_batch[i, 0] = np.arange(0, 10000).reshape(100, 100) / 10000 / 16 * i
| img_batch[i, 1] = (1 - np.arange(0, 10000).reshape(100, 100) / 10000) / 16 * i
|
| writer = SummaryWriter()
| writer.add_images('my_image_batch', img_batch, 0)
| writer.close()
|
| Expected result:
|
| .. image:: _static/img/tensorboard/add_images.png
| :scale: 30 %
|
| add_mesh(self, tag, vertices, colors=None, faces=None, config_dict=None, global_step=None, walltime=None)
| Add meshes or 3D point clouds to TensorBoard.
|
| The visualization is based on Three.js,
| so it allows users to interact with the rendered object. Besides the basic definitions
| such as vertices, faces, users can further provide camera parameter, lighting condition, etc.
| Please check https://threejs.org/docs/index.html#manual/en/introduction/Creating-a-scene for
| advanced usage.
|
| Args:
| tag (str): Data identifier
| vertices (torch.Tensor): List of the 3D coordinates of vertices.
| colors (torch.Tensor): Colors for each vertex
| faces (torch.Tensor): Indices of vertices within each triangle. (Optional)
| config_dict: Dictionary with ThreeJS classes names and configuration.
| global_step (int): Global step value to record
| walltime (float): Optional override default walltime (time.time())
| seconds after epoch of event
|
| Shape:
| vertices: :math:`(B, N, 3)`. (batch, number_of_vertices, channels)
|
| colors: :math:`(B, N, 3)`. The values should lie in [0, 255] for type `uint8` or [0, 1] for type `float`.
|
| faces: :math:`(B, N, 3)`. The values should lie in [0, number_of_vertices] for type `uint8`.
|
| Examples::
|
| from torch.utils.tensorboard import SummaryWriter
| vertices_tensor = torch.as_tensor([
| [1, 1, 1],
| [-1, -1, 1],
| [1, -1, -1],
| [-1, 1, -1],
| ], dtype=torch.float).unsqueeze(0)
| colors_tensor = torch.as_tensor([
| [255, 0, 0],
| [0, 255, 0],
| [0, 0, 255],
| [255, 0, 255],
| ], dtype=torch.int).unsqueeze(0)
| faces_tensor = torch.as_tensor([
| [0, 2, 3],
| [0, 3, 1],
| [0, 1, 2],
| [1, 3, 2],
| ], dtype=torch.int).unsqueeze(0)
|
| writer = SummaryWriter()
| writer.add_mesh('my_mesh', vertices=vertices_tensor, colors=colors_tensor, faces=faces_tensor)
|
| writer.close()
|
| add_onnx_graph(self, prototxt)
|
| add_pr_curve(self, tag, labels, predictions, global_step=None, num_thresholds=127, weights=None, walltime=None)
| Add precision recall curve.
|
| Plotting a precision-recall curve lets you understand your model's
| performance under different threshold settings. With this function,
| you provide the ground truth labeling (T/F) and prediction confidence
| (usually the output of your model) for each target. The TensorBoard UI
| will let you choose the threshold interactively.
|
| Args:
| tag (str): Data identifier
| labels (torch.Tensor, numpy.ndarray, or string/blobname):
| Ground truth data. Binary label for each element.
| predictions (torch.Tensor, numpy.ndarray, or string/blobname):
| The probability that an element be classified as true.
| Value should be in [0, 1]
| global_step (int): Global step value to record
| num_thresholds (int): Number of thresholds used to draw the curve.
| walltime (float): Optional override default walltime (time.time())
| seconds after epoch of event
|
| Examples::
|
| from torch.utils.tensorboard import SummaryWriter
| import numpy as np
| labels = np.random.randint(2, size=100) # binary label
| predictions = np.random.rand(100)
| writer = SummaryWriter()
| writer.add_pr_curve('pr_curve', labels, predictions, 0)
| writer.close()
|
| add_pr_curve_raw(self, tag, true_positive_counts, false_positive_counts, true_negative_counts, false_negative_counts, precision, recall, global_step=None, num_thresholds=127, weights=None, walltime=None)
| Add precision recall curve with raw data.
|
| Args:
| tag (str): Data identifier
| true_positive_counts (torch.Tensor, numpy.ndarray, or string/blobname): true positive counts
| false_positive_counts (torch.Tensor, numpy.ndarray, or string/blobname): false positive counts
| true_negative_counts (torch.Tensor, numpy.ndarray, or string/blobname): true negative counts
| false_negative_counts (torch.Tensor, numpy.ndarray, or string/blobname): false negative counts
| precision (torch.Tensor, numpy.ndarray, or string/blobname): precision
| recall (torch.Tensor, numpy.ndarray, or string/blobname): recall
| global_step (int): Global step value to record
| num_thresholds (int): Number of thresholds used to draw the curve.
| walltime (float): Optional override default walltime (time.time())
| seconds after epoch of event
| see: https://github.com/tensorflow/tensorboard/blob/master/tensorboard/plugins/pr_curve/README.md
|
| add_scalar(self, tag, scalar_value, global_step=None, walltime=None, new_style=False, double_precision=False)
| Add scalar data to summary.
|
| Args:
| tag (str): Data identifier
| scalar_value (float or string/blobname): Value to save
| global_step (int): Global step value to record
| walltime (float): Optional override default walltime (time.time())
| with seconds after epoch of event
| new_style (boolean): Whether to use new style (tensor field) or old
| style (simple_value field). New style could lead to faster data loading.
| Examples::
|
| from torch.utils.tensorboard import SummaryWriter
| writer = SummaryWriter()
| x = range(100)
| for i in x:
| writer.add_scalar('y=2x', i * 2, i)
| writer.close()
|
| Expected result:
|
| .. image:: _static/img/tensorboard/add_scalar.png
| :scale: 50 %
|
| add_scalars(self, main_tag, tag_scalar_dict, global_step=None, walltime=None)
| Add many scalar data to summary.
|
| Args:
| main_tag (str): The parent name for the tags
| tag_scalar_dict (dict): Key-value pair storing the tag and corresponding values
| global_step (int): Global step value to record
| walltime (float): Optional override default walltime (time.time())
| seconds after epoch of event
|
| Examples::
|
| from torch.utils.tensorboard import SummaryWriter
| writer = SummaryWriter()
| r = 5
| for i in range(100):
| writer.add_scalars('run_14h', {'xsinx':i*np.sin(i/r),
| 'xcosx':i*np.cos(i/r),
| 'tanx': np.tan(i/r)}, i)
| writer.close()
| # This call adds three values to the same scalar plot with the tag
| # 'run_14h' in TensorBoard's scalar section.
|
| Expected result:
|
| .. image:: _static/img/tensorboard/add_scalars.png
| :scale: 50 %
|
| add_tensor(self, tag, tensor, global_step=None, walltime=None)
| Add tensor data to summary.
|
| Args:
| tag (str): Data identifier
| tensor (torch.Tensor): tensor to save
| global_step (int): Global step value to record
| Examples::
|
| from torch.utils.tensorboard import SummaryWriter
| writer = SummaryWriter()
| x = torch.tensor([1,2,3])
| writer.add_scalar('x', x)
| writer.close()
|
| Expected result:
| Summary::tensor::float_val [1,2,3]
| ::tensor::shape [3]
| ::tag 'x'
|
| add_text(self, tag, text_string, global_step=None, walltime=None)
| Add text data to summary.
|
| Args:
| tag (str): Data identifier
| text_string (str): String to save
| global_step (int): Global step value to record
| walltime (float): Optional override default walltime (time.time())
| seconds after epoch of event
| Examples::
|
| writer.add_text('lstm', 'This is an lstm', 0)
| writer.add_text('rnn', 'This is an rnn', 10)
|
| add_video(self, tag, vid_tensor, global_step=None, fps=4, walltime=None)
| Add video data to summary.
|
| Note that this requires the ``moviepy`` package.
|
| Args:
| tag (str): Data identifier
| vid_tensor (torch.Tensor): Video data
| global_step (int): Global step value to record
| fps (float or int): Frames per second
| walltime (float): Optional override default walltime (time.time())
| seconds after epoch of event
| Shape:
| vid_tensor: :math:`(N, T, C, H, W)`. The values should lie in [0, 255] for type `uint8` or [0, 1] for type `float`.
|
| close(self)
|
| flush(self)
| Flushes the event file to disk.
|
| Call this method to make sure that all pending events have been written to
| disk.
|
| get_logdir(self)
| Return the directory where event files will be written.
|
| ----------------------------------------------------------------------
| Data descriptors defined here:
|
| __dict__
| dictionary for instance variables (if defined)
|
| __weakref__
| list of weak references to the object (if defined)
# 实例化 指定路径
writer = SummaryWriter("logs")
help(SummaryWriter.add_scalar)
Help on function add_scalar in module torch.utils.tensorboard.writer:
add_scalar(self, tag, scalar_value, global_step=None, walltime=None, new_style=False, double_precision=False)
Add scalar data to summary.
Args:
tag (str): Data identifier
scalar_value (float or string/blobname): Value to save
global_step (int): Global step value to record
walltime (float): Optional override default walltime (time.time())
with seconds after epoch of event
new_style (boolean): Whether to use new style (tensor field) or old
style (simple_value field). New style could lead to faster data loading.
Examples::
from torch.utils.tensorboard import SummaryWriter
writer = SummaryWriter()
x = range(100)
for i in x:
writer.add_scalar('y=2x', i * 2, i)
writer.close()
Expected result:
.. image:: _static/img/tensorboard/add_scalar.png
:scale: 50 %
说明:
scalar_value y轴点
global_step x轴点
help(SummaryWriter.add_image)
Help on function add_image in module torch.utils.tensorboard.writer:
add_image(self, tag, img_tensor, global_step=None, walltime=None, dataformats='CHW')
Add image data to summary.
Note that this requires the ``pillow`` package.
Args:
tag (str): Data identifier
img_tensor (torch.Tensor, numpy.ndarray, or string/blobname): Image data
global_step (int): Global step value to record
walltime (float): Optional override default walltime (time.time())
seconds after epoch of event
dataformats (str): Image data format specification of the form
CHW, HWC, HW, WH, etc.
Shape:
img_tensor: Default is :math:`(3, H, W)`. You can use ``torchvision.utils.make_grid()`` to
convert a batch of tensor into 3xHxW format or call ``add_images`` and let us do the job.
Tensor with :math:`(1, H, W)`, :math:`(H, W)`, :math:`(H, W, 3)` is also suitable as long as
corresponding ``dataformats`` argument is passed, e.g. ``CHW``, ``HWC``, ``HW``.
Examples::
from torch.utils.tensorboard import SummaryWriter
import numpy as np
img = np.zeros((3, 100, 100))
img[0] = np.arange(0, 10000).reshape(100, 100) / 10000
img[1] = 1 - np.arange(0, 10000).reshape(100, 100) / 10000
img_HWC = np.zeros((100, 100, 3))
img_HWC[:, :, 0] = np.arange(0, 10000).reshape(100, 100) / 10000
img_HWC[:, :, 1] = 1 - np.arange(0, 10000).reshape(100, 100) / 10000
writer = SummaryWriter()
writer.add_image('my_image', img, 0)
# If you have non-default dimension setting, set the dataformats argument.
writer.add_image('my_image_HWC', img_HWC, 0, dataformats='HWC')
writer.close()
Expected result:
.. image:: _static/img/tensorboard/add_image.png
:scale: 50 %
# 关闭
writer.close()