What is verbose?


z在Deep Learning的编程中,总是会遇到verbose这个概念,我一直理解这个单词就是控制程序打印信息的意思,但是具体是怎么控制打印信息,我一直没理解,查阅资料之后发现,这个参数在Keras中常见,stackoverflow关于它的解释如下:

verbose: Integer. 0, 1, or 2. Verbosity mode.

  • Verbose=0 (silent)

  • Verbose=1 (progress bar)

Train on 186219 samples, validate on 20691 samples
Epoch 1/2
186219/186219 [==============================] - 85s 455us/step - loss: 0.5815 - acc: 
0.7728 - val_loss: 0.4917 - val_acc: 0.8029
Train on 186219 samples, validate on 20691 samples
Epoch 2/2
186219/186219 [==============================] - 84s 451us/step - loss: 0.4921 - acc: 
0.8071 - val_loss: 0.4617 - val_acc: 0.8168
  • Verbose=2 (one line per epoch)
Train on 186219 samples, validate on 20691 samples
Epoch 1/1
 - 88s - loss: 0.5746 - acc: 0.7753 - val_loss: 0.4816 - val_acc: 0.8075
Train on 186219 samples, validate on 20691 samples
Epoch 1/1
 - 88s - loss: 0.4880 - acc: 0.8076 - val_loss: 0.5199 - val_acc: 0.8046

文章作者: CarlYoung
版权声明: 本博客所有文章除特別声明外,均采用 CC BY 4.0 许可协议。转载请注明来源 CarlYoung !
 上一篇
Simple and Effective Few-Shot Named Entity Recognition with Structured Nearest Neighbor Learning Simple and Effective Few-Shot Named Entity Recognition with Structured Nearest Neighbor Learning
Simple and Effective Few-Shot Named Entity Recognition with Structured Nearest Neighbor LearningContributions 作者引入了STRUC
下一篇 
Everything You Need To Know About Saving Weights In PyTorch Everything You Need To Know About Saving Weights In PyTorch
在使用huggingface transformers时经常需要用到保存model,或者说是保存model的parameters。看了一篇medium上的blog,感觉很有用,解决了我很多的困扰,下面是blog里面的精髓: Applyin
2021-04-11
  目录