Jingwen's Homepage

How Far Does BERT Look At: Distance-based Clustering and Analysis of BERT’s Attention
Yue Guan, Jingwen Leng, Chao Li, Quan Chen, Minyi Guo
In the International Conference on Computational Linguistics (COLING), 2020

ABSTRACT
Recent research on the multi-head attention mechanism, especially that in pre-trained models such as BERT, has shown us heuristics and clues in analyzing various aspects of the mechanism. As most of the research focus on probing tasks or hidden states, previous works have found some primitive patterns of attention head behavior by heuristic analytical methods, but a more system- atic analysis specific on the attention patterns still remains primitive. In this work, we clearly cluster the attention heatmaps into significantly different patterns through unsupervised cluster- ing on top of a set of proposed features, which corroborates with previous observations. We further study their corresponding functions through analytical study. In addition, our proposed features can be used to explain and calibrate different attention heads in Transformer models.

guan20coling-mha