A Multiscale Visualization of Attention in the Transformer Model
Details
Florence, Italy. Date of Talk: 2019-07-28
Speakers
Jesse Vig
Event
A Multiscale Visualization of Attention in the Transformer Model
The Transformer is a sequence model that forgoes traditional recurrent architectures in favor of a fully attention-based approach. Besides improving performance, an advantage of using attention is that it can also help to interpret a model by showing how the model assigns weight to different input elements. However, the multi-layer, multi-head attention mechanism in the Transformer model can be difficult to decipher. To make the model more accessible, we introduce an open-source tool that visualizes attention at multiple scales, each of which provides a unique perspective on the attention mechanism. We demonstrate the tool on BERT and OpenAI GPT-2 and present three example use cases: detecting model bias, locating relevant attention heads, and linking neurons to model behavior.
Additional information
Focus Areas
Our work is centered around a series of Focus Areas that we believe are the future of science and technology.
Licensing & Commercialization Opportunities
We’re continually developing new technologies, many of which are available for Commercialization.
News
PARC scientists and staffers are active members and contributors to the science and technology communities.