Transformers: More than meets the eye not only is the slogan for these guys ...
but also is a new way for AI to learn ... faster.
The transformer first appeared in 2017 in a paper that cryptically declared that “Attention Is All You Need.” In other approaches to AI, the system would first focus on local patches of input data and then build up to the whole. In a language model, for example, nearby words would first get grouped together. The transformer, by contrast, runs processes so that every element in the input data connects, or pays attention, to every other element. Researchers refer to this as “self-attention.” This means that as soon as it starts training, the transformer can see traces of the entire data set.
I can see clearly now ...
The busy child looms ...
No comments:
Post a Comment