Primer: attention, transformers and the new architecture of language

Sign up for our newsletter, Artificiality, to get our latest AI insights delivered directly to your email inbox.

Natural language is one of the most important challenges in AI and the frontier has moved rapidly over the past few years. This primer will help you get to grips with the basic concepts in a (mostly) non-technical way. If you haven't already read our primer on representing data, you might want to do that first as it explains how AI networks can be used to find patterns in data.

Language is unique. There are many shades of meaning for a given word that only emerge based on its situation in a sentence and its relationship with other words. Meaning happens as language is actually used. The challenge for AI is to capture these relationships mathematically.

Traditional approaches to natural language processing assumes that word meaning is stable across sentences. This is far from the case so researchers are on a quest to invent techniques that are able to include more relevant information about a word.

A technique that has, up until recently been state-of-the-art, is to use an A...

- - - - -


The rest of this post is available for our Pro Members. Please login if you're a member. And, if you're not, subscribe now to get access to all of our Pro Member content!

Subscribe here.

Email us with any questions.



At Sonder Scheme, we are on a mission to help humans win in the age of AI. We have created a human-centered AI design system that allows leading companies to create AI in an inclusive way. Companies like The New York Times, Starbucks, R/GA, Galp and the National Headstart Association tap us for engaging presentations at team/executive forums, in-depth workshops or long-term learning journeys. We support them with executive coaching and online access to our design system and up-to-the-minute insights into the frontiers of AI technology and ethics.

Share on email
Share on facebook
Share on linkedin
Share on twitter
Scroll to Top