From the course: Generative AI: Working with Large Language Models
Transformers in production
- [Instructor] If you're like me, the only time you watch the winter sport curling is every four years during the Winter Olympics. Now, whenever I've used Google search in the past, I've often only entered keywords such as curling objective. In 2019, Google started using BERT as part of search. Now BERT is an acronym for Bidirectional Encoder Representations from Transformers as one of the first large language models developed by the Google research team. Now that I know that Google Search uses BERT, I know that I can enter a more English sounding phrase like what's the main objective of curling? And the answer back from Google doesn't give me the most relevant page but the answer to my question is in bold. You can see that it's here. The goal is to deliver the stone from one side of the sheet to the circular scoring area on the other side called the house. Here's another example of transformers in production. Again, using BERT. In the past, if you did a Google search using the phrase, can you get medicine for someone pharmacy, it would've picked up on the fact that for someone was a really important part of a query because you're looking for another person to pick up the medicine. Google search would've returned results about getting a prescription filled which isn't relevant in this context. Now with BERT, Google search captures the important nuance that another person picks up the medicine and it returns results about having a friend or a family member pick up a prescription. The quality of Google search has improved significantly using BERT.