Комментарии:
OMG the BEST transformers video EVER!
ОтветитьVery well explained. This video is must watch for anyone who wants to demystify the latest LLM technology. Wondering if this could be made into a more generic video with a quick high-level intro on neural networks for those who aren't in the field. I bet there are millions out there who want to get a basic understanding of how ChatGPT/Bard/Claude work without an in-depth technical deep dive.
Ответитьwoww, she's good at explaining things
ОтветитьI have more respect for Google after watching this Video. Not only did they provided their engineers with the funding to research, but they also let other companies like OpenAI to use said research. And they are opening up the knowledge for the general public with these video series.
Ответитьwhy optimus prime?
yyyyyyyy
Self-attention is better than attention because it is aware of the connections between all the words in a sentence?
ОтветитьDr. Ashish Vaswani is a pioneer and nobody is talking about him. He is a scientist from Google Brain and the first author of the paper that introduced TANSFORMERS, and that is the backbone of all other recent models.
ОтветитьOptimum Pride Æ Æ Æ Æ Æ
ОтветитьHow did you condense so many pieces of information in such a short time? This video is on a next level, I loved it!
ОтветитьGreat summary of the Transformers technology!
My only criticism: :The backroundmusic got annoying after 3-4 minutes, but that might just be me.
How did you sync your talking cadence to the background music?
ОтветитьVery good high level explanation what are the mayor innovations coming along with transformers but I want to stress that these large language models using Transformer approach impose a risk as well. The large language models show more and more unexplainable phenomena which can impose big risks on society. The arm race that has now started between Microsoft and Google is no good sign for deploying this innovation savely.
Ответитьjust sneaking in a bit of THE MESSAGE along the way...
ОтветитьTransformers. More than meets the eye
Ответитьphenomenal video
Ответитьthank u mam❤🔥
ОтветитьAnd that's how ChatGPT has born
ОтветитьThey have to add that silly Optimus Prime image on every video about this subject? Are we 6 year olds? I used to love the TV show. When I was six. Then I grew up.
ОтветитьWhen I was a kid, I knew the trouble of translation were due to literally translation words, without contextual/ sequential awareness. I knew it's important to distinguish between synonyms. I've imagined there's a button that generate the translation output then you can highlights the you words that doesn't make sense or want improvement on it . then regenerate text translation. this type of nlp probably exist before I program my first hello world (+15y ago)!
ОтветитьStooooooppp with the backtracks!!!!!!!
ОтветитьI wish someone could explain the concrete math behind transformers and attention
ОтветитьWhen I saw this title, I was hoping to better understand the mathematical workings of transformers such as matrices and the like. Maybe you could do a follow-up video explaining mathematically how transformers work.
thank you for your time
送我一个云服务器看看实力
ОтветитьI want to ask - when you translate a sentence from one language to another fair enough to know how the sentence will be read in the language it has been translated.
However, bigger question is - did we totally eat up the grammar of the sentence bcz grammar plays differently for individual language,
What’s the point of translating a sentence form one language to another if the grammatical sense of the language is lost?
Thanks for the video You mentioned that GPT 3 was trained on 45 terabytes of text. I have seen much smaller numbers, like 570 gig. Can you give me a reference for the training data size. I am working on a project and I would like to cite the correct number. Thanks
ОтветитьThe pointless background music is so irritating, I gave up
ОтветитьThat was really good. Thank you
ОтветитьThanks, that was very interesting
ОтветитьPlease remove background music, it's really disturbing when you only listen to this otherwise great video
ОтветитьWowww….thanks for clarifying my confusion.
ОтветитьSo why did Google fire their entire AI Ethics Team earlier this year?
Ответить"In language order of words matter" - well... that's probably why Google Translate sucked so much when working with languages like Russian, Polish, Hungarian and others where the grammar is dictated by inflexion or other types of morphemes instead of their position in sentences.
Ответитьexplanation is good but too much background music
ОтветитьAny advice for how to get in front of the right people to show a brand new unique advance and machine learning? For example, if you had been the inventor of the first Transformer based model, how would you get in front of the right people to popularize it, or do you feel it's enough just to write a paper?
Ответитьthe video is informative thank you, but music is very loud and annoying
ОтветитьLove the content and thanks for the great video! (one thing that might help is lower the background music a bit, I found myself stopping the video because I thought another app was playing music)
Ответитьmusic in a video trying to explain stuff is so distracting. stopped watching b/c of it
ОтветитьGreat and worth watching - what iswith the elevator music in the background - just a distraction,
Ответитьbackgound music is very disturbing
ОтветитьCan you provide us an ultra-minimalistic Python version of GPT chatbot for us as a single script and just as a function method instead of a class method? I sometimes can only learn by looking at whatever the working code. Thanks!
ОтветитьGreat, bug the BGM is annoying.
ОтветитьWell done and informative video. Your music is too loud though. Hard to hear you over it.
ОтветитьWelp, it's April 2023, and last month "every few years," turned into "every few days." Well on the way to "every few hours" ...
...
"Transformers explained: What we hoped would be in the video"
Ответить