Lesson 3: Practical Deep Learning for Coders 2022

Lesson 3: Practical Deep Learning for Coders 2022

Jeremy Howard

2 года назад

107,509 Просмотров

Ссылки и html тэги не поддерживаются


Комментарии:

Nam Nguyen
Nam Nguyen - 24.09.2023 20:42

I think one way to improve the slow/fast issue is that it is actually sometimes, both. The part that needs to go faster, would/should be going faster, or trimmed out unnecessary part.
The parts that is complicated, maybe slow down a bit.
Then add very short/fast "teaching" for each topic, and then goes into details after each short teaching, short teaching is not summary. So people who gets it can move ahead to the next topic.

Ответить
Nam Nguyen
Nam Nguyen - 24.09.2023 20:38

Skip 10 minutes to start the lesson

Ответить
Wadeed Ahmad
Wadeed Ahmad - 23.09.2023 22:23

i am in love with this course

Ответить
Đức Master
Đức Master - 09.09.2023 17:19

I couldn't understand why ReLu was needed and now I understand. I'm a programmer and I think this is the DL course for me. The explanation is very easy to understand. Thank you!

Ответить
Silva
Silva - 02.09.2023 15:53

=IF([@Embarked]="S" , 1, 0) and other IF statements like this seem not to work for me.
Anyone experienced the same thing.

Ответить
Abdelhak Saouli
Abdelhak Saouli - 11.08.2023 10:53

how much the difference betewen train_loss and validation_loss should be accepted ?

Ответить
Pranav Deshpande
Pranav Deshpande - 04.07.2023 04:30

Simply amazing! Excellent lecture.

Ответить
Kushal G
Kushal G - 21.06.2023 23:59

The excel example blew my mind. Loved this lesson. Thank you.

Ответить
AQQIA
AQQIA - 09.06.2023 22:06

I was lucky to have good math teachers in high school. Jeremy explaining the concepts reminded me of them. Thanks.

Ответить
Diet Tom's
Diet Tom's - 01.06.2023 10:30

I just made a NN in Excel. Wow. If you want to predict two different things, do you just have a separate set of weights and Lins for the second item?

Ответить
MindShift
MindShift - 26.05.2023 20:03

Where can I find the walk through of Gradio?

Ответить
toromanow
toromanow - 14.05.2023 02:33

So paperspace appears to not be free. When I try starting a notebook he forces me to upgrade to 8/month. Is this still the recommended platform? IS it worth it?

Ответить
Ananda Kumar
Ananda Kumar - 30.04.2023 05:40

Love the explanation of RELUs being the foundation of learning. So intuitive that you cant forget it or unsee it from the moment you have seen it. ❤

Ответить
Devashish Jose
Devashish Jose - 18.04.2023 20:40

Thank you so much jeremy for making this course, I am going slow but learning a lot everyday, you are a very patient teacher. Thank you.

Ответить
anonanonous
anonanonous - 11.04.2023 13:06

what a great lesson. mind blown! Thank you so much! You are a great teacher!

Ответить
matthew rice
matthew rice - 06.04.2023 22:15

I'm slightly confused about the intuition behind how multiple ReLUs can lead to a squiggly line. Wouldn't it more specifically lead to a line that is always either stagnant or gradually increasing because of how the output must be >=0 ?

Ответить
3duy buidoi
3duy buidoi - 03.04.2023 10:09

I am a newbie in machine learning. But the approach, you took in this lesson to explain difficult concepts, is making it so easy to understand. Great work.

Ответить
ekbastu
ekbastu - 19.03.2023 23:01

Quadratic example was just superb. 🎉

Ответить
E D
E D - 12.03.2023 09:57

Great lesson!! Jeremy deciding to approach chapter 4 differently after seeing many student quit at this point really shows that he cares about students' learning. Greatly appreciated for the effort!🙏

Ответить
Manu G
Manu G - 06.03.2023 14:17

I "knew" that deep learning models used the sum of wi +xi + b function, I "knew" that it supposedly was used because it was an "all purpose" function, but now thanks to you Jeremy I know WHY its an "all purpose" function
10/10 explanation. Math should always be explained like this, its actually beautiful to see it all unfold.

Ответить
Mukhtar Bimurat
Mukhtar Bimurat - 11.02.2023 22:08

Wow, great explanation! Thanks!

Ответить
Leo Medina - Javascript Developer
Leo Medina - Javascript Developer - 22.12.2022 23:09

This is mind blowing! Great job explaining all these concepts.

Ответить
Hüseyin ABANOZ
Hüseyin ABANOZ - 21.12.2022 12:16

Unbelievable content! Thanks to all who have made it possible!

Ответить
Daniel Hemmati
Daniel Hemmati - 18.12.2022 21:38

basically we have data, now let's create a general function (from those data) that can kind of produce those data and also predict what the next data would be.

Ответить
Shravan Kumar
Shravan Kumar - 17.11.2022 17:55

👏👏👏 applause from online

Ответить
Matt McConaha
Matt McConaha - 11.11.2022 21:52

I tried to make a Paperspace account and accidentally mistyped the phone verification, so they decided that I'm no longer allowed to verify with my phone number. Disappointing.

Ответить
Dingus
Dingus - 02.11.2022 05:12

I've gone through many great courses in all sorts of subjects, but I think this course might be the best. Kudos for putting out this fantastic content out there for free for everyone to learn.

Ответить
Manuel Araoz
Manuel Araoz - 25.10.2022 02:33

This is god-tier educational content, sir. Thanks for sharing it!

Ответить
voko axecer
voko axecer - 19.10.2022 20:06

I don't even know how to use Excel.

Ответить
tumadrep00
tumadrep00 - 06.10.2022 18:50

As always, an excellent video Jeremy.

Ответить
Andres Pineda
Andres Pineda - 23.09.2022 21:34

Great foundational lecture. Jeremy has a relaxed, non-intimidating approach that works for me. Brilliant step by step walk into the deep end of the pool without getting us lost or scared :) Thank you for taking the time to put this together.

Ответить
cantabr0
cantabr0 - 26.08.2022 12:26

Excellent tutorial! I have one question, in the excel, why are Parch and SibSp not normalized? Because they are not "big enough" to negatively interfere?

Ответить
Conor Donovan
Conor Donovan - 24.08.2022 03:05

The quadratic example was a really good illustration of how gradient descent works - it is really good for building intuition. Then, the Excel example cements the understanding really well with a solid dataset. This is my favourite of the 3 lectures so far.

Ответить
Nathan Glenn
Nathan Glenn - 09.08.2022 16:36

I don't quite see how the Excel example qualifies as a "deep" neural network, since the layers were not stacked on top of each other but added together. The example is still great, though, and I could see how to stack the layers.

Ответить
Mr John
Mr John - 31.07.2022 12:51

New didactic and methodological ideas - like them very much - still a bit rough in execution - but discovers amazing new territory to approach neural networks - deep learning ... well done!

Ответить
Levy Nunes
Levy Nunes - 29.07.2022 20:49

loved the excelTorch!!

Ответить
Goutam Garai
Goutam Garai - 29.07.2022 20:19

great content.

Ответить
Santiago
Santiago - 28.07.2022 04:20

Amazing talk! Thanks thanks thanks! You're doing the machine learning field so much easier to understand, and that's something invaluable.

Ответить
Egor Asirotiv
Egor Asirotiv - 27.07.2022 21:18

Excellent!

Ответить
Leo
Leo - 27.07.2022 15:37

dear professor, you mentioned in the learning rate segment when you were drawing there's a "theory" that says everything is quadratic from a certain resolution ongoing. can you please share a link toward which paper introduced this idea?

Ответить
Luke Woods
Luke Woods - 25.07.2022 19:43

The quadratic section is a beautifully crafted example. Thanks

Ответить
AnalyticsRoot
AnalyticsRoot - 25.07.2022 11:04

Thanks Jeremy, great tutorial.

Ответить
Oscar Rangel
Oscar Rangel - 16.05.2022 00:03

Thanks! Jeremy, great Lecture, never got into NPL, but now I am understanding it.

Ответить