MIT 6.S192 - Lecture 22: Diffusion Probabilistic Models, Jascha Sohl-Dickstein

MIT 6.S192 - Lecture 22: Diffusion Probabilistic Models, Jascha Sohl-Dickstein

Nick Ali Jahanian

2 года назад

60,639 Просмотров

Ссылки и html тэги не поддерживаются


Комментарии:

@brendongong7295
@brendongong7295 - 06.12.2023 00:52

dog bark got me good

Ответить
@guillemsala9116
@guillemsala9116 - 25.09.2023 18:00

Thank you for this wonderful lecture -- but I almost had a heart attack when the dog started barking!

Ответить
@tobir693
@tobir693 - 02.07.2023 18:31

"dog barking, that you may or may not be able to hear" Brother, that scared the living crap out of me.

Ответить
@SnoopyDoofie
@SnoopyDoofie - 31.05.2023 20:14

So if you use the same seed value, you always end up with the same image. But if you change the seed value, you get a different image. So what exactly is the seed value changing?

Ответить
@MuhammetAliDede
@MuhammetAliDede - 20.05.2023 15:52

This was not a super lecture. He should have prepared a bit more. And the dog is super annoying.

Ответить
@aliasziken7847
@aliasziken7847 - 03.04.2023 15:37

The dog barking! lol🤣

Ответить
@6DAMMK9
@6DAMMK9 - 31.03.2023 13:10

Mathmetical proof on the samplig process lol
This is way distinct from the previous lectures.
TLDR: By formulating the latent space (Lecture 1 / 21 / some more) into high dimensional gaussian distributions, we apply the "old school" mathmetical tools ("Proof of nonnegativity of KL divergence using Jensen's inequality" with "Maximum likelihood estimation"), to dconvert the unsupervisied image / caption clustering problem into a solveable supervised regression problem, by simply "destruct / reconstruct" the data structure with iterlations (markov decision process).
The "decision" is iterlatable, as you can extrapolate (lecrue 1 / some more) with extra sampling steps to "explore" (lecture 7 / 16 / 21) the latent space instead of strictly reconstruct a desired image. Moreover, the "latent space" is highly abstract, which is sole depends on the dataset itself, data labling / categorising would be more important then simpling preprocessing or clensing, which is coincident with the current problem facing from NLP / "GPT", which both of them are topics on Machine Learning and Data Science.

The act of "exploration" would be an act of "creation" (lecture 14 / 15), even the machince itself is just solving mathetical equations, providing more usage and probabilities for different domains (lecture 12), instead of the widely spread "disruptive innovation".
Besides highly academic / technical details (lecture 22 / some others, point cloud based), we could "reformultate" the situation with different perspectives, putting more focus on "quantive art analyis" instead of a determinstic task (e.g. drawing via computer vision / graphics), which could be benefical for people from different domians, instead of surviving in a industrial rat race.

Thank you for the lecture series by providing insights from the professionals and scholars.

Ответить
@1potdish271
@1potdish271 - 28.02.2023 23:33

Where can we find this slides?

Ответить
@VoltLover00
@VoltLover00 - 27.02.2023 02:11

Dog barking - what an idiot

Ответить
@bibiworm
@bibiworm - 05.02.2023 02:43

Is the slides available? Thanks%

Ответить
@huveja9799
@huveja9799 - 24.01.2023 23:18

It was almost diffused into tiny atoms when the dog barked unexpectedly ;-) ... great talk, thanks!

Ответить
@anonymousperson9757
@anonymousperson9757 - 19.01.2023 11:33

Amazing lecture! Ngl though, but that dog suddenly barking in the background almost gave me a heart attack.

Ответить
@hansheng654
@hansheng654 - 08.01.2023 10:43

I wonder if the dog was there to wake us up.

Ответить
@xv0047
@xv0047 - 05.12.2022 06:29

Really good explanation.

Ответить
@ceremonious_houseplant
@ceremonious_houseplant - 16.10.2022 12:16

Good lecture but audio needs work. That dog jump scare at 11.10 also doesn’t help…

Ответить
@_XY_
@_XY_ - 05.10.2022 18:01

👏👏

Ответить
@Vikram-wx4hg
@Vikram-wx4hg - 03.10.2022 10:50

This was a super lecture by Jascha - and it was nice to see things from his perspective! Thanks for sharing.

Ответить
@ckhalifazada
@ckhalifazada - 28.09.2022 23:24

THAT GODDAMN DOG, MAN!!!!!!!!!!!!!!!!!!!!! NOT COOL!!!!!!!!!!!!!!!

Ответить
@mustahidahmed4643
@mustahidahmed4643 - 05.09.2022 14:49

Could anyone explain why the factor in the mean is sqrt(1 - beta) instead of simply (1 - beta) ?

Ответить
@fredxu9826
@fredxu9826 - 31.08.2022 06:15

The dog decreases the entropy in the room for sure!

Ответить
@anmolgupta6870
@anmolgupta6870 - 28.08.2022 21:59

Just a small question-
I couldn't understand the decay towards origin thing using Beta and how it finally ends with getting a distribution with mean 0 and variance I?

Ответить
@onmountdoom
@onmountdoom - 23.08.2022 11:27

Jesus that dog scared the living crap out of me

Ответить
@bingbingsun6304
@bingbingsun6304 - 09.08.2022 19:29

扩散模型,我觉得其想法比GAN,VAE等都要厉害,大家觉得了?

Ответить
@bingbingsun6304
@bingbingsun6304 - 09.08.2022 18:16

扩散模型是天才之作,机器学习和物理的完美联姻。

Ответить
@qrubmeeaz
@qrubmeeaz - 15.07.2022 14:39

I must admit this is hurting my brain. Pls. correct me if I'm wrong: It looks like they are solving an initial value problem for the backward heat equation, where initial data is white noise (no information), and somehow they are "evolving" it into an information rich image. Whaaaaa?!? How?

Ответить
@fernandagontijo6720
@fernandagontijo6720 - 08.07.2022 20:17

Perfect explanation!!! Thanks for sharing your knowledge!!!

Ответить
@tho207
@tho207 - 30.06.2022 16:05

thank you for the lecture! just what I needed, really enjoyed throughout! 🙏

Ответить
@homataha5626
@homataha5626 - 27.06.2022 13:53

I jumped out of my seat when I heard the dog!

Ответить
@prabhavkaula9697
@prabhavkaula9697 - 26.06.2022 12:21

Thank you so much for uploading the lecture, authentic sources explaining diffusion models are rare.

Ответить
@WillTesler
@WillTesler - 16.06.2022 23:44

Tldr

Ответить
@InquilineKea
@InquilineKea - 27.05.2022 20:23

GOD, does it take higher IQ to understand diffusion processes than it is to understand VAEs?

What percent of THIERL FELLOWS are smart enough to understand THIS? OR ISIR attendees?

Ответить