Комментарии:
dog bark got me good
ОтветитьThank you for this wonderful lecture -- but I almost had a heart attack when the dog started barking!
Ответить"dog barking, that you may or may not be able to hear" Brother, that scared the living crap out of me.
ОтветитьSo if you use the same seed value, you always end up with the same image. But if you change the seed value, you get a different image. So what exactly is the seed value changing?
ОтветитьThis was not a super lecture. He should have prepared a bit more. And the dog is super annoying.
ОтветитьThe dog barking! lol🤣
ОтветитьMathmetical proof on the samplig process lol
This is way distinct from the previous lectures.
TLDR: By formulating the latent space (Lecture 1 / 21 / some more) into high dimensional gaussian distributions, we apply the "old school" mathmetical tools ("Proof of nonnegativity of KL divergence using Jensen's inequality" with "Maximum likelihood estimation"), to dconvert the unsupervisied image / caption clustering problem into a solveable supervised regression problem, by simply "destruct / reconstruct" the data structure with iterlations (markov decision process).
The "decision" is iterlatable, as you can extrapolate (lecrue 1 / some more) with extra sampling steps to "explore" (lecture 7 / 16 / 21) the latent space instead of strictly reconstruct a desired image. Moreover, the "latent space" is highly abstract, which is sole depends on the dataset itself, data labling / categorising would be more important then simpling preprocessing or clensing, which is coincident with the current problem facing from NLP / "GPT", which both of them are topics on Machine Learning and Data Science.
The act of "exploration" would be an act of "creation" (lecture 14 / 15), even the machince itself is just solving mathetical equations, providing more usage and probabilities for different domains (lecture 12), instead of the widely spread "disruptive innovation".
Besides highly academic / technical details (lecture 22 / some others, point cloud based), we could "reformultate" the situation with different perspectives, putting more focus on "quantive art analyis" instead of a determinstic task (e.g. drawing via computer vision / graphics), which could be benefical for people from different domians, instead of surviving in a industrial rat race.
Thank you for the lecture series by providing insights from the professionals and scholars.
Where can we find this slides?
ОтветитьDog barking - what an idiot
ОтветитьIs the slides available? Thanks%
ОтветитьIt was almost diffused into tiny atoms when the dog barked unexpectedly ;-) ... great talk, thanks!
ОтветитьAmazing lecture! Ngl though, but that dog suddenly barking in the background almost gave me a heart attack.
ОтветитьI wonder if the dog was there to wake us up.
ОтветитьReally good explanation.
ОтветитьGood lecture but audio needs work. That dog jump scare at 11.10 also doesn’t help…
Ответить👏👏
ОтветитьThis was a super lecture by Jascha - and it was nice to see things from his perspective! Thanks for sharing.
ОтветитьTHAT GODDAMN DOG, MAN!!!!!!!!!!!!!!!!!!!!! NOT COOL!!!!!!!!!!!!!!!
ОтветитьCould anyone explain why the factor in the mean is sqrt(1 - beta) instead of simply (1 - beta) ?
ОтветитьThe dog decreases the entropy in the room for sure!
ОтветитьJust a small question-
I couldn't understand the decay towards origin thing using Beta and how it finally ends with getting a distribution with mean 0 and variance I?
Jesus that dog scared the living crap out of me
Ответить扩散模型,我觉得其想法比GAN,VAE等都要厉害,大家觉得了?
Ответить扩散模型是天才之作,机器学习和物理的完美联姻。
ОтветитьI must admit this is hurting my brain. Pls. correct me if I'm wrong: It looks like they are solving an initial value problem for the backward heat equation, where initial data is white noise (no information), and somehow they are "evolving" it into an information rich image. Whaaaaa?!? How?
ОтветитьPerfect explanation!!! Thanks for sharing your knowledge!!!
Ответитьthank you for the lecture! just what I needed, really enjoyed throughout! 🙏
ОтветитьI jumped out of my seat when I heard the dog!
ОтветитьThank you so much for uploading the lecture, authentic sources explaining diffusion models are rare.
ОтветитьTldr
ОтветитьGOD, does it take higher IQ to understand diffusion processes than it is to understand VAEs?
What percent of THIERL FELLOWS are smart enough to understand THIS? OR ISIR attendees?