Building the Gradient Descent Algorithm in 15 Minutes | Coding Challenge

Building the Gradient Descent Algorithm in 15 Minutes | Coding Challenge

Nicholas Renotte

1 год назад

60,264 Просмотров

Ссылки и html тэги не поддерживаются


Комментарии:

Ultimate Saksham
Ultimate Saksham - 30.08.2023 16:13

how amazing it is that he set timer for 15 mins and the vid is 22 mins long

Ответить
Victor Giustini Perez
Victor Giustini Perez - 06.07.2023 12:07

Really nice video! Love the energy and the enthusiasm. Thanks for the help!

Ответить
spicytuna08
spicytuna08 - 19.05.2023 18:05

wow. you make the subject come alive with excitements and simplicity. you are really gifted. i will take you over hard to understand but smart Ph.D professors from Ivy league any day.

Ответить
Felix TheCat
Felix TheCat - 17.04.2023 23:48

Was too fast for me

Ответить
Akumlong Longkumer
Akumlong Longkumer - 15.04.2023 10:27

Pretty impressive. This is awesome. Cheers

Ответить
Darshit Goyani
Darshit Goyani - 12.04.2023 09:17

Lots of Thanks, Nick :)

Ответить
Alexis Julián Rojas Huamaní
Alexis Julián Rojas Huamaní - 09.04.2023 19:35

U R GOD MAN, so much thanks

Ответить
lvjian lvj
lvjian lvj - 08.04.2023 16:10

I really like this video. It is great!

Ответить
William Stephen Jones
William Stephen Jones - 05.04.2023 17:46

This is a very novel and cool way to teach coding. I really enjoyed it, and it was good to see you troubleshoot and get stuff wrong.

Ответить
robin vermillion
robin vermillion - 18.03.2023 10:36

where is it used? why?

Ответить
Dipendra Thakuri
Dipendra Thakuri - 15.03.2023 05:11

I think you missed dividing the derivative by 2. Because in the formula for cost function, we have (1/2*no. of training data)*sum of squared error, when we take the derivative, 2 from dldw and 1/2 from cost function cancel each other. Anyway, it was a cool video, keep up the good work brother

Ответить
WRSNx
WRSNx - 18.02.2023 00:33

You should create a model to Reduce the pressure during last minutes. Such that finding an optimal time tolerance (+-) ( 15+-b) 😂😂😂😂. 😢 but we need more videos like this to have good dataset 😂😂🎉. Thanks man

Ответить
D. V.
D. V. - 13.02.2023 15:05

I've been following your channel for a while now and I always find new cool stuff here. Keep up the good work, it's really helpful. Also, I love your positive personality, you really make complex stuff look entertaining.

Ответить
Jake Kisiel
Jake Kisiel - 10.02.2023 00:05

Is there any other machine learning/NVIDIA Jetson video tutorials you would recommend?

Ответить
विशाल कुमार
विशाल कुमार - 09.02.2023 08:38

why is it necessary for x and y to be list of lists ?

Ответить
Bir genç
Bir genç - 04.01.2023 14:06

Love it!

Ответить
Дима Дмитрий
Дима Дмитрий - 03.01.2023 12:38

👍👍👍

Ответить
Pedrommelos
Pedrommelos - 27.12.2022 11:30

hey man! I have a friend from Lyon and you guys have the same surname, haha
Any chance you have roots from there?

Ответить
Ibrahim
Ibrahim - 14.12.2022 04:11

ChatGPT won this challenge instantaneously lol :

import numpy as np

# Set the learning rate
learning_rate = 0.01

# Set the number of iterations
num_iterations = 1000

# Define the data points
X = np.array([[0, 1], [1, 0], [1, 1], [0, 0]])
y = np.array([1, 1, 0, 0])

# Initialize the weights
weights = np.zeros(X.shape[1])

# Train the model
for i in range(num_iterations):
# Compute the predicted values
y_pred = 1 / (1 + np.exp(-1 * np.dot(X, weights)))

# Compute the error
error = y - y_pred

# Update the weights
weights += learning_rate * np.dot(X.T, error)

# Print the weights
print("Weights:", weights)


A.I. description of the code: "This script defines a simple dataset with four data points and trains a model using the gradient descent algorithm to learn the weights that minimize the error between the predicted values and the true values. The model uses a sigmoid activation function to make predictions.

The script initializes the weights to zeros, and then iteratively updates the weights using the gradient descent algorithm, computing the predicted values, the error, and the gradient of the error with respect to the weights. The learning rate determines the size of the step taken in each iteration.

After training the model, the final weights are printed out. You can use these weights to make predictions on new data points by computing the dot product of the data points and the weights, and applying the sigmoid function."

Ответить
HARSHAD
HARSHAD - 17.11.2022 05:55

I can do this more efficiently

Ответить
20MSCAIML40 R.B.RITHANYA
20MSCAIML40 R.B.RITHANYA - 07.11.2022 16:38

oh god! you forgot to save and i involuntarily kept shouting SAVE IT! SAVE IT!

Ответить
cavaliere oscuro
cavaliere oscuro - 29.10.2022 10:45

the essence of Deep learning in a few lines of code... awesome

Ответить
Meguellati Younes
Meguellati Younes - 28.10.2022 02:52

I wonder how much i takes the backpropagation algorithm ?

Ответить
RRR Family ( Rash Ride Rockers)
RRR Family ( Rash Ride Rockers) - 12.10.2022 14:05

so can you please do this algorithm for multiple variables

Ответить
Felicia
Felicia - 07.10.2022 10:34

Amazing video!! Thank you so much

Ответить
Yu Chen
Yu Chen - 28.09.2022 14:12

Where's my $50 gift card? Lol

Ответить
Hari Nair
Hari Nair - 25.09.2022 13:50

Thanks for the video, subscribed! A suggestion : this small change to your code would demonstrate a real-world gradient descent solution for linear regression with noisy data. E.g. :
x = np.random.randn(20,1)
noise = np.random.randn(20,1)/10
# w = 5.8, b = -231.9
y = 5.8*x - 231.9 + noise

Ответить
Luis M
Luis M - 10.09.2022 05:21

Great video, I like this kind of video where you code some AI task counterclock, you teach us the concepts and show us the reality of implementing it👏
Well explained 😄👍

Ответить
Adi purnomo
Adi purnomo - 07.09.2022 10:06

Bro, how to implement gradient descent as weight in K nearest neighbor ?

Ответить
Adi purnomo
Adi purnomo - 07.09.2022 10:03

Nice implementation bro

Ответить
Msa720
Msa720 - 01.09.2022 16:00

Please do a video building a NN from scrath!!

Ответить
Terrence Jefferson Cimafranca
Terrence Jefferson Cimafranca - 24.08.2022 16:31

Can you explain the notears algorithm? It would be a great help.

Ответить
Maksym Dmytruk
Maksym Dmytruk - 21.08.2022 11:56

Thanks waiting for the part 5 forza

Ответить
carlos vasquez
carlos vasquez - 20.08.2022 15:56

Great video. Set time to 20 mins.

Ответить
Tech updates
Tech updates - 19.08.2022 04:23

Nick but I thought there are existing algorithms that u can feed your data into ? I love the way you’re doing it though but is it good doing your style or used existing ones ??

Ответить
Leonard Püttmann
Leonard Püttmann - 19.08.2022 00:43

This was oddly intense. Great job Nicholas! Even though you ran out of time, this video is still a win to me. 😉

Ответить
Sergio Quijano
Sergio Quijano - 18.08.2022 20:38

You are so good at explaining these complicated concepts. Also, if you want to close the explore tab in VSCode try: Ctrl + b

Ответить
javierjdaza
javierjdaza - 18.08.2022 20:08

i need to say this: you are the gamechanger here!!
as a data scientist +2 years of experience, i ALWAYS learn something new with your content! please nich, never stop doing this things, and also, never cut your smile in your face, even if your are having bugs!!
thanks for everything

Ответить
Julian Steden
Julian Steden - 18.08.2022 20:03

Great Video!
Would be cool to come back to this and add visualization during gradient descend using matplotlib and show what is actually happening.
For example drawing data points, regression line, individual loss between line and data points and showing stats like current step, w, b, total loss! :)

Ответить
MrElectrecity
MrElectrecity - 18.08.2022 18:39

Please check the Auto Save in file drop down list it's really time saver 😃
I need to see the video many times to understand what are you doing
But great work
I love all what you do
Thumb up 👍👍

Ответить
Asoo Sroo
Asoo Sroo - 18.08.2022 18:32

You can contact us on telegram

Ответить
Miguel Fernandes
Miguel Fernandes - 18.08.2022 17:38

Awesome video !! It's preety cool to see such theoretical concepts coded and explained like this. Keep going Nich !!

Ответить
Patrick M.
Patrick M. - 18.08.2022 16:41

Are you reading my mind or something? Every time I'm stuck on a topic, you drop a video about it...

Ответить
Quadrophenia Guy
Quadrophenia Guy - 18.08.2022 15:28

Could you please upload correct code to github? I lost track of your logic after "def descend () etc".

Ответить