Комментарии:
Tensorflow has issues with M1/M2 macbooks.
ОтветитьWhat about m1 now in 2023
ОтветитьThanks a lot for making such a helpful video man.
ОтветитьHi Jesper and tank you for your video, informative as usual. I'd like to ask you what you think of a laptop with a Ryzen 7 5825u and no GPU, but with the intention of connecting a "prosthetic" desktop GPU in the future through a pcie connector; I'm talking about something like the EXP GDC "THE BEAST". Or do you think it's easier and better just using an external GPU through thunderbolt 4?
ОтветитьHi given that I will do the learning on cloud but it may not be always available while travelling. So, what minimum configuration would you recommend. I am a beginner and student . I travel so I have to also carry a work laptop. portability is a concern. I thought MacBook Air with m1 but ram could be concern. Can you suggest minimum ram, processor, gpu(if needed) and other.
ОтветитьI understand your point but I don't fully agree about your sentence when you say (with my words) "a CPU with just a load of RAM will be enough"... I'll explain why:
Though you are right saying we have to prioritize RAM, but CPU is important too...try training a model with Weka workbench (java based) on you laptop or desktop computer... a fast CPU will help.
Students will do deep learning and not necessarily limit themselves in machine learning with scikit or whatever framework. so...
a) Having a lot of RAM yes but with a very good CPU too... most probably when working with MLmodels is because you are probably working on an application that requires many components where all are not necessarilly ML based. You could design a NodeJS driven UI that will interact with some back end that you still develop onto your computer and that will serve the model.
In order to make it in a very efficient and organised way, you will endup with containers and there is why you'll need CPU and RAM (though they are lightweight).
b) because of (a) you will probably start diving in both DEVOPS techniques and MLOPS paradigm. Both of them will require automation which will also consume CPU. Especially if you build a C++ or Java application that must be built.
c) because of (a) and (b) your computer will start to gain some load just to work all these things.
d) Though an NVIDIA RTX is quite expensive, it can help you a lot on doing deep learning tasks and allow TF to use the onboard GPU. there you will face interresting issues. You'll probably hit during training the VRAM limits and will have to work hard but learn in order to get a really good neural network architecture running on your machine.
e) You talk about using cloud, yes I agree partly, this is only for experienced people. Other will get hard time to make it work (I am not talking about Google colab or any other fantasy stuff).
Therefore you will travel from (a) to (d) on your local machine.
Personally I follow you and agree on what you say about using open sytems and not using macs. 2 years ago I bought a Linux Laptop with an onboard NVIDIA RTX. Because of the budget I could only afford an RTX3600. But I could have a very good intel i7 16 vcpus and 32 GBRAM. All that for less than 1800€ with a wide screen (17"). But that was 2 years ago. Today I would go for a more robust RTX card and smash 64GB or 128GB RAM directly.
The only thing I think I would recommend in that... is the battery, choose good ones and chose a laptop with spare batteries. Also, because you will work with Docker containers and perhaps have many versions of the virtual environments in Python... think about the disk space => I recommend today MINIMUM 2TB of SSD. If you can afford more, better it is.
Then yes using a cloud solution is also elegant but you'll still need to consider an efficient laptop too because of (a) to (e)..
Suggest laptop in Dell / Asus for this data science and ai
ОтветитьI do not suggest a macbook for ML. I have a fully beefed out M1 Max, even with big ML frameworks like tensorflow starting support ARM, that's not main issue. The main issue is the package manager. Your computer will be your "entire" development environment, and developers know the ass-pain of dependency hell. The apple package manager is absolute garbage. You may be able to use an updated framework for M1, but that doesn't mean you'll be able to use the code everyone else is producing with other frameworks and libraries, or more importantly the prior version of pytorch or tensorflow. Get a PC and install a linux distro with a good package manager.
ОтветитьHow Mac mini for Neural Network and ML
ОтветитьGreat advice. Because I've been looking for a second machine for my deep learning research. Now, I will switch my strategy from a local machine to the cloud. Thanks.
ОтветитьThank you for this. Is there a need to offer an update to this video given that it is now two years later?
ОтветитьThis was the first video of yours I ever watched and when I started I thought, naaah a new MacBook Pro could surely be fine for training models. I can't tell you how wrong I was. The hype is very different from reality and you are 100% correct. I have had to embed so many special cases into my training pipeline to support MPS (METAL) and even then support for torchvision is still incomplete in V2. I ended up going for an rtx4090 on a separate headless Linux server and it reduced training time on my use cases by an order of magnitude.
ОтветитьGood price/performance Macs are only base models, if you upgrade just 1 thing it’s gonna cost you a fortune.
ОтветитьVery good, thanks.
Ответитьur right my laptop is really aerodynamic... i find myself playing frisbee with it all the time
ОтветитьYou are so very correct. Especially for newer AI developers, long training times are not the norm. We use RTX through H100’s for most of our AI development— at least on the training side. However for coding, data sci work, inference and UI/UX we all use our favorite OS, whichever that is. One thing to keep in mind for pro level large parameter/data set AI dev, you will often be using a dedicated server running in the kilowatts with AI grade TPU/GPU’s (e.g. V100’S, H100’s, etc). Whether owned, hosted or otherwise, few jobs will be run locally.
ОтветитьThat was really helpful. Many thanks
ОтветитьNowadays with LlaMA models we might can run some models on our laptops
ОтветитьI got myself an m1 air a couple of months ago. One thing I dislike is that tensorflow has multiple issues with Mac. It's better to learn about scaling and deploying first, because clouds are always available, rather than throwing a large amount. As for whether it's worth it when you're very advanced in the field, I'll update when I get there 😂
Side note I have a 3070 but I realised model design, preprocessing plays more of a part in ML.
laptops are aerodynamic? did anyone here that also he throw his laptop
ОтветитьVery helpful! Thank you so much.
ОтветитьI'm starting machine learning with a neural network for a school project, he 😅. First I start with Google Colab because there are too many things to install/setup and I wanted to omit that tedious step. But... I exceed the limit of GC, so I decided to test the model on my laptop (I'm kind of stingy). I looked how to use de GPU on the env created for this task (again I'm starting in ML, just knew a few of python) and I discovered more packages/libraries to install and deal with the compatilibily with windows, my Nvidia card, ... 🤯 (regular user). It's too overwhelming, I even considered to partion the disk for Linux OS or use a VM, 'cause I need some programs that don't run in Linux. Just too much. I'm too grateful for this video, it has been my lifeline in a moment of despair 🥹🥹🥹
ОтветитьDude i dont know is it enough for me? For beginner = rtx3060 , 32gb ram , i7 12th gen , 512gb ssd (Hp Victus 16) . Is it ok for me? What do u think?
ОтветитьThis video is gonna blowup because of chatgpt.
Ответитьthis high-pitch sound you heard most likely came from capacitors
ОтветитьTotally disagree ..cloud GPU is super expansive AWS GPU cost 3\$ per hrs * 24hrs * 30 day = 2160$/month.
Ответитьcould you help me , ERAZER MEDION laptops are good for neural networks?
ОтветитьAgree with many things here, great video! However, using cloud GPUs is cheap at first sight, but letting a model train for days on cloud GPUs might be much more money than your electricity bill and cost you in the hundreds (with a sizeable model) and should be considered into the whole calculation. Cloud GPUs range from 0.2 €/hr (single 3090) up to 4€ /hr (multi-A100), a discrete GPU might pay off in less than a year depending on your project.
ОтветитьGreat video Jesper. Can I ask u? Wud be better laptop with i7 1260p and 64gb Intel Iris graphics xe or i7 12700h, 32gb, Intel Iris Xe too but having gpu RTX 3070 8gb? Second little more price about 250€. Thx for your help.
ОтветитьCUDA
Ответитьwhat shoud i get for my graduation project. My teacher wants me to use the below techniques:
-Logistic Regression
-Support Vector Machine
-Random Forest
-Decision Trees
Any market recommendations? I waa told the Lenovo Yoga is a great buy, I am new on this topic. What do I need to get on memory etc?
ОтветитьAlready owned a Acer Nitro 5 with RTX 3070 mobile + R7 5800H. Still watch you full video :). And my laptop can train 90% types of model after I cramp up the virtual memory => 80 GB (from 16GB of RAM 😂). I'm very satisfied with my $1500 laptop
Ответитьyou broke a laptop , you can give it to me ,
ОтветитьHow about running Mindspore on Macs and laptops with Nvidia?
ОтветитьLoved this video, Jesper. Thank you!!! I’m happy I came across your channels
Ответитьgreat video, thank you!
Ответитьyou re a prophet
ОтветитьLaptops are aerodynamic 🤣🤣🤣🤣🤣🤣...
ОтветитьI'm preparing to train on two laptops, Dell Precision 7550 with Quadro T1000 dedicated graphics, 64MB RAM and Windows or Dell Precision 7520 with 16GB RAM and Quadro M1200 (it's not even supported by Rapids, so we'll see). Maybe this year I will buy PC with nVidia 3070 or other new card, depending on the market offer (we'll see how Intel ARC will behave with AI).
ОтветитьRAM is the key. At least, from my experience in ML.
Ответитьhello, I really loved your video, I have a question, I am a computer sciences master degree student and I am taking courses like machine learning, deep learning and artificial intelligent, do you recommend the macbook air M1 or should I go for a pro or promax? I need it for my studies.
thank you very much for your help.
Nice video!
ОтветитьThis is a useful video, but having tried tensorflow-metal on osx/mac its just not ready (in 2022) - the particular piece of work I'm doing sees the ML RNN network come to a screeching halt after several hours, the issue has been replicated by Apple Developers, but they have yet to offer a fix, in the meantime I cannot do any development on my mac. Because I have to run the model, I went for a gaming laptop that is CUDA compatible, in 2022 I was able to pick up a laptop with 4G of video ram for less than £800 - that is alot, but compared to running ML/AI GPU on cloud providers, a better and cheaper development environment (An EC2/P3 for Horovod costs £3.59 an hour in eu-west-2). I've only just found your channel and will subscribe, but I think for 2022 this video could be updated as there are some good gaming laptops with nvidia chipsets that a fully CUDA comapatible that allow tensorflow to run natively.
ОтветитьHello @Jesper, Some python library's use Intel mkl and
other Intel specific optimization ..... How much performance does this effect in your experience when compared to AMD. Should we just avoid amd cpus for ml work?
Does the new AMD GPU work with deep learning ?
ОтветитьThank you so much for your exposition. I just got into machine learning at the start of the year (so 6 to 7 months at the time of writing). I have a gaming laptop with 6GB of VRAM but I find that it's not the GPU that's utilized when I'm training ANNs. So I've been considering the M1 Macs because of the inbuilt neural engines. Could you make a video detailing them, just as you did the GPU?
ОтветитьI've found if you're just interested in accelerating inference you can get away with murder in the hardware department.
ОтветитьSo, what is your suggestion for a person who is planing to get a laptop for ML/DL workloads?
Ответить