Serverless Doesn't Make Sense

Serverless Doesn't Make Sense

Ben Awad

3 года назад

364,825 Просмотров

Ссылки и html тэги не поддерживаются


Комментарии:

Kahn
Kahn - 13.11.2023 09:18

AWS is a great way (or a gateway) for you to get Jeff'ed.

Ответить
Ana Skywalker
Ana Skywalker - 30.09.2023 05:38

fast forward to 2023, prime video team agrees with Ben.

Ответить
Jp Singh
Jp Singh - 26.09.2023 17:35

2023 , amazon just confirmed you were right

Ответить
Piyush Gupta
Piyush Gupta - 22.09.2023 14:23

This guy thinks kubernetics doesn't run on server

Ответить
mangotrails
mangotrails - 18.09.2023 19:05

Serverless doesn't make sense for your usecase may be ?

Ответить
Obz
Obz - 14.09.2023 20:26

I like aws lambda and google firebase both are great honestly

Ответить
Enzo Coelho Albornoz
Enzo Coelho Albornoz - 01.09.2023 17:57

Wait 500ms for a single request to warm your AWS function (which can be a timed event) OR pay for an entire K8s cluster and destroy your wallet.

I'll probably stick with the first option.

Ответить
Daniel Gruszczyk
Daniel Gruszczyk - 31.08.2023 15:50

You are focusing too much on synchronous API calls. If you are supporting your API with serverless functions then yes - cold starts suck. If, on the other hand, you are using serverless for async operations in event-driven systems, where response times do not exist - then cold-starts do not matter that much.
Try this example - API running on a kubernetes cluster (or ECS, whatever you like), at the point of receiving requests it publishes an event to SNS, you have SQS subscribed to this and Lambda triggering off that event, processing some data asynchronously.
Or another one: API call -> kubernetes -> something saved into DynamoDB -> Lambda triggered from DynamoDB stream.
Real world examples would be: user buys something in an online store, you do not need to generate an invoice "in real time" before showing them purchase summaary. A lambda asynchronously triggers of that event, generates the invoice and that gets emailed.
THIS IS where serverless TRULY SHINES.

Ответить
Salaheddin AbuEin
Salaheddin AbuEin - 27.08.2023 00:03

Thank you.

Ответить
Pete
Pete - 23.08.2023 17:32

They sounded like a stupid idea the day they first appear... Because IT IS a really stupid idea.

Ответить
עומר פריאל
עומר פריאל - 06.08.2023 21:36

In the past, I would have agreed with you on the two things you said in the first sentence.
Today, I understand that there is sometimes an appropriate use of Serverless. For example for models or calculations. In my opinion less for routine things like image resizing service for customers

Ответить
curtisw0234
curtisw0234 - 13.07.2023 06:31

Serverless requires you to trust the provider to not fuck you over

Ответить
Shubhra P
Shubhra P - 11.07.2023 18:14

I'm confused by headless ... girls

Ответить
Daniel Van Niekerk
Daniel Van Niekerk - 28.06.2023 16:06

Would love your opinion on Google's Cloud Run, which can handle multiple requests per instance, and you can eliminate cold start by processing the SIGTERM sent by cloudrun just before it terminates to quickly let the cloudrun call itself, thus provisioning another cloudrun to be in a ready state, while only paying when it is processing requests (assuming you allow cpu throttling when no requests are being processed)

Ответить
Temidayo
Temidayo - 22.06.2023 02:34

"I wanted that. And I wanted that serverlessly." 🤣

Ответить
Aditya Yada
Aditya Yada - 16.06.2023 12:36

somewhere I read/saw a vid that
monolithic is better and cheaper and that amazon saved lot of money by switching to monilithic.
Not sure about it though.

Ответить
John McWay
John McWay - 10.06.2023 03:04

True serverless would be to run that image resizer on the client duh

Ответить
Awakened Raccoon
Awakened Raccoon - 08.06.2023 01:22

Cold start doesn't matter practically. Having even 100 calls to the Firebase API per day will result is blazing speeds (pretty much equal to ping between two end points).

Ответить
valentin krajzelman
valentin krajzelman - 07.06.2023 17:01

this video aged beautifully

Ответить
Alicia Sykes
Alicia Sykes - 06.06.2023 01:17

Can vouch for CloudFlare workers, it's the best of a bad bunch

Ответить
Childrens Health
Childrens Health - 05.06.2023 17:16

"This thing, that I don't really know much about, seems to not work well for all use cases. Therefore, it is trash."

Ответить
Luís Filipe
Luís Filipe - 04.06.2023 15:40

I wonder if people's opinion has changed after AWS itself said it's more expensive to run serverless stuff.

Ответить
Richard Lucas
Richard Lucas - 01.06.2023 17:22

The point is to keep starting new things so there's a limit on how far you'll go with the completely useful tech we already have, and that being for the sake of opening new opportunities for business. It's kind of at the point where there is no net gain in usefulness or functionality from the end user perspective, and the change occurs solely for the sake of fog of war. You are free to resist and you should. Make something useful from fricken PHP just to spite the new crop of CS grads, lol.

Ответить
Tacticool Rick
Tacticool Rick - 30.05.2023 02:12

Look, you need to consider your use case before you decide to use a microservice. For medium/enterprise size business, using microservices for some things makes a lot of sense, and is a lot easier than maintaining monolith architecture.

Also, Azure functions can live as either standalone functions, or you can spin up an app service. You'd likely not run into cold start issues for that case, but you'd then be paying for an app service.

Third, pure speed isnt the only reason you'd go with one provider or another. For functions that get called frequently, you should be running something like an app service anyway.

Ответить
ㅈ ㅊ
ㅈ ㅊ - 21.05.2023 09:23

never write in pure serverless or containers.. a hybrid will be better so you don't end up fighting with the tool

Ответить
Renato Custódio Pereira
Renato Custódio Pereira - 19.05.2023 04:16

The real winner in GCP is actually Cloud Run, not cloud functions. Instead of using small functions you just upload your whole docker container with your app written in the language and framework of choice and that's it.

Ответить
Pim Scheffers
Pim Scheffers - 18.05.2023 18:23

My holy goodness! Resizing an image in native C/C++ takes a few milliseconds and uses up a maximum of 2x the uncompressed image size in memory (say 25mb). and the binary size is like 100k max. Modern computing I guess.

Ответить
Stephen Muga
Stephen Muga - 17.05.2023 14:07

McLovin talks about serverless. YES!

Ответить
Tom S. Gao
Tom S. Gao - 16.05.2023 19:54

serverless... you mean a laggy pod w/ startup delay running on someone's server.

Ответить
Nicolas Rodriguez
Nicolas Rodriguez - 16.05.2023 03:36

Ive used firebase and I like it a lot. Very easy to setup and deploy code. I’d say the biggest issue with serverless functions at least with aws is just learning everything. When I first started aws it was like trying to solve a puzzle without a picture to work off of. Now I just use terraform to automate all of the infrastructure setup. Makes it a breeze to provision anything I want in minutes.

As far as cold start latency. It doesn’t effect much in actual production applications that don’t require super fast speeds I’ve found

Ответить
BitwiseMobile
BitwiseMobile - 16.05.2023 03:29

"I don't understand something, so I avoid it like the plague." That's a winning mindset, let me tell you. /s

Ответить
Javier Corado
Javier Corado - 16.05.2023 01:49

Funny how well this aged. Amazon Prime team surely watched this video.

Ответить
Aleksander Błaszkiewicz
Aleksander Błaszkiewicz - 15.05.2023 21:05

It is worth noting that AWS Lambda CPU speed scales with RAM up to 1GB. Then you get more cores which in case of nodejs is not useful.
So if your nodejs app consumes sub 1gb ram, you will get maximum performance at 1GB ram.

Ответить
Mohd Salman Ansari
Mohd Salman Ansari - 15.05.2023 14:26

he was a hero.

Ответить
Ashim
Ashim - 15.05.2023 10:30

Amazon was just 2 years behind Ben in figuring that out. The same Amazon might reject on a DSA interview. 😂😂😂

Ответить
Rob Malford
Rob Malford - 15.05.2023 09:18

Neither do React Hooks but your generation keeps on shitting them out.

Ответить
吉井雄太朗
吉井雄太朗 - 15.05.2023 04:21

Why can't you just do the resizing on the client?

Ответить
Bjørn Otto Vasbotten
Bjørn Otto Vasbotten - 15.05.2023 00:34

Have no idea why this two year old clip popped up in my feed, would be interested to hear Ben's opinion on the topic today

Ответить
EyesOpen
EyesOpen - 14.05.2023 23:36

Its all marketing BS...

Ответить
Kaleb Productions
Kaleb Productions - 14.05.2023 17:36

#1 How often are you resizing an image? #2 Resizing an image can be done on the client side. Let the user's computer do that work, it's free for you. #3 If it stays warm for a short period of time for functions that happen more often than that time. For ones that happen less often but you still want them warm you can tell it to stay warm.

Ответить
DAoC Nostalgia
DAoC Nostalgia - 14.05.2023 17:23

The language you use is never what makes things slow, that performance difference is in the microseconds. It’s always network

Ответить
Mel
Mel - 14.05.2023 16:28

the intro tho. haha

Ответить
Artie Fischel
Artie Fischel - 14.05.2023 09:52

Yes, if you have enough, steady traffic you probably should put your code on a server. If not then Lambda is an option if you have really spikey loads.You're going to be setting up API Gateway either way .

Ответить
B
B - 14.05.2023 08:35

Your mistake with serverless on GCP was not using Go.

Ответить
vander monke
vander monke - 14.05.2023 07:54

my man knew....

Ответить
mattmmilli
mattmmilli - 14.05.2023 03:34

We have plenty of processes that take minutes if not hours to run. Waiting a few extra seconds on few parts isn’t a big deal. All depends on what you are doing

Ответить
Vojin Milovic
Vojin Milovic - 13.05.2023 23:37

Coming from the future, you were right

Ответить
Ogo Okafor
Ogo Okafor - 13.05.2023 22:28

THIS AGED WELL!

Ответить