13

How Might Serverless Impact Node.js Ecosystem?

 5 years ago
source link: https://www.tuicool.com/articles/hit/A3eUfqn
Go to the source link to view the article. You can view the picture content, updated content and better typesetting reading experience. If the link is broken, please click the button below to view the snapshot at that time.
UFFvQfi.png!webnEBFjqr.png!web

Since there’s a lot of client-side developers, who know JavaScript and are used to how that language works, it’s very easy for them to walk up to Node.js and start building their own backends. Serverless has a similar kind of thing about it. Serverless is very easy to walk up to, start building a cloud service and get it up and running. You don’t need to worry about setting up and managing VMs, antivirus software, firewalls, etc. — all that’s done for you by the managed service.

You can now use serverless frameworks to deploy systems at scale for not only frontend-facing HTTP applications but also back-end systems that might have even been further away from their area of expertise. It’s a very accessible thing to get up to

The impact of Serverless on Node.js ecosystem

Because node.js is the star language of serverless, we’re gonna see serverless, as it’s becoming more popular, bringing even more people into the node.js ecosystem. However, there’s a couple of gotchas that new people coming into the node.js ecosystem have to deal with. Lately, Node.js started ignoring some the lessons that we learned from the web-space. We became too comfortable with the fact that since these things are gonna be long running on servers, it doesn’t matter if our packages are huge and we load everything in the world and smooth that process.

Also, we tend to think as long as the application not taking up huge event memory footprint, who cares coming in tiny little files that we have and things along those lines. Node.js is great because of the simple cases it comes up very quickly. For the cases where you decide to go ahead and import Lodash and every single giant framework that you can think of into memory, it takes a while to even read those things from disk. The memory footprint then means that people who are using those libraries inside of functions have to pay more because they have to pay for that memory footprint to actually exist.

If you load giant libraries into memory more than ever, you do not have to pay for that. If I can choose a library, which is one megabyte versus a library, which is 30 megabytes that can mean big differences for my end of month bill in the case of serverless.

It really pays to be thinking about how we can go ahead and reduce what actually gets loaded. Lots of things can help with this.

The classic example of Node.js impact

Azure Functions Pack webpacks your code all together. It doesn’t necessarily do as much to reduce memory footprint. For that, you’d want to go ahead and run Uglify on top of the results of the web package output. It does, however, reduce the load time.

Somebody from Microsoft Azure tested the pack with the four largest node modules he could find. He was trying to understand the impact of serverless on Node.js. He put them into a function and measured it on an i7 with an SSD. It was a blazing-fast system yet it took about two seconds for that function to come up for the first time.

Then we made Node.js read all that stuff into memory. When he webpacked it, it took about a hundred milliseconds — a giant improvement.

When you’re running inside serverless, it’s gonna take even longer to load those modules into memory. Many of the sales providers actually have limits on the amount of content you can actually put up there in the first place. It’s important to think as we’re writing these packages, especially if we want these packages to work inside serverless, we need think what those things are gonna do.

Read the documentation before coding

In one instance, a socket.io application at a lead database company, started consuming too many sockets. The Node.js developers working on the application realized the function they wrote was coming in there, continuously using it, opening up sockets but was not closing them. Of course, they were misusing the library. They didn’t read the documentation properly. They weren’t using the function as prescribed but they used it the way that it felt natural to use it — inside of their function.

3UjuayE.png!webqA7zmmN.png!web

A quick workaround when writing a package which might be harmful to use in serverless, developers must write warning. Instructions on how to create a singleton instance of it outside the function so it stays in their memory. If we talk about the socket.io example above, instruction on how to write the code for it to close the socket before the functions completed would have been greatly beneficial.

These warnings and instruction can help a long way making sure that anyone coding Node.js on serverless doesn’t feel alienated. It’ll pay to make sure the people are successful in that way.

Node.js and serverless have really gotten very far and been really well because of the strong, open source nature of things and how easy it is to get access to the parts of the system runs on.

Potential risks

Since the service implementation today are very vendor specific, there is some risks there. So it’s something that’s worth keeping in mind. There are people out there who are thinking about the ways to address some of these problems.

ZZzMryB.png!webmi2MFz7.png!web

Even though vendors are building a specific implementation that’s mainly designed to run on their cloud platform, they can make it open source. They should try to build it in a way it is a bit more of a portable system. They should make sure that you don’t feel like you’re locked in even though it’s built to run really well on their cloud platform.

The efforts should be in the direction where vendors are trying to make sure that they at least build it in the open source and eventually work towards a way where it’s even open platform.

The future is now!

When you are thinking about trying to deploy services out into the cloud, consider the serverless path whatever vendor you feel is right for you. It’s all about trying to improve the agility with which you can deploy and build services, which you can trust will operate well at scale. Node.js can build more with serverless and you’re already in good shape. If know the language of choice on this platform — Node.js, then you’re already ahead of the game. Serverless has taken advantage of the fact that Node.js and JavaScript already had this great, big, open community.

What do you think?


About Joyk


Aggregate valuable and interesting links.
Joyk means Joy of geeK