This is the first post in a tutorial series covering Node.JS, from theoretics to practical implementation.
The word good is a subjective value term, especially when used in the context of programming, where comparisons between languages are often centered around what may be the “best” tool for one’s engineering goals with no absolutes assigned. However, Node.JS is a fairly good example of a well-written language and also a pretty good representative example of what’s happening in the software and technology industry as a whole – a giant move toward non-linear processing systems that mimic the complex processing found in nature and that better reflect modern life experience. It is thus not surprising that we are seeing a torrent of new programming technologies come out, from Scala to Node, to MongoDB, that are overturning the old and somewhat rigid order of things. They are languages for the new world and for the programmers in it, who think differently than those from the past. Art may imitate life in this case, if you consider language design to be an art.
Node is event-driven dynamic content serving and has asynchronous I/O. A event-driven dynamic content serving system means that, as a webserver, it handles connections without starting totally new processing threads between the client and itself. So Apache, which is not event-driven, keeps connections open in the form of a thread stored in memory to a particular client. Thus Apache itself, and the computer it runs on, can have problems scaling if there are too many connections, usually about 1000 connections at once. It simply freezes. There are in existence currently event-driven STATIC content serving servers, such as Lighttpd and NGINX. In fact, YouTube currently uses Lighttpd to serve up it’s static video(simply pull the video and display it). But for actual processing, actual scripting, something different was needed. Enter NODE.JS, and you have an dynamic event-driven webserver, one that takes commands and connections from the client as it is received, leaving no open processes. Fire away at it with all you got and it and the webserver can handle it.
Secondly, the webserver component, while important, is not all Node has to offer to improve efficiency and scalability. Node is asynchronous I/O, meaning it doesn’t parallelize it’s threads when it is running its functions. What does this mean? It means code continues processing after a function is called. Think about that for a moment – if you are a hardcore PHP programmer, this concept will be completely foreign to you – you can not wait for a function to finish before the interpreter moves onto the next line of code. It means the code will continue running while the function is processing. I know from my experience with PHP that this would result in me completely redesigning my code, and indeed in Node, functions are built inside functions, creating one long chain of function, in a function, in a function, in a function, etc. The reason this kind of processing works is because every function call in Node has a callback parameter, so there is no need for the parallelization that occurs in PHP, where your base code is stored in memory in the form of a thread, as the function being called runs in a separate thread. For huge web applications and services, this is majorly beneficial because it simply takes up less memory and, again, enables the web server/backend script combo to run on less memory overall, on less load, and reduces the chance the page will be frozen or lagged if too many hits, connections, or processes are happening at once.
The point is that Node has incredible benefits from a server load and business ( think electricity usage, RAM cost, number of servers) standpoint for large websites, which is why LinkedIn and the NYTimes are all switching to it from their current configurations.
In the old days, these type of processes, where you don’t do one thing at a time but many things, would probably be called chaos; it’s non-linear and it doesn’t make sense to simpletons. But these processes are actually carefully crafted organized chaos. The processes are diffused procedures. This type of thinking change is a step forward, and it happens all the time. Node.JS is a step forward similar to the many step forwards before it.
Thanks to the folks at #Node.JS on FreeNODE.