fbpx

Why does NodeJS scale insanely?

If you are new to NodeJS, you might have heard that NodeJS is single threaded. And you might have also heard that it is insanely scalable, serving millions of users at realtime. How does a single threaded application scale so well?

Single threading is half the truth

Yes, NodeJS follows a Single Thread Event Loop Model, but it is not actually single threaded. It works on an event based execution architecture.

NodeJS has a main thread and additional worker threads. Tasks that do not have to be serviced synchronously can be passed onto the worker threads. When worker threats are ready to be executed, they report back to the event loop. The event loop picks up an event and passes it to the main program stack for being the next in line for execution.

This provides a single threaded, but sudo parallel execution environment.

Understanding NodeJS Execution

const request = require('request');
let f1 = function() {
  console.log('Hello at beginning');
  request('https://google.com', (err, res, body) => {
    console.log('Hello from function');
  });
  console.log('Hello at end');
}

f1();

If we executed the above code in a procedural manner, we would expect the following output.

Hello at beginning
Hello from function
Hello at end

However your NodeJS application will show the following output.

Hello at beginning
Hello at end
Hello from function

Why is this so? Why does the request() line execute after the last console.log() statement? This is so because invoking request() is an asynchronous task. The execution of this task gets allotted to a worker thread. While the worker thread waits to get the response from google.com, the main thread can continue with further execution. This results in the last console output being printed while the worker thread is waiting for a response on the request.

When the worker thread does receive a response, it puts an entry into the event loop. When the main thread is free and doing nothing else, it picks up an event from the event loop and executes the tasks that was allotted to the worker. The event loop tasks are only executed when the main thread is free and not performing any other task.

NodeJS Async Execution
Call to request() passed on to a worker thread

So why is NodeJS insanely scalable?

This unique event based model prevents NodeJS from being blocked by any specific event. Each event is treated and processed independent of each other. This is only true as long as you don’t write code that blocks the main event thread.

Since async function calls report back to an event loop for execution when they are ready to be executed, the main thread is always busy doing something and never waiting on any task. A properly designed NodeJS application, can thereby keep the main event loop free from long running tasks, by passing long running tasks to worker threads.

This concept is very different than spawning new threads for executing tasks in parallel. There is a physical limit to the number of threads a system can execute. When this limit is reached, if individual threads are waiting for a long running operation to complete, all threads would essentially wait, thereby making the complete application slow.

On the contrary, in NodeJS, the main event loop only gets those tasks to execute that are ready to be executed. Thereby millions of concurrent events can be created, without affecting the performance of the main thread, thereby allowing for significant scalability of applications that are well designed.

NodeJS is turning out to be one of the preferred backend systems for web applications and web services.

BlobCity joins Docker Certification Program

An enterprise class multi-model and real-time analytics database can now be powered up right out of Docker containers.

India, 03 March 2017 – BlobCity is pleased to announce the availability of BlobCity DB Enterprise on the Docker Store. Today BlobCity DB has become the technology backbone chosen by many companies for their real-time analytics requirements. The enterprise license provides organisations of all sizes with access to a powerful, scalable and reliable database technology.

“We would like to congratulate BlobCity on their acceptance of the BlobCity DB into the Docker Certification  Program,” said Marianna Tessel, EVP, Strategic Development. “Enterprise IT teams are looking to Docker to provide recommendations and assurances on the ecosystem of container content, infrastructure and extensions. BlobCity’s inclusion into the program indicates that BlobCity DB, a real-time analytics database has been tested and verified by Docker, confirming for customers that BlobCity DB container images have been evaluated for security and are supported and built according to best practices.” 

About BlobCity

BlobCity is a multi-model real-time analytics database. It removes database as a concern from application architectures. It not only processes stored data at high speeds but also processes data in motion during on-going transactions.

Complete In-memory & On-disk storage engines

BlobCity DB offers dual storage methods, one in-memory and the second on-disk. Querying across data in-memory and on-disk has never been this easy. Dual storage allows you to split data between disk and memory data stores. This approach significantly improve cross query capabilities without a strain on the backend infrastructure budgets.

Hybrid Transactional / Analytical Processing

BlobCity DB can be used as a sole database backend to perform both online transaction processing and online analytical processing for the purpose of real-time operational intelligence processing.

Availability

BlobCity DB is available on Docker Store offering a full enterprise edition license. 1 month free trial available. Trial also comes with standard enterprise support. 

For more information on BlobCity DB, visit https://blobcity.com or visit BlobCity DB at the Docker Store – http://store.docker.com

All product and company names herein may be trademarks of their registered owners.

You will be charged on how you drive

Ever wondered that how you drive, and how much you drive could define the insurance premium for your vehicle?

Usage Based Insurance is seeing substantial adoption by insurance companies. It is changing flat rate insurance structures, with more dynamic insurance structures for your vehicles.

Usage Based Insurance systems encompass:

  1. Pay-As-You-Drive
  2. Pay-How-You-Drive
  3. Pay-As-You-Go
  4. Pay-By-Distance

Distance Analysis

Charges for your car could be based on the distance you drive. For the days or months that you are out of town and don’t use the car, you could actually land up paying near zero premiums for those months.

Behaviour Analysis

Your driving behaviour and discipline could define the insurance premium you pay. If you are rash driver, then UBI policies are not for you. But if you are in your 50s, have a car, and are a peaceful and safe driver, a UBI policy could help you save a lot of money on your premiums.

Supporting Technology

A simple electronic telematic device can be connected to your car by your insurance provider. Some cars come pre-installed with such devices, and the insurance company will accept data from such devices for setting your UBI based premiums.

Who offers this

Progressive Insurance Company and General Motors Assurance Company, started a distance linked insurance discount scheme for car owners. It used telematics that captures the distance car moved, with the insurance rate being dynamic computed, by the amount of time the car was actually running.

Who is benefited

Both customers and insurance companies are equally benefited. Customers get the option of choosing an incentive based policy. This translates to more accurately priced insurance premiums and discounts for good driving behaviour. The insurance companies are also benefited by repeating substantial returns on investments, and being able to secure a strong book of business with reduced losses.

Interesting reads & References

5 reasons why AI will not cost you your job

We have seen automation take up a lot of jobs, and AI is one of the biggest automation created by mankind.

With AI did mankind create its own killer?

The answer very definitely is a NO. The possibility of AI talking over the world and out smarting humans is very far fetched and unlikely even in the remotest of possibilities. Even if AI was built powerful enough to rule the world, why would the machines want the humans to be extinct or be ruled?

But having said that, AI can make us loose our jobs. How true is that? 

1.  AI is not intelligent enough to build itself

While AI grows to automate some of the human labour jobs, it also creates jobs for humans who will build and maintain the AI system itself. As penetration of AI increases across industries, the jobs to support the penetration also increase. Humans who would do laborious jobs would now have a revised job profile of observing the work of AI or building and improving the AI system itself.

2.  Machines need training data to learn, who produces this?

All of the AI systems need training data to learn. Yes there are unsupervised learning systems, but they are used for very specific requirements that are not widely acceptable for the jobs AI will do. Automation with AI requires a training data set to be fed to the AI system. The AI system will only be ask good as the training data provided. This creates a classic need for having the perfect training data set. Humans are the guinea pigs. They will have to produce all the training data for AI systems to learn. More so producing training data is not a one person affair, but will need thousands of people to do the same thing. This means rather than humans doing the actual laborious task, they will do the task in a controlled and recorded environment. Their movement and actions will be recorded to help machines learns.

3. Our kids will not be unemployed, they will just have different jobs than us

Evolution is inevitable and we have to embrace it. While we are building AI systems and discovering that AI can take over lot of the human jobs. We have to acknowledge that the Generation Z is grown up by seeing these technology advancements. While technology can baffle us, it is a part of life for Generation Z. If we think that the next generation will be out of jobs, that is completely wrong. They will just have different jobs, most of which they will create for themselves. Their jobs will embrace the technology advancements and will be targeted towards solving the problems that technology has not yet solved.

4. We can’t yet travel through solar systems

It is good that AI can take over most of the work we do to keep our world running. This means that humans can now work on solving problems that are not yet solved. We have travelled to the moon and that is just about how far it has ever gotten. A human on mars is still several years away. But what’s worse is travelling to other solar systems in search for life or to explore the expanse of the universe would take another 2-3 generations of research. If we get AI to run our world as we know it, then humans can spend their energies in exploring the rest of the universe out there and building technological disruptions that allow faster travel through solar systems and possibility even teleportation.

5. Humans are intelligent, machines are sudo intelligent

Machines portray intelligence, but humans have intelligence. This why intelligence of machines is called “Artificial” intelligence. We will never let machines take us over. We will just use them to make our lives simpler and help us do things that are not physically possible for humans. So the though of AI taking over our jobs is absolutely ridiculous. AI will do our jobs for us because we want it to do it. And we will want it to do our current jobs only because we found ourselves something better to do instead!

Internet of Things: Revolutionising Retail

The interconnection between everyday objects through the internet, called as Internet of Things has

transformed the way we live and work. While the debates continue on whether retail shops are here to survive amidst online e-commerce stalwarts, quite a few of the progressive retail shops have opted in to challenge the status quo and adopted e-commerce tactics and have been fairly successful in their pursuit of happiness in reaching their goals.

IoT enabled retail outlets are powered by technologies like Global Positiontioning Systems (GPS), Near Field Communication (NFC), Beacons and Wi-Fi signals allowing them to grab a customer’s attention when he is close to their shops. Imagine you walk past your favourite apparel store where you shop regularly and have accumulated loyalty points as well. When you walk past this store today towards the coffee shop without wanting to buy anything from this store, and you get a offer on your phone with a offer or new arrivals? Makes you stop and think for a second, you may very likely end up spending some time shopping and may make a purchase too. Voila! The retail outlet’s IoT enabled system could predict your location, give you a fair offer and make you purchase. Not to mention you are beaming with joy on your new purchase.

Consumers today adopt IoT devices at a unimaginably high rate. The number of wearable devices and home automation purchases have seen exceptional rise in the recent times. With so much data floating around, its crucial to be able to make communication between various IoT systems to process and enhance the user experience. Common usecases where Retail has really adopted and intelligently made use of IoT are listed below:

Smart Shelves

Along with being able to maintain stocks and inventory, smart shelves have been able to predict whether you liked a particular shirt and in case you did not find the right fit, using an in-store app, you can request for another size and the attendant will find that for you and get it to you. You have officially converted a drop-off at the changing room to a successful purchase.

In-Store targeting

Imagine walking into your favourite outlet to buy fashion apparel for your next office party, and you get a discount offer on recommended accessories. Wouldn’t that be just the type of deal you would love? Using Beacon technologies and artificial intelligence, shops today are accurately able to predict if you’d like to collectively buy that black dress with a pearl neckpiece. With such a value added offer like that, who would increase their chances of sales and better profits.

Smart Shopping

“Once a consumer, always a consumer”, isn’t it?

A shopping mall is a collaboration of consumers, sellers and loads of data to process. Why not share this data amongst their network of partners. A customers’ journey through the mall is a classic example of why you end up with more than you imagined when you are out shopping. A connected network of movie theatre, cafe and fashion apparel, would definitely bring in more data points to capture every moment in a consumer’s journey through the shopping mall. Retailers, if open to partnerships a smart shopping experience will very likely be a thing in the future.

There is a clear and strong shift in retail outlets adopting technology and IoT as a means of competing with e-commerce. With all these improvements in shopping experience, sky is the limit for IoT in Retail.

Best products for log analytics and comparisons

If you are working in a decent sized company,  then you likely have a ton of log data. And if you are the one made responsible to analyse it, then times are tough for you!

Does this sound like your regular workday?

Logs have structure, yet no structure

64.242.88.10 - - [07/Mar/2004:16:47:12 -0800] "GET /robots.txt HTTP/1.1" 200 68
64.242.88.10 - - [07/Mar/2004:16:47:46 -0800] "GET /twiki/bin/rdiff/Know/ReadmeFirst?rev1=1.5&rev2=1.4 HTTP/1.1" 200 5724
64.242.88.10 - - [07/Mar/2004:16:49:04 -0800] "GET /twiki/bin/view/Main/TWikiGroups?rev=1.2 HTTP/1.1" 200 5162
64.242.88.10 - - [07/Mar/2004:16:50:54 -0800] "GET /twiki/bin/rdiff/Main/ConfigurationVariables HTTP/1.1" 200 59679

If you are responsible for analysing your companies log data, then this probably looks very familiar to you.

There are two ways to look at this data. You can either say that this log has a structure, or you can say that it does not. Most standard BI & Analytics tools cannot make any sense of this data. It just appears as textual lines, and you basically can get nothing out of loading this data into a self-service data discovery tool like Tableau, Qlik or Power BI.

The right tool is all you need

The log data actually does have structure. The problem is that common BI tools are not comfortable with its structure. If you had a tool that could understand this structure, things would work just fine.

Do all logs have the same structure? Well No! Then?

Logs from different softwares or hardware come with different structures. What you really need is a tool where you can input the structure of your log, and make the tool interpret your custom logging format.

Your options

Splunk: These guys are literally market leaders. If you can afford their services, Splunk would be the way to go for even the most pesky log management and log analytics requirements.

Logstash: A popular open source tool for log management. You can store log events and retrieve them for future analysis. It uses Elasticsearch and Kibana underneath.

Loggy: Cloud based tool for log analytics. Very friendly for log management in DevOps, SysOps and several engineering requirements.

BlobCity: Offers cloud and on-premise solution for integrated analytics of log and structured data. If you need to analyse your log data along side your structured relational or NoSQL data, then BlobCity is a good choice.