You will be charged on how you drive

Ever wondered that how you drive, and how much you drive could define the insurance premium for your vehicle?

Usage Based Insurance is seeing substantial adoption by insurance companies. It is changing flat rate insurance structures, with more dynamic insurance structures for your vehicles.

Usage Based Insurance systems encompass:

  1. Pay-As-You-Drive
  2. Pay-How-You-Drive
  3. Pay-As-You-Go
  4. Pay-By-Distance

Distance Analysis

Charges for your car could be based on the distance you drive. For the days or months that you are out of town and don’t use the car, you could actually land up paying near zero premiums for those months.

Behaviour Analysis

Your driving behaviour and discipline could define the insurance premium you pay. If you are rash driver, then UBI policies are not for you. But if you are in your 50s, have a car, and are a peaceful and safe driver, a UBI policy could help you save a lot of money on your premiums.

Supporting Technology

A simple electronic telematic device can be connected to your car by your insurance provider. Some cars come pre-installed with such devices, and the insurance company will accept data from such devices for setting your UBI based premiums.

Who offers this

Progressive Insurance Company and General Motors Assurance Company, started a distance linked insurance discount scheme for car owners. It used telematics that captures the distance car moved, with the insurance rate being dynamic computed, by the amount of time the car was actually running.

Who is benefited

Both customers and insurance companies are equally benefited. Customers get the option of choosing an incentive based policy. This translates to more accurately priced insurance premiums and discounts for good driving behaviour. The insurance companies are also benefited by repeating substantial returns on investments, and being able to secure a strong book of business with reduced losses.

Interesting reads & References

5 reasons why AI will not cost you your job

We have seen automation take up a lot of jobs, and AI is one of the biggest automation created by mankind.

With AI did mankind create its own killer?

The answer very definitely is a NO. The possibility of AI talking over the world and out smarting humans is very far fetched and unlikely even in the remotest of possibilities. Even if AI was built powerful enough to rule the world, why would the machines want the humans to be extinct or be ruled?

But having said that, AI can make us loose our jobs. How true is that? 

1.  AI is not intelligent enough to build itself

While AI grows to automate some of the human labour jobs, it also creates jobs for humans who will build and maintain the AI system itself. As penetration of AI increases across industries, the jobs to support the penetration also increase. Humans who would do laborious jobs would now have a revised job profile of observing the work of AI or building and improving the AI system itself.

2.  Machines need training data to learn, who produces this?

All of the AI systems need training data to learn. Yes there are unsupervised learning systems, but they are used for very specific requirements that are not widely acceptable for the jobs AI will do. Automation with AI requires a training data set to be fed to the AI system. The AI system will only be ask good as the training data provided. This creates a classic need for having the perfect training data set. Humans are the guinea pigs. They will have to produce all the training data for AI systems to learn. More so producing training data is not a one person affair, but will need thousands of people to do the same thing. This means rather than humans doing the actual laborious task, they will do the task in a controlled and recorded environment. Their movement and actions will be recorded to help machines learns.

3. Our kids will not be unemployed, they will just have different jobs than us

Evolution is inevitable and we have to embrace it. While we are building AI systems and discovering that AI can take over lot of the human jobs. We have to acknowledge that the Generation Z is grown up by seeing these technology advancements. While technology can baffle us, it is a part of life for Generation Z. If we think that the next generation will be out of jobs, that is completely wrong. They will just have different jobs, most of which they will create for themselves. Their jobs will embrace the technology advancements and will be targeted towards solving the problems that technology has not yet solved.

4. We can’t yet travel through solar systems

It is good that AI can take over most of the work we do to keep our world running. This means that humans can now work on solving problems that are not yet solved. We have travelled to the moon and that is just about how far it has ever gotten. A human on mars is still several years away. But what’s worse is travelling to other solar systems in search for life or to explore the expanse of the universe would take another 2-3 generations of research. If we get AI to run our world as we know it, then humans can spend their energies in exploring the rest of the universe out there and building technological disruptions that allow faster travel through solar systems and possibility even teleportation.

5. Humans are intelligent, machines are sudo intelligent

Machines portray intelligence, but humans have intelligence. This why intelligence of machines is called “Artificial” intelligence. We will never let machines take us over. We will just use them to make our lives simpler and help us do things that are not physically possible for humans. So the though of AI taking over our jobs is absolutely ridiculous. AI will do our jobs for us because we want it to do it. And we will want it to do our current jobs only because we found ourselves something better to do instead!

Internet of Things: Revolutionising Retail

The interconnection between everyday objects through the internet, called as Internet of Things has

transformed the way we live and work. While the debates continue on whether retail shops are here to survive amidst online e-commerce stalwarts, quite a few of the progressive retail shops have opted in to challenge the status quo and adopted e-commerce tactics and have been fairly successful in their pursuit of happiness in reaching their goals.

IoT enabled retail outlets are powered by technologies like Global Positiontioning Systems (GPS), Near Field Communication (NFC), Beacons and Wi-Fi signals allowing them to grab a customer’s attention when he is close to their shops. Imagine you walk past your favourite apparel store where you shop regularly and have accumulated loyalty points as well. When you walk past this store today towards the coffee shop without wanting to buy anything from this store, and you get a offer on your phone with a offer or new arrivals? Makes you stop and think for a second, you may very likely end up spending some time shopping and may make a purchase too. Voila! The retail outlet’s IoT enabled system could predict your location, give you a fair offer and make you purchase. Not to mention you are beaming with joy on your new purchase.

Consumers today adopt IoT devices at a unimaginably high rate. The number of wearable devices and home automation purchases have seen exceptional rise in the recent times. With so much data floating around, its crucial to be able to make communication between various IoT systems to process and enhance the user experience. Common usecases where Retail has really adopted and intelligently made use of IoT are listed below:

Smart Shelves

Along with being able to maintain stocks and inventory, smart shelves have been able to predict whether you liked a particular shirt and in case you did not find the right fit, using an in-store app, you can request for another size and the attendant will find that for you and get it to you. You have officially converted a drop-off at the changing room to a successful purchase.

In-Store targeting

Imagine walking into your favourite outlet to buy fashion apparel for your next office party, and you get a discount offer on recommended accessories. Wouldn’t that be just the type of deal you would love? Using Beacon technologies and artificial intelligence, shops today are accurately able to predict if you’d like to collectively buy that black dress with a pearl neckpiece. With such a value added offer like that, who would increase their chances of sales and better profits.

Smart Shopping

“Once a consumer, always a consumer”, isn’t it?

A shopping mall is a collaboration of consumers, sellers and loads of data to process. Why not share this data amongst their network of partners. A customers’ journey through the mall is a classic example of why you end up with more than you imagined when you are out shopping. A connected network of movie theatre, cafe and fashion apparel, would definitely bring in more data points to capture every moment in a consumer’s journey through the shopping mall. Retailers, if open to partnerships a smart shopping experience will very likely be a thing in the future.

There is a clear and strong shift in retail outlets adopting technology and IoT as a means of competing with e-commerce. With all these improvements in shopping experience, sky is the limit for IoT in Retail.

How to Dockerize an AngularJS App

Do you have an AngularJS project that you would like to build into a Docker container? If   yes, you will have to start with making a Dockerfile inside your projects base folder.

### STAGE 1: Build the AngularJS app ###

FROM node:8-alpine as builder
COPY package.json package-lock.json ./
RUN npm set progress=false && npm config set depth 0 && npm cache clean --force
RUN npm i && mkdir /ng-app && cp -R ./node_modules ./ng-app
WORKDIR /ng-app
COPY . .
## Production mode build
RUN $(npm bin)/ng build --env=staging --prod --build-optimizer

### STAGE 2: Add Nginx for hosting the AngularJS app ###

FROM nginx:1.13.3-alpine
## Removes the default nginx html files
RUN rm -rf /usr/share/nginx/html/*
COPY --from=builder /ng-app/dist /usr/share/nginx/html
CMD ["nginx", "-g", "daemon off;"]
Place the above contents in a filed called Dockerfile, and place this file in your projects base folder

Once you have the Dockerfile placed, you can use Docker itself to build the AngularJS project. You do not need to build your AngularJS project separately.

Building the Docker container

Run the following command from your project’s base folder. This is the same location as where you placed your Dockerfile.

docker build .

You can also build your container by naming and tagging it, with the following command

docker build -t blobcity/angular-app:latest .

This will create the container blobcity/angular-app with the tag latest

Running the Docker container

You can start the container using the following docker run command. When the container is started, the AngularJS app is deployed and ready to use.

docker run blobcity/angular-app:latest

If you are running this command on your own computer, then open the browser at the following URL to use your AngularJS application


If you have started the Docker container on a remote server, you will need a public IP address to that server for using your AngularJS app



Nginx Proxy with Docker-Compose

docker-logoversion: '2'
   restart: always
   image: jwilder/nginx-proxy
     - /var/run/docker.sock:/tmp/docker.sock:ro
     - "80:80"

    image: "blobcity/angular-app:latest"
    restart: always
      VIRTUAL_HOST: angular-app.blobcity.com
      VIRTUAL_PORT: 80
     - "80"

The above docker-compose.yml file provides and Nginx proxy confirmation, that forwards to an AngularJS application.  The above contents should be placed in a file called docker-compose.yml.

Replace image: “blobcity/angular-app:latest” with the image name of your application. It can be an AngularJS, NodeJS or any other docker based application.

Configure your domain, by replacing angular-app.blobcity.com with your domain.

To start Nginx and Angular app

> docker-compose up

To start as a daemon

> docker-compose up -d

To stop

> docker-compose stop

retainIf and removeIf for Java Collections

Iterate over Java collection classes to conditionally retain or remove values from the collection.

CollectionUtil.retainIf(list, element -> element > 5);

Retains all elements in the collection that have a value greater than 5.

CollectionUtil.removeIf(list, element -> element > 5);

Removes all elements from the list that have a value greater than 5.

The CollectionUtil class is available as part of BlobCity java-commons open source distribution of useful Java utilities.


Jar file: https://github.com/blobcity/java-commons/tree/master/target 

Complete Implementation

List<Integer> list1 = new ArrayList<>(Arrays.asList(1,5,10,30,4,8,11));
List<Integer> list2 = new ArrayList<>(Arrays.asList(1,5,10,30,4,8,11));

/* Retains all elements in list that have a value > 5 */
CollectionUtil.retainIf(list1, element -> element > 5);

System.out.println("List 1");

/* Removes all elements in list that have a value > 5 */
CollectionUtil.removeIf(list2, element -> element > 5);

System.out.println("\nList 2");


List 1

List 2

Best products for log analytics and comparisons

If you are working in a decent sized company,  then you likely have a ton of log data. And if you are the one made responsible to analyse it, then times are tough for you!

Does this sound like your regular workday?

Logs have structure, yet no structure - - [07/Mar/2004:16:47:12 -0800] "GET /robots.txt HTTP/1.1" 200 68 - - [07/Mar/2004:16:47:46 -0800] "GET /twiki/bin/rdiff/Know/ReadmeFirst?rev1=1.5&rev2=1.4 HTTP/1.1" 200 5724 - - [07/Mar/2004:16:49:04 -0800] "GET /twiki/bin/view/Main/TWikiGroups?rev=1.2 HTTP/1.1" 200 5162 - - [07/Mar/2004:16:50:54 -0800] "GET /twiki/bin/rdiff/Main/ConfigurationVariables HTTP/1.1" 200 59679

If you are responsible for analysing your companies log data, then this probably looks very familiar to you.

There are two ways to look at this data. You can either say that this log has a structure, or you can say that it does not. Most standard BI & Analytics tools cannot make any sense of this data. It just appears as textual lines, and you basically can get nothing out of loading this data into a self-service data discovery tool like Tableau, Qlik or Power BI.

The right tool is all you need

The log data actually does have structure. The problem is that common BI tools are not comfortable with its structure. If you had a tool that could understand this structure, things would work just fine.

Do all logs have the same structure? Well No! Then?

Logs from different softwares or hardware come with different structures. What you really need is a tool where you can input the structure of your log, and make the tool interpret your custom logging format.

Your options

Splunk: These guys are literally market leaders. If you can afford their services, Splunk would be the way to go for even the most pesky log management and log analytics requirements.

Logstash: A popular open source tool for log management. You can store log events and retrieve them for future analysis. It uses Elasticsearch and Kibana underneath.

Loggy: Cloud based tool for log analytics. Very friendly for log management in DevOps, SysOps and several engineering requirements.

BlobCity: Offers cloud and on-premise solution for integrated analytics of log and structured data. If you need to analyse your log data along side your structured relational or NoSQL data, then BlobCity is a good choice.