fbpx

Nginx Proxy with Docker-Compose

docker-logoversion: '2'
services:
  nginxproxy:
   restart: always
   image: jwilder/nginx-proxy
   volumes:
     - /var/run/docker.sock:/tmp/docker.sock:ro
   ports:
     - "80:80"

  angular-app:
    image: "blobcity/angular-app:latest"
    restart: always
    environment:
      VIRTUAL_HOST: angular-app.blobcity.com
      VIRTUAL_PORT: 80
    expose:
     - "80"
docker-compose.yml

The above docker-compose.yml file provides and Nginx proxy confirmation, that forwards to an AngularJS application.  The above contents should be placed in a file called docker-compose.yml.

Replace image: “blobcity/angular-app:latest” with the image name of your application. It can be an AngularJS, NodeJS or any other docker based application.

Configure your domain, by replacing angular-app.blobcity.com with your domain.

To start Nginx and Angular app

> docker-compose up

To start as a daemon

> docker-compose up -d

To stop

> docker-compose stop

retainIf and removeIf for Java Collections

Iterate over Java collection classes to conditionally retain or remove values from the collection.

CollectionUtil.retainIf(list, element -> element > 5);

Retains all elements in the collection that have a value greater than 5.


CollectionUtil.removeIf(list, element -> element > 5);

Removes all elements from the list that have a value greater than 5.


The CollectionUtil class is available as part of BlobCity java-commons open source distribution of useful Java utilities.

https://github.com/blobcity/java-commons

Jar file: https://github.com/blobcity/java-commons/tree/master/target 

Complete Implementation

List<Integer> list1 = new ArrayList<>(Arrays.asList(1,5,10,30,4,8,11));
List<Integer> list2 = new ArrayList<>(Arrays.asList(1,5,10,30,4,8,11));

/* Retains all elements in list that have a value > 5 */
CollectionUtil.retainIf(list1, element -> element > 5);

System.out.println("List 1");
list1.forEach(System.out::println);

/* Removes all elements in list that have a value > 5 */
CollectionUtil.removeIf(list2, element -> element > 5);

System.out.println("\nList 2");
list2.forEach(System.out::println);

Output

List 1
10
30
8
11

List 2
1
5
4

Best products for log analytics and comparisons

If you are working in a decent sized company,  then you likely have a ton of log data. And if you are the one made responsible to analyse it, then times are tough for you!

Does this sound like your regular workday?

Logs have structure, yet no structure

64.242.88.10 - - [07/Mar/2004:16:47:12 -0800] "GET /robots.txt HTTP/1.1" 200 68
64.242.88.10 - - [07/Mar/2004:16:47:46 -0800] "GET /twiki/bin/rdiff/Know/ReadmeFirst?rev1=1.5&rev2=1.4 HTTP/1.1" 200 5724
64.242.88.10 - - [07/Mar/2004:16:49:04 -0800] "GET /twiki/bin/view/Main/TWikiGroups?rev=1.2 HTTP/1.1" 200 5162
64.242.88.10 - - [07/Mar/2004:16:50:54 -0800] "GET /twiki/bin/rdiff/Main/ConfigurationVariables HTTP/1.1" 200 59679

If you are responsible for analysing your companies log data, then this probably looks very familiar to you.

There are two ways to look at this data. You can either say that this log has a structure, or you can say that it does not. Most standard BI & Analytics tools cannot make any sense of this data. It just appears as textual lines, and you basically can get nothing out of loading this data into a self-service data discovery tool like Tableau, Qlik or Power BI.

The right tool is all you need

The log data actually does have structure. The problem is that common BI tools are not comfortable with its structure. If you had a tool that could understand this structure, things would work just fine.

Do all logs have the same structure? Well No! Then?

Logs from different softwares or hardware come with different structures. What you really need is a tool where you can input the structure of your log, and make the tool interpret your custom logging format.

Your options

Splunk: These guys are literally market leaders. If you can afford their services, Splunk would be the way to go for even the most pesky log management and log analytics requirements.

Logstash: A popular open source tool for log management. You can store log events and retrieve them for future analysis. It uses Elasticsearch and Kibana underneath.

Loggy: Cloud based tool for log analytics. Very friendly for log management in DevOps, SysOps and several engineering requirements.

BlobCity: Offers cloud and on-premise solution for integrated analytics of log and structured data. If you need to analyse your log data along side your structured relational or NoSQL data, then BlobCity is a good choice.