sexta-feira, 25 de maio de 2018

The Image Classifier Microservice is public on Docker repository

Using fabric8 docker plugin and the project from A Java microservice for image classification using Thorntail and DeepLearning4J post I was able to build a docker image and I pushed it to my Docker repository!

In this post I am sharing the steps I followed to build the docker image and publish it. Next post I hope to show you how to run on Openshift with some real world application.

Step 1: Build an image from the Thorntail project


This is simple, you will need just a configuration in the maven descriptor, see the changes in the project pom.xml.

Notice that I had to made a manual copy of a JAR file as described in Thorntail's issue #951, however, I am probably missing something, that may not be a bug,. Let's wait to see what Thorntail team will provide as a feedback for that issue. I wasn't a bug, Bob just commented on the issue, the solution is not use the thin mode.

For having help I pinged Thorntail users on #thorntail IRC channel in freenode and Ken Finingan helped me, also, Bob helped me to get started with Thorntail when I wrote "". Both are Thorntail core contributor and they were so kind and patient to me (also others Thorntail users) that I had to mention them here!

Step 2: Test the image


Testing the image is simple. Run it using docker:

docker run -di -p 8080:8080 fxapps-image-classifier/app

It will start the app and bind it to your local port 8080. So when the image is build you can test it using curl:

curl  http://localhost:8080

If it is working if you see a big JSON as response! 

Step 3: Push the image


I followed the instructions from Docker documentation and the image is public now, meaning that you can pull this image and run locally.

If you have docker and want to give it a try first you must pull the image:

docker pull jesuino/image-classifier

Then run it:

docker run -di -p 8080:8080 docker.io/jesuino/image-classifier

Follow the logs until you see that thorntail is started:

INFO  : Sat May 26 03:04:32 UTC 2018 [io.thorntail.kernel] THORN-000999: Thorntail started in 71.896s

Notice that it took more than 1 minute to run because I didn't use a custom model, instead, I let it download from DeepLearning4J Zoo.

Once Thorntail is started you test the service: curl  http://localhost:8080


Conclusion


Yes, Thorntail doesn't even have a final release, we have been using snapshot releases and it is cloud ready. Next steps is to improve the microservice by adding health check and then deploy it to Openshift with a real world application!

Nenhum comentário:

Postar um comentário