Skip to main content

Posts

How To Use Keras Trained CNN Models

Introduction Keras is a popular deep learning api. It can run on top of Tensorflow , CNTK and Theano frameworks. Keras provides an easy to use interface which makes deep learning practice straight forward. It is widely used thus resources are easily accessible. Objective This article aims to give an introductory information about using a Keras trained CNN model for inference. This article does not contain information about CNN training. Audience This article assumes introductory information about python and Convolutional Neural Networks. For those who lack information may first begin with information from following resources. For python use  Python For Beginners For Convolutional Neural Networks use  CS231n Convolutional Neural Networks for Visual Recognition Software Installation Keras is a high level API. It requires a back-end framework to be installed. In this article, Tensorflow is used. Keras can transparently select CPU or GPU for processing. If use ...

Using Docker Container for MySQL Service

Problem Suppose you are working on a project that relies on Mysql service but you do not want to install MySQL to your system. Using a Docker image can be a good solution. Remember that you can use many services through Docker containers. Steps 1- Install docker  Docker Community Edition can be freely downloaded from the  official page. It requires registration. https://www.docker.com/community-edition Install docker image. It may require a restart, if it does, do so. In case any problems occur that prevent Docker for Windows from starting check trouble shooting page. https://docs.docker.com/docker-for-windows/troubleshoot/ In Windows 10, hypervisor launch type needs to be "auto".  In order to check current status bcdedit command can be used via PowerShell (Administrator Mode). In order to set to "auto" following command can be used. bcdedit /set hypervisorlaunchtype auto 2- Find the Relevant Container Go to  Docker Store . Type M...

Facial Landmark Detector

Dlib is a popular library. It can be used for face detection or face recognition. In this article I will use it for facial landmark detection. Facial landmarks are fecial features like nose, eyes, mouth or jaw. Start with installing Dlib library. Dlib requires Lib Boost. sudo apt-get install libboost-all-dev Now we can install Dlib. sudo pip install dlib Following example uses PIL and numpy packages. Instead of Pillow it is possible to use skimage package. pip install Pillow pip install numpy Note that in order to detect facial landmarks, a previously trained model file is needed. You can download one from Dlib site. Download model file from this link : http://dlib.net/files/shape_predictor_68_face_landmarks.dat.bz2 After download is completed, extract the archive and make sure its location is correctly referenced in the source file. The application first tries to detect faces in the given image. After that for each face it tries to detect landmarks. For ...

Prepare a Ubuntu System for Deep Learning

An Ubuntu Deep Learning System A.                     Install latest Nvidia drivers 1-       Run following  commands  to add latest drivers from PPA. sudo add-apt-repository ppa:graphics-drivers/ppa sudo apt update 2-       Then use Ubuntu  Software &Updates  Additional Drivers application to update your driver. For my GTX-1070 I chose driver with version 384.69. 3-       After installation Restart your PC. You may need to disable safe boot using bios menu. 4-       Run following command to ensure that drivers are installed correctly. lsmod | grep nvidia 5-       İf you have issue with the new driver remove it with following command. sudo apt-get purge nvidia* For more information see: https://askubuntu.com/questions/...

Java Custom ClassLoader

Problem I want my previously written java application to be run multiple times by a shell script with calculated parameters. However, target platforms may not contain bash. Thus I decied to write another class which acts like a bash script file and launch my application with generated parameters. My application has static variables and static classes. Before every launch I want the state information be cleared. But since I work in the same JVM, static objects will not be removed. It comes out using static fileds and initializers may not be a good approach. There should be a way to solve this.  The solution is to use a custom classloader. Classloader will load all classes. At the end of the execution, the classloader and all the classes loaded will be garbage collected. At next execution, all classes will be reloaded. Solution : Custom Class Loader Class loaders can be considered a container to launch an application. Servlet containers like Tomcat uses a custom classl...

Hadoop Cluster Installation Document

This document shows my experience on following apache document titled “ Hadoop Cluster Setup ” [1] which is for Hadoop version 3.0.0-Alpha2. This document is successor to Hadoop Installation Document-Standalone  [2]. “ ubuntul_hadoop_master ” machine is used in the rest of the machine. You will need to read and follow Hadoop Installation Document-Standalone [2] before reading any further. A. Prepare the guest environments for slave nodes. It is easy to clone virtual machines using Virtualbox. Right click “ ubuntul_hadoop_master ” and clone. Name new VM as “ ubuntul_hadoop_slave1 ”. You can have as many slaves as you like. Since we simply clone the master machine, much of the configuration comes ready. Practically slave nodes needs more disk space while master node needs more memory. But this is an educational setup and these details are not necessary. B. Install Hadoop Hadoop comes installed with " ubuntul_hadoop_master ”. C. Running Cluster In maste...

Hadoop Installation Document - Standalone Mode

This document shows my experience on following apache document titled “Hadoop:Setting up a Single Node Cluster”[1] which is for Hadoop version 3.0.0-Alpha2 [2]. A. Prepare the guest environment Install VirtualBox. Create a virtual 64 bit Linux machine. Name it “ubuntul_hadoop_master”. Give it 500MB memory. Create a VMDK disc which is dynamically allocated up to 30GB. In network settings in first tab you should see Adapter 1 enabled and attached to “NAT”. In second table enable adapter 2 and attach to “Host Only Adaptor”. First adapter is required for internet connection. Second one is required for letting outside connect to a guest service. In storage settings, attach a Linux iso file to IDE channel. Use any distribution you like. Because of small installation size, I choose minimal Ubuntu iso [1]. In package selection menu, I only left standard packages selected.  Login to system.  Setup JDK. $ sudo apt-get install openjdk-8-jdk Install ssh and pdsh, if...