Sunday, April 26, 2020

Build googletest on Mac OS (solve -Werror, -Wc++-extensions)

Build googletest on Mac OS (-Werror, -Wc++-extensions
When I try to build google test on my MacBook using Cmake, the following errors are shown.
**deleted function definitions are a C++11 extension**
**[-Werror,-Wc++11-extensions]**
GTEST_DISALLOW_ASSIGN_(RE);
**error:**
**deleted function definitions are a C++11 extension**
**[-Werror,-Wc++11-extensions]**
...
expanded from macro 'GTEST_DISALLOW_COPY_AND_ASSIGN_'
GTEST_DISALLOW_ASSIGN_(type)
expanded from macro 'GTEST_DISALLOW_ASSIGN_'
type& operator=(type const &) = delete
From the error message, it seems that the compiler is not using the correct C++ standard. Using CMake parameter -DCMAKE_CXX_STANDARD=“17” solves the problem. Below is the full commands
git clone https://github.com/google/googletest
cd googletest/
mkdir build
cd build/
cmake -DCMAKE_CXX_STANDARD="17" ../
make
sudo make install

Using static or extern as global variable in C++

Good practice using extern variable:
Declare the extern variable in an .h file
extern int outputMonth;

Define and initialize it outside of a function (like before main function)
int outputMonth = 50;

Static variable is internal linkage (extern use external linkage) where each cpp file has its own copy of value. So it is not recommended to use it as a global variable.

Monday, April 20, 2020

Deploy C++ Program to CentOS Server with Similar Configuration

Deploy C++ Program to CentOS Server with Similar Configuration

Here is an simple tutorial that shows how to deploy a C++ program to a Linux server which has similar architecture and software package as your development Linux workstation.

  1. Compile your C++ program into executable
  2. Here is a shell script that Copying shared library dependencies. You can use this script to copy all dependent dynamic libraries into one folder. (note: you may want only keep those custom libraries you used and remove those libraries provide by operating system)
  3. Copy your C++ program and the library folder to the linux sever you want to deploy on, using the following command to put the dynamic folder name to link library path
export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:/your_library_path
  • Note: if you want this setting to be persistent, you need to add this command to your .bash_rc or .bash_profile file.

Saturday, April 18, 2020

Build CentOS7 C++ Compiling Environment with MPI and Boost Library

Build CentOS7 C++ Compiling Environment with MPI and Boost Library

In this tutorial, I will show how to build a C++ compiling environment from raw CentOS linux. Specifically, OpenMPI and Boost Library is included in this environment.

First, use the following commands to update CentOS repositories

yum update -y

Then, group install the development tools in which gcc is included.

yum groupinstall "Development Tools" -y

Using the following command to install OpenMPI

yum install openmpi openmpi-devel -y

Since the default version of CMake in CentOS7 is rather old, we will install a relative new version. Below commands install some dependent libraries

yum install wget -y
yum install openssl-devel -y

Run the below commands to build and install CMake 3.17

wget https://github.com/Kitware/CMake/releases/download/v3.17.0/cmake-3.17.0.tar.gz
tar -zxvf cmake-3.17.0.tar.gz
cd cmake-3.17.0 
./bootstrap --prefix=/usr/local
make && make install

Now we start to download Boost Libary 1.68.0

wget https://dl.bintray.com/boostorg/release/1.68.0/source/boost_1_68_0.tar.gz
tar -zxvf boost_1_68_0.tar.gz
cd boost_1_68_0

Since we need Boost MPI library, make sure you load the OpenMPI module

source /etc/profile.d/modules.sh
module load mpi/openmpi-x86_64 

Using the below command to compile and install Boost library

./bootstrap.sh
echo "using mpi ;" >> project-config.jam
./b2 threading=multi --with-program_options --with-filesystem --with-serialization --with-mp    i --with-system install

Note the “using mpi ;” string is essential to tell Boost building script to compile Boost MPI Library. If you need other Boost libraries, just add the options –with-xxx where xxx is the library name.

Mission accomplished!

Thursday, April 16, 2020

Simple Facebook Prophet Tutorial: Predicting S&P Index

Stock price is very difficult to predict, since it is very complex and influenced by many factors that are not easily quantified. In this post, I will just use Facebook Prophet package to do a simple S&P 500 Index prediction for tutorial purpose. Facebook Prophet is based on additive model and accounts for non-linear components by including seasonality and holiday effect. The data I use is from Yahoo Finance between 1998-01-01 to 2018-12-31. The majority of the data (1998-01-01 to 2017-12-31) are used as training data and the rest are used as validation.

First, load necessary libraries
import numpy as np
import pandas as pd
import matplotlib.pyplot as pyplot
Load the S&P 500 data from yahoo finance
sp500 = pd.read_csv("sp500_yahoo_finance.csv", parse_dates=['Date'])
Let us first look at the data
 sp500.head()

As we can see, different index values like "Open", "High", "Low" etc. as well as "Volume". Here we use the "Close" index as our prediction value

Next step. The raw data is split into training data and test data

trainSp500 = sp500[sp500.Date<'2018-01-01'][['Date','Close']]
testSp500 = sp500[sp500.Date>='2018-01-01'][['Date','Close']]
The Facebook Prophet library is loaded. Data is prepared to the format required by Prophet.
from fbprophet import Prophet
data = pd.DataFrame({'ds':trainSp500['Date'].values, 'y':trainSp500['Close'].values})
All preparation is ready. Here we fit the model using the training data
 model = Prophet()
model.fit(data)
We make a one-year prediction.
future = model.make_future_dataframe(365) # forecasting for 1 years
forecast = model.predict(future)
 In the end, the follow graph is made to show how good the prediction. Note the vertical line represents the separation point of  training and testing data. We can see Facebook did a pretty good job at fitting the training set. However, the testing set is quite off and the big dip is not reflected in the prediction at all.





Saturday, April 11, 2020

Comparison of np.dot and operator *

np.dot calculate the dot product of two arrays. For 2-D arrays, the matrix multiplication result is returned.

 The * operator calculate the element-wise multiplication. It will expand its dimension (called broadcasting) if the dimension does not match.

The best way to explain this is through some actual code


Sunday, April 5, 2020

Docker Machine mapping host drive to container

    To share files between the host and Docker container, we need to map a directory in the host to a path in the container using the below option with docker run

-v path_in_the_host:path_in_the_container

    This command works within linux system, but does not directly work on Docker Machine. One easy workaround is put your folder in C:\Users in your Windows host. Then using the below command

docker run -dit -v /c/Users/folder_name:/mnt/data --name container_name image_name

    After this, your folder "folder_name" in C:\Users will be accessible in the container through /mnt/data.