Logistic Regression with Keras
Logistic Regression (LR) is a simple yet quite effective method for carrying out binary classification tasks. There are many open source machine learning libraries which you can use to build LR models.
Keras (with Tensorflow as back-end) is a powerful tool for quickly coding up your machine learning modeling efforts. The main use case is to build and deploy deep neural networks. LR models can be viewed as special cases of neural nets (i.e, a single layer model, without any hidden layers), so naturally Keras is well-suited.
Now let us see it in action. As is the case with any machine learning project, remember to clean up and pre-process your data first, and then compute intuitive features for the model to learn.
import numpy as np from keras.models import Sequential from keras.layers import Dense from keras.regularizers import L1L2 """ Prepare you data, such as: """ x_train = np.array([],[],[]) # should be a numpy array y_train = np.array([],[],[]) # should be a numpy array x_val = np.array([],[],[]) # should be a numpy array y_val = np.array([],[],[]) # should be a numpy array """ Set up the logistic regression model """ model = Sequential() model.add(Dense(2, #number of classes activation='softmax', kernel_regularizer=L1L2(l1=0.0, l2=0.1), input_dim=len(feature_vector)) # input dimension = number of features your data has model.compile(optimizer='sgd', loss='categorical_crossentropy', metrics=['accuracy']) model.fit(x_train, y_train, epochs=100, validation_data=(x_val, y_val))
The above code builds a single-layer densely connected network. It uses L2 regularization with coefficient 0.1 (but you shall determine the best value using cross validation on your own data). Similarly, you may want to use more or fewer epochs to train, depending on your own situations. Such parameter tuning should be done on a case-by-case basis.
Thank you for reading. Read more here.
I write about Data Science and Machine Learning topics. Follow me on Twitter.