Skip to content

john-philipp/ml-impl

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

15 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

ml-impl

Houses some machine learning and neural network implementations. Due to the restrictions of in-line mathematics in github markdown, I embed latex based pngs for better readability.

Requires train and test directories in each dataset directory for training and inference, respectively.

Derivations

Logistic regression

As a basic example I've implemented a logistic regression scheme using a sigmoid function and the gradient descent method in both numpy and pytorch. The code is maths heavy. Below is a complete derivation of the single sample case. As well as its vectorised extension actually implemented. We vectorise to reduce computing times by leveraging low-level optimised matrix multiplications over explicit top-level for-loops.

Full logistic regression derivation.

Hidden layer using ReLU

We extend the logistic regression case using a hidden layer based on the ReLU function.

Full derivation for ReLU hidden layer.

About

A first-principle implementation of a basic logistic regression model based on a linear logit and a sigmoid activation function trained via the gradient descent method. Its purpose is mainly to establish knowledge fundamentals and document this process.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors