Skip to content

digital-carver/LightML.jl

 
 

Repository files navigation

LightML.jl

Build Status Build status Coverage Status

About

LightML.jl is a collection of reimplementations of general machine learning algorithms in Julia.

The purpose of this project is purely self-educational.

Fork

This repository is a fork of LightML.jl. The default branch in this fork is master-gjl. This branch partially supports Julia v1.0: the package can be installed and many examples work. The branch master is synced with master at https://github.com/memoiry/LightML.jl.

Why?

This project is targeting people who want to learn internals of machine learning algorithms or implement them from scratch.

The code is much easier to follow than the optimized libraries and easier to play with.

All algorithms are implemented in Julia.

You should access the test function of every implementation for its usage in detail (See below). Every model is actually constructed in a similar manner.

Installation

September 25, 2018. You need to install development versions Compose and Gadfly for Julia v1.0. See this issue.

julia> pkg"add Compose#master"
julia> pkg"add Gadfly#master"
julia> pkg"add Hexagons"

Make sure you have the correct python dependency. You can use the Conda Julia package to install more Python packages, and import Conda to print the Conda.PYTHONDIR directory where python was installed. On GNU/Linux systems, PyCall will default to using the python program (if any) in your PATH.

The advantage of a Conda-based configuration is particularly compelling if you are installing PyCall in order to use packages like PyPlot.jl or SymPy.jl, as these can then automatically install their Python dependencies.

ENV["PYTHON"]=""
Pkg.add("Conda")
using Conda
Conda.add("python==2.7.13")
Conda.add("matplotlib")
Conda.add("scikit-learn")
Pkg.add("PyCall")
Pkg.build("PyCall")

or you can simply

Pkg.build("LightML")

It's actually same with the procedure above.

Then every dependency should be configured, you can simply run command below to install the package.

Pkg.clone("https://github.com/memoiry/LightML.jl")

Running Implementations

Let's first try the large-scale spectral clustering example.

julia> using LightML
julia> LightML.load_examples()
julia> ? LSC_example
julia> LSC_example()

Figure 1: Smiley, spirals, shapes and cassini Datasets using LSC(large scale spectral clustering)

Running Demo

using LightML
julia> LightML.load_examples()
LightML.demo()

Figure 2: The Digit Dataset using Demo algorithms

Current Implementations

Supervised Learning:

Unsupervised Learning:

Test Examples available

  • test_ClassificationTree()
  • test_RegressionTree()
  • test_label_propagation()
  • test_LDA()
  • test_naive()
  • test_NeuralNetwork()
  • test_svm()
  • test_kmeans_random()
  • test_PCA()
  • test_Adaboost()
  • test_BoostingTree()
  • test_spec_cluster()
  • test_LogisticRegression()
  • test_LinearRegression()
  • test_kneast_regression()
  • test_kneast_classification()
  • test_LSC()
  • test_GaussianMixture() (Fixing)
  • test_GDA() (Fixing)
  • test_HMM() (Fixing)
  • test_xgboost (Fixing)

Contribution

Please examine the todo list for contribution detials.

Any Pull request is welcome.

Selected Examples

LinearRegression

using LightML
test_LinearRegression()

Figure 3: The regression Dataset using LinearRegression

Adaboost

test_Adaboost()

Figure 4: The classification Dataset using Adaboost

SVM

test_svm()

Figure 5: The classification Dataset using LinearRegression

Classification Tree

test_ClassificationTree()

Figure 6: The digit Dataset using Classification Tree

kmeans

test_kmeans_random()

Figure 7: The blobs Dataset using k-means

LDA

test_LDA()

Figure 8: The classification Dataset using LDA

PCA

test_PCA()

Figure 9: The Digit Dataset using PCA

About

Minimal and clean examples of machine learning algorithms implemented in Julia

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages

  • Julia 100.0%