This repository has been archived by the owner on Apr 26, 2021. It is now read-only.
-
Notifications
You must be signed in to change notification settings - Fork 0
/
README.jmd
99 lines (71 loc) · 1.85 KB
/
README.jmd
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
# JLBoostMLJ.jl
The [MLJ.jl](https://github.com/alan-turing-institute/MLJ.jl) interface to [JLBoost.jl](https://github.com/xiaodaigh/JLBoost.jl), a hackable implementation of Gradient Boosting Regression Trees.
## Usage Example
```julia
using RDatasets;
iris = dataset("datasets", "iris");
iris[!, :is_setosa] = iris.Species .== "setosa";
using MLJ, JLBoostMLJ;
X, y = unpack(iris, x->!(x in [:is_setosa, :Species]), ==(:is_setosa));
using JLBoostMLJ:JLBoostClassifier;
model = JLBoostClassifier()
```
### Using MLJ machines
Put the model and data in a machine
```julia
mljmachine = machine(model, X, y)
```
Fit model using machine
```julia
fit!(mljmachine)
```
Predict using machine
```julia
predict(mljmachine, X)
```
Feature importance using machine
```julia
feature_importance(fitted_params(mljmachine).fitresult, X, y)
```
#### Hyperparameter tuning
Data preparation: need to convert `y` to categorical
```julia
y_cate = categorical(y)
```
Set up some hyperparameter ranges
```julia
using JLBoost, JLBoostMLJ, MLJ
jlb = JLBoostClassifier()
r1 = range(jlb, :nrounds, lower=1, upper = 6)
r2 = range(jlb, :max_depth, lower=1, upper = 6)
r3 = range(jlb, :eta, lower=0.1, upper=1.0)
```
Set up the machine
```julia
tm = TunedModel(model = jlb, ranges = [r1, r2, r3], measure = cross_entropy)
m = machine(tm, X, y_cate)
```
Fit it!
```julia
fit!(m)
```
Inspected the tuned parameters
```julia
fitted_params(m).best_model.max_depth
fitted_params(m).best_model.nrounds
fitted_params(m).best_model.eta
```
### Simple Fitting
Fit the model with `verbosity = 1`
```julia
mljmodel = fit(model, 1, X, y)
```
Predicting using the model
```julia
predict(model, mljmodel.fitresult, X)
```
Feature Importance for simple fitting
One can obtain the feature importance using the `feature_importance` function
```julia
feature_importance(mljmodel.fitresult.treemodel, X, y)
```