In the past two weeks I indeed managed to code the following new surrogates:
Support vector machine
I also had to make sure that these new surrogates would comply with the optimization methods I coded beforehand. It turns out I have been quite sloppy: at the end I had to change around a lot of data structures of these surrogates to make everything compatible.
Now, this seems like a lot of work. Actually it was not that bad because I have taken advantage of a great deal of Packages, such as: GLM, Flux, LIBSVM and XGBoost.
The definition and construction of a Linear Surrogate is indeed quite easy:
The bounds are needed in the construction because optimization method need to have explicit limitations. The ND case is the same, because I still take advantage of GLM.
The lobachesky spline is super interesting. It is defined in this way:
With and parameters and dimension of the problem.
The inner is defined in this way:
By applying the central limit theorem the d-variate Lobachevsky spline converge to the d-variate Gaussian. Hence, Lobachevsky splines asymptotically behave like radial functions, though they are not radial in themselves.
Let's call our objective function . If we are able to express it in the following way:
Then we can approximate with a Lobachesky spline because there exists a closed form of the integral. Surrogates.jl makes this extremely easy, check it out:
In these last two weeks I plan on writing docs, examples and tutorials because a good package is useless if I am the only one that knows how to operate it. Also, I would love to finish the SOP optimization method whose PR is still open. I would also love to code up the MARS spline surrogate.
Anyway, I have a lot more ideas for this package so for sure the work will not end after JSOC. Cannot wait for the last article that will wrap up these amazing three months!