Akshay Jain / Jun 27 2019
Remix of Julia by Nextjournal

JSoC'19: Stochastic Trace Estimation

Hutchinson's estimator

Hutchinson in 1989, came up with a general technique to transform algebraic problem of calculating trace into statistical problem of computing expectation of a quadratic function.

Tr(A)=E(zTAz)Tr(A) = \mathbb{E}(\textbf{z}^T A\textbf{z})

Here, is a random variable usually following either Gaussian or Rademacher distribution, i.e., are independent vectors of zero-mean, unit-variance and uncorrelated entries.

How to solve the given bilinear form?

Let general case being then One prominent question that arises from the use of this technique is what method to employ while evaluatingA quick resolution is using some iterative solver such as CG to evaluateWe can have a good estimate of via this procedure in most cases, but ifis so ill-conditioned that even a stable iterative solver leaves a substantial backward error in evaluation ofthen we will have a significant departure from true value of While such a scenario does occur in practice, it is fairly uncommon. On the other hand, the opposite scenario where it is possible to have a good estimates forgivenis not highly ill-conditioned for our solver. It is also possible to estimate these values via polynomial interpolation.

Chebyshev Polynomial Interpolation

When working with Stochastic Lanczos Quadrature method, we built a tridiagonal matrix, to extract Ritz-values (a close approximation to eigenvalues of the original matrix) and used the identity for trace estimation. Now, we take a different but fundamentally similar approach to the problem by interpolating eigenvalues of the matrix for a given function via chebyshev interpolation.

f(x)pn(x)=j=0n(cjTj(2baxb+aba))f(x) \equiv p_n(x) = \sum_{j=0}^n(c_jT_j(\frac{2}{b-a}x - \frac{b+a}{b-a}))

where, is the interpolation polynomial of degree is the interpolation coefficient and is the chebyshev polynomial.

Chebyshev polynomial of first kind defined recursively as

To(x)=1, T1(x)=x,T_o(x) = 1,~ T_1(x) = x,
Tj+1(x)=2xTj(x)Tj1(x)T_{j+1}(x) = 2xT_j(x) - T_{j-1}(x)

Where,

Cheby-Hutch Trace Estimator

With the interpolation power of chebyshev interpolation, we can feed the values of into out traditional Hutchinson estimator. A high-level description of the algorithm for a symmetric positive definite matrixis given below:

  • Find and the extreme eigenvalues of the
  • Calculate the values for interpolation coefficient
  • Calculate to approximate for the eigenvalues of
  • Form and feed it to the Hutchinson estimator for calculation of

Usage and Comparison

Code Snippet

Add the required modules

using Pkg; Pkg.add(PackageSpec(url="https://github.com/luca-aki/traceEstimation.jl"))
using LinearAlgebra, SparseArrays, Plots, TraceEstimation, BenchmarkTools

Create dummy random matrices dense and sparse

A = rand(5000,5000)
# Make sure it is symmetric positive definite 
A = A + A' + 250I 
while(!isposdef(A))
  A = A + 100I
end
B = sprand(5000,5000,0.001)
B = B + B' + 250I
while(!isposdef(B))
  B = B + 100I
end

Now we benchmark Cheby-Hutch comparing it with default methods

@benchmark tr(inv(A))
BenchmarkTools.Trial: memory estimate: 193.21 MiB allocs estimate: 8 -------------- minimum time: 7.014 s (0.01% GC) median time: 7.014 s (0.01% GC) mean time: 7.014 s (0.01% GC) maximum time: 7.014 s (0.01% GC) -------------- samples: 1 evals/sample: 1
@benchmark chebyhutch(A, 4, 6, fn = inv)
BenchmarkTools.Trial: memory estimate: 950.14 KiB allocs estimate: 304 -------------- minimum time: 1.011 s (0.00% GC) median time: 1.031 s (0.00% GC) mean time: 1.038 s (0.00% GC) maximum time: 1.078 s (0.00% GC) -------------- samples: 5 evals/sample: 1

For sparse matrices, the performance increases by manyfold.

Bm = Matrix(B)
@benchmark tr(inv(Bm))
BenchmarkTools.Trial: memory estimate: 193.21 MiB allocs estimate: 8 -------------- minimum time: 8.025 s (0.01% GC) median time: 8.025 s (0.01% GC) mean time: 8.025 s (0.01% GC) maximum time: 8.025 s (0.01% GC) -------------- samples: 1 evals/sample: 1
@benchmark chebyhutch(B, 4, 6, fn = inv)
BenchmarkTools.Trial: memory estimate: 2.08 MiB allocs estimate: 342 -------------- minimum time: 34.848 ms (0.00% GC) median time: 40.622 ms (0.00% GC) mean time: 41.523 ms (0.40% GC) maximum time: 77.777 ms (0.00% GC) -------------- samples: 121 evals/sample: 1

Because Everyone Loves Graphs!

Following tests are performed on random dense matrices generated inside julia.

Test details:

  • Default: tr(inv(A))
  • SLQ: slq(A, fn = inv, mtol = 0.005)
  • Cheby-Hutch: chebyhutch(A, 6, 25, fn = inv)

Test details:

  • Default: tr(sqrt(A))
  • SLQ: slq(A, fn = sqrt, mtol = 0.005)
  • Cheby-Hutch: chebyhutch(A, 6, 25, fn = sqrt)

Test details:

  • SLQ: slq(A, fn = inv, mtol = 0.005)
  • Cheby-Hutch: chebyhutch(A, 6, 25, fn = inv)

Following tests are performed on a Sparse Matrix downloaded form SuiteSparse Matrix Collection and imported using MatrixMarket.jl.

Test details:

Matrix: Kuu.mtx (7102 x 7102), Condition number = 1.57e+04

  • SLQ: slq(A, mtol = X, eps = X, fn = inv)
  • Cheby-Hutch: chebyhutch(A, 15, X, fn = inv)

Up Next

Continuing the journey with trace estimation method, I will be working on Diagonal Approximation method. Along with that, I will also be working on a Sparse Approximate Inverse to serve as preconditioner for Diagonal Approximation.

Thank you for reading!