-
-
Notifications
You must be signed in to change notification settings - Fork 9
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
eigvals
doesn't accept a (Matrix, Symmetric)
pencil, although eigen
does
#1002
Comments
The case
function LinearAlgebra.eigvals!(A::StridedMatrix{T}, B::Union{LinearAlgebra.RealHermSymComplexHerm{T},Diagonal{T}}; sortby::Union{Function,Nothing}=nothing) where {T<:Number}
return _choleigvals!(A, B, sortby)
end
function _choleigvals!(A, B, sortby)
U = cholesky(B).U
vals = eigvals!(LinearAlgebra.UtiAUi!(A, U))
return vals
end In the following, the benchmark results on my PC for the two methods above. First, for option number 1 based on the generic function in julia> @benchmark eigvals(A,B)
BenchmarkTools.Trial: 10000 samples with 10 evaluations.
Range (min … max): 1.219 μs … 169.958 μs ┊ GC (min … max): 0.00% … 97.47%
Time (median): 1.280 μs ┊ GC (median): 0.00%
Time (mean ± σ): 1.346 μs ± 2.285 μs ┊ GC (mean ± σ): 2.32% ± 1.38%
██▄▂
▄▆████▅▄▃▂▂▂▂▂▂▂▂▂▁▂▂▂▂▂▂▁▂▁▂▂▂▂▂▁▁▁▂▁▂▂▁▂▂▂▂▂▂▂▁▂▂▂▂▂▂▂▂▂▂ ▃
1.22 μs Histogram: frequency by time 2.21 μs <
Memory estimate: 1.00 KiB, allocs estimate: 9. Next, for option number 2 with julia> @benchmark eigvals(A,Symmetric(B))
BenchmarkTools.Trial: 10000 samples with 10 evaluations.
Range (min … max): 1.788 μs … 111.861 μs ┊ GC (min … max): 0.00% … 95.43%
Time (median): 2.003 μs ┊ GC (median): 0.00%
Time (mean ± σ): 2.144 μs ± 2.383 μs ┊ GC (mean ± σ): 2.34% ± 2.13%
▅█▃
▃▆████▆▅▄▄▄▄▅▇█▇██▆▄▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂▂▂▂▂▂▂▂▂▂▂▂▂▁▂▂▂▂▂▂▂▂▂▂▂ ▃
1.79 μs Histogram: frequency by time 3.37 μs <
Memory estimate: 2.56 KiB, allocs estimate: 14. It seems like we're better off using the generic function from Note: For completeness, |
@jishnub An example implementation, based on the above discussion, is here: aravindh-krishnamoorthy/julia@efbeb98 Could you please check if it works for you now? |
I find that this trend changes for larger matrices (the break-even is somewhere around 1000x1000). julia> using LinearAlgebra, BenchmarkTools
julia> import LinearAlgebra: BlasReal, BlasComplex, HermOrSym
julia> A = Matrix(Tridiagonal(ones(2999), collect(Float64.(1:3000)), ones(2999)));
julia> B = Symmetric(Matrix(Diagonal(Float64[1:3000;])));
julia> function _choleigvals!(A, B, sortby)
U = cholesky(B).U
vals = eigvals!(LinearAlgebra.UtiAUi!(A, U))
return vals
end
_choleigvals! (generic function with 1 method)
julia> function eigvalsc!(A::StridedMatrix{T}, B::Union{LinearAlgebra.RealHermSymComplexHerm{T},Diagonal{T}}; sortby::Union{Function,Nothing}=nothing) where {T<:Number}
return _choleigvals!(A, B, sortby)
end
eigvalsc! (generic function with 1 method)
julia> function eigvalsm!(A::StridedMatrix{T}, B::HermOrSym{T,S}; sortby::Union{Function,Nothing}=nothing) where {T<:BlasReal,S<:StridedMatrix}
return eigvals!(A, Matrix{T}(B); sortby)
end
eigvalsm! (generic function with 1 method)
julia> @btime eigvalsm!($(copy(A)), $(copy(B)));
31.037 s (20 allocations: 69.76 MiB)
julia> @btime eigvalsc!($(copy(A)), $(copy(B)));
16.086 s (23 allocations: 69.63 MiB) This would suggest using the Cholesky approach, as evaluation times are relevant mainly for large matrices. |
Thank you for this info. Indeed, I can also reproduce this on my end and note that for 1000x1000 matrices, Cholesky decomposition based method is better. However, there is still an issue with this approach, which I did not mention last time, and I still prefer to delegate the calls to GGEV3: The function julia> using LinearAlgebra
julia> A = [1 2; 2 1]
2×2 Matrix{Int64}:
1 2
2 1
julia> B = [2 1; 1 -2]
2×2 Matrix{Int64}:
2 1
1 -2
julia> eigen(A, Symmetric(B))
ERROR: PosDefException: matrix is not positive definite; Cholesky factorization failed.
Stacktrace:
[...] Perhaps this reveals to us that we need a positive definite matrices class as will in stdlib, which can really improve several other algorithms, such as inverse, as well. Nevertheless, I have updated the comments in the commit. The relevant commits are as follows: aravindh-krishnamoorthy/julia@efbeb98 + aravindh-krishnamoorthy/julia@86e72bb I will let this lie around for a bit to gather further comments and opinions... |
It should be possible to check if the Cholesky decomposition fails, and dispatch to |
@jishnub Thank you for your reply. But, I still feel that attempting Would this be acceptable? |
The way to check if a matrix is positive-definite in Julia is to perform a Cholesky and see if it fails, so the implementation is actually in the other way. I understand your concerns, but given that a Cholesky decomposition provides a significant boost to a common real-life case, perhaps it's better to attempt it anyway? The Are generalized eigenvalue problems with a non-positive-definite Wonder what @dkarrasch has to say about this? |
Sure, we could wait for the comment of others. But, I do not think that attempting Cholesky decomposition on symmetric matrices is a good idea. I'd rather have a new positive (semi) definite (PSD) matrix type for such functions. That way, we could have the performance boost, and, at the same time, be consistent. |
Such types already exist in packages, e.g. https://github.com/JuliaStats/PDMats.jl, so this doesn't need to be added to |
…ues with symmetric and normal matrices
…n! functions for generalised Normal+Symmetric matrices and forward to the generic functions in eigen.jl
Fixed by JuliaLang/julia#49673. |
It seems like this might benefit from the
Cholesky
factorization as wellThe text was updated successfully, but these errors were encountered: