Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update sparspak #253

Merged
merged 4 commits into from
Jan 10, 2023
Merged

Update sparspak #253

merged 4 commits into from
Jan 10, 2023

Conversation

j-fu
Copy link
Contributor

@j-fu j-fu commented Jan 9, 2023

No description provided.

* Base sparspak interface on sparspaklu
* Introduce tests for MultiFloats and ForwardDiff
@j-fu
Copy link
Contributor Author

j-fu commented Jan 9, 2023

In fact precompilation does not help much here - there are so many different generic FP types (and ForwardDiff breeds them in masses with diffent tags anyway) that probably a hint for users to care about precompilation themselves in the docs would make sense.

@PetrKryslUCSD
Copy link

This looks good. Agreed on the precompilation.

@ChrisRackauckas
Copy link
Member

Precompiling for Float64, which is by far the most common case, is probably useful though.

Sparspak.Problem.infullrhs!(p, b)
s = Sparspak.SparseSolver.SparseSolver(p)
return s
return sparspaklu(A)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this performs an extra lu-factorization though? Can we avoid this and just build a blank/empty LU object to fill later?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I see. Assuming I have all info on A this is possible but not in the API. I think I could add a kwarg to sparspaklu.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think a kwarg would be a nice solution.

Copy link
Contributor Author

@j-fu j-fu Jan 10, 2023

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This will be in PetrKryslUCSD/Sparspak.jl#16 . Once this is done I will add a corresponding commit here.

@codecov
Copy link

codecov bot commented Jan 10, 2023

Codecov Report

Merging #253 (bae7cc0) into main (e850e1e) will increase coverage by 0.31%.
The diff coverage is 88.88%.

@@            Coverage Diff             @@
##             main     #253      +/-   ##
==========================================
+ Coverage   65.99%   66.31%   +0.31%     
==========================================
  Files          12       12              
  Lines         744      754      +10     
==========================================
+ Hits          491      500       +9     
- Misses        253      254       +1     
Impacted Files Coverage Δ
src/factorization.jl 78.60% <87.50%> (-0.98%) ⬇️
src/LinearSolve.jl 95.00% <100.00%> (+20.00%) ⬆️

📣 We’re building smart automated test selection to slash your CI/CD build times. Learn more

@ChrisRackauckas
Copy link
Member

I'll run the formatter, thanks!

@ChrisRackauckas ChrisRackauckas merged commit c31e460 into SciML:main Jan 10, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants