Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Allow slicing of DenseAxisArray #2524

Merged
merged 18 commits into from
Mar 22, 2021
Merged

Allow slicing of DenseAxisArray #2524

merged 18 commits into from
Mar 22, 2021

Conversation

odow
Copy link
Member

@odow odow commented Mar 22, 2021

DenseAxisArrays with OneTo axes used a dictionary that mapped i => i. This PR replaces that with a Base.OneTo for improved efficiency. But in doing so, I realized we can now define get for vectors, and support proper slicing of DenseAxisArray.

Closes #287

@odow odow added Category: Containers Related to the Containers submodule Type: Performance labels Mar 22, 2021
@odow odow changed the title Improve performance of DenseAxisArray Allow slicing of DenseAxisArray Mar 22, 2021
@codecov
Copy link

codecov bot commented Mar 22, 2021

Codecov Report

Merging #2524 (b07b264) into master (f6be919) will decrease coverage by 0.18%.
The diff coverage is 100.00%.

Impacted file tree graph

@@            Coverage Diff             @@
##           master    #2524      +/-   ##
==========================================
- Coverage   93.69%   93.51%   -0.19%     
==========================================
  Files          43       45       +2     
  Lines        5360     5473     +113     
==========================================
+ Hits         5022     5118      +96     
- Misses        338      355      +17     
Impacted Files Coverage Δ
src/Containers/DenseAxisArray.jl 87.00% <100.00%> (+3.22%) ⬆️
src/_Derivatives/topological_sort.jl 95.12% <0.00%> (-4.88%) ⬇️
src/_Derivatives/coloring.jl 95.31% <0.00%> (-2.50%) ⬇️
src/_Derivatives/sparsity.jl 95.18% <0.00%> (-2.41%) ⬇️
src/_Derivatives/conversion.jl 96.49% <0.00%> (-1.76%) ⬇️
src/constraints.jl 95.02% <0.00%> (-0.50%) ⬇️
src/nlp.jl 92.21% <0.00%> (-0.45%) ⬇️
src/JuMP.jl 82.90% <0.00%> (ø)
src/precompile.jl 95.00% <0.00%> (ø)
src/feasibility_checker.jl 100.00% <0.00%> (ø)

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update f6be919...b07b264. Read the comment docs.

@odow
Copy link
Member Author

odow commented Mar 22, 2021

Here's the proof of @mlubin's example:

julia> using JuMP
[ Info: Precompiling JuMP [4076af6c-e467-56ae-b986-b466b2749572]

julia> model = Model()
A JuMP Model
Feasibility problem with:
Variables: 0
@variableModel mode: AUTOMATIC
CachingOptimizer state: NO_OPTIMIZER
Solver name: No optimizer attached.

julia> @variable(model, x[0:5])
1-dimensional DenseAxisArray{VariableRef,1,...} with index sets:
    Dimension 1, 0:5
And data, a 6-element Array{VariableRef,1}:
 x[0]
 x[1]
 x[2]
 x[3]
 x[4]
 x[5]

julia> x[0:2]
1-dimensional DenseAxisArray{VariableRef,1,...} with index sets:
    Dimension 1, [0, 1, 2]
And data, a 3-element Array{VariableRef,1}:
 x[0]
 x[1]
 x[2]

@odow odow merged commit 51c2be6 into master Mar 22, 2021
@odow odow deleted the od/containers branch March 22, 2021 21:32
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Category: Containers Related to the Containers submodule Type: Performance
Development

Successfully merging this pull request may close these issues.

Slicing DenseAxisArrays
2 participants