-
Notifications
You must be signed in to change notification settings - Fork 68
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Number of dimensions of a tensor increases when indexing a dimension of length 1 with a one-dimension logical vector #1174
Comments
I think indexing must used by tensors not any data types , I mean we can use :
And
|
Thanks for pointing out that the problem does not occur when using bolean tensor for indexing. After reading your comment, I went to look in package website and/or in the book but I did not find any information/guidelines on using booleans for indexing. Since using logical expressions is very natural in R (e.g. However, as a user, I think that one should be able to use a R logical vectors and logical expressions to index torch tensors in the same way as we can already use R numerical vectors for indexing. Moreover, using R logical vectors already works except for the case pointed out in this issue, which does not gives the expected result. It seems weird that one should cast all R expressions that yield a boolean vector that is used to index a tensor just to avoid this specific problem. Perhaps there are other issues that I am not aware of? Note I have tried to find out in the code about the Tensor class what can explain the fact that |
I think it is in indexing.R .
for Line 55 in 10ec53b
|
@mohamed-180, as you suggested, I have converted the R logical vectors into tensors to avoid the issue with indexing. @dfalbel. Is this issue a bug or is there any reason for it? Note that it makes my code messier because I have now to use tensors not only for the index but also for R vectors that are being indexed since
gives an More generally, it seems that I keep bumping into small issues that occur because, fundamentally, torch tensors and R vectors are different (see also #1164). So I understand that it is not always obvious how to define operators should behave in places that mix R vector and torch tensors. A bit more documentation, in particular with respect to indexing and slicing when R vectors and torch tensors are mixed, indications on possible performance hits when converting R vector to torch tensors and vice versa, implicit conversions linked to redefinition of standard operators, the interface between R vectors and "scalar" torch tensors , would be very useful and appreciated. |
This indeed looks like a bug to me. We should treat R scalar logical vectors the same as vectors of logical vectors. |
Here is the issue in a nutshell
The 1d tensor becomes 2d when indexing the first dimension with
TRUE
.This happens in other cases such as
Indexing with a logical vector does not add a dimension when the indexed dimension
is longer than 1:
though I would have perhaps expected a 0d tensor in this case,
for consistency with
Using a numerical index with drop=FALSE also does not add (or drop) dimension:
The text was updated successfully, but these errors were encountered: