-
-
Notifications
You must be signed in to change notification settings - Fork 34
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Support functions in tensor shapes. #2
Comments
Hello! There is an active typing-sig mailing list, with a tensor typing meeting that you would probably be interested in. These meeting have discussed things like doing tensor arithmetic, and have produced a PEP-in-progress (646) for variadic generics. This does not actually enable type arithmetic but does serve to lay foundation for it. This repo may also be of interest to you. The main to contributor to the repo runs the tensor typing meeting ans is the author of the PEP. |
Hello! Thanks for reaching out. The tensor typing meetings do sound interesting. I've now joined the typing-sig mailing list. On the topic of tensor arithmetic, I remember it seeing discussed somewhere (I forget where exactly) that whenever it makes it in, type arithmetic would likely be restricted to simple arithmetic like +, -, max etc. One of the nice things about operating at runtime is that |
There are definitely advantages to |
e.g.
cat(a: TensorType["x"], b: TensorType["y"]) -> TensorType["x + y"]
.Given that
torchtyping
operates at runtime then arbitrary Python expressions should be possible. Essentially just calleval
with the appropriate constantsx
,y
etc. bound in. Ignore any errors, so that e.g.func(x: TensorType["x + y"])
remains valid, but if the expression evaluates then compare its result against the value inferred.This should happen as an additional round of checking, at the end of the current
_check_memo
function.The text was updated successfully, but these errors were encountered: