-
Notifications
You must be signed in to change notification settings - Fork 12.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. Weβll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conditional type inside function resolves differently than at top-level #49852
Comments
Additionally, prior to {
type A = Record<string, number>
type B = Record<'a' | 'b', number>
const x: A extends B ? true : false = false; // error, A is assignable to B
} So |
This is a particularly unfortunate manifestation of a known issue related to variance measurement. For To make up for this discrepancy we mark type parameters that occur in the key position of mapped types as having "unreliable" variance, which causes us to relate types structurally when they appear unrelated according to variance. The net effect is: declare let aa: { [x: string]: number }
declare let bb: { a: number, b: number }
aa = bb; // Ok
bb = aa; // Error
declare let ar: Record<string, number>;
declare let br: Record<'a' | 'b', number>;
ar = br; // Ok (related structurally because of "unreliable" marker)
br = ar; // Ok (permitted by variance, but error structurally) We could fix this my marking type parameters that occur in the key position of mapped types as "unmeasurable", which would cause us to always relate structurally. But (a) this has negative performance impact, and (b) it would technically be a breaking change because undoubtedly code exists that depends on the current behavior. I'm pretty sure we've already tried and rejected this remedy. Returning to the original example in this issue, the global scope The behavior changed in 4.2 because that's when we introduced the ability to re-alias type alias instantiations. |
Wowβ¦ it feels so wrong that an undocumented internal implementation detail like that should affect assignability. Not that that doesnβt happen elsewhere (variance measurement being what it is), but this one is particularly hard to come up with theoretical justification for without popping the hood of the compiler. |
For what it's worth, I wouldn't be caught dead intentionally relying on any behavior that can best be summed up as " |
I mentioned this is already a known issue. For reference, it was first reported in #29698. We even have a test to show the inconsistency, introduced in #30416. That PR first attempted to mark all mapped types as "unmeasurable", but then we backed off, I think because there were too many DT breaks. We may have to try that again. @weswigham Thoughts? |
No worries, I understand it's a design limitation; it's not really the unsoundness that bothers me but moreso that I'm trying to figure out the rationale for the discrepency below so I can try to invent a theoretical explanation for it that doesn't involve popping the hood π
edit: I just realized this probably only affects the case where you're creating type aliases for generic instantiations, and things directly typed as |
Hi, I asked this question on StackOverflow , and this issue has been linked. If no fix for this can be found, would it at least be possible to add a warning when typescript assumes the variance, since that assumption may be wrong, with instructions on how to explicitely tell Typescript the variance (with in, out or in out annotations)? And maybe let the users opt-out of the variance optimization (maybe with something like a novar T annotation that would tell typescript to always compare instances of that generic type structurally) ? I don't see how that could be a breaking change. |
@ahejlsberg I don't know enough about the variance part of the compiler yet, so maybe this is wrong, but I feel like marking the variance of the key type of mapped types as immeasurable is addressing the symptom instead of the cause of the problem. If I understand correctly from your comment this problem stems from a difference in the semantics between the normal properties generated by mapped types, which have "required" semantics, and index signatures, which have "optional" semantics, ultimately breaking referential transparency. In other words, it seems to stem from an unsoundness in the natural expansion of the type of index signatures, i.e. (purely hypothetically): type Foo = {
[k: symbol | 'a']: string
} would currently expand to (or be equivalent to) type Foo = {
[k: symbol]: string
a: string
} and in this case both notations are not equivalent. In reality the following type would be equivalent to the first: type Foo = {
[k: symbol]: string
a?: string
} (In directly specified key signatures like these this unsoundness is currently avoided by error 1337, which disallows literal types in index signatures.) So I feel like this issue could more cleanly be solved by allowing both required and optional index signatures: // equivalent to existing index signatures
type Foo = {
[k: string]?: unknown
}
// required signature. note that a value for `Foo` can in this case not be provided,
// as it would have to return a value for every possible string index value,
// but that's not necessarily a problem. there are other empty types, e.g. `never`
type Foo = {
[k: string]: unknown
}
// however it does allow us to provide an index signature that is equivalent to normal properties:
type Foo = {
[k: 'a' | 'b']: unknown
}
// is equivalent to
type Foo = {
a: unknown,
b: unknown
}
// whereas the optional version would be equivalent to optional properties:
type Foo = {
[k: 'a' | 'b']?: unknown
}
// is equivalent to
type Foo = {
a?: unknown,
b?: unknown
} In this case the mapped type can just generate the required version of the index signatures if it is mapping without an optional modifier, and the optional version if it is mapping with an optional modifier, and referential transparency is conserved. This would also overlap with Of course we can't just completely change the semantics of the index signature syntax for backwards compatibility reasons, but this could be avoided by instead letting the syntax default to "optional" and providing a syntax for a "required" signature, e.g.: // optional
type Foo = {
[k: string]: unknown
}
// required
type Foo = {
[k: string]!: unknown
} |
Fixes #167 Weirdly the `Resolve` helper only resolves types at the top level, not in functions: ```ts interface Point { x: number; y: number; } type Resolve<Raw> = Raw extends Function ? Raw : { [K in keyof Raw]: Raw[K] }; function foo(pt: Point) { let k: keyof Point; for (k in pt) { // ^? let k: keyof Point type SynthK = Resolve<keyof Point>; // ^? type SynthK = keyof Point } } type SynthT2 = Resolve<keyof Point>; // ^? type SynthT2 = "x" | "y"`); ``` This might be microsoft/TypeScript#49852
Bug Report
π Search Terms
regression conditional type inside function behaves difference
π Version & Regression Information
β― Playground Link
Playground link
π» Code
π Actual behavior
The conditional type resolves differently in global vs local scope.
π Expected behavior
Both should resolve the same way, preferably
false
in both cases.Additional Details
This seems to have something to do with the way typescript compares aliases, wrapping one of the
Record
types inOmit<Record<...>, never>
causes the comparison to behave normallyPlayground Link
The text was updated successfully, but these errors were encountered: