Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Literal type not inferred correctly when using generics #10685

Closed
OliverJAsh opened this issue Sep 2, 2016 · 1 comment
Closed

Literal type not inferred correctly when using generics #10685

OliverJAsh opened this issue Sep 2, 2016 · 1 comment
Labels
Design Limitation Constraints of the existing architecture prevent this from being fixed

Comments

@OliverJAsh
Copy link
Contributor

Version 2.0.2

type A = 'a'
const z: A[] = ['a']; // fine as expected

// how come 'a' cannot be inferred as type `A`?
const x: A[] = Array.from(['a'])  // Error: Type 'string' not assignable to 'a'

// my workaround (without creating a variable):
const y: A[] = Array.from([_.identity<A>('a')])

I think this may be related to #10676

@mhegazy
Copy link
Contributor

mhegazy commented Sep 2, 2016

The main problem is that literal types are not inferred unless there is a contextual type. and for generic types, return types are not an inference position. so the A[] in const x: A[] = Array.from(['a']) does not impact how T is inferred for Array.from call. and thus "a" is always inferred as string.

The rational here is that literal types were added later on, and inferring them always by default would be a breaking change to code written before TS 1.8. e.g.:

let x = `a`;
x = `b` // Error x is `a`.

Now #10676 is trying to address that by inferring literals more often. but it will not address all cases, such as the one above. you can see the description @ahejlsberg provides in the PR:

During type argument inference, literal types are widened to their base primitive type unless the target type parameter has a constraint that includes primitive or literal types.

The rational here is that a literal type is not super useful by itself, if the target of the inference is not one itself. e.g. new Array('a'), most likely you do not want an array of a's, but rather a string array with one initial value.

so to an extent this issue is really a design limitation. if we infer literal types all the time, this breaks existing code in a huge way.

A better work around for this scenario is to explicitly specify the generic type argument for Array.from, e.g:

const y = Array.from<A>(['a']);

@mhegazy mhegazy added the Design Limitation Constraints of the existing architecture prevent this from being fixed label Sep 2, 2016
@mhegazy mhegazy closed this as completed Sep 20, 2016
@microsoft microsoft locked and limited conversation to collaborators Jun 19, 2018
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
Design Limitation Constraints of the existing architecture prevent this from being fixed
Projects
None yet
Development

No branches or pull requests

2 participants