Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
The problem was that we had some cached lexer state when reading new input during FSI session. If lexer threw an error on a very first token of new input, the cached lexer state would not get updated, so we would assosciate old lexer state with this new token that caused an error. Now we will invalidate that cached state at the begining of reading new input. Co-authored-by: Adam Boniecki <[email protected]>
- Loading branch information