We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Dispatcher#unsafeToFutureCancelable
h/t @TimWSpence for original minimization.
//> using scala "3.3.1" //> using dep "org.typelevel::cats-effect::3.5.2" import cats.* import cats.syntax.all.* import cats.effect.* import cats.effect.std.Dispatcher import scala.concurrent.duration.* object Repro extends IOApp.Simple: // also sequential val run = Dispatcher.parallel[IO].use { dispatcher => IO.fromFuture { IO { println("starting") val (_, cancel) = dispatcher.unsafeToFutureCancelable(IO.never) // broken // val (_, cancel) = IO.never.unsafeToFutureCancelable()(runtime) // works println("started, now canceling") cancel() } }.flatMap(_ => IO(println("canceled"))).replicateA_(1000) }
The text was updated successfully, but these errors were encountered:
Add repro from typelevel#3898
729e9bb
Dispatcher.parallel
There are (at least) two different problems here. #3900 seems to fix the problem for Dispatcher.parallel.
For Dispatcher.sequential, the problem seems to be that it doesn't actually fork the submitted action, so it can't really cancel it (https://github.com/typelevel/cats-effect/blob/series/3.x/std/shared/src/main/scala/cats/effect/std/Dispatcher.scala#L214). I'm not sure what to do with that.
Dispatcher.sequential
Sorry, something went wrong.
If I remember correctly, a sequential dispatcher involves at least two fibers:
sequential
So even though the task is not actually forked, it should be possible to cancel it by simply canceling the worker fiber.
Merge pull request #3900 from durban/issue3898
3f0182d
Fix #3898 for `Dispatcher.parallel`
b67eb9f
No branches or pull requests
h/t @TimWSpence for original minimization.
The text was updated successfully, but these errors were encountered: