Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Deadlock in 0.15.x between MultipleAssignmentSubscription and CompositeSubscription #577

Closed
mattrjacobs opened this issue Dec 6, 2013 · 5 comments
Labels

Comments

@mattrjacobs
Copy link
Contributor

Here's the relevant thread dump:

"RxComputationThreadPool-8":
at rx.subscriptions.CompositeSubscription.unsubscribe(CompositeSubscription.java:100)
- waiting to lock <0x00007f2a2846be00> (a rx.subscriptions.CompositeSubscription)
at rx.subscriptions.MultipleAssignmentSubscription.unsubscribe(MultipleAssignmentSubscription.java:43)
- locked <0x00007f2a26de2a80> (a rx.subscriptions.MultipleAssignmentSubscription)
at rx.subscriptions.CompositeSubscription.add(CompositeSubscription.java:92)
- locked <0x00007f2a2846be60> (a rx.subscriptions.CompositeSubscription)
at rx.concurrency.ExecutorScheduler$4.run(ExecutorScheduler.java:128)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:334)
at java.util.concurrent.FutureTask.run(FutureTask.java:166)
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:178)
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:292)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:724)
"RxComputationThreadPool-7":
at rx.subscriptions.MultipleAssignmentSubscription.unsubscribe(MultipleAssignmentSubscription.java:40)
- waiting to lock <0x00007f2a26de2a80> (a rx.subscriptions.MultipleAssignmentSubscription)
at rx.subscriptions.CompositeSubscription.add(CompositeSubscription.java:92)
- locked <0x00007f2a2846be00> (a rx.subscriptions.CompositeSubscription)
at rx.concurrency.ExecutorScheduler$4.run(ExecutorScheduler.java:128)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:334)
at java.util.concurrent.FutureTask.run(FutureTask.java:166)
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:178)
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:292)

@mattrjacobs
Copy link
Contributor Author

In English:

2 CompositeSubscriptions (denoted using last 2 letters of memory address): 00 and 60
1 MultipleAssignmentSubscription : 80

Thread 7:
Composite-00.add
MultipleAssignment-80.unsubscribe

Thread 8:
Composite-60.add
MultipleAssignment-80.unsubscribe
Composite-00.unsubscribe

So I believe there are 2 bugs here - a circular reference between subscriptions and a deadlock when unsubscribes happen on circular subscriptions

@mattrjacobs
Copy link
Contributor Author

Also, based on our codebase, rolling back to 0.14.10 did not exhibit the problem (yet). We also believe that the Scheduler.schedule(Action1) method is the one creating the MultipleAssignmentSubscription (The only other places are OperationRetry and OperationSwitch).

@akarnokd
Copy link
Member

akarnokd commented Dec 8, 2013

Generally, unsubscribing should not happen while holding locks, especially inside these subscription instances. I'll take a look at them.

@mattrjacobs
Copy link
Contributor Author

Thanks for the really quick work, @akarnokd. Once we've got the next release of RxJava into our codebase, we'll let you know if the issue is resolved.

@benjchristensen
Copy link
Member

Should be fixed in #593

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

3 participants