Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Static file download hangs when having undertow dependency and compression is enabled #12584

Closed
pourfar opened this issue Oct 7, 2020 · 6 comments
Labels
kind/bug Something isn't working triage/needs-feedback We are waiting for feedback.

Comments

@pourfar
Copy link

pourfar commented Oct 7, 2020

Describe the bug
When we try to put a static file with size larger than a threshold into 'src\main\resources\META-INF\resources' folder, and enable compression using 'quarkus.http.enable-compression=true' in application.properties file, and then try to download the static file using web browser or curl, download hangs in middle of transfer. This happens when we have a dependency on undertow.
Also an exception is thrown on next hanging request (if we try to resubmit download request).

Expected behavior
We should be able to download the file completely.

Actual behavior
Download hangs in middle of transfer.

To Reproduce

code-with-quarkus.zip

Steps to reproduce the behavior:

  1. Run attached sample ('mvnw compile quarkus:dev')
  2. Use browser or curl to download 'data.txt' file:
    curl http://localhost:8080/data.txt

Configuration

Add your application.properties here, if applicable.

quarkus.http.enable-compression=true


**Screenshots**
2020-10-07 19:49:10,177 INFO  [io.quarkus] (Quarkus Main Thread) code-with-quarkus 1.0.0-SNAPSHOT on JVM (powered by Quarkus 1.8.2.Final) started in 1.212s. Listening on: http://0.0.0.0:8080
2020-10-07 19:49:10,186 INFO  [io.quarkus] (Quarkus Main Thread) Profile dev activated. Live Coding activated.
2020-10-07 19:49:10,187 INFO  [io.quarkus] (Quarkus Main Thread) Installed features: [cdi, resteasy, servlet, undertow-websockets]
2020-10-07 19:49:22,349 WARN  [io.net.cha.ChannelOutboundBuffer] (vert.x-eventloop-thread-14) Failed to mark a promise as success because it has failed already: DefaultChannelPromise@320f4af2(failure: io.netty.handler.codec.EncoderException: io.netty.util.IllegalReferenceCountException: refCnt: 0, decrement: 1), unnotified cause:: io.netty.handler.codec.EncoderException: io.netty.util.IllegalReferenceCountException: refCnt: 0, decrement: 1
        at io.netty.handler.codec.MessageToMessageEncoder.write(MessageToMessageEncoder.java:104)
        at io.netty.channel.AbstractChannelHandlerContext.invokeWrite0(AbstractChannelHandlerContext.java:717)
        at io.netty.channel.AbstractChannelHandlerContext.invokeWrite(AbstractChannelHandlerContext.java:709)
        at io.netty.channel.AbstractChannelHandlerContext.write(AbstractChannelHandlerContext.java:792)
        at io.netty.channel.AbstractChannelHandlerContext.write(AbstractChannelHandlerContext.java:702)
        at io.netty.handler.codec.MessageToMessageEncoder.write(MessageToMessageEncoder.java:110)
        at io.netty.handler.codec.MessageToMessageCodec.write(MessageToMessageCodec.java:116)
        at io.vertx.core.http.impl.HttpChunkContentCompressor.write(HttpChunkContentCompressor.java:38)
        at io.netty.channel.AbstractChannelHandlerContext.invokeWrite0(AbstractChannelHandlerContext.java:717)
        at io.netty.channel.AbstractChannelHandlerContext.invokeWrite(AbstractChannelHandlerContext.java:709)
        at io.netty.channel.AbstractChannelHandlerContext.write(AbstractChannelHandlerContext.java:792)
        at io.netty.channel.AbstractChannelHandlerContext.write(AbstractChannelHandlerContext.java:702)
        at io.netty.handler.stream.ChunkedWriteHandler.doFlush(ChunkedWriteHandler.java:300)
        at io.netty.handler.stream.ChunkedWriteHandler.flush(ChunkedWriteHandler.java:132)
        at io.netty.channel.AbstractChannelHandlerContext.invokeFlush0(AbstractChannelHandlerContext.java:750)
        at io.netty.channel.AbstractChannelHandlerContext.invokeFlush(AbstractChannelHandlerContext.java:742)
        at io.netty.channel.AbstractChannelHandlerContext.flush(AbstractChannelHandlerContext.java:728)
        at io.netty.channel.ChannelDuplexHandler.flush(ChannelDuplexHandler.java:127)
        at io.netty.channel.AbstractChannelHandlerContext.invokeFlush0(AbstractChannelHandlerContext.java:750)
        at io.netty.channel.AbstractChannelHandlerContext.invokeWriteAndFlush(AbstractChannelHandlerContext.java:765)
        at io.netty.channel.AbstractChannelHandlerContext.write(AbstractChannelHandlerContext.java:790)
        at io.netty.channel.AbstractChannelHandlerContext.writeAndFlush(AbstractChannelHandlerContext.java:758)
        at io.vertx.core.net.impl.ConnectionBase.write(ConnectionBase.java:124)
        at io.vertx.core.net.impl.ConnectionBase.lambda$queueForWrite$2(ConnectionBase.java:215)
        at io.netty.util.concurrent.AbstractEventExecutor.safeExecute(AbstractEventExecutor.java:164)
        at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:472)
        at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:500)
        at io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:989)
        at io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74)
        at io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
        at java.base/java.lang.Thread.run(Thread.java:832)
Caused by: io.netty.util.IllegalReferenceCountException: refCnt: 0, decrement: 1
        at io.netty.util.internal.ReferenceCountUpdater.toLiveRealRefCnt(ReferenceCountUpdater.java:74)
        at io.netty.util.internal.ReferenceCountUpdater.release(ReferenceCountUpdater.java:138)
        at io.netty.buffer.AbstractReferenceCountedByteBuf.release(AbstractReferenceCountedByteBuf.java:100)
        at io.vertx.core.http.impl.AssembledHttpResponse.release(AssembledHttpResponse.java:159)
        at io.netty.util.ReferenceCountUtil.release(ReferenceCountUtil.java:88)
        at io.netty.handler.codec.MessageToMessageEncoder.write(MessageToMessageEncoder.java:91)
        ... 30 more

**Environment (please complete the following information):**

 - Output of `uname -a` or `ver`: 
Microsoft Windows [Version 10.0.19041.508]

 - Output of `java -version`: 
java version "15" 2020-09-15
Java(TM) SE Runtime Environment (build 15+36-1562)
Java HotSpot(TM) 64-Bit Server VM (build 15+36-1562, mixed mode, sharing)

 - GraalVM version (if different from Java): 

 - Quarkus version or git rev: 
1.8.2.Final

 - Build tool (ie. output of `mvnw --version` or `gradlew --version`): 
Apache Maven 3.6.3

**Additional context**
(Add any other context about the problem here.)
@pourfar pourfar added the kind/bug Something isn't working label Oct 7, 2020
@gsmet
Copy link
Member

gsmet commented Oct 9, 2020

/cc @stuartwdouglas does it ring a bell?

@johnoliver
Copy link
Contributor

I can also confirm that when enable-compression is set that it causes very odd behaviour, I also see the exception above, however oddly enough that correlates with when the request actually succeeds from the clients point of view. When set I see the following behaviour:

curl -vvvv 'http://localhost:8080/anaddress' -o output --compressed this will work fine but the server will throw the exception shown in the original bug.

curl -vvvv 'http://localhost:8080/anaddress' -o output this will spin forever and the output will contain only a partial segment of what should be returned, this is neither the start or end of the output, but in my case a section of the middle of the response. No errors are shown server side.

It seems to me that when enable-compression is set, and the client request does not offer some compression in the Accept-Encoding header, that the server does something odd.

@famod
Copy link
Member

famod commented May 17, 2021

Seems like more or less the same as #14695?

@famod
Copy link
Member

famod commented May 26, 2021

FWIW, I tried both this issue's reproducer and the one of #14695 with 2.0.0.CR1 which contains the upgrade to netty 4.1.65 and things are still kinda strange: #14695 (comment)

@geoand
Copy link
Contributor

geoand commented Feb 27, 2024

Is this still an issue?

@geoand geoand added the triage/needs-feedback We are waiting for feedback. label Feb 27, 2024
@geoand
Copy link
Contributor

geoand commented Mar 14, 2024

Closing for lack of feedback

@geoand geoand closed this as not planned Won't fix, can't repro, duplicate, stale Mar 14, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
kind/bug Something isn't working triage/needs-feedback We are waiting for feedback.
Projects
None yet
Development

No branches or pull requests

5 participants