-
Notifications
You must be signed in to change notification settings - Fork 722
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
OpenJDK java/foreign/TestDowncallScope.java NoSuchMethodError MethodHandle.invokeBasic #16161
Comments
@ChengJin01 @tajila fyi |
This doesn't make any sense to me as it always works fine till https://openj9-jenkins.osuosl.org/job/Test_openjdk19_j9_sanity.openjdk_x86-64_linux_Nightly/ and there is no updated code related to downcall. I will double-check locally to see how it goes. |
which seems odd to me
in which I'm wondering whether there was any change merged in OpenJDK MH recently given the issue never occurs on other supported platforms. |
FYI: @fengxue-IS |
I am not aware of any changes related to method handle invocation that was merged. |
Launched a Grinder (x10) at https://openj9-jenkins.osuosl.org/job/Grinder/1434/ on other x86_64 machines to see how it goes. |
All tests passed on the Grinder at https://openj9-jenkins.osuosl.org/job/Grinder/1434/tapTestReport/ and no issue was detected. So I suspect the issue was caused a corrupted nightly build or some platform dependent issue in there. |
I have been working on investigating this test failure over the past few days. It can be reproduced in Grinder or locally within 20-30 iterations. However, using To answer your question @ChengJin01 , there are a few more things I am currently trying, but given the intermittent nature of this failure and the lack of success in generating JIT logs from a failed run, it is difficult to estimate when this can be resolved. |
https://openj9-jenkins.osuosl.org/job/Test_openjdk19_j9_sanity.openjdk_x86-64_linux_Nightly/42
|
@nbhuiyan Do you think this will be resolved within 2 weeks? |
https://openj9-jenkins.osuosl.org/job/Test_openjdk19_j9_sanity.openjdk_x86-64_linux_Nightly/46
|
unfortunately that is difficult to say because we do not know what the problem is and unable to get the JIT logs necessary to examine what may be going wrong during compilation. I will continue working on this and will provide updates on the progress. |
https://openj9-jenkins.osuosl.org/job/Test_openjdk19_j9_sanity.openjdk_x86-64_linux_Nightly/54/
|
Update: I have managed to obtain a core dump for this failure locally that I am currently investigating. |
There was no new information I could obtain from the core dump than what I already did in #14135. Since attempting to obtain JIT logs hide the problem, I experimented on turning off some transformations we do in the JIT for invokeBasic call, to rule out the possibility of the JIT transforming it into something that causes incorrect behaviour. There should be no issues with dispatching into either the invokeBasic INL handler or the compiled target method. invokeBasic calls get transformed by the JIT into a conditional branch if the target method cannot be determined at compile-time. This allows faster dispatch into the target method if it is compiled, skipping j2i and i2j transitions. This is done in RecognizedCallTransformer. There is a simpler mechanism in MethodHandleTransformer if the target method is known, where the solution is simply transforming the call into the target method. I experimented with several builds with one or both these transformations disabled. In all cases, I was able to reproduce the failure, so it rules out any of these transformations doing something wrong. |
The failure seems more easily reproducible with my build that disables the JIT transformations done to invokeBasic calls. I have been able to reproduce the crash today with a non-consequential JIT option in the command line, which means it may be possible to obtain a compilation log of a failed run. While I try to obtain the JIT logs, I am focusing on comparing the differences between the generated code of a failed run (from the core dump) vs what is normally generated in a passing run. |
While I have not been able to obtain a JIT log for a failed run yet, here is a summary of some info gathered so far:
|
That might be what was happening in these failing tests. e.g.
For |
Seems like this error happens only when |
Since my last update, the focus has been on investigating the AOT compilation of |
https://openj9-jenkins.osuosl.org/job/Test_openjdk17_j9_sanity.functional_x86-64_linux_OMR_testList_0/253
|
https://openj9-jenkins.osuosl.org/job/Test_openjdk19_j9_sanity.openjdk_x86-64_linux_Nightly/73
|
https://openj9-jenkins.osuosl.org/job/Test_openjdk17_j9_sanity.functional_x86-64_linux_Nightly_testList_1/367 (Dec. 30)
|
@nbhuiyan, the failing test openj9/test/functional/Java17andUp/src_170/org/openj9/test/jep389/downcall/StructTests2.java Line 2166 in c9907a0
which might be easier for you to narrow down as compared to the jtreg test suite. |
Since my last update, I have managed to obtain the AOT compilation logs and core dumps for a few failing runs (where The reason why JIT-compiling Prior to calling From the same AOT-compiled code, we have both passing cases and failing cases based on which helper is used prior to |
@nbhuiyan The MH INLs ( |
@fengxue-IS I agree that the right approach is to ensure that the right helper is used for resolving the method ref. The question that remains now is why do we end up in the |
https://openj9-jenkins.osuosl.org/job/Test_openjdk17_j9_sanity.openjdk_x86-64_linux_Nightly/388 One example:
|
The resolution helper is selected based on the bytecode / cp ref type, @nbhuiyan can you generate a core file and check what bytecode instruction is used to do the invocation in failing path |
@fengxue-IS , here are the bytecodes of
|
Thanks, the bytecode is using |
JDK17 0.36.0 milestone 2 build(
50x grinder jdk_foreign_0 - 25/50 failed |
https://openj9-jenkins.osuosl.org/job/Test_openjdk17_j9_sanity.openjdk_x86-64_linux_Nightly/390 |
https://openj9-jenkins.osuosl.org/job/Test_openjdk19_j9_sanity.openjdk_x86-64_linux_Nightly/90 |
This problem happens because methods that are not yet AOT compilable (which is all of the java/lang/invoke/* methods), are considered compilable by the inliner, which then inlines non-AOT-compilable callees resulting in the strange invokeBasic dispatch attempt via the special method resolve helper. We either need AOT compilations to handle MH-related INL calls properly, or for the inliner to not inline callee methods that are not eligible for AOT compilations during AOT. The inliner decides this by using |
Is the INLs the only methods non-AOT compilable? or do we need to exclude the entire package? |
https://openj9-jenkins.osuosl.org/job/Test_openjdk19_j9_sanity.openjdk_x86-64_linux_Nightly/92 |
The MH INLs such as invokebasic and linkTo* are never compiled even in JIT. For AOT, the entire java/lang/invoke package has to be excluded from compilation. |
There was a failure in https://openj9-jenkins.osuosl.org/job/Test_openjdk17_j9_sanity.openjdk_x86-64_linux_Nightly/394 but it seems to be testing a JVM before the fix was merged. |
https://openj9-jenkins.osuosl.org/job/Test_openjdk19_j9_sanity.openjdk_x86-64_linux_Nightly/31
java/foreign/TestDowncallScope.java
The text was updated successfully, but these errors were encountered: