Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

processor.onNewFrame(bitmap, long): JNI DETECTED ERROR IN APPLICATION: GetStringUTFChars received NULL jstring #3452

Closed
gavel94 opened this issue Jun 21, 2022 · 15 comments
Assignees
Labels
legacy:pose Pose Detection related issues platform:android Issues with Android as Platform type:bug Bug in the Source Code of MediaPipe Solution

Comments

@gavel94
Copy link

gavel94 commented Jun 21, 2022

I'm seeing the following crash after a call to (the very first frame after app launch) :
processor.onNewFrame(bitmap, System.currentTimeMillis());

Integrate pose detection according to documentation.

1,Initialize
‘’‘
try {
processor = FrameProcessor(activity,"pose_tracking_gpu.binarypb")

        processor?.addPacketCallback("pose_landmarks",object :PacketCallback{
            override fun process(packet: Packet?) {
                try {
                    Log.e("aaaa", "process:${PacketGetter.getString(packet)} ", )

                } catch (exception: Exception) {
                }
            }

        })
    }catch (e:Exception){
       e.printStackTrace()
    }

‘’‘

2, use
Get data in the USB camera callback。
Data is converted to Bitmap via OpencV。
‘’‘
val cacheBitmap = Bitmap.createBitmap(
bgMat.cols(),
bgMat.rows(),
Bitmap.Config.ARGB_8888
)
Utils.matToBitmap(bgMat, cacheBitmap)
processor?.onNewFrame(cacheBitmap,System.currentTimeMillis())
‘’’
3,result
The following logs are obtained after calling “processor?.onNewFrame(cacheBitmap,System.currentTimeMillis())”

'''
2022-06-21 15:36:22.193 5462-5462/? E/[email protected]: Could not get passthrough implementation for [email protected]::ICameraProvider/legacy/0.
2022-06-21 15:36:22.269 30174-5460/com.elevatorbus.feco A/levatorbus.fec: java_vm_ext.cc:542] JNI DETECTED ERROR IN APPLICATION: GetStringUTFChars received NULL jstring
2022-06-21 15:36:22.269 30174-5460/com.elevatorbus.feco A/levatorbus.fec: java_vm_ext.cc:542] in call to GetStringUTFChars
2022-06-21 15:36:22.269 30174-5460/com.elevatorbus.feco A/levatorbus.fec: java_vm_ext.cc:542] from void com.google.mediapipe.framework.Graph.nativeMovePacketToInputStream(long, java.lang.String, long, long)
2022-06-21 15:36:22.269 30174-5460/com.elevatorbus.feco A/levatorbus.fec: java_vm_ext.cc:542] "Thread-17" prio=5 tid=48 Runnable
2022-06-21 15:36:22.269 30174-5460/com.elevatorbus.feco A/levatorbus.fec: java_vm_ext.cc:542] | group="main" sCount=0 dsCount=0 flags=0 obj=0x12f80110 self=0x8cbe8200
2022-06-21 15:36:22.269 30174-5460/com.elevatorbus.feco A/levatorbus.fec: java_vm_ext.cc:542] | sysTid=5460 nice=0 cgrp=default sched=0/0 handle=0x8aea5970
2022-06-21 15:36:22.269 30174-5460/com.elevatorbus.feco A/levatorbus.fec: java_vm_ext.cc:542] | state=R schedstat=( 281723173 123833911 470 ) utm=19 stm=8 core=3 HZ=100
2022-06-21 15:36:22.269 30174-5460/com.elevatorbus.feco A/levatorbus.fec: java_vm_ext.cc:542] | stack=0x8adaa000-0x8adac000 stackSize=1010KB
2022-06-21 15:36:22.269 30174-5460/com.elevatorbus.feco A/levatorbus.fec: java_vm_ext.cc:542] | held mutexes= "mutator lock"(shared held)
2022-06-21 15:36:22.269 30174-5460/com.elevatorbus.feco A/levatorbus.fec: java_vm_ext.cc:542] native: #00 pc 002e0cf3 /system/lib/libart.so (art::DumpNativeStack(std::__1::basic_ostream<char, std::__1::char_traits>&, int, BacktraceMap*, char const*, art::ArtMethod*, void*, bool)+134)
2022-06-21 15:36:22.269 30174-5460/com.elevatorbus.feco A/levatorbus.fec: java_vm_ext.cc:542] native: #1 pc 00378753 /system/lib/libart.so (art::Thread::DumpStack(std::__1::basic_ostream<char, std::__1::char_traits>&, bool, BacktraceMap*, bool) const+210)
2022-06-21 15:36:22.269 30174-5460/com.elevatorbus.feco A/levatorbus.fec: java_vm_ext.cc:542] native: #2 pc 00374d6f /system/lib/libart.so (art::Thread::Dump(std::__1::basic_ostream<char, std::__1::char_traits>&, bool, BacktraceMap*, bool) const+34)
2022-06-21 15:36:22.269 30174-5460/com.elevatorbus.feco A/levatorbus.fec: java_vm_ext.cc:542] native: #3 pc 00235b11 /system/lib/libart.so (art::JavaVMExt::JniAbort(char const*, char const*)+720)
2022-06-21 15:36:22.269 30174-5460/com.elevatorbus.feco A/levatorbus.fec: java_vm_ext.cc:542] native: #4 pc 00235e77 /system/lib/libart.so (art::JavaVMExt::JniAbortV(char const*, char const*, std::__va_list)+58)
2022-06-21 15:36:22.269 30174-5460/com.elevatorbus.feco A/levatorbus.fec: java_vm_ext.cc:542] native: #5 pc 000c4e89 /system/lib/libart.so (art::(anonymous namespace)::ScopedCheck::AbortF(char const*, ...)+48)
2022-06-21 15:36:22.269 30174-5460/com.elevatorbus.feco A/levatorbus.fec: java_vm_ext.cc:542] native: #6 pc 000c469f /system/lib/libart.so (art::(anonymous namespace)::ScopedCheck::CheckInstance(art::ScopedObjectAccess&, art::(anonymous namespace)::ScopedCheck::InstanceKind, _jobject*, bool)+394)
2022-06-21 15:36:22.269 30174-5460/com.elevatorbus.feco A/levatorbus.fec: java_vm_ext.cc:542] native: #7 pc 000c397b /system/lib/libart.so (art::(anonymous namespace)::ScopedCheck::CheckPossibleHeapValue(art::ScopedObjectAccess&, char, art::(anonymous namespace)::JniValueType)+630)
2022-06-21 15:36:22.269 30174-5460/com.elevatorbus.feco A/levatorbus.fec: java_vm_ext.cc:542] native: #8 pc 000c3025 /system/lib/libart.so (art::(anonymous namespace)::ScopedCheck::Check(art::ScopedObjectAccess&, bool, char const*, art::(anonymous namespace)::JniValueType*)+624)
2022-06-21 15:36:22.269 30174-5460/com.elevatorbus.feco A/levatorbus.fec: java_vm_ext.cc:542] native: #9 pc 000cae83 /system/lib/libart.so (art::(anonymous namespace)::CheckJNI::GetStringCharsInternal(char const*, _JNIEnv*, _jstring*, unsigned char*, bool, bool)+546)
2022-06-21 15:36:22.269 30174-5460/com.elevatorbus.feco A/levatorbus.fec: java_vm_ext.cc:542] native: #10 pc 000bb763 /system/lib/libart.so (art::(anonymous namespace)::CheckJNI::GetStringUTFChars(_JNIEnv*, _jstring*, unsigned char*)+26)
2022-06-21 15:36:22.269 30174-5460/com.elevatorbus.feco A/levatorbus.fec: java_vm_ext.cc:542] native: #11 pc 003808b9 /data/app/com.elevatorbus.feco-FTBXUnDMMhaqmOBblDEmJg==/base.apk (offset 5879000) (???)
2022-06-21 15:36:22.269 30174-5460/com.elevatorbus.feco A/levatorbus.fec: java_vm_ext.cc:542] native: #12 pc 0007301f /data/app/com.elevatorbus.feco-FTBXUnDMMhaqmOBblDEmJg==/base.apk (offset 5879000) (Java_com_google_mediapipe_framework_Graph_nativeMovePacketToInputStream+16)
2022-06-21 15:36:22.269 30174-5460/com.elevatorbus.feco A/levatorbus.fec: java_vm_ext.cc:542] native: #13 pc 0041c279 /system/lib/libart.so (art_quick_generic_jni_trampoline+40)
2022-06-21 15:36:22.269 30174-5460/com.elevatorbus.feco A/levatorbus.fec: java_vm_ext.cc:542] native: #14 pc 00417d75 /system/lib/libart.so (art_quick_invoke_stub_internal+68)
2022-06-21 15:36:22.269 30174-5460/com.elevatorbus.feco A/levatorbus.fec: java_vm_ext.cc:542] native: #15 pc 003f12e7 /system/lib/libart.so (art_quick_invoke_stub+226)
2022-06-21 15:36:22.269 30174-5460/com.elevatorbus.feco A/levatorbus.fec: java_vm_ext.cc:542] native: #16 pc 000a1031 /system/lib/libart.so (art::ArtMethod::Invoke(art::Thread*, unsigned int*, unsigned int, art::JValue*, char const*)+136)
2022-06-21 15:36:22.269 30174-5460/com.elevatorbus.feco A/levatorbus.fec: java_vm_ext.cc:542] native: #17 pc 001e8835 /system/lib/libart.so (art::interpreter::ArtInterpreterToCompiledCodeBridge(art::Thread*, art::ArtMethod*, art::ShadowFrame*, unsigned short, art::JValue*)+232)
2022-06-21 15:36:22.269 30174-5460/com.elevatorbus.feco A/levatorbus.fec: java_vm_ext.cc:542] native: #18 pc 001e4a2b /system/lib/libart.so (bool art::interpreter::DoCall<true, true>(art::ArtMethod*, art::Thread*, art::ShadowFrame&, art::Instruction const*, unsigned short, art::JValue*)+1338)
2022-06-21 15:36:22.269 30174-5460/com.elevatorbus.feco A/levatorbus.fec: java_vm_ext.cc:542] native: #19 pc 001fb7d3 /system/lib/libart.so (_ZN3art11interpreterL8DoInvokeILNS_10InvokeTypeE1ELb1ELb1EEEbPNS_6ThreadERNS_11ShadowFrameEPKNS_11InstructionEtPNS_6JValueE+170)
2022-06-21 15:36:22.269 30174-5460/com.elevatorbus.feco A/levatorbus.fec: java_vm_ext.cc:542] native: #20 pc 001f7b43 /system/lib/libart.so (void art::interpreter::ExecuteSwitchImplCpp<true, false>(art::interpreter::SwitchImplContext*)+52034)
2022-06-21 15:36:22.269 30174-5460/com.elevatorbus.feco A/levatorbus.fec: java_vm_ext.cc:542] native: #21 pc 0041cc55 /system/lib/libart.so (ExecuteSwitchImplAsm+4)
2022-06-21 15:36:22.269 30174-5460/com.elevatorbus.feco A/levatorbus.fec: java_vm_ext.cc:542] native: #22 pc 0017ed94 /data/app/com.elevatorbus.feco-FTBXUnDMMhaqmOBblDEmJg==/base.apk (offset 6891000) (com.google.mediapipe.framework.Graph.addConsumablePacketToInputStream)
2022-06-21 15:36:22.269 30174-5460/com.elevatorbus.feco A/levatorbus.fec: java_vm_ext.cc:542] native: #23 pc 001c7e4b /system/lib/libart.so (_ZN3art11interpreterL7ExecuteEPNS_6ThreadERKNS_20CodeItemDataAccessorERNS_11ShadowFrameENS_6JValueEb.llvm.2193211614+290)
2022-06-21 15:36:22.269 30174-5460/com.elevatorbus.feco A/levatorbus.fec: java_vm_ext.cc:542] native: #24 pc 001cc757 /system/lib/libart.so (art::interpreter::ArtInterpreterToInterpreterBridge(art::Thread*, art::CodeItemDataAccessor const&, art::ShadowFrame*, art::JValue*)+146)
2022-06-21 15:36:22.269 30174-5460/com.elevatorbus.feco A/levatorbus.fec: java_vm_ext.cc:542] native: #25 pc 001e34fb /system/lib/libart.so (bool art::interpreter::DoCall<false, false>(art::ArtMethod*, art::Thread*, art::ShadowFrame&, art::Instruction const*, unsigned short, art::JValue*)+754)
2022-06-21 15:36:22.269 30174-5460/com.elevatorbus.feco A/levatorbus.fec: java_vm_ext.cc:542] native: #26 pc 003ebf0f /system/lib/libart.so (MterpInvokeVirtual+442)
2022-06-21 15:36:22.269 30174-5460/com.elevatorbus.feco A/levatorbus.fec: java_vm_ext.cc:542] native: #27 pc 0040aa14 /system/lib/libart.so (ExecuteMterpImpl+14228)
2022-06-21 15:36:22.269 30174-5460/com.elevatorbus.feco A/levatorbus.fec: java_vm_ext.cc:542] native: #28 pc 0016fc4a /data/app/com.elevatorbus.feco-FTBXUnDMMhaqmOBblDEmJg==/base.apk (offset 6891000) (com.google.mediapipe.components.FrameProcessor.onNewFrame+66)
2022-06-21 15:36:22.269 30174-5460/com.elevatorbus.feco A/levatorbus.fec: java_vm_ext.cc:542] native: #29 pc 001c7e89 /system/lib/libart.so (_ZN3art11interpreterL7ExecuteEPNS_6ThreadERKNS_20CodeItemDataAccessorERNS_11ShadowFrameENS_6JValueEb.llvm.2193211614+352)
2022-06-21 15:36:22.269 30174-5460/com.elevatorbus.feco A/levatorbus.fec: java_vm_ext.cc:542] native: #30 pc 001cc757 /system/lib/libart.so (art::interpreter::ArtInterpreterToInterpreterBridge(art::Thread*, art::CodeItemDataAccessor const&, art::ShadowFrame*, art::JValue*)+146)
2022-06-21 15:36:22.269 30174-5460/com.elevatorbus.feco A/levatorbus.fec: java_vm_ext.cc:542] native: #31 pc 001e34fb /system/lib/libart.so (bool art::interpreter::DoCall<false, false>(art::ArtMethod*, art::Thread*, art::ShadowFrame&, art::Instruction const*, unsigned short, art::JValue*)+754)
2022-06-21 15:36:22.269 30174-5460/com.elevatorbus.feco A/levatorbus.fec: java_vm_ext.cc:542] native: #32 pc 003ebf0f /system/lib/libart.so (MterpInvokeVirtual+442)
2022-06-21 15:36:22.269 30174-5460/com.elevatorbus.feco A/levatorbus.fec: java_vm_ext.cc:542] native: #33 pc 0040aa14 /system/lib/libart.so (ExecuteMterpImpl+14228)
2022-06-21 15:36:22.269 30174-5460/com.elevatorbus.feco A/levatorbus.fec: java_vm_ext.cc:542] native: #34 pc 0000570a /data/app/com.elevatorbus.feco-FTBXUnDMMhaqmOBblDEmJg==/base.apk (offset 404b000) (com.jiangdg.usbcamera.USBCameraFragment.openUsbCamera$lambda-2+634)
2022-06-21 15:36:22.269 30174-5460/com.elevatorbus.feco A/levatorbus.fec: java_vm_ext.cc:542] native: #35 pc 001c7e89 /system/lib/libart.so (_ZN3art11interpreterL7ExecuteEPNS_6ThreadERKNS_20CodeItemDataAccessorERNS_11ShadowFrameENS_6JValueEb.llvm.2193211614+352)
2022-06-21 15:36:22.269 30174-5460/com.elevatorbus.feco A/levatorbus.fec: java_vm_ext.cc:542] native: #36 pc 001cc757 /system/lib/libart.so (art::interpreter::ArtInterpreterToInterpreterBridge(art::Thread*, art::CodeItemDataAccessor const&, art::ShadowFrame*, art::JValue*)+146)
2022-06-21 15:36:22.269 30174-5460/com.elevatorbus.feco A/levatorbus.fec: java_vm_ext.cc:542] native: #37 pc 001e34fb /system/lib/libart.so (bool art::interpreter::DoCall<false, false>(art::ArtMethod*, art::Thread*, art::ShadowFrame&, art::Instruction const*, unsigned short, art::JValue*)+754)
2022-06-21 15:36:22.269 30174-5460/com.elevatorbus.feco A/levatorbus.fec: java_vm_ext.cc:542] native: #38 pc 003eceeb /system/lib/libart.so (MterpInvokeStatic+130)
2022-06-21 15:36:22.269 30174-5460/com.elevatorbus.feco A/levatorbus.fec: java_vm_ext.cc:542] native: #39 pc 0040ab94 /system/lib/libart.so (ExecuteMterpImpl+14612)
2022-06-21 15:36:22.269 30174-5460/com.elevatorbus.feco A/levatorbus.fec: java_vm_ext.cc:542] native: #40 pc 00004f6c /data/app/com.elevatorbus.feco-FTBXUnDMMhaqmOBblDEmJg==/base.apk (offset 404b000) (com.jiangdg.usbcamera.USBCameraFragment.lambda$CZkLI_fuImic5WnGI_UKiHM7mo0)
2022-06-21 15:36:22.269 30174-5460/com.elevatorbus.feco A/levatorbus.fec: java_vm_ext.cc:542] native: #41 pc 001c7e89 /system/lib/libart.so (_ZN3art11interpreterL7ExecuteEPNS_6ThreadERKNS_20CodeItemDataAccessorERNS_11ShadowFrameENS_6JValueEb.llvm.2193211614+352)
2022-06-21 15:36:22.269 30174-5460/com.elevatorbus.feco A/levatorbus.fec: java_vm_ext.cc:542] native: #42 pc 001cc757 /system/lib/libart.so (art::interpreter::ArtInterpreterToInterpreterBridge(art::Thread*, art::CodeItemDataAccessor const&, art::ShadowFrame*, art::JValue*)+146)
2022-06-21 15:36:22.269 30174-5460/com.elevatorbus.feco A/levatorbus.fec: java_vm_ext.cc:542] native: #43 pc 001e34fb /system/lib/libart.so (bool art::interpreter::DoCall<false, false>(art::ArtMethod*, art::Thread*, art::ShadowFrame&, art::Instruction const*, unsigned short, art::JValue*)+754)
2022-06-21 15:36:22.269 30174-5460/com.elevatorbus.feco A/levatorbus.fec: java_vm_ext.cc:542] native: #44 pc 003eceeb /system/lib/libart.so (MterpInvokeStatic+130)
2022-06-21 15:36:22.269 30174-5460/com.elevatorbus.feco A/levatorbus.fec: java_vm_ext.cc:542] native: #45 pc 0040ab94 /system/lib/libart.so (ExecuteMterpImpl+14612)
2022-06-21 15:36:22.269 30174-5460/com.elevatorbus.feco A/levatorbus.fec: java_vm_ext.cc:542] native: #46 pc 00004500 /data/app/com.elevatorbus.feco-FTBXUnDMMhaqmOBblDEmJg==/base.apk (offset 404b000) (com.jiangdg.usbcamera.-$$Lambda$USBCameraFragment$CZkLI_fuImic5WnGI_UKiHM7mo0.onPreviewResult+4)
2022-06-21 15:36:22.269 30174-5460/com.elevatorbus.feco A/levatorbus.fec: java_vm_ext.cc:542] native: #47 pc 001c7e89 /system/lib/libart.so (_ZN3art11interpreterL7ExecuteEPNS_6ThreadERKNS_20CodeItemDataAccessorERNS_11ShadowFrameENS_6JValueEb.llvm.2193211614+352)
2022-06-21 15:36:22.269 30174-5460/com.elevatorbus.feco A/levatorbus.fec: java_vm_ext.cc:542] native: #48 pc 001cc757 /system/lib/libart.so (art::interpreter::ArtInterpreterToInterpreterBridge(art::Thread*, art::CodeItemDataAccessor const&, art::ShadowFrame*, art::JValue*)+146)
2022-06-21 15:36:22.269 30174-5460/com.elevatorbus.feco A/levatorbus.fec: java_vm_ext.cc:542] native: #49 pc 001e34fb /system/lib/libart.so (bool art::interpreter::DoCall<false, false>(art::ArtMethod*, art::Thread*, art::ShadowFrame&, art::Instruction const*, unsigned short, art::JValue*)+754)
2022-06-21 15:36:22.269 30174-5460/com.elevatorbus.feco A/levatorbus.fec: java_vm_ext.cc:542] native: #50 pc 003ecadd /system/lib/libart.so (MterpInvokeInterface+1020)
2022-06-21 15:36:22.269 30174-5460/com.elevatorbus.feco A/levatorbus.fec: java_vm_ext.cc:542] native: #51 pc 0040ac14 /system/lib/libart.so (ExecuteMterpImpl+14740)
2022-06-21 15:36:22.269 30174-5460/com.elevatorbus.feco A/levatorbus.fec: java_vm_ext.cc:542] native: #52 pc 00010aae /data/app/com.elevatorbus.feco-FTBXUnDMMhaqmOBblDEmJg==/base.apk (offset 2f25000) (com.serenegiant.usb.common.AbstractUVCCameraHandler$CameraThread$3.onFrame+30)
2022-06-21 15:36:22.269 30174-5460/com.elevatorbus.feco A/levatorbus.fec: java_vm_ext.cc:542] native: #53 pc 001c7e89 /system/lib/libart.so (_ZN3art11interpreterL7ExecuteEPNS_6ThreadERKNS_20CodeItemDataAccessorERNS_11ShadowFrameENS_6JValueEb.llvm.2193211614+352)
2022-06-21 15:36:22.269 30174-5460/com.elevatorbus.feco A/levatorbus.fec: java_vm_ext.cc:542] native: #54 pc 001cc6a3 /system/lib/libart.so (art::interpreter::EnterInterpreterFromEntryPoint(art::Thread*, art::CodeItemDataAccessor const&, art::ShadowFrame*)+82)
2022-06-21 15:36:22.269 30174-5460/com.elevatorbus.feco A/levatorbus.fec: java_vm_ext.cc:542] native: #55 pc 003df753 /system/lib/libart.so (artQuickToInterpreterBridge+890)
2022-06-21 15:36:22.269 30174-5460/com.elevatorbus.feco A/levatorbus.fec: java_vm_ext.cc:542] native: #56 pc 0041c2ff /system/lib/libart.so (art_quick_to_interpreter_bridge+30)
2022-06-21 15:36:22.269 30174-5460/com.elevatorbus.feco A/levatorbus.fec: java_vm_ext.cc:542] native: #57 pc 00417d75 /system/lib/libart.so (art_quick_invoke_stub_internal+68)
2022-06-21 15:36:22.270 30174-5460/com.elevatorbus.feco A/levatorbus.fec: java_vm_ext.cc:542] native: #58 pc 003f12e7 /system/lib/libart.so (art_quick_invoke_stub+226)
2022-06-21 15:36:22.270 30174-5460/com.elevatorbus.feco A/levatorbus.fec: java_vm_ext.cc:542] native: #59 pc 000a1031 /system/lib/libart.so (art::ArtMethod::Invoke(art::Thread*, unsigned int*, unsigned int, art::JValue*, char const*)+136)
2022-06-21 15:36:22.270 30174-5460/com.elevatorbus.feco A/levatorbus.fec: java_vm_ext.cc:542] native: #60 pc 00350a6d /system/lib/libart.so (art::(anonymous namespace)::InvokeWithArgArray(art::ScopedObjectAccessAlreadyRunnable const&, art::ArtMethod*, art::(anonymous namespace)::ArgArray*, art::JValue*, char const*)+52)
2022-06-21 15:36:22.270 30174-5460/com.elevatorbus.feco A/levatorbus.fec: java_vm_ext.cc:542] native: #61 pc 00351a15 /system/lib/libart.so (art::InvokeVirtualOrInterfaceWithVarArgs(art::ScopedObjectAccessAlreadyRunnable const&, _jobject*, _jmethodID*, std::__va_list)+316)
2022-06-21 15:36:22.270 30174-5460/com.elevatorbus.feco A/levatorbus.fec: java_vm_ext.cc:542] native: #62 pc 0027872f /system/lib/libart.so (art::JNI::CallVoidMethodV(_JNIEnv*, _jobject*, _jmethodID*, std::__va_list)+482)
2022-06-21 15:36:22.270 30174-5460/com.elevatorbus.feco A/levatorbus.fec: java_vm_ext.cc:542] native: #63 pc 000c6e91 /system/lib/libart.so (art::(anonymous namespace)::CheckJNI::CallMethodV(char const*, _JNIEnv*, _jobject*, _jclass*, _jmethodID*, std::__va_list, art::Primitive::Type, art::InvokeType)+1148)
2022-06-21 15:36:22.270 30174-5460/com.elevatorbus.feco A/levatorbus.fec: java_vm_ext.cc:542] native: #64 pc 000b8041 /system/lib/libart.so (art::(anonymous namespace)::CheckJNI::CallVoidMethodV(_JNIEnv*, _jobject*, _jmethodID*, std::__va_list)+44)
2022-06-21 15:36:22.270 30174-5460/com.elevatorbus.feco A/levatorbus.fec: java_vm_ext.cc:542] native: #65 pc 0001150c /data/app/com.elevatorbus.feco-FTBXUnDMMhaqmOBblDEmJg==/base.apk (offset 3ea5000) (_JNIEnv::CallVoidMethod(_jobject*, _jmethodID*, ...)+52)
2022-06-21 15:36:22.270 30174-5460/com.elevatorbus.feco A/levatorbus.fec: java_vm_ext.cc:542] native: #66 pc 00011428 /data/app/com.elevatorbus.feco-FTBXUnDMMhaqmOBblDEmJg==/base.apk (offset 3ea5000) (UVCPreview::do_capture_callback(_JNIEnv*, uvc_frame*)+244)
2022-06-21 15:36:22.270 30174-5460/com.elevatorbus.feco A/levatorbus.fec: java_vm_ext.cc:542] native: #67 pc 000110d8 /data/app/com.elevatorbus.feco-FTBXUnDMMhaqmOBblDEmJg==/base.apk (offset 3ea5000) (UVCPreview::do_capture(_JNIEnv*)+144)
2022-06-21 15:36:22.270 30174-5460/com.elevatorbus.feco A/levatorbus.fec: java_vm_ext.cc:542] native: #68 pc 00010b20 /data/app/com.elevatorbus.feco-FTBXUnDMMhaqmOBblDEmJg==/base.apk (offset 3ea5000) (UVCPreview::capture_thread_func(void*)+64)
2022-06-21 15:36:22.270 30174-5460/com.elevatorbus.feco A/levatorbus.fec: java_vm_ext.cc:542] native: #69 pc 00063c15 /system/lib/libc.so (__pthread_start(void*)+22)
2022-06-21 15:36:22.270 30174-5460/com.elevatorbus.feco A/levatorbus.fec: java_vm_ext.cc:542] native: #70 pc 0001e065 /system/lib/libc.so (__start_thread+22)
2022-06-21 15:36:22.270 30174-5460/com.elevatorbus.feco A/levatorbus.fec: java_vm_ext.cc:542] at com.google.mediapipe.framework.Graph.nativeMovePacketToInputStream(Native method)
2022-06-21 15:36:22.270 30174-5460/com.elevatorbus.feco A/levatorbus.fec: java_vm_ext.cc:542] at com.google.mediapipe.framework.Graph.addConsumablePacketToInputStream(Graph.java:395)
2022-06-21 15:36:22.270 30174-5460/com.elevatorbus.feco A/levatorbus.fec: java_vm_ext.cc:542] - locked <0x060d94eb> (a com.google.mediapipe.framework.Graph)
2022-06-21 15:36:22.270 30174-5460/com.elevatorbus.feco A/levatorbus.fec: java_vm_ext.cc:542] at com.google.mediapipe.components.FrameProcessor.onNewFrame(FrameProcessor.java:511)
2022-06-21 15:36:22.270 30174-5460/com.elevatorbus.feco A/levatorbus.fec: java_vm_ext.cc:542] at com.jiangdg.usbcamera.USBCameraFragment.openUsbCamera$lambda-2(USBCameraFragment.kt:176)
2022-06-21 15:36:22.270 30174-5460/com.elevatorbus.feco A/levatorbus.fec: java_vm_ext.cc:542] at com.jiangdg.usbcamera.USBCameraFragment.lambda$CZkLI_fuImic5WnGI_UKiHM7mo0(USBCameraFragment.kt:-1)
2022-06-21 15:36:22.270 30174-5460/com.elevatorbus.feco A/levatorbus.fec: java_vm_ext.cc:542] at com.jiangdg.usbcamera.-$$Lambda$USBCameraFragment$CZkLI_fuImic5WnGI_UKiHM7mo0.onPreviewResult(lambda:-1)
2022-06-21 15:36:22.270 30174-5460/com.elevatorbus.feco A/levatorbus.fec: java_vm_ext.cc:542] at com.serenegiant.usb.common.AbstractUVCCameraHandler$CameraThread$3.onFrame(AbstractUVCCameraHandler.java:826)
2022-06-21 15:36:22.270 30174-5460/com.elevatorbus.feco A/levatorbus.fec: java_vm_ext.cc:542]
2022-06-21 15:36:22.298 5474-5474/? E/DEBUG: failed to readlink /proc/5468/fd/104: No such file or directory

'''

@gavel94 gavel94 added the type:bug Bug in the Source Code of MediaPipe Solution label Jun 21, 2022
@sureshdagooglecom sureshdagooglecom added legacy:pose Pose Detection related issues platform:android Issues with Android as Platform labels Jun 22, 2022
@sureshdagooglecom
Copy link

Hi @gavel94 ,
Make sure to set input stream for CPU via frameProcessor.setVideoInputStreamCpu("input_video") where input_video should be the name of the input stream for your graph. That property isn't passed in constructor of FrameProcessor, only the one for GPU is.

@sureshdagooglecom sureshdagooglecom added the stat:awaiting response Waiting for user response label Jun 23, 2022
@gavel94
Copy link
Author

gavel94 commented Jun 24, 2022

After I set “ frameProcessor.setVideoInputStreamCpu("input_video")”. I got the following log。
What other parameters can be set here。
‘’’
2022-06-24 14:09:21.771 5027-5649/com.elevatorbus.feco E/FrameProcessor: Mediapipe error:
com.google.mediapipe.framework.MediaPipeException: invalid argument: Graph has errors:
Packet type mismatch on calculator outputting to stream "input_video": The Packet stores "mediapipe::ImageFrame", but "mediapipe::GpuBuffer" was requested.
at com.google.mediapipe.framework.Graph.nativeMovePacketToInputStream(Native Method)
at com.google.mediapipe.framework.Graph.addConsumablePacketToInputStream(Graph.java:395)
at com.google.mediapipe.components.FrameProcessor.onNewFrame(FrameProcessor.java:511)
at com.jiangdg.usbcamera.USBCameraFragment.openUsbCamera$lambda-2(USBCameraFragment.kt:176)
at com.jiangdg.usbcamera.USBCameraFragment.lambda$CZkLI_fuImic5WnGI_UKiHM7mo0(Unknown Source:0)
at com.jiangdg.usbcamera.-$$Lambda$USBCameraFragment$CZkLI_fuImic5WnGI_UKiHM7mo0.onPreviewResult(Unknown Source:2)
at com.serenegiant.usb.common.AbstractUVCCameraHandler$CameraThread$3.onFrame(AbstractUVCCameraHandler.java:826)
‘’’

@google-ml-butler google-ml-butler bot removed the stat:awaiting response Waiting for user response label Jun 24, 2022
@sureshdagooglecom
Copy link

Hi @gavel94 ,
Could you provide code changes to investigate further on this issue.

@sureshdagooglecom sureshdagooglecom added the stat:awaiting response Waiting for user response label Jun 29, 2022
@gavel94
Copy link
Author

gavel94 commented Jun 29, 2022

According to the document inheritance, above I already use all the code related to mediapipe.
Here is all the code for my module.
“”"
package com.jiangdg.usbcamera

import android.graphics.Bitmap
import com.serenegiant.usb.CameraDialog.CameraDialogParent
import com.serenegiant.usb.widget.CameraViewInterface
import android.hardware.usb.UsbDevice
import android.widget.Toast
import com.jiangdg.usbcamera.UVCCameraHelper.OnMyDevConnectListener
import android.os.Looper
import android.os.Bundle
import android.util.Log
import android.view.*
import android.widget.ArrayAdapter
import android.widget.AdapterView
import android.widget.ListView
import androidx.appcompat.app.AlertDialog
import androidx.fragment.app.Fragment
import com.google.mediapipe.components.FrameProcessor
import com.google.mediapipe.framework.Packet
import com.google.mediapipe.framework.PacketCallback
import com.google.mediapipe.framework.PacketGetter
import com.jiangdg.libusbcamera.R
import com.jiangdg.libusbcamera.databinding.FragmentUsbcameraBinding
import com.jiangdg.usbcamera.utils.FileUtils
import com.serenegiant.usb.USBMonitor
import com.serenegiant.utils.LogUtils
import org.opencv.android.BaseLoaderCallback
import org.opencv.android.LoaderCallbackInterface
import org.opencv.android.OpenCVLoader
import org.opencv.android.Utils
import org.opencv.core.*
import org.opencv.imgproc.Imgproc
import org.opencv.video.BackgroundSubtractorMOG2
import org.opencv.video.Video
import java.util.ArrayList
import java.util.stream.Collectors
import java.util.stream.Stream
import kotlin.math.abs

/**

  • UVCCamera use demo

*/
public class USBCameraFragment public constructor() : Fragment(), CameraDialogParent,
CameraViewInterface.Callback {

private lateinit var mBinding: FragmentUsbcameraBinding
private lateinit var mCameraHelper: UVCCameraHelper
private var mBaseBitmapFlag = false
private var mDialog: AlertDialog? = null
private var isRequest = false
private var isPreview = false

// private val mCameraWidth = 320
// private val mCameraHeight = 240
private val mCameraWidth = 1920
private val mCameraHeight = 1080
private val uSBDevInfo: List?
private get() {
if (mCameraHelper == null) return null
val devInfos: MutableList = ArrayList()
val list = mCameraHelper!!.usbDeviceList
for (dev in list) {
val info = DeviceInfo()
info.pid = dev.vendorId
info.vid = dev.productId
devInfos.add(info)
}
return devInfos
}

private lateinit var standHist: Mat

private var mLoaderCallback: BaseLoaderCallback? = null

private lateinit var backgroundSubtractorMOG2: BackgroundSubtractorMOG2

private var mStudy = true

private var mReady = false

private val mMinPerimeter = 200

private val mMinContourArea = 1000

private var mDetectCallBack: DetectCallBack? = null

private lateinit var bgMat: Mat

private var processor:FrameProcessor? = null

public fun updateStatus(study: Boolean) {
    mStudy = study
}

public fun addDetectCallBack(callBack: DetectCallBack) {
    mDetectCallBack = callBack
}

private fun openUsbCamera() {
    // step.1 initialize UVCCameraHelper
    mBinding.cameraView.setCallback(this)
    mCameraHelper = UVCCameraHelper.getInstance(mCameraWidth, mCameraHeight)
    mCameraHelper.initUSBMonitor(activity, mBinding.cameraView, listener)

// mCameraHelper.setDefaultFrameFormat(UVCCameraHelper.FRAME_FORMAT_MJPEG)
//
mBinding.cameraView.setOnClickListener {
mStudy = !mStudy
showResolutionListDialog()
}
mCameraHelper.setOnPreviewFrameListener { data ->

        val inputMat = Mat(
            mCameraHeight + mCameraHeight / 2,
            mCameraWidth,
            CvType.CV_8UC1
        )
        inputMat.put(0, 0, data)
        val frameMat = Mat()
        Imgproc.cvtColor(inputMat, frameMat, Imgproc.COLOR_YUV420p2GRAY)
        Log.i("USB", "study = $mStudy ready = $mReady")
        if (mStudy || !mReady) {
            backgroundSubtractorMOG2.apply(frameMat, bgMat)
            if (!mReady){
                val contours = arrayListOf<MatOfPoint>()
                Imgproc.findContours(
                    bgMat,
                    contours,
                    Mat(),
                    Imgproc.RETR_EXTERNAL,
                    Imgproc.CHAIN_APPROX_SIMPLE
                )
                mReady = contours.size == 0
            }
        } else {
            backgroundSubtractorMOG2.apply(frameMat, bgMat, 0.0)
            val contours = arrayListOf<MatOfPoint>()
            Imgproc.findContours(
                bgMat,
                contours,
                Mat(),
                Imgproc.RETR_EXTERNAL,
                Imgproc.CHAIN_APPROX_SIMPLE
            )
            for (contour: MatOfPoint in contours) {
                val matOfPoint2f = MatOfPoint2f()
                contour.convertTo(matOfPoint2f, CvType.CV_32F)
                val perimeter = Imgproc.arcLength(matOfPoint2f, true)
                val contourArea = Imgproc.contourArea(matOfPoint2f, true)
                if (perimeter > mMinPerimeter) {

// if (abs(contourArea) > mMinContourArea) {
// Log.i("USB", "perimeter = $perimeter")
// Log.i("USB", "contourArea = $contourArea")
// }
// mDetectCallBack?.onHitTheTarget(perimeter, abs(contourArea))
if (null != mDetectCallBack) {
if (mDetectCallBack!!.onHitTheTarget(perimeter, abs(contourArea))) {
break
}
}
}
}
}

        val cacheBitmap = Bitmap.createBitmap(
            bgMat.cols(),
            bgMat.rows(),
            Bitmap.Config.ARGB_8888
        )
        Utils.matToBitmap(bgMat, cacheBitmap)
        processor?.onNewFrame(cacheBitmap,System.currentTimeMillis())
        activity?.runOnUiThread {
            mBinding.ivStand.setImageBitmap(cacheBitmap)
        }

// if (!mBaseBitmapFlag) {
// LogUtils.e("$mCameraWidth $mCameraHeight")
// mBaseBitmapFlag = true
//// val mat = Mat(
//// mCameraHeight + mCameraHeight / 2,
//// mCameraWidth,
//// CvType.CV_8UC1
//// )
//// mat.put(0,0,data)
//
// val cacheBitmap = Bitmap.createBitmap(
// inputMat.cols(),
// inputMat.rows(),
// Bitmap.Config.ARGB_8888
// )
// mat2compare(inputMat, standHist)
// Utils.matToBitmap(inputMat, cacheBitmap)
// activity?.runOnUiThread {
// mBinding.ivStand.setImageBitmap(cacheBitmap)
// }
//
// } else {
//
// val resultMat = Mat()
// mat2compare(inputMat, resultMat)
//
// val correl =
// Imgproc.compareHist(standHist.clone(), resultMat, Imgproc.HISTCMP_CORREL)
// val chisqr =
// Imgproc.compareHist(standHist.clone(), resultMat, Imgproc.HISTCMP_CHISQR)
// val intersect =
// Imgproc.compareHist(standHist.clone(), resultMat, Imgproc.HISTCMP_INTERSECT)
// val bhattacharyya =
// Imgproc.compareHist(standHist.clone(), resultMat, Imgproc.HISTCMP_BHATTACHARYYA)
//
//
//// LogUtils.i("compare info correl = $correl chisqr = $chisqr intersect = $intersect bhattacharyya = $bhattacharyya")
// activity?.runOnUiThread {
// val builder = StringBuilder()
// builder.append("correl = $correl")
// builder.append(System.lineSeparator())
// builder.append("chisqr = $chisqr")
// builder.append(System.lineSeparator())
// builder.append("intersect = $intersect")
// builder.append(System.lineSeparator())
// builder.append("bhattacharyya = $bhattacharyya")
// mBinding.tvCompare.text = builder
// }
// }

    }
    // step.2 register USB event broadcast
    if (mCameraHelper != null) {
        mCameraHelper.registerUSB()
    }
}

fun mat2compare2(inputMat: Mat, resultMat: Mat) {

// var hvsMat = Mat()
//灰度
// Imgproc.cvtColor(inputMat, hvsMat, Imgproc.COLOR_YUV2GRAY_UYNV)
//直方图计算
Imgproc.calcHist(
Stream.of(inputMat).collect(Collectors.toList()),
MatOfInt(0),
Mat(),
resultMat,
MatOfInt(255),
MatOfFloat(0.0F, 256.0F)
)
//图片归一化
Core.normalize(

        resultMat,
        resultMat,
        1.0,
        resultMat.rows().toDouble(),
        Core.NORM_MINMAX,
        -1,
        Mat()
    )
}

fun mat2compare(mat: Mat, resultMat: Mat) {
    var hvsMat = Mat()
    var bgrMat = Mat()

// Imgproc.cvtColor(mat, hvsMat, Imgproc.COLOR_YUV420p2GRAY)
Imgproc.cvtColor(mat, bgrMat, Imgproc.COLOR_YUV420p2BGR)
Imgproc.cvtColor(bgrMat, hvsMat, Imgproc.COLOR_BGR2HSV)
// Imgproc.GaussianBlur()
//直方图计算
Imgproc.calcHist(
Stream.of(hvsMat).collect(Collectors.toList()),
MatOfInt(0),
Mat(),
resultMat,
MatOfInt(255),
MatOfFloat(0.0F, 256.0F)
)
//图片归一化
Core.normalize(
resultMat,
resultMat,
0.0,
resultMat.rows().toDouble(),
Core.NORM_MINMAX,
-1,
Mat()
)
}

private fun popCheckDevDialog() {
    val infoList = uSBDevInfo
    if (infoList == null || infoList.isEmpty()) {
        Toast.makeText(activity, "Find devices failed.", Toast.LENGTH_SHORT)
            .show()
        return
    }
    val dataList: MutableList<String> = ArrayList()
    for (deviceInfo in infoList) {
        dataList.add("Device:PID_" + deviceInfo.pid + " & " + "VID_" + deviceInfo.vid)
    }
    AlertCustomDialog.createSimpleListDialog(
        activity,
        "Please select USB devcie",
        dataList
    ) { postion -> mCameraHelper.requestPermission(postion) }
}

private val listener: OnMyDevConnectListener = object : OnMyDevConnectListener {
    override fun onAttachDev(device: UsbDevice) {
        // request open permission
        if (!isRequest) {
            isRequest = true
            popCheckDevDialog()
        }
    }

    override fun onDettachDev(device: UsbDevice) {
        // close camera
        if (isRequest) {
            isRequest = false
            mCameraHelper!!.closeCamera()
            showShortMsg(device.deviceName + " is out")
        }
    }

    override fun onConnectDev(device: UsbDevice, isConnected: Boolean) {
        if (!isConnected) {
            showShortMsg("fail to connect,please check resolution params")
            isPreview = false
        } else {
            isPreview = true
            showShortMsg("connecting")
            // initialize seekbar
            // need to wait UVCCamera initialize over
            Thread { //                        try {

// Thread.sleep(100);
// } catch (InterruptedException e) {
// e.printStackTrace();
// }
Looper.prepare()
if (mCameraHelper != null && mCameraHelper!!.isCameraOpened) {
}
Looper.loop()
}.start()
}
}

    override fun onDisConnectDev(device: UsbDevice) {
        showShortMsg("disconnecting")
    }
}


override fun onCreateView(
    inflater: LayoutInflater,
    container: ViewGroup?,
    savedInstanceState: Bundle?
): View {
    mBinding = FragmentUsbcameraBinding.inflate(layoutInflater)
    initData()
    initView()
    return mBinding.root
}

private fun initData() {
    try {
        processor = FrameProcessor(activity,"pose_tracking_gpu.binarypb")

        processor?.setVideoInputStreamCpu("input_video")

        processor?.addPacketCallback("pose_landmarks",object :PacketCallback{
            override fun process(packet: Packet?) {
                try {
                    Log.e("aaaa", "process:${PacketGetter.getString(packet)} ", )

                } catch (exception: Exception) {
                }
            }

        })
    }catch (e:Exception){
       e.printStackTrace()
    }
}

private fun initView() {}

override fun onStart() {
    super.onStart()
    mLoaderCallback = object : BaseLoaderCallback(requireActivity()) {
        override fun onManagerConnected(status: Int) {
            when (status) {
                SUCCESS -> {
                    LogUtils.i("OpenCV loaded successfully")
                    standHist = Mat()
                    backgroundSubtractorMOG2 =
                        Video.createBackgroundSubtractorMOG2(500, 128.0, true)
                    bgMat = Mat()
                    openUsbCamera()

                }
                else -> {
                    super.onManagerConnected(status)
                }
            }
        }
    }
}

override fun onResume() {
    super.onResume()
    if (!OpenCVLoader.initDebug()) {
        LogUtils.i("Internal OpenCV library not found. Using OpenCV Manager for initialization")
        OpenCVLoader.initAsync(OpenCVLoader.OPENCV_VERSION, requireActivity(), mLoaderCallback)
    } else {
        LogUtils.i("OpenCV library found inside package. Using it!")
        mLoaderCallback?.onManagerConnected(LoaderCallbackInterface.SUCCESS)
    }
}

override fun onStop() {
    super.onStop()
    // step.3 unregister USB event broadcast
    if (mCameraHelper != null) {
        mCameraHelper.unregisterUSB()
    }
}

private fun showResolutionListDialog() {
    val builder = AlertDialog.Builder(requireActivity())
    val rootView =
        LayoutInflater.from(activity).inflate(R.layout.layout_dialog_list, null)
    val listView = rootView.findViewById<View>(R.id.listview_dialog) as ListView
    val adapter = ArrayAdapter(
        requireActivity(),
        android.R.layout.simple_list_item_1,
        resolutionList
    )
    if (adapter != null) {
        listView.adapter = adapter
    }
    listView.onItemClickListener =
        AdapterView.OnItemClickListener { adapterView, view, position, id ->
            if (!mCameraHelper.isCameraOpened) return@OnItemClickListener
            val resolution = adapterView.getItemAtPosition(position) as String
            val tmp = resolution.split("x").toTypedArray()
            if (tmp != null && tmp.size >= 2) {
                val widht = Integer.valueOf(tmp[0])
                val height = Integer.valueOf(tmp[1])
                mCameraHelper.updateResolution(widht, height)
            }
            mDialog!!.dismiss()
        }
    builder.setView(rootView)
    mDialog = builder.create()
    mDialog!!.show()
}

// example: {640x480,320x240,etc}
private val resolutionList: List<String>
    private get() {
        val list = mCameraHelper.supportedPreviewSizes
        var resolutions = mutableListOf<String>()
        if (list != null && list.size != 0) {
            for (size in list) {
                if (size != null) {
                    resolutions.add(size.width.toString() + "x" + size.height)
                }
            }
        }
        return resolutions
    }

override fun onDestroy() {
    super.onDestroy()
    FileUtils.releaseFile()
    // step.4 release uvc camera resources
    mCameraHelper.release()
}

private fun showShortMsg(msg: String) {
    Toast.makeText(requireActivity(), msg, Toast.LENGTH_SHORT).show()
}

override fun getUSBMonitor(): USBMonitor {
    return mCameraHelper.usbMonitor
}

override fun onDialogResult(canceled: Boolean) {
    if (canceled) {
        showShortMsg("取消操作")
    }
}

val isCameraOpened: Boolean
    get() = mCameraHelper.isCameraOpened

override fun onSurfaceCreated(view: CameraViewInterface, surface: Surface) {
    if (!isPreview && mCameraHelper.isCameraOpened) {
        mCameraHelper.startPreview(mBinding.cameraView)
        isPreview = true
    }
}

override fun onSurfaceChanged(
    view: CameraViewInterface,
    surface: Surface,
    width: Int,
    height: Int
) {
}

override fun onSurfaceDestroy(view: CameraViewInterface, surface: Surface) {
    if (isPreview && mCameraHelper.isCameraOpened) {
        mCameraHelper.stopPreview()
        isPreview = false
    }
}

companion object {
    @JvmStatic
    fun newInstance(bundle: Bundle) = USBCameraFragment().apply { arguments = bundle }
}

init {
    System.loadLibrary("mediapipe_jni")
}

}
"""

@google-ml-butler google-ml-butler bot removed the stat:awaiting response Waiting for user response label Jun 29, 2022
@gavel94
Copy link
Author

gavel94 commented Jun 29, 2022

Here is my ARR packaging script。I don't know if this helps with the problem.Thank you very much for your help。
“””
load("//mediapipe/java/com/google/mediapipe:mediapipe_aar.bzl", "mediapipe_aar")

mediapipe_aar(
name = "mediapipe_pose_tracking",
calculators = ["//mediapipe/graphs/pose_tracking:pose_tracking_gpu_deps"],
)
“”“

@gavel94
Copy link
Author

gavel94 commented Jul 19, 2022

How do I use this method?
“FrameProcessor.onNewFrame(final Bitmap bitmap, long timestamp)”

Could you give me an expedited treatment?

If this plan doesn't work, I can consider another one.

Thank you very much.

@sureshdagooglecom
Copy link

sureshdagooglecom commented Aug 5, 2022

Hi @gavel94 ,
I think we need to verify the image input that the client is providing.
Can you verify following questions while inputting images.
1)Could you verify that you are using MediaPipe solutions without changes.
2) verify the type required by processor.onNewFrame().
3) verify the contents of "bitmap" in the call to processor.onNewFrame.

@sureshdagooglecom sureshdagooglecom added the stat:awaiting response Waiting for user response label Aug 5, 2022
@gavel94
Copy link
Author

gavel94 commented Aug 8, 2022

During this time, I tried various ways but couldn't achieve my needs.

Hi @gavel94 , I think we need to verify the image input that the client is providing. Can you verify following questions while inputting images. 1)Could you verify that you are using MediaPipe solutions without changes. 2) verify the type required by processor.onNewFrame(). 4) verify the contents of "bitmap" in the call to processor.onNewFrame.

During this time, I tried various ways but couldn't achieve my needs.

For now, I prefer to use Mediapipe to solve the posture problem.

I don't understand how to verify the type and content later.

The bitmap is converted by OpencV and displayed on the screen as a normal preview.

@google-ml-butler google-ml-butler bot removed the stat:awaiting response Waiting for user response label Aug 8, 2022
@schmidt-sebastian schmidt-sebastian self-assigned this Aug 24, 2022
@hadon
Copy link

hadon commented Sep 7, 2022

I think the error is indicating that the "stream_name" parameter to nativeMovePacketToInputStream() is bad. This might mean that the instance variable "FrameProcessor.videoInputStream" is not properly initialized through FrameProcessor.addVideoStreams().

@gavel94
Copy link
Author

gavel94 commented Sep 9, 2022

I think the error is indicating that the "stream_name" parameter to nativeMovePacketToInputStream() is bad. This might mean that the instance variable "FrameProcessor.videoInputStream" is not properly initialized through FrameProcessor.addVideoStreams().
I can't find any other parameter settings in the project demo。

@sureshdagooglecom sureshdagooglecom added the stat:awaiting googler Waiting for Google Engineer's Response label Sep 14, 2022
@44438497
Copy link

问题是在使用MediaPipe的Java API时,无法正确加载GPU资源。解决方法是在AndroidManifest.xml文件中添加以下权限:

<uses-permission android:name="android.permission.INTERNET" />
<uses-permission android:name="android.permission.CAMERA" />
<uses-feature android:name="android.hardware.camera" />
<uses-feature android:name="android.hardware.camera.autofocus" />
<uses-feature android:glEsVersion="0x00020000" android:required="true" />

此外,还需要在build.gradle文件中添加以下依赖项:

implementation 'com.google.android.gms:play-services-mlkit-face-detection:16.1.1'
implementation 'com.google.android.gms:play-services-mlkit:16.1.1'
implementation 'com.google.protobuf:protobuf-java:3.12.0'
implementation 'com.google.guava:guava:28.2-android'
implementation 'com.google.code.findbugs:jsr305:3.0.2'
implementation 'com.google.android.material:material:1.2.0-alpha05'
implementation 'com.google.android.gms:play-services-vision:20.1.3'

最后,以下是完整的Java代码示例:

import android.Manifest;
import android.content.pm.PackageManager;
import android.os.Bundle;
import android.util.Log;
import android.widget.TextView;

import androidx.annotation.NonNull;
import androidx.appcompat.app.AppCompatActivity;
import androidx.camera.core.Camera;
import androidx.camera.core.CameraSelector;
import androidx.camera.core.ImageAnalysis;
import androidx.camera.core.ImageAnalysisConfig;
import androidx.camera.core.ImageProxy;
import androidx.camera.lifecycle.ProcessCameraProvider;
import androidx.core.app.ActivityCompat;
import androidx.core.content.ContextCompat;

import com.google.common.util.concurrent.ListenableFuture;
import com.google.mediapipe.components.CameraHelper;
import com.google.mediapipe.components.FrameProcessor;
import com.google.mediapipe.formats.proto.LandmarkProto;
import com.google.mediapipe.framework.AndroidAssetUtil;
import com.google.mediapipe.framework.Graph;
import com.google.mediapipe.framework.GraphService;
import com.google.mediapipe.framework.Packet;
import com.google.mediapipe.framework.PacketGetter;
import com.google.mediapipe.framework.TextureFrame;
import com.google.mediapipe.glutil.EglManager;
import com.google.protobuf.InvalidProtocolBufferException;

import java.nio.ByteBuffer;
import java.util.concurrent.ExecutionException;
import java.util.concurrent.Executor;
import java.util.concurrent.Executors;

public class MainActivity extends AppCompatActivity {

    private static final String TAG = "MainActivity";
    private static final int REQUEST_CODE_PERMISSIONS = 10;
    private static final String[] REQUIRED_PERMISSIONS = new String[]{Manifest.permission.CAMERA};

    private ListenableFuture<ProcessCameraProvider> cameraProviderFuture;
    private FrameProcessor frameProcessor;
    private Executor executor = Executors.newSingleThreadExecutor();

    @Override
    protected void onCreate(Bundle savedInstanceState) {
        super.onCreate(savedInstanceState);
        setContentView(R.layout.activity_main);

        if (allPermissionsGranted()) {
            startCamera();
        } else {
            ActivityCompat.requestPermissions(this, REQUIRED_PERMISSIONS, REQUEST_CODE_PERMISSIONS);
        }

        // Initialize the FrameProcessor.
        frameProcessor = new FrameProcessor(
                this,
                executor,
                "face_detection_front.tflite",
                "face_detection_front_labelmap.txt",
                4,
                4,
                true);
        frameProcessor.getVideoSurfaceOutput().setFlipY(true);

        // Setup a callback for when new frames are available.
        frameProcessor.setOnWillAddFrameListener((timestamp) -> {
            Log.d(TAG, "onWillAddFrame: " + timestamp);
        });

        // Add a callback to render the face landmarks.
        frameProcessor.addPacketCallback("face_landmarks_with_iris", (packet) -> {
            ByteBuffer landmarksData = PacketGetter.getProto(packet, LandmarkProto.NormalizedLandmarkList.parser()).asReadOnlyByteBuffer();
            try {
                LandmarkProto.NormalizedLandmarkList landmarks = LandmarkProto.NormalizedLandmarkList.parseFrom(landmarksData);
                Log.d(TAG, "face landmarks: " + landmarks);
            } catch (InvalidProtocolBufferException e) {
                Log.e(TAG, "Failed to get face landmarks from packet: " + e);
            }
        });

        // Start the FrameProcessor.
        frameProcessor.start();
    }

    private void startCamera() {
        cameraProviderFuture = ProcessCameraProvider.getInstance(this);
        cameraProviderFuture.addListener(() -> {
            try {
                ProcessCameraProvider cameraProvider = cameraProviderFuture.get();
                bindPreview(cameraProvider);
            } catch (ExecutionException | InterruptedException e) {
                e.printStackTrace();
            }
        }, ContextCompat.getMainExecutor(this));
    }

    private void bindPreview(@NonNull ProcessCameraProvider cameraProvider) {
        ImageAnalysisConfig config = new ImageAnalysisConfig.Builder()
                .setTargetResolution(CameraHelper.computeIdealSize(640, 480))
                .setLensFacing(CameraSelector.LENS_FACING_FRONT)
                .setImageReaderMode(ImageAnalysis.ImageReaderMode.ACQUIRE_LATEST_IMAGE)
                .build();

        ImageAnalysis imageAnalysis = new ImageAnalysis(config);
        imageAnalysis.setAnalyzer(executor, new ImageAnalysis.Analyzer() {
            @Override
            public void analyze(@NonNull ImageProxy image) {
                // Convert the ImageProxy to a TextureFrame.
                TextureFrame textureFrame = new TextureFrame(image.getWidth(), image.getHeight(), TextureFrame.TextureFormat.RGBA);
                ByteBuffer buffer = image.getPlanes()[0].getBuffer();
                textureFrame.setBuffer(buffer);

                // Process the frame with MediaPipe.
                frameProcessor.process(textureFrame);

                // Close the ImageProxy.
                image.close();
            }
        });

        CameraSelector cameraSelector = new CameraSelector.Builder()
                .requireLensFacing(CameraSelector.LENS_FACING_FRONT)
                .build();

        Camera camera = cameraProvider.bindToLifecycle(this, cameraSelector, imageAnalysis);
    }

    private boolean allPermissionsGranted() {
        for (String permission : REQUIRED_PERMISSIONS) {
            if (ContextCompat.checkSelfPermission(this, permission) != PackageManager.PERMISSION_GRANTED) {
                return false;
            }
        }
        return true;
    }

    @Override
    public void onRequestPermissionsResult(int requestCode, @NonNull String[] permissions, @NonNull int[] grantResults) {
        super.onRequestPermissionsResult(requestCode, permissions, grantResults);
        if (requestCode == REQUEST_CODE_PERMISSIONS) {
            if (allPermissionsGranted()) {
                startCamera();
            } else {
                finish();
            }
        }
    }

    @Override
    protected void onDestroy() {
        super.onDestroy();
        frameProcessor.close();
    }
}

@kuaashish kuaashish assigned kuaashish and unassigned hadon Apr 26, 2023
@kuaashish kuaashish removed the stat:awaiting googler Waiting for Google Engineer's Response label Apr 26, 2023
@kuaashish
Copy link
Collaborator

Hello @gavel94,
We are upgrading the MediaPipe Legacy Solutions to new MediaPipe solutions However, the libraries, documentation, and source code for all the MediapPipe Legacy Solutions will continue to be available in our GitHub repository and through library distribution services, such as Maven and NPM.

You can continue to use those legacy solutions in your applications if you choose. Though, we would request you to check new MediaPipe solutions which can help you more easily build and customize ML solutions for your applications. These new solutions will provide a superset of capabilities available in the legacy solutions. Thank you

@kuaashish kuaashish added the stat:awaiting response Waiting for user response label Apr 26, 2023
@github-actions
Copy link

github-actions bot commented May 4, 2023

This issue has been marked stale because it has no recent activity since 7 days. It will be closed if no further activity occurs. Thank you.

@github-actions github-actions bot added the stale label May 4, 2023
@github-actions
Copy link

This issue was closed due to lack of activity after being marked stale for past 7 days.

@google-ml-butler
Copy link

Are you satisfied with the resolution of your issue?
Yes
No

@kuaashish kuaashish removed stat:awaiting response Waiting for user response stale labels May 11, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
legacy:pose Pose Detection related issues platform:android Issues with Android as Platform type:bug Bug in the Source Code of MediaPipe Solution
Projects
None yet
Development

No branches or pull requests

6 participants