Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

dport task stack overflow (IDFGH-4586) #6403

Closed
SMinin opened this issue Jan 14, 2021 · 2 comments
Closed

dport task stack overflow (IDFGH-4586) #6403

SMinin opened this issue Jan 14, 2021 · 2 comments

Comments

@SMinin
Copy link

SMinin commented Jan 14, 2021

Environment

  • Development Kit: Custom board
  • Module or chip used: ESP32-WROOM-32D
  • IDF version (run git describe --tags to find it): v4.2
  • Build System: idf.py
  • Compiler version (run xtensa-esp32-elf-gcc --version to find it): xtensa-esp32-elf-gcc (crosstool-NG esp-2020r3) 8.4.0
  • Operating System: Linux
  • Using an IDE?: Yes (Eclipse 2020-12)
  • Power Supply: external 3.3V

Problem Description

On the latest 4.2 release, stak overflow occurs even before app_main is launched. The problem only disappears when the stack size of the "dport" task (esp-idf / components / esp32 / dport_access.c) is increased. Increasing the stack size for other tasks does not solve the problem. The issue could not be reproduced when running the "blink" example and using the standard sdkconfig. But the problem is reproducible if we use sdkconfig from our application (attached). The stack size of other tasks changed in our sdkconfig probably affects the detection of a stack overflow.

Expected Behavior

Actual Behavior

Steps to reproduce

1.copy sdkconfig (attached) to the "blink" example application folder. Remove the "txt" extension.
2.compile and run the application on esp-32
3.stack overflow occurs when initializing main tasks

Code to reproduce this issue

Debug Logs

I (137) cpu_start: Pro cpu start user code
I (148) spi_flash: detected chip: generic
I (148) spi_flash: flash io: dio
I (149) esp_core_dump_uart: Init core dump to UART
I (149) cpu_start: Starting scheduler on PRO CPU.
Guru Meditation Error: Core 0 panic'ed (Unhandled debug exception).
Debug exception reason: Stack canary watchpoint triggered (Tmr Svc)
Core 0 register dump:
PC : 0x4008a4ae PS : 0x00060036 A0 : 0x80081224 A1 : 0x3ffb5570
0x4008a4ae: cpu_hal_set_watchpoint at /home/sergey/esp-idf/components/soc/src/hal/cpu_hal.c:39

A2 : 0x00000001 A3 : 0x3ffb7ca0 A4 : 0x00000020 A5 : 0x00000001
A6 : 0x400d3858 A7 : 0x00000000 A8 : 0x86456fd4 A9 : 0x00000000
0x400d3858: dport_access_init_core at /home/sergey/esp-idf/components/esp32/dport_access.c:149

A10 : 0x00000000 A11 : 0x00000000 A12 : 0x3ffb57b0 A13 : 0x3ffb5780
A14 : 0x00000001 A15 : 0x10000000 SAR : 0x00000000 EXCCAUSE: 0x00000001
EXCVADDR: 0x00000000 LBEG : 0x00000000 LEND : 0x00000000 LCOUNT : 0x00000000

Other items if possible

sdkconfig.txt

@github-actions github-actions bot changed the title dport task stack overflow dport task stack overflow (IDFGH-4586) Jan 14, 2021
@Alvin1Zhang
Copy link
Collaborator

Thanks for reporting, we will look into.

@projectgus
Copy link
Contributor

projectgus commented Feb 5, 2021

Hi @SMinin,

Thanks for the very clear steps to reproduce, and for being patient while someone got back to you.

You're correct that the DPORT task is overflowing its allocated stack size. The overflow is happening right at the end of the context switch, which is why the line "Stack canary watchpoint triggered (Tmr Svc)" suggests a different task (timer task) is implicated - the last step in context switching is to change the stack watchpoint, this happens on the previous task's stack via the function call to cpu_hal_set_watchpoint, but after FreeRTOS scheduler has already switched state to the new task.

Contributing to the stack overflow there seem to be two config settings:

(Setting the above two options, and enabling stack overflow watchpoints, is enough to reproduce here.)

A fix is incoming to increase default task stack sizes when the "Overall" stack smashing option is set, and we'll also look to improve the test coverage for stack headroom in these configurations to avoid any regression. The fix will merge on master first, and then be backported to the release/v4.2 branch for inclusion in the v4.2 bugfix release.

In the meantime, the workaround you have clearly works or if you'd like to save overall stack usage then changing the stack smashing setting protection back to "Strong" should also stop it crashing (this may also allow relaxing some of the other task stack sizes that were bumped in your sdkconfig, if you're looking to reclaim any RAM and don't need the Overall protection setting).

espressif-bot pushed a commit that referenced this issue Apr 1, 2021
…r is enabled

Fixes issue with DPORT init task, this task uses minimum stack size and may not be
enough if stack smashing detection is set to Overall mode.

Also reworks the way we calculate minimum stack to allow for adding multiple
contributing factors.

Closes #6403
mahavirj pushed a commit to espressif/esp-afr-sdk that referenced this issue Apr 7, 2021
…r is enabled

Fixes issue with DPORT init task, this task uses minimum stack size and may not be
enough if stack smashing detection is set to Overall mode.

Also reworks the way we calculate minimum stack to allow for adding multiple
contributing factors.

Closes espressif/esp-idf#6403
projectgus added a commit that referenced this issue Apr 21, 2021
…r is enabled

Fixes issue with DPORT init task, this task uses minimum stack size and may not be
enough if stack smashing detection is set to Overall mode.

Also reworks the way we calculate minimum stack to allow for adding multiple
contributing factors.

Closes #6403
espressif-bot pushed a commit that referenced this issue Jun 11, 2021
…r is enabled

Fixes issue with DPORT init task, this task uses minimum stack size and may not be
enough if stack smashing detection is set to Overall mode.

Also reworks the way we calculate minimum stack to allow for adding multiple
contributing factors.

Closes #6403
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants