-
Notifications
You must be signed in to change notification settings - Fork 13.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
GDB support w/new toolchain and UART driver #5559
Conversation
It appears that Espressif patched the open source xtensa GDB port in order to build their old GDB executable and their old gdbstub (basically removing any register in a generic xtensa and only leaving those present in the chip they synthesized). Their GDBStub also assumed this behavior. Unpatched upstream GNU GDB now expects all the registers in xtensa-config.c to be sent/read on a 'g' command. Change the GDB stub to send "xxxxxxxx"s (legal per the spec) for unimplemented registers. This makes the 'g' response much longer, but it's results are cached and in an interactive debugger it isn't noticeable.
All functions which are not interrupt or exception called are now in flash. A small IRAM wrapper enables flash when processing main GDB ops by calling Cache_Read_Enable_New() and then jumping to the main flash code. This seems to work for catching exceptions, data and code breaks, and Ctrl-C. The UART ISR handler and exception handler register-saving bits of code in ASM are still in IRAM. GDB IRAM usage is now about 670 bytes.
Add some simple GDB documentation to the main tree showing a worked example. Adds the definition of `void gdbstub_init()` to <GDBStub.h>
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I have several comments and questions, but I'm not sure if they should be addressed to @earlephilhower , @kylefleming , or someone else.
@earlephilhower In #4386 there are some observations, e.g.:
Does it make sense to pursue them in view of the current status of this implementation? |
@devyte wrote:
He was talking about the fatal exception handler (postmortem). It's already moved out in 2.5.0-b2
That's IRAM usage AFAICT from the thread. The movement of things to flash with stubs that call os-enable-flash was already done. We're at ~700b of IRAM and ~1KB of heap (and ~3KB of flash). So it's not something we would want to enable by default, but it's not unreasonable. Might be heap/flash savings possible, but I've not yet looked into any optimizations other than minimizing the IRAM footprint.. Since the other review comments look like work, I'll leave them for tomorrow. |
Replace GDBstub.h with the version in the internal/ directory, and adjust stub code accordingly. This way, only one copy of a file called "GDBstub.h" will exist. Update the gdbcommands and replace the obsolete ESPRESSIF readme with @kylefleming's version since we're mainly doing serial, not TCP, connected debugging. Bump the library rev. number since this is a pretty big functionality change. Minor documentation tweak.
Remove the refactoring of pin control and other little things not directly related to GDB processing. Should greatly reduce the diff size in uart.c. Should also remove any register value changes (intended or otherwise) introduced in the original PR from @kylefleming. Set the FIFO interrupt to 16 chars when in GDB mode, matching the latest UART configuration for highest speed.
Comments added to UART.c trying to explain (as best as I understand it) the changes done to support GDB and how they interact with standard operation. Fix the uart_uninit to stop the ISR and then free appropriately. Fix uart_isr_handle_data (GDB's shim for sending chars to the 8266 app) to do the exact same thing as the standard UART handler including set the overflow properly and either discard or overwrite in that case. Fix serial reception when GDB enabled by enabling the user recv ISR. Remove commented attributes from gdbstub, leftover from the move to flash. General logic cleanup per comments in the PR.
Ensure we also check the UART flags and set the uart status appropriately when in GDB mode.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks @earlephilhower and @kylefleming
Hi there, |
@freeck, thanks for the report! Is there anything needed to make it work with your IDE? I can update the docs easily enough (but run CLI for most things so haven't used Eclipse or an IDE in a long time). |
I will adapt my document (the document should reside on this site somewhere) based on the previous version of gdbstub. It is quit easy to configure Eclipse. |
Yes, the mem settings are applicable to all boards. While the 512M flash boards don't have the full 1MB program flash, the address space that doesn't exist should never be accessed by GDB (since your program can never reference it). |
Question perhaps a bit off topic: Is there any documentation on how to handle exceptions on application level (try catch) , with and without using this debugger? With the example provided I observe that the debugger halts at line 4....(-fexceptions). Adding a try catch statement around line 4 show the same result. |
They're 2 different things. Hardware exceptions vs C++ exceptions.
You can't catch() the 1st type, they're below the language and hardware actually does a jump to a HW exception vector. |
I'm not sure I understand what you're trying to do, but if you don't include gdbtub.h and do the call then the GDB stub will never be installed and the default exception handler will be called. The simplest thing you could do is use |
That implies I have to build an new image and download to the target...
OK, that's an option! But now: when I disconnect the serial communication line, the gdbserver remains active, not? Ergo the GDB-server will be activated on an exception and the application halts. So how to detach programmatically in such cases? |
Actually, scratch that. By including a call to gdbstub_init() anywhere in the code, the linker will replace dummy no-op GDB hooks with real GDB hooks in the code, even if it doesn't get called. So gdbstub_init() later on won't affect things, that only hooks the serial port and (I think) the breakpoint exception. With the way it's architected now, I don't believe there's a clean, sane way to do what you're trying. |
Ok no problem, I am happy as it is. |
@devyte, what do you think of getting this in 2.5.0? It's self contained and introduces no changes in default operation (you need to mod your code to enable GDB). It's been tested by me and @freeck with different environments (Eclipse and CLI). Even in the worst case GDB would just not work (same as before), so it seems like only upside. |
Using Eclipse for debugging I observe that system messages and output from function ets_printf() are sent to the gdb-tracewindow on a single character basis packed in a debug-message. As functions like Serial.printf are buffered and are sent on a line basis. The first creates a lot of overhead in gdb client-server traffic and significantly delays the debugproces during entering a breakpoint or singel stepping. |
Haven't seen any problem when running from CLI (printf's seem to come out at normal speed, but I haven't run it with a stopwatch or anything). Hitting a breakpoint and getting app as well as debug-command output seems instantaneous. Could be some weird JAVA thing or the way Eclipse is running things (maybe single-insn stepping through all code?). |
Arduino/libraries/GDBStub/src/internal/gdbstub.c Lines 362 to 370 in a3ed4b4
So only if the GDB debugger is connected will it encapsulate chars with the GDB packet. That's just the way it has to be when GDB is connected since it's parsing everything coming over the line and would interpret bytes as garbage and barf. At 115200 and the CLI, it's as close as I can tell to normal speed. Maybe you're running at a low baud? |
My one comment is almost the same as @freeck 's: it would be nice to be able to de-init the gdbstub to allow normal operation. I know that calling gdbstub_init() replaces the hooks with real ones, but that shouldn't mean that the gdbstub functionality needs to be enabled all the time. In other words, it should be possible to have the gdbstub compiled into the app, then have something like a master switch that enables/disables the capability to attach gdb. |
@devyte It is indeed a nice to have. So far I am very happy with the debugger as it is. |
Please, feel free to open an issue for the next release. It's not something that's impossible, only tedious, but it will need new code and testing. You won't get back any IRAM, of course, which was the big reason for not including GDBStub by default. |
Sorry guys, but I wish I new about this while I had an active pull request of my own with uart.h/uart.c. I want to discuss if this truly was the right thing to do. I have concerns over the architecture that moved the ownership of the uart ISR out of uart work into another place, versus having the gdbstub registering into the "uart" to take over complete ownership. What you have created is hard to manage in one spot or even enhance (which was what my pull was cleaning up and was active at the same time). Thus making my changes and this pretty incompatible. |
Observation: my application relies on the softwareserial library (ver 18-12-2018). I receive 1000 bytes messages, baudrate is 115200, length message 1000 bytes, each message ends with a crc. |
Absolute no idea, sorry. SWSerial doesn't use interrupts at 115.2K, it busy-loops unless the baud is really low IIRC. I'd suggest pinging the SWSerial author if I were you (we did upgrade to the latest SWserial repo a month or so back)... |
Don't you think there is a strong relation with this issue #5513 …..knowing that SW-serial does not use interrupts.... |
Concerning IRAM-usage for gdbstub: from version 2.5.0 on the I2C-library core_esp8266_si2c.c has been extended with slave-functionality. For a good reason there was decided to put a lot of functions in IRAM! |
I don't think so. I was mistaken about SWserial. It transmits in a busy-loop, but Rx is handled right inside a single ISR handler. It doesn't return from the ISR until a full byte is shifted in....not seeing how anything at all could be affected there no matter what's done in the core.
Open up a new issue on IRAM w/the i2c blowing up, that way it can be followed. 2.5K is a big amount... |
I will.. |
SWserial introduces a 10% CPU load in the ISR. In my case I sent 1000 chars/second, baud rate is 115200. |
This extends #4386 by manually fixing the merge issues in @kylefleming 's original PR, fixing the linker literal assignment errors, and updating the PR to use the open source GDB protocol for Xtensa (as is included in the new toolchain).
It's still WIP because I've not tried everything out nor written up how to use it, but it's able to stop a program in mid-run, set a breakpoint, continue, show values, and change them.