Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Latest 6.0 SDK/Runtime Docker images segmentation fault on ARM64 #79796

Closed
MarcusWichelmann opened this issue Dec 18, 2022 · 25 comments
Closed

Comments

@MarcusWichelmann
Copy link

MarcusWichelmann commented Dec 18, 2022

Description

When running the current mcr.microsoft.com/dotnet/sdk:6.0 image on a Raspberry Pi 3B+ with a 64-bit OS, dotnet --version segfaults.

The 7.0 images work. Older 6.0 images, like 6.0.301 also work.

Running a published application using the latest mcr.microsoft.com/dotnet/aspnet:6.0 image also segfaults. The older mcr.microsoft.com/dotnet/aspnet:6.0.8 runtime image works.

Reproduction Steps

With latest 6.0 image (SDK version 6.0.404):

# docker run -it --rm mcr.microsoft.com/dotnet/sdk:6.0
root@a9d38a1ca2dc:/# dotnet --version
Segmentation fault (core dumped)

An older image works:

# docker run -it --rm mcr.microsoft.com/dotnet/sdk:6.0.301
root@a6d6799cb152:/# dotnet --version
6.0.301

SDK 7.0 images also work:

# docker run -it --rm mcr.microsoft.com/dotnet/sdk:7.0
root@2d60270b7aeb:/# dotnet --version
7.0.101

Should be easily reproducible with an ARM64 machine at hand.

Expected behavior

The dotnet command on the current SDK 6.0 docker images shouldn't segfault.

Actual behavior

The dotnet command on 6.0.404 segfaults.

Regression?

No response

Known Workarounds

I'm currently using the older 6.0.301 image. But that's obviously not a permanent solution.

Configuration

No response

Other information

Please let me know, if I should test any other SDK versions or can provide some helpful output.

@dotnet-issue-labeler
Copy link

I couldn't figure out the best area label to add to this issue. If you have write-permissions please help me learn by adding exactly one area label.

@ghost ghost added the untriaged New issue has not been triaged by the area owner label Dec 18, 2022
@MarcusWichelmann
Copy link
Author

6.0.403 also works, so the bug must have been introduced very recently:

# docker run -it --rm mcr.microsoft.com/dotnet/sdk:6.0.403
root@5abe5e17b56f:/# dotnet --version
6.0.403

@MarcusWichelmann MarcusWichelmann changed the title dotnet segmentation faults on ARM64 with 6.0 Docker images SDK 6.0.404 Docker image segmentation faults on ARM64 Dec 18, 2022
@MarcusWichelmann MarcusWichelmann changed the title SDK 6.0.404 Docker image segmentation faults on ARM64 Latest SDK/Runtime Docker images segmentation fault on ARM64 Dec 18, 2022
@MarcusWichelmann MarcusWichelmann changed the title Latest SDK/Runtime Docker images segmentation fault on ARM64 Latest 6.0 SDK/Runtime Docker images segmentation fault on ARM64 Dec 18, 2022
@vcsjones
Copy link
Member

I cannot reproduce this on an M1 Mac. It may be specific to running on a Raspberry Pi 3B+.

Output
docker run -it --rm mcr.microsoft.com/dotnet/sdk:6.0 dotnet --info

.NET SDK (reflecting any global.json):
Version: 6.0.404
Commit: be4f3ec411

Runtime Environment:
OS Name: debian
OS Version: 11
OS Platform: Linux
RID: debian.11-arm64
Base Path: /usr/share/dotnet/sdk/6.0.404/

global.json file:
Not found

Host:
Version: 6.0.12
Architecture: arm64
Commit: 02e45a4

Does this reproduce for you outside of Docker? Secondly, are you able to grab the memory dump of the crash?

@ghost
Copy link

ghost commented Jan 4, 2023

Tagging subscribers to this area: @vitek-karas, @agocke, @VSadov
See info in area-owners.md if you want to be subscribed.

Issue Details

Description

When running the current mcr.microsoft.com/dotnet/sdk:6.0 image on a Raspberry Pi 3B+ with a 64-bit OS, dotnet --version segfaults.

The 7.0 images work. Older 6.0 images, like 6.0.301 also work.

Running a published application using the latest mcr.microsoft.com/dotnet/aspnet:6.0 image also segfaults. The older mcr.microsoft.com/dotnet/aspnet:6.0.8 runtime image works.

Reproduction Steps

With latest 6.0 image (SDK version 6.0.404):

# docker run -it --rm mcr.microsoft.com/dotnet/sdk:6.0
root@a9d38a1ca2dc:/# dotnet --version
Segmentation fault (core dumped)

An older image works:

# docker run -it --rm mcr.microsoft.com/dotnet/sdk:6.0.301
root@a6d6799cb152:/# dotnet --version
6.0.301

SDK 7.0 images also work:

# docker run -it --rm mcr.microsoft.com/dotnet/sdk:7.0
root@2d60270b7aeb:/# dotnet --version
7.0.101

Should be easily reproducible with an ARM64 machine at hand.

Expected behavior

The dotnet command on the current SDK 6.0 docker images shouldn't segfault.

Actual behavior

The dotnet command on 6.0.404 segfaults.

Regression?

No response

Known Workarounds

I'm currently using the older 6.0.301 image. But that's obviously not a permanent solution.

Configuration

No response

Other information

Please let me know, if I should test any other SDK versions or can provide some helpful output.

Author: MarcusWichelmann
Assignees: -
Labels:

area-Host, untriaged

Milestone: -

@draakuns
Copy link

draakuns commented Jan 8, 2023

Hi,

i reproduced it on a pine64 A64+, its Allwinner cpu so its not a family of the cpu of raspi other than being ARM64

running on docker it segfaults

# dotnet --info
Segmentation fault (core dumped)

If you direct me I can try to get a dump. It seems to be related to 6.0.12 and 6.0.404, as 403 (.11) works and v7 works too.

EDIT: Kernel 5.15.80-sunxi64 (Armbian OS)

Regards,
Draakuns

@elinor-fung
Copy link
Member

You can run ulimit -c unlimited and then the crashing command to get a core dump.

In your most recent comment, it was dotnet --info that seg faulted. Do you know if dotnet --info (as opposed to dotnet --version) was also seg faulting in your original repro?

Could you also collect a host trace via the COREHOST_TRACE and COREHOST_TRACEFILE environment variables per https://learn.microsoft.com/dotnet/core/tools/dotnet-environment-variables#corehost_trace?

@MichaIng
Copy link

MichaIng commented Jan 14, 2023

I tested v6.0.13 with dotnet --info on Amlogic S922X (Cortex-A73 and A53) Rockchip RK3588 (Cortex-A76 and A55), RK3568 (Cortex-A55), RK3399 (Cortex-A72 and A53) and RK3328 (Cortex-A53) and it worked on all of them. OS's were Debian Bullseye and Bookworm.

So the CPU itself is ruled out, since we have cases where it fails on Raspberry Pi 3 and PINE A64, both Cortex-A53 as well. And the Distro alone is also ruled out since we have cases where it does fail on Debian Bullseye.

So sadly I cannot replicate the issue on the SBCs I have and cannot provide error traces 😞.

@MichaIng
Copy link

Seems to be solved in SDK v6.0.405 / runtime v6.0.13. Interestingly I could not trigger a segfault with the dotnet binary from v6.0.12 runtime, but only with the one from the v6.0.404 SDK, which ships the v6.0.12 runtime as well. I tried to narrow it down:

  • As fast as the v6.0.13 .NETCore runtime is installed in the v6.0.404 SDK, the segfault stops, here the other way round:
# ./dotnet --info
.NET SDK (reflecting any global.json):
 Version:   6.0.404
 Commit:    be4f3ec411

Runtime Environment:
 OS Name:     debian
 OS Version:  11
 OS Platform: Linux
 RID:         debian.11-arm64
 Base Path:   /root/dotnet/sdk/6.0.404/

global.json file:
  Not found

Host:
  Version:      6.0.12
  Architecture: arm64
  Commit:       02e45a41b7

.NET SDKs installed:
  6.0.404 [/root/dotnet/sdk]

.NET runtimes installed:
  Microsoft.AspNetCore.App 6.0.12 [/root/dotnet/shared/Microsoft.AspNetCore.App]
  Microsoft.NETCore.App 6.0.12 [/root/dotnet/shared/Microsoft.NETCore.App]
  Microsoft.NETCore.App 6.0.13 [/root/dotnet/shared/Microsoft.NETCore.App]

Download .NET:
  https://aka.ms/dotnet-download

Learn about .NET Runtimes and SDKs:
  https://aka.ms/dotnet/runtimes-sdk-info
# ./dotnet --version
6.0.404
# mv /root/dotnet/shared/Microsoft.NETCore.App/6.0.13 ..
# ./dotnet --info
Segmentation fault
# ./dotnet --version
Segmentation fault

I produced a core dump and can do a host trace. However, is still still required as current version seems to be fixed?

@draakuns
Copy link

draakuns commented Jan 29, 2023

@elinor-fung
Sorry I'm quite late, as I understand it is fixed on .405... but anyways, if the (core + host trace) are needed yet can you let me know where to upload them?

the dotnet binary segfaults, no matter what flags you provide: info, version, etc...

Best regards

@elinor-fung
Copy link
Member

Thanks for all the investigation here.

could not trigger a segfault with the dotnet binary from v6.0.12 runtime, but only with the one from the v6.0.404 SDK

As fast as the v6.0.13 .NETCore runtime is installed in the v6.0.404 SDK, the segfault stops, here the other way round

This is definitely interesting... dotnet --info does end up getting information from the SDK (if it is installed). Perhaps something in the SDK is exposing a bug that was in v6.0.12 of the runtime. The only difference from v6.0.12 to v6.0.13 I see of interest fixed a segfault around tiered compilation. I'd be curious if disabling tiered compilation (DOTNET_TieredCompilation=0) affects the repro here.

I'm glad it is working on latest versions - I'm going to close this issue based on that.

I would be curious to see the trace/dump. If you are willing to collect/share, you can upload them to this issue - but only if you are okay with sharing, since they will be public - otherwise, feel free to e-mail them to me (address is in my profile).

@ghost ghost removed the untriaged New issue has not been triaged by the area owner label Jan 31, 2023
@MichaIng
Copy link

MichaIng commented Jan 31, 2023

I'd be curious if disabling tiered compilation (DOTNET_TieredCompilation=0) affects the repro here.

# DOTNET_TieredCompilation=0 ./dotnet --version
Segmentation fault

It is this no runtime variable?

EDIT: I tried adding/setting

{
   "runtimeOptions": {
      "configProperties": {
         "System.Runtime.TieredCompilation": false
      }
   }
}

in ./shared/Microsoft.NETCore.App/6.0.12/Microsoft.NETCore.App.runtimeconfig.json (and some other *.runtimeconfig.json), but it still segfaults.

I would be curious to see the trace/dump. If you are willing to collect/share, you can upload them to this issue - but only if you are okay with sharing, since they will be public - otherwise, feel free to e-mail them to me (address is in my profile).

A pure test system, so this is fine:

@draakuns
Copy link

@elinor-fung core and host_trace from my sbc too, hope it sheds some light into the issue.

host_trace.txt.gz
core.17.gz

@elinor-fung
Copy link
Member

It is this no runtime variable?

I think it'd need to be export DOTNET_TieredCompilation=0.

Thanks both of you for the traces and dumps. I haven't managed to get symbols for some reason, but the crash does look to be while running the SDK (dotnet.dll).

 # Child-SP          RetAddr               Call Site
00 0000007f`ef50cf70 0000007f`1dff3a5c     0x0000007f`18934424
01 0000007f`ef50cf70 0000007f`1d766ca4     System_Linq+0x53a5c
02 0000007f`ef50cfa0 0000007f`1d766df8     dotnet+0xd6ca4
03 0000007f`ef50d000 0000007f`1d768220     dotnet+0xd6df8
04 0000007f`ef50d040 0000007f`9691f948     dotnet+0xd8220
05 0000007f`ef50d0a0 0000007f`9677c020     libcoreclr+0x482948

@dotnet/dotnet-diag Any ideas why dotnet-symbol doesn't seem to recognize the .NET binaries for either of the dumps?

# dotnet-symbol core -d
Downloading from https://msdl.microsoft.com/download/symbols/
WARNING: Unknown ELF core image 000000556D549000 /root/dotnet/dotnet
WARNING: Unknown ELF core image 0000007F1C9B0000 /root/dotnet/shared/Microsoft.NETCore.App/6.0.12/System.Private.CoreLib.dll
WARNING: Unknown ELF core image 0000007F1D690000 /root/dotnet/sdk/6.0.404/dotnet.dll
WARNING: Unknown ELF core image 0000007F1D950000 /root/dotnet/shared/Microsoft.NETCore.App/6.0.12/System.Diagnostics.Process.dll
WARNING: Unknown ELF core image 0000007F1D9D0000 /root/dotnet/shared/Microsoft.NETCore.App/6.0.12/System.ComponentModel.Primitives.dll

@MichaIng
Copy link

MichaIng commented Feb 3, 2023

I think it'd need to be export DOTNET_TieredCompilation=0.

Tried that as well and it doesn't change something about the segfault.

@mikem8361
Copy link
Member

mikem8361 commented Feb 3, 2023 via email

@MichaIng
Copy link

MichaIng commented Feb 3, 2023

dotnet-symbol is broken for arm32 core dumps.

These are both ARM64 core dumps 😉.

@mikem8361
Copy link
Member

Oops, I'll check if dotnet-symbol on arm64 (and arm32 too) 6.0 core dumps.

@mikem8361
Copy link
Member

The core.17 attached doesn't have any of the modules mapped/added to the dump. It looks like this dump was generated by the system by setting ulimit but you would also need to set coredump_filter to 0xff to get a full dump: echo 0xff > /proc/self/coredump_filter

@MichaIng
Copy link

MichaIng commented Feb 4, 2023

I just followed the instructions given above 😉: #79796 (comment)

New core dump with echo 0xff > /proc/self/coredump_filter executed in addition to ulimit -c unlimited (xz-compressed since gzipped it's 1 MiB too large to upload here, but named .gz, since GitHub does not allow to upload .xz 😄):
core.gz

@draakuns
Copy link

draakuns commented Feb 4, 2023

Oops, I'll check if dotnet-symbol on arm64 (and arm32 too) 6.0 core dumps.
It only fails on ARM64... ARM32 builds at least work flawlessly from applications POV.

Attaching new core & host_trace, all-in-one.
core.25.gz
host_trace.txt

BR

@mikem8361
Copy link
Member

Does that dump work with dotnet-symbol? It would be easier for you to try first.

@MichaIng
Copy link

MichaIng commented Feb 5, 2023

Where do I get dotnet-symbol from?

@MichaIng
Copy link

MichaIng commented Feb 5, 2023

Found it, seems to work, while a few things were not found:

root@NanoPiFire3:~# dotnet-symbol core -d
Downloading from https://msdl.microsoft.com/download/symbols/
Cached: /root/.dotnet/symbolcache/dotnet/elf-buildid-821cc9cc8b1a11a8fba2bc0fa21cb4cec4e482e4/dotnet
Writing: ./dotnet
Cached: /root/.dotnet/symbolcache/_.debug/elf-buildid-sym-821cc9cc8b1a11a8fba2bc0fa21cb4cec4e482e4/_.debug
Writing: ./dotnet.dbg
Cached: /root/.dotnet/symbolcache/system.private.corelib.dll/988911EAb8a200/system.private.corelib.dll
Writing: ./System.Private.CoreLib.dll
Cached: /root/.dotnet/symbolcache/system.private.corelib.pdb/5be6836afd464bd884d6d56cad2a188dFFFFFFFF/system.private.corelib.pdb
Writing: ./System.Private.CoreLib.pdb
ERROR: Not Found: dotnet.dll - 'https://msdl.microsoft.com/download/symbols/dotnet.dll/934DC04A259400/dotnet.dll'
SymbolChecksum: SHA256:c26faece1f87d300c31cb8b9a14499801d0abaae6efa322de8fd8f0a2b99e91a
Testing checksum: SHA256:c26faece1f87d300c31cb8b9a14499801d0abaae6efa322de8fd8f0a2b99e91a
Found checksum match SHA256:c26faece1f87d300c31cb8b9a14499801d0abaae6efa322de8fd8f0a2b99e91a
Cached: /root/.dotnet/symbolcache/dotnet.pdb/ceae6fc2871f40d3831cb8b9a1449980FFFFFFFF/dotnet.pdb
Writing: ./dotnet.pdb
Cached: /root/.dotnet/symbolcache/system.diagnostics.process.dll/9DAF75F476200/system.diagnostics.process.dll
Writing: ./System.Diagnostics.Process.dll
Cached: /root/.dotnet/symbolcache/system.diagnostics.process.pdb/4fbe7c7a113f46789f1584449495f1deFFFFFFFF/system.diagnostics.process.pdb
Writing: ./System.Diagnostics.Process.pdb
Cached: /root/.dotnet/symbolcache/system.componentmodel.primitives.dll/F58689BE42a00/system.componentmodel.primitives.dll
Writing: ./System.ComponentModel.Primitives.dll
Cached: /root/.dotnet/symbolcache/system.componentmodel.primitives.pdb/35f90c9024a741e5b5fbceab9987c032FFFFFFFF/system.componentmodel.primitives.pdb
Writing: ./System.ComponentModel.Primitives.pdb
Cached: /root/.dotnet/symbolcache/microsoft.win32.primitives.dll/AB9414AE33a00/microsoft.win32.primitives.dll
Writing: ./Microsoft.Win32.Primitives.dll
Cached: /root/.dotnet/symbolcache/microsoft.win32.primitives.pdb/7b1b235bb2cd4d3e8c807e199529feb1FFFFFFFF/microsoft.win32.primitives.pdb
Writing: ./Microsoft.Win32.Primitives.pdb
Cached: /root/.dotnet/symbolcache/system.threading.dll/8BAFFA7B33600/system.threading.dll
Writing: ./System.Threading.dll
Cached: /root/.dotnet/symbolcache/system.threading.pdb/d1fa74e8cd6c4cf69611f6fa6f42a78cFFFFFFFF/system.threading.pdb
Writing: ./System.Threading.pdb
Cached: /root/.dotnet/symbolcache/system.memory.dll/D57981265e200/system.memory.dll
Writing: ./System.Memory.dll
Cached: /root/.dotnet/symbolcache/system.memory.pdb/4791093be12048949f4d98604e84165fFFFFFFFF/system.memory.pdb
Writing: ./System.Memory.pdb
ERROR: Not Found: Microsoft.DotNet.Cli.Utils.dll - 'https://msdl.microsoft.com/download/symbols/microsoft.dotnet.cli.utils.dll/E48C460A6a200/microsoft.dotnet.cli.utils.dll'
SymbolChecksum: SHA256:939256a77b4ad5e70ed0fa87ae4a34c6821ee8dc8774f393eb7aa0b5110e70cc
Testing checksum: SHA256:939256a77b4ad5e70ed0fa87ae4a34c6821ee8dc8774f393eb7aa0b5110e70cc
Found checksum match SHA256:939256a77b4ad5e70ed0fa87ae4a34c6821ee8dc8774f393eb7aa0b5110e70cc
Cached: /root/.dotnet/symbolcache/microsoft.dotnet.cli.utils.pdb/a75692934a7b47d58ed0fa87ae4a34c6FFFFFFFF/microsoft.dotnet.cli.utils.pdb
Writing: ./Microsoft.DotNet.Cli.Utils.pdb
Cached: /root/.dotnet/symbolcache/system.text.encoding.codepages.dll/A20BDAE4108800/system.text.encoding.codepages.dll
Writing: ./System.Text.Encoding.CodePages.dll
Cached: /root/.dotnet/symbolcache/system.text.encoding.codepages.pdb/6be87684f6a942e4bdeb220c329b0c18FFFFFFFF/system.text.encoding.codepages.pdb
Writing: ./System.Text.Encoding.CodePages.pdb
Cached: /root/.dotnet/symbolcache/system.collections.dll/FC8C338479400/system.collections.dll
Writing: ./System.Collections.dll
Cached: /root/.dotnet/symbolcache/system.collections.pdb/0ee29025149e4fb299fae344774023dfFFFFFFFF/system.collections.pdb
Writing: ./System.Collections.pdb
ERROR: Not Found: System.CommandLine.dll - 'https://msdl.microsoft.com/download/symbols/system.commandline.dll/D43AC67Fb8e00/system.commandline.dll'
SymbolChecksum: SHA256:ecc744d70d1b0e4d0bfad712a6f91b6d89ada365b698c70ef15f570b2d7a2b92
Testing checksum: SHA256:ecc744d70d1b0e4d0bfad712a6f91b6d89ada365b698c70ef15f570b2d7a2b92
Found checksum match SHA256:ecc744d70d1b0e4d0bfad712a6f91b6d89ada365b698c70ef15f570b2d7a2b92
Cached: /root/.dotnet/symbolcache/system.commandline.pdb/d744c7ec1b0d4d0e8bfad712a6f91b6dFFFFFFFF/system.commandline.pdb
Writing: ./System.CommandLine.pdb
ERROR: Not Found: Microsoft.DotNet.Configurer.dll - 'https://msdl.microsoft.com/download/symbols/microsoft.dotnet.configurer.dll/93970DDA39200/microsoft.dotnet.configurer.dll'
SymbolChecksum: SHA256:69cb78eaed23604ef2b6e9da180644ef3f7667a0b651465ac8e762efef084a4f
Testing checksum: SHA256:69cb78eaed23604ef2b6e9da180644ef3f7667a0b651465ac8e762efef084a4f
Found checksum match SHA256:69cb78eaed23604ef2b6e9da180644ef3f7667a0b651465ac8e762efef084a4f
Cached: /root/.dotnet/symbolcache/microsoft.dotnet.configurer.pdb/ea78cb6923ed4e60b2b6e9da180644efFFFFFFFF/microsoft.dotnet.configurer.pdb
Writing: ./Microsoft.DotNet.Configurer.pdb
ERROR: Not Found: Microsoft.DotNet.InternalAbstractions.dll - 'https://msdl.microsoft.com/download/symbols/microsoft.dotnet.internalabstractions.dll/E602A83036c00/microsoft.dotnet.internalabstractions.dll'
SymbolChecksum: SHA256:a7ee305a8b7b7876e7af6a9340159d97e2db550ab26d6bcf259d9e3dd732aa39
Testing checksum: SHA256:a7ee305a8b7b7876e7af6a9340159d97e2db550ab26d6bcf259d9e3dd732aa39
Found checksum match SHA256:a7ee305a8b7b7876e7af6a9340159d97e2db550ab26d6bcf259d9e3dd732aa39
Cached: /root/.dotnet/symbolcache/microsoft.dotnet.internalabstractions.pdb/5a30eea77b8b4678a7af6a9340159d97FFFFFFFF/microsoft.dotnet.internalabstractions.pdb
Writing: ./Microsoft.DotNet.InternalAbstractions.pdb
Cached: /root/.dotnet/symbolcache/system.componentmodel.dll/94B261A032200/system.componentmodel.dll
Writing: ./System.ComponentModel.dll
Cached: /root/.dotnet/symbolcache/system.componentmodel.pdb/d685a39cca4e4b00af44c026222c6295FFFFFFFF/system.componentmodel.pdb
Writing: ./System.ComponentModel.pdb
Cached: /root/.dotnet/symbolcache/system.collections.concurrent.dll/C9AB477C60400/system.collections.concurrent.dll
Writing: ./System.Collections.Concurrent.dll
Cached: /root/.dotnet/symbolcache/system.collections.concurrent.pdb/1dac7fbf629e4f369998cc6f7b49e389FFFFFFFF/system.collections.concurrent.pdb
Writing: ./System.Collections.Concurrent.pdb
Cached: /root/.dotnet/symbolcache/system.linq.dll/DDDE1C50c4200/system.linq.dll
Writing: ./System.Linq.dll
Cached: /root/.dotnet/symbolcache/system.linq.pdb/222ca97c008e4ffc8d30d3b272d2554dFFFFFFFF/system.linq.pdb
Writing: ./System.Linq.pdb
Cached: /root/.dotnet/symbolcache/netstandard.dll/91757F0F1c000/netstandard.dll
Writing: ./netstandard.dll
ERROR: Not Found: libicui18n.so.67.1 - 'https://msdl.microsoft.com/download/symbols/libicui18n.so.67.1/elf-buildid-e5ccce9dd17d61c5de04a301120546bc04ad7003/libicui18n.so.67.1'
ERROR: Not Found: libicui18n.so.67.1.dbg - 'https://msdl.microsoft.com/download/symbols/_.debug/elf-buildid-sym-e5ccce9dd17d61c5de04a301120546bc04ad7003/_.debug'
ERROR: Not Found: libicudata.so.67.1 - 'https://msdl.microsoft.com/download/symbols/libicudata.so.67.1/elf-buildid-fe67d3742ed9536221811beb4c9a6d3e9731a281/libicudata.so.67.1'
ERROR: Not Found: 67d3742ed9536221811beb4c9a6d3e9731a281.debug - 'https://msdl.microsoft.com/download/symbols/_.debug/elf-buildid-sym-fe67d3742ed9536221811beb4c9a6d3e9731a281/_.debug'
ERROR: Not Found: libicuuc.so.67.1 - 'https://msdl.microsoft.com/download/symbols/libicuuc.so.67.1/elf-buildid-6779b15a0f1553b0fb31fa86c906da6b98ad4366/libicuuc.so.67.1'
ERROR: Not Found: libicuuc.so.67.1.dbg - 'https://msdl.microsoft.com/download/symbols/_.debug/elf-buildid-sym-6779b15a0f1553b0fb31fa86c906da6b98ad4366/_.debug'
Cached: /root/.dotnet/symbolcache/libsystem.native.so/elf-buildid-5fb960cc3cb7b660303b53bdc0e4e7085bb50a55/libsystem.native.so
Writing: ./libSystem.Native.so
Cached: /root/.dotnet/symbolcache/_.debug/elf-buildid-sym-5fb960cc3cb7b660303b53bdc0e4e7085bb50a55/_.debug
Writing: ./libSystem.Native.so.dbg
Cached: /root/.dotnet/symbolcache/system.runtime.dll/E4B413C8e000/system.runtime.dll
Writing: ./System.Runtime.dll
ERROR: Reading PDB records for /root/dotnet/shared/Microsoft.NETCore.App/6.0.12/System.Runtime.dll: Virtual address range is not mapped 0000007FB18B7168 4
Cached: /root/.dotnet/symbolcache/libclrjit.so/elf-buildid-7d3b21c32ffacd082995ce156780ecc45ad8daf0/libclrjit.so
Writing: ./libclrjit.so
Cached: /root/.dotnet/symbolcache/_.debug/elf-buildid-sym-7d3b21c32ffacd082995ce156780ecc45ad8daf0/_.debug
Writing: ./libclrjit.so.dbg
ERROR: Not Found: librt-2.31.so - 'https://msdl.microsoft.com/download/symbols/librt-2.31.so/elf-buildid-774ba134055bb1a66a0d8edc55f1226673c52ac1/librt-2.31.so'
ERROR: Not Found: 4ba134055bb1a66a0d8edc55f1226673c52ac1.debug - 'https://msdl.microsoft.com/download/symbols/_.debug/elf-buildid-sym-774ba134055bb1a66a0d8edc55f1226673c52ac1/_.debug'
Cached: /root/.dotnet/symbolcache/libcoreclr.so/elf-buildid-4bdc74db9e544fcb19be983630a541426b5003ad/libcoreclr.so
Writing: ./libcoreclr.so
Cached: /root/.dotnet/symbolcache/_.debug/elf-buildid-sym-4bdc74db9e544fcb19be983630a541426b5003ad/_.debug
Writing: ./libcoreclr.so.dbg
Cached: /root/.dotnet/symbolcache/libmscordaccore.so/elf-buildid-coreclr-4bdc74db9e544fcb19be983630a541426b5003ad/libmscordaccore.so
Writing: ./libmscordaccore.so
Cached: /root/.dotnet/symbolcache/libmscordbi.so/elf-buildid-coreclr-4bdc74db9e544fcb19be983630a541426b5003ad/libmscordbi.so
Writing: ./libmscordbi.so
Cached: /root/.dotnet/symbolcache/mscordaccore.dll/elf-buildid-coreclr-4bdc74db9e544fcb19be983630a541426b5003ad/mscordaccore.dll
Writing: ./mscordaccore.dll
Cached: /root/.dotnet/symbolcache/mscordbi.dll/elf-buildid-coreclr-4bdc74db9e544fcb19be983630a541426b5003ad/mscordbi.dll
Writing: ./mscordbi.dll
ERROR: Not Found: libsos.so - 'https://msdl.microsoft.com/download/symbols/libsos.so/elf-buildid-coreclr-4bdc74db9e544fcb19be983630a541426b5003ad/libsos.so'
ERROR: Not Found: SOS.NETCore.dll - 'https://msdl.microsoft.com/download/symbols/sos.netcore.dll/elf-buildid-coreclr-4bdc74db9e544fcb19be983630a541426b5003ad/sos.netcore.dll'
Cached: /root/.dotnet/symbolcache/libhostpolicy.so/elf-buildid-0117406b28d6d74689360905cc33d6632ffeb6c9/libhostpolicy.so
Writing: ./libhostpolicy.so
Cached: /root/.dotnet/symbolcache/_.debug/elf-buildid-sym-0117406b28d6d74689360905cc33d6632ffeb6c9/_.debug
Writing: ./libhostpolicy.so.dbg
Cached: /root/.dotnet/symbolcache/libhostfxr.so/elf-buildid-a1efdb27f97e2788652f1ee16fa54eecf540de7a/libhostfxr.so
Writing: ./libhostfxr.so
Cached: /root/.dotnet/symbolcache/_.debug/elf-buildid-sym-a1efdb27f97e2788652f1ee16fa54eecf540de7a/_.debug
Writing: ./libhostfxr.so.dbg
ERROR: Not Found: libc-2.31.so - 'https://msdl.microsoft.com/download/symbols/libc-2.31.so/elf-buildid-7689b99bf17371e7f441bdc217f4a0d17f449dd9/libc-2.31.so'
ERROR: Not Found: 89b99bf17371e7f441bdc217f4a0d17f449dd9.debug - 'https://msdl.microsoft.com/download/symbols/_.debug/elf-buildid-sym-7689b99bf17371e7f441bdc217f4a0d17f449dd9/_.debug'
ERROR: Not Found: libgcc_s.so.1 - 'https://msdl.microsoft.com/download/symbols/libgcc_s.so.1/elf-buildid-7a631ea0164655f48e53fe1b1e0727312e257b39/libgcc_s.so.1'
ERROR: Not Found: 631ea0164655f48e53fe1b1e0727312e257b39.debug - 'https://msdl.microsoft.com/download/symbols/_.debug/elf-buildid-sym-7a631ea0164655f48e53fe1b1e0727312e257b39/_.debug'
ERROR: Not Found: libm-2.31.so - 'https://msdl.microsoft.com/download/symbols/libm-2.31.so/elf-buildid-ae78fc1be92ef39904b5b19fb2e5dcf5292bc879/libm-2.31.so'
ERROR: Not Found: 78fc1be92ef39904b5b19fb2e5dcf5292bc879.debug - 'https://msdl.microsoft.com/download/symbols/_.debug/elf-buildid-sym-ae78fc1be92ef39904b5b19fb2e5dcf5292bc879/_.debug'
ERROR: Not Found: libstdc++.so.6.0.28 - 'https://msdl.microsoft.com/download/symbols/libstdc%2B%2B.so.6.0.28/elf-buildid-3ef32ad83b058f02cf374e5f892f0587814e4297/libstdc%2B%2B.so.6.0.28'
ERROR: Not Found: f32ad83b058f02cf374e5f892f0587814e4297.debug - 'https://msdl.microsoft.com/download/symbols/_.debug/elf-buildid-sym-3ef32ad83b058f02cf374e5f892f0587814e4297/_.debug'
ERROR: Not Found: libdl-2.31.so - 'https://msdl.microsoft.com/download/symbols/libdl-2.31.so/elf-buildid-20b17e8019b2e358363fd2f57808c7d63d3b76fd/libdl-2.31.so'
ERROR: Not Found: b17e8019b2e358363fd2f57808c7d63d3b76fd.debug - 'https://msdl.microsoft.com/download/symbols/_.debug/elf-buildid-sym-20b17e8019b2e358363fd2f57808c7d63d3b76fd/_.debug'
ERROR: Not Found: libpthread-2.31.so - 'https://msdl.microsoft.com/download/symbols/libpthread-2.31.so/elf-buildid-50698f111196e410c367ad6031b3cb34c4dfc1e4/libpthread-2.31.so'
ERROR: Not Found: 698f111196e410c367ad6031b3cb34c4dfc1e4.debug - 'https://msdl.microsoft.com/download/symbols/_.debug/elf-buildid-sym-50698f111196e410c367ad6031b3cb34c4dfc1e4/_.debug'
ELF .gnu_debuglink section in /usr/lib/aarch64-linux-gnu/ld-2.31.so: Virtual address range is not mapped 0000007FB56021B0 4
ERROR: Not Found: ld-2.31.so - 'https://msdl.microsoft.com/download/symbols/ld-2.31.so/elf-buildid-ae4da2e572cddd8f08179a9bcda3dc37a146e2eb/ld-2.31.so'
ERROR: Not Found: ld-2.31.so.dbg - 'https://msdl.microsoft.com/download/symbols/_.debug/elf-buildid-sym-ae4da2e572cddd8f08179a9bcda3dc37a146e2eb/_.debug'
Cached: /root/.dotnet/symbolcache/system.diagnostics.tracing.dll/E7400D8F8000/system.diagnostics.tracing.dll
Writing: ./System.Diagnostics.Tracing.dll

Shall I provide all/any of the generated files?

@mikem8361
Copy link
Member

dotnet-symbol looks like it is working fine now that you have a full core dump which is what I wanted to address.

@draakuns
Copy link

draakuns commented Feb 8, 2023

Does that dump work with dotnet-symbol? It would be easier for you to try first.

I am using official docker image for MS dotnet mcr.microsoft.com/dotnet/sdk:6.0, so I suppose dotnet symbols should be ok... but I will check whenever i find a time slot.

BR

@ghost ghost locked as resolved and limited conversation to collaborators Mar 11, 2023
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Projects
None yet
Development

No branches or pull requests

8 participants