Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

GenevaExporter uses Expression.Lambda from System.Linq.Expressions - AOT size and performance concerns #1814

Closed
eerhardt opened this issue May 17, 2024 · 4 comments · Fixed by open-telemetry/opentelemetry-dotnet#5642 or #2294
Assignees
Labels
comp:exporter.geneva Things related to OpenTelemetry.Exporter.Geneva enhancement New feature or request

Comments

@eerhardt
Copy link
Contributor

eerhardt commented May 17, 2024

When using the GenevaExporter in a native AOT application, it is bringing in System.Linq.Expressions, which adds size to the app, but also any Linq.Expressions are interpreted at runtime (since IL can't be generated).

See the following:

          @eerhardt GenevaExporter also uses `Expression.Lambda` from [System.Linq.Expressions](https://learn.microsoft.com/en-us/dotnet/api/system.linq.expressions?view=net-8.0). Did we not have a Trim/AOT issue when using that?

static ReentrantExportProcessor()
{
var flags = BindingFlags.Instance | BindingFlags.NonPublic;
var ctor = typeof(Batch<T>).GetConstructor(flags, null, new Type[] { typeof(T) }, null);
var value = Expression.Parameter(typeof(T), null);
var lambda = Expression.Lambda<Func<T, Batch<T>>>(Expression.New(ctor, value), value);
CreateBatch = lambda.Compile();
}

Originally posted by @utpilla in #1666 (comment)

We should decide how to fix this. Options I see are:

  1. Make Batch<T>(T item) ctor public, and GenevaExporter can just call it.
  2. Use Reflection directly when RuntimeFeature.IsDynamicCodeCompiled is false.
  3. Address this comment in the code and make this class go away:

// This export processor exports without synchronization.
// Once OpenTelemetry .NET officially support this,
// we can get rid of this class.
// This is currently only used in ETW export, where we know
// that the underlying system is safe under concurrent calls.
internal class ReentrantExportProcessor<T> : BaseExportProcessor<T>

cc @utpilla @CodeBlanch

@utpilla utpilla added the comp:exporter.geneva Things related to OpenTelemetry.Exporter.Geneva label May 17, 2024
@utpilla
Copy link
Contributor

utpilla commented May 17, 2024

Another approach would be to use the public ctor of Batch which takes in an array T[] and a count. We could have a thread-local array of size 1 and use that every time we need to create a Batch.

However, the best thing would be to just mark the Batch<T>(T item) ctor public.

@CodeBlanch Thoughts?

@CodeBlanch
Copy link
Member

@eerhardt How urgent is this? If we go the route of making the ctor public in SDK it will take a bit of time to get out into a stable release so GenevaExporter can consume it.

@eerhardt
Copy link
Contributor Author

How urgent is this?

Not very urgent. Just something that would be good to get fixed. Doing whatever is best long-term would be great. I'm not necessarily looking for a short-cut.

We could have a thread-local array of size 1 and use that every time we need to create a Batch.

I'm not super familiar with the code, so my concern might not be valid, but since Batch<T> holds directly on to the passed in array, we would need to ensure the Batch<T> insance is never passed to another thread. If it was, it could contain the wrong data.

@Dreamescaper
Copy link

Dreamescaper commented Sep 23, 2024

On .NET8 it is possible to use ConstructorInvoker:

var flags = BindingFlags.Instance | BindingFlags.NonPublic;
var ctor = typeof(Batch<T>).GetConstructor(flags, null, new Type[] { typeof(T) }, null);
var invoker = ConstructorInvoker.Create(ctor);
CreateBatch = t => (Batch<T>)invoker.Invoke(t);

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
comp:exporter.geneva Things related to OpenTelemetry.Exporter.Geneva enhancement New feature or request
Projects
None yet
4 participants