opentelemetry-dotnet-instrumentation
opentelemetry-dotnet-instrumentation copied to clipboard
Method not found when using `opentelemetry-dotnet-instrumentation`
Bug Report
Symptom
Describe the bug After installing the OpenTelemetry .NET Instrumentation module on a functioning Windows EC2 instance using the PowerShell module from the latest GitHub release, and restarting IIS (iisreset), our previously working ASP.NET application throws a MethodNotFoundException when handling an API request.
Method not found: 'Void DSI.POS.Application.TicketManagement.TicketCommands.ScheduleTicketCommand..ctor(...)'
The API call completes successfully before installing the OpenTelemetry module. The error only appears after installing the module and restarting IIS.
Expected behavior The API call should continue to work as it did before instrumentation was installed, without runtime errors like MethodNotFoundException.
Screenshots N/A
Runtime environment (please complete the following information):
- OpenTelemetry Automatic Instrumentation version: v1.11.0
- OS: Windows Server 2025
- .NET version: 4.8
Additional context The method that fails is a constructor in a custom domain library.
This error suggests a mismatch in method signatures or potential conflicts introduced by instrumentation.
The method exists and works prior to installing OpenTelemetry.
Reproduce
Steps to reproduce the behavior:
- Launch a Windows Server 2025 EC2 instance and deploy a working ASP.NET Web API application.
- Verify that the API endpoint works by making a successful request.
- Run the following PowerShell to install the OpenTelemetry module:
$ErrorActionPreference = 'Stop'
$ConfirmPreference = 'None'
$headers = @{ 'User-Agent' = 'PackerScript' }
$latestRelease = Invoke-RestMethod -Uri 'https://api.github.com/repos/open-telemetry/opentelemetry-dotnet-instrumentation/releases/latest' -Headers $headers
$asset = $latestRelease.assets | Where-Object { $_.name -like '*.psm1' }
if (-not $asset) { throw "Could not find a suitable OpenTelemetry module in the latest release." }
$module_url = $asset.browser_download_url
$download_path = Join-Path $env:temp $asset.name
Invoke-WebRequest -Uri $module_url -OutFile $download_path -UseBasicParsing
Import-Module $download_path
Install-OpenTelemetryCore
- Run iisreset
- Call the same API endpoint again
- Observe that the application now throws a MethodNotFoundException
I also tried this
$module_url = "https://github.com/open-telemetry/opentelemetry-dotnet-instrumentation/releases/download/v1.11.0/OpenTelemetry.DotNet.Auto.psm1"
$download_path = Join-Path $env:temp "OpenTelemetry.DotNet.Auto.psm1"
Invoke-WebRequest -Uri $module_url -OutFile $download_path -UseBasicParsing
Import-Module $download_path
# Install binaries (downloads latest to default dir)
Install-OpenTelemetryCore
# Register IIS for current machine (DefaultAppPool and all sites)
Register-OpenTelemetryForIIS
same result when I execute it I get the following
"Method not found: 'Void DSI.POS.Application.TicketManagement.TicketCommands.ScheduleTicketCommand..ctor(DSI.POS.Application.DTO.Ticket, System.Collections.Generic.List`1<System.DateTime>, Boolean, DSI.Core.ActionDialogList, Boolean, Boolean, Boolean, Boolean, System.Collections.Generic.List`1<System.ValueTuple`3<Int32,System.String,Int32>>, Boolean, Boolean, DSI.POS.Domain.Model.Settings.AllTicketSettings, DSI.POS.Domain.Model.Settings.AllTaxSettings, DSI.POS.Application.DTO.TicketClient, System.Collections.Generic.List`1<DSI.POS.Domain.Model.Tickets.CreatedBy>, System.Collections.Generic.List`1<DSI.POS.Domain.Model.Tickets.Confirmation>, System.Collections.Generic.List`1<DSI.POS.Domain.Model.Tickets.Referral>)'."
Running:
$module_url = "https://github.com/open-telemetry/opentelemetry-dotnet-instrumentation/releases/download/v1.11.0/OpenTelemetry.DotNet.Auto.psm1"
$download_path = Join-Path $env:temp "OpenTelemetry.DotNet.Auto.psm1"
Invoke-WebRequest -Uri $module_url -OutFile $download_path -UseBasicParsing
Import-Module $download_path
# Register IIS for current machine (DefaultAppPool and all sites)
Unregister-OpenTelemetryForIIS
Uninstall-OpenTelemetryCore
fixes it again
@Shereef could you compare your dependencies with https://github.com/open-telemetry/opentelemetry-dotnet-instrumentation/blob/v1.11.0/src/Directory.Packages.props#L48
This section lists all dependencies that auto instrumentation needs and generates redirections. My best guess currently is that some version in your application is newer and it might downgrade something. The more these versions are in sync, the less chances are that there will be a conflict.
I compared them and yest we have so many things outdated but I am wondering how this interaction is going on
So we Install-OpenTelemetryCore then it runs within the server
where does it's dependencies collide with ours?
how is it being injected into our project ?
I am asking so that I can understand more and try to fix it another way other than go into the rabbit hole of updating so many dependencies
https://github.com/open-telemetry/opentelemetry-dotnet-instrumentation/blob/main/docs/design.md provides a description about how auto instrumentation works. It is not a separate process that runs on your machine, and it behaves more like a library that is automatically loaded into your application, and it runs within your application. Auto instrumentation is based on the OpenTelemetry SDK, and as a result it needs to ensure that all of the dependencies that are required by the OpenTelemetry SDK are satisfied. It handles some of this in .net framework applications by bringing with it the required dependencies and attempting to use something similar to assembly binding redirects to ensure that the correct version is used in the application.
There might also be something in the native log files that can help with better understanding what is happening when that error occurs. You can see https://github.com/open-telemetry/opentelemetry-dotnet-instrumentation/blob/main/docs/troubleshooting.md#enable-detailed-logging for instructions on how to enable more detailed logging for troubleshooting purposes.
Sadly installing OpenTelemetry.AutoInstrumentation didn't help
I installed it on the main project that runs
and
COREHOST_TRACE=1
COREHOST_TRACEFILE=corehost_verbose_tracing.log
do not work on 4.8 but works on .net 8
I think 4.8 breaks way before it can write to that file
I also tried manually going through and updating the dependencies to match https://github.com/open-telemetry/opentelemetry-dotnet-instrumentation/blob/v1.11.0/src/Directory.Packages.props#L48 but sadly it seems like months of work which can't be done
how do we find out what is breaking it ?
Were you able to find any log files generated by auto-instrumentation? Those log files are necessary in order to help us see what auto-instrumentation was doing before the error.
I also think that the next best debugging step will be to try to create a Minimal, Reproducible Example that can be shared in this issue. Even with the log files, we might have a really hard time reproducing this issue.
https://github.com/open-telemetry/opentelemetry-dotnet-instrumentation/blob/main/docs/config.md#additional-settings
You can try to disable OTEL_DOTNET_AUTO_NETFX_REDIRECT_ENABLED, it's going to reveal all conflicting libraries (sadly also those which it can also fix successfully).
Sadly installing
OpenTelemetry.AutoInstrumentationdidn't help
This is for .NET only. Netfx is using redirections.
I am working on this today and will try again to find log files and will try OTEL_DOTNET_AUTO_NETFX_REDIRECT_ENABLED = 0
Here is my current relevant env vars:
CORECLR_NEWRELIC_HOME='C:\ProgramData\New Relic\.NET Agent\'
CORECLR_PROFILER='{36032161-FFC0-4B61-B559-F6C5D41BAE5A}'
COREHOST_TRACE=1
COREHOST_TRACEFILE='c:\logs\corehost_verbose_tracing.log'
OTEL_DOTNET_AUTO_INSTALL_DIR='C:\Program Files\OpenTelemetry .NET AutoInstrumentation'
OTEL_DOTNET_AUTO_NETFX_REDIRECT_ENABLED=0
OTEL_EXPORTER_OTLP_ENDPOINT=http://localhost:4318
OTEL_RESOURCE_ATTRIBUTES=service.namespace=Beauty,deployment.environment=stage,service.instance.id=IP-0A6982C4,service.version=v1.113.0-24
OTEL_SERVICE_NAME=posapicloud
When making an api call it still shows the same message even after restarting IIS for the new env vars to take
{"StatusCode":400,"StatusMessage":"Method not found: 'Void DSI.POS.Application.TicketManagement.TicketCommands.ScheduleTicketCommand..ctor(DSI.POS.Application.DTO.Ticket, System.Collections.Generic.List`1<System.DateTime>, Boolean, DSI.Core.ActionDialogList, Boolean, Boolean, Boolean, Boolean, System.Collections.Generic.List`1<System.ValueTuple`3<Int32,System.String,Int32>>, Boolean, Boolean, DSI.POS.Domain.Model.Settings.AllTicketSettings, DSI.POS.Domain.Model.Settings.AllTaxSettings, DSI.POS.Application.DTO.TicketClient, System.Collections.Generic.List`1<DSI.POS.Domain.Model.Tickets.CreatedBy>, System.Collections.Generic.List`1<DSI.POS.Domain.Model.Tickets.Confirmation>, System.Collections.Generic.List`1<DSI.POS.Domain.Model.Tickets.Referral>)'."}
also no logs in C:\logs while the folder exists because of other logs there
I have the same config on a .net 8 app and the log file is in C:\logs\corehost_verbose_tracing.net8.log
windows event viewer has those 2 warnings:
Log Name: Application
Source: Microsoft-Windows-AppModel-State
Date: 6/18/2025 11:30:43 AM
Event ID: 24
Task Category: None
Level: Warning
Keywords:
User: IP-0A6982C4\Administrator
Computer: ip-0A6982C4
Description:
Repair of state locations for operation InitializeDataChangedSignaler against package MicrosoftWindows.Client.Core_cw5n1h2txyewy with error -2147024894 returned Error Code: 0
Event Xml:
<Event xmlns="http://schemas.microsoft.com/win/2004/08/events/event">
<System>
<Provider Name="Microsoft-Windows-AppModel-State" Guid="{bff15e13-81bf-45ee-8b16-7cfead00da86}" />
<EventID>24</EventID>
<Version>0</Version>
<Level>3</Level>
<Task>0</Task>
<Opcode>0</Opcode>
<Keywords>0x8000000000000000</Keywords>
<TimeCreated SystemTime="2025-06-18T11:30:43.8849182Z" />
<EventRecordID>4325</EventRecordID>
<Correlation />
<Execution ProcessID="1648" ThreadID="940" />
<Channel>Application</Channel>
<Computer>ip-0A6982C4</Computer>
<Security UserID="S-1-5-21-1535322380-4254010328-1966328435-500" />
</System>
<EventData>
<Data Name="Operation">InitializeDataChangedSignaler</Data>
<Data Name="PackageFamily">MicrosoftWindows.Client.Core_cw5n1h2txyewy</Data>
<Data Name="OperationError">-2147024894</Data>
<Data Name="RepairTriggerError">0</Data>
</EventData>
</Event>
Log Name: Application
Source: Microsoft-Windows-AppModel-State
Date: 6/18/2025 11:30:43 AM
Event ID: 23
Task Category: None
Level: Warning
Keywords:
User: IP-0A6982C4\Administrator
Computer: ip-0A6982C4
Description:
Triggered repair of state locations because operation InitializeDataChangedSignaler against package MicrosoftWindows.Client.Core_cw5n1h2txyewy hit error -2147024894.
Event Xml:
<Event xmlns="http://schemas.microsoft.com/win/2004/08/events/event">
<System>
<Provider Name="Microsoft-Windows-AppModel-State" Guid="{bff15e13-81bf-45ee-8b16-7cfead00da86}" />
<EventID>23</EventID>
<Version>0</Version>
<Level>3</Level>
<Task>0</Task>
<Opcode>0</Opcode>
<Keywords>0x8000000000000000</Keywords>
<TimeCreated SystemTime="2025-06-18T11:30:43.7348286Z" />
<EventRecordID>4324</EventRecordID>
<Correlation />
<Execution ProcessID="1648" ThreadID="940" />
<Channel>Application</Channel>
<Computer>ip-0A6982C4</Computer>
<Security UserID="S-1-5-21-1535322380-4254010328-1966328435-500" />
</System>
<EventData>
<Data Name="Operation">InitializeDataChangedSignaler</Data>
<Data Name="PackageFamily">MicrosoftWindows.Client.Core_cw5n1h2txyewy</Data>
<Data Name="OperationError">-2147024894</Data>
</EventData>
</Event>
When I restart IIS I see this information in event viewer:
Log Name: Application
Source: .NET Runtime
Date: 6/18/2025 11:36:14 AM
Event ID: 1022
Task Category: None
Level: Information
Keywords: Classic
User: IIS APPPOOL\DefaultAppPool
Computer: ip-0A6982C4
Description:
.NET Runtime version 4.0.30319.0 - The profiler was loaded successfully. Profiler CLSID: '{918728DD-259F-4A6A-AC2B-B85E1B658318}'. Process ID (decimal): 9024. Message ID: [0x2507].
Event Xml:
<Event xmlns="http://schemas.microsoft.com/win/2004/08/events/event">
<System>
<Provider Name=".NET Runtime" />
<EventID Qualifiers="0">1022</EventID>
<Version>0</Version>
<Level>4</Level>
<Task>0</Task>
<Opcode>0</Opcode>
<Keywords>0x80000000000000</Keywords>
<TimeCreated SystemTime="2025-06-18T11:36:14.3274897Z" />
<EventRecordID>4327</EventRecordID>
<Correlation />
<Execution ProcessID="9024" ThreadID="0" />
<Channel>Application</Channel>
<Computer>ip-0A6982C4</Computer>
<Security UserID="S-1-5-82-3006700770-424185619-1745488364-794895919-4004696415" />
</System>
<EventData>
<Data>.NET Runtime version 4.0.30319.0 - The profiler was loaded successfully. Profiler CLSID: '{918728DD-259F-4A6A-AC2B-B85E1B658318}'. Process ID (decimal): 9024. Message ID: [0x2507].</Data>
</EventData>
</Event>
also no logs in C:\logs
CORECLR_NEWRELIC_HOME='C:\ProgramData\New Relic.NET Agent' CORECLR_PROFILER='{36032161-FFC0-4B61-B559-F6C5D41BAE5A}' COREHOST_TRACE=1 COREHOST_TRACEFILE='c:\logs\corehost_verbose_tracing.log' OTEL_DOTNET_AUTO_INSTALL_DIR='C:\Program Files\OpenTelemetry .NET AutoInstrumentation' OTEL_DOTNET_AUTO_NETFX_REDIRECT_ENABLED=0 OTEL_EXPORTER_OTLP_ENDPOINT=http://localhost:4318 OTEL_RESOURCE_ATTRIBUTES=service.namespace=Beauty,deployment.environment=stage,service.instance.id=IP-0A6982C4,service.version=v1.113.0-24 OTEL_SERVICE_NAME=posapicloud
@Shereef you do not direct logs to C:\logs, look at the default path or overwrite the path using OTEL_DOTNET_AUTO_LOG_DIRECTORY.
https://github.com/open-telemetry/opentelemetry-dotnet-instrumentation/blob/main/docs/config.md#internal-logs
otel-dotnet-auto-5628-LM-W3SVC-1-ROOT-1-133940163781895393-Loader-20250610.log otel-dotnet-auto-912-w3wp-Native.log otel-dotnet-auto-912-LM-W3SVC-1-ROOT-1-133940037810444804-Managed-20250610.log otel-dotnet-auto-912-LM-W3SVC-1-ROOT-1-133940037810444804-Loader-20250610.log otel-dotnet-auto-5656-w3wp-Native.log otel-dotnet-auto-5656-LM-W3SVC-1-ROOT-1-133940034346683622-Managed-20250610.log otel-dotnet-auto-5656-LM-W3SVC-1-ROOT-1-133940034346683622-Loader-20250610.log otel-dotnet-auto-3736-w3wp-Native.log otel-dotnet-auto-3736-LM-W3SVC-1-ROOT-1-133940031092587445-Managed-20250610.log otel-dotnet-auto-3736-LM-W3SVC-1-ROOT-1-133940031092587445-Loader-20250610.log otel-dotnet-auto-5052-LM-W3SVC-1-ROOT-1-133947398650990805-Managed-20250618.log otel-dotnet-auto-5052-w3wp-Native.log otel-dotnet-auto-5052-LM-W3SVC-1-ROOT-1-133947398650990805-Loader-20250618.log otel-dotnet-auto-4964-w3wp-Native.log otel-dotnet-auto-4964-LM-W3SVC-1-ROOT-1-133947397497514473-Managed-20250618.log otel-dotnet-auto-4964-LM-W3SVC-1-ROOT-1-133947397497514473-Loader-20250618.log otel-dotnet-auto-9024-LM-W3SVC-1-ROOT-1-133947201748033550-Managed-20250618.log otel-dotnet-auto-9024-w3wp-Native.log otel-dotnet-auto-9024-LM-W3SVC-1-ROOT-1-133947201748033550-Loader-20250618.log otel-dotnet-auto-4276-LM-W3SVC-1-ROOT-1-133947196853144144-Managed-20250618.log otel-dotnet-auto-4276-w3wp-Native.log otel-dotnet-auto-4276-LM-W3SVC-1-ROOT-1-133947196853144144-Loader-20250618.log otel-dotnet-auto-7416-LM-W3SVC-1-ROOT-1-133947194527910968-Managed-20250618.log otel-dotnet-auto-7416-w3wp-Native.log otel-dotnet-auto-7416-LM-W3SVC-1-ROOT-1-133947194527910968-Loader-20250618.log otel-dotnet-auto-1016-w3wp-Native.log otel-dotnet-auto-1016-LM-W3SVC-1-ROOT-1-133947192396927553-Managed-20250618.log otel-dotnet-auto-1016-LM-W3SVC-1-ROOT-1-133947192396927553-Loader-20250618.log otel-dotnet-auto-6504-LM-W3SVC-1-ROOT-1-133947188799649603-Managed-20250618.log otel-dotnet-auto-6504-w3wp-Native.log otel-dotnet-auto-6504-LM-W3SVC-1-ROOT-1-133947188799649603-Loader-20250618.log otel-dotnet-auto-5628-LM-W3SVC-1-ROOT-1-133940163781895393-Managed-20250610.log otel-dotnet-auto-5628-w3wp-Native.log
OpenTelemetry .NET AutoInstrumentation Logs Summary
✅ Initialization & Setup (All Instances)
-
CLR Profiler version: 1.11.0 on Windows x64.
-
Features:
-
JIT inlining: enabled
-
NGEN: disabled
-
-
Loaded Components:
-
OpenTelemetry.AutoInstrumentation.Native.dll -
Managed Loader and Instrumentation assemblies
-
-
Instrumentation Definitions:
-
35 standard
-
2 derived
-
-
ReJITsuccess for methods inSystem.WebandSystem.ServiceModel(ASP.NET & WCF)
Applies to PIDs: 3736, 4964, 5052, 5656
🛠️ Assembly Reference Redirects
-
Performed hundreds of assembly redirects.
-
Common redirected assemblies:
-
System.*,Microsoft.Extensions.*
-
-
Upgrade targets:
-
Usually to version 9.0.0.2 or newer
-
-
No errors detected during redirects
⚠️ Common Issues in All Logs
1. Failed Trace Context Injection
Warning: Failed to inject activity context in format: 'TraceContextPropagator', context: 'Invalid context'.
-
Indicates missing or invalid
Activity.Current -
Likely causes:
-
Context not initialized
-
Incorrect middleware ordering
-
2. Exporter Errors: Collector Unavailable
Error: Exporter failed send data to collector at http://localhost:4318...
System.Net.Sockets.SocketException: No connection could be made because the target machine actively refused it 127.0.0.1:4318
-
Repeated failures for both traces and metrics
-
Collector expected at
localhost:4318was not reachable
⏱️ Profiler Exit and Stats
-
All profilers exited cleanly
-
Total runtimes ranged from ~66s to ~1276s
-
Key time consumers:
-
JitCompilationStarted -
JitInlining
-
-
Example:
Total time: 1276691ms | Callbacks: 531ms [JitCompilationStarted=143ms, JitInlining=33ms, ...]
📅 Summary by PID
| PID | Date | Collector Errors | Context Warnings | Clean Exit |
|---|---|---|---|---|
| 3736 | 2025-06-10 | Yes | Yes | Yes |
| 4964 | 2025-06-18 | Yes | Yes | Yes |
| 5052 | 2025-06-18 | Yes | Yes | Yes |
| 5656 | 2025-06-10 | Yes | Yes | Yes |
🧩 Recommended Actions
-
Ensure OTLP Collector Availability
-
Either run a collector at
localhost:4318 -
Or configure
OTEL_EXPORTER_OTLP_ENDPOINTto point to a valid endpoint
-
-
TraceContext Injection Failures
-
Confirm
Activity.Currentis populated during HTTP pipeline execution -
Add manual instrumentation or adjust middleware order if needed
-
The above are .NET Framework 4.8 for an app we call posapi
Below are dotnet 8 for an app we call accountmanagementapi
both started from the same code base but then accountmanagementapi was updated to dotnet 8
otel-dotnet-auto-200-AccountManagementApi-AspNetCoreBootstrapper-20250618.log otel-dotnet-auto-200-AccountManagementApi-Loader-20250618.log otel-dotnet-auto-200-AccountManagementApi-Managed-20250618.log otel-dotnet-auto-200-AccountManagementApi-StartupHook-20250618.log otel-dotnet-auto-200-w3wp-Native.log otel-dotnet-auto-608-AccountManagementApi-AspNetCoreBootstrapper-20250618.log otel-dotnet-auto-608-AccountManagementApi-Loader-20250618.log otel-dotnet-auto-608-AccountManagementApi-Managed-20250618.log otel-dotnet-auto-608-AccountManagementApi-StartupHook-20250618.log otel-dotnet-auto-608-w3wp-Native.log otel-dotnet-auto-1716-AccountManagementApi-AspNetCoreBootstrapper-20250618.log otel-dotnet-auto-1716-AccountManagementApi-Loader-20250618.log otel-dotnet-auto-1716-AccountManagementApi-Managed-20250618.log otel-dotnet-auto-1716-AccountManagementApi-StartupHook-20250618.log otel-dotnet-auto-1716-w3wp-Native.log otel-dotnet-auto-2244-AccountManagementApi-AspNetCoreBootstrapper-20250618.log otel-dotnet-auto-2244-AccountManagementApi-Loader-20250618.log otel-dotnet-auto-2244-AccountManagementApi-Managed-20250618.log otel-dotnet-auto-2244-AccountManagementApi-StartupHook-20250618.log otel-dotnet-auto-2244-w3wp-Native.log otel-dotnet-auto-8912-AccountManagementApi-AspNetCoreBootstrapper-20250618.log otel-dotnet-auto-8912-AccountManagementApi-Loader-20250618.log otel-dotnet-auto-8912-AccountManagementApi-Managed-20250618.log otel-dotnet-auto-8912-AccountManagementApi-StartupHook-20250618.log otel-dotnet-auto-8912-w3wp-Native.log otel-dotnet-auto-5492-AccountManagementApi-AspNetCoreBootstrapper-20250610.log otel-dotnet-auto-5492-AccountManagementApi-Loader-20250610.log otel-dotnet-auto-5492-AccountManagementApi-Managed-20250610.log otel-dotnet-auto-5492-AccountManagementApi-StartupHook-20250610.log otel-dotnet-auto-5492-w3wp-Native.log
Summary of OpenTelemetry .NET Auto-Instrumentation Logs for AccountManagementApi
✅ Instrumentation & Initialization
CLR Profiler Startup (June 10 & 18)
-
Profiler version 1.11.0 successfully attached on both instances (
w3wpPIDs 5492 and 200). -
JIT Inlining: Enabled
-
NGEN: Disabled
-
Profiler attached and rewrote P/Invoke methods for native routines like
AddInstrumentations,ConfigureContinuousProfiler, etc. -
Integrations initialized: 38 primary + 2 derived
Instrumentation Hooks
-
Components like
StartupHook,Loader, andAspNetCoreBootstrapperinitialized successfully. -
Rule validations passed:
MinSupportedFrameworkRuleandOpenTelemetrySdkMinimumVersionRule. -
Tracer and meter initialization completed with instrumentation ID
FF...2DA.
⚠️ Exporter Failures
Collector Connectivity Issues
-
Continuous
503 (Service Unavailable)andSocketException (10061)errors observed. -
Telemetry data (traces, logs, metrics) failed to export to
http://localhost:4318. -
Indicates the OpenTelemetry Collector was not reachable or not running.
-
Failures occurred repeatedly every minute in both logs.
ℹ️ Lifecycle Events & Termination
Shutdown Events
-
Instruments (e.g.,
http.server.active_requests,aspnetcore.routing.match_attempts) deactivated at shutdown. -
Components like
OpenTelemetryLoggerProvider,BatchLogRecordExportProcessor, andLoggerProviderSdkwere disposed. -
Profiler threads (
ReJIT) exited gracefully with instrumentation stats recorded.
🧩 Profiling Metrics
-
Total runtime: ~1.1M ms (18 min) on June 10, ~1.3M ms (22 min) on June 18.
-
Callback overhead: Minimal (<0.1% of total time).
-
ReJIT timeouts occurred, potentially delaying instrumentation for initial methods.
🚨 Key Issues & Recommendations
| Issue | Impact | Recommendation |
|---|---|---|
| ❌ Exporter failed to connect to http://localhost:4318 | No telemetry exported | Ensure the OTLP collector is running and accessible on port 4318. Check local firewall or switch to remote collector. |
| ⚠️ ReJIT timeout warning | Some early methods might not be traced | Monitor startup instrumentation, increase ReJIT timeout, or optimize startup path. |
Let me know if you need further help diagnosing the OTLP collector or adjusting OpenTelemetry configurations.
I had chat gpt read the files and summarize them but I don't trust it so I also included the files
I also have the following Grafana Alloy config:
remotecfg {
url = "https://REDACTED.grafana.net"
id = coalesce(constants.hostname, sys.env("COMPUTERNAME"), sys.env("HOSTNAME"), "unknown")
poll_frequency = "60s"
basic_auth {
username = coalesce(sys.env("DSI_GRAFANA_CLOUD_USER"), "REDACTED")
password = coalesce(sys.env("DSI_GRAFANA_TOKEN"), "default_password")
}
}
prometheus.remote_write "metrics_service" {
endpoint {
url = "https://REDACTED.grafana.net/api/prom/push"
basic_auth {
username = coalesce(sys.env("DSI_GRAFANA_PROM_USER"), "REDACTED")
password = coalesce(sys.env("DSI_GRAFANA_TOKEN"), "default_password")
}
}
}
loki.write "grafana_cloud_loki" {
endpoint {
url = "https://REDACTED.grafana.net/loki/api/v1/push"
basic_auth {
username = coalesce(sys.env("DSI_GRAFANA_LOKI_USER"), "REDACTED")
password = coalesce(sys.env("DSI_GRAFANA_TOKEN"), "default_password")
}
}
external_labels = {
org = "DC",
}
}
otelcol.auth.basic "grafana_cloud" {
username = coalesce(sys.env("DSI_GRAFANA_CLOUD_USER"), "REDACTED")
password = coalesce(sys.env("DSI_GRAFANA_TOKEN"), "default_password")
}
otelcol.receiver.otlp "default" {
grpc {}
http {}
output {
metrics = [otelcol.processor.resourcedetection.default.input]
logs = [otelcol.processor.resourcedetection.default.input]
traces = [otelcol.processor.resourcedetection.default.input]
}
}
otelcol.processor.resourcedetection "default" {
detectors = ["env", "system", "ec2"]
system {
hostname_sources = ["os"]
}
output {
metrics = [otelcol.processor.transform.drop_unneeded_resource_attributes.input]
logs = [otelcol.processor.transform.drop_unneeded_resource_attributes.input]
traces = [otelcol.processor.transform.drop_unneeded_resource_attributes.input]
}
}
otelcol.processor.transform "drop_unneeded_resource_attributes" {
error_mode = "ignore"
trace_statements {
context = "resource"
statements = [
"delete_key(resource.attributes, \"k8s.pod.start_time\")",
"delete_key(resource.attributes, \"os.description\")",
"delete_key(resource.attributes, \"os.type\")",
"delete_key(resource.attributes, \"process.command_args\")",
"delete_key(resource.attributes, \"process.executable.path\")",
"delete_key(resource.attributes, \"process.pid\")",
"delete_key(resource.attributes, \"process.runtime.description\")",
"delete_key(resource.attributes, \"process.runtime.name\")",
"delete_key(resource.attributes, \"process.runtime.version\")",
"delete_key(resource.attributes, \"host.name\")",
"delete_key(resource.attributes, \"host.id\")",
"delete_key(resource.attributes, \"cloud.instance.id\")",
]
}
metric_statements {
context = "resource"
statements = [
"delete_key(resource.attributes, \"k8s.pod.start_time\")",
"delete_key(resource.attributes, \"os.description\")",
"delete_key(resource.attributes, \"os.type\")",
"delete_key(resource.attributes, \"process.command_args\")",
"delete_key(resource.attributes, \"process.executable.path\")",
"delete_key(resource.attributes, \"process.pid\")",
"delete_key(resource.attributes, \"process.runtime.description\")",
"delete_key(resource.attributes, \"process.runtime.name\")",
"delete_key(resource.attributes, \"process.runtime.version\")",
"delete_key(resource.attributes, \"host.name\")",
"delete_key(resource.attributes, \"host.id\")",
"delete_key(resource.attributes, \"cloud.instance.id\")",
]
}
log_statements {
context = "resource"
statements = [
"delete_key(resource.attributes, \"k8s.pod.start_time\")",
"delete_key(resource.attributes, \"os.description\")",
"delete_key(resource.attributes, \"os.type\")",
"delete_key(resource.attributes, \"process.command_args\")",
"delete_key(resource.attributes, \"process.executable.path\")",
"delete_key(resource.attributes, \"process.pid\")",
"delete_key(resource.attributes, \"process.runtime.description\")",
"delete_key(resource.attributes, \"process.runtime.name\")",
"delete_key(resource.attributes, \"process.runtime.version\")",
"delete_key(resource.attributes, \"host.name\")",
"delete_key(resource.attributes, \"host.id\")",
"delete_key(resource.attributes, \"cloud.instance.id\")",
]
}
output {
metrics = [otelcol.processor.transform.add_app_info.input]
logs = [otelcol.processor.transform.add_app_info.input]
traces = [
otelcol.processor.transform.add_app_info.input,
otelcol.connector.host_info.default.input,
]
}
}
otelcol.processor.transform "add_app_info" {
error_mode = "ignore"
trace_statements {
context = "resource"
statements = [
"set(resource.attributes[\"service.name\"], resource.attributes[\"DSI_APPLICATION\"])",
"set(resource.attributes[\"service.fullname\"], resource.attributes[\"DSI_FULL_APPLICATION\"])",
"set(resource.attributes[\"service.namespace\"], \"DC\")",
"set(resource.attributes[\"deployment.environment\"], resource.attributes[\"DSI_ENVIRONMENT\"])",
"set(resource.attributes[\"deployment.dynamic_environment\"], resource.attributes[\"DSI_DYNAMIC_ENVIRONMENT\"])",
"set(resource.attributes[\"org\"], \"DC\")",
]
}
metric_statements {
context = "resource"
statements = [
"set(resource.attributes[\"service.name\"], resource.attributes[\"DSI_APPLICATION\"])",
"set(resource.attributes[\"service.fullname\"], resource.attributes[\"DSI_FULL_APPLICATION\"])",
"set(resource.attributes[\"service.namespace\"], \"DC\")",
"set(resource.attributes[\"deployment.environment\"], resource.attributes[\"DSI_ENVIRONMENT\"])",
"set(resource.attributes[\"deployment.dynamic_environment\"], resource.attributes[\"DSI_DYNAMIC_ENVIRONMENT\"])",
"set(resource.attributes[\"org\"], \"DC\")",
]
}
log_statements {
context = "resource"
statements = [
"set(resource.attributes[\"service.name\"], resource.attributes[\"DSI_APPLICATION\"])",
"set(resource.attributes[\"service.fullname\"], resource.attributes[\"DSI_FULL_APPLICATION\"])",
"set(resource.attributes[\"service.namespace\"], \"DC\")",
"set(resource.attributes[\"deployment.environment\"], resource.attributes[\"DSI_ENVIRONMENT\"])",
"set(resource.attributes[\"deployment.dynamic_environment\"], resource.attributes[\"DSI_DYNAMIC_ENVIRONMENT\"])",
"set(resource.attributes[\"org\"], \"DC\")",
]
}
output {
metrics = [otelcol.processor.transform.add_resource_attributes_as_metric_attributes.input]
logs = [otelcol.processor.transform.add_resource_attributes_as_metric_attributes.input]
traces = [otelcol.processor.transform.add_resource_attributes_as_metric_attributes.input]
}
}
otelcol.processor.transform "add_resource_attributes_as_metric_attributes" {
error_mode = "ignore"
metric_statements {
context = "datapoint"
statements = [
"set(resource.attributes[\"deployment.environment\"], resource.attributes[\"deployment.environment\"])",
"set(resource.attributes[\"deployment.dynamic_environment\"], resource.attributes[\"deployment.dynamic_environment\"])",
"set(resource.attributes[\"service.version\"], resource.attributes[\"service.version\"])",
"set(resource.attributes[\"org\"], resource.attributes[\"org\"])",
]
}
output {
metrics = [otelcol.processor.batch.default.input]
}
}
otelcol.connector.host_info "default" {
host_identifiers = ["host.name"]
output {
metrics = [otelcol.processor.batch.default.input]
}
}
otelcol.processor.batch "default" {
output {
metrics = [otelcol.exporter.otlphttp.grafana_cloud.input]
logs = [otelcol.exporter.otlphttp.grafana_cloud.input]
traces = [otelcol.exporter.otlphttp.grafana_cloud.input]
}
}
otelcol.exporter.otlphttp "grafana_cloud" {
client {
endpoint = "https://otlp-gateway-prod-us-west-0.grafana.net/otlp"
auth = otelcol.auth.basic.grafana_cloud.handler
timeout = "30s"
}
}
this config is not reporting any errors at all with either EC2s (.NET 4.8 or 8)
I know this is not a Grafana support repo so I doubt it's relevant but I wanted you to have all the info
@RassK has PTO this week. I hope he will be able to help next week.
@Shereef seems like auto instrumentation is not failing but your application is. Most probably because auto instrumentation is updating a dependency across major versions and likely introducing a breaking change.
Method not found: 'Void DSI.POS.Application.TicketManagement.TicketCommands.ScheduleTicketCommand..ctor(...)'
Could you expand this exception more? What dependencies align with auto inst and are called in ScheduleTicketCommand.
from https://github.com/open-telemetry/opentelemetry-dotnet-instrumentation/issues/4233#issuecomment-2933430378
{"StatusCode":400,"StatusMessage":"Method not found: 'Void DSI.POS.Application.TicketManagement.TicketCommands.ScheduleTicketCommand..ctor(DSI.POS.Application.DTO.Ticket, System.Collections.Generic.List`1<System.DateTime>, Boolean, DSI.Core.ActionDialogList, Boolean, Boolean, Boolean, Boolean, System.Collections.Generic.List`1<System.ValueTuple`3<Int32,System.String,Int32>>, Boolean, Boolean, DSI.POS.Domain.Model.Settings.AllTicketSettings, DSI.POS.Domain.Model.Settings.AllTaxSettings, DSI.POS.Application.DTO.TicketClient, System.Collections.Generic.List`1<DSI.POS.Domain.Model.Tickets.CreatedBy>, System.Collections.Generic.List`1<DSI.POS.Domain.Model.Tickets.Confirmation>, System.Collections.Generic.List`1<DSI.POS.Domain.Model.Tickets.Referral>)'."}
the only thing mentioned here related to dependencies is System.ValueTuple
I did try to update it locally and faced the same issue with telemetry but no issues without telemetry
To me it seems like it's missing an inner exception, maybe already pinning the faulty point. According to this doc, the missing member data should be present in the exception. https://learn.microsoft.com/en-us/dotnet/api/system.missingmethodexception?view=netframework-4.8
At least there seems nothing suspicious in the ctor's signature.
SIG meeting: @Shereef We might need full exception or a repro to try help investigate the issue.
Because of the issues I faced my bosses decided to put this on hold until further notice, I will try to provide you more info n my own time but it might be a while
We'll close this for now, but if you have any updates on your end we're open to hearing from you