Flutter testing args not considered for analysis before execution
description
When using the flutter test arguments to use a local engine (as noted in https://github.com/Dart-Code/Dart-Code/issues/5224) they work as long as the api hasn't changed. Changing the API can lead to situations where the tests cannot execute. For example: I switched a class from X extends Y to X implements Y and added a method to X, int foo => _value.foo;.
I was not able to execute the tests because if Y didn't have foo it would complain that it isn't implemented, if I added the declaration it would complain that X doesn't have a foo method.
It seems like VSCode is doing 2 different analyses:
- before we run the tests, that doesn't us the local engine arguments
- when executing the tests, that does use the local engine arguments
Executing the tests from the command line works which seems to indicate some extra step that VSCode is doing to analyse the code.
Can you confirm where you're seeing these issues? Are they in the Problems view (eg. from analysis before you start a debug session), or in the Debug Console (happening when trying to run)?
If the former, that's probably no surprise - the test arguments for for test execution and don't affect anything happening in the editor before starting to run.
If the latter, that is odd because the debug adapter is essentially just running flutter run anyway (but it makes me wonder if the flags aren't going to the right place?).
I've not done engine development before so I might need some specific steps to repro to debug. Thanks!
Can you confirm where you're seeing these issues? Are they in the Problems view (eg. from analysis before you start a debug session), or in the Debug Console (happening when trying to run)?
Yep, they show up in different places depending on the situation but at the same time you invoke the tests. If foo's implementation is missing the problem shows up in the testing results (presumably since the code passes validation before execution on the old API). If foo's implementation is there the problems show up in the "Problems" pane.
If the former, that's probably no surprise - the test arguments for for test execution and don't affect anything happening in the editor before starting to run.
Yea, I'd expect to get squiggly lines since the editor doesn't know about the testing flags. The problem is that the detected errors are aborting the test run. I'd still expect to be able to execute / debug the tests.
I've not done engine development before so I might need some specific steps to repro to debug. Thanks!
Will do.
reproduction
1) Setup vscode to use the local engine flags for running tests
2) Edit dart code in the engine
In the Color class add a new method int foo() => 1234;
3) Compile the engine
./flutter/bin/et build -c host_debug_unopt_arm64
4) Edit the framework code
In CupertinoDynamicColor make CupertinoDynamicColor implement Color instead of extend (I'm going to land this change so this might already be the case depending on when you come). Implement all the methods to just forward to _effectiveColor, like this.
5) Try to execute the tests from VSCode
Notice that if CupertinoDynamicColor implements foo, there will be an error at int foo() => _effectiveColor.foo(); before executing the tests since foo is unknown. If you remove foo, the tests will fail in the console because foo isn't implemented.
The problem is that the detected errors are aborting the test run. I'd still expect to be able to execute / debug the tests.
Can you give a screenshot of exactly what you see? If there are analysis errors you might get a "There are build errors, continue?" dialog, but you can dismiss that and it should still tun the debug session. I'm not sure if you then mean something else is failing?
Fixing the analysis errors may be tricky. In the past we allowed specifying a custom Dart SDK for a Flutter project, but many people used it incorrectly and hit issues because the version the IDE was using was not the same one used when they would run the project, so we locked the Dart SDK to the one in bin/cache/dart-sdk.
It might take me some faffing to get set up to build the engine (I'm not sure if I've done it either on my PC or Mac before), so any screenshots of exactly what you're seeing might help understand the issue quicker. Thanks!
Here is the error without foo:
Here is the text editor after having added foo:
Here is the dialog that pops up when trying to execute the tests after having added foo:
Okay, hitting the "Debug anyways" button does give me the behavior I wanted. I wasn't hitting that without thinking since usually that isn't the behavior you want. For example, in visual studio if your program fails to build it will ask you if you still want to debug. Which is a dumb question, but in this case still debugging is the right thing to do.
Maybe there's nothing to do here.
hitting the "Debug anyways" button does give me the behavior I wanted. I wasn't hitting that without thinking since usually that isn't the behavior you want. For example, in visual studio if your program fails to build it will ask you if you still want to debug.
Yeah, I think in this case this is probably what you want but perhaps the dialog is a little misleading because it sounds like a compile error.
If the dialog was something like
Your project has analysis errors.
[Run Anyway] [Show Errors]
Would it be less confusing? In this case, you have analysis errors because you're using an analysis server that doesn't match what you're going to use to run with.
Ofc the real fix is to use a compatible analyzer. I'm not certain how hard that is because I don't know if you're building it when you build your engine - if you are, we could consider adding some additional settings to let you specific your local engine, and then using it both for the language server and automatically setting the run/test arguments. How worthwhile that is may depend on how often this comes up (is everyone trying to work on the engine hitting stuff like this a lot, or is this some specific class of change that doesn't happen much?).
This issue has been marked stale because it is tagged awaiting-info for 20 days with no activity. Remove the stale label or comment to prevent the issue being closed in 10 days.
I've updated the message to no longer say "Build", and instead just "Errors exist in your project" and I've changed the "Debug Anyway" button to "Run Anyway" (https://github.com/Dart-Code/Dart-Code/issues/5283).
While making this change, I also remembered that you can add promptToRunIfErrors: false to your launch configuration if you always want to just ignore these errors and attempt to run anyway. Again, it's not a complete fix (because really you want an analyzer that matches what you're running), but it might help in the meantime.
If you think these changes (and that setting) are good enough, let me know and we can close this. Otherwise if this should be a request to make it easier to set a local engine path and have it used for both analysis and running, I might need some additional info (see the comment above).
Thanks that helps. I think having the full solution would be nice but I completely understand it's not the highest priority. Feel free to close if you have another way of tracking those sorts of feature requests or keep this alive but marked as low priority. This doesn't block my work, it's a minor nuisance that effects only a handful of people.
This issue has been marked stale because it is tagged awaiting-info for 20 days with no activity. Remove the stale label or comment to prevent the issue being closed in 10 days.
I'm cleaning up the issue tracker and closing things that aren't likely to be addressed in the foreseeable future.
I don't have a good solution here, and I feel like the real fix is for this to be configured in some way that the analyzer being spawned matches the SDK/engine being used rather than it being defined only in the flutter test args (because we can't analyze on the assumption of flutter test args if you might also run flutter run that doesn't have those args).
Hopefully this isn't a big issue, but if it is, I think we should try to find a better way to define these values and ensure the analyzer can use them too.