Refactor how filtering works
Right now, File_Iterator is used to crawl the filesystem for files that match the criteria specified in the (PHPUnit) configuration resulting in a list (array) of files that are to be included in the code coverage report. This crawling is responsible for a slowdown in the startup of PHPUnit.
It would be better to store the information ("all *.php files in directory src) instead and match the filenames against that instead of a list generated through crawling.
Also, if you exclude a directory,
<filter><whitelist><exclude>
<directory>vendor</directory>
even if there are no files that have previously been white-listed within the directory, PHP_CodeCoverage_Filter::removeDirectoryFromWhitelist will collect the entire file list and attempt to unset the whitelist of each file.
@sebastianbergmann Is your proposal here that we should have a single flat map of files in an array or object instead of a recursive nested directory tree? Would that array or object then be included with the serialized output?
Is your proposal here that we should have a single flat map of files in an array
That is the status quo and what I meant in https://github.com/sebastianbergmann/php-code-coverage/issues/386#issue-107924189 when I wrote "list (array) of files that are to be included in the code coverage report".