AutoBogus
AutoBogus copied to clipboard
AutoBogus taking too long to generate data
I've come across a performance issue when using AutoFaker to generate test objects for use in unit tests.
I've been able to create a super simplified example of my domain model to demonstrate two things.
- AutoFaker generates way too many levels of data when it shouldn't be generating data for levels beyond a reasonable level of nesting. Ideally the nesting level should be configurable..but at a max 2 levels would be more than enough for tests.
- AutoFaker becomes slower as a result and my real-world tests end up taking several minutes to complete rather than a few seconds.
Here is my unit test demonstrating the issue:
[TestMethod]
public void DemonstrateAutoFakerPerformanceIssue()
{
System.Diagnostics.Stopwatch stopwatch = new System.Diagnostics.Stopwatch();
stopwatch.Start();
var generator = new AutoBogus.AutoFaker<Document>();
var document = generator.Generate();
document.Indicator.ShouldNotBeNull();
document.Indicator.Branch.ShouldNotBeNull();
document.Indicator.Branch.Parent.ShouldBeNull();
//document.Indicator.Branch.Documents.ShouldBeEmpty(); //Fails. Document should be populated. Indicator should be populated. Branch should have all its properties be default type. Or should be able to configure the generation depth.
document.Indicator.Branch.Documents.First().Indicator.ShouldBeNull();
document.Indicator.Branch.Documents.First().Branch.ShouldBeNull();
document.Branch.ShouldNotBeNull();
document.Branch.Documents.ShouldNotBeNull();
document.Branch.Documents.ShouldNotBeEmpty();
document.Branch.Documents.First().Branch.ShouldBeNull();
//document.Branch.Documents.First().Indicator.ShouldBeNull(); //Fails.
document.Branch.Documents.First().Indicator.Branch.ShouldBeNull();
//Show that we are able to access populated properties that are much too deep.
var generator2 = new AutoBogus.AutoFaker<Sentence>();
var sentence = generator2.Generate();
sentence.Branch.ShouldNotBeNull();
sentence.Paragraphs.ShouldNotBeNull();
sentence.Branch.Documents.ShouldNotBeEmpty();
sentence.Branch.Documents.First().Branch.ShouldBeNull();
//sentence.Branch.Documents.First().Indicator.ShouldBeNull(); //Fails
sentence.Branch.Documents.First().Indicator.Branch.ShouldBeNull();
sentence.Branch.Paragraphs.ShouldNotBeEmpty();
sentence.Branch.Paragraphs.First().Branch.ShouldBeNull();
//sentence.Branch.Paragraphs.First().Sections.ShouldBeEmpty(); //Fails
sentence.Branch.Paragraphs.First().Sections.First().Branch.ShouldBeNull();
//sentence.Branch.Paragraphs.First().Sections.First().Pages.ShouldBeNull(); //Fails
//sentence.Branch.Paragraphs.First().Sections.First().Pages.First().Documents.ShouldBeEmpty(); //Fails
sentence.Branch.Paragraphs.First().Sections.First().Pages.First().Documents.First().Branch.ShouldBeNull();
//sentence.Branch.Paragraphs.First().Sections.First().Pages.First().Documents.First().Indicator.ShouldBeNull(); //Fails
sentence.Branch.Paragraphs.First().Sections.First().Pages.First().Documents.First().Indicator.Branch.ShouldBeNull();
//sentence.Branch.Paragraphs.First().Sections.First().Pages.First().Documents.First().Indicator.Property.ShouldBeNullOrWhiteSpace(); //Fails
stopwatch.Stop();
stopwatch.ElapsedMilliseconds.ShouldBeLessThan(500);
}
The following code represents a simplified version of my domain model. Please note that I've tried to make it make sense without giving away our exact domain model...modeling a document so that unique sentences can be applied to other paragraphs is not realistic...it was just the closest analogy that I could come up with. We basically have objects whose configuration needs to be tracked in branches (like version control). The links between objects are the foreign key relationships in the database and our entities are creating using Entity Framework.
The things that seem to contribute to the issue are:
- Having Branch reference all objects attached to that branch
- Having all entities have a base type that allows the branch to be specified seems to create
- Having entities be able to reference other types of entities
- Auto Faker seems to generate 3 children for every collection. Branch references all entities. Each entity also has a collection to track where else in the hierarchy they have been added. Each level in the hierarchy results in AutoFaker taking longer to generate.
public class MyBranch
{
public string Property { get; set; }
public MyBranch Parent { get; set; }
public ICollection<MyBranch> Children { get; set; }
public ICollection<Document> Documents { get; set; }
public ICollection<Page> Pages { get; set; }
public ICollection<PageSection> Sections { get; set; }
public ICollection<Paragraph> Paragraphs { get; set; }
public ICollection<Sentence> Sentences { get; set; }
}
public class EntityBase
{
public MyBranch Branch { get; set; }
}
public class Document : EntityBase
{
public string MyProperty { get; set; }
public SpecialIndicator Indicator { get; set; }
public OtherSpecialIndicator OtherIndicator { get; set; }
}
public class Page : EntityBase
{
public string MyProperty { get; set; }
public OtherSpecialIndicator Indicator { get; set; }
public ICollection<Document> Documents { get; set; } //documents where this page appears
}
public class PageSection : EntityBase
{
public string MyProperty { get; set; }
public OtherSpecialIndicator Indicator { get; set; }
public ICollection<Page> Pages { get; set; } //pages where this page section appears
}
public class Paragraph : EntityBase
{
public string MyProperty { get; set; }
public OtherSpecialIndicator Indicator { get; set; }
public ICollection<PageSection> Sections { get; set; } //sections where this paragraph has been used
}
public class Sentence : EntityBase
{
public string MyProperty { get; set; }
public ICollection<Paragraph> Paragraphs { get; set; } //paragraphs where this sentance appears
}
public class SpecialIndicator : EntityBase
{
public string Property { get; set; }
}
//Adding this to other entities results in generation taking additional time
public class OtherSpecialIndicator : EntityBase
{
public string Property { get; set; }
}
Any thoughts on how this could be fixed? At the moment I am having to add rules so that for most objects Branch will be null
or new Branch()
....but it would be better if AutoBogus could stop itself from generating so many levels of data.
Hi @jjroth
Apologies for the delayed response. What version of AutoBogus
are you using? As of version 2.0.0, support for recursive generation has been in place and limited to 2 levels (not currently configurable).
I have just added some more unit tests to verify this behaviour based on the following scenarios:
- A child of the same parent type
- A list of children of the same parent type
- A type further down the object graph that has already been generated
If you are on a 2+ version, I'd be interested in understanding why it doesn't stop at the 2 levels.
As for performance, there is a lot of reflection going on under the hood. When I get chance, I am hoping to review the code base and address any bottlenecks like this.
Nick.
Hi @nickdodd79,
No problem on the delay in responding. I totally understand.
I am using Autobogus 2.2.1. And yeah, I did see the documentation about it stopping at two levels...but it doesn't seem to be fully working as you can see from my test above.
Regarding performance, on my machine it tends to make each test take about 1 second each....which really adds up after hundreds of tests. What I don't understand is that there is one colleague of mine whose machine takes 40 seconds or more per test...and our virtual machines are fairly identical.
Are you able to see the performance issue on your side that is caused by excessive data generation?
I have been able to find a workaround of sorts and that is to create a RuleFor that forces any reference objects such as MyBranch
to be new MyBranch()
instead of allowing Autobogus to generate the object. This is not ideal but does reduce the test time to a few milliseconds.
Hey @jjroth
I will invest some more time trying to replicate your setup. My tests showed it was working as expected so there must be some factor not being considered by them.
As for the performance, I shall also attempt to run testing on some larger data sets to see where the impact and bottlenecks are.
Nick.
I have same issue, the entities have a multiple dependencies and circular dependencies, consume 100% CPU.
Demonstrating entities:
public class Team : Entity
{
// Empty constructor for EF
public Team() { }
public new string Id { get; protected set; }
public string CompanyId { get; private set; }
public virtual Company Company { get; private set; }
public string AffiliateId { get; private set; }
public virtual Affiliate Affiliate { get; private set; }
public string CostCenterId { get; private set; }
public virtual CostCenter CostCenter { get; private set; }
public string Name { get; private set; }
public virtual ICollection<EmployeeTeam> EmployeeTeams { get; private set; }
public virtual ICollection<Order> Orders { get; private set; }
public virtual ICollection<Measurement> Measurements { get; private set; }
public bool IsAdmin { get; private set; }
public DateTime ChangedAt { get; private set; }
}
public class Company : Entity
{
// Empty constructor for EF
protected Company() { }
public new string Id { get; protected set; }
public string Name { get; protected set; }
public virtual ICollection<Affiliate> Affiliates { get; private set; }
public virtual ICollection<CostCenter> CostCenters { get; private set; }
}
public class Affiliate : Entity
{
// Empty constructor for EF
public Affiliate() { }
public new string Id { get; protected set; }
public string Name { get; private set; }
public string CompanyId { get; private set; }
public virtual Company Company { get; private set; }
public virtual ICollection<CostCenter> CostCenters { get; private set; }
}
public class CostCenter : Entity
{
// Empty constructor for EF
public CostCenter() { }
public new string Id { get; protected set; }
public string CompanyId { get; private set; }
public virtual Company Company { get; private set; }
public string AffiliateId { get; private set; }
public virtual Affiliate Affiliate { get; private set; }
public string CoreBusinessId { get; private set; }
public virtual CoreBusiness CoreBusiness { get; private set; }
public string Name { get; private set; }
public DateTime ChangedAt { get; private set; }
public bool AuthorizesOrderValueRange { get; private set; }
public int MinimumOrderAuthorization { get; private set; }
}
public class EmployeeTeam : ValueObject
{
// Empty constructor for EF
private EmployeeTeam() { }
public string TeamId { get; private set; }
public virtual Team Team { get; private set; }
public int EmployeeId { get; private set; }
public virtual Employee Employee { get; private set; }
public string CompanyId { get; private set; }
public virtual Company Company { get; private set; }
public string AffiliateId { get; private set; }
public virtual Affiliate Affiliate { get; private set; }
public string CostCenterId { get; private set; }
public virtual CostCenter CostCenter { get; private set; }
public string YearMonth { get; private set; }
public float Factor { get; private set; }
}
public class Order : Entity
{
// Empty constructor for EF
public Order() { }
public string CompanyId { get; private set; }
public virtual Company Company { get; private set; }
public string AffiliateId { get; private set; }
public virtual Affiliate Affiliate { get; private set; }
public string CostCenterId { get; private set; }
public virtual CostCenter CostCenter { get; private set; }
public string TeamId { get; private set; }
public virtual Team Team { get; private set; }
public DateTime Date { get; private set; }
public virtual Authorization Authorization { get; private set; }
public virtual ICollection<ServiceOrder> ServiceOrders { get; private set; }
public virtual ICollection<MeasurementService> MeasurementServices { get; private set; }
public string Assignee { get; private set; }
public DateTime ChangedAt { get; private set; }
public DateTime? AppDownloadedAt { get; set; }
public DateTime? ChangedByManagerAt { get; set; }
public DateTime? UnblockedByManagerAt { get; set; }
}
public class Measurement : Entity
{
public Measurement(int id, string companyId, string affiliateId, string costCenterId, int executionTermId, Authorization authorization,
string teamId, decimal contractAmount, decimal measurementAmount, decimal netAmount, decimal payAmount, bool isAdmin,
Audit audit, IList<MeasurementService> measurementServices, IList<MeasurementEmployee> measurementEmployees)
{
Id = id;
CompanyId = companyId;
AffiliateId = affiliateId;
CostCenterId = costCenterId;
ExecutionTermId = executionTermId;
Date = DateTime.Now.Date;
Audit = audit;
Authorization = authorization;
ContractAmount = contractAmount;
TeamId = teamId;
MeasurementAmount = measurementAmount;
NetAmount = netAmount;
PayAmount = payAmount;
IsAdmin = isAdmin;
CalculateFactorSalary = false;
CalculateFactorProduction = false;
MeasurementServices = measurementServices;
MeasurementEmployees = measurementEmployees;
ChangedAt = DateTime.Now.Date;
}
// Empty constructor for EF
public Measurement() { }
public string CompanyId { get; private set; }
public virtual Company Company { get; private set; }
public string AffiliateId { get; private set; }
public virtual Affiliate Affiliate { get; private set; }
public string CostCenterId { get; private set; }
public virtual CostCenter CostCenter { get; private set; }
public virtual Authorization Authorization { get; private set; }
public virtual Audit Audit { get; private set; }
public int? ExecutionTermId { get; private set; }
public virtual ExecutionTerm ExecutionTerm { get; private set; }
public DateTime Date { get; private set; }
public string TeamId { get; private set; }
public virtual Team Team { get; private set; }
public decimal? ContractAmount { get; private set; }
public float? ReadjustmentIndex { get; private set; }
public decimal? ReadjustmentAmount { get; private set; }
public decimal? MeasurementAmount { get; private set; }
public decimal? WithheldAmount { get; private set; }
public decimal? InssAmount { get; private set; }
public decimal? IrrfAmount { get; private set; }
public decimal? IssAmount { get; private set; }
public decimal? CofinsAmount { get; private set; }
public decimal? PisAmount { get; private set; }
public decimal? CsslAmount { get; private set; }
public decimal? DiscountAmount { get; private set; }
public decimal? NetAmount { get; private set; }
public decimal? PayAmount { get; private set; }
public bool IsAdmin { get; private set; }
public DateTime? DateInit { get; private set; }
public DateTime? DateEnd { get; private set; }
public decimal? PayrollAmount { get; private set; }
public string Description { get; private set; }
public bool CalculateFactorSalary { get; private set; }
public bool CalculateFactorProduction { get; private set; }
public virtual ICollection<MeasurementService> MeasurementServices { get; private set; }
public virtual ICollection<MeasurementEmployee> MeasurementEmployees { get; private set; }
public DateTime ChangedAt { get; private set; }
}
Hey @afranioce
Thanks for providing another instance of this. I think the amount of reflection going on is quite intense when generating EF entities. I have a plan to introduce a type cache so reflected info doesn't get loaded multiple times.
At the moment I am in the process of adding some generator override mechanisms which I plan on releasing soon. Once that is done, I will be looking at how the bottlenecks can be alleviated.
Nick.
Hey Nick,
I don't know if this helps, but I received a PR for some performance improvements to Bogus that had some reflection improvements. These might be the same caching mechanisms you're thinking about, so I'm not sure how helpful they are:
https://github.com/bchavez/Bogus/pull/170
Also, there was a small bug in the original PR, so something to watch out for. Here's the proper fix for the bug:
https://github.com/bchavez/Bogus/commit/2f909cc46feef09d46295abbb14a04478dc6e382
Thanks @bchavez. I will take a look.
I'm having the same issue. Using EF .NET Core 2.2, which produced all the POCOs (around 40 table which have a fair few one-to-many relationships). The Generate() never actually returns and uses a fair bit of CPU. Is there a way to perhaps convert the reflection that is happening into actual Bogus c# code that the Generate is trying to run?
I'm also, having this issue with .net core 2.2 EF Core and the following dependencies. I also tried with .net core 3.0.
- "AutoBogus" Version="2.7.3"
- "FluentAssertions" Version="5.9.0"
- "Microsoft.NET.Test.Sdk" Version="16.4.0"
- "Moq" Version="4.13.1"
- "xunit" Version="2.4.1"
- "xunit.runner.visualstudio" Version="2.4.1">
My EF dependencies
- "Microsoft.EntityFrameworkCore.SqlServer" Version="2.1.11"
- "Microsoft.EntityFrameworkCore.SqlServer.Design" Version="1.1.6"
- "Microsoft.EntityFrameworkCore.Tools" Version="2.1.11"
I have the same issue too, trying to fake EF Core data. But in fact, I do think the Generate will return something. It just takes a huge amount of time to generate data.
Hey all,
Apologies for my delayed response. I have looked into this several time, but haven't got far with it. I have some spare time over the next few weeks so plan on dedicating some effort to it. I suspect it something related specifically to EF and have some theories around why it is slow. Just need to do some exploration.
Thanks for your patience. Nick.
Hi Nick, Any luck with this issue? I'm facing the same problem when generating fake objects.
Hey @srihere17
I have attempted to look into and resolve this several times, but haven't got far with the reproduction. I suspect there is something in the EF setup that is making AutoBogus
do something it shouldn't.
As another use case to pin it down, could you post what your code looks like here. I can then run over the framework to see if it highlights anything.
Thanks, Nick.
I encountered similar problem while using AutoBogus with autogenerated POCOs. The problem seemed to be complexity of NavigationProperty reference tree, which seemed to be surprisingly wide with just 2 levels of depth for my database model. It was fixed by adjusting AutoFaker configuration (recourse level: down to 0-1 / number of elements for collections).
I do believe the issue has to do with Navigation properties and my POCOs/Entities were not autogenerated. Probably similar to a JSON circular reference issue. If there was a way we could set AutoBogus to ignore virtual properties (or set a default of null to all virtual properties) that might resolve the issue. Here's a class for an example of one of my entities.
public partial class InitialReview : BaseEntity, IReviewResponse
{
[Required]
public Guid ReviewId { get; set; }
[Required]
public Guid QaTemplateQuestionId { get; set; }
public Guid? InitialResponseId { get; set; }
public Guid? InitialResponseTypeId { get; set; }
public bool IsParentAddChildren { get; set; }
public DateTimeOffset? Updated { get; set; }
public bool NeedsFurtherReview { get; set; } = false;
public virtual Review Review { get; set; }
public virtual QaTemplateQuestion QaTemplateQuestion { get; set; }
public virtual InitialResponse InitialResponse { get; set; }
public virtual InitialResponseType InitialResponseType { get; set; }
public virtual ICollection<ReviewResponseQaDocTag> QaTemplateQuestionQaDocTags { get; set; }
public virtual AdditionalResponseData AdditionalResponseData { get; set; }
}
Should be rather easy to reproduce. I just had a large number of class referencing themselves, with a large amount of properties and possible circular dependencies.
Adding a +1 for this issue. Using EFCore5, and without introducing a DbContext
and simply having AutoFaker crawl a POCO in my domain with WithRecursiveDepth(1)
set takes about 5 seconds. Kind of a bummer too because this looks like an awesome project. 😭
FWIW, looks like the grief is occurring in PopulateInstance
:
OK... apologies for the previous post. It turns out I had a leftover navigation property that was pointing back to my domain model and that was causing the issue.
It turns out that this issue is simply due to traversing navigation properties. I have been able to reproduce this... and this time posted it as working code. 😉
https://github.com/Mike-E-angelo/AutoBogus.Performance
Basically, the following model will do the trick (notice that the project does not have EF installed whatsoever):
public class Parent
{
public virtual Other? Other { get; set; } = default!;
public virtual ICollection<Child> Children { get; set; } = default!;
}
public class Child
{
public virtual Parent Parent { get; set; } = default!;
public virtual ICollection<Other> Items { get; set; } = default!;
}
public class Other
{
public virtual Child Child { get; set; } = default!;
public virtual ICollection<Item> Items { get; set; } = default!;
}
public class Item
{
public virtual Other Other { get; set; } = default!;
}
Running this on my machine will take about two full seconds. Interestingly enough, commenting out this line will reduce that to over half (about .9 seconds total runtime).
Anyways, wanted to get that out there for more data towards a possible diagnosis.
This issue is rather complexe and not really solvable. In my opinion, Autobogus should scan data, picture a navigation property tree, and check for circular dependencies. If it founds some, it should throws a runtime exception asking for the dev to specify a specific action about those properties.
I've ran in to this same issue. I'm a bit confused about how WithRecursiveDepth
actually works. I would have assumed it controlled how far in to the tree it went. So a recursive depth of 0 would generate only the primitives at the top level. Recursive depth of 1 would go in to 1 level of complex child types and for each of those treat them as depth 0 (i.e. not go in any further). That doesn't seem to be the case though.
The way it seems to work is it will stop if the number of parents of the current type being generated that match the current type is >= depth. I guess that is handling recursion of self-related types, whereas I guess what wanted was tree depth. Tree depth would be more useful I think.
I've added a TreeDepth option in this PR. https://github.com/nickdodd79/AutoBogus/pull/64 This greatly helps improve performance for our use case (Xero api)
There's a package here if it helps https://github.com/Ian1971/AutoBogus/packages/637294
Hey All,
Thanks for effort of @Ian1971 I think we have a performance improvement in v2.13.0. It does involve using a WithTreeDepth()
config handler.
I did some top level testing and a simple model that took 4s to generate now takes 18ms 👍
Nick.
@nickdodd79 Wanted to add my $0.02 here, but possibly from a different angle. I've got a test that's generating a set of Header
s (between 3 and 5 total) and for each header a set of Card
s (between 3 and 7 per header). The entities I'm faking are old-growth code, and the majority of the navigation properties on the Header
instances are unimportant to the test context. Because of the random nature of the generation code, I've seen generation take between 20 and 45 seconds. A sample timing:
Arrange: Fixture: 00:00:00.6768900
Arrange: AutoFaker: 00:00:34.4636040
Arrange: SetupData: 00:00:00.0090527
Act: 00:00:00.1758669
Assert: 00:00:00.0144105
This is the entire AutoFaker portion of the code, redacted:
stopwatch.Restart();
var headerFaker = new AutoFaker<Header>()
.Configure(builder =>
builder
.WithBinder<NSubstituteBinder>()
.WithRecursiveDepth(1)
.WithTreeDepth(4)
);
var cardFaker = new AutoFaker<Card>()
.Configure(builder =>
builder
.WithBinder<NSubstituteBinder>()
.WithRecursiveDepth(1)
.WithTreeDepth(1)
);
var headers = trxKeys.Select(
key => headerFaker
.RuleFor(header => header.Key, key)
.RuleFor(header => header.Settled, "0")
.RuleFor(header => header.Voided, "0")
.RuleFor(header => header.Reversed, "0")
.RuleFor(header => header.TransactionType, transType)
.RuleFor(header => header.Date, <a valid DateTime>)
.RuleFor(header => header.Processor_ID, pid)
.Generate()
)
.ToDictionary(header => header.TRX_HD_Key);
var cards = cardKeys.Select(
keyPair => cardFaker
.RuleFor(card => card.HeaderKey, keyPair.trxKey)
.RuleFor(card => card.Header, headers[keyPair.trxKey])
.RuleFor(card => card.Key, keyPair.cardKey)
.RuleFor(card => card.Result, "0")
.Generate()
)
.ToList();
Console.WriteLine($"Arrange: AutoFaker: {stopwatch.Elapsed}");
My questions are
- Is there a way I can tell AutoFaker to only build out specific navigation property paths?
- In the
Header
instances, I need to go several layers deep in one specific navigation tree (Header.Merchant.BusinessInfo.First().Address
), hence theWithTreeDepth(4)
in the header faker. But there are more than a dozen other navigation properties onHeader
that are not used, most if not all of which each has a very wide navigation property tree of its own.
- In the
- Am I using the library efficiently? This is my first time trying it out, so I have no idea whether this is the best way or just something that gets the job done.
Following up:
I was able to get the runtime down from an average of above 2s per transaction to around 300ms per transaction by explicitly setting rules for each of the navigation properties I'm not using (returning default(TType)
or new List<TType>()
as appropriate). And I realized after I wrote out the RuleFor()
calls that I can use WithSkip()
in the faker config instead. Sample timings:
Before:
Arrange: Fixture: 00:00:01.0589245
Arrange: AutoFaker: 00:00:33.7371618 for 15 transactions
Arrange: SetupData: 00:00:00.0100100
Act: 00:00:00.2064520
Assert: 00:00:00.0170002
Using RuleFor()
:
Arrange: Fixture: 00:00:00.8760979
Arrange: AutoFaker: 00:00:06.5224953 for 20 transactions
Arrange: SetupData: 00:00:00.0101066
Act: 00:00:00.1974466
Assert: 00:00:00.0157477
Using WithSkip()
:
Arrange: Fixture: 00:00:00.6758549
Arrange: AutoFaker: 00:00:05.9639872 for 22 transactions
Arrange: SetupData: 00:00:00.0102199
Act: 00:00:00.2106083
Assert: 00:00:00.0194124
So that basically addresses the concern I raised in my first question, although I'm still curious whether there's a way I'm not aware of to have the setup be opt-in instead of opt-out.
As an aside: from a tiny sampling, using WithSkip()
seems to be about 10-15% faster than using an explicit RuleFor()
. Is that what you'd expect?
I think we also could improve some performance (not sure if this is already possible)
- Mark an property as parent
- When generating the data, assign the parent with the parent value and don't re-generate it.
Of course, this won't fix all the cases. But it's maybe a better solution (faster and better) than assigning a tree depth
Hi @nickdodd79 It seems that the recursive depth safe mechanism doesn't work on records. Generating an instance of this record as follows
public record FakeChild (string Name, List<FakeChild>? Children);
var child = new AutoFaker<FakeChild>().Generate();
will throw StackOverflow exception. Is it because the children are being passed in the constructure's parameters? Is there a workaround for this?
EDIT: I created a separate issue for this. link
Are you using the WithTreeDepth or WithRecursiveDepth config option?
Are you using the WithTreeDepth or WithRecursiveDepth config option?
@Ian1971 I tried both, didn't work.
What actually worked is moving the property from the constructure to the record body, as follows:
public record FakeChild (string Name)
{
List<FakeChild> Children { get; set; }
}