DataGenerationToolkit
DataGenerationToolkit copied to clipboard
Bulk data for testing orgs with large amounts of data
Orgs expected to have large amounts of data, need to have fairly large data sets for testing. The details of the data do not matter a great deal, but do need the volume to ensure triggers, flow, and similar, have appropriate filters.
As a developer I want to generate data sets large enough to use most, or all, of the storage in a partial or full sandbox to QA build with large volumes.
This functionality may be handled, at least in part, with Paul's internal code for LDV generation. Our primary use case for the data generation tool is a midsize dataset generation--large enough that the dataset can cover interesting permutations, but not true LDV.
Snowfakery, and some other tools, can do this well right now, but lack examples and documentation to make it easy even for an expert to generate the data.
Likely this issue can be split into multiple issues one for each tool that we end up creating docs/samples for.