DataGenerationToolkit icon indicating copy to clipboard operation
DataGenerationToolkit copied to clipboard

Bulk data for testing orgs with large amounts of data

Open acrosman opened this issue 5 years ago • 2 comments

Orgs expected to have large amounts of data, need to have fairly large data sets for testing. The details of the data do not matter a great deal, but do need the volume to ensure triggers, flow, and similar, have appropriate filters.

As a developer I want to generate data sets large enough to use most, or all, of the storage in a partial or full sandbox to QA build with large volumes.

acrosman avatar Oct 17 '19 20:10 acrosman

This functionality may be handled, at least in part, with Paul's internal code for LDV generation. Our primary use case for the data generation tool is a midsize dataset generation--large enough that the dataset can cover interesting permutations, but not true LDV.

allisonletts avatar Oct 18 '19 15:10 allisonletts

Snowfakery, and some other tools, can do this well right now, but lack examples and documentation to make it easy even for an expert to generate the data.

Likely this issue can be split into multiple issues one for each tool that we end up creating docs/samples for.

acrosman avatar Sep 23 '20 14:09 acrosman