FHIR icon indicating copy to clipboard operation
FHIR copied to clipboard

Consent : sourceAttachment

Open fernanda62 opened this issue 2 years ago • 2 comments

Hello, I am studying the use of the consent manager on Fhir v4.1.0 I would like to know if there is a maximum allowable length for the "data" field in the "sourceAttachment" section of the Consent resource , and where it is defined. In fact I need to insert a text that could be even more than 3000 characters. Can I use the "data" field for this purpose? Thank you, Fernanda

fernanda62 avatar Jul 01 '22 10:07 fernanda62

This element is of type base64Binary and there is no spec-defined limit for this data type: https://hl7.org/fhir/datatypes.html#base64Binary Correspondingly, there is no enforced limit for this type in our fhir-model classes, so it basically just comes down to what the configured persistence layer can support.

Our default JDBC schema documentation is at https://github.com/LinuxForHealth/FHIR/blob/main/fhir-persistence-schema/docs/SchemaDesign.md but it doesn't clearly list the our upper limits, so its a good question.

By default*, we store the resources as gzip-compressed json in a database BLOB column with a max length of 2 GiB. That limit comes from here: https://github.com/LinuxForHealth/FHIR/blob/main/fhir-persistence-schema/src/main/java/com/ibm/fhir/schema/control/FhirResourceTableGroup.java#L357

So there's no configured limit for this particular field, but rather an overall limit on the size of a single resource. Often we'll have limits imposed at the gateway/ingress level as well (e.g. to prevent overly large transaction bundles).

*Note: as of version 4.11.0, we also support "offloading" this blob to Azure Blob Storage.

lmsurpre avatar Jul 01 '22 19:07 lmsurpre

Note that the 2 GiB limit applies to the resource after it is serialized as JSON and compressed.

With offloading to Azure Blob, the max resource size is larger, although in this case this is the uncompressed size (since we delegate any compression to the Azure Blob service):

    private static final long MAX_BLOB_UPLOAD_BYTES = BlockBlobAsyncClient.MAX_UPLOAD_BLOB_BYTES_LONG;

which is currently:

    public static final long MAX_UPLOAD_BLOB_BYTES_LONG = 5000L * Constants.MB;

Note that if you plan to use really large resources, make sure that your JVM memory limits are configured appropriately.

punktilious avatar Jul 06 '22 13:07 punktilious