MAX_DATA_SIZE is too low for full support of "Structured-appended" feature
The following command returns an error Input data is too large but I would have expected input data spread over several QR codes (output-01.png, output-02.png, etc.):
$ qrencode -t PNG -Sv 40 -r input.txt -o output.png
Input data is too large.
According to the manual: "Libqrencode can generate "Structured-appended" symbols that enables to split a large data set into mulitple QR codes."
Version tested:
$ qrencode -V
qrencode version 4.1.1
Copyright (C) 2006-2017 Kentaro Fukuchi
@misa-ki, how many symbols do you expect to be generated? When more than 16 symbols are needed, the data is too large in fact.
According to QR Code Outline specification, at most 16 symbols are allowed in structured appending.
And, #define MAX_STRUCTURED_SYMBOLS 16 is set in qrinput.h (v4.1.1).
@lemniscati I'm expecting less than 16 chained QR codes because my input file is not so large:
$ file -i input.txt
input.txt: text/plain; charset=us-ascii
$ cat input.txt | wc -c
15000
$ cat input.txt | sed 's/./\0\n/g' | sort -uV | tr -d $'\n'
.ACDEFILMNPQSUVabcdefghijlmnopqrstuvx ,;
$ qrencode -t PNG -Sv 40 -r input.txt -o output.png
Input data is too large.
I've made some tests and it seems to work only up to 14179 characters (generating 5 QR codes from output-01.png to output-05.png), but I'm not sure why it fails starting from 14180 characters:
$ tr -dc a-z < /dev/urandom | head -c 14179 > input-14179.txt
$ qrencode -t PNG -Sv 40 -r input-14179.txt -o output.png
$ ls output*.png
output-01.png output-02.png output-03.png output-04.png output-05.png
$ tr -dc a-z < /dev/urandom | head -c 14180 > input-14180.txt
$ qrencode -t PNG -Sv 40 -r input-14180.txt -o output.png
Input data is too large.
The error message comes from a function readFile() in qrenc.c (v4.1.1).
And this limitation is brought by the definition of MAX_DATA_SIZE:
#define MAX_DATA_SIZE (7090 * 2) /* timed by the safty factor 2 */
It seems that MAX_DATA_SIZE in qrenc.c should be larger.
And, maybe it suffices that #define MAX_DATA_SIZE (7089 * 16 + 1), for example.