cloudbeaver
cloudbeaver copied to clipboard
WebServiceDataTransfer exportData getBytesWritten() always return zero
StreamTransferConsumer consumer = new StreamTransferConsumer() { @Override public void fetchRow(DBCSession session, DBCResultSet resultSet) throws DBCException { super.fetchRow(session, resultSet); if (fileSizeLimit != null && getBytesWritten() > fileSizeLimit.longValue()) { throw new DBQuotaException("Data export quota exceeded", QUOTA_PROP_FILE_LIMIT, fileSizeLimit.longValue(), getBytesWritten()); } } }; getBytesWritten() value always return zero.
Am I right that you try to export a big table from CloudBeaver? In this case please try to change the export limit. If I'm wrong, could you please tell me:
- steps to reproduce the issue, expected and actual results;
- your CloudBeaver version;
- your browser version.
- reproduce the issue: Well, when I was testing the table data export, I see this class WebServiceDataTransfer first judged the size of the exported table data through the data stream. However,I found that the size of the data stream(bytesWritten) kept returning zero until the data transfer was completed.As shown in the code:
WebServiceDataTransfer
StreamTransferConsumer consumer = new StreamTransferConsumer() {
@Override
public void fetchRow(DBCSession session, DBCResultSet resultSet) throws DBCException {
super.fetchRow(session, resultSet);
// bytesWritten always return zero
log.info("bytesWritten:" + getBytesWritten());
if (fileSizeLimit != null && getBytesWritten() > fileSizeLimit.longValue()) {
throw new DBQuotaException("Data export quota exceeded", QUOTA_PROP_FILE_LIMIT,
fileSizeLimit.longValue(), getBytesWritten());
}
}
};
...
consumer.finishTransfer(monitor, false);
StreamTransferConsumer
protected long getBytesWritten() {
// statStream.getBytesWritten() always return zero
return statStream == null ? 0 : statStream.getBytesWritten();
}
StatOutputStream
public class StatOutputStream extends OutputStream {
private final OutputStream stream;
private long bytesWritten = 0;
public StatOutputStream(OutputStream stream) {
this.stream = stream;
}
public long getBytesWritten() {
return bytesWritten;
}
@Override
public void write(int b) throws IOException {
stream.write(b);
bytesWritten++;
}
@Override
public void write(@NotNull byte[] b) throws IOException {
stream.write(b);
bytesWritten += b.length;
}
@Override
public void write(@NotNull byte[] b, int off, int len) throws IOException {
stream.write(b, off, len);
bytesWritten += len;
}
@Override
public void flush() throws IOException {
stream.flush();
}
@Override
public void close() throws IOException {
stream.close();
}
}
- expected: if fileSizeLimit != null && getBytesWritten() > fileSizeLimit.longValue()),it should throw new DBQuotaException;
if (fileSizeLimit != null && getBytesWritten() > fileSizeLimit.longValue()) {
throw new DBQuotaException("Data export quota exceeded", QUOTA_PROP_FILE_LIMIT,
fileSizeLimit.longValue(), getBytesWritten());
}
- actual results: the size of bytesWritten always return zero, it can't throw new DBQuotaException;
- CloudBeaver version: 21.3.3;
- browser version: 21.3.3.
Thank you for the explanation. We are going to investigate and fix the issue.
The bug is fixed and available in the latest 22.2.0 release.