tiny_tds icon indicating copy to clipboard operation
tiny_tds copied to clipboard

Problem with inserting large binary data. TinyTds::Error: Write to the server failed

Open DarkSideF opened this issue 8 years ago • 4 comments
trafficstars

Hello!

I have I problem with inserting large binary data (varbinary(max)).

When I trying insert large binary data to a table row I get errors TinyTds::Error: Write to the server failed and TinyTds::Error: Read from the server failed depending on the file size.

Its sample code. In this example I use repeating binary chars instead of real file content, but I get exactly same errors with genuine file content obtained through 0x#{::File.read(input_file.path)

def create_binary_ct(filename, content)
  client = TinyTds::Client.new(
    { :adapter=>"sqlserver", :host=>"host", :port=>4444,
      :database=>"dbname", :username=>"sa", :password=>"pwd",
      :mode=>:dblib, timeout: 20000}
  )
  new_id = SystemHelper.generate_uuid
  sql = <<~SQL
    INSERT INTO [dvsys_binaries] ([ID], [Type], [Data])
    VALUES (
      '#{new_id}',
      '#{/\.[^\.]*$/.match(filename)}',
      #{content}
    )
  SQL
  client.execute(sql).each { |e| puts e }
end


# This example will result in an error `TinyTds::Error: Write to the server failed`
create_binary_ct('example_filename.iso', "0x#{'ac'*99000000}")

# This example will result in an error `TinyTds::Error: Read from the server failed`
create_binary_ct('example_filename.iso', "0x#{'ac'*67000000}")

# This example works fine
create_binary_ct('example_filename.iso', "0x#{'ac'*66000000}")

Initially, I got this problem with ActiveRecord using gem activerecord-sqlserver-adapter When I upload file less than 70 Megabytes it works perfect, but file with size about 90 megabytes raises error

module DVCore
  class Binary < ApplicationRecord
    self.table_name = 'dvsys_binaries'
    self.primary_key = 'ID'

    before_create :assign_id

    def self.create_binary(filename, content)
      DVCore::Binary.create!(
        Type: /\.[^\.]*$/.match(filename).to_s,
        Data: content
      )
    end

    protected

    def assign_id
      self.ID ||= SystemHelper.generate_uuid
    end
  end
end

DVCore::Binary.create_binary('example_larger_than_90mb.zip', ::File.read(tempfile.path))

This code will result in an error

ActiveRecord::StatementInvalid: TinyTds::Error: DBPROCESS is dead or not enabled: IF @@TRANCOUNT > 0 ROLLBACK TRANSACTION
	from /home/darksidef/.rvm/gems/ruby-2.4.0/gems/activerecord-sqlserver-adapter-5.0.5/lib/active_record/connection_adapters/sqlserver/database_statements.rb:278:in `execute'
	from /home/darksidef/.rvm/gems/ruby-2.4.0/gems/activerecord-sqlserver-adapter-5.0.5/lib/active_record/connection_adapters/sqlserver/database_statements.rb:278:in `raw_connection_do'
	from /home/darksidef/.rvm/gems/ruby-2.4.0/gems/activerecord-sqlserver-adapter-5.0.5/lib/active_record/connection_adapters/sqlserver/database_statements.rb:219:in `block in do_execute'
	from /home/darksidef/.rvm/gems/ruby-2.4.0/gems/activerecord-5.0.2/lib/active_record/connection_adapters/abstract_adapter.rb:589:in `block in log'
	from /home/darksidef/.rvm/gems/ruby-2.4.0/gems/activesupport-5.0.2/lib/active_support/notifications/instrumenter.rb:21:in `instrument'
	from /home/darksidef/.rvm/gems/ruby-2.4.0/gems/activerecord-5.0.2/lib/active_record/connection_adapters/abstract_adapter.rb:583:in `log'
	from /home/darksidef/.rvm/gems/ruby-2.4.0/gems/activerecord-sqlserver-adapter-5.0.5/lib/active_record/connection_adapters/sqlserver/database_statements.rb:219:in `do_execute'
	from /home/darksidef/.rvm/gems/ruby-2.4.0/gems/activerecord-sqlserver-adapter-5.0.5/lib/active_record/connection_adapters/sqlserver/database_statements.rb:66:in `exec_rollback_db_transaction'
	from /home/darksidef/.rvm/gems/ruby-2.4.0/gems/activerecord-5.0.2/lib/active_record/connection_adapters/abstract/database_statements.rb:285:in `rollback_db_transaction'
	from /home/darksidef/.rvm/gems/ruby-2.4.0/gems/activerecord-5.0.2/lib/active_record/connection_adapters/abstract/query_cache.rb:17:in `rollback_db_transaction'
	from /home/darksidef/.rvm/gems/ruby-2.4.0/gems/activerecord-5.0.2/lib/active_record/connection_adapters/abstract/transaction.rb:138:in `rollback'
	from /home/darksidef/.rvm/gems/ruby-2.4.0/gems/activerecord-5.0.2/lib/active_record/connection_adapters/abstract/transaction.rb:183:in `rollback_transaction'
	from /home/darksidef/.rvm/gems/ruby-2.4.0/gems/activerecord-5.0.2/lib/active_record/connection_adapters/abstract/transaction.rb:192:in `rescue in within_new_transaction'
	from /home/darksidef/.rvm/gems/ruby-2.4.0/gems/activerecord-5.0.2/lib/active_record/connection_adapters/abstract/transaction.rb:209:in `within_new_transaction'
	from /home/darksidef/.rvm/gems/ruby-2.4.0/gems/activerecord-5.0.2/lib/active_record/connection_adapters/abstract/database_statements.rb:232:in `transaction'
	from /home/darksidef/.rvm/gems/ruby-2.4.0/gems/activerecord-5.0.2/lib/active_record/transactions.rb:211:in `transaction'
	from /home/darksidef/.rvm/gems/ruby-2.4.0/gems/activerecord-5.0.2/lib/active_record/transactions.rb:392:in `with_transaction_returning_status'
	from /home/darksidef/.rvm/gems/ruby-2.4.0/gems/activerecord-5.0.2/lib/active_record/transactions.rb:324:in `save!'
	from /home/darksidef/.rvm/gems/ruby-2.4.0/gems/activerecord-5.0.2/lib/active_record/suppressor.rb:45:in `save!'
	from /home/darksidef/.rvm/gems/ruby-2.4.0/gems/activerecord-5.0.2/lib/active_record/persistence.rb:51:in `create!'
	from /home/darksidef/SuperApp/app/models/dv_core/dv_core/binary.rb:9:in `create_binary'
	from (irb):3

Code of table definition from schema.rb

  create_table "dvsys_binaries", primary_key: "ID", id: :uuid, default: -> { "newsequentialid()" }, force: :cascade do |t|
    t.ss_timestamp "FullTextTimeStamp",                                 null: false
    t.nchar        "Type",              limit: 10,                      null: false
    t.binary       "Data",              limit: 2147483647, default: ""
    t.binary       "StreamData",        limit: 2147483647
    t.index ["FullTextTimeStamp"], name: "dvsys_binaries_fulltexttimestamp"
  end

  add_foreign_key "dvsys_files", "dvsys_binaries", column: "BinaryID", primary_key: "ID", name: "dvsys_files_fk_binaryid"

I already tried change some SQLServer congfigs like network packed size but it didn't help

App runned on Ubuntu using Rails 5.0.2, Ruby 2.4.0, TinyTDS version 1.1.0 and connected to SQLServer 2014 database I tried update to Rails 5.1 and TinyTDS 1.3.0 but it does not help

Can you help me figure out how to fix this problem please?

DarkSideF avatar Jun 02 '17 14:06 DarkSideF

Great bug report I can confirm the same behavior on TinyTDS master with latest FreeTDS, platform Darwin. Here is the code I wrote up in the TinyTDS test suite to play with this. Those binary zips are just some folders with screenshots. Note too how I used .unpack("H*")[0] to format the binary string properly for SQL Server statement.

@client.execute("EXEC sp_configure 'show advanced options', 1;").do
@client.execute("RECONFIGURE;").do
@client.execute("EXEC sp_configure 'network packet size', 6500;").do
@client.execute("RECONFIGURE;").do
# data-53mb.zip
# data-85mb.zip
# data-117mb.zip
# data-200mb.zip
data = File.read 'test/schema/data-85mb.zip', mode: 'rb:BINARY'
insert = "INSERT INTO [datatypes] ([varbinary_max]) VALUES (0x#{data.unpack("H*")[0]})"
@client.execute(insert).insert

I was able to insert the 53 Megabyte file without issue. But could not do the 85 Megabyte file due to the same error. You can see how I tried doing the network packet size too. To no avail. I referenced this links:

  • https://stackoverflow.com/questions/9201354/how-many-characters-long-may-an-sql-server-sql-statement-be
  • https://docs.microsoft.com/en-us/sql/sql-server/maximum-capacity-specifications-for-sql-server
  • https://docs.microsoft.com/en-us/sql/database-engine/configure-windows/configure-the-network-packet-size-server-configuration-option

Maybe it is time to do a little bit of research into FreeTDS lists to see if this has been reported?

metaskills avatar Jun 09 '17 01:06 metaskills

I had a similar issue when trying to download large binary data for a DB. The files would get truncated when downloaded. The issue turned out to be a config issue in FreeTDS, the text size config was too small, I ended up changing text size = 4294967295. I would take a look at freetds.conf. I found this (page) that lists the settings but found that the defaults may not be correct.

tdurkee avatar Jun 09 '17 14:06 tdurkee

I can confirm that SET TEXTSIZE 314572800 which is 300MB does not help here. Mainly because this is for select/return sizes and what we are debugging is the insert. I did some more testing to watch the memory growth of various steps, reading binary data, unpacking it, and sending thru to execute. On my machine that story goes like so for a 56MB file:

  • 16MB - Base
  • 66MB - File Read
  • 279MB - Unpack File IO for SQL Insert
  • 386MB - Execute Insert

My first guess was to do a little research into both our code and FreeTDS. The main two points would be rb_tinytds_execute and dbcmd. Links here:

  • https://github.com/rails-sqlserver/tiny_tds/blob/master/ext/tiny_tds/client.c#L233
  • https://github.com/FreeTDS/freetds/blob/master/src/dblib/dblib.c#L1356

We are using StringValueCStr to provide dbcmd the command string. That is then put into the statement buffer and I suspect something is happening in FreeTDS. We should ask them if there are limits to their code. I tried reading thru their buffering code but I'm no expert here. In theory, I do not see any limits on their buffer. So I'm all ears to what others think.

metaskills avatar Jun 18 '17 02:06 metaskills

Hello! I had the problem with retrieving large binary data from a varbinary(max) column. I didn't received the full content. It looked like only the first chunk. I read about the TEXTSIZE setting. Than I took a look at the activerecord-sqlserver-adapter gem and found this line: https://github.com/rails-sqlserver/activerecord-sqlserver-adapter/blob/99f1fafeaea2e4982510291025b9b559265534ee/lib/active_record/connection_adapters/sqlserver_adapter.rb#L398 This settings solved my problem with retrieving the binary data.

mmichaa avatar Oct 09 '17 13:10 mmichaa