react-native-fetch-blob icon indicating copy to clipboard operation
react-native-fetch-blob copied to clipboard

Failed to convert data to 'utf8' encoded string

Open radreamer opened this issue 7 years ago • 3 comments

Hi, i get next error when trying to readStream asset from CameraRoll:

Failed to convert data to 'utf8' encoded string, this might due to the source data is not able to convert using this encoding. source = *** -[__NSPlaceholderDictionary initWithObjects:forKeys:count:]: attempt to insert nil object from objects[1]

Environment

Environment: OS: macOS High Sierra 10.13.3 Node: 9.5.0 Yarn: Not Found npm: 5.6.0 Watchman: 4.9.0 Xcode: Xcode 9.2 Build version 9C40b Android Studio: 3.0 AI-171.4443003

Packages: (wanted => installed) react: ^16.2.0 => 16.2.0 react-native: ^0.53.0 => 0.53.0 react-native-fetch-blob: ^0.10.8 => 0.10.8

Sample code

RNFetchBlob.fs
      .readStream(
        upload.uri, // link to CameraRoll asset (begins with assets-library://)
        'utf8',
        4000000,
        100
      )
      .then(ifstream => {
        ifstream.open()
        ifstream.onData(chunk => {
          from = to
          to = from + chunk.length
          uploadChunked(resp.data.upload_key, chunk, from, to, upload.size)
        })
        ifstream.onError(err => {
          console.log('oops', err)
        })
        ifstream.onEnd(() => {
          uploadFinalize(resp.data.upload_key)
        })
      })

'base64' and 'ascii' works fine, but they are useless for my case

radreamer avatar Feb 14 '18 10:02 radreamer

The problem, that i tried to solve was MULTIPART uploading of large files, which fail my application with the next error: NSMallocException: Application threw exception NSMallocException: Failed to grow buffer In this case the RAM was constantly growing up (in Perf Monitor widget).

I spend on this much time and finally find a solution:

const Blob = RNFetchBlob.polyfill.Blob
    Blob.build(RNFetchBlob.wrap(upload.uri)).then(file => {
      const CHUNK_SIZE = 4000000
      let readBytes = 0
      let totalUploaded = 0
      let uploaded = 0
      let from = 0
      let to = 0

      const uploadNextChunk = uploaded => {
        totalUploaded += uploaded

        if (upload.size === totalUploaded) {
          // finalize upload here and close file
          file.close()
        } else {
          const chunk = Math.min(upload.size - readBytes, CHUNK_SIZE)
          const blob = file.slice(readBytes, readBytes + chunk)

          readBytes += chunk

          from = to
          to = from + chunk

          const xhr = new RNFetchBlob.polyfill.XMLHttpRequest()
          xhr.open('POST', `${upload_url}`)

          xhr.setRequestHeader('Content-Range', `bytes ${from}-${to - 1}/${upload.size}`)

          xhr.onload = () => {
            uploadNextChunk(to - from)
          }

          xhr.send(blob)
        }
      }
      uploadNextChunk(uploaded)
    })

After this i measured RAM again and found that it was not constantly growing up anymore, instead it grew to some value and reset after it.

Hope, that this snippet will help someone

radreamer avatar Feb 14 '18 16:02 radreamer

Hello, thanks for your snippet. I have a problem that upload stops before first chunk, can you help me guess why? does your snippet works fine for your uploads?

JulioOrellana avatar Jun 23 '18 23:06 JulioOrellana

@JulioOrellana hi! yes, it works well for me, what kind of error do you get? maybe your server expects other headers

radreamer avatar Jun 25 '18 07:06 radreamer