tinygo
tinygo copied to clipboard
WASM: stack overflow when unmarshalling JSON
Environment
TinyGo version: 0.29.0 (with #3890 patch) Environment: WASM
Description
We've encountered a stack overflow bug specifically when targeting WASM using TinyGo. Interestingly, we were unable to replicate the issue on desktop targets.
Minimal Reproducible Example
package main
import (
"encoding/json"
"fmt"
"syscall/js"
)
var (
terminateChannel = make(chan struct{})
)
type Person interface {
}
type JsonResponse struct {
Generated []Person `json:"generated"`
}
func main() {
mre := js.Global().Get("Object").New()
js.Global().Set("mre", mre)
mre.Set("trigger", js.FuncOf(trigger))
<-terminateChannel
}
func trigger(this js.Value, args []js.Value) interface{} {
// removing any level from the JSON below solves the stack overflow issue
data := `{
"generated": [
{
"metadata": {
"person_records": {
"errors": [
{
"with": {
"int": "1004"
}
}
]
}
}
}
]
}`
var response JsonResponse
err := json.Unmarshal([]byte(data), &response)
if err != nil {
panic(err)
}
fmt.Println(response) // the program prints this and then experiences a stack overflow
terminateChannel <- struct{}{}
return nil
}
From JavaScript, it's invoked as:
const tezpay = globalThis.mre
tezpay.trigger()
Steps to reproduce:
- compile the provided code using tinygo targeting WASM. (
tinygo clean && tinygo build -o mre.wasm -target wasm -no-debug mre.go
) - Include
https://github.com/tinygo-org/tinygo/blob/release/targets/wasm_exec.js
in your index.html - Run the resulting WASM binary.
async function wasmBrowserInstantiate(wasmModuleUrl, importObject) => {
let response = undefined;
if (!importObject) {
importObject = {
env: {
abort: () => console.log("Abort!")
}
};
}
if (WebAssembly.instantiateStreaming) {
response = await WebAssembly.instantiateStreaming(
fetch(wasmModuleUrl),
importObject
);
} else {
const fetchAndInstantiateTask = async () => {
const wasmArrayBuffer = await fetch(wasmModuleUrl).then(response =>
response.arrayBuffer()
);
return WebAssembly.instantiate(wasmArrayBuffer, importObject);
};
response = await fetchAndInstantiateTask();
}
return response;
};
const go = new Go();
const importObject = go.importObject;
// adjust path to built WASM as needed
const wasmModule = await wasmBrowserInstantiate("/wasm/mre.wasm", importObject);
go.run(wasmModule.instance);
wasmModule.trigger() // triggers stack overflow
Expected Behavior
The Go code should successfully unmarshal the JSON data without any issues. (GO is able to unmarshal without stack overflow)
Current Behavior
Upon invoking the trigger, the program does print the unmarshaled result. However, it subsequently crashes due to a stack overflow.
Notes
Parsing in main wont trigger stack overflow so the issue may be related to the channels or JS interop.
Any help on this would be greatly appreciated!
I tweaked the binary to run without javascript (but still wasm):
package main
import (
"encoding/json"
"fmt"
)
var (
terminateChannel = make(chan struct{})
)
type Person interface {
}
type JsonResponse struct {
Generated []Person `json:"generated"`
}
func main() {
go trigger()
<-terminateChannel
}
func trigger() interface{} {
// removing any level from the JSON below solves the stack overflow issue
data := `{
"generated": [
{
"metadata": {
"person_records": {
"errors": [
{
"with": {
"int": "1004"
}
}
]
}
}
}
]
}`
var response JsonResponse
err := json.Unmarshal([]byte(data), &response)
if err != nil {
panic(err)
}
fmt.Println(response) // the program prints this and then experiences a stack overflow
terminateChannel <- struct{}{}
return nil
}
Then as we can see, it runs fine with go
and tinygo
, but tinygo -target=wasi
fails. I believe this is due to a stack overflow, and you can see that increasing the stack size solves the problem. This is becoming a common failure case for people wanting to parse json.
~/go/src/github.com/dgryski/bug/json2 $ go run main.go
{[map[metadata:map[person_records:map[errors:[map[with:map[int:1004]]]]]]]}
~/go/src/github.com/dgryski/bug/json2 $ tinygo run main.go
{[map[metadata:map[person_records:map[errors:[map[with:map[int:1004]]]]]]]}
~/go/src/github.com/dgryski/bug/json2 $ tinygo run -target=wasi main.go
{[map[metadata:map[person_records:map[errors:[map[with:map[int:1004]]]]]]]}
panic: runtime error: nil pointer dereference
Error: failed to run main module `/var/folders/5b/_hr1d1fd3qn4t9vtfx5p61wm0000gp/T/tinygo1150388317/main`
Caused by:
0: failed to invoke command default
1: error while executing at wasm backtrace:
0: 0x23082 - runtime.abort
at /Users/dgryski/go/src/github.com/tinygo-org/tinygo/src/runtime/runtime_tinygowasm.go:70:6 - runtime.runtimePanicAt
at /Users/dgryski/go/src/github.com/tinygo-org/tinygo/src/runtime/panic.go:71:7
1: 0x741c - runtime.nilPanic
at /Users/dgryski/go/src/github.com/tinygo-org/tinygo/src/runtime/panic.go:134:16
2: 0x61ec7 - main.trigger
at /Users/dgryski/go/src/github.com/dgryski/bug/json2/main.go:51:2
3: 0x24ce7 - <goroutine wrapper>
at /Users/dgryski/go/src/github.com/dgryski/bug/json2/main.go:20:2
4: 0x7a8 - tinygo_launch
at /Users/dgryski/go/src/github.com/tinygo-org/tinygo/src/internal/task/task_asyncify_wasm.S:59
5: 0x242c3 - (*internal/task.Task).Resume
at /Users/dgryski/go/src/github.com/tinygo-org/tinygo/src/internal/task/task_asyncify.go:109:17 - runtime.scheduler
at /Users/dgryski/go/src/github.com/tinygo-org/tinygo/src/runtime/scheduler.go:236:11 - runtime.run
at /Users/dgryski/go/src/github.com/tinygo-org/tinygo/src/runtime/scheduler_any.go:28:11 - _start
at /Users/dgryski/go/src/github.com/tinygo-org/tinygo/src/runtime/runtime_wasm_wasi.go:21:5
2: wasm trap: wasm `unreachable` instruction executed
error: failed to run compiled binary /var/folders/5b/_hr1d1fd3qn4t9vtfx5p61wm0000gp/T/tinygo1150388317/main: exit status 134
~/go/src/github.com/dgryski/bug/json2 $ tinygo run -target=wasi -stack-size=32KB main.go
{[map[metadata:map[person_records:map[errors:[map[with:map[int:1004]]]]]]]}
I suspect this issue is solved for wasi
and wasm
in the latest dev
branch by increasing the default stack size.
Can you retest with the latest dev
?
Will do @deadprogram
@deadprogram, I've tested with the patch on the dev branch. I didn't use the latest dev as I had trouble compiling it, but I checked out to the commit of that PR - ff32fbbb4ff266f5f29080479a29d941cad2a57e. Unfortunately, 32K doesn't seem sufficient for our production needs. I was able to work with our WASM using 64K, but it's hard to determine with certainty whether that's adequate for production.
Wouldn't it be more reasonable to opt for a larger default? I've looked into other implementations, and it's common to find values ranging from 0.5M to 5M.
From the comment by @dkegel-fastly in the Slack channel, seems like 64k is the way.
https://github.com/emscripten-core/emscripten/pull/18191
I will make another PR doing bumping up to 64k.
Another data point: our real world wasi app needs a stack size of 48KB. That's one more little bit of evidence 64KB is a good default stack size for wasi.
WASI default stack size is now 64K. I think we can tag this as "next-release".
Completed with v0.31.0
so now closing. Thank you!