Yao Yue
Yao Yue
### Description Currently there doesn't seem to be a way to add or modify metadata (as key-val map) either at the file or column level when exporting dataframe/series to parquet/arrow...
Add a mock timer for tests that rely on timing, such as pipe tcp timewheel, these occasionally fail due to timing issues.
`tcp_accept` in `ccommon` didn't seem to be handling exception correctly. This type of behavior is hard to debug in production and should be captured in unittests
https://github.com/twitter/pelikan/pull/236
To subtract 16 bytes (metadata overhead for malloc) from the current default size, which is exactly 16KB
There are 4 hashtables in Pelikan now, each only very slightly different. In general if we can create a hashtable template with the following components pluggable, we can serve them...
So we can cap the maximum amount of memory used by each socket.
Investigate a setup where channel type and buffer type are both pluggable, what would the stream setup look like then? Reference: https://github.com/twitter/ccommon/pull/76