Anakin
Anakin copied to clipboard
High performance Cross-platform Inference-engine, you could run Anakin on x86-cpu,arm, nv-gpu, amd-gpu,bitmain and cambricon devices.
entropy_calibratior和tensorrt实现的Int8EntropyCalibrator2一样功能不
Fixes #548
# Absolute Path Traversal due to incorrect use of `send_file` call A path traversal attack (also known as directory traversal) aims to access files and directories that are stored outside...
Bumps [protobuf](https://github.com/protocolbuffers/protobuf) from 3.1.0 to 3.15.0. Release notes Sourced from protobuf's releases. Protocol Buffers v3.15.0 Protocol Compiler Optional fields for proto3 are enabled by default, and no longer require the...
如题。 980比855慢10%实锤了。
Is there any quantization tools for me, if I have a pretrained model with FP32, to quantize my model to a INT8 model ?
Can I get inference speed up with Intel(R) Xeon(R) CPU E5-2650 v4 @ 2.20GHz + INT8?
I want to use anakin's sgemm to do matrix multiplication, how about its speed?