leon icon indicating copy to clipboard operation
leon copied to clipboard

COQUI STT error

Open AlexNovicov opened this issue 2 years ago • 0 comments

Specs

  • Leon version: 1.0.0-beta.6
  • OS (or browser) version: Ubuntu 22.04
  • Node.js version: v16.15.0
  • Complete "leon check" (or "npm run check") output:

[email protected] check babel-node scripts/run-check.js

.: CHECKING :. ➡ node --version ✔ v16.15.0

➡ npm --version ✔ 8.5.5

➡ pipenv --version ✔ pipenv, version 2022.4.21

➡ pipenv --where ✔ /home/dean_grey/.leon/bridges/python

➡ pipenv run python --version ✔ Python 3.10.4

➡ pipenv run python bridges/python/main.py scripts/assets/query-object.json ✔ {"package": "leon", "module": "randomnumber", "action": "run", "lang": "en", "input": "Give me a random number", "entities": [], "output": {"type": "end", "codes": ["success"], "speech": 48, "options": {}}}

➡ NLP model state ✔ Found and valid

➡ Amazon Polly TTS ❗ Amazon Polly TTS is not yet configured

➡ Google Cloud TTS/STT ❗ Google Cloud TTS/STT is not yet configured

➡ Watson TTS ❗ Watson TTS is not yet configured

➡ Offline TTS ✔ Found Flite at bin/flite/flite

➡ Watson STT ❗ Watson STT is not yet configured

➡ Offline STT ✔ Found Coqui language model at bin/coqui/huge-vocabulary.scorer

.: REPORT :. ➡ Here is the diagnosis about your current setup ✔ Run ✔ Run modules ✔ Reply you by texting ❗ Amazon Polly text-to-speech ❗ Google Cloud text-to-speech ❗ Watson text-to-speech ✔ Offline text-to-speech ❗ Google Cloud speech-to-text ❗ Watson speech-to-text ✔ Offline speech-to-text

✔ Hooray! Leon can run correctly ➡ If you have some yellow warnings, it is all good. It means some entities are not yet configured

Expected Behavior

OK

Actual Behavior

leon start

[email protected] start cross-env LEON_NODE_ENV=production node ./server/dist/index.js

.: INITIALIZATION :. ✔ The current env is production ✔ The current version is 1.0.0-beta.6 ✔ The current language is en-US ✔ The current time zone is Europe/Moscow ✔ Collaborative logger enabled

.: BRAIN :. ✔ New instance

.: NER :. ✔ New instance

.: NLU :. ✔ New instance

.: NLU :. (node:28569) [FST_MODULE_DEP_FASTIFY-STATIC] FastifyWarning.fastify-static: fastify-static has been deprecated. Use @fastify/[email protected] instead. (Use node --trace-warnings ... to show where the warning was created) ✔ NLP model loaded

.: INITIALIZATION :. ✔ Server is available at http://localhost:1337

.: REQUESTING :. ➡ GET /

.: REQUESTING :. ➡ GET /assets/index.6bd2b8f1.js

.: REQUESTING :. ➡ GET /assets/vendor.61a2236a.js

.: REQUESTING :. ➡ GET /assets/index.72eb34a6.css

.: REQUESTING :. ➡ GET /api/v1/info

.: GET /INFO :. ✔ Information pulled.

.: REQUESTING :. ➡ GET /assets/mic.d8267246.svg

.: REQUESTING :. ➡ GET /assets/logo.0679f287.svg

.: CLIENT :. ✔ Connected ➡ Type: webapp ➡ Socket id: OvEBynUvxMzfZqAtAAAB

.: ASR :. ✔ New instance

.: STT :. ✔ New instance ➡ Initializing STT...

.: COQUI STT PARSER :. ➡ Loading model from file bin/coqui/model.tflite... TensorFlow: v2.8.0-8-g06c8fea58fd Coqui STT: v1.3.0-0-g148fa743 INFO: Created TensorFlow Lite XNNPACK delegate for CPU. terminate called after throwing an instance of 'lm::FormatLoadException' what(): native_client/kenlm/lm/binary_format.cc:230 in void* lm::ngram::BinaryFormat::LoadBinary(std::size_t) threw FormatLoadException because `file_size != util::kBadSize && file_size < total_map'. Binary file has size 396656640 but the headers say it should be at least 874582710 Error: Command failed with exit code 1: npm start

How Do We Reproduce?

Start leon and go to browser

AlexNovicov avatar May 01 '22 09:05 AlexNovicov

Please try with the latest version to see how it goes, I have the same environment and it works.

louistiti avatar Sep 10 '22 03:09 louistiti