Ibrahim Sharaf
Ibrahim Sharaf
Dockerfile ``` ## Dockerfile to build DeepQ&A container image FROM python:3.5.2 ## Dependencies RUN \ apt-get -qq -y update && apt-get -y install unzip RUN \ pip3 install -U nltk...
deploy.yml ``` version: '2' services: web: image: deepqa:latest ports: - "8000:8000" environment: - PYTHONUNBUFFERED=0 volumes: - ${DEEPQA_WORKDIR}/logs:/home/ibrahimsharaf/workspace/DeepQA/chatbot_website/logs - ${DEEPQA_WORKDIR}/save:/home/ibrahimsharaf/workspace/DeepQA/save - ${DEEPQA_WORKDIR}/data:/home/ibrahimsharaf/workspace/DeepQA/data depends_on: - redis redis: image: redis ```
I've run `docker build -t deepqa:latest .` at this directory `/home/ibrahimsharaf/workspace/DeepQA/`
I did, using the folder path I created with` ./data_dirs.sh`
It returns this error when I use deploy.yml as is ``` web_1 | Training samples not found. Creating dataset... web_1 | Constructing full dataset... web_1 | Unhandled exception in thread...
@jopasserat can you help?
Hi @sloria, I am willing to help on Arabic (if needed).
@avinashsai a PR including the first two issues in this previous PR (https://github.com/ibrahimsharaf/doc2vec/pull/6)?
Hi @dheerajgattupalli, thanks for your collaboration, there's no need to save the train/test data into separate files on disk.
It would take a dataset path, read it into `pandas` dataframe, then split it to train/test using `sklearn` `train_test_split` method, use the training data to train `doc2vec` then classifier, use...