NeuralTripleTranslation
NeuralTripleTranslation copied to clipboard
Getting started
Hello Yue Liu,
I'm interested trying out your model for a problem I'm trying to solve, where I require a natural language database to be linked to a semantic layer using RDF Triples. So far I've done the following:
- Downloaded pretrain Embedding vector (Word2Vec model): https://wikipedia2vec.github.io/wikipedia2vec/pretrained/
- Prepare input data (train/test) in the following format:
- keikavus was born in 1021 dbpedia.org/property/birthDate http://dbpedia.org/resource/Keikavus 1021
- tora torbergsdatter (old norse: þóra þorbergsdóttir), born 1025 dbpedia.org/property/birthDate http://dbpedia.org/resource/Tora_Torbergsdatter 1025
- Run s2s_preprocess.py to generate pkl file for training.
- Prepare support model
- Train Knowledge graph Embedding (TransE model)
- Build cpp/python libraries (run make.sh)
- Run dbpedia_train_transe.py
- Ran s2s_validate_graph_embredding.py to get *_graph.pkl
- Run s2s_train_graph or s2s_end2end.py
The last step I'm having troubles when running s2s_train_graph.py (line 70: feed_dict["decoder_target"] = np.array(y, dtype=np.int32) where the variable y has been appended with string arrays (fx. ['keikavus', 'was', 'born', 'in', '1021', 1, 0]).
Any idea/suggestion where I might have gone wrong.
Best Regards, Axel Bender.