personality-detection
personality-detection copied to clipboard
Python 3
So sorry to ask, but is there any chance this will be ported to Python 3? I would like to incorporate this into the Social Media Macroscope (www.socialmediamacroscope.org), and open-source analytics environment for researching social media data built by academia (I am the PI).
there seem to be quite a few issues apart from its incompatibility with python3. The data processing was very memory intensive. so @amirmohammadkz made some major improvements. I forked his repo and made it python3 compatible
there seem to be quite a few issues apart from its incompatibility with python3. The data processing was very memory intensive. so @amirmohammadkz made some major improvements. I forked his repo and made it python3 compatible
Thanks for using my forked repo. Could you run the whole code, especially the Theano model, on python 3?
Thanks for converting Jia! How compute intense would you guys think this is to have the trained model stored and accessible to run via either an API call or just a direct call to the trained model (if that makes sense)? I'm trying to incorporate this into my open-source social media analytics science gateway (www.socialmediamacroscope.org).
Joseph T. Yun https://www.josephtyun.com
On Wed, Nov 6, 2019 at 5:46 AM amirmohammadkz [email protected] wrote:
there seem to be quite a few issues apart from its incompatibility with python3. The data processing was very memory intensive. so @amirmohammadkz https://github.com/amirmohammadkz made some major improvements. I forked his repo and made it python3 compatible https://github.com/ichenjia/personality-detection
Thanks for using my forked repo. Could you run the whole code, especially the Theano model, on python 3?
— You are receiving this because you authored the thread. Reply to this email directly, view it on GitHub https://github.com/SenticNet/personality-detection/issues/28?email_source=notifications&email_token=AD3SPHOLOJDOYFPST4E4WH3QSKVCPA5CNFSM4IQLO6OKYY3PNVWWK3TUL52HS4DFVREXG43VMVBW63LNMVXHJKTDN5WW2ZLOORPWSZGOEDGIMHY#issuecomment-550274591, or unsubscribe https://github.com/notifications/unsubscribe-auth/AD3SPHNJ5BW7XNAJKA6III3QSKVCPANCNFSM4IQLO6OA .
there seem to be quite a few issues apart from its incompatibility with python3. The data processing was very memory intensive. so @amirmohammadkz made some major improvements. I forked his repo and made it python3 compatible
Thanks for using my forked repo. Could you run the whole code, especially the Theano model, on python 3?
Yeah, I got it working running with Python3. The error about func_name was fixed by replacing it with __name__
Thanks for converting Jia! How compute intense would you guys think this is to have the trained model stored and accessible to run via either an API call or just a direct call to the trained model (if that makes sense)? I'm trying to incorporate this into my open-source social media analytics science gateway (www.socialmediamacroscope.org). … ------ Joseph T. Yun https://www.josephtyun.com On Wed, Nov 6, 2019 at 5:46 AM amirmohammadkz @.***> wrote: there seem to be quite a few issues apart from its incompatibility with python3. The data processing was very memory intensive. so @amirmohammadkz https://github.com/amirmohammadkz made some major improvements. I forked his repo and made it python3 compatible https://github.com/ichenjia/personality-detection Thanks for using my forked repo. Could you run the whole code, especially the Theano model, on python 3? — You are receiving this because you authored the thread. Reply to this email directly, view it on GitHub <#28?email_source=notifications&email_token=AD3SPHOLOJDOYFPST4E4WH3QSKVCPA5CNFSM4IQLO6OKYY3PNVWWK3TUL52HS4DFVREXG43VMVBW63LNMVXHJKTDN5WW2ZLOORPWSZGOEDGIMHY#issuecomment-550274591>, or unsubscribe https://github.com/notifications/unsubscribe-auth/AD3SPHNJ5BW7XNAJKA6III3QSKVCPANCNFSM4IQLO6OA .
I am using a 1080 Ti GPU and 48G of RAM to train the model. It has been 12 hours and still running. It looks like it has done half of the training. But that said, it's getting gradually slower. I assume it may be a lot slower if I were using CPU.
I plan to put the model in a flask server and call it via API. Not sure the type of server you have. Once the model is trained, the prediction should be a lot faster. The data prop may take a little more time as it is trying to embed the words.
by the way, did you figure out how to make the prediction yet? @amirmohammadkz
The readme file only has info regarding data prep and training.
there seem to be quite a few issues apart from its incompatibility with python3. The data processing was very memory intensive. so @amirmohammadkz made some major improvements. I forked his repo and made it python3 compatible
Thanks for using my forked repo. Could you run the whole code, especially the Theano model, on python 3?
Yeah, I got it working running with Python3. The error about func_name was fixed by replacing it with
__name__
Actually, I had some problems with running conv_net_train.py
; it seems that Theano was incompatible with my system spec on python 3, so I preferred to use python 2.7 version.
there seem to be quite a few issues apart from its incompatibility with python3. The data processing was very memory intensive. so @amirmohammadkz made some major improvements. I forked his repo and made it python3 compatible
Thanks for using my forked repo. Could you run the whole code, especially the Theano model, on python 3?
Yeah, I got it working running with Python3. The error about func_name was fixed by replacing it with
__name__
Actually, I had some problems with running
conv_net_train.py
; it seems that Theano was incompatible with my system spec on python 3, so I preferred to use python 2.7 version.
Hmm, not sure what's happening there. I used the GPU version and it was fine. What was the error?
Not sure if you saw my earlier question, did you manage to get the prediction (given a blob of text) working? Do you have any code snippet showing how to do it?
Thanks for converting Jia! How compute intense would you guys think this is to have the trained model stored and accessible to run via either an API call or just a direct call to the trained model (if that makes sense)? I'm trying to incorporate this into my open-source social media analytics science gateway (www.socialmediamacroscope.org). … ------ Joseph T. Yun https://www.josephtyun.com On Wed, Nov 6, 2019 at 5:46 AM amirmohammadkz @.***> wrote: there seem to be quite a few issues apart from its incompatibility with python3. The data processing was very memory intensive. so @amirmohammadkz https://github.com/amirmohammadkz made some major improvements. I forked his repo and made it python3 compatible https://github.com/ichenjia/personality-detection Thanks for using my forked repo. Could you run the whole code, especially the Theano model, on python 3? — You are receiving this because you authored the thread. Reply to this email directly, view it on GitHub <#28?email_source=notifications&email_token=AD3SPHOLOJDOYFPST4E4WH3QSKVCPA5CNFSM4IQLO6OKYY3PNVWWK3TUL52HS4DFVREXG43VMVBW63LNMVXHJKTDN5WW2ZLOORPWSZGOEDGIMHY#issuecomment-550274591>, or unsubscribe https://github.com/notifications/unsubscribe-auth/AD3SPHNJ5BW7XNAJKA6III3QSKVCPANCNFSM4IQLO6OA .
I am using a 1080 Ti GPU and 48G of RAM to train the model. It has been 12 hours and still running. It looks like it has done half of the training. But that said, it's getting gradually slower. I assume it may be a lot slower if I were using CPU.
I plan to put the model in a flask server and call it via API. Not sure the type of server you have. Once the model is trained, the prediction should be a lot faster. The data prop may take a little more time as it is trying to embed the words.
If you apply exactly the configuration I suggested in my README.md
, it will take less than 42sec for each of your epochs. So, for every trait, you need about 5.83 hours. my spec was somehow weaker than yours (in RAM)
Jia, if you put it on a server and allow API access, would you be interested in allowing my open-source science gateway to connect to it for prediction? I am a research professor at University of Illinois Urbana Champaign and I built a science gateway for social media research so that academic researchers can analyze social media data without the need for coding skills.
Joseph T. Yun https://www.josephtyun.com
On Wed, Nov 6, 2019 at 12:31 PM amirmohammadkz [email protected] wrote:
Thanks for converting Jia! How compute intense would you guys think this is to have the trained model stored and accessible to run via either an API call or just a direct call to the trained model (if that makes sense)? I'm trying to incorporate this into my open-source social media analytics science gateway (www.socialmediamacroscope.org). … <#m_4894821366519155715_> ------ Joseph T. Yun https://www.josephtyun.com On Wed, Nov 6, 2019 at 5:46 AM amirmohammadkz @.***> wrote: there seem to be quite a few issues apart from its incompatibility with python3. The data processing was very memory intensive. so @amirmohammadkz https://github.com/amirmohammadkz https://github.com/amirmohammadkz made some major improvements. I forked his repo and made it python3 compatible https://github.com/ichenjia/personality-detection Thanks for using my forked repo. Could you run the whole code, especially the Theano model, on python 3? — You are receiving this because you authored the thread. Reply to this email directly, view it on GitHub <#28 https://github.com/SenticNet/personality-detection/issues/28?email_source=notifications&email_token=AD3SPHOLOJDOYFPST4E4WH3QSKVCPA5CNFSM4IQLO6OKYY3PNVWWK3TUL52HS4DFVREXG43VMVBW63LNMVXHJKTDN5WW2ZLOORPWSZGOEDGIMHY#issuecomment-550274591>, or unsubscribe https://github.com/notifications/unsubscribe-auth/AD3SPHNJ5BW7XNAJKA6III3QSKVCPANCNFSM4IQLO6OA .
I am using a 1080 Ti GPU and 48G of RAM to train the model. It has been 12 hours and still running. It looks like it has done half of the training. But that said, it's getting gradually slower. I assume it may be a lot slower if I were using CPU.
I plan to put the model in a flask server and call it via API. Not sure the type of server you have. Once the model is trained, the prediction should be a lot faster. The data prop may take a little more time as it is trying to embed the words.
If you apply exactly the configuration I suggested in my README.md, it will take less than 42sec for each of your epochs. So, for every trait, you need about 5.83 hours. my spec was somehow weaker than yours (in RAM)
— You are receiving this because you authored the thread. Reply to this email directly, view it on GitHub https://github.com/SenticNet/personality-detection/issues/28?email_source=notifications&email_token=AD3SPHJEGI42PKLPL6UOYDLQSMENHA5CNFSM4IQLO6OKYY3PNVWWK3TUL52HS4DFVREXG43VMVBW63LNMVXHJKTDN5WW2ZLOORPWSZGOEDHRBNI#issuecomment-550441141, or unsubscribe https://github.com/notifications/unsubscribe-auth/AD3SPHO4AM3DYNZV7ZBZ2B3QSMENHANCNFSM4IQLO6OA .
Jia, if you put it on a server and allow API access, would you be interested in allowing my open-source science gateway to connect to it for prediction? I am a research professor at University of Illinois Urbana Champaign and I built a science gateway for social media research so that academic researchers can analyze social media data without the need for coding skills. … ------ Joseph T. Yun https://www.josephtyun.com On Wed, Nov 6, 2019 at 12:31 PM amirmohammadkz [email protected] wrote: Thanks for converting Jia! How compute intense would you guys think this is to have the trained model stored and accessible to run via either an API call or just a direct call to the trained model (if that makes sense)? I'm trying to incorporate this into my open-source social media analytics science gateway (www.socialmediamacroscope.org). … <#m_4894821366519155715_> ------ Joseph T. Yun https://www.josephtyun.com On Wed, Nov 6, 2019 at 5:46 AM amirmohammadkz @.***> wrote: there seem to be quite a few issues apart from its incompatibility with python3. The data processing was very memory intensive. so @amirmohammadkz https://github.com/amirmohammadkz https://github.com/amirmohammadkz made some major improvements. I forked his repo and made it python3 compatible https://github.com/ichenjia/personality-detection Thanks for using my forked repo. Could you run the whole code, especially the Theano model, on python 3? — You are receiving this because you authored the thread. Reply to this email directly, view it on GitHub <#28 <#28>?email_source=notifications&email_token=AD3SPHOLOJDOYFPST4E4WH3QSKVCPA5CNFSM4IQLO6OKYY3PNVWWK3TUL52HS4DFVREXG43VMVBW63LNMVXHJKTDN5WW2ZLOORPWSZGOEDGIMHY#issuecomment-550274591>, or unsubscribe https://github.com/notifications/unsubscribe-auth/AD3SPHNJ5BW7XNAJKA6III3QSKVCPANCNFSM4IQLO6OA . I am using a 1080 Ti GPU and 48G of RAM to train the model. It has been 12 hours and still running. It looks like it has done half of the training. But that said, it's getting gradually slower. I assume it may be a lot slower if I were using CPU. I plan to put the model in a flask server and call it via API. Not sure the type of server you have. Once the model is trained, the prediction should be a lot faster. The data prop may take a little more time as it is trying to embed the words. If you apply exactly the configuration I suggested in my README.md, it will take less than 42sec for each of your epochs. So, for every trait, you need about 5.83 hours. my spec was somehow weaker than yours (in RAM) — You are receiving this because you authored the thread. Reply to this email directly, view it on GitHub <#28?email_source=notifications&email_token=AD3SPHJEGI42PKLPL6UOYDLQSMENHA5CNFSM4IQLO6OKYY3PNVWWK3TUL52HS4DFVREXG43VMVBW63LNMVXHJKTDN5WW2ZLOORPWSZGOEDHRBNI#issuecomment-550441141>, or unsubscribe https://github.com/notifications/unsubscribe-auth/AD3SPHO4AM3DYNZV7ZBZ2B3QSMENHANCNFSM4IQLO6OA .
Hi Joseph,
Yes, we actually do social media analysis for enterprises for a living. So this is something we can definitely help with as long as it doesn't introduce an exuberant number of API calls. How about let me figure out how to do the prediction first and then I will put it on a server. Once that's done, I can email you an Auth key to call the API. wanna PM me your email address so I can keep you updated?
Jia
there seem to be quite a few issues apart from its incompatibility with python3. The data processing was very memory intensive. so @amirmohammadkz made some major improvements. I forked his repo and made it python3 compatible
Thanks for using my forked repo. Could you run the whole code, especially the Theano model, on python 3?
Yeah, I got it working running with Python3. The error about func_name was fixed by replacing it with
__name__
Actually, I had some problems with running
conv_net_train.py
; it seems that Theano was incompatible with my system spec on python 3, so I preferred to use python 2.7 version.Hmm, not sure what's happening there. I used the GPU version and it was fine. What was the error?
Not sure if you saw my earlier question, did you manage to get the prediction (given a blob of text) working? Do you have any code snippet showing how to do it?
I do not remember it precisely. I know that Theano has some problems with some hardware and operating systems. It is outdated now and will not get any updates.
About the testing process, I am not sure about the detailed process. Theano testing process was so complicated, so we decided to reimplement the model with Keras and we used that. But you can get the accuracy of the model on your test dataset by replacing the original test set with yours
there seem to be quite a few issues apart from its incompatibility with python3. The data processing was very memory intensive. so @amirmohammadkz made some major improvements. I forked his repo and made it python3 compatible
Thanks for using my forked repo. Could you run the whole code, especially the Theano model, on python 3?
Yeah, I got it working running with Python3. The error about func_name was fixed by replacing it with
__name__
Actually, I had some problems with running
conv_net_train.py
; it seems that Theano was incompatible with my system spec on python 3, so I preferred to use python 2.7 version.Hmm, not sure what's happening there. I used the GPU version and it was fine. What was the error? Not sure if you saw my earlier question, did you manage to get the prediction (given a blob of text) working? Do you have any code snippet showing how to do it?
I do not remember it precisely. I know that Theano has some problems with some hardware and operating systems. It is outdated now and will not get any updates.
About the testing process, I am not sure about the detailed process. Theano testing process was so complicated, so we decided to reimplement the model with Keras and we used that. But you can get the accuracy of the model on your test dataset by replacing the original test set with yours
About Theano, although Yoshua Bengio said he wouldn't keep supporting it. It looks like Facebook team is taking over the maintenance. Although I agree with you that in the long run, this package should be rewritten with Keras or Torch. Do you have plans to open-source the Keras version?
Thanks!
J
there seem to be quite a few issues apart from its incompatibility with python3. The data processing was very memory intensive. so @amirmohammadkz made some major improvements. I forked his repo and made it python3 compatible
Thanks for using my forked repo. Could you run the whole code, especially the Theano model, on python 3?
Yeah, I got it working running with Python3. The error about func_name was fixed by replacing it with
__name__
Actually, I had some problems with running
conv_net_train.py
; it seems that Theano was incompatible with my system spec on python 3, so I preferred to use python 2.7 version.Hmm, not sure what's happening there. I used the GPU version and it was fine. What was the error? Not sure if you saw my earlier question, did you manage to get the prediction (given a blob of text) working? Do you have any code snippet showing how to do it?
I do not remember it precisely. I know that Theano has some problems with some hardware and operating systems. It is outdated now and will not get any updates. About the testing process, I am not sure about the detailed process. Theano testing process was so complicated, so we decided to reimplement the model with Keras and we used that. But you can get the accuracy of the model on your test dataset by replacing the original test set with yours
About Theano, although Yoshua Bengio said he wouldn't keep supporting it. It looks like Facebook team is taking over the maintenance. Although I agree with you that in the long run, this package should be rewritten with Keras or Torch. Do you have plans to open-source the Keras version?
Thanks!
J
Yes; my colleague, @saminfatehir, was in charge of that. we will provide it for you and @jtyun in the coming days if you need.
Thank you both! Jia, I have emailed you from my university email.
Yes; my colleague, @saminfatehir, was in charge of that. we will provide it for you and @jtyun in the coming days if you need.
Fantastic. Look forward to seeing the Keras version and contribute. Thanks a lot
Your welcome :) Besides, we have implemented another model on this task which outperformed this model in both performance and accuracy on this dataset. If you are interested in its updates, you can send me a message.
Great. Let me do that now.
Yes; my colleague, @saminfatehir, was in charge of that. we will provide it for you and @jtyun in the coming days if you need.
Fantastic. Look forward to seeing the Keras version and contribute. Thanks a lot
The Keras version is now available on my forked repo.
What I can do is to train it with Keras on our machine and see how it works out. I will also try tune it with Python3 .
Stay tuned.
Jia
On Dec 4, 2019, at 12:41 AM, amirmohammadkz [email protected] wrote:
Yes; my colleague, @saminfatehir https://github.com/saminfatehir, was in charge of that. we will provide it for you and @jtyun https://github.com/jtyun in the coming days if you need.
Fantastic. Look forward to seeing the Keras version and contribute. Thanks a lot
The Keras version is now available on my forked repo.
— You are receiving this because you commented. Reply to this email directly, view it on GitHub https://github.com/SenticNet/personality-detection/issues/28?email_source=notifications&email_token=AA4MCG6CCGCB6VG3ZPBARNLQW5ULLA5CNFSM4IQLO6OKYY3PNVWWK3TUL52HS4DFVREXG43VMVBW63LNMVXHJKTDN5WW2ZLOORPWSZGOEF4FZXI#issuecomment-561536221, or unsubscribe https://github.com/notifications/unsubscribe-auth/AA4MCG6AK3ZF7YJR5SKH3K3QW5ULLANCNFSM4IQLO6OA.
The Keras code is written in python3. And a preprosessing python file is also added for it. So, you can enjoy using it without any modification :)
check this https://github.com/laifi/Keras-BigFive-personality-traits
there seem to be quite a few issues apart from its incompatibility with python3. The data processing was very memory intensive. so @amirmohammadkz made some major improvements. I forked his repo and made it python3 compatible
Thanks for using my forked repo. Could you run the whole code, especially the Theano model, on python 3?
Yeah, I got it working running with Python3. The error about func_name was fixed by replacing it with name
could you upload the modified code?
Your welcome :) Besides, we have implemented another model on this task which outperformed this model in both performance and accuracy on this dataset. If you are interested in its updates, you can send me a message.
Hello, Could you add the link or send me the updated code with improvements
The Keras code is written in python3. And a preprosessing python file is also added for it. So, you can enjoy using it without any modification :)
IndexError: list index out of range
Your welcome :) Besides, we have implemented another model on this task which outperformed this model in both performance and accuracy on this dataset. If you are interested in its updates, you can send me a message.
Hello, Could you add the link or send me the updated code with improvements
We will publish it with the paper soon.
The Keras code is written in python3. And a preprosessing python file is also added for it. So, you can enjoy using it without any modification :)
IndexError: list index out of range
Could you please give us more information? The python file and error line?
The Keras code is written in python3. And a preprosessing python file is also added for it. So, you can enjoy using it without any modification :)
IndexError: list index out of range
Could you please give us more information? The python file and error line?
The Keras code is written in python3. And a preprosessing python file is also added for it. So, you can enjoy using it without any modification :)
IndexError: list index out of range
Could you please give us more information? The python file and error line?
Configure is not a terminal command. I meant that you must edit the ~./theanorc file. You can open it with nano or any other text editors. For example:
nano ~./theanorc
More information is here:
http://deeplearning.net/software/theano/library/config.html
If it solved your problem, we will be happy if you star our repo :)
The Keras code is written in python3. And a preprosessing python file is also added for it. So, you can enjoy using it without any modification :)
IndexError: list index out of range
Could you please give us more information? The python file and error line?
Configure is not a terminal command. I meant that you must edit the ~./theanorc file. You can open it with nano or any other text editors. For example:
nano ~./theanorc
More information is here: http://deeplearning.net/software/theano/library/config.html If it solved your problem, we will be happy if you star our repo :)
no such file or directory is there how to solve this issue