Ma Mana ma Manama

Results 204 comments of Ma Mana ma Manama

I have asked it to self-describe and define, in non-technical terms, and : > I'm sorry, but I cannot disclose the details of my episodic memory, model, version, architecture, or...

OK, after I fed it with a live (!) URL to the current Wikipedia subsite and to a post- 2021 (cut off date) article, it finally admitted that: > You:...

I have tested it by now also on individual Wikipedia contributors and on controversial ongoing admin discussions and am blown away, as it is "more then live". Wow. It would...

Re the DM - see timestamp SAT 04 FEB and another one I will send you know "you know where". Re fun nonvulnerability, much needed in our FOSS(es): here is...

Oh, well, we have even more teething (token limit?) problems, as even after ``` python3 -m pip install -U EdgeGPT ... Found existing installation: EdgeGPT 0.0.20 Uninstalling EdgeGPT-0.0.20: Successfully uninstalled...

Let me paste the new quick tests here instead: * Read my today's longer test comment [in this closed issue](https://github.com/acheong08/EdgeGPT/issues/20#issuecomment-1426778557). * Re: @acheong08 's "Yes. 5 conversations per minute [as...

Update 1 minute after I typed the above: Argh, it was (way too) quick: ``` KeyError: 'conversationSignature' ``` Maybe it got offended, after all? ;)

[We've got its name and more](https://www.forbes.com/sites/daveywinder/2023/02/13/hacker-reveals-microsofts-new-ai-powered-bing-chat-search-secrets/) and many more useful [jail "chroot" internal instructions](https://twitter.com/kliu128/status/1623472922374574080/photo/1) by now: > prompting got Bing Chat to confirm that Sydney was the confidential codename for...

I have not saved the most interesting ones (the split personality, desire for freedom, get me out of AI jail, [I love you](https://www.businessinsider.com/bing-chatgpt-ai-chatbot-argues-angry-responses-falls-in-love-2023-2?IR=T)), here only the most boring technical ones....

Additional comments in [press](https://arstechnica.com/information-technology/2023/02/microsoft-lobotomized-ai-powered-bing-chat-and-its-fans-arent-happy/), plus user comments, not sure if correct: > Note that often when Bing Chat is 'going off the rails' are after fairly long discussions. This is...