morty icon indicating copy to clipboard operation
morty copied to clipboard

[documentation] Purpose of Morty (in Searx context only)

Open Beeblebrox-BSD opened this issue 7 years ago • 4 comments

Hello, I have self-hosted Searx with no public instance (LAN only)

Could someone help me understand what Morty does exactly, but only in the context of Searx. I'm not asking about Morty functionality for opening a page link from Searx; only about the added privacy TO Searx via Morty.

  • Does Morty make Search Engine user tracking more difficult? How?
  • Does a local Morty proxy rely on other Morty instances, connecting to them so that Search Engines have millions of queries coming from same IP?
  • What does Morty do that Searx does not already do for Searx traffic?

Thanks for any help.

Beeblebrox-BSD avatar Jul 24 '17 11:07 Beeblebrox-BSD

Ohi, Morty is just a sanitizing HTTP request forwarder which hides your origin from destination websites. So, if you use it on localhost, the only benefit of it is content filtering, what can be achieved multiple other ways (e.g.: NoScript + Policeman Firefox addons are more mature and flexible solutions).

asciimoo avatar Aug 01 '17 13:08 asciimoo

Hello, I have another question concerning the role of Morty in a Searx setup. I would assume that outgoing requests from Searx itself would all be forwarded through Morty like so (going off of the provided docker-compose Github project):

Browser -> Reverse Proxy -> Filtron -> Searx -> Morty -> Google, Bing etc.

I set this all up this way using internal Docker networks (as to not expose ports unnecessarily) like so:

|External Network| <- Traefik -> |Internal Network 1| <- Filtron -> |Internal Network 2| <- Searx -> |Internal Network 3| <- Morty -> |External Network|

Using this setup however, requests from Searx to Google & co time out or end in a request exception. I made sure I set the Morty url correctly and that it is reachable from the Searx container. External requests start working however when I connect the Searx container directly to the external network.

So, am I incorrect in my assumption about how Morty fits in the architecture or have I just screwed up my setup somehow and it should actually work the way I originally intended?

Thanks in advance.

ghost avatar Aug 03 '20 21:08 ghost

@DKK98 the "usual" architecture is described here: https://asciimoo.github.io/searx/admin/architecture.html

An user sends a request to searx:

browser --[external network]--> Reverse Proxy --[internal network]--> filtron --[internal network]--> searx --[external network]--> DuckDuckGo, ....

Searx results contain reference to some images. For each image: browser --[external network]--> Reverse Proxy --[internal network]--> morty --[external network]--> external URL

requests from Searx to Google & co time out or end in a request exception

You can set

  • the morty timeout using the -timeout parameter
  • the searx timeout using settings.yml

dalf avatar Aug 04 '20 06:08 dalf

Thank you. That makes sense. The timeout was caused because Searx itself was not able to reach the external engines, since I thought all requests were forwarded by Morty and had it on the internal network only.

ghost avatar Aug 04 '20 14:08 ghost