Robert Rothenberg
Robert Rothenberg
> robots.txt is a file that should be used by crawlers: tools that discover urls (see https://developers.google.com/search/docs/advanced/robots/robots_txt or https://datatracker.ietf.org/doc/html/draft-koster-rep ) > > this tool is meant to be used _after_...
This also closes #14
@kenahoo I'm not sure. I think it should get the OS from the object (if it can) and pass that the URI::file. I also realise that I forgot to add...
@kenahoo I've added some tests for Win32 as well as POD.
That makes sense. On Sat, 25 Jun 2022, 20:34 brian d foy, ***@***.***> wrote: > I'm making several release but the only thing that's changing is the > generated module....
Do you mean "Tcl" instead of Tk? I think Tk has a further complication in that it will use the system-installed version of some libraries, if there is one.
It looks like this might be complicated than I thought. Ideally it should be letting HTTP::Message etc. do the decoding.
FWIW, a website that I maintain blocks fake user agents, e.g. things that claim to be Googlebot when they are not coming from Google's networks. (The site shows OpenGraph data...
@gshank ?