[Pdweb] puredata.info robots.txt disallows everything
Hans-Christoph Steiner
hans at at.or.at
Mon Jun 28 20:01:54 CEST 2010
On Jun 28, 2010, at 4:38 AM, IOhannes m zmoelnig wrote:
> On 2010-06-15 20:33, Hans-Christoph Steiner wrote:
>>
>> I'd like to change the robots.txt for puredata.info so that it allows
>> everything, and search engines can index it. Are we going by lazy
>> consensus on this?
>>
>
> i would prefer if robots would not constantly pull 4GB of data.
> this is mainly relevant for media-data (patches, movies,
> presentations,
> ...) which is probably not so relevant for searchbots either.
>
> so i would consent on a robots.txt that does not allow everything for
> everyone but rather restrict bots to text-pages.
>
> fmasdr
> IOhannes
I'm sure the robots are also interested in not constantly pulling 4GB
files down, I imagine they try to avoid that.
I'm fine with a text only robots but it seems not worth the effort and
I wouldn't know how to do it. I do know that I have many gigs of
files on my own website, and robots are constantly hitting it. My
site is hosted on my home internet connection and I have never noticed
a problem with robots.
.hc
----------------------------------------------------------------------------
I spent 33 years and four months in active military service and during
that period I spent most of my time as a high class muscle man for Big
Business, for Wall Street and the bankers. - General Smedley Butler
More information about the Pdweb
mailing list