[gopher] Gopher "robots.txt" (was Re: New V-2 WAIS database)
[Top] [All Lists]
[Date Prev][Date Next][Thread Prev][Thread Next][Date Index] [Thread Index]
> >New indexing will resume Sunday evening.
> How can I tell it to skip some parts of my server?
> I don't want it to download the whole 19 GB debian archive, or run some
> thousand times the NNTP gateway.
Good point. I am actually trying to think of a way like the HTTP robots.txt
that can more or less transparently tell V-2 what to stay out of. Suggestions?
--
----------------------------- personal page: http://www.armory.com/~spectre/ --
Cameron Kaiser, Point Loma Nazarene University * ckaiser@xxxxxxxxxxxxxxxxxxxx
-- Die, v.: To stop sinning suddenly. -- Elbert Hubbard -----------------------
- [gopher] New V-2 WAIS database, Cameron Kaiser, 2001/01/13
- [gopher] Re: New V-2 WAIS database, Marco d'Itri, 2001/01/13
- [gopher] Gopher "robots.txt" (was Re: New V-2 WAIS database),
Cameron Kaiser <=
- [gopher] Re: Gopher "robots.txt" (was Re: New V-2 WAIS database), Marco d'Itri, 2001/01/13
- [gopher] Re: Gopher "robots.txt" (was Re: New V-2 WAIS database), Cameron Kaiser, 2001/01/13
- [gopher] Re: Gopher "robots.txt" (was Re: New V-2 WAIS database), David Allen, 2001/01/13
- [gopher] Re: Gopher "robots.txt" (was Re: New V-2 WAIS database), Cameron Kaiser, 2001/01/13
- [gopher] Re: Gopher "robots.txt" (was Re: New V-2 WAIS database), David Allen, 2001/01/14
- [gopher] Re: Gopher "robots.txt", emanuel at heatdeath organisation, 2001/01/14
- [gopher] Re: Gopher "robots.txt", David Allen, 2001/01/14
- [gopher] Re: Gopher "robots.txt", emanuel at heatdeath organisation, 2001/01/14
- [gopher] Re: Gopher "robots.txt", Cameron Kaiser, 2001/01/14
- [gopher] Re: Gopher "robots.txt", David Allen, 2001/01/14
|
|