I have made some improvements to my crawler that will allow for some interesting ideas that I have planned for AuraGem Search. For at least 2 years now my search engine has had a way of detecting which pages can be used as gemsub feeds and which cannot. With slight changes to my crawler, it can now query from the db a list of all URLs that are considered feeds and crawl only internal links from those pages - meaning it will crawl only the non-cross-host links of those feed pages. This will allow me to have a constantly updated feed aggregator based on my search engine, with no censorship and no requirement of having to submit a url.
8 months ago ยท ๐ maxheadroom
Note: If you want your pages to not be crawled by my search engine, be sure to use a robots.txt ยท 8 months ago