为了获得搜索的收录,自然是要让蜘蛛爬取的,但是垃圾蜘蛛会让服务器压力增大,还可能让内容被非法转载。本文介绍如何在宝塔中配置robots文件以及Nginx来拦截垃圾爬虫。
2022-12-11更新:
最新更新的辣鸡蜘蛛:
(CheckMarkNetwork|Synapse|Nimbostratus-Bot|Dark|scraper|LMAO|Hakai|Gemini|Wappalyzer|masscan|crawler4j|Mappy|Center|eright|aiohttp|MauiBot|Crawler|researchscan|Dispatch|AlphaBot|Census|ips-agent|NetcraftSurveyAgent|ToutiaoSpider|EasyHttp|Iframely|sysscan|fasthttp|muhstik|DeuSu|mstshash|HTTP_Request|ExtLinksBot|package|SafeDNSBot|CPython|SiteExplorer|SSH|MegaIndex|BUbiNG|CCBot|NetTrack|Digincore|aiHitBot|SurdotlyBot|null|SemrushBot|Test|Copied|ltx71|Nmap|DotBot|AdsBot|InetURL|Pcore-HTTP|PocketParser|Wotbox|newspaper|DnyzBot|redback|PiplBot|SMTBot|WinHTTP|Auto Spider 1.0|GrabNet|TurnitinBot|Go-Ahead-Got-It|Download Demon|Go!Zilla|GetWeb!|GetRight|libwww-perl|Cliqzbot|MailChimp|SMTBot|Dataprovider|XoviBot|linkdexbot|SeznamBot|Qwantify|spbot|evc-batch|zgrab|Go-http-client|FeedDemon|JikeSpider|Indy Library|Alexa Toolbar|AskTbFXTV|AhrefsBot|CrawlDaddy|CoolpadWebkit|Java|UniversalFeedParser|ApacheBench|Microsoft URL Control|Swiftbot|ZmEu|jaunty|Python-urllib|lightDeckReports Bot|YYSpider|DigExt|YisouSpider|HttpClient|MJ12bot|EasouSpider|LinkpadBot|Ezooms)
刀云免责声明
1、帖子原作者须承担一切因本文发表而直接或间接导致的民事或刑事法律责任。
2、本帖部分内容转载自其它媒体,但并不代表本站赞同其观点和对其真实性负责。
3、如本帖侵犯到任何版权问题,请立即告知本站,本站将及时予与删除并致以最深的歉意。
签名:这个人很懒,什么也没有留下!