KaliLinux之web安全扫描器skipfish使用

0x00.skipfish简介

谷歌公司出品的开源web程序评估软件。 

skipfish特点:CPU资源占用低,扫描速度快,每秒可以轻松处理2000个请求,误报率低。

1×00.skipfish使用

1×01  帮助信息 

root@kali:~# skipfish --help skipfish web application scanner - version 2.10b Usage: skipfish [ options ... ] -W wordlist -o output_dir start_url [ start_url2 ... ] Authentication and access options: -A user:pass - use specified HTTP authentication credentials -F host=IP - pretend that 'host' resolves to 'IP' -C name=val - append a custom cookie to all requests -H name=val - append a custom HTTP header to all requests -b (i|f|p) - use headers consistent with MSIE / Firefox / iPhone -N - do not accept any new cookies --auth-form url - form authentication URL --auth-user user - form authentication user --auth-pass pass - form authentication password --auth-verify-url - URL for in-session detection Crawl scope options: -d max_depth - maximum crawl tree depth (16) -c max_child - maximum children to index per node (512) -x max_desc - maximum descendants to index per branch (8192) -r r_limit - max total number of requests to send (100000000) -p crawl% - node and link crawl probability (100%) -q hex - repeat probabilistic scan with given seed -I string - only follow URLs matching 'string' -X string - exclude URLs matching 'string' -K string - do not fuzz parameters named 'string' -D domain - crawl cross-site links to another domain -B domain - trust, but do not crawl, another domain -Z - do not descend into 5xx locations -O - do not submit any forms -P - do not parse HTML, etc, to find new links Reporting options: -o dir - write output to specified directory (required) -M - log warnings about mixed content / non-SSL passwords -E - log all HTTP/1.0 / HTTP/1.1 caching intent mismatches -U - log all external URLs and e-mails seen -Q - completely suppress duplicate nodes in reports -u - be quiet, disable realtime progress stats -v - enable runtime logging (to stderr) Dictionary management options: -W wordlist - use a specified read-write wordlist (required) -S wordlist - load a supplemental read-only wordlist -L - do not auto-learn new keywords for the site -Y - do not fuzz extensions in directory brute-force -R age - purge words hit more than 'age' scans ago -T name=val - add new form auto-fill rule -G max_guess - maximum number of keyword guesses to keep (256) -z sigfile - load signatures from this file Performance settings: -g max_conn - max simultaneous TCP connections, global (40) -m host_conn - max simultaneous connections, per target IP (10) -f max_fail - max number of consecutive HTTP errors (100) -t req_tmout - total request response timeout (20 s) -w rw_tmout - individual network I/O timeout (10 s) -i idle_tmout - timeout on idle HTTP connections (10 s) -s s_limit - response size limit (400000 B) -e - do not keep binary responses for reporting Other settings: -l max_req - max requests per second (0.000000) -k duration - stop scanning after the given duration h:m:s --config file - load the specified configuration file Send comments and complaints to <heinenn@google.com>.

1×02 

• skipfish -o test [url]  #test为保存结果的文件名
• skipfish -o test @url.txt #指定目标IP列表文件url.txt
• skipfish -o test -S complet.wl -W abc.wl [url]  #-S 加载一个只读字典,-W  指定一个可读可写的文件

 -I 只检查包含´string´的 URL
-X 不检查包含´string´的URL
 -K 不对指定参数进行 Fuzz 测试
 -D 跨站点爬另外一个域
 -l 每秒最大请求数
 -m 每IP最大并发连接数
 –config 指定配置文件

 

原文链接:https://www.cnblogs.com/iAmSoScArEd/p/10409288.html

原创文章,作者:优速盾-小U,如若转载,请注明出处:https://www.cdnb.net/bbs/archives/18612

(0)
上一篇 2023年6月30日
下一篇 2023年7月2日

相关推荐

发表回复

您的电子邮箱地址不会被公开。 必填项已用*标注

优速盾注册领取大礼包www.cdnb.net
/sitemap.xml