简述

什么是AWVS

AWVS(Acunetix Web Vulnerability Scanner)是一个自动化的Web应用程序安全测试工具,它可以扫描任何可通过Web浏览器访问的和遵循HTTP/HTTPS规则的Web站点和Web应用程序。适用于任何中小型和大型企业的内联网、外延网和面向客户、雇员、厂商和其它人员的Web网站。WVS可以通过检查SQL注入攻击漏洞、跨站脚本攻击漏洞等来审核Web应用程序的安全性。 它可以扫描任何可通过Web浏览器访问的和遵循HTTP/HTTPS规则的Web站点和Web应用程序。

什么是Xray

Xray 是一款功能强大的安全评估工具,支持主动或者被动扫描常见 web 安全问题和自定义Poc

下载安装

AWVS下载请参考

AWVS13.X破解版下载

Xray下载

Xray下载

安装我就不用讲了吧,不懂可参考文档.

正文

由于AWVS批量导入不友好所以使用如下脚本

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
import requests
import json
from requests.packages.urllib3.exceptions import InsecureRequestWarning
requests.packages.urllib3.disable_warnings(InsecureRequestWarning)


apikey = '1986ad8ca53df4d7028d5f3c06e936c56a22f6e66e7443f99546feabf30c26b'#AWVS API
headers = {'Content-Type': 'application/json',"X-Auth": apikey}


def addTask(url,target):
try:
url = ''.join((url, '/api/v1/targets/add'))
data = {"targets":[{"address": target,"description":""}],"groups":[]}
r = requests.post(url, headers=headers, data=json.dumps(data), timeout=30, verify=False)
result = json.loads(r.content.decode())
return result['targets'][0]['target_id']
except Exception as e:
return e
def scan(url,target,Crawl,user_agent,profile_id,proxy_address,proxy_port):
scanUrl = ''.join((url, '/api/v1/scans'))
target_id = addTask(url,target)

if target_id:
data = {"target_id": target_id, "profile_id": profile_id, "incremental": False, "schedule": {"disable": False, "start_date": None, "time_sensitive": False}}
try:
configuration(url,target_id,proxy_address,proxy_port,Crawl,user_agent)
response = requests.post(scanUrl, data=json.dumps(data), headers=headers, timeout=30, verify=False)
result = json.loads(response.content)
return result['target_id']
except Exception as e:
print(e)

def configuration(url,target_id,proxy_address,proxy_port,Crawl,user_agent):
configuration_url = ''.join((url,'/api/v1/targets/{0}/configuration'.format(target_id)))
data = {"scan_speed":"fast","login":{"kind":"none"},"ssh_credentials":{"kind":"none"},"sensor": False,"user_agent": user_agent,"case_sensitive":"auto","limit_crawler_scope": True,"excluded_paths":[],"authentication":{"enabled": False},"proxy":{"enabled": Crawl,"protocol":"http","address":proxy_address,"port":proxy_port},"technologies":[],"custom_headers":[],"custom_cookies":[],"debug":False,"client_certificate_password":"","issue_tracker_id":"","excluded_hours_id":""}
r = requests.patch(url=configuration_url,data=json.dumps(data), headers=headers, timeout=30, verify=False)
def main():
Crawl = True #
proxy_address = '127.0.0.1'
proxy_port = '7777'
awvs_url = 'https://127.0.0.1:3443' #awvs url
with open('url.txt','r',encoding='utf-8') as f:
targets = f.readlines()
profile_id = "11111111-1111-1111-1111-111111111111"
user_agent = "Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.21 (KHTML, like Gecko) Chrome/41.0.2228.0 Safari/537.21" #扫描默认UA头
if Crawl:
profile_id = "11111111-1111-1111-1111-111111111117"
for target in targets:
target = target.strip()
if scan(awvs_url,target,Crawl,user_agent,profile_id,proxy_address,int(proxy_port)):
print("{0} 添加成功".format(target))

if __name__ == '__main__':
main()

脚本配置

  • 将apikey替换为自己的
  • 将需要扫描的url放入到脚本同一目录下的url.txt
  • 将awvs_url改为自己awvs地址
  • 如果和需要和X-RAY配合使用可自行修改 将Crawl 改为 True
  • proxy_address 改为代理地址
  • proxy_port 改为代理端口
  • 如果还想使用其他扫描类型可以自行修改profile_id
  • 如果想修改UA头自行修改user_agent

获取AWVS API

首先在 AWVS 中点击个人资料生成API

获取到API填入脚本中

Xray开启本地监听

xray webscan --listen 127.0.0.1:7777 --html-output proxy.html

开始扫描

在脚本同目录新建 url.txt 填入待扫描的域名

运行脚本 python scan.py

可以发现已经开始运行了 等待扫描结束会在 Xray 目录生成 proxy.html

结尾

这种联动方式相当于取长补短,但是听说长亭也出了个爬虫,还没上手有兴趣的表哥们可以去试试,xray不仅仅可以联动awvs还可以和例如 crawlergo、BurpSuite等等联动。

参考资料:

Awvs批量导入目标