WebSurgery A Tools Suite for Web Applications Security Testing

WebSurgery is a suite of tools for security testing of web applications. It was designed for security auditors to help them with the web application planning and exploitation. Currently, it uses an efficient, fast and stable Web Crawler, File/Dir Bruteforcer and Fuzzer for advanced exploitation of known and unusual vulnerabilities such as SQL Injections, Cross site scripting (XSS), brute-force for login forms, identification of firewall-filtered rules etc.

 

 WEB Crawler 

WEB Crawler was designed to be fast, accurate, stable, completely parametrable and the use of advanced techniques to extract links from Javascript and HTML Tags. It works with parametrable timing settings (Timeout, Threading, Max Data Size, Retries) and a number of rules parameters to prevent infinitive loops and pointless scanning (Case Sensitive, Dir Depth, Process Above/Below, Submit Forms, Fetch Indexes/Sitemaps, Max Requests per File/Script Parameters). It is also possible to apply custom headers (user agent, cookies etc) and Include/Exclude Filters. For example, by default the crawler will scan only the initial web service (url at the specific port), however you could change the initial filter “^($protocol)://($hostport)/” to “^(http|https)://[^/]*\.test.com” to specify the whole domain sites for specific domain using regular expressions (.net) (e.g. for http://test.com, https://test.com, http://www.test.com, https://something.test.com:9443 etc). Crawler has an embedded File/Dir Brute Forcer which helps to directly brute force for files/dirs in the directories found from crawling.

WEB Bruteforcer 

WEB Bruteforcer is a brute forcer for files and directories within the web application which helps to identify the hidden structure. It is also multi-threaded and completely parametrable for timing settings (Timeout, Threading, Max Data Size, Retries) and rules (Headers, Base Dir, Brute force Dirs/Files, Recursive, File’s Extension, Send GET/HEAD, Follow Redirects, Process Cookies and List generator configuration). By default, it will brute force from root / base dir recursively for both files and directories. It sends both HEAD and GET requests when it needs it (HEAD to identify if the file/dir exists and then GET to retrieve the full response).

 WEB Fuzzer

WEB Fuzzer is a more advanced tool to create a number of requests based on one initial request. Fuzzer has no limits and can be used to exploit known vulnerabilities such (blind) SQL Inections and more unsual ways such identifing improper input handling and firewall/filtering rules.

Download

NJ Ouchn

"Passion is needed for any great work, and for the revolution, passion and audacity are required in big doses"