BLOG@CACM
Computing Profession

Choosing a DAST Solution: What to Pay Attention To

Posted

The majority of today's Web applications contain dangerous vulnerabilities. To analyze their security, one cannot do without a dynamic scanner. DAST (Dynamic Application Security Testing) tools allow you to detect and evaluate security problems quickly. Let me tell you what to look for when choosing such a tool.

According to various studies, 70% of vulnerabilities have to do with errors in the code. Using vulnerabilities in your web application code, hackers can distribute malware, launch cryptojacking attacks, employ phishing and redirect users to malicious sites, hack a phone remotely, or steal personal data using social engineering techniques. 

Yes, sure, it is impossible to create perfectly secure software, but it is quite possible to reduce the number of vulnerabilities and increase the level of product security. To do this, you can rely on DevSecOps — a process that links development and security, where software is checked and tested for vulnerabilities at every stage of its creation.

The DevSecOps process is voluminous; it may include numerous information security tools. In this post, I want to talk about DAST and how to choose the right scanner for dynamic application analysis. Together we will figure out what tool characteristics and parameters you need to pay attention to, and what product types are currently available on the market.

What is DAST, and how does it work?

Dynamic application security testing is a secure development practice in which an automated analysis of a deployed and functioning application is carried out. The dynamic scanner checks all access points via HTTP, simulates external attacks using common vulnerabilities, and simulates various user actions. The tool determines which APIs the service has, sends verification requests, and uses, where possible, incorrect data (quotes, delimiters, special characters, and more).

The dynamic scanner sends and analyzes a large number of requests. The analysis of the sent request and the received response, as well as their comparison with a regular request, allows you to find different security problems.

Most scanners have similar functions and modus operandi. Their main components are a crawler and an analyzer.

The crawler traverses every link on every page it can reach, examining the contents of files, pressing buttons, and going through a dictionary of possible page names. This process allows you to estimate the size of the attack surface and possible attack vectors, taking into account all ways of interacting with the application.

The analyzer checks the application directly. It can work in passive or active mode. In the first case, the analyzer studies only information that the crawler sends to it. In the second, the analyzer sends requests with incorrect data to the points found by the crawler and to other places that are not currently present on the pages but which can be used in the application. It then infers the presence of a vulnerability based on the server's responses.

What should you pay attention to when choosing a DAST tool?

  • Scan quality

This is the ratio of found and missed vulnerabilities. It is impossible to immediately understand how well the scanner analyzes. To do this, you should at least approximately understand ​​what vulnerabilities can be there and compare your estimates with the scan results. There are several ways to evaluate a tool:

  1. If you have an application and have already checked it for vulnerabilities through a bug bounty program or penetration testing, you can compare those results with the results of the scanner.
  2. If there is no application yet, you can use other pre-vulnerable software, which is created, as a rule, for training. You need to find an application that is close to your development in terms of the technology stack.

The number of false positives plays a decisive role when assessing the scan quality. Too many false positives clog the results. Besides, real errors can be missed. To determine how well the tool scans, you should analyze the report, parse the responses, and calculate the number and proportion of false positives.

  • Crawling

If there is no information about the application and you need to analyze it from scratch, it is important to understand how many paths and transitions you can collect; that is, how accurate the crawling will be. To do this, you can look at the DAST product settings. You need to find out if it can monitor requests from the front end to the back end; parse, for example, Swagger or WSDL applications; or find links in HTML or JS. It is also worth studying the process of obtaining information about the application. Before scanning, you can, for example, find out which APIs are used. This will help you understand what the tool needs to perform a full program scan. When choosing a scanner, it is helpful to make a list of what each tool can import and see if it can be built into the development process.

  • Scan speed

This parameter is also important, especially if checks are integrated into the development process. Scanning can slow down the process and, as a result, lead to a waste of time and money. Scan speed largely depends on how quickly the application responds to requests, how many simultaneous connections it can handle, and several other factors. Therefore, in order to compare the speeds of different DAST tools, you need to run them with the same software under approximately the same conditions.

  • Advanced settings

Automatic analysis tools must have detailed settings. They will allow you to remove unnecessary requests and limit the scan area. This will increase the quality of the process and the speed of analysis. To set tasks for the tool appropriately, you must have all available options and settings.

There are "smart" scanners that adapt themselves to applications, but such tools still have to be manually configured, since the goals of the checks are different. For example, sometimes you need to scan an application in several ways, starting with a full scan and ending with a superficial analysis; in this case, the manual mode will definitely come in handy.

When choosing a tool, you need to pay attention to the total number of possible parameters, as well as to how easy it is to configure them. To compare the work of different tools, you can create several scan profiles in each of them: fast and shallow for initial analysis, full and maximum for a full-fledged analysis.

  • Integration

To make the dynamic analysis as effective as possible, it is worth integrating this practice into the development process and periodically running the scanner during the build. It is necessary to form a list of what is used in the CI/CD process in advance, and to draw up an approximate plan for launching the tool.

This will help you understand how easy it will be to integrate it into the development process, and whether it is convenient to use its API.

  • Technology

When choosing a scanner, you should consider the technologies your company uses in development. To do this, you can analyze applications and create a list of technologies, languages ​, and frameworks that are used. The list can get quite extensive, especially if the company is big. Therefore, it is appropriate to choose only a few critical parameters as criteria for evaluating scanners:

  1. The number of technologies and frameworks the tool covers.
  2. Its ability to support key technologies the company uses in its critical services.
  • Login sequence recording

Recording the login sequence is extremely important for dynamic scanners, since authentication is required to enter the application. There are many pitfalls in this process, such as hashing the password before sending it, encrypting it with a shared key on the front-end, etc. Therefore, you must check in advance whether the tool will cope with all such nuances. To do this, you need to select as many different applications as possible and see if the scanner can go through the login stage in each of them.

It is also good to check how the tool behaves when logged out. The scanner sends a lot of requests during the analysis process. In response to some of them, the server can "throw the user out" of the system. The tool should notice this and re-enter the application.

  • Tool updates

Technology is constantly evolving, so when choosing a tool, it is vital to consider how often its updates or new versions of signatures/patterns or analysis rules are released. It is worth studying this information on the product website or requesting it from the vendor. This will show whether the developer is following trends and how up-to-date your database of checks will be.

It is desirable to find out if you can influence the development of the product and how the developer handles requests for new features. This will show how quickly the functionality you need will appear in the product and how communication with the vendor is arranged as part of the options update.

Which tool to choose?

There are plenty of tools on the market offered by such companies as Netsparker, Acunetix, Nessus, Rapid7, AppScan, and others. Let me briefly describe two instruments that I use.

BurpSuite Enterprise:

This tool was developed by PortSwigger. The product has a full-fledged REST API for interacting and managing scans, sending reports, and much more. The scanning agent is the classic BurpSuite. It is launched in "headless mode," but has limitations. For example, you can interact with it only through control commands from the head portal, and you will not be able to load your plugins. Generally, if the tool is configured correctly, it can provide excellent results.

OWASP ZAP (Zed Attack Proxy)

This popular tool was created by the OWASP community, so it is completely free. It has different SDKs and APIs for different programming languages. You can use OWASP options or your own plugins. The product has extensions for various CI/CD tools. It can be run in different modes and controlled programmatically. You can easily insert the tool into your development process. At the same time, the scanner has its drawbacks. Since it is an open-source solution, the quality of scans is lower than that of enterprise solutions. Also, the tool's functionality is not very extensive and deep, but it can be extended and improved.

Conclusion

When choosing a dynamic analyzer, you can use the criteria noted above in this article, but they must be applied correctly. Each company is unique and has its own nuances and features — all this must be taken into account in conjunction with all the selection criteria. It is also good to define your needs in advance and understand what results you want to receive from the tool. It is advisable to conduct full-fledged testing of various options, compare them with each other, and choose the best solution.

 

Alex Vakulov is a cybersecurity researcher with over 20 years of experience in malware analysis and strong malware removal skills.

Join the Discussion (0)

Become a Member or Sign In to Post a Comment

The Latest from CACM

Shape the Future of Computing

ACM encourages its members to take a direct hand in shaping the future of the association. There are more ways than ever to get involved.

Get Involved

Communications of the ACM (CACM) is now a fully Open Access publication.

By opening CACM to the world, we hope to increase engagement among the broader computer science community and encourage non-members to discover the rich resources ACM has to offer.

Learn More