ReDoS

A regular expression denial of service (ReDoS) is an algorithmic complexity attack that produces a denial-of-service by providing a regular expression and/or an input that takes a long time to evaluate. The attack exploits the fact that many regular expression implementations have super-linear worst-case complexity; on certain regex-input pairs, the time taken can grow polynomially or exponentially in relation to the input size. An attacker can thus cause a program to spend substantial time by providing a specially crafted regular expression and/or input. The program will then slow down or become unresponsive.

Description
Regular expression ("regex") matching can be done by building a finite-state automaton. Regex can be easily converted to nondeterministic automata (NFAs), in which for each state and input symbol, there may be several possible next states. After building the automaton, several possibilities exist:


 * the engine may convert it to a deterministic finite-state automaton (DFA) and run the input through the result;
 * the engine may try one by one all the possible paths until a match is found or all the paths are tried and fail ("backtracking").
 * the engine may consider all possible paths through the nondeterministic automaton in parallel;
 * the engine may convert the nondeterministic automaton to a DFA lazily (i.e., on the fly, during the match).

Of the above algorithms, the first two are problematic. The first is problematic because a deterministic automaton could have up to $$2^m$$ states where $$m$$ is the number of states in the nondeterministic automaton; thus, the conversion from NFA to DFA may take exponential time. The second is problematic because a nondeterministic automaton could have an exponential number of paths of length $$n$$, so that walking through an input of length $$n$$ will also take exponential time. The last two algorithms, however, do not exhibit pathological behavior.

Note that for non-pathological regular expressions, the problematic algorithms are usually fast, and in practice, one can expect them to "compile" a regex in O(m) time and match it in O(n) time; instead, simulation of an NFA and lazy computation of the DFA have O(m2n) worst-case complexity. Regex denial of service occurs when these expectations are applied to a regex provided by the user, and malicious regular expressions provided by the user trigger the worst-case complexity of the regex matcher.

While regex algorithms can be written in an efficient way, most regex engines in existence extend the regex languages with additional constructs that cannot always be solved efficiently. Such extended patterns essentially force the implementation of regex in most programming languages to use backtracking.

Exponential backtracking
The most severe type of problem happens with backtracking regular expression matches, where some patterns have a runtime that is exponential in the length of the input string. For strings of $$n$$ characters, the runtime is $$O(2^n)$$. This happens when a regular expression has three properties:


 * the regular expression applies repetition to a subexpression;
 * the subexpression can match the same input in multiple ways, or the subexpression can match an input string which is a prefix of a longer possible match;
 * and after the repeated subexpression, there is an expression that matches something which the subexpression does not match.

The second condition is best explained with two examples:


 * in, repetition is applied to the subexpression  , which can match   in two ways on each side of the alternation.
 * in, repetition is applied to the subexpression  , which can match   or  , etc.

In both of these examples we used  to match the end of the string, satisfying the third condition, but it is also possible to use another character for this. For example  has the same problematic structure.

All three of the above regular expressions will exhibit exponential runtime when applied to strings of the form $$a...a!$$. For example, if you try to match them against  on a backtracking expression engine, it will take a significantly long time to complete, and the runtime will approximately double for each extra   before the.

It is also possible to have backtracking which is polynomial time $$O(n^x)$$, instead of exponential. This can also cause problems for long enough inputs, though less attention has been paid to this problem as malicious input must be much longer to have a significant effect. An example of such a pattern is " ", when the input is an arbitrarily long sequence of " "s.

Vulnerable regexes in online repositories
So-called "evil" or vulnerable regexes have been found in online regular expression repositories. Note that it is enough to find a vulnerable subexpression in order to attack the full regex:


 * 1) RegExLib, id=1757 (email validation) - see  part


 * 1) OWASP Validation Regex Repository, Java Classname - see  part

These two examples are also susceptible to the input.

Attacks
If the regex itself is affected by user input, such as a web service permitting clients to provide a search pattern, then an attacker can inject a malicious regex to consume the server's resources. Therefore, in most cases, regular expression denial of service can be avoided by removing the possibility for the user to execute arbitrary patterns on the server. In this case, web applications and databases are the main vulnerable applications. Alternatively, a malicious page could hang the user's web browser or cause it to use arbitrary amounts of memory.

However, if a vulnerable regex exists on the server-side already, then an attacker may instead be able to provide an input that triggers its worst-case behavior. In this case, e-mail scanners and intrusion detection systems could also be vulnerable.

In the case of a web application, the programmer may use the same regular expression to validate input on both the client and the server side of the system. An attacker could inspect the client code, looking for evil regular expressions, and send crafted input directly to the web server in order to hang it.

Mitigation
ReDoS can be mitigated without changes to the regular expression engine, simply by setting a time limit for the execution of regular expressions when untrusted input is involved.

ReDoS can be avoided entirely by using a non-vulnerable regular expression implementation. After CloudFlare's web application firewall (WAF) was brought down by a PCRE ReDoS in 2019, the company rewrote its WAF to use the non-backtracking Rust regex library, using an algorithm similar to RE2.

Vulnerable regular expressions can be detected programmatically by a linter. Methods range from pure static analysis to fuzzing. In most cases, the problematic regular expressions can be rewritten as "non-evil" patterns. For example,  can be rewritten to. Possessive matching and atomic grouping, which disable backtracking for parts of the expression, can also be used to "pacify" vulnerable parts.