Reservoir computers (RCs) are a biology inspired computational framework for signal processing typically implemented using recurrent neural networks. Recent work has shown that Boolean networks (BN) can also be used as reservoirs. We analyze the performance of BN RCs, measuring their flexibility and identifying factors that determine effective approximation of Boolean functions that are applied in a sliding-window fashion over a binary signal, either non-recursively or recursively. We train and test BN RCs of different sizes, signal connectivity, and in-degree to approximate 3-bit, 5-bit and 3-bit recursive binary functions. We analyze how BN RC parameters and function average sensitivity, a measure of function smoothness, affect approximation accuracy as well as the spread of accuracies for a single reservoir. We found that approximation accuracy and reservoir flexibility are highly dependent on RC parameters. Overall, our results indicate that not all reservoirs are equally flexible and RC instantiation and training can be more efficient if this is taken into account. The optimum range of RC parameters opens up an angle of exploration for understanding how biological systems might be tuned to balance system restraints with processing capacity.