Somewhere between the effort, expense, and time of Orange Book style evaluation, and the
"apparently OK firewall" sticker there is a happy medium. To be useful, tests have to take into
account the design principles of what they are tested; they cannot be mindlessly automated attack
scripts. To be timely, repeatable, and cost-effective, tests cannot require expensive talent, elaborate
design reviews, and development of customized attack tools. I believe that what is needed is a
process of peer review for testing methodologies, and perhaps some give-and-take between the
vendor community. Vendors need to stop selling firewalls based on smoke and mirrors. Every time a
sales rep tells a potential customer, "don't buy an XYZ, I hear they got broken into." (yes, they do
that) the firewall technology is called into question as a whole; it is penny-wise and pound foolish to
call the basis of an entire technology into question in order to make a sale, but that is what happens.
Firewall customers need to not only ask whether a firewall provides the functionality they want, but
how the vendor tested it, and if there was some other procedure involved in the testing, what it was.
Customers need to grow a bit more cynical about testing and test strategies. The time is ripe, right
now. It is ripe for hucksters who want to cash in by "testing firewalls" they don't understand, and it is
ripe for good faith efforts to establish peer review processes for firewall design and marketing. Let's
be on the alert for both.