The 'Wormability' Factor
Worm prediction formulas remain little more than educated guesswork.
Researchers at Arbor Networks
published details about
" formula at the recent
RSA Conference. They claim it's possible to determine the likelihood a vulnerability
will result in a worm by formalizing the factors that they believe contribute
to a worm's success. They use this information to allow them to focus on a much
smaller number of vulnerabilities; those that have a high Wormability factor.
At Cybertrust, we also predict worms, as does practically
every other security company. Like everyone else, we use a formula; although
ours may not look as elaborate as some others, they're all equally fuzzy. There
are so many unquantifiable values in the factors surrounding malware that it's
impossible to constantly and accurately pre-determine a vulnerability's "wormability."
For example, if a malware author has any arrests or convictions, what affect
does this have on would-be worm releasers? What is currently considered "cool"
within their one of more than 8,000 hacker groups? How many contacts does the
malware author have with organized crime, or other criminals with financial
motives? These all rank very high amongst factors which would need to be considered
for any formulaic predictability to be accurate, and as you can tell, it's not
easy to accurately measure those things. Consider also that the actual number
of Windows 98 boxes in use or the number of Windows XP systems that are "fully
patched" is not accurately known, and you can begin to see the scope of
Research can describe, accurately, the possibilities for abuse with a given
vulnerability, assuming enough of the vulnerability's details have been sufficiently
disclosed or discovered to clearly describe the vulnerability in the first place.
Insight and/or an "X-factor" are then added, but both are subjective
and only as valuable as the professionals from whom they come. Often that includes
a discussion of how fast the pixies have been dancing, or what phase the moon
Determining which vulnerabilities require the most attention can be done, certainly,
within a margin of error that makes uptime a reasonable expense. But formulaic
approaches can only be used to compare one vulnerability to others, not predetermine
whether a worm will or won't be written.
Webalizer, a *nix tool to create Web statistics from
Web logs, has been getting abused lately. Webalizer will create, amongst many
other pages, a page which specifies all of the referrer URLs used to get to
your pages. It does this by reading and interpreting the HTTP-Referrer
value provided by the visitor to your site. This HTTP-Referrer can be almost
anything you want, so if you force-feed sites that use Webalizer with your own
HTTP-Referrer, it will place it as a link on the results page. Then along comes
Google or another search engine that lists pages which are frequently used as
Referrers. Page ranking is, at least in part, based on how many other sites
are linked to your page. It knows this by looking at links on Web sites and
correlating that information with other sites. So if you're on every Web site
that runs Webalizer as a referrer, you've got a lot of links to your site from
other sites—ergo, you're ranked higher than the next guy.
column was originally published in our weekly Security Watch
newsletter. To subscribe, click here.
The security issue here is that links stuffed in the HTTP-Referrer may be a
legitimate business trying to get better exposure in a very aggressive market,
but it's just as likely to be a malicious Web site lying in wait for visitors
to install Spyware/Adware or Trojans. The developers of the infrastructure that
became the World Wide Web didn't realize how its core infrastructure components
were going to be used to support commercial activity, so they couldn't have
imagined the abuse it would support and assist. Data, such as the HTTP-Referrer
value, were deemed to be valid since they weren't being exposed to the casual
viewer. But anything a browser can send, a malware author can send better. Commercial
Web sites are desperate for demographic information, but unfortunately they
can't tell the human from the tool -- and likely never will.
Follow-up: Computer Associates BrightStor Backup
In last Monday's column, I had some negative things to say about Computer
Associate's BrightStor Backup product. I was
contacted by them and had a nice chat with their VP of Data Availability Development.
They wanted to assure me that they take quality assurance very seriously. They
gave me the chance to ask questions, so I asked if they had verified that there
weren't any other hard-coded user ID and password combinations that could be
used as back-doors. They assured me there weren't in the products they worked
on. That's a relief!
Russ Cooper is a senior information security analyst with Verizon Business, Inc.
He's also founder and editor of NTBugtraq, www.ntbugtraq.com,
one of the industry's most influential mailing lists dedicated to Microsoft security.
One of the world's most-recognized security experts, he's often quoted by major
media outlets on security issues.