Mailing List Archive

RE: It takes two to tango
On Wed, 31 Jul 2002, Scott, Richard wrote:

[SNIP]

> [RS] Lets assume that contracts and licensing are not defunct of liability.
> Providing that the security vulnerability is reported to the vendor, the
> vendor should immediately verify the claims and inform all its licensed
> clients. In most cases many vulnerabilities could be mitigated with certain
> other efforts, whilst not as efficient or reduce business functionality, may
> reduce the risk, until a patch is available. The business would decide if
> the risk is acceptable to continue business or would defer risk by either
> reducing functionality (stopping services etc) or completely stop until a
> patch (in the event the IDS picked something up). Just because a
> vulnerability is detected in a service one is using does not necessarily
> mean my server has to be placed off line. However, I would expect a patch
> if I intend to use that feature in the future.
>
> In such cases, businesses are fully aware of risk of doing business, can
> apply some vague quantitative measure of risk and understand the risk model.
> If the client was not notified, after the vulnerability was published (not
> the exploit), businesses affected by the security hole, could sue the
> vendor. The vendor may have chosen not to inform it's clients of the
> potential security problem, and thus did not do its due diligence.
>
> I believe this would be a better model of controlling and enabling full
> disclosure. Thus, the vulnerability owner would notify a vendor, and
> following the guidelines, give 30 days for client notification (assume 30,
> could be anything noted..). The Vendor must notify clients to take
> precautionary action.
> If vendor refuses to notify clients, and clients discover additional risk,
> and/or potential damage litigation can be a consequence. [.Seems very
> similar to other product warranties et al ?? ...]
>
> <snip>
> IMHO, vendors SHOULD be responsible for security holes. However,
> before that can be done there needs to be some kind of law put in
> place to protect the researchers who find the holes. Doesn't need to
> be much, just a blanket law that if the researcher has taken
> reasonable steps to alert the vendor, they cannot be held liable for
> the consequences of releasing the advisory. If that doesn't happen,
> things are going to get messy.
> </snip>
>
> [RS] I must admit that the legal system in this country is not proactive,
> very reactive and very heavily fraught with strange laws. The introduction
> of laws and regulations to prevent reverse engineering is just step to
> remove full disclosure. The onus should be placed back in to liability and
> insurance. Preventing discovery is not the answer. If Full Disclosure was
> covered by some government classification as to require adequate and
> official steps, liability is placed on both hands of the vulnerability. The
> author would be required to follow the steps, informing the vendor and then
> releasing an advisory and then potentially the exploit. Whilst the vendor
> must be required to notify licensees / clients prior to the advisory and
> then follow up with a patch.
>
> Secondly, just because one person has discovered the flaw doesn't mean
> others do not know about it. Hence, it is vital that vendors treat
> advisories as high priority issues and must assume that potential criminals
> could use those vulnerabilities.
>
> It doesn't seem much to stretch the Homeland office for security to regard
> commerce systems as "Infrastructure" and hence bind researchers and vendors
> to an agreement. The only sticky part is if a vendor fails to take note and
> the advisory and exploits are released. In such a case the department of
> HLS could be involved in high level cases, i.e. large scale potential.
>
> This is just a sketch and there are numerous possible obstacles, but it
> certainly beats the current rogue view of many members who regard FD a
> terrible thing.
>

Of course, the same should apply to companies that expose their customers
to potential information leakage as with the recent BestBuy wireless cash
transactions exposures, yes? Afterall, it's not just the products vendors
release that are at issue in the security of information, but also, how
those products are enabled and used. Granted the vendors are extremely
lacking in providing adequete documentaion about how to properly secure
their wireless toys and trinkets, and should be taken to task for
their failings there, but, a company like BestBuy with INFORMATION
SECURITY staff should know more then the average home user about how to
secure delicate and potentially exploitable information like credit card
information. Exposure of private information is not all HIPAA related
for sure. Does not the exposure of some client information <e.g.
FORD's recent client information fiasco> make the companies obtaining,
using, and improperly storing it also perhaps contributors to the extreme
fraud rates in the credit industry such a major issue too?


Thanks,

Ron DuFresne
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
"Cutting the space budget really restores my faith in humanity. It
eliminates dreams, goals, and ideals and lets us get straight to the
business of hate, debauchery, and self-annihilation." -- Johnny Hart
***testing, only testing, and damn good at it too!***

OK, so you're a Ph.D. Just don't touch anything.
RE: It takes two to tango [ In reply to ]
[SNIP]
> If the client was not notified, after the vulnerability was published (not
> the exploit), businesses affected by the security hole, could sue the
> vendor. The vendor may have chosen not to inform it's clients of the
> potential security problem, and thus did not do its due diligence.
[SNIP]

I think you've hit a key point here. Think of all the product
recalls that happen outside of the IT world. A case in point was a baby
stroller that I purchased a few years ago. These strollers could fold up and
trap a child if they were hit in a certain way. Once it made the news the
manufacturer issued a fix (some plastic parts to strengthen the latch) and
when we saw the story on the news, they also had contact information on how
to get the pieces to fix this stroller.

It would be nice to think that this company did this out of concern
for children, but, I'm kind of cynical, I think the exec's of this company
looked closely at the potential liability they faced and compared this with
the potential cost of producing/shipping these plastic pieces. At the end of
the day, the potential cost of fixing the problem was less than the
projected liability.

Unfortunately in software we have a different situation. End User
License Agreements are so incredibly broad and seem to protect the software
'manufacturer' from any potential liability. The end result, it's cheaper,
easier and better for the bottom line to cover up the defect or ignore it's
existence.

But due diligence. That's an interesting point. I wonder if the
failure to follow due diligence can be used to strip the software
manufacturer of their blanket indemnity clauses in the End User License
Agreement. If it can be proven that Microsoft has not followed due diligence
(not to say they haven't, just an example) in protecting users of Outlook
from worms, could Microsoft be held liable for the cost of cleaning up the
next "Love Letter" worm outbreak?

Very interesting point you have made with regards to due diligence,
I wonder if it can be used.

O'Neil.

This message expresses only my personal opinion and does not necessarily
represent the official opinion of my employer.
Re: RE: It takes two to tango [ In reply to ]
-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1

Let's stop gossiping and do something about. Let us create a war chest and raise $100 million, or $1 billion. Everyone chip in, customer's bitten by bugs created by these vendors, security people and companies alike.

Create a war chest and drag a vendor into court by the ear and test all of this. Sue them! Create some new law, set some precedence. A war chest of $1 billion set aside solely to litigate one vendor until the courts decide. Keep donating to the war chest so that it never runs out. We'll see who gets tired first.

They cannot be allowed to hide behind their EULA forever. Let us test this once and for.

I pledge $10,000 right now!

[SNIP]
> If the client was not notified, after the vulnerability was published (not
> the exploit), businesses affected by the security hole, could sue the
> vendor. The vendor may have chosen not to inform it's clients of the
> potential security problem, and thus did not do its due diligence.
[SNIP]

I think you've hit a key point here. Think of all the product
recalls that happen outside of the IT world. A case in point was a baby
stroller that I purchased a few years ago. These strollers could fold up and
trap a child if they were hit in a certain way. Once it made the news the
manufacturer issued a fix (some plastic parts to strengthen the latch) and
when we saw the story on the news, they also had contact information on how
to get the pieces to fix this stroller.

It would be nice to think that this company did this out of concern
for children, but, I'm kind of cynical, I think the exec's of this company
looked closely at the potential liability they faced and compared this with
the potential cost of producing/shipping these plastic pieces. At the end of
the day, the potential cost of fixing the problem was less than the
projected liability.

Unfortunately in software we have a different situation. End User
License Agreements are so incredibly broad and seem to protect the software
'manufacturer' from any potential liability. The end result, it's cheaper,
easier and better for the bottom line to cover up the defect or ignore it's
existence.

But due diligence. That's an interesting point. I wonder if the
failure to follow due diligence can be used to strip the software
manufacturer of their blanket indemnity clauses in the End User License
Agreement. If it can be proven that Microsoft has not followed due diligence
(not to say they haven't, just an example) in protecting users of Outlook
from worms, could Microsoft be held liable for the cost of cleaning up the
next "Love Letter" worm outbreak?

Very interesting point you have made with regards to due diligence,
I wonder if it can be used.

O'Neil.

This message expresses only my personal opinion and does not necessarily
represent the official opinion of my employer
-----BEGIN PGP SIGNATURE-----
Version: Hush 2.1
Note: This signature can be verified at https://www.hushtools.com

wmYEARECACYFAj1JS/EfHGNob29zZS5hLnVzZXJuYW1lQGh1c2htYWlsLmNvbQAKCRDT
5JkCl0iMkI9bAKCTsorQOWGiNkO5IRTmMklIY/MZKQCeMLtGoF9AzP5RUIblkuDynq3D
/Go=
=wwVF
-----END PGP SIGNATURE-----


Communicate in total privacy.
Get your free encrypted email at https://www.hushmail.com/?l=2

Looking for a good deal on a domain name? http://www.hush.com/partners/offers.cgi?id=domainpeople
RE: RE: It takes two to tango [ In reply to ]
OK, I volunteer to keep the war chest. I accept
PayPal.


-----Original Message-----
From: choose.a.username@hushmail.com
[mailto:choose.a.username@hushmail.com]
Sent: Thursday, August 01, 2002 11:00 AM
To: bugtraq@securityfocus.com;
vuln-dev@securityfocus.com;
full-disclosure@lists.netsys.com
Subject: Re: [Full-Disclosure] RE: It takes two to
tango




*** PGP Signature Status: unknown
*** Signer: Unknown, Key ID = 0x97488C90
*** Signed: 8/1/2002 10:55:45 AM
*** Verified: 8/1/2002 11:42:31 AM
*** BEGIN PGP VERIFIED MESSAGE ***

Let's stop gossiping and do something about. Let us
create a war chest and raise $100 million, or $1
billion. Everyone chip in, customer's bitten by bugs
created by these vendors, security people and
companies alike.

Create a war chest and drag a vendor into court by the
ear and test all of this. Sue them! Create some new
law, set some precedence. A war chest of $1 billion
set aside solely to litigate one vendor until the
courts decide. Keep donating to the war chest so that
it never runs out. We'll see who gets tired first.

They cannot be allowed to hide behind their EULA
forever. Let us test this once and for.

I pledge $10,000 right now!

[SNIP]
> If the client was not notified, after the
vulnerability was published (not
> the exploit), businesses affected by the security
hole, could sue the
> vendor. The vendor may have chosen not to inform
it's clients of the
> potential security problem, and thus did not do its
due diligence.
[SNIP]

I think you've hit a key point here. Think of all the
product
recalls that happen outside of the IT world. A case in
point was a baby
stroller that I purchased a few years ago. These
strollers could fold up and
trap a child if they were hit in a certain way. Once
it made the news the
manufacturer issued a fix (some plastic parts to
strengthen the latch) and
when we saw the story on the news, they also had
contact information on how
to get the pieces to fix this stroller.

It would be nice to think that this company did this
out of concern
for children, but, I'm kind of cynical, I think the
exec's of this company
looked closely at the potential liability they faced
and compared this with
the potential cost of producing/shipping these plastic
pieces. At the end of
the day, the potential cost of fixing the problem was
less than the
projected liability.

Unfortunately in software we have a different
situation. End User
License Agreements are so incredibly broad and seem to
protect the software
'manufacturer' from any potential liability. The end
result, it's cheaper,
easier and better for the bottom line to cover up the
defect or ignore it's
existence.

But due diligence. That's an interesting point. I
wonder if the
failure to follow due diligence can be used to strip
the software
manufacturer of their blanket indemnity clauses in the
End User License
Agreement. If it can be proven that Microsoft has not
followed due diligence
(not to say they haven't, just an example) in
protecting users of Outlook
from worms, could Microsoft be held liable for the
cost of cleaning up the
next "Love Letter" worm outbreak?

Very interesting point you have made with regards to
due diligence,
I wonder if it can be used.

O'Neil.

This message expresses only my personal opinion and
does not necessarily
represent the official opinion of my employer

*** END PGP VERIFIED MESSAGE ***


Communicate in total privacy.
Get your free encrypted email at
https://www.hushmail.com/?l=2

Looking for a good deal on a domain name?
http://www.hush.com/partners/offers.cgi?id=domainpeople

_______________________________________________
Full-Disclosure - We believe in it.
Full-Disclosure@lists.netsys.com
http://lists.netsys.com/mailman/listinfo/full-disclosure


__________________________________________________
Do You Yahoo!?
Yahoo! Health - Feel better, live better
http://health.yahoo.com
RE: It takes two to tango [ In reply to ]
Commercial software entities, especially the larger ones, charge significant sums of money for
their products. In turn, they spend money on developers, testers, marketers, lawyers, and
insurance. They market their products as beneficial and, many times, secure. The source code is not
freely available nor is the consumer, for the most part, allowed to dig into what is provided (e.g.
EULA, DMCA), so the consumer depends on the word of the software vendor. (I am not arguing open vs.
closed source, just citing facts.) Additionally, unless some work-around is available, the consumer
must rely on the software vendor for fixes/patches. After charging money and restricting how well
the consumer can examine/fix their products, the vendor then disclaims all responsibility for their
products. (This seems flawed to me. Some of the responsibility should be placed on the vendor.)

The real question... What is the least cost solution to extremely buggy software? I think it lies
with the commercial software entities to the extent that they should have strong processes in place
to prevent, discover, and fix problems with their code. It is simpler and far less costly for the
vendor to put methodologies in place during the development/testing of software to
prevent/discover/fix problems with the software than it is for consumers to be hit with the
consequences of these problems in software they widely deploy. I understand that code would still
have bugs, but that is where proof of the strong methodologies employed (e.g., non-negligent
behavior) and insurance would come into play.

So, if all fault continues to rest with consumers, what correction might happen? Consumers could
start looking for companies that have a different EULA, strong track record, and demonstrated
development/testing practices. Insurance companies might begin offering insurance to consumers
against shoddy software, and with that, insurance companies would charge consumers less rates for
those demonstrated products. At some point, this could lead to strong competition and stronger
development/testing practices at software companies.

And, if some (certainly not all) fault rested with the commercial software industry, what
correction would happen? Well, companies would increase their development/testing practice until it
reached the appropriate cost-risk level. Part of the determination of this level would be the base
rates charged by insurance companies.

To me, the ladder makes the most sense. But, I am no economist and I have performed no studies.

As it stands now, solely the software industry has made this decision. Perhaps a lawsuit
challenging the EULA would spark the necessary examination of this decision.

-Andrew