On 29/06/2021 14:16, Bernhard Reiter wrote: >
> It seems possible to get the abuse-potential of synchronising keyservers unter
> control. (While the abuse-potential of central and validating keyservers is
> also there, but in a different are.)
> I've outlined the concepts how this could be done here:
> Preserving non-central and privacy with a "permission recording keyserver"
> [Reiter 2019-07 a]
I think you meant to link to the previous message... ;-) https://lists.gnupg.org/pipermail/gnupg-devel/2019-July/034388.html
The "permission-recording keyserver" as described here requires the
various keyserver operators to trust each other to validate these
permissions correctly. Technically, this could be done if the validating
keyserver signs the user IDs that it has personally checked, and each of
its peers verifies this third-party sig against their own trusted
keyservers list. But how this interplays with the sync process,
particularly with the subjectivity of trust relationships, gets tricky.
The use of arbitrary data in IDs is problematic; even IDs that validate
as "correct" email addresses can still be abused. I don't think we need
to solve all of these issues straight away, but some of the low hanging
fruit (e.g. UATs containing abusive images) could be eliminated very
quickly if we collectively agreed.
Deletion due to either RTBF or abusive behaviour is possible through
blacklisting, as implemented now in Hockeypuck. This has knock-on
effects on sync (again due to subjectivity) but they are manageable for now.
I think the best way to approach user IDs is to think of keystores
(including keyservers, but also WKD, DANE etc) as performing two
separate functions: discovery and refresh. Discovery is a
human-to-machine operation that finds key material and certifications
based on the user ID, and refresh is a purely machine (and preferably
automated) operation that finds key material and certifications based on
The main reason we are having GDPR problems now is that the original
design of the PGP public key did not properly separate these functions.
At no point in any reasonable scenario do we need to find all user IDs
of a key based solely on knowing the fingerprint, nor do we ever need to
find a third-party sig made by a key that we do not already have or
trust. But the design encourages (and in some cases forces) us to bundle
all of this data together, effectively as a blob.
We need to unpick this, and work with much smaller atoms of key
material, bigger than packets but smaller than entire "public keys"
(such unfortunate terminology that we're now stuck with), so we can be
more selective about what we distribute in a given context. > Preserving third party signatures distribution [Reiter 2019-07 b]
I think the third-party sig issues raised in this post are best tackled
with attestations, as discussed already. The trick is to get the
end-user workflow cleaned up and into as many clients as possible. > Overall there has not been enough efforts going into preserving
> the decentral network of public keyservers.
I agree. I think hockeypuck and hagrid are approaching the same problem
from opposite ends - the optimum is somewhere in the middle, if we can
find it. >> Signature attestations will help tackle many of the abuse (and
>> functional limitation) issues, if we can get them standardised
> That is a possible puzzle piece of the solution.
> (While I am not usre if it depends on standardisation.)
Wide adoption is the goal, and while I think standardisation can help
with that, it is not strictly necessary of course.