Mailing List Archive

En-/Decryption speed for large files (GnuPG and Gpg4win)
Hello,

I was testing the encryption and decryption with "pure" GnuPG and
Gpg4win to compare the speed of them. What I also wanted to find out it
how long it takes to en-/decrypt larger files.

Some details of the environment for the test:
* Windows 10
* Gpg4win 4.0.3
* CPU: Intel i5-6500 @3,20 GHz
* RAM: 16 GB
* Storage: SSD

To test the speed I created files of different sizes with a Python
script where I used the method os.urandom() to fill the files. GnuPG was
running in the PowerShell and I measured the time by using the command
"Measure-Command". To measure the time that Gpg4win needed I used a
stopwatch.

First part of the test was the encryption of the files. To test GnuPG I
used the command "gpg -r test --encrypt ./test_file". To encrypt with
Gpg4win I used the entry "Encrypt" of the GpgEX context menu in the file
explorer.

Results of encryption:

Size | GnuPG | Gpg4win
1GB | 38 sec. | 1 min. 8 sec.
1GB | 37 sec. | 1 min. 7 sec.
2GB | 1 min. 14 sec. | 2 min. 15 sec.
2GB | 1 min. 14 sec. | 2 min. 14 sec.
5GB | 3 min. 10 sec. | 6 min. 10 sec.
5GB | 3 min. 6 sec. | 5 min. 34 sec.
10GB | 6 min. 28 sec. | 11 min. 21 sec.
10GB | 6 min. 21 sec. | 11 min. 6 sec.


To decrypt the files I used the entry "Decrypt" of the GpgEX context
menu in the file explorer for Gpg4win and for GnuPG I used the command
"gpg --output test_file --decrypt test_file.gpg".

Results of decryption:

Size | GnuPG | Gpg4win
1GB | 3 sec. | 36 sec.
1GB | 3 sec. | 34 sec.
2GB | 10 sec. | 1 min. 13 sec.
2GB | 7 sec. | 1 min. 12 sec.
5GB | 22 sec. | 3 min. 1 sec.
5GB | 19 sec. | 3 min. 2 sec.
10GB | 1 min. 3 sec. | 5 min. 52 sec.
10GB | 1 min. 7 sec. | 6 min. 5 sec.


One insight of this test is that Gpg4win needs around two times longer
for encryption. For decryption the difference is much bigger.

When I was testing the decryption I also tried "gpg --decrypt
test_file.gpg" (without output file) with the 10 GB file and it took 8
minutes and 47 seconds. I was wondering why it took longer when GnuPG
didn't need to create an output file.

Did someone of you also try to en-/decrypt larger files? Maybe even
files that are larger than 1 TB? It would be really nice to know how
long GnuPG and Gpg4win are busy with such large files.

With regards,
Christoph


--
Christoph Klassen | https://www.intevation.de/
Intevation GmbH, Neuer Graben 17, 49074 Osnabrück | AG Osnabrück, HRB 18998
Geschäftsführer: Frank Koormann, Bernhard Reiter
RE: En-/Decryption speed for large files (GnuPG and Gpg4win) [ In reply to ]
On Sunday, January 15, 2023 5:52 PM, Christoph Klassen wrote:
> When I was testing the decryption I also tried "gpg --decrypt
> test_file.gpg" (without output file) with the 10 GB file and it took 8
> minutes and 47 seconds. I was wondering why it took longer when GnuPG
> didn't need to create an output file.

As far as I know, outputting text to the screen (like printf) is a very time
consuming operation, it will block you until all printing is complete.

gpg --decrypt test_file.gpg without output file will print all the decrypted
contents on the screen, which may be the reason why it takes so long.
Re: En-/Decryption speed for large files (GnuPG and Gpg4win) [ In reply to ]
On Sun, 15 Jan 2023 10:52, Christoph Klassen said:

> When I was testing the decryption I also tried "gpg --decrypt
> test_file.gpg" (without output file) with the 10 GB file and it took 8
> minutes and 47 seconds. I was wondering why it took longer when GnuPG
> didn't need to create an output file.

Because you sent the output the the console. This is of course slow.


BTW, Do not use gpg4win 4.0.3 - it has a known vulnerability. Use
gpg4win 4.1.0. This will also change the numbers because we improved
some things in gpg.


Salam-Shalom,

Werner


--
The pioneers of a warless world are the youth that
refuse military service. - A. Einstein
Re: En-/Decryption speed for large files (GnuPG and Gpg4win) [ In reply to ]
On 2023-01-15 at 23:14 +0800, Ming Kuang via Gnupg-users wrote:
> On Sunday, January 15, 2023 5:52 PM, Christoph Klassen wrote:
> > When I was testing the decryption I also tried "gpg --decrypt
> > test_file.gpg" (without output file) with the 10 GB file and it took 8
> > minutes and 47 seconds. I was wondering why it took longer when GnuPG
> > didn't need to create an output file.
>
> As far as I know, outputting text to the screen (like printf) is a very time
> consuming operation, it will block you until all printing is complete.
>
> gpg --decrypt test_file.gpg without output file will print all the decrypted
> contents on the screen, which may be the reason why it takes so long.

Generally speaking, I wouldn't consider printing to the screen "very
expensive" (i.e. print if you need to), but if you need to output a lot
of text, the other side (the terminal) will need to process and draw it
into the screen (think on it as a pipe), which will be slow with lots
of text or extremely long lines. Moreover, in Windows it will be
processed to convert LF into CRLF, and then moved into the Terminal
subsystem.

For any test like this where you are not going to process the output
(e.g. to compare it) I would recommend writing into the null device
(/dev/null in *nix, nul in Windows).

Also, when measuring encryption make sure it is not trying to use
compression (based on the preferences of your test key). The time spent
by the compressor on your uncompressible files would be just an
unneeded source of variation.

Regards



_______________________________________________
Gnupg-users mailing list
Gnupg-users@gnupg.org
https://lists.gnupg.org/mailman/listinfo/gnupg-users
RE: En-/Decryption speed for large files (GnuPG and Gpg4win) [ In reply to ]
On Monday, January 16, 2023 9:02 AM, ángel wrote:
> On 2023-01-15 at 23:14 +0800, Ming Kuang via Gnupg-users wrote:
> > On Sunday, January 15, 2023 5:52 PM, Christoph Klassen wrote:
> > > When I was testing the decryption I also tried "gpg --decrypt
> > > test_file.gpg" (without output file) with the 10 GB file and it took 8
> > > minutes and 47 seconds. I was wondering why it took longer when GnuPG
> > > didn't need to create an output file.
> >
> > As far as I know, outputting text to the screen (like printf) is a very time
> > consuming operation, it will block you until all printing is complete.
> >
> > gpg --decrypt test_file.gpg without output file will print all the decrypted
> > contents on the screen, which may be the reason why it takes so long.
>
> Generally speaking, I wouldn't consider printing to the screen "very
> expensive" (i.e. print if you need to), but if you need to output a lot
> of text, the other side (the terminal) will need to process and draw it
> into the screen (think on it as a pipe), which will be slow with lots
> of text or extremely long lines. Moreover, in Windows it will be
> processed to convert LF into CRLF, and then moved into the Terminal
> subsystem.

You are right, my reply might be a bit misleading, what really takes time
is the operation of drawing content to the terminal, if the application tries
to print but you don't let it display on the screen (e.g. redirecting output
to a file or /dev/null), the time consumption will not be a problem.
Re: En-/Decryption speed for large files (GnuPG and Gpg4win) [ In reply to ]
Thanks for your replies!

On 15.01.23 16:14, Ming Kuang wrote:
> gpg --decrypt test_file.gpg without output file will print all the decrypted
> contents on the screen, which may be the reason why it takes so long.
For some reason in that test gpg didn't output anything or at least the
PowerShell didn't show anything.


On 15.01.23 17:03, Werner Koch wrote:
> BTW, Do not use gpg4win 4.0.3 - it has a known vulnerability. Use
> gpg4win 4.1.0. This will also change the numbers because we improved
> some things in gpg.
Don't worry, the system is mostly offline ;-) When I will give it access
to the internet again I will update Gpg4win. Anyway, great to hear that
the current version is faster than 4.0.3.


On 16.01.23 02:01, Ángel wrote:
> For any test like this where you are not going to process the output
> (e.g. to compare it) I would recommend writing into the null device
> (/dev/null in *nix, nul in Windows).
I wanted to create an output file because I wanted to see how GnuPG
would behave in a real scenario :)

> Also, when measuring encryption make sure it is not trying to use
> compression (based on the preferences of your test key). The time spent
> by the compressor on your uncompressible files would be just an
> unneeded source of variation.
Thanks for the hint! I will try it with disabled compression.


With regards,
Christoph

--
Christoph Klassen | https://intevation.de
Intevation GmbH, Osnabrück, DE; Amtsgericht Osnabrück, HRB 18998
Geschäftsführer: Frank Koormann, Bernhard Reiter
Re: En-/Decryption speed for large files (GnuPG and Gpg4win) [ In reply to ]
On Mon, 16 Jan 2023 16:47, Christoph Klassen said:

> For some reason in that test gpg didn't output anything or at least
> the PowerShell didn't show anything.

Powershell and stdout and stderr are a bit problematic. I can't
remember the details so I usually stick to cmd.exe or run tools directly
via ssh from a Unix shell.


Shalom-Salam,

Werner

--
The pioneers of a warless world are the youth that
refuse military service. - A. Einstein
Re: En-/Decryption speed for large files (GnuPG and Gpg4win) [ In reply to ]
Hi,

On Sunday 15 January 2023 10:52:23 CET Christoph Klassen wrote:
> When I was testing the decryption I also tried "gpg --decrypt
> test_file.gpg" (without output file) with the 10 GB file and it took 8
> minutes and 47 seconds. I was wondering why it took longer when GnuPG
> didn't need to create an output file.

Yes that is expected. Gpg encrypt and decrypt with AES should be mostly IO
Bound as with AES-NI instructions it is really fast in the CPU. So not writing
the output to disk will result in faster operations. And one of the biggest
differences you get is when you encrypt / decrypt on a faster disk.


Another big difference what you will see in the perfomance of GnuPG is if you
use -z 0 which disables compression. Currently GnuPG on the command line
disables compression when the input file name already looks compressed
depending on the file name. We want to improve that, especially since Kleopatra
hands the filename only in a way that is not used in that compression
calculation. E.g. Adding Media data formats there might already help in a lot
of use cases. For uncompressable output, like random data, this will make the
largest difference. You can put "compress-level 0" into your gpg.conf to cause
Kleopatra to also use that.

That issue is: https://dev.gnupg.org/T6332 If you could do a run of your
tests and comment in that issue with the results that would be helpful.


It does not surprise me that Kleopatra is much slower. Due to our Architecture
Kleopatra passes Data, through GPGME directly to GnuPG. This results in
additional overhead but gives us more flexibility what kind of data we encrypt
/ decrypt. E.g. a mail or something that is not even written on the File
system.

For some parts we want to change that. Most notably Ingo is currently working
on Gpgtar. Gpgtar can nowadays directly encrypt / decrypt so there is no need
to pipe the input / output of GnuPG to or from GpgTar. Using GpgTar directly
should help a lot when working with larger Archives. https://dev.gnupg.org/
T5478

We also already increased the buffer size in GPGME to reduce the number of
callbacks we do internally but there can be more optimization there. Currently
our recommendation for Large Data is to use the command line directly, which
will always be fastest as there is no overhead.

> Did someone of you also try to en-/decrypt larger files? Maybe even
> files that are larger than 1 TB? It would be really nice to know how
> long GnuPG and Gpg4win are busy with such large files.

I think my largest tests were around 40GB. But I don't have the numbers
anymore, the testing I did there was mostly because there were reports that
Kleopatra crashes on such large files.


Maybe you can open a ticket for this with a reference to https://
dev.gnupg.org/T5478 about performance problems when decrypting / encrypting
large files (In contrast to archives.)


Best Regards,
Andre

P.S. we are currently also looking at the startup / initial keycache building
time of Kleopatra. This might also be intresting for those looking at Kleo
performance. https://dev.gnupg.org/T6259

--
GnuPG.com - a brand of g10 Code, the GnuPG experts.

g10 Code GmbH, Erkrath/Germany, AG Wuppertal HRB14459
GF Werner Koch, USt-Id DE215605608, www.g10code.com.

GnuPG e.V., Rochusstr. 44, D-40479 D?sseldorf. VR 11482 D?sseldorf
Vorstand: W.Koch, B.Reiter, A.Heinecke Mail: board@gnupg.org
Finanzamt D-Altstadt, St-Nr: 103/5923/1779. Tel: +49-211-28010702
Re: En-/Decryption speed for large files (GnuPG and Gpg4win) [ In reply to ]
Thanks a lot for your reply Andre!

On 17.01.23 13:08, Andre Heinecke wrote:
> Another big difference what you will see in the perfomance of GnuPG is if you
> use -z 0 which disables compression.
I tried that with the 10GB file and, indeed, it was much faster. The
encryption took only 51 seconds (with compression it was: 6 min. 21 sec.).

--
Christoph Klassen | https://intevation.de
Intevation GmbH, Osnabrück, DE; Amtsgericht Osnabrück, HRB 18998
Geschäftsführer: Frank Koormann, Bernhard Reiter
Re: En-/Decryption speed for large files (GnuPG and Gpg4win) [ In reply to ]
Can someone please remove my email address from this group! This has nothing to do with me!

Sent from Yahoo Mail for iPhone


On Tuesday, January 17, 2023, 5:10 AM, Andre Heinecke via Gnupg-users <gnupg-users@gnupg.org> wrote:

Hi,

On Sunday 15 January 2023 10:52:23 CET Christoph Klassen wrote:
> When I was testing the decryption I also tried "gpg --decrypt
> test_file.gpg" (without output file) with the 10 GB file and it took 8
> minutes and 47 seconds. I was wondering why it took longer when GnuPG
> didn't need to create an output file.

Yes that is expected. Gpg encrypt and decrypt with AES should be mostly IO
Bound as with AES-NI instructions it is really fast in the CPU. So not writing
the output to disk will result in faster operations. And one of the biggest
differences you get is when you encrypt / decrypt on a faster disk.


Another big difference what you will see in the perfomance of GnuPG is if you
use -z 0 which disables compression. Currently GnuPG on the command line
disables compression when the input file name already looks compressed
depending on the file name. We want to improve that, especially since Kleopatra
hands the filename only in a way that is not used in that compression
calculation. E.g. Adding Media data formats there might already help in a lot
of use cases. For uncompressable output, like random data, this will make the
largest difference. You can put "compress-level 0" into your gpg.conf to cause
Kleopatra to also use that.

That issue is: https://dev.gnupg.org/T6332 If you could do a run of your
tests and comment in that issue with the results that would be helpful.


It does not surprise me that Kleopatra is much slower. Due to our Architecture
Kleopatra passes Data, through GPGME directly to GnuPG. This results in
additional overhead but gives us more flexibility what kind of data we encrypt
/ decrypt. E.g. a mail or something that is not even written on the File
system.

For some parts we want to change that. Most notably Ingo is currently working
on Gpgtar. Gpgtar can nowadays directly encrypt / decrypt so there is no need
to pipe the input / output of GnuPG to or from GpgTar. Using GpgTar directly
should help a lot when working with larger Archives. https://dev.gnupg.org/
T5478

We also already increased the buffer size in GPGME to reduce the number of
callbacks we do internally but there can be more optimization there. Currently
our recommendation for Large Data is to use the command line directly, which
will always be fastest as there is no overhead.

> Did someone of you also try to en-/decrypt larger files? Maybe even
> files that are larger than 1 TB? It would be really nice to know how
> long GnuPG and Gpg4win are busy with such large files.

I think my largest tests were around 40GB. But I don't have the numbers
anymore, the testing I did there was mostly because there were reports that
Kleopatra crashes on such large files.


Maybe you can open a ticket for this with a reference to https://
dev.gnupg.org/T5478 about performance problems when decrypting / encrypting
large files (In contrast to archives.)


Best Regards,
Andre

P.S. we are currently also looking at the startup / initial keycache building
time of Kleopatra. This might also be intresting for those looking at Kleo
performance. https://dev.gnupg.org/T6259

--
GnuPG.com - a brand of g10 Code, the GnuPG experts.

g10 Code GmbH, Erkrath/Germany, AG Wuppertal HRB14459
GF Werner Koch, USt-Id DE215605608, www.g10code.com.

GnuPG e.V., Rochusstr. 44, D-40479 Düsseldorf.  VR 11482 Düsseldorf
Vorstand: W.Koch, B.Reiter, A.Heinecke        Mail: board@gnupg.org
Finanzamt D-Altstadt, St-Nr: 103/5923/1779.  Tel: +49-211-28010702_______________________________________________
Gnupg-users mailing list
Gnupg-users@gnupg.org
https://lists.gnupg.org/mailman/listinfo/gnupg-users
Re: En-/Decryption speed for large files (GnuPG and Gpg4win) [ In reply to ]
On Wed, 18 Jan 2023 01:57:59 +0000 (UTC)
Shannon Mess via Gnupg-users <gnupg-users@gnupg.org> wrote:

> Can someone please remove my email address from this group! This has
> nothing to do with me!

Send an email to gnupg-users-request@gnupg.org?subject=unsubscribe if
you're not interested in emails from this mailing list.

--
Current PGP KeyID: 11ADE4393600C1BDFFCBC0A598DE15942B08CA00

https://blueselene.com/pgp-archive/11ADE4393600C1BDFFCBC0A598DE15942B08CA00/key.pub

For up-to-date information on my crypto keys, see
https://blueselene.com/crypto.html