Mailing List Archive

Limiting amount of memory a program can use.
Howdy,

As most know, I got a much faster internet and I use torrent software,
quite a lot.  I was using Ktorrent and it was OK but it was slow.  I
started using Qbittorrent and like it better in a way but it has its own
speed issues and both affect my desktop response.  I did some googling
and used top to figure out that they are using a LOT of memory for
cache.  At times, it uses well over half my memory just for cache.  It
also gets to a point where it is using swap even tho I have swappiness
set to 1, basically use swap only to prevent a crash from out of memory
problems.  We all know how slow swap use can make things.  As it is, I
reduced the number of active files which is not something I want to do. 
If I receive, I like to send as well.  After all, someone sent to me as
well.  What I would like to do is limit the amount of memory torrent
software can use.  I don't know exactly how to do that tho.  It's not
something I've ever done. 

Is this something I do on the command line or a setting is some file
somewhere?  I don't even know where to start on this.  By the way, I'm
maxed out at 32GBs of memory for this mobo.  So adding memory isn't a
option.  Is there even a mobo that has a 64GB option??? :/ 

Dale

:-)  :-) 
Re: Limiting amount of memory a program can use. [ In reply to ]
On Sunday, 28 August 2022 13:24:46 BST Dale wrote:

> Is this something I do on the command line or a setting is some file
> somewhere? I don't even know where to start on this. By the way, I'm
> maxed out at 32GBs of memory for this mobo. So adding memory isn't a
> option. Is there even a mobo that has a 64GB option??? :/

This may not help, but it may set you on the right track: man ulimit.

:)

--
Regards,
Peter.
Re: Limiting amount of memory a program can use. [ In reply to ]
On Sunday, 28 August 2022 13:56:09 BST Peter Humphrey wrote:
> On Sunday, 28 August 2022 13:24:46 BST Dale wrote:
> > Is this something I do on the command line or a setting is some file
> > somewhere? I don't even know where to start on this. By the way, I'm
> > maxed out at 32GBs of memory for this mobo. So adding memory isn't a
> > option. Is there even a mobo that has a 64GB option??? :/
>
> This may not help, but it may set you on the right track: man ulimit.
>
> :)

I'm not sure how you can set a limit for a single application and any of its
child processes with ulimit. However, Control Groups (cgroups) can do this
easily as long as it has been included in the kernel:

https://wiki.gentoo.org/wiki/OpenRC/CGroups

Also, from what I recall at least some torrent applications can limit the
amount of uploading connections and/or throughput - but I haven't used any for
years now.
Re: Limiting amount of memory a program can use. [ In reply to ]
On Sunday, 28 August 2022 14:03:00 BST you wrote:
> On Sunday, 28 August 2022 13:56:09 BST Peter Humphrey wrote:
> > On Sunday, 28 August 2022 13:24:46 BST Dale wrote:
> > > Is this something I do on the command line or a setting is some file
> > > somewhere? I don't even know where to start on this. By the way, I'm
> > > maxed out at 32GBs of memory for this mobo. So adding memory isn't a
> > > option. Is there even a mobo that has a 64GB option??? :/
> >
> > This may not help, but it may set you on the right track: man ulimit.
> >
> > :)
>
> I'm not sure how you can set a limit for a single application and any of its
> child processes with ulimit. However, Control Groups (cgroups) can do this
> easily as long as it has been included in the kernel:
>
> https://wiki.gentoo.org/wiki/OpenRC/CGroups
>
> Also, from what I recall at least some torrent applications can limit the
> amount of uploading connections and/or throughput - but I haven't used any
> for years now.

There is also prlimit which may be more appropriate for a single command?
Re: Limiting amount of memory a program can use. [ In reply to ]
On Sun, Aug 28, 2022 at 8:24 AM Dale <rdalek1967@gmail.com> wrote:
>
> What I would like to do is limit the amount of memory torrent
> software can use.

While ulimit/cgroups/etc will definitely do the job, they're probably
not the solution you want. Those will cause memory allocation to
fail, and I'm guessing at that point your torrent software will just
die.

I'd see if you can do something within the settings of the program to
limit its memory use, and then use a resource limit at the OS level as
a failsafe, so that a memory leak doesn't eat up all your memory.

Otherwise your next email will be asking how to automatically restart
a dead service. Systemd has support for that built-in, and there are
also options for non-systemd, but you're going to be constantly having
restarts and it might not even run for much time at all depending on
how bad the problem is. It is always best to tame memory use within
an application.

Something I wish linux supported was discardable memory, for
caches/etc. A program should be able to allocate memory while passing
a hint to the kernel saying that the memory is discardable. If the
kernel is under memory pressure it can then just deallocate the memory
and then have some way to notify the process that the memory no longer
is allocated. That might optionally support giving warning first, or
it might be some kind of new trappable exception for segfaults to
discarded memory. (Since access to memory doesn't involve system
calls it might be hard to have more graceful error handling. I guess
an option would be to first tell the kernel to lock the memory before
accessing it, then release the lock, so that the memory isn't
discarded after checking that it is safe.)

--
Rich
Re: Limiting amount of memory a program can use. [ In reply to ]
On Sun, Aug 28, 2022 at 5:25 AM Dale <rdalek1967@gmail.com> wrote:
<SNIP> Is there even a mobo that has a 64GB option??? :/
>
> Dale
>
> :-) :-)

My ROG Strix 570 MB handles up to 128GB. The board and that much memory is
pretty expensive though.

As to your question I suspect sticking the app in a VM would more
completely limit memory usage.

HTH,
Mark
Re: Limiting amount of memory a program can use. [ In reply to ]
On 28/08/2022 15:21, Rich Freeman wrote:
> Something I wish linux supported was discardable memory, for
> caches/etc. A program should be able to allocate memory while passing
> a hint to the kernel saying that the memory is discardable.

Linux DOES have that ...

I'm not sure how to use it, but you can pass a flag to open(), which
says "do not cache this file". The obvious use case is something like
cp, but there's plenty of others. Or there are applications which cache
stuff in the app, so don't want to waste OS cache as well - databases
are a prime example.

Apparently not only does memory use crash when this is actively used,
but because the OS does not have to manage cache, speed is actually
noticeably impacted. Given the obsession kernel devs have with not
wasting memory and time, this is likely to be actively maintained and
supported. We just need to find out how to access this from user space.

Cheers,
Wol
Re: Limiting amount of memory a program can use. [ In reply to ]
On 8/28/22 14:24, Dale wrote:
> Howdy,
>
> As most know, I got a much faster internet and I use torrent software,
> quite a lot.  I was using Ktorrent and it was OK but it was slow.  I
> started using Qbittorrent and like it better in a way but it has its own
> speed issues and both affect my desktop response.  I did some googling
> and used top to figure out that they are using a LOT of memory for
> cache.  At times, it uses well over half my memory just for cache.  It
> also gets to a point where it is using swap even tho I have swappiness
> set to 1, basically use swap only to prevent a crash from out of memory
> problems.  We all know how slow swap use can make things.  As it is, I
> reduced the number of active files which is not something I want to do.
> If I receive, I like to send as well.  After all, someone sent to me as
> well.  What I would like to do is limit the amount of memory torrent
> software can use.  I don't know exactly how to do that tho.  It's not
> something I've ever done.
>
> Is this something I do on the command line or a setting is some file
> somewhere?  I don't even know where to start on this.  By the way, I'm
> maxed out at 32GBs of memory for this mobo.  So adding memory isn't a
> option.  Is there even a mobo that has a 64GB option??? :/

Not really an answer to your question but here I never had
speed/responsiveness/memory issues with transmission (Xfce, 16Gb RAM,
50Mbit/s network bandwidth ):

[I] net-p2p/transmission

     Available versions:  3.00-r1^t (~)3.00-r4^t **9999*l^t
{appindicator cli gtk lightweight mbedtls nls qt5 systemd test web}
     Installed versions:  3.00-r4^t(01:14:38 PM 05/29/2022)(cli nls
-appindicator -gtk -lightweight -mbedtls -qt5 -systemd -test)
     Homepage:            https://transmissionbt.com/
     Description:         A fast, easy, and free BitTorrent client

I use it without GUI (-gtk -qt5) because I find the web interface just
fine. Also, being server-based it runs regardless of who's logged into
the PC, which is a plus here.

raffaele
Re: Limiting amount of memory a program can use. [ In reply to ]
On Sun, Aug 28, 2022 at 11:32 AM Wols Lists <antlists@youngman.org.uk> wrote:
>
> On 28/08/2022 15:21, Rich Freeman wrote:
> > Something I wish linux supported was discardable memory, for
> > caches/etc. A program should be able to allocate memory while passing
> > a hint to the kernel saying that the memory is discardable.
>
> Linux DOES have that ...
>
> I'm not sure how to use it, but you can pass a flag to open(), which
> says "do not cache this file".

That isn't what I was proposing. That has been around for a while.
However, it only impacts caching/buffering of open files, which is
kernel memory, and not memory held by a process itself.

You wouldn't want to limit kernel file caching of a torrent file,
since those files are pretty ideal candidates for caching (since the
client is likely to send data not long after receiving it). There
isn't really any benefit to this either in this case as the kernel
already automatically drops file cache memory as needed under memory
pressure.

I was referring to application caching, which is internal storage used
by an application that may or may not have anything to do with open
files. A browser cache is a good example of something like this
(though that extends to disk caching often and those files would also
get cached by the kernel). The kernel can't simply drop this memory
when it needs memory since there is no way for it to tell which memory
an application owns is used for this purpose, and no way currently to
handle the resulting segfaults when the application goes to read that
memory. Reading memory is a CPU instruction, not a system call, so it
would trigger an exception at the CPU level which must be handled.

I'm sure something could be engineered but it is more complex and I'm
guessing nobody has seen the need for it yet. Ideally you'd also have
ways for the OS to hint how much memory applications ought to consume
in this way. After all, you might have two applications running at
the same time, which both want to use lots of extra RAM for caching if
it is available, but neither has any way of knowing that this
situation exists and it is undesirable if all the memory goes to the
first application that asks, and also undesirable if after everything
is done grabbing memory if there is a bunch of it sitting around doing
nothing.

TL;DR: cache is not necessarily the same as kernel file cache.

--
Rich
Re: Limiting amount of memory a program can use. [ In reply to ]
ralfconn wrote:
> On 8/28/22 14:24, Dale wrote:
>> Howdy,
>>
>> As most know, I got a much faster internet and I use torrent software,
>> quite a lot.  I was using Ktorrent and it was OK but it was slow.  I
>> started using Qbittorrent and like it better in a way but it has its own
>> speed issues and both affect my desktop response.  I did some googling
>> and used top to figure out that they are using a LOT of memory for
>> cache.  At times, it uses well over half my memory just for cache.  It
>> also gets to a point where it is using swap even tho I have swappiness
>> set to 1, basically use swap only to prevent a crash from out of memory
>> problems.  We all know how slow swap use can make things.  As it is, I
>> reduced the number of active files which is not something I want to do.
>> If I receive, I like to send as well.  After all, someone sent to me as
>> well.  What I would like to do is limit the amount of memory torrent
>> software can use.  I don't know exactly how to do that tho.  It's not
>> something I've ever done.
>>
>> Is this something I do on the command line or a setting is some file
>> somewhere?  I don't even know where to start on this.  By the way, I'm
>> maxed out at 32GBs of memory for this mobo.  So adding memory isn't a
>> option.  Is there even a mobo that has a 64GB option??? :/
>
> Not really an answer to your question but here I never had
> speed/responsiveness/memory issues with transmission (Xfce, 16Gb RAM,
> 50Mbit/s network bandwidth ):
>
> [I] net-p2p/transmission
>
>      Available versions:  3.00-r1^t (~)3.00-r4^t **9999*l^t
> {appindicator cli gtk lightweight mbedtls nls qt5 systemd test web}
>      Installed versions:  3.00-r4^t(01:14:38 PM 05/29/2022)(cli nls
> -appindicator -gtk -lightweight -mbedtls -qt5 -systemd -test)
>      Homepage:            https://transmissionbt.com/
>      Description:         A fast, easy, and free BitTorrent client
>
> I use it without GUI (-gtk -qt5) because I find the web interface just
> fine. Also, being server-based it runs regardless of who's logged into
> the PC, which is a plus here.
>
> raffaele


I may look into that.  Just to see if it is even better than Ktorrent
and Qbittorrent.  Never know.  I may like it like Mikey.  lol  It's a
old TV commercial. 

I'm not sure what happened but it started acting better.  One, I did
remove a lot of items that were just sitting there.  Then I removed some
that had a high share ratio, I did a lot of sharing.  There was also a
update to QBT but at first it didn't help any.  It was after a few
restarts that it seemed to improve.  I did notice that at one point it
was uploading around 60 or 70MB/sec and desktop response slowed some but
not as bad as before.  I guess I need to limit what it uploads a bit.  I
know how to do that. 

So, I think the update improved things but took a few restarts to kick
in or something.  Removing some unused files helped too.  I wonder if I
could use torrent on a Raspberry Pi?  :-)  lol 

By the way, I notice this in df and mount:


root@fireball / # df -h | grep group
cgroup_root                 10M     0   10M   0% /sys/fs/cgroup
root@fireball / # mount | grep group
cgroup_root on /sys/fs/cgroup type tmpfs
(rw,nosuid,nodev,noexec,relatime,size=10240k,mode=755)
openrc on /sys/fs/cgroup/openrc type cgroup
(rw,nosuid,nodev,noexec,relatime,release_agent=/lib/rc/sh/cgroup-release-agent.sh,name=openrc)
none on /sys/fs/cgroup/unified type cgroup2
(rw,nosuid,nodev,noexec,relatime,nsdelegate)
gvfsd-fuse on /run/user/1000/gvfs type fuse.gvfsd-fuse
(rw,nosuid,nodev,relatime,user_id=1000,group_id=100)
portal on /run/user/1000/doc type fuse.portal
(rw,nosuid,nodev,relatime,user_id=1000,group_id=100)
root@fireball / #

I guess cgroups is enabled but I never touched any of it.  Only read a
few threads on here about it. 

Thanks to all.

Dale

:-)  :-) 
Re: Limiting amount of memory a program can use. [ In reply to ]
How many torrents are you seeding, for a point of comparison?

I use deluge (headless) on my home server, seeding anywhere from 500-
1000 individual torrents, representing ~5TB of total data, and the
process is cruising along at just over 1GB of memory used.

Maybe qbittorrent just doesn't scale well? I feel like I'm on the
exceedingly-high end of the spectrum with usage of a torrent program,
but I don't know what everyone else is doing.
Re: Limiting amount of memory a program can use. [ In reply to ]
--"Fascism begins the moment a ruling class, fearing the people may use their political democracy to gain economic democracy, begins to destroy political democracy in order to retain its power of exploitation and special privilege." Tommy Douglas




Aug 29, 2022, 10:07 by matt@connell.tech:

> How many torrents are you seeding, for a point of comparison?
>
> I use deluge (headless) on my home server, seeding anywhere from 500-
> 1000 individual torrents, representing ~5TB of total data, and the
> process is cruising along at just over 1GB of memory used.
>
> Maybe qbittorrent just doesn't scale well? I feel like I'm on the
> exceedingly-high end of the spectrum with usage of a torrent program,
> but I don't know what everyone else is doing.
>
Faster hard drives/striped raid array would help.  Also check out the "Advanced" section of the preferences in Qbit.  You can control memory use somewhat from there and definitely turn off multiple connections from the same host.
Re: Limiting amount of memory a program can use. [ In reply to ]
Matt Connell wrote:
> How many torrents are you seeding, for a point of comparison?
>
> I use deluge (headless) on my home server, seeding anywhere from 500-
> 1000 individual torrents, representing ~5TB of total data, and the
> process is cruising along at just over 1GB of memory used.
>
> Maybe qbittorrent just doesn't scale well? I feel like I'm on the
> exceedingly-high end of the spectrum with usage of a torrent program,
> but I don't know what everyone else is doing.
>
>


At one point, I had about 400 or so total, some downloading, some
finished and uploading while some were doing both.  I may have to switch
to something else before it is over.  I may even put it on a different
machine or use something else to put my videos on the TV, HTPC thingy.
Since I've found things that were hard to find elsewhere, I want to
return the favor.  So, I'd really like to hold onto those that are hard
to find. 

After I removed some of the inactive ones, it is a lot better.  I also
limited the upload speed which helped a lot.  According to atop, when
uploading at higher speeds, the reads were really putting a load on that
set of drives.  Does deluge have a GUI option?  Of course, if I put it
on another machine, I may go headless for it.  That's one reason I'm
asking.  Options.

And as soon as I think it is better, it slows down again and I find this
from top:


27672 dale 20 0 410.7g 4.3g 4.1g S 2.2 13.6 158:26.37
/usr/bin/qbittorrent


It's not nearly as bad as it was when I started this thread but still,
that's a lot of memory.

Dale

:-)  :-) 
Re: Limiting amount of memory a program can use. [ In reply to ]
On Mon, 2022-08-29 at 16:11 -0500, Dale wrote:
> Does deluge have a GUI option?  Of course, if I put it
> on another machine, I may go headless for it.  That's one reason I'm
> asking.  Options.

Yes, deluge has a GUI by default. I just build it with USE="-gtk -
libnotify -sound webinterface" and then access it by
http://<serverip>:port from a desktop PC instead.

If you want more details about my setup I'm happy to provide.

Also, for the record, all of my torrent data is on a *USB external hard
disk* which is about the slowest possible workable solution, but it
still performs adequately for my needs.