Mailing List Archive

Aggregate file requests [RE: Requests by half-hour]
Thanks for the quick response. I've got another question for you.

Is it possible to combine requests for all files that start with the
same name?

Example:
I have a bunch of files that have this format:
show1YYYYMMDD.mp3
show2YYYYMMDD.mp3
Show3YYYYMMDD.mp3
ForumYYYYMMDD.mp3 etc...

And each one of those examples has 5 other files with the same first
half of the name, (such as show1) but with a different ending date.

Is there a way to combine the statistics for all show1YYYYMMDD.mp3
files, all show2YYYYMMDD.mp3 files, ect. so the aggregate requests for
show1 can be compared to the aggregate requests for show2 and show3,
etc.?

Thanks,
Morgen


-----Original Message-----
From: analog-help-bounces@lists.meer.net
[mailto:analog-help-bounces@lists.meer.net] On Behalf Of Aengus
Sent: Thursday, November 29, 2007 2:50 PM
To: Support for analog web log analyzer
Subject: Re: [analog-help] Requests by half-hour


Morgen Nilsson <mnilsson@kuow.org> wrote:
> Can Analog report a list of when each file (mp3) was
> accessed/requested?
>
>
> I currently have it reporting how many total requests for mp3 files
> per quarter-hour using 'QUARTERREP ON', and it also reports how many
> times each mp3 file was requested using 'REQCOLS NRr' (I think).
>
> I want to be able to tell when each file was downloaded and a graph
> that marks the number or requests, per file, per half-hour.

http://analog.cx/docs/faq.html#faq128

The short answer is no. You can get a report of how many times any
single .MP3 file was downloaded during set periods (5 minutes, quarter
hour, hourly, daily or weekly), but there isn't any way within Analog to
get seperate periodic totals for seperate files in a single run. You
either run Analog against each .MP3 file, or (possible more useful) run
Analog once every half hour, and import the data into some other
application to display the results).

Aengus

+-----------------------------------------------------------------------
+-
| TO UNSUBSCRIBE from this list:
| http://lists.meer.net/mailman/listinfo/analog-help
|
| Analog Documentation: http://analog.cx/docs/Readme.html
| List archives: http://www.analog.cx/docs/mailing.html#listarchives
| Usenet version: news://news.gmane.org/gmane.comp.web.analog.general
+-----------------------------------------------------------------------
+-

+------------------------------------------------------------------------
| TO UNSUBSCRIBE from this list:
| http://lists.meer.net/mailman/listinfo/analog-help
|
| Analog Documentation: http://analog.cx/docs/Readme.html
| List archives: http://www.analog.cx/docs/mailing.html#listarchives
| Usenet version: news://news.gmane.org/gmane.comp.web.analog.general
+------------------------------------------------------------------------
Re: Aggregate file requests [RE: Requests by half-hour] [ In reply to ]
Morgen Nilsson <mnilsson@kuow.org> wrote:
> Thanks for the quick response. I've got another question for you.
>
> Is it possible to combine requests for all files that start with the
> same name?
>
> Example:
> I have a bunch of files that have this format:
> show1YYYYMMDD.mp3
> show2YYYYMMDD.mp3
> Show3YYYYMMDD.mp3
> ForumYYYYMMDD.mp3 etc...
>
> And each one of those examples has 5 other files with the same first
> half of the name, (such as show1) but with a different ending date.
>
> Is there a way to combine the statistics for all show1YYYYMMDD.mp3
> files, all show2YYYYMMDD.mp3 files, ect. so the aggregate requests for
> show1 can be compared to the aggregate requests for show2 and show3,
> etc.?

FILEALIAS show1*.mp3 show1.mp3

http://analog.cx/docs/alias.html

Aengus

+------------------------------------------------------------------------
| TO UNSUBSCRIBE from this list:
| http://lists.meer.net/mailman/listinfo/analog-help
|
| Analog Documentation: http://analog.cx/docs/Readme.html
| List archives: http://www.analog.cx/docs/mailing.html#listarchives
| Usenet version: news://news.gmane.org/gmane.comp.web.analog.general
+------------------------------------------------------------------------