Mailing List Archive

Populating the .xmltv file for my grabber from mythtv database?
I've got the channel lineup I want inside MythTV, but I can't download
programming info because my entire Cable.xmltv file looks like this:

cache=/home/fred/.xmltv/tv_grab_zz_sdjson.cache
channel-id-format=mythtv
previously-shown-format=date
username=redacted
password=redacted
mode=channels
lineup=USA-CA04459-X

Apparently supposed to have a list of all the channels I want updates for.
Is there an easy way to populate the file with the channels from the
channel lineup I currently have in my mythconverg? (I'm trying to avoid
adding the additional 800 channels I *don't* want.)

Cheers
Re: Populating the .xmltv file for my grabber from mythtv database? [ In reply to ]
On Mon, 2021-04-05 at 14:44 -0700, Fred Hamilton wrote:
> I've got the channel lineup I want inside MythTV, but I can't
> download programming info because my entire Cable.xmltv file looks
> like this:
>
> cache=/home/fred/.xmltv/tv_grab_zz_sdjson.cache
> channel-id-format=mythtv
> previously-shown-format=date
> username=redacted
> password=redacted
> mode=channels
> lineup=USA-CA04459-X
>
> Apparently supposed to have a list of all the channels I want updates
> for.  Is there an easy way to populate the file with the channels
> from the channel lineup I currently have in my mythconverg?  (I'm
> trying to avoid adding the additional 800 channels I *don't* want.)
>
> Cheers
>
>
> _______________________________________________
> mythtv-users mailing list
> mythtv-users@mythtv.org
> http://email.mg.glenb.net/c/eJxNjksOwiAURVdTZhI-BdoBAyfuAx6PlgTQFDRx96Ix0eRO7uckN1iILqiZJAsaQPFFMR-jMhxArk5oAysA9zALsYaFCUQKrmCeZrZlrJ5W7GS3qEE5GSUaB3wQIP0qkYFAZYIETbLde79N8jyJy1BOrTdann3vD3o9thEVl3Jx9VumGq_v8LM43RsejRz2344HP_4FYHJByg
> http://email.mg.glenb.net/c/eJxFjUEOgjAQRU9Ddzbt0FJZdOHGlZ6BtNMpNAIqDBpvL4kLk796eS8_ecwhWSOKxwbR6qNVMWfrNGLdBmgctog6ogFo01EBkcQw0VgZ1Y80RzkTi8ErcjnYxmWTIgCCqR06Mkq7EAKQEaMfmB9VfargvO9dbkVOHx74Je9Lv5NrKGOZ--5SVu6Iy3MjZhKL_1mHbaVl3V__0Rc1hD0e
> MythTV Forums: http://email.mg.glenb.net/c/eJxFjDsOwyAQBU9juiBY8zEFRZrcwyyLiWTsCHCk3D5IKSK9ZjSaFz2mNWrFnh4NopaLFiElbSXi7FYwFh2iDKgAXFwEEHFcC-2TEttOR-AHdZY9JCkNqJHHaIMgq4XCMNsFTDAUNdt97v3Vpvk-wWMsnfUqvHx67m9-1o1V_4Pb1ai2cf93X_fYM-4

i more or less followed wiki for
sdjson http://email.mg.glenb.net/c/eJxFjrFqwzAURb9G3mqk96TYHjSEQqZ2agndxNOTZLtxnGApMcnX16VD4cKBMxxusJwoGF2NlnfMRrVG-pRMo5ixI9g13DErzxqgC62EGGumc5yElv0UZ1_PsVSDRUqKjEkBjUbZovKgCSio2GCHhNVkh1KuWeBewGHbuq71-VGGcq8vS_8rxtO44ev97fMoAD9iuV3dOpbBlbvrF_Lu-XQ5fOfLLF5B7CVVi_0rvNxyXPL26D_4A_QRRZ0
nd first got list of all the channels, then i did quick search and
replace all '=' with '!' using sed. then went back in and corrected
first lines not dealing with channels and replace with '=' the ones i
use. still takes a while but better than having to take out 800 one by
one. in my case i only use like 50 or 60.

i also used this handy util channel
editor http://email.mg.glenb.net/c/eJxFjcsKwjAURL8m2VnyuGnMIotS9Dckvbltgn1IGy3-vRUXwsDAYQ4TPfYhGuDZY41o5NmIru-NlYjaBVVbdIiyQ1DKxbNQRBWGiUYGYhhp7qqZCk--j2CDMxAgkgWI2ioj0IBUUltHxEefSnlsTDdMXY_s-15N75LKq1rW4QvyPR_VpjDPNN4uMZdlZa1ijSgLX_1vfHputG7H-d_9AIObPbI have a reference
while edited the conf file for sdjson. probably a way to export from
the channel editor (which it supports) and make a script to adjust or
recreate the conf file for sdjson.
Re: Populating the .xmltv file for my grabber from mythtv database? [ In reply to ]
I found the perfect solution - Simon Hobson patiently tried to explain it
to me 2 weeks ago in the "Culling my MythTV channels" thread, but I didn't
understand it at the time. All I needed was his scripts on about the 10th
message in the thread. Adapted those for my directory locations and
grabber, and very quickly had a .xmltv with only the channels already in
mythtv, and mythfilldatabase is working normally again. Thanks community
and particularly Simon for those scripts!
Re: Populating the .xmltv file for my grabber from mythtv database? [ In reply to ]
On Mon, Apr 5, 2021 at 3:04 PM glen <glenb@glenb.net> wrote:

> i more or less followed wiki for sdjson
> https://www.mythtv.org/wiki/XMLTV#Setup_with_tv_grab_zz_sdjson
> <http://email.mg.glenb.net/c/eJxFjj0PgjAURX8NbJK2tBSGDi5OOmmMW0NfXwHlK7RA5NeLcTC5yU1Ock-uVeBKK3jcKMgABM0FMc4JSQHSomSZhAKAGuCMFTYnDDGBssM24qRqsTdJjyGuFSmkSwm3zrmMYs6JJZYyKaURzgiJcavqEEYfpceInfas65p071CHJRmm6guaV7PX43K-3SOWXjHMo16bUOuw6Goqjd427e3TD308qd_0MHuc_H7lb_oAUbBEXA> and
> first got list of all the channels, then i did quick search and replace all
> '=' with '!' using sed. then went back in and corrected first lines not
> dealing with channels and replace with '=' the ones i use. still takes a
> while but better than having to take out 800 one by one. in my case i only
> use like 50 or 60.
>
> i also used this handy util channel editor
> https://www.mythtv.org/wiki/Channel_Editor
> <http://email.mg.glenb.net/c/eJxFTUsOwiAUPE3ZSYAWKAsWxug1DLxHLZFSQ9HG24txYTKZSSbzQQuTQzmQaEEBSD5K5qdJag7QGyeUBgPAPQxCGByZCIGCW0LqBnZLIXuaQyWz5Ux47RoDR0QYQ1sQIzqNynDlBUl2rvWxdf2xE5eGfd_p8q5zfdG13L5GvMcmp9nlHNL1jLGuhRT7Cx2eWyhbO_13PtCDO-k> to
> have a reference while edited the conf file for sdjson. probably a way to
> export from the channel editor (which it supports) and make a script to
> adjust or recreate the conf file for sdjson.
>

Thanks - that channel editor looks like it might be a really good way to
maintain the list.
Re: Populating the .xmltv file for my grabber from mythtv database? [ In reply to ]
I am just now attempting the upgrade to version 31, and the cutover from DD
to XMLTV.

I successfully configured the sqlite grabber to pull from SD, and noticed
that it downloaded all of the channels on my SD lineup, even though most
are not selected. I guess this is intended behavior?

Seeing how tedious it is to mark channels as "selected" in the Sqlite
database, I came up with a slightly different way to transfer the channels
I currently care about.

I run this query in my existing mythconverg database:

SELECT CONCAT('update channels set selected = 1 where channum = ', channum
,";") AS combined FROM channel where visible=1

Pipe this output to a file named selected.sql which creates the sqlite
update statements to set the correct channels to "selected" in the new
grabber database to match what you now have as visible channels in mythtv.

Then, just use the grabber tool to make all the channels as "not selected"
by default, and then pipe the file from above into the sqlite query as seen
on the xmltv wiki:

sqlite3 $HOME/.xmltv/SchedulesDirect.DB < selected.sql

This seems simpler than some of the other methods I've seen on the wiki and
elsewhere...

Moving on....I need some advice before I go past the rubicon on this v31 +
xmltv upgrade:

After I do the 31 upgrade, and since I only use an HDHR, I plan to just
blow away my channels and repopulate. I think all the channel ids will
line up the same as before.

Currently, in v30, I run mythfilldatabase as a cron job. Should I continue
to do that, or switch to another method? I've never let mythtv natively
run the mythfilldatabase process. I'm having a hard time following the
intent of the setup-video-sources wiki page on this topic. Advice
appreciated!

Thanks,
Larry






On Mon, Apr 5, 2021 at 7:06 PM Fred Hamilton <fred@yonkitime.com> wrote:

>
>
> On Mon, Apr 5, 2021 at 3:04 PM glen <glenb@glenb.net> wrote:
>
>> i more or less followed wiki for sdjson
>> https://www.mythtv.org/wiki/XMLTV#Setup_with_tv_grab_zz_sdjson
>> <http://email.mg.glenb.net/c/eJxFjj0PgjAURX8NbJK2tBSGDi5OOmmMW0NfXwHlK7RA5NeLcTC5yU1Ock-uVeBKK3jcKMgABM0FMc4JSQHSomSZhAKAGuCMFTYnDDGBssM24qRqsTdJjyGuFSmkSwm3zrmMYs6JJZYyKaURzgiJcavqEEYfpceInfas65p071CHJRmm6guaV7PX43K-3SOWXjHMo16bUOuw6Goqjd427e3TD308qd_0MHuc_H7lb_oAUbBEXA> and
>> first got list of all the channels, then i did quick search and replace all
>> '=' with '!' using sed. then went back in and corrected first lines not
>> dealing with channels and replace with '=' the ones i use. still takes a
>> while but better than having to take out 800 one by one. in my case i only
>> use like 50 or 60.
>>
>> i also used this handy util channel editor
>> https://www.mythtv.org/wiki/Channel_Editor
>> <http://email.mg.glenb.net/c/eJxFTUsOwiAUPE3ZSYAWKAsWxug1DLxHLZFSQ9HG24txYTKZSSbzQQuTQzmQaEEBSD5K5qdJag7QGyeUBgPAPQxCGByZCIGCW0LqBnZLIXuaQyWz5Ux47RoDR0QYQ1sQIzqNynDlBUl2rvWxdf2xE5eGfd_p8q5zfdG13L5GvMcmp9nlHNL1jLGuhRT7Cx2eWyhbO_13PtCDO-k> to
>> have a reference while edited the conf file for sdjson. probably a way to
>> export from the channel editor (which it supports) and make a script to
>> adjust or recreate the conf file for sdjson.
>>
>
> Thanks - that channel editor looks like it might be a really good way to
> maintain the list.
> _______________________________________________
> mythtv-users mailing list
> mythtv-users@mythtv.org
> http://lists.mythtv.org/mailman/listinfo/mythtv-users
> http://wiki.mythtv.org/Mailing_List_etiquette
> MythTV Forums: https://forum.mythtv.org
>
Re: Populating the .xmltv file for my grabber from mythtv database? [ In reply to ]
On Sun, 2 May 2021 at 14:55, Larry Kennedy <lunchtimelarry@gmail.com> wrote:

> I am just now attempting the upgrade to version 31, and the cutover from
> DD to XMLTV.
>
> I successfully configured the sqlite grabber to pull from SD, and noticed
> that it downloaded all of the channels on my SD lineup, even though most
> are not selected. I guess this is intended behavior?
>
> Seeing how tedious it is to mark channels as "selected" in the Sqlite
> database, I came up with a slightly different way to transfer the channels
> I currently care about.
>
> I run this query in my existing mythconverg database:
>
> SELECT CONCAT('update channels set selected = 1 where channum = ', channum
> ,";") AS combined FROM channel where visible=1
>
> Pipe this output to a file named selected.sql which creates the sqlite
> update statements to set the correct channels to "selected" in the new
> grabber database to match what you now have as visible channels in mythtv.
>
> Then, just use the grabber tool to make all the channels as "not selected"
> by default, and then pipe the file from above into the sqlite query as seen
> on the xmltv wiki:
>
> sqlite3 $HOME/.xmltv/SchedulesDirect.DB < selected.sql
>
> This seems simpler than some of the other methods I've seen on the wiki
> and elsewhere...
>
> Moving on....I need some advice before I go past the rubicon on this v31 +
> xmltv upgrade:
>
> After I do the 31 upgrade, and since I only use an HDHR, I plan to just
> blow away my channels and repopulate. I think all the channel ids will
> line up the same as before.
>
> Currently, in v30, I run mythfilldatabase as a cron job. Should I
> continue to do that, or switch to another method? I've never let mythtv
> natively run the mythfilldatabase process. I'm having a hard time
> following the intent of the setup-video-sources wiki page on this topic.
> Advice appreciated!
>
> Thanks,
> Larry
>
>
I have a combined BE/FE which shuts down when not being used and wakes up
for recordings. I run this script from crondaily and it seems to work out
OK.

#!/usr/bin/bash
/usr/local/bin/mythshutdown --lock

tv_grab_zz_sdjson --days 10 --config-file ~/.xmltv/tv_grab_zz_sdjson.conf
--output ~/sd_listing.xml 2>/dev/null

/usr/local/bin/mythfilldatabase --only-update-guide --max-days 10 --file
--sourceid 2 --xmlfile ~/sd_listing.xml 2>/dev/null

/usr/local/bin/mythshutdown --unlock

It's not an ideal solution because I have a low powered ION motherboard
with only 4GB RAM and this more or less guarantees that MFD will run while
a recording is in progress when I'd prefer to ensure it ran when the box
wasn't recording. Also, if I ever go more than 10 days without a
scheduled recording the whole thing will come to a stop because it will run
out of guide data and never wake up to load any more. In practice neither
of those things causes a problem.


Although SD returns 18 days of guide data I've found that many channels
just have 'boilerplate' programme information after 10 days or so, so I
limit both SD and MFD to 10.
Re: Populating the .xmltv file for my grabber from mythtv database? [ In reply to ]
On Tue, May 4, 2021 at 12:58 PM David Watkins <watkinshome@gmail.com> wrote:

>
>
> On Sun, 2 May 2021 at 14:55, Larry Kennedy <lunchtimelarry@gmail.com>
> wrote:
>
>> I am just now attempting the upgrade to version 31, and the cutover from
>> DD to XMLTV.
>>
>> I successfully configured the sqlite grabber to pull from SD, and noticed
>> that it downloaded all of the channels on my SD lineup, even though most
>> are not selected. I guess this is intended behavior?
>>
>> Seeing how tedious it is to mark channels as "selected" in the Sqlite
>> database, I came up with a slightly different way to transfer the channels
>> I currently care about.
>>
>> I run this query in my existing mythconverg database:
>>
>> SELECT CONCAT('update channels set selected = 1 where channum = ',
>> channum ,";") AS combined FROM channel where visible=1
>>
>> Pipe this output to a file named selected.sql which creates the sqlite
>> update statements to set the correct channels to "selected" in the new
>> grabber database to match what you now have as visible channels in mythtv.
>>
>> Then, just use the grabber tool to make all the channels as "not
>> selected" by default, and then pipe the file from above into the sqlite
>> query as seen on the xmltv wiki:
>>
>> sqlite3 $HOME/.xmltv/SchedulesDirect.DB < selected.sql
>>
>> This seems simpler than some of the other methods I've seen on the wiki
>> and elsewhere...
>>
>> Moving on....I need some advice before I go past the rubicon on this v31
>> + xmltv upgrade:
>>
>> After I do the 31 upgrade, and since I only use an HDHR, I plan to just
>> blow away my channels and repopulate. I think all the channel ids will
>> line up the same as before.
>>
>> Currently, in v30, I run mythfilldatabase as a cron job. Should I
>> continue to do that, or switch to another method? I've never let mythtv
>> natively run the mythfilldatabase process. I'm having a hard time
>> following the intent of the setup-video-sources wiki page on this topic.
>> Advice appreciated!
>>
>> Thanks,
>> Larry
>>
>>
> I have a combined BE/FE which shuts down when not being used and wakes up
> for recordings. I run this script from crondaily and it seems to work out
> OK.
>
> #!/usr/bin/bash
> /usr/local/bin/mythshutdown --lock
>
> tv_grab_zz_sdjson --days 10 --config-file ~/.xmltv/tv_grab_zz_sdjson.conf
> --output ~/sd_listing.xml 2>/dev/null
>
> /usr/local/bin/mythfilldatabase --only-update-guide --max-days 10 --file
> --sourceid 2 --xmlfile ~/sd_listing.xml 2>/dev/null
>
> /usr/local/bin/mythshutdown --unlock
>
> It's not an ideal solution because I have a low powered ION motherboard
> with only 4GB RAM and this more or less guarantees that MFD will run while
> a recording is in progress when I'd prefer to ensure it ran when the box
> wasn't recording. Also, if I ever go more than 10 days without a
> scheduled recording the whole thing will come to a stop because it will run
> out of guide data and never wake up to load any more. In practice neither
> of those things causes a problem.
>
>
> Although SD returns 18 days of guide data I've found that many channels
> just have 'boilerplate' programme information after 10 days or so, so I
> limit both SD and MFD to 10.
>
>
I attempted an upgrade over the weekend, but ended up rolling it back.
Kudos to the myth devs that created the backup and restore scripts!

I'm still on v30 with Schedules Direct DataDirect and a cron job to run
mythfilldatabase. To get to v31 and the new xmltv grabber, I decided to
do the XMLTV upgrade now, followed by an upgrade to v31, say, a week
later. I chose this path since it appears that v31 would force the same
upgrade of XMLTV, and I wanted to decouple these two things. If I
approached this incorrectly, please let me know.

I ran the xmltv install, then followed the wiki to configure the sqlite
grabber. This all appeared to work, as I was able to see like 1,000 rows
of channel data in the sqlite database. As indicated in this thread, I
created a simple way to set the right channels as "selected".

After that, I went into mythtv-setup, channel editor, and deleted all my
channels. I only have one source that's by HDHR.

Then, I added a new video source that matches the name of the grabber
config, set this up as Multinational, and done.

Then I went into the other setup screen where I map the video source to my
HDHR tuners. Forget the name of that one. Capture cards?

After this, I ran mythfilldatabase. Looking at mythweb, I could see it
populated the channels back into the database, but I am pretty sure it put
in more than the ones I marked as "selected" in the Sqlite database
Strange. The listing started to appear, but VERY slowly.

Mytfilldatabase ran for like 4+ hours, and since I was getting close to the
time I needed to finish or bail out, I killed it. I then tried running the
script that does 3 days at a time, and this ran to full completion right
away, but never fully populated my listings. I still had major gaps in the
listings. Some were there and some never showed up.

Any advice before I give this another go this coming weekend? Is my
strategy flawed? Did I miss a step? How best to run mythfilldatabase so
that it doesn't take all day? Is it the 3 days at a time script?

Larry



_______________________________________________
> mythtv-users mailing list
> mythtv-users@mythtv.org
> http://lists.mythtv.org/mailman/listinfo/mythtv-users
> http://wiki.mythtv.org/Mailing_List_etiquette
> MythTV Forums: https://forum.mythtv.org
>
Re: Populating the .xmltv file for my grabber from mythtv database? [ In reply to ]
On Tue, 4 May 2021 13:32:55 -0400, you wrote:

>On Tue, May 4, 2021 at 12:58 PM David Watkins <watkinshome@gmail.com> wrote:
>
>>
>>
>> On Sun, 2 May 2021 at 14:55, Larry Kennedy <lunchtimelarry@gmail.com>
>> wrote:
>>
>>> I am just now attempting the upgrade to version 31, and the cutover from
>>> DD to XMLTV.
>>>
>>> I successfully configured the sqlite grabber to pull from SD, and noticed
>>> that it downloaded all of the channels on my SD lineup, even though most
>>> are not selected. I guess this is intended behavior?
>>>
>>> Seeing how tedious it is to mark channels as "selected" in the Sqlite
>>> database, I came up with a slightly different way to transfer the channels
>>> I currently care about.
>>>
>>> I run this query in my existing mythconverg database:
>>>
>>> SELECT CONCAT('update channels set selected = 1 where channum = ',
>>> channum ,";") AS combined FROM channel where visible=1
>>>
>>> Pipe this output to a file named selected.sql which creates the sqlite
>>> update statements to set the correct channels to "selected" in the new
>>> grabber database to match what you now have as visible channels in mythtv.
>>>
>>> Then, just use the grabber tool to make all the channels as "not
>>> selected" by default, and then pipe the file from above into the sqlite
>>> query as seen on the xmltv wiki:
>>>
>>> sqlite3 $HOME/.xmltv/SchedulesDirect.DB < selected.sql
>>>
>>> This seems simpler than some of the other methods I've seen on the wiki
>>> and elsewhere...
>>>
>>> Moving on....I need some advice before I go past the rubicon on this v31
>>> + xmltv upgrade:
>>>
>>> After I do the 31 upgrade, and since I only use an HDHR, I plan to just
>>> blow away my channels and repopulate. I think all the channel ids will
>>> line up the same as before.
>>>
>>> Currently, in v30, I run mythfilldatabase as a cron job. Should I
>>> continue to do that, or switch to another method? I've never let mythtv
>>> natively run the mythfilldatabase process. I'm having a hard time
>>> following the intent of the setup-video-sources wiki page on this topic.
>>> Advice appreciated!
>>>
>>> Thanks,
>>> Larry
>>>
>>>
>> I have a combined BE/FE which shuts down when not being used and wakes up
>> for recordings. I run this script from crondaily and it seems to work out
>> OK.
>>
>> #!/usr/bin/bash
>> /usr/local/bin/mythshutdown --lock
>>
>> tv_grab_zz_sdjson --days 10 --config-file ~/.xmltv/tv_grab_zz_sdjson.conf
>> --output ~/sd_listing.xml 2>/dev/null
>>
>> /usr/local/bin/mythfilldatabase --only-update-guide --max-days 10 --file
>> --sourceid 2 --xmlfile ~/sd_listing.xml 2>/dev/null
>>
>> /usr/local/bin/mythshutdown --unlock
>>
>> It's not an ideal solution because I have a low powered ION motherboard
>> with only 4GB RAM and this more or less guarantees that MFD will run while
>> a recording is in progress when I'd prefer to ensure it ran when the box
>> wasn't recording. Also, if I ever go more than 10 days without a
>> scheduled recording the whole thing will come to a stop because it will run
>> out of guide data and never wake up to load any more. In practice neither
>> of those things causes a problem.
>>
>>
>> Although SD returns 18 days of guide data I've found that many channels
>> just have 'boilerplate' programme information after 10 days or so, so I
>> limit both SD and MFD to 10.
>>
>>
>I attempted an upgrade over the weekend, but ended up rolling it back.
>Kudos to the myth devs that created the backup and restore scripts!
>
>I'm still on v30 with Schedules Direct DataDirect and a cron job to run
>mythfilldatabase. To get to v31 and the new xmltv grabber, I decided to
>do the XMLTV upgrade now, followed by an upgrade to v31, say, a week
>later. I chose this path since it appears that v31 would force the same
>upgrade of XMLTV, and I wanted to decouple these two things. If I
>approached this incorrectly, please let me know.
>
>I ran the xmltv install, then followed the wiki to configure the sqlite
>grabber. This all appeared to work, as I was able to see like 1,000 rows
>of channel data in the sqlite database. As indicated in this thread, I
>created a simple way to set the right channels as "selected".
>
>After that, I went into mythtv-setup, channel editor, and deleted all my
>channels. I only have one source that's by HDHR.
>
>Then, I added a new video source that matches the name of the grabber
>config, set this up as Multinational, and done.
>
>Then I went into the other setup screen where I map the video source to my
>HDHR tuners. Forget the name of that one. Capture cards?
>
>After this, I ran mythfilldatabase. Looking at mythweb, I could see it
>populated the channels back into the database, but I am pretty sure it put
>in more than the ones I marked as "selected" in the Sqlite database
>Strange. The listing started to appear, but VERY slowly.
>
>Mytfilldatabase ran for like 4+ hours, and since I was getting close to the
>time I needed to finish or bail out, I killed it. I then tried running the
>script that does 3 days at a time, and this ran to full completion right
>away, but never fully populated my listings. I still had major gaps in the
>listings. Some were there and some never showed up.
>
>Any advice before I give this another go this coming weekend? Is my
>strategy flawed? Did I miss a step? How best to run mythfilldatabase so
>that it doesn't take all day? Is it the 3 days at a time script?
>
>Larry

As I understand it (not being an SD user), the first time you run
mythfilldatabase after installing the new setup, it does take hours to
run. Subsequent updates need to be limited to only a few days to make
the time reasonable. This has been discussed here several times, and
also on the forum, so a search here:

https://lists.archive.carbon60.com/mythtv/users/

and

here:

https://forum.mythtv.org/

should help.

Unfortunately, it seems that Google searches these days are not
finding the mailing list references much, unless you have a very
specific keyword.

Looking on as a disinterested party, I have never understood why the
SD json EPG requires such long times to run. The actual process of
putting the EPG data from my XMLTV generated sources into the database
takes less than a minute. I have two sources, 141 channels and one
week of EPG data.
_______________________________________________
mythtv-users mailing list
mythtv-users@mythtv.org
http://lists.mythtv.org/mailman/listinfo/mythtv-users
http://wiki.mythtv.org/Mailing_List_etiquette
MythTV Forums: https://forum.mythtv.org
Re: Populating the .xmltv file for my grabber from mythtv database? [ In reply to ]
On Wed, 5 May 2021 at 05:06, Stephen Worthington <stephen_agent@jsw.gen.nz>
wrote:

> On Tue, 4 May 2021 13:32:55 -0400, you wrote:
>
> >On Tue, May 4, 2021 at 12:58 PM David Watkins <watkinshome@gmail.com>
> wrote:
> >
> >>
> >>
> >> On Sun, 2 May 2021 at 14:55, Larry Kennedy <lunchtimelarry@gmail.com>
> >> wrote:
> >>
> >>> I am just now attempting the upgrade to version 31, and the cutover
> from
> >>> DD to XMLTV.
> >>>
> >>> I successfully configured the sqlite grabber to pull from SD, and
> noticed
> >>> that it downloaded all of the channels on my SD lineup, even though
> most
> >>> are not selected. I guess this is intended behavior?
> >>>
> >>> Seeing how tedious it is to mark channels as "selected" in the Sqlite
> >>> database, I came up with a slightly different way to transfer the
> channels
> >>> I currently care about.
> >>>
> >>> I run this query in my existing mythconverg database:
> >>>
> >>> SELECT CONCAT('update channels set selected = 1 where channum = ',
> >>> channum ,";") AS combined FROM channel where visible=1
> >>>
> >>> Pipe this output to a file named selected.sql which creates the sqlite
> >>> update statements to set the correct channels to "selected" in the new
> >>> grabber database to match what you now have as visible channels in
> mythtv.
> >>>
> >>> Then, just use the grabber tool to make all the channels as "not
> >>> selected" by default, and then pipe the file from above into the sqlite
> >>> query as seen on the xmltv wiki:
> >>>
> >>> sqlite3 $HOME/.xmltv/SchedulesDirect.DB < selected.sql
> >>>
> >>> This seems simpler than some of the other methods I've seen on the wiki
> >>> and elsewhere...
> >>>
> >>> Moving on....I need some advice before I go past the rubicon on this
> v31
> >>> + xmltv upgrade:
> >>>
> >>> After I do the 31 upgrade, and since I only use an HDHR, I plan to just
> >>> blow away my channels and repopulate. I think all the channel ids will
> >>> line up the same as before.
> >>>
> >>> Currently, in v30, I run mythfilldatabase as a cron job. Should I
> >>> continue to do that, or switch to another method? I've never let
> mythtv
> >>> natively run the mythfilldatabase process. I'm having a hard time
> >>> following the intent of the setup-video-sources wiki page on this
> topic.
> >>> Advice appreciated!
> >>>
> >>> Thanks,
> >>> Larry
> >>>
> >>>
> >> I have a combined BE/FE which shuts down when not being used and wakes
> up
> >> for recordings. I run this script from crondaily and it seems to work
> out
> >> OK.
> >>
> >> #!/usr/bin/bash
> >> /usr/local/bin/mythshutdown --lock
> >>
> >> tv_grab_zz_sdjson --days 10 --config-file
> ~/.xmltv/tv_grab_zz_sdjson.conf
> >> --output ~/sd_listing.xml 2>/dev/null
> >>
> >> /usr/local/bin/mythfilldatabase --only-update-guide --max-days 10 --file
> >> --sourceid 2 --xmlfile ~/sd_listing.xml 2>/dev/null
> >>
> >> /usr/local/bin/mythshutdown --unlock
> >>
> >> It's not an ideal solution because I have a low powered ION motherboard
> >> with only 4GB RAM and this more or less guarantees that MFD will run
> while
> >> a recording is in progress when I'd prefer to ensure it ran when the box
> >> wasn't recording. Also, if I ever go more than 10 days without a
> >> scheduled recording the whole thing will come to a stop because it will
> run
> >> out of guide data and never wake up to load any more. In practice
> neither
> >> of those things causes a problem.
> >>
> >>
> >> Although SD returns 18 days of guide data I've found that many channels
> >> just have 'boilerplate' programme information after 10 days or so, so I
> >> limit both SD and MFD to 10.
> >>
> >>
> >I attempted an upgrade over the weekend, but ended up rolling it back.
> >Kudos to the myth devs that created the backup and restore scripts!
> >
> >I'm still on v30 with Schedules Direct DataDirect and a cron job to run
> >mythfilldatabase. To get to v31 and the new xmltv grabber, I decided to
> >do the XMLTV upgrade now, followed by an upgrade to v31, say, a week
> >later. I chose this path since it appears that v31 would force the same
> >upgrade of XMLTV, and I wanted to decouple these two things. If I
> >approached this incorrectly, please let me know.
> >
> >I ran the xmltv install, then followed the wiki to configure the sqlite
> >grabber. This all appeared to work, as I was able to see like 1,000 rows
> >of channel data in the sqlite database. As indicated in this thread, I
> >created a simple way to set the right channels as "selected".
> >
> >After that, I went into mythtv-setup, channel editor, and deleted all my
> >channels. I only have one source that's by HDHR.
> >
> >Then, I added a new video source that matches the name of the grabber
> >config, set this up as Multinational, and done.
> >
> >Then I went into the other setup screen where I map the video source to my
> >HDHR tuners. Forget the name of that one. Capture cards?
> >
> >After this, I ran mythfilldatabase. Looking at mythweb, I could see it
> >populated the channels back into the database, but I am pretty sure it put
> >in more than the ones I marked as "selected" in the Sqlite database
> >Strange. The listing started to appear, but VERY slowly.
> >
> >Mytfilldatabase ran for like 4+ hours, and since I was getting close to
> the
> >time I needed to finish or bail out, I killed it. I then tried running
> the
> >script that does 3 days at a time, and this ran to full completion right
> >away, but never fully populated my listings. I still had major gaps in
> the
> >listings. Some were there and some never showed up.
> >
> >Any advice before I give this another go this coming weekend? Is my
> >strategy flawed? Did I miss a step? How best to run mythfilldatabase so
> >that it doesn't take all day? Is it the 3 days at a time script?
> >
> >Larry
>
> As I understand it (not being an SD user), the first time you run
> mythfilldatabase after installing the new setup, it does take hours to
> run. Subsequent updates need to be limited to only a few days to make
> the time reasonable. This has been discussed here several times, and
> also on the forum, so a search here:
>
> https://lists.archive.carbon60.com/mythtv/users/
>
> and
>
> here:
>
> https://forum.mythtv.org/
>
> should help.
>
> Unfortunately, it seems that Google searches these days are not
> finding the mailing list references much, unless you have a very
> specific keyword.
>
> Looking on as a disinterested party, I have never understood why the
> SD json EPG requires such long times to run. The actual process of
> putting the EPG data from my XMLTV generated sources into the database
> takes less than a minute. I have two sources, 141 channels and one
> week of EPG data.



When I configured xmltv for the Schedules Direct lineup at my location it
picked up about 140 channels, which is consistent with what MythTV picks up
with a channel scan and puts into the database.

In MythTV I've marked all but 51 channels as 'invisible' and I've disabled
those channels in xmltv as well, by replacing the '=' with a '!' in the
relative line in the xmltv config file.

I have pretty low powered hardware (Zotac ION / 4GB RAM). Running the
xmltv grab for 10 days takes 2 or 3 minutes to grab the file. Running
mythfilldatabase on that file takes about 20 minutes the first time (when
the guide database table is empty) and about 5 minutes when it only has to
update the 10th day (assuming that no earlier entries have changed).

So my daily mythfilldatabase run takes about 7 minutes. Reducing the number
of channels from 140 to 51 helped, as did reducing the days from 18 to 10.

I only have one data source. When I had two then I ran xmltv once to grab
the data, but I had to run mythfilldatabase twice(once for each source).
This slowed things down somewhat.

I have to say that working out the xmltv id's for each channel is not
trivial and there can be maintenance to do after a channel rescan. I have a
couple of SQL queries and an Excel spreadsheet to help keep track of
things, and I've seen people post similar schemes here and on the WiKi. I
believe that there's work in hand to preserve xmltvid, icon name and other
channel parameters after a rescan so things should get a bit easier
sometime.

HTH

D
Re: Populating the .xmltv file for my grabber from mythtv database? [ In reply to ]
On Wed, May 5, 2021 at 4:19 AM David Watkins <watkinshome@gmail.com> wrote:

>
>
> On Wed, 5 May 2021 at 05:06, Stephen Worthington <stephen_agent@jsw.gen.nz>
> wrote:
>
>> On Tue, 4 May 2021 13:32:55 -0400, you wrote:
>>
>> >On Tue, May 4, 2021 at 12:58 PM David Watkins <watkinshome@gmail.com>
>> wrote:
>> >
>> >>
>> >>
>> >> On Sun, 2 May 2021 at 14:55, Larry Kennedy <lunchtimelarry@gmail.com>
>> >> wrote:
>> >>
>> >>> I am just now attempting the upgrade to version 31, and the cutover
>> from
>> >>> DD to XMLTV.
>> >>>
>> >>> I successfully configured the sqlite grabber to pull from SD, and
>> noticed
>> >>> that it downloaded all of the channels on my SD lineup, even though
>> most
>> >>> are not selected. I guess this is intended behavior?
>> >>>
>> >>> Seeing how tedious it is to mark channels as "selected" in the Sqlite
>> >>> database, I came up with a slightly different way to transfer the
>> channels
>> >>> I currently care about.
>> >>>
>> >>> I run this query in my existing mythconverg database:
>> >>>
>> >>> SELECT CONCAT('update channels set selected = 1 where channum = ',
>> >>> channum ,";") AS combined FROM channel where visible=1
>> >>>
>> >>> Pipe this output to a file named selected.sql which creates the sqlite
>> >>> update statements to set the correct channels to "selected" in the new
>> >>> grabber database to match what you now have as visible channels in
>> mythtv.
>> >>>
>> >>> Then, just use the grabber tool to make all the channels as "not
>> >>> selected" by default, and then pipe the file from above into the
>> sqlite
>> >>> query as seen on the xmltv wiki:
>> >>>
>> >>> sqlite3 $HOME/.xmltv/SchedulesDirect.DB < selected.sql
>> >>>
>> >>> This seems simpler than some of the other methods I've seen on the
>> wiki
>> >>> and elsewhere...
>> >>>
>> >>> Moving on....I need some advice before I go past the rubicon on this
>> v31
>> >>> + xmltv upgrade:
>> >>>
>> >>> After I do the 31 upgrade, and since I only use an HDHR, I plan to
>> just
>> >>> blow away my channels and repopulate. I think all the channel ids
>> will
>> >>> line up the same as before.
>> >>>
>> >>> Currently, in v30, I run mythfilldatabase as a cron job. Should I
>> >>> continue to do that, or switch to another method? I've never let
>> mythtv
>> >>> natively run the mythfilldatabase process. I'm having a hard time
>> >>> following the intent of the setup-video-sources wiki page on this
>> topic.
>> >>> Advice appreciated!
>> >>>
>> >>> Thanks,
>> >>> Larry
>> >>>
>> >>>
>> >> I have a combined BE/FE which shuts down when not being used and wakes
>> up
>> >> for recordings. I run this script from crondaily and it seems to work
>> out
>> >> OK.
>> >>
>> >> #!/usr/bin/bash
>> >> /usr/local/bin/mythshutdown --lock
>> >>
>> >> tv_grab_zz_sdjson --days 10 --config-file
>> ~/.xmltv/tv_grab_zz_sdjson.conf
>> >> --output ~/sd_listing.xml 2>/dev/null
>> >>
>> >> /usr/local/bin/mythfilldatabase --only-update-guide --max-days 10
>> --file
>> >> --sourceid 2 --xmlfile ~/sd_listing.xml 2>/dev/null
>> >>
>> >> /usr/local/bin/mythshutdown --unlock
>> >>
>> >> It's not an ideal solution because I have a low powered ION motherboard
>> >> with only 4GB RAM and this more or less guarantees that MFD will run
>> while
>> >> a recording is in progress when I'd prefer to ensure it ran when the
>> box
>> >> wasn't recording. Also, if I ever go more than 10 days without a
>> >> scheduled recording the whole thing will come to a stop because it
>> will run
>> >> out of guide data and never wake up to load any more. In practice
>> neither
>> >> of those things causes a problem.
>> >>
>> >>
>> >> Although SD returns 18 days of guide data I've found that many channels
>> >> just have 'boilerplate' programme information after 10 days or so, so
>> I
>> >> limit both SD and MFD to 10.
>> >>
>> >>
>> >I attempted an upgrade over the weekend, but ended up rolling it back.
>> >Kudos to the myth devs that created the backup and restore scripts!
>> >
>> >I'm still on v30 with Schedules Direct DataDirect and a cron job to run
>> >mythfilldatabase. To get to v31 and the new xmltv grabber, I decided to
>> >do the XMLTV upgrade now, followed by an upgrade to v31, say, a week
>> >later. I chose this path since it appears that v31 would force the same
>> >upgrade of XMLTV, and I wanted to decouple these two things. If I
>> >approached this incorrectly, please let me know.
>> >
>> >I ran the xmltv install, then followed the wiki to configure the sqlite
>> >grabber. This all appeared to work, as I was able to see like 1,000 rows
>> >of channel data in the sqlite database. As indicated in this thread, I
>> >created a simple way to set the right channels as "selected".
>> >
>> >After that, I went into mythtv-setup, channel editor, and deleted all my
>> >channels. I only have one source that's by HDHR.
>> >
>> >Then, I added a new video source that matches the name of the grabber
>> >config, set this up as Multinational, and done.
>> >
>> >Then I went into the other setup screen where I map the video source to
>> my
>> >HDHR tuners. Forget the name of that one. Capture cards?
>> >
>> >After this, I ran mythfilldatabase. Looking at mythweb, I could see it
>> >populated the channels back into the database, but I am pretty sure it
>> put
>> >in more than the ones I marked as "selected" in the Sqlite database
>> >Strange. The listing started to appear, but VERY slowly.
>> >
>> >Mytfilldatabase ran for like 4+ hours, and since I was getting close to
>> the
>> >time I needed to finish or bail out, I killed it. I then tried running
>> the
>> >script that does 3 days at a time, and this ran to full completion right
>> >away, but never fully populated my listings. I still had major gaps in
>> the
>> >listings. Some were there and some never showed up.
>> >
>> >Any advice before I give this another go this coming weekend? Is my
>> >strategy flawed? Did I miss a step? How best to run mythfilldatabase so
>> >that it doesn't take all day? Is it the 3 days at a time script?
>> >
>> >Larry
>>
>> As I understand it (not being an SD user), the first time you run
>> mythfilldatabase after installing the new setup, it does take hours to
>> run. Subsequent updates need to be limited to only a few days to make
>> the time reasonable. This has been discussed here several times, and
>> also on the forum, so a search here:
>>
>> https://lists.archive.carbon60.com/mythtv/users/
>>
>> and
>>
>> here:
>>
>> https://forum.mythtv.org/
>>
>> should help.
>>
>> Unfortunately, it seems that Google searches these days are not
>> finding the mailing list references much, unless you have a very
>> specific keyword.
>>
>> Looking on as a disinterested party, I have never understood why the
>> SD json EPG requires such long times to run. The actual process of
>> putting the EPG data from my XMLTV generated sources into the database
>> takes less than a minute. I have two sources, 141 channels and one
>> week of EPG data.
>
>
>
> When I configured xmltv for the Schedules Direct lineup at my location it
> picked up about 140 channels, which is consistent with what MythTV picks up
> with a channel scan and puts into the database.
>
> In MythTV I've marked all but 51 channels as 'invisible' and I've disabled
> those channels in xmltv as well, by replacing the '=' with a '!' in the
> relative line in the xmltv config file.
>
> I have pretty low powered hardware (Zotac ION / 4GB RAM). Running the
> xmltv grab for 10 days takes 2 or 3 minutes to grab the file. Running
> mythfilldatabase on that file takes about 20 minutes the first time (when
> the guide database table is empty) and about 5 minutes when it only has to
> update the 10th day (assuming that no earlier entries have changed).
>
> So my daily mythfilldatabase run takes about 7 minutes. Reducing the
> number of channels from 140 to 51 helped, as did reducing the days from 18
> to 10.
>
> I only have one data source. When I had two then I ran xmltv once to grab
> the data, but I had to run mythfilldatabase twice(once for each source).
> This slowed things down somewhat.
>
> I have to say that working out the xmltv id's for each channel is not
> trivial and there can be maintenance to do after a channel rescan. I have a
> couple of SQL queries and an Excel spreadsheet to help keep track of
> things, and I've seen people post similar schemes here and on the WiKi. I
> believe that there's work in hand to preserve xmltvid, icon name and other
> channel parameters after a rescan so things should get a bit easier
> sometime.
>
> HTH
>
> D
>

I'm attempting the XMLTV upgrade again today. For background, my myth
backend is a virtual machine with 8GB RAM and 4 vCPUs on an i7-9700K host.
I ran the script to optimize the database, just in case.

My number one issue is the length of time it is taking mythfilldatabase to
run. I've only got 125 channels in my lineup with one source. I'm running
the script that chunks the work into 3 day increments as seen on the
mythfilldatabase wiki.

The first 3-day chunk took 70 minutes. This seems quite long to me, but
at least it seems to be working -- the gaps in my listings are disappearing.

Should I expect this to be the case every day? I'm thinking seven 3-day
chunks of 70 minutes each will take a total of ~8 hours.

Larry




> _______________________________________________
> mythtv-users mailing list
> mythtv-users@mythtv.org
> http://lists.mythtv.org/mailman/listinfo/mythtv-users
> http://wiki.mythtv.org/Mailing_List_etiquette
> MythTV Forums: https://forum.mythtv.org
>
Re: Populating the .xmltv file for my grabber from mythtv database? [ In reply to ]
On Sun, 9 May 2021 at 20:22, Larry Kennedy <lunchtimelarry@gmail.com> wrote:

>
>
> On Wed, May 5, 2021 at 4:19 AM David Watkins <watkinshome@gmail.com>
> wrote:
>
>>
>>
>> On Wed, 5 May 2021 at 05:06, Stephen Worthington <
>> stephen_agent@jsw.gen.nz> wrote:
>>
>>> On Tue, 4 May 2021 13:32:55 -0400, you wrote:
>>>
>>> >On Tue, May 4, 2021 at 12:58 PM David Watkins <watkinshome@gmail.com>
>>> wrote:
>>> >
>>> >>
>>> >>
>>> >> On Sun, 2 May 2021 at 14:55, Larry Kennedy <lunchtimelarry@gmail.com>
>>> >> wrote:
>>> >>
>>> >>> I am just now attempting the upgrade to version 31, and the cutover
>>> from
>>> >>> DD to XMLTV.
>>> >>>
>>> >>> I successfully configured the sqlite grabber to pull from SD, and
>>> noticed
>>> >>> that it downloaded all of the channels on my SD lineup, even though
>>> most
>>> >>> are not selected. I guess this is intended behavior?
>>> >>>
>>> >>> Seeing how tedious it is to mark channels as "selected" in the Sqlite
>>> >>> database, I came up with a slightly different way to transfer the
>>> channels
>>> >>> I currently care about.
>>> >>>
>>> >>> I run this query in my existing mythconverg database:
>>> >>>
>>> >>> SELECT CONCAT('update channels set selected = 1 where channum = ',
>>> >>> channum ,";") AS combined FROM channel where visible=1
>>> >>>
>>> >>> Pipe this output to a file named selected.sql which creates the
>>> sqlite
>>> >>> update statements to set the correct channels to "selected" in the
>>> new
>>> >>> grabber database to match what you now have as visible channels in
>>> mythtv.
>>> >>>
>>> >>> Then, just use the grabber tool to make all the channels as "not
>>> >>> selected" by default, and then pipe the file from above into the
>>> sqlite
>>> >>> query as seen on the xmltv wiki:
>>> >>>
>>> >>> sqlite3 $HOME/.xmltv/SchedulesDirect.DB < selected.sql
>>> >>>
>>> >>> This seems simpler than some of the other methods I've seen on the
>>> wiki
>>> >>> and elsewhere...
>>> >>>
>>> >>> Moving on....I need some advice before I go past the rubicon on this
>>> v31
>>> >>> + xmltv upgrade:
>>> >>>
>>> >>> After I do the 31 upgrade, and since I only use an HDHR, I plan to
>>> just
>>> >>> blow away my channels and repopulate. I think all the channel ids
>>> will
>>> >>> line up the same as before.
>>> >>>
>>> >>> Currently, in v30, I run mythfilldatabase as a cron job. Should I
>>> >>> continue to do that, or switch to another method? I've never let
>>> mythtv
>>> >>> natively run the mythfilldatabase process. I'm having a hard time
>>> >>> following the intent of the setup-video-sources wiki page on this
>>> topic.
>>> >>> Advice appreciated!
>>> >>>
>>> >>> Thanks,
>>> >>> Larry
>>> >>>
>>> >>>
>>> >> I have a combined BE/FE which shuts down when not being used and
>>> wakes up
>>> >> for recordings. I run this script from crondaily and it seems to
>>> work out
>>> >> OK.
>>> >>
>>> >> #!/usr/bin/bash
>>> >> /usr/local/bin/mythshutdown --lock
>>> >>
>>> >> tv_grab_zz_sdjson --days 10 --config-file
>>> ~/.xmltv/tv_grab_zz_sdjson.conf
>>> >> --output ~/sd_listing.xml 2>/dev/null
>>> >>
>>> >> /usr/local/bin/mythfilldatabase --only-update-guide --max-days 10
>>> --file
>>> >> --sourceid 2 --xmlfile ~/sd_listing.xml 2>/dev/null
>>> >>
>>> >> /usr/local/bin/mythshutdown --unlock
>>> >>
>>> >> It's not an ideal solution because I have a low powered ION
>>> motherboard
>>> >> with only 4GB RAM and this more or less guarantees that MFD will run
>>> while
>>> >> a recording is in progress when I'd prefer to ensure it ran when the
>>> box
>>> >> wasn't recording. Also, if I ever go more than 10 days without a
>>> >> scheduled recording the whole thing will come to a stop because it
>>> will run
>>> >> out of guide data and never wake up to load any more. In practice
>>> neither
>>> >> of those things causes a problem.
>>> >>
>>> >>
>>> >> Although SD returns 18 days of guide data I've found that many
>>> channels
>>> >> just have 'boilerplate' programme information after 10 days or so,
>>> so I
>>> >> limit both SD and MFD to 10.
>>> >>
>>> >>
>>> >I attempted an upgrade over the weekend, but ended up rolling it back.
>>> >Kudos to the myth devs that created the backup and restore scripts!
>>> >
>>> >I'm still on v30 with Schedules Direct DataDirect and a cron job to run
>>> >mythfilldatabase. To get to v31 and the new xmltv grabber, I decided
>>> to
>>> >do the XMLTV upgrade now, followed by an upgrade to v31, say, a week
>>> >later. I chose this path since it appears that v31 would force the same
>>> >upgrade of XMLTV, and I wanted to decouple these two things. If I
>>> >approached this incorrectly, please let me know.
>>> >
>>> >I ran the xmltv install, then followed the wiki to configure the sqlite
>>> >grabber. This all appeared to work, as I was able to see like 1,000
>>> rows
>>> >of channel data in the sqlite database. As indicated in this thread, I
>>> >created a simple way to set the right channels as "selected".
>>> >
>>> >After that, I went into mythtv-setup, channel editor, and deleted all my
>>> >channels. I only have one source that's by HDHR.
>>> >
>>> >Then, I added a new video source that matches the name of the grabber
>>> >config, set this up as Multinational, and done.
>>> >
>>> >Then I went into the other setup screen where I map the video source to
>>> my
>>> >HDHR tuners. Forget the name of that one. Capture cards?
>>> >
>>> >After this, I ran mythfilldatabase. Looking at mythweb, I could see it
>>> >populated the channels back into the database, but I am pretty sure it
>>> put
>>> >in more than the ones I marked as "selected" in the Sqlite database
>>> >Strange. The listing started to appear, but VERY slowly.
>>> >
>>> >Mytfilldatabase ran for like 4+ hours, and since I was getting close to
>>> the
>>> >time I needed to finish or bail out, I killed it. I then tried running
>>> the
>>> >script that does 3 days at a time, and this ran to full completion right
>>> >away, but never fully populated my listings. I still had major gaps in
>>> the
>>> >listings. Some were there and some never showed up.
>>> >
>>> >Any advice before I give this another go this coming weekend? Is my
>>> >strategy flawed? Did I miss a step? How best to run mythfilldatabase
>>> so
>>> >that it doesn't take all day? Is it the 3 days at a time script?
>>> >
>>> >Larry
>>>
>>> As I understand it (not being an SD user), the first time you run
>>> mythfilldatabase after installing the new setup, it does take hours to
>>> run. Subsequent updates need to be limited to only a few days to make
>>> the time reasonable. This has been discussed here several times, and
>>> also on the forum, so a search here:
>>>
>>> https://lists.archive.carbon60.com/mythtv/users/
>>>
>>> and
>>>
>>> here:
>>>
>>> https://forum.mythtv.org/
>>>
>>> should help.
>>>
>>> Unfortunately, it seems that Google searches these days are not
>>> finding the mailing list references much, unless you have a very
>>> specific keyword.
>>>
>>> Looking on as a disinterested party, I have never understood why the
>>> SD json EPG requires such long times to run. The actual process of
>>> putting the EPG data from my XMLTV generated sources into the database
>>> takes less than a minute. I have two sources, 141 channels and one
>>> week of EPG data.
>>
>>
>>
>> When I configured xmltv for the Schedules Direct lineup at my location it
>> picked up about 140 channels, which is consistent with what MythTV picks up
>> with a channel scan and puts into the database.
>>
>> In MythTV I've marked all but 51 channels as 'invisible' and I've
>> disabled those channels in xmltv as well, by replacing the '=' with a '!'
>> in the relative line in the xmltv config file.
>>
>> I have pretty low powered hardware (Zotac ION / 4GB RAM). Running the
>> xmltv grab for 10 days takes 2 or 3 minutes to grab the file. Running
>> mythfilldatabase on that file takes about 20 minutes the first time (when
>> the guide database table is empty) and about 5 minutes when it only has to
>> update the 10th day (assuming that no earlier entries have changed).
>>
>> So my daily mythfilldatabase run takes about 7 minutes. Reducing the
>> number of channels from 140 to 51 helped, as did reducing the days from 18
>> to 10.
>>
>> I only have one data source. When I had two then I ran xmltv once to
>> grab the data, but I had to run mythfilldatabase twice(once for each
>> source). This slowed things down somewhat.
>>
>> I have to say that working out the xmltv id's for each channel is not
>> trivial and there can be maintenance to do after a channel rescan. I have a
>> couple of SQL queries and an Excel spreadsheet to help keep track of
>> things, and I've seen people post similar schemes here and on the WiKi. I
>> believe that there's work in hand to preserve xmltvid, icon name and other
>> channel parameters after a rescan so things should get a bit easier
>> sometime.
>>
>> HTH
>>
>> D
>>
>
> I'm attempting the XMLTV upgrade again today. For background, my myth
> backend is a virtual machine with 8GB RAM and 4 vCPUs on an i7-9700K host.
> I ran the script to optimize the database, just in case.
>
> My number one issue is the length of time it is taking mythfilldatabase to
> run. I've only got 125 channels in my lineup with one source. I'm running
> the script that chunks the work into 3 day increments as seen on the
> mythfilldatabase wiki.
>
> The first 3-day chunk took 70 minutes. This seems quite long to me, but
> at least it seems to be working -- the gaps in my listings are disappearing.
>
> Should I expect this to be the case every day? I'm thinking seven 3-day
> chunks of 70 minutes each will take a total of ~8 hours.
>
> Larry
>


That seems an extraordinarily long time. As I said, my 10-day chunk of 50
channels takes about 25 minutes on way less hardware.

Can you find out whether it's the xmltv stage or the mythfilldatabase stage
which is taking the time? Maybe in the logs, or run them separately by
hand.

D
Re: Populating the .xmltv file for my grabber from mythtv database? [ In reply to ]
David Watkins wrote:
> I have a combined BE/FE which shuts down when not being used and wakes up
> for recordings. I run this script from crondaily and it seems to work out
> OK.
>
> #!/usr/bin/bash
> /usr/local/bin/mythshutdown --lock
>
> tv_grab_zz_sdjson --days 10 --config-file ~/.xmltv/tv_grab_zz_sdjson.conf
> --output ~/sd_listing.xml 2>/dev/null
>
> /usr/local/bin/mythfilldatabase --only-update-guide --max-days 10 --file
> --sourceid 2 --xmlfile ~/sd_listing.xml 2>/dev/null
>
> /usr/local/bin/mythshutdown --unlock
>
> It's not an ideal solution because I have a low powered ION motherboard
> with only 4GB RAM and this more or less guarantees that MFD will run while
> a recording is in progress when I'd prefer to ensure it ran when the box
> wasn't recording. Also, if I ever go more than 10 days without a
> scheduled recording the whole thing will come to a stop because it will run
> out of guide data and never wake up to load any more. In practice neither
> of those things causes a problem.

For your first not-currently-a-problem: you could switch to
running your grabber/MFD as part of shutdown of the box.

https://opensource.com/life/16/11/running-commands-shutdown-linux

shows methods for doing so with sysvinit and systemd.

You would probably want to create a lock file and only run if that
file is more than 24 hours old. Remember that ctime is metadata
change time and mtime is modification change time -- echo'ing
something to that file will both create it if necessary and
modify it, so that's your best bet.

For your second not-currently-a-problem, you would probably have
to parse the current rtcwake value, decide if it's too far into
the future, and if so, save it, set the new value to n<10 days
out, and then on boot pull the old value out of a file and
re-apply it.

-dsr-
_______________________________________________
mythtv-users mailing list
mythtv-users@mythtv.org
http://lists.mythtv.org/mailman/listinfo/mythtv-users
http://wiki.mythtv.org/Mailing_List_etiquette
MythTV Forums: https://forum.mythtv.org
Re: Populating the .xmltv file for my grabber from mythtv database? [ In reply to ]
On Mon, May 10, 2021 at 3:10 AM David Watkins <watkinshome@gmail.com> wrote:

>
>
> On Sun, 9 May 2021 at 20:22, Larry Kennedy <lunchtimelarry@gmail.com>
> wrote:
>
>>
>>
>> On Wed, May 5, 2021 at 4:19 AM David Watkins <watkinshome@gmail.com>
>> wrote:
>>
>>>
>>>
>>> On Wed, 5 May 2021 at 05:06, Stephen Worthington <
>>> stephen_agent@jsw.gen.nz> wrote:
>>>
>>>> On Tue, 4 May 2021 13:32:55 -0400, you wrote:
>>>>
>>>> >On Tue, May 4, 2021 at 12:58 PM David Watkins <watkinshome@gmail.com>
>>>> wrote:
>>>> >
>>>> >>
>>>> >>
>>>> >> On Sun, 2 May 2021 at 14:55, Larry Kennedy <lunchtimelarry@gmail.com
>>>> >
>>>> >> wrote:
>>>> >>
>>>> >>> I am just now attempting the upgrade to version 31, and the cutover
>>>> from
>>>> >>> DD to XMLTV.
>>>> >>>
>>>> >>> I successfully configured the sqlite grabber to pull from SD, and
>>>> noticed
>>>> >>> that it downloaded all of the channels on my SD lineup, even though
>>>> most
>>>> >>> are not selected. I guess this is intended behavior?
>>>> >>>
>>>> >>> Seeing how tedious it is to mark channels as "selected" in the
>>>> Sqlite
>>>> >>> database, I came up with a slightly different way to transfer the
>>>> channels
>>>> >>> I currently care about.
>>>> >>>
>>>> >>> I run this query in my existing mythconverg database:
>>>> >>>
>>>> >>> SELECT CONCAT('update channels set selected = 1 where channum = ',
>>>> >>> channum ,";") AS combined FROM channel where visible=1
>>>> >>>
>>>> >>> Pipe this output to a file named selected.sql which creates the
>>>> sqlite
>>>> >>> update statements to set the correct channels to "selected" in the
>>>> new
>>>> >>> grabber database to match what you now have as visible channels in
>>>> mythtv.
>>>> >>>
>>>> >>> Then, just use the grabber tool to make all the channels as "not
>>>> >>> selected" by default, and then pipe the file from above into the
>>>> sqlite
>>>> >>> query as seen on the xmltv wiki:
>>>> >>>
>>>> >>> sqlite3 $HOME/.xmltv/SchedulesDirect.DB < selected.sql
>>>> >>>
>>>> >>> This seems simpler than some of the other methods I've seen on the
>>>> wiki
>>>> >>> and elsewhere...
>>>> >>>
>>>> >>> Moving on....I need some advice before I go past the rubicon on
>>>> this v31
>>>> >>> + xmltv upgrade:
>>>> >>>
>>>> >>> After I do the 31 upgrade, and since I only use an HDHR, I plan to
>>>> just
>>>> >>> blow away my channels and repopulate. I think all the channel ids
>>>> will
>>>> >>> line up the same as before.
>>>> >>>
>>>> >>> Currently, in v30, I run mythfilldatabase as a cron job. Should I
>>>> >>> continue to do that, or switch to another method? I've never let
>>>> mythtv
>>>> >>> natively run the mythfilldatabase process. I'm having a hard time
>>>> >>> following the intent of the setup-video-sources wiki page on this
>>>> topic.
>>>> >>> Advice appreciated!
>>>> >>>
>>>> >>> Thanks,
>>>> >>> Larry
>>>> >>>
>>>> >>>
>>>> >> I have a combined BE/FE which shuts down when not being used and
>>>> wakes up
>>>> >> for recordings. I run this script from crondaily and it seems to
>>>> work out
>>>> >> OK.
>>>> >>
>>>> >> #!/usr/bin/bash
>>>> >> /usr/local/bin/mythshutdown --lock
>>>> >>
>>>> >> tv_grab_zz_sdjson --days 10 --config-file
>>>> ~/.xmltv/tv_grab_zz_sdjson.conf
>>>> >> --output ~/sd_listing.xml 2>/dev/null
>>>> >>
>>>> >> /usr/local/bin/mythfilldatabase --only-update-guide --max-days 10
>>>> --file
>>>> >> --sourceid 2 --xmlfile ~/sd_listing.xml 2>/dev/null
>>>> >>
>>>> >> /usr/local/bin/mythshutdown --unlock
>>>> >>
>>>> >> It's not an ideal solution because I have a low powered ION
>>>> motherboard
>>>> >> with only 4GB RAM and this more or less guarantees that MFD will
>>>> run while
>>>> >> a recording is in progress when I'd prefer to ensure it ran when the
>>>> box
>>>> >> wasn't recording. Also, if I ever go more than 10 days without a
>>>> >> scheduled recording the whole thing will come to a stop because it
>>>> will run
>>>> >> out of guide data and never wake up to load any more. In practice
>>>> neither
>>>> >> of those things causes a problem.
>>>> >>
>>>> >>
>>>> >> Although SD returns 18 days of guide data I've found that many
>>>> channels
>>>> >> just have 'boilerplate' programme information after 10 days or so,
>>>> so I
>>>> >> limit both SD and MFD to 10.
>>>> >>
>>>> >>
>>>> >I attempted an upgrade over the weekend, but ended up rolling it back.
>>>> >Kudos to the myth devs that created the backup and restore scripts!
>>>> >
>>>> >I'm still on v30 with Schedules Direct DataDirect and a cron job to run
>>>> >mythfilldatabase. To get to v31 and the new xmltv grabber, I decided
>>>> to
>>>> >do the XMLTV upgrade now, followed by an upgrade to v31, say, a week
>>>> >later. I chose this path since it appears that v31 would force the
>>>> same
>>>> >upgrade of XMLTV, and I wanted to decouple these two things. If I
>>>> >approached this incorrectly, please let me know.
>>>> >
>>>> >I ran the xmltv install, then followed the wiki to configure the sqlite
>>>> >grabber. This all appeared to work, as I was able to see like 1,000
>>>> rows
>>>> >of channel data in the sqlite database. As indicated in this thread,
>>>> I
>>>> >created a simple way to set the right channels as "selected".
>>>> >
>>>> >After that, I went into mythtv-setup, channel editor, and deleted all
>>>> my
>>>> >channels. I only have one source that's by HDHR.
>>>> >
>>>> >Then, I added a new video source that matches the name of the grabber
>>>> >config, set this up as Multinational, and done.
>>>> >
>>>> >Then I went into the other setup screen where I map the video source
>>>> to my
>>>> >HDHR tuners. Forget the name of that one. Capture cards?
>>>> >
>>>> >After this, I ran mythfilldatabase. Looking at mythweb, I could see it
>>>> >populated the channels back into the database, but I am pretty sure it
>>>> put
>>>> >in more than the ones I marked as "selected" in the Sqlite database
>>>> >Strange. The listing started to appear, but VERY slowly.
>>>> >
>>>> >Mytfilldatabase ran for like 4+ hours, and since I was getting close
>>>> to the
>>>> >time I needed to finish or bail out, I killed it. I then tried
>>>> running the
>>>> >script that does 3 days at a time, and this ran to full completion
>>>> right
>>>> >away, but never fully populated my listings. I still had major gaps
>>>> in the
>>>> >listings. Some were there and some never showed up.
>>>> >
>>>> >Any advice before I give this another go this coming weekend? Is my
>>>> >strategy flawed? Did I miss a step? How best to run mythfilldatabase
>>>> so
>>>> >that it doesn't take all day? Is it the 3 days at a time script?
>>>> >
>>>> >Larry
>>>>
>>>> As I understand it (not being an SD user), the first time you run
>>>> mythfilldatabase after installing the new setup, it does take hours to
>>>> run. Subsequent updates need to be limited to only a few days to make
>>>> the time reasonable. This has been discussed here several times, and
>>>> also on the forum, so a search here:
>>>>
>>>> https://lists.archive.carbon60.com/mythtv/users/
>>>>
>>>> and
>>>>
>>>> here:
>>>>
>>>> https://forum.mythtv.org/
>>>>
>>>> should help.
>>>>
>>>> Unfortunately, it seems that Google searches these days are not
>>>> finding the mailing list references much, unless you have a very
>>>> specific keyword.
>>>>
>>>> Looking on as a disinterested party, I have never understood why the
>>>> SD json EPG requires such long times to run. The actual process of
>>>> putting the EPG data from my XMLTV generated sources into the database
>>>> takes less than a minute. I have two sources, 141 channels and one
>>>> week of EPG data.
>>>
>>>
>>>
>>> When I configured xmltv for the Schedules Direct lineup at my location
>>> it picked up about 140 channels, which is consistent with what MythTV picks
>>> up with a channel scan and puts into the database.
>>>
>>> In MythTV I've marked all but 51 channels as 'invisible' and I've
>>> disabled those channels in xmltv as well, by replacing the '=' with a '!'
>>> in the relative line in the xmltv config file.
>>>
>>> I have pretty low powered hardware (Zotac ION / 4GB RAM). Running the
>>> xmltv grab for 10 days takes 2 or 3 minutes to grab the file. Running
>>> mythfilldatabase on that file takes about 20 minutes the first time (when
>>> the guide database table is empty) and about 5 minutes when it only has to
>>> update the 10th day (assuming that no earlier entries have changed).
>>>
>>> So my daily mythfilldatabase run takes about 7 minutes. Reducing the
>>> number of channels from 140 to 51 helped, as did reducing the days from 18
>>> to 10.
>>>
>>> I only have one data source. When I had two then I ran xmltv once to
>>> grab the data, but I had to run mythfilldatabase twice(once for each
>>> source). This slowed things down somewhat.
>>>
>>> I have to say that working out the xmltv id's for each channel is not
>>> trivial and there can be maintenance to do after a channel rescan. I have a
>>> couple of SQL queries and an Excel spreadsheet to help keep track of
>>> things, and I've seen people post similar schemes here and on the WiKi. I
>>> believe that there's work in hand to preserve xmltvid, icon name and other
>>> channel parameters after a rescan so things should get a bit easier
>>> sometime.
>>>
>>> HTH
>>>
>>> D
>>>
>>
>> I'm attempting the XMLTV upgrade again today. For background, my myth
>> backend is a virtual machine with 8GB RAM and 4 vCPUs on an i7-9700K host.
>> I ran the script to optimize the database, just in case.
>>
>> My number one issue is the length of time it is taking mythfilldatabase
>> to run. I've only got 125 channels in my lineup with one source. I'm
>> running the script that chunks the work into 3 day increments as seen on
>> the mythfilldatabase wiki.
>>
>> The first 3-day chunk took 70 minutes. This seems quite long to me, but
>> at least it seems to be working -- the gaps in my listings are disappearing.
>>
>> Should I expect this to be the case every day? I'm thinking seven 3-day
>> chunks of 70 minutes each will take a total of ~8 hours.
>>
>> Larry
>>
>
>
> That seems an extraordinarily long time. As I said, my 10-day chunk of 50
> channels takes about 25 minutes on way less hardware.
>
> Can you find out whether it's the xmltv stage or the mythfilldatabase
> stage which is taking the time? Maybe in the logs, or run them separately
> by hand.
>
> D
>

Sorry for the repeated posts. From what I can tell, all the time is spent
in the mythfilldatabase stage. When I run the grabber command below in
isolation, it comes back quickly, albeit with a ton of data.

The initial run took well over 12 hours. After that finished, a subsequent
run took 4 hours.

I've noticed with verbose mythfill logging there are a bazillion
queries/inserts on the people, credits, programrating, and programgenre
tables. Stuff that appears to be optional? Not sure what that is all
about or why I really need it. These tables have hundreds of thousands of
rows.

Here is the script I am running:

#!/bin/bash
SOURCEID=2
grabber="tv_grab_zz_sdjson_sqlite"
rm -f /tmp/tv_grab_off*.xml
for (( offset = 0; offset < 20; offset += 3 )) ; do
"$grabber" --days 3 --offset $offset --config-file
$HOME/.mythtv/xfinity.xmltv > /tmp/tv_grab_off$offset.xml
mythfilldatabase --file --sourceid $SOURCEID --xmlfile
/tmp/tv_grab_off$offset.xml
done

Here is the resulting log that shows it took 4 hours:
https://docs.google.com/document/d/1WcNZLTUkD3VRmAxx7sktNJL0YU0d-ilQRRJEkQEUraM/edit?usp=sharing

Not sure if anything in there hints to the problem. It does look like most
of the chunks took 30 minutes, which is still too long, but one of the
later chunks took 90 minutes.

I think this is clearly related to all the mysql activity. When I monitor
the vm, though, the CPU utilization is low, including the mysql daemon. I
am starting to think this operation is I/O bound, resulting in slow
database operations.

Here is the file system of the virtual machine as seen in /etc/fstab:

unraid /mnt/unraid 9p
trans=virtio,version=9p2000.L,_netdev,rw 0 0


Larry

_______________________________________________
> mythtv-users mailing list
> mythtv-users@mythtv.org
> http://lists.mythtv.org/mailman/listinfo/mythtv-users
> http://wiki.mythtv.org/Mailing_List_etiquette
> MythTV Forums: https://forum.mythtv.org
>
Re: Populating the .xmltv file for my grabber from mythtv database? [ In reply to ]
>
>
>
>
> Not sure if anything in there hints to the problem. It does look like
> most of the chunks took 30 minutes, which is still too long, but one of the
> later chunks took 90 minutes.
>
> I think this is clearly related to all the mysql activity. When I monitor
> the vm, though, the CPU utilization is low, including the mysql daemon. I
> am starting to think this operation is I/O bound, resulting in slow
> database operations.
>
>
>
Most of the earlier 3 day chunks will involve few database writes because
the guide data should be already there from previous runs. Only the last
chunk will contain new stuff. So I guess that supports the suspicion that
it's mysql updates that are eating up the time.

You could increase the log level to LOG_DEBUG which may give some more
clues.

What about memory utilisation / disk swapping in the VM? Can you monitor
that?