Project

General

Profile

TVH backend appliance: Raspberry Pi 3 and HDHomeRun

Added by Alex Commerce almost 8 years ago

UPDATED for 4.2.2

We're cutting the cord here. Playstation Vue has Big Ten Network in their mid-level package, so my wife, the OSU fan, has signed off on the endeavor. However, PSV only has 1 local station available in our area, so if I want to keep watching Sunday Night Football with my own instat replay, and recording Thomas & Friends and Bob the Builder for my 2 young sons, I needed a PVR. Now we already have 2 Pi3 Kodi's (OpenELEC) pulling from a NAS, but these Kodi's also double as RetroPie machines part of the time thanks to a customized NOOBs multi-boot setup, so they can't be TVH backends. What I needed was a dedicated backend TVH. Turns out getting it to work was a lot more painful than I thought, thanks to a good bit of old or bad information floating around various forums. So if someone comes along wanting to do what I did (more recently than 2 years ago), I figured I'd put it all down here.

Hardware:
Raspberry Pi 3 (for TVH backend)
HDHomeRun Connect (uses DHCP; no static IP - best you can do is map the MAC)
OTA Antenna
1.5 TB External HDD (with dedicated power supply)
2x Kodi/RetroPie Pi3 Appliances (TVH clients)

Note: All the above hardware is on the same subnet (for anyone who overnetworks their house like me), though I suspect only the TVH backend and HDHR need be.

Assumption1: HDHR is connected to the network and validated with the software from Silicon Dust's website.

1) Install OSMC + TVH.

I know it's a cop-out, but it's the only way I was ever able to get TVH to see the HDHR tuners. I must have reimaged the SD card a dozen times, and I even found a post here from people, whether on Raspbian or Ubuntu, having the same issue. I saw others responding to them that they didn't have any issues at all. If you're in the latter group, I tip my hat to you. But after 8 hours of installs, apt-gets, tweaks, reinstalls... OSMC was my escape hatch.

1a) Run the OSMC installer program on your local PC to prepare the SD card. I didn't use the prepared image, for the record.
1b) Boot the Pi 3 and run through the initial setup (you'll need a TV or monitor). Make sure to enable SSH. Login/password for everything (including TVH) will be osmc/osmc.
1c) Browse to the "App Store" and install Tvheadend.
1d) Verify that TVH is up and running by browsing to the Pi's IP address on port 9981
1e) Verify that the HDHR tuners are present under Configuration -> DVB Inputs
1f) Follow this easy video to scan your channels and get things set up initially (I've seen others link to it on this forum): https://www.youtube.com/watch?v=2Y-E4sQSb94

At this point you can unplug the monitor and move your Raspberry Pi to whatever corner of your house you like. It's a headless appliance at this point.

2) Configure the Episode Guide
This was another source of frustration, but less so than trying to get TVH to detect the tuners. You'll need to SSH into the TVH backend for much of this.

Note1: The link to tv_grab_file from that reddit post (2b) appears old. Use this instead: https://github.com/Rigolo/tv-grab-file
Note2: You will need to manually edit line 7 of tv-grab-file to point to your generated xml file. It should read: "cat /home/osmc/zap2xml/xmltv.xml" or whatever path you specified in the zap2xml.conf.

2a) From ssh: sudo apt-get install xmltv-util cron
2b) Follow these steps: https://www.reddit.com/r/raspberry_pi/comments/4th19b/problems_with_getting_tvheadend_to_work_with/d5i7l63/
IMPORTANT: Don't miss the part about "Installing Dependencies". I did. Oops.
2c) Reboot
2d) Under Configuration -> Channel/EPG -> EPG Grabber Modules, make sure module "Internal XMLTV: tv_grab_file..." is "Enabled"
2e) Under Configuration -> Channel/EPG -> EPG Grabber, make sure settings are as needed
2f) Under Configuration -> Channel/EPG -> EPG Grabber Channels, make sure the entries in the "Channels" have been populated correctly, or no broadcasts will load.
2g) Clicking "Save Configuration" should kick off a load. It will pick up the file that was generated when you ran runzap.sh as part of 2b.
2h) Under Configuration -> Channel/EPG -> Channels, I recommend mapping the channels manually by editing each one and setting a value for "EPG Source". Uncheck Auto EPG. Perhaps others have had better luck with the "Auto", but I didn't.
2i) Check the electronic Program Guide tab at the very top to see if it is populated. If not, try running the load again by clicking "Re-run Internal EPG Grabbers" from either the "EPG Grabber" or "EPG Grabber Modules" tabs. Check the System Log at the bottom of the browser window to make sure its picking something up.
If the feedback from the log indicates the grabber is finding channels but no broadcasts, then it could be your channels aren't set under EPG Grabber Channels or that the wrong channels have been designated for the entries on that screen. This can happen if there are multiple entries for the same channel. If channel info in the retrieved xmltv.xml changes (e.g. it gets a new ID), a new record is created for the same channel. The old record must be removed and the "Channels" property set for the new entries.
2j) Set up a cron task to run runzap.sh at midnight daily. Add the following line (no quotes): "0 0 * * * /home/osmc/runzap.sh"

3) Add External Drive (Optional)
I didn't want to tax my NAS with PVR duties, so I hooked up a 1.5 TB WD external HDD I'd had laying around. I recommend 7200 rpm.

Note1: For sizing purposes, I found that 1h of OTA 720p broadcasting takes up roughly 6 GB of space. 1080i shouldn't be much larger due to the interlacing of the higher resolution.
Note2: You can't use the /media directory in OSMC. It clears it out after each reboot and uses it for other temporary attached storage. In fact, your USB drive will be mounted automatically here when you first connect it here in an alphanumeric directory. If you put the external drive in the fstab, OSMC will leave it alone.

3a) From SSH, unmount the USB drive from the /media directory if it has been mounted there by OSMC with the following command: sudo umount /media/<some alphanumeric directory>
3b) Format the USB drive using fdisk (I'll leave out the details on that - I went with an ext4 file system though)
3c) sudo mkdir /mnt/usb
3d) Assuming your drive is /dev/sda, edit the /etc/fstab and add the following line at the bottom: /dev/sda1 /mnt/usb ext4 defaults 0 0
3e) sudo mount -a
3f) sudo mkdir /mnt/usb/data
3g) sudo mkdir /mnt/usb/timeshift
3h) sudo chmod 0777 /mnt/usb/data
3i) sudo chmod 0777 /mnt/usb/timeshift
3j) in TVH browser, under Configuration -> Recording -> Digital Video Recorder Profiles, add /mnt/usb/data to "Recording System Path" on the right-hand side under Recording File Options
3k) in TVH browser, under Configuration -> Recording -> Timeshift, add /mnt/usb/timeshift to "Storage Path" under Timeshift Options

4) Connect your client software. I'm using Kodi/LibreELEC so that's what these instructions are geared towards.

Note1: OpenELEC/LibreELEC 7.x, for some reason, unless the Guide is populated on the Kodi client, you won't be able to timeshift. The buffer files will be building up on your TVH backend, but the RW/FF buttons will be grayed out. Once your Episode Guide populates, this functionality opens up. It's a bug in Kodi v16/Jarvis that is supposedly fixed in v17/Krypton.
Note2: I had to restart Kodi to get the Guide to populate.

4a) Install client from Add-Ons -> PVR
4b) Configure the plugin. The selections are all pretty self-explanatory.
4c) Under Settings -> TV, select the "Enable" radio button

That's what worked for me. It's running like a champ now. I hope this helps others, and if I think of anything I missed or someone finds I've missed a step, please let me know.


Replies (25)

RE: TVH backend appliance: Raspberry Pi 3 and HDHomeRun - Added by Josh Crosby over 7 years ago

I can get all the way up until I run the EPG Grabber but it outputs this:

2017-02-14 02:22:46.248 /usr/bin/tv_grab_file: grab /usr/bin/tv_grab_file
2017-02-14 02:22:46.254 spawn: Executing "/usr/bin/tv_grab_file"
2017-02-14 02:22:46.269 spawn: cat: /home/osmc/zap2xml/xmltv.xml: No such file or directory
2017-02-14 02:22:46.271 /usr/bin/tv_grab_file: no output detected
2017-02-14 02:22:46.271 /usr/bin/tv_grab_file: grab returned no data

What am I missing here? Is my runzap.sh script not functioning properly?

RE: TVH backend appliance: Raspberry Pi 3 and HDHomeRun - Added by Alex Commerce over 7 years ago

Josh Crosby wrote:

2017-02-14 02:22:46.269 spawn: cat: /home/osmc/zap2xml/xmltv.xml: No such file or directory

That's the important line, I think. Check where you're generating the "xmltv.xml" file, which should be the result of running the runzap.sh script (which calls the zap2xml.pl). Check your zap2xml.conf for the line outfile=<something>. That path should match the location in line 7 in tv_grab_file. If it doesn't, change one to match the other (doesn't really matter unless you're particular about file organizing like I am).

RE: TVH backend appliance: Raspberry Pi 3 and HDHomeRun - Added by Josh Crosby over 7 years ago

Finally figured that out. Had a " at the end of the script that was causing a syntax error. However, now that it's running, it doesn't seem to be pulling in any data. Any ideas?

2017-02-14 13:20:03.550 /usr/bin/tv_grab_file: grab /usr/bin/tv_grab_file
2017-02-14 13:20:03.557 spawn: Executing "/usr/bin/tv_grab_file"
2017-02-14 13:20:03.761 /usr/bin/tv_grab_file: grab took 0 seconds
2017-02-14 13:20:03.806 /usr/bin/tv_grab_file: parse took 0 seconds
2017-02-14 13:20:03.806 /usr/bin/tv_grab_file: channels tot= 28 new= 0 mod= 0
2017-02-14 13:20:03.806 /usr/bin/tv_grab_file: brands tot= 0 new= 0 mod= 0
2017-02-14 13:20:03.806 /usr/bin/tv_grab_file: seasons tot= 0 new= 0 mod= 0
2017-02-14 13:20:03.806 /usr/bin/tv_grab_file: episodes tot= 0 new= 0 mod= 0
2017-02-14 13:20:03.806 /usr/bin/tv_grab_file: broadcasts tot= 0 new= 0 mod= 0

RE: TVH backend appliance: Raspberry Pi 3 and HDHomeRun - Added by Josh Crosby over 7 years ago

Never mind. Realized I just need to manually set EPG Data Source per channel. Thanks for a great guide! This helped a TON!

RE: TVH backend appliance: Raspberry Pi 3 and HDHomeRun - Added by Josh Crosby over 7 years ago

So I have everything set up correctly with tvgrabfile and it works, but will only pull in the data for 7 days from when it first runs. Then when it runs again won't find anything new until I delete xmltv.xml and manually run runzap.sh from the terminal. Then it pulls in the newer data. Any ideas?

Log after it has run once:

2017-02-26 17:56:17.893 /usr/bin/tv_grab_file: grab /usr/bin/tv_grab_file
2017-02-26 17:56:17.900 spawn: Executing "/usr/bin/tv_grab_file"
2017-02-26 17:56:18.108 /usr/bin/tv_grab_file: grab took 1 seconds
2017-02-26 17:56:18.212 /usr/bin/tv_grab_file: parse took 0 seconds
2017-02-26 17:56:18.212 /usr/bin/tv_grab_file: channels tot= 31 new= 0 mod= 0
2017-02-26 17:56:18.212 /usr/bin/tv_grab_file: brands tot= 0 new= 0 mod= 0
2017-02-26 17:56:18.212 /usr/bin/tv_grab_file: seasons tot= 0 new= 0 mod= 0
2017-02-26 17:56:18.212 /usr/bin/tv_grab_file: episodes tot= 0 new= 0 mod= 0
2017-02-26 17:56:18.212 /usr/bin/tv_grab_file: broadcasts tot= 0 new= 0 mod= 0

Log after I delete xmltv.xml and run runzap.sh:

2017-02-26 18:00:08.377 /usr/bin/tv_grab_file: grab /usr/bin/tv_grab_file
2017-02-26 18:00:08.385 spawn: Executing "/usr/bin/tv_grab_file"
2017-02-26 18:00:08.601 /usr/bin/tv_grab_file: grab took 0 seconds
2017-02-26 18:00:09.520 /usr/bin/tv_grab_file: parse took 0 seconds
2017-02-26 18:00:09.522 /usr/bin/tv_grab_file: channels tot= 31 new= 0 mod= 0
2017-02-26 18:00:09.522 /usr/bin/tv_grab_file: brands tot= 0 new= 0 mod= 0
2017-02-26 18:00:09.523 /usr/bin/tv_grab_file: seasons tot= 5844 new= 1078 mod= 1078
2017-02-26 18:00:09.524 /usr/bin/tv_grab_file: episodes tot= 6127 new= 6044 mod= 6082
2017-02-26 18:00:09.525 /usr/bin/tv_grab_file: broadcasts tot= 6127 new= 6127 mod= 6127

RE: TVH backend appliance: Raspberry Pi 3 and HDHomeRun - Added by Joe User over 7 years ago

Are you executing runzap.sh as root in the terminal?

What are the permissions of runzap.sh zap2xml.pl and the xmltv.xml files?

RE: TVH backend appliance: Raspberry Pi 3 and HDHomeRun - Added by Josh Crosby over 7 years ago

I'm just running it as the default osmc user, no sudo/root needed. Here's the permissions for each of those files:

runzap.sh
-rwxr-xr-x 1 osmc root   83
zap2xml.pl
-rwxr-xr-x 1 osmc osmc   50190
xmltv.xml
-rw-r--r-- 1 osmc osmc 2941977

What needs to change? Thanks!

RE: TVH backend appliance: Raspberry Pi 3 and HDHomeRun - Added by Alex Commerce over 7 years ago

-Josh Crosby

Here are some helpful items that will hopefully iron things out for you.

crontab for "osmc" (no root crontab)

0 0 * * * /home/osmc/runzap.sh

Contents of runzap

#!/bin/bash
cd /home/osmc/zap2xml
./zap2xml.pl -C zap2xml.conf -A "new live" -T -q

Contents of zap2xml.conf

user=myemail@mydomain
password=<redacted>
ncsdays=1
cache=/home/osmc/zap2xml/cache
outfile=/home/osmc/zap2xml/xmltv.xml

Top of tv_grab_file

#!/bin/bash
dflag=
vflag=
cflag=
if (( $# < 1 ))
then
cat /home/osmc/zap2xml/xmltv.xml
exit 0
fi

RE: TVH backend appliance: Raspberry Pi 3 and HDHomeRun - Added by Josh Crosby over 7 years ago

I can confirm that all of those are set properly. Here are the permissions of my tv_grab_file in /usr/bin. Are they set properly?

-rwxr-xr-x 1 osmc osmc 1011 Feb 14 02:26 tv_grab_file

Are all the rest of my permissions in the previous post okay?

RE: TVH backend appliance: Raspberry Pi 3 and HDHomeRun - Added by Alex Commerce over 7 years ago

Run the following command as the osmc user and paste the output.

crontab -l

RE: TVH backend appliance: Raspberry Pi 3 and HDHomeRun - Added by Josh Crosby over 7 years ago

Here it is:

# Edit this file to introduce tasks to be run by cron.
# 
# Each task to run has to be defined through a single line
# indicating with different fields when the task will be run
# and what command to run for the task
# 
# To define the time you can provide concrete values for
# minute (m), hour (h), day of month (dom), month (mon),
# and day of week (dow) or use '*' in these fields (for 'any').# 
# Notice that tasks will be started based on the cron's system
# daemon's notion of time and timezones.
# 
# Output of the crontab jobs (including errors) is sent through
# email to the user the crontab file belongs to (unless redirected).
# 
# For example, you can run a backup of all your user accounts
# at 5 a.m every week with:
# 0 5 * * 1 tar -zcf /var/backups/home.tgz /home/
# 
# For more information see the manual pages of crontab(5) and cron(8)
# 
# m h  dom mon dow   command
0 0 * * * /home/osmc/runzap.sh

RE: TVH backend appliance: Raspberry Pi 3 and HDHomeRun - Added by Alex Commerce over 7 years ago

That looks correct. The next step would be to check the timestamp on the xmltv.xml file. It should reflect having been updated just after midnight when that cron job runs.

RE: TVH backend appliance: Raspberry Pi 3 and HDHomeRun - Added by Josh Crosby over 7 years ago

Everything seems to be working fine now! Went through and fixed all permissions and I'm not having problems.

By the way, thanks for a killer write-up! This is the only set of instructions I followed that got everything working the way I wanted.

RE: TVH backend appliance: Raspberry Pi 3 and HDHomeRun - Added by Alex Commerce over 7 years ago

Josh Crosby wrote:

Everything seems to be working fine now! Went through and fixed all permissions and I'm not having problems.

By the way, thanks for a killer write-up! This is the only set of instructions I followed that got everything working the way I wanted.

That's pretty much why I wrote it. Nothing else out there fit the bill.

RE: TVH backend appliance: Raspberry Pi 3 and HDHomeRun - Added by Alex Commerce over 7 years ago

Josh Crosby wrote:

Everything seems to be working fine now! Went through and fixed all permissions and I'm not having problems.

By the way, thanks for a killer write-up! This is the only set of instructions I followed that got everything working the way I wanted.

Let me know if you run into the same issues I describe in these 2 posts. I haven't been able to get anyone to comment on them.

https://tvheadend.org/boards/14/topics/25360
https://tvheadend.org/boards/5/topics/25373

RE: TVH backend appliance: Raspberry Pi 3 and HDHomeRun - Added by Alex Commerce almost 7 years ago

So something changed at the start of the year (2018). On January 3rd, my login/password started failing when I tried to pull xmltv.xml, but of course I didn't notice this until this past week (Jan 16th) when my guide ran out. Doh! So, firstly, I needed to get a fresh copy of zap2xml.pl from http://zap2xml.awardspace.info/. That got the file downloading again, but then when I ran my grabber (tv_grab_file), it only found channels but no broadcasts. Thanks to an unrelated post about about a similar situation in the UK, I was directed to check the "Channels" column under "EPG Channel Grabber". I noticed that there were now 2 entries for each of my local stations, and one of each lacked a designation under "Channels". Upon closer inspection I also noticed that the ones without "Channels" set had a different ID than the ones which did have them set, and furthermore, those IDs lacking "Channels" corresponded to the ones found in my newly grabbed xmltv.xml. So I removed the old entries, and set the Channels entries appropriately. The next time I went to run the Grabbers again, this time it picked the broadcasts up without issue.

Best guess: whatever changed at the beginning of the year, zap2it altered their channel IDs and TVHeadend understandably assumed them to be brand new channels. I have updated my original steps to correspond not only to the 4.2.2 release which is what my OSMC is using right now, but also to include this new detail I've learned about channel mapping and what prompted it.

RE: TVH backend appliance: Raspberry Pi 3 and HDHomeRun - Added by Robert Cameron almost 7 years ago

Alex Thompson wrote:

So something changed at the start of the year (2018). On January 3rd, my login/password started failing when I tried to pull xmltv.xml, but of course I didn't notice this until this past week (Jan 16th) when my guide ran out. Doh! So, firstly, I needed to get a fresh copy of zap2xml.pl from http://zap2xml.awardspace.info/. That got the file downloading again, but then when I ran my grabber (tv_grab_file), it only found channels but no broadcasts. Thanks to an unrelated post about about a similar situation in the UK, I was directed to check the "Channels" column under "EPG Channel Grabber". I noticed that there were now 2 entries for each of my local stations, and one of each lacked a designation under "Channels". Upon closer inspection I also noticed that the ones without "Channels" set had a different ID than the ones which did have them set, and furthermore, those IDs lacking "Channels" corresponded to the ones found in my newly grabbed xmltv.xml. So I removed the old entries, and set the Channels entries appropriately. The next time I went to run the Grabbers again, this time it picked the broadcasts up without issue.

Best guess: whatever changed at the beginning of the year, zap2it altered their channel IDs and TVHeadend understandably assumed them to be brand new channels. I have updated my original steps to correspond not only to the 4.2.2 release which is what my OSMC is using right now, but also to include this new detail I've learned about channel mapping and what prompted it.

Yes, right around the new year, Zap2It changed their website and broke nearly all grabbers using it.

You may want to look into a different Grabber, as it seems they are now starting to fight back against scrapers/grabbers. (While it is not quite explicit, the terms of use for Gracenote's data through Zap2It seems to forbid scraping and only permit use through the website.) Schedules Direct offers the same data—actually, more complete data—from the same upstream source (Gracenote) for a paltry $25/yr.

While others may push back against paying for EPG data, $2/mo seems overly reasonable. Plus, their API is quite stable and available through multiple different grabbers/applications.

RE: TVH backend appliance: Raspberry Pi 3 and HDHomeRun - Added by K Shea almost 7 years ago

Alex Thompson: Did you read the release notes for the new version? I THINK that if you had used the -9 option, it might have used the "old" channel ID's, though I am not 100% certain of that. But now that you have gone through and changed them in Tvheadend, might as well stick with what you have.

Robert Cameron: I believe I may have asked you this before and I don't think you ever answered, but do you receive some sort of benefit (financial or otherwise) for shilling for the pay service? You sure seem anxious to part people from their money. Zap2it has changed their website maybe twice in the time I've been using zap2xml; that hardly seems like a concerted effort to push back against scraping their listings. And even if it gets to the point that you can't use Zap2it any longer, Zap2xml also works with TV Guide, and if that stops working there's always the free guide data transmitted over the air by the TV stations. I push back against paying for EPG data in part because of people like you who try to insinuate that people might be doing something vaguely wrong by using a program like zap2xml, but without telling us the reason why you care so much. I've always despised the "you're a bad person if you don't buy our product" type of sales pitch; it almost guarantees I will never under any circumstances buy that product or service.

In most countries of the world there is no pay service for listings, and just because one happens to exist in the USA doesn't mean people are under any obligation to use it. If you want to use it that's fine, but you really don't need to be preaching to people who don't share your views about the necessity of supporting a for-profit schedule service.

RE: TVH backend appliance: Raspberry Pi 3 and HDHomeRun - Added by Robert Cameron almost 7 years ago

K Shea wrote:

Robert Cameron: I believe I may have asked you this before and I don't think you ever answered, but do you receive some sort of benefit (financial or otherwise) for shilling for the pay service? You sure seem anxious to part people from their money. Zap2it has changed their website maybe twice in the time I've been using zap2xml; that hardly seems like a concerted effort to push back against scraping their listings. And even if it gets to the point that you can't use Zap2it any longer, Zap2xml also works with TV Guide, and if that stops working there's always the free guide data transmitted over the air by the TV stations. I push back against paying for EPG data in part because of people like you who try to insinuate that people might be doing something vaguely wrong by using a program like zap2xml, but without telling us the reason why you care so much. I've always despised the "you're a bad person if you don't buy our product" type of sales pitch; it almost guarantees I will never under any circumstances buy that product or service.

In most countries of the world there is no pay service for listings, and just because one happens to exist in the USA doesn't mean people are under any obligation to use it. If you want to use it that's fine, but you really don't need to be preaching to people who don't share your views about the necessity of supporting a for-profit schedule service.

No, I receive no benefit. I feel that Tvheadend is a great product, and that promoting the use of questionable software that scrapes websites for proprietary data—especially when that data provider offers several channels, including one geared towards free software projects like Tvheadend—is just asking for trouble.

There is a reason that Kodi's forum forbids the discussion of the use of addons that facilitate copyright infringement and illegal streaming. While they haven't considered Zap2It scrapers a banned addon yet, with the way Gracenote has been modifying their services worldwide, I believe it's only inevitable that it will happen, and probably rather soon.

If Zap2It was truly meant to be a free service for your computer applications to obtain guide information, it would have a published API with well-defined endpoints. Instead you have a website that must be scraped and crawled across multiple links in order to get data of any use for a computer application ... doesn't sound too proper to me.

(Also, while those outside the US may be fine, here in the US the case of Aaron Swartz is still fresh in some minds. He did what was technically legal by crawling and scraping information that he had authorized access to, but was using/distributing in a way that skirted published terms, and was charged with multiple hacking felonies and railroaded by the Justice department until he committed suicide.

If you want to pursue the use of legally questionable software for your purposes, go ahead. But you ought to keep off the official sites and forums of software projects where your "data should be free so I'm not going to pay for it" attitude has no chance of affecting others using the software.)

RE: TVH backend appliance: Raspberry Pi 3 and HDHomeRun - Added by Alex Commerce almost 7 years ago

To both parties, let's not turn this into a brawl. I'm glad K Shea brought this up as I'm a big proponent of using Kodi legally and responsibly. In that spirit, I went back to Zap2It's website and looked into the matter. In my opinion, I do not believe using this script is illegal or even unethical. Here are my reasons.

1) I could not find any advertising anywhere on the website, and specifically on the TV listings page. So retrieving the data without a browser does not circumvent a revenue stream.
2) I am able to view full listings for my local area without being logged in. I suspect my zip code is contained in a browser cookie, but the point is that the data is publicly available and not behind a paywall or registration. Registration does allow me to designate my favorites, which drives the script. But the mode of transmission does not alter anything about this arrangement.
3) The script only does what I could do myself, just faster. This is a weak point by itself, but it strengthens the overall argument in conjunction with the other two points.

I tried to find a "Terms of Service" for further research, but had no luck. I did notice that Gracenote, the parent company, is in turn owned by Nielsen. With that in mind, revenue stream for the site could be subsidized by viewing pattern data (Nielsen's traditional territory) provided by registered users and anonymous traffic to the site. Retrieving the data using zap2xml does not deny the company either of these data points. True, they have not provided an API themselves, but concluding that the absence of such a route proves, or even indicates, it's prohibition commits the logical fallacy of "argument from ignorance" (please note, that was not an insult - that's simply what the logical fallacy is called). There could be any number of reasons they don't provide it. Perhaps they don't want to spend the time and labor costs to write it up? Who knows?

Lastly, I want to address the case of Aaron Swartz which you mention. I'm not intimately familiar with the details, but from what I have read, in his case the data in question was retrieved from a closed, private system. That is not the case here (see Item #2 above). Furthermore, the prosecution's case was that beyond accessing the site in the manner he did (placing a laptop and external hard drive in a restricted wiring closet in MIT's basement, and hiding it under a cardboard box), his intent was to disseminate this controlled information across P2P sites without the consent of the controller. None of that is the case here.

I do appreciate the heads-up though. It is always good to consider not just the legality, but also the spirit in which we leverage technology.

RE: TVH backend appliance: Raspberry Pi 3 and HDHomeRun - Added by Robert Cameron almost 7 years ago

Alex Thompson wrote:

To both parties, let's not turn this into a brawl. I'm glad K Shea brought this up as I'm a big proponent of using Kodi legally and responsibly. In that spirit, I went back to Zap2It's website and looked into the matter. In my opinion, I do not believe using this script is illegal or even unethical. Here are my reasons.

1) I could not find any advertising anywhere on the website, and specifically on the TV listings page. So retrieving the data without a browser does not circumvent a revenue stream.
2) I am able to view full listings for my local area without being logged in. I suspect my zip code is contained in a browser cookie, but the point is that the data is publicly available and not behind a paywall or registration. Registration does allow me to designate my favorites, which drives the script. But the mode of transmission does not alter anything about this arrangement.
3) The script only does what I could do myself, just faster. This is a weak point by itself, but it strengthens the overall argument in conjunction with the other two points.

I tried to find a "Terms of Service" for further research, but had no luck. I did notice that Gracenote, the parent company, is in turn owned by Nielsen. With that in mind, revenue stream for the site could be subsidized by viewing pattern data (Nielsen's traditional territory) provided by registered users and anonymous traffic to the site. Retrieving the data using zap2xml does not deny the company either of these data points. True, they have not provided an API themselves, but concluding that the absence of such a route proves, or even indicates, it's prohibition commits the logical fallacy of "argument from ignorance" (please note, that was not an insult - that's simply what the logical fallacy is called). There could be any number of reasons they don't provide it. Perhaps they don't want to spend the time and labor costs to write it up? Who knows?

Lastly, I want to address the case of Aaron Swartz which you mention. I'm not intimately familiar with the details, but from what I have read, in his case the data in question was retrieved from a closed, private system. That is not the case here (see Item #2 above). Furthermore, the prosecution's case was that beyond accessing the site in the manner he did (placing a laptop and external hard drive in a restricted wiring closet in MIT's basement, and hiding it under a cardboard box), his intent was to disseminate this controlled information across P2P sites without the consent of the controller. None of that is the case here.

I do appreciate the heads-up though. It is always good to consider not just the legality, but also the spirit in which we leverage technology.

If you followed the links through to Gracenote's site, they have a developer FAQ related to their video services (which includes TV listings). There it is indicated that registration and an API key is needed to programmatically access their information.

Also, the "updating" of the Zap2It site seems to have removed the terms of use that was present on their site within the last 6 months. Still, I feel that the terms covering all of Gracenote's data and IP equally apply to Zap2It, and given the media/content owning-friendly nature of the current US legal system, I'm sure the courts would agree at present. However, since the TOS are no longer present on the Zap2It site, I can see how the question is now open to a more liberal interpretation.

With regards to Swartz: while it was perhaps a "closed" system, the data he was accessing was technically the property of the People of the US, since they were court records stored in the PACER system. As such, his access to the data was technically within his rights; Gracenote's data is their private property and therefore subject to enforceable protections. Again, your interpretation may vary, but still something to keep in mind when constructing computer systems that access questionably "open" data.

RE: TVH backend appliance: Raspberry Pi 3 and HDHomeRun - Added by K Shea almost 7 years ago

Robert Cameron wrote:

K Shea wrote:

Robert Cameron: I believe I may have asked you this before and I don't think you ever answered, but do you receive some sort of benefit (financial or otherwise) for shilling for the pay service? You sure seem anxious to part people from their money. Zap2it has changed their website maybe twice in the time I've been using zap2xml; that hardly seems like a concerted effort to push back against scraping their listings. And even if it gets to the point that you can't use Zap2it any longer, Zap2xml also works with TV Guide, and if that stops working there's always the free guide data transmitted over the air by the TV stations. I push back against paying for EPG data in part because of people like you who try to insinuate that people might be doing something vaguely wrong by using a program like zap2xml, but without telling us the reason why you care so much. I've always despised the "you're a bad person if you don't buy our product" type of sales pitch; it almost guarantees I will never under any circumstances buy that product or service.

In most countries of the world there is no pay service for listings, and just because one happens to exist in the USA doesn't mean people are under any obligation to use it. If you want to use it that's fine, but you really don't need to be preaching to people who don't share your views about the necessity of supporting a for-profit schedule service.

No, I receive no benefit. I feel that Tvheadend is a great product, and that promoting the use of questionable software that scrapes websites for proprietary data—especially when that data provider offers several channels, including one geared towards free software projects like Tvheadend—is just asking for trouble.

There is a reason that Kodi's forum forbids the discussion of the use of addons that facilitate copyright infringement and illegal streaming. While they haven't considered Zap2It scrapers a banned addon yet, with the way Gracenote has been modifying their services worldwide, I believe it's only inevitable that it will happen, and probably rather soon.

If Zap2It was truly meant to be a free service for your computer applications to obtain guide information, it would have a published API with well-defined endpoints. Instead you have a website that must be scraped and crawled across multiple links in order to get data of any use for a computer application ... doesn't sound too proper to me.

(Also, while those outside the US may be fine, here in the US the case of Aaron Swartz is still fresh in some minds. He did what was technically legal by crawling and scraping information that he had authorized access to, but was using/distributing in a way that skirted published terms, and was charged with multiple hacking felonies and railroaded by the Justice department until he committed suicide.

If you want to pursue the use of legally questionable software for your purposes, go ahead. But you ought to keep off the official sites and forums of software projects where your "data should be free so I'm not going to pay for it" attitude has no chance of affecting others using the software.)

Well, Robert, you started this. If you don't want me to talk about Zap2xml so much, then stop acting like a sales agent for the pay service. I'm not saying you are one, but sometimes you come across that way, and when you come along pontificating with your "maybe somehow you are doing something of questionable legality by using the FREE listings" speech, then certainly others have the right to jump in and offer their interpretation. You are not a representative or agent of any listings service as far as I know, so how would you know if they care if people use their listings? For all you know, they couldn't care less, and they certainly have never given any real indication that they object to such usage (changing the look and feel of their website means nothing here, it's more than likely totally coincidental). Even if there's some kind of boilerplate language in the T&C, which they seem to bury on their site, it's not like they force you to read it before using their site (even if you register for an account). And a lot of times there is language in T&C's that in real life no one pays any attention to, at least not until some extreme issue arises. And even if someday they did decide they don't like this type of usage, I doubt that they are going to jump right to the "let's sue our users" stage - I would expect that at first they would give some kind of notice or warning.

Why I don't get is why you seem so intent on starting these discussions. If you think this should be all under the radar, then stop making a big deal about it, and stop trying to persuade people to use the pay service. Because, every time you do, you invite someone else to jump in and talk about the free alternative. I doubt very many people share your sense of paranoia that something bad will happen to them (indeed, it would be a public relations nightmare if any of these services tried to sue a user without giving fair warning that they were doing some that the company considers wrong - note I said "the company", not YOU) and just because your sense of morality keeps you from doing anything that might even possibly somehow be the slightest bit questionable doesn't mean that everyone appreciates your preachy posts.

Zap2xml is certainly not the first software to scrape a web site for data, and it won't be the last. API's are a relatively new thing, and a big thing in certain segments of the programmer community, but they are not a one size fits all solution, and the fact that they exist do not mean that scraping will disappear. And again, this is free information, it's not like we are stealing something that the site charges money for, or breaking through someone's paywall. Neither of the supported listings services charge for listings. And you are only getting the same information you'd get by using any standard web browser.

As for Kodi's attitude, I doubt any of this is even on their radar. If they ever do ban mention of this type of software on their forums, I will be shocked, because that would be about the same thing as when Amazon banned Kodi because they thought it was a piracy tool. I realize they are walking a tightrope now because they have been accused of being piracy software so often that they have to avoid any appearance of evil, but keep in mind that the original Zap2xml was never intended to be a Kodi addon, and doesn't interface with Kodi in any way unless some third party creates software that uses it to provide guide data to Kodi, and even then Kodi would be more likely to ban discussion of that particular third party addon if for some reason they found it objectionable. Zap2xml is not a piracy tool (except maybe in your pretzel logic, which to me always seems like a sales pitch for the pay service) because it's not enabling the use of anything that's not freely available. There are many video addons in Kodi (even in the official addon repository) that scrape video sites such as YouTube to allow viewing videos; I'm surprised you aren't over there preaching about the evils of that. But no, you only seem to care about trying to make people pay for TV listings. Why?

And also, keep in mind that the software and services under discussion can also be used to get listings for Canadian channels (and if using TV Guide, also for Mexican channels, and channels in several other countries in the Western hemisphere). So even if you think there is a problem using such software in the U.S., it's still valid to talk about it, since it could be useful in other countries, many of which have no for-profit listings alternative.

Anyway, if you don't like this type of discussion, then stop promoting the pay service and it will probably pretty much disappear. I'm not saying not to ever mention it, but when you start trying to moralize that people should use it and not the free service for reasons that you have mostly conjured up in your head, that's where these types of discussions get started.

RE: TVH backend appliance: Raspberry Pi 3 and HDHomeRun - Added by Alex Commerce almost 7 years ago

Mark Clarkstone wrote:

Keep it civil please folks :)

Yes, please. I put this together to save others in a similar situation a lot of time. I don't want the thread degenerating into a flame war.

RE: TVH backend appliance: Raspberry Pi 3 and HDHomeRun - Added by Hiro Protagonist almost 7 years ago

I don't really know anything about the UK broadcasting system, but isn't there an option for extracting the EPG data via EIT or MHEG that will avoid this whole issue of relying on web access?

    (1-25/25)