Usenet Cache Fails

The way that the cache works for usenet makes it rare that the cache will ever hit, leading to wasted bandwidth and slower downloads. The problem lies in hashing the link as-is with no sanitation. I’m using

https://inhies.github.io/Newznab-API/functions/#get as reference since it is pretty commonly used as the format the links will be in. Everything I’m going to say, especially suggestions, are only really valid if that format is matched. Due to the nature of usenet I’m not sure anything can be done to solve this universally, but starting with improving the most common cases would help.

A few examples that make caching difficult:
1. Query order. It is legal for those parameters to be in any order, and any changing of it will obviously affect the hash. A deterministic order could help this (ex. sort the query parameters before hashing).
2. del may or may not be present. It is probably likely to put del on a cart RSS feed, but since it is unlikely that del would be used in other cases that want to access the file the cache will be missed due to del. Stripping del before hashing would solve this.
3. apikey obviously has a value that varies. This effectively means that even if two people are using the same indexer to access the same file they’ll always cache miss. Stripping del before hashing would solve this.

Bringing my suggestion fully together would be something like “if the link matches the standard format of a Newznab t=get URL then change the hash rules to instead just hash together the host and the id.” I don’t know how popular using private usenet trackers is with TorBox, but this would dramatically increase cache hits for them. It would also make it so usenet RSS feeds and actually cache hit in the Stremio plugin.

Please authenticate to join the conversation.

Upvoters
Status

Completed

Board

💡 Feature Request

Date

7 months ago

Author

decoupled456

Subscribe to post

Get notified by email when there are changes.