Efficient sharing of Usenet connections in sabnzbd

The following script allows sabnzbd to evenly divide the number of connections to a shared Usenet account amongst your friends.

Some Usenet providers allow you to share your account with your friends. You can use the account from multiple IP addresses, but you can’t exceed the maximum number of simultaneous connections. If the maximum number of connections if 8, you and your buddy each get to use 4. If you use sabnzbd, you have to manually set the number of connection in the config. The drawback is that you always use only 50% of the maximum download speed, even if your friend isn’t downloading.

But with the following script, the number of connections are dynamically changed for optimal performance. If you’re the only one downloading, you get the full number of connections. When someone else starts downloading, the number of connections is divided evenly.

How it works

There’s two scripts. The first one checks and updates your local copy of sabnzbd, the second one acts as a database. This database is a simple PHP script plus text file that can run on any web server accessible to you and your buddies. The first script checks everyone’s status in the remote database, and changes the number of connections you can use accordingly through the sabnzbd API.

The local script: check how many connections we can use

Run this script every minute or so through a cron job. If you’re using sabnzbd 0.6.x or higher, you can also run it as a pre-queue script just before a download starts. That way the connections get adjusted right away.

You have to enter your sabnzbd API key, name of the Usenet server, and some more details in the first few lines. You can get this info from sabnzbd.ini or through the config in the web interface.

#!/usr/bin/python
import sys
import os
import urllib2
import json
import string
import hashlib
import math

# Get the api key from sabnzbd.ini or the web interface
apikey = "1234567890abcdef1234567890abcdef"

# Name of the Usenet server in your config
newsreader = "MyNewsReader"

# Create a unique secret seed. Use the same secret for everyone in your group
# Also change this in the remote PHP script!
secret = "MyRandomStringOfCharacters"

# URL to the remote PHP script that functions as a database
remoteURL = "http://my.website.com/sab.php"

# Maximum number of lines to share
maxConnections = 8

# Local URLs for sabnzbd
statusURL = "http://localhost:8080/sabnzbd/api?mode=qstatus&output=json&apikey=" + apikey
configURL = "http://localhost:8080/sabnzbd/api?mode=set_config&section=servers&keyword=" + newsreader + "&apikey=" + apikey

# Ask local sabnzbd if it's busy downloading
def getLocalStatus():
    try:
        response = urllib2.urlopen(statusURL).read()
    except URLError as e:
        print "Sabnzbd API error: " + e.reason

    sabnzbdStatus = json.loads(response)

    if len(sys.argv) > 1 and sys.argv[1] == "force":
        sabnzbdStatus["state"] = "DOWNLOADING"

    return 1 if sabnzbdStatus["state"] == "DOWNLOADING" else 0

# Get remote status and update it with our local status
def checkConnections():
    try:
        response = urllib2.urlopen(getRemoteURL()).read()
    except URLError as e:
        print "Couldn't get remote status: " + e.reason

    statuses = json.loads(response)

    downloaders = 0
    for key, value in statuses.iteritems():
        downloaders += value

    if localStatus == 0:
        # We're not downloading, so set connections to zero
        return 0
    if localStatus == 1 and downloaders == 0:
        # We're the only one downloading
        return maxConnections
    else:
        # Someone else is downloading too, divide up the available connections
        return int(math.floor(maxConnections / downloaders))

def getRemoteURL():
    meHash = hashlib.md5(me + str(localStatus) + secret).hexdigest()
    return remoteURL + "?meh=" + meHash + "&me=" + me + "&v=" + str(localStatus)

# Create unique indentifier from the api key and the seed
me = hashlib.md5(apikey + secret).hexdigest()
localStatus = getLocalStatus()

# Check how many connections we need
connections = checkConnections()

# Tell sabnzbd to use the new number of available connections
try:
    response = urllib2.urlopen(configURL + "&connections=" + str(connections)).read()
except urllib2.URLError as e:
    print "Sabnzbd API error: " + e.reason

Save this as sabnzbdconnections.py and run the script once a minute in a cron job:

# Check if we can leech at full speed
*/1 * * * * /usr/local/bin/sabnzbdconnections.py &> /dev/null

If your version of sabnzbd supports pre-queue scripts, run it with the “force” parameter to claim your connections before the download has technically started:

/usr/local/bin/sabnzbdconnections.py force

This way you don’t have to wait till the cron job runs to get your optimal number of connections.

The remote script: keep track of everyone

This is a little PHP script that simply stores the state of everyone’s sabnzbd status. It does this in a local text file and returns a JSON object. Put this online somewhere and put a link to it in the Python script above.

<?php
// Be warned: there is no purging of inactive downloaders

$secret = "MyRandomStringOfCharacters";
$filename = "sabstatus.txt";

$me = $_GET['me'];
$meHash = $_GET['meh'];
$meStatus = $_GET['v'];

if($meHash != md5($me . $meStatus . $secret)) die();

$status = json_decode(file_get_contents($filename), true);
$status[$me] = (int)$meStatus;

$status = json_encode($status);

file_put_contents($filename, $status);

die($status);

That’s it

Just run the Python script once a minute, and enjoy your equitably distributed number of connections in sabnzbd!

blog comments powered by Disqus