dslreports logo
 
    All Forums Hot Topics Gallery
spc
Search similar:


uniqs
3043

drmorley
MVM
join:2000-12-20
Three Lakes, WI

1 recommendation

drmorley

MVM

NewzNab

Okay, so I setup my own usenet indexer for personal use. How do I get it to automatically update on a regular basis? Is there a script I can run to set that up or a PHP mod I can make?
BoogaBooga
join:2004-06-12
Canada

BoogaBooga

Member

»newznab.readthedocs.org/ ··· indexing
pavelbure10
join:2011-09-30

pavelbure10 to drmorley

Member

to drmorley
You can just run the 2 scripts automatically by adding them to your crontab.

You're going to end up using a lot of diskspace for anything more than a few groups and 10+ days retention.
singerie3
anon are muted
join:2008-10-12
Montreal, QC

singerie3 to drmorley

Member

to drmorley
everybody say it will use a lot of disk space... but what is a lot of disk space ?

swintec
Premium Member
join:2003-12-19
Alfred, ME

swintec to pavelbure10

Premium Member

to pavelbure10
said by pavelbure10:

You're going to end up using a lot of diskspace for anything more than a few groups and 10+ days retention.

keep in mind that header data is removed from the database after time of X. This leaves you just with the NZBs.

Exodus
Your Daddy
Premium Member
join:2001-11-26
Earth

Exodus to pavelbure10

Premium Member

to pavelbure10
said by pavelbure10:

You can just run the 2 scripts automatically by adding them to your crontab.

You're going to end up using a lot of diskspace for anything more than a few groups and 10+ days retention.

So, I've actually gone through the process of setting up a paid version of newznab. I have 30 days retention on a good three dozen groups. I'm using up 16GB of space, including space for the OS (Linux).

Inside of the scripts folder, contains sub-folders for update scripts. You do not cron these scripts out as they will execute continuously. Using the screen command, you background these scripts and you add the script to your startup.

Adding this to your cron will cause you to spawn up more and more of these scripts. Cronning out the individual update binaries and update releases will cause things to operate out of order. Use the update script that is provided. It cycles and checks things more thoroughly than a novice cron invention will.
Exodus

Exodus

Premium Member

Linux is running in a VM, by the way, not stand-alone. I think it's been given like 1GB of memory and runs fine.

drmorley
MVM
join:2000-12-20
Three Lakes, WI

drmorley

MVM

I've got mine running in a VM with 2048GB of Mem and 100GB of disk space.

Everything is up and running and I haven't automated the update scripts, but when I run them manually now the processes keep getting killed.

php update_binaries.php - this script won't run now and keeps getting killed.

Any ideas? Sorry, I'm a Linux n00b.

Exodus
Your Daddy
Premium Member
join:2001-11-26
Earth

Exodus

Premium Member

What's the error message?

drmorley
MVM
join:2000-12-20
Three Lakes, WI

drmorley

MVM

It craps out at various points, but here's a typical error message:

Group 2 of 8
Processing alt.binaries.abc
Group alt.binaries.abc has 4,102,880 new parts.
First: 104642385 Last: 2278275563 Local last: 2274172683
Getting 4,102,880 parts (2274172684 to 2278275563) - 0 in queue
Killed

Exodus
Your Daddy
Premium Member
join:2001-11-26
Earth

Exodus

Premium Member

Killed, that means the process was terminated.

What does the output of /var/log/syslog look like?
sandman_1
join:2011-04-23
11111

sandman_1 to drmorley

Member

to drmorley
said by drmorley:

I've got mine running in a VM with 2048GB of Mem and 100GB of disk space.

Holy crap!! 2TB of RAM. That must of cost a pretty penny.

Frink
Professor
Premium Member
join:2000-07-13
Scotch Plains, NJ

Frink to Exodus

Premium Member

to Exodus
said by Exodus:

said by pavelbure10:

You can just run the 2 scripts automatically by adding them to your crontab.

You're going to end up using a lot of diskspace for anything more than a few groups and 10+ days retention.

So, I've actually gone through the process of setting up a paid version of newznab. I have 30 days retention on a good three dozen groups. I'm using up 16GB of space, including space for the OS (Linux).

Inside of the scripts folder, contains sub-folders for update scripts. You do not cron these scripts out as they will execute continuously. Using the screen command, you background these scripts and you add the script to your startup.

Adding this to your cron will cause you to spawn up more and more of these scripts. Cronning out the individual update binaries and update releases will cause things to operate out of order. Use the update script that is provided. It cycles and checks things more thoroughly than a novice cron invention will.

Do you know more about the differences between the Paid version, and the free Regex file that was available? Is it really that much better to run the paid version? Could you provide more detail to the screen syntax, as getting this stuff up and running are my next steps? Thanks.

Exodus
Your Daddy
Premium Member
join:2001-11-26
Earth

Exodus

Premium Member

The paid version introduces several scripts, including threaded scripts. Could you get them on your own? Yeah. The paid version also allows you to index more than one group. Found that little piece when I was trying to have more than one group listed. This makes the paid version mandatory with the free version being more of a proof of concept.

The paid version also has an ID that allows your server to phone home to get the latest and greatest regex's that are supplied by the community.

What I could really use right now is a very thorough list of _ALL_ the groups that indexers like dog, nzbmatrix, etc., indexed. That's my issue right now.
67845017 (banned)
join:2000-12-17
Naperville, IL

67845017 (banned)

Member

Shouldn't the group list depend on your preferences? If/when I get one running, I have only a handful that I'm going to do.

Exodus
Your Daddy
Premium Member
join:2001-11-26
Earth

Exodus

Premium Member

The small group of people that are in on this have a diverse set of preferences.
67845017 (banned)
join:2000-12-17
Naperville, IL

67845017 (banned)

Member

Ahh. That makes sense.

scooby
Premium Member
join:2001-05-01
Schaumburg, IL

scooby

Premium Member

I have 2 weeks of backlog on 133 groups and it uses about 6G total between everything newznab related and my mysqldb. Its going to ake forever to load the full backlog.

drmorley
MVM
join:2000-12-20
Three Lakes, WI

drmorley

MVM

said by scooby:

I have 2 weeks of backlog on 133 groups and it uses about 6G total between everything newznab related and my mysqldb. Its going to ake forever to load the full backlog.

What method are you using to load the full backlog?
sandman_1
join:2011-04-23
11111

sandman_1 to drmorley

Member

to drmorley
I don't understand the need to have several days worth of nzbs. If you are using this to get the quickest turn around on a current release, shouldn't less than week (like 3 days tops) be more than sufficient to do that? The whole point is to beat DMCA take downs right? You could use binsearch or something else to get the older stuff and save disk space.