Versions list for PreNews
-------------------------
0.01* 14/05/2001
Basic test version created, will (fairly quickly) process news, but
only works from a dynamic area at present.
0.02* 28/05/2001
Wow!
Take the following, in BASIC:
BPUT#output%, " "+RIGHT$(" "+STR$(id%), 3)+" ";
IF xpost% THEN BPUT#output%, "X"; ELSE BPUT#output%, " ";
BPUT#output%, subject_name$(offset%);
IF clipped% THEN BPUT#output%, CHR$140 ELSE BPUT#output, ""
You could write it in C as:
File_printf(output, " %3d ", id);
if (xpost)
File_WriteByte(output, 'X');
else
File_WriteByte(output, ' ');
File_printf(output, " %s", subject[offset].name);
if (clipped)
File_WriteByte(output, 140);
File_WriteByte(output, '\n');
But to be REALLY sexy, you could write:
File_printf(output, " %3d %c %s%c\n", id, (xpost==1)?'X':' ',
subject[offset].name, (clipped==1)?140:32);
You just can't help but LIKE a language that allows you to do those
annoying little things as easily as that. Like I said, wow.
Okay, back to your scheduled programming...
New CLI option "-subjects". This will scan through <PreNews$Source>
and compile a list of subjects in each newsgroup. Nothing else is
processed.
This can be useful if you want, say, to take a really quick browse
and see if anything new that might appeal to you has come up. You can
whizz through the news file and look for potentially interesting
subjects, without having to actually load it into !Edit and read it
(which can be slow and tedious a chore).
This works from file if no Dynamic Areas are available or if you do
not have enough free memory.
The parameter -onlyuseda is ignored.
Undocumented [*] CLI parameter "-barleyroll" which makes PreNews fly
along with Dynamic Areas.
To give examples:
To do a -subjects on a news file 2491K, small status window, on a
RiscPC700 with Simtec IDEFS...
Normal speed - 38 seconds, 65Kb/sec (dynamic area)
Fast speed - 13 seconds, 191Kb/sec (dynamic area)
Barley roll speed - 7 seconds, 355Kb/sec (dynamic area)
Normal speed - 99 seconds, 25Kb/sec (from file)
Fast speed - 31 seconds, 80Kb/sec (from file)
Barley roll speed - 24 seconds, 103Kb/sec (from file)
It'll work best when processing from a dynamic area. If processing
from file, it may put an unacceptable load on the system to be useful
unless you don't plan to use your machine while processing, in which
case barley roll will get it over sooner. :-)
Those of you who understand the way my mind works will know why I have
called this "barley roll".
The rest of you, um, "Don't even go there, girlfriend!"
[*] Aren't you glad you bothered to read this file?
Now writes back the status information, and uses this in it's report.
The update routine WILL preserve comments at the top of the file - any
line which is either blank or has a ';' as it's first character. When
a line that that does not fit that criteria is found, PreNews will
output the updated information.
New backup feature...
If "<INNewsIn$File>" == "<PreNews$Source>", (as is current setup)
then the Index file will be erased and the news file will be backed
up to the path "<PreNews$Backup>", and the incoming news will be that
file (ie, "<PreNews$Incoming>" will be ignored).
This means after a fetch, you can backup and process your news in one
operation.
If you link to PreNews in !Voyager.Apps.News.!NewsAgent.!Run, then you
can:
Backup your news
Process your news
Debatch your news
Then prepare to read your news
...all with one mouseclick.
The backup is as follows:
<PreNews$Source> points to the 'source' news file. This is possibly
the same as <PreNews$Outgoing>, ie, <INNewsIn$File>, but using a
different variable provides more flexibility.
<PreNews$Backup> points to the base directory. For the purposes of
our example, we shall assume it points to the directory
"IDEFS::Willow.$.newsbackup".
Within this directory, will be a subdirectory constructed from a
partial ISO date, in the format YYYY-MM.
Within that subdirectory will be the news file, named as:
DD-HHMMSS
So, the instant that we run PreNews is Tuesday, 22nd May. It is half
ten at night. You can expect your news to be backed up as:
IDEFS::Willow.$.newsbackup.2001-05.22-223104
or something along those lines.
You can store up to 77 files per directory, which will be plenty to
hold a month's worth of news if you debatch and read twice a day.
[if you exceed the filer limit, you'll see an error message; oh and
this restriction has been removed for E+ formatted discs under
RISC OS 4.xx]
You can store 77 months worth of news. That's just under six and a
half years.
There's only one guy I know that has archived that much (of the
group argonet.zfc) and it takes a whole lot of room. To store news
indiscriminately would eat rather large chunks of disc space. Are
you that mad? :-)
Steve Pampling would like a fetcher which is more intelligent in it's
operation, without being as horrendous to configure as Newshound.
I cannot speak for the dark distant future, but I can say there is
absolutely NO intention of providing this support in PreNews at this
time. I have other ideas of how to solve that particular problem...
Fixed bug where it'd miss the very last article.
0.03* 06/06/2001
Creates "Index" file to speed up NewsAgent debatching.
Fixed bug where it would miss the very first message in subject
scanning. Also increased sanity trapping in group/subject munger.
Altered code to sort/process subjects and groups in subject scanner.
Will now cope with unusual conditions such as only having one message.
Altered new message detection (in processor and subject scanner) to
look for "#! rnews" as the first thing on a line (rather than as the
only thing on a line!). Now PreNews should work with news transports
that write octet counts into the message header after the "#! rnews"
thing - you know, like Newshound.
Benefit : If you're using NewsHound and NewsAgent (!), then PreNews
will now build an Index file to make your news debatching
slightly quicker...
...mmm, ain't exactly a big selling point though, is it? :-)
Output reports tidied up a little.
I wrote an extension to create a second dynamic area, and copy to/from
that, but on my system it barely made any difference (processing at
119K/sec rather than 107K/sec).
I have removed this as it was a rather yucky kludge designed to act as
a 'would this work?' sort of test. If you are using a none-too-nifty
harddisc or interface, and would like this facility put back in, then
email me.
But you'll need to make it convincing as I'm using a standard Samsung
harddisc attached to a Simtec IDE card (which doesn't do DMA), so it
isn't like I have a blindingly fast setup. I rather suspect the speed
is in the drive buffering.
Like I said, you'll need to make it convincing. Don't tell me you're
using the original 240Mb harddiscs hanging off the IDE port on the
A5000. The best we can say about Acorn's attempts at IDE is that it
works, and thanks for putting it on the board. As for speed... Hmmm...
As for sanity, *ADFSBuffers? :-) :-)
Return to index
Copyright © Y2K1 Richard Murray