Wednesday, 22 February 2006

Blood on the Dancefloor

How come this album sucked when it was released? It has to be the best album going for writing lexical string parsers!

As for the third year project... work goes on slowly. I can now happily parse a filter string, which I have designed to be much like SQL: FROM rssuri,rssuri2 WHERE BODY="regular expression here" AND SUBJECT="another regex" DURATION 30m DISPLAY ON COMPLETION.

I've spent a bit of time umming and aaring over what my project will do. I reckon now that making it into an RSS client server is best in terms of it not totally sucking at being a crappy desktop RSS reader and providing me a decent reason to make it a multi-user reader... not to mention that I reckon it is also the best way to get some more marks.

By client server I mean: my RSS hoojum sits on the server and users connect via sockets (or a web interface, if I'm feeling really frisky) and say: "yo, whassssuuuuuup! I want you to email me with all of the articles you can fetch from the BBC, CNN and the DoJ that contain "Michael Jackson" in either the subject or the body. Let me know of any updates every two hours. Thanks a lot man! Tootles! (FROM bbc.com/rss,cnn.com/rss,doj.gov/rss where body="%Michael Jackons%" or subject="%Michael Jackson%" display every 2h)" Then, the hoojum parses that lot, sets up some threads to poll the feeds and does some filter magic and spews out an email every two hours, if there were any documents that matched.

What makes the whole thing cool (o_o) is the fact that feed data can be shared between users, meaning that bandwidth isn't wasted, and any queries that are placed upon the data do not need to be duplicated. This is where in theory I can make some ace marks. The only flaw is that I have absolutely toss all idea how to share the data between matching threads to allow computation to be reduced. I guess I will ask Alvaro, my tutor, about this later on today :)

No comments:

Post a Comment