Search This Blog

Friday, March 27, 2009

So Much for That

A couple weeks ago I happened to be at a certain computer store. I happened to notice that they had a sale on a GeForce 9400 GT 512 meg video card for $40 (normally $70). While I already have a sufficiently powerful video card, I'd been thinking for a while about picking up a cheap GeForce, in case I ever felt inclined to play with CUDA (my video card is an ATI). So, I grabbed one.

After a sufficient amount of procrastination (specifically, two weeks, bringing it to the last day to return it), I figured I should see about opening it and installing it, to make sure it works. But first I decided to do a bit of research: namely, whether you can have both an ATI and nVidia card in the computer at a time. Obviously this should work, but with the ATI and nVidia war, driver idiocy, etc., you never know.

Well, it turns out that you can't - at least, not in Vista (I'm using 2008, which is based on Vista). Specifically, Vista only allows one WDDM display driver to be loaded at a time; this means that if you have multiple video cards, they all need to use the same video driver. While Vista still supports older XP XPDM display drivers (and allows multiple XPDM drivers to be loaded at once), it'll cost you - WDDM drivers are required for things such as Aero and DirectX 10; worse, you can't have one driver that's XPDM and one that's WDDM. Windows 7 is rumored to support multiple WDDM display drivers at once.

And that's why I tried it while it was still returnable, and didn't open it before doing some basic research.

Thursday, March 26, 2009

Die, .NET. Thanks.

So, just encountered an (extremely) evil quirk of the .NET platform in a bug.

Everyone who programs .NET knows that one key difference between the two is that structs are (without ref specified) always passed by value, while classes are passed by reference. Apparently that rule is not limited to actual passing of structs, themselves; passing a "pointer" to a callback function for an instance of a struct causes a copy of the entire struct to be passed, and the callback is then called on that copy, not your original instance.

Example:
system.FindCollisions(collisionSet.OnPossibleCollision, workingSet);

In this line, FindCollisions receives a local copy of collisionSet. When it then calls that callback function, that callback operates on the local copy, not on collisionSet itself.

I'm not sure whether this is by design or whether it's a bug. While it's consistent with the policy of always passing structs by value, the fact that it's so counter-intuitive makes me wonder if it might not be a bug.

Wednesday, March 25, 2009

& More Piracy

I haven't written about it as much on this blog as other places (instant messages, IRC, various forums, etc.), but I'm a huge opponent of both Digital Rights Management (also known as Content Restriction, Annulment and Protection) and draconian anti-piracy efforts like the RIAA's law suit campaign. This was based on the belief that regardless of the raw numbers (e.g. total P2P downloads of something vs. actual sales), the total number of lost sales due to piracy was low enough that it was better to just eat the losses as part of the cost of doing business than to expend the effort, cost, and public good will to try and fight piracy (which wouldn't work anyway). Specifically, I was guessing that the actual losses were in the 5-15% range - that is, if there were 0 piracy, producers could sell 5-15% more of whatever they make.

However, just now I came across some rather shocking statistics, complements of the makers of World of Goo (a game that shipped without DRM). According to their data collection, 82% of copies of the game played were pirated. That's higher than I expected, but that wasn't the shocking part: that the ratio of pirated copies to lost sales is about 1000:1. Crunching the numbers and rounding up a bit to err on the side of caution, this means that the actual losses of piracy are less than 0.6% of the revenue they make from legitimate sales. This is between 8 and 25 times lower than I thought it would be.

These numbers absolutely demolish the claims that
  1. Internet piracy seriously hurts publishers (the actual damage is negligibly small)
  2. That DRM does more help than harm by reducing piracy (with levels of losses that low, there isn't any room for DRM to help, yet it's clear that it does a substantial amount of harm)

Thursday, March 19, 2009

Random Linguistic Fact of the Day

Technically English (and I think all Germanic languages) doesn't have a future tense. The future is rendered as a mood in English, using a modal (mood auxiliary) verb, in the same class as (and mutually exclusive with) "can", "may", "must", "would", etc. The past and present tenses, on the other hand, are true tenses, and both in the indicative mood (no modal verb or the "do" dummy modal verb).

The logical basis for this distinction has to do with the concept of realis. Essentially that means what it looks like: realis moods have to do with 'real' things - things which are considered certain to have already happened; while irrealis moods are not certain for one reason or another. There's a general tendency in language to regard the future as inherantly uncertain, and thus place it in an irrealis mood.

Whether this is a peculiarity of Germanic languages or is universal among Indo-European languages is unclear. In Latin there is a future tense for the imperative mood (commands - an irealis mood) as well as indicative (events that are certain - a realis mood), but not for subjunctive or supine, two other irealis moods.

For trivia value: Caia does not have tense; aspect and mood are used to imply tense, and if tense must be made absolutely certain, it can be indicated with adverbs. It has three basic moods (more complex moods are specified with helper verbs or particles): indicative, potential, and hypothetical. As in English, the indicative is used for events considered certain, and is used primarily for past and present tense. Potential mood indicates that an event is possible, but not certain; it is used for the future, among other things (although the preferred method of referring to the future is to reduce it to a certain, indicative present expression such as "I intend to go" or "I want to go", which is more precise). The hypothetical refers to events that are known to be false (hence talking about a hypothetical, counter-factual "what if" situation).

Sunday, March 08, 2009

Random Linguistic Fact of the Day

Ever wonder why it's fairly common to create compound nouns from phrases (e.g. bird-watching, card-carrying), but in all these cases the object comes before the verb participle? Based on English word order it should be watching-bird, etc., yet it never is.

This is probably due to the fact that word order in Proto-Indo-European was very different than the word order used in English and most other Indo-European languages today. In particular, instead of the subject-verb-object order typically used today, PIE (along with more recent ones, like Latin) preferred the subject-object-verb word order. So you might say things like "Avem [bird] spectabam [I watched]" in Latin, which is exactly the order seen in the compounds.

Saturday, January 10, 2009

Here We Go Again!

It's that time again: the beginning of the next anime season; the winter 2009 season, in this case, and things look pretty bleak. THAT Anime Blog put their preview up quite a while ago. I'd been waiting to post until Random Curiosity posted theirs, although then I was too lazy to actually post when they did, a week ago or so :P

So, what's Q looking forward to? Well, not much. I'll probably watch Slayers EVOLUTION-R because I have a couple of friends that absolutely love that franchise. I downloaded the first episode of White Album (though I haven't watched it) mainly because of the pretty pictures on Random Curiosity's first episode review; we'll see how that turns out. Other than that, there are three series that, while I can't say they sound great, I'm at least willing to try watching a couple episodes and see - Chrome Shelled Regios, Black God (listed as "Kurokami the Animation" on Random Curiosity), Kemono no Souja Erin [The Beast Player Erin] (in no particular order).

Really, the closest thing to something I'm "looking forward to" on that list is Minami-ke Okaeri [Minami family, welcome home]. While I wouldn't name it among my favorites, the Minami-ke franchise was sufficiently amusing to make me watch more of it (although it has a bit more cross-dressing than I'd care for).

I'd say the best stuff ("best" is a relative measure, here) is the two+-season series that are continuing into this season. Toradora!, Clannad After Story, Skip-Beat!, Toaru Majutsu no Index [A Certain Magical Index], and Tales of the Abyss, listed vaguely in the order I like them.

However, I do have good news. I recently saw a mini-review on the sidebar of Anime News Network for a series by the name of Higurashi no Naku Koro ni ["when the cicadas cry", though the official translation is "Higurashi - When They Cry"] that kind of intrigued me; a quick look on AniDB revealed that it was fairly well-rated (although the art style is hideous). So, I grabbed it. While the first roughly 2/3 of the series were sufficiently amusing to hold my interest, the last third of the season really elevated my opinion of it, and made me recommend it to all my friends. However, this series is not for the faint of heart, faint of stomach, or those who don't like brutal, tragic stories that don't have happy endings.

It's a big of a peculiar series. It's structured into a number of arcs (six in the first season), falling into the categories of question arcs and answer arcs. Question arcs are kind of horror/thriller type, where a lot of weird stuff happens, but a lot of it is left unexplained, as to why it happened to begin with. Answer arcs explain parts of the series - why things happened the way they did, and what is behind some of the strange occurrances in question arcs. Oddly, while some of the arcs are compatible, others appear to take place in different universes, as they're mutually exclusive subject matter occurring at the same time, to the same characters. The last arc of the first season, however, suggests that there are actually parallel worlds that aren't entirely separate, and there's some crossover between them; the details are left, presumably, to the second season (Higurashi no Naku Koro ni Kai) or the OVA, which I haven't seen yet.

In any case, I've quickly become attached to it (I watched all 26 episodes in 2 days, which is more than I usually watch of anime), and I'm a bit depressed by running out of the first season to watch.

You can get it from a variety of places, at least one of which is downloadable online (*cough* Boxtorrents *cough*); note that the first season has 26 episodes, while they second has 24. I'd definitely recommend watching it. If you want to start with the best part, I'd say watch episodes 5-8 and 16-21 (both contain the same story, but from different perspectives; the latter explains what was going on in the background of the former, and shows why things turned out the way they did). Though even if you don't skip ahead, I'd recommend watching those two arcs back-to-back at some point, as they're part of the same story.

Friday, December 19, 2008

Switching Gears

After years of suing thousands of people for allegedly stealing music via the Internet, the recording industry is set to drop its legal assault as it searches for more effective ways to combat online music piracy.
...
Instead, the Recording Industry Association of America said it plans to try an approach that relies on the cooperation of Internet-service providers. The trade group said it has hashed out preliminary agreements with major ISPs under which it will send an email to the provider when it finds a provider's customers making music available online for others to take.
Wall Street Journal

I definitely did not expect this so soon, as progress in the courts of stopping the RIAA's legal campaign is proceeding at a crawl, and the RIAA could probably have continued for a while longer before getting hit with serious legal penalties.

Wednesday, December 17, 2008

Graphics Programming Term Project

In case anybody is interested, here is the paper for my term project in graphics class. I was working on an implementation of it, but thanks to various brick wall problems, that didn't end up getting completed in time (which is why the results section does not discuss the results of the actual implementation), although it did result in me going 38 hours without sleep. It's an interesting method, and at least a couple small parts of it are novel (I'm not aware of them being proposed before), but probably isn't practically useful for the reasons explained in the paper.

Multi-Fragment Deferred Shading Using Index Rendering

& More Echoes

BahamutZero has informed me that Echoes of War is now available on iTunes+ (DRM-free 256 kbps AAC audio downloads) for $14.85 (the two CDs in Echoes of War are sold separately, totaling that price). I'm not aware of the actual CDs being available anywhere but Eminence, the creators. If you just want the audio files without the shipping (my Legendary Edition cost like $12 shipping), check out iTunes.

Tuesday, December 16, 2008

Intriguing

Well, just as school is almost over (finals are this week) and I don't have a job lined up yet, substantial amounts of amusement will be welcome in the near future (especially given how bleak the anime outlook is, this season...). Well, as it turns out, I'm in luck! While in the process of banging my head against a wall till I pass out while working on a term project, something amusing happened. I don't have time to explain the details now (despite the fact that this is much more interesting than my school project), but here's a short headline of what's up and coming: Q vs. Scam Debt Collection Agency.

Look forward to it!

Sunday, November 23, 2008

& Other Things

I managed to forget something important in my last post, despite the fact that it's closely related to one of the things I did mention; that is, Echoes of War is out. At least, it got shipped to me last week; few others seem to have gotten it already, and as far as I know it hasn't even hit peer-to-peer networks yet.

Echoes of War
is an orchestral remix/arrangement of music from all three of Blizzard's universes - Warcraft (III, World of Warcraft), Starcraft (I & II), and Diablo (I-III) - by Eminence. It's about one and a half hours of music, with several tracks from each game and each track being a medley of game pieces.

While some of them fairly closely follow the original sound, some of them are arranged in very novel and surprising ways. Two of the best examples of this are the big band jazz arrangement of the Starcraft I Terran music, and the crazy symphonic/operatic/Middle Eastern/The Rock arrangement of the Starcraft I Zerg music. (other samples can be played from the Echoes of War media section)

How much I like the tracks varies by the track. Several of them I really like, although I'm noticably less fond of the Diablo tracks than the Warcraft and Starcraft ones. But in any case, the album is awesome. If you like the music of Warcraft, Starcraft, and/or Diablo, buy it. I just wish the stupid thing was sold by stores that didn't charge you $14 for shipping...

Wednesday, November 19, 2008

Various Thingies

First of all, I should mention that my house is fine; the fire didn't get near it. The probability of it getting here was fairly low, but we did a bit of better-safe-than-sorry packing. Though last night while driving home from school I did drive past a (unrelated) fire that filled the entire intersection with smoke in about a 50 feet radius; I still don't know what was on fire (I couldn't see it), but the smoke was very obvious, and I heard fire trucks going by.

In other news, it's been relatively difficult to collect data on Firefox after reenabling the Feed Sidebar addon. Firefox crashed after three days of logging memory usage, and then a couple days later I needed to restart it because I needed the memory for WoW (Firefox was using about a gig). But the addon defintiely seems like the cause of the memory leak. From the days I gathered data, it looks like it leaks about 40 megs/hour (although that's only over a couple days; it might decrease over time).

Finally, I just noticed something that happened last year: the Starcraft soundtrack, not previously available (the compressed audio shipped with the games is 22 khz ADPCM, which is pretty poor quality), is on iTunes for $10; the other Blizzard OSTs that were included in the collectors' editions of Diablo 2, Warcraft 3, and World of Warcraft are also available there (though unfortunately all of them are single CDs, which means they are incomplete). The music is DRM-free (although I hear they encode personally identifying information in the audio files), 256 kbps AAC (good quality), though you will have to install the Apple iTunes crapware to buy it. I'm told the M4A files should play on all PC audio players that support AAC (I know they work on WinAmp), though they are not MP3s, and will not work on MP3 audio players. That's your public service announcement for today.

Saturday, November 15, 2008

Toasty

So, it's a blistering 4% humidity (91 degree temperature), and southern California is burning once again. As has happened several times, everything looks golden through the smoke filling the sky, and ash is accumulating on every outdoor surface. People working outside here are told to wear masks to cut down on the amount of smoke inhaled.

Currently several hundred homes have burned down and a few thousand have been evacuated. The fire isn't expected to get here (it's 10 miles away), but we're doing some preliminary packing in case things go badly and we have to evacuate. It's also possible that damage to the power lines at a distance might cause us to lose power here (in a bad case scenario), even if we don't have to evacuate.

Wednesday, November 12, 2008

& More Leakage

So, after writing that last post about the audio driver handle leak, I decided to log some data - specifically, the amount of memory Firefox allocates, and the number of handles in the Symantec Anti-Virus process smc.exe. It's now been about a week since I started gathering data (although unfortunately the power went out in the middle, so I ended up with two smaller replicates).

The data for smc.exe shows that it begins at approximately 450 handles on startup, and acquires an additional 3100 handles per day (although 'day' is about 14 hours, as I hibernate my computer at night; meaning about 220 handles/hour). This definitely doesn't seem normal, and I'm going to venture a guess that it's a handle leak. I also noted that the increase seems to be linear over the course of the day, so is unlikely to be related to something like automatic update.

I already knew that Firefox was hemmorhaging memory. If I recall correctly, the amount of memory allocated by Firefox increased by 200-300 MB per day. This time, I tried using Firefox for several days without two of the three addons I normally use (the third was NoScript, so I didn't want to try without that unless I had to). While this test didn't last as long as I'd hoped (thanks to that power outage), after four days, Firefox had only increased from 125 MB (when I first started it, with a lot of saved tabs) to 205 MB (now). In four days I would have predicted it would hit 600-900 megs.

This strongly suggests that one of the two plugins is responsible for the massive leakage, although I'll have to watch what happens after I reenable the one most likely the be causing the leak (as the other is newly installed, and this problem has been around for longer): Feed Sidebar (version 3.1.6). So, we'll see what happens with that. Might have an answer in another 4-7 days about that.

Tuesday, November 04, 2008

& the Audio Driver Incident

Several months ago, I (finally) upgraded my computer. My old one was a 1.8 GHz Athlon XP (single 32-bit core) with 1.25 gigs RAM and a GeForce 3; in other words, it was 2002 or 2003 hardware. My new computer is a 2.4 GHz Core 2 (quad 64-bit cores) with 4 gigs RAM and a Radeon 4850; depending on the benchmarks, my new CPU is 10-18x as fast as my old one, if you count all 4 cores. After trying various voodoo to try to get my old XP installation to run on my new computer (despite the fact that it wouldn't have been able to use about a gig of my RAM), I ultimately gave up and installed Windows Server 2008 64-bit. After dealing with a whole bunch of problems getting various stuff working with 64-bit 2008, things ultimately ended up being acceptable, and I've used that ever since.

However, a couple relatively minor problems have been pretty long-standing, and continued until a few days ago. One was easy to diagnose: Firefox was leaking memory like heck. For every day I left my computer on, Firefox would grow in RAM usage by a couple hundred megs, getting up to a good 2 gigs on occasion (I usually kill it before it gets to that point). While this was certainly an annoyance, it wasn't much of a problem, as I have 4 gigs memory, and I can simply restart it to reclaim all the leaked memory whenever it gets so large it becomes a problem.

One was much harder to diagnose, however. Something else was leaking memory in addition to Firefox, and it was not clear what was causing this. Total system memory usage would increase over days, and if you ignored Firefox, would end up using up all of my 4 gigs memory by about 2 weeks since the last reboot. Unlike with Firefox, there was no apparent problem - no single process was showing a significant accumulation of memory, nor were excess processes being created, leaving 1-2 gigs of memory I couldn't account for. So, I went several months without knowing what the problem was, usually handling it by restarting my computer every week or so.

Then, one day my dad called me from work to ask me why his computer at work was sometimes performing poorly. So I had him look through the process list and system statistics and look for memory leaks, excessive CPU usage, etc. As I don't have the exact terminology used on those pages memorized, I also opened up the listing on my computer to be sure I told him to look for the right things.

This brought something very curious to my attention: the total handle count for my computer was over 4 million. This is a VERY large number of handles; normally computers don't have more than 20-50k handles at a time - 2 orders of magnitude less than what my computer was experiencing. This was an almost certain indication that something was leaking handles on a massive scale. After adding the handles column to the process list, I found that audiodg.exe was the process with some 98% of those handles. Some looking online revealed that that process is a host for audio driver components and DRM. Some further looking for audiodg.exe and handle leaks found some reverse-engineering by one person that showed that this was due to the Sonic Focus audio driver for my Asus motherboard leaking registry handles.

Fortunately, there was an updated driver available by this time that addresses the issue. As my computer was currently at 96% RAM usage (the worst it's ever been - usually I reboot it before it gets to this point), I immediately installed the driver and restarted the audio services (of which audiodg.exe is one). This resulted in a shocking instant 1.3 gig drop in kernel memory usage to less than 400 megs total. It's been one and a half days since then, and audiodg.exe currently is using 226 handles, suggesting that the problem is either dead or drastically reduced (it has increased by like 70 handles in those 1.5 days); and even if it is still leaking handles, 50 handles a day is a tolorable leakage, as that's only like 10 k/day.

So, this whole thing revealed that Windows is quote robust. Given that most computers never go above 50k handles, I was very surprised that Windows was able to handle 6.6 million handles (the highest I've ever seen it get to) without falling over and dying (although this wouldn't have been possible with a 32-bit version of Windows, as that 1.7 gigs of kernel memory wouldn't have fit in the 2 gig kernel address space after memory-mapped devices have memory space allocated). Traditionally, Unix has had a limit of 1024 file handles per process, though I don't know what's typical these days (I know at least some Unix OS have that as a configurable option).

After pursuing that problem to its conclusion, I decided to do some more looking for handle leaks in other processes. While the average process used only 200-500 handles, a number a processes (which are not abnormally high) get as high as 2k handles. However, one process - smc.exe, a part of Symantec Antivirus - has almost 50 k handles allocated, making it a good candidate for a handle leak. Looking at the process in Process Explorer shows that a good 95% of these handles are of the same type - specifically, unnamed event handles - providing further evidence in support of handle leakage. That's as far as I've gotten so far; I haven't spent much time investigating the problem, or looking for an analysis online (though the brief searches I did didn't find anything related to this). So, that's work for the future.

Thursday, September 04, 2008

Final Lap

Well, it's currently the second week of the fall university semester. This semester is extra special because it's the last semester before I graduate, and anything that results in less school is always a good thing. Between my two majors - biology and computer science - I've been in college for entirely too long, and I'm hoping to move on to more enjoyable things after Christmas.

This semester, I only need 9 units left for graduation, and two specific classes. First, I need a "modern high-level language" class. This comes in three flavors: Visual Basic (probably .NET), C#, and Java. As I don't see much of a use for VB, it's a toss-up between C# and Java. I already have a decent amount of experience in C#, which means that Java would be the course I could gain the most from (as it would add another entry to my resume). Unfortunately, the only Java class this semester is at 8 AM, which is a bit (*cough*) too early for me. Thus C# wins by default. While I probably won't learn a great deal, it has the advantage of requiring less effort, which is also always a good thing.

The other class required specifically is Programming Languages and Translation. This is a recent class which merges two previous classes, one on high-level languages (a survey of like a dozen languages, and the various ways high-level languages accomplish common tasks) and the other on compiler development. I had actually taken the former of the previous classes, but they merged the two before I could take the higher-level course, forcing me to take this new one instead. On the plus side, this also means I'll have to put in less effort at this course, as well, and I probably won't have to study (in my case this means 'read the textbook and come to class') much of the first half of the semester.

One of the things we'll be doing in the class over the semester is writing our own compiler. I've already got some ideas for a high-level programming language which closely resembles natural (i.e. spoken) English, intended for use by people who are not computer science or math people. I ought to discuss some of the ideas for this on this blog; we'll see what my infamous laziness permits.

Unfortunately, the class I really wanted to take this semester isn't being offered - the Game Programming Development Project. It would have been awesome to have to spend a semester working on E Terra (Gord knows I'm too lazy to work on it when I don't have to) and get three units credit for it.

So, that left me needing to find another class. This semester is actually pretty bleak, as far as which courses are being offered and when. While there are maybe five other classes I wouldn't have minded taking if nothing else was available, like none of them are offered this semester (or those that are have prerequisites I don't have, or are at extremely inconvenient times). So, I was forced to improvise - by looking into the list of graduate classes. As it turns out, my school allows undergraduate students to take graduate classes with permission from the department, although you can get kicked out if there are more graduate students than spots in the class.

One class was at a convenient time, covers something useful to me, and only required courses I'd already taken: Advanced Graphics Programming. Unlike BahamutZero, I can't really say I especially like or get excited by graphics and graphics programming, but clearly a thorough knowledge of graphics is a big plus for game programming; as well, I hadn't had any trouble in the undergraduate graphics class, so I can at least get the job done. Unfortunately, the syllabus doesn't look as applicable to game programming as the course description suggests, but hopefully it'll end up being worthwhile (and hopefully graduate-level homework and projects won't be too painful).

One thing that may turn out to be fun is the term project. The teacher hasn't actually given out the assignment (which would have a few dozen example topics), but as I understand it, we can do just about anything, as long as it's related to graphics and is sufficiently ambitous for a graduate-level class. When I mentioned all this to BZ (who loves graphics stuff, and would probably take a graduate-level graphics course, if he had the chance), he immediately asked if we could do a project together (although I'd thought the same thing even before he asked about it). As it turns out, we can (I talked to the teacher), provided the project is large enough for us both, and our work is sufficiently separated so that the teacher can grade my part of the work on its own. So, this could turn out to be fun. I'll probably write about at least the topic (when we come up with something), if not details along the way (and if BZ is working with me, he may post about it on his blog as well).

Also, just to briefly mention a topic I should (as in ought to) write about in the near future: the first programming assignment in graphics class - that is, a rudimentary ray-tracer. This actually isn't very difficult. Writing a simple ray-tracer that can render simple things (e.g. plastic-looking spheres) is pretty easy; it's making it fast and photo-realistic that's hard - but neither of those are requirements for this project. I estimate it'll take two or three days of coding, and we have two weeks to do it (though I have a bunch of relatively easy optional features I want to add, so it will probably take me longer than the others in the class), which isn't bad - not unlike what I'd expected from a graduate-level class.

Sunday, August 24, 2008

& More MPQDraft

Well, looking at the SourceForge statistics, a few interesting things became apparent. First, there's still a remarkable amount of interest in MPQDraft, even after all these years. MPQDraft, released some 7 years ago, was originally targeted at Starcraft, which was released in 1997 - 11 years ago; while this has remained the primary target of MPQDraft and modders, MPQDraft has also been used to a lesser extent with Diablo II (8 or 9 years old, I think) and Warcraft III (7 years old, I believe). Given the fact that the most recent of those is still 7 years old, it's pretty surprising that MPQDraft is still heavily used today. As the graph indicates, over the last year, MPQDraft has been getting an average of 600 downloads/month, and trending upward, with 750 downloads last month.

I'm a bit curious what happened in November that produced such a huge rise in download count - about a 6x increase from September to November. The first thing I thought of was the announcement of Starcraft II, last year; however, this was quickly found to be incorrect, as SC2 was announced 6 months prior to that. So I'm really not sure what caused that increase. I can only imagine some very large site related to Blizzard games (Blizzard themselves or one of the major modding sites) linked to MPQDraft's recent home on SourceForge at that time.

The other surprise is the seeming lack of interest in the source code. According to the SourceForge statistics, there have been fewer than 50 source gets to date. I'm curious whether that's at all related to the fact that I'd only made the source available through Subversion (the source version control system I use for MPQDraft). To test that hypothesis, I've posted a package containing all the source on the SourceForge download page. You'll need to take a look at the notes (separate from the ZIP file) for what you need to get the code to build.

Thursday, August 21, 2008

& MPQDraft

Well, it's been a while. After I registered the MPQDraft project on SourceForge last April 1, I procrastinated for so many months that many people suspected it was an April fools' prank; however, the only prank that was intended was the fact that it wasn't a prank. One year after then, I finally got around to posting the complete patching code, although I'm only just now getting the last of the GUI code uploaded. This is to say that the code on SourceForge is now complete, and MPQDraft is now fully open-source.

Relevant links:
The MPQDraft project
Binary download page
Web-based source code browser
Instructions on downloading the source code
The OSS license it's licensed under

Sunday, August 10, 2008

Gah

Random fact of the day: Microsoft Developer Network library no longer gives information about what versions of Windows prior to 2000 support some functions. For example, MSDN does not list support for CreateFile in any versions of Windows prior to 2000, despite it being in every single 32-bit version of Windows (Windows 95 and NT 3.1 onward).

Wednesday, July 30, 2008

Comcast Revisited

A bit ago I wrote a Mac Chaos-inspired parody of the Comcast board room, explaining how they'd gotten into this P2P bandwidth crisis they're in, and proposing that the true cause of their crisis is that they'd signed up massively too many users for their old infrastructure while steadily increasing speeds offered to subscribers. A post on P2PNet today lends strong support for that conclusion.
There really is a problem on (at least some) cable upstreams today, based on what I hear from people I respect who have the data. My hope - which won’t be tested until 2009 - is that the DOCSIS 3.0 upstream will resolve most or all of the problems for the next few years. Full DOCSIS 3.0 has a minimum of 120 megabits upstream (shared) among typically 300 homes, something like 400K per subscriber. Current cable modems typically have 8 to 30K per subscriber. This is a huge difference.

While those 'K' don't indicate whether those are kilobits or kilobytes, a bit of quick math tells us that those are kilobit counts. In other words, currently Comcast is allocating a minimum of 1 to 4 KBps for each subscriber. As well, IIRC, Comcast sells 384 to 768 Kbps upstream connections. That puts the overselling ratio between 13 and 100.

Another section is also interesting, for comparison with DSL and FIOS:
Verizon, AT&T, and Free.fr are strongly on the record they do not have significant congestion problems. They do not have a shared local loop, and have allocated much more bandwidth per customer. I’ve sat at the network console of a large ISP and seen they had essentially no congestion amongst the millions of subscribers they served. They allocate 250K per subscriber, much more than current cable.

It's not clear who these figures are for. I believe AT&T DSL doesn't offer more than like 768 Kbps upstream, in which case this would be an overselling ratio of 3. If this is Verizon FIOS (let's say at 5 Mbps, which is their faster speed), that's an overselling ratio of 20. Suddenly it seems very unsurprising that Comcast is having problems and AT&T/Verizon are not. It also shows you who's been investing in their network over the last decade and who hasn't.