Category Archives: Benchmarking

Biding Time

Until President Obama waves his hand and magically fixes the economy so I can find a job, I’ve been biding my time with several projects as usual. As mentioned in my post a couple months ago, I expected to complete a major overhaul of the content system by the end of last year and amazingly actually did so. I felt like the content pages were too bulky with the varying number of images and description lengths, and it didn’t look very clean. So I crafted a custom vertical tabview to organize the information into specific tabs for description, images (dynamically loaded with AJAX), changelog, and downloads (also AJAX). The default tab, called “Vitals”, is a combination of the other tabs, showing general information, a shortened description, one image, and the number of total downloads. The succinctness of the vitals tab helps keep the tabview height down and thus all the items on the page look uniform.

Of course, it all looks rather well until you go to test in Internet Explorer. Despite my attempts to keep everything within standards, IE6 still has issues such as flickering tab button background images and the always enjoyable broken box model. However, IE7 isn’t without its problems either and the tabviews seem to adversely affect my fixed positioning hack from last post.

I’ve also been redoing parts of the site to use more CSS and less inline formatting. Most of the web seems to be in love with CSS to the point that they blindly use only CSS, but I tend to be more pragmatic about it. Certainly, CSS is useful for centralizing style information that is to be used repeatedly or as part of an overarching theme. But the CSS standard is not quite complete enough to handle everything a developer might want to do. I frequently need a property that tells an element to be springy (i.e. fills up the remaining height or width of its parent), but there exists no such property in CSS2. A trick that I like to do with (100% height) tables is to use them to keep something vertically centered in a page or at the bottom of a window but able to expand. CSS has no way to do these things; its vertical-align property only works on inline elements (and don’t get me started on margin hacks). So I think I’ve made my point: it’s a good start, but it’s not there yet. (Plus: IE6. So even if it were there, we still couldn’t use it.)

But I haven’t just been diddling web development lately; I’ve also been back at VB.Net to release a public beta of my much slaved over alarm program cleverly named “Snake’s Alarm”. Not much has changed since I last worked on it in earnest in August 2007. I finally fixed any instability with the FMOD system playing two alarm sounds concurrently by just preventing it from doing so, figuring there wasn’t much use for two overlapping sounds playing. I have also perfected the snooze feature by adding options to control the max amount of snooze time allowed and to turn off the monitor when snoozing. There’s still a lot left in the TODO file, but this version is still completely functional and reliable.

In hardware news, I recently replaced my Radeon 9600XT with a GeForce 7300GT as a stopgap upgrade until I can finally afford a new system. It was seriously the best AGP nVidia card I could get on Newegg–they’re going like hot-cakes (whatever the hell that means). I had my eye on a 7600GS until it sold out when I went to buy. Now the 7300GT that I got is already sold out. I wrote a lengthy review on Newegg for the video card about a week before it sold out (albeit one person labeled it as helpful before then) that I’m going to republish below.

Pros: I haven’ t done a lot of benchmarks, but it looks to be about 60-120% faster than the Radeon 9600XT it replaced, depending on the game or benchmark of course. I chose to switch to nVidia because this card supposedly runs cooler and with less power than ATI’s final AGP offerings (and to prevent fanboy-ism). My tests with RivaTuner show the core runs a bit hot at idle (~116°F), but it only creeps up marginally in most games (~140°F). Video stress tests put it at about 166°F. Overall, the 7300GT’s performance is only somewhat noticeably better in most newer games compared to its predecessor.

Read More…

What’s silly is that I’ve mostly been playing Diablo 2 (an eight year old game) since getting this new video card. I convinced Kaylen to play it with me, being that it would run on just about any computer and she was in exile over winter break. Though it seems I got her hooked since we played all the way through with my Paladin and her Sorceress. Since the first completion, I’ve been poking around in the game’s data files for any changes I can make to perceived flaws.

My biggest complaint about Diablo 2 has always been that you level too frequently at the beginning and hardly ever later on. I did a huge spreadsheet with player experience, monster level, and level-to-area calculations trying to come up with the best solution for a balanced and steady leveling system. One of the most telling graphs of this data is at right, showing the percentage increase in experience needed to get to the next level compared to the last level. In vanilla Diablo 2, after level 11, the player needs 25% more experience to get to each subsequent level, which can lengthen the process significantly as one approaches level 27, where the experience difference levels out at a more respectable 9%. I created a modification to the leveling system that merely smooths out the experience difference from level 5 to 30 and balances the resultant increased difficulty by lowering monster stats according to how far behind in levels the player is.

I’m not sure if it’s all as complicated as it sounds, but when I finally release the mod, I’ll be sure to include the spreadsheet for others to marvel at. I’ve also done a number of smaller mods and have already uploaded three such mods as of this post. One fixes the ever-annoying game font where the 5’s look like 6’s–a huge confusion when looking at item stats. More will follow as soon as they’re thoroughly tested in our new Barbarian and Assassin game. 😛

Posted in Benchmarking, Hardware, Modding, Programming, Website | Tagged , , | Leave a comment

Lessons in Overclocking

As I mentioned in my last post a week ago, I had a hunch that the CPU was the source of all my locking issues in games. Originally, I suspected the clock was too high at ~100Mhz over stock, 2158 (166*13) @ 2262 (174*13). However, while researching Athlon XP’s on Wikipedia, I noticed that none of the Thoroughbred’s were set above 1.65 Volts, yet I had mine set to 1.70 Volts for years. I can’t recall now why I set the core voltage up .05v; it could have been that my original overclock demanded it for stability, there was a perceived need for increased voltage for better stability, or just my own ignorance at the time. It seems reasonable that the voltage is responsible partly, if not completely, for the instability. Increasing the voltage is often necessary to maintain higher clock rates, but they also add to the watts of thermal energy the CPU puts out (increasing the likelihood of overheating). It is also my speculation that increased voltage puts more demand on the capacitors that regulate the CPU power, potentially causing over-voltage failure, and it just so happens that I noticed a couple slightly burst capacitors recently (as seen in the photos below)–wondering if there’s a correlation.

Since I still couldn’t be sure if the problem was the core voltage or not, I was planning a battery of tests the day after Christmas to find a suitable CPU clock. I’d recently realized that my chipset and memory were both rated for a 200Mhz front-side bus, but I had been keeping it close to the CPU’s default of 166. Luckily, I got my 2700+ a couple months before AMD started locking the multiplier by default; therefore, I would be able to drop the multiplier and raise the FSB, roughly maintaining the CPU clock while increasing the memory bandwidth. I was hoping this strategy would allow me to lower the CPU clock as much as necessary and make up the performance with increased bandwidth. The only unknown factor was possible increased CPU latency due to the lower multiplier.

I spent five or more hours running through my test battery, which consisted of Sandra 2005 CPU Arithmetic, CPU Multimedia, and Memory Bandwidth tests; some or all of 3DMark 2005’s game and CPU tests; and the Half-Life 2 benchmark (which apparently is only available through Counter-Strike: Source now). 3DMark’s GPU tests were understandably unhelpful, except later on when I discovered that the Firefly test was extremely memory-intensive. As I tried to find the maximum stable FSB clock, this test proved most helpful. Half-Life 2 showed little responsiveness to the increased memory bandwidth and was only slightly more affected by the CPU clock, even when the video card wasn’t the bottleneck. Sandra’s tests were the most consistently telling of raw performance. As expected, memory bandwidth scaled quite linearly with increased FSB clock. And although the multimedia benchmark was mostly useless, the arithmetic benchmark showed slight performance degradation due to a lower CPU multiplier; but in the end, it was much more affected by CPU clock speed.

Tnews250he benchmarking sequence I took was to decrease the multiplier by .5 each time and then bring the FSB up until the CPU clock was about 2200Mhz. However, when I reached 11*200, I noticed that HL2 and the 3DMark were exhibiting strange crashing. I tried lowering the FSB until I brought these crashes under control (~190Mhz). However, at 11.5*190, the CPU clock was too low, so I raised the multiplier to 12 and set the FSB to 186Mhz for a comfortable 2232Mhz. This has proved to be very stable over the last week, so my next move is to bump up the FSB to 188, which would bring the CPU clock to nearly the same as it has been for the last few years and ultimately show that the core voltage was the problem all along. All these tests were run with a core voltage of 1.65v (stock).

The only problem with using a multiplier of 12 instead of 13, is that most programs identify my CPU as a 2400+ instead of 2700+. But really, it’s more like a 3200+ in terms of performance.

I’ve been enjoying my new stability by playing through Half-Life 2 Episodes 1 and 2 in the last week. Yes, episode 2 was that good that I wanted to play it again. It was much easier to appreciate the game without random locking and with commentaries turned on. I was surprised how often Valve mentioned changing the game in response to the actions of playtesters. This seems like something more developers should pay better attention to.

Posted in Benchmarking, Hardware, Troubleshooting | Tagged , | Leave a comment

Ramalicious, bitches!

Yesterday, I received my Christmas gift to myself, a stick of (get ready for some specs, bitches) 1GB OCZ DDR400 (PC3200) CL2-3-2-5 Platinum Edition RAM with a copper heat spreader. Of course, I already had 2 sticks of 256MB GEiL DDR400 CL2-3-3-6 Ultra RAM in a dual-channel configuration on my Asus A7N8X Deluxe v2.0 mobo. I was skeptical about putting three sticks of differing RAM in my mobo and it working. The rule is usually that it’s not a good idea to fill all your RAM banks, even with high-quality memory. Such setups can cause the system to not POST or crash or lock on memory-intensive apps. So I popped the new 1GB stick of RAM into the free slot and left the case in a work position in the likely case I needed to swap some sticks around. However, it posted fine and I proceeded to set the optimal timings of 2-3-3-6, which was as low as the GEiL sticks could handle. It booted into Windows fine and then handled 6 hours of Dungeon Siege multiplayer hosting like a champ. Later on, still dazzled that all three sticks were working together, I booted into Memtest86+ and ran most of the tests. It handled those flawlessly as well. I’m not sure if this is true or not, but on POST, it says that the two GEiL sticks are still running in Dual-Channel. However, it still performs on par with nVidia 333Mhz dual-channel systems in Sandra memory benchmarks. So there were no performance improvements in benchmarks, but you can really see it in the quality of gaming. Now I can max out texture settings in games and have no memory hitching. And games also can be minimized without memory swapping and exit instantly. I can also leave any number of explorer or browser windows open while playing games. It’s nice. I give this OCZ stick of RAM a big thumbs up.

I also got a Saitek Eclipse keyboard for Christmas, but I really could have used some of the other items on my list more (like a UPS, X-fi sound card, etc.). It’s good for gaming in the dark, but that’s about it. I wouldn’t recommend it for use just based on the badass factor of it glowing blue. The glowing actually makes it harder to read in decent or better lighting. I don’t think it’s angled enough and that makes it harder to read in the light or dark, because yes…I hunt and peck when I type. But I may prop it up more to fix the angle. Sometimes the wrist rest rattles, too. The keys feel nice, though. They’re about 33% softer than the ones on my previous MS keyboard. So, I’d only recommend it if you like to use your computer in the dark. Just don’t tell me what you’re doing in the dark if it’s not gaming. 😕

Also, Betty Crocker Warm Delights are the shiznit.

Posted in Benchmarking, Hardware | Tagged , , | Leave a comment

When Hard Drives Go Shitty: An XP Story

Around Saturday night, I was playing the Sims (don’t laugh! this game is crack) and copying music from my Seagate slave to the Sims’ stations directory on the Maxtor master. Then I noticed an odd stutter to the music in SCMPX, but I thought it was only because I had the game open currently. Still, it was odd, as SCMPX always runs with an above normal process priority (as dictated by my 1337 ShellMPX app). So, I go back to the Sims and it locks cold almost instantaneously. I restart XP and then the Sims has no sound whatsoever. And whenever I open an app on the Maxtor, SCMPX would stutter and it almost never skips, let alone stutter like that. 😕 By watching Task Manager, I could see that there was 100% CPU usage going down when there was disk I/O occuring. But no programs, even explorer.exe were using the CPU hardly at all. On Monday, I started assuming something was seriously fucked up, so I installed HD Tach, a great HDD benchmark that you can pick up here, and ran it on both drives. Here’s what I came up with:

These results were very disturbing, as you can probably tell. :( They showed the possibility for a total hard drive fault, given that the Seagate was unaffected yet the Maxtor was. To try to rule that out, I did a complete scan of the Maxtor, updated my 4in1 drivers (which are now in an executable form and called “Hyperion”…go figure), updated my SBLive driver package (which now comes with a hardware EQ and compressor…sweeeet), and did a factory recertification test. I thought that if a driver got corrupted, this would set it straight. But, it was still pulling 100% CPU and only 4MB/s in HD Tach. So, I slept on it and then decided at school to rule out a hardware fault by transplating the Maxtor in my sister’s new rig. I did, and it worked fine, luckily. When I went to put the Maxtor back in my rig, I reversed its bay position with the Seagate so I wouldn’t have to use my funky IDE cable voodoo anymore. Then as a final hope before having to reinstall XP, I scoured the XP newsgroups looking for someone with a similar problem. After about an hour, I came across a thread where the dude talked about having slow HDD performance and the answer posted was to reinstall the Primary IDE Channel. So, I Win+Break to Device Manager, look at the Primary IDE and sure enough, device 0 was listed as PIO Mode 1 (which tops out at 5MB/s and uses the CPU instead of the chipset/itself to handle transfers) instead of DMA Mode 3 (theoretical 66MB/s max). So I killed the Primary IDE Channel, restart (taking 1min 9sec), install it again, restart (taking 52sec), and everything is fine and dandy:

This compressor sounds really cool on rock songs, but is a little too problematic for most NIN songs, except for The Becoming. The more disturbing The Becoming, the better 😉 .

Posted in Benchmarking, Hardware, Troubleshooting | Tagged | Leave a comment