Category Archives: Programming

Steam Widget in the Wild!

Steam Widget on WordPress.org

As I said I would last weekend, I got my new Steam Widget up on WordPress.org.  I still want to add some things like stats links and currently in-game, but I think it’s a good first release.

Also, I forgot to mention this in my last post, but it’s pretty cool still.  A German guy did a drum remix using my remastered versions of the Deus Ex UNATCO and Area 51 themes and posted them on Youtube a few months ago.  Check it out.

Posted in Media, Programming | Tagged , , , | Leave a comment

Cursor Lock and Steam Widgets

I just did a quick update to Cursor Lock to fix a painfully obvious bug that I somehow overlooked.  Thanks to the person that googled “cursor lock strict mode does not create shortcut” today! 😉

Steam Widget

Also, I noticed you noticed my Steam widget.  I just added it over the weekend and people are already clicking through on the game links.  I did the widget based on my code for Sitewide Recent Images (more about that on my work blog), so it supports the same caching and template options, which the other two Steam widgets on wordpress.org failed at.  I hope to release it over the weekend.

Posted in Programming, Website | Tagged , , | Leave a comment

We have explosive.

(i.e. Lightboxes)

I’ve been really into web development lately, especially anything requiring a lot of JavaScript control of the DOM and CSS control over layout and style. I’ve spent a solid week (at least) working on adding lightboxes to replace some of my lighter scripts: image viewing and downloads. I’ve been pretty fascinated by lightboxes ever since I saw them on addons.mozilla.org. The idea to actually employ one on this site didn’t come to me until I was doing the remake of the content system and thought it would be awesome to combine a lightbox with the ability to preview the contents of archives (rar and zip files).

Although there are tens if not hundreds of lightbox implementations, I felt like creating my own to avoid all the code bloat and because I’m that much of a control freak. The first hurdle was figuring out how the hell these other lightboxes could be triggered on the click of a link without navigating to the link URL immediately after the script finishes. Googling didn’t turn up any leads, but I eventually found out the answer by just reading the code comments of another lightbox. It’s simple and makes sense but isn’t obvious: the onclick function must return false. It kinda works like a message chain in Windows.

From there, I just kept chugging out JavaScript. On DOM readiness, the script adds onclick events to trigger the lightbox to any existing links to image or download pages. On click, the script does the appropriate HTTP request (AJAX) for the content to fill the lightbox. On request received, it puts the HTML into a lightbox container that automatically enlarges to fit content. Meanwhile, the script is fading in the obligatory black overlay; I chose to fade it in not only because it looks awesome but it also helps your eyes to adjust to change in light. It’s all a delicate ballet of scripting, but surprisingly IE performs quite well with it and only minimal hassles (e.g. filter: alpha(opacity=#); instead of opacity: #.#; for overlay opacity in CSS).

It didn’t stop with lightboxes, though. I’ve also been wanting for a while to have a display of the most recent uploaded images on the front page but was underwhelmed by the prospect of cramming only a few images on there. Then I got to thinking about how I might make a scrolling marquee for the images and realized it wasn’t too hard to code. You simply need an inner container for the images with position: relative and an outer container with overflow-x: hidden and on button script events move the inner container’s style.left property by negative the amount to scroll. Then, obviously, you have to do some code to detect the beginning and end of the marquee to keep it within bounds among other things. Quite snazzy.

I also made another slight change to front page (beyond adding the latest blogs). I was a little displeased by my method of finding the most popular content given that it merely sorts the database by the total downloads. Thus, it’s not at all responsive to changing trends. For example, if file A gets 1000 downloads over five years but not much recently, and file B has only 300 downloads over a couple months but gets several hits a day, then file B is obviously more popular than file A. The best solution to calculate what’s more popular would be to log all the hits for a day for every file and calculate popularity trends often, but that’s a logistical nightmare and too much hassle for this small site. However, I came up with a simple solution that requires only one new field in the database (last_dl_count) and a monthly cron job to do UPDATE `content` SET `last_dl_count`=`dl_count`;. Finally, I switched the front page popular query to the following:

SELECT `id`, `title`, `sshots`, `dl_count`, (`dl_count`-`last_dl_count`) as `delta_count` FROM `content` WHERE `type` = '$type' ORDER BY `delta_count` DESC, `dl_count` DESC LIMIT 2;

It works fairly well, except at the beginning of the next period after the update cron runs. Since all the deltas are 0, you get only the all-time popular again until someone downloads something.

Well, I think that’s enough web developer theory for now. However, I’d like to point out three academic columns I recently added. There’s one from Fortran Programming class with all my source code and most of my documentation. The second is on the Parallel Programming with PVM project I did last year, including a Flash slideshow (first mentioned here), presentation notes, and source code. The last is a paper I wrote on the Aspects of Overpopulation, a subject that greatly concerns me; too bad the class it was for was completely worthless.

Posted in Academics, Programming, Website | Tagged , , , | Leave a comment

Biding Time

Until President Obama waves his hand and magically fixes the economy so I can find a job, I’ve been biding my time with several projects as usual. As mentioned in my post a couple months ago, I expected to complete a major overhaul of the content system by the end of last year and amazingly actually did so. I felt like the content pages were too bulky with the varying number of images and description lengths, and it didn’t look very clean. So I crafted a custom vertical tabview to organize the information into specific tabs for description, images (dynamically loaded with AJAX), changelog, and downloads (also AJAX). The default tab, called “Vitals”, is a combination of the other tabs, showing general information, a shortened description, one image, and the number of total downloads. The succinctness of the vitals tab helps keep the tabview height down and thus all the items on the page look uniform.

Of course, it all looks rather well until you go to test in Internet Explorer. Despite my attempts to keep everything within standards, IE6 still has issues such as flickering tab button background images and the always enjoyable broken box model. However, IE7 isn’t without its problems either and the tabviews seem to adversely affect my fixed positioning hack from last post.

I’ve also been redoing parts of the site to use more CSS and less inline formatting. Most of the web seems to be in love with CSS to the point that they blindly use only CSS, but I tend to be more pragmatic about it. Certainly, CSS is useful for centralizing style information that is to be used repeatedly or as part of an overarching theme. But the CSS standard is not quite complete enough to handle everything a developer might want to do. I frequently need a property that tells an element to be springy (i.e. fills up the remaining height or width of its parent), but there exists no such property in CSS2. A trick that I like to do with (100% height) tables is to use them to keep something vertically centered in a page or at the bottom of a window but able to expand. CSS has no way to do these things; its vertical-align property only works on inline elements (and don’t get me started on margin hacks). So I think I’ve made my point: it’s a good start, but it’s not there yet. (Plus: IE6. So even if it were there, we still couldn’t use it.)

But I haven’t just been diddling web development lately; I’ve also been back at VB.Net to release a public beta of my much slaved over alarm program cleverly named “Snake’s Alarm”. Not much has changed since I last worked on it in earnest in August 2007. I finally fixed any instability with the FMOD system playing two alarm sounds concurrently by just preventing it from doing so, figuring there wasn’t much use for two overlapping sounds playing. I have also perfected the snooze feature by adding options to control the max amount of snooze time allowed and to turn off the monitor when snoozing. There’s still a lot left in the TODO file, but this version is still completely functional and reliable.

In hardware news, I recently replaced my Radeon 9600XT with a GeForce 7300GT as a stopgap upgrade until I can finally afford a new system. It was seriously the best AGP nVidia card I could get on Newegg–they’re going like hot-cakes (whatever the hell that means). I had my eye on a 7600GS until it sold out when I went to buy. Now the 7300GT that I got is already sold out. I wrote a lengthy review on Newegg for the video card about a week before it sold out (albeit one person labeled it as helpful before then) that I’m going to republish below.

Pros: I haven’ t done a lot of benchmarks, but it looks to be about 60-120% faster than the Radeon 9600XT it replaced, depending on the game or benchmark of course. I chose to switch to nVidia because this card supposedly runs cooler and with less power than ATI’s final AGP offerings (and to prevent fanboy-ism). My tests with RivaTuner show the core runs a bit hot at idle (~116°F), but it only creeps up marginally in most games (~140°F). Video stress tests put it at about 166°F. Overall, the 7300GT’s performance is only somewhat noticeably better in most newer games compared to its predecessor.

Read More…

What’s silly is that I’ve mostly been playing Diablo 2 (an eight year old game) since getting this new video card. I convinced Kaylen to play it with me, being that it would run on just about any computer and she was in exile over winter break. Though it seems I got her hooked since we played all the way through with my Paladin and her Sorceress. Since the first completion, I’ve been poking around in the game’s data files for any changes I can make to perceived flaws.

My biggest complaint about Diablo 2 has always been that you level too frequently at the beginning and hardly ever later on. I did a huge spreadsheet with player experience, monster level, and level-to-area calculations trying to come up with the best solution for a balanced and steady leveling system. One of the most telling graphs of this data is at right, showing the percentage increase in experience needed to get to the next level compared to the last level. In vanilla Diablo 2, after level 11, the player needs 25% more experience to get to each subsequent level, which can lengthen the process significantly as one approaches level 27, where the experience difference levels out at a more respectable 9%. I created a modification to the leveling system that merely smooths out the experience difference from level 5 to 30 and balances the resultant increased difficulty by lowering monster stats according to how far behind in levels the player is.

I’m not sure if it’s all as complicated as it sounds, but when I finally release the mod, I’ll be sure to include the spreadsheet for others to marvel at. I’ve also done a number of smaller mods and have already uploaded three such mods as of this post. One fixes the ever-annoying game font where the 5’s look like 6’s–a huge confusion when looking at item stats. More will follow as soon as they’re thoroughly tested in our new Barbarian and Assassin game. 😛

Posted in Benchmarking, Hardware, Modding, Programming, Website | Tagged , , | Leave a comment

More Fun Times with Browsers

Unfortunately, the browser share statistics I gave last time were incorrect. Although I did my best to try to remove erroneous hits from web crawlers, site grabbers, and the like (which don’t constitute valid visitor hits), I overlooked a growing trend in hits from (what I suspect are) botnets trying to exploit URL parameters that may be used to pass in filenames to a script that subsequently includes that file (i.e. executes the file). Of course, none of my scripts have that glaring vulnerability in them, but the zombies try it anyways, creating bogus hits. The reason I missed these hits earlier was that their volume per distinct IP was low enough such that I passed over deleting them on initial inspection of the log data. Of course, crawlers like Google will often have thousands of hits per IP, which is simple enough to identify and remove.

However, swarms of zombie computers weren’t the only reason my data was off. Microsoft’s search engine also uses subversive techniques to get in under the radar. From a huge pool of IPs under 65.55., they continually run bogus searches on sites like mine using a User Agent that doesn’t identify itself as a bot but rather looks like a normal (of course Windows) user. I would just block MS’s range altogether, but I suspect that would just anger their search engine. They may have reasonable motives for doing these search checks (e.g. to make sure sites display the same content to the crawler that they do to a visitor), but it’s an annoying system nonetheless.

Anyways, I present to you the revised graphs from the cleaner data below. Thankfully, it looks like users are moving towards IE7 and away from IE6 more than I previously reckoned. Also, Firefox seems to be gaining on both flavors of IE more than expected. Meanwhile, Safari and Opera are not to be counted out. However, Netscape is still finding a new definition of pain and suffering as it is slowly digested over a thousand years.

To be sure this new data was reasonably accurate, I found a neat site that analyzes internet market shares at major websites. Their data does concur with mine; however, due to my enthusiast-orientated content, I do get a lot more Firefox visitors than mainstream sites. This is perfectly acceptable, though. 😉

Actually, the reason I was looking at the log data again to begin with was because I got interested in Operating System shares after reading a revisit on the suckiness of Vista in MaxPC. Thankfully, I’m not the only one that’s holding out on upgrading my OS until something significantly better comes along. Unfortunately, a lot of people are having Vista shoved down their throats just because they want a new computer. …It’s been a while since everything Microsoft did didn’t piss me off.

Even though Internet Explorer 7’s adoption is on the rise, web developers can’t rejoice yet, as IE7 is still a faulty product in its own right. Since I’ve been testing my new layout in IE7, I’ve noticed more and more a strange scrolling bug. It is characterized by an incomplete scrollbar and frequently stuck mouse wheel scrolling. It wasn’t until I tested my new version of the Deus Ex guide (a very long page) that I realized how much of an issue this bug was. Googling around, I could find barely a mention of the bug on all the intertubes. However, one bizarrely made page (#41: Infinite loop related to overflow and position: fixed) did have the problem outlined quite well on a list of IE7 bugs. Unfortunately, there’s no simple fix beyond scraping the whole fixed positioning layout. Reportedly the only way to remedy the situation was to scroll to the bottom of the page, which could take a poor visitor several minutes to do on my DX guide.

However, after I thought about it for a little while, I thought there may be a way to force the page to scroll to the bottom and then back up in JavaScript on page load. I recently used a scrollIntoView function on my new content system overhaul that did just that. It worked brilliantly. And with a little creative use of the DOM, the script can be dropped into any page with a fixed positioned content frame and overcome this bug automatically. Unfortunately, the mouse wheel scrolling issue remains–can’t fix everything with the DOM.

//trigger this function with body onload and onresize events
 function IE7ScrollBarHack() {
     //get the main content frame from id
     var mainFrame = document.getElementById('main');
     var ie7hackanchor;

     //check if our span has already been created
     if ((ie7hackanchor = document.getElementById('hackanchor')) == null) {
         //create a span and append it to the content frame
         ie7hackanchor = document.createElement("span");
         ie7hackanchor.setAttribute("id", "hackanchor");
         mainFrame.appendChild(ie7hackanchor);
     }

     //scroll down to that span and then return
     var prevScroll = mainFrame.scrollTop;
     if (ie7hackanchor != null) {
         ie7hackanchor.scrollIntoView(false);
         mainFrame.scrollTop = prevScroll;
     }
 }

Of course, it’s probably best if you only print this function in IE7 using whatever server scripting language you have. I usually use something like this (in PHP):

$IsIE7 = preg_match('/^Mozilla\/\d+\.0 \(compatible; MSIE 7\.0/i', $HTTP_USER_AGENT);
Posted in Programming, Website | Tagged , | Leave a comment

SnakeByte Studios Endorses Obama-Biden

The election will surely be a landslide for the Democrats with this crucial endorsement!

But seriously, I’ve been behind Obama since the primaries. My decision is based on his keen intellect, technological prowess, and inspirational charisma (and you know, the fact that I’m a Democrat). I often wonder how anyone in my field can even support McCain, who can barely use a computer, versus the young, ambitious, and intellectual spirit of most programmers that is embodied in Barack Obama.

I normally wouldn’t discuss politics on my website and have continually shaken off the urge to do so recently, but the fast yet seemingly endless approach of the presidential election has gotten me so excited that I’ve turned into a political junkie over the past few months. Every morning, I scour the web for hot political news; every evening, I’m fixed to MSNBC’s line-up; and every waking hour, I’m watching for new poll data. It’s gotten so bad, that I’m starting to do programs and spreadsheets with election data.

Friday, I randomly decided to start crunching numbers on the differences in spread between the 2004 election and 2008 election predictions. I grabbed the predictions with a Regex on Pollster.com‘s fine poll trend data and the results from the Federal Election Commission for 2004 and mashed them together in Excel to get the spreads. Then, I got the crazy idea of seeing the data visually. So from there, I made a VB.Net script to take the data and decide what color to make a state (with the typical Blue equals more Democratic and Red equals more Republican) by adjusting hue, saturation, and luminance. Unfortunately, the linear hue equation made the map mostly purple, which didn’t convey the key point that the country is turning Democratic this cycle. So I switched to a logarithmic hue equation that would keep the purples towards the very center. For any interested, here’s the code I used to find the colors. (Where “x” is positive for more Democratic and negative for more Republican.)

Dim MaxDemHue As Integer = 147 'hue when input is MaxShift in the democratic direction
Dim MaxRepHue As Integer = 255 'hue when input is MaxShift in the republican direction
Dim MaxSat As Integer = 255 'saturation when input is MaxShift
Dim MinSat As Integer = 160 'saturation when input is 0
Dim MaxLum As Integer = 92 'luminance when input is MaxShift
Dim MinLum As Integer = 207 'luminance when input is 0
Dim CenterHue As Integer = ((MaxRepHue - MaxDemHue) / 2) + MaxDemHue 'hue when input is 0
Dim MaxShift As Integer = 40 'the max spread in either direction

Dim HueShift As Double = ((MaxShift * 0.262) * Math.Log(Math.Abs(x))) + 3.9138
If HueShift > 0 Then HueShift = 0 'possible that it could go negative with log
hsl.H = CenterHue - (Math.Sign(x) * HueShift)
hsl.S = ((Math.Abs(x) / MaxShift) * (MaxSat - MinSat)) + MinSat
hsl.L = ((Math.Abs(x) / MaxShift) * (MaxLum - MinLum)) + MinLum

Initially, I manually put the state colors into an image. This proved to be a bit time-consuming, so I moved to a vector art format called SVG that, because it’s merely XML, can be changed programmatically. With each state shape having a unique ID, I can just XPath to it in code and Regex change the color. Now the results.

Looking at the map, it seems quite clear that indeed the majority of the country is trending Democratic this year. Most of the states are 5 to 10% more for Obama than they were for Kerry. There are some notable exceptions, though. The northeastern states are surprisingly not much more Democratic (or even more Republican in the case of Massachusetts and Rhode Island); I’m guessing this is because they already turned out very strongly for Kerry last election. Similar to Mass. is Arizona, where McCain has the home-team advantage. That leaves Tennessee and Arkansas as the only other states not trending more Democratic, which may be explained by racial factors and population demographics.

Just for the hell of it, I also plugged the raw prediction data into my script to produce a map with a bit more information than the standard electoral ones you see floating around.

news273I’ve worked on the new version of Cursor Lock recently and a release is eminent (hope I didn’t say that last time I mentioned Cursor Lock). A new installer is already made and all the known bugs are fixed, so there’s just the matter of documentation left. If I can peel myself away from the polls, I’ll hopefully have the new release done before November 4th.

Posted in Programming | Tagged , , , | Leave a comment