Category Archives: Website

We have explosive.

(i.e. Lightboxes)

I’ve been really into web development lately, especially anything requiring a lot of JavaScript control of the DOM and CSS control over layout and style. I’ve spent a solid week (at least) working on adding lightboxes to replace some of my lighter scripts: image viewing and downloads. I’ve been pretty fascinated by lightboxes ever since I saw them on addons.mozilla.org. The idea to actually employ one on this site didn’t come to me until I was doing the remake of the content system and thought it would be awesome to combine a lightbox with the ability to preview the contents of archives (rar and zip files).

Although there are tens if not hundreds of lightbox implementations, I felt like creating my own to avoid all the code bloat and because I’m that much of a control freak. The first hurdle was figuring out how the hell these other lightboxes could be triggered on the click of a link without navigating to the link URL immediately after the script finishes. Googling didn’t turn up any leads, but I eventually found out the answer by just reading the code comments of another lightbox. It’s simple and makes sense but isn’t obvious: the onclick function must return false. It kinda works like a message chain in Windows.

From there, I just kept chugging out JavaScript. On DOM readiness, the script adds onclick events to trigger the lightbox to any existing links to image or download pages. On click, the script does the appropriate HTTP request (AJAX) for the content to fill the lightbox. On request received, it puts the HTML into a lightbox container that automatically enlarges to fit content. Meanwhile, the script is fading in the obligatory black overlay; I chose to fade it in not only because it looks awesome but it also helps your eyes to adjust to change in light. It’s all a delicate ballet of scripting, but surprisingly IE performs quite well with it and only minimal hassles (e.g. filter: alpha(opacity=#); instead of opacity: #.#; for overlay opacity in CSS).

It didn’t stop with lightboxes, though. I’ve also been wanting for a while to have a display of the most recent uploaded images on the front page but was underwhelmed by the prospect of cramming only a few images on there. Then I got to thinking about how I might make a scrolling marquee for the images and realized it wasn’t too hard to code. You simply need an inner container for the images with position: relative and an outer container with overflow-x: hidden and on button script events move the inner container’s style.left property by negative the amount to scroll. Then, obviously, you have to do some code to detect the beginning and end of the marquee to keep it within bounds among other things. Quite snazzy.

I also made another slight change to front page (beyond adding the latest blogs). I was a little displeased by my method of finding the most popular content given that it merely sorts the database by the total downloads. Thus, it’s not at all responsive to changing trends. For example, if file A gets 1000 downloads over five years but not much recently, and file B has only 300 downloads over a couple months but gets several hits a day, then file B is obviously more popular than file A. The best solution to calculate what’s more popular would be to log all the hits for a day for every file and calculate popularity trends often, but that’s a logistical nightmare and too much hassle for this small site. However, I came up with a simple solution that requires only one new field in the database (last_dl_count) and a monthly cron job to do UPDATE `content` SET `last_dl_count`=`dl_count`;. Finally, I switched the front page popular query to the following:

SELECT `id`, `title`, `sshots`, `dl_count`, (`dl_count`-`last_dl_count`) as `delta_count` FROM `content` WHERE `type` = '$type' ORDER BY `delta_count` DESC, `dl_count` DESC LIMIT 2;

It works fairly well, except at the beginning of the next period after the update cron runs. Since all the deltas are 0, you get only the all-time popular again until someone downloads something.

Well, I think that’s enough web developer theory for now. However, I’d like to point out three academic columns I recently added. There’s one from Fortran Programming class with all my source code and most of my documentation. The second is on the Parallel Programming with PVM project I did last year, including a Flash slideshow (first mentioned here), presentation notes, and source code. The last is a paper I wrote on the Aspects of Overpopulation, a subject that greatly concerns me; too bad the class it was for was completely worthless.

Posted in Academics, Programming, Website | Tagged , , , | Leave a comment

Biding Time

Until President Obama waves his hand and magically fixes the economy so I can find a job, I’ve been biding my time with several projects as usual. As mentioned in my post a couple months ago, I expected to complete a major overhaul of the content system by the end of last year and amazingly actually did so. I felt like the content pages were too bulky with the varying number of images and description lengths, and it didn’t look very clean. So I crafted a custom vertical tabview to organize the information into specific tabs for description, images (dynamically loaded with AJAX), changelog, and downloads (also AJAX). The default tab, called “Vitals”, is a combination of the other tabs, showing general information, a shortened description, one image, and the number of total downloads. The succinctness of the vitals tab helps keep the tabview height down and thus all the items on the page look uniform.

Of course, it all looks rather well until you go to test in Internet Explorer. Despite my attempts to keep everything within standards, IE6 still has issues such as flickering tab button background images and the always enjoyable broken box model. However, IE7 isn’t without its problems either and the tabviews seem to adversely affect my fixed positioning hack from last post.

I’ve also been redoing parts of the site to use more CSS and less inline formatting. Most of the web seems to be in love with CSS to the point that they blindly use only CSS, but I tend to be more pragmatic about it. Certainly, CSS is useful for centralizing style information that is to be used repeatedly or as part of an overarching theme. But the CSS standard is not quite complete enough to handle everything a developer might want to do. I frequently need a property that tells an element to be springy (i.e. fills up the remaining height or width of its parent), but there exists no such property in CSS2. A trick that I like to do with (100% height) tables is to use them to keep something vertically centered in a page or at the bottom of a window but able to expand. CSS has no way to do these things; its vertical-align property only works on inline elements (and don’t get me started on margin hacks). So I think I’ve made my point: it’s a good start, but it’s not there yet. (Plus: IE6. So even if it were there, we still couldn’t use it.)

But I haven’t just been diddling web development lately; I’ve also been back at VB.Net to release a public beta of my much slaved over alarm program cleverly named “Snake’s Alarm”. Not much has changed since I last worked on it in earnest in August 2007. I finally fixed any instability with the FMOD system playing two alarm sounds concurrently by just preventing it from doing so, figuring there wasn’t much use for two overlapping sounds playing. I have also perfected the snooze feature by adding options to control the max amount of snooze time allowed and to turn off the monitor when snoozing. There’s still a lot left in the TODO file, but this version is still completely functional and reliable.

In hardware news, I recently replaced my Radeon 9600XT with a GeForce 7300GT as a stopgap upgrade until I can finally afford a new system. It was seriously the best AGP nVidia card I could get on Newegg–they’re going like hot-cakes (whatever the hell that means). I had my eye on a 7600GS until it sold out when I went to buy. Now the 7300GT that I got is already sold out. I wrote a lengthy review on Newegg for the video card about a week before it sold out (albeit one person labeled it as helpful before then) that I’m going to republish below.

Pros: I haven’ t done a lot of benchmarks, but it looks to be about 60-120% faster than the Radeon 9600XT it replaced, depending on the game or benchmark of course. I chose to switch to nVidia because this card supposedly runs cooler and with less power than ATI’s final AGP offerings (and to prevent fanboy-ism). My tests with RivaTuner show the core runs a bit hot at idle (~116°F), but it only creeps up marginally in most games (~140°F). Video stress tests put it at about 166°F. Overall, the 7300GT’s performance is only somewhat noticeably better in most newer games compared to its predecessor.

Read More…

What’s silly is that I’ve mostly been playing Diablo 2 (an eight year old game) since getting this new video card. I convinced Kaylen to play it with me, being that it would run on just about any computer and she was in exile over winter break. Though it seems I got her hooked since we played all the way through with my Paladin and her Sorceress. Since the first completion, I’ve been poking around in the game’s data files for any changes I can make to perceived flaws.

My biggest complaint about Diablo 2 has always been that you level too frequently at the beginning and hardly ever later on. I did a huge spreadsheet with player experience, monster level, and level-to-area calculations trying to come up with the best solution for a balanced and steady leveling system. One of the most telling graphs of this data is at right, showing the percentage increase in experience needed to get to the next level compared to the last level. In vanilla Diablo 2, after level 11, the player needs 25% more experience to get to each subsequent level, which can lengthen the process significantly as one approaches level 27, where the experience difference levels out at a more respectable 9%. I created a modification to the leveling system that merely smooths out the experience difference from level 5 to 30 and balances the resultant increased difficulty by lowering monster stats according to how far behind in levels the player is.

I’m not sure if it’s all as complicated as it sounds, but when I finally release the mod, I’ll be sure to include the spreadsheet for others to marvel at. I’ve also done a number of smaller mods and have already uploaded three such mods as of this post. One fixes the ever-annoying game font where the 5’s look like 6’s–a huge confusion when looking at item stats. More will follow as soon as they’re thoroughly tested in our new Barbarian and Assassin game. 😛

Posted in Benchmarking, Hardware, Modding, Programming, Website | Tagged , , | Leave a comment

More Fun Times with Browsers

Unfortunately, the browser share statistics I gave last time were incorrect. Although I did my best to try to remove erroneous hits from web crawlers, site grabbers, and the like (which don’t constitute valid visitor hits), I overlooked a growing trend in hits from (what I suspect are) botnets trying to exploit URL parameters that may be used to pass in filenames to a script that subsequently includes that file (i.e. executes the file). Of course, none of my scripts have that glaring vulnerability in them, but the zombies try it anyways, creating bogus hits. The reason I missed these hits earlier was that their volume per distinct IP was low enough such that I passed over deleting them on initial inspection of the log data. Of course, crawlers like Google will often have thousands of hits per IP, which is simple enough to identify and remove.

However, swarms of zombie computers weren’t the only reason my data was off. Microsoft’s search engine also uses subversive techniques to get in under the radar. From a huge pool of IPs under 65.55., they continually run bogus searches on sites like mine using a User Agent that doesn’t identify itself as a bot but rather looks like a normal (of course Windows) user. I would just block MS’s range altogether, but I suspect that would just anger their search engine. They may have reasonable motives for doing these search checks (e.g. to make sure sites display the same content to the crawler that they do to a visitor), but it’s an annoying system nonetheless.

Anyways, I present to you the revised graphs from the cleaner data below. Thankfully, it looks like users are moving towards IE7 and away from IE6 more than I previously reckoned. Also, Firefox seems to be gaining on both flavors of IE more than expected. Meanwhile, Safari and Opera are not to be counted out. However, Netscape is still finding a new definition of pain and suffering as it is slowly digested over a thousand years.

To be sure this new data was reasonably accurate, I found a neat site that analyzes internet market shares at major websites. Their data does concur with mine; however, due to my enthusiast-orientated content, I do get a lot more Firefox visitors than mainstream sites. This is perfectly acceptable, though. 😉

Actually, the reason I was looking at the log data again to begin with was because I got interested in Operating System shares after reading a revisit on the suckiness of Vista in MaxPC. Thankfully, I’m not the only one that’s holding out on upgrading my OS until something significantly better comes along. Unfortunately, a lot of people are having Vista shoved down their throats just because they want a new computer. …It’s been a while since everything Microsoft did didn’t piss me off.

Even though Internet Explorer 7’s adoption is on the rise, web developers can’t rejoice yet, as IE7 is still a faulty product in its own right. Since I’ve been testing my new layout in IE7, I’ve noticed more and more a strange scrolling bug. It is characterized by an incomplete scrollbar and frequently stuck mouse wheel scrolling. It wasn’t until I tested my new version of the Deus Ex guide (a very long page) that I realized how much of an issue this bug was. Googling around, I could find barely a mention of the bug on all the intertubes. However, one bizarrely made page (#41: Infinite loop related to overflow and position: fixed) did have the problem outlined quite well on a list of IE7 bugs. Unfortunately, there’s no simple fix beyond scraping the whole fixed positioning layout. Reportedly the only way to remedy the situation was to scroll to the bottom of the page, which could take a poor visitor several minutes to do on my DX guide.

However, after I thought about it for a little while, I thought there may be a way to force the page to scroll to the bottom and then back up in JavaScript on page load. I recently used a scrollIntoView function on my new content system overhaul that did just that. It worked brilliantly. And with a little creative use of the DOM, the script can be dropped into any page with a fixed positioned content frame and overcome this bug automatically. Unfortunately, the mouse wheel scrolling issue remains–can’t fix everything with the DOM.

//trigger this function with body onload and onresize events
 function IE7ScrollBarHack() {
     //get the main content frame from id
     var mainFrame = document.getElementById('main');
     var ie7hackanchor;

     //check if our span has already been created
     if ((ie7hackanchor = document.getElementById('hackanchor')) == null) {
         //create a span and append it to the content frame
         ie7hackanchor = document.createElement("span");
         ie7hackanchor.setAttribute("id", "hackanchor");
         mainFrame.appendChild(ie7hackanchor);
     }

     //scroll down to that span and then return
     var prevScroll = mainFrame.scrollTop;
     if (ie7hackanchor != null) {
         ie7hackanchor.scrollIntoView(false);
         mainFrame.scrollTop = prevScroll;
     }
 }

Of course, it’s probably best if you only print this function in IE7 using whatever server scripting language you have. I usually use something like this (in PHP):

$IsIE7 = preg_match('/^Mozilla\/\d+\.0 \(compatible; MSIE 7\.0/i', $HTTP_USER_AGENT);
Posted in Programming, Website | Tagged , | Leave a comment

Snake crunches more numbers

(Any excuse to make a spreadsheet.)

Since the election is over (as predicted, Obama landslide–yes, I am a happy snake) and I’m now lacking my daily dose of poll data, I’ve turned to crunching other data to support my statistical addiction. Obviously, the first data I looked at was the final election results so I could update my “blue shift” map. Naturally, this led to an examination of the differences between my previous map (from the last blog post), which used predictions based on polls, and the final results. The map this data produced basically only shows where polls were over- and understating support for a candidate. While I was at it (and because it was trivial to drop new data into my SVG generator program), I generated maps for the 2004 and 2008 final presidential election results. All four maps follow–click for the full image, obviously.

The aspect that jumps out the most in the first image (Blue Shift) is how Arkansas is the only state shifting decidedly more Republican; Louisiana and some other adjacent states are also slightly more red or neutral. As I alluded to last time, it appears that the demographics in these states are such that: A. there was profound white prejudice/misinformation against Obama and B. there wasn’t enough black/minority support to overcome this.

The poll performance map is even more interesting. Immediately, North Dakota and Wyoming stand out as performing way better for McCain than expected–I’m going to attribute these two to the limited number of polls. It’s also likely that Rhode Island, Vermont, and Hawaii fall into that same category except in the Democratic direction. Arkansas, as previously discussed, also under-performed prominently for Obama. But I think the biggest conclusion we can draw from this map can be seen if we look at the bigger picture. The Southeast and Central states are all mostly under-performers for Obama, while the western states over-performed (in general). Perhaps, when they get in the polling booth, rural westerners are generally more open-minded and less bigoted (and thus more accepting of a black president) than rural southerners? It could also say something about the turnout regionally. More information is required before we can make a decisive conclusion, but it’s interesting nonetheless.

It occurs to me that the maps aren’t particularly useful for seeing specific results. I considered rectifying that at one point but decided that the purpose of the maps was mainly to compare states to each other, which the maps do quite well. However, if you want to see the data I used, I uploaded the spreadsheet here (Excel 2002 format).

More recently, I’ve been studying the visitor demographics for my site–pondering my precarious IE support and wishing IE6 would just go away already. Though sadly, the data suggests that IE6 usage will persist. It’s like the herpes of web development. Anyways, I collected the data using this site’s raw logs, counting only distinct IPs for a particular browser (i.e. multiple hits are filtered out). Then, I made some pretty charts in Excel to show off the results. Nothing is really surprising here, but… it’s interesting nonetheless. 😉 Charts below.

As a status update, I’ve been working on my Deus Ex guide again and… yes… it’s almost complete 😮 . And I’ve only had to play DX what must be 12 times over 7 years to write it! I’m not complaining, though. Really, it feels like the definitive guide to Deus Ex on the internet and I’m quite proud. Look for an updated GameFAQs version soon.

Also, I’m working on a massive overhaul of the SnakeByte content system which will feature tabviews (emulated with clever HTML, DOM, JavaScript, and CSS, of course), more images, changelog support, and even some AJAX. It’s about 75% complete as of this post, with expected completion before the end of the year.

Posted in Website | Tagged , , | Leave a comment

Quirk This!

It’s doubtful any previous visitor would notice the recent complete overhaul of this site’s layout, and that’s kind of the point. For years (even back to the dawn of S&L), I’ve been using a framed layout that, while seamless, still required separate pages and an index frame to setup. Since switching to PHP in 2006, I used a script on every page that checked the HTTP referrer to ensure that a page went into frames if referred from an outside site (or no site). I also had a backup JavaScript that would check if the top-level document location was the same as the current document location; and if so, put it into frames. These framing hacks work pretty well for the most part, but can present some occasional odd bugs and intrinsic hurdles, such as infinite framing and query passing.

A month or so ago, I decided to try out an alternate layout using CSS fixed positioning that would nonetheless look and function the same as the frames but without all the hacks. Creating the fixed layout was rather straightforward and effective in Firefox. The only initial issue I had with it was the reloading of the navigation and Flash logo on every page. This requires additional load times and some choppiness as the logo animation restarts. Unfortunately, this factor was one of the key reasons I had been sticking with frames for so long. Many would question my continued use of fixed layouts even, but my own statistics have shown that the likeliness of visitors clicking a link in a navigation that’s persistently in view is greatly increased. Even if they’re just looking around briefly, chances are something will catch their eye, which is better than them just leaving altogether.

I expected the worst as I went to test the new CSS layout in IE6. I wasn’t disappointed: the page layout was completely dysfunctional. IE6 simply has no support for fixed positioning. However, through a number of hacks, I was able to get something that acts almost the same. There are actually two stages of hacks for IE6. In the first stage, absolute positioning (which works fine in IE6) is substituted for fixed positioning and a floating nav is enabled. Once the page finishes loading, the height and overflow of the <body> and <html> are set to 100% and hidden. Then the main frame’s width and height are set to exactly the available window dimensions, causing the overflow to require scrolling. If this all sounds rather tedious and precarious, that’s because it is. I’m hoping that the adoption of IE7+ gets to a point soon that I can drop the crappy IE6 hacks. However, the fact that many still run XP (not so bad) and XP comes pre-installed with IE6 (goddamn you, Microsoft) means that IE6 will likely require support for several more years.

I thought the nightmare was over, but then I went to test IE7. IE7 supposedly supports fixed positioning BUT only in standards-compliant mode, something I’d been putting off accepting for years since Microsoft’s own standards were the dominant force. I knew I was opening Pandora’s Box by switching on standards-compliance, but it was the only way to get IE7 to not look as bad as IE6 (without the hacks). Luckily, Firefox’s only major adverse reaction was to dimensions given without a unit (i.e. 0 instead of 0px). IE had a bizarre issue with centering table contents if the table was centered, fixed with table {text-align: left;}. All browsers had a problem with <dd> tags that I’d used forever to do quick paragraph indents. I’m still unsure how I want to resolve this other than just removing the tags. I could do a special class with indents, but what a hassle (plus W3C insists that block formatting is better anyways). Overall, my impression of standards mode is that it’s more quirky than quirks mode.

It’s unlikely I’ll switch back to frames, though. The industry is moving more and more towards CSS and the support for fixed positioning will only get better. Also, there are some neat tricks you can do when you have script access to the entire page. You can see one of these tricks when viewing any large image. If an image is larger than the main frame, it can be maximized to the whole window without reloading the page. (Plus, the maximizing is animated thanks to the YUI library I’ve been playing with.) Also, I’ve been noticing some vastly increased Google hits since making the switch. For the Kangaroo paper alone, it went from 0-2 hits per day to 10-40 hits per day. Unfortunately, half of those hits are from weirdos looking for kangaroo sex. 😕

RIP Frames
2000-2008

Posted in Website | Tagged | Leave a comment

The Monolithic Procrastination Post

Yes, I’m aware that I’m a bastard for not posting for a whole two months despite not having any work obligations. I was honestly going to post last month following the release of the new Cursor Lock version, but got sidetracked with other projects. The story of my life really–project ADD.

news270

Cursor Lock 2.5 alpha

However, for anyone yearning for a new version of Cursor Lock, I can assure you that I will finish it soon as it’s practically complete already (as the screenshot at right will show). I’ve already completed a majority of the testing and just have a few more issues to resolve and documentation to update. I’ve also found a game that puts the new features to use: DX-Ball 2. The new window locking mode works perfectly in the game’s windowed mode to keep the paddle’s responsiveness from drifting out of the window with the mouse.

Another project that I wanted to post about alongside Cursor Lock last month is my newest Age of Empires 3 mod, Banner Army Reforms. It’s a relatively simple tweak that makes a big difference to the manageability of the Chinese civilization. It allows the player to train units individually instead of in unique groupings (Banner Armies). While a key facet of the Chinese, Banner Armies were just a strategic annoyance to me. Too bad the Chinese still suck.

Last post, I mentioned a PHP script that used PEAR’s Text_Highlighter package, but which I had some concerns about and thus wasn’t ready to go live with. Since then, I’ve been racking my brains trying to come up with a way to securely show external source code files (that have been parsed with Text_Highlighter) embedded inside a formatted page. My second implementation idea was to use my inflatable wrapper technique that places a formatting script inside the target source code file; however, that would force me to make the source code files have .php extensions. I also considered a database of IDs and associated code files, which would be secure but also a hassle and it would obfuscate the underlying source code files.

After much googling and frustration, I finally found a way to make certain file extensions be passed to a handler script. This obscure Apache manual page shows how to add an action to extension handlers. If the target of the action is a script, it will receive the originally called file as some sort of CGI parameter; in PHP, it’s placed in the ENV variable $PHP_SELF (don’t use $PATH_INFO, it can be spoofed).

Once I had all that snazzy handler business figured out and implemented in the script, I noticed a glaring oversight in the Text_Highlighter package. Since they place all the source code in a <pre> tag, no word-wrapping is done. Quite frankly, I never care enough to bother with manually line-breaking code and just let it run off as far as I need. However, on a webpage, horizontal scrolling is a cardinal sin. So, I spent more time than I’d like to admit rewriting the output module of Text_Highlighter to put the source code in a table which would allow it to wrap effectively while preserving line numbering and line indentations. I ended up actually having to put the indentation whitespace in a cell by itself. The script still needs a bit more polish, but can be seen in action on the files in this parallel program directory.

Those source code files are actually part of a column I’ve been working on lately to show off the parallel program I wrote for my final college project. Besides having beautified code, the column is also to have a web-based version of the Powerpoint slideshow I gave during my results presentation. I had originally made a simple PHP script that took a page number as a parameter and showed the appropriate slide image with all the formatting and such. I had two problems with this, however: there was no ability to copy the slide’s text and it was a bit underwhelming in this Web 2.0-hyped world. So, on a whim, I decided to have a go at making a Flash movie with all the typical play controls as well as Fullscreen and Copy Text buttons, which could load in an external SWF file with one slide per frame. Of course, it has all sorts of delicious alpha effects, too. The parallel presentation slideshow isn’t quite live yet, but you can see the new Flash in action on the lovable old Tony the Worm column. It still needs a smidgen of work yet, though, such as tooltips. I find Flash a decidedly quirky and frustrating format; finishing one button is enough for celebration.

Posted in Modding, Programming, Website | Tagged , , | Leave a comment