Tag Archives: internet

More Fun Times with Browsers

Unfortunately, the browser share statistics I gave last time were incorrect. Although I did my best to try to remove erroneous hits from web crawlers, site grabbers, and the like (which don’t constitute valid visitor hits), I overlooked a growing trend in hits from (what I suspect are) botnets trying to exploit URL parameters that may be used to pass in filenames to a script that subsequently includes that file (i.e. executes the file). Of course, none of my scripts have that glaring vulnerability in them, but the zombies try it anyways, creating bogus hits. The reason I missed these hits earlier was that their volume per distinct IP was low enough such that I passed over deleting them on initial inspection of the log data. Of course, crawlers like Google will often have thousands of hits per IP, which is simple enough to identify and remove.

However, swarms of zombie computers weren’t the only reason my data was off. Microsoft’s search engine also uses subversive techniques to get in under the radar. From a huge pool of IPs under 65.55., they continually run bogus searches on sites like mine using a User Agent that doesn’t identify itself as a bot but rather looks like a normal (of course Windows) user. I would just block MS’s range altogether, but I suspect that would just anger their search engine. They may have reasonable motives for doing these search checks (e.g. to make sure sites display the same content to the crawler that they do to a visitor), but it’s an annoying system nonetheless.

Anyways, I present to you the revised graphs from the cleaner data below. Thankfully, it looks like users are moving towards IE7 and away from IE6 more than I previously reckoned. Also, Firefox seems to be gaining on both flavors of IE more than expected. Meanwhile, Safari and Opera are not to be counted out. However, Netscape is still finding a new definition of pain and suffering as it is slowly digested over a thousand years.

To be sure this new data was reasonably accurate, I found a neat site that analyzes internet market shares at major websites. Their data does concur with mine; however, due to my enthusiast-orientated content, I do get a lot more Firefox visitors than mainstream sites. This is perfectly acceptable, though. 😉

Actually, the reason I was looking at the log data again to begin with was because I got interested in Operating System shares after reading a revisit on the suckiness of Vista in MaxPC. Thankfully, I’m not the only one that’s holding out on upgrading my OS until something significantly better comes along. Unfortunately, a lot of people are having Vista shoved down their throats just because they want a new computer. …It’s been a while since everything Microsoft did didn’t piss me off.

Even though Internet Explorer 7’s adoption is on the rise, web developers can’t rejoice yet, as IE7 is still a faulty product in its own right. Since I’ve been testing my new layout in IE7, I’ve noticed more and more a strange scrolling bug. It is characterized by an incomplete scrollbar and frequently stuck mouse wheel scrolling. It wasn’t until I tested my new version of the Deus Ex guide (a very long page) that I realized how much of an issue this bug was. Googling around, I could find barely a mention of the bug on all the intertubes. However, one bizarrely made page (#41: Infinite loop related to overflow and position: fixed) did have the problem outlined quite well on a list of IE7 bugs. Unfortunately, there’s no simple fix beyond scraping the whole fixed positioning layout. Reportedly the only way to remedy the situation was to scroll to the bottom of the page, which could take a poor visitor several minutes to do on my DX guide.

However, after I thought about it for a little while, I thought there may be a way to force the page to scroll to the bottom and then back up in JavaScript on page load. I recently used a scrollIntoView function on my new content system overhaul that did just that. It worked brilliantly. And with a little creative use of the DOM, the script can be dropped into any page with a fixed positioned content frame and overcome this bug automatically. Unfortunately, the mouse wheel scrolling issue remains–can’t fix everything with the DOM.

//trigger this function with body onload and onresize events
 function IE7ScrollBarHack() {
     //get the main content frame from id
     var mainFrame = document.getElementById('main');
     var ie7hackanchor;

     //check if our span has already been created
     if ((ie7hackanchor = document.getElementById('hackanchor')) == null) {
         //create a span and append it to the content frame
         ie7hackanchor = document.createElement("span");
         ie7hackanchor.setAttribute("id", "hackanchor");

     //scroll down to that span and then return
     var prevScroll = mainFrame.scrollTop;
     if (ie7hackanchor != null) {
         mainFrame.scrollTop = prevScroll;

Of course, it’s probably best if you only print this function in IE7 using whatever server scripting language you have. I usually use something like this (in PHP):

$IsIE7 = preg_match('/^Mozilla\/\d+\.0 \(compatible; MSIE 7\.0/i', $HTTP_USER_AGENT);
Posted in Programming, Website | Tagged , | Leave a comment

Quirk This!

It’s doubtful any previous visitor would notice the recent complete overhaul of this site’s layout, and that’s kind of the point. For years (even back to the dawn of S&L), I’ve been using a framed layout that, while seamless, still required separate pages and an index frame to setup. Since switching to PHP in 2006, I used a script on every page that checked the HTTP referrer to ensure that a page went into frames if referred from an outside site (or no site). I also had a backup JavaScript that would check if the top-level document location was the same as the current document location; and if so, put it into frames. These framing hacks work pretty well for the most part, but can present some occasional odd bugs and intrinsic hurdles, such as infinite framing and query passing.

A month or so ago, I decided to try out an alternate layout using CSS fixed positioning that would nonetheless look and function the same as the frames but without all the hacks. Creating the fixed layout was rather straightforward and effective in Firefox. The only initial issue I had with it was the reloading of the navigation and Flash logo on every page. This requires additional load times and some choppiness as the logo animation restarts. Unfortunately, this factor was one of the key reasons I had been sticking with frames for so long. Many would question my continued use of fixed layouts even, but my own statistics have shown that the likeliness of visitors clicking a link in a navigation that’s persistently in view is greatly increased. Even if they’re just looking around briefly, chances are something will catch their eye, which is better than them just leaving altogether.

I expected the worst as I went to test the new CSS layout in IE6. I wasn’t disappointed: the page layout was completely dysfunctional. IE6 simply has no support for fixed positioning. However, through a number of hacks, I was able to get something that acts almost the same. There are actually two stages of hacks for IE6. In the first stage, absolute positioning (which works fine in IE6) is substituted for fixed positioning and a floating nav is enabled. Once the page finishes loading, the height and overflow of the <body> and <html> are set to 100% and hidden. Then the main frame’s width and height are set to exactly the available window dimensions, causing the overflow to require scrolling. If this all sounds rather tedious and precarious, that’s because it is. I’m hoping that the adoption of IE7+ gets to a point soon that I can drop the crappy IE6 hacks. However, the fact that many still run XP (not so bad) and XP comes pre-installed with IE6 (goddamn you, Microsoft) means that IE6 will likely require support for several more years.

I thought the nightmare was over, but then I went to test IE7. IE7 supposedly supports fixed positioning BUT only in standards-compliant mode, something I’d been putting off accepting for years since Microsoft’s own standards were the dominant force. I knew I was opening Pandora’s Box by switching on standards-compliance, but it was the only way to get IE7 to not look as bad as IE6 (without the hacks). Luckily, Firefox’s only major adverse reaction was to dimensions given without a unit (i.e. 0 instead of 0px). IE had a bizarre issue with centering table contents if the table was centered, fixed with table {text-align: left;}. All browsers had a problem with <dd> tags that I’d used forever to do quick paragraph indents. I’m still unsure how I want to resolve this other than just removing the tags. I could do a special class with indents, but what a hassle (plus W3C insists that block formatting is better anyways). Overall, my impression of standards mode is that it’s more quirky than quirks mode.

It’s unlikely I’ll switch back to frames, though. The industry is moving more and more towards CSS and the support for fixed positioning will only get better. Also, there are some neat tricks you can do when you have script access to the entire page. You can see one of these tricks when viewing any large image. If an image is larger than the main frame, it can be maximized to the whole window without reloading the page. (Plus, the maximizing is animated thanks to the YUI library I’ve been playing with.) Also, I’ve been noticing some vastly increased Google hits since making the switch. For the Kangaroo paper alone, it went from 0-2 hits per day to 10-40 hits per day. Unfortunately, half of those hits are from weirdos looking for kangaroo sex. 😕

RIP Frames

Posted in Website | Tagged | Leave a comment

Gaming, Photo Album, and Search Engines

I’ve been secretly working on a photo browser script for this site from scratch. I’ve completed two of the three parts to this project: augmentation of viewimage.php with extended image information (file details, dimensions, EXIF, hits, keywords), and an updater script that creates thumbnails and links images in the photo album directory with an image database, which holds some of the extended information. The only part left is the actual image browser frontend, which I expect to finish within the next week. Woot. 8)

Since my last post, I’ve been keeping an eye on how the search engines have been crawling, indexing, and caching my site. Google and Yahoo! seem to be getting the idea now–slowly phasing out nonexistent pages and indexing existing pages, eventually with a correct cache (although the caches just send you back to my site). I’ve begun doubting my usage of frames. In the near future, I may start examining DHTML and other alternatives. At least the search engines are cooperating now.

As for my recent gaming trends, I’ve been mostly playing Company of Heroes lately. I’ve pretty much given up on NWN2 near the beginning of Act 3. My Ranger 15/Rogue 1/Shadow Thief 2 character isn’t all that interesting and the story has been way too convoluted. But as for COH, I finished the campaign last week. Then a couple days ago, I discovered the goodness of skirmishes. My favorite tactic is to use a camouflaged sniper to direct artillery fire and then overwhelm the enemy with armor superiority.

Since Kaylen doesn’t like the wargames, we played a few crazy sessions of Super Mario 3 on Snes9x this weekend. To alleviate some of the tedium, I whipped up some memory cheats for infinite lives: addresses 7E0736 and 7E0737 set to 99 (63h) for Mario and Luigi respectively (All-Stars version). Also over the weekend, I discovered a user mod that I had been hoping would be made: Classic Doom for Doom 3. The levels are designed really well–true to the original layouts with upgraded art and decor to up the realism. However, some of the continuity-breaking attributes of Doom 3 persist: more agile/tougher monsters, weapon effectiveness, and monster teleports (I know Doom had these, but they were kinda scarce comparatively). Still, it’s a lot of fun romping around these new renditions of a classic game.

Posted in Gaming, Website | Tagged , , , | Leave a comment

More site enhancements and setbacks

For the last few days, I’ve been working on some upgrades for this site. After deciding that my site wasn’t getting enough traffic, I looked into what Google had crawled on my site and found that a lot of it was old crap that doesn’t even exist or the caches were of my index frame and not the actual page. I thought Google was smarter than this, but I guess not. So I’ve been taking some steps to allow search engines to better comb my existing content and remove non-existing pages.

To get search engines to remove old pages, you have to let them see the 404 Not Found HTTP header, which I wasn’t doing because all my pages try to go into the frameset thus returning a 200 OK. To tell search engines what pages you do want crawled regularly, you have to have a sitemap. Google’s “Webmaster Tools” system is a really useful web developer interface for their engine that allows you to do things like specify sitemaps. I submitted my already working RSS feed and then created a more complete sitemap. It only has pages that I want it to (mostly PHP pages with specific queries), so no worries about security (as a directory recursion sitemap might have). I’m hoping these steps will at least get Google to send more hits to my site.

While I was at it, I also made a simple database that will put 404 pages into a database that I can go through later and provide correct urls to. The server returns a 301 Moved Permanently header so that search engines won’t think they can still link there, but users will get to the content without any additional loading. It also allows me to tell users and search engines whether a page is gone permanently or just not there yet.

As for setbacks, the other day I was working on my TI-83 RPG and things started getting weird on the map display. Enemies had turned into exit doors and all sorts of funky business. So I go to see what’s wrong and the calc turns off. When I turned it back on, you guessed it…”RAM Cleared”. It’s a good thing I hadn’t been doing a lot of coding on it lately. I was mostly figuring out how I wanted the dice rolls to work with skills, damage, armor, attack, and defense; I had only begun to integrate the dice rolls into the main game that day. So I guess I’ll be recoding it, but this time right into the game. Luckily, I just released beta 2 a couple weeks ago.

And don’t forget to check your ceiling.

Posted in Website | Tagged , | Leave a comment

Shameless THG Plug

I just read a Column at Tom’s Hardware Guide Guide that affirmed my loyalty to that wonderful hardware site. I used to respect [H]ardOCP a good deal, but in the last year, Kyle has become a right lame-ass bastard. I was especially displeased by one of the last articles I read at his site, Matrox Parhelia Testing. In it, he basically whines about not getting to preview Matrox’s new Parhelia Card. What respectable site would do that? Kyle Bennett is lame-ass.

Anyways, the Column I was talking about at THG can be found here. It’s even written by the man himself, Tom Pabst. Lately, Tom hasn’t done many articles, but his excellent staff (such as Frank Volkel) puts out more than enough brilliant content. Who would ever need another hardware site? 😉

Posted in Hardware | Tagged | Leave a comment

May 26, 2001 #2

A LOT of things have happened with me in the past 3 weeks. So I’ll try to keep this as condensed as possible.

The first topic is: I AM THE MASTER OF THE 56K!!! How many of you with 56k modems on normal ISPs out there are frustrated because you get disconnected so often? Well this was not a problem for me: I decided to go for broke and leave my PC on, connected, for as long as possible. I used my trusty and incredibly stable SupraSonic II modem, the local ISP, and of course stable Wad for this test. I saved screenshots of my connection window every so often and chatted about it on AIM. Well, below this is the final proof. I had to shut down my PC after AIM crashed (and Explorer had done about 5 fake-restarts and was having severe window button blips)… after being connected nonstop, at 48kbps, for 8.9 days.

Loogie says: i swear to god, the people at hovac died

Next, I finally beat DiabloII in Normal mode, as a Level 27 Amazon. When we have our 2001 Game Awards Column, D2 will most likely be getting very high accolades. I did take a number of screenshots while playing, but I don’t feel like compiling them onto this news entry.. maybe another day.

Weird Shit is the next topic. Please gaze over the following 3 screenshots from GameSpy and AIM that I found really weird. To save space on this entry, I put the descripts as mouseovers on the images.

I went to Snake’s yesterday for about 5 hours for pics for my next Column… all about computer mice. We used his (his dad’s.. shh) digital camera and took about 14 pics of anything I owned even remotely related to a mouse, and there’re still more pics to come. The column will be done within 2 weeks. We also shot BBs at a can and a milk jug {insert own comment here} for a bit. Fun.

My additions/revisions to the site lately have been totally redoing the POTM and fixing parts of my Staff Page. I have also lately been busy updating a couple of Angelfire sites for people I know.

There’s probably more that I’ve done, but I can’t think of it right now…

Posted in Hardware | Tagged , | Leave a comment