MSHP 1994

This spring Microsoft’s Chris Balt asked Paravel to recreate the first version of the Microsoft homepage, originally built in 1994. We initially thought about ‘modernizing’ the page with media queries and other current HTML & CSS, but realized there was far greater value in being faithful to the original, so we were.

ms-homepage-1994-scrn

I remember when Benson Chan told us that we’d be working with Microsoft to design the 17th version of their homepage. It set a tone of reverence and respect for all the hard work that came before. But all that hard work—where was it now? I couldn’t help but think, “I wish I could see the other sixteen versions. I wish they were all still live.”

Picking apart the code would be invaluable, even more so accessing a README file that outlined the problems of the day and how they were solved. Similar to the way NASA hosts mission transcripts online, what if we launched sites with README files and pledged to keep them on the web permanently so that future generations of web builders could study them. Take dconstruct.org as an example. It’s archived annually back to 2005. Greg Storey takes this idea further, and I like it:

We need a museum! An institution that can help preserve first-hand accounts of how things were done, what went down in the past. The working files, important emails, formative essays, and forgotten blog posts. We need to preserve the story of how web design began and how it has evolved to today.

I’d love to see such an institution exist. It would certainly need participation from individuals as well as larger organizations. It’d need to be impartial, well-funded, and staffed full-time. I’d kickstart something like this. I’d pay dues or make annual donations to help because I believe that as an industry we could do a better job of documenting our history and progress. Dave Rupert brings up some questions about this institution in his post:

I don’t know what it looks like, who builds it, who curates it, how “archeologists” submit fossils to it, but I’d love to see something like that exist.

Of course I don’t have all the answers, but I do have some thoughts…

  • The Internet Archive is wonderful, but we should all take responsibility for archiving our own work. If we think it’s important (or want it to be) we should be putting time, effort, and money into preserving it.
  • Context is important. Screenshots are nowhere near as helpful as live code and the working files Greg talks about.
  • Blog posts are nice, but there’s no formal connection between the two. Maybe every yoursite.com gets paired with a yoursite.com/readme. Maybe there’s a better, more formal, way.
  • If someone (or some organization) pledges to keep a site online and to archive past versions, maybe there should be a way to indicate that. Maybe it’s a footer link, the README file, or a submission to a directory.
  • If there is such a directory, that could be a great first step—a virtual museum that lives as a directory of sites that pledge to preserve versions online.
  • There’s nothing to say that we can’t also have an actual brick & mortar museum. Maybe it’s set in one location, or maybe it’s a mobile exhibition. After all, our work is dramatically informed by how users access the web at particular points in time. Capturing sites and the devices they were experienced on would help archive the full experience.

The original version of the Microsoft homepage may still be sitting on a hard drive deep within Redmond headquarters, but I think our recreation will serve as a great stand-in until it is unearthed. I applaud Microsoft for making this a priority and hope that other organizations that were instrumental in shaping the web follow suit. There is a lot of value in knowing how the web has evolved. As scrappy as the web was (and still may be) there’s still a wealth of wisdom and ingenuity that has formed the foundation we build upon. We’d be irresponsible not to seek to preserve and understand it, and now is the time to do it. Sites like SpaceJam shouldn’t be the exception, they should be the rule.

Update: 8/11/14

Thanks to Dan Schlitzkus for his work on the Star Map graphic!

M-Dot or RWD

M Dot or RWD. Which is Faster? outlines some great research conducted by Doug Sillars on whether m-dot or responsive sites are faster. Redirects slow m-dot sites from the very start while those sites might turn out to be lighter in requests & KB weight (potentially due to incomplete experiences, I’d argue).

But what is really interesting is the idea that RWD sites are VERY competitive on Visually Complete and SpeedIndex scores. The median values are within 5% for both metrics. Even though it appears that RWD is faster, there is enough fluctuation in the data that we should probably call it a dead heat.

Be sure to read the full post. It’s well worth your time.

TBT: Flash Edition

I was cleaning out an old hard drive and found this ridiculous little 8+ year old Flash (SWF) gem hanging around. Its issues are too numerous to count, but I think I was having a good time when I made it.

Optimizing Images

Friends don’t let friends load 300kb PNG images on their websites no matter how cool the transparent photo of a happy customer waving over a gradient background looks. That said, when raster graphics are required, one should always optimize them. My favorite tools for this are ImageOptim and ImageAlpha. The more I use them, the more they inform how I design.

CSS-Zen-Garden

While working on this page design for my Apothecary CSS Zen Garden theme, I knew I would be exporting a handful of PNGs with alpha transparency. Page weight could have easily surpassed the 1mb mark because of that, but I knew that if the texture graphics were only 2 or 4 colors that Image Alpha could get me massive savings.

Image-Alpha

Note the bottom-left of the above screenshot. I reduced textured-border.png by 85% in file size (34,760 bytes to 5383 bytes). I consider that free money. Whatever you do, use apps like ImageOptim and ImageAlpha on everything, and donate because, holy shit, they’re free.