Click here to send feedback! Web Techniques Magazine
April 1999
Volume 4, Issue 4

Webagra-vation

To invent, you need a good imagination and a pile of junk. -- Thomas Edison

At times we all get caught up in the promises and expectations of the next great product. Whether it improves our sex life or adds zing to our Web site, we all too often find ourselves buying in to that magic pill, HTML editor, or asset-management software that will do it all, and more! Other times we have to step back and reexamine our roots, and take stock of our assets, procedures, and basic premises in order to exercise, elucidate, or enumerate the underlying technique of the trade, be it tackling in football or table layout in HTML -- that is the essence of Back to Basics.

Sometimes when we Web professionals study old Web sites we wonder if we can tackle the problems at all. Faced with the programmer's equivalent of spaghetti code -- the broken, relative, and missing links, the inconsistent treatment of graphic elements and tables, the include files that aren't (because someone changed servers), and the navigational aids that lead to nowhere (because the department maintaining those pages no longer exists), may make us want to throw it away and start over. At times like these, we may find ourselves on a frantic search for a good Perl script, or some other way to strip it all back down to its basic form.

My favorite is lynx -dump my.html > raw-html. If you try this, lynx will dump formatted output to STDOUT and remove tables, replace images with [INLINE], or the contents of <ALT> tags, and annotate your links with numeric references. Furthermore, if version 2.4 or later is used, all of your links are nicely listed and numbered at the bottom of the page. If you combine some of lynx's other options, such as -crawl with a little Perl, you can pour your site into a database, or restructure it in any number of clever ways. This may be appropriate considering that lynx is the default reference implementation for accessibility. A major problem, however, is that dumping formatted output in this fashion loses metadata, comments, and other structural information. So unless you've redundantly tagged your <ALT> tags, visual meaning blurs with textual, and clarity suffers.

In this issue we revisit a few technologies we take for granted.

Daniel Menasce and Virgilio Almeida take us through the basic steps of planning for capacity. Understated in this article, however, is how to predict the qualitative aspects of capacity demand. When popular e-trading sites report growth rates of up to 20 percent a day, we may ask, how can we prepare at all? We may not be able to, but if we learn to develop basic quantitative strategies, we can at least have a plan of attack.

John Udell points out an almost obvious fact about one of the oldest technologies for collaborating on the Net. If we don't maintain a complete news feed, then Usenet becomes manageable, and furthermore, there are many free tools that we can put to use in developing collaborative environments.

Terry Sullivan presents an eloquent argument for something that we all know to be true. Simple is good, but simpler is better. However, we also know that in Web design we must strike a balance among many factors. And in contrasting Terry's arguments against the style and approach that Roger Black brings to the Web from his solid foundation in print publishing, we can truly appreciate the challenges of the medium.

So, should we compromise aesthetic quality and appealing visual design for simplicity? No.

Back to basics should not be read as back to the drawing board. It should be an exercise that helps us find new ways of meeting our challenges, and -- in the case of the Web -- discover new ways of integrating form with function.


Copyright © Web Techniques. All rights reserved.
Web Techniques Magazine