http://www.ozvalveamps.org/webtools.html | Created: 19/04/07 | Last update:
16:10 12/04/2012
<<< OzValveAmps |
If you are thinking of putting up a web page, this may help.
Contains:
|
OzValveAmps was built on several different machines over time, all pretty limited by todays standards, but with a few exceptions building a web site isn't very demanding of computer technology, speed or storage. You can build a website with nothing more than Windows NotePad, but the following tools were found to be robust and take a lot of the drudge out of assembling pages.
Development environment; all second hand 486/586 machines (mostly free ;), 100/133MHz (slow), avg only 32Meg RAM (smallish), Win95 or Win98 (old, lacking in modern features like USB).
A small web page only requires a few megs of disk space, but this site occupies about 150 megs on the home computer for about 50 megs on-line at the server.
I picked up a basic but very cheap second-hand digital camera with RS-232 serial output, and a scanner in perfect working order at a Fete for $5. The only Twain driver I could find is in German, but I don't mention the war and we get on just fine.
The message is that, providing it is working properly, an old redundant computer makes a perfectly capable platform on which to build and maintain a web site. Speed and storage capacity are helpful, but stability is all-important. You can't get creative on a flaky machine.
A second-hand machine generally needs a good physical clean then clear out of redundant files and software, a good Registry scrubbing, and finally a hard-disk de-fragmentation. Then I test it for a couple of weeks using Flight Simulator.
This machine is called the “Host” and is where you assemble and maintain your web site, generally your home computer.
Web site technology can be used off-line to organise information on your home computer, to distribute small sites on floppy disk, or up to quite large ones on CD-ROM.
To publish on the net you require “Server” space. In fact most ISP accounts come bundled with some web server space that few subscribers ever use - you are most likely already paying for 10 megabytes of server space you are not using.
Once your site is working properly on your host, you then upload the files to your web server.
If you are thinking of setting up a web page I strongly suggest a good starting point is to spend some time reading the articles at Web page design flaws - a number of links to good advice and bad examples. Keep it simple and don't try to do too much. Creeping Complexity is easy, it's maintaining simplicity that's difficult.
Choose a topic you know a lot about. Search a lot to see what is already there. Notice that this site has a very narrow interest, Australian stage amplifiers up to the transistor era, yet currently spans 150 pages totalling around 70 megabytes on-line.
Text is very economical and you might get War and Peace in 10 megs, but images, sound and video files gobble up space. However there are free file archiving services if you are not too worried about privacy and speed.
Visitors are impressed by content not a chorus line of animated gee-gaws and programming prowess. The more “Advanced” the site, the fewer people get to see it.
Text
NotePad will do for personal pages and small sites, but there is a much more helpful tool for editing HTML scripts.
NoteTab Light http://www.notetab.com/ - Free (no catch) and very functional text editor, macros to suit most programming languages and user-supported library. Two known minor bugs; sometimes tells you a file is write-protected, telling it manually fixes; no data loss. The vertical elevator sometimes goes into auto-repeat - very minor.
Image processing
IrfanView - Free (no catch) image handling tool. Powerful, easy to use, and reasonably quick even on old iron. Handles all graphics standards, movies, and sound files. Can be set to leave selected file types to other programmes to handle. Very solid. Download IrfanView now from Tucows.com
Server upload and manage
You need a tool to upload (or FTP) your site files from your local computer to your web server space, and manage your on-line directory/folder spaces.
FTPExpress 1.0 b010 - Free (no catch) manages FTP to your server and works much like Windows File Explorer. Bugs; sometimes gets in a tangle connecting to a second server in a session. Programme close and restart cures. If you drag and drop a folder full of files it creates the folder on your server, but may only copy some of the contained files.
Update April 2012. I have moved to FTP Commander which is also freeware (no catch). Available from www.internet-soft.com. Supports a wide range of languages, and has a cute mouse-with-eyepatch logo.
For testing your site.
Validate your web pages at home or on your server using the W3C validation service page.
http://validator.w3.org/
That's what the little W3C button on each AVA page does. It's what I actually use to check each page on the site is valid after upload, and you can click it and check if I'm cheating. ;)W3C dead link checker checks all the links on your page or site and provides a (big) report of the actual status of all your links. This is handy to spot images and pages that are not correctly linked within the site, but it's real usefulness is selectively scanning links pages such as this to check the activity of all the external links. (but validate first) This page takes about 15 minutes to check and report.
Also provides other info and functions:
- Dead Links FAQ
- Reciprocal Link Checker at recip-links.com
- HTTP headers inspector at server-whois.com
META Tags on a web page are really important to get right, particularly if you are trying to be helpful to Search Engine robots such as Google or Yahoo! looking to include your pages in their lookup. Simply getting them right is sufficient to get you listed.
This site has several useful META tag checking functions, browser-friendly colours, and Robots.txt file generator, but is dogged by ad's http://www.webtoolcentral.com
Robots.txt is an optional file you can place in your server root directory to instruct Search Engine “web-crawlers” or robots how to deal with your site. If you don't have a robots.txt file then the default is to follow all links and index everything, so if you want to limit access to folders or pages you need to specify these in a robots.txt file.
AVA's robots.txt reads;# Robots.txt file created by http://www.webtoolcentral.com # For domain: http://ozvalveamps.org # All robots will spider the domain User-agent: * Disallow: /cgi-bin/ Disallow: /contents.htmLines starting with “#” are comments
Allow all user agents (robots)
Disallow cgi-bin folder (traditional, mine's empty)
Disallow the contents page (a recursive waste of their time and my bandwidth)Metas also allow you to control the indexing and/or link-following of pages individually. For this page the significant Meta's are;
<meta name=“robots” content=“INDEX,NOFOLLOW”> <meta name=“Revisit-After” content=“30 Days”>Index the page every 30 days but don't follow any of the links. For most pages it's simply “all” - do both.
Note that not all Robots (aka Web Crawlers) honor robots.txt.
You can (and many people do) invent your own Metas, but only a very small number of NAME and CONTENT values are actually recognised by other programmes.
Other helpful sites are http://tool.motoricerca.info/, an Italian webmasters resource site, and http://www.searchengineworld.com/ which contains a robot checker.
Some sites offer you a one-stop Search Engine submission service to a list of search engines, which are mostly worthless. You can submit your site address to Google yourself and once they list it the others will quickly pick it up. Another way to get listed is to get somebody with a kindrid site that is already listed to make a link to your new website when you are really ready to go “hot”. After testing and proofreading.
<<<OzValveAmps |