Add Search Engine Optimization to your Web Site
Workbook & Plans for Search Engine Optimization
Search tip about Text on the Page
Text on the page: When you are writing text for your pages you should try to use your topical keywords throughout the entire page. Ideally you will have about 7% of the text use these keywords. If you have 400 words on the page, you should strive to have about 28 keyword usage. More than this is considered spamming the search engine, and less is considered irrelevant or more properly, less relevant. 7% tells the search engine this is the topic about the page. Obviously these terms must match the meta, the title, and keyword usage in the heading tag structure. You write for the reader, as well as the search engine.
Search tip about Sitemaps
Sitemaps: There are two kinds of sitemaps; one is for the search engines and another is for navigating your site by your visitors. The one for search engines is built using xml and the one for your visitors is normal x(html). I believe when you have more than 5 pages on your site you should probably start considering adding the sitemap navigation for visitors - which normally will appear in the footer area of every page of your website. No matter how many pages you have on your website though, you should have an xml sitemap, which is then submitted to all major search engines. One important thing about the xml sitemap is that you can set the preferences for how often you update your site. This is set in periods of hours, days, weeks etc. If you have a CMS site such as Drupal or WordPress then you will need a plugin for the content management system (CMS) you are using, and set the plugin parameters for the sitemap appropriately.
Search tip about Construction of the Page
Most web designers will build your entire site without thinking about the need for your visibility and find ability on the web. They will 'old school' the site, and even use tables (the structures with rows and cells) to add 'presentation value' to the site. It is a lot easier to build the site at first with tables, but the management of the site then becomes difficult. In any case, some designers will put in hundreds (ok dozens) of Meta keywords and call that 'Search', as if the work is now done and finished. Not So! What that has actually done is diluted the keywords (or descriptions) so that the GoogleBot now thinks the page should have all those references on the page, to verify the relevancy factor. So in truth it is better to have just a few keywords and descriptions of the page, and follow through in the text on the page to a high degree of relevancy. With dozens of keywords there is very little likelihood of the page having a high degree of relevancy for each term. I've seen inexplicable exceptions, but this is the general rule...
Search tip about Video
When you create a video don't assume that by embedding it after uploading it to you tube that your video will go viral and you will get massive amounts of traffic. Actually you probably should create the video for you tube but then just put a link to your site from there, but not embed it on your web page. What you should do is to re-encode the raw video file and reformat it for your own site, put a player on your server and serve the video from there. That way you won't be losing traffic to you tube. You will see more traffic to your site and less bounce if you do that. Since spiders can't read video per se, there is no penalty for having duplicate content on the web.
Search tip about Meta
When you create your meta tags for your web pages, be sure to use unique Title Tags for each page. The title tag is probably the most important thing you can do to inform the search engine what your web page document is about. Find rich keywords and phrases for your Titles, not just your company name. Don't use "Welcome" or "Home" as the Title Tag. That is a waste of precious use of your vocabulary for your website. Use Terms and Phrases that you know people use in searching for your website.
Edward worked at Lawrence Berkeley Laboratory in the hills of Berkeley Ca in the 70's, where the current internet backbone was being developed. He consulted scientists and programmed in fortran, cobol, pascal, C, and Oracle and Informix 4gl's. He worked corporate in the city as a freelancer for 12 years. Higher Source Web Sites started developing web sites in 2001, while in the real estate business. He started blogging and developing sites for his colleagues and his company, then other Real Estate companies. His background in business programming, database administration, & software development made staying within the soft technologies more appealing than other choices. He lives in Santa Rosa Ca with his wife Pam and runs a beagle rescue with Pam, teaches chess, and loves music and cinema.