SEO “best practices” Standards for Optimization Technique
The final “best practices” standard will have 3 sections comprised of “appropriate” Technique, Marketing Practices, and Business Practices. Although we would like these to be industry standards they aren’t. These are the guidelines and practices the Organization expects for both inclusion in the directory, Organization Membership and “best practices” Certification.
Search Engines determine what is “inappropriate” all we are doing is interpreting the “intent” of the guidelines. Search engines are not in total agreement between themselves further confusing what is acceptable and “appropriate”.
Manipulation using HTML Elements:
The first group are all well known and all have been designated “inappropriate” in at least 1 SE *on site* content guidelines. Suggested references for this topic are Alan Perkins The Classification of Search Engine Spam paper and the Google SEO guidelines:
- Invisible text and links: Already specifically mentioned in Google content guidelines
- Use of non-compliant HTML to manipulate relevancy: multiple titles and other techniques which aren’t HTML standards compliant, used specifically to raise relevancy. The first 2 in the grey area would be real candidates for this area as well. An example of non-compliant HTML would be using a title that does not reflect the content of the page. A table of HTML standard elements is available here. The links lead to information on proper implementation of elements
- Use of CSS (cascading style sheets) to manipulate relevancy: Using hidden elements (layer or span elements etc) that can’t be seen by executing code to reveal them. This activity, to our knowledge, has not been addressed in SE content guidelines however they are generally accepted as “inappropriate” by many firms and consultants.
- Comments: Comments help maintain the code in an HTML document. Comments should not be used to raise relevancy or manipulate SE descriptions. Previously *on site* in Excite content Guidelines
- Invisible form elements: used to hold keyword values, not a well known technique, however they can be used this way. Not known to be mentioned specifically in any SE content guidelines or “unwritten policy”
- Keyword stuffing or stacking using any HTML element. Image alts are often used in this manner. This is a highly subjective area that only SE can make the ultimate decision.
Submission of orphaned or “Doorway Pages”: Most doorway and cloaked pages are orphaned (not linked to by “real” content). They must be submitted directly to the engine causing clogged indexing queues at most major search engines. A large number of pages in any index isn’t necessarily a sign of spam.
Machine Generated Code:
This part of the draft was heavily influenced by the paper Disa Johnson wrote.
Used to produce keyword specific pages often optimized for a specific engine:
- Often used by cloaking companies and referred to as “proprietary software”
- An option in WebPosition which generates “doorway pages”
- Often those using these methods are using other grey area or non-compliant techniques to induce indexing.
- Cloaking or IP Delivery is possibly allowed for sites which must determine user location and deliver differing content for legal reasons. Eg Industry: Pharmaceuticals
- Cloaking is tolerated to some degree by Inktomi. It is possible only if the content guidelines are followed closely. It is recommended to first consider the Index Connect Inclusion program. The program provides a solution similar to cloaking minus the “possibility” of spam which is often associated with cloaking firms
- Machine generated code can be used for usability purposes on a website. One common use is to use a component to test for browser versions and settings. The component generates code used to remove or change features which do not give the functionality a user should expect from a website. Acceptable, provided the changes are only to functionality and not text or other relevancy raising HTML elements. This is extremely subjective since intent is hard to ascertain
Needless Submission: One myth that just won’t go away is that re-submission of a site at predetermined intervals provides a relevancy boost. In the past this **did** have some merit.
Presently it is not true and this sort of submission could lead to your domain being flagged for spam submission. Re-submission should only be done if:
- The index has dropped the page (check that content guidelines were not the reason)
- The content has been edited
- The engine could find it on its own (not orphaned)
Members will take all precautions that content isn’t duplicated on different domains for the purpose of inflating relevancy. Crosslinking between mirrored content is contrary to many SE content guidelines. Instances of both domains in any search engines results is also non-compliant. Use a robots.txt file to ensure mirrored content isn’t duplicated in results.