Get Quote
For page printing of Grant Communications LLC Web Design logo

Website Designer and search engine optimization experts: Call us at (603) 715-5445 for a free initial consultation: SEO - Design

News & Webmaster Resources  


 
 
 
 

What We Do

View the main topics pages for web design, brand identity, web hosting and training.View how we have brought business development and sales solutions to the table for clients since 1992. How can we help your company achieve the integrated communications results
you need ?
 
 
 
 

Tell a colleague about this website...

 

 

 

 

Search Engine Optimization
Techniques

by Grant Communications LLC on September 15, 2009

“This issue seems to be coming up more frequently. SoI wanted to ask what types of SEO firms are out there, and why?“ ~ Bruce Robertson

Hi Bruce,

I'll document the 'real' actuality of what is out there today. Use as needed; some folks really need an education. If in fact they should know more than what may help them.

OrganicDefinitions

Current Position

The web is flooded with billionsof pages, over 100 languages, inconsistent correct language usage per characterset, inconsistent bandwidth for technologies used (rich, java, javascripting),and enter mobile media. Education levels, competitiveness and relative newnessof the technologies all create havoc for the search engines to accuratelyrepresent on any consistent basis the relative importance of a given page orsite on the Internet.

As the engines’ primary reasonfor existence is the [somewhat] accurate representation of the worlds knowledgeis a correctly ordered index, it is easy to see where definitions andinterpretations of the definitions of each engine of what constitutes proper organicoptimization and where artificially and negatively influencing the engines lie.

Organic Search Engine Optimization- Any activity on an off-page or on-page code or other thatwill affect the scoring algorithm that creates web contents’ natural positioning,based on measured relevancy of a page or domain,  on a given search engine.

FromWikipedia- "Search engine optimization (SEO) is the process of improving the volume orquality of traffic to a web site from search engines via "natural"("organic" or "algorithmic") search results. Typically, theearlier a site appears in the search results list, the more visitors it willreceive from the search engine. SEO may target different kinds of search,including image search, local search, and industry-specific vertical searchengines."

As an Internet marketing strategy, SEOconsiders how search engines work and what people search for. Optimizing awebsite primarily involves editing its content and HTML coding to both increaseits relevance to specific keywords and to remove barriers to the indexingactivities of search engines.

The acronym "SEO" can also refer to"search engine optimizers," a term adopted by an industry ofconsultants who carry out optimization projects on behalf of clients, and byemployees who perform SEO services in-house. Search engine optimizers may offerSEO as a stand-alone service or as a part of a broader marketing campaign.Because effective SEO may require changes to the HTML source code of a site,SEO tactics may be incorporated into web site development and design. The term"search engine friendly" may be used to describe web site designs,menus, content management systems and shopping carts that are easy to optimize.

Another class of techniques, known as black hatSEO or Spamdexing, use methods such as link farms and keyword stuffing thatdegrade both the relevance of search results and the user-experience of searchengines. Search engines look for sites that employ these techniques in order toremove them from their indices.

This is a tough one. We know that the engines prefer text links over imageswith javascript in displaying mouse hover effects on a navigation system. Byusing a CSS-driven navigation system, would the site then be targeted asunfairly influencing an engines decision to crawl a site over the invisiblejavascript version?

Based on the work produced bySEMPA members, it appears that CSS is allowed as a white hat technique. But, asanother associate commented, “Building agood website is presenting an unfair advantage over the vast majority ofgarbage out there, so just building a good web site could be considered as aform of cheating”.

 

Current Organic Search Engine Levels

 

White Hat, Lily White- This version of siteoptimization is the most preferred, and mandated in becoming a member of SEMPA(Search Engine Marketing Professionals Association).

Thefirst core code used by engine robots is <metaname="revisit-after" content="15 days" />. This isconsidered to be “gray-to-black hat” by this organization, and cannot appear onany site to be considered for membership. With just this one line. This is considered “lily white”. ADA-compliant code, text andmeta content unique and professionally crafted. No deviation from thehierarchal DTDs offered by the W3C; H1 always proceeds H2, which alwaysproceeds H3, etc. No htaccess 301/302 redirects.

In examining their client listsand reading the forums and past customer feedback, one quickly realizes SEMPAcan afford to stand their ground.

Annual fees to access their membership as aclient starts at $100,000. So the hours and average rate allows for dedicatedR&D departments, high wages to employ doctorates or steal staff from theengines, and extremely high profile links equivalent to our sitemap-g program(yes, even they rely on this tactic). And if you look carefully on some of theSEMPA client sites NOT linked from SEMPA member portfolio pages, you may see"gray hat" old code for a particularly tough industry and low client budget…

 

White Hat, Real World- This more realistic approach to coding assumes thatnothing is being done to artificially influence or affect search engine robots.All code is clean, and is reviewed and aligned at time of launch.

Commonpractice strict DTDs from the W3Cs HTML 4.0 implementation is used.Transitional can be used with extraordinary documentation, as Bobby-compliant(US ADA) visibility is also required. No links are or may be paid for, otherthan industry-known search engine PPC programs.

EvenPPC to a primary domain is questionable, due to aggregator PPC link campaignslike go.com and CNET. This is where a secondary educational site, blog or Social2.0 page may be built for PPC, which then has corresponding product offeringson the educational page. An extraordinarily expensive way to develop sites.Very few sites on the web comply to this standard. It is not financiallycompatible with standard corporate budgets, and is typically seen in state andfederal websites targeting consumers (tourism sites).

 

Gray Hat, Light [90%-99%]- Transitional HTML code is allowed, as the expansion ofuser experience is greatly enhanced. Javascript, rich media, Meta Robots,Dublin Core and other meta level code is
now allowed.

This code remains inplace on some client sites. A cross-domain links page is allowed, but only if strictcommonality can be proven and the link structure does not solicit via outboundemail or PPC for new links on a paid-for basis.

Lowvalue mutual links between domains are accepted as part of the process tocreate the inclusive ‘web’. No hidden text that influences an engine isallowed, using CSS or other code, unless directly related to a usersnavigational ease. (Staples, products description page or our Home page are twoaccepted practices). No key phrase stuffing, no duplicate titles sitewide, notitle stuffing, ALT tags used on all images, unique for each image, must bedocumented.

Theabove two types of sites do gain consideration by SEMPA and, I believe, willemerge within SEMPA into a pure unified ‘white hat’ category.  All three levels reflect that the sites areor have been built by professional or academically educated coders, that knowhow to code and code properly, link correctly, and are building sites for easeof crawlability and content being added to the respective indexes. No ‘get richquick’ code can be seen, other than an occasional snippet from what may havebeen an older developer or grandfathered code on an old domain.

Now, onto the more 'risky' organic techniques.

These can be used by your competition to potentially get your domain 'banned' from one or more engines. So interview carefully, choose your code carefully. In this environment, you don't 'have a big stick', to loosely paraphrase Teddy Roosevelt.

 

Gray Hat, Medium- This is where the vast majority of web properties exist,based on the site builders available knowledge and resources.

Most people don’tstudy what they are allowed and not allowed to do, so these sites generallyappear with one or more areas of code as being highly suspect. The beginningsof intentional cheating is seen, or a mark of a too aggressive approach withoutenough financial resources (equals SEO hours) allocated, or using vendors that are unaware of the risks(another indication of limited resources).

Basedon the sheer number of these sites that otherwise represent honest industries,companies, organizations or individuals, requests to remove these sites fromthe indices I believe will mostly go ignored. However, other punitive steps maybe taken if seen as blatantly intentional; keyword stuffing would fall intothis category. Into the sandbox for 6 months, following by automaticallyre-scoring the domain to a PR0 with a delayed re-indexing may and probably doesoccur.

The offending code is examined andmodifications made to the algorithm, to identify and reduce page ranks on otherpages and domains using similar metrics.

Links from these domains are reducedor eliminated in value. Some unrelated inbound links which may or may not bepaid, may also be found by the robots. If found to be paid and are from darkneighborhoods, additional relevancy penalties would be applied. Site code maybe inconsistent hierarchically, DTDs ignored. CSS inconsistent ornon-consistent, which may reduce penalties; the person building the site wasobviously unaware of ‘best practice’, so no unduly harsh penalties would beassessed.

 

Gray Hat, Dark- These sites are ‘pushing the envelope’ at more than oneelement of the code, and usually heavily at the meta level elements. This isbeing done intentionally, but typically the infractions remain at the codelevel.

If reported, or found and identified as egregious by a robot through thealgorithm, these domains are subject to a long sandbox hibernation or potentialpermanent exclusion from the index.

Examplesinclude multiple iterations of the same phrases replicated identically at alllevels of the metas, three or more iterations per phrase block.

Keyphrase stuffing at this level and other element levels; the bottom of visibletext blocks, ALT stuffing, link title stuffing and cross-link phrase stuffing.Dozens to hundreds of duplicate domains with identical landing page content andno attempt at individualizing, with automatic redirects or meta refreshes toother visible domains. Paid-for inbound links on free-for-all Denver link farms, where no commonality canbe traced by a robot.

 

Black Hat- Any vestiges of flying below the radar are removed.These sites are egregious in the lack of content value to the web user, whichis the search engine customer.

This is where the greatest resources are appliedfrom the engines to counter bad code, and where the greatest resources arebeing applied from those industries benefiting the most from manipulation.

Duringthe mid-to-late 1990s, we studied this code on the best performing sites todetermine what elements may and should not be used, as well as weighting andprominence of phrases. The software available on a subscription basis surpassedthis need back in 1999/2000. The industries that are noted in using the mostaggressive techniques focus on base human needs at the bottom of Maslow’sHierarchy and thus, are extremely in demand. This creates the competitiveenvironment that fosters cheating mentality; money is the only goal.

This iswhere black text on a black background is first seen. Metarefresh, first used to forward users to an updated page on a corporate site, isnow used to force users to an external domain. All the tactics found in DarkGray Hat. Automatic content generators flooding dynamically generated pages toautomatically created domains and sub-domains, all pointing to yet anotherforwarding domain. “Hide behind the complexity of hundreds  of concurrent strategies ”, is the siteowners complete thought and reasoning. Link structures include highly-rankedlegitimate sites, or hijacking external PR pages for backlinks to bad domains,to their benefit.

Thetactics become increasingly sophisticated and automated, with doctorate-levelR&D specialists guiding the ongoing development and viral population.

 


Search Engine Optimization Techniques, SEO News and Articles ArchivesArchives

Click here for reprints and timely articles on search engines, optimization, marketing, and advertising ideas.

 

Web Design

 

 
 

Follow Us

Want to learn more about us? Follow GrantCom LLC, our clients and our companies on Facebook, Twitter, Google+, Blogspot and Linkedin.

Look at our social media connections! Grant Communications LLC on Facebook SEO Experts on FeedBurner GrantCom on Twitter GrantCom on Google+ Internet Marketing on Blogspot SEO Experts on LinkedIn
 

What They Say

 
 
 

Contact us for your web design and SEO needs
Contact us for your web design and SEO needs