Does eXpertWeb use SEO technologies and methodologies such as Cloaking, Doorway, Shadow, or Envelope pages to optimize site?
Many SEO firms believe that it does not matter if you use Cloaking, Doorway, Shadow, or Envelope pages to optimize your site as long as the product meets SEO Standards. But we understand that things are changing, and that what is a trick allowed today would be blacklisted tomorrow. eXpertWeb has always focused on “honest page” optimization rather than waste time on something that will need to be abandoned soon.
What guidelines does eXpertWeb follow to submit web pages to search engines and directories?
General guidelines that eXpertWeb follows to submit search engines and directories (may vary from search engine to search engine):
- Keywords should be relevant, applicable, and clearly associated with page body content.
- Keywords should be used as allowed and accepted by the search engines (placement, color, etc.)
- Keywords should not be utilized too many times on a page (frequency, density, distribution, etc.)
- Redirection technology (if used) should facilitate and improve the user experience. But we prefer not to use this since we understand that this is almost always considered a trick and is frequently a cause for removal from an index.
- Pages should not be submitted to the search engines too frequently.
NOTE: To be fair, each search engine must support at least the Robots Exclusion Standard. This is not always the case, but it should be.
Additional guidelines followed by eXpertWeb for a search engine or directory may further involve revising or adding content in order to improve the user experience.
eXpertWeb will always refer to the individual search engine submission page for specific rules and guidelines.
What are the main page-centric processes used to optimize pages, and which of them are actually followed by eXpertWeb?
There are four main page-centric processes used by SEO firms:
- Edit Client site pages
The revisions made to a client site’s pages so that they may rank higher in the search engine. This is “honest” SEO work and involves editing real “honest” web site pages. This is the “bread-and-butter” of legitimate SEO firms and is the clear winner when it comes to obtaining meaningful and long-lasting rankings.
- Man Made Pages
Commonly a “doorway-like” technology (Shadow Page) that is keyword intensive and that if visited should present an “honest” site page. This is a labor-intensive process where a copy of a real “honest” page is made, then that copy is altered to emphasize keywords found on the “honest” page (page presented). In some implementations, this page loads the page to be presented into a frame set, and some redirect.
- Machine Made Pages
Commonly an “envelope-like” page (Envelope Page) where the content of the page is derived from other site content based upon keywords and compiled by a software tool. If visited, this page should present an “honest” site page. In some implementations, this page loads the page to be presented into a frame set, and some use redirection to transfer to the “honest” page. Also, some implementations generate pages using gibberish or templates that are easily detected by the search engines. This type of tool could literally generate thousands of additional pages in minutes.
This is normally associated with sites doing IP and USER-AGENT serving where the Internet server will present a page that will vary based upon the visitor characteristics. This technology is commonly used to present differing content to each search engine or browser, thus a search engine seldom sees the same content that is presented to a browser. eXpertWeb only focus on “honest page” optimization. It only edits the main pages of the site, which are to be submitted to the search engines.
What is eXpertWeb Editing Focus / Methodology for the web pages?
the use of links to encourage spiders to locate content within the web site, and to support “popularity” algorithms.
- Contentthe inclusion or focus on words, phrases, and themes associated with search engine query strings.
What are the SEO malpractices that eXpertWeb will never be a party to?
Transparent, hidden, misleading, and inconspicuous links — the use of any fully transparent image for a link, the use of hidden links (possibly in DIV/LAYERs), any link associated with a graphic without words / symbols that can be interpreted as even remotely representing the effect of taking the link, or inconspicuous links like 1×1 pixel graphics or the use of links on punctuation (<a href=link> </a><a href=real-link>real words</a><a href=link>.</a>) would be “spam” and a cause for removal from a search engine index.
“Machine generated” pages – pages that were created by a software tool based upon specified parameters with the intent of directing traffic to other pages within a site, or to other sites. These are bad if not properly used. In the beginning, there were many tools that would be okay under certain circumstances and this was one of them. As an example, a page representing a database-driven dynamic TOOL-PRODUCTS site generated by database parameters (with generated pages for each tool — “craftsman wrenches” as an example) that when a generated page is ranked and executed would place the surfer specifically onto the craftsman wrenches page instead of the home page of the TOOL-PRODUCTS site (thus improving rather than hurting user experience and certainly improving sales rates) would be a beneficial search engine entry. But it would not be okay to have a craftsman wrenches page that lands the surfer on a porno site.
Cloaking – this is a very deceptive process in all circumstances unless there is no impact (deletion, formatting, or insertion) on content delivered to the visitor different than to the search engine spiders. Where the stated objective of the tool [filtering by IP number] is to facilitate the delivery of differing content based upon visitor/search engine identification processes the implementation of cloaking technology is considered BAD. Although not all engines can detect cloaked sites, and some may choose to allow it, cloaked sites are bad in most cases. Google has stated that they have a tool that can detect such pages and is removing cloaked sites from their index where deception is involved. It would be easy to give a search engine a “craftsman wrenches” page, yet to give the browser a “porno page”. Even the advocates of this process agree that cloaking presents multiple versions of content based upon visitor identification, thus there is no single (one-and-only-one) real “honest” page. Spam is an even broader topic and runs from white-on-white to overloading the web with “free web pages/sites developed to provide volumes of links to a site to boost popularity”.
Re-direction. As discussed previously.
IP Switching – ‘Switch and bait” strategy (refer craftsman and porno page example earlier.)
Other practices we to avoid:
Write text or create links that can be seen by search engines but not by visitors to your site. Participate in link exchanges for the sole purpose of increasing your ranking in search engines.Send automated queries to search engines in an attempt to monitor your sites ranking.
What is the checklist followed by eXpertWeb before submitting web pages to search engines?
- Check for broken links off the home page.
- Check for under construction signs.
- Check for inappropriate content.
- Browser Check.
- Spell-check throughout the web site.
- Check loading speed.
- Validated HTML code.
Which Web sites are not optimized/ submitted by eXpertWeb?
We do not optimize/ submit Mirror site, illegal site web site containing or linking to illegal content. Sites consisting of nothing but links to affiliate programs will not be accepted.
What is eXpertWeb follow up towards submission, acceptance or rejection of web pages?
It is important to follow up on web site submission. Most web sites do show up in search engines three weeks after the date of submission and if the site is accepted it may take up to four months for the listing to filter on out to the other search engines supported by it. If your site has not been accepted eXpertWeb will recheck the site and resubmit it to the search engines where it was not accepted at no extra cost.
What’s eXpertWeb approach to submitting non-root pages?
Some sites may qualify for one or more page listings. eXpertWeb will be careful because attempts to spam will usually result in deletion of URLs. If a site has clear lines of separate content, then only non-root pages will be submitted. Usually, a site will need to have deep content in order to do so.