Spamdexing

Some of these include determining whether the search term appears in the body text or URL of a web page.

Also, search-engine operators can quickly block the results listing from entire websites that use spamdexing, perhaps in response to user complaints of false matches.

[3] The earliest known reference[2] to the term spamdexing is by Eric Convey in his article "Porn sneaks way back on Web", The Boston Herald, May 22, 1996, where he said: The problem arises when site operators load their Web pages with hundreds of extraneous terms so search engines will list them among legitimate addresses.

The process is called "spamdexing," a combination of spamming—the Internet term for sending users unsolicited information—and "indexing.

"[2]Keyword stuffing had been used in the past to obtain top search engine rankings and visibility for particular phrases.

Search engines now employ themed, related keyword techniques to interpret the intent of the content on a page.

Keyword stuffing is a search engine optimization (SEO) technique in which keywords are loaded into a web page's meta tags, visible content, or backlink anchor text in an attempt to gain an unfair rank advantage in search engines.

Keyword stuffing may lead to a website being temporarily or permanently banned or penalized on major search engines.

[citation needed] Many major search engines have implemented algorithms that recognize keyword stuffing, and reduce or eliminate any unfair search advantage that the tactic may have been intended to gain, and oftentimes they will also penalize, demote or remove websites from their indexes that implement keyword stuffing.

[11] Headlines in online news sites are increasingly packed with just the search-friendly keywords that identify the story.

Traditional reporters and editors frown on the practice, but it is effective in optimizing news stories for search.

Google declared that it doesn't use the keywords meta tag in its online search ranking in September 2009.

This process is undertaken by hired writers[citation needed] or automated using a thesaurus database or an artificial neural network.

Similarly to article spinning, some sites use machine translation to render their content in several languages, with no human editing, resulting in unintelligible texts that nonetheless continue to be indexed by search engines, thereby attracting traffic.

[17] Use of links farms has greatly reduced with the launch of Google's first Panda Update in February 2011, which introduced significant improvements in its spam-detection algorithm.

This technique was made famous by Matt Cutts, who publicly declared "war" against this form of link spam.

Guest books, forums, blogs, and any site that accepts visitors' comments are particular targets and are often victims of drive-by spamming where automated software creates nonsense posts with links that are usually irrelevant and unwanted.

It can be problematic because agents can be written that automatically randomly select a user edited web page, such as a Wikipedia article, and add spamming links.

Because of the large amount of spam posted to user-editable webpages, Google proposed a "nofollow" tag that could be embedded with links.

This ensures that spamming links to user-editable websites will not raise the sites ranking with search engines.

[citation needed] A mirror site is the hosting of multiple websites with conceptually similar content but using different URLs.

Possible solutions to overcome search-redirection poisoning redirecting to illegal internet pharmacies include notification of operators of vulnerable legitimate domains.

Further, manual evaluation of SERPs, previously published link-based and content-based algorithms as well as tailor-made automatic detection and classification engines can be used as benchmarks in the effective identification of pharma scam campaigns.