The search engines constantly strive to improve their
performance by providing the best possible results. While "best" is
subjective, the engines have a very good idea of the kinds of pages and
sites that satisfy their searchers. Generally, these sites have several
traits in common:
- Easy to use, navigate, and understand
- Provide direct, actionable information relevant to the query
- Professionally designed and accessible to modern browsers
- Deliver high quality, legitimate, credible content
On Search Engine Rankings
There are a limited number of variables that search engines can take into account directly, including keywords, links, and site structure. However, through linking patterns, user engagement metrics and machine learning, the engines make a considerable number of intuitions about a given site. Usability and user experience are "second order" influences on search engine ranking success. They provide an indirect, but measurable benefit to a site's external popularity, which the engines can then interpret as a signal of higher quality. This is called the "no one likes to link to a crummy site" phenomenon.
Crafting a thoughtful, empathetic user experience can ensure that
your site is perceived positively by those who visit, encouraging
sharing, bookmarking, return visits and links - signals that trickle
down to the search engines and contribute to high rankings.
Download PDF
Signals of Quality Content
1. Engagement Metrics
When a search engine delivers a page of results to you, they can measure their success by observing how you engage with those results. If you hit the first link, then immediately hit the "back" button to try the second link, this indicates that you were not satisfied with the first result. Since the beginning, search engines have sought the "long click" - where users click a result without immediately returning to the search page to try again. Taken in aggregate over millions and millions of queries a day, the engines build up a good pool of data to judge the quality of their results.2. Machine Learning
In 2011 Google introduced the Panda Update to its ranking algorithm, significantly changing the way it judged websites for quality. Google started by using human evaluators to manually rate 1000s of sites, searching for "low quality" content. Google then incorporated machine learning to mimic the human evaluators. Once its computers could accurately predict what the humans would judge a low quality site, the algorithm was introduced across millions of sites spanning the Internet. The end result was a seismic shift which rearranged over 20% of all of Google's search results. For more on the Panda update, some good resources can be found here and here.3. Linking Patterns
The engines discovered early on that the link structure of the web could serve as a proxy for votes and popularity - higher quality sites and information earned more links than their less useful, lower quality peers. Today, link analysis algorithms have advanced considerably, but these principles hold true.
All of that positive attention and excitement around the content offered
by the new site translates into a machine parse-able (and
algorithmically valuable) collection of links. The timing, source,
anchor text, and number of links to the new site are all factored into
its potential performance (i.e., ranking) for relevant queries at the
engines.
For Search Engine Success
Developing "great content" may be the most repeated suggestion in the SEO world. Yet, despite its clichéd status, appealing, useful content is critical to search engine optimization. Every search performed at the engines comes with an intent - to find, learn, solve, buy, fix, treat, or understand. Search engines place web pages in their results in order to satisfy that intent in the best possible way, and crafting the most fulfilling, thorough content that addresses a searcher's needs provides an excellent chance to earn top rankings.Search intent comes in a variety of flavors...
Transactional Searches
Identifying a local business, making a purchase online and completing a task.
Transactional searches don't necessarily involve a
credit card or wire transfer. Signing up for a free trial account at
Cook's Illustrated, creating a Gmail account, or finding the best local
Mexican cuisine (in Seattle it's Carta de Oaxaca) are all transactional
queries.
Navigational Searches
Visiting a pre-determined destination and sourcing the “correct” website URL.
Navigational searches are performed with the intent
of surfing directly to a specific website. In some cases, the user may
not know the exact URL, and the search engine serves as the "White
Pages", passing along the (hopefully) correct location.
Informational Searches
Researching non-transactional information, getting quick answers and ego-searching.
Informational searches involve a huge range of
queries from finding out the local weather, getting a map and
directions, to finding the name of Tony Starks' military buddy from the
Iron Man movie or checking on just how long that trip to Mars really
takes. The common thread here is that the searches are primarily
non-commercial and non-transaction-oriented in nature; the information
itself is the goal, and no interaction beyond clicking and reading is
required.
Download PDF
No comments:
Post a Comment