Word up: get into the lexicon with our brilliant SEO glossary 2017

Word up: get your head around the optimization lexicon with our brilliant SEO glossary 2017 

It’s like diving into a pool of alphabet soup in a wildlife refuge. SEO has spawned a mind-bending argot full of delicious words and phrases such as ‘crawler’ and ‘link farm’. We offer you this bowl of key terminology, so try not to slurp. Here, for you, is our SEO glossary 2017.

Rather than put them in alphabetical order,

New to SEO, check out our SEO Glossary 2017 Lexicon

SEO is a whole new language

I’ve grouped them according to three subsets. This list is by no means exhaustive, and I’m likely to add more terms as they gain currency, but it’s a reasonable starting point if you’re an SEO neophyte and you don’t want to keep asking ‘What’s that?’ in marketing strategy meetings. If you reckon other terms need adding, please let us know.

1. The basics

These core terms are the ones you’re most likely to come across when you first dip your toe into the SEO universe.

Search Engine Optimization image

What is SEO?

SEO
Search engine optimization: using creative, technical, and analytic means to raise your ranking in search engines such as Google and Bing.

SERPs
Search engine results pages: the web pages that come up when you run a search in Google or any other search engine.

B2B
While we’re on the acronyms, here’s one that means ‘business-to-business’. You use it when your marketing efforts are directed at other businesses, rather than your ordinary Joes.

B2C
Correct! You guessed right! This acronym means ‘business-to-customer’. If your business or operation markets directly to potential customers who are members of the public, then you’re in the B2C business. Now wash your hands!

Crawler
If you think this sounds kinda creepy, well, it is. A crawler is an algorithm or computer program (also known as a ‘bot’ or ‘spider’) that automatically scans a website’s content, following the links from page to page. Google’s crawler is called ‘Googlebot’, and it’s constantly at work, scanning pages and updating Google’s servers on what it finds. A crawler affects your page rank in a search engine by ranking the integrity and complexity of your site’s links. Roughly speaking, the higher the complexity, the higher your rank (though it’s not quite as simple as that).

Spider
An alternative term for ‘crawler’.

Link
If you don’t know what a ‘link’ is, you should consider returning to the cave from which you’ve just exited, blinking your eyes against the bright light of the world. Links (HTML links, or hypertext mark-up language links, to give them their proper name) send you to another page of the internet. They’re often in differently colored text to ensure that even Donald Trump recognizes them. Links are one of the currencies of SEO. Everyone is telling you to ‘get backlinks from a domain authority’. That means ‘blue chip’ backlinks from a site with real clout, such as a news organization or newspaper. (‘My New Zealand Sheep Blog’, no; ‘New York Times’, yes please!) See the section on ‘backlinks’ below for more information.

Authority site/authority domain
Authority sites sit at the top of the SEO pecking order. They get a lot of incoming links, or ‘backlinks’ from other trusted sites. Authority sites gain that authority because they are trusted by others. They’ll eschew fakery and provide well-sourced information that is up-to-date. An example of an authority site would be The Guardian newspaper, or Wikipedia. These sites appear at or near the top of the rankings a good deal of the time.

Blog
If you have a business and you’re not adding to the enormous word cloud that humanity has belched out since the printing press was invented, you’re not doing it right. Just imagine that everyone in the world thinks you’re as fascinating as your mum does, and start writing!
In fact, blogs can indeed be useful. Just take a look at this one. Useful and funny, in a teenaged mutant ninja turtle kinda way. Actually, www.seorw.com does exist to provide high-quality, actionable information for readers who are thinking about setting themselves up in the web marketing, site design, or search engine optimization business. We’re also all about getting your website shipshape and Bristol fashion so you can rank higher than your less-smart competitors.

White hat
The ‘good guys’ in the world of SEO. White hat operators follow the ‘rules’ (i.e., the ones set by Google). They use legitimate methods (creative, technical, analytic) to optimize your website.

These days, white hats are pretty much the only game in town, because Google and other engines have deployed a range of algorithm updates that reward white hat SEO and severely penalize black hat techniques.

Black hat

A black hat. Our SEO glossary 2017 will explain….

If your SEO provider promises the Earth, tries to sell you a gross of Viagra, and has a Russian accent, it’s just possible they’re a ‘black hat’ SEO operator. The Web has followed the trajectory of the Wild West, starting off as a lawless, freewheeling, amoral zone in which you could get away with the SEO equivalent of murder. Techniques such as ‘keyword stuffing’ could, in the Gold Rush days, get you higher rankings.

As in the movie High Noon, enter Gary Cooper (in the multi-armed form of Google’s team of algorithm code writers). One by one, they shot down Frank Miller’s gang of black hat SEO techniques, and since 2011 have been heavily penalizing sites that deploy them (sites that do can expect to be banished from the rankings, sometimes for years). These days, the Web, or at least those parts of it occupied by search engines, is better policed. To credit-crunch that metaphor, in Webworld, ladies with parasols can now stroll past saloons without fear of being accosted by drunken cowboys.

Check out High Noon at https://dvd.netflix.com/Movie/High-Noon/589258; it’s a fantastic movie. I’ll let you know when it’s available on demand to stream.

Log file analysis
Spock/Data might use this term. ‘Captain, I’m going to conduct a log file analysis on the activity at the surface.’ ‘Do it Spocky’/’Make it so’. It sounds deliciously technical and is. However, it’s a useful way of checking how user-friendly your site is, to search engines as well as visitors. The file is basically a file output from your Web server (you may have your own, like Hillary Clinton did, or you may use a local or national server that is not in your basement surrounded by FBI agents but is instead located elsewhere).

Log files tell you the ‘hits’ on your pages, including times, dates, and the IP addresses of those ‘requesting’ information (i.e., opening pages or elements such as pictures) from your site.
SEO wonks and site webmasters review this information regularly because by ‘reverse-engineering’ it they can find out a range of useful information. They can see if any search engines ‘crawled’ the site, and if they did, what ‘crawl budget’ was spent on particular parts of the site, whether there are any areas of ‘crawl deficiency’, and whether there were any problems accessing any page or element on the site.

It’s all useful information that allows webmasters to tweak and improve the sites under their aegis, and also allows SEO providers to identify areas for improvement in client sites. (The object being to make sites as smooth and ‘crawlable’ as possible.)

2. Updates
You’ll already be aware that the internet is in constant flux. The very fact that more than half the world’s population (3.2 million people in 2015; lots more now for sure) can access the Web makes it a test-bed for every novel idea that a human brain can conjure up. There’s no way that market leaders can sit on their hands in this environment, and search engines such as Google, Bing, and Baidu (in China) work very hard to stay on top of their game. Their code writers are like a bunch of grease-spattered DIY motorheads in the bayou, noses under the hood 24-7, constantly tinkering and tweaking.

Following the updates is the same as the FBI following the Sopranos’ money: it’ll give you a good idea of the lay of the land.

Here, in chronological order, are Google’s most important updates to their search algorithms. You can assume that where Google leads, others follow. All the key updates, and especially Penguin, Panda, and Phantom (Google’s ‘quality update’, on which more below), have been tweaked and refreshed several times over. We’re now on Panda 4.2, Penguin 4.0, and Phantom V, for example. The update names are in italics, and we’re giving you a decade’s worth:

May 2007: Universal search. Google integrates traditional results with news, video, images, and local results.

Feb 2009: Vince. Big brands get a boost in rankings. I guess someone was watching the Entourage box set at the time.

April 2010: Google places. Closer integration of local search results.

May 2010: May Day. Google cracks down on low-quality pages in long-tail keyword searches (see entry on long-tail searches below).
June 2010: Caffeine. Just like it sounds: Google launches new, turbocharged indexing system, resulting in 50% fresher indexing.

Feb 2011: Panda. Cute and cuddly, right? Not if you were a black hat operator. This update smacked down sites with thin content, those that used content farms (see entry on content farms below), and those with high ad-to-content ratios. Offenders got the old Henry Bolingbroke treatment: banishment from the Google kingdom, often for years at a stretch.

January 2012: Top-heavy. This very user-friendly update penalized sites with a slew of ads stuck ‘above the fold’ on their websites. Any that did were instantly down-ranked and thus the update had a very beneficial effect on site design, forcing sites to be straight-up informative about what they were, and ‘shoo-ing’ advertising away from key home page content.

Feb 2012: Venice. A key update that generated more localized results for broad queries (and a short-lived experiment with place names for update nomenclature).

April 2012: Penguin. Google reverts to names from the animal kingdom (or is this a reference to Batman?) The Penguin update penalizes yet more black-hat techniques, combating keyword stuffing and nefarious link schemes, among other SEO dark arts.

May 2013: Phantom. Woo! Woo! Spooky. Actually, this has only been acknowledged latterly by Google, who don’t like the moniker ‘Phantom’ and instead call it a ‘quality update’. ‘Phantom’ was the name given the update by Glenn Gabe, a blogger at www.searchengineland.com, who explains the update in his posts here: http://searchengineland.com/why-googles-quality-updates-should-be-on-your-algorithmic-radar-part-1-257389. The update significantly altered Google rankings based on an assessment of quality, and is now in its fifth generation. So Google didn’t like Gabe’s name for their update. Well, that’s their bad. One way to avoid bloggers christening your updates is to give them a real name yourself. Perhaps naming the update ‘Koala’ or ‘Gerbil’ would have avoided all the spookiness.

August 2013: In-depth articles. An update favoring long-form, ‘evergreen’ content. Evergreen content is content that stays relevant for website visitors over a lengthy period. This glossary, for example, is only partly evergreen, and we’ll keep updating it as necessary.

August 2013: Hummingbird. Back into the aviary, this time with a major update that introduced semantic searches (see entry below). Hummingbird targeted full-sentence searches and promoted high-quality content. A busy summer for Google wonks. Someone must have been cracking the whip.

July 2014: Pigeon. Still in the aviary, though whose idea it was to name an update after the sky-rats that plague our cities we’ll never know. This update instituted stronger links between Google’s ‘core’ and ‘local’ algorithms, adding a dash of Yellow Pages to the mix.

August 2014: HTTPS/SSL update. A loss of romance in Google’s typology, certainly. After all, they could have called this update ‘Cerberus’ after the guard dog at the gates of the underworld, or something equally resonant. But no. The update gave higher ranking to sites with secure certification, reflecting a major change to the Web’s governing software aimed at tackling uncertified ‘mirror’ sites and other dastardly frauds.

April 2015: Mobile-friendly (AKA Mobilegeddon). A major nod to the huge amount of traffic now moving via mobile phones and tablets, whose configuration differs from PCs and laptops. After this update, mobile search results factored in a site’s ‘mobile-friendliness’ (are the pages set up for easy navigation on your iPhone or Samsung?)

The April 2015 update was dubbed ‘Mobilegeddon’ by Moz and others. In the end, however, it was more of a whimper than a bang. It had minimal impact on rankings. That was two years ago now. These days, anyone who doesn’t consider their site’s UX on mobile is a flat-out idiot. If your webmaster or site constructor doesn’t burble on about mobiles or tablets a flag should be well and truly raised.

Designing and optimizing for mobile is a vital part of website construction these days. If you’re in the process of having a site built by a web designer, raise the issue yourself if he or she doesn’t. And if they tell you it’s not important, show them the door (the exit, rather than the one giving onto the canteen, that is).

May, 2015: Phantom 2/the Quality Update. This major content quality update was one of those algorithm tweaks by Google that did shake up a lot of SERPs. CNBC’s brief report focuses on Blog curating site HubPages, with even the highest-regarded blogs losing traffic (in some cases by around a fifth). However, as with most under-the-hood tinkering engaged in by the Big G, when analysts took a closer look at site traffic they found that no particular sector (such as the blog sector) was penalized any more than any other. Across all classes of websites, there were winners and losers in SERPs. What this implies is either the addition of one or two new quality ranking factors, or a re-weighting in the existing (200+) data points that Google uses to assess quality. Most commentators agree that if you’re considering your site content or creating new content, Google’s original comments on its Panda update still make a pretty good rule of thumb for creating content that search engines will like.

October 2015: RankBrain. Google announced that it had deployed machine-learning technology in its PageRank toolbox. Note that this was merely an announcement: in fact, the tech had been introduced in the Spring of that year, and presumably the Google Geeks had been monitoring how well the tech bedded in before going public with the change. RankBrain didn’t take over the search engine’s core activity, however. Like other updates covered above, it was installed as a kind of ‘plug-in’ to the core Hummingbird algorithm that since 2013 has been ranking pages by the gazillion. Note that this is ‘machine-learning’, not AI, or ‘artificial intelligence’. Machine-learning works like a complex feedback loop, not merely analyzing incoming data but using the data itself to alter the operation of the machine-learning element. RankBrain sure has a lot of heavy lifting to do, and machine-learning and AI research has been a core element of the ‘all that other stuff’ department at Google’s parent company Alphabet, for a while. The ATOS department, though, remains loss-making, and it’s YouTube ads and mobile search ads which have been generating Alphabet’s Croesus-like wealth for a while now.

It’s just possible that RankBrain could have been behind some of the metrics-malarkey that Phantom 2 generated. It may also have caused the big movements in rankings that were noted in January 2016, and which were blamed on an ‘unknown update’. However, we’ll never know. If there’s one corporate fact that’s carved in stone, it’s that Google’s ranking code is not going to become open-source any time soon.

February 2016: Adwords re-jig. Let’s face it; something was always going to happen with Adwords. In our opinion, what happened was someone with a psych degree told the Adwords department that having paid-for content down the right-hand sidebar wasn’t cutting the mustard. That sidebar was so easily ignorable, and all the ‘real’ search results were still running from number 1 on down. The new format, with four ads at the top of the SERPs, labeled with a teeny square containing the legend ‘AD’, took instant advantage of users’ propensity to simply click the top result. Bingo (or should we say ‘kerching!’): more cash for Alphabet/Google. Since then, YouTube ads and Adwords have been driving Alphabet’s income stream.

May 2016: Phantom 3. I’ve called this ‘event’ Phantom 3 because while Google never confirmed an update at this time, G-geeks noted a huge spike in activity in Hummingbird. No one knows what happened, so here’s my speculation: occasional ‘spikes’ such as these could be attributable to reconfigurations ‘developed’ by RankBrain’s machine-learning. Tech managers at Google could ‘watch’ the methodological recommendations generated by RankBrain, and then simply press ‘enter’ on a day of their choosing. This would then show up as a big expansion in algorithmic activity in Hummingbird as it engaged in resulting recalibrations. Pure speculation, we admit, but no one else has a clue what happened that May, either.

September 2016: Possum (Schrödinger’s update). Awwww. How cute. But in fact the name was coined by local search specialist Phil Rozek. Well, he ought to have known, because Possum was an update specific to local search, the biggest one since Penguin in 2016. Rozek called it ‘Possum’ because it initially appeared that some location results were absent. Rozek obviously didn’t understand that ‘playing possum’ or ‘lying possum’ actually means being right there but pretending to be dead. The whole point of lying possum is that you’re visible. Doh! That’s why we’ve dubbed this update Schrödinger’s Update, because your business may not be there when you decide to look at it.

Common bush-tailed possum

Trichosurus vulpecula, or the ‘common bush-tailed possum’. An ill-informed title for what we’ve called the Schrödinger Update

One thing that Schrödinger’s Update did provide a workaround for was what you might call the ‘city limits’ phenomenon that omitted business listings from maps even if they were a yard or so outside the designated city’s geographical limit. It sort of ‘fuzzified’ the boundaries, and in fact brought in a lot of businesses to Google Maps displays where they hadn’t been before. What ‘disappeared’ were multiple listings (e.g., lawyers or dentists) where several dentists with individual listings might share a practice. (Knowing this might well have an impact on your site SEO if you’re a dentist).

September 2016: Penguin comes in from the cold. This was a very major move for the Penguin local search ‘plug-in’ to Hummingbird. In fact, it was really an acknowledgement of the rising significance of local and mobile searches. What happened was that Penguin ‘came in from the cold’, windswept wastes and was fully integrated into the core Hummingbird algorithm. The integration happened in phases spread out over several subsequent months, and resulted in ‘real-time’ rather than recently cached results.

It’s possible to imagine this move having had an ‘enabling’ effect on Penguin in relation to RankBrain, facilitating the machine-learning analysis of local search results in a somehow more effective way.

One touted result was a ‘gentler, kinder’ Penguin, which devalued bad site links rather than fully penalizing them.

November 2016: Phantom 4: The Reckoning. Fall 2016 was a real welter of activity for Google, but on November 18 a whole bunch of sites saw quite large-scale impacts on their rankings. It was generally agreed that this unannounced update had a major effect on local rankings too. Mobile UX issues multiplied in the wake of Phantom 4, and any mobile site with an ‘interstitial’ or ‘pop-up’ screen that appeared first was heavily affected.

No one really knows the details of this update, but here at seorw.com we reckon that Phantom 4 might have been to mobile what Phantom 3 was to desktop: a ‘plugging in’ of some of what RankBrain had been learning following Penguin’s incorporation into Hummingbird. One notable follow-on was the fact that in January of 2017, pop-ups and interstitials started getting penalized (rather than causing misattributed URL addresses and UX issues on mobile).

February 2017: Hypermania. We’re lumping them together, but a load of updates to the core algorithm resulted in a number of effects. These seemed to include an enhancement to the Googlebot’s ability to discount links and spam (again, perhaps feedback from RankBrain). All we know for sure is that the activity large-scale. It could have been some core algorithm tweaks in preparation for:

March 2017: Fred. Dubbed ‘Fred’ by Google Geek Gary Illyes, the name was never officially adopted, but we’re out of ‘Possum’, ‘Penguin’, and ‘Hummingbird’ territory. It’s almost as if this update really didn’t want to be noticed at the party, hence the inconspicuous moniker. At any rate, ‘Fred’ was deemed by Search Engine Land to have cleaned the clocks of sites with poor content that focused on revenue rather than UX. We still had the impression that this ‘major’ update was somehow linked to the vast amounts of high-quality, statistically robust data being generated by RankBrain.

3. Others
Here are a slew of other terms you may come across, in no particular order. Some are positively useful, others denote bad-boy techniques you’ll need to avoid at all costs.

Backlink bombing; also known as ‘Googlewashing’ and ‘spamdexing’
I just like these phrases, they show you the Shakespearean inventiveness of the SEO world. Essentially, backlink bombing is another black hat technique, this time aiming to manipulate search rankings by creating lots of backlinks to a particular site. One of the first documented instances came in 1999, when wags created lots of links between the search tem ‘more evil than Satan himself’ and the Microsoft webpage. Inserting lots of links to a website you want to enhance in spammed contributions to a page’s comments section still occurs. Here’s a very informative Wikipedia entry that charts the hilarious origins of the technique: https://en.wikipedia.org/wiki/Google_bomb. It also explains ‘spamdexing’ and ‘Googlewashing’.

Backlinks/inbound links
HTML links from other sites to your site. Here’s an ‘outbound’ link from this site to the excellent Jill Whalen: http://www.highrankings.com/jill-whalen. For Jill, that’s now a ‘backlink’ or ‘inbound’ link (I’ll update this post if she ever reciprocates). I’m updating it now: she didn’t but she sent us a lovely message that she’d retired. Essentially, the more ‘quality’ backlinks you have to your site, the higher your ranking in search engines.

Canonical problems (a.k.a. duplicate content)
Duplicate content is hard to avoid these days, especially where people are using programs such as WordPress. For some crazy reason, search engines are liable to count as ‘doubles’ material that appears on the same site with different URLs, such as ‘www.blah.com’, ‘blah.com’, and ‘www.blah.com/index.htm’.

To avoid the problems (such as downranking) that this can create, web designers should use the ‘noindex’ meta tag for pages that are ‘canonical’ (or ‘original’). You can also add 301 redirects to the canonical material.

Doorway page
A low-quality page or site stuffed with keywords and designed only to maximize traffic. Google updates have downranked this black hat technique.

Invisible text
A technique that, frustratingly, can be black hat or white hat. Black hats can ‘hide’ keywords on pages in an attempt to manipulate a site’s search engine ranking. Text can, for example, be hidden by making it transparent, hiding it behind pictures, or setting the font size to zero. Frowned upon by Google, if your SEO provider brings this up, run for the hills and don’t look back.

However, invisible text can also be a white hat technique: descriptive tags for elements such as Flash files and pictures help Google by allowing it to search the textual tags relating to non-text content, making for more relevant image search results, for example.
Have a look at what Google’s webmaster has to say about it here: https://support.google.com/webmasters/answer/66353?hl=en.

Latent semantic indexing
Sounds like something out of psychology, right? LSI leverages research in synonymy and polysemy (the similarities and differences in meaning between words and phrases) and allows Google to interrogate its archive while making allowances for the wooliness and inaccuracy of searchers’ inputted text. This means that LSI is, essentially, what a ‘crawler’ does. The Googlebot interrogates your site and pulls out the most common or important words and phrases. These are then ranked by frequency and indexed in Google’s servers. When a search matches one of these phrases, your site pops up in the results.

Link farms
Tsk, tsk. A link farm is an interrelated group or ‘clique’ of websites or webpages that each link to one another, creating a whole set of ‘backlinks’ that may not be relevant or warranted. As with other black hat techniques, link farms aim to maximize traffic or direct you to a particular page (one you don’t want to visit).

Meta description
This is a 160-character summary of a web page’s content, used as a ‘tag’ in HTML. It won’t appear on the site itself, but if it includes your search keywords, it may be displayed in the search results. When you click through to that page, it won’t appear anywhere. It’s important as an ‘under-the-hood’ technique in SEO. For this page, our number one keyword is ‘seo glossary 2017’.

On-page SEO

On-page SEO covers all of the stuff you can fiddle around with on your website to help optimize your SERPs. There’s a stack of twiddling and tweaking you can do with on-page SEO, such as getting your key words just right. In fact, I’m doing that right now by adding the keyword ‘seo glossary 2017’. There.

That’s not all, though. To get the most juice from the SEO orange, you need to attend to details such as heading designations.

Your on-page SEO should have one aim in mind. No, make that two aims in mind. First, you want to make your pages as spanking as it can be for your readers.

Second, you want your website to be the best ever tour guide for the bots that Google and other search engines use. Like those English Heritage volunteer guides who escort you around old priories in the U.K., and who just know everything there is to know about the place.

On-page optimization generates easy access and clear signals for bots. There’s a lot to it, (such as structuring your URLs and ensuring that your images are well-labelled, but get this aspect of SEO right and you’ll see positive results in the rankings.

Off-page SEO

No, you can’t shut down your computer to do this, so take the cursor off there! Off-page SEO is the work you can do to get your website noticed elsewhere. Anything you do that raises your domain authority, or creates social media ‘buzz’ that links back to your site, is covered in off-page SEO. So, guest-blogging, generating likes and links from social media sites, and public relations work, are all included in off-page SEO. If your site has content so goddamned good that others sit up, take notice of it, and link to it, you’re boosting your off-page SEO.

PageRank

Larry Page Google founder who invented PageRank

The geek who started it all: Google co-founder Larry Page

Named for Google founder Larry Page (not after any old Web ‘page’) PageRank was the original Google algorithm used to rank websites and pages according to their relevance to searches. It’s now used as a shorthand term for where your site or page sits in search engine rankings, whether Google or not.

Paid links
Just what they sound like: links that are paid for. Essentially, advertisements. These are links on your page or site that are paid for (sometimes per click) and which direct visitors to your site to a third party.

Reciprocal linking
Again, just what it sounds like: your webmaster scratches my back, and mine scratches yours. Hey presto! Higher rankings.

Semantic search
Introduced with Google’s Hummingbird update in 2013, semantic search aims to parse underlying meaning, whether in a search term or on websites in general (when they’re being ‘crawled’ and indexed. Algorithms examine elements such as context to refine searches. For example, semantic search helps out if you enter ‘I need a plumber right now!’ into Google. Using Penguin too, a search like this will generate more local results, helping you find someone more quickly to stop the flooding when your sump pump cashes in its chips.

Short-tail/long-tail keywords
This terminology refers to statistical analysis. I’m kidding, you can wake up now. It does, though. Short-tail keywords are usually just a single word and result in huge numbers of results. Enter the word ‘money’ into Google and you’ll see what I mean (2,800,000,000 results).
Long-tail keywords, on the other hand, can be whole sentences and prompt fewer results in Google. They’re more granulated and usually more useful to you. Rather than using single-word terms such as ‘money’, long-tail keywords give Google more to go on. Try searching this: ‘investing money in a tax-free Roth IRA’ (2,360,000 results).

3 comments

Leave a Reply

Your email address will not be published. Required fields are marked *

*