Call it SEO “jargon” or “lingo” but more often than not industries such as ours often create more acronyms than the alphabet can handle. Whether you run a small business and do your own SEO or work in an agency, once in a while, as SEO’s, we come across a word, term or acronym that leaves befuddled with confusion. Not only that but more often than not we nod in agreement silently when its mentioned only to spend the night trawling the internet frantically searching down its meaning.
It’s for this reason alone we decided to create the most comprehensive and detailed SEO and marketing glossary there is online. Now admittedly not all of these are strictly SEO related terms, but they have been mentioned here because SEO has so many overlapping aspects. we come across them more and more frequently on a daily basis.
We hope you enjoy this SEO glossary and find the information contained within helpful. If there are any terms you think we missed that should be included, please let us know.
Or hit CTRL+F to search for a word or term
You can use the above menu to navigate to each section of our SEO glossary however, you can also use the search function to also find terms.
.htaccess is short for hypertext access and is the default name given to the apache web server configuration file. It starts with a dot because the file itself is an extension of the root directory. It lets webmasters control how their website is accessed and gives them the ability to customize and alter the configuration of their apache web server to add or remove functionality. They allow for things such as redirection of URLs, hotlink protection, password protection of files and directories, denial of users by IP and many other useful abilities.
Also referred to as split testing. This is a testing technique commonly used in online marketing. It’s a randomized experiment with two variants of the same item against a single variable. Two versions, both A and B are both identical – except for one variation. Item A (the control and original version) is compared to item B (the treatment, the variation of the item) and tested against user responses and behaviours, establishing which item of the two performs better.
Above the Fold
This term refers to the top section of a website that is visible in your web browser without scrolling. This section should hold the most important information or headlines about your website.
This used to describe a hyperlink that shows the full URL of the linking domain. These URLs contain all the information needed to locate the resource on a network or local host and how it should be accessed. Its common for many SEO’s to use relative URLs which only show the target path for the directories and not the server however, this should be avoid due to be subject to hijacking for nefarious purposes.
Example of a absolute URL
<a href="www.equillmedia.co.uk/resources/SEO-Glossary">Our SEO Glossary</a>
Example of a relative URL
<a href="/resources/SEO-Glossary ">Our SEO Glossary</a>
This is Google’s own contextual Advertising network. Currently this is the most popular and simplest way of advertising for most users.
An affiliate is a person who is affiliated or associated with another company or brand.
This is when an affiliate of a company sells that companies products online for a fee or given percentage. A very popular form of generating revenue online.
The Ahrefs is the < a tag that defines a hyperlink within a web page that links to another page on the website or web. The < a meaning anchor position of the link and href meaning the hyperlink reference that points to the URL the link is pointed towards.
This is a subsidiary company owned by Amazon that tracks rankings for websites online. However, their data has known to have weaknesses and flaws and it only takes information from those using their own Alexa toolbar.
Often dubbed ominously as “The Algo” this is the piece of computer software designed and developed by Google. It is used to evaluate and determine where a website should be ranked within its search results for any given keyword or search query a user inputs.
This is a negative impact in rankings that is placed upon your website or web page via a search engines algorithm such as Google Penguin. Search engines apply these types of penalty when users have been engaging in active that breaks the webmaster guidelines set out by a search engine. The effect is more often than not an immediate drop in keyword rankings, traffic and search visibility which can be crippling for businesses that mainly focus on E-commerce as a way to make their money. These can also be manually applied by Google’s web spam team see also Manual Penalty.
Search engines like Google review, revise and update their core algorithms constantly throughout the year in an ever-ongoing effort to serve the best most relevant results. However, every once in a while, Google or other search engine’s will make a major change in their algorithm and as such usually make a public notification that such an update will or has occurred.
This is short for alternative text and it relates to the alternative text you can add to your images to help screen readers explain images to those with eye sight problems. This is also actually a legal requirement in making websites accessible to all.
This is the acronym for accelerated mobile pages project and is an open source initiative which was created to allow for publishers to create and publish more user mobile friendly content. This is achieved by making web pages more mobile friendly allowing them to load almost instantly.
Are programmes or set of programmes that are used to analyse and measure data about websites. The most popular of course being Google Analytics.
This means the text used when creating your hyperlinks. The more inbound links with the same anchor text means you will usually rank for that keyword or phrase. However, over optimisation needs to be avoided or you could face a penalty.
Here’s how it looks.
<a href="www.equillmedia.co.uk">This is your anchor text</a>
Anchor Text Ratio
This refers to your anchor text and the ratio of variations of your keywords. It is strongly advised that you keep your ratios of keywords equally mixed to avoid a penalty for over optimisation.
This is short for application programme interface. It is used to communicate and specify instructions on how certain types of software should interact with each other.
This is the application programme interface key given to an authorised user of the API and is primarily used to authenticate and validate their permission to use the API.
This is when articles are created with the purpose of posting on 3rd party article sites such as Ezine articles, Hubpages and Squidoo, often with links back to your website. These types of website were penalized and deemed them as low quality resulting in low quality links.
The refers to someone who tries to push an agenda whilst trying to pose as an impartial party. An example would be joining a forum of Facebook group with the intention of possibly recruiting people from there or nowadays more likely joining seo groups on Facebook with the intention of poaching clients etc.
This is the amount of trust you are given by the search engines for certain keyword searches or queries.
These are simple a reference to back links gained to your website from a high authority website. Sites that would fit these criteria would be the NY times the BBC or Forbes, or any other authority of a given subject matter.
An example of an authority website is one where it covers a subject in depth and in concise detail, it will also have many backlinks from other topically relevant websites. This often leads to authority websites ranking higher in the results.
This refers to the state or fact that a person has created a book, article, blog post or any other written or creative work.
Simply means business to business and refers to what type of customer a given business serves.
This is the opposite of B2B and refers to businesses to consumer meaning the end user usually the public.
This is a link to a web page or website from another. This is often seen a vote of confidence or trust by Google and other search engines as to a websites relevance, essentially the more votes you have the higher your site will rank.
This is the process of reviewing yours and your competitor’s backlinks to analyse and asses the quantity and quality of them, and how you can improve on them.
This is a piece of software either desktop or browser based that determines the overall number of links pointing back to your website. The number and quality play a major role in your ranking positions.
Not just amazing on sandwiches! This refers to the emails people legitimately want and subscribe to but don’t have the time to open and read straight away, often obscuring email statistics.
First popularized by Will Scott, this simply describes the process of attaching your website to existing already high-ranking websites then promoting them to dominate the search engine results. Also Known as Parasite SEO in the Black Hat world.
Below the Fold
This is the opposite of above the fold and is when the most important content is often below the top section of the web page, so scrolling is needed to bring it into view.
This refers to those who perform SEO aggressively and using tactics that violate and go against Google’s search guidelines.
These are usually a collection of services, products or websites that are deemed unsuitable and therefore excluded by some services. Most commonly blocked are things such as email addresses, IP addresses and websites.
Is a simple combination of the words web and log, giving us a web log or blog for short. This log acts just like a captain’s and keeps chronological track of all your articles and content you post on your website.
Blog Comment Spamming
This is the process of constantly commenting on other websites and blogs in order to gain a backlink to your own website, this is a black hat tactic and can be done manually or using software.
A blogger is someone who owns a blog and has a large dedicated following on both their blog and social media. More often than not they are also able to make a living of their blog.
Bot, Robot, Crawler
This is a programme that search engines use to automatically find new websites and web pages and then adds them to their index. These can also be used by Black Hat SEO’s and spammers to scrape content for their websites.
Refers to the percentage of people who followed a link to your website or web page and then exited without any further interaction with your website. This can often indicate the performance of your website and its relevance in the search results it shows up for.
These are the keyword of phrases that make use of your brand of companies name within them. For instance, Amazon web services would be classed as a branded keyword because although we mention web services we have also mentioned the actual company or brand name Amazon in the search query as well.
These are backlinks that make use of anchor text that reflect your brand or company name rather than the keyword you wish to rank for. The main reason for using them is for building up your brands visibility online but they are also useful to help avoid penalties from Algorithms such as Google’s penguin that looked to penalise those manipulating backlink anchor text ratios’.
The reason for this was that many SEO’s were using exact or partial match domains using their keyword and would then also create backlinks using the same or very similar keywords in the anchor text which was just asking to be penalised by the search engines for over optimisation.
So rather than creating a backlink using “best men’s watches” as the anchor text you would instead use “My Company or brand Name” or “My Company or Brand Name.co.uk ” and even “My Website Name” and this would help to avoid such penalties by diversifying their anchor text ratio.
Example of a branded anchor text link.
<a href="www.equillmedia.co.uk/"> Equill Media Ltd</a>
This term is used to describe any mention or reference of your company or brand anywhere in online world. Often used as an outreach tactic SEO’s will utilise these opportunities to ask the website where the mention is to update it and link back to the brands website (also known as link reclamation).
A form of navigation to help show a user where they currently are in regard to your websites page hierarchy. They are often listed as links in the top of the pages you visit and help users navigate more efficiently.
An example of our breadcrumbs would look like this www.equillmedia.co.uk/resources/this page/
This is when someone has created an a href link to your website but has misspelled the URL for the html reference. Clicking on the link will result in the user receiving a 404 error.
Broken Link Building
This is a white Hat SEO technique employed to take advantage of a websites broken links by offering a suitable replacement from your own website.
A cache can be either a software or hardware used to store data and about your website, this allows your content to be served up faster to the end user to reduce loading times. There are many types of caches and the most popular are browser caches.
Call to Action (CTA)
A call to action is a message designed to influence the user to perform a certain action on your website such as a purchase or form sign up. They can be made from text, images or a combination of both.
When using a content management system content can often be displayed via different URLs resulting in duplicate content across your website which can have a detrimental effect on your SEO efforts. Here’s an example of a canonical.
Setting a canonical URL tells the search engine which is the one you wish to be displayed and have prominence over the others in the search results. These issues can also be addressed using the no index in non-canonical copies of the page, 301 redirects or by using or robots disallow directive.
And here is what the canonical link looks like.
< link rel="canonical" href="https://www.equillmedia.co.uk/canonical-url-you-want-indexed/">
This is the acronym for wait for it, “Completely Automated Public Turing test to tell Computers and Humans Apart" Phew! what a mouthful eh? This is a computer designed challenge response test to determine if a user was indeed human or in fact a computer.
The very earliest forms of Captcha made use of text that was illegible to computers and would require users to input this text into a field to confirm they were indeed human. Over the years as computers and their captcha capabilities improved it led to the introduction of other more complicated forms of Captcha. These made use of images and areas around points having to be drawn by the user in order to confirm they were human.
Used to describe a Catch all email account. The “Catch All” refers to a mailbox that will catch any and all emails addressed to that mailbox even if they don’t exist in the mail server. Now they are particularly useful as they can be used to avoid losing any emails because of a misspelling.
They have also become recently used by automated link building tools such as GSA and RankerX to sign up to web 2.0 websites using fake email aliases and avoiding being caught using more public email services such as Hotmail or Google mail.
Often used alongside NAP a citation is any reference to your website as a business entity from another website online. These are more often than not directory websites like Yell, Yelp Foursquare and others.
This is the illegal practice of fraudulently try to increase revenue from paid advertising networks. Often individuals will use software or people to repeatedly click on pay per click advertisements to increase the returns.
Click Through Rate (CTR)
This is the ratio of users who clicked a link or advertisement as measured as a percentage of 1000, for example if 100 users clicked your link that would be 1% of 1000 giving you a CTR of 1%.
Another Black Hat tactic likely to get you penalised and similar to creating Doorway pages this is when content that it served to the bot’s or crawlers is different to that shown to the user when they click and visit the website.
Similar to Cloaking this tactic is when the content of a website is switched over once a website has reached the desired rankings. This is a black hat tactic and most certainly will lead to being penalised by the search engines.
Also a spammy frowned upon tactic is comment spamming and this is when individual either manually or using software constantly comments on other blogs to place links back to their own websites.
This is the process of identifying and evaluating who your competitors are and what strategies and tactics they are using, their strengths and weaknesses in relation to your own products or service offerings.
This is used to reduce the file size of data usually on web servers. With a reduce file size data can be loaded much faster and is especially valuable for mobile users. This removes any unnecessary spaces from your code and putting it all on one line, so it can be read faster. Load times are also a ranking factor as mobile device usage has increased year on year, and also with the introduction of accelerated mobile pages.
This is the part of a web page usually in the form of text and images that is intended to serve value and be of interest to the user. This doesn’t include parts such as boilerplate repetition, navigation bars and advertisements.
This is the art of promoting your website or blogs content in order to help it reach a wider audience, and often uses social media and outreach techniques to other bloggers in similar fields to achieve this. Content marketing has steadily grown into its own specialisation or marketing.
Content Delivery Network (CDN)
A content delivery network is a series of web servers throughout the world all connected to each other. They store the data and information of your website as a cache on the servers. This allows your content to be server much faster to users from the nearest CDN server.
This is when desirable content that a user wants is hidden behind content locks requiring the user to perform certain actions such as filling out a form or signing up with their email before the content is then accessible.
Content Management System (CMS)
A content management system is a piece of web-based software that allows webmasters and website owners to edit content on their website without requiring any coding skills. Some of the most common content management systems include:
Content scraping is the process of collecting large amounts of content from across the web. This is a black hat tactic used to gather content fast and use it to rank their own websites in the hope they out rank you with it causing you duplication problems.
This is when a website fills a website with low quality content in order to game search engine rankings algorithm, more often than not it is poorly spun scraped content from other websites.
This is a form of targeted advertising and means that the advertisement has been placed on a relevant web page and is related to the content it is placed with.
This is a hyperlink that has been created using anchor text that is relevant to the content it is placed within. These are one of the most often sought out forms of links for SEO.
A conversion is a goal, which is usually by getting someone to perform a desired action or respond to a call to action such as. This could be getting someone to make a purchase or getting them to sign up to your email list or newsletter.
This is the percentage of users who performed your desired action or goal. Either purchasing a product or filling out a form. Also see conversion.
Conversion Rate Optimisation (CRO)
This is the practice or act of optimising a website or web page to help increase its user to customer conversion rate or any other desired user action. This is done by testing alternative versions of page layout, content, text or images using split testing like A/B testing. By performing conversion rate optimisation companies can significantly increase their online leads, sales and profitability. Also, similarly to content marketing conversion rate optimisation has also grown into its own specialist field.
Also commonly called sales or e-commerce funnels a conversion funnel is an e-commerce marketing phrased that we use to describe the series of steps or “journey” a consumer takes through a series of online advertising. Usually started by clicking a link or advertisement and being directed to an e-commerce website moving on to finally make a purchase.
Cookie / Cookies
And we don’t mean the sweet kind either. These are files that store specific information about a user’s browsing activity and stores information on the websites they visit, any affiliate links they may click.
This law has caused much controversy in the SEO community and many still haven’t implemented it in the hope it is eventually abolished. However currently of writing it’s still a legal requirement and one you should follow as failure to do could result in a €50,000 fine.
This is the assignable legal right given to the creator for a number of years the right to print, publish or re-create their work. Because of this it is extremely important that you seek the original creator or publisher of a work to be sure you have permission to use it. Failing to do so could result in legal action being taken against you.
This is when someone has essentially copied someone else work with prior authorisation from the creator to do so, this gives the copy in copyright. This is a becoming a problem across the web as more people disregard the law to their own peril. This is the legal term given for such a violation and can result in legal action being taken against you.
Cost Per Acquisition (CPA)
This can mean either cost per acquisition or cost per action and refers to the cost publishers have to pay advertisers for provider each lead or action taken.
Cost Per Click (CPC)
Short for cost per click this refers to the price you will have to pay for each click on one of your pay per click campaigns. This is an important metric when it comes to paid search as it helps determine how much is being spent on advertising campaigns and helps determine a campaign’s return of investment and overall success.
Cost Per Mille (CPM)
This is the abbreviation for cost per Mille. Mille or M is the roman word for a thousand and can also be known as cost per thousand. This means how much it will cost you to have your banner advert or other forms of ad served one thousand times on a given web page.
Country Code Top Level Domain (CC TLD)
Country code top level domain are top level domain reserved for countries. They can be found after the dot at the end of a domain and are used to help search engines understand the geographical location of the website itself. This is important because these are taken into account for local SEO when ranking and a country specific domain will (with all things being equal) out rank a more generic such as a .com or .net one its own countries version of a search engine. Here are three examples of ccTLD’s you will see around the world web today.
www.example.co.uk The ccTLD represents the United Kingdom.
www.example.de This ccTLD represents Germany.
www.example.fr And this ccTLD is for France.
This is the total number of pages that a search engine will crawl on any given website. Not to be mistaken will the blow crawl depth.
This is the extent to which a search engine will crawl and index the web pages on any given website at a time.
Called so because they crawl the world wide web a crawler also known as a spider or bot is an automatic programme that systematically trawls the world wide web via hyperlinks to read and understand a website’s content to be indexed for the search results. The most important and famous of course being the Googlebot.
This is the process of collecting and gathering large amounts of information around a certain topic and then organizing and presenting that content in a meaningful and informative way. Those who take part in this process and contribute content are called “curators” and will gather, arrange and sort content out based on the topic chosen.
This is a web server where it is exclusively used or owned by only one user. Other types of server include virtual private servers or VPS, managed hosting and also shared hosting.
A deep link or deep linking is when you create hyperlinks to the inner pages of your website rather than just linking to your homepage. The depth of your linking can be determined by the number of slashes in the domain.
Deep Link Ratio
This is simply the percentage of hyperlinks that you have pointing to your deeper pages that are not your homepage. This is advisable if you want to rank those pages above your homepage for certain keywords or phrases.
Used to describe a website that has been removed form a search engines index. This can occur by accidentally setting the wrong robots.txt directive which can prevent search engines from crawling and indexing your website. However, it can also be a more sinister sign that the search engine itself has removed your website for any number of reasons relating to breaching their search guidelines.
Some of the following can result in being de indexed;
- Keyword Stuffing
- Spammy Structured Markup
- Thin or Poor-quality Content
- Spammy Backlinks
A directory is a website compromising of directory pages used to list information about various businesses or organisations either alphabetically or thematically.
The directory page is the webpage on the directory website where the information about a business or organisation will be listed. It usually contains a small summary of what the company does, and it’s contact details including phone number, email and web address.
Disavow or Link Disavowing
Disavowing is letting a search engine know that you deny any responsibility for a hyperlink or set of hyperlinks pointing to your website. This allows a webmaster to let Google know they do not want them to include those links into their ranking algorithm this is especially useful for low quality spammy or suspicious links in your backlink profile. This is done by listing all the offending URLs into a sheet for uploading to google via there disavow tool.
This is short for Domain Name System and is the naming system used for computers and is primarily used to connect a domain name with an IP Address which represents computers server which the website is running on. This is a core part of how the internet works as a whole. The internet is made up of two main parts or systems, one is domain name hierarchy which is controlled by the Domain Name System (DNS), and the second is the Internet Protocols used also known as IP’s.
Do Follow / No Follow
All hyperlinks are Do follow by default unless set otherwise in your CMS or web server. Do follow links pass along what’s known as “link juice” think of this as virtual Word of mouth. This means that search engine bots can crawl those links and use them in their ranking factors for that page. A No follow link is a HTML attribute that can added to the rel part of the hyperlink, essentially telling the search engine bot that it should not use the link in its ranking factors nor pass along any link juice.
A domain name is the name of the website that you type into the browser search field to go to a specific website. These domain names always end with an extension such as .com, co.uk, .net, .org. All websites use IP numbers for example, 22.214.171.124 which can be difficult to remember so domain names were created to overcome this problem. The domain name will point the IP the website is hosted on.
Domain Name Age
This refers to when the domain name was first initially registered with ICANN and is measured in years. A domains age is considered an important ranking factor for SEO. This is information can be found using the Whois website.
Doorway / Doorway Page
Doorway pages are similar to cloaking and pages specifically created to attract traffic and clicks then redirect the user (not bots which is why the problem isn’t spotted by them) to another website usually employing cloaking.
Drip Feed/ Drip Feeding
Drip feeding is the process of creating a specific number of backlinks or indexing links using an indexing service over a specific time period. This is done to help make there acquisition more natural and also to help maintain link velocity.
One of the most understood and biggest problems on the web today is duplicate content. Duplicate content is any content which is similar or identical to other pieces on the web is repeated across more than one page or website. Duplicate content can also be found on your own site and is usually in form of design layouts and boilerplate text repetition repeated across pages.
Duplicate content doesn’t have to be exactly word for word. Also, contrary to popular belief you Do Not Get Google Penalties for duplicate content but rather Google trusts your website a lot less and you will receive less link juice which can result in lower rankings for your website and pages creating the illusion of penalty being placed on the website.
Dwell time is a term used for the amount of time a user spends on any given webpage. This is considered an important ranking factor and part of SEO as a higher dwell time signifies the content was relevant, informative or useful to the user.
This refers to content that changes depending on the users browser behaviour or user interactions. For instance, static content on a page will never change and will always serve the same content however, dynamic content is generated on the fly. A great example of a website serving dynamic content is Facebook where each and every user see different content depending on their preferences likes and interests.
This refers to the practice of serving different content to different users based on their location or device. It is used to help serve more mobile optimised version or content to those using mobile devices and likewise for desktop. Another more favoured method is using responsive web design.
Dynamic URLs are result of user searches on database driven websites. This is caused most websites that either use categories and tags for their blog or also on E-commerce websites when a product may be listed in multiple categories the result is this product can now be found via multiple different URLs with different searches. With duplicate content issues being the end result.
This acronym stands for Expertise, Authority and Trust, and these are used as signals by Google to determine your website or web pages quality in relation to how well it will rank. This is based on the following fundamental principles.
Expertise – If your site proclaims to be an expert on a certain subject or field then it’s vital you have writing and content that reflects your expertise in that subject regardless if it was a website about mechanical engineering, fashion or entertainment.
Any sort of advice your be given on your website be it an FAQ page, tutorial, how to should all be written and explained by experts in those fields so that the source of information is correct and can be trusted. For instance, medical advice needs to come from those actually qualified to give it.
Authority – This represents a websites authority on a subject and goes hand in hand with expertise. As earlier mentioned producing great relevant expert contain will over time help to establish your website as an authority in its field. You can also help build a websites authority on-site by displaying who the website is owned and operated by, who the content is produced by, how good and trust worthy the content itself is and other factors such as about pages displaying company addresses etc, relevant credentials or experiences.
Trustworthiness – For this you need to show users that your website or webpage a user is on can be trust. This is particularly important for E-commerce websites where you are asking for personal payment details when they make purchases, they need to know their information can be trusted with you and safe guarded. This can also be achieved with other factors as mentioned above but also by ensuring you take security seriously installing SSL certificates etc.
This describes websites that are solely devoted to the selling of retail products online to consumers and be aimed towards businesses or the general public.
Editorial links are those that are gained naturally without coercion from the editor of another website. The most popular types of these links are those from news outlets such as Huffington post and Forbes down to smaller more niche websites.
Encrypted searches refers to those done over a secure connection which Google then encrypts resulting in an encrypted search. The problem for SEO’s with encrypted searches is that although in your Google analytics you’ll be able to see where the user came from you won’t be able to what their search query or keyword was resulting in more NOT Provided data.
Also known as White Hat SEO this refers to doing search engine optimisation according to Google guidelines. This means doing things correctly without employing Black Hat SEO tactics.
Exact Match Domains (EMD)
An exact match domain is one that makes use of an exact keyword or search phrase instead of a brand name, more often than not they will contain a product or location too. A prime example would be runningshoes.com or Londonflorists.co.uk. Exact match domains used to rank well and despite Google’s efforts to stop that in 2012 many still do continue to rank for their keyword.
Exit pages are those webpages were the user exits your website. Having this type of information is particularly useful as it can help you optimise those pages to increase your retention rates.
Also called backlinks an external link is any hyperlink placed on your website that points out to another website. This is also true if another website links out to your website in the same manner.
A feed is when information such as news articles or blog post are delivered to users via programmes or websites and applications such as news or content aggregators. An example of this would be a site like Paper.li.
Is short for Free for All and is when a website literally allows anybody and everybody to leave a backlink to their website on them. Although it may seem wise to include your link on here to your website it can be detrimental as these pages offer no value to real users and thus receive very little link juice and can often be filled with low quality spammy websites. You tend see a lot of these as guest books where everyone just spams a link.
File Transfer Protocol (FTP)
The file transfer protocol is used when you transfer files between your web server and your computer. Once a user has created an account then can then use free programmes such as FileZilla to then connect to the server to upload or download files.
Frame or Iframe
A frame is when contain can be displayed within a set frame size hence its name. This content can be separate from the rest of the page and can often be pages displayed from other websites. The problem in particular with frames is because the content is read as separate files it can sometimes can indexation problems because they are broken up as files when indexed.
Refers to continually adding new and fresh content that engages users, which can be either in the form of new articles and blog posts or updating existing ones. Not only that but Google likes fresh content being published on websites, and by regularly publishing new content on your website you are encouraging Google to re crawl your website which can improve your rankings once the new content is indexed.
This is another term used to describe a Doorway page.
We love this one and first heard it mention by Dawn Anderson of Move It Marketing at Brighton SEO. Generational cruft refers to the inherited badly designed and over complicated code and software. This is true in the sense that SEO’s often have to work with poorly coded websites that may be using outdated technology and load slowly as a result or due to poor web design, or they simply suffer from poorly set up redirects from a previous SEO provider.
Google AdSense is a programme owned by Google that allows publishers (or website owners) that signed up to the Google network of content partners to display adverts on their website in return for revenue. The adverts themselves are administered and sorted by Google. Publishers of the adverts on their website are then paid on a CPC or CPM basic.
This is Google’s own advertising service that allows businesses wanting to display paid advertisements on Google and also its advertising network. It enables businesses to create adverts based on their chosen keywords helping to place your ad in the most relevant search results. They can also set maximum budgets and only pay when a user clicks on an advert.
Google Analytics (GA)
Probably the most popular used, this is Google’s own free analytics software that allows webmasters to collect information and data about the websites users. The information collected is things like traffic amounts, the sources of traffic, conversion rates, devices used, bounce rates and a lot more. This data is vital for SEO’s and can provide extremely useful insights.
Google Bombing is the practice of influencing the search engines to show results totally unexpected to the user. This is achieved by creating numerous backlinks to the intended site use anchor text the user is most likely to use when searching. It’s a form of online pranking mostly used for laughs however it can be used to influence and share social or political ideologies.
The most famous Google bomb where those when users search “more evil than Satan” and the search returned Microsoft, another well-known example was one that when users searched for “the miserable failure” George Bush’s autobiography site was the returned result.
Google Bot / Google Spider
This is Google’s own preferred web crawler, which crawls the web indexing billions of different websites and their webpages for inclusion in its search results.
This actually refers to spamming the internet with links to a website in another person’s name usually with malicious intent intended to lower the target websites rankings or to damage its online reputation. Google bowling also goes hand in hand with negative SEO.
This is the change of positioning of website and rankings within the SERPS. It is often caused by updates or changes to the search algorithms. As a result of the change rankings fluctuate over a time period. This phenomenon is often the bane of many SEO consultants the world over.
Google Disavow Tool
Quite simply this is the tool used to upload your disavow file of links you want Google to ignore in calculating your websites ranking. The list should include all of the low quality and spammy websites you wish to not be associated with, this can help with avoiding any potential future Google penalties.
Introduced in 2013 this was a new algorithm Google implemented in order to improve its search results based on understanding “conversational search” and allowing it to better understand the context in which the search keyword or phrase is being used rather than just picking out matching keywords. The concept is also known as Semantic Search. This process is done by understanding synonyms and the meaning behind how the words are used in conversation in order to better index and improve its results.
Google Keyword Planner
This useful free tool by Google although basic allows you to search for specific keywords to find the best and most suitable ones to use in your pay per click campaigns. Although initially designed for PPC it can also be very useful for SEO as usually the keywords targeted in PPC are also those you would to also target using SEO over a longer time period, this also helps to reduce your ad spend over time.
Google Manual Penalty
When a webmaster or SEO (accidentally or not) breaks Google’s guidelines they will then begin to lower its rankings for certain keywords and search queries as a sort of punishment for disobeying their rules this is a google penalty, and they can be triggered by both the algorithm automatically our manually by a member of Google’s web spam team. It can also remove the website from its index entirely if the offending website is severe enough to warrant it. In which case you will need to seek out a professional Google penalty recovery service.
This is a web-based Google service which provides users with detailed information about geographical regions, countries or famous landmarks. It’s also important in its role in local seo and localised searches due to the fact that the information and maps are often shown in the search results for various keywords or phrases.
Google My Business
This is a Google provided platform that combines a number of Google’s services via control of one dashboard. One of the major uses of this service is the ability to add, edit and manage entries listed on the Google Maps platform itself which plays a pivotal role is local SEO as they as more often than not also displayed in the search results.
This was a series of updates to Google’s already existing algorithm designed to evaluate a websites content quality. This helped combat what it deemed as websites offering low quality or low value content, very little content also known as “Thin Content” and ad heavy websites. It was named after the engineer Naveet Panda who designed the technology behind the algorithm.
This was another series of updates to Google’s main algorithm designed to evaluate if a website has been employing unethical or unnatural backlinks. This is including things such as low-quality links, spammy links, over optimised anchor text ratios etc.
This refers to an update which Google introduced into its local ranking algorithm. Launched in 2014 this update was designed to serve more useful and relevant search results in local areas who may be using mobile hand-held devices such as smartphones and tablets.
Google Search Console (GSC)
Formerly known as Google Webmaster Tools, Search console is another free programme offered by Google for webmasters to be able to be able to check their indexing and to optimise their websites visibility.
These are the large links displayed underneath a websites domain in the search results when a user searches using a branded keyword. These are designed to help users navigate to the information they want much easier and faster. Sitelinks are only shown automatically by the algorithm and will only show for websites where Google thinks they will be useful.
If a websites internal linking doesn’t allow for google to find good sitelinks or they are irrelevant to a user’s query Google will not show them. Although there is no way sure fire way to acquire sitelinks Google states that good practice of improving your internal linking, making use of informative anchor and alt text. And implementing schema markup can all help towards increasing your website chance of being shown with sitelinks.
Google Universal Search
This is a type of search where Google blends the results of a search using multiple verticals such as Google news, Google images or Google scholar and can also include video. This new form of search was introduced in 2007 and further refined throughout 2018.
Grey Hat SEO
This refers to those who use a combination of both Black Hat and White SEO tactics. Black Hat is SEO that violates Google’s guidelines and may lead to a penalty or deindexation. White Hat is the alternative of following Google’s guidelines and best practices. And thus, Grey hat SEO lies in between the two.
This is a strategy employed by bloggers or SEO’s as a means to of building up brand awareness and increase traffic to your website by utilising the audience of the website receiving your guest post. The problem that arose out of guest blogging was that because your post would include an author’s biography with links to your website. Due to this fact many people began abusing this method solely with the intention of receiving a backlink in order to game the search results.
Heading Tags (H1, H2, H3)
Heading are essentially html tags that are used to style headlines within a web page. They are also significant as they also contribute to your websites rankings by reinforcing to search engines the topic of a webpage and are typically similar to the pages Title tag. There should always only be one H1 tag ever used on page. However, you can use multiple H2 tags with H3 subheadings placed in between those.
Hidden content is an old black hat SEO method where the coded html content of a page differs from that visible to the user and is intended for search engines only in order to influence their rankings. This was commonly done by placing keyword stuff text onto a page the same colour of the pages background, so it wasn’t visible to the human eye for search engine crawlers only.
Host / Hosting
Host or hosting refers to the companies that provide web hosting services. These are usually large data servers permanently switched on, so websites can be accessed 24 hours 7 days a week.
This is short for hyperlink reference and is an attribute of the html anchor tag, which is used to identify parts of a webpage. A href contains two parts, the URL of a webpage which is the actual link and the clickable text which is called the anchor text.
The hreflang is another html attribute that allows website owners to specify the language in which a web page is written. It also allows search engines the ability to add any geographical restrictions that may apply to websites in that country such as blocking access to certain sites. These are especially important if your website content is accessible in other countries and languages as you can mark up each page to correspond with each country’s various language.
The link below would specific to Google that our website is meant for those who speak and read english in Great Britain opposed to other countries who speak English as a first language.
< link rel=”alternate” href=”https://www.equillmedia.co.uk/en” hreflang=”en-gb” />
The second link shows how we would markup a page if it was being served in German.
< link rel=”alternate” href=”https://www.equillmedia.co.uk/en” hreflang=”de” />
Is short for http strict transport security and is a web security mechanism developed to help prevent websites being subjected to cyber attacks such protocol downgrade attacks and cookie hijacking attempts. It tells the web browser that it must connect to the website using only the HTTPS and never to connect with just http.
A HTML sitemap is a specific page on the front end of a website that is displayed as a hierarchical list of websites pages. These are used so that websites users can easily navigate throughout the website. This is particularly useful if users are struggling to find information using the in-site search function or are not easily found using the site navigation. Sitemaps can also be created in XML format to aid search engines that are crawling your website in the same way.
HTTP (Hypertext Transfer Protocol)
This is the underlying protocol used by the entire world wide web. It controls how hypertext information is transferred and received across the internet. It also tells web browsers and servers how to behave in response to different commands.
HTTP Status Code
These are server response codes and indicate whether a server request was successful or not. Depending on the response to the request a http status code will be presented. The codes have ranges that vary from 200 successful, 300 Redirects, 400 Client errors and finally 500 server errors. There are other types however, these 5 classes are the most common. You can find a full list of the most commonly found http status codes here.
This stands for Hypertext Transfer Protocol Secure and is the same as https except it makes use of a layer of security known as Secure Socket Layer or SSL for short. This means that all information transmitted between a web browser and the web server is encrypted and secure from hackers and anyone trying to intercept the data that is being sent.
Hub / Web Hub
These are Industry expert websites that provide huge knowledge within their given industries and provides a places meeting places for buyers and sellers, and allow members to exchange ideas, concepts and industry related news.
There are two processes for optimising images. The first method is to provide a relevantly suitable Title, alt text, and URL location for the uploaded image, this is done by using the keywords the page the image is on is targeting. This is done to help google better understand what your images are about and to also help them show in Google’s search results. The second method for optimising images is just as important as the first and involves reducing the images overall file size.
This is important because website loading times are a ranking factor and un-optimised images are usually the main culprit. This can easily be done by making use of free image compression websites to reduce the file size of the image. Another thing to note is that it also wise to use images at the size you wish them to be displayed at rather than using CSS or HTML to resize them as this adds extra unnecessary code to your website which is counterproductive as it increases load times.
Image sitemaps similar to HTML sitemaps are style less web pages that list all of your websites images in a logical order. This is done so that Google can crawl these pages and read the metadata that it present within them allowing them better to understand what your images are about.
An impression refers to the event of a user viewing your web page or online advertisement one time without clicking. So, imagine 1000 different people seeing your page one time that would be 1000 impressions.
Also called inlink and incoming link and refers to incoming hyperlinks to a web page that have been placed on another website. This is seen as a sign of trust or vote of confidence to Google and other search engines and is a main ranking factor for in their algorithm.
Incognito Tabs / Private Windows in Browsers
These are special browsing tabs that can be opened within a web browser to that do not store any cookies or information. As such when performing searches using an incognito or private tab any previous searches or websites you have visited won’t affect your search results however, location can still influence them especially using mobile devices.
This is a huge database of websites and all of their web pages and content that is stored by search engines and used to server up results from for user search queries.
These are the individual web pages of a website that have been crawled by search engine crawls to include in their index. If a web page hasn’t been indexed, it cannot show up in the search results.
Also known as information architecture this is a universal design principle that applies to the arrangement, structuring and prioritising information on web pages in such as way that it helps to reveal an order or “hierarchy” of importance.
This can be achieved through the use of written text, images, graphics and video. The trick is not what medium is used to present the information but how it is structured in a way that conveys the hierarchy of importance you wish to display.
An internal link is a hyperlink placed on a web page that point to another web page on the same website or domain.
This is used to describe ads that appear on websites in the form of pop up modals or window when browsing a website or waiting for its pages to load. These can be harmful for a website’s rankings especially on mobile as they are often intrusive and hinder users from viewing the content they came to view.
Inverse Document Frequency (IDF)
Inverse document frequency is a numerical statistic used in information retrieval and is used to measure how much information a word provides in regard to how common or rare the word itself is within the document. It is usually applied alongside term frequency as a way to further refine the results of information retrieval queries. It works by helping to filter out words that are too common in a query, words such as “The” “a” “and” to help narrow down search results.
ISP (Internet Service Provider)
This is the company that provides the internet service you use. If you use the internet for spammy reasons your IP can eventually be blacklisted, and once this happens you may be prevented from accessing certain websites.
This is the desired word or phrase you wish to be found for when a user searches for it within a search engine.
This is when the same keyword or key phrase has been over or excessively used throughout lots of pages on a website. The problem with this practice is that when it comes to showing relevant results for a search query the search engine will not know which page you want to rank for that given keyword. Not only does it confuse search engines but users alike.
Keyword density refers to the percentage a chosen keyword or phrase is repeated through a web page. There are no rules on what the optimum percentage is for keyword density however, not using a keyword enough may not give a clear indication to the search engines of what your page is about. Alternatively, over using a keyword or key phrase too much could result in a penalty for over optimising the page.
This refers to the method of determining the right keywords to target for your given web pages. This is the most important thing to consider when it comes to SEO. When you select your keywords, you should choose ones that people search for and are of relatively low to medium competition. This is so you can realistically rank for them and much sooner than more broader keywords.
The reason for this is because if you choose words that are too broad you will too much competition and will never be able to compete with more established brands. Another downfall to incorrect keyword research is selecting keywords that no one ever uses during searches, this is a common practice of shady SEO companies.
This is the spammy practice of over optimising a web page with a keyword or key phrase by continually repeating it throughout the page, even if it makes no sense when read out aloud by a human. This usually leads to a penalty for over optimisation.
This is short for key performance indicator. These are a value given to measure the performance of a business in meeting its desired businesses goals. They are used in all forms of marketing and help determine the effectiveness of a campaign or alert you when your plan may be going awry, this allows issues to be addressed before they become worse or budgets are overspent.
This is the web page in which a user will land once they have clicked a link to your website, they as usually in the search results. Landing pages used in online marketing especially pay per click are usually separate from your main website with no navigation to them. This is so that users are easily funnelled towards your desired goal and also, so they can only be reached by the specified PPC advert and link as to be able to better measure its performance.
There are two types of landing pages, click through and lead generation. Click through landing pages are designed to make a user click through to another page. These are usually used in Ecommerce to describe a product in more detail before passing them onto a shopping cart. Lead generation landing pages are specifically designed to capture a user data this could be there name and email address, they usually offer something in return for a free download or product in return for filling out your information.
This is the element on a web page that a user can click on to cause the web browser to take them to another part of the page or a different page either within the website or another one altogether.
This is a term used to describe those internet users who are happy to share and link to practically almost anything. These are the kinds of people who will share any random post or news and help create incoming backlinks to your website. These are the kinds of people that are most likely to fall for link bait articles or even maybe share fake news.
This is an in-depth process of analysing a websites link profile. It allows us to be able to evaluate the quality of incoming links, the types of anchor text used, quality of the website that is linking back This is done to ensure that a website maintains a healthy backlink profile to avoid any potential penalties from search engines. It also allows you to reverse engineer a competitors backlink profile in order to exploit any opportunities you may have missed in your link building efforts.
Also known as click bait this refers to content that has been specifically created with the intention of getting as much attention as possible and gaining as many links as they can. It’s usually used on social media where it can reach the mass and be shared. The most common types of link bait articles used headlines based on news, humour or controversy.
This describes the process of creating and building incoming links to a website with the purpose of helping to increase its online rankings. There are numerous link building strategies that can be employed such as blog comments, guest blogging, article and content marketing, blogger outreach, infographic submissions and social bookmarking are all common types of link building strategy.
This is a type of link building where websites mutually exchange links to each other also known as reciprocal linking. It’s a common practice on most low quality and spammed website directories and should be avoided.
A link farm is a group of websites or web pages that specifically link to each other in the group in order to help manipulate the search results by helping them all rank. These were used back in the late 90’s and were mostly spammed by webmasters until Google took action to lower their ranking potential. These days they are rarely seen.
Link Juice (Trust, Authority, PageRank)
This is the name given to the link weighting algorithm that is used to determine the value of each link, as some are more valuable to have than others. For instance, do follow links pass on link juice opposed to no follow links. However, it’s important to remember than a healthy backlink profile will contain a mixture of both types.
Link Partner (Link Exchange, Reciprocal Linking)
This is when two websites decide to link to each other for mutual benefit, however because of the mutual arrangement these types of links were devalued by search engines. This type method was used a lot in low quality directories in order to gain a listing.
This is used to measure the value of a website by the total number and more importantly the quality of the backlinks it has pointing to it.
This term is used to describe the distribution and quality of a websites backlinks. A healthy backlink profile will consist of many high-quality links from authoritative websites and will contain both do follow and no follow links.
Search engines use complex algorithms to examine all incoming and outgoing links and use their own secret formula to determine the quality of a link. However, it’s usually a combination of the websites trustworthiness, page rank, quality, and relevance which are taken in to account with other unknown factors.
This refers to a link building strategy where you find online mentions of your company name or brand online and find out if those mentions can utilised to get links from. This can also be used to find links that may be pointing to old pages they may no longer be there and thus have them pointed to a more relevant page.
This is a black hat SEO practice where links are creating with the intention of gaming the search engines by manipulating the flow of pagerank using links. This allows the user to control the flow and decide which pages should receive the most page rank.
This term refers to any low quality and irrelevant links often in the form of irrelevant comment links and is usually done using automated software with the intention to get as many links as possible in the shortest amount of time.
This one is pretty self-explanatory and refers to the speed and frequency of links a website has gained attributing the websites growth over time. These velocities can be natural or unnatural which can be determined by the peaks in the amount gained over a certain time or period.
A more natural link velocity will often be at a steady and incremental pace. However, this does not consider virality in which case velocity would be sky high and although unnatural looking it would be completely genuine.
Local Rank Tracker
This is simply a piece of software that will track and monitor a specific number of chosen keywords locally in the search engines.
Local seo is the process of optimising a website to become more visible for search queries locally. This is important because rankings and positions for keywords can vary often drastically depending on the users location or city. More often than not searches will even change depending on where in a city you are located at the time of the search.
Local Search Query
This is when a user enters a keyword or search term alongside using a specific location. This helps the users narrow down the search to be more specific to their location. An example of which would be something like cinemas near Manchester or auto repairs New York.
Long Tail Keyword
This term refers to longer and much more specific search queries enters by users. These are usually much less competitive to compete for as they usually have lower search volumes however because they are more specific and targeted they usually convert much better than more broader terms. An example of a longtail keyword would be something like “men’s size 12 brown shoes” where a broad keyword example would be “men’s shoes”.
Low Quality Links
This is used to describe links that search engines don’t deem valuable these are usually of a very low quality and standard and are usually found in the form of spam comments or links coming from a website that links to almost any other site, or link farms.
LSI (Latent Semantic Indexing)
Now this term gets thrown about a lot especially by new SEO’s and it needs to stop. Latent Semantic indexing or LSI for short is an indexing and information retrieval method that uses a mathematical technique known as singular value decomposition and was originally designed for processing information within defined sets of documents.
It is named latent semantic because of its ability to understand the semantic relationship between words in a body of text. The latent means hidden assuming there is a hidden similarity in the texts and words used throughout the collection of documents that have undergone latent semantic analysis or LSA for short.
The indexing clearly refers to the indexing of the documents themselves and here is where the big problem with applying LSI to world wide web and search in general becomes clear. Every time a new document I,e web page is published onto the web the all the web pages on the web would then need to be instantly re indexed to update its index and find the similarities in the documents again. This alone would cost Google so much in server processing costs alone it would make it a very expensive process. Not to mention there are already much better ways of retrieving information.
So, there you go LSI is nonsense and you can stop using the term now, instead I like you to say semantic keywords or synonyms. I’d also like to say thank you to a few SEO’s who I’m lucky enough to call friends for helping to clear up what I now dub “LSI Gate” thanks goes to Bill Slawski of SEO by the Sea the Black Knight himself Ammon Johns and David Harry of Verve Developments.
This is essentially a set of defined data which describes basic information about a web page or piece of content. Metadata can come in many forms. A great way to think of this is like an executive summary where it gives a brief description about what is contained within a report.
The meta description is a html meta tag attribute used to give search engines a description of what your page contains. Now it’s been long said that the meta description doesn’t have any major SEO benefits however, it does play a huge role in your CTR as the more enticing your meta description the more likely someone is to click on the link.
Here’s how it looks in the as html in the head section of your web page.
< meta name=” description” content=” This is your meta description that will show up in the search engine results page.”>
And here is how it looks on in the search results.
Also another html meta tag this was used to indicate to search engines what keyword or phrase your website should be found for. However, because of wide scale abuse and also the fact that search engines became better at understanding what pages where about without help it was declared obsolete.
Meta tags are html attributes that are used in the head section of a HTML or XHTML and help provide information about the relevant web page. Meta description and the obsolete meta keywords are both examples of meta tags. The most important of these is the Meta Title tag and should be optimised for your keyword and describe your page correctly. Other examples are the meta description which helps with CTR and also the now obsolete meta keyword tag, Noindex and Nofollow tags which are used to instruct Google not to crawl and index your web pages.
These are measurable performance standards. And are crucial in any digital marketing campaigns not only for accountability but also to measure a campaigns success. They are used to quantitatively measure the success of digital marketing campaigns.
Made for AdWords and refers to any site that is specifically designed to serve AdWords as the majority if not all of its content.
Minifying or Minification (html, CSS, Java)
This is when a website is fully replicated and copied exactly as is on a new domain. Often the new domain will have a slight variation in its URL or more commonly now using the exact same name and changing the TLD top level domain, this is common amongst illegal sharing and movie streaming sites. When one site is shut down or is having issues they will replicate the same site under a new TLD to keep the site running for users.
In April of 2015 Google introduced a new algorithm update that would give priority in its search results to websites that displayed and functioned well on mobile devices. The term was widely adopted due to fears it would cause a huge shake up in the rankings.
This describes how well a website or web page functions for users using mobile devices, such as how fast it loads the pages and if the user can click buttons without hindrance. This is often achieved using responsive frameworks, so a web page can adjust to fit the users mobile device.
Mobile First Indexing
Now more than ever more and more people search using mobile devices and in 2016 Google announced that it will start to use mobile versions of a website to evaluate their relevance to users searching on a mobile. One of the main problems faced when this was announced was when content that may be accessible on a desktop version may not have been on a mobile version, for various reasons such as to improve UX or load times. The problem here is that Google would potentially penalise sites in the mobile index for showing different sets of content to each set of users both desktop and mobile.
Also known as monetization is process of extracting or making money from a website. This could be done through a multitude of ways and the most common is by using AdSense where you are paid for every click a user makes on the ads displayed on your website. Another more popular form in the SEO world is by using sites to help promote and sell affiliate products. See Also Affiliate Marketing.
This is a ranking metric used by a popular SEO tool to calculate and determine a web page’s importance. One major thing worth noting here is that this is actually as you guessed a 3rd party metric that was created for their SEO tool set which is why is became widely used. Now here’s the thing it isn’t used at all by Google to determine a sites relevance or importance. As such you can start to drop it from your SEO vocab along with the dreadfully misunderstood and overused term LSI.
Multivariate testing often uses more than one variable to test at a time. It may also use more than the two versions of page to simultaneously test these multiple variations unlike A/B Testing.
Also used to describe the Domain Name System see also (DNS).
This one stands for Name and address and phone number and is used by search engines like Google to help determine a business’s physical location. This is extremely important for local SEO to help determine the relevance of a website for when user’s search locally.
Natural Language Processing (NLP)
This is a process that helps computers understand human language. This is used to help search engines better understand the intent and meaning of a search query and thus serve better more relevant pages as a result. Since the addition of the knowledge graph, hummingbird and now more recently rankbrain search engines like Google have vastly improved their knowledge and understanding of the human language.
Natural Links or Organic Links
These types of links are the ones Google prefers, and are also the hardest to acquire. Natural links are those that are gained naturally without any interaction or influence from the websites owner. These should not be confused with links gained via outreach as they have been gained via influence with the intent or gaining a link.
This refers to the practice of using Black Hat SEO tactics to maliciously sabotage another website with the sole aim of getting penalised by a search engine. This is done with the belief that lowering a competitor’s ranking by means of a penalty will help them gain an advantage in the search results.
A term used to describe a smaller subset of a larger marketplace on which a product or service is the focus.
This term refers to a link building tactic where a webmaster will pay for a link to be placed on an already previously posted web page or post, they will add your link and anchor text and ask for the page to be re indexed to crawl and update the new link is live.
These are websites that are built with a sole focus on one niche and are usually small in size. Because of their small size there are usually part of a small network of sites and are also usually monetized through AdSense as another way to generate income.
This is a HTML tag that can be assigned to the rel attribute of a HTML. There also called no follow links because they do not pass link juice. These types of links are often seen as worthless by many new SEO’s however they do have their value and also offer a more natural looking link profile as no website would ever have just follow links.
Here is an example of a Nofollow link
<a href="”https://www.equillmedia.co.uk/”" rel="”nofollow”"> This would be a Nofollow link</a>
The noindex directive is value in a meta tag that can be added to the HTML head section of a web page to ask search engines to not include that page in its index and ultimately search results. This is particularly useful to prevent Google or other search engines Indexing checkout pages or customer login pages etc.
Back in roughly 2013 Google made a very important change to their analytics data. This was a major pain for SEO’s as they essentially stopped providing data for keywords that were used to access your website and web pages for anyone searching using a secure browser. The end result is that it made it difficult for SEO’s to find out what users on secure browsers where searching for.
Offsite Optimization (Off Page SEO)
This refers to SEO activities that are performed off your website to help improve its search engine rankings. This includes things such as building or earning links, creating industry relevant citations, PR work and social media marketing efforts.
Onsite Optimization (On Page SEO)
This is the second aspect of SEO and refers to all activities that are performed on your website or web pages to help increase its search engine rankings. These includes things such as pages titles and descriptions, setting up xml sitemaps, creating internal links, posting great content, adding schema mark up and ensuring your website is crawlable and able to be indexed by the search engines.
Online Reputation Management
Is the name given to the process of monitoring, managing and influencing search engine result pages (SERPS) or mentions of a company, brand or person in the online world. This is most achieved by addressing any negative sentiment online and reacting positively to negative customers reviews, producing press releases and other PR and social media work.
Organic Search Results
Once a user submits a search query the results displayed are actually two types organic and paid. These are displayed as paginated lists and the organic or natural search results as they are also referred to are the ones listed below the paid advertisements. The positioning of each listing is determined by the search engines algorithm.
Is all the traffic that was directed to your website or web pages by entering a search query in the search engine itself and then clicking upon your link as it was displayed in the search engine results page.
Outbound Link (Outgoing Link)
Outbound links are hyperlinks that are created on a website and link out to a different website that you do not own or have control over. They also often referred to as external links.
Meaning “To reach out” this is also a term used to describe the off-site SEO activity used to help build links to a web page by reaching out to other website owners within a similar niche or industry, usually with the intention of sharing or promoting content relevant to their audience in exchange for a backlink.
This refers to the scoring system developed by Moz, it was designed to determine how well a web page would rank in the search engine result pages or SERPs. It used a scoring system of 1 – 100 and uses data from their own Mozscape web index. The important thing to note here again is that this is a third-party metric and as such is not used by Google at all to determine a website’s ranking so once you understand this you realise it’s yet another irrelevant term SEO’s need to stop using along with MozRank and LSI.
Page Loading Speed (Page Speed)
This is the amount of time it takes to load a web page. Page load times are now a ranking factor and having pages load as fast as possible is a must, failure to do so could affect your pages rankings, more so with the onset of the mobile first index.
Typically, things that can affect your pages load time can vary but the usual culprits could be any or all of the following:
Huge File Sizes : More often than not its image file that have been uploaded at their maximum sizes and highest quality which massively increases their size and thus a pages load time as a result. These can be optimised using 3rd party plugins or web-based services such as CompressJpeg.com
Code Bloat: Typically found on poorly coded themes and plugins this unnecessary code, HTML comments and poor formatting can seriously increase your pages load times. One way to fix this is to use minify your HTML and CSS code. See also Minify/Minification
Slow Server Times: Usually this is down to using cheap low-quality shared hosting. The result can be very slow loading times. Upgrading your hosting to a more reliable provider can help to increase your load time.
Others to consider: Expiry headers, not leveraging browser caching, Java blocking scripts Can also hinder your page load times and should be fixed where possible.
Is a metric used to measure how many different pages a user visited on your website in one session.
PageRank was the earliest algorithm developed by Google’s founders Larry Page and Sergey Brin. It was designed to evaluate a web pages authority, trust and relevance buy the quality and number of links it received. This in tun helped score the pages on a scale of 0 – 10 with 0 being the lowest and 10 being the highest. Despite many myths PageRank is still used a core part of its algorithm although it has far progressed from just using PageRank as a means to evaluate a web page which is why the term is rarely used these days but still often enough to be explained here.
These are links that are purchased either through a 3rd party service or through a website owner themselves and are usually under the form of sponsored posts. Because of the abuse of such practices it’s not strongly advised by Google that these posts state they are sponsored and apply the No follow attribute to the hyperlinks to prevent them from passing on Link juice. Failure to follow these guidelines may result in a penalty.
Paid search is a term used to describe the Pay Per Click advertising section that is displayed in the search engine result pages, above the organic results. See also PPC
Partial Match Domain (PMD)
Partial match domains are domain names that contain part of a keyword or phrase that you want search engines to rank you for. Unlike an Exact Match Domain which contains all of the keyword.
Pay for Inclusion PFI
This is a method were by websites that pass certain editorial quality guidelines and a review can pay a fee to be included gain extra exposure. This a very common practice amongst web directories where users can pay a fee to be listed in them. Like local chamber of commerce sites where a fee is required to join.
Pay Per Action (PPA)
Pay for Action is similar to the below PPC however with PPA add publishers are only paid when the user clicking the ad goes on to convert.
Pay Per Click (PPC)
This acronym stands for Pay Per Click advertising and is also known as Paid Search. It is an advertising schema where publishers pay ad agencies a fee whenever a user clicks one of their ads displayed on the agencies advertising networks. AdWords is probably the most well-known PPC advertising platforms.
This term refers to the content that is protected and only viewable by paying a fee, this is usually in the form of an ongoing subscription and has become more widespread amongst large publishers and some newspapers.
Also known as buyer persona’s these are fictional characters that are created to represent different types potential buyers of your product or service. Persona’s help to identify similar patterns of behaviour and characteristics that can then be used to help create product, services or creative that matches with those segments of your audience that have similar traits to those of your created persona.
Over the years search engines have vastly improved and this term refers to the aspect of them personalizing their results based on a user’s preferences. The search engines use data such as any previous searches or ads clicked, location of the user when making the search and also social data to help serve more personalised relevant search results.
Hypertext Preprocessor is a server-side scripting language developed for web development. It was produced by Rasmus Lerdorf 1994. As well as being widely used in web development and web applications it is also used as an all general-purpose programming language seeing itself reinvented with many iterations over the years with the latest being 7.0 released in 2015.
Also sometimes called foundational links these are essentially any links created that do not use any keyword or key phrase anchor text in them. Instead these links will usually be naked URLs or could be Branded keywords or even miscellaneous words such as “Click here” “Learn more” and are typically created to help establish a new websites online presence in its early stages. They are also used often to help diversify a ‘websites back link profile in an effort to avoid a penguin penalty due to over optimising their keyword anchor text.
In web development a plugin or plug-in is module that adds a specific feature or extra functionality to a website or web page. Software that allows for plugin also help to extend their customizability. The most common types of plugins are those used in content management systems like WordPress Joomla or Drupal and add extra functionality to the web page or website.
Often confused with bounce rate this is when a user performs a search clicks on a result and then immediately clicks back to the results before clicking on another listing. This is particularly bad for any website as it indicates to Google a poor user experience and that the searcher didn’t find the right answer clicking your website so immediately looked for better alternative. High levels of pogo sticking on your website could hurt your rankings but also highlight problems with your web pages overall content quality.
Also known as a media release, news release or press statements these are usually written informational pieces delivered to media outlets and newspapers to announce something newsworthy either about a company, brand, product or famous person.
This is simply a reference to web pages that sell products. These are often templated pages that usually all look exactly the same. The important to thing to remember about them is they are also the last page a person see’s before committing to a purchase so great care must be taken to help create a better user experience to ensure they convert better.
A portal or sometimes called a web portal are websites that offer a single point of access to a wide array of services from numerous sources such as email, news, forums, search engines, allowing all the information to be displayed in a uniformed way. The MSN homepage is a prime example of a portal.
Private Blog Network (PBN)
Sometimes also called personal blog networks a PBN is a network of websites all created to give the illusion of being real traffic generating websites and are usually run by a single SEO or agency. They work by helping to manipulate the rankings of a target websites by then creating exact match and keyword rich links on these sites back to the target site.
They are a source of contention in the SEO world and Google takes a zero-tolerance approach to using PBN’s are constantly giving out penalties for websites that make use of them.
Usually this would imply trade secrets a company would like to keep confidential. However, in the SEO world it is usually a sales term and BIG RED warning sign that they imply to have a secret method to help secure rankings which is just bullshit as we all know. See Also Secret Sauce and Snake Oil.
This is an important one for those who specialise in Local SEO because Google uses the proximity of a user when making a search to determine the most suitable results to display. Its because of this reason that search results on mobile devices can vary as you move around a city as the differentiating proximity is also taken into account.
A term relating to measurement, qualitative means to measure something by its quality as opposed to its quantity. These usually involve assessment through non-numerical means and are more commonly based on human judgement.
This is used in reference to measure a backlink by its quality rather than by the quantity. This is important because in the old days of SEO you could easily rank a website but throwing mass number of links to a web page and see almost instant results, however Google and other search engines have greatly improved their algorithms to place more emphasis of those websites that have better quality backlinks and penalise those that don’t (Think penguin). This now makes it possible for a website with a small but good quality backlink profile to out rank one than just goes for mass poor quality links.
These are metrics that can used to measure the quality of something. For Instance, in search engine optimisation it’s common for inexperienced SEO’s to judge a websites overall SEO success by its keyword rankings however, a more qualitative metric would to look at the websites traffic to see if the SEO campaign has raised the websites total organic traffic over that period. It’s important to note that qualitative metrics are subjective in nature and will vary depending on what is being measured.
Your quality score is a metric used by Google to determine a web page’s ranking and how much they should have to pay when using an AdWords pay per click campaign. The score is made by assessing the relevancy of your pages content, landing pages, previous ad campaigns and click through rates all of which help to build up your quality score.
Another term used in measurement this one means to measure something by its quantity rather than its quality. These assessments are made with the input of numerical or statistical data to come to determine a conclusion. Opposite to qualitatively.
Queries are usually questions that are seeking information or seek to remove doubt around a given question. These are what users are typing into the search box on Google homepage to find those answers. See Also Search Query
Query Deserves Diversity (QDD)
This is part of Google’s ranking algorithm and is used to help display a diverse range of results for a user’s search query that may have numerous meanings. For example, when a user performs a search for the term “Jaguar” Google will try to understand your intent, but it also understands that you could be searching for either the car manufacturer or the animal and as a consequence will display a diverse set of results showing those both of the car and the animal. Sometimes as a result of this Google will display disambiguation box suggesting other results based on the query.
Query Deserves Freshness (QDF)
This is one of the signals used by Google’s algorithms and means that search queries deserve up-do-date search results. Usually based upon news pieces or unfolding stories where searchers will expect to see the latest and freshest results as a story progresses and updates are published.
When users perform a search they sometimes are unhappy with the results given and as such will refine their search query until they are satisfied with the results. Sometimes search engines will also suggest other variations of your search query to help refine the results. As such query refinement can be both a manual and automated process.
On the internet the query string is part the universal resource locator (URL) that is not part of the hierarchy of the domain and it uses to send information to and from a database. They are typically added to the end of a URL by a web browser when a request is made. The blue section on the below example is the query string part of the URL.
This is the Google algorithm that was confirmed in 2015. It was developed to make use of artificial intelligence and to help Google process and improve its search results. It was also announced by Google that RankBrain is the third most important ranking factor after content and links. What sets RankBrain apart from others is its use of machine learning that helps it make decisions based on past experience. If it comes across words, it doesn’t recognise it can guess as to their meaning using similar words. This makes it much more likely to return a result a for a never before used search term.
These represent the factors used to merit a website’s ranking. Google’s algorithm has been rumoured to use over 200 ranking factors when evaluating and deciding where to rank a website. These factors include things such as the quality of the content on the site or page, the quality and quantity of backlinks, the authority of the website, social signals and many others that are used to determine how well a website should rank if at all.
This is used to refer to software or programmes that are designed to monitor a website’s keyword rankings for any given keyword or phrase.
Short for Resource Description Framework in Attributes these are a set of attribute level extensions for HTML and XHTML document types and are used to embed rich meta data within the documents themselves.
Reciprocal links or linking is when two websites agree to mutually link to each other. This is generally thought of as being ok if there are legitimate reasons for them to do so if they share audiences or typically offer complimentary services though go hand in hand with each other.
The problem with reciprocal links was that they too were abused where sites were built with the sole purpose of exchanging links in return for you guessed it more links. This in turn lead to the devaluing of these types of links which has seen the practice since die off over time.
This is a submission-based appeal to search engines such as Google and Bing to have them lift a manual penalty. These manual penalties are as a result of failing to comply to either Google Webmaster Guidelines or Bing Webmaster Guidelines and spamming the search results.
Redirect or Redirection
The instance of sending users and search engines to a different URL than the one initially requested
This is the process of sending a user or search engine to a different URL than the one that was requested. A redirect works when a user clicks on a link or URL and immediately lands on a different one. There are numerous reasons to use a redirect and they are typically done when content has been removed or has changed URL.
Types of redirect include:
- 301 Redirect
- 302 Found
- 307 Temporary Redirect
- Meta Refresh Tag
This is the domain from which a backlink came from. This is important in SEO as its vitally important that the majority of your backlinks do not come from a limited number of domains. Usually the higher the number of referring domains in a backlink profile the better however, this does not take into account the quality of those referring domain merely the total number.
Regional Long Tail (RLT)
Regional long tail keywords are those that contain a city, region or suburb within the phrase itself these are typically more related to local SEO where users want to find specific services or products near where they are currently located. As an example, “Best Pizza’s in San Diego” would be a regional longtail keyword.
These are links that do not show the full reference to the URL they are linking too but just use a shortened version of the directory path. This is because the server already knows on which server the information is stored and therefore does not require the absolute or full path.
Example of a relative URL
<a href="/resources/SEO-Glossary">Our SEO Glossary</a>
Example Absolute URL
<a href="www.equillmedia.co.uk/resources/SEO-Glossary">Our SEO Glossary</a>
This is a term used to measure the quality of something by how connected or appropriate it is. This more commonly associated in search with content and how relevant it is to a user’s search query however it can also be applied to other things such as the relevancy of a backlink or anchor text. Google has placed great emphasis or having backlinks that come from relevant sources. So, if you have a website about cooking and food it would be inappropriate or irrelevant to have backlinks from a travel website.
However, that being said it is possible to find related topic within such a website if they had a thread about food from around the world and the content was relevant and we had a recipe for one of those foods listed then it would make sense to have a link from this page.
This is a term used in web design and essentially means that a website will adapt its viewport sizes fluidly to match that of the display it is being viewed on. Based on CSS responsive web designs biggest advantage is the ability to display websites as intended on both desktop and mobile display as opposed to having to create a separate desktop and mobile website which was the common go to solution.
Return on Investment (ROI)
This is a marketing term used to work out the total return after the initial investment and is used to determine the profitability of an investment. This is important in SEO as work cost a fee any company paying for those services needs to make sure that it is profitable for them to undertake and justify that cost.
As an example, if a client had hired you for seo services costing £1000 a month for a total of 6 months this would be £6,000 in total. If over that same period, the company made a total of £12,000 then that would create a return of investment or ROI of 50%. The formula of which is below.
ROI = Cost of investment / The total investment * 100
This can be worked out as follows: 6000 / 12000 x 100 = 50%
These are types of microdata markup that use the Schema.org vocabulary. These are used to better describe your pages content to the search engines but are also used to help make your web page more appealing in the search results by displaying things like ratings, price ranges etc.
Also known as the robots exclusion standard or robots exclusion protocol is a web standard used to communicate with website robots and crawlers. It can used to tell crawlers which folders and pages of a website should be crawled and indexed. The robots.txt is a text file and situated in the root folder of a domain. Here is what the robot.txt protocol looks like.
User-agent: * Disallow: /login-page
The * corresponds to all search engine crawlers and bots that are programmed to obey the robot.txt file, and the disallow section tells the search engine robot not to crawl our login page so that it doesn’t waste our crawl budget allowing us to only have it crawl important pages we want to rank.
The root domain is the highest hierarchical level in the domain name system and doesn’t have a formal name and is essentially an empty string. On the internet it can be assumed that domains automatically end with this for the root and end in a full stop. It comes before the TLD or top-level domain part that comes at the end of a domain. Here is an example.
Google’s root domain would be the dot before Google with the .com end being the top-level domain part of the domain name hierarchy system.
RSS Feed (Rich Site Summary)
Also called real simple syndication is a method of syndicating information and content to a feed reader allowing users to subscribe to blogs and sites so they can consume all the content in one place. They regularly update with content and notify you when they find new stories and topics of interest.
This term is used to describe an alleged “sandbox” that google put new websites in to prevent them from ranking well on search terms until a given amount of time has passed. There is much debate and speculation even today surrounding the google Sandbox and its existed which is not universally accepted amongst many SEO’s.
This is a set of coded semantic vocabulary tags or schema types that are developed to help maintain support structured data on the web. Schema can be used alongside other microdata formats such as RDFa and Json-LD to mark up your HTML pages to improve how search engines understand and present those pages in the search engine results page.
Scrape or Scraping
This involves using automated programmes to copy or extract information and other data from other websites. More often than not though it is used to copy another websites content which is plagiarised and then used elsewhere which is against Google search guidelines.
This is actually a computer science term and refers to a mathematical programme that is developed to improve search problems mainly in the retrieval of information stored in databases. When used in SEO it is referring to Google’s search algorithm which has vastly been improved over the years to become very complex and incorporates artificial intelligence and machine learning. It is used to determine a website’s ranking within its search engine result page.
A search engine is a programme or piece of software that is used to search for information within a set of documents. Search engines use a robot or crawler to crawl the pages of a website before indexing them in a database. It’s from this indexed database that certain information is retrieved from when a user enters a search query or term.
Search Engine Spam
This is used to describe pages that are created to intentionally manipulate the search engines results. These are often in the form of low quality or thin content pages used to help flood the search results with irrelevant pages.
Also called search parameters these are characters or strings of characters used to refine and narrow down a search query. Search operators are particularly useful for SEO’s as they can be used to find backlink opportunities or sites that accept guest posts.
Search queries are the keywords or phrases used in a search engine to find information. See also Query.
Search traffic is used to describe the number of visitors you receive to your website or web pages from the search engine result pages. This is often measured using website analytic programmes such as Google analytics, which can help segment your traffic to show the number of visitors visiting each page or section of your website.
This term refers to some sort of secret ingredient and is often used to imply that an SEO service provider has a secret method to improve your websites rankings. In reality it is often personal blog network links pointed towards websites that link to yours or the use of automated link building programmes. See also Proprietary Method and Snake oil.
Short for social media marketing is an online form of marketing that describes any actions or acts associated with helping a website to increase its visibility online. This includes all aspects of search including both organic search SEO or paid advertising such as Pay Per Click.
The stands for search engine optimisation and is essentially the process of helping a website rank higher in the search engines and thus increase its overall traffic and visibility online. Ranking is especially important as users tend not to go past the first few pages in a search so ranking high in a search engine is vital to increasing traffic. Search engine optimisation involves many technical aspects involving both on and off site seo techniques that are used to improve the websites rankings.
This is short for search engine results page and is the search engine page that shows the results of a user’s query in the form of a paginated list of results.
This is a term used to describe the fluctuations that happen within the search engine results pages as a result of the constantly ever-evolving updates to Google’s search algorithm. However, it should be noted that this can also be caused by large influxes of searches around a recent event or news piece.
Also called web server logs these of files are log files or a collection automatically created by a web server to maintain a history of its requests. These files typically contain information such as connecting IP address, time and date, http request type, the size and type of page requested and many other bits of important information. They are great to use in SEO because they can help you identify problems with your server and how it is handling requests.
This is a hosting package where multiple website will reside all on one shared server usually with the same IP address. This can be problematic in SEO as if one site using the IP is penalised by Google it will permeate to all those sites using that same IP and it’s for these reasons if you cannot afford to pay for a dedicated server or hosting that you purchase a dedicated IP address for you website to overcome this shared IP problem.
This is used to describe how we structure a website in a way not only meets our business goals but also presents a great user experience, so that users are able to find information on products and services in a much easier way. This is achieved by ensuring users can access information in as few as clicks as possible and goes just beyond naming conventions and also refers to things such as navigational layouts, internal linking and sitemaps.
The term sitemap can used to describe both HTML or XHTML sitemaps. HTML sitemaps are used to help users navigate a websites page, whereas XHTML Sitemaps are used to help search engine bots crawl and index your websites pages.
This refers to those SEO service providers who claim to be able to guarantee rankings in the search engines. It’s often used with other terms such as proprietary method or secret sauce to help sell their services and imply they can magically improve your websites rankings no matter what.
This refers to an old link building tactic where webmasters would make use of websites such as delicious, Diigo, StumbleUpon and many others that would let you bookmark a websites URL on a profile or public page. Because these pages were set to public they could be crawled by search engines bots and count towards you backlink profile. Today this is considered a low quality and spammy link building tactic that should not be used.
This is used to often describe technologies usually in the form of websites that make allow users to upload their own content in the forms of blogs, pages, images and other forms allowing their users to network according to similar aspirations, interests and tastes. Social media has evolved over the years with presence of many large networks and communities being created such as Facebook, Twitter, LinkedIn, Instagram and Pinterest all with their very own unique communities.
Social Media Marketing (SMM)
Social media marketing is used to describe all marketing and promotional efforts for companies and brands using social media channels to such as Facebook, Twitter, LinkedIn and Instagram to increase their sales and traffic.
Social Media Poisoning (SMP)
Social media poisoning is a black hat technique that is used to create spam giving the perception that a competitor is the spammer. This is primarily used with the intention of causing negative sentiment online towards the competition business. Not only is it immoral but also Illegal.
A fake online persona often created to hide a real person’s identity creating multiple identities. Sock puppets are often used as brand advocates to help promote positive sentiment online around a company, brand or person.
This is used to describe any web page that is created with the intent of improving web rankings without providing any value at all to the user. Often these web pages will link to other low-quality websites in an effort to manipulate rankings and give the perceived image of being relevant but will often provide little value and will make use of repetitive keyword stuffing in meta tags and content copy in an effort to help improve rankings.
Spam Ad Page
These are pages created solely with the intent of serving advertisements using ad networks such as Google AdSense. The pages themselves are filled with content automatically generated or scraped from other sources while the majority of the page serves ad.
These are automated programmes that are used to harvest information and data to be used in assisting with the sending of spam. They types of programmes can scrape websites for contact information such as email address that are used to send automated spam emails without permission.
Another form of spam, this is the process of spamming search engines with the intention of manipulating a website’s rankings. This can be done using various methods which include things like keyword stuffing, building mass low quality links, or low quality or serving different content to users and search engines.
Used to describe any SEO who takes part in any form of spamming to pursue a goal.
Spammy blogs that offer no value to users at all and are usually filled with content that has been scraped or copied from other websites.
Spider (Bot, Crawler)
An automated programme used by search engines to systematically crawl the web to index web pages. See also Crawler.
These are traps made for search engine spiders, bots and crawlers that make use of them following links that are automatically created to confuse and trap the spider in an endless loop. These are used to help webmasters prevent spam bots from harvesting information that can then be later used for spamming purposes.
These are usually highly visually appealing pages that appear before a user may move onto the main content of a website. Although they usually do look great they can often be a pain to navigate and provide very little user or SEO value and should be avoided.
A black hat tactic also known as article spinning this is the process of scraping content from another website then running it through automated programmes that re assemble the sentences and paragraphs in an effort to produce unique readable content and avoid it being detected as plagiarised and duplicated by search engines. However, these programmes usually fail to make the content readable and provide very poor-quality results.
Short for secure socket layer this is the standard web security protocol used to establish secure encrypted connections between servers and web browsers. These are used to help prevent cybercrime and prevent criminals being from being able to read the information being sent between the browser and server.
Now only is it wise to use them for E-commerce website where users are entering sensitive credit card information, but in 2017 Google announced it would warn all web users about websites that were not secure with a warning sign in the serps, this move alone prompted the majority of website owners to move over to SSL.
Static content is content that never changes or needs to be modified or processed. The content served is exactly the same as it was stored, and the server delivers the same file to every user who requests it on the same static URL.
Also referred to as static pages, static URLs are universal resource locators that serve the page and content exactly the same as it was developed, provided the changes made to these pages are not hard coded. Opposed to dynamic URLs that are generated from a database driven website. See also Dynamic URL.
Stickiness is used to refer to number of minutes per month a user spends on your website. It also is used to refer to anything that helps increase the % of return visitors back to your website, the higher the percentage the stickier the page is.
Structured data is data that has been organised and formatted into a repository or database. The data is standardized and resides within in fixed fields that can be requested alongside other data fields. In Search Engine Optimisation structured data is used to help markup pages with more detailed data about the information contained within a document or web page.
This is done using microdata and Json-LD to create various data mark-ups like reviews, authors, recipes and articles help it increase its visibility or click through rate. You can also check your structured data is markup is valid and correct using Google’s own structured data testing tool.
Also known as site submission or search engine submission, this is used to describe the old practice of having to submit your website to search engine to notify them to crawl and index your website. This is no longer the case and these days there is absolutely no need to submit your website as search engines crawl the web every day and will index them naturally.
However, there may still be a legitimate reason to ask a search engine to recrawl and index a page, for instance were the page was set to “Noindex” when in fact the crawler should have been allowed to crawl it previously, or if content has recently been published or updated you may want a search engine to crawl it as soon as possible. For these cases you can use Google’s search Console to request a page be crawled and indexed.
These are domains that are part of larger domain. For example, red.mywebsite.com and blue.mywebsite.com are subdomains of the domain mywebsite.com which itself is a subdomain of the dot com Top-level domain or TLD.
In search the supplemental index refers to Google’s secondary index that is made up of web results that are considered less important or relevant. Pages are judged by the quality of the content on them the number of links pointing towards them as well as many other factors.
Should Google’s algorithm determine that your pages are not of sufficient quality, it may put them in the supplemental index. If your websites pages have ended up in the supplementary index it may be time for you to review those pages and try to improve them and get them in the main index.
This is used to describe a sudden drop in website rankings. A website’s rankings may drop or tank for any number of reasons, usually it can be narrowed down to bad low-quality content and backlinks or a combination of both, as well accidentally setting those pages to Noindex. In either case a thorough SEO audit would be required to identify the root case.
These are the keywords or phrases you intend to optimise your web pages for and rank well on the search engine result pages.
Term frequency is a measure used to determine the frequency of keywords or phrases in a document. It is used as a weighting system in information retrieval and also by search engines. However, because it only focuses on just the frequency of a term in a document and not the importance or rarity of the word it is often used alongside Inverse Document Frequency or IDF to help further refine the results given.
Time on Page
This is the length of time a user remains on a page on your website before clicking away. The higher the time on the page the more relevant and better quality the page is deemed.
Time to First Byte (TTFB)
This is used as a measure of the responsiveness or speed of a server or network. Time to first byte is the length of time taken when a user makes a request using a web browser and how long it takes to then load up the first byte of information. This is important to SEO as the speed and load times of a website can affect its it user experience resulting in high bounce rates and could harm your mobile rankings.
The Title tag is a HTML element used to specify the title of a web page. A page title should describe what contain within the page in accurate and brief way. It also used by web browsers to show in the tab section of the browser. Title tags are important because they are also used by search engines in their search engine result pages.
This term refers to the perceived expertise and authority of a website on a given topic or broad set of ideas, opposed to a single term or idea. This is important because instead of just building a web page to rank for given terms webmasters and SEO’s should instead build up topical authority by creating content hubs that go over the topic and sub branches in detail and inform readers.
For instance, many subjects can be covered and linked to with an overarching theme or topic so, if you have a blog about travel you could also write about other related topics such as hotels and places to stay or even the best ways to travel around those locations. This in turn helps to build up your topical authority with search engines.
Top Level Domain
Top level domain is the second most important part of the domain hierarchy system after the root domain. These are the end part of a full domain and go after the dot. Top level domains can be categorized as either generic TLD’s as .com for commercial or .edu for educational or Country Code specific TLDs which are used to identify specific countries such as .co.uk for the United Kingdom or .de for Germany.
This is used to describe a button or any other clickable element of a web page for mobile displays. Google places great emphasis on creating a great mobile experience and uses page mobile friendliness as a ranking factor on the mobile.
Careful consideration should be taken when creating web pages for mobile display as if they are too close together or large users may click them accidentally or even struggle to click them at all, causing a poor user experience.
These are automatic notifications sent to website whenever another website has linked back to it, sort of like a link notification. Due to trackbacks being automated they are prone to be used to spam which is why most webmasters now turn them off.
Trust flow is a metric created and used by the Majestic seo toolset. It is used to measure the trustworthiness of a web page by scoring it on a scale of 0 – 100 and is calculated based on the number and quality of backlinks from other trusted websites. Again, the important thing to note here is that it’s a 3rd party metric and not used by Google so use with an air of caution.
First developed by researchers Zoltan Gyongyi and Hector Garcia-Molina of Stanford University and Jan Pedersen of Yahoo. Trust rank is a technique used to analyse links in an effort to combat web spam.
Uniform Resource Locator (URL)
Often shortened to URL and also referred to as a web address the uniform resource locator is used to located web documents and files that are hosted on web servers and connected to the internet. In Seo is considered wise to optimise URLs for the keywords you wish the page to rank for in the SERPs.
Sometimes also called blended or enhanced search this term refers to the media rich integration on some search results. Sometimes Google will combine media from different verticals and show Video, images and maps either above or amongst the results.
This is a term used in web analytics and refers to a distinctly unique person or visitor (usually identified by their IP address) who requests and visits a web page at least once during the reporting period. Each visitor is only counted once regardless of how many times they visit your web pages.
This used to describe any links that have been created unnaturally with the sole purpose and intent of manipulating the search engine results. This is deemed as spammy link building practice but also against Google’s guidelines and if detected your website will be penalised.
Unnatural Link Profile
When analysis a websites backlink profile, if a large number of the links pointing to appear to have been created unnaturally then this is considered an unnatural backlink profile. As mentioned above any website deemed to have an unnatural backlink profile made up of artificial and superfluous links will be at high risk of being penalised so this should be avoided.
In web design web usability or usability as its more commonly referred to is how easy a website is for its users to navigate and use it functions. The main principles behind usability are to present display information in a clear way and make it easily accessible to users.
User agents are types of software and are used to access the world wide web. When used in the web and HTTP protocols, user agents tell web server the type of operating system and the browser a person is using to make requests online. The most common form of user agent is the web browser, but this can also include search engines bots and crawlers.
User Centric Design
Used to describe a website that has been designed and developed with the intended user actions or process given careful thought and planned out accordingly. Typically, this will involve research into how the intended audience will use the website, how well the site functions and if it meets user needs. All of this is taken into account at each step of design of the website.
User Generated Content (UGC)
Used to describe any content that is created by the users of the platform the content resides on. This is normally in the shape of online forums where all the content is generated by those using it. This can be a powerful tool when used in SEO as those who take part in content are usually those very willing to help promote and share it due to having helped create it, a tactic often used in link building roundups.
User Experience (UX)
This is used to refer to how easy a website is to use by its intended user base and their overall experience in using it. This is achieved by creating the most relevant content for users, making use of easily readable fonts, using great quality images, attention grabbing headlines and content copy, relevant calls to action and also creating a uniform experience on all devices all go towards creating a great user experience.
User Experience is important in SEO as many search engines including Google and Bing place great emphasis on delivering the best results to their users by serving them the best quality and most relevant information. However, this also goes hand in hand with user experience because if your website bounce rate is high this will indicate to search engines that the user wasn’t satisfied with the results or if they were happy with the results that they may have been put off by the websites user experience instead. All of which counts towards an unsatisfactory experience for the search user and can negatively impact your SEO efforts.
User Interface (UI)
This is a term used to describe the front end of a website or application. These are the sections that users will use and interact with to make the application or website function. A prime example of a simple user interface would a websites main navigation menu.
This is the acronym for unique value proposition and is a marketing factor used to differentiate a product or service from that of its competitors. They can be thought of as a “We have what they don’t” and are typically used to highlight the things that make one service or product better or more suitable for a situation than another.
Vary: User-Agent HTTP Header
This is a type of user agent and used when a website offers both desktop and mobile versions of the same page. Once a person uses their desktop or mobile browser known as a user agent to connect to a web server this tells passes the information on as to what type of browser what used to access it, then the web server will decide upon the vary user-agent HTTP which version of the site it should display.
This is when content that is created or marketed to an audience becomes extremely popular in a very short amount of time often in just a few hours a piece can be become “Viral”. This is a term used from medicine as virus’s spread very fast amongst hosts. Usually viral marketing is created using popular social networking sites to enable large audiences to be reached in a very short amount of time.
Virtual Private Network (VPN)
A virtual private network or VPN for short is a private encrypted network that extends through a public one. This allows users to connect to it and send and receive data as though they were part of this private network allowing for data to be encrypted and your searches obfuscated and keeping your activity private. VPN usage has increased in recent times with increasing fears of cyber crime and privacy issues.
Virtual Private Server (VPS)
virtual private servers are remote computers that can be accessed via a private connection and preloading with an operating system they essentially give the user a computer that runs 24 hours a day. These are particularly useful for running software on that may be to be taxing to your run on your own PC or for tasks that take long times to compute and run.
Short for video blog or video log, vlogs are a form of blog where the majority medium used is that of video. As an example, YouTube is probably the most well-known vlogging platform, where all the content is video.
This is the world wide web consortium and is the main internet standards organization on the web. Made up of member organizations its own staff and members of the public its main aim is to develop and provide a set of universal standards for the world wide web. It is led by internet inventor Sir Tim Berners Lee and member organizations include, Mozilla foundation, Apple Inc, Opera Software, Google Inc and many others.
Used to describe a set or group of pages that are interlinked to each other but to any other pages within the website. However, despite not being linked to other pages on the website they still may be indexed if included in the XHTML Sitemap.
Also known as the participative or social web this used to describe any website where its primary source of content is its users (user generated content) and can be used by the majority of the population with ease. This includes things like social media sites like Facebook and Twitter, but also includes such sites like Forums like Reddit or Medium where users can post their own articles and content.
Web directories are websites that list other businesses and their websites on them in organized hierarchical lists usually alphabetical or by industry. In the past directories where abused by SEO’s to make mass links easily but over time search engines have adjusted their algorithms to gives these types of links less weight in its ranking of the website listed. However, web directories are still relevant for many industries and it does make sense to use them as long as there are appropriate for your websites niche, they also act as a form off online citation.
This is used to describe any search engine optimization practices that strictly adhere to search engine guidelines as Google’s own Webmaster Guidelines or Bing’s Webmaster Guidelines. The opposite of Black Hat SEO.
It should be noted that these guidelines are not set in stone so whatever techniques that are ok today may be subject to being classed as deceptive the next.
These are pieces of written content that go beyond to fully inform and educate the reader of a given subject or topic and are usually written by experts in those fields. They combine research and expert knowledge to help create debate around problems or solutions.
Who is Privacy
Every individual, business or organization that registers a new domain with a domain registrar must supply their details, so they know who owns the new domain. All this information is sent to a central database where people can look up the details of the owner, this data is referred to as “Whois Data” coined from looking up who is the domain owner and giving us the name Whois.
Small applications used in web design with limited functionality, they are able to be executed on the web page itself. They can be used to display other information form other parts of the website such as blog post archives or social buttons.
A wiki is a website that uses wiki software known as a wiki engine to allow its users to collaborate on creating, modifying and editing content on the website. Unlike blogs they typically don’t have a common structure and are structured out depending on their subject or topic needs. The most common and widely known wiki is that of Wikipedia the online collaborative encyclopaedia managed and run by its own editors.
In web technology the term wildcard is a form of secure socket layer or SSL certificate that is used to extend encryption to multiple sub domains of a domain to protect them. Using wildcard SSL certificates on sub domains have its own set of pros and cons using one certificate to cover them all may saved money. However, any issues or if it is revoked it will also extend to the subdomains it covers thus leaving your domain and subdomains unprotected.
This acronym is for “What You See Is What You Get editor and is allows for content and web pages to be edited in a form that resembles the appearance of the item as it is created. The most popular forms of WYSIWYG editors are those used by many self-hosted website builders such as Wix or Weebly where the how the page is visually designed is how the finished product will look.
This stands for Extensible Hypertext Markup Language is a branch of the XML markup languages created for humans and search engines. It was created to further extend the HTML language vocabulary and increase its ability to be used with other data set. XHTML like HTML is easily understood and read by search engines.
Short for Extensive Markup Language Sitemap and similar to an Image and HTML sitemaps an XML sitemap is used to list all the pages of a website in a hierarchical list. Unlike its HTML counterpart a XML sitemap is made solely to communicate with search engines bots and crawlers so they know what pages you wish to be crawled and index.
It also contains metadata which tells the search engine when the page was created, when it last modified and how its changed. This then allows the bots to crawl it without wasting crawl budget.
This is the name of what used to be the main search engine in the early days of search before Google established its market dominance. Funnily enough despite being a search engine Yahoo never actually did most of its own web crawling instead in 2002 they bought a company named Inktomi which provided its search results until they actually later made a deal with Google and used their results despite owning multiple search engines themselves. It wasn’t later until 2009 they signed a deal with Microsoft who would then provide Yahoo users with search results from Bing.
This is the name for the most popular search engine in Russia. Accounting for more than 52% half of all searches within Russia.
Acronym for “Your Money or Your Life” this is used to describe nay website where it’s content can affect either someone’s health, happiness, or finances. These includes sites that offer advice or updates on any of the following topics;
- Medical pages: Information and advice on diagnosing and treating diseases.
- Financial Pages: Stocks and bond trading or mortgage, investment and retirement advice.
- Parenting Pages: Mummy advice blogs that offer parenting advice.
- Legal Pages: Those offering legal advice around divorce, child custody will creation and citizenship
YouTube is a website that makes use of user generated content which uses video as its primary medium. Originally developed in 2005 it allowed its users to search for, create, upload and share videos, rate them, watch them and subscribe for future videos. Because of the rapid rise in YouTube’s popularity it was bought by Google in 2006 for $1.6 billion. It now runs as a subsidiary of the company as is the second most popular website in the world as well as being the second most popular search engine of all after Google.
Zero Results Page
This refers to a test Google performed within its search results. Because some search queries would have definitive answers it created and tested what would be called zero results page. The results would bring back a knowledge graph answer and display no other organic results. This was true for queries such as what is the time in London where irrespective of its index there was only one real answer whatever the time was when the search was made. However, in March 2018 Google confirmed it had suspended zero results page and now the same query will display both an answer box and organic results.
This is short for Zero Moment of Truth and was coined by Google themselves in their own 2011 eBook “ZMOT” and refers to the point in a consumer purchase cycle in between where a person see’s an advert or product and the moment the actually decide to purchase. This is important in SEO as there are millions of people searching for product information before buying and it’s this moment between seeing something they want to, making a decision by researching reviews etc and deciding to buy is what Google calls the Zero moment of truth.
The cost of sharing this amazing free resource if you found it yourself.
Exactly what we’re going to be catching now we finally finished it, thank you.