Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
SlideShare a Scribd company logo
INTRO TO SEO IN 20 MINUTES
JOE SINKWITZ
FOUNDER & CEO, INTELLIFLUENCE
WHO AM I?
 Involved in search industry since
1997
 Founder & CEO of Intellifluence
 Have managed thousands of
clients
 I’ve been fortunate enough to
play in all facets of online
marketing, providing a broad
perspective
2
3
The stated description of this talk is to “Learn everything
from keyword research to formatting content for featured
snippets so that you can learn how to optimize your website,
blog, and even your video streams on YouTube.”
I’ll try to save you some time with simple recommendations
on how to best accomplish this, but will focus more on what I
believe is important to understand on a SEO 101 level.
Styled after Johnny Carson’s rules for things to avoid for a
happy life.
WHAT IS THE DESCRIPTION OF THIS TALK?
WHAT IS YOUR TAKEAWAY?
4
SEO can be perceived as overly complex as it is possible to deep
dive into various areas in a way that do require a significant
amount of research and experience to fully appreciate, however I
am a firm believer in focusing on a few core elements in order to
maximize your efforts in the even that you aren’t looking to
actually make a career out of SEO and instead want to use it as a
tool when running your business.
5
 I bucket SEO into three areas: links, content, and user
signals.
 Positive SEO under this broader view would be any tactic
performed with the intent to positively impact rankings for
a URL and possibly its host domain by manipulating a
variable within the links, content, or user signals buckets.
 As an inverse then, negative SEO would be any tactic
performed with the intent to negatively impact rankings
for a URL and possibly its host domain by manipulating
a variable within the links, content, or user signal
buckets.
SO LET’S TALK NEGATIVE SEO
6
WHAT ARE THE THREE BUCKETS?
Remember: Links, Content, and User Signals:
7
WHAT MIGHT YOU NEED?
1. A browser with access to Google
and Bing [content]
2. Access to your raw weblogs
[content & user signals]
3. Google Analytics [content &
user signals]
4. Google Search Console
[content, links, & user signals]
5. Bing Webmaster Tools [content,
links, & user signals]
6. Ahrefs [links]
7. Sitebulb [content & user signals]
8. Copyscape [content]
8
 You can use alternative tools when
evaluating links, content, and an
approximation of user signals; this is just
a snapshot of the current publicly
available tools if I were to dive into an
audit today.
 Every practitioner is also probably going
to have their own homegrown tools as
well, but it doesn't make sense to
mention those given their lack of broad
availability.
DISCLAIMER ON TOOLS
9
 Bad news: there is no such thing as being
hackproof.
 More bad news: there also is no such
thing as being negative SEO-proof.
 All you can reasonably do is make the
appropriate efforts to lessen the
probability of becoming a victim by
reducing attack vectors, so that someone
seeking to do harm has to be more
sophisticated and put forth more effort
than they would against the average
website.
HOW TO BE PROACTIVE AND PREVENT A
NEGATIVE SEO CAMPAIGN AGAINST YOU
10
 Hosting
 CMS Considerations
 Robots.txt
 Scrape scrape scrape
CONTENT (AND INFRASTRUCTURE) AS AN
ATTACK VECTOR
11
 What can your host do to keep you out of trouble? Quite a bit.
 I debated placing hosting as a user signals vector given how important a proper setup is for
uptime and site speed, but there's another critical factor at play with this specific
recommendation: reputation.
 If you were to address 100% of all the issues in this presentation, yet happen to be on a shared IP
with a dozen other domains which are flagged for distributing malware, are blocked by email
spam detection services, or are subject to manual link actions from Google, you're in for a bad
time.
 Bad neighborhoods are a thing.
 You will at a minimum want to ensure you have a dedicated IP for this domain you care about,
but ideally have the site on its own dedicated server. The other advantage of not sharing a
server at your host is this becomes one fewer attack vector a bad actor could employ, not being
able to gain access to your hosting setup by compromising a less security-minded domain on
the same host.
HOSTING
12
 Not all content management systems (CMS) are made equal.
 Some will automatically auto-spawn tag pages, archive pages, and separate image pages when
you attempt to create a single page; some will automatically include a dofollow commenting
section on posts which screams "spam me!" to all my favorite spam tools.
 Since the majority of the world's websites run on WordPress, it is worth speaking to with
specificity to by default: disable comments, noindex tag pages, noindex author archive pages,
and probably noindex category pages. Some will disagree, but in a Google that contains the
Panda algorithmic penalty, my focus is on attempting to index high value pages only, a hurdle
tag pages, archives, and category pages rarely clear.
 Also with certain CMSes it is important to ensure proper canonicalization is used to keep
duplicated content from being indexed due to pagination and other query string nonsense.
CMS CONSIDERATIONS
13
 Robots.txt manipulation is a double-edged sword, and not just because it is very common to
find a mistake which results in an entire domain being deindexed.
 When crawling rules are too tight, it is possible to pretend a bad content result exists in a
somewhat expected path and have Google rank that nonexistent page on the basis of a
domain's inherent authority and keywords used in the URL slug alone, since Google is prevented
from actually crawling the page and therefore has to trust that it "might" exist.
 One of the biggest risk reductions comes in the form of disallowing search pages from
becoming crawled and indexed. Without knowing which CMS you use, here's some generic
advice for you to pick and choose from:
Disallow: /search/
Disallow: /*?s=
Disallow: /*?q=
ROBOTS.TXT
14
 Proper robots.txt setup isn't just for keeping poor quality pages out of the index.
When faced with crawl budgeting, it can also be important to preview pages and
permalink fail pages from the standpoint of ensuring Google doesn't waste time
getting caught in a spider trap. To do that in WordPress is relatively easy:
Disallow: *&preview=
Disallow: *?p=
Disallow: *&p=
ROBOTS.TXT (CONT.)
15
 I'm not going to suggest that you take a stance on scraping content as a means to
protect yourself.
 You'll need to be proactive in using a content protection service to ensure that your
images and writing are not used elsewhere on the web without your authorization.
 While Google is a bit better now at attribution of source material, there still does
exist issues with using authoritative domains as parasitic hosts, where the attacker
will purposefully seek to continuously crawl a target domain by sniffing a sitemap,
posting the new content on the parasitic host within seconds of it going live on the
target site.
 Using a service to find these content thieves is a must. Depending on the where the
content exists, you may need to issue a takedown request, some of the addresses of
which I've compiled here: http://www.digitalheretix.com/blog/how-to-remove-
information-from-google/
SCRAPING
16
 Outbound links via UGC
 Outbound links injected
 Inbound links
LINKS AS AN ATTACK VECTOR
17
 As stated in the section on CMSes, I'm not a fan of open comments because they
seem to be more often abused than used correctly, but what about other sources of
UGC? If you add a community/forum section on your site for members to interact, I
recommend doing one of four things:
a. Nofollow all external links
b. Force all external links to redirect through an internal page to strip
outbound link equity
c. Noindex all threads
d. Pre-moderate all external links
 Forums/communities that don't follow one of the above approaches likely appear
on several Xrumer and GSA lists.
OUTBOUND LINKS VIA UGC
18
 This is a trickier issue to be proactive about,
because by definition you're being reactive.
However, monitoring Google Search Console
for outbound links and scoring those links
with a risk assessment tool of your choice
such as Link Research Tools is going to be the
best option here.
 Another method involves a consistent
crawling script with multiple user agents
(Google and not Google) to determine if any
links or content exist that should not; this is
essentially handled by reverse engineering
cloaking software to attempt to decloak
injected issues.
OUTBOUND LINKS INJECTED
19
 Inbound links are far more likely to be your problem. There's only a few things you
can reasonably expect to accomplish to try and protect yourself:
a. As an overall percentage of your incoming links, you want as many good
ones as possible, which means getting links for yourself is part of your
proactive strategy - Yes, I know, that's obvious; it's a lot like Google telling
you to just make great content. The truth isn't far from that though -- if you
are consistently focused on producing the best content assets as they pertain
to your niche and can do so in a way that your content answers a lot of
questions users of sites in your niche might have, you'll consistently earn links.
If you have only a few decent links and a bad actor decides to place a few
hundred thousand very bad links at you, Google will almost certainly treat you
unfavorably. The more uneconomical you can make that attack by increasing
your beneficial links, the better.
INBOUND LINKS
20
b. Watch your anchor text - One easy filter to trip is still the over-optimization of
anchor text, so even if you're attracting great links be sure to do so in a manner
that isn't forcing your audience into a narrow set of phrases you wish to rank for. If
you do see your anchor text starting to get too concentrated, be on the specific
lookout for a bad actor that's trying to slow trip the filter by ramping up a more
nuanced attack.
c. Disavow - I've gone on record that I don't like that disavow even exists, as I feel
it is indicative of a guilty until proven innocent environment within Google, but
since it does exist, you'll want to proactively disavow based on your risk scoring
solution. Remember, it is not just the overseas counterfeit porn and gambling
emporium links that you'll need to address, but also include those that appear to
be part of any nuanced attack as well.
INBOUND LINKS (CONT.)
21
 There's really only a couple factors that can come into play for you
to be aware of, and one of them there isn't much you can do
about:
USER SIGNALS AS AN ATTACK VECTOR
 Metrics
 Speed
 Malware
22
 CTR, time on site, and bounce metrics are consistently being folded in over time as
more trusted signals by Google.
 Knowing your baseline stats in Google Search Console and Google Analytics are
important here because it is nominally easy to hire a botnet and a few thousand
micro workers to click a result, bounce in 1 second, and have a portion of them file
a suggestion that the domain they visited wasn't a quality site.
 All you can really hope to do is notice strange trends and then attempt to
compensate; if it is an obvious botnet, block it at the server or CDN level.
 If it is a bunch of incentivized users, however, all you can really hope to do is handle
the situation like you would your inbound links, by aiming to provide a satisfactory
experience and acquiring traffic that you know will offset the poor metrics.
METRICS
23
 Earlier I alluded to including server setup as a speed
consideration for negative SEO; want to prevent a
potentially slow site being used against you? Don't host it
on a shaky setup.
 If possible, consider using a CDN to protect yourself
somewhat from DDoS attacks and ensure that your own
server environment is up-to-date to prevent zero day issues
such as UDP amplification, Slowloris (RSnake's fault), and
other attacks (those are particularly nasty).
 Beyond that, you'll want to look into any way an individual
could leech bandwidth off of you by locking down at the
server level on inline linking of your images, remove unused
CMS plugins, and establish proper caching.
SPEED
24
 Malware as a user signal? Absolutely, though you could
argue this is more of a content issue and I'll agree with you.
 Nothing contributes to a poor experience quite like getting
auto-redirected to a scam site by some injected JavaScript.
 To prevent such situations, in addition to keeping up-to-
date on your hosting environment and CMS security
updates, it is healthy to periodically run a malware scanner
on your site's server to seek it out and remove. The sooner
you can find problems, the better; thankfully, Google is
pretty forgiving to addressing known malware
compromises, but they don't catch all and will fold in poor
user data as normal usage when they miss it.
MALWARE
25
Let’s quickly give you the answers to the topical guide.
 Want to properly use content formatted for featured snippets? Purchase either Yoast premium
or use Rankmath and follow the plugin steps for simple content optimization. There’s no need
to overthink it as the schema does change over time and you can let the plugin manage that.
 Want to know what content to create in the first place? Perform a gap analysis in Excel by
collecting paid search data from SpyFU and link data from Ahrefs. By combining these two
sources you can get an understanding for what content is the most valuable and which content
is the least link supported – pick the low hanging fruit. Coming soon: KWjuicer.com will do all of
this research for you, and is owned by CopyPress, so they can also create the content for you.
 Want to optimize your streams on YouTube? Make sure you provide a detailed title and
description w/ link to your site. Embed all your videos on your blog posts and get influencers to
share both the blog posts and your direct YouTube videos. I own the largest warm contact
influencer network in existence to help you. Enjoy a 60-day free trial on me by using the coupon
code AFFSUMMIT at Intellifluence.com.
OK, SO THAT WAS MAYBE MORE ABOUT
NEGATIVE SEO THAN AN INTRO TO SEO…
26
Remember, there’s very few things you really need to think about in most
cases within SEO 101…
 Only create content that uniquely satisfies a user’s query; don’t create
content for the sake of doing so. Make it the best possible answer for that
query.
 Make sure that content is accessible to search engines and loads as quickly
as you can get it to load, with a very clear KPI of what you want the user to
do on that page.
 Get traffic-bearing links to that content. Don’t obsess about nofollow;
obsess over whether that traffic source might convert on your stated page’s
KPI.
For SEO 101, anyone trying to force you into doing something further is
probably just trying to sell something.
CREDITS
Icons courtesy of flaticon.com (Freepik, Icon Monk, Icon Pond,
mynamepong, pongsakornRed, Roundicons, Smashicons, Nikita
Golubev, photo3idea_studio, Good Ware, phatplus, Prosymbols,
Gregor Cresnar and Dave Gandy). All product names, logos, and
brands are property of their respective owners.

More Related Content

Intro to SEO

  • 1. INTRO TO SEO IN 20 MINUTES JOE SINKWITZ FOUNDER & CEO, INTELLIFLUENCE
  • 2. WHO AM I?  Involved in search industry since 1997  Founder & CEO of Intellifluence  Have managed thousands of clients  I’ve been fortunate enough to play in all facets of online marketing, providing a broad perspective 2
  • 3. 3 The stated description of this talk is to “Learn everything from keyword research to formatting content for featured snippets so that you can learn how to optimize your website, blog, and even your video streams on YouTube.” I’ll try to save you some time with simple recommendations on how to best accomplish this, but will focus more on what I believe is important to understand on a SEO 101 level. Styled after Johnny Carson’s rules for things to avoid for a happy life. WHAT IS THE DESCRIPTION OF THIS TALK?
  • 4. WHAT IS YOUR TAKEAWAY? 4 SEO can be perceived as overly complex as it is possible to deep dive into various areas in a way that do require a significant amount of research and experience to fully appreciate, however I am a firm believer in focusing on a few core elements in order to maximize your efforts in the even that you aren’t looking to actually make a career out of SEO and instead want to use it as a tool when running your business.
  • 5. 5  I bucket SEO into three areas: links, content, and user signals.  Positive SEO under this broader view would be any tactic performed with the intent to positively impact rankings for a URL and possibly its host domain by manipulating a variable within the links, content, or user signals buckets.  As an inverse then, negative SEO would be any tactic performed with the intent to negatively impact rankings for a URL and possibly its host domain by manipulating a variable within the links, content, or user signal buckets. SO LET’S TALK NEGATIVE SEO
  • 6. 6 WHAT ARE THE THREE BUCKETS? Remember: Links, Content, and User Signals:
  • 7. 7 WHAT MIGHT YOU NEED? 1. A browser with access to Google and Bing [content] 2. Access to your raw weblogs [content & user signals] 3. Google Analytics [content & user signals] 4. Google Search Console [content, links, & user signals] 5. Bing Webmaster Tools [content, links, & user signals] 6. Ahrefs [links] 7. Sitebulb [content & user signals] 8. Copyscape [content]
  • 8. 8  You can use alternative tools when evaluating links, content, and an approximation of user signals; this is just a snapshot of the current publicly available tools if I were to dive into an audit today.  Every practitioner is also probably going to have their own homegrown tools as well, but it doesn't make sense to mention those given their lack of broad availability. DISCLAIMER ON TOOLS
  • 9. 9  Bad news: there is no such thing as being hackproof.  More bad news: there also is no such thing as being negative SEO-proof.  All you can reasonably do is make the appropriate efforts to lessen the probability of becoming a victim by reducing attack vectors, so that someone seeking to do harm has to be more sophisticated and put forth more effort than they would against the average website. HOW TO BE PROACTIVE AND PREVENT A NEGATIVE SEO CAMPAIGN AGAINST YOU
  • 10. 10  Hosting  CMS Considerations  Robots.txt  Scrape scrape scrape CONTENT (AND INFRASTRUCTURE) AS AN ATTACK VECTOR
  • 11. 11  What can your host do to keep you out of trouble? Quite a bit.  I debated placing hosting as a user signals vector given how important a proper setup is for uptime and site speed, but there's another critical factor at play with this specific recommendation: reputation.  If you were to address 100% of all the issues in this presentation, yet happen to be on a shared IP with a dozen other domains which are flagged for distributing malware, are blocked by email spam detection services, or are subject to manual link actions from Google, you're in for a bad time.  Bad neighborhoods are a thing.  You will at a minimum want to ensure you have a dedicated IP for this domain you care about, but ideally have the site on its own dedicated server. The other advantage of not sharing a server at your host is this becomes one fewer attack vector a bad actor could employ, not being able to gain access to your hosting setup by compromising a less security-minded domain on the same host. HOSTING
  • 12. 12  Not all content management systems (CMS) are made equal.  Some will automatically auto-spawn tag pages, archive pages, and separate image pages when you attempt to create a single page; some will automatically include a dofollow commenting section on posts which screams "spam me!" to all my favorite spam tools.  Since the majority of the world's websites run on WordPress, it is worth speaking to with specificity to by default: disable comments, noindex tag pages, noindex author archive pages, and probably noindex category pages. Some will disagree, but in a Google that contains the Panda algorithmic penalty, my focus is on attempting to index high value pages only, a hurdle tag pages, archives, and category pages rarely clear.  Also with certain CMSes it is important to ensure proper canonicalization is used to keep duplicated content from being indexed due to pagination and other query string nonsense. CMS CONSIDERATIONS
  • 13. 13  Robots.txt manipulation is a double-edged sword, and not just because it is very common to find a mistake which results in an entire domain being deindexed.  When crawling rules are too tight, it is possible to pretend a bad content result exists in a somewhat expected path and have Google rank that nonexistent page on the basis of a domain's inherent authority and keywords used in the URL slug alone, since Google is prevented from actually crawling the page and therefore has to trust that it "might" exist.  One of the biggest risk reductions comes in the form of disallowing search pages from becoming crawled and indexed. Without knowing which CMS you use, here's some generic advice for you to pick and choose from: Disallow: /search/ Disallow: /*?s= Disallow: /*?q= ROBOTS.TXT
  • 14. 14  Proper robots.txt setup isn't just for keeping poor quality pages out of the index. When faced with crawl budgeting, it can also be important to preview pages and permalink fail pages from the standpoint of ensuring Google doesn't waste time getting caught in a spider trap. To do that in WordPress is relatively easy: Disallow: *&preview= Disallow: *?p= Disallow: *&p= ROBOTS.TXT (CONT.)
  • 15. 15  I'm not going to suggest that you take a stance on scraping content as a means to protect yourself.  You'll need to be proactive in using a content protection service to ensure that your images and writing are not used elsewhere on the web without your authorization.  While Google is a bit better now at attribution of source material, there still does exist issues with using authoritative domains as parasitic hosts, where the attacker will purposefully seek to continuously crawl a target domain by sniffing a sitemap, posting the new content on the parasitic host within seconds of it going live on the target site.  Using a service to find these content thieves is a must. Depending on the where the content exists, you may need to issue a takedown request, some of the addresses of which I've compiled here: http://www.digitalheretix.com/blog/how-to-remove- information-from-google/ SCRAPING
  • 16. 16  Outbound links via UGC  Outbound links injected  Inbound links LINKS AS AN ATTACK VECTOR
  • 17. 17  As stated in the section on CMSes, I'm not a fan of open comments because they seem to be more often abused than used correctly, but what about other sources of UGC? If you add a community/forum section on your site for members to interact, I recommend doing one of four things: a. Nofollow all external links b. Force all external links to redirect through an internal page to strip outbound link equity c. Noindex all threads d. Pre-moderate all external links  Forums/communities that don't follow one of the above approaches likely appear on several Xrumer and GSA lists. OUTBOUND LINKS VIA UGC
  • 18. 18  This is a trickier issue to be proactive about, because by definition you're being reactive. However, monitoring Google Search Console for outbound links and scoring those links with a risk assessment tool of your choice such as Link Research Tools is going to be the best option here.  Another method involves a consistent crawling script with multiple user agents (Google and not Google) to determine if any links or content exist that should not; this is essentially handled by reverse engineering cloaking software to attempt to decloak injected issues. OUTBOUND LINKS INJECTED
  • 19. 19  Inbound links are far more likely to be your problem. There's only a few things you can reasonably expect to accomplish to try and protect yourself: a. As an overall percentage of your incoming links, you want as many good ones as possible, which means getting links for yourself is part of your proactive strategy - Yes, I know, that's obvious; it's a lot like Google telling you to just make great content. The truth isn't far from that though -- if you are consistently focused on producing the best content assets as they pertain to your niche and can do so in a way that your content answers a lot of questions users of sites in your niche might have, you'll consistently earn links. If you have only a few decent links and a bad actor decides to place a few hundred thousand very bad links at you, Google will almost certainly treat you unfavorably. The more uneconomical you can make that attack by increasing your beneficial links, the better. INBOUND LINKS
  • 20. 20 b. Watch your anchor text - One easy filter to trip is still the over-optimization of anchor text, so even if you're attracting great links be sure to do so in a manner that isn't forcing your audience into a narrow set of phrases you wish to rank for. If you do see your anchor text starting to get too concentrated, be on the specific lookout for a bad actor that's trying to slow trip the filter by ramping up a more nuanced attack. c. Disavow - I've gone on record that I don't like that disavow even exists, as I feel it is indicative of a guilty until proven innocent environment within Google, but since it does exist, you'll want to proactively disavow based on your risk scoring solution. Remember, it is not just the overseas counterfeit porn and gambling emporium links that you'll need to address, but also include those that appear to be part of any nuanced attack as well. INBOUND LINKS (CONT.)
  • 21. 21  There's really only a couple factors that can come into play for you to be aware of, and one of them there isn't much you can do about: USER SIGNALS AS AN ATTACK VECTOR  Metrics  Speed  Malware
  • 22. 22  CTR, time on site, and bounce metrics are consistently being folded in over time as more trusted signals by Google.  Knowing your baseline stats in Google Search Console and Google Analytics are important here because it is nominally easy to hire a botnet and a few thousand micro workers to click a result, bounce in 1 second, and have a portion of them file a suggestion that the domain they visited wasn't a quality site.  All you can really hope to do is notice strange trends and then attempt to compensate; if it is an obvious botnet, block it at the server or CDN level.  If it is a bunch of incentivized users, however, all you can really hope to do is handle the situation like you would your inbound links, by aiming to provide a satisfactory experience and acquiring traffic that you know will offset the poor metrics. METRICS
  • 23. 23  Earlier I alluded to including server setup as a speed consideration for negative SEO; want to prevent a potentially slow site being used against you? Don't host it on a shaky setup.  If possible, consider using a CDN to protect yourself somewhat from DDoS attacks and ensure that your own server environment is up-to-date to prevent zero day issues such as UDP amplification, Slowloris (RSnake's fault), and other attacks (those are particularly nasty).  Beyond that, you'll want to look into any way an individual could leech bandwidth off of you by locking down at the server level on inline linking of your images, remove unused CMS plugins, and establish proper caching. SPEED
  • 24. 24  Malware as a user signal? Absolutely, though you could argue this is more of a content issue and I'll agree with you.  Nothing contributes to a poor experience quite like getting auto-redirected to a scam site by some injected JavaScript.  To prevent such situations, in addition to keeping up-to- date on your hosting environment and CMS security updates, it is healthy to periodically run a malware scanner on your site's server to seek it out and remove. The sooner you can find problems, the better; thankfully, Google is pretty forgiving to addressing known malware compromises, but they don't catch all and will fold in poor user data as normal usage when they miss it. MALWARE
  • 25. 25 Let’s quickly give you the answers to the topical guide.  Want to properly use content formatted for featured snippets? Purchase either Yoast premium or use Rankmath and follow the plugin steps for simple content optimization. There’s no need to overthink it as the schema does change over time and you can let the plugin manage that.  Want to know what content to create in the first place? Perform a gap analysis in Excel by collecting paid search data from SpyFU and link data from Ahrefs. By combining these two sources you can get an understanding for what content is the most valuable and which content is the least link supported – pick the low hanging fruit. Coming soon: KWjuicer.com will do all of this research for you, and is owned by CopyPress, so they can also create the content for you.  Want to optimize your streams on YouTube? Make sure you provide a detailed title and description w/ link to your site. Embed all your videos on your blog posts and get influencers to share both the blog posts and your direct YouTube videos. I own the largest warm contact influencer network in existence to help you. Enjoy a 60-day free trial on me by using the coupon code AFFSUMMIT at Intellifluence.com. OK, SO THAT WAS MAYBE MORE ABOUT NEGATIVE SEO THAN AN INTRO TO SEO…
  • 26. 26 Remember, there’s very few things you really need to think about in most cases within SEO 101…  Only create content that uniquely satisfies a user’s query; don’t create content for the sake of doing so. Make it the best possible answer for that query.  Make sure that content is accessible to search engines and loads as quickly as you can get it to load, with a very clear KPI of what you want the user to do on that page.  Get traffic-bearing links to that content. Don’t obsess about nofollow; obsess over whether that traffic source might convert on your stated page’s KPI. For SEO 101, anyone trying to force you into doing something further is probably just trying to sell something.
  • 27. CREDITS Icons courtesy of flaticon.com (Freepik, Icon Monk, Icon Pond, mynamepong, pongsakornRed, Roundicons, Smashicons, Nikita Golubev, photo3idea_studio, Good Ware, phatplus, Prosymbols, Gregor Cresnar and Dave Gandy). All product names, logos, and brands are property of their respective owners.