Usability vs SEO: the big misunderstanding

Dilemma

When working in big companies, where every person has his own speciality, many conflicts can happen. Social media, SEO, UX designer, developer, content manager: everyone has his own reasons, and it seems impossible to reconcile every request. The consequence is a less than optimal compromise.

As SEO, I too often hear statements like “I understand you need this for SEO purposes, but we can’t do it because it would be bad for usability”. A big misunderstanding.

No, it’s not what you are thinking: we are not talking about keyword stuffing all over the page (that shit stuff stopped working before I started studying SEO in 2004 – 10 years ago…).

And every time I hear that, I can hardly keep calm, because it’s immediately clear to me that the other person is totally missing the point. And I’m not referring just to the specific SEO request; I’m talking about completely missing what Google wants. Because in the end, doing SEO is a lot about understanding what Google wants. And what is it?

Google wants to make his users happy.

Once you understand this, deeply understand this, the question “usability or SEO?” becomes suddenly meaningless. SEO and usability have the same goal!

This very simple truth has a huge consequence: the conflict between the SEO guy and the UX guy becomes a joint effort to find the best way to implement some improvement. Not anymore a less than optimal compromise, but a possibly new best practice to experiment.

Unless the SEO guy and/or the UX guy are not good at their job, of course.

By the way, there’s an interesting 2 years old article by Jakob Nielsen about SEO and Usability, and the few possible conflicts he mentions there aren’t in fact an issue, if the SEO guy really knows his job.

SEO and web usability go hand in hand.

If a solution is not good for both sides, chances are that the solution is not good for any side!

In that case, please reconsider your options.

Remove the favicon in Genesis

StudioPress

StudioPress themes (my favourite WordPress themes: this site is using one of them too) come with a default favicon.

And chances are you want to remove it.

Now, what you might find on Google might not work. That’s because the code to remove the favicon changed with the new Genesis 2.

So, here are solutions for both versions (however, I don’t see why not upgrade Genesis).

With the old Genesis, the code to put in your functions.php is:

/** Remove favicon */
remove_action('genesis_meta', 'genesis_load_favicon');
With the new Genesis, the code is:
/** Remove favicon */
remove_action(‘wp_head’, ‘genesis_load_favicon’);
Add the line in your (child theme) functions.php and check the source code: you shouldn’t see anymore the favicon there (while the browser might show the favicon for some time, because cached).

My 21 favourite marketing blogs of 2014

Reading

Here is the list of 20 marketing blogs I find the most interesting lately.

(The list is rigorously in alphabetical order)

Topics covered are online marketing, SEO, copywriting, content marketing, conversion rate optimization, social media marketing and web analytics.

Digital marketing blog by Adobe

Bryan Eisenberg

Buffer Social

ConversionXL

Copyblogger

The copybot

Hubspot

Kiss Metrics

Majestic Blog

Marketing Land

Marketing Profs

Marketo Blog

Moz Blog

Occam’s Razor by Avinash Kaushik

Search Engine Journal

Search Engine Land

SEMrush blog

Signal vs. Noise

Social Media Examiner

Sprout Social

Unbounce

 

update: 27/10/2014 Realized I forgot one of my very favorite ones, and updated the post accordingly

2 books that have made me a better SEO

Like everyone else, I wasn’t born SEO.

DNS&BindHowever, when I first started doing SEO in 2004, I had a solid understanding of how Internet works. In fact, at that time I was more a system/network administrator.

I knew many things, but surely I didn’t know that knowledge would have helped me to become a better SEO.

I didn’t even know what SEO was, I think.

But still, I was learning it. The two books are:

  1. TCP/IP Illustrated Vol. 1
  2. DNS&Bind

I know, it seems unnecessary to study the basics of Internet. However, still today I happen to answer to SEO questions using the knowledge I got from these two books.

Also, I’m not saying that without this knowledge you can’t be a SEO.

Afterall, I’m not your average SEO.

How to make the WordPress login cookie expire later than 2 weeks

By default, WordPress keeps you logged in for 14 days (if you flag the “Remember me” checkbox).

If you want to change this period to something different, you only have to add these lines to the functions.php of your theme:

add_filter( ‘auth_cookie_expiration’, ‘keep_me_logged_in_for_1_year’ );
function keep_me_logged_in_for_1_year( $expirein ) {
return 31556926; // 1 year in seconds
}

In this case, the cookie will expire in 1 year, so I don’t have to log in every two weeks. If you want the cookie a different period of time, just change that number accordingly (in seconds).

 

Change the default sender on WordPress

By default, WordPress sends notification emails from WordPress <wordpress@yourdomain.tld>.

To change this to something nicer, just add few lines of code on your theme’s functions.php file:

add_filter(‘wp_mail_from’, ‘new_mail_from’);
add_filter(‘wp_mail_from_name’, ‘new_mail_from_name’);

function new_mail_from($old) {
return ‘info@yourdomain.tld';
}
function new_mail_from_name($old) {
return ‘Yourdomain';
}

Clearly, remember to change info@yourdomain.tld and Yourdomain.

 

Celebrity Website SEO Audit

Celebrity Website Audit Yesterday I was following (“participating” would be too much for someone who doesn’t say a word, right?) the Celebrity Website SEO Audit, a Google Hangout organized by Dejan SEO.

I don’t know if it was the spam attack to Spamhaus, a too slow internet connection here in Malta or just that sometimes I’m not very patient, but after about 10/15 minutes I decided to give up trying to follow the SEO audit.

However, in the few minutes I followed, I could get a bit of what I was actually trying to get, which is, basically, how other SEOs would conduct a quick SEO audit, what they would look for first, and so. Because it’s always good to compare your way to do things with others.

And what I could see is that there are not big differences. I didn’t expect to find them, but you never know.

However, to apologize with Dan and all the others, I would like to point out 3 issues I found on LadyGaga.com, the first website audited yesterday. I apologize again if any of these issues has been already pointed out yesterday, but again I had 5 minutes of connection problems, and then quit, so I might have missed it.

  • The first issue I saw is the link on the logo, which links to /Default.aspx instead of /. There’s not even a rel=”canonical” saying search engines / and Default.aspx are the same. Yes, I know: if you search for the Default.aspx page, you don’t find it on Google, so one might ask why bother?  Mmm
  • Performance can be improved, just by adding a max-age or expires headers to a lot of static content or by combining CSS and javascript files.
  • All the events could use Event Schema

These are my 2 cents. Hopefully, next time I will be able to properly participate, also because SEO audits are something I really like to do.

Redirect 301 for everyone but you

Consider this post just a quick note to myself, because I will surely need this again in the future.

Scenario: i have a production server and a development server. As you can easily guess, on the production server i have the live version of the websites and on the development server I develop new features/bug fixes/new websites.

What I usually do is to create a specific subdomain for the development. Sometimes i add that hostname as DNS record, sometimes i just add it to my hosts file. Either way, it’s still possible for others to find your “hidden” development website (for example, Alexa.com displays the subdomains getting traffic, so if you use the Alexa toolbar – not necessarily the official – you will see your development website listed under your domain statistics on Alexa.com).

The solution to the problem is extremely simple: just redirect to the production website the traffic coming from any IP address but yours. Just add to the .htaccess of the development website these lines:

RewriteEngine On
# If it’s not your IP address
RewriteCond %{REMOTE_ADDR} !^1.2.3.4$
# Redirect everything to production host.
RewriteRule ^(.*)$ http://www.stefanogorgoni.com/$1 [R=301,L]

Change of course that 1.2.3.4 to your IP address (if you have a dynamic IP, don’t forget to change the .htaccess accordingly every time the IP changes), and the hostname of destination (the www.stefanogorgoni.com part).

 

Link building in the gambling industry

After the Penguin update, the SEO community talked a lot about unnatural links. Most of the times, the links that look unnatural are also purchased links.

This is true because SEOs focused so much on the anchor text of the links they could get from other websites they forgot one of the great rules of SEO: if you are making something unnatural, make it look natural.

The panic is at such levels that many SEOs started contacting the link vendors to ask them to shut down the links.

Ah, if only all those SEOs didn’t spend time forcing things that much… But ok, not everybody is meant to be a great SEO…

So, penalization after penalization, people is learning the importance of getting links with the most stupid anchor texts… Better late than never!

But is Google really penalizing every website buying links? Clearly no! First of all, it would be absolutely impossible for Google to know, for every link, which link is purchased and which link is spontaneous. They can just guess.

I work in a weird industry: gambling. Gambling is usually considered a very competitive niche. Yes, it is, but SEO in this industry is also really primitive. So, i wouldn’t say it’s that competitive! In the short term, sure it is… But basically every website is violating Google’s guidelines, so it’s reasonable to expect that one day they will be just wiped out.

If someone today wants to work purely in white hat mode, they won’t rank in short term, but they might be one of the best websites in the long term.

It’s funny, however, one aspect of the story: many of the online casinos buy unnecessary links. They just have a big budget and they buy all they can get.

SEO in the gambling is hard? Well, most of the times, a SEO in this industry just contact/is contacted, ask for a price, get quotes for every website (quote that varies depending on Pagerank! In 2012!), and choose a anchor text and a destination page to link to. Hard work!

This is the very competitive SEO in the gambling industry!

LOL

Even those companies that say “we don’t do that, come on” work this way. An example? This is just a little piece taken from a job description from one of the biggest betting companies when it was looking for a SEO Manager more than a year ago:

Would you expect to receive an offer of 10€ for a post with two backlinks from the same company? Well, what can i say… they don’t buy links. They beg for them!

But… is there any other way to acquire links, for a gambling website? I mean, apart from buying or exchanging links, what else a SEO working in this industry can do? It’s not all our fault. After all, who would naturally link a gaming website?

50% of webmasters wouldn’t do it at any cost; the other 50% would do it only for money. So, there’s no room for free links in this industry.

Unless you are creative. Unless you do something worth a link. Unless you deserve it.

But even doing so, you are still fighting against competitors that spend many thousands of dollars in links. If Google decided (and actually could) to ban all the websites buying links, today we’d likely have this SERP when searching for online casino:

SERP for the keyword "online casino" if Google banned websites that buy links

(click on the pic)

 

The irrational fear of high bounce rate

Watching a video from a Whiteboard Friday on SEOmoz, 10 Myths That Scare SEOs But Shouldn’t, i got stuck for a bit on the 5th myth listed by Rand about a high bounce rate.

Because yes, many SEOs are really afraid of that, they are afraid that Google can see that data and use it against the website.

The point is: this makes no sense. Users search for an information on search engines, they click on a result, they find the answer… and they leave. Happy.

What’s wrong with this behavior? Nothing! Why should a SEO be worried about the bounce rate then? No reason.

Rand correctly uses the example of a Q&A website. I go further: if you go on Google, search for something, and leave Google without coming back for more results, doesn’t it mean Google actually provided you with a good result?

Ok, technically speaking, searching for a key-phrase and clicking on a result doesn’t increase a bounce rate, since you are visiting two pages, not just one. But still, the time spent on Google is short. The shorter, the better.

You should be worry about the short time spent on your website for other reasons, maybe. But that data, out of context, means nothing. So, don’t panic.