How to make the WordPress login cookie expire later than 2 weeks

By default, WordPress keeps you logged in for 14 days (if you flag the “Remember me” checkbox).

If you want to change this period to something different, you only have to add these lines to the functions.php of your theme:

add_filter( ‘auth_cookie_expiration’, ‘keep_me_logged_in_for_1_year’ );
function keep_me_logged_in_for_1_year( $expirein ) {
return 31556926; // 1 year in seconds
}

In this case, the cookie will expire in 1 year, so I don’t have to log in every two weeks. If you want the cookie a different period of time, just change that number accordingly (in seconds).

 

Change the default sender on WordPress

By default, WordPress sends notification emails from WordPress <wordpress@yourdomain.tld>.

To change this to something nicer, just add few lines of code on your theme’s functions.php file:

add_filter(‘wp_mail_from’, ‘new_mail_from’);
add_filter(‘wp_mail_from_name’, ‘new_mail_from_name’);

function new_mail_from($old) {
return ‘info@yourdomain.tld';
}
function new_mail_from_name($old) {
return ‘Yourdomain';
}

Clearly, remember to change info@yourdomain.tld and Yourdomain.

 

Redirect 301 for everyone but you

Consider this post just a quick note to myself, because I will surely need this again in the future.

Scenario: i have a production server and a development server. As you can easily guess, on the production server i have the live version of the websites and on the development server I develop new features/bug fixes/new websites.

What I usually do is to create a specific subdomain for the development. Sometimes i add that hostname as DNS record, sometimes i just add it to my hosts file. Either way, it’s still possible for others to find your “hidden” development website (for example, Alexa.com displays the subdomains getting traffic, so if you use the Alexa toolbar – not necessarily the official – you will see your development website listed under your domain statistics on Alexa.com).

The solution to the problem is extremely simple: just redirect to the production website the traffic coming from any IP address but yours. Just add to the .htaccess of the development website these lines:

RewriteEngine On
# If it’s not your IP address
RewriteCond %{REMOTE_ADDR} !^1.2.3.4$
# Redirect everything to production host.
RewriteRule ^(.*)$ http://www.stefanogorgoni.com/$1 [R=301,L]

Change of course that 1.2.3.4 to your IP address (if you have a dynamic IP, don’t forget to change the .htaccess accordingly every time the IP changes), and the hostname of destination (the www.stefanogorgoni.com part).

 

Get domain from URL

How to get the domain from the URL? It depends!

Lately i’ve spent some time trying to figure out the best way to solve this problem.

Scenario: a website reachable through two different second-level domains (and a bunch of third-level domains). No redirects from a domain to the other, or from the third-level domains to the second-level (and this behaviour couldn’t be changed). The two SLD have their own virtual host configured on Apache (this detail is very important, as you will see).

Please note: the following possible solutions consider PHP, but i guess that, apart from the different syntax, the logic would be the same with any other language). I’m not a programmer anyway, so won’t put much code here (feel free to add it in the comments, if you want).

One possible solution is to get the server name:

<?php
$_SERVER(‘SERVER_NAME’);
?>

and then take the last two strings starting (separated by a dot) from the end.

So, if SERVER_NAME is www.mydomain.tld, you would get mydomain.tld, which is the second level domain.

This solution can be good enough if you know in advance you are not going to use it with domains including a dot, like co.uk, com.mt or com.au, just to name a few.

But if you have the website accessible through google.com and google.co.uk (the first example coming to my mind, i wonder why), this kind of solution would return google.com and co.uk. Not exactly what you’d want.

A more sophisticated solution would be to check the TLD against a list (there is one here but it’s not complete). If you have a complete list of TLD, you can get the SERVER_NAME, check what TLD is in it, and pick up the part of the hostname before the TLD (plus the TLD itself, of course).

For both the solutions above, you can find a lot of code snippets on Google.

But my favourite solution is the third! In fact, you can set in the virtual host (in the two virtual hosts, in my case) on Apache a variable defining the domain:

<VirtualHost>
ServerName www.domain1.tld
SetEnv MY_DOMAIN domain1.tld
</VirtualHost>

<VirtualHost>
ServerName www.domain2.tld
SetEnv MY_DOMAIN domain2.tld
</VirtualHost>

This way it’s Apache defining the exact domain value, and at this point you can get the variable in php with a simple

<?php
$_SERVER(‘MY_DOMAIN’);
?>

For the record, i needed to use the variable to create a cookie valid for the second level domain and any subdomain of it. So, once defined the variable in the virtualhost, all i had to do was something like this:

$domain = $_SERVER['MY_DOMAIN'];
if(isset($_GET['parameter'])) {
$variable = htmlentities($_GET['parameter']);
setcookie(“mycookie”, $variable, time()+(60*60*24*7), “/”, $domain);
}

In case you will find yourself in the same situation, hope this saves you some time.

P.s. as you see now, being able to edit the virtual host is essential to use this solution.

WordPress and XML sitemaps plugins

If you have a WordPress with multisite feature enabled, you may have experienced problems in finding the right plugin to generate a XML sitemap to submit to search engines.

I usually use Google XML Sitemaps, maybe the most used plugins to generate XML sitemaps on WordPress. Unfortunately, this nice plugin doesn’t work on WP Multisite. And it doesn’t generate multiple sitemaps.

If you want a XML sitemap plugin to generate sitemaps on your multisite wordpress, you want try WordPress SEO by Yoast, as far as i know the only plugin that works well in generating a sitemap on a WP multisite website.

But if you have a huge website, with thousands and thousands of URLs in it, you may have another kind of issue. In fact, none of the above mentioned plugins generate multiple sitemaps (and the sitemap index, of course) in case you have more than 50.000 URLs to list. And by the way, 50.000 is the limit in the protocol, but Google seems not to love sitemaps with more than 10.000 URLs listed. If you have this issue, you should try Strictly Google Sitemap, a plugin that allows to generate multiple sitemaps (and with great performances!). Only problem i found out using this plugin is that the permalink structure must include some numeric value (%post_id%, for example), or the sitemap generated won’t be correct.

And if you have a WP multisite with some of the websites in the network with more than 50.000 (or just 10.000) URLs? I’m afraid we have to wait for it: i couldn’t find any.

Pidgin cannot connect to MSN: the certificate chain presented is invalid

The certificate for omega.contacts.msn.com could not be validated. The certificate chain presented is invalid.

If you have an error when trying to connect to MSN messenger with your pidgin today, this is the easy and quick way to fix the problem: just delete the contacts.msn.com SSL certificate.

rm ~.purple/certificates/x509/tls_peers/contacts.msn.com

This way, pidgin will download again the SSL certificate and everything will be working again.

update: check comments for more other possible fixes

WordPress, Feedburner and sitemaps

UPDATE (19/03/2011): it seems the last version of the plugin already takes care of Googlebot, so this post has to be considered outdated.

If you use Feedburner for your wordpress feed, you probably use the FD Feedburner plugin for WordPress . The plugin is cool because it redirects your users to your Feedburner while letting Feedburner itself accessing your wordpress feed; and it’s really simple to configure.

But if you want to submit your feed to Google Webmaster Tools, Google will be redirected to your Feedburner too. While you may expect it to work, in some case it won’t. If you track clicks on Feedburner in fact, your feed will have changed links in it. Feedburner changes the <link>URL</link> to an internal URL that will redirect to your own URL after tracking stuff.

As a consequence, if you submit your feed as a sitemap on Google Webmaster Tools, Google will show you errors like this:

Feedburner and Google sitemap

This happens because the URLs in the Feedburner feed are not into your own domain but on http://feedproxy.google.com/

To fix this behaviour, easiest solution is having Google accessing your original feed (http://yourblog.tld/feed/) instead of being redirect to Feedburner. This can be easily done with a little change in the plugin.

Edit your plugin (with a text editor accessing the file via ftp, or just from the dashboard -> Plugins -> Editor, and select the FD Feedburner plugin) and look for this piece of code:

function feedburner_redirect() {
global $feed, $withcomments, $wp, $wpdb, $wp_version, $wp_db_version;

// Do nothing if not a feed
if (!is_feed()) return;

// Do nothing if feedburner is the user-agent
if (preg_match(‘/feedburner/i’, $_SERVER['HTTP_USER_AGENT'])) return;

// Do nothing if not configured
$options = get_option(‘fd_feedburner’);
if (!isset($options['feedburner_url'])) $options['feedburner_url'] = null;

Just change the line

if (preg_match(‘/feedburner/i’, $_SERVER['HTTP_USER_AGENT'])) return;

with:

if (preg_match(‘/(feedburner|google)/i’, $_SERVER['HTTP_USER_AGENT'])) return;

and you are done. Google won’t be redirected to your Feedburner feed, and it will use your original feed as sitemap.

Bigdump

Moving a website to a new hosting, i had the problem of importing the database. In fact, export was too much big (30Mb) compared to allowed size of upload files via phpmyadmin on new hosting (1Mb – where “M” maybe stands for “miserable”). Of course, no shell access…

So? Fortunately, i found BigDump, a GPL script that allows to import into the new database the exported file. Excellent!

302 loops with Pligg

Yesterday i spent many hours trying to fix a 302 loop with Pligg (a great piece of open source software to create a Digg clone site), so i’m gonna post here the (easy) solution, just in case someone else have same problem.

Installed Pligg and enabled search engine friendly URL (Pligg calls it URL Method 2), i had a loop of 302 redirects when i tried to visit users page.

Apache’s error log file said “Negotiation: discovered file(s) matching request: /path/to/vhost/domain.tld/htdocs/index.html (None could be negotiated).” and it should had tell me something, but i haven’t be able to look for the solution in the right place (Google is not always the best place…).

Apache’s access log file showed a bunch of 302 redirects. So, the problem was a loop of 302 redirects (and page wasn’t loaded, of course).

Where the problem was? I immediately thought to look in .htaccess file, but after i commented every single rewrite instruction i still had the problem. Then, i looked at Apache’s configuration to remember how virtual host are configured (i use VHCS on the server i installed Pligg).

At that point, something catched my eyes. In fact, vhosts have this line:

Options Indexes Includes FollowSymLinks MultiViews

In .htaccess file i had the line:

Options +Indexes +FollowSymlinks

So, looking for explanation of every option in Apache documentation, i found out that MultiViews was the option giving problems to me (more info here).

And adding -MultiViews to the line above fixed the problem. So, just change

Options +Indexes +FollowSymlinks

to

Options +Indexes +FollowSymlinks -MultiViews

if you have the same problem of 302 redirects loop.

I spent many hours to find it out, so i hope this post can save your time in case you have the same problem.

Mass mailing software

If you are looking for a versatile mass mailing software, written in PHP and freely provided under the GPL, you should definitely try poMMo.