Home: Products: Plugins: Gossamer Links Plugins: Paid Plugins: 3.x Version: Error_Jump_Articles

The idea of this plugin is to 'catch' bad links before sending the user there, and potentially losing them. This version also lets you pass in URL=xxx, and that URL will then be checked.

Please Note: If using this plugin on more than one site, you need one license per site!

Please note, this plugin is also part of the following "packages";

Other Details

Plugin Description

The idea of this plugin is to 'catch' bad links. It verifies the link exists before sending the user to that page. If it finds there is an error, it will show them an error message on your site, thus not giving them a direct bad link. It will also email you the Link ID if you turn this option on.

The new powerful features let you use things like;

<%if Description contains 'http://'%>
  <%loop URL_Loop%>
    <%SubURL%> - Status: <%Plugins::Error_Jump_Articles::GetURLStatus($ID,$SubURL)%>
<%endloop%> <%endif%>

This would then print out a list of extracted URL's from the "Description" field (this can be set to whatever field you want). This is ideal for giving feedback on the last known status of a link, before sending the user there. For example; http://www.ultranerds.co.uk/forum - Status: 200
http://www.domain.com - Status: 200
http://www.a-dead-site.org.com - Status: 404
http://www.google.com/some_sub_page - Status: 200

Plugin Requirements

Links SQL 3.0+


  • Emails error messages to the link owner (and/or site admin), letting them know there was a problem with their link.
  • HTML email sent out to link owner, so you can make it look pretty :-)
  • All automated... All you need to do is update jump.cgi in the templates to safe_jump.cgi.
  • Checks for 500,401,403 and 404 error codes.
  • Now works for additional URLs (i.e not just the value held in "URL")
  • Better error reporting.
  • Automatically "extracts" URL's from the Description field (or whatever has been set in the plugin), ready to report back on the link itself or the detailed page.
  • Simple interface to "loop" through the extracted URL's, and report back on their last known status.