<?xml version="1.0" encoding="utf-8" standalone="yes" ?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom">
  <channel>
    <title>Ian Peacock on Ariadne</title>
    <link>http://www.ariadne.ac.uk/authors/ian-peacock/</link>
    <description>Recent content in Ian Peacock on Ariadne</description>
    <generator>Hugo -- gohugo.io</generator>
    <language>en-gb</language>
    <lastBuildDate>Wed, 22 Sep 1999 23:00:00 +0000</lastBuildDate>
    
	<atom:link href="http://www.ariadne.ac.uk/authors/ian-peacock/index.xml" rel="self" type="application/rss+xml" />
    
    
    <item>
      <title>Unix: What Is mod_perl?</title>
      <link>http://www.ariadne.ac.uk/issue/21/unix/</link>
      <pubDate>Wed, 22 Sep 1999 23:00:00 +0000</pubDate>
      
      <guid>http://www.ariadne.ac.uk/issue/21/unix/</guid>
      <description>mod_perl [1] has to be one of the most useful and powerful of the Apache modules. Beneath the inconspicuous name, this module marries two of the most successful and widely acclaimed products of OSS, the Apache Webserver [2] and Perl [3]. The result is a kind of Web developers Utopia, with Perl providing easy access to, and control of, the formidable Apache API. Powerful applications can be rapidly created and deployed as solutions to anything from an office Intranet to Enterprise level Web requirements.</description>
    </item>
    
    <item>
      <title>The Unix Column: &#39;Sandboxes Using Chroot&#39;</title>
      <link>http://www.ariadne.ac.uk/issue/20/unix/</link>
      <pubDate>Mon, 21 Jun 1999 23:00:00 +0000</pubDate>
      
      <guid>http://www.ariadne.ac.uk/issue/20/unix/</guid>
      <description>You&#39;ve just obtained a new application that will run networked over an Internet. How do you know its secure? How do you know that its code doesn&#39;t contain any oversights that may lead to a system compromise? You probably don&#39;t, especially if its a large application.
Unintentional holes may be introduced to applications with as little as a one line coding oversight, such as copying data between two memory locations without checking the bounds of the data first (such a crack can be leveraged through a so-called &#39;stack overflow&#39; exploit).</description>
    </item>
    
    <item>
      <title>What Is a URI?</title>
      <link>http://www.ariadne.ac.uk/issue/18/what-is/</link>
      <pubDate>Sat, 19 Dec 1998 00:00:00 +0000</pubDate>
      
      <guid>http://www.ariadne.ac.uk/issue/18/what-is/</guid>
      <description>Users of the Web are familiar with URLs, the Uniform Resource Locators. A URL is a locator for a network accessible resource. Such a locator can be considered an identifier for the resource that it refers to. Depending on the interpretation of identification, various different attributes of a resource could be considered as an identifier for that resource. However, what comprises a functional resource identifier depends upon the context in which that identifier will be used.</description>
    </item>
    
    <item>
      <title>Metadata: BIBLINK.Checksum</title>
      <link>http://www.ariadne.ac.uk/issue/17/biblink/</link>
      <pubDate>Fri, 18 Sep 1998 23:00:00 +0000</pubDate>
      
      <guid>http://www.ariadne.ac.uk/issue/17/biblink/</guid>
      <description>BIBLINK [1] is a project funded within the Telematics for Libraries programme of the European Commission. It is investigating the bi-directional flow of information between publishers and National Bibliographic Agencies (NBAs) and is specifically concerned with information about the publication of electronic resources. Such resources include both on-line publications, Web pages, electronic journals, etc. and electronic publications on physical media such as CD-ROMs.  The project has recently finalised the Functional Specification for the &amp;lsquo;BIBLINK workspace&amp;rsquo; - a shared, virtual workspace for the exchange of metadata between publishers, NBAs and other third parties such as the ISSN International Centre.</description>
    </item>
    
    <item>
      <title>Showing Robots the Door</title>
      <link>http://www.ariadne.ac.uk/issue/15/robots/</link>
      <pubDate>Mon, 18 May 1998 23:00:00 +0000</pubDate>
      
      <guid>http://www.ariadne.ac.uk/issue/15/robots/</guid>
      <description>What is Robots Exclusion Protocol?The robot exclusion protocol (REP) is a method implemented on web servers to control access to server resources for robots that crawl the web. Ultimately, it is up to the designer or user of robot-software to decide whether or not these protocols will be respected. However, the criteria defining an ethical robot includes stipulation that a robot should support REP.
This article refers to the established REP[1] acredited to Martijn Koster [2].</description>
    </item>
    
  </channel>
</rss>