Inclusion Excluded: Code Is a Leonine Contract

Study on the Technical and Legal Value of the Robots.txt Protocol
By Guillaume Sire
English

Internet is built on a web of protocols. After explaining why I believe the social sciences should study computer code – these protocols’ raw material – I analyse the controversy that surrounded the “robots.txt” exclusion protocol, designed to allow an editor to regulate visits to its documents by search engines’ software agents. I explain why only the agents’ designer has the power to give legal value and a technical function to the commands of the protocol which, without the designer’s action, would be inoperative. Looking at a case where the editors tried to redefine the syntax of robots.txt without going through the standardization bodies, we show why the relationship between editors and intermediaries online is characteristic of a leonine consensus favouring the latter.

Go to the article on Cairn-int.info