Nicholas Bohm
MA (Cantab), Solicitor of the Supreme Court of Judicature in England and Wales,
All information herein is exclusively the author's opinion.
Copyright © 1997 by N. Bohm. This is a Personal Position Paper and was not reviewed by the MCG.


    I make my comments on the basis of 25 years' experience as a practising commercial lawyer in a major City of London firm of solicitors. I include my curriculum vitae as an Appendix to show the range of my relevant experience. It includes major international transactions, work involving computers and intellectual property, and banking and insolvency work. An additional example of relevant experience was a detailed review of the contractual terms in use by a London clearing bank for its electronic banking services.

    In major transactions, and particularly in the case of international ones, lawyers are often required to give formal opinions on the validity of agreements and security documentation. This requires a close analysis of who relies upon whom for what in the transaction, and a detailed articulation of the underlying assumptions. This analysis is rarely undertaken by anyone outside the legal teams, and I believe that this gives practising lawyers with the necessary experience a particularly clear view of the issues relating to authentication which are of central relevance to the consultation paper.


    Just as communication by telex and later fax arrived to supplement communication by ordinary post, so communication by forms of electronic mail is now coming to supplement post and fax (telex use having greatly declined). Each new technology has brought advantages, together with some drawbacks, and commercial law and practice have required adaptation to meet the new circumstances. If a scheme of regulation is to be applied to electronic communications when no similar scheme has been thought necessary for earlier methods of communications, it must be justified on the basis of a commercial analysis of the features that are new to electronic communications.

    The particular advantage of electronic mail is that it conveys text (and spreadsheets and other files) in its original form, so that the recipient can process it directly in his computer system. A document created by the sender in his word processing application can be sent by electronic mail and thereby enable the recipient to make amendments to the identical text in his own word processing application, for example.

    Electronic mail can operate between computers in a network within a building, or between two machines (or two networks) connected by the public telephone network or by dedicated telephone lines. It can also be sent over the growing informal network known as the Internet.

    All transactions conducted remotely are vulnerable to interception and forgery. The post is generally regarded as relatively secure from interception, telephones (and the faxes sent by telephone line) somewhat less so. The Government can always intercept both (within the limits laid down by law). Letters and documents are significantly more difficult to forge than their fax equivalents, which are in essence fairly poor quality photocopies.

    Electronic mail within a private network is probably more difficult to intercept than over the public telephone network, but electronic mail over the public telephone network is as easy or difficult to intercept as fax or voice communications. Electronic mail sent over the Internet is notoriously insecure and vulnerable to interception. It is also highly vulnerable to forgery, since an electronic message has inevitably lost many of the characteristics of letters and even faxes that aid the detection of alterations or wholly bogus imitations.

    The solution to this problem has been found in developments in public key cryptography which took place in the 1970s and 1980s. The result has been the widespread availability of simple and inexpensive means of doing two things:

    (1) encrypting electronic messages with a very high degree of security from decipherment by anyone other than the holder of the public key used by the sender, and

    (2) marking them so that the recipient can be certain that they have not been altered in transmission and that they originate from the holder of a particular public key.

    A brief description of the system is necessary (in which the details are considerably simplified at some expense in accuracy but not in result):

    (1) A user generates a key pair, one being his public key and freely publishable, the other being his private key and kept strictly secret. The keys are associated mathematically, but the private key cannot practicably be derived from the public key.

    (2) A message encrypted with the public key cannot be decrypted by that key, but only by the corresponding private key.

    (3) A message encrypted by the private key can be decrypted by anyone with the corresponding public key; but the fact that it can be decrypted by that public key proves that it must have originated with the holder of the corresponding private key, as no other key could render it decryptable by that public key.

    (4) X sends B a message encrypted both by B's public key and X's private key. B (and only B) can decrypt it, using B's private key. He finds that X's public key is also required to decrypt it, from which it follows that X's private key must have been used to encrypt it, and that it could not have been altered since X encrypted it.

    (5) The message was secure from interception, and it must have been received from X unaltered.

    A slightly fuller explanation of one implementation of this procedure, taken from the PGP user guide, is set out for interest as an Appendix.

    It should be noted that X never needs to use his own public key. He uses other parties' public keys to encrypt messages to them; and he uses his own private key (a) to authenticate his own messages to others and (b) to decrypt others' messages to him. It is possible for a user to maintain two separate key pairs, reserving one for authenticating messages (changing it rarely) and another for encryption (changing it more often for added security). Users with satisfactorily secure arrangements for the protection of their private keys may not need two key pairs, and can use the same pair for both purposes.

    This leaves two problems, one for the parties to the message and one for the Government.

    The parties' problem is how to know which public key (or which public authentication key) belongs to which person. The problem is easily solved if they know each other and meet and exchange keys before using them, or if they can be introduced by a third party known to both. It is in cases where the parties are unknown to one another, and no introduction is available, and yet certainty of their identity is important, that a system of certifying authorities is required to vouch for the identity of holders of public authentication keys. (It is not necessary for encryption only public keys to be similarly certified: if encrypted text is sent to the wrong person, i.e. a person who does not hold the corresponding private key, the text will be unreadable.) Certifying authorities are the trusted third parties referred to by the DTI.

    The Government's problem is that encryption could prevent it from reading messages lawfully intercepted for law enforcement purposes, or delay its ability to do so, or compel it to rely on burdensomely expensive methods for obtaining access to such messages, or methods which would tend to reveal its interception.

    To solve the parties' problem, so far as it is one, certifying authorities need do no more than vouch for the fact that a particular public key belongs to a named person and has not been revoked. The certifying authority need know nothing of any party's private key for this purpose. The DTI do not propose that certifying authorities who certify a public key for use for authentication should hold the corresponding private key.

    The DTI's proposed solution

    The DTI proposes a legislative framework in which only licensed persons can provide the public with encryption services, including the service of vouching for the identity of the holders of public keys. The DTI would establish and enforce the observance of high standards so that the public could safely rely on the licensed certifying authorities. Within limits, that helps towards solving the parties' problem.

    But as part of the "bargain", the DTI would require all licensed certifying authorities to solve the Government's problem by holding not only the parties' public keys, but also their private keys; and making the private keys available to law enforcement agencies when required by law. Or, as explained in an open letter on 9th May, it would so require where a public key is certified as being for use for encryption but not where it is certified only for use for authentication. (The letter is referred to below as "9/5".)

    This element of the proposal has attracted vehement opposition, and has caused much of the debate to focus on privacy and civil liberty issues, to the detriment of attention to the commercial aspects of the proposals. The primary issue ought to be whether the proposed scheme of regulation is necessary or useful in promoting secure electronic commerce. The DTI may feel that the debate has concentrated on the wrong issue because Internet users are an articulate but unrepresentative minority, and that the public at large will not share the concerns expressed on the privacy issue. But the public at large will only be affected at all by becoming Internet users, and I believe that in doing so the public will tend to absorb the Internet ethos. Just as the tail has wagged the dog in the debate, so it will in the public response to the TTP scheme as implemented: if TTPs are seen as the Government's route into private communications and data stores, this will do great damage to the effectiveness of the scheme, and hinder its primary objective. The bias of the debate, however unwelcome to the DTI, should be seen as a harbinger of trouble ahead.

    The requirement for holding private keys is wholly unnecessary for the solution of the parties' problem; and it confers no material benefit on the parties. (The DTI suggests the existence of some advantages, but those suggestion are criticised below.) It exposes the parties to the risk of accidental or corrupt disclosure or misuse of a private key: this would not only facilitate access to private communications, but enable undetectable forgery in the name of the proprietor of the private key (where one key pair is used for authentication and also for encryption). Collecting together in one system a large number of private keys conferring a power of undetectable forgery is to create a quite extraordinarily attractive target for organised crime, justifying a huge expenditure on attempts at corruption or other methods of unlawful access. Moreover the requirement for law enforcement access to a private key within one hour of demand would make it impossible to protect its security by conventional techniques of making access depend on several different senior individuals having to co-operate for the purpose.

    A forgery using a private key is much more serious for the owner of the private key than (for example) the forgery of a cheque drawn on his bank account. In the case of a forged cheque, the risk falls on the bank: the bank is not entitled to debit my account with a forged cheque (unless I have somehow facilitated the forgery). The position is not the same with a forgery by the use of a private key: the contractual terms on which the banks provide electronic banking invariably make the user responsible for all debits authenticated with his private key, whether authorised by him or not. An irrebuttable presumption of authenticity applies. The result is that the risk falls on the customer instead of the bank. This makes any risk of the compromise of a private key specially serious for its owner.

    It should be noted at this point that the DTI's proposals would not require the licensing of encryption services provided as an integral part of another service. One of the examples offered by the DTI is the authentication of credit cards.

    Do the parties really have a problem?

    In the great majority of transactions neither party requires knowledge of the other's identity. The obvious example is a cash transaction in a shop.

    The problem begins to arise where the parties are distant, as in mail order transactions. (This is not of course a phenomenon of the electronic age: mail order has existed in England since at least the 18th century, and it no doubt expanded greatly in the mid-19th century with the introduction of cheap postal services and wider newspaper advertising.) Either the merchant must trust the unknown customer to pay after receipt of the goods, or the customer must pay in advance and trust the merchant to deliver. Much trade of this kind was done on trust, and still is. Where the risks are too high, one party or the other will try to assess the trustworthiness (and incidentally the authenticity of the identity) of the other by taking up references from third parties known to both, such as bankers or other merchants.

    Note that in these circumstances it is not the identity of the other party that is the matter of primary concern, but the trustworthiness, although often the trustworthiness cannot be ascertained without ascertaining the identity first.

    The problem of trading between distant parties has been addressed by the banking system, originally by the use of bills of exchange and letters of credit (primarily for use between traders rather than by consumers), and in recent times (but well before the Internet era) by the credit card system. It is the credit card system that has brought about the greatest expansion in remote transactions between previously unknown parties.

    The credit card system works because the banks control access to the system by card holding consumers and card accepting merchants, and they use that control to assess the risks to the banks of admitting particular persons. The banks then assure the merchants of payment (taking on themselves risks of cardholder insolvency) and (at least in the UK) assure the cardholder of the reliability of the merchant by incurring a measure of statutory responsibility to the cardholder for the defaults of the merchant.

    Provided that the merchant follows the system's rules in accepting an order by remote means (which entails obtaining full card details, no doubt checking that the card has not been reported lost or stolen, and usually ensuring that goods or services are supplied only to the address recorded as that of the cardholder), the merchant can be indifferent to the identity of the customer. Likewise the customer can be sure that if the banks have admitted a merchant to the system (without which the merchant cannot obtain payment or the customer be charged with that payment), they have checked the existence and status of the merchant and have accepted the statutory responsibility falling on them for the defaults of the merchant. In principle each of the customer and the merchant can trace the other through the banking system if they need direct recourse.

    Although the banks suffer a considerable loss through fraud relating to the card system, it remains profitable for them despite its deficiencies. As the volume of remote trading by electronic mail grows, the banks may take advantage of the increased opportunity for card authentication. They could enable and require their merchants and customers to generate key pairs and to provide the bank with the public keys so as to maintain a database available to customers and merchants. It would be up to the banks to specify the evidence they required in order to admit a customer or a merchant to the system, just as it is now. Some frauds would be unaffected (such as obtaining and using a card without having the means to pay); others would become much more difficult (such as impersonating a cardholder on the basis of the details found in a discarded transaction record).

    Such a system could utilise customers' and merchants' existing keys where they had them. But since there is no universal standard, it seems more likely that the banks would establish their own system under their own management, and the cardholders' and merchants' keys might well be usable only within the system. There seems no reason in principle why key holders should not use the same keys for general authentication or encryption purposes, particularly if the banks' database of public keys were accessible to the public. But it should be noted that such a system would do no more than associate a public key with a card or merchant (and perhaps the name and address of the holder or merchant as known to the bank). There would be no more verification of the content of the database than is required to verify the identity of an individual applying for a card account, which is significant but not very great.

    Payment has been addressed at length because payment is the main concern in transactions between previously unidentified parties. It seems likely that the banks will continue to take the risks, and will develop whatever systems seem to them cost effective to reduce those risks. The DTI does not propose any legislative intervention, and this seems the right approach.

    Where else do previously unknown parties need assurance of one anothers' identities? This occurs less often than might be supposed. A number of examples may be considered. You do not wish your bank to release confidential information about you to strangers pretending to be you: but the bank and you are not previously unknown, and it is indeed out of your prior relationship that the existence of the confidential information arises. You have ample opportunity to ensure that the bank knows your public key: for example you can hand it personally to a manager who knows you through a history of personal dealings. You may wish to consult a famous physician although you have never met him. It is unlikely that this can be satisfactorily achieved without a visit to his consulting rooms, perhaps after some introduction. It is possible that his presence in the telephone directory and medical register are part of an elaborate fraud, including bogus consulting rooms, but in practical terms this is unlikely. If during a visit you obtain his public key from him personally, and hand him your own, then you can thereafter be sure that future dealings are between the same two individuals.

    Perhaps more plausible cases of previously unknown parties needing assurance of one anothers' identities can arise where a seller claims to be the owner of property whose title is registered. Examples are registered titles to land or corporate securities. Another class of case is that of subscribers to an electronic commercial service, such as satellite television or electronic share price information, who need to prove their right to receive the service.

    It is surprising that in the UK at least, neither the Land Registry nor a corporation in which you hold shares will necessarily have any prior knowledge of your signature, despite the fact that it is your signature on a form of transfer that is necessary to transfer your title to a buyer. This suggests that remarkably low levels of authentication are quite sufficient to enable substantial transactions to proceed on a large scale without significant practical risk. Nevertheless there might come to be significant benefit from attaching a public key to a land or corporate share title at the time of acquisition, so that the buyer could later use the corresponding private key to authenticate a subsequent dealing. At the time of original purchase, all that the land registry or corporation would be concerned to ensure was that the buyer and the public key were associated: there is no reason why either should go further and seek to verify that the buyer really is known by the name or resides at the address given for registration purposes. Similar procedures might help to reduce the risks of buying a second hand car in a private sale. In the case of electronic services the customer could use his private key to prove his entitlement, having previously supplied his public key to the service provider at the time of subscription.

    Low levels of authentication are as much a feature of the corporate scene as of the personal. A company is bound by a decision of its board of directors: how can one party to a contract be sure that the other party, a company, is bound by the signature on it? When formal evidence is sought, which is by no means always the case, the usual evidence is an extract from the board minutes certified by the secretary or a director. How are the signatures to be checked? Directors and secretaries must consent to their appointment by signing the notice filed at the Companies Registry to record their appointment, which must also be signed on behalf of the company. It would be possible, but most unusual, to compare the signature on the contract with the consent signature on the notice of the signatory's appointment. It would be even more unusual to check the signature of the notifying officer on the appointment form with that officer's signature on his original appointment. Given the comparative ease with which signatures can be imitated, there is no great assurance against fraud: it would be quite easy for someone to file a notice of their own appointment as a director. Despite this vulnerability, the system in general works satisfactorily.

    The following is a typical provision from a formal legal opinion given by solicitors responsible for a major corporate transaction:

    "We have assumed the genuineness of all signatures and seals on all documents, the completeness and authenticity of all documents submitted to us as originals and the conformity to original documents of all copies submitted to us."

    The terms of such opinions are debated in the course of a transaction, and the effect of such a provision is to draw the attention of the party relying on it to the fact that he bears the risks of impersonation and forgery. It is my invariable experience that by the conclusion of an important transaction the parties have become so satisfied about the identity of those with whom they are dealing that no enquiries are made to verify the assumption set out above.

    In the light of these considerations it can be concluded that it is much less common than is supposed by current conventional wisdom for previously unknown correspondents to need a high level of certainty about the true identity of the person with whom they are corresponding. While it is no doubt true that it is easier to impersonate someone by electronic mail than by other forms of communication, because it conveys fewer personal characteristics, this is a relatively unimportant practical impediment to electronic commerce. The need for a general system of licensed certifying authorities is therefore open to doubt. It is much more likely that in a limited number of cases, such as credit cards, electronic services and perhaps some registered titles, improved means of authentication will be available as an integral part of the operation of the particular system involved.

    It might be thought that this would lead to undesirable fragmentation, with users requiring a cumbersome plethora of electronic keys. But it must be remembered that only users of computers will be able to use electronic mail; and in practice computers would be able to manage the keys as an integral part of operating the different electronic services for which they were required.

    How good can general purpose authentication be?

    How will a certifying authority decide that an individual seeking authentication for his public key is who he claims to be? Verisign, Inc., a company based in California seeking to establish itself as a certifying authority, offers individuals three classes of certificate. To obtain the highest, class 3, an individual must provide certain personal information (name, address, telephone numbers, etc) and attend before a notary public with three forms of identification (such as a passport, a driving licence, etc). For most purposes this is reasonable evidence of identity, but it is clearly vulnerable to a moderately determined effort to have a false identity authenticated. False passports are obtainable by criminals, as are most other usual forms of identification. On the basis of such an authentication it is no doubt possible to conclude that the unknown but authenticated individual is either who he claims or else a criminal, but this provides only limited comfort.

    It should also be borne in mind that an individual is free to adopt any name he chooses, and to do so without any formality prescribed by law. In these circumstances there can be no difference between an individual's "true" or "real" name and the name by which he happens to be known to those with whom he deals in the ordinary course of life. This is not a concept which can easily be accommodated by rigid notions of authenticating the identity of an individual by standard general purpose procedures.

    It would be possible for a certifying authority to seek references and investigate them, to visit an applicant at his home and place of work, make enquiries of neighbours, and so forth. Such investigations might succeed in establishing a very high level of reliability, so that one could conclude that the individual is either who he claims or else a state supported impersonator (such as a beneficiary of a witness protection programme or the agent of a foreign power). But investigations of this kind are hardly practicable on a wide scale, would be very expensive, and would take too long. Not enough applicants would be able to pass or willing to pay.

    The charges of a certifying authority would be based not only on the costs of certification but would also have to include an element of premium to cover the authority's exposure to liability to the public for errors in its authentications. If this liability were to be excessively limited, or dependent on the need to prove fault, this would detract from the value of the authentications. Assumption of a generous measure of strict liability would lead to higher charges. (The DTI paper is silent on the issue of responsibility for authentications. In a paper based on principles of consumer protection, this is a striking omission.)

    It is of course true that every individual has a number of unique characteristics, such as DNA profile and fingerprints. If these were routinely identified at birth and recorded in a machine searchable database, it might be possible to provide a reliable permanent connection between an individual and an identifier which could be used for authentication. Such a scheme is unlikely to be feasible in practice, particularly on an international basis, and would in any event depend on government involvement which would render it objectionable to many.

    How is the real need for authentication to be met?

    The primary need for authentication arises from the need for remote payment, and this is best met by gradual improvements in the existing means offered by the banking system. These improvements, which it is in the interests of the banks to provide, may well utilise modern techniques of public key cryptography. The DTI does not propose that this should be covered by its proposed legislative framework.

    Apart from payment methods, the need for authentication can best be met in electronic communications in the same ways that have always applied: by a gradual development of trust on the basis of a course of dealings. Public key cryptography can help to ensure that present communications are from the same source as previous ones, without any need for external certification. Personal introductions from known sources can be evaluated much more satisfactorily than certificates from a certifying bureaucracy. The need for instant certainty about the identity of unknown individuals has been greatly exaggerated.

    The development of trust over a period of time has the great advantage that it can derive from a number of independent sources of confidence that a person is who and what he claims. If I have three independent sources who verify that someone is who and what he claims to be, this greatly increases my confidence in the conclusion. Even if there is a 20% risk of error in the case of each source, the risk that all three are in error is (20% x 20% x 20%) = 0.8%, a much lower risk. In the case of certifying authorities the same consideration applies: if I am given three certificates from independent authorities, the risks of error are much reduced. This is a strong argument for fostering numerous independent certifying authorities.

    A problem which cannot be avoided by a multiplicity of certifying authorities arises if there is a hierarchy of authorities and cross-certifications. If a single authentication key authenticates numerous certifying authorities and is compromised, so that the genuineness of their certificates themselves becomes unreliable, all the benefits of the independence are lost. This risk can only be reduced by combining the "web of trust" approach of PGP with the use of certifying authorities, so that a key is authenticated both by certifying authorities and private individuals who are not part of any hierarchy and not exposed to a common risk of key compromise at a higher level.

    In general it makes no more sense to impose an onerous licensing scheme on providers of encryption services like public key management than it would to impose a similar scheme on the publishers of directories of fax telephone numbers. In both cases incorrect entries are a nuisance and probably lead to communications being addressed to inappropriate recipients, sometimes with resulting loss: this does not begin to justify the imposition of a scheme which is wholly disproportionate to the new risks of electronic commerce properly analysed. The proposed scheme is liable to hinder rather than promote the development of a diversity of authentication sources which would provide a much more reliable basis for electronic commerce.

    How is the Government's problem to be solved?

    The Government's problem is that strong cryptography has escaped from Pandora's box, and there are no means of putting it back. Anyone with access to the World Wide Web can obtain simple free (or inexpensive) software for providing strong encryption of electronic mail (and indeed voice telephony).

    One of the best known strong cryptography programs is PGP. It uses well known published methods of public key cryptography, and is widely available on the Internet both in free versions suitable for individual use and in commercial versions suitable for corporate use. It is the subject of numerous reference sites and discussion newsgroups on the World Wide Web. It is easy for anyone with an ordinary home computer to use PGP for encrypting electronic mail or documents.

    PGP is supported by a number of "key servers" in different parts of the world. PGP users can publish their public keys by adding them to a server (each server updates the others regularly), and published keys can be found and downloaded from the server for use with PGP. Not all users publish their key on the servers, and some keys held on them are out of date, but the volume of their use gives some indication of the extent of PGP use.

    At the end of February 1996 there were nearly 22,000 public keys in the server collection. In April 1997 there were nearly 44,000. It is impossible to produce an accurate breakdown by country, but the United States no doubt accounts for by far the largest number of these (as it does for almost all aspects of the use of the Internet). The UK probably accounts for about 2,000 of the 44,000. Germany probably has significantly more users than the UK, and there are substantial numbers of users in the Netherlands. There are significant numbers in other European countries. The numbers seem to be growing rapidly.

    On the subject of what activities are in fact to be covered by the licensing scheme (and thus criminalised if undertaken by someone without a licence), it is said in 9/5:

    "Of all the issues you raise this has perhaps caused the most concern to readers of the Consultation Document. So let me confirm the position we take. We do not have any intention to impose licensing conditions on individuals who through whatever action, certify the public keys of another individual through the basis of personal knowledge or friendship. It would, however, be possible (and this is why the Paper is perhaps ambiguous on this point) for an individual (as opposed to an organisation) to carry out a certification process as a service using knowledge (to guarantee authenticity) gained by conventional means (such as passports etc). It would also be possible for an individual to systematically certify the keys of a large number of individuals through "hearsay" knowledge. Such arrangements, we believe, should be subject to licensing."

    This still leaves the boundary hopelessly diffuse. If I meet someone and he tells me who he is and I certify his key, is that on the basis of personal knowledge or hearsay (the latter a word better kept for use in its technical sense)? If he shows me his passport, does that make it better or worse? Must I have met him twice before my knowledge is "personal" instead of "hearsay"? Or must I have met one of his friends, or if more than one how many, for there to be personal knowledge? The concepts of knowledge and identity are very difficult, and no attempt has been made by the DTI to define or explain their use in the present context: in a consultation over the creation of new criminal offences, this is quite inadequate.

    New criminal sanctions should not be imposed except to prevent clearly demonstrated social evils. There are at present no grounds for imposing criminal sanctions on unlicensed key certification activities. There is no evidence that any harm would be caused by them, or that electronic commerce depends on public confidence in key certification services, or that public confidence in key certification depends on a licensing scheme. It is possible that the risks of systemic failure inherent in a hierarchical system of certification would damage public confidence. Action at this stage is inevitably premature. This is to some extent demonstrated by the fairly confused and sometimes immature character of debate on the consultation paper.

    Applying the licensing scheme to overseas activities also presents troublesome aspects. The DTI says:

    "The offering of encryption services to the UK public (for example via the Internet) by an unlicensed TTP outside of the UK will also be prohibited. For this purpose, it may be necessary to place restrictions on the advertising and marketing of such services to the public."

    The PGP key servers offer an example of the impracticality of this proposal: they are accessible from anywhere in the world, and information about their availability is found at numerous sites on the Internet whose location may be very difficult to determine. Traditional concepts of advertising and marketing as activities with a fixed location cannot survive in the context of the Internet, and legislation founded on those traditional concepts will be unworkable. Nor can it be desirable to criminalise the provision of services overseas by foreign nationals merely because UK residents can obtain the benefit of those services. The same considerations apply to services which, unlike the PGP key servers, are intended to be covered by the scheme.

    Whether strong cryptography is really one of the woes of the world remains debatable. Despite extensive public controversy on the subject for some years, primarily in the United States, there seems to be no evidence of the use of strong cryptography as an impediment to law enforcement. None is cited in the DTI's consultation paper.

    Unsophisticated terrorists and other criminals no doubt send incautious letters or faxes to one another and speak carelessly on the telephone. Criminals who conspire by electronic mail seem likely to be among the more sophisticated. If they nevertheless fail to take advantage of encryption, there is no problem. If they see the virtues of encryption, what will they do?

    The DTI's answer to this question is: "Criminals will often make use of whatever technology is conveniently available to them. We expect TTPs to have a major role in conveying secure electronic communications, especially where a payment for legitimate services is involved." (This sits oddly with the following answer given by the DTI to a different question: "It is important to be clear that it is not envisaged that the encrypted communication would be routed via the TTP" ­ emphasis added. The reference to payments is also odd in the light of the exemption proposed by the DTI for cryptographic services provided as an integral part of secure payment systems, mentioned in the next paragraph.) It nevertheless appears to be the Government's belief that criminals will use the services of a certifying authority because they are conveniently available. In 9/5, however, the contrary seems to be acknowledged.

    It is very much open to speculation whether the services of certifying authorities will in fact be conveniently available ­ in order to offer useful levels of authentication, they may have to be expensive and difficult to satisfy. The need to make use of means of secure payment may lead criminals to use the services of the banking system, which may well involve authentication by the use of cryptography, but the DTI does not propose that banks will be licensed if they are providing cryptographic services as an integral part of secure payment systems. Banks will therefore not be obliged to hold their customers' private keys, and the use of such systems by criminals will not therefore expose their communications to decryption.

    What will certainly be conveniently available to criminals, as to everyone else, is the powerful and free encryption software already available now. If criminals are sophisticated enough to want to use encryption, they will have a choice of using this software, or obtaining the authentication services of a certifying authority. With all respect to the DTI, it is wholly implausible to suppose that criminals will be so impressed by the need to authenticate their identity (presumably to the unknown other criminals with whom they will enter into correspondence) that they will choose to use the services of a certifying authority with whom they must deposit their private keys. Given the evidential requirements that certifying authorities must impose if their services are to be of any value to the public, it is impossible that criminals will simply fall into using their services somehow unawares.

    Two other advantages to law enforcement should be considered. The first is that even though criminal conspiracies will be conducted without the use of deposited private keys, criminals will in carrying out the conspiracy using electronic mail to communicate with honest users and using deposited key pairs. It is difficult to assess the plausibility of this possibility; but in such an event the honest user is probably the victim or an unwitting participant, and will assist the law enforcement authorities by decrypting any encrypted material. Reliance on deposited private keys is irrelevant.

    Another possibility, which has been the subject of a conjecture published in the "" Internet news group, is that private key deposit serves purely to distinguish honest from dishonest users, thus enabling the law enforcement authorities to identify their proper targets and employ against them sophisticated but expensive intelligence gathering techniques which are effective for their purpose but impracticable against the mass of everyday users.

    The remote and conjectural possibilities described in preceding paragraphs do not begin to justify an expectation that users will accept the risks of accidental or corrupt misuse of their private keys. When considering this risk, it must be noted that although the DTI proposes that certifying authorities should be liable for improper disclosure of private keys without the need for fault to be proved against them, it proposes that liability should be limited. It follows that liability may be in an amount insufficient to compensate the key owner for what might be very large losses from undetectable forgeries (if a key pair is certified both for authentication and encryption).

    In 9/5 it is stated as follows:

    [If a TTP] "certifies a public key for signature it would not require the deposit of the associated private key; while if it certified a key for confidentiality then it would. It follows, I believe, that if a user then used a "signature" key for confidentiality it would be doing so outside of the terms of the certificate. It would therefore be unwise, in normal circumstances, for the user to do this as it would mislead his potential counterparts."

    If I have a public key certified for signature, I do not use it at all (see paragraph 10, and the Appendix). If someone else uses it to encrypt a message to me, there is no discernible sense in which I or he can be mislead by this. I appear to have no incentive at all to have my key certified for confidentiality, especially if the consequence is that I must deposit my private key.

    It may be said that this is an insecure procedure, because signature keys should be kept unchanged for long periods and encryption keys frequently changed. Not all users will find this precaution necessary (it depends on their confidence in the security of their private key), but if they do want to take this precaution they still have no incentive to deposit their encryption keys. All they need do is deposit a public key to have it certified for authentication, and use that certified key to certify their own successive encryption keys. They have all the benefits of the certification system (such as they may be), and the ability to make frequent changes to their encryption key pair, without any private key deposit. In this case it cannot even be said that they are operating outside the terms of their certificate.

    Other claimed advantages of using certifying authorities

    The DTI makes the following points by way of example: "... an employee who has encrypted files may resign without leaving information concerning the private key, or the death of an individual may require a Solicitor to have access to their encrypted information. In all of these circumstances [certifying authorities] can enable legitimate access to the keys to unlock such information."

    This tends to confuse transmitted messages (where the authentication of the identities of the parties to the transmission may be material) with stored material (where the issue does not arise). No employer should provide employees with methods of encryption which render either communications or stored material inaccessible to the employer, and existing encryption products provide means for enabling this precaution to be taken. Private keys should of course have security copies kept, like other data, and the need to do this is part of normal data security. The use of licensed certifying authorities for this purpose is unnecessary. It would be a very unfortunate consequence of the proposed scheme if it resulted in providers of backup and data security services requiring to be licensed because (quite possibly unknown to them) the data in question happened to include private encryption keys.

    Whether individuals choose to ensure that their legal personal representatives have access to their encrypted records after their death is entirely up to them. If I choose to destroy my records before my death I am free to do so; and if I prefer to keep them in a form in which they will be inaccessible after my death the position is the same. If I wish to do so, I can ensure that a key is available. The use of licensed certifying authorities for this purpose is unnecessary.

    The DTI suggests that the use of certifying authorities will lead to increased interoperability between different encryption products, with resulting simplifications in communications between users of different products. So it might, but as the benefits of interoperability becomes apparent in the marketplace, market forces can be expected to have the same result. The history of open systems is a clear example. Government encouragement of interoperability and promotion of the setting of standards would certainly be valuable, but the DTI offers no reason for supposing that the use of certifying authorities is necessary or even beneficial for this purpose, and no evidence whatever that a scheme of criminal sanctions are needed to preserve the exclusivity of licensed certifying authorities.

    Legal professional privilege

    Solicitors are making increased use of electronic mail, both between firms and in communications with other professionals and with clients. The ability of solicitors to send draft documents and revisions in electronic form is of increasing value to their clients. The content of such communications is often the subject of legal professional privilege belonging to the client, under which the client is entitled to preserve the secrecy of that content. In both litigation and commercial matters, the UK or foreign governments may have interests adverse to those of the clients in question.

    Solicitors should not transmit confidential material over the Internet without encrypting it. If they have satisfactory means of encrypting it without making the relevant private key available to a certifying authority, then they should not choose less secure means without their client's consent.

    Solicitors are officers of the court. If obliged to do so by law, they would of course decrypt encrypted material for the benefit of law enforcement authorities.

    Notaries and public bodies

    The Government (through the Foreign and Commonwealth Office) already accepts responsibility for authenticating the seals and signatures of United Kingdom notaries. The Government should extend this responsibility to authenticating the public keys of UK notaries, thereby substantially enhancing their ability to facilitate international trade by authenticating transactions electronically. There would be no justification for requiring the deposit of notaries' private keys, and any such requirement could be damaging to confidence in the digital signatures used by notaries.

    The Government should likewise authenticate public keys for Government Departments and for public bodies such as HM Land Registry, to enable them to authenticate electronic communications and electronically issued official documents. Until interoperability of authentication technologies has been achieved in practice, public keys should be available in formats compatible with the main systems in current use, including PGP. Such measures would promote public confidence in electronic transactions and authentication procedures, and would be a valuable practical demonstration of the Government's commitment to improving the UK infrastructure for commerce and trade.

    Validity of digital signatures

    Digital signatures are much more secure against forgery than ones made by hand. Among other advantages, a digital signature prevents subsequent alteration of the signed document. But in considering the advantages it should always be remembered that binding contracts can be made by word of mouth and successfully enforced despite the evidence for them being far more ephemeral even than electronic mail.

    At common law a signature is no more than a mark made by the person signing, recognised by him as authenticating the document signed. The law has adapted readily to the formation of agreements by telegram or telex, or the signature of a form of proxy by fax (see for example Inland Revenue Commissioners v Conbeer [1996] BCC 189). There is no central register of the signatures of individuals (or of the seals of companies) to enable the public to check the validity of the signatures of unknown persons, and no suggestion was ever made when telegrams or telexes began to be used in commerce that any special measures of this kind were needed. There is no reason why the law should not adapt equally readily to accept the digital signature, and without requiring any complex legislative scheme backed by criminal sanctions. In cases of dispute there might have to be resort to expert evidence, as there is in forgery cases. There is no need for general legislative provision, but legislative acknowledgement of the possibility of digital signatures would facilitate their acceptance (by analogy with legislative acknowledgement that expressions such as "record" or "document" include things in electronic form).

    Where statutory requirements apply, as for example in the case of wills and contracts for the sale of interests in land, specific statutory provision would be necessary to adapt the requirement. Policy considerations will vary: a change to the law on the execution of wills might be better postponed, but there seems no reason to hesitate similarly about land contracts. It is for consideration whether a digital signature by a corporation should be treated (perhaps at the option of the corporation) as analogous to sealing, or whether individual digital signatures by director and secretary should be required.

    There seems no basis for introducing presumptions to alter the ordinary laws of evidence relating to the proving of a signature. Where the evidence of a third party is relevant it should be admitted for what it may be worth, whether or not the third party is a participant in a licensing scheme (although that fact may itself be relevant to the weight to be given to the evidence in question). Widespread deposit of private keys could discourage reliance on digital signatures by leaving the depositors to bear the risk of non-repudiable forgeries. Although the DTI does not propose any requirement for the deposit of private keys corresponding to public keys certified for authentication, a key pair can in fact be used for encryption as well as authentication, and there has been widespread misunderstanding of the proposals on this point: the result is that any private key deposit is liable to lead to suspicion and lack of confidence.


    The legislative scheme proposed is fundamentally misconceived. The arguments presented in its favour have greatly exaggerated not only the need for it, but also its utility either to the public and the business community or to law enforcement authorities. Controversy about law enforcement access and the impracticability of the scheme will in fact hinder the development in the marketplace of widely accepted interoperable standards and the extension of security in electronic communications. In exposing private encryption keys to unnecessary risks it poses an unacceptable threat to privacy (and in some cases a risk of electronic fraud) without adequate countervailing advantages. It creates new and unnecessary criminal offences without adequate justification. The whole scheme should be withdrawn entirely.

    Even if the scheme is so weakened in its requirements for private key deposit as to present little threat to privacy, it is objectionable as laying the foundations for the introduction of stronger requirements at a later stage once the obvious weaknesses emerge.

    I respond to the DTI questions as follows:

    Paragraph 50 - Whether the suggested scope of an exclusion from licensing for intra-company TTPs is appropriate in this context.

    As the licensing scheme is inappropriate, no comment can be offered on the exclusions.

    Paragraph 54 - Whether, in the short term, it would be sufficient for business to rely on agreements under contract regarding the integrity of documents and identification of signatures; or whether it would be helpful for legislation to introduce some form of rebuttable presumption for the recognition of signed electronic documents.

    Presumptions are unnecessary, but legislative recognition that the concept of signature, like those of document or record, extend to things in electronic form, would be helpful.

    Paragraph 60 - The appropriateness of the proposed arrangements for the licensing and regulation of TTPs.

    The proposed arrangements are wholly inappropriate.

    Paragraph 65 - Where views are sought on the proposed conditions.

    The conditions are inappropriate because they form part of an inappropriate scheme.

    Paragraph 70 - What, if any, specific exemptions for particular organisations offering encryption services would be appropriate depending on the nature of services offered?

    There should be no licensing scheme and no need for exemptions.

    Paragraph 71 - Whether it is thought desirable to licence the provision of encryption services to businesses and citizens wholly outside the UK?


    Paragraph 81 - Should secure electronic methods for the delivery of electronic warrants by the central repository and the subsequent delivery of keys by the TTP be introduced?

    All handling of private keys should be done by the most secure possible means.

    Paragraph 82 - Does the legislation specifically need to refer to other forms of legal access including a civil court order for access to cryptographic keys used to protect information relating to civil matters such as bankruptcy?

    Where a court has an existing power to order disclosure of records, it undoubtedly has the power to order them to be disclosed in intelligible form. No case has been made for an additional specific power relating to cryptographic keys.

    Paragraph 84 - Should deliberate (and perhaps wilfully negligent) disclosure of a client's private encryption key be a specific criminal offence, or would existing civil and criminal sanctions suffice?

    No case has been made for any new offence.

    Paragraph 89 - Whether the principle of strict liability (as described) is appropriate in these circumstances?

    Yes; but the point is irrelevant because schemes involving the deposit of private keys are not acceptable.

    It is notable, in a paper basing itself on consumer protection, that no similar principle is proposed in relation to errors by a certifying authority in issuing an incorrect certificate. However, since certifying authorities would in practice have to publish their terms of business, the extent of their acceptance of liability would be part of the ordinary law of contract, and there is no reason for imposing any special regime for this purpose.

    Paragraph 91 - Whether, in principle, an independent appeals body (such as a Tribunal, separate from that referred to below) should be created ?

    No, because the licensing scheme is inappropriate.

    Paragraph 93 - Whether the proposed duties of an independent Tribunal are appropriate.

    No, for the same reason.

    Solicitor (England & Wales)

    Partner, Norton Rose (1975 to 1994)


    Salkyns, Great Canfield, Takeley, Bishop's Stortford CM22 6SX
    Telephone (01279) 870285 Mobile 0860 636749 Fax (01279) 870215 Internet

    Age 53 in July 1996

    Education Leighton Park School, Reading; St John's College, Cambridge.

    Qualifications MA (Cantab); Solicitor (admitted 1968).

    Career as a solicitor

    1966 to 1970 Trainee and then assistant with Gregory, Rowcliffe & Co

    1970 to 1972 Assistant and then partner in Edward Moeran & Partners

    1972 to 1994 Norton Rose: 1972 assistant; 1975 partner; 1987 technology partner.

    Overview of experience--

    Commercial and corporate lawyer with over 20 years' experience of working at the most senior level for UK and overseas clients involved in commerce, industry, shipping, finance and the public sector, dealing with commercial, intellectual property, constitutional, competition, employment, corporate, insolvency, tax and other legal issues arising out of trading structures, acquisitions and disposals of businesses, corporate reorganisations, research and development contracts, computer system contracts, joint ventures and other transactions and problems. Experience of computer system contracts enriched by experience of the implementation and use of computer systems in a major law firm's information technology projects.

    Examples of experience--

  1. Reconstitution of the Crown Agents as a statutory corporation.

Involvement with the management by an interdepartmental committee of the necessary government legislation, with its drafting by Parliamentary counsel, and with its implementation by the sponsoring department.

Conversion of a government research laboratory into a commercial organisation and transfer of complex intellectual property arrangements. Dealing with sensitive issues arising under the EEC and ECSC Treaties in the context of a large flotation. Acting for public sector bodies in a joint venture for radioactive waste disposal to set up a suitable corporate vehicle, taking into account special issues concerning nuclear materials and public sector financial requirements. A joint venture between a public sector body owning computer software, a venture capital fund and a software marketing organisation. Contractual structures for the management and implementation of complex computer system contracts involving project management, software development, and the supply, installation, commissioning, testing, development and maintenance of hardware and systems. Jointly with a senior accountant investigating share dealings in Aldershot Football Club Limited, examining witnesses and reporting to the Secretary of State (report published by HMSO 1987).


Extract from the PGP user guide: how it works

First, some elementary terminology. Suppose I want to send you a message, but I don't want anyone but you to be able to read it. I can "encrypt", or "encipher" the message, which means I scramble it up in a hopelessly complicated way, rendering it unreadable to anyone except you, the intended recipient of the message. I supply a cryptographic "key" to encrypt the message, and you have to use the same key to decipher or "decrypt" it. At least that's how it works in conventional "single-key" cryptosystems.

In conventional cryptosystems, such as the US Federal Data Encryption Standard (DES), a single key is used for both encryption and decryption. This means that a key must be initially transmitted via secure channels so that both parties can know it before encrypted messages can be sent over insecure channels. This may be inconvenient. If you have a secure channel for exchanging keys, then why do you need cryptography in the first place?

In public key cryptosystems, everyone has two related complementary keys, a publicly revealed key and a secret key. Each key unlocks the code that the other key makes. Knowing the public key does not help you deduce the corresponding secret key. The public key can be published and widely disseminated across a communications network. This protocol provides privacy without the need for the same kind of secure channels that a conventional cryptosystem requires. Anyone can use a recipient's public key to encrypt a message to that person, and that recipient uses her own corresponding secret key to decrypt that message. No one but the recipient can decrypt it, because no one else has access to that secret key. Not even the person who encrypted the message can decrypt it.

Message authentication is also provided. The sender's own secret key can be used to encrypt a message, thereby "signing" it. This creates a digital signature of a message, which the recipient (or anyone else) can check by using the sender's public key to decrypt it. This proves that the sender was the true originator of the message, and that the message has not been subsequently altered by anyone else, because the sender alone possesses the secret key that made that signature. Forgery of a signed message is infeasible, and the sender cannot later disavow his signature.

These two processes can be combined to provide both privacy and authentication by first signing a message with your own secret key, then encrypting the signed message with the recipient's public key. The recipient reverses these steps by first decrypting the message with her own secret key, then checking the enclosed signature with your public key. These steps are done automatically by the recipient's software. Because the public key encryption algorithm is much slower than conventional single-key encryption, encryption is better accomplished by using a high-quality fast conventional single-key encryption algorithm to encipher the message. This original unenciphered message is called "plaintext". In a process invisible to the user, a temporary random key, created just for this one "session", is used to conventionally encipher the plaintext file. Then the recipient's public key is used to encipher this temporary random conventional key. This public-key-enciphered conventional "session" key is sent along with the enciphered text (called "ciphertext") to the recipient. The recipient uses her own secret key to recover this temporary session key, and then uses that key to run the fast conventional single-key algorithm to decipher the large ciphertext message. Public keys are kept in individual "key certificates" that include the key owner's user ID (which is that person's name), a timestamp of when the key pair was generated, and the actual key material. Public key certificates contain the public key material, while secret key certificates contain the secret key material. Each secret key is also encrypted with its own password, in case it gets stolen. A key file, or "key ring", contains one or more of these key certificates. Public key rings contain public key certificates, and secret key rings contain secret key certificates. The keys are also internally referenced by a "key ID", which is an "abbreviation" of the public key (the least significant 64 bits of the large public key). When this key ID is displayed, only the lower 32 bits are shown for further brevity. While many keys may share the same user ID, for all practical purposes no two keys share the same key ID.

PGP uses "message digests" to form signatures. A message digest is a 128-bit cryptographically strong one-way hash function of the message. It is somewhat analogous to a "checksum" or CRC error checking code, in that it compactly "represents" the message and is used to detect changes in the message. Unlike a CRC, however, it is computationally infeasible for an attacker to devise a substitute message that would produce an identical message digest. The message digest gets encrypted by the secret key to form a signature. Documents are signed by prefixing them with signature certificates, which contain the key ID of the key that was used to sign it, a secret-key-signed message digest of the document, and a timestamp of when the signature was made. The key ID is used by the receiver to look up the sender's public key to check the signature. The receiver's software automatically looks up the sender's public key and user ID in the receiver's public key ring. Encrypted files are prefixed by the key ID of the public key used to encrypt them. The receiver uses this key ID message prefix to look up the secret key needed to decrypt the message. The receiver's software automatically looks up the necessary secret decryption key in the receiver's secret key ring.

These two types of key rings are the principal method of storing and managing public and secret keys. Rather than keep individual keys in separate key files, they are collected in key rings to facilitate the automatic lookup of keys either by key ID or by user ID. Each user keeps his own pair of key rings. An individual public key is temporarily kept in a separate file long enough to send to your friend who will then add it to her key ring.