The advent of digital technology is both a revolution and an evolution. The possibilities inherent in the merger of the most advanced technologies from three different industries: cable, computer and telephone are revolutionary - movies and home shopping on demand in the home, and financial and management services in the office are already under development e.g. by Time Warner in Orlando, Florida. If to these information systems we add the impact of multi-media CDs and the Internet and its successor networks, the sheer scale of deliverability is revolutionary.
At the same time, this revolution in deliverability does not imply that we have to start from scratch with revolutionary laws. Our present structure needs to be evolved to absorb these new means of delivery and new works in ways which encourage them on the traditional bases of reward. Copyright is not some insurmountable hurdle in the way of exploitation of information. It both encourages creation of and the flow of information and is also the trading system for the works that are created. The challenge is both to secure reward for use and also to ensure that securing that reward is for the users as fast, simple and painless as possible.
And the copyright system can be evolved. It has absorbed previous new technologies e.g. photography, sound recording, films, sound broadcasts, TV - audio visual media and products altogether and, most recently, computer programs.
Publishers need now to consider the ways in which the copyright system might be refined and evolved to give proper place to their interests as prime investors of organisational and creative skills, no less than of finance, in new products and services whose markets will, in due course, be the world.
The agenda for this refinement and evolution is set out under seven Sections.
The Application of Authors Exclusive Rights.
It is important to re-assert in the digital environment the central role for publishers of the traditional exclusive rights of reproduction and of distribution/publication which copyright law grants to authors. If the author is not empowered appropriately by copyright law for the digital era, then he/she cannot license his/her publisher appropriately, and the publishers house is built not on rock, but on shifting sand. Security of investment and security of trading are the publishers prime concerns. The acts of scanning and of placing a digitised form of a work into an electronic system, no less than those of downloading the work within and replicating out of the system, must be defined as acts of copying, that is, of reproduction. It is the need for librarians and user groups to seek rights-owners authority to scan and to store works which alone preserves the present negotiating position of the rights-holders in some countries.
The exclusive right to reproduce is, perhaps the foremost right to re-emphasise. The framework is Art. 9 (I) of the Berne Convention, as elaborated by Art. 4 of the European Unions Software Directive, which defined the right so as to include downloading and transmission. It will be vital that the final text of the proposed Protocol to the Berne Convention recognises this scope of the right in treaty language.
At the 5th Session on the Berne Protocol held in Geneva in September 1995 there was unanimous approval for drafting into the Protocol a distribution right on an exclusive territorial basis. Many governments, however, wished to confine the right to distribution of physical products. Others, mainly Non Government Organisations (NGOs) argued in favour of the position taken by the US White Papers Recommendations that copyright laws should be amended to recognise expressly that copies of works can be distributed to the public by transmissions, and that such transmissions fall within the exclusive distribution right of the copyright owner.
The report Highways to Change from the Copyright Convergence Group in Australia does recommend transmission as a separate right, which should cover the transmission of copyright material in intangible form to the public by any means or combination of means which is capable of being made perceivable or used by a receiving device. On balance, copyright advisers prefer to absorb rather than to innovate, so that publishers in many countries will prefer to argue, as the USA White Paper argues, for acts of transmission to be absorbed in the traditional rights of reproduction and distribution/publication.
Whether a separate right of display is needed is a subject of current attention. The right exists in the USA. The House Report on the 1976 Copyright Act stated that display would include the transmission of an image by electronic or other means, and the showing of an image on a cathode ray tube, or similar viewing apparatus connected with any sort of information storage and retrieval system. If in its national law a state regards display as an act of reproduction, it may rest content. If it does not, then it might consider a separate right of display, (although there are difficulties over the meaning of public in the usual formula of public display in relation to display on private PCs).
A generalised right of communication to the public (a right often placed in the context of broadcasting) was, at the 5th Berne Protocol Session, considered appropriate by those governments which resisted placing digital transmission within the scope of a distribution right. The danger here for publishers is that the private/public distinction inherent in such a right is dangerously inappropriate. The distinction would allow, as an exception to control of the right, single copying by home computer users. Since, however, acts of single copying of the same work may be done by thousands and thousands of home computer users, a leakage could look remarkably like a haemorrhage, certainly to the rights-owners. The concept of public will need very careful rethinking.
Any treaty language for a Berne Protocol would also need to recast Art. 11 of Berne to include all categories of works. The USA delegation to the 6th Berne Protocol Session held in Geneva in February 1996 offered the following wording-
Communication to the Public and Public Performance:
(1) Authors and their successors in interest shall have the exclusive right to authorise:
(b) any communication to the public of their works or the performance thereof.
The issue of exhaustion was extensively debated at the 5th Berne Protocol Session. Delegates agreed that a distribution right in respect of physical products would be subject to national or regional (e.g. EU) exhaustion, but there was a view strongly expressed by many governments (with the honourable exceptions of the USA, the Russian Federation, France and Germany) in favour of international exhaustion over a right of importation. Virtually all the NGOs led an equally strong resistance, pointing out that an importation right promotes incentives for investment as the US delegation had put the matter, while parallel importation, the inevitable result of international exhaustion, did precisely the opposite. This issue will continue to be on the international agenda, if the placing of an explicit distribution right into the text of Berne finds favour at forthcoming Protocol sessions. There was general support for the contention that no exhaustion of any kind could be allowed in the case of on line transmissions, a stance supported in the EUs Common Position (See TWO below) at Recitals 33 and 34.
Following the 5th Berne Protocol Session, Dr. Mihaly Ficsor, the Assistant Director General of WIPO reviewed approaches to the legal regulation of digital transmissions and introduced an umbrella solution which would acknowledge the general consensus that the acts involved in the case of a transmission/delivery in a digital network should be covered by exclusive rights. He quoted the USA Delegation at the 5th Berne Protocol Session, supported by the UK Delegation - It is not the legal characterisation which is truly important but rather that the acts involved be covered by appropriate exclusive rights.
In a masterly presentation to the WIPO World Forum on the Protection of Intellectual Creations in the Information Society, held in Naples in October 1995. Dr. Ficsor proceeded-
Thus, following the idea of the Delegation of the United States of America, a solution may be to describe the acts to be covered in a neutral way, not including any specific legal characterisation, and to leave such characterisation and, consequently, the choice of the right or rights to be applied, to national legislation.
This could be done, for example, in the following way (this is not a proposal, just an outline of one of the possible solutions; and it only relates to the Berne Protocol, but might also be adapted to the "New Instrument"):
(a) to provide that authors of literary and artistic works shall have the exclusive right of authorising the making of their works available in an electronic or similar network, either by wire and by wireless means, to the public (i) to perceive the work on a screen and/or through a loudspeaker, or in any other way, the signs, sound and/or images in which the work is expressed, and/or (ii) to obtain a copy or copies of the work by any means and in any form, including the storage of the work, even temporarily, in an electronic or similar storage device;
(b) to provide that, in the application of this provision, a work shall be considered to be made available to the public irrespective of whether the members of the public may have access to the work in the same place or in separate places and at the same time or at different times (this, however, might also be included in a more general provision defining the concept "to the public");
(c) to provide that it shall be a matter for legislation in the countries party to the Protocol to permit the making available of the works to the public as described under point (a), above, in certain special cases, provided that such an act does not conflict with a normal exploitation of the work and does not unreasonably prejudice the legitimate interests of the author (this again may be included in a provision of broader coverage);
(d) to provide that it shall also be a matter for legislation in the countries party to the Protocol to implement the above-outlined provisions by applying an exclusive right or exclusive rights of authorisation to be granted under the Protocol and/or the Berne Convention, or by applying a specific exclusive right to cover the acts or some of the acts described under point (a), or by applying a combination of all these rights (it is to be noted, that, for being able to offer all these options, certain "gaps" in the Convention should be eliminated; for example, a general distribution right should be introduced);
(e) to provide that, where a country party to the Protocol, under the previously mentioned provision, applies the right of distribution provided for in the Protocol and in Articles 14(1 ) (i) and 14 bis (1) of the Berne Convention, the act of making a work available to the public as described in point (a), above, shall not be covered by any exhaustion of right that may be provided for in the Protocol or in national legislation;
(f) to include a safeguard clause to the effect that none of the provisions mentioned above shall be interpreted as affecting any obligation under the Protocol and the Berne convention to grant protection to the rights of authors in their literary and artistic works.
The final item, (f), in Dr. Ficsors umbrella solution is a reminder that authors and publishers are concerned over unauthorised manipulation of the material which they provide: authors are concerned that their moral rights of paternity and of integrity should be maintained and publishers are concerned that the authenticity of their publications should not be damaged.
Whether authorised manipulation can be achieved through harmonisation of the law and practice of waiver is an issue which now needs examination. Some precedents are emerging. A general requirement may be made in an author-publisher contract that the publisher must in his own edition of the authors work recognise the authors basic rights of paternity and integrity, and pass on that requirement of recognition to sub-licensees. That general requirement may then be qualified by an undertaking by the Author to waive the integrity right when such a waiver is an essential condition of the exercise of any particular, e.g. an electronic medium, right.
The Protection of Publishers Investment
The publisher is, at the international level, in a legal vacuum. He is neither an author in the terms of the Berne Union, nor is he a producer in the terms of the Rome Convention, nor indeed is he a producer for the purposes of that part of the EUs Rental and Lending Directive which deals with certain neighbouring rights. And yet the publisher is a major investor, in both creative and financial terms, in intellectual property and will be especially vulnerable in his investment in electronic products and services. Because of this paradox, the International Publishers Copyright Council (IPCC) has for some time been developing a draft International Convention for the Protection of Publishers of Electronic Editions. During virtually the some period the Commission of the European Union (EU) has been developing a draft Directive for the Protection of Databases. This draft Directive contains an innovative right, on behalf of the maker of a database, against unauthorised extraction of the contents of the database, known widely as the sui generis right. Since this EU sui generis right could turn out to be a publishers right, it is of great importance to the publishing community world-wide. The Federation of European Publishers (FEP) has made constant representations since 1992 to DGXV and the arguments have had some effect on the final Common Position achieved in July 1995, and largely confirmed in the final text of the Directive.
THE KEY ELEMENTS OF THE COMMON POSITION
(i) A database is defined as a collection of works, data or other independent materials arranged in a systematic or methodical way and capable of being individually accessed by electronic or other means.
(a) The intention of 'collection' is to place the nature of a database firmly in the tradition of Art. 2 (5) of the Berne Convention as affected by Art. 10 of TRIPS. That is, the items collected no longer need to be copyright works: this Berne plus approach is clearly essential for copyright protection of most databases, since they will usually consist of both copyright and non-copyright materials. It is suggested, however, that the criterion for originality will make copyright protection thin.
(b) The intention of 'individually accessed' is to exclude collective works, such as films. This feature suggests that while some multimedia works will have features that are individually accessible, others, as sophistication of creative direction increases over time, will not. This draft Directive is not a multimedia directive. (But see Section Four below)
(c) For publishers, the extension of this new regime to print-on-paper databases (other means) will need careful thought as implementation into national laws approaches. Is a library, one might ask, a database?
(ii) (a) The text adopts the wording of the software directive on originality. Art 3.1. reads: 'databases which, by reason of the selection or arrangement of their contents, constitute the authors own intellectual creation shall be protected as such by copyright.'
(b) Art. 3.1 is to be read together with Recital 15- 'whereas such protection should cover the structure of the database.' The essential point is that the copyright protection offered by the text is only for intellectual creation and only for the expression of the database which is protectable by copyright. Some databases will pass the test of originality: many may not. It is very likely that the maker of a database will bypass the uncertainties of copyright protection and seek protection under the new sui generis regime (see (iii) below) which protects the contents of the database which the maker has put together, regardless of whether the structure qualifies for copyright protection or not.
(iii) The Directive confirms the creation of an entirely new right, outside copyright itself but neighbouring to a neighbouring right (if one may put it that way). The text requires member states, under Art. 7, to 'provide for a right for the maker of a database which shows that there has been qualitatively and/or quantitatively a substantial investment in either the obtaining, verification or presentation of the contents, to prevent acts of extraction and/or re-utilisation of the whole or a substantial part, evaluated qualitatively and/or quantitatively, of the contents of that database.'
As the Directive comes before member States for implementation, the following points may need further representation.
(i) Exhaustion needs representation on two fronts-
(a) Implementation of the statements at Recitals 33 and 43 that, in effect, exhaustion of the copyright right of distribution does not operate in the case of on-line databases, nor does exhaustion of the sui generis right to prohibit re-utilisation operate in the case of on-line databases.
(b) Clarification of first sale at Art. 7(2) (b), which does exhaust the right to control resale. Many databases, however, are offered not for sale, but as part of a subscription service, which is not a sale. Software is typically licensed, not sold.
(ii) The wording of Art.7.1 introduces a quasi-exemption - the taking of 'insubstantial parts' is allowed. Even although Art. 7.5 forbids repeated extraction and re-utilisation of insubstantial parts which would conflict with Art. 9(2) of Berne, this provision could well undermine investment by database members who invest both in the collection of materials and in the accessing software. The ability of a user to extract and re-utilise small nuggets of information under an 'insubstantial parts' doctrine could deny to the database maker rewards for use, and thus deny him a proper return on his investment. (See also the Wojcik quotation at Section Three below).
(iii) The Directive has left many issues of exemption as options for members states. Given the traditional vigour of publisher/librarian exchanges in most member states, it is quite possible that the Directive will end up on this critical issue less and not more harmonised.
(iv) The text chooses a basis of reciprocity rather than national treatment. Under reciprocity, the contents of a database made in Europe containing important European data will be protected by the sui generis right, whereas the contents of a database made in the USA containing exactly the same important European data, will not. If, of course, the USA were to move to an equivalent of the sui generis right, the problem would largely disappear. The influential Information Industry Alliance (IIA) of America has come out in favour of such an equivalent.
The draft Directives new sui generis regime will certainly set the pace not only in Europe, but at the international level also. It affects the interests of publishers/producers much more than those of authors. It is a major building brick for the infrastructure of the information society. It is to be noted, finally, that the sui generis right is already being considered at the Berne Protocol sessions. The Conclusions of the 5th Session stated: 'consideration of supplementary sui generis protection of databases continues, without prejudice to copyright protection and without yet determining the form of a possible treaty'.
And at the 6th Session the Delegate of the European Commission tabled a first 'international' draft for a sui generis right. The debate which followed showed much understanding of the need for the right, and support for further study, leading possibly to an international instrument. Many delegates foresaw that the right, being essentially in favour of investors and their investments in databases, would not fit easily into a Berne Protocol in favour of authors, so that a separate instrument may be appropriate. Indeed, the Geneva Convention in 1971, a specialist anti-piracy treaty in favour of phonogram producers, was quoted by several delegates as a precedent. Work at the international level is thus well launched.
Limiting Exceptions to Rights
In the broadest terms, the prime need is not to allow as exceptions to copyright or neighbouring (including the sui generis right examined in TWO above) rights acts which are acts of primary exploitation. The Dutch copyright authority, Dr. Bernt Hugenholtz, puts the point well:
'Any future system of copyright limitations, either on the national or the European level, must remain within the limits set by article 9 (2) of the Berne Convention.
Article 9 (2) does not permit limitations that would conflict with the "normal exploitation" of protected works. Even though it is far from clear what a "normal exploitation" will be in the digital environment, it can not be contested that systematic acts of primary exploitation can not be exempted.
Here lies, perhaps, the essence of the copyright problems of electronic document delivery. Document supply services provided by libraries and intermediaries have become so sophisticated and successful, that they are now provided in direct competition with database publishing and other commercial services offered by publishers and vendors. In doing so, libraries and intermediaries have, perhaps unwittingly, become primary exploiters, thereby perhaps overstepping the borderlines of article 9 (2) BC. '
It is a telling comment on the move from inter-library lending to document delivery by the library community.
Interlibrary Lending as Document Delivery
The potential erosion of publishers rights is well expressed by the Association of American Publishers (AAP) Statement on Document Delivery of April 1994-
'The situation today is that the advent of post-1976 technologies -- fax machines, computer networks, low-priced scanner and CD-ROMs -- has facilitated the interlibrary delivery of photocopies of articles and chapter-length excerpts from books and generated services focused on this conduct. In addition, publishers are exploiting the market for single copies either directly or through one or several document delivery services or by participating in the Copyright Clearance Center. Some libraries and similar institutions provide copies of articles but do not pay royalties to publishers because the copying is, according to them, "interlibrary loan and permissible under the CONTU (National Commission on New Technological Uses for Copyright Works) guidelines". However, as stated above, this copying is far beyond that permitted under CONTU and therefore, may be done only with permission of copyright holders.
Additionally, librarians have vigorously promoted resource sharing which amounts to nothing less than the Senates third example of forbidden co-ordinated subscription buying to "save money" by filling patron needs from source libraries. This copying and document delivery far exceeds the scope of interlibrary "lending" contemplated by Congress or CONTU in permitting interlibrary loan arrangements under the proviso. When the push for resource sharing and the development of consortia are combined with the new networks and even the existing inter- university network (Internet), one sees a formula for the erosion of publishing revenues in this country, from: (1) lost book and journal subscription sales, (2) lost royalty income from licensing, and (3) lost new product opportunities. The revenue base that now supports publishing relies on multiple opportunities to exploit a product such as income from sales, subscriptions, and licenses. The interlibrary copying without permission and other non-authorised document delivery denies the copyright owner its rights under the law.'
Art 5 (2) of the EUs draft Directive on the Legal Protection of Databases provided in its draft of November 1993 that the incorporation of abstracts into a database should not require the authorisation of the owners of rights in the abstract. Section 60 of the UKs Copyright, etc. Act of 1988 made copying of abstracts in learned journals covering scientific and technical subjects free unless a licensing scheme were to be put in place. The provision in the EUs draft Directive has been vigorously and successfully opposed by the Federation of European Publishers (FEP), and debate continues in the UK over the British legislation. There is a growing sense among Scientific Technical and Medical (STM) publishers that the use by secondary publishers of abstracts prepared either by publishers themselves or by the authors of the articles concerned should be paid for.
Exceptions to the Unauthorised Extraction Right
Barry Wojcik has, on behalf of the European Association of Direct Marketing (FEDIM) made the necessary point well:
'The substantial investment in coding a paper-based, inflexible collection of literature or scientific writings or other reference material to create a collection in the form of an electronic database results in enormous gains in research productivity: research which previously took years or weeks can be done in days or minutes. The purpose of the research will often be to find only one or a few "insubstantial part(s)" in a mass of materials. This is made possible only by significant investment in the productivity tool which is the database containing the materials. An "insubstantial" parts exception to the right to prevent unauthorised extraction and re-utilisation could therefore seriously undermine the object of protecting database investment.'
The nature, publishers should emphasise to their governments, of what they provide is changing, and that change must be recognised in a re-assessment of exceptions to copyright and neighbouring rights.
It is likely, finally, that useful information and some guidelines will emerge from the American NII Fair Use Conference.
The advent of multi-media products, often in the form of CDs, poses some difficult copyright questions for publishers, especially
(a) what is the work to be protected?
(b) who is the author of such a work and the owner of the rights?
There are no clear answers to these questions, but they now need to be addressed urgently, as we move towards the next millennium and multi-media becomes the order of the day, rather than the latest technological innovation. Without a clear legal framework in which to develop the technologies to their full potential there is real danger that the necessary investment may not be forthcoming and the public ultimately denied the benefits this revolution could bring. This perspective must be made clear to governments by the publishing community.
Copyright law recognises certain categories of works and provides protection for all of them, but the protection is not identical for each form. As things currently stand, it may be that there is no protection for the multi media product per se. This critical point is well supported by UKs Multimedia Industry Advisory Group Report (1995)-
'A pressing question here is whether a format right for a CD-ROM (or equivalent multimedia product) would be beneficial. Today, a producer of a CD-ROM must spend an enormous amount of time and energy clearing all the constituent IPR rights that go into a CD-ROM, but - unlike the producer of a film or a television programme - he has no specific format right on which he can subsequently rely for his own products protection. Such a right would vindicate producers efforts, and might encourage and facilitate the development of the CD-ROM business.'
If multi media products are protected by accumulation of all rights in the product, this might provide a sufficient, if cumbersome weapon, against piracy, since all the component parts - computer program, audio-visual work, text, sound recording - will be protected under copyright and neighbouring rights legislation in all major markets and the reproduction right is the basic right which is invariably given even if other rights, such as a rental right, are not.
This is, however, not a sound basis for exploitation. If the multi media product is only protected as the sum of its many component parts, variations in rights, such as public communication rights, and the different exceptions applicable to the various elements forming the multi media product, may make the management of rights in the multi media product impossibly complex.
A multimedia product is neither a literary, artistic, musical or dramatic work. Is it a an audio-visual work? Is it a database? The EU Directive on Databases uses the term collection of data to describe what it is protecting. That is simply not appropriate to the sophisticated combination of words, sounds and pictures (sometimes still, sometimes moving) which fuse together into a new medium. However, some kind of consensus is urgently needed, and while a multi-media product is much more than a database, it can at a lowest common denominator level be classified as such, and that indeed seems to be the emerging consensus.
Upstream And Downstream Licensing
The imperative need here is to convince governments that voluntary negotiation will resolve the licensing issues 'upstream' (to use Clive Bradleys useful metaphor) between authors and publishers and then 'downstream' between publishers and users.
From Literary Author to Publisher (Upstream Licensing)
There is an understandable hesitation by authors to license to publishers the exploitation of electronic rights. They have little confidence that publishers themselves understand either what acts of exploitation are subsumed under the phrase 'electronic rights' or (even if they do) that they have the expertise to exploit those acts in a rapidly evolving market place.
Between a grant of copyright itself in all forms and editions from author to publisher and a grant from author to publisher of a first option on specific electronic rights there is fertile ground for negotiation between the representatives of authors and publishers. The Association of Authors Agents in the UK has suggested that, as regards the publishing of general interest trade books, all negotiations should take account of essentials which they express in the following questions-
It is very likely that these kinds of questions will, indeed, be in the minds of book publishers as they negotiate directly with users. Each publisher will take his/her own decision on what breadth of grant of electronic rights he/she wishes to acquire from the author in order to ask such questions of users. The point to be made to governments is that the established channels between authors and publishers are perfectly capable of resolving these 'upstream' issues.
It is a curious paradox that the countries deemed to be least author-minded, the common law Anglo- American countries, have the best possible protection of authors of general literature in their tradition of literary agents. These long-established figures are central in the publishing trades of the USA and the UK, but largely peripheral in droit dauteur countries!
From Academic/Professional Author to Publisher ('Upstream Licensing')
It may be convenient to consider here the relationships 'upstream' between academic and professional interests authors (not predominantly represented by literary agents) and their publishers in the digital world.
There is a argument which states that the traditional role of the learned journal publisher will in the digital world become redundant. Instead of paying out (in advance) high subscription fees to print-on- paper journals, much of whose contents are not read by the user academic, that person will be able to access directly at low cost, individual articles of interest to his or her on open networks without any publisher/middleman costly intervention. The argument in favour of a direct author-user line should not be discounted, not least by publishers who may 'downstream', as we shall see later, themselves be attracted by a similar direct publisher-user line. In an article in The Bookseller in October 1994, however, Professor Bernard Donovan, the distinguished Secretary of the Association of Learned and Professional Society Publishers in the UK offered some important countervailing arguments -
'Although the authors of papers in learned journals are not paid, much screening of the papers goes on before publication. Very few scientific papers are ever published without substantial editorial amendment.
In my experience as a journal editor, the arrival of a near perfect paper was a rare event indeed. Likewise, Stevan Harnad, of "subversive proposal" fame, has written that "in over 15 years of editing Brain and Behavioural Sciences and five years of editing Psycoloquy, I have never once encountered a paper where the authors final draft could be published verbatim!" (Harnad, Paying for the Pipe......)
Customarily, each paper submitted to a journal of standing is studied by at least two referees, and an editor, before any decision is taken over publication. Even then, papers are usually sent back to the authors for revision and improvement, after which they are again refereed before the final checking and preparation for press begins.'
(2) Managing the System:
'All too often electronic information systems are regarded as having elastic, ever-expanding memories. But memory costs money, and the maintenance of large memory stores is expensive. Further, a backup system is needed in case the original store or compilation becomes faulty, damaged or destroyed. That doubles the amount of storage needed and complicates management. The storage and transmission of pictures, let alone colour images, vastly increases the demand for wide bandwidth and consumes massive amounts of memory.
Another premise is that some preferable cost-free management structure will be devised to run the system and deal with such matters as controlling the input and looking after necessary housekeeping. But how would this function? Would we have a separate systems for each cluster of journals, or would we have a national all-embracing system?
Who would manage the system? Who would decide what material, or information is to be added to the store? Of, if the gate is to be left wide open, who is to label, classify and index the information - and to guard the store? How will we keep track of the information becoming available? One two occasions recently I have downloaded files from the Princeton University file-server on to my computer, but I cannot be certain that this material will still be available in six months time, let alone six years.'
'How are those electronic marks, the dots and dashes, the zeros and ones, to be preserved for posterity? What form will the archive take? What medium will be used for storage, for, like the traces on magnetic tape, the coating on CDs seems to have a finite life? Must plans be made to copy the archive on to a new stock or storage medium every 10 or 20 years? Who will pay for this somewhat unproductive exercise?
This is not a minor issue, for without such an archive there is no assurance that your bright ideas and superb research results will be kept for future generations. I fully expect my research findings in the field or endocrinology and reproductive physiology, printed on old-fashioned paper in standard journals, to remain available for coming generations, but the same cannot be said for material distributed in the current electronic journals. Already, valuable information published in sadly short-lived electronic journals has vanished.'
The more academic and professional authors insist that their interests are in academic and/or professional career advancement and not in earning (or not earning) modest fees as copyright authors, the more they must address the issues raised by Professor Donovan. The last issue, archiving, is rapidly becoming an international issue. In the UK, the British Library has already discussed with the Publishers Association the terms for both archiving and access to such archives of works presented in electronic form in order to put forward a Bill to Parliament in respect of national deposit regulation. In the European Union, a Workshop in November 1995 was held on 'Issues in the Field of National Deposit Collections of Electronic Publications' attended by no less then fourteen Chief Librarians of nation libraries throughout Europe, and in the USA a Draft Report of a Task Force on Archiving of Digital Information was published by the Commission on Preservation and Access. And, already in 1993, the relevant interests had concluded a set of CD Mandatory Deposit Agreements, which establish some very useful contractual precedents.
From Publisher to User ('Downstream Licensing')
Here, perhaps more than in 'upstream' licensing, publishers may have to convince governments that voluntary licensing can and will work, and that forms of compulsory licensing are not required, not to say inconsistent with Berne Art 9(2). The assumption of many commentators (e.g. at the WIPO Louvre Symposium in 1994) that collective licensing is inevitable and indeed the only solution veers rather alarmingly for publishers in the direction, even if unspoken, of compulsory licensing. It is therefore very important that publishers persuade governments that direct publisher-user licensing is realistic, that licensing via the increasing range of information brokers is also realistic, and that licensing via collective administration is also realistic. Direct publisher-user licensing has taken on fresh life recently in the form of contracts for electronic site licences, between, for example, publishers and librarians and between publishers and universities.
The term 'collective administration' is used here in distinction to 'collective licensing'. The value to users of collective licensing in print-on-paper reprography, that is standard fees, set by the licensing body, which can be broadly calculated and budgeted in advance, is to the rights-holders their main defect in electronic transfer. Publishers now look to Reproduction Rights Organisations (the RROs) not for collective licensing of secondary rights at fees set by the licensing body but for collective administration of what in the electronic context they see as, often, quasi-primary rights at fees set by the publishers themselves. Recent pioneering work has been undertaken by the Copyright Clearance Center (CCC) in the USA through its Rightsholder Electronic Access Agreement (a form of site licensing), and by the Copyright Licensing Agency (CLA) in the UK through its Rapid Clearance System, widely known as CLARCS (a form of one-stop shopping).
Clive Bradley, CBE, the Chief Executive of the Publishers Association in the UK has recently suggested some thoughts for alternative models of copyright management-
'1. Umbrella licences. A range of standardised voluntary licences which different copyright holders can include in their own conditions of supply, to which they can add their own individual conditions on such matters as payment, territorial rights, etc., but which otherwise unable the licensee (user) to know that conditions of use generally must be in accordance with the licence, may be appropriate in cases of supply to libraries, document supply services, reproduction of abstracts, and the like.
2. Site licences. Licences permitting use of works within an agreed site, e.g. a schools or school authority, a college, or a training institution, may be appropriate where the user wishes to make many different uses of a work, e.g. photocopying, inclusion in course packs, loading onto a local area network, in return for a fee which seeks to reflect the volume and value of these uses. Such licences require i) systems of measurement of use, and ii) adequate controls on downloading or providing access to users outside the site.
3. Systems operated by consortia of publishers. Given the dual requirements of i) individual control, and ii) ease of access by users, it may be appropriate for groups of publishers to develop schemes which reflect the need of their sector, without impeding on competition between them.
4. Individual licensing. Still likely to be practised by most publishers for the most significant acts of republishing and sub-licensing, and for setting basic restrictions when releasing their works onto the market place.
It should be noted i) that such systems are not designed to reduce the role and earning capacity of authors, but to enable authors to share in the full commercial rewards resulting from successful publication. and ii) that such systems need not cut out the role of collecting societies, in that agencies for the operation of such schemes will almost certainly be required, but that such agencies must be truly agencies, and not mechanisms for collectivising the highly individualistic and entrepreneurial market which is publishing.'
Whatever variations of these initiatives publishing industries adopt according to the differing legal cultures of the various states in which they trade, it is now essential that governments are persuaded that 'downstream' voluntary licensing can be exercised in the real world.
The Answer To The Machine Is In The Machine
'The question surrounding the electronic use of copyright materials is not so much, 'How shall we prevent access and use?' as 'How shall we monitor access and use?' Generally speaking, intellectual property is made available to the public so that it can be used, and mechanisms which simply prevent use eventually defeat the very reason for which the material was created at all. After all, to publish is to make something available to the public. The real issue is to link identifying, monitoring, control and reward. The ideal is a system which can undertake several different tasks, preferably all at the same time. A system must be able to identify copyright materials, to track usage, to verify users, and to record usage and appropriate compensation. In addition, the system should provide security for the integrity of the copyrighted material (freedom from tampering) and some level of confidentiality or privacy for the user. It might also provide the user with a price list showing various costs for different uses and individual materials along the model of a retail establishment. '
The above opening paragraph from Chapter Six of The Publisher in the Electronic World (the International Publishers Copyright Council, IPCC, Turin, May 1994) sets the scene for much active research in the field known broadly as Electronic Copyright Management Systems (ECMS). There is intensive work being invested by an impressively large number of researchers and institutions mainly in the USA (not least by the Association of American Publishers through its Enabling Technologies Project, whose Final Report was published in Summer, 1995), but also in the EU and in Japan. The Association of Scientific Technical and Medical Publishers (STM) recently and very helpfully made available an inventory of such initiatives, prepared by Douglas Armati. No less than 20 American, 6 European and 5 Japanese projects are in hand for the management of literary works.
The anxieties of STM publishers, faced with potential mass piracy on digital networks and highways, will, it is suggested, incline them to seek first some security in closed circuit systems, for which there is likely shortly to be the relevant 'architecture'. Publishers will not wish to make the intellectual property of themselves and their authors available to open access until 'identifying, monitoring, control and compensation' models are much nearer secure reality.
One example of a closed circuit system is being pioneered and tested in the United Kingdom by a consortium of users, publishers and technologists under the name INFOBIKE (so called because the basic system architecture resembles a stylised bicycle - one wheel being Bibliographic Databases, and the other wheel being Document Servers). The projects mission statement is "to make available, and prove, in a real environment, an 'electronic document' finding, ordering, browsing and delivery service". The consortium members include Blackwell Science and Academic Press who provide the learned journals contents for the project; ICL who will develop the document server system; the Universities of Kent, Keele & Staffordshire; and the Consortium of Academic Libraries in Manchester (CALIM): last and perhaps most important are the participation and the project leadership of the Bath University Information and Data Services (BIDS). INFOBIKE has sought and received funding from a Group on Information Technology set up to implement the well-known Follett Report (hence the funding groups name, FIGIT).
The overall objective is to provide for services (to quote the consortiums proposal to FIGIT) 'which will allow users to have browsing and reading access to a large range of journals in electronic form, for which their institutions have paid licence fees. This (the Bibliographic Database wheel) will be backed up by an electronic document delivery facility (the Document Servers wheel), charged on a usage basis. Such services must be established taking account of the legitimate interests of users, libraries, publishers and authors. One of the factors which is critical to the introduction of such a service will be pricing models that are acceptable to all the parties.'
Similar pilot schemes are in hand in the Netherlands. Publishers may well in the next few years become reasonably comfortable with this kind of closed circuit system, which has features of both electronic subscription fees and 'pay as you use' transactional fees. The IPCC (International Publishers Copyright Council) has been specifically charged with the task of monitoring such systems as they develop.
The longer term future, however, must involve the publishers in open access systems. The first and foremost solution, on which all other solutions depend, will then be to find an identifying system. The World Intellectual Property Organisation (WIPO) recently called together a Working Group, including lawyers, software experts and standards officers, to examine this key issue. Its full title is 'Consultation Forum for NGOs on the Protection and Management of Copyright and Neighbouring Rights in Digital Systems'. One critical choice will be between 'intelligent' and 'dumb' identifiers. An intelligent identifier would encompass all the information relevant to identifying the copyright work, its various rights-holders, the terms on which the work can be licensed for various uses, etc. A dumb identifier would simply identify the work, and refer to a repository of relevant further information. One advantage of dumb rather than intelligent identifiers is that information which changes (just as each year works pass out of copyright into the public domain or, indeed, have their copyright extended or revived) can be kept securely up to date in the repository.
A dumb identifier strongly suggests a role for collecting societies which are well used to handling and updating work-specific information. Development of the Copyright Licensing Agencys Rapid Clearance System (CLARCS) in the UK, which calls up from its database work-specific information for licensing purposes, is, for example, likely to lean towards such a 'repository' function.
It is rather ironic that the book trade, thought of often as a 'steam-age' trade, has actually pioneered unique identifiers, first with ISBNs, then with ISSNs, over 25 years ago, and now with journal article identifiers (SICI). Whether and how that system can be adapted must currently be an open question. It must, certainly, continue in existence for trading purposes even if universally compatible codes or one universal code for all categories of copyright works become realistic. Exciting work is now in hand by CISAC in creating a Common Information System (CIS) which may become capable of application to all categories of copyright works.
The latest available Plan from CISAC sets out the following Principles-
'The CIS program rests on four underlying "principles of copyright information management for the protection of copyright within the international network in the digital age. These will be incorporated into the Principles and Standard Protocol" to which all participants in the CIS plan will in due course be expected to subscribe. The four principles are:
Application of the Principles
Any organisation wishing to play a full part in the Common Information System must respect these four principles. The net result of the four principles should be that anyone can identify the current rights owner of a copyright work in a given territory for the purposes of licensing, reporting and royalty payment
In applying these principles, especially the principle of access, a number of other considerations must also be respected:
* The provisions of data protection legislation
* The internal security of each organisations computer systems
* The confidentiality of the specific terms and provisions of copyright agreements
* The costs to the data provider, which may be met by service charges where appropriate.'
It is likely that an international project will soon emerge to co-ordinate, in particular, the American and European initiatives. One critical choice in this ongoing exploration will be the level of work which it is sought to identify. Technical experts state that a sentence of a book or learned journal article, or a bar of music, can be identified. It may be that identification needs to start only at the threshold beyond which intellectual property value is measured by fees.
Allied to that issue is one raised recently by Daniel Gervais, the Assistant General Secretary of CISAC, which he described as 'the problem of the smallest protected unit'.
'The question of the smallest work concerns the definition of the notion of work and, more precisely, how small can a creation be and still embody a sufficiently high degree of originality ....... in the literary field, a single word is not protected, but there are cases where a single sentence has been said to be protected. There is no universal answer to the question.' Daniel Gervais then quotes from a presentation by Dr Thomas Dreier of the Max-Planck Institute at the WIPO Harvard Symposium in Spring 1993-'if digital technology and networking thus have a tendency to replace the "author" with mere contributors, the dissolution of what constitutes a "work" as described above seems to work quite to the contrary, i.e., in favour of the contributors status as authors. The reason for this is the fact that, if single parts of the entirety that was traditionally considered a work -- eventually even any combination of data to which a meaning is attached -- are regarded as independent "works", it would consequently be possible for independent "authorship" to attach to any of these minimal combinations.'
In Europe, the EU's CITED project (Copyright in Transmitted Electronic Data) was concluded in 1994 with meetings held throughout Europe to describe the model which the CITED team developed with funding from the ESPRIT research programme.
In an article in New Society in February 1995 Andy Lawrence gave a short and very clear overview of the CITED model-
'These proposals went further than merely specifying ways of encrypting information so that only authorised key holders can grab an electronic document from the Internet and convert it into a usable form. CITED tackled the tricky problem of what happens after the material has been decoded. If a computer file containing music, or the page of a journal is sitting on the hard disc of a computer that is hooked up to a local area network,-with basic encryption systems there is nothing to stop the authorised user from redistributing, or even printing those files.
The model that the pan-European team has come up with is built around a tamper-proof software module which acts rather like indestructible tachometers installed on long-distance coaches and lorries, recording everything that happens to the copyrighted or commercially valuable material. As with the Cerberus approach, the basic idea is that the valuable material is linked to a specific piece of software. This software is required to gain access to the material, and it can only be converted into its usable form by someone in possession of the right key or password.
The difference with the CITED approach is that, when the authorised user requests a piece of software or some pages of a report or journal, he or she will have to key in a password. From then on, each time a program is run or a print of a page is made, the associated software module sends a message back to the secure database stored on the computer. The database can then track every activity carried out by the organisations software modules, so providing an audit trail which show whether pages are being printed or copied electronically. Eventually, it may be possible to forward this information to rights societies to help them determine how much artists, authors and publishers should be paid.'
CITED is now being applied in various successor projects, e.g. COPICAT, funded by the EU. Particular attention should be paid to IMPRIMATUR, a new and very ambitious EU-funded project whose overall aim is to provide a 'new settlement' for the digital era.
These indications of current work suggest that sooner or later the answer to the machine will indeed be found in the machine. Two consequent issues will then loom large. The first is whether the information society can wait for a further 20 years while owners of competing patented systems slug it out in the marketplace. Technical compatibility, possibly through the International Standards Organisation (ISO), must come soon onto the agenda.
The second consequent issue is the need for controlling legislation to reinforce technical protection. A draft Article is offered here:
Contracting Parties shall by civil and criminal measures prohibit
the effect of which is
There is precedent for such a provision in Art. 7 (I) (c) of the EUs Directive on the Legal Protection of Computer Programs. Further support is provided by specific statutory language in Appendix 2 of the USA White Paper on Circumvention of Copyright Protection Systems and Integrity of Copyright Management Information. It is, however, recognised that there will need to be provision for acceptable exemptions. Two examples, come to mind: one for certain products of the consumer electronic industry; and one for the law-enforcement agencies.
The International Perspective
There is an increasing mismatch between the traditional concept of the nation-state as the engine of economic and social, including intellectual property, management and the reality of the rise of trans- national activity. The Internet is a vastly challenging transnational phenomenon, operating in over 75 countries, reaching over 25m people who access the Internet via over 2m PCs, with no concern for 'nation state' interests whatever. Nor will other information delivery systems have any such concern, e.g. university distance learning networks. And we see now very large firms operating across the world, sometimes with a purely technical relationship, merely that of registration as a business activity, with the traditional nation state.
We approach, therefore, in the words of Eric Hobsbawn in his masterly 'Short History of the 20th Century' 'a state of economic activities for which state territories and state frontiers are not the basic framework but merely complicating factors'.
Yet our copyright system, as the central pillar of national treatment in the Berne Convention makes clear, is founded on the nation state.
Can publishers therefore maintain the territoriality of copyright in an environment of unstoppable transnational/transborder flow of information over the digital highways?
One way forward is being pioneered by the music business in an alliance of tracking systems and collective administration. Godfrey Rust, the Database Controller of the Mechanical Copyright Protection Society (MCPS) wrote up for Copyright World his presentation to the 1994 CISAC Congress in Washington and offered the following instructive example of how the development of codes can lock in to the development of collective administration. (Note the plural 'codes': the ISBN code identifies only the book which contains the authors' work (or authors' works) and the publisher of the book; it does not identify the work itself (or the works themselves).
'A French songwriter, writes a song, which, some months later, is recorded in the USA. Afterwards the recording is used in Australia.
The composer already has a unique CAE (Compositeur, Auteur, Editeur) number, which identifies him as author and owner. His new song is given an ISWC, an International Standard Work Code, by his publisher or, if he has none, by his society.
In the USA, when applying for its licence, the record company tells the Fox Agency of the recordings international standard recording code, the ISRC. The agency identifies the song through the French database, and links the recording code to the work code.
In Australia the recording is played somewhere, perhaps on a smart-card music-on-demand system on a superhighway. The ISRC is automatically tracked and reported to the Australian performing rights society, APRA. Through the network, thanks to the earlier work of Fox and SACEM, APRA automatically retrieves the work code and the CAE number. In due time they attach the appropriate payment, which goes through the French society to the composers account'.
Such a practical alliance of identifier codes with collective administration may become one central strategy for maintaining reward for uses of copyright works in the digital environments, as envisaged in the pioneering Common Information System whose basic principles are set out at Section Six above.
It is a cliché to observe that the advent of digital technology is both a challenge and an opportunity. But clichés become clichés because they are true, and the overall aim of this 7-point agenda is to arm publishers with some of the arguments which will enable them to meet the challenge posed by digital technology to the copyright system, the bedrock on which publishing houses are built, so that as publishers they can maximise both rewards for their authors and themselves and access for user communities of all kinds in the new digital world.
Last updated : April 03 1996 Copyright
ICSU Press and individual authors. All rights reserved
Listing by Author's Name, in alphabetical order
Return to the ICSU Press/UNESCO Conference Programme Homepage
University of Illinois at Urbana-Champaign
The Library of the University of Illinois at Urbana-Champaign
Comments to: Tim Cole