Can A Computer Be An Author - Copyright Aspects of Artificial Int
Can A Computer Be An Author - Copyright Aspects of Artificial Int
1-1-1982
Recommended Citation
Timothy L. Butler, Can a Computer be an Author - Copyright Aspects of Artificial Intelligence, 4 Hastings Comm. & Ent. L.J. 707
(1982).
Available at: https://repository.uchastings.edu/hastings_comm_ent_law_journal/vol4/iss4/11
This Note is brought to you for free and open access by the Law Journals at UC Hastings Scholarship Repository. It has been accepted for inclusion in
Hastings Communications and Entertainment Law Journal by an authorized editor of UC Hastings Scholarship Repository. For more information,
please contact [email protected].
Can a Computer be an Author? Copyright
Aspects of Artificial Intelligence
By TIMOTHY L. BUTLER*
I
Introduction
Few people would doubt the significance of the computer in
modern United States society, but most would doubt a com-
puter "authored" the above quoted excerpt. Perhaps trivial
and somewhat nonsensical, it nonetheless evidences a signifi-
cant trend in computer science towards more powerful, crea-
tive and autonomous computer programs. In the computer
science field of artificial intelligence (AI), these developments
in computer programming pose unique problems in copyright
protection of computer software. Computer software capable
of automatic programming, inductive analysis and knowledge-
based problem solving will soon challenge the legal concepts of
authorship and originality central to the common law and stat-
utory basis of copyright. As the threshold between man and
machine narrows, courts will have to determine the legal sta-
tus of the apparently creative work product of a machine,
which, if produced by a human, would be afforded copyright
protection.
* Member, Third Year Class; B.A., University of California, Irvine, 1977. A ver-
sion of this note has been entered in the Nathan Burkan Memorial Competition.
1. RACTER, Soft Ions, OMNI,April 1981, at 96, 97.
707
COMM/ENT [Vol. 4
II
Artificial Intelligence
The ultimate goals of AI researchers can roughly be divided
into two non-exclusive parts. 2 One group of AI investigators
uses computers in attempts to simulate, and thereby under-
stand, human behavior.3 The other group of researchers uses
AI principles to make computers perform tasks unsuited to
human capabilities, thus extending the power of man in his en-
vironment.4 Both groups use AI machines' as tools to achieve
their respective goals.
Although the field of AI has been filled with attempts to cre-
ate machines which perform human tasks with the speed and
efficiency of man, AI machines are still "remote from achieving
a level of intelligence comparable in complexity to human
thought. ' 6 In early contemplation of machine intelligence,
2. Problem-OrientedLanguages
The task of problem specification when using code genera-
tors is simplified by using problem-oriented languages. "A
problem-oriented language allows one to state and to solve a
whole family of problems .... The user just has to state his
problem in a descriptive manner ... .30 ALICE was the first
language of this kind. ALICE has been applied to real-life
problems involving operations research and has been more
successful than traditional programs. 31 When using ALICE, a
purely descriptive statement of the problem is entered using
mathematical sets and symbols. No algorithm 32 is supplied by33
the user. The computer systematically introduces heuristics
to derive implications and make appropriate choices.3 4 The
problem solution is obtained, and the coded algorithm used is
available for future use.
3. Knowledge-Based Systems
Code generators and problem-oriented languages like AL-
ICE evidence only two approaches towards the AI goal of auto-
mating the programming process (Type I AI programs). A
third major attempt has focused on the knowledge-based ap-
proach to automatic programming.3 5 One such project was PE-
COS, where the researchers summed up their method and
rationale as follows:
[H]uman programmers seem to know a lot about program-
ming. This suggests a way to try to build automatic program-
ming systems: encode this knowledge in some machine
readable form. In order to test the validity of this approach,
knowledge about elementary symbolic programming has been
codified into a set of about four hundred detailed rules, and a
system, called PECOS, has been built for applying 36
these rules
to the task of implementing abstract algorithms.
B. Simulation of Intelligence
Although devising and writing computer programs is also
traditionally a human endeavor now sometimes accomplished
wholly or partially by a computer, RACTER is a computer pro-
gram that simulates the written output of a human author
(Type II AI program).4 Using a vocabulary stored in its mem-
ory, RACTER applies grammatical rules to construct semi-co-
herent stories in English. The author of the program stated
that "once the story-writing program is set into motion the out-
put is not only novel and a priori unknowable but also cohe-
sive and apparently thoughtful. It is crazy thinking, I grant
you, but thinking conducted in perfect English."' 4 1 RACTER
picks nouns and verbs, adjectives and adverbs, at random from
lists that [the author] supplies. Then it tinkers with them and
strings them together according to the rules of grammar.
RACTER conjugates verbs, keeps track of singular and plural
nouns and of male 42 and female characters, and chooses verbs
and pronouns to fit.
The source program 43 and the vocabulary are supplied by the
original author of RACTER. The products of the machine,
simulating the output of a human author, are those of the com-
puter and its software. As programs like RACTER become
more sophisticated, the style and coherency of the resulting
stories should improve. The human will merely be required to
push a button to execute the program. After the human starts
the program, the results are not foreseeable or predetermined;
the story has possibly not been "conceived" in the sense of
original authorship required by the copyright laws.' Under
40. RACTER, supra note 1, at 97-98.
41. Id.
42. Id. at 98.
43. A source program is the coded instruction set usually written in a high level
procedural language (FORTRAN, COBOL, etc.) which has been input by the user and
represents the human-readable version of the compiled object module.
44. See notes 127, 136 and accompanying text, infra.
COMM/ENT [Vol. 4
III
The Modern Digital Computer and Computer
Programs
A. Historical Development
The history of the digital computer has been characterized
by a decrease in physical size, a reduction in cost per calcula-
tion, an increase in commercial applications and the creation
of a mature computer industry and associated support technol-
ogies. Although Charles Babbage first formulated the concept
of an "Analytical Engine" capable of automatically performing
numerous mathematical functions in 1833, 45 the first electronic
digital computer with a stored program capability was not op-
erational until over a century later when the EDSAC (Elec-
tronic Delay-Storage Automatic Computer) was constructed at
Cambridge University in 1949.46 Since the 1940's, computer
technology has passed through several generations. These
generations are distinguishable by the technology used, the
programming languages employed and by type of function or
application sought or achieved."
45. S. MANDELL, COMPUTERS AND DATA PROCESSING CONCEPTS AND APPLICATIONS
22 (1979).
46. Id. at 27.
47. The first generation, (1951-1959), was characterized by vacuum tube compo-
nents. Data was input by paper tape or punched cards. The languages available to the
programmer were restricted to actual binary code and assembly-level symbolic lan-
guages. Mainly used for military and scientific applications, these computers were
also sometimes employed in the business environment to expedite payroll and billing
tasks. Though much faster than the earlier electromechanical computers, these ma-
chines generated large amounts of heat and experienced reliability problems because
of vacuum tube failure.
During the second generation of computer technology, (1959-64), vacuum tubes were
replaced by transistors, magnetic drums replaced magnetic cores and assembly and
machine level languages were supplemented by higher-level, procedural languages
like FORTRAN. Data was input from or stored on magnetic tapes, paper cards,
punched cards or from the keyboard. These machines were less expensive, smaller
and more reliable than the previous generation. This generation of computers was
primarily built for general business use.
The third generation, (1965-1977), was characterized by use of integrated circuits,
sophisticated operating systems, magnetic disk storage and the introduction of a pleth-
ora of general and specific programming languages. Though "old" from a technological
No. 41 COPYRIGHT AND ARTIFICIAL INTELLIGENCE 717
viewpoint, many of these machines are still in use. They are very fast in calculation,
low in cost, small in size and extremely reliable. This generation has formed the data
processing infrastructure which currently supports U.S. business, education, govern-
ment and defense. See S. MANDELL, supra note 45, at 19-39. See generally W. RODGERS,
THINK: A BIOGRAPHY OF THE WATSONS AND IBM (1974).
The fourth generation extends from the late 1970's into the present. This generation
of computer technology is characterized by very large scale integration (VLSI), in-
creased data storage capacity, a marked reduction in price and an increase in reliabil-
ity and applications possibilities. This technology is used in very powerful and fast
computers like the Cray I, which is capable of close to 100 million calculations per
second. The fourth generation has also spawned the "personal" computer. Based on
microprocessor technology, these machines are small and inexpensive, and are prolif-
erating rapidly into the world of the small businessman, professional, scientist, educa-
tor and hobbyist. See Toong and Gupta, PersonalComputers, 247 Sci. AM. 87 (Dec.,
1982), and Matisoo, The Superconducting Computer, Sci. Am. May 1980, at 50, 65. See
generally Sc. AM., MICROELECTONICS (1977) (collection of 11 articles published in Sci-
entific American covering development, technology and applications of microelectron-
ics) and Hogan, From Zero to a Billion in Five Years, INFOWORLD, Aug. 31, 1981, at 6.
The Japanese feel we're well on our way towards a fifth generation of computer tech-
nology. "The previous four generations of computers ... were categorized according
to their elements: the first used the vacuum tube, the second the transistor, the third
the integrated circuit, or semiconductor, and the fourth the very large scale integrated
circuit, or VLSI.... [The Japanese] expect the fifth generation to use, in addition to
VLSIs, circuits that work on different physical principles from today's semiconductors.
These powerful new circuits, however, are far from the only way in which fifth genera-
tion computers will be revolutionary." Lehner, Japan Starting 10-Year Effort to Create
Exotic Computer, Wall St. J., Sept. 25, 1981, at 29, col. 1. See also Uttal, Here Comes
Computer Inc., FORTUNE, Oct. 4, 1982, at 82.
48. See note 47, supra.
49. See Toong and Gupta, supra note 47, at 100-104. The 42 members of the Com-
puter and Business Equipment Manufacturer's Association in 1976 predicted an ex-
penditure of 17 billion dollars on software alone during 1976-79. Gemgnani, Legal
Protectionfor Computer Software: The View from '79, 7 RUTGERS J. COMPUTERS TECH.
& L. 269, 274 (1979). An estimated seven million Americans spent their work days in
front of video display terminals in 1981. Markoff &Freiberger, VDT's Can Cause Stress
and Other Health Hazards in the Office, INFOWORLD, Oct. 26, 1981, at 21. "When elec-
tronic computers became commercial, just after World War II, forecasters predicted
that no more than a dozen machines ever would be needed. IBM even decided not to
sell computers because the market seemed so small. Yet this year, for the first time,
computers on the planet will outnumber people. According to market researchers
Dataquest, Inc. and International Data Corp., by the end of 1982, more than five billion
computers of all sizes, from microprocessors to mainframes, will be in use." Shaffer,
Computing Industry is Finding That It's Vulnerable to Slump, Wall St. J., April 16,
1982, at 27, col. 1. "So many tiny computers are working under the hood of today's
automobile to increase mileage and lower pollution that a single car maker, General
Motors, actually manufactures more computers every year, albeit smaller ones, than
COMM/ENT [Vol. 4
B. Computer Programs
A modern digital computer functions as a vast array of elec-
trical switches, whose values ("on" or "off") can be manipu-
lated to represent and modify information. 5 This
manipulation of information is controlled within the machine
by a set of instructions known as a "program. "52
The "program" most people are familiar with is the actual
"coded" instruction set entered at the computer keyboard (or
on punched cards or paper tape, etc.) .53 The program is a rep-4
5
resentation of an algorithm prepared initially as a flow chart
or step-wise procedure 5 written in standard English
5 6
to solve a
particular problem or to perform a specific task.
about a certain result." See Gemgnani, supra note 49, at 280-281, for further discussion
of definitions of "computer programs." See also Nycum, Legal Protectionfor Computer
Programs, I COMPUTER/LAw J. 1, 11-12 (1978).
57. See generally S. MANDELL, supra note 45, at 53-54.
58. "Compilers are necessarily dependent upon the language being processed;
however, the following steps are usually involved:
1. The source program is read and analyzed on a statement by statement
basis.
2. A lexical analysis routine scans each source statement and identifies re-
served words, variables, operator symbols, constants, etc.
3. A syntactical analysis routine identifies the type of statements and veri-
fies that the structure of that statement is admissible.
4. Tables and lists of symbols, expressions, and statements are maintained so
that an inter-statement analysis can be made.
5. Analysis is made of logical flow of the source program and a global error
analysis is made.
6. Machine language instructions, in an intermediate symbolic form internal
to the compiler are generated and optimization is performed, as required.
7. An object module is generated from the intermediate language and a pro-
gram listing is produced.
H. KATZAN, supra note 51, at 22-23. After the compilation state, the output module
contains not only the translated code, it also contains various system routines, or in-
structions invoking these routines to assist in execution and uses of system resources.
Some compilers "optimize" the module by modifying it in even more drastic fashion to
allow for more efficient execution. See id. at 24-36 for an introduction to loading, link-
ing and editing.
59. But this correspondence is not necessarily in a 1:1 ratio.
COMM/ENT [Vol. 4
IV
Artificial Intelligence and Federal
Copyright Law: 1982
The computer industry has sought to protect its ideas, tech-
niques, aesigns and products from misappropriation, infringe-
ment and unauthorized use. The industry has traditionally
relied upon the use of trademarks, copyrights, trade secrets,
patents and licensing agreements to protect its property inter-
ests. 64 However, the recent growth of the market in video-
games, home or "personal" computers and small business sys-
tems has created problems in protecting and enforcing legal in-
terests in computer software.65 Low cost, ease of copying,
60. See note 24, supra.
61. See note 30, supra.
62. See note 35, supra.
63. See note 32, supra.
64. See generally Nycum, supra note 56, at 1-81; Gemgnani, supra note 49, at 269-
313; Comment, Copyright Protectionfor Computer Programs, 47 TENN. L. REV. 787
(1980); Schmidt, Legal ProprietaryInterests in ComputerPrograms:The American Ex-
perience, 21 JURIMETRICS J. 345 (1981).
65. 'The case of [MICROCHESS] shows how severely amateur copying can dam-
age software sales. Before the International PET Users' Group published a method of
copying [MICROCHESS], the game program had sold more than 100,000 copies. After
publication of the copy method, sales dried up. By contrast, the semiprofessional pro-
gram [WORDCRA-T] enjoyed a dramatic increase in sales when the protection rou-
tine known as the 'Dongle' was incorporated." Hayman, Software Protection in the
United Kingdom, BYTE, Oct. 1981, at 132. See generally Freiberger, Software Piracy,
INFOWORLD, Mar. 22, 1982, at 31.
No. 4] COPYRIGHT AND ARTIFICIAL INTELLIGENCE 721
66. See generally D. REMER, LEGAL CARE FOR YOUR SoFrWARE (1982).
67. See, e.g., Teiser, Locksmith Version 4.0from Omega Micro-Ware, INFOWORLD,
Feb. 1, 1982, at 22 (review of a popular program used to defeat copy-protection
schemes). One commentator highlighted the problems of copy-protection:
"[A] mateurs confront software publishers with a dilemma: if publishers take no steps
to protect their programs, making a copy becomes the easiest thing in the world. On
the other hand, if publishers use protection routines, making a copy is for many ama-
teurs the most enjoyable thing in the world. Unlike semiprofessional users of
software, amateurs have both the time and the enthusiasm needed to defeat protective
measures." Hayman, supra note 65, at 132. Peter Laurie, editor of PracticalComput-
ing, confirmed this view by saying "Any intelligent tpenager will make it [overcoming
copy-protection measures] his first task of the day." Id. See generally Freiberger,
Software Piracy, INFOWORLD, Mar. 22, 1982, at 31.
68. See text section IV and accompanying notes, infra.
69. A program is usually sold imbedded on a "floppy disk" (a magnetic recording
medium about the size of a 45 rpm record) enclosed within some form of packaging.
The licensing agreement, warranty, etc. is often sealed within the package and is un-
available for review prior to purchase of the program. The user is supposed to fill out
the enclosed forms, mail them to the manufacturer and then abide by the terms stated
therein. See generally D. REMER, supra note 66.
70. Id.
71. A single-disk user is in great need of a backup copy, especially where his data
and program are still on the same disk. Some users take this privilege too far, causing
problems for all users. See generally Freiberger, Software Piracy, INFOWORLD, Mar. 22,
1982, at 31. One commentator has said, "amateur piracy will have five consequences
for the average software buyer. It will reduce the range of software available, raise
prices, and make companies reluctant to invest in software development. ...
[Pliracy also leads to lack of support and maintenance, and discourages development
of software by cottage industries which cannot afford to go to court to protect their
interests." Hayman, supra note 65, at 133.
72. "[Djramatic change in the law and the growing trend toward mass-marketed
programs mean that copyright is likely to be increasingly important in protecting com-
COMM/ENT (Vol. 4
puter programs, particularly those of small entrepreneurs who create their works for
individual consumers and who can neither afford nor properly use other forms of pro-
tection." FINAL REPORT OF THE NATIONAL COMMISSION ON NEW TECHNOLOGICAL USES
OF COPYRIGHTED WORKS, July 31, 1978, at 15. [hereinafter CONTU]. See also Remer,
Legal expert on software theft: the piranahasversus true pirates, INFOWORLD, Mar. 22,
1982, at 40; Lawlor, A Proposalfor Strong Protectionfor Computer ProgramsUnder the
Copyright Law, 20 JURIMETRICS J. 18 (1979) (broad support within business commu-
nity for copyright protection for computer software); Freiberger, Sony Case Scares
Micro Makers, INFOWORLD, Nov. 16, 1981, at 1 (discusses Universal Studios v. Sony
Corp. of America, 659 F.2d 963 (9th Cir. 1981) as it affects possible copyright infringe-
ment by makers of anti-copy protection devices specifically designed and marketed to
defeat copyright owner's attempts to protect source code from unauthorized copying).
73. CONTU, supra note 72, at 11.
74. Id.
75. 17 U.S.C. app. §§ 101-810 (1976 & Supp. IV 1980).
76. CONTU, supra note 72, at 1.
77. Boorstyn & Fliesler, Copyrights, Computers, and Confusion, 56 CAL. ST. B.J.,
148 (1981).
78. Id. See also Wehringer, Copyright in Brief, PRAC. LAw., July 15, 1979, at 77
(brief discussion of filing procedures for copyright protection).
No. 4] COPYRIGHT AND ARTIFICIAL INTELLIGENCE 723
derlying computer program was not copyrighted. But see In re Certain Coin-Operated
Audio-Visual Games and Components Thereof, 537 PAT. TRADEMARK & COPYRIGHT J.
(BNA) No. 337-TA-87, at A-5 (July 16, 1981) where the International Trade Commission
distinguished between the "attract mode" and the "play mode" when assessing
copyrightability of commercial audiovisual displays. Midway had asked the ITC to ex-
clude importation of video games appearing to infringe on copyrights it held on its
Galaxian video game. The Commission held the attract mode was copyrightable be-
cause it always replayed the same identical sequence. Similarly, it held the first few
seconds of the play mode also copyrightable, because this sequence is also repeated
identically in every game. However, the Commission refused to decide whether per-
formance of the game could possibly infringe upon a valid copyright of the play mode:
First, for statistical reasons, it is virtually impossible for a performance of
Galaxian ever to duplicate that performance fixed in the video tape. If we
were to hold that such performances could infringe a copyright in the play
mode, we might be protecting the game itself or its mode of play, items which
are specifically not subject to copyright protection. Second, each performance
of the Galaxian play mode depends, in part, on the player. It is therefore pos-
sible that the player may be considered a "coauthor" of each performance of
the play mode. Our research has indicated no legislative history or case law
on whether coauthored works of this sort are subject of copyright, and we de-
cline to rule on this issue. Third, in view of the remedy we are granting in this
investigation, a ruling on either copyrightability or infringement of the play
mode is unnecessary.
Id. at A-6.
96. Boorstyn, supra note 77, at 148.
97. 17 U.S.C. app. § 102(a) (1976).
COMM/ENT [Vol. 4
106. Synercom Technology, Inc. v. University Computing Co., 199 U.S.P.Q. (BNA)
537 (N.D. Tex. 1978).
107. Comment, Copyright Protectionfor Computer Programs, 47 TENN. L. REV. 787,
788 (1980).
108. CONTU, supra note 72, at 19 (emphasis omitted).
109. Id. See also notes 114, 115 and accompanying text, infra.
110. CONTU, supra note 72, at 19.
COMM/ENT [Vol. 4
111. The programmer who uses the software has no control over the exact code or
story that is produced by the Al software. The programmer who created the AI
software itself might have an idea of what the program will do when faced with a given
set of data, but, as in RACTER, if the program randomly selects nouns and verbs then
places them in grammatically correct relationships consistent with the rules given it
by the original programmer, even he would be unable to foresee or influence the con-
tent of the story output.
112. The federal copyright laws require a work to be that of an author and that it
evidence a non-trivial amount of "intellectual labor." See notes 116-118 and accompa-
nying text, infra.
113. See generally Milde, supra note 8.
114. See notes 116-118, 127 and accompanying text, infra. See also Apple Computer,
Inc. v. Franklin Computer Corp., 545 F. Supp. 812 (E.D. Penn. 1982) (slip opinion, July
30, 1982) (in denying copyrightability of a series of programs stored in ROM on a chip,
the court discussed the authorship requirements of copyright law). But see Williams
Electronics, Inc. v. Artic International, Inc., 685 F.2d 870 (3rd Cir. 1982) (slip opinion,
August 2, 1982) (on similar facts, court held programs stored in ROM to be works of
authorship, fixed in tangible medium of expression, etc., meeting all requirements of
copyright law). See also Midway Mfg. Co. v. Artic International, - F. Supp. - (E.D. Ill.
1982) (slip opinion, March 10, 1982) (programs stored in ROM copyrightable).
115. Apple Computer, Inc. v. Franklin Computer Corp., 545 F. Supp. at 825. "If the
concept of 'language' means anything, it means an ability to create human interaction.
It is the fixed expression of this that the copyright law protects, and only this. To go
beyond the bounds of this protection would be ultimately to provide copyright protec-
tion to the programs created by a computer to run other computers. With that, we step
into the world of Gulliver. .. "
No. 41 COPYRIGHT AND ARTIFICIAL INTELLIGENCE 729
are represented within the computer as their electronic equivalents. The author then
executes certain word-processor editing programs to manipulate this data as desired
and then directs the computer to print the final product. Here, the ideas concerning
the story have originated in the mind of the author. This set of ideas is expressed
through the display on the video screen, within the computer memory and in the final
output version of the story. The word-processor has functioned merely as a tool to
help the author express his ideas. Next, consider a participant in a sophisticated inter-
active fantasy game played on a computer modified to print out a narrative summary of
the game as it is being played. Here, the player/author initiates the game by starting
the program. His idea is to play the game. He may have ideas regarding strategy and
select the level of complexity at which he desires to play, but he will not be able to
determine the specifics of his "plot" with any degree of certainty. As the player selects
the general traits of his characters (certain powers, skills, etc.), the computer program
fills in the other character attributes using preprogrammed (random) values unknown
to the player. The actual course of events and the outcome of the game are determined
jointly through the interaction between the values input at the keyboard by the user
and those supplied by the computer as a programmed, sometimes random, response.
Although the software-generated portion of the game dialogue is apparently idea ex-
pressive, the ideas expressed by the software are not those of the human player. The
flow and amount of the author's originality has been intermixed with the product of
the machine. Finally, if one were to replace the human game participant with a com-
puter program which randomly selected allowable answers to the game software que-
ries, the amount of human input attributable to the game player would be reduced to
nil because the human would merely initiate the game dialogue-producing program,
yet would not contribute at all to its idea-expressive content. See generally Word
Processing,INFOWORLD, Jan. 11, 1982, at 17 (special section on word-processing technol-
ogy) and Supergames, INFOWORLD, Apr. 12, 1982, at 14 (special section on computer
video-games).
137. Mazer v. Stein, 347 U.S. at 217.
138. Presently, machines are not deemed able to formulate ideas capable of being
expressed in a copyrightable manner and cannot be said to be authors of original ex-
pressions within the meaning of the Act. Allowing copyright protection for expres-
sions not evidencing an underlying human-originated idea would raise significant
problems. See Apple Computer, Inc. v. Franklin Computer Corp. 545 F. Supp. at 824-
825.
No. 4] COPYRIGHT AND ARTIFICIAL INTELLIGENCE 733
V
Authorship, Originality and Artificial Intelligence:
Coping with the Man/Machine Threshold
We have seen that developments in AI challenge the applica-
tion of present copyright laws to computer generated software
and literature. Both the "original expression by an author" re-
quirement and the goal of affording copyright protection to ex-
pressions of ideas rest on a fundamental assumption that
"authors" are human beings. 43 As AI progresses, this assump-
tion will be tested in a legal sense as valuable and apparently
139. One might argue the ideas "expressed" in the story or program produced by an
Al machine are "derived" from the ideas of the original human programmer or the AI
software. This approach misconstrues the definition of a "derived work" as used in the
Act. See note 173 and accompanying text, infra. Also, one might question how far
removed the human can be from the process of expression and still maintain a claim
for authorship. For example, if a programmer designs an AI machine which produces
other AI machines which produce other AI machines, etc.-each of which randomly
produces stories, can the ideas of the original software programmer be effectively
linked or traced to the nth story written? What about a program which summarizes
news stories as they appear on the UPI wire service? See Waltz, Sci. Am. Oct. 1982, at
118, 132.
140. See note 136, supra.
141. An author is "one that originates or gives existence," WEBSTER'S NEW COL-
LEGIATE DICTIONARY 75 (1981). This colloquial meaning is given more definite meaning
under copyright law. See note 127, supra. Also, CoNTu's refusal to even speculate
upon the copyright issues presented by computer-authored literary works seems to
indicate an explicit presumption that the Act is meant to deal with human (or at least
non-computer) authors. See CoNTU, supra note 72, at 44.
142. Language understanding and synthesis is an area of great interest to Al re-
searchers. See Waltz, Sci. Am. Oct. 1982, at 118, 132.
143. See note 138, supra.
COMM/ENT [Vol. 4
155. 35 U.S.C. § 101 (1970) provides: "Whoever invents or discovers any new and
useful process, machine, manufacture, or composition of matter, or any new and useful
improvement thereof, may obtain a patent therefore .. " The word "process" means
"process, art or method, and includes a new use of a known process, machine, manu-
facture, composition of matter, or material." 35 U.S.C. § 100(b) (1975). See also Nycum,
supra note 56, at 4-5 and Schmidt, supra note 64, at 355-57.
156. A patent is valid for a period of seventeen years from date of issuance. The
holder has the right to "exclude others from making, using, or selling the invention
throughout the United States .... " 35 U.S.C. § 154 (1970).
157. For instance, in copyright multiple expressions of a similar idea are permitted,
but in patent multiple use of the idea itself is prohibited without authorization from
the patent holder.
158. The costs of litigation in patent enforcement can be prohibitive. In 1967, a
study found costs of litigation for patent enforcement/infringement to range from
$25,000 to over $1,000,000. See Harris & Chuppe, Cost of Enforcement ofIndustrialProp-
erty Rights, 14 IDEA 77, 81 (conf. issue, 1970). See also Nycum, supra note 56, at 72-73.
159. To do otherwise would distort the meaning of "idea" under patent law. See
note 160, infra.
No. 4] COPYRIGHT AND ARTIFICIAL INTELLIGENCE 737
164. Patent registration and approval are time-consuming, costly mechanisms when
compared with the immediate contract arrived at through use of trade secret or licens-
ing agreements made with employees or co-authors.
165. A trade secret is "any formula, pattern, device or compilation which is used in
one's business, and which gives him an opportunity to obtain an advantage over com-
petitors who do not know or use it." RESTATEMENT OF TORTS § 757 comment b (1939).
See generally Raysman, Protectionof ProprietarySoftware in the Computer Industry:
Trade Secrets as an Effective Method, 18 JUXIMETRICS J. 335 (1978) and Schmidt, supra
note 64, at 386-99.
166. Exceptions to subject matter not valid as objects of services contracted for in-
clude "illegal bargains." See generally 1 A. CORBIN, CORBIN ON CONTRACTS §§ 1373-78
(17th printing 1975).
167. 17 U.S.C. app. § 301 effectively preempts state and local governments from ex-
ercising jurisdiction over copyright in works "fixed" after January, 1978. At present,
states may exercise authority over works fixed prior to Jan. 1978, "unfixed" works and
certain other narrow areas not yet covered by federal statute. See generally N. BOOR-
STYN, supra note 84, at 10-20. At present, there is no uniform, national statutory basis
for enforcing rights granted under trade secrecy and licensing agreements.
No. 4] COPYRIGHT AND ARTIFICIAL INTELLIGENCE 739
168. If one grants copyright to "expressions" not purported to be the result of un-
derlying intellectual labor, the concept of authorship becomes meaningless. Natural
phenomena as well as animals would thereby become likely candidates for authorship
if, by accident, an "expression" is produced. (e.g., a cat walks on an ink pad and then
onto a piece of paper, leaving an "expressive pattern" in his trail. One cannot distin-
guish this expression from one intentionally done by a human for some unknown rea-
son. If one doesn't look to the expression of an underlying idea or some evidence of
intellectual effort having been expended as the basis for copyright, then the accidental
markings of the cat would be equated with the works of famous artists: both would be
copyrightable or neither would be. Critics of modern art might cheer this interpreta-
tion but the damage done to the goals and the availability of copyright is significant).
See Apple Computer, Inc. v. Franklin Computer Corp., 545 F. Supp. 812 (E.D. Penn.
1982) and Alfred Bell & Co., 191 F.2d 99 (2d Cir. 1976) (inadvertent expressions can be
"adopted" by a human and thus meet authorship requirements under copyright).
169. In other words, the "legal person" status of a corporation reflects a need for
society to recognize and facilitate a method for efficiently organizing human physical
and intellectual effort to achieve maximum return on monetary and human capital in-
vestment. See P. DRUCKER, CONCEPT OF THE CORPORATION 30-45 (rev. ed. 9th printing
1975).
170. Id. at 174-215.
COMM/ENT [Vol. 4
171. One of thb fundamental goals of copyright protection is to give the author in-
centive to express his ideas. However, if the "author" is an AI machine,,the availability
of copyright protection would be of no consequence because the AI machine, being
non-sentient, would be unable to perceive the "incentive" or reap the economic bene-
fits of such protection. Thus, one must turn towards finding authorship in humans-
who are capable of recognizing and responding to such incentives-t]1ereby promoting
broad policy goals of economic development and dissemination of valuable ideas.
172. See note 168, supra.
173. 17 U.S.C. app. § 101 defines a "joint work" as work prepared by two or more
authors "with the intention that their contributions be merged into inseparable or in-
terdependent parts of a unitary whole." It is doubtful that a computer can have "inten-
tions" in a legal sense. See generally A. CORBIN, supra note 166, at §§ 2, 3, 9, 15. But,
one commentator argues for this approach when he states "[P resumably both the
original programmer for the machine and the person whose problem the machine is
designed to handle are both contributing to the formulation of the final problem. But,
since neither of these human beings need have full knowledge of the steps the com-
puter will perform to create a program, can it not be said that the machine is also at
least co-author of the program?" Milde, supra note 8, at 395.
174. See generally A. CORBIN, supra note 166, at §§ 3, 9, 622-26, 923.
175. See notes 72, 127, supra.
No. 4] COPYRIGHT AND ARTIFICIAL INTELLIGENCE 741
188. The products of the AI software would not be in existence but for the prior
original expression of a human programmer. This expression is embodied in the AI
software. Because the original programmer can own the copyright to such software if
he meets the requirements of the Act, one could argue that ownership of the copyright
to the products of this software should be traced back to this original programmer
because the products are "derived" from his copyrighted work. See note 183 and ac-
companying text, supra.
189. WEBSTER'S NEW COLLEGIATE DIcTIoNARY 303 (1981) speaks of "source" or "ori-
gin" when attempting to define the meaning of "derived." The Act defines a derivative
work as a "work based upon one or more pre-existing works, such as a translation,
musical arrangement, dramatization, fictionalization, motion picture version, sound re-
cording, art reproduction, abridgement, condensation, or any other form in which a
work may be recast, transformed, or adapted." 17 U.S.C. app. § 101 (1976).
190. For instance, in Type I Al programs, the code generator source program re-
mains unaffected by the creation of the product program. Similarly, in Type II situa-
tions, the output story bears no resemblance as a whole to the source code of RACTER
or to the contents of the various vocabulary data files used by RACTER to write the
story. Thus, in a copyright sense, the product of Type I and Type II AI program are not
"derived" from the underlying AI program because they bear no resemblance to it.
The AI program creates a product distinct from its own expression format.
191. See note 190, supra.
COMM/ENT [Vol. 4
192. Id.
193. See notes 189 and 190, supra.
194. Id.
No. 4] COPYRIGHT AND ARTIFICIAL INTELLIGENCE 745
195. Courts often have to determine which contending party owns how much, if
any, of a given copyright. The question is usually one for the fact-finder to determine
the quantity and quality of the contribution of each "author" to the given work. The
court will also look at all contracts, agreements, customs and other facts and circum-
stances when determining the distribution of "ownership" of the copyright. Under this
proposed solution, the "authorship" requirement under the Act would be presumed
met in Type I and Type II AI product situations whenever human contribution is found
to be trivial. The court would then look to the other requirements under the Act to
allow or disallow copyright in a given case. If copyright were found to exist, the court
would be able to "assign" the rights of ownership to the humans who, given all the
facts, are most deserving.
196. See notes 114, 115, 127, 136 and accompanying text, supra.
197. Id.
198. See generally L. PATTERSON, supra note 148.
199. Id.
COMM/ENT [Vol. 4
VI
Solution: A Proposed Interpretation of Section
102 of the Copyright Act
To assist the courts in creating a presumption of a fictional
human author whenever a product of Type I or Type II AI pro-
grams evidences insufficient human contribution to satisfy
copyright requirements of "originality" and "authorship," sec-
tion 102 of the Act 2°° should be interpreted by the court as if it
incorporates the following passage:
In determining the copyrightability of expressions wholly or
partly produced by computer software and which are appar-
ently thoughtful or indistinguishable from those produced by a
human author, human authorship will be presumed and "au-
thorship" and "originality" requirements of this Act will be
deemed satisfied.2 ° '
This interpretation would enable courts to presume authorship
in AI product situations and allow time for analyzing the other
requirements of the Act to determine copyright availability in
a given situation. If all copyright requirements were met, then
the policy goals of copyright would adequately be served be-
cause protection would be granted. The court would have to
determine the correct apportionment of copyright ownership
rights among the owner of the underlying AI software, the
owner of the machine and the problem-specifier.2 °2 Assessing
the economic interests of human parties, the court would be on
familiar ground. The traditional integrity of the idea/expres-
sion dichotomy would be maintained in copyright law, and the
human focus of jurisprudence would remain inviolate.
200. See, e.g., discussion in Apple Computer, Inc. v. Franklin ComputerCorp., supra
note 114 (the court developed a "Seldon-Taylor" doctrine of copyright wherein expres-
sions to be given protection must have an underlying explanatory purpose or be at-
tempting to communicate in some way. Finding the expressions fixed in ROM to be, at
most, communication from a man to a machine, the court found programs stored in
ROM to be non-copyrightable expressions). See also Alfred Bell & Co. v. Catalda Fine
Arts, Inc., 191 F.2d 99 (2d Cir. 1976) (concept of inadvertent expressions (i.e. natural
effects of lightning strike caught in nature photograph) which are not planned for nor
created by the human, yet are copyrightable if the human "adopts" such expressions
as his own. Note, however, that in adopting the expression as his own work, the
human puts forth the minimal degree of "intellectual effort" needed to satisfy the Act).
201. See notes 195 and 200, supra. This "interpretation" could also be statutorily
enacted.
202. See note 195, supra.
No. 4] COPYRIGHT AND ARTIFICIAL INTELLIGENCE 747
VII
Conclusion
Throughout its history, federal copyright law has flexed
under the pressures of technological advancement. Develop-
ments in artificial intelligence will soon place even more ten-
sion on the scope and application of its traditional
requirements. Code generators, problem solving languages
and story-writing programs all act as harbingers of more pow-
erful, creative and economically important innovations soon to
come. Although dealing with many issues germane to the ad-
vent of computer technology, CONTU did not adequately ad-
dress the looming onslaught of AI based products and
creations. Finding that extension of copyright protection to AI
software work products is advantageous for the same reasons
copyright is extended to normal, human produced literary
works, this note has analyzed the threat AI developments pose
to copyright concepts of authorship and originality and has dis-
cussed several alternative solutions to afford copyright protec-
tion. The solution least destructive to traditional copyright
concepts presumes a fictional human author and then "as-
signs" copyright to the creator of the AI software, the problem-
specifier or the computer owner, as the court sees fit.
It should be emphasized that the solution proposed in this
note is a stop-gap measure. As AI develops more fully, the
ability to sidestep the tremendous impact such advances will
have on the legal system will diminish significantly. Man will
confront "thinking machines," and the impact will be unprece-
dented. The technical feasibility of artificial intelligence has
been shown at a relatively primitive level, but "[ o ] nce artificial
intelligences start getting smart, they're going to be very smart
very fast. What's taken humans and their society tens of
thousands of years is going to be a matter of hours with artifi-
cial intelligences. ' 203 We must prepare for "the possibility that
we may have to undergo still another redefinition of ourselves
as a species, another Copernican revolution that will move us
further yet from the center of the universe. "2°4