All posts by Matthew Sag

About Matthew Sag

Technology enthusiast, law professor, copyright and internet law specialist.

HathiTrust Wins on Fair Use, and just about everything else

Landmark Fair Use Win

Yesterday, District Judge Harold Baer, Jr., handed down his decision in Authors Guild v. HathiTrust, a case that spins out of the long-running Google Books dispute. The decision is a landmark win for the HathiTrust, the University defendants, people with print-disabilities, Google, the Digital Humanities and, I would argue, for humanity in general.

Essential Background

The HathiTrust is a digital repository of millions scanned university library books that became available to various universities by virtue of the Google Books project.  About 3/4 of the books are still in copyright. In 2011 HathiTrust announced plans to embark on an innovative orphan works program (OWP), but dropped (or at least shelved) the plan soon after in light of criticism as to its implementation. Spurred into action by the OWP, in September 2011 the Authors Guild filed a copyright lawsuit against HathiTrust, five universities, and multiple university officials.

The Authors Guild suit alleged that library digitization for any purpose amounts to copyright infringement. The purposes specifically under attack in this case were (i) preservation; (ii) to enable non-expressive use such as conducting word searches; and (iii) to facilitating access by persons who are blind or visually impaired.

There is a key fact in this case that media reports will probably get wrong. This is not about scanning books to make extra copies for the public at large. As the Court explained, “No actual text from the book is revealed except to print-disabled library patrons at [University of Michigan].” Authors Guild v. HathiTrust, p 16. This case was about library digitization for three specific purposes, preservation, disabled access and non-expressive uses such as text searching and computational analysis.

The Score Card

Here is quick and dirty summary of the key copyright issues:

  • Digitization to provide access for the print-disabled held to be transformative use and, on balance, fair use.
  • Digitization to provide for print-disabled students held to be (i) an obligation of universities under the ADA, (ii) fair use under section 107 of the Copyright Act and (iii) enabled by section 121 of the Copyright Act.
  • Section 108 the Copyright Act was held to expand the rights of libraries, not limit the scope of their fair use rights in any way, shape or form. Given the text says “Nothing in this section . . . in any way affects the right of fair use as provided by section 107” any ruling to the contrary would have been pretty shocking.
  • Digitization to create a search index held to be a transformative use, and, on balance, fair use.
  • Alleged security risks created by library digitization — dismissed as speculative and unproven. The judge noted the strong evidence to the contrary. It is still an open question whether the risk of subsequent illegal act by a third party could ever render an initial lawful copy not fair use. The whole notion strikes me as rather odd.
  • The market effect of library digitization — the court found there was none to speak of in this case. The court rejected the CCC’s magic toll-booth arguments — i.e., there were some wild assertions about future licensing revenue that the court rejected as “conjecture”.
  • The court also notes that a copyright holder cannot preempt a transformative market merely by offering to license it.
  • The market effect of enabling print-disabled access to library books — the court found there was no market for this under-served group, nor was one likely to develop.

Did the authors Guild win anything?
Not really, but two issues could have been even worse.

  • The court held that the issue of the Orphan Works Program was not ripe for adjudication. This was inevitable in my opinion, but the judge could have added unfavorable dicta indicating that the AG had no case here either. Wisely, the judge said only what needed to be said.
  • On the issue of library digitization for the purpose of preservation, the court found that the argument that “preservation on its own is transformative is not strong.”

The Digital Humanities

The court appeared to accept the arguments in the Digital Humanities amicus brief, written by Matthew Jockers, Jason Schultz and myself with the assistance of many others. The brief extended arguments I made in Orphan Works as Grist for the Data Mill, 27 Berkeley Technology Law Journal (forthcoming) and Copyright and Copy-Reliant Technology 103 Northwestern University Law Review 1607–1682 (2009).

Following Second Circuit precedent, the court explained that

“a transformative use may be one that actually changes the original work. However, a transformative use can also be one that serves an entirely different purpose.”

The court concluded that

“The use to which the works in the HDL are put is transformative because the copies serve an entirely different purpose than the original works: the purpose is superior search capabilities rather than actual access to copyrighted material. The search capabilities of the HDL have already given rise to new methods of academic inquiry such as text mining.”

The court even cites an illustration from our brief!

“Mass digitization allows new areas of non-expressive computational and statistical research, … One example of text mining is research that compares the frequency with which authors used “is” to refer to the United States rather than “are” over time. See Digital Humanities Amicus Br. 7 (“[I]t was only in the latter half of the Nineteenth Century that the conception of the United States as a single, indivisible entity was reflected in the way a majority of writers referred to the nation.”).”

Google Ngram Visualization Comparing Frequency of “The United States is” to “The United States are”

You can reconstruct the figure on Google Ngram yourself!

The court also cites our brief for the proposition that the use of metadata and text mining “could actually enhance the market for the underlying work, by causing researchers to revisit the original work and reexamine it in more detail”

Non-expressive use is fair use

The court did exactly what the amicus briefs urged it to do. As Matthew Jockers, Jason Schultz and I argued in our recent article in Nature last week (Digital Archives: Don’t Let Copyright Block Data Mining, 490 Nature 29-30 (October 4, 2012))

“It is time for the US courts to recognize explicitly that, in the digital age, copying books for non-expressive purposes is not infringement.”

Courts have already applied this logic in internet search engine cases and in a case involving plagiarism detection software. As we hoped, Judge Baer’s ruling demonstrates that digitization for text mining and other forms of computational analysis is, unequivocally, fair use.

“Plaintiffs assert that the decisions in Perfect 10 and Arriba Soft are distinguishable because in those cases the works were already available on the internet, … I fail to see why that is a difference that makes a difference.”

This was not a close case

“Although I recognize that the facts here may on some levels be without precedent, I am convinced that they fall safely within the protection of fair use such that there is no genuine issue of material fact. I cannot imagine a definition of fair use that would not encompass the transformative uses made by Defendants’ MDP and would require that I terminate this invaluable contribution to the progress of science and cultivation of the arts that at the same time effectuates the ideals espoused by the ADA.”

 

A significant win for the National Federation for the Blind

My focus in this case has always been on the technological side, that is my academic interest. However,the most important issue in this case is not about search engines, the digital humanities or non-expressive use, it is about reading, humanity and expressive use. I am of course referring to those aspects of the decision relating to fair use and persons with disabilities.

“[m]aking a copy of a copyrighted work for the convenience of a blind person is expressly identified by the House Committee Report as an example of a fair use, with no suggestion that anything more than a purpose to entertain or to inform need motivate the copying.”

As Kenny Crews summarizes:

“The opinion provides a strong opinion about fair use as applied to serving persons with disabilities, especially when an educational institution is mandated to serve needs under the Americans With Disabilities Act.  The court goes further and resolves a long-time quandary that arose under Section 121 of the Copyright Act.  That statute permits an “authorized entity” to make formats of certain works available to persons who are visually impaired.  An “authorized entity” is one that has a “primary mission” to serve those needs.  Libraries and universities have many functions, so is that service a “primary mission”?  The court said yes.”

 

Some useful links:

Google Book Search: Digital Humanities still needs answers

Google has settled with the publishers, but not the Authors Guild. This is good news for the Digital Humanities because it means that we may still get a substantive ruling on the big fair use question underlying the entire litigation.

Human life is short, none of us can hope to read more than a smattering of the literary record, but fortunately massive digitization efforts like those undertaken by Google allow scholars to apply large-N computerized methods to millions of works. Computational and statistical analysis of literature will be a big part of humanities research for years to come. However, legal actions like those of the Authors Guild could bar scholars from studying as much as two-thirds of the literary record.

In a comment published in Nature today [paywall] [Nature Vol. 490, pages 29–30 (04 October 2012) doi:10.1038/490029a], Matthew Jockers (an English professor), Jason Schultz (a law professor) and myself (also a law professor) explain why the the Association for Computers and the Humanities and a large group of scholars chose to file an amicus curiae brief on behalf of the digital humanities in the Authors Guild v. Google and Authors Guild v. HathiTrust cases.

In the brief we explain why U.S. courts should recognize that copying books for non-expressive purposes is not infringement.

My view is that the settlement between Google and the publishers makes such a ruling more likely because it provides further evidence that the ability to make non-expressive uses of copyrighted books works hand in hand with the commercialization of expressive uses which is what copyright law is all about.

For more on this topic, see http://matthewsag.com/projects/google-book-copyright-the-digital-humanities/

 

 

Google Book, Settled and Unsettled.

According to Reuters, Google and the Association of American Publishers (AAP) have reached a settlement in the long-running Google Book Search Litigation. Details remain sketchy. The settlement does not affect Google’s current litigation with the Authors Guild.

Nature has just published a comment piece by Matthew Jockers, Jason Schultz myself explaining why humanities scholars filed amicus briefs in the Authors Guild v. Google and Authors Guild v. HathiTrust lawsuits. These suits are still very much alive and it is not clear that the Authors Guild has the same incentives to settle as the AAP did.

Additional Links:

  • Joint Press Release
  • Techdirt comment that this is exactly what Google offered 7 years ago. “Basically, this settlement is AAP admitting that the entire lawsuit was a waste of time and money.”
  • James Grimmelman’s summary. “the settlement does not change the situation on the ground in any significant way”
  • Andrew Albanese, Publishers Weekly quotes AAP president Tom Allen saying “[we] out an arrangement that doesn’t resolve the legal issues. We agree to disagree on those, but as a practical matter, it does resolve our differences with Google.”

 

The origins of fair use

I have just added a page to this website devoted to the history of fair use. As I note in my article The Pre-History of Fair Use 76 Brooklyn Law Review 1371-1412 (2011), fair use does not begin with early American cases such as Folsom v. Marsh in 1841, as many accounts assume. The fair use doctrine began over a century earlier when English courts were considering issues of republishing and abridgment — the remix culture of the 1700′s.

My main points are

  • Copyright has always involved some balancing between authors rights and users rights. Fair use is part of the legal tradition of every country that traces its copyright law back to the Statute of Anne.
  • Fair use did not take away from authors rights, it made it possible for the courts to take a purposive reading of the copyright act that actually expanded authors rights.

Global Research Network on Copyright Flexibilities in National Legal Reform Meeting in DC

I am in DC today at the Global Research Network on Copyright Flexibilities in National Legal Reform Meeting.

Copyright reform is under active discussion at the national level in numerous countries. The goal of the Global Research Network on Copyright Flexibilities in National Legal Reform is to produce draft language for a flexible limitation and exception that could be included in national legislation. We expect to offer this language, which may include more than one model provision, to legislators and civil society advocates in countries contemplating copyright reform. Additionally, we aim to develop an online “tool kit” to assist these deliberations.

Brands, Competition, and the Law

On October 19, the Institute for Consumer Antitrust Studies is co-hosting a conference on Brands, Competition, and the Law along with University College London.  This is the follow-up to a very successful program on the same theme in London in December 2011.  A book with selected papers and comments from these conferences will be forthcoming.

We have assembled an all-star lineup of economists, marketing and branding professionals, as well as antitrust and IP lawyers and professors to try to reach a common understanding of the meaning and impact of brands in the market place and the appropriate legal regime. The full details and registration information for the conference are available at http://www.luc.edu/law/academics/special/center/antitrust/brands_competition_law.html.

The speakers include: Deven Desai, Kirsten Edwards-Warren, Phil Evans, Warren Grimes, Greg Gundlach, James Langenfeld, Ioannis Lianos, Deborah Majoras, Mark McKenna, John D. Mittelstaedt, John Noble, Barak Orbach, Joan Phillips, Matthew Sag, Eliot Schreiber, and Spencer Weber Waller.

Our Robot Overlords

There is a great story today on io9.com illustrating just why automatic copyright filtering can never be a complete solution to online copyright issues. In short,

Dumb robots, programmed to kill any broadcast containing copyrighted material, had destroyed the only live broadcast of the Hugo Awards.

Apparently, a licensed clip from Dr. Who (which would have been fair use even if it had not been licensed) triggered the filtering software and exterminated the webcast. Companies like Ustream are of course free to implement whatever dumb software they like, but if filtering becomes the norm we will all be subject to prior restraint by mindless automatons. I, for one, do not welcome our new robot overlords.

Australia, Copyright and the Digital Economy

The Australian Law Reform Commission has just published a thought provoking issues paper on Copyright and the Digital Economy, ALRC Issues Paper 42, August 2012.

The ALRC has been charged with considering whether existing exceptions under Australian law are appropriate and whether further exceptions should recognize fair use of copyright material; allow transformative, innovative and collaborative use of copyright materials to create and deliver new products and services of public benefit; and allow appropriate access, use, interaction and production of copyright material online for social, private or domestic purposes.

This is not the first time that Australia has considered adopting a more open-ended approach to copyright limitations and exception. The 2005 “Fair Use Review” by the Attorney-General’s Department also looked at the appropriateness of introducing a general fair use exception. That review led to some piecemeal reforms, but left Australia with its complicated labyrinth of exceptions keyed to particular uses of particular types of works under particular circumstances, all subject to a balancing test not unlike the U.S. fair use doctrine.

The new ALRC report makes some interesting observations about how things have changed since the last time this issue was considered.

At page 78-79, the ALRC notes …

“There has been a noticeable degree of change with respect to technology and social uses of it, even since the Fair Use Review. In its preliminary discussions with some stakeholders and others with an interest in copyright, the ALRC heard that there may now be more of an appetite for a broad, flexible exception to copyright—perhaps based on US-style fair use—than in late 2006.

In January 2008, Barton Beebe’s empirical study of US fair use case law through to the year 2005 was published. (B Beebe, ‘An Empirical Study of US Copyright Fair Use Opinions, 1978–2005’ (2008) 156 University of Pennsylvania Law Review 549). He argued that the results ‘show that much of our conventional wisdom about that case law is mistaken’.

In 2009, [Pamela] Samuelson published her ‘qualitative assessment’ of the fair use case law, which was built upon Beebe’s study (P Samuelson, ‘Unbundling Fair Uses’ (2009) 77 Fordham Law Review 2537). Samuelson has argued that ‘fair use is both more coherent and more predictable than many commentators have perceived once one recognizes that fair use cases tend to fall into common patterns’.

Earlier in 2012, Matthew Sag published his work that built upon these two studies (M Sag, ‘Predicting Fair Use’ (2012) 73 Ohio State Law Journal 47).  He went further than Samuelson and ‘assesse[d] the predictability of fair use in terms of case facts which exist prior to any judicial determination’. He argued that his work demonstrates that “the uncertainty critique is somewhat overblown: an empirical analysis of the case law shows that, while there are many shades of gray in fair use litigation, there are also consistent patterns that can assist individuals, businesses, and lawyers in assessing the merits of particular claims to fair use protection.”

In my view, if Australian companies are going to have a fair chance to compete in the global digital economy, Australia needs to adopt a more flexible approach to copyright exceptions and limitations. When new issues arise in the United States, they are dealt with by the courts. Litigation is far from perfect, but it beats waiting around for a slow-moving, special-interest-beholden, narrowly focused legislative process. More often than not,  Australian copyright law tends to echo the results of United States cases, just with a significant delay that limits the innovation opportunities of Australian companies.

Without a fair use doctrine, Australian innovators need to wait for permission regardless of how fair their intended use might be. In contrast, their American counterparts can back their own judgement as to fair use, and ultimately, if necessary, defend that judgment in court.

Online submissions to the ALRC can be made here. A final Report will then be delivered by 30 November 2013.

more coverage of digital humanities amicus

I just read James Grimmelmann’s amusing and insightful post, Google Books: Even Friends of the Court Have Enemies. He concludes that “The opposition, overall, is a litigation tactic for the sake of tactics; I don’t see how it helps the plaintiffs either substantively or strategically.”

I won’t comment every time James says something worthwhile about the Google Books litigation, it happens far too often.