Home > The Next Wave: Liberation Technology

The Next Wave: Liberation Technology

by Open-Publishing - Monday 24 May 2004

In everything from course management to big enterprise
systems, universities must choose between monopolies and
the open approach

BY JOHN M. UNSWORTH

http://chronicle.com/prm/weekly/v50/i21/21b01601.htm>

The Chronicle of Higher Education

If the nineties were the e-decade (e-com-merce, e- business,
e-publishing, eBay, E*Trade, etc.), the aughties are the o-
decade (open source, open systems, open standards, open
access, open archives, open everything). This trend, now
unfolding with special force in higher education, reasserts
an ideology, a meme, that has a continuous tradition
traceable all the way back to the beginning of networked
computing (in fact, as far back as Thomas Jefferson’s famous
defense of the principle that "ideas should freely spread
from one to another over the globe"). Call this meme
Liberation Technology. It has recently been adopted by some
venerable institutions — not only by some of the great
public and private universities, but also by major private
foundations — and it means business.

Since the beginning of Internet time and before, Liberation
Technology has been intertwined with and opposed to another
ideology. Call it Command and Control. You see Command and
Control at work in the military roots of the Internet, in the
Recording Industry Association of America’s prosecution of
file- sharing college students, and in Microsoft’s doubly
possessive and oddly revealing slogan ("your potential, our
passion"). Liberation Technology wants to keep information
free; Command and Control wants to make the Internet safe for
private property.

To be sure, not all proprietary operations oppose open
inquiry, but the key to the business success of open- source
products like Linux is that they allow people to make money
by selling them, without allowing the seller exclusive
control. Especially with information goods, the notion of
nonexclusive commercial rights is key.

In the Early Days of the Web, Public Good vs. Property Rights

By the early 1990s, the Internet was expanding rapidly, going
from one thousand hosts in 1984 to one million in 1992, and
new, more sophisticated applications were appearing, like
Gopher (1991) and the World Wide Web (the first Web server in
the United States was set up in 1991, with Mosaic, the first
graphical Web browser for personal computers, coming along in
1993). Throughout the 1990s, university faculty members and
students outside of computer science were gradually becoming
aware of the existence of the Internet, largely because of
the Web; so was the rest of the world, for the same reason.

In retrospect, it’s difficult to comprehend the rapidity with
which the Web went from an obscure science experiment to a
fact of daily life, but it took only about three years. By
late 1994, the World Wide Web Consortium was founded to take
over managing Web protocols and their development and to
ensure that the Web would remain a nonproprietary public
good. In 1996, the consortium presented the first draft of
XML (Extensible Markup Language, the encoding format that is
now used for exchanging text and many other kinds of data on
the Web); the official draft of XML 1.0 was presented in
1998.

In distinct contrast to that ethos, with its focus on the
public good, an aggressive campaign began in the late 1990s
to expand the property rights of "content providers," in
legislation like the Digital Millennium Copyright Act and the
Sonny Bono Copyright Term Extension Act (both passed in 1998)
and in case law arising out of the Recording Industry
Association of America’s suit against Napster in 2000. Mixed
in there was the Microsoft antitrust case, initiated in 1998
under the Clinton administration, first decided against
Microsoft, overturned on appeal, and eventually settled,
quite favorably for Microsoft, by the Bush administration in
2001.

Against that backdrop, during the 1990s all over the United
States universities became big IT consumers, not just in
computer science or in the sciences, but increasingly in all
disciplines, on every part of campus, for all kinds of
services. As they came to rely more, and more broadly, on
networked information in teaching, research, and
administration, universities turned away from the strategy of
meeting their own specialized needs with homegrown software
and began to license more commercial products.

They also began to be seen, for the first time, as a
profitable market for commercial IT products and services.
WebCT and Blackboard, for example, both appeared on the scene
in 1997 and over the next few years they signed up hundreds
of university clients for "e learning" systems to put courses
online, do grading online, accept homework assignments
online, etc. On the administrative side, beginning in the
mid-1990s, enterprise-resource-planning (ERP) systems from
vendors like PeopleSoft and Oracle — for managing payroll,
student records, human resources, purchasing, etc. — began
to find a market in universities, partly built on the fear
that Y2K would wreak havoc on older, usually homegrown,
systems that had hitherto been performing those functions,
often successfully, often for years.

Universities also got caught up in the Internet bubble —
that combination of greed, optimism, and willful ignorance of
history that led us to believe that information technology
would create a permanent bull market. In the heady days at
the turn of the millennium, Columbia University, to take only
one of many possible examples, plowed millions into launching
Fathom, a for- profit online content provider for e-learning,
confident that such a foray into the commercial sector would
turn a handsome profit for the stakeholders, which included
not just Columbia, but the London School of Economics and
Political Science, the New York Public Library, the
University of Chicago, the University of Michigan, and
others.

Some time in 2000, though, the pendulum started swinging the
other way, beginning, perhaps, in reaction to failures such
as Fathom’s. In his annual report for 2000-2001, the
president of the Massachusetts Institute of Technology,
Charles M. Vest, succinctly articulated a return to the
original ideology of the Net when he announced MIT’s
OpenCourseWare project to make primary materials of its
courses available online — free. As he noted, "inherent to
the Internet and the Web is a force for openness and
opportunity that should be the bedrock of its use by
universities."

Vest’s report is not the source of the trend that is now
unfolding, but it is certainly a document that crystallizes a
historical moment. It is significant for another reason, too:
It is emblematic of what’s changed in this iteration of
Liberation Technology.

Course Management, Portals, and Enterprise Systems

This time around, the ideas are being advanced not by ragtag
communitarians, but by major institutions, with substantial
backing not just from MIT, but from a number of other
universities as well, and not just from universities, but
from corporations, foundations, and government agencies at
home and abroad.

In MIT’s case, support comes from the institution itself and
also from two major private foundations, the William and
Flora Hewlett Foundation and the Andrew W. Mellon Foundation.
On a first visit, the MIT site for OCW looks a little longer
on structure than substance. If you dip at random into
courses, you may see mostly syllabi, perhaps some exercises,
and a list of assigned readings, but not the readings
themselves (leading you to wonder how the effort is going to
provide new educational opportunities in the developing
world, as claimed). But on further investigation, you’ll find
that some courses have the complete text of every lecture (in
PDF), and others have full-length videos of every lecture (at
three different resolutions for slow, medium, and fast
connections). At that point, MIT’s claim to be the first
open-source university begins to seem more plausible.

MIT can’t give away the readings in its courses — in most
cases, textbooks and articles that come from commercial
publishers — but it can give away the intellectual property
created by its own faculty members, and that’s what it’s
doing. As with the open- source-software movement from which
it drew inspiration, it permits the reuse, modification, and
redistribution of content. Unlike open-source software,
however, it prohibits doing any of those things for
commercial purposes.

That distinction is important, and it is key to understanding
the doctrinal differences among open- source sects. Beginning
in the early ’80s, the innovation of the open-source-software
movement was to argue that users should have the freedom to
modify source code, but could sell the results, as long as
the source code for the modified version was available for
modification. Those terms are codified in the GNU Public
License.

Since then, other variants of open-source licensing have
emerged. MIT’s materials in OCW are covered by a different,
newer copyright, developed by the Creative Commons project,
an effort led by Lawrence Lessig, who set up Stanford
University’s Center for Internet and Society, with support
from Hewlett, Stanford and Harvard Universities’ law schools,
and others (including the philanthropic group Center for the
Public Domain). The Creative Commons license allows copying
and redistribution, but also allows the content creator a set
of options with respect to attribution, commercial use, and
modification of the work. The Creative Commons license is
inspired by GNU, but also informed by a somewhat broader
perspective, in that it is intended to cover creative work
other than software.

Though legal variants of open-source licenses do exist, at a
technical level, open systems require that everyone who
designs or modifies the systems does so under the same set of
rules. In the case of online courseware, content, and tools,
the IMS Global Learning Consortium is providing some
important common ground on which to coordinate a very broad
range of specifications. One of the partners in that effort
is another "open" entity, called the Open Knowledge
Initiative, or OKI. That effort, financed by the Mellon
foundation, is based at MIT with Stanford as a principal
partner and supported by a number of major universities. It
describes itself as "an open and extensible architecture that
specifies how the components of an educational software
environment communicate with each other and with other
enterprise systems." The goal is to liberate universities
from having to choose a single software solution for managing
online instruction and/or online components of classroom
instruction. The result would be greater portability of
content, greater flexibility in choosing and assembling
elements of a learning- management system, and a shift in the
balance of power between the client (the university) and the
software vendor, in favor of the client.

Universities — or open-source developers at large — could
choose to produce and share their own modules for things like
calendars, gradebooks, etc. Commercial vendors could also
continue to build and sell proprietary solutions that adhered
to the architectural specification (and that, therefore,
allowed users to unplug some of the vendor’s modules and plug
in some of their own, or some from another vendor). That
speaks directly to the practice of monopolistic "bundling"
that was at the heart of the antitrust case against
Microsoft.

As with any standard, success will depend on whether both
vendors and users buy into it. That is not yet a certainty
with OKI, but in May 2002 Blackboard announced its intention
to adopt the OKI architecture. In October 2002 OKI announced
that it had joined in an informal consortium with other
"leading organizations developing specifications for e-
learning technology in higher education ... to coordinate
strategy and conduct common activities."

While the OKI project aims at specifying an architecture for
online learning systems, and MIT’s OCW is focused on content
for such systems, another open-source effort, the Sakai
Project, focuses on educational software tools. According to
the Sakai Web site, the project hopes to "demonstrate the
compelling economics of ’software code mobility’ for higher
education, and it will provide a clear road map for others to
become part of an open-source community." Sakai is a
collaboration among Indiana University, MIT, Michigan, and
Stanford, which will begin using its tools in 2004.

Another partner in Sakai is the open-source project uPortal.
A number of other universities (in the United States and
abroad) and for-profit companies (Sun Microsystems, SCT,
Interactive Business Solutions) are involved in developing
uPortal. Once again, the Mellon foundation is helping to
support the project.

Portals can do more than integrate news and weather, or
library and course information. They can also integrate the
administrative-computing functions of the university, such as
student records, payroll and human resources, and purchasing.
Interestingly, but perhaps not surprisingly, one of the
corporate sponsors of uPortal is SCT, a company whose
interests could be threatened, or at least significantly
reoriented, if uPortal achieves the success for which it
seems destined. SCT provides a "solution" called Banner, one
of those enterprise-resource-planning products mentioned
above.

Over the past few years, universities have spent hundreds of
millions of dollars to acquire, customize, and make the
transition to such systems, often with very mixed results.
The university that now employs me, and the one I worked at
last, are both in the throes of such a transition, probably
too far in to get out, but probably wishing they could.

Admittedly, it’s a huge undertaking to retool an entire
university’s administrative-computing infrastructure and
workflow, and it requires long-range planning and
commitments. An institution makes those plans and commitments
based on the best choices available at the time: Several
years ago, when decisions were being made at the Universities
of Illinois and Virginia, there were no plausible open-
source/open-standards ERP alternatives, so the universities
bought into monolithic proprietary systems. Now alternatives
are beginning to come into view. It will be years before the
current generation of university ERP adopters can switch to
open-source alternatives, but their experience will certainly
help to make the case for such alternatives as they emerge.

Toward a New Model of Scholarly Communication

There are a number of other pressing IT challenges facing
higher education, and at or near the top of the list are
digital libraries (or, more generally, data repositories).
Those could include data held in an institution’s library
(licensed or locally produced scholarly information), data
held outside the library (by an office of management
information, for example), and/or data published by a
university press.

The case for institutional repositories is laid out
convincingly in an article by Clifford A. Lynch, executive
director of the Coalition for Networked Information,
published in the February 2003 newsletter of the Association
of Research Libraries. Lynch argues that "an institutional
repository is a recognition that the intellectual life and
scholarship of our universities will increasingly be
represented, documented, and shared in digital form, and that
a primary responsibility of our universities is to exercise
stewardship over these riches: both to make them available
and to preserve them."

There are a number of noteworthy "open" initiatives in this
area as well, and familiar institutions and financial
supporters. Four very different, possibly complementary,
open-source frameworks for institutional repositories and/or
digital libraries are MIT’s DSpace (supported by Hewlett-
Packard), the Cornell/Virginia Fedora Project (supported by
the Mellon foundation), EPrints (supported by the National
Science Foundation and Britain’s Joint Information Systems
Committee), and Greenstone (produced by the University of
Waikato, in New Zealand, and developed and distributed in
cooperation with Unesco and Human Info NGO).

Beyond the individual repository, there is the problem of
federated collections, and how to search across repositories,
a dream long held in digital libraries. The Open Archives
Initiative (OAI) is a project aimed at achieving that goal,
by developing and maintaining standards to facilitate sharing
information. Currently, there are 134 registered OAI
repositories, and you can see a nice working example of
sample searches across many of them on the Web site for the
Perseus Digital Library at Tufts University.

The EPrints software mentioned above is the self- archiving
component of a larger project on open access, supported by
the Soros Foundation and marching under the banner of the
Budapest Open Access Initiative, whose purpose is "to make
research articles in all academic fields freely available on
the Internet" — either by institutional self-archiving of
articles that also appear in for-fee journals, or by authors
publishing in open-access (free) journals.

In the American Scientist Open Access Forum (moderated by
Southampton University’s Stevan Harnard), there is a lively,
long-running, and unresolved debate on what open access
means. That debate has been attracting considerable attention
around the world, both within and beyond the academy.

The efforts to promote open access to scholarly research, to
build interoperable digital libraries, and to create
institutional repositories coincide with the broadening
university revolt against the monopolistic bundling strategy
of Elsevier, in which university libraries are required to
subscribe to packages of titles and are locked into multiyear
subscriptions. Faculty members and libraries at Cornell
University, Harvard, North Carolina State University, the
University of California system, and the University of North
Carolina at Chapel Hill have all rejected those tactics in
the last year.

University-press publishers have a golden opportunity here to
distinguish themselves from commercial publishers and join
with libraries and scholars to create a new model of
scholarly communication. To seize the opportunity, though,
university presses will require more capital, cooperation,
and creativity than they seem to be able to muster.

The Battle for the Desktop

Journals, repositories, portals, and ERP systems are the
macro end of IT in higher education; at the micro end is the
individual user’s desktop environment. The desktop has been
Microsoft territory for years, but open-source projects are
cropping up here as well. In September 2003 25 universities
joined with Mellon to provide funds for Chandler, an open-
source alternative to Microsoft’s Outlook. Chandler is (or
will be) a desktop application for Linux, Mac OS X, and
Windows, combining e-mail, calendars, address books, instant
messaging, and file sharing. It’s being produced by Mitch
Kapor’s Open Source Applications Foundation, and it has two
subtypes: a personal version called Canoga, due out in the
fall of 2004, and a version called Westwood that is
specifically aimed at higher education, due out in the fall
of 2005.

What Chandler brings into focus is the battle for the desktop
between Microsoft and the open-source community. Microsoft
has already seen a serious challenge to its server market
from Linux, but it still has a lock on the desktop, in spite
of a much-improved Macintosh operating system and the
persistence of efforts like OpenOffice, which provide an
open-source alternative to Microsoft’s word-processing,
spreadsheet, and presentation software. Kapor estimates that
it will be 2007 before Linux makes significant inroads here.
Still, Microsoft is clearly already worried about its
dominance, as one can see from a series of leaked Microsoft
memos on how to combat Linux, available in annotated form on
the open-source Web site.

More immediately, there are some noteworthy open-source
developments in the collaborative creation of content. One is
a courseware project from Rice University called Connexions,
which converts "raw knowledge" into self- contained modules
of information and places them in commons, to be used,
reused, updated, and adapted. It is designed to highlight the
nonlinear "connexions" among concepts both within the same
course and, more important, across courses and disciplines.
It is open source and based on open standards (XML), and has
support from the Hewlett foundation.

Another, simpler and more general-purpose collaborative tool
that’s become quite popular in the last couple of years is
Wiki, a Web-based platform for collaboration that comes in a
variety of open-source incarnations. Perhaps the most robust
and widely used is TWiki. Using any Web browser, you can
directly edit any Wiki page, add links automatically, group
pages, search pages, attach files, track revisions, control
access at the individual or group level, and so on. TWiki,
which is just one type of Wiki, has hundreds, probably
thousands, of installations, not only in higher education,
but in corporate intranets at places like Disney, British
Telecom, Motorola, SAP, and others.

Combined with something like LionShare, Wikis could provide a
powerful tool for collaboration in academe, one that could
change teaching, project management, the work of professional
societies, and many other activities. LionShare (another
Mellon-financed project) is essentially peer-to-peer
networking with authentication. Peer-to-peer networking is
the technology underlying demonized post-Napster software
like KaZaA, but it also has less well-known applications in
things like videoconferencing. LionShare’s addition of
authentication makes it legitimate for a broad range of
applications in institutional settings.

Choice and Compatibility With Commercial Software

The university-based open-source projects described here have
in common two key characteristics: unbundling and
interoperability. Those strategies are inherent to open-
source software development, but have also proved compatible
with commercial software development. They are hostile only
to monopolistic practices.

Unbundling and interoperability are important because they
provide choice and flexibility. Instead of being locked into
a single application or suite of applications from a single
vendor, you can choose to mix different applications to
achieve the best performance for your particular purposes, at
the best price. For the end user, that means that you can use
a word processor from one place, a collaboration tool from
another, an e- mail client and an address book from somewhere
else, and exchange data among all of them using open
standards to which all adhere.

At the other end of the spectrum, in administrative computing
or digital libraries, it means that you can use a database
engine from one vendor, a portal kit from someone else, a
Wiki for managing projects and discussions. When something
better comes along for one of those functions, you can swap
out that piece, rather than waiting until the whole system is
intolerably outdated, and then undergoing vast,
enterprisewide transition from one monolithic system to
another.

On a broader level, what’s noteworthy in the various threads
of the trend assembled here is the concerted efforts of a
handful of private foundations, working with public (and some
private) universities, to promote self-determination in
higher education’s use and development of information
technology. Most of the examples I’ve cited have been
supported by two foundations, Hewlett and Mellon. Both
foundations give to things other than higher education and,
within higher education, both give to things other than IT
projects. Yet they clearly are having substantial impact on
the information infrastructure of the 21st-century
university, and the projects they are helping get under way
will liberate it from Information Property monopolies and IT
monocultures. They’ve achieved those results by emphasizing
long-term sustainability of projects and by adopting and
promoting the open-source ethos of shared goals, shared work,
and shared results.

Open-source methodology has already spread well beyond
software development: In the world at large, the Human Genome
Project is a famous example. Over the coming decade we’re
certain to see this new mode of production locked in mortal
combat with older methods and the legal and ideological
commitments that they entail. It will be interesting to see
whether, at this critical juncture, the university comes down
on the side of freely shared ideas.

With a little help from its friends, it just might.

[John M. Unsworth is dean of the Graduate School of Library
and Information Science at the University of Illinois at
Urbana-Champaign. He is departing president of the
Association for Computers and the Humanities and is chairman
of the American Council of Learned Societies’ 2004 Commission
on Cyberinfrastructure for the Humanities and Social
Sciences.]