20 June 1995
The Changing Workplace and the Nature of the Record
Richard E. Barry[1]
This paper
was originally delivered in the form of an on-line multimedia computer
presentation at the ACA 1995 meeting in Regina Saskatchewan. Audio and other media effects used in the
original version are not included in this document, highlighting one of the
complications in multi-media recordkeeping.
Unpublished paper written in preparation for presentation made at ACA Conference, Regina, Canada, June 16, 1995
Introduction
For a number of macro- and micro-level reasons, mainly
of a global political and economic nature, private and public sector
organizations have been transforming themselves or risking obsolescence or
extinction if they don’t. Accompanied with these developments are significant
changes in the workpatterns and the ways in which information, documents and
records are created and used[2]. Workplace changes show no signs of letting
up, but appear to have become a way of life, at least for the foreseeable future,
to survive in post-Cold War economic times.
Accordingly, archives and records management (ARM)[3]
and information management and technology (IM&T)[4]
programs and professionals will increasingly be influenced by workplace changes
over which they will have little control but with which they will have to carry
out their mandates. Work pattern changes frequently involve new innovative uses
of technology -- and in some cases result from rather than in such uses. Because they very often bring with them changes
in the ways that documents are created or represented, ARM and IM&T
organizations will have to join forces to keep pace with and adapt to these
changes.
In
this paper, I will attempt to develop these ideas by drawing from some of my
own experiences in the planning and implementation of IM&T projects, related business systems analysis projects
and, more recently, ARM experiences as well as on my observations of what is
going on in the field of information technology and other related research
areas. I will summarize those experiences chronologically to highlight some of
the ways in which the workplace has been transformed in terms of work patterns,
technology, interests and analytical tools during the past three decades,
drawing lessons along the way, most of them with the clearer vision of
hindsight and, I hope, not too much euphoric recall. Rather than simply leaving it there with another list of
unanswered questions and unrequited concerns, I will take the plunge and offer
both some conclusions for archives and records management (ARM) and some
suggestions about what we might do about it all. These are not offered in any prescriptive way but rather as a way
of provoking reactions, debate and better suggestions that, hopefully, will
help all of us to move more rapidly out of the debating mode and into the
“doing” mode. Before getting into the
individual projects, however, I will begin by stating “for the record” my point
of departure with respect to the meaning of ‘recordness’ and make some
observations about technology in a broader sense.
An
important part of the discussion in this paper has to do with the rapidly
emerging area of multimedia documents and records. What used to be strictly in the domain of children’s games is now
beginning to enter the board room.[5] While preparing this paper for presentation
at the ACA 1995 meeting in Regina, I found it difficult to make the necessary
points regarding multimedia without actually demonstrating such forms of
documents and records. That could not be done by simply reading this paper and,
in any case, it was too long to present that way. To facilitate making several
points concerning the impact of different presentation and storage media on ARM
practices, I presented a considerably abbreviated form of this paper an on-line
computer presentation where the various representation media could be
demonstrated. Now, retrospectively, I
have attempted to deal with the challenge of discussing multimedia forms of
records, at least in part, by incorporating in this paper a few graphics taken
from the multimedia presentation.
II. “Recordworthiness” and “Recordness”
There are others in the room far better qualified than
I am to speak to the subject of the nature of records, including the next
speaker, Trevor Livelton. Moreover, as
will be noted later, in my opinion the nature of records has not yet changed
because of the introduction of IM&T.
However, as my task is to reflect on the impact of IM&T on ARM, I
feel that I should begin by indicating my understanding of the term “record”
and some prejudices I harbor about
that. I will refer to more than one
definition, in part to signal some of my own views early on in this
presentation. I first drew from a
traditional ARM definition given in the UN ACCIS report of 1990 which cited an
earlier definition of the International Council on Archives.[6]
In this definition, the word record means:
Any recorded
information, regardless of form or medium created, received and maintained by
an agency, institution, organization or individual in pursuance of its legal
obligations or in the transaction of business.[7]
What
I now find missing from that definition
is the concept of use of records --
the notion that records are kept because there is a presumption of future use.
This made me realize that the ICA/ACCIS definition is incomplete, if it hasn’t
since been updated. I then turned to the Australian literature for another
definition that was basically the same, but which added the phrase:
and
subsequently kept as evidence of such activity though incorporation into the recordkeeping system of the organisation
or person[8].
We owe a vote of thanks to Clive Smith and Glenda
Acland for the improved Australian definition which, by drawing from the
Jenkensonian definition, has the concept of subsequent usage of records (as
evidence), if not explicit, at least implicit.
I would like to suggest a further enhancement that would make usage both
more explicit and broader, perhaps
characterized by a further variation, such as:
and kept to
be used for purposes of operational continuity, evidence, accountability,
institutional memory, historical legacy and research.
Part
of this may be semantic -- no doubt, the idea of use was undoubtedly intended
to be implied in both definitions. Part
of it may be a question of image. That
is not a trivial matter for us to be concerned with, however. In this day and age of reengineering and
reinventing government -- often code words for downsizing -- people who are
seen simply to be keeping things with no explicit
concept of serving fairly specific client communities are people who
probably should be dusting off their CVs.
In the electronic environment, I believe that records should be captured
to the extent possible at the time of creation and only subsequently when
necessary to reflect subsequent actions.
Thus, in the electronic age, I submit that it is more appropriate to
drop the word “subsequent”. Records are
kept to be used and we should
emphasize that -- used immediately and subsequently for the reasons
listed. We might argue that in archival
theory all of the reasons for usage noted above are subsumed in one word: evidence. Yet, it may be important to
highlight and differentiate the uses of records because they reach out and
speak to different client communities who should be advocating the case for
ARM. Archivists and ARM programs need
all the allies and advocates we can get these days. People who make significant use of records probably would become
advocates if they felt well served in their own work by IRM programs. I question how successful we have been in
culturing advocates outside of the ARM community with such terms as records or evidence (instead of, for example, mission-critical information), recordkeeping
(instead of, say, information management),
repositories (instead of information stores), central registry (instead of information intermediary or information service unit), etc. How we define things for our own purposes is
one thing. How we project what we do to
the client world perhaps should be another[9].
Archival
science and diplomatics helps enrich our understanding of recordness with
properties[10] of impartiality, naturalness,
interrelatedness, authenticity and uniqueness.
Archival records also have content,
form and medium[11]. And they have context, which is a particularly important characteristic in
electronic systems where part of the context may be supplied by the system
software or hardware. The word ‘medium’
as typically used by authors in the field of ARM and diplomatics implies
physical or storage medium. Now we
begin to get into the area of how the use of words between disciplines can get
us into trouble. Particularly in the
electronic environment, the term “medium” should not ordinarily be used without
a modifier since it may be used to describe quite different characteristics: perceptual media (sound, sight), presentation
media (speech, paper, microfilm reader, video display terminal), recording media (analog, digital), processing
media (text, data, image, sound, video), or storage media (paper, microform, floppy disc, videotape, hard
drive, CD-ROM, DAT, etc.)
As
one of my observations in this paper is that there is a need to bring the ARM
and IM&T communities much closer together, I think it is important to
mention these distinctions because I notice that quite different priorities are
placed on these characteristics between ARM and IM&T professionals and even
within those groups, between archivists and records managers and between
information managers and information technology professional. This is, in part,
why IT specialists and electronic document management systems traditionally
give little attention to the issues of information value, appraisal,
disposition management and long-term preservation; or to linking digital and
paper-based or microform-based information and systems.
III. Let’s Stop Whining About Technology
In
our understandable desire to focus on issues and remedies, we tend in many of
our discussions of electronic records to lament the scourge of technology. I do
it myself. Look at the problems that
digital technology presents for records acquisition, appraisal, preservation,
access and security. We focus much less
on how technology might help us to overcome some equally or even more
intractable problems with paper-based records management systems with respect
to -- yes -- acquisition, appraisal, preservation, access and security.
To illustrate, we commonly use
the issue of how easy it is to change electronic documents in an undetectable
way as an example of the kinds of intractable problems we face with electronic
records in comparison to paper records.
Not long ago, the author had the occasion to print and sign a letter to
an airline company to make a claim for lost luggage. While the original file copy was saved in electronic form,
another paper copy of the letter had to be made to include in another letter to
American Express where the author had baggage insurance to cover the difference
between any loss and what the airline would pay. A copy of the signed letter was made on a personal copier machine
(no high-tech office machinery). During
the distraction of a phone call, the author got the original signed paper
version and the copy mixed up. After
spending about 10 minutes holding the two up to the light and trying to detect
which was the original, I finally said to myself: What am I doing wasting my time here trying to detect what is for
all intents and purposes an undetectable difference? I don’t know who got the “original”.
This experience with a $650
copier made me realize that with even inexpensive modern copying technology, it
would be very easy to change a paper document -- particularly for the author
with access to the original word processor, fonts and printer -- and to make a
copy that would stand up well as the “original”, at least without the benefits
of very expensive forensic lab tests.
It is noticeably easier to do this today than it was 5-10 years ago,
possibly when some of our notions about this subject were formed. Today, a well-designed electronic records
system could make it easier, not necessarily more difficult, to prevent or
detect document tampering. A relatively
new technology called electronic time-stamping or digital time-stamping,[12]
offers an excellent way in which to prevent or detect changes to digital
documents.
It can also be much more feasible
in electronic than paper-based systems to maintain duplicate stores of the same
information in different physical locations and even under different
administrative control. While this
solution to detecting changes in electronic documents might not always be
feasible for the entire corpus of an organization’s electronic records, it
might very well be feasible for 5-10% of those records, the typical amount
representing archival documents or even a lower percentage of records
considered most important. Or it might
be feasible in larger quantities for records that are considered to be at great
risk of manipulation by interested parties but needed only for a few years --
say the output of a computer-based procurement applications system. Thus it is possible to use a design strategy to achieve the objective
of ensuring genuine records and of discovering attempts to alter or delete
electronic records that is superior to what can be done in the paper
environment.
Does this mean that technology
does not bring with it a great many adverse or potentially adverse
effects? No, it do not. Apart from the many well known archives and
records management problems, there are many serious human issues that
technology has given rise to -- but for which the solutions are not likely to
be found in technological innovations -- at both the individual level (e.g.,
hardware ergonomics,[13]
human interface with software systems,[14]
potential isolation of people from the “real world” [15],
etc.) and at the societal level (underemployment and unemployment, protection
of personal privacy, access to public information, creation of a class -- even
whole developing countries -- of information deprived “info-nots”, etc.) What I do mean is well expressed by an internationally acclaimed Renaissance scholar
Walter Ong, S. J., who observed in 1968 in his book, Knowledge and the Future of Man:
It is not
the inhuman effects of technological living -- our being “dominated” by
machines, whatever that may mean -- but the human effects that pose our
problem. The science that underlies
technological living has given a new shape to the contents of the human
mind....Opposition between technology and the humanities is more imaginary than
real. The printing press, a
technological device, was developed largely under Renaissance humanist
auspices, and the use of computers for textual study and other humanistic
purposes is already becoming commonplace.[16]
Archivists
and records managers love the written word.
We would not be in the businesses we are in if that were not the
case. As we wrestle with ever changing
advancements in information technology, it might help all of us to pause and
think about writing as a technology. As Ong later put it in Orality & Literacy: The Technologizing
of the Word:
Because we have by today so deeply interiorized writing, made it so much
a part of ourselves...we find it difficult to consider writing to be a
technology as we commonly assume printing and the computer to be. Yet writing (and especially alphabetic
writing) is a technology...in a way the most drastic of the three
technologies. It initiated what print
and computers only continue, the reduction of dynamic sound to quiescent space,
the separation of the word from the living present.[17]
In
their days, wood, wax and parchment technologies were very popular and well
accepted for the conduct of business in the middle ages -- wood for tally stick
accounting records, wax for drafting other documents and parchment for the
final version of those documents. Can
we not imagine the headaches we would
endure trying to store organizational accounting records on tally stick or
carrying out document version control with drafts written in wax plates? Could CD-ROM possibly be worse than that?
But wait! It would be more than a
preservation problem. In the case of
wax plates, it would also be a capture problem because, as with some modern
forms of electronic records, people didn’t keep the drafts. They reused the wax for the next document
much as today we use floppy disks, C-drives, PCMCIA cards and videotapes.
Writing is indeed a technology, and one that has made use of many technologies.[18] It is not, however, the only one. There are also speech, other sounds[19]
and video. Shouldn’t we be prepared to
deal with whatever technologies are deemed effective or necessary by those most
competent to create records of the actions for which they are responsible?
That
many people hate to contemplate a change in what has become the favored
technology of writing for producing records over other technologies, also is
not a new phenomenon. It has always
been contentious and a certain topic to raise emotions. In the fourth century BC, Plato ridiculed
writing in any form as inhuman. Triethemius, the Renaissance humanist lamented
the emergence of what today we might refer to as the ‘parchmentless abbey’ much
as today some people lament the specter
of the paperless office. He
said:
“The printed
word is on paper...The most you can expect of a book of paper to survive is two
hundred years. Yet, there are many who
think they can entrust their words to paper.
Only time will tell.”[20]
The
fifteenth century promoter of the printing of the Latin classics, Squarciafico,
wrote that the “abundance of books makes men less studious”. Drawing from Lowry[21],
Ong carries the thought through to its modern day conclusion:
[the printed
word] destroys memory and enfeebles the mind by relieving it of too much work
...downgrading the wise man and wise woman in favor of the pocket compendium.[22]
Now,
in a different way, the ‘printed’ word is under attack again, this time for
largely economic reasons. Technical and
professional journals and even newspapers are concerned about the significant
threat posed by the increasing popularity of electronic publishing.[23] While probably a large population of people
seriously doubt that this trend should or will cause the demise of printed
journals or newspapers, perhaps with the exception of some of the smaller
journals that rely heavily on advertising receipts to sustain themselves, it is
nonetheless seen by a number of publishers as something that will reduce their
revenues and cause them to consider ways in which they will become more
competitive in the future.
What
will be next? Can we imagine our
successors struggling to protect the paperless office from the onslaught of the
next redefining technology? It could
happen, and sooner than we may think.
In November 1994, Leonard M. Adleman proposed a totally new approach to
large scale computations in the form of a primitive DNA computer -- a
biological computer -- to solve computer science problems.[24] More recently, Eric B. Baum suggests the use
of this model for purposes much nearer and dearer to the hearts and minds of
the people in this room. He theorizes
the use of Adleman’s model:
to produce
an associative, or content addressable, memory of immense capabilities...one
where a stored word may be retrieved from sufficient, partial, knowledge of its
content, rather than needing to know a specific address as in standard computer
memories. Content addressable memories
are useful in a number of computer contexts and are widely thought to be an
important component of human intelligence.”[25]
CA memories may also be very useful in addressing some
of the most taxing problems with electronic records -- navigation through
extremely large document stores consisting of
many, many terabytes, perhaps the magnitude of an entire nation’s
archive.[26] Some combination of “hardware” architecture,
such as suggested by the DNA computer, and navigation software[27]
will be required.
We
are comfortable and have learned to deal with the written word, but this does
not mean that we should favor it over other technologies for creating and
recording information. That is a matter
more properly made (and that in all likelihood will be made) by document
creators, not by the ARM or IM&T community. The emergence of multi-media (systems that manage separate stores
for documents in various presentation and storage media, mixed-media (systems
that manage documents with containing more than one presentation medium,
hypertext (non linear text) and hypermedia (same as ‘hypertext’ but not limited
to the textual medium) records in the form of ‘composite content objects’[28]
(a.k.a. complex documents) will be a test of that proposition.
Finally,
we should recognize that technical issues, albeit the easier to raise and
discuss, are not the biggest issues. It
is the social issues (privacy, preservation of democratic institutions, etc.),
organizational issues (information ownership, service delivery organization,
policy, etc.) and individual issues (underemployment, unemployment, ergonomic
problems, etc.) arising out of the use of modern information technology that
are the truly big issues. If we want to
debate the pros and cons of technology, those are the issues needing most
attention.
IV. Thinking
“Out of the Box” About Electronic Records
One
of the trappings of current business systems analysis[29]
and reengineering methodologies is the use of so-called “out-of-box”
thinking. The term is used to contrast
our thinking when we are so immersed in our problem world (the box) that we
limit all possible future by the norms, constraints and tools that we have in
the box. By contrast, when we engage in out-of-box thinking, we can visualize
the box as though we were detached from it as we might be in a helicopter
looking down on the box and trying to see surrounding landscape and previously unconsidered possibilities. In the former case, we might look at a
barrier and consider how to climb over it.
In the latter, we might smash through the wall or simply walk around it
-- define it out of the problem. I find it helpful to apply out-of-box thinking
to electronic records issues, e.g., to think about them in terms other than the
usual paper-based paradigm. Let us
explore some points about recordness using voice technology to help avoid
getting trapped in the paper records box.
We know that an unrecorded telephone conversation in
which I signal my agreement with your earlier proposal is not a record because,
even if it met all other requirements of recordness, it remains information
that is not recorded. We have chosen
traditionally not to record telephone conversations for a number of reasons --
mainly because it is seen as a practice which would cross a strong cultural
line of personal privacy and because of our preference for the written word. If
something that takes place over the phone is of such importance as to be
regarded as “recordworthy”, we expect one or both parties to the conversation
to record the results in written form, e.g., a memo to files. Despite our almost universal distaste for
answering machines and voice mail (or vmail), we put up with them because we
dislike the alternative even more if it means that we won’t get the service we
are after or the return call we anxiously await.
Now
suppose that I fail to reach you when I call, and instead leave a message on
your voicemail (vmail) system confirming my agreement with your proposal. Now this is a recorded voice document. It would
become a voice record if other
requirements of recordness are subsequently met, i.e., if you forward the voice
document to others for action, as you might an email message, or otherwise use
it as part of the business process of which it is a residue, and its context is
established through linkage to the action.
This requires that the vmail system be functionally able to facilitate
this kind of linkage. Today’s vmail systems lack this kind of functionality
even more than electronic document management systems.
Like
most other information technologies, voice technology is moving beyond the
state of simply providing a means for improving individual productivity and
into the arena of organizational productivity and client services. The example given above is isolated and
trivial and presented only to make a point that recordness may not be changing
but the way in which acts are being recorded is. However, when organizations begin to use similar technology to
perform business operations, then archivists and records managers do have to
think at least twice. To illustrate,
one very large insurance company, USAA, routinely records telephone calls from
its policy holders because it does its business with customers directly by
phone without sales brokers in the middle.
As part of a business process re-design a few years ago, this company
now permits its customers to make claims through a human interview process over
an 800 telephone line. Conversations
are recorded with the knowledge of the caller.
They are linked to image and data files maintained on the customer for
use as inputs to a work assignment and workflow system that parcels out claims
and other requests for timely action and for later reference. They are voice records.
This
practice is becoming more common as organizations turn more and more to the use
of telephone services to provide more direct “just-in-time” services to their
clients and less expensive means for delivering services, especially when those
services are information based. Among
the more common examples are the use of telephone touch-tone commands to
transfer funds from ones savings to checking account without ever leaving home,
or to charge a theater ticket over the phone for later pickup at the theater
Will Call desk, and the telephone Help lines that are operated by most
information technology hardware and software firms. In the not distant future, we will be able to get money from our
banks without leaving home. An
electronic box, perhaps the size of a cable box will allow us to insert a
credit card with an embedded computer chip which will be used to download an
electronic fund transfer to the card which we will subsequently be able to use
at home or in stores to purchase goods and services and the card will be
debited accordingly. It is not unusual
these days to hear a recorded message when dialing into one of these services
that says words to the effect: “This
call is being monitored for training and quality control purposes.” This could mean that calls are monitored in
real time and are not recorded at all, in which case they would not qualify as
records; or that they are randomly or always recorded for later review, in
which case they could be records. They could be disposed of immediately after
such reviews or they could be filed for later use as evidence, should they
become linked to some service-delivery or product liability action. In short, voice mail messages may very well
qualify as records.
With
speech generation and speech recognition capabilities now available in PC
consumer products, the distinctions between the written and the spoken word
will soon become blurred. Speech
generation systems simply create computer-generated words and sentences,
converting one digital form to another -- text to sound. With software that was preloaded on the
notebook computer I am presently using, I could demonstrate this capability
with this paragraph in a few different ways.
It is not possible to do so through the medium of print; however, I will
do my best to describe what happens.
One easy way, using the particular software that I have on my system,
would be to highlight this paragraph and simply do a COPY command off the Edit
Menu. Then by simply clicking the right
mouse key over the active speech generation software icon, we would hear this
paragraph converted to sound. Pauses
would happen at the end of sentences and you would find it amazingly
understandable -- lacking in emotional intonations to be sure, but quite
understandable. Of course there are
shortcomings. For example, if the
paragraph included a phone number, say my fax number -- (703) 241-7968, it
would read that as: SEVEN HUNDRED AND
THREE...TWO HUNDRED AND FORTY ONE MINUS 7 THOUSAND, NINE HUNDRED AND SIXTY
EIGHT. But keep in mind that this
technology in the hands of ordinary consumers 10 years ago would have been
considered little short of a miracle.
Even the fact that we are amused by or ridicule such seemingly obvious
shortcomings is itself evidence of how much we have come to expect from
information technology. Then, what is
so newsworthy about this technology?
After all, it has been around for a long time in such common
applications as those used by telephone Information Operators when they say
“Please hold for your number.” What is
newsworthy is that this capability is now available in PC consumer software
products and is included among the pre-loaded software packages provided on
some multimedia computers. This should
be taken as a wake-up call for ARM professionals to anticipate the interchange
of email and vmail, once the email vendors catch on to the potential for their
own products, probably before the end of this decade -- not a great deal of
time to prepare for.
Speech
recognition is a wholly different thing.
It satisfies the opposite functionality and is vastly more complicated
than voice generation, as it must ‘read’ widely varying voice sound signals
created by individuals with different speech patterns and different accents --
even within the same vocal box when a person has the common cold -- and convert
them to the written word form or to other outputs such as computer
commands. Unlike voice generation, it
has taken literally decades to bring this technology to fruition, mainly
through a few leading computational linguistics laboratories, and mainly using
simultaneous dictation applications.
This very same technology were it to be applied to the field of
automatic document classification could break some great barriers for the field
of archives and records management. Like speech generation, speech recognition
has appeared on the PC consumer market only in the past few months. For example, the simplest (and cheapest) of
these systems (maybe $100) allows the user to easily navigate around a graphic
user interface (or GUI) by speaking into a microphone incorporated in, or
attached to, a PC (as all multimedia PCs now have) with a string of commands --
words -- such as: “Open Microsoft Windows,...Open Microsoft Word for
Windows,...Open file name: “A C A PAPER PERIOD DOC”. In so doing users can change their GUI to a VUI or voice user
interface. Other such systems, much
more sophisticated and costly (say $1000), and not yet elegantly, may be used for
the same purposes and more -- to dictate a memorandum or report without using a
keyboard. To be sure, vocabularies are
limited to a few thousand words that must be spoken by the system owner and
dictation must be discontinuous and therefore slow. They also have difficulty distinguishing the various legitimate
spellings of words with similar sounds such as the words “to”, “too” and
“two”. However, the individual can
build up his or her own dictionary by ‘teaching’ the system new words. For the hunt-and-peck typist, or the person
physically unable to type or with a cultural objection to typing, it may sell
some PCs to people who wouldn’t otherwise use them. More importantly, this is a rapidly emerging technology that has
really turned the corner now. Ironically
perhaps, but speech recognition may turn out to be a great friend to those in
the archives community and historians who have fought to preserve records
through the special medium of oral history.
Consider the enormous possibilities of being able to do ‘full-text’ word
searches on large voice record stores.
The
author has been following speech recognition research for about 15 years in the
artificial intelligence and natural language processing research and
development communities. Every year, I
have heard the same thing. The response
usually went something like this:
“We’re about three years away from the necessary breakthroughs needed to
develop a product.” Many people didn’t
believe it would ever happen. It was
too difficult to achieve. Those who
did, thought it would be possible to deliver such products only on large
mainframes, including massively parallel platforms, because that was the level
of power needed for complicated artificial intelligence applications during the
research stage. Moreover, research in
this area preceded the advent of personal computers. No longer is this the case however and, except for some special
kinds of intelligence applications, most vendors involved in this field are
going after the PC market -- now the most common instrument for document and
record creation. Moreover, current
research is underway to overcome the limitations noted above. To avoid getting
blindsided, we should anticipate wide availability of a superior continuous
speech recognition technology well before the New Millennium. If you are operating in a field where there
is a fairly limited or specialized vocabulary, such as medicine, it is already
here. Many physicians are already using
speech recognition systems to dictate patient reports. What will be kept for the long term? The secondary text version created from the
voice dictation, or the original voice records themselves? As with speech generation, the paper version
of this presentation does not permit demonstration of this exciting and
innovative technology. Seeing and
hearing the use of these technologies for business purposes in the PowerPoint™[30]
version of this presentation brings the point home in a way that is not
possible to do on paper. Alas, that is
at the heart of the multimedia records issue.
The
point that email and vmail may be interchanged is a point that needs to be made
more generally about information and records management in the digital
environment. This is perhaps better
described using the ‘slide build’ from the PowerPoint™ version of this
paper. Through the use of image
scanning technology, paper documents
may be converted into digital form.
Paper Text 4 F Digital Image >
=
scanning systems = paper F digital storage
Once the paper document is in digital form, it may be
further converted from an image base to a character base, significantly
facilitating information retrieval.
Digital Image > F Digital Text 4
= OCR systems = key-word F full-text retrieval
Now in character form, it is
now possible to convert the original paper form into voice form through speech
generation technology:
Digital Text 4 F Digital Voice (
=
speech generation systems = email F vmail
A similar route may be taken
going in the opposite direction beginning with the use of speech recognition
technology to digitize a human voice document or record:
Human Voice J F Digital Text 4
= speech recognition systems = vmail F email; also = voice record
creation
What
this means is that, at least to a certain extent when the developers of GUIs
catch up with the document creation and conversion technology, people will be
able to choose their preferred way of operating by simply throwing a readily
reversible switch or clicking an icon, or speaking a voice command, to have
their email converted to vmail or vice versa.
Obviously this could be quite important to the person who is hearing or
sight challenged. But it could be very
convenient also for any individual traveling without a computer who wishes to
be able to pick up email by dialing into the vmail system back home. This poses an interesting question. Let us say that we decide as an organization
that voice mail records will not be retained out of consideration of privacy;
but that email records will be. But now
if we have some users converting vmail to email, we wind up retaining some but
not all records for reasons other than their relationship to a business
activity. This is one reason why we
should not make appraisal decisions on the basis of technology but rather on
the basis of the acts or business processes that create these records. Reflecting on Charles Dollar’s Macerata
paper,[31]
Luciana Duranti writes:
[I]ntense
preoccupation to demonstrate the common nature of all records, regardless of
physical form, and the need for management decisions (appraisal included) that
treat all the records of one creator as an integrated whole, is certainly
shared by many, but it is by no means the consequence of a generally accepted
idea. The qualifier that Charles Dollar
added to his statement that appraisal should be based on the functions or
competences generating the records, rather than on the technical applications
from which they resulted, that is, that this might not be so with electronic
records having no paper analog, still stands unchallenged and is therefore
quite disturbing.”[32]
If
we agree with the proposition that acts, not technology, should be the
governing factor in appraising records, it is difficult to justify ignoring the
appraisal of vmail records despite privacy considerations, even though
retaining them in organizational multi-media record systems would be disturbing
to many, including this author. The
key here is the same as for email: to
have an organizational policy on the subject that is clearly understood by all
employees.
In
the discussion of speech recognition above, it was noted that it is a rapidly
emerging technology. This is a point
that needs to be made more generally about information technology. It is a very fast moving field in which
there is enormous competition. Thus,
while we should of course recognize the limitations of IT where they exist, we
should also realize that it is risky to base our future plans on the existence
of the same limitations three to five years hence -- the typical lead time for
introduction of an electronic document/records management system. As noted earlier, one of the major reasons
that archivists expressed a few years ago for not maintaining records in
electronic form was that electronic records could not be trusted to maintain
their genuineness -- i.e., that one could not be assured that, over time, the
record had not been tampered with or altered in some way. Research was carried out in the early 1990s
that, although it was done for intellectual property rights purposes rather
than for electronic records management more generally, addressed the document
‘genuineness’ issue that so concerns archivists.[33] Another common concern of archivists is how
email can be used as records in the absence of signatures. Considerable effort has been going on in the
research community to tackle this problem that has resulted both in products
and legislative changes to provide for certification of email messages.[34]
V. Technology
and the Transformation of the Workplace
We
often observe that information technology is too often procured without an
intended result or transformation in mind other than in some general terms of
improved productivity, which organizational analysts and industry economist
have found difficult to scientifically demonstrate. It is fashionable to state, as I have myself on many other
occasions and as I do again here today, that desired organizational end results
should govern organizational aims and objectives that, in turn, should be the
basis for the creation of business processes, which should drive information
and information management needs (including records) and architecture and this,
finally, should drive information technology decisions. We say that it should never be the other way
around. Despite my own theoretical
convictions and incantations to this effect, my observations and experience
tell me that there must be a more complicated arithmetic at work -- that there
is a greater symbiosis between work and technology than I am always ready to
admit. People do sometimes do it in
reverse and sometimes with good results. Sometimes creative people can see in a
new technology the potential for changing the way that some business process is
currently carried out. The invention of
the PC followed this route. Most people
in the workplace didn’t cry for that invention; however, many immediately saw
ways in which it could improve their work patterns. Emerging multimedia technology may very well prove to be the
latest example of this phenomenon. When
things are not done according to some prearranged analysis and plan, it is not
always easy to discern or say which it was that led to workplace transformation
-- a problem in search of a solution or a solution in search of a problem. Perhaps that is not the important
distinction to be made. Perhaps it is
more realistic to see work and technology as parts of a continuous feedback loop
where work needs spawn technological requirements that may be only partly
satisfied by technological innovation that is then reacted to in the workplace
and refined in later innovations; and sometimes technology results in
unexpected or unintended innovations in work patterns and the cycle begins
again.
To
the extent that technological innovation has an impact on document creation and
use, these cycles carry with them significant implications for records
management. The emergence of more
complex document forms illustrates this point:
Simple documents are typically text-only documents such as telephone logs, call
slips, calendars, text-only letters, etc.
Compound documents are documents containing graphics (e.g., a
logo or signature) or data (e.g., a statistical table). Complex
documents are documents containing multimedia objects (e.g., sound,
animation, video). Traditional records
systems are based upon a simple document architecture in paper and/or microform
storage media. They have been stretched
to accommodate some compound documents, though often only by printing out such
documents and filing them with simple documents in paper or microform storage
systems. The emergence of complex
documents will not be so easily accommodated by traditional records systems
because multimedia documents cannot always be fully represented in paper
form. Unfortunately, the printed form
of this paper (as distinct from its multimedia form in a PowerPoint™
presentation) does not permit illustrating this point.
Simple Documents
Figure 1 is an example of a simple document that will be followed
through with more complex examples using the same case to illustrate why more
complex forms of documents are beginning to be seen as providing a competitive
advantage, either in terms of time, productivity, ease of understanding or
attractiveness to the reader. The case
is a person who is sending another person a video clip for a TV commercial spot
that promotes travel to Alaska. The
recipient’s comments are requested urgently because time is money in the
advertising business. It is sent at
considerable expense, relative to regular mail services, by overnight courier
with a request that the recipient provide comments and a voice over to use with
the spot immediately. We can imagine
that this exercise would take about 3-4 days to complete.
Figure 1: Simple Document: text only
Compound Documents
Figure
2 illustrates a compound document
simply because it has a graphic logo on the letterhead -- something that
archivists may find important, like signatures, to capture -- a reason why
archivists typically favor image rather than ASCII representations of documents
for electronic archives -- but it adds little if any substance or content to
the document. Perhaps a bit of context. More interesting examples of compound
documents that would add more by way of content and substance might be ones
containing a statistical table or picture.
In the case outlined above, there would be no difference in the manner
of delivery or handling of the task.
The introduction of the graphic figures in this paper make it a compound
document.
Figure 2: Compound
Document: text and graphic
Complex Documents
Complex
documents include embedded objects in addition to text that are linked from the
document to some other independent computer file where the object is separately
maintained and may be separately updated (and by so doing automatically update
what is contained in the document).
Such objects may be attached and mailed electronically with the
document. Figure 2 could also be a complex document if, for example, the logo became an active icon
that, when selected, would activate the embedded object. Objects so linked are said to be hyperlinked and may be hypertext[35],
hypersound, hypervideo, etc. A document
(or composite content object) with
more than one such representation medium may also be called hypermedia. If the logo in Figure 2 were
hyperlinked to the Alaska promotional video of the case in question, it would
be illustrative of a composite content object containing both textual and video
objects. The version of this paper that
exists on my hard drive is a complex document because the graphics are hyperlinked to the graphics that exist
in the PowerPoint™ version of this presentation. In its paper medium version, however, it is not so linked and is
therefore a compound document.
Figure 3: Composite content
object (CCO) containing text and embedded video/sound objects
Figure 3 is such an example. (The Asymetrix icon calls up a video
capture system and is a trademark of Asymetrix, Corp. The microphone icon calls up the Microsoft Windows Recorder
system.) Taking the example the next
step, the document creator sends this document by electronic mail to the
recipient with the attached linked object, in this case an Asymetrix “*.avi” or
video-type file or video clip. The instructions indicate to the recipient that
by ‘clicking’ on the video object the recipient may view the embedded video
clip. The recipient receives the email
request within minutes of its being sent and, after reviewing the video clip,
attaches a voice annotation comment on the document and returns it to the
sender. The whole transaction could be
concluded in an hour or so.
In Figure 3 we see that the email message has been received and marked
up with a return arrow to the sender ad dated 10/24. Now the CCO contain an embedded sound object (the microphone
icon), which if clicked on will reveal a voice commentary on the video object
(the Asymetrix icon). Rather than days
having passed sending these communications back and forth in the mail, the
whole transaction takes place in a matter of hours electronically. This kind of document may be stored in a
normal digital storage medium, e.g., hard disk, but may be perceived in all of
its presentation media (text, video and sound) only if stored in a multimedia
information store and retrieved through a multimedia workstation. Until the latter conditions are met, the
document would not qualify as a record, even if all other conditions of
recordness were met, because it could not otherwise be perceived by human senses.
Imagine other examples of this
kind of business use of multimedia, e.g., in the form of a university capital
budget request that includes an opening “paragraph” in the form of a videoclip
of a proposed new research center or
student center. Think of the personnel
file with the employee’s picture that has embedded in it a videoclip of the
person giving a brief sketch of his or her professional background and current
professional interests. Or the police
report of a highway arrest which combines a written report with the picture of
the person arrested which is hyperlinked to a videoclip taken of the arrest
from the officer’s dashboard video camera.
Many police departments already have computer terminals in their police
cars where data is obtained from
central data stores and where textual reports
are submitted by the policeman in the field.
Recently, some have installed video
cameras. How long do we expect it to be
before these record capturing media are combined into multimedia, hypermedia, documents or composite
content objects? Not very long, I
submit. Or consider the typical
“back-to-office” report in most development assistance organizations. Traditionally such reports are fairly highly
structured with statistical tables showing the financial status of the project
and, if applicable, progress on the civil works schedule. How much more informative and meaningful
such a report could be if it included a photograph of the developing country
project manager or of a national archives building under construction. Beyond that, how much more informative it
would be if, when one ‘clicked’ on the project manager’s picture, it revealed a
video clip interview with the PM on the status of the project. One could imagine many such
applications. In the German Government,
one need not imagine. One of the most
advanced operational multimedia systems anywhere is one presently being used by
the German Government as part of the transition of the capitol from Bonn to Berlin,
during which time portions of the Government are in both cities.[36] As more real people with real business
applications discover the powerful capability and multiplier effect of mixing
the written form of language with other forms of information, it will not take
long for hypermedia documents to become commonplace.
VI. Lessons Learned from
Past Experience
In
the following sections, I will attempt to illustrate some of these points with
personal experiences and observations. They are not presented in isolation of the experiences
of others as a way of ascribing greater importance to them than they
deserve. They simply reflect the kinds
of things that were going on in many quarters of the North America and
elsewhere that, together, have had some impact on how information technology is
used, how work is done and what it might mean for ARM.
1960s
Early
Command and Control System Project
Information
technology in the 1960s was largely confined to centralized, mainframe computer
systems with data processing applications, as distinct from the text-based
office systems of more recent years. Most of these applications were
transaction-oriented and mostly in the financial sector -- accounting systems,
payroll systems, etc. There was a
certain likeness between the status of management information systems in the
60s and electronic records systems in the early 90s. That is, the subjects were very much topics of discussion and
debate at professional conferences, but there was little by way of implemented,
operational, systems. It was my good
fortune, as a young naval aviator in 1960, to be assigned to a newly created
“command, control and communications” group (or C3 as it was known,
and subsequently C3I when “intelligence” was added to the function)
in the Office of the Chief of Naval Operations in the Pentagon under the
leadership of then-Commander Arthur K. Bennett, USN and to become involved in
some of the first computer-based command and control systems that were military
versions of management information systems, and more.
Prior
to 1958, the Chief of Naval Operations exercised command and control of naval
forces the world over. With the Defense
Reorganization Act of 1958, however, operational command and control of all
military forces was transferred from the service chiefs, under their individual
civilian cabinet secretaries, to the Joint Chiefs of Staff reporting directly
to the newly established Secretary of Defense.
Because
of the new law, the Joint Chiefs faced the enormous problem of aggregating and
digesting the status of all U.S. Armed Forces daily and set out to develop a
computer-based operational readiness system for the Joint Chiefs of Staff,[37].
It involved the creation of a reporting system consisting of about 30 highly
structured reporting formats. These messages were transmitted daily to the
Pentagon shortly after midnight Washington time so that they could be processed
and summarized in a computer printout and used for preparation of the Joint
Chiefs’ regular morning operational briefing.
No such summary of all forces with such up-to-date information had been
possible before that time.
The
punched paper tape that was created by the teletype machines containing the
digital representation of the messages were fed into paper tape readers that
read the data into a large IBM 7090 -- what was at that time regarded as a
scientific computer because of its computational power. It had all of 64K memory, a room full of
tape drives and three shifts of about eight enlisted men and officers to
maintain it. Remarkably, the
punched-paper tape reader was connected directly between the teletype machine
to the 7090 and messages were processed into the computer automatically as they
arrived. The information was then
interpreted by a software programs coded in machine-language. It was an early generation precursor to the
modern data base management system. It
was my first involvement in the design and implementation of computer-based
information technology. The Joint OPerational Reporting System
(JOPREP), consisting of about 30 separate structured reports, was fully
digitized so that the system for reporting the conventional and nuclear
readiness of U. S. Armed Forces world-wide was fully automated. The result was
that, at 5 a.m. each morning, a highly classified stack of computer pages
representing the state of U.S. Armed Forces individually and in the aggregate
was in the hands of Pentagon briefing officers for use in morning
operational/intelligence briefings of the Joint Chiefs in the National Military
Command Center on the second floor of the Pentagon.
Lessons Drawn
This project was an early lesson in how macro-level
forces (in this case legislation in the form of the Defense Reorganization Act
of 1958) can have enormous impact on local work needs and patterns, technology
and organizational behavior that changed the ways in which information and
records were created and used. In a
similar manner, the demise of the Cold War has created global and national
economic trends that are reverberating down to the smallest public and private
sector organizations today in the form of competitive pressures, reinvention of
government, etc. These forces are
driving the rethinking of work patterns and technologies in ways that have a
direct bearing on recordkeeping practices.
The experience reinforced the idea that if one focuses
on the operational needs first, rather than the technology, results are more
likely to be successful and enduring. The JOPREP system has undergone many
evolutionary changes since it was developed in the 60s to reflect changing operational
needs and technological improvements; but it is basically still in place
today. We didn’t know them as business
processes then and had no business systems analysis tool other than what was
provided by old fashioned systems analysis, but it was the defining experience
for me in the use of information management and technology tools to address
business needs and was instrumental in my decision to leave the service and
make a career in this field.
It was an early example of ARM and IM&T
organizations missing each other’s boat.
Paper printouts were regarded as the residue of the
system. Computer tapes were kept purely
for backup reasons, not because anyone thought of them as “the record”. As such, they probably would not have been
useful as an information store for selectively retrieving or presenting the
component messages or summary reports. No military archivist, records manager
or historian showed up, and we didn’t know enough to ask. That is not to say
that there was no records management system in the Joint Staff. There were, for example, military historians
whose job it was to reconstruct crisis situations retrospectively to learn
lessons for the future -- a classical example of where records are needed for
institutional memory purposes. It was simply a case in which the two
communities of interest didn’t conceive that they had any common interests.
Early “Roomware”[38]
Project
Shortly
after implementation of the JOPREP project, I had the good fortune to be
involved (albeit as probably the most junior officer on the team) in the design
of a new National Military Command and Control Center or NMCC. This was a room that was to replace the old
fashioned “War Room”. It was one of the
earliest attempts (the earliest for
me) to integrate physical facilities with information management and
technology. The task was to design a
suite of rooms for the processing and presentation of high-level information
with facilities for instantaneous world-wide communications at each seat around
a large conference table in the central room where the Joint Chiefs met, as
well as in adjacent rooms designed for the use of ad hoc battle teams convened
to provide staff support to specific crisis situations. Just as the military was a leader in the
development of information systems, including some of the earliest text-based
systems that were developed for the processing of textual information in
support of intelligence operations, it also played a very early leadership role
in the development of decision-support rooms.
Lessons Drawn
This
project also opened up the thinking for many of us to look beyond the more
traditional forms of data processing and presentation (mainly in the form of
massive computer printouts) and to linking information processing with
real-time communications -- the two principal elements that subsequently became
integrated in what we now call office technology.
Unlike today’s “roomware”[39]
systems, the advanced version of this kind of technology, the NMCC did not
automatically record decisions taken..
It is very easy to lose the record (or never make it)
of ad hoc situations that are not a part of the daily routine of any particular
office, such as during military crises or very common business project oriented
activities. Yet, the associated actions may be among the more important
records. With current and foreseeable
trends in organizational design being toward flatter, less structured
organizations, this is likely to be a problem that modern organizations will
have to deal with more often.
Facility managers can be great allies of ARM programs
when they see the potential of electronic records for reducing office space
pressures and costs that in large urban centers can easily run $30-50 per ft.
P.a.
Coordination of Information Sciences Research and
Development Projects
Following
a pause of a few years to return to an operational squadron and carry out
postgraduate studies, I had the opportunity in the late 60s to return to the
IM&T field in a very different capacity as Executive Secretary to an interagency
information sciences technology group[40]
under the President’s Science Adviser’s Committee on Scientific and Technical
Information (COSATI) whose purpose was to coordinate information sciences and
technology research work among the agencies of the Federal Government. Here I
first learned about some extremely interesting research that was going on in
such fields as computational linguistics, large text-based systems and
geographical information systems (GIS).
Some of this was still highly classified at the time. GIS systems are now figuring significantly
in modern electronic records.
Presently, they constitute a relatively small portion of electronic
records, because they are used principally in specialized applications dealing
with mapping and building drawings.
Nonetheless, even those applications often create important
organizational records that ARM professionals have to deal with, usually
through the use of very expensive and space consuming map filing cabinets. The storage of such records can be much more
economically achieved electronically than in paper where the records are
usually oversized, of different sizes and are sometimes on chemically treated
paper. In the future, we may anticipate
that the use of GIS technology will not be limited to mapping and building
maintenance applications. Information
scientists are using this technology to help tackle difficult information
retrieval problems, especially as a retrieval interface for large data bases,
e.g., any data base that has geographical components such as medical research
data, real estate data bases, etc. Even
applications that have no geographic components, such as very large directory
systems are quite amenable to representation through the use of ‘information
trees’ and ‘information maps’.[41] GIS applications can, on the one hand, be
used in ARM electronic directory applications.
On the other hand, GIS represents one of the most complex forms of
electronic records that modern archivists must learn to deal with, or to get
others to deal with.
Lessons Drawn
From
this experience, I learned for the first time the great importance of the
balance between research, development and implementation projects, including basic research with no
specific application in mind at the start, development projects aimed at a
particular problem set, or the functional needs of a particular user group, and
good implementation projects to make the desired results happen. The Chairperson of the group, its members and
invited guests from the research community were extremely knowledgeable in the
field of information sciences and provided a most challenging intellectual
experience. It also gave all of us the
opportunity to see some state-of-the-art implementations. I learned the importance of networking with
people in the research community, keeping up with their literature and to
bringing them into operational settings and discussions when, in later
management positions, I was involved with various information systems and
services. I learned that researchers
were just as wanting to have contacts in the real systems world to hear about
current business needs and related gaps in the technology, as I was to know
what their research might have to offer in the problem sets in which I was
working.
Early Centralized Office System Project
The
COSATI assignment led to another toward the end of the 60s, now as a civil
servant back in the Pentagon, managing the information science and technology
division in the Office of the Secretary of the Navy. There I had my first experience with a rudimentary office
technology system -- conceiving and implementing a centralized office services
system that provided dictation and word processing services for Secretariat
staff. This was a ‘Rube Goldberg’
attempt if I ever saw one to put together some technologies that weren’t
designed to work together to perform typical office services. It consisted of two secretaries located in a
room with two single-tape IBM Magnetic Tape Selectric Typewriter (MTST)
machines (at the cost of $7,500 each), one $10,000 two-tape MTST, a telephone
line and an answering machine. Secretariat staff would call the service from
their office telephones and would dictate the desired document over the
phone. A secretary in the office
services unit would transcribe the voice tape into its first typed form using
one of the single-tape MTST machines. The document would be sent by special
messenger to the author, usually the same day for markup and return to the
office services unit for further update.
The original MTST tape would be mounted on the two-tape MTST
machine. The second tape drive would be
used for merging the original tape information with the revisions and for
creation of the new version for printout and return to the author.
Lessons Drawn
No
thought was given to keeping the electronic versions of the dictated documents
beyond the time necessary to finalize those documents unless the author had
reason to believe that it would be necessary to use large portions of the
document in future documents. Any
preservation in electronic form was for operational not for records management
purposes which were seen to be totally under the purview of the principals
creating those documents. The paper
output of the service, not any electronic representation, was regarded as the
record. Once again, the basic
recordness of the paper residue of the actions that precipitated the documents
was largely unchanged. Because the paper version was the record, neither
content, context nor form were particularly effected. Only the process of
document preparation and its related service levels had changed.
1970s
Private Sector Executive Involvement Project
For
me, the end of the 60s and beginning of the 70s was marked by several projects
in the private sector that involved interactions with executives on the design
of computer-based management information systems, including organizing special
seminars for managers in information management, interviewing managers and
marketing systems to them, writing MIS proposals, etc.
Lessons Drawn
Senior
managers typically view information systems projects as computer projects or
information technology projects, not as projects that support strategic aims
and not as projects that manage valuable organizational assets in the form of
information. This is changing nowadays
because of the threat of competition and the greater computer literacy of
people today than a generation ago.
Even today, however, managers do not equate records to information. Whereas they are very anxious to talk about
the strategic value of information, they will refer anyone wishing to talk
about records to their secretaries. It
is therefore important for ARM professionals to project their offerings and
needs in ways that executives will relate to in their own terms and language.
Distributed Office System Participative Design Project
Word
processing was first invented as a mainframe computer application in the 1950s
by one of the great information scientists of all time, Douglas Englebart. A
few years ago, I had the pleasure of meeting with Doug and having a
demonstration of that early system and of the first (now ubiquitous) computer
mouse (albeit today’s versions look quite different from Englebart’s
hand-fashioned mouse) that he invented in 1964[42]
to be used with a mainframe computer terminal.
However, just as the mouse didn’t really gain wide usage until the
latter part of the 1980s after something called Microsoft Windows was invented,
word processing didn’t begin to catch on until the 1970s when dedicated word processing
equipment made it possible to create documents locally rather than on
mainframes and when equipment prices came down significantly at the same time
that secretarial costs were on the rise.
The type of office service described in the Pentagon experience was one
of the precursors of what became commonplace during the decade of the 1970s --
the so-called “word-processing pool” in which the whole organization was served
in its typing needs by a single central pool of secretarial staff working in what
would today be regarded as an electronic sweat shop. This became feasible as the cost of word processors decreased and
became more functional.
In
1979, while serving in an operational position of an international financial
institution, I led a study of a regional vice presidential office consisting of
several departments that serviced all projects in about 20 Eastern African
countries. Since it was clear that
document creation and preparation were tasks that were subject to differing
individual work preferences and behavior patterns, we undertook this study in a
participative manner involving managers, economic and financial analysts,
sector specialists and support staff of various disciplines and areas of
responsibility[43]. The conclusion was that office technology
should be fully decentralized to the
unit level. Within a couple of years, the other regional offices followed suit
and the centralized WP unit was disbanded and its resources were distributed to
help seed an enterprise-wide decentralized office support system. For a short period, control of document
preparation reverted back to the unit level support staff and the management of
records improved at that level.
Lessons Drawn
Among
the noteworthy observations on this project are that:
Innovations may seem trivial or even absurd when they
are first revealed. Sometimes they are;
but other times they simply reflect that their creator is years ahead of his or
her time and that they will become widely used when the time is right. It may be difficult for archivist to always
sort out correctly which is which. The
main lesson for the author was simply not to dismiss possible futures and to
give thought as to how one might accommodate them if they appear on the scene.
The study was an early example of a participative
design project in which considerable importance was placed on identifying and
involving all key stakeholders as members of the team.
The notion that there might be stakeholders external
to the regional office was not readily accepted because staff did not see the
relevance of changes in their own work patterns to the work of other external
offices.[44] It didn’t
occur to anyone that the records management division and archivist might also
be stakeholders.
The study was very team oriented. On the other hand, it was a bottom-up
approach to how work was done locally and therefore did not address the
underlying business processes that produced the reports and other records. Thus
even with the team orientation, the focus was more on individual productivity tools in a local organizational setting,
rather than group productivity in an
enterprise setting.
Individuals bring not only their own national and
ethnic cultures to a project team, but the culture and value system of the
organization in which they work. If
that culture places a value on the importance of information and information
sharing, it can make the archivists job much easier. If it treats information simply as a power lever, and not as an
organizational asset, the archivist’s job will be much more difficult, and
conceivably threatened by political pressures to dispose of records in
inappropriate ways; and it will be necessary to consider strategies for
bringing about changes at the cultural level.
1980s
No
one was prepared for what would happen at the beginning of the decade of the
80s when IBM began marketing what it called a “personal computer” or “PC”. It was reportedly called that because it was
seen as something that could be entertaining or useful in the home, not as
something with any particular business application[45].
Nor did most business organizations invest much in that technology, beyond
supplying a few technology-oriented industrial-strength analysts with such
tools, until about 1983/4. However, it
took little time for analysts and authors to see how much better they could
perform their jobs when they had control of the tools of document creation in
their own hands. This realization
fueled an even greater decentralization of office technology than had been
dreamed even following the 1979 study.
Decentralization of Information Processing Power
Probably
the single force exerting the greatest influence on the workplace of the early
and mid-eighties was rapid diffusion in the control of information technology,
information technology tools and all aspects of document creation. In my own organization, even before the
introduction of PCs, a handful of daring economists and other operational
officers requested word processors of their own and offered to make sacrifices
elsewhere in their budgets to pay for the equipment. This quickly caught on.
Whereas, in 1977, on average there was one terminal (pre-PC) for every
100 staff members, by 1980 it had changed to 1:25, and by 1983 the ratio had
become to 1:4. By 1987, the ratio was
1:1 roughly where it stabilized until the generation of PCs that followed the
IBM AT created a backlog of obsolescent PCs with little street value and they
began to plow the older technology back to use in the homes of staff members,
largely to give them after-hours access to email.
These
events of the 1980s noted above mirror what has happened in many other North
American organizations, with variances in the timing of full
decentralization. It is what has
happened or is happening now in other countries. What it meant was that support staff -- the original gatekeepers
of office filing -- were increasingly dealt out of the document creation phase
and the maintenance of local and organizational files, except for what was
given to them in hard copy by their supervisors. As document creators were often analysts in their professional
work, many began to make use of spreadsheet systems wherein they could define
formulas for specified cells, fill in the data cells and let the program work
out the arithmetic. It was easier than
some of the more complicated earlier mainframe based systems, and was
immediately available on ones own PC.
Like WP, it had the great merit that it was easy to change the data and
update the tables. Where reports
involved the integration of statistical tables into the document texts,
secretaries were typically involved in document creation, as typically this
task was performed by secretaries using the more traditional literal
“cut-and-paste” operations of the earlier years. Following the advent of Microsoft Windows, however, the invention
of systems with “object linking and embedding” or “OLE”, made it possible for
authors to import tables directly into their text documents and more recently
to create them without appearing to even leave the document. This could be done nicely by the author with
automatic changes and pagination -- without anyone else’s help. The 80s was the
decade in which the author became master of his or her creations, making it
also the decade in which the quality and completeness of records and
recordkeeping systems began to slip.
Individual, Workgroup and Organizational Productivity
As
noted earlier, in a broader way, it was a decade marked by tools for improving
individual productivity -- word processors and “dumb terminals” that gave way to “smart personal computers”,
word processing systems, spreadsheet systems, individual organizers, etc. But it was also the beginning of a shift
toward concerns about workgroup productivity with the introduction of email (mostly
mainframe based systems) and local area networks where the focus was on
providing technological infrastructure for information sharing at the level of
a small organizational unit. This
experience and technology provided the operational and research base from which
enterprise-wide networks and wide area networks of the 90s emerged -- the
essential underpinning for enterprise electronic document/records management
systems and workflow systems and what is emerging as the period of
organizational productivity tools.
Business Systems Analysis and Information Management
Projects
The
80s marked some other sea changes in orientation that would become much more
pronounced in the 1990s in the form of increasing use of business systems
analysis and information engineering tools.
Business systems analysis (BSA) involves identifying broad
organizational goals and supporting business areas and processes, business
process definition and decomposition, and the development of improved processes
and information architectures. (A processes is “a set of activities that, taken
together, produce a result of value to a customer -- developing a new product,
for example.[46]) It helps
to rationally link all these things and to drive systems development of
supporting information technology architectures.
Since we hear a good deal these days, including from
this author, about the desirability of linking records to business processes
(BPs) and conducting appraisal at the BP level, it is important to recognize
the complexity and commitment involved in undertaking such a program. It is not something to be undertaken lightly
or without senior level air cover. It
typically involves project sponsorship by the senior executive who is the
common manager for all managers of offices that are stakeholders in the
business areas to be considered, the commitment of several managers and senior
professionals on the project task force, some on a full-time basis for several
months, and in-depth interviews of all senior managers and other staff key to
the BPs in question. During the 80s and early 90s, the author had the fortune
to lead or be otherwise involved in three BPA projects at various
organizational levels. The
methodologies used were precursors to the business process reengineering (BPR)
methodologies and computer-based tools widely used today.[47]
Information
management skills come in at the end-game of these exercises using various
information engineering[48]
and data administration tools to establish how information resources can be
organized in such a manner as to promote optimal information sharing and
usage. It involves such things as
designing and implementing enterprise information directories that rationalize
and make it easy for users to discover, access and use divergent multi-media
information stores. The design of
corporate filing scheme is one of the oldest forms of information
management. In a subsequent case, the
author managed a project that used the business process model created from the
above exercises and linked record series to those processes. It demonstrated that records could be
organized according to BP and to a related provenance data base. More recently, the Dutch Government, as part
of its “Revolution in Records” project resulted in research led by Tora Bikson
that concluded that electronic information is operationally and strategically
important to government agencies but lacks policy definition. It offered suggestions for filling that
void. Furthermore, it concluded that
the development of tools to provide for technological support for records
management in diverse mission environments should be undertaken on a priority
basis.[49] Finally, it stated:
Finally,
context-relevant constructs and methods for the management of electronic
records and archives need to be developed.
Reliance on paper-based procedures, plus the assumption that electronic
records material has a print equivalent that can be managed according to
traditional rules has probably delayed progress toward the articulation of new
approaches that better suit today’s interactive information environment. [50]
Lessons Drawn
BSA offers excellent approaches
to business modeling that, in turn, can provide an excellent basis for
developing information architectures and information directories (including paper
and electronic records holdings). However, it is a very complex methodology
that involves considerable investment of senior management commitment and time
and is not well understood by many people, including most ARM practitioners. As valuable as BSA can be, it also has its
shortcomings that are seldom mentioned with the positives. It is a complex approach which might not be
easily carried out by many smaller organizations with limited human and
financial resources. BPR specialists
can be extremely expensive (several thousand dollars per day), are often
dogmatic and wedded to their own approach and often take considerable time to
orient to the local context. (To be
sure, part of the value of BPR consultants is that it is the local context that
too often gets in the way of identifying improved ways of doing things.) The BPR approach also very often gives only
cosmetic attention to the human factor; e.g., it does not ordinarily make use
of social models of the workplace to accommodate both process improvement and
human adaptation, and it often relies upon identifying ‘better’ ways of
carrying out business processes through BPR teams that consist in part of
people who are afraid to come forward with suggestions that they feel may place
their own jobs at risk.
I believe that the admittedly
attractive and desirable notion of records appraisal at the business process
level is better understood in theory than it is in terms of practical
application. In my discussions with ARM practitioners around the world, I
notice what appears to be a lack of appreciation of the complexities of
implementing the theory, and a certain polarization of belief systems regarding
electronic document/records management.
It is sometimes also framed as the top-down vs. bottom-up approaches.
Hopefully the NHPRC funded Pittsburgh project and the Dutch Revolution in
Records project will shed a good deal more light on this subject.
The top-down or business process
(BP) approach is broadly as outlined above.
The bottom-up or docu-centric
approach (e.g., as suggested in some of the diplomatics literature) begins at
the other end of the spectrum with the document. Simply stated, this approach says that every document, especially
every record, is associated with a business transaction and has a document
profile which contains essential contextual data about the document and is
managed as part of its parent record series.
Somewhere in between, we have
computer “application systems” that support specific business process or
sub-process, especially where the application carries out specific, usually
repetitive, tasks. They are usually
highly structured, transaction based systems. They are especially common in the
finance business area, such as the payroll, cash management and accounts
receivable applications systems. As
such they operate above the record series level but below the major business
process level. Most such applications
systems are amenable to system-level appraisal and automated disposition
management. This is not the case for
many business processes, since they may involve both structured and highly
unstructured documents from sources not easily disposition managed such as
email systems which are not oriented
toward any specific business process or application as most are not. Moreover, even when operating at the BP
level, it is necessary somehow to populate a BP with all of the relevant
documents or composite objects as discussed earlier, which means that documents
must be identified as to what BP they support and be marked accordingly.
Thus,
while stressing one or the other
approach may seem to make good sense in theory, in the practical world of
systems development and implementation, it is likely that some combination of
both approaches will be necessary, and it will vary from case to case, BP to
BP. I doubt that theorists in
electronic records management see themselves as at opposite ends of the
spectrum. Nonetheless, I find that many
ARM practitioners are interpreting the literature to mean that one or the other
of these approaches must survive over the other. Just as linking records to BP or provenance are not mutually
exclusive, the choices among BP-level, applications-systems-level and
document-level records management are
neither universally applicable nor mutually exclusive. A benefit of computer-based solutions is
that you can have it both ways.
These
experiences suggest to me that, since the use of technology is largely driven
by changes in business processes and work patterns, archivists need to give
increasing attention to business systems planning and analysis and end-user
work habits. As part of our involvement
in such activities, we might also do a little “image enhancement” of our own --
to change the image of the archivist from one purely as the “keeper” of records
to one also as the “purveyor” of documents, as noted in my opening comments on
the definition of records. The purveyor of documents can also be and be seen to be a keeper of documents;
but the reverse is not always as easily seen to be the case. This may be more a
question of perception than of reality. To her great credit, although the
national archivist of the U. K. is known as the Keeper of Public Records is
also very much a purveyor of information. in these hard times, we have to
address the perceptions of the profession as well as its realities. Perhaps we should reinstate the idea of the
“Remembrancer”, established in
sixteenth century London to help preserve the city’s institutional memory.[51] This point is likely to be seen as
unimportant if the evidentiary value of information is stressed to such a point
that records creators think ARM professionals interests are only interested in
the evidentiary (which most users will read as ‘legal’) aspects of records and
thus are largely irrelevant to their own interests. I understand the logic and importance of stressing evidence[52],
but caution that it not be overdone and that, as noted earlier, we also ally
archives with the use of records for purposes of operational continuity,
information management and institutional memory.
I
have also heard it said, in advancement of the focus on evidence, that nobody
really cares any more about institutional memory, other than historians. In my consulting work, I have not found this
to be the case. Much of my practice
involves dealing with executives and it is not uncommon that they raise
concerns about erosion in their own organizations’ institutional memory, and
without prompting on my part. I predict
that this concern will grow as we see more downsizing activity, and more use of
consultants in place of regular staff.
To illustrate, one financial institution with which I had a recent
engagement to examine its information management and records management
programs had in course of approximately one year replaced its CEO and had the
positions of general counsel and financial vice president become vacated. Thus, in a very short period, it had lost a
substantial portion of the senior, experienced knowledge of corporate
operations. Upon learning about this, I
made inquiries to the personnel office and discovered that 40% of its
investment officers (the mid-level front line deal makers for investment
banking institutions) had been in the organization for less than three
years. Remaining executives and
investment officers readily offered, in the course of interviews, the importance they placed on the need to
provide some document/records system to help them rebuild the institutional
memory of the organization. This was
an organization that was at the same time contemplating budget and space
cutbacks for the records management function.
They are now reevaluating the potential for the ARM program in
maintaining institutional memory and are reevaluating priorities for their ARM
program.
The Turn of
the Decade of the 90s
Little
attention was paid to email and facsimile by archivists and records managers at
the beginning for a number of reasons.
Firstly, low budgetary policies and organizational clout resulted in ARM
programs being denied PCs, fax machines and email accounts. Secondly, ARM professionals were slow to
appreciate the potential for using computer-based systems for managing paper
and microform records, slow to make their own needs known to IM&T managers,
and slow to gain senior management understanding of the role of ARM in carrying
out core business processes throughout their organizations. For their part, IM&T managers did little
to try to understand the related systems requirement or to represent those
needs to the system development community and thus the market was largely
ignored. There were exceptions, of course, such as MINISIS and MICRO-ISIS, but
they were few and largely home grown rather than constituting an important
market segment where most of the broad-based systems development work takes
place. Thirdly, the view was strongly
held by many that preservation is a major issue with records that are use
something other than a paper or microform storage medium, and this continues to
be a concern of many professionals. In
addition, it was widely believed that most of the traffic going over email
systems was junk mail, mainly in the form of lunch date exchanges and
substitutions for “telephone tag”; and junk mail was not worthy of serious
records management attention.[53]
Integrating IM&T and ARM, Document Management and
Records Management
In
1986, I became chief of a central division responsible for office systems
services an organization of about 8000 employees. This included information technology standards, central
operations for mini-computer systems, and support services for distributed PCs,
printers, local area networks, email, word processing, spreadsheets, etc. --
most information technology except mainframes, telephone systems and long line
communications. It was a year of mainly
infrastructure building. In the same
year, the archivist and records management division were moved from the
administrative department to the central department responsible for information
management and technology, as a sister division to my own. A year later, as part of a major corporate
restructuring, downsizing and consolidation of management portfolios, most of
the office technology support services and staff were decentralized throughout
the user community, standards concerns were shifted to a policy group and my
division was eliminated. I became chief of information services, which picked
up the functions of the erstwhile records management division.
I
was shocked to discover the paucity of information technology tools assigned
within the ARM group and the generally lower educational levels of the
staff. Whereas, using an oversimplified
measure, the ratio of PCs to staff was nearly 1:1 elsewhere in the
organization, it was about 1:25 in the ARM group. I came to the early recognition that I had entirely ignored this
function in my previous position, to its considerable detriment. I also realized how little I or anyone else
in the IM&T organization understood about imaging or character-based technologies
that, with appropriate network facilities, would form the underpinning of
document management systems. Like
myself, virtually all of my colleagues had been raised on mainframe
applications. Many in the IM&T
profession, by no means limited to my organization, were just beginning to
accept the notion that maybe PCs were here to stay.
We
did several things to try to lift ourselves up by the bootstraps and to gain
first hand experience in the ARM group with text-based systems that would
prepare us to deal more effectively with electronic records issues and
opportunities, including:
Lessons Drawn
It
takes a multi-pronged attack to make up for lost ground in the ARM group:
obtaining its own IM&T human and technological
resources that could be trained in and become totally dedicated to ARM concerns
and functions; a number of national and
state or regional archives organizations are now doing this, most recently the
U. K. Public Record Office.[55]
cross training the ARM and IM&T staff; and
beginning to get training and hands-on experience with
the application of the technology to ARM functions and a more mature
understanding of ARM functional requirements.
These
experiences resulted in my involvement in a leadership role in a major
inter-disciplinary study of electronic records by some 30 organizations of the
United Nations. Because of my interests
in both ARM and IM&T, I was seen as someone who might be least offensive to
either group (or at least equally suspect by both). It was a rich learning
experience and an opportunity to work with many people in these fields
throughout the UN community and with excellent consultants such as David Bearman,
Tora Bikson and Charles Dollar and ARM practitioners such as Gertrude Long and
Alf Erlandsson and many other members of the Technical Panel. For many of the
ARM participants, it was their first experience with PCs and email, their first
personal encounter with an electronic record.
For many of the IM&T participants, it was their first conscious
engagement with a record of any kind.
The result of our effort was the publication of the UN report on Managing Electronic Records: Issues and Guidelines,[56] and additional reports of the follow-on
group.
Early 1990s
EDMS Projects
Though
I cannot escape thinking about them in drawing lessons and thinking about
things we can do, I will not go into detail on more recent projects, some of
which are reported elsewhere[57]
and others of which are still underway.
These include: evaluation of a prototype workplace that integrated
office building design[58],
office landscaping and furniture, information management and technology, and
records management; working with interdisciplinary IM&T/ARM teams in the
development of functional requirements for enterprise electronic document and
records management systems[59];
designing conferences to bring senior executives, IM&T and ARM managers
together to confront emerging electronic document and records policies, issues
and opportunities; working on an approach for transferring personal electronic
mail files to archives; and helping establish an IM&T function in a
national archives organization[60].
The
central lesson for me in all of this has been to understand the essential
importance of joining forces between ARM and IM&T professionals, and of
course the user communities managing the organizational business processes that
produce the records. It is my
conviction that none of these groups can resolve electronic records issues by
themselves. I learned the hard way,
however, that achieving this alliance is something more easily said than
done. I have found that what seems to
work the best where and organization is just beginning in its efforts to
increase collaboration between ARM and IM&T groups is a non-threatening
workshop in which representatives from both communities participate together.
Archivists
and records managers, reading about litigation over the U. S. Presidential
papers in the famous White House email trials in the last days of the Bush
Administration also saw living examples of what some had feared for many years
-- that email could constitute documents of considerable evidentiary or
historical value. The lack of a well considered National Archives and Records
Administration policy on electronic mail, and the politicization of the
National Archives in bending to pressure by the Bush Administration led to
national headlines that read: “Neglect
at the Archives,”[61] and a black eye for the ARM profession.
Where is IM&T Heading
Without
going into a great deal on emerging information technology, it may be
worthwhile to mention a few defining technologies. Of particular interest to ARM professionals are developments in
personal scanners, PC chips and server chips.
Full-page
personal image scanners have become common on the market for as little as about
$1000 (half that if you are willing to forego color), making it possible to
improve the readability of documents through the incorporation of graphics,
pictures, newspaper clippings, etc., and therefore their attractiveness and
likelihood of being read. For the most
part, the use of imaging technology has been largely limited to centrally managed,
fairly specialized, organizational applications. The use of personal scanners, because of cost, also has been
typically limited to such applications as newsletters, even though small,
hand-held, partial-page scanners under $300 have been on the market for
years. With the increasing availability
of full-page scanners that are relatively inexpensive, organizations will be
more likely to provide such resources, at first on a shared basis much as they
did initially with personal computers, to units throughout the organization.
The recent advent of machines that package copying, facsimile and image
scanning capabilities into the same box will further advance the move toward
decentralized imaging services. This
will also remove one of the main natural barriers against moving toward a
less-paperful, if not paperless, office.
In the past, parallel paper systems were necessary even when electronic
systems were embraced for maintaining externally generated documents. If the price gets right, there will be less
reason to maintain parallel paper systems, except for legacy records systems,
and less reason to maintain paper copies of current documents except for
short-term convenience purposes.
The
introduction of desktop systems using the new Pentium, 586 and (in April) DX4
100 megahertz desktop computer chip brings a level of computing power not
dreamt of by most people five years ago.
At the same time, delivery commenced on 6-7 pound notebook computers
that package this chip and deliver 75 megahertz (three to four times the speed
of what was generally available just a few years ago) and with 500 megabytes
(one-half gigabit or several file cabinet equivalents) of storage. Not much after the ink was dry on the first
draft of this paper, Intel announced a 90 MHZ Pentium chip for notebooks and
IBM announced a Pentium overdrive and a 810 MB removable hard disk for its 755
Thinkpad™ Notebook series as well as a CD-ROM drive that can be swapped into
the floppy disc drive of the same machine.
Other manufacturers did and will follow through in kind. This puts extraordinary computing power in
the hands of people on airplanes, in hotels, and at home -- of such a magnitude
as to make it practical to seriously consider the use of a single computer for
office, home and travel use. It also
fuels the engine of location independent information processing that is needed
to permit location independent work.
Newer P6 chips are of particular interest to ARM professionals. They are not for the end-user PC, but rather
for much more powerful servers that will provide enormously efficient task
sharing in client server architectures, a sine
qua non for advanced electronic
document and records management systems.
When ARM professionals see enterprise networks implemented in their
organizations, as distinct from independent LANs, they will know that the
planning time for electronic document and records systems has just become zero.
In the past, my work has
involved attempting to project future workplace scenarios, projecting 10 years
ahead for purposes of early identification of potential workplace changes, and
25 years ahead for purposes of office building strategies. That was difficult enough to do for an
organization with which I was very familiar.
I will not, in this paper, attempt anything as ambitious for a
comprehensive view of workplace futures more generally, although elements of
those futures may be drawn from many of the lessons that are reported in this
paper and the suggested things that we might consider doing in the paragraphs
below. However, I will associate myself
with some of the projections made by one of the leaders in the field of
information management, Paul Strassmannn. Ten years ago -- about the same time
I was writing an internal paper for my own organization on “A Scenario for the
Workplace in 1995” -- Strassmannn predicted, in an excellent chapter “The
Paperless Office” (that I cannot do proper justice to in a short quote), his
view of what the beginning of the Third Millennium will be like.
There will be a lot of paper in use in
the year 2000. There will be more of
it, per capita, than at present because there will be so many more originals
from which copies can be made. The
information workforce will be more than twice the present size...The quality of
electronic printing -- incorporating
color, graphic designs, and pictures -- will make this means of communication
attractive to use. The “intelligence of
printing and composing machines will be of a sufficiently high order to cope
with the enormous variety of electronic forms in which originals will be represented. All of this assumes that the present
sociopolitical hurdles preventing the exchange of electronically communicated
text will be resolved through international standards...we should expect to see
the same progress...which now permits home-to-home dialing around the globe.
Paper will not be used for archival
storage of routine business records.
Optical recording... provides a much better means for the filing of
information. Paper will be used for
reading, due to its greater human
compatibility....VDUs will not replace reading. They will deal with the logic of information search, with
composition of text, and with terse, highly structured messages....The
“paperless office” will not be one of the outcomes of office automation. Large amounts of paper will continue to be
used, even though paper’s archival role will diminish.[62]
Had the likely impact of
multimedia, including the inability to fully represent multimedia documents in
paper form, been as obvious 10 years ago as it is today, Strassmann might have
altered some of his predictions. I take
this not as an indictment of Strassmann’s forecasting skills, which I salute,
but more as simply an example of how much IT changes in a decade and how risky
forecasting really is. In a project in
which I was involved in 1990, helping to integrate technology and human factors
into the design of a large office building, I concluded that it was hopeless to
try to crystal-ball what information technology was going to be prevalent
during the projected 50 year life of that building. In the end, my recommendation to the architectural design project
team was that the building specifically not
be designed to a specific technology, but rather that it be designed as an adaptive building -- design it in a
manner such that, even if it adds a bit more to the ‘first costs’ of
construction, it will be less costly to alter in the future and probably more
than recapture any incremental first costs.
As it turned out, because this kind of debate took place before the
architects did their construction drawings, the additional costs for making the
building more adaptable were negligible.
Interestingly, the archivist faces much the same problem, trying to
figure out what it will be like 50 years hence in terms of information technology. Again, my only advice would be to assume
that there will be many changes in technology over that period. We had better design our information
architectures, enterprise networks and electronic document and records
management systems to recognize and facilitate future change, e.g., through the
use of such strategies as open systems architectures, object oriented systems,
portable document formats, application-independent multimedia data bases, etc.
VII. Conclusions
Some
of the main conclusions emerging from this collection of personal experiences
and other observations are that:
In
thinking about approaches to electronic records management, especially at the
national levels, we should be careful to take account of differences in
national heritage and culture and not simply to be swept up by what is regarded
as the way to success somewhere else. Made-in-America solutions should not be
rejected out of hand because they might not have all been invented here any
more than made-in-Canada solutions should be rejected in the U. S.; neither
should other people’s solutions be adopted here if they cannot be effectively
made to be Canadian. The variances in the traditional European,
U. S. And Canadian approaches to appraisal and the Canadian ‘total archives’
approach as contrasted with the ‘public archives’ approach of most other
countries are illustrative of this point.[64] The use of IT is even more subject to
cultural and human factors. For example, as was noted in a recent Toronto Globe & Mail article:
“The second annual Gallup survey
indicates that 69.9% of Canadians have heard about the info-highway but 61.8%
fear it represents a threat to Canada's cultural identity and say they want the
federal government to assume responsibility for protecting that identity. 3.5-million Canadians, or 11.9% of the
population, have used the Internet.”[65]
It
would be a mistake not to realize that ARM functions themselves are very
vulnerable in the current budgetary climate, and that they could become the
subject of outsourcing. Will the
private sector want archivists it doesn’t already have for other reasons? If it does, will archivists find themselves
in the untenable position of trying to preserve records of continuing public
value when doing so might jeopardize their jobs?
It
might be comforting to conclude that to prepare for the next millennium,
archivist must simply become conversant with modern information technology --
comforting but unwise. That will be a
necessary starting point. However, the
more complex demands will be to properly understand the objects of automation
through business systems analysis.
VIII. What
Can We Do?
Having the business insights,
communications skills and courage to become aware of and alert their senior
managers to proposals for organizational, technological or procedural changes
in the name of business process reengineering or innovation that have important
business continuity, evidentiary or social dimensions will be an even more
important aspect of being an archivist than has been the case in the past. Becoming skilled in business systems analysis
and business process innovation, and in evaluating the implications of change
on work patterns and social issues, and being able to articulate related
archives and records management considerations will make it possible and more
likely that archivists will be invited to the table when such discussions take
place, thus enabling them to ensure that the important values of archives and
records management that are worthy of keeping are indeed kept. Some of the actions suggested below have
already been implemented in some organizations. More particularly we should consider:
As Professional Associations:
As National Archives Organizations:
As Operating Organizations:
As ARM Business Units:
As Designers and Teachers of Archival
Study Programs:
As Individuals:
Summary
If the distinguishing feature of
information management and technology in the decade of the 80s was chiefly one
of innovating and exploiting individual productivity improvements through the
use of technology mainly in the form of individual utility tools, the
distinguishing feature of the first half of the 90s has been innovations aimed
at work-group productivity: e.g.,
email, group authoring tools and “roomware”.
Imagine for the second half of this decade and the first decade of the
new millennium:
REBarry:msw60/ba/assn-ca/aca-pv15.doc:
20 June, 1995
[1] Rick Barry is an author who consults and carries out workshops on information management and technology and electronic records management issues, strategies and policies. He has been a keynote speaker at numerous conferences on electronic records internationally; email: rickbarry [at] aol [dot] com
[2] For an elaboration of this subject, see the videotape: Electronic Records in the New Millennium: Managing Documents for Business and Government, University College London, 1995, written and directed by R. E. Barry, the accompanying teaching and discussion guide, “Managing Documents for Business and Government’ by R. E. Barry and Anne Thurston and paper: “Electronic Objects circa 2001: Problems or Opportunities?...Yes”, by R. E. Barry.
[3] The term “ARM” is used in this paper as a generic abbreviation for organizations and programs carrying out life cycle administration of records.
[4] The term “IM&T” is used in this paper as a generic abbreviation for organizations and programs carrying out information management and information technology tasks and services. Information management (IM) refers to the exercise of intellectual control over corporate information assets to aid in their easy discovery and use, usually carried out by people using information engineering and possibly business systems analysis tools. Information technology (IT) refers to the hardware, software and communications infrastructure, standards and human technical support necessary for the effective use of information.
[5] Interval Research Corporation in Palo Alto, CA, (one of the co-founders of which is also, with Bill Gates, a co-founder of Microsoft Corporation) is one of the most advanced information science research centers in the world. The disciplinary mix of the group is very interesting, combining information scientists with physicists with an array of other disciplines. One person in the group comes from the academic field of drama, and spent much of her prior professional career in designing the ‘look’ of children’s electronic games.
[6] Dictionary of Archival Terminology, ICA Handbook Series Volume 3, Munich, K.G. Saur, 1984, as cited in Management of Electronic Records: Issues and Guidelines, a report of the Advisory Committee for the Co-ordination of Information Systems (ACCIS) Technical Panel on Electronic Records, Chaired by R. E. Barry; United Nations Sales No. GV.E.89.0.15, New York and Geneva, 1990. Whenever the term “business” is used, unless specified differently, its meaning is intended to be inclusive of both the private and public sector organizations.
[7] Dictionary of Archival Terminology, ICA Handbook Series Volume 3, Munich, K.G. Saur, 1994, as cited in Management of Electronic Records: Issues and Guidelines, Advisory Committee for the Co-ordination of Information Systems (ACCIS), United Nations Sales No. GV.E.89.0.15, New York and Geneva, 1990. Whenever the term “business” is used, unless specified differently, its meaning is intended to be inclusive of both the private and public sector organizations.
[8] Keeping Archives, Judith Ellis, Ed., Thorpe, a part of Reed Reference Publishing, Port Melbourne, Australia, 1993, p.477.
[9] The argument I am advancing here is for promoting the use of the information that is maintained in ARM systems. It is separate and independent from the issue of whether documents should be appraised on the basis of the context of documentation as it relates to the actions it reflects versus on the potential future uses of records by historians or others based on the current (possibly short lived) views of the user community. For an excellent discussion of this latter issue, see Jean-Pierre Wallot’s “Free Trade in Archival Ideas: The Canadian Perspective on North American Archival Development” in American Archivist, Vol. 57, Spring 1994, pp. 380-399.
[10] For a further explanation of the distinctions among these properties see Luciana Duranti’s six-part series on “Diplomatics: New Uses for an Old Science”, in Archivaria, (Numbers 28, Summer 1989 through 33, Winter 1991-92) and her unpublished paper, “Authenticity and Reliability: The Concepts and Their Applications”, presented on September 8, 1994 at the Society of American Archivists 58th Annual Meeting in Indianapolis, Indiana.
[11] Duranti, Luciana, “Diplomatics: New Uses for an Old Science”, Archivaria, 28 (Summer 1989), p.16, and subsequent paper by the same title (Part V), Archivaria 32 (Summer 1991), pp. 6-7.
[12] See references to articles by Stuart Haber and Scott Stornetta and by Cirpa below.
[13] Lacey, Julia S. How to Survive Your Computer Workstation, CRT Services, Inc., Kerryville Texas, 1-800 256-4379.
[14]See the videotape Organization overviews and Role Management: Inspiration for future desktop environments, by Catherine Plaisant and Ben Shneiderman; produced by the Human Computer Interaction Laboratory (HCIL), Center for Automation Research, University of Maryland, (301) 405-2768, 1993. A video summary appears in ACM CHI ‘95 Companion, (Denver, Colorado, May 7-11, 1995), pp. 419-420. CHI 95 Technical Program Video available through ACM. In paper form, see the paper by the same authors and title in Proceedings of the 4th Workshop on Enabling Technologies: Infrastructure for Collaborative Enterprises, April, 1995, CAR-TR-771, CS-TR-4373.
[15] Stoll, Clifford, Silicon Snake Oil: Second Thoughts on the Information Highway, Doubleday, N.Y., 1995, pp. 46-59.
[16] Ong, Walter J., S. J., “Knowledge in Time” in Knowledge and the Future of Man, Walter J. Ong, S. J., Ed., Holt, Rinehart and Winston, New York, 1968, pp. 11-13.
[17] Ong, Walter J., Orality & Literacy: The Technologizing of the Word, Routledge Publishers, London and N. Y., 1993, p. 82.
[18] Clanchy, M. T., From Memory to the Written Word, Blackwell Publishers, Oxford, U. K. and Cambridge, U. S. A., 1993; see especially, “The Technology of Writing” pp. 114-144.
[19] One of the author’ clients has a rich collection of analog tapes of radio conversations with missions in the field during disaster operations. It is more than the spoken words of people reporting from the field that give evidence to their actions. Other ambient sounds provide a record of the events in ways that could not nearly as faithfully be represented by second-hand human accounts.
[20] Courtesy of Charles Dollar, University of British Columbia.
[21] Lowry, Martin, The World of Aldus Manutius: Business and Scholarship in Renaissance Venice, Ithica, N.Y., Cornell University Press,, 1979, pp. 29-32.
[22] Ong, Walter J., Ibid., p. 80.
[23] See two recent EDUPAGE clippings: Newspapers Face Stiff Competition In Online Classifieds: The newspaper industry relies heavily on the $12.5 billion generated through its classified ads last year -- and is finding itself challenged by online upstarts such as Electric Classifieds, Inc. which offers a classified service on the Web. Unlike traditional publishing companies, which have millions invested in physical plant, fleets of trucks, and tons of newsprint, electronic publishers can set up shop for next to nothing. To combat this growing gang of competing Davids, the newspaper Goliaths are launching their own online efforts, but they may be overlooking the obvious, according to the editor of an electronic journal on online media is: "Newspapers have a tremendous advantage, if they don't blow it, and that's the infrastructure to take the ads, run the ads and bill for the ads." (Forbes 7/3/95 p.80); and: SCIENTISTS LEAD THE WAY IN ONLINE PUBLISHING: Scientists who used to rely on print journals for research sharing and peer review increasingly are turning to the Net, and the $4-billion technical publications industry is worried. The venerable New England Journal of Medicine is sticking to its guns -- an editorial to be published June 22 says it plans to "apply the same rules to Internet that apply to publishing anywhere else." In other words, if the article's appeared on the Internet, it won't be considered for publication. But other journals are looking at the numbers and deciding they can't afford to be left out. "Costs are up, postage is up, and ad revenues are down," says the American Medical Association's president for publishing and multimedia. "You can't grow enough new revenue sources. We've got to look at electronics as the future." Some scientists worry that bypassing the rigorous vetting process used by the journals will result in "low credibility, instant regurgitation." But others contend the peer review process enabled by electronic publishing can be just as thorough, and far more efficient. "We've only begun to scratch the surface of how much more effectively we can communicate," says the editor of Science. (Business Week 6/26/95 p.44). Source: EDUPAGE 6/22/95. EDUPAGE is a free electronic news clipping service covering the IM&T field. To subscribe, send a message to: listproc@educom.edu and in the body of the message type: subscribe edupage and your name.
[24] Adleman, L. M., Science, Vol. 266, 11 Nov. 1994, p. 1021.
[25] Baum, Eric, “Building an Associative Memory Vastly Larger than the Brain,” Science, Vol. 268, 28 April 1995, p. 583.
[26] This technology is not likely to be suitable for quick turnaround information queries, but could be very useful for complex research searches over massive information stores where it will not be crucial if it takes a few days to receive the search results. It has the potential of dealing with large and complex data searches slowly that would not likely be possible to perform today in any length of time.
[27] Considerable research and development work related to this subject is being carried out under the Text REtrieval Conference (TREC). See, for example, “Information Retrieval Systems for Large Document Collections”, by Alistair Moffat and Justin Zobel (University of Melbourne, Australia) in Overview of the Third Text REtrieval Conference (TREC-3), Donna K. Harman, Ed., U. S. Department of Commerce, Technology Administration, National Institute of Standards and Technology (NIST), NIST Special Publication 500-225, April 1995, pp. 85-94.
[28] The term “composite content object” or CCO is a term used by Jon Stewart, a consultant to NIST in an unpublished paper on document architecture in which the document is characterized as the integration of it component content objects (CO) of text, sound, video, etc.
[29] “Business systems analysis” (BSA) refers to the use of information engineering tools to model organizations for purposes of developing information and information technology architectures and for carrying out business process engineering with the aim of improving organizational performance against business aims. A “business process” is a set of activities that, taken together, produce a result of value to a customer or carry out all or part of a specific business aim.
[30] “PowerPoint™” is a trademark of the Microsoft Corporation.
[31] Dollar, Charles M. Archival Theory and Information Technologies, Macerata; Universita degli Studi de Macerata, 1992.
[32] Duranti, Luciana in paper to be published in the forthcoming issue of Archivaria: “The Thinking on Appraisal of Electronic Records: Its Evolution, Focuses, and Future Directions”.
[33] Haber, Stuart and W. Scott Stornetta, “How to Time-Stamp a Digital Document,” in Journal of Cryptology, Vol; 3. 1991, International Association for Cryptologic Research, pp. 99-111. For a lay explanation, see “Electronic Time-Stamping: The Notary Public Goes Digital” by Barry Cirpa in Science, the journal of the American Association for the Advancement of Science, Vol. 261, 9 July 1993, pp. 162-3.
[34] See EDUPAGE clippings:
POST OFFICE TEAMS UP ON
CRYPTOGRAPHY: The U.S. Postal Service
is working with software firm Premenos Corp. To develop public key cryptography for verifying the identity of e-mail
senders and the integrity of electronically transmitted documents. Meanwhile, Verisign Inc. is developing a
rival system that not only checks the digital signature, but also scrambles the
content of the message. A number of companies, including Visa
International, Mitsubishi and Ameritech, have invested in Verisign's
technology, and Apple Computer and Netscape Communications have signed on as
customers. (Wall Street Journal 6/22/95
B7) Source:
EDUPAGE: 6/22/95; and
DIGITAL SIGNATURE GAIN LEGITIMACY: A law recently passed in Utah recognizes digital signatures as legally binding, and legislators in California and Washington are considering following suit. The Utah law is based on public key encryption, where companies and individuals register their public keys with a certification authority, which then uses them to decode messages created with private keys, verifying the senders' identities. Computer security companies, banks and the U.S. Postal Service all are expected to offer certification services. (Information Week 5/8/95 p.24) EDUPAGE is a free electronic news clipping service covering the IM&T field. To subscribe, send a message to: listproc@educom.edu and in the body of the message type: subscribe edupage and your name.
[35]The terms hypersound, hypervideo, etc. Are simply extensions of the terms hypertext and hypermedia, the coining of which is attributed to Ted Nelson. On April 17, 1991, I visited Nelson to learn
more about the elusive Xanadu system that he had created and, with a team of
followers, had been working on its development for years. We talked about these words and his vision
of a world of Xanadu document
kiosks. As is his custom, he recorded
and kindly sent me a transcript of our conversation. On the subject of hypertext, he put it this way:
I coined the terms hypertext and hypermedia...in ‘65...Again, these terms weren’t there, but we would have non-sequential documents and many data types. A server would be holding them in some authenticatable form...And then there would be a network of these servers and the user would send for arbitrary fragments defined by these. You would send for whole documents because you might not necessarily want the whole document. ...And the whole idea -- what hypertext really becomes is simply the idea of connected data. Because it means that you can leave things and follow connections....And so a data service, a data storage, engine and service protocol that will provide -- that will allow us to cross between documents across different kinds of bridges of linkage and what we call transclusion. So the two primatives (sic.)...are linkage and transclusion...Linkage means any connection between something on the left and something on the right. So this can be a piece of text, this can be a section of data and this can be an illustration...we’ll have a registry, so you can register a new link any time you’d like....Transclusion means virtual inclusion by reference. So that your document can virtually include a piece of that, a piece of that, a piece of that and a piece of that without copying. And that has become -- that has immense powers. And so it becomes a fundamental mechanism....In other words, the document is entitled to see something that it can link to and then transclude it. That means that you now have a new principle basis for all data. Or what I like to call a landscape of data and the applications just become windows or lenses which manipulate the data and work at it. And look at it....Hypermedia is ways of looking at this. And the style of interaction which you want is simply chosen by the user.
[36] POLIKOM - management, a 10-minute videotape (available in English and German) on the use of multimedia systems in the conduct of German Government business, 1992, Institute for Applied Information Technology, GMD, Schlos Birlinghoven, Postfach 13 16, D-5205 Sankt Augustin 1, Germany, 1992. For further information contact Uta Pankoke-Babatz, email: <pankoke@gmd.dbp.de>.
[37] The author led the project and the technical work was led by Mr. Herbert Goertzel of the G. C. Dewey Company of New York.
[38] “Roomware” refers to systems that involve the integrated design of hardware, software and physical facilities. Teleconferencing rooms are rudimentary examples of roomware. More advanced examples, often generically referred to as “Arizona Rooms”, named after the university where the technology was created, include a dozen or so workstations, usually in a U shape, each of which has a low profile PC, a large screen projection capability, and software that facilitates presentation of design options, issues, questions or other agenda items and permits anonymous messages to be sent by the participants and projected in real time. The software facilitates grouping of responses, polling, etc. A manager or facilitator leads the discussion. “Electronic flip charts” are automatically created that could be used as a record of the meeting.
[39] Very recently, in response to a court challenge, it has been revealed that the Joint Chiefs do not keep records of such actions beyond their initial usage.
[40] This COSATI Panel was under
the leadership of Dr. Ruth Davis, then Director of the National Library of Medicine which
housed one of the first and most successful large text-based systems, MEDLARS.
[41] For an excellent review of some current research projects that use graphical approaches to information management, see the following video documents all contained in the video tape HCIL Open House ‘93, Catherine Plaisant, Ed., Video by John Reesch: “Dynamaps: dynamic queries on a health statistics atlas,” by Catherine Plaisant and Vinit Jain; “Hierarchical visualization with tree maps: making sense of pro basketball data,” by Dave Turo; “Tree VIZ™: file directory browsing,” by Brian Johnson; produced by HCIL, 1993 produced by the Human Computer Interaction Laboratory, Center for Automation Research, University of Maryland, (301) 405-2768, 1993.
[42] Negroponte, Nicholas, Being Digital, Alfred Knopf, N.Y., 1995,
pp. 130-131.
[43] Reported in: “Democratic Automation,” by Jonathan Schlefer in Technology Review, July 1983; "Staff Participation in Office Systems Design: Two Case Studies at the World Bank," by R. E. Barry in Office Automation, Jekyll or Hyde?, Working Women Education Fund, Cleveland OH, 1983; and in The Silicon Jungle, by David H. Rothman, Ballantine Books, N.Y., 1985, pp. 132-134.
[44] This is not an uncommon
situation today in many organizations where information technology decisions
have been decentralized to end-user units, making it extremely difficult in the
absence of insightful and strong top management to properly design enterprise document/records
management systems that are interoperable across organizational units and
different generations of technology.
[45] This is similar to what has happened 10 years later with multimedia technologies which gained first consumer application in the form of children’s games such as PAC MAN. Except for marketing functions, only in the past year or two have businesses begun to take multi-media seriously as an opportunity for improved business communications.
[46] Hammer, Michael and James
Champy, Reengineering the Corporation: A
manifesto for business revolution, Harper Business, NY, 1993, p.3.
[47] For detailed discussions of business process reengineering and suggested methodologies, see Hammer and Champy; Daniel Morris and Joel Brandon, Re-engineering Your Business, McGraw-Hill, N.Y., 1993; James Donovan, Business Re-engineering with Information Technology, PTR Prentice Hall, N.J., 1994.
[48] For a detailed discussion of
information engineering, including methodologies, see James Martin’s Information Engineering, Book 1,
Introduction, Prentice Hall, N.J., 1989, and Book II, Planning and Analysis, and Book III, Design and Construction.
For an excellent guide to the application of information engineering
tools for public sector financial management, see Information Systems Strategies for Public Financial Management by
Hywel M. Davies, Ali Hashim and Eduardo Talero, World Bank Discussion Paper #
193, 1993. For an example of the use
of one specific tool, state transition analysis, as it might be applied to the management
of electronic records, see R. E. Barry, “Electronic Document and Records
Management Systems: Toward a Methodology for Requirements Definition” in Information Management & Technology, the journal of Cimtech and UKAIIM, Herts, U.
K., Vol. 27, No. 6, November 1994, pp. 251-56.
[49] An extensive training package was subsequently developed by the Dutch Government, “Analysis of Business Processes and Records Management Requirements,” outlined in an unpublished manuscript by Jan Acterbergh, Ministry of the Interior of the Netherlands, Division for Coordination of Documentary Information, The Hague, 1994, Tel: as above; Fax: +31 70 302 7600.
[50] Reported in Preserving the Present: toward viable electronic records, by T. K. Bikson and E. J. Frinking, European-American Center for Policy Analysis (RAND), Delft, The Netherlands, Sdu Publishers, The Hague, 1993, pp. 16-17. For further information, contact: Administrative Coordination and Information Systems Department, Ministry of Home Affairs, Postbus 20011, 2500 EA Den Haag, The Netherlands, Tel: +31 70 302 7677; Fax: +31 70 3106937; Email X.400 address: <c=nl/a=400net/p=idn/o=min biza/s=biza ibi>.
[51] Cain, Piers, “Robert Smith and the Reform of the Archives of the City of London, 1580-1623”, London Journal, Vol. 13, No. 6, 1987-88, University College London.
[52] Bearman, David, Electronic Evidence: Strategies for managing records in contemporary organizations, Archives and Museum Informatics, Pittsburgh, PA, 1994, ,p.2.
[53] This argument was a strong
element of President Bush’s unsuccessful defense against litigation to prevent
destruction of White House email records in the last week of the outgoing
Administration’s existence.
[54] Regrettably, this turned out not to be a remedy that could be counted on as most operational people chose not to designate most messages as “official”, even when they were of considerable substance.
[55] See unpublished paper by R. E. Barry, “The Case for a Senior Information Management and Technology Position in Government Archives and Records Management Offices for Electronic Records” August 8, 1994, reported on at the annual meeting of the Society of American Archivists in Indianapolis, Indiana, September, 1994.
[56] Managing Electronic Records: Issues and Guidelines, United Nations Administrative
Committee for Coordination of Information Systems (ACCIS), New York and Geneva,
1990.
[57] Barry, Richard E., “Addressing Electronic Records Management in the World Bank” in Electronic Records Management Program Strategies, Margaret Hedstrom, Ed., Archives and Museum Informatics Technical Report No. 18, Pittsburgh, PA, 1993, pp. 19-29.
[58] Reported in “Real Problems, Real Solutions” by Bronwyn Fryer in PC World, September 1993, p.35.
[59] Barry, R. E., “Electronic Document and Records Management Systems: Toward a Methodology for Requirements Definition” , Information Management & Technology, journal of Cimtech and UKAIIM, Herts, U. K., Vol. 27, No. 6, November 1994, pp. 251-56.
[60] Barry, R. E., unpublished
paper “The Case for a Senior Information Management and Technology
Position/Function in Government Archives and Records Management Offices
for Electronic Records, August 8, 1994
(Rev.),” presented at the September, 1994 meeting of the Society of American
Archivists in Indianapolis, Indiana.
[61] “Neglect of the Archives”, Washington Post, May 8, 1995, p. A 20.
[62] Strassmann, Paul A. Information Payoff: The Transformation of work in the Electronic Age, The Free Press, A Division of Macmillan, N. Y., 1985, pp. 176-7.
[63] See, for example, Michael Hess, “An Incrementally Extensible Document Retrieval System Based on Linguistic and Logical Principles,” in the Proceedings of the 1st Annual International ACM SIGIR Conference on R & D in Information Retrieval, Copenhagen, June 1992, p. 111 and associated bibliography.
[64] See Jean-Pierre Wallot’s “Free Trade in Archival Ideas: The Canadian Perspective on North American Archival Development” in American Archivist, Vol. 57, Spring 1994, pp. 380-399.
[65] Toronto Globe & Mail, April 4 20, 1995, p. B1. Source: EDUPAGE (4/20/95); to subscribe: send message to <listproc@educom.edu>. In body of the message type: “subscribe edupage” followed by your name.
[66]See Overview of the Third Text REtrieval Conference (TREC-3), Donna K. Harman, Ed., U. S. Department of Commerce, Technology Administration, National Institute of Standards and Technology (NIST), NIST Special Publication 500-225, April 1995.