Background history of library automation

Although the automated circulation systems were primitive by modern standards, they were a cost-effective solution that allowed a library to provide a better service to its clients. Meanwhile, librarians were embarking upon another venture, which proved to be a pivotal point in the history of library automation. One of the most important functions in a library is cataloging and classifying individual items.

Library Management

Creating such a bibliographic record is time consuming because it requires professional librarians to apply the Anglo-American Cataloging Rules to each item. To curtail the costs and raise the productivity of the system, librarians and library technicians have copied the cataloging information for individual documents from the Library of Congress and other institutions. In the early s, a few libraries formed informal networks to exchange their printed book catalogs to decrease the amount of original cataloging that their staffs had to perform.

Toward the end of the s, the Library of Congress took a leading role in using computer technology to establish a project for exchanging cataloging information. Under the leadership of Henriette Avram, MARC Machine-Readable Cataloging was developed as a protocol for storing bibliographic information in a standard format to facilitate exchange of cataloging records.

It also facilitated the production of other products such as catalog cards and microfiche. The Library of Congress experiment was so successful that many other countries embraced the MARC record structure with minor local variations , and eventually it was adopted by the International Organization for Standardization ISO. In the early s, another important development shaped the future of library automation. Lockheed Missiles and Space Company introduced Dialog, an online information service that provided access to a variety of bibliographic databases.

Many large academic libraries began to offer specialized services for scientists who required literature searches for their projects.

  • Navigation menu?
  • criminal record work visa for europe?
  • The Library Quarterly!
  • divorce support group tulsa oklahoma.
  • california state bar exams divorce law.

As the demand for online access to information grew, so did the number and the size of databases. Between and , the number of databases grew from to 11,, and the number of records in these databases increased from 52 million to The technological progress occurring in the last half of the s and in the early s led to the introduction of turnkey systems in libraries. Computer hardware and software were combined to provide libraries with an exclusive and dedicated system to automate circulation operations.

Turnkey systems usually consisted of a minicomputer, dumb terminals i. By the end of the s, many libraries had automated some of their functions, mainly circulation, cataloging, and, to a lesser extent, reference services. Shared cataloging, though relatively expensive, enabled many large libraries to begin transferring cataloging and classification duties from professional librarians to library technicians.

Libraries also began to convert their old card catalogs into machine-readable records. Many large-scale retrospective conversion RECON projects, while costly, were underway or had been completed by the mids. These first automated library systems required a different type of bibliographic record for each function e.

Technological advances and market demands required the vendors of library automation systems to develop a new generation of powerful integrated systems. These systems were designed to use a single bibliographic record for all the library functions.

A unique MARC record allows every book or any item to be tracked from the moment that it is chosen for acquisition by a library to the time that it is available on the shelf for the user. At each subsystem, the MARC record is enhanced and augmented with the additional information that becomes available about the book e. Libraries, which have completed their RECON projects, can transfer their large bibliographic databases to the new integrated systems and automate all their operations. Acquisitions and serials management were the last modules to be incorporated into the integrated systems.

Procurement of library materials involves complex functions such as online ordering, invoicing, accounting, and claims for unfulfilled orders. When a book is considered for selection, the automated system allows library staff to enter minimal bibliographic information about it. The incomplete record is then augmented with new information to form a MARC format record as soon as the item is acquired and received by the library. Some bibliographic utilities offer libraries time-sharing access to their large databases for acquisition purposes. Among these is the Research Libraries Information Network RLIN , which supports a number of functions such as preordering, standing orders, and in-process information.

Serials management is one of the most complex operations in an online environment. Tracking publication patterns of individual journals, automatic claiming of late arrivals or missing issues, and maintaining binding information are a few examples of the activities performed by the automated serials management subsystem. Perhaps the greatest effect that automation had in the s certainly the most visible was the introduction of online public-access catalogs OPACs.

The new online catalogs quickly gained wide acceptance among the public, who preferred them to traditional card catalogs. The first generation online catalog was simply an extension of the card catalog and had limited capabilities. These OPACs provided users with a few access points i. Despite the limitations of the early OPACSs, library patrons preferred them to the card catalog, since these online systems provided patrons with information about circulation status e.

Libraries were among the first organizations to adopt the new technology , since librarians and information professionals realized the potential of the CD-ROM as a storage medium for vast amounts of information. This technology was used to provide access to a variety of bibliographical records, including MARC cataloging information.

Many libraries used CD-ROMs to supplement or even replace the online utilities as a cost-saving measure. This model consists of protocols for a layered communication system, which simplifies the movement of data between various computers. ISO also developed another set of protocols, referred to as Search and Retrieve Service Definition and Protocol Specification, to facilitate search and retrieval of information. This protocol was later modified and became known as the Z This elaborate scheme, in conjunction with an abstract database model, has been developed to accommodate the differences among server databases.

The client may specify the record structure and data elements to be retrieved, preferred syntax e. The server should be able to provide access and resource control for the client. The standard also has provisions for security passwords, charging and billing, scanning terms in lists and indexes within the browsing facility, and sorting. Several other standards that facilitated the management and communication of the digital information were proposed and drafted by the end of the s. Along with information retrieval standards, new MARC communication standards were introduced i.

In addition, the Unicode project, which began in , responded to the lack of a consistent international character set and led to a set of standards for encoding multilingual text.

Standard Generalized Markup Language SGML , initiated by ANSI in , was designed as a way to separate content from style and as a means of marking up any type of text so it can be effectively handled and managed by any type of computer. SGML identifies and names digital information to be used in a variety of products and services, such as indexing, typesetting, hypertext manipulation, and CD-ROM distribution. Although computer networks were first developed in the s and the first e-mail was sent in the early s, it was not until the late s that computer communication systems were widely used in libraries.

File Transfer Protocol FTP was used for transfer of large data files, and e-mail service was used for fast and efficient interlibrary loans. Telnet, however, had the greatest effect on information services by allowing users to have remote access to libraries. Researchers no longer had to rely on librarians to find information in distant libraries or even travel to many locations to search library catalogs.

As telecommunication technology progressed at a rapid rate, so did computer hardware and software technologies.

An Essay on the History of Library Automation

The introduction of graphical user interfaces GUIs , particularly the Windows operating system, had a profound effect on library automation. Librarians began hastily to write request for proposals RFPs and seek funding to upgrade their outdated automated systems. The users were the real benefactors of the new systems, since they would no longer need to learn and memorize long commands to search, retrieve, and display the desired bibliographic records. Throughout the s, the pace of development in libraries matched the changes fueled by the introduction of the World Wide Web.

Popular Essays

Many library automation vendors adopted the Z The implementation of Z While web-based systems were being developed, the Library of Congress, OCLC, and other organizations sought new methods for enhancing the contents of millions of cataloging records in their databases. MARC format was augmented by adding a new field in its record structure to reflect the availability of information resources on the web. The strategy was enthusiastically welcomed and supported by existing users. In we looked at the library automation landscape. Little had changed in terms of competitive library automation systems from , when we first launched our new software.

The majority of systems that we compete with are populated with add on functions to an existing code base programs. The development effort is solely about giving users fixes and small enhancements to existing programs.

Brief History of Library Information Systems

Essentially living off the back of old, legacy code. The result is that rather than helping staff perform tasks more efficiently, the software confuses and is difficult to learn. Both our front-line team, who implement and support the application, and the development team felt that we could do more and do better. Client feedback suggested that the pressures to control content and costs with fewer staff were not abating.

We try to listen to and speak to clients every day. Significantly, our very own Travelling Librarian , Graham Partridge, saw first-hand the need for greater streamlining of workflows. Most of all, the simplification of the software is needed. Librarians and Archivists clearly tell us that they need to find new ways to process more data.

An Essay on the History of Library Automation | Kibin

We made the decision to build from the ground up, completely new software, not using any of the programs that we had built up over eight years. It was the only sensible way to achieve our ambitions to introduce greater efficiencies and introduce modern ways of processing data. The result: more friendly and useable software, future proofed for the next years. The philosophy behind our software is very important to us and our clients.

There are two strands. First, we release code to clients as soon as it is developed and tested so that productivity gains can be immediately received. Second, once a feature is in the system it remains in place and the upgrade does not affect the configuration settings that each client will have applied. It was necessary, therefore, to combine the new designs and functionality with the existing system programs.

This meant grafting old and new together until we had replaced all areas of the system. This would make the transition easier for clients and put the new, really good stuff, in their hands sooner, instead of the old way of waiting years before bringing out a completely new product and having users migrate to it. In making this decision, it has meant we can bring much needed modernisation sooner to corporate librarians. This is exemplified by:. The development process has involved existing users at every stage. Agile Development is a methodology that allows us to test assumptions.

Who can edit:

This takes longer to deliver sometimes but the results are impressive and garner greater user acceptance. Our development team is very excited. They are using the latest technology which all programmers like to be working on and we can implement ideas we have had an eye on for years but could never before deliver in the old software. The speed of development and testing using new program development tools is a major factor.