You are here

قراءة كتاب The eBook is 40 (1971-2011)

تنويه: تعرض هنا نبذة من اول ١٠ صفحات فقط من الكتاب الالكتروني، لقراءة الكتاب كاملا اضغط على الزر “اشتر الآن"

‏اللغة: English
The eBook is 40 (1971-2011)

The eBook is 40 (1971-2011)

تقييمك:
0
No votes yet
المؤلف:
دار النشر: Project Gutenberg
الصفحة رقم: 4

languages were taken into account from 1986 onwards with 8-bit variants of ASCII, also called extended ASCII, that provided sets of 256 characters. But problems were not over until the publication of Unicode in January 1991 as a new universal encoding system. Unicode provided "a unique number for every character, no matter what the platform, no matter what the program, no matter what the language", and could handle 65,000 characters or ideograms.

***

With the internet spreading worldwide, the use of ASCII and extended ASCII was not enough anymore, thus the need to take into account all languages with Unicode, whose first version was published in January 1991.

Used since the beginning of computing, ASCII (American Standard Code for Information Interchange) is a 7-bit coded character set for information interchange in English (and Latin). It was published in 1963 by ANSI (American National Standards Institute). The 7-bit plain ASCII, also called Plain Vanilla ASCII, is a set of 128 characters with 95 printable unaccented characters (A-Z, a-z, numbers, punctuation and basic symbols), the ones that are available on the American / English keyboard.

With computer technology spreading outside North America, the accented characters of several European languages and characters of some other languages were taken into account from 1986 onwards with 8-bit variants of ASCII, also called extended ASCII, that provided sets of 256 characters.

Brian King, director of the WorldWide Language Institute (WWLI), explained in September 1998: “Computer technology has traditionally been the sole domain of a 'techie' elite, fluent in both complex programming languages and in English — the universal language of science and technology. Computers were never designed to handle writing systems that couldn't be translated into ASCII. There wasn't much room for anything other than the 26 letters of the English alphabet in a coding system that originally couldn't even recognize acute accents and umlauts — not to mention non-alphabetic systems like Chinese. But tradition has been turned upside down. Technology has been popularized. (…)

An extension of (local) popularization is the export of information technology around the world. Popularization has now occurred on a global scale and English is no longer necessarily the lingua franca of the user. Perhaps there is no true lingua franca, but only the individual languages of the users. One thing is certain — it is no longer necessary to understand English to use a computer, nor it is necessary to have a degree in computer science. A pull from non- English-speaking computer users and a push from technology companies competing for global markets has made localization a fast growing area in software and hardware development. This development has not been as fast as it could have been. The first step was for ASCII to become extended ASCII. This meant that computers could begin to start recognizing the accents and symbols used in variants of the English alphabet — mostly used by European languages. But only one language could be displayed on a page at a time. (…)

The most recent development [in 1998] is Unicode. Although still evolving and only just being incorporated into the latest software, this new coding system translates each character into 16 bits. Whereas 8-bit extended ASCII could only handle a maximum of 256 characters, Unicode can handle over 65,000 unique characters and therefore potentially accommodate all of the world's writing systems on the computer. So now the tools are more or less in place. They are still not perfect, but at last we can surf the web in Chinese, Japanese, Korean, and numerous other languages that don't use the Western alphabet. As the internet spreads to parts of the world where English is rarely used — such as China, for example, it is natural that Chinese, and not English, will be the preferred choice for interacting with it. For the majority of the users in China, their mother tongue will be the only choice."

First published in January 1991, Unicode "provides a unique number for every character, no matter what the platform, no matter what the program, no matter what the language" (excerpt from the website). This double-byte platform-independent encoding provides a basis for the processing, storage and interchange of text data in any language. Unicode is maintained by the Unicode Consortium, with its variants UTF- 8, UTF-16 and UTF-32 (UTF: Unicode Transformation Format), and is a component of the specifications of the World Wide Web Consortium (W3C). Unicode has replaced ASCII for text files on Windows platforms since 1998. Unicode surpassed ASCII on the internet in December 2007.

1992 > HOMES FOR ELECTRONIC TEXTS

[Summary] The first homes for electronic texts were the Etext Archives, founded in 1992 by Paul Southworth, and the E-Zine-List, founded in 1993 by John Labovitz, among others. The first electronic texts were mostly political. They were followed by electronic zines also covering cultural topics, and not targeted towards a mass audience, at least during the first years. The Etext Archives, hosted on the website of the University of Michigan, were "home to electronic texts of all kinds, from the sacred to the profane, and from the political to the personal", without judging their content. The E-Zine-List was a directory of e-zines around the world, accessible via FTP, gopher, email, the web and other services. The list was updated monthly. 3,045 zines were listed in November 1998. John wrote on its website: "Now the e-zine world is different. (…) Even the term 'e-zine' has been co-opted by the commercial world, and has come to mean nearly any type of publication distributed electronically. Yet there is still the original, independent fringe, who continue to publish from their heart, or push the boundaries of what we call a 'zine'."

***

The first homes for electronic texts were the Etext Archives, founded in 1992 by Paul Southworth, and the E-Zine-List, founded in 1993 by John Labovitz, among others.

The first electronic texts were mostly political. They were followed by electronic zines, that also covered cultural topics.

What exactly is a zine? John Labovitz explained on its website: "For those of you not acquainted with the zine world, 'zine' is short for either 'fanzine' or 'magazine', depending on your point of view. Zines are generally produced by one person or a small group of people, done often for fun or personal reasons, and tend to be irreverent, bizarre, and/or esoteric. Zines are not 'mainstream' publications — they generally do not contain advertisements (except, sometimes, advertisements for other zines), are not targeted towards a mass audience, and are generally not produced to make a profit. An 'e-zine' is a zine that is distributed partially or solely on electronic networks like the internet."

# The Etext Archives

The Etext Archives were founded in 1992 by Paul Southworth, and hosted on the website of the University of Michigan. They were "home to electronic texts of all kinds, from the sacred to the profane, and from the political to the personal", without judging their content.

There were six sections in 1998: (a) "E-zines": electronic periodicals from the professional to the personal; (b) "Politics": political zines, essays, and home pages of political groups; (c) "Fiction": publications of amateur authors; (d) "Religion": mainstream and off-beat religious texts; (e) "Poetry": an eclectic mix of mostly amateur poetry; and (f) "Quartz": the archive formerly hosted at quartz.rutgers.edu.

As recalled on the website the same year: "The web was just a glimmer [in 1992], gopher was the new hot technology, and FTP was

Pages