Web 1.0: How Did We Get Here? Part 6
This is the sixth in a series on the commercial history of the internet. If you're lost, don't worry, you can?read Part 1 here.
From the early 70s to the late 80s
There was at this point in the mid-80s a lot of networks and a lot (not by 2022 standards but by 1980s standards) of data being shared over these networks. There was no way to organize them for sharing. With so much information being shared so quickly, one key element had been missed out on: usability.
The initial versions of the internet, packet-switching data networks, sent data in the form that it was put into the sending computer. The receiving computer could receive the data, but there was no guarantee the human on the receiving end could make use of it in any meaningful way. It was time for user interfaces to be designed.
This was not by any measure the first time this problem had been thought of, but with the size of the-then internet growing, the size of the problem and the need became larger in the mid-1980s. Since the mid-1960s, however, different models had been proposed by different companies based on a hypertext-based user interface.
There were research projects like the one in 1967 by Andries van Dam, the Dutch-American computer scientist and now long-time professor at Brown University, called HES (the Hypertext Editing System).
The project was also worked on by Ted Nelson, the philosopher and sociologist who coined the term hypertext and hypermedia in 1963 and published them in 1965. HES organized information into two large groups: links, and branching text. The branching text could be automatically arranged into a menu, similar to a restaurant’s menu, and a piece of data within an area could also have an assigned name called labels, which could be accessed later by that name on the screen, like a dish on a restaurant’s menu can be recalled by the section and the name of the dish, like chocolate milkshake in the ice cream section.
HES was a pioneer in many concepts which would be foundational to hypertext, such as text formatting and printing on the-then computer monitors. It required multi-million dollars of equipment, primarily from IBM, to be functional at the time though, limiting its use, however, it was used by NASA for documentation on the Apollo Space Program.?
领英推荐
Then there was IBM’s GML (Generalized Markup Language). Developed by a small but industrious team at IBM headed by Charles Goldfarb, Edward Mosher, and Raymond Lorie (whose initials were used to make up the initials GML), the language was intended for the IBM text formatter (a word processor machine). The GML was a set of macros, short for macro instruction, which is a rule or pattern in computing that specifies how to map an input into a corresponding output, with the goal being to use macros to implement procedural (based on the intent of user) markup tags (keywords associated with a piece of information to find it in a system, for example) for the IBM text formatter’s language, SCRIPT.?
Also, Ted Nelson’s famous Project Xanadu. Arguably the first hypertext project, it was announced in 1960, but as Gary Wolf would write in the June 1995 edition of Wired magazine in an article titled “The Curse of Xanadu”, Project Xanadu is “the longest running vaporware story of the computer industry”.
You see, vaporware is the name given to projects that are announced but never see the light of day. It wouldn’t be until 1998 that a (not complete) version would be released, and not until 2014 that a “working deliverable” called Open Xanadu would be available. This would not stop the administrators of the project, however, from declaring Project Xanadu superior to the World Wide Web, then already in international use, with a mission statement reading “Today’s popular software simulates paper. The World Wide Web (another imitation of paper) triviliases our original hypertext model with one-way ever-breaking links and no management of versions or content”.
It must be noted, however, that World Wide Web existed, and Project Xanadu, until recently as of 2022, arguably did not. Much has been said about perfecting software before releasing it with many viewpoint held, but in this case, and a concept we’ll revisit in this book, done was better than perfect.?
Finally, there is Douglas Engelbart, the American engineer and computer scientist, whose project was titled the oN-Line System (NLS). The NLS was developed in the 1960s at the Augmentation Research Center (ARC) of the Stanford Research Institute (SRI), and quite frankly changed the game when it came to human-computer interaction.
The oN-Line System pioneered concepts and was the first to make practical use of of hypertext links, the computer mouse for traveling a cursor around a monitor, raster-scan video monitors (computer monitors which used the rectangular pattern of image capture and reconstruction used by televisions), information organized by relevance (a precursor to Google’s algorithm), the GUI, affectionately called “goo-ey” (graphical user interface, monitors with computer graphics we rely on so heavily in 2022, which allows for screen windowing, that is, interacting with a computer by having different software programs have different icons in different files), and presentation programs (an early version of Microsoft’s PowerPoint suite of solutions and features).
The modern computer, its software, and the internet’s user interface simply would not have taken the shape it did to be as user-friendly and accessible as it is today without Engerbart and ARC’s contributions.?
If you enjoyed this article, you may want to follow us to learn more and join the movement for a content consumption experience without commercial interruption at?Life App.