My Cyberspace: Part Two

June 9, 2018

As you can see from part one, each modern computer itself is a vast terrain for exploration. Every application is a maze of screens and dialogues and options, providing tools which offer an incalculable variety of possible workflows and possibilities. Operating systems themselves are highly configurable and have deep levels of access and abstraction going down from the high-level user interface deep into the internals upon which it depends. All the files and resources which came with software can be pulled apart and opened, modded and configured. A modern computer is an entire cyberspace in and of itself.

The Web

But, of course, once the internet gets thrown into the mix, computers often seem to become flattened into mere machines which run your web browser. It wasn’t always this way, but started in 1993 when the world wide web stole the whole show. All the old empires of controlled online spaces were toppled by it’s out-of-control spontaneous growth, and the browser has come to be synonymous with the internet (if not computing itself!) to very many users. In the mid-90s websites and homepages had become the trendiest new thing and everyone just had to have one. People bought computers and started learning the mark-up language used to create hypertext documents – or paid someone that did.

Of course, this was a step backward technologically speaking: HyperCard from Apple, for instance, offered a much better model for creation of hypermedia in the age of graphical user interfaces: dragging and dropping elements in a manner more similar to slideshow presentation software. Of course, when it comes to fads and what gets popular, technical merits are never the deciding factor and so 25 years later we’re still stuck with HTML, JavaScript, and web browsers instead of what might have been. Perhaps the upshot of having a web which requires engagement with computer languages, instead of seamless graphical metaphors and icons, is the constant reminder that computers are still telegram processing machines.

In the 90s and early 2000s the hypertext of the world wide web was a lot closer to text, and much less hyper. What made it interactive was the hyperlinks which let you turn any part of the page into a way to access another page. Initially web servers were just dumb file hosts, containing the .html and .jpg files in directories: surfing the web was like following a path in a DOS file system, except with a domain name instead of a drive letter. The first major innovation was the creation of dynamically-generated hypertext, allowing for pages that were created by the server on-the-fly when they were requested by a client web browser. The server software could do things like create a page representing a conversation, pulling records from a database which contained usernames and posts.

In this way the web started to be used for things that local software had usually been used for. Instead of opening the email program on your computer, you could read your email from your webmail account at Hotmail. Your “inbox” was not a static HTML file, but was generated live as soon as you clicked the bookmark to represent the state of your email account. It takes a lot more processing power on the server-side to build the website on-demand like this, but nowadays nearly every website works this way. The days of a web server as a dumb file host serving up plain text and picture files are long gone. This is how, in the past 20 years, we’ve moved gradually back toward the terminal/mainframe or client/server model of computing. Instead of the screen having all your apps in windows, now your browser is a screen with your apps in its tabs.

Dynamic hypertext allowed for websites to have guest books and web forums, and for news sites to have comment sections. Many sites popped up which existed merely to link to and host discussion on other pages which they linked to. Sometimes those sites became so popular that they’d crash the servers of any site submitted to them, or sometimes the community would have a sizable contingent of nasty people who would raid those sites en masse and flood the forums with spam and troll posts. Newspapers and magazines would complain when news aggregators like Fark.com would link directly to their stories instead of their home page. What’s a newspaper without a front page, or a magazine without a cover story? Visitors would only see a single article and never end up browsing through the whole online issue, instead just going back to the news aggregator to comment there.

There were many other ways to explore the web and find out about new sites. There were curated, phonebook-like directories, like the paper yellow pages I had or the much more sensible digital ones like Yahoo! There were web rings, which was when many sites all related to a similar theme would link to each other in an ordered chain, each letting you go forward or back until you looped around. The problem with web rings is that if one site went offline then the chain would be broken at that spot and the sites linking to and from that page would need to get in contact and manually route about the damage by linking to each other. Still, the image that web rings bring to mind, that of a group of people standing in a circle holding hands, really captures the spirit of human connection and community online.

Search engines came up as processing power became available to handle the levels of server-side processing power which their services demanded. A search engine crawls through the web (using software imaginatively called web-spiders) traversing page after page and following all the links while recording. Then users can query the database and look through dynamically-generated result pages. There were so many new websites and pages being created all the time that you could not, nor still can record or navigate or order them all. On the internet, the territory is the only true map and all maps leave out most of the picture.

This is why I’m sure that everyone has their own cyberspace, and why I can only tell the history of mine. A history of the internet is impossible because everyone does their own thing, and in their lifetime will only get a tiny glimpse of all the information that is out there. It reminds me of that old advertisement for broadband internet: a man is rapidly clicking through websites which are loading instantly, faster and faster, until suddenly the screen goes black and an error message pops ups reading, “I’m sorry. You have reached the end of the internet. Please use the Back button on your web browser to return to a page you have already visited.” In theory, the man would only have to wait a day or two to go find another couple minutes of brand-new web to click through before hitting the wall again. At least, if all the new websites which had been made in that time had managed to be linked to by existing ones, and crawled over by web spiders to be indexed or directory-listed or added to a web ring. Of course, since the web is (as it’s very name implies) not linear, and web ring didn’t really last, there is no real order to it all and never will be. Maybe it could all be organized eventually if humanity one day overcome all semantic differences and finally got on the same page regarding the meaning of commonly used words…. just kidding.

Growing Up Before Social Media

When we finally got dial-up internet in my house I was in Grade 7. Since it would tie up the phone line, I got into the habit of going to bed at around eight o’clock in the evening, wearing earplugs to fall asleep while my family were still up and making noise. I’d wake up at 3:30 or four in the morning and surf the internet until 8 when I’d have to get ready for school. This was my basic schedule until we finally got DSL and I could use the internet whenever I wanted, worry-free.

In the eighties and early nineties, before the world wide web, most users of the internet had been college and university students. Each September a new class of students would use all the various pre-web networking programs and start messing around, spamming everyone, writing emails IN ALL CAPS and generally behaving badly. Eventually they’d learn all the rules and proper behaviour after the first month or so and the internet would return to general civility. But with the deregulation and commercialization of the internet in the early 90s, followed by the proliferation of the World Wide Web, people started pouring onto the internet in droves. It was called The Eternal September, and ever since the internet has had a constant churn with no way to uniformly absorb or discipline the oncoming masses common civility and protocol. Internet culture was out of control and words like netizenship and cyber-etiquette were old fashioned and basically unknown.

Instead of the dignified place of discourse befitting a military or academic communication platform, the internet became a sprawling wasteland of ever-growing unconnected communities. It is full of information and drama, facts and cults, a place to get lost, to find other voices, to build a place far away from reality. In the time after the Dot-com bubble burst at the turn of the millennium, but before social media exploded in 2008, it was a wild west frontier that few businesses had any business in beyond owning a website and an email address. Before social media companies decided to just start selling everyone’s private lives, it was commonly held that the internet was useless for making money. It is in this short period, less than a decade long, that teenage me grew up in cyberspace.

Compared to one’s experience in immediate social reality in “meatspace”, the sorts of people and voices one encounters in cyberspace are completely different. The proportions are inverted – well adjusted people do not spend all day online, and alienated and lonely people who one seldom encounters in active social engagement do. That’s not to paint everyone online with a broad brush – I’m painting a picture of what is represented at the extremes. Pejoratively put, the internet pre-social media was mostly either people who “have a life” and people who need to “get a life”, and the latter were far more invested in time, effort, breadth, and resolve. The internet (which is now called “the deep web” in contrast to social media) may have lots of people in it, but it belongs to the weirdos. Nowadays the 4chan dichotomy of Normies vs. NEETs (for not in employment, education, or training; i.e. lots of free time) seems to be gaining wider usage (amongst normies, of course) as a way to express this strong polarization of popularity, acceptability, belonging, and commitment among web users. The internet is where antisocial, alienated, disabled, and the sort of marginal people who lack public awareness campaigns fill their human need for socializing.

Web Communities

Broadly speaking web communities exist either as walled gardens of relatively normal people who come together around a single specialist or niche topic or project which can justly expunge all off-topic discussion, or as general hang-out places or haunts which attract the full range of civilized or free-for-all membership and must manage the variety and diversity of opinions, modes of discourse, and content accordingly. It is the latter type which end up causing the most trouble.

If a web community is going to last and function and remain civil it had to take lots of steps to brow-beat new users into behaving. The fight against the latent chaos of Eternal September is fought differently by different sites. In 1999 the tech website Slashdot.org pioneered many of those civilizing systems, most notably a well-thought out karma system. The culture of Slashdot tends to venerate posters with low user-ids which denote that they had joined the site early, before it became popular. Such long-time members are inherently stewards of the community, representing the original culture of the site which newer members are implicitly becoming a part of. Valuable contributions to the site earn users entry into a lottery system which grants winners temporary moderation powers, meaning that the job for policing the site is randomly distributed amongst responsible posters. Posters thus deputized into “mod duty” must refrain from contributing to the threads in which they opt to apply or remove karma from posts, tilting the discussions. Anonymity is permitted on Slashdot but is heavily discouraged as cowardly.

On 4chan anonymity is celebrated as a way of discouraging narcissism, self-importance, attention whoring, and individuality in general. Instead new people were often compelled to “lurk more”, meaning read more and post less until they learn to better to post in the style of the anonymous group think which ends up reading like one schizophrenic mind arguing with itself in an ever-changing, orderless flow. Unwanted posts are basically unrewarded with responses, leading them to slip away the fastest off the front page and out of attention, replaced by the constant bumping of those which do gain the community’s interest. The no-rules, anything goes style of 4chan was born from a rebellion and succession from its older sibling and total opposite: the SomethingAwful forums.

Discussion on Something Awful is quite formal (at least, for a web forum), reputation is paramount, and everything is categorized, rated, and archived. Proper grammar and punctuation is mandated in most of the sub-boards. Rule breaking gets you warnings, probations and can get you banned, meaning you have to repurchase your ten-dollar membership. Permabans are dolled for people so toxic that site doesn’t even want their money in exchange for putting up with them. If you don’t like someone, you can spend money to change their avatar or graffiti all over their online identity which they must then spend money to undo. The long-form discussions, with threads which last weeks, months, and years (often necessitating occasionally summary posts, or tables of contents edited into their start) demand much attention and involvement. This often leads to wonderful pay-offs in creativity and discussion and a strong familiarity with various individual contributors who earn their mindshare in the community.

It takes a lot of work and structure to stay on top of the process of socializing new users into conforming with a site’s pre-existing culture, maintaining whatever it is that keeps it the place which the established user-base call home. Yet cultures still drift and old users eventually always end up complaining about how a site is no longer what it used to be. Web communities have golden ages and they have dark ages — they have times of peace and times of war. They are composed of individuals who are not alike and whose contrast provides the content of the site: the discussions and debates and drama. And these different individuals are all alike in sharing the same implicit or explicit culture and values of discourse and civility which keeps that community functioning.

Most civil web forums have rules against name calling or direct insults directed toward people instead of engaging the substance of their arguments. Often times discussion on particular topics are banned because they just can’t be had civilly. This often isn’t seen as censorship because many sites have some demilitarized boards or other places where users can engage in unmitigated free speech — those places are just usually overrun with shit-talking, bitter arguments on contentious topics, gross pictures, and all the other stuff you want to keep quarantined. The maintenance of healthy internet cultures requires constant upkeep and adjustment to rules, board structure, and the needs of the community. Sometimes out-of-control, unmoderated boards are get shut-down when their drama spills over and creates a disruption or liability for everyone else. It’s a big internet — there is always someplace else to go.

Citing Sources and Debunkings

One of the features of hypertext is the ability to link to citations, and so arguments are often expected to be bolstered by references to scientific data. The internet has communities dedicated to many topics and people in the sciences enjoy discussing their fields of interest and educating people about them. Growing up online, for me, has meant lots of exposure to discussion and debates and the ability to investigate all sides of an issue and their evidence. Picking up what something is or how it works by reading or seeing those who understand it use it and wield it (or how the correct those who use it improperly) is an excellent way to learn. What is being learned is encountered within its proper context — in precisely the situations where it is relevant instead of abstractly. It’s very instructive to visit the enclaves of two competing schools of thought and see how each identifies in opposition to the other and what sort of beefs they have.

Besides just using a search engine, asking questions in a web community of experts or aficionados is the best way to get information online. Although sometimes other posters are finicky or scarce. There is no faster way to get an answer to a question on the internet than to propose a deliberately wrong answer to it as a matter of fact. Experts will race to correct you by throwing their knowledge and sources in your face much quicker than had you merely asked the question directly.

My cyberspace was disproportionately rational, skeptical, and atheist compared to the media at large and as a young teenager I loved reading all of the debunking which were produced by the communities I frequented. I was a big fan of the Bad Astronomy website, run by astronomer Phil Plait, which addressed many misconceptions and false claims about space and science. In one of his more famous projects he collected all of the supposed photographic and scientific evidence supporting the theory that the moon landings were a hoax and debunked each one by one. I was soon into all the other various communities committed to debunking unscientific beliefs, or “woo” under the banner of enlightened skepticism.

It’s a delightful intellectual indulgence to read bad theories being torn apart with science lessons and rational thinking. The emotional satisfaction of knocking the losing side down a peg is the same as one gets from celebrity tabloid magazines, only the reader can justify it to themselves as being educational. The internet has certainly raised an entire generation of precocious teenagers into skeptical, rational atheist adults by filling this market demand for combative entertainment in this way. The result today is that everything being up for debate and clickbait titles abound promising to show ideas being “destroyed” or “wrecked” as though science and learning was a tournament in some bloody spectator sport. Two memes enter; one meme leaves.

To be a dubunking, the argument being addressed must be meticulously torn apart and demolished. It takes a lot of work, requiring expertise and methodology and lots of citations. The whole point of a debunking is to be final; to be more than just a counter-argument but an absolute routing. While many topics are not so cut-and-dry to allow a concrete right and wrong opinion, such as philosophical questions regarding consciousness, you can rest-assured that all of the low-hanging fruit has been picked clean. Every poor argument ever made against the theory of evolution has been pulverized in internet court so many times that when I encounter them one in the wild I have to check my shock. How could anyone still believe that? Have they been living under a rock (or have I)? By consequence, the opinion columns newspapers and new media outfits try to pass-off as debunkings are a disgrace to the name.

Talking like an Expert

The thing about internet discussions is that they between individuals and small groups of individuals out of millions who are brought together in nearly-infinite combinations. Thus the same argument occurs again and again and again with different people. If none of the participants experience the deja vu of having already seen this argument play out then perhaps something novel will happen. But usually whoever has already seen this argument, and thus already knows who wins or loses, has the upper hand. They can go for points on style, deciding how they will deploy the winning argument — perhaps as a humble conveyer of knowledge or as a sick burn. Or maybe they can anticipate counter-arguments and guide their interlocutor into a trap, or know when to tip the board, move the goalposts, and avoid getting caught.

Eventually, having lurked on the internet long enough, anyone can pick up expert knowledge and develop an authoritative voice on subjects which they are not directly studied in. I’m sure it’s like those doctors with fake degrees; they are passable and people seldom test them or question the depth of their knowledge. It’s a sort of imposter intellectualism, since one is basically just reciting lines and talking points as though one is playing a role instead of having formulated one’s opinions from education in the concepts discussed. It’s about acting knowledgeable and sounding smart than having knowledge and being smart. On the internet nearly anyone can act like an expert on anything, and it’s hard to know what’s wrong with that when they actually are correct. And one can also play devil’s advocate to many sides, already having seen the talking points come up so often. What innovations do occur, they are not so often balanced syntheses which progress the arguments in good will, but semantic shifts which lead to reinterpretations of the arguments into strawman positions and specious ad-hominem attacks against the people who wield them.

Creative Communities

For all these reasons, the healthiest parts of the internet are the ones where people are not merely talking or discussing or reviewing or critiquing or commenting on events or art or ideas. They are the communities where people are creating things. Where the discussions involve the planning and deliberation and constructing of steps to achieve a goal which everyone present desires and is working toward. In these environments people have more to contribute than just their opinions or facts uttered in
order to prove a point. There is something relevant to be discussed for it’s merits. Of course, there are always fans or viewers or users or future-owners who may be hard to please, but creators know that what they accomplish will speak for itself and its value will be self-evident to those who appreciate it. I’ve been part of fan-film communities, art communities, and Free Software communities as a cheer-leader, fund-raiser, bug tester, and sometimes contributor. All of these communities have milestones and goals and setbacks and accomplishments — in short there is a story to the common discussion. It’s not the story of a flamewar, or a some attention-grabber causing drama — it’s rising action of a journey in which all involved are invested, have stakes in, and share a common bond around. If you’re going to spend a lot of time on the internet, then be sure to be part of something creative, above and beyond mere discussion forums and comment sections. Of all the rewards it might reap, a renewed faith in cyberspace as a tool to for bringing people together in common humanity is the most valuable.


My Cyberspace: Part One

June 2, 2018

Everyone has their own cyberspace: this is the story of mine.

My family first got a computer when I was about 6 years old. It was a Compaq Contura 4/25c laptop, a 486 computer running DOS with Windows 3.1. There were two games, both for DOS. One was a Berenstain Bears colouring book, the other was educational involving several minigames with a frog and lilypads. I spent a lot of time exploring every nook and cranny of that computer. I discovered typing “help” at the DOS prompt provided a list of commands to try out. I found the “dosshell” which I thought was edgy because it had the word “hell” in its name. I liked watching the Surface Scan in ScanDisk. I couldn’t figure out the difference between edit.com and qbasic.com beyond a few extra menu entries. I increased our drive space with DoubleSpace and then messed it all up copying a bunch of stuff to the host drive. In Windows I remember finding there were “hidden” applications I could find by inserting Objects into Microsoft Word documents, such as a drawing program that had no entry in the Program Manager. I remember doing ugly floodfills on, and saving the changes to several of the bitmap wallpapers that came with Windows, ruining them for future use. Once, in Microsoft AntiVirus, I decided to print off the list of viruses in the database, going through half the stack of paper on our dot-matrix printer. Probably wasted a ton of ink on the ribbon and several dozen sheets in a big chain which stretched across the room.

My most ambitious project on that machine was to attempt to type-up the script to The Wizard of Oz. The film’s mythos enthralled me; the notion of watching a monochrome film which suddenly, through technological miracles, became fully colour was astounding. I fantasized of being a film-goer in the 1930s experiencing this transition unaware and marveling at the beauty of it. This exercise in transcription was an effort which lasted at least a week, a few hours each day. I was probably 8. I’d hit play on the VCR for a few seconds, pause the tape, and type up the dialogue. I got as far as the part where Dorothy had run away and was about to meet the traveling psychic who later appears in her dream as the Wizard. Considering I probably didn’t type very quickly that was quite an accomplishment, although I remember the dawning realization of how long it would take me to finish the full film and my agonizing decision to pack it in.

My elementary school had Macintosh computers. We would play KidPix and MathBlaster, and were learning how to create documents in ClarisWorks. I would spend time in Cross Country Canada, but never really figured out what commodities were, or what the point of the game was until much later so it never really made sense to me. I remember that if you stopped the truck when a hitchhiker was on the screen, trying to bum a ride then you’d pick them up and the odds were good that they’d kill you for it. Horrifying!

Shortly after moving into a new house, we got an actual desktop computer around 1996. But it was still a 486. At least there were many more games on the hard-drive provided by the fellow who sold it to us. Commander Keen 4 and 5, the Castle Wolfenstein demo, and a similar 3D-type horror game that started off in a graveyard involving ghosts and whatnot. That game was too frightening for me at the time. But in retrospect I was the sort of kid who was hypersensitive and adhered to every rule I knew of: I didn’t watch PG-13 movies until I was 13. It took very little to scare me because I just avoided frightening things altogether. I remember being terrified deep in a dungeon in Castle of the Winds, saving a new game every few steps, breathing heavy each tile my icon (literally) moved perchance to encounter a goblin or troll. I’d spend much more time playing Wheel of Fortune.

This 486 desktop computer had a CD-ROM drive but for some reason it wouldn’t work with the IDE slot; I’m not sure why. But the computer did run Windows 95 although it only really had DOS games on it. My house was only just down the street from the Public Library so I would often book time there and surf the internet several hours each week. I had a directory of websites for kids, some yellow-paged sort of phonebook typical of the 90s when publishers thought they could sell such a thing. I’d go through it and find that half of the sites listed were already dead, although many more weren’t and had lots of neat stuff. There was a giant caveat emptor that even though all the websites listed were parent-approved, all the sites which were linked to couldn’t be trusted. I stayed safe.

At school I was given time to use the computers in the library on my own thanks to my individual education plan. I studied HyperCard and make a big, sprawling project exploiting every feature of the program I could find use for. I also got some instruction in LOGO, but that was only for a few days so that never quite sunk in.

Around grade 7, in the year 2000, my friend Kevin invited me over to help him make a website. It was some free Angelfire site and we learned, on the fly, what GIF and JPG files were, and how to write HTML. We had a fun evening and got a lot done: namely get a page up with some text and an animation of a stick man in a safety vest wielding a shovel inside a yellowish-orange warning sign underneath of which read “Under Construction”.

Shortly afterwards we got a newer computer, a Pentium II, which had Windows 98 and a software modem. At the time there were a few ISPs offering free dial-up in exchange for a permanent advertisement banner on your desktop. The first we used was called FreeWeb, which only lasted a few months before we switched to Juno. Finally, I had the internet at home! Around the same time, I got a 386 desktop computer in my bedroom running DOS and Windows 3.1. I would spend many hours searching the internet for “abandonware” to run on my bedroom computer, meticulously splitting files and copying them onto floppy disks for transfer. I found my first piece of GPL software, Calmira, which was an alternative shell for Windows 3.1 that gave it a Windows 95-style taskbar and desktop. I thought it was the coolest thing ever. The WebRing which Calmira was a part of was a treasure trove of free goodies for my bedroom computer, and I customized my desktop to the extreme. You could hack system libraries to do things like replace the minimize/maximize buttons. I drew a little Windows flag, pixel-perfect copy of the one in Windows CE, and made that the new Control Box in the upper left-hand side of the Windows; only problem was that when you clicked it the colours inverted. I would try different icon schemes, create themes, etc. Reading the changelog for Calmira was exciting, and I loved the idea of how anyone could contribute changes, create forks, etc. I was learning how Free Software development worked: a community of enthusiasts and hobbyists building what they wanted. I learned their names, many details of their lives, and came to appreciate what they were building together. Even if it was for a long-obsolete old operating system which practically nobody used any more.

I was basically the computer kid. I would make money around town setting up and fixing computers for people quite regularly. I’d advise everyone I met that free trial CDs of America Online were great for getting up-to-date versions of Internet Explorer 6 instead of downloading it the long way over dialup. When my friend Nathan at school told me about Napster and WinAmp I was excited as hell to go home and get them. I bugged my dad for a CD-Burner and we drove all the way to Toronto to buy one for about $250. The advertisement said it was a Yamaha drive, but later we realized that it was some off-brand which merely used a Yamaha chipset. What a ripoff. I sold a few music CDs to kids in grade 8 until Shawn complained that the songs sounded terrible and he wanted his money back. I hadn’t quite figured out what bitrate was, and had been going for the smallest possible file sizes to speed up my download time. I might have noticed the problem myself had I listened to the files before burning them but I didn’t like Linken Park.

Pokemon was a huge craze at my school (so was Crazy Bones, but whatever) and I didn’t have a GameBoy. I had a Sega GameGear which ate batteries like crazy and didn’t want to bring to school in case I lost it or broke it. So I’d spend recess watching my friend Troy play Pokemon Red, and then Yellow. When I discovered emulation I was ecstatic. I copied a DOS-based GameBoy emulator onto my bedroom computer and got rather far into Pokemon Blue. However, playing at home by myself wasn’t that fun; I’d have rather been on the playground where I could trade or battle using the link cable. I played a few Sierra and Lucas Arts adventure games, played the Interplay Star Trek TOS games, and mostly stopped playing video games on that machine after that.

I would, however occasionally try to play StarCraft with my friends online, however my latency always got me kicked from multiplayer games. I didn’t get into StarCraft very much; instead I played Civilization II after finding it for 10 dollars on a CD spinner at Canadian Tire in Barrie.

Highschool was when things got fun. Windows 98 machines with high-speed internet abound! I took business computing in Grade 9 with Mr. Buffone and had a blast. First thing we did was learn touch typing; was it a gift or a curse? I’ve a hard time saying today. We also watched Pirates of Silicon Valley, which became my favourite movie of all time. Finally I had a story, an origin, for this world of computers which I had been growing up in like the first explorer on a new continent. It was real; it was human. Wozniak and Jobs; Gates and Allen and Ballmer. We also watched Robert Cringely’s Triumph of the Nerds, which had one scene which above all the rest had a deep emotional impact on me. It was Dan Bricklin explaining how his accountant friend break down upon seeing his “Magic Blackboard” software VisiCalc, saying “That’s what I do all week! I could do it in a few hours!” The immense societal changes brought on by the introduction of computers were brought into stark relief in that moment. All the talk of how computers were changing things were finally contextualized into a narrative about the world I could understand. Just like how Calmira was developed by regular people scratching an itch, creating what they wanted, computers were made in a similar sort of way: only they all got rich from doing it.

Soon the 386 in my room was replaced with something faster: an AMD Athlon K7. In Grade 10 I took Communications Technology, where finally the doors to media production were blown open. We did graphic design with CorelDRAW! and 3D animating with Cinema 4D. I burned Cinema 4D straight out of the Program Files directory onto a blank CD and copied the registry keys necessary to get it to run. Soon I was working on my next grand project: an accurate model of the Starship Enterprise. Starting with the Franz Joseph designs, I quickly learned of their inadequacy and after going through several different blueprints I finally annointed the Sinclair schematics as my definitive guide. I scoured the net for photographs of the original 11 foot filming model, many taken at its location in the gift shop at the Smithsonian many more during it’s several restoration projects where drastic expriments in hull texture were undertook. It took about 4 months to complete and I rendered a single, horribly-overlit flyby before dropping my hard drive on the floor one day and losing it all. I cried for a whole day; it was a painful lesson in why making backups is important.

MSN Messenger was the de facto communications tool at my high school. Everyone had one tied to their Hotmail email address. This was part of Microsoft’s Passport scheme which was intended to be a single, centralized sign-on for many web services; the goal which has today been achieved by Facebook and Twitter. I was a part of several web communities, most of whom were full of adults with great senses of humour. So usually on MSN I was actually quite witty and got plenty of LOLs from people who I wasn’t really friends with at all in school. It was confusing to me at the time. Why am I funny online, but not in person? On MSN your screen name could be pretty long, and so everyone used it as their online status. You would change your screen name to say what you were up to, where you were, or just to have some song lyrics or something funny. People would change their names several times a day. It was clean, elegant, and unimposing social networking. I seldom used my useername directly in that way, instead opting to run a 3rd party MSN client that would update my screen name to show what song I was listening to. Another added benefit was that there were no advertisements, however I missed out on playing some of the minigames that came with the official client, or using the voice chat. Oh well, at least it was Free Software.

Free Software was something I was getting into. I had started reading Slashdot.org daily and was learning about GNU/Linux, the GPL, and Richard Stallman. Free software isn’t just free as in beer (a comparison I didn’t quite get), it’s free as in speech. That means every user had access and control over the source code and communities of volunteers and users were free to improve it, change it, and share it like they owned it. That’s what Calmira had been! A metaphor by Neil Stephenson was often employed to explain it: GNU/Linux is like hippies selling free tanks on the side of the road across the street from the more respectable car lots run by Apple and Microsoft. No matter how often the hippies yelled “free tanks! rugged and durable!” they couldn’t get anybody to take one since Microsoft and Apple were known to be the “safe” choices.

Some programmers saw the stipulations in the GNU Public License as a burden which removed their freedom to copy/paste the code from GNU software and put it into their own work. Doing that would instantly mandate them to re-license the source code to their entire work with the GPL and give it to whomever they distributed the software to. The GPL is a legal hazard to commercial software development firms since any lazy programmer who copied the readily-available source from GPL software instantly contaminated their products with some legal obligation to give it all away for free or get sued! Tesla only recently was forced to release the source to the software running their cars for having made this very blunder. This understandably does not seem like “freedom” to developers and software vendors. However this stipulation ensured end-users of GPL software would always enjoy full access to the source code of their own computer, in effect keeping computing free and open and thwarting any historical Orwellian 1984 scenario by means of proprietary software. If Apple, Microsoft, or any other major bedrock computer company ever went totalitarian, there would exist an auditable, trustworthy fall-back operating system and software base of upon which freedom loving people and societies could fall back upon, all thanks to the foresight of Richard Stallman. He had, in fact, preemptively saved the world from digital tyranny preemptively with a software license!

The radical nature of this heavily political act — utilizing the nature of and copyright law in a manner so subversive to its intended cause — was awe-inspiring. Imagine: a software license which didn’t stop you from copying a work, but instead forced you to give away fully to its users and contribute your improvements to all for the greater good. Richard Stallman was a hero, and yet I had never heard of his name. Perhaps the nuances of computer development and intellectual property law were too arcane and boring for people to appreciate this historic impact of this champion for freedom against tyranny. Silly ignorant consumers… one day they’d learn the truth, I thought. And I will already be there with a great head start, and a smug “what too you so long?” look on my face!

I didn’t waste much time formatting my hard drive and beginning the long journey in learning computers all over again with GNU/Linux. Of course, since it’s free to copy there are in fact many different outfits distributing their own flavour of GNU/Linux. I started with Mepis Linux, a Debian derivative, before switching to Mandrake which was more rich in it’s initial distribution and developed in France. Afterward I bit the bullet and took the plunge into Gentoo, which is a distro the user must build from the ground up from source, compiling all the code for the software themselves. Doing so gives you an education in every level of the operating system since you’re creating it from scratch. It took me about 3 weeks since it was summer break of 2004 and I kept having problems with overheating which crashed the compiler. It was so long before I got the graphics stack working that I was forced to take a detour to learn how to surf the internet, play music, and chat on MSN Messenger using the command-line as though I were back in DOS!

I’ve recently read, in full, the Neil Stephenson essay from which the above metaphor involving hippies with tanks was taken. It’s called “In The Beginning was the Command Line” and I earnestly recommend it to anybody who wants a comprehensive history of personal computing (sans Stallman and Kildall, alas). Given that old telecommunications equipment known as “teletypes” or “teleprinters” were used as the first textual computer interfaces (instead of punch cards or switches), Stephenson makes the wonderful creative decision to refer to all textual communication with computers as telegraphy. He also calls text files on computers telegrams. This is something I myself had thought up on my own before coming across this essay, by serendipitous coincidence, and so I’ll include this metaphor here although I didn’t know it back in 2004.

Back when I was using MS-DOS on my x86 computers, their was only one terminal; one DOS prompt. You could only run one program at a time, launched from within a directory which was on a drive. The “root” of the command line was the C: drive or the A: drive, denoting whichever storage device you were working with, and then you’d go into folders. All of your commands being executed from wherever you are on the “path” you’ve traveled into the filesystem. Anything you type happens within this working directory, from which you can back out back to the root of the drive.

For instance you can start at the root of your hard drive, at C:\>. By typing `mkdir docs` you can create a directory in the root. You can go into that directory by typing `cd docs`, meaning you’ve gone down a path to end up in C:\DOCS>. If you type `edit readme.txt` then you can launch a text editor to create a telegram called C:\DOCS\README.TXT. Quitting the text editor brings you back to the C:\DOCS> prompt, where you can choose to back out, or copy the file you’ve made somewhere else, etc. I was used to this from my childhood.

Well, while struggling to get Gentoo up and running, I was learning how UNIX (and, by extension it’s derivatives like GNU/Linux) was basically a supercharged form of DOS. That’s anachronistic, since the DOS paradigm for microcomputers was developed after UNIX, but that was how I saw it. On UNIX you didn’t have one prompt; you had six. Each was called a TTY, which was short for teletype. By pressing Ctrl+Alt+F1 you can change to TTY1, pressing Ctrl+Alt+F2 gets you to TTY2, and so on. Switching to a new teletype gets you to a login prompt which asks you for your username and password before dropping you into your home directory. It’s just like having six teletypes (which look just like typewriters) on your desk, each of them providing you with a command prompt to the same computer. In fact, back in the da]y before video displays, that’s exactly how it worked! Each teletype was called a terminal, like the last station on a railway line, because it was at the end of the line which was coming out from the mainframe server computer running UNIX. In the 60s and 70s the teletypes (and later keyboards and video screens sitting on desks) were not computers, merely terminals which remotely accessed the real computer elsewhere which was probably quite large and expensive. This terminal/mainframe model of working on a faraway computer is analageous to what we today call the client/server model or the paring of apps/webapps in regards to “cloud” computing. Dilbert’s pointy-haried boss has many semantic marketing tricks to rebrand the old as the new.

Of course, in my bedroom, the six teletype terminals were all virtual within the single computer sitting on my desk. For my purposes, that meant I was running DOS but had six different “windows” to run apps in simultaneously, giving my multitasking! Sweet. This was the future (of DOS), at least while I spent all my time trying to get my video card drivers working so I could begin spending the hours, days, and nights, compiling all the graphical software so I could finally start using my mouse again in a modern graphical user interface. This summer installing Gentoo was a real, practical lesson in operating systems. I had manually partitioned and formatted my hard drive, copied files over, crafted configuration files by hand, got my network card working, downloaded source code, configured each component, compiled it, and rinsed, lathered, repeated with plenty of troubleshooting, all in an environment which I was barely familiar with. In the final stretch I moved my operation into the living room, with my CRT on the coffee table and my open computer case up on the back of the couch leaning against the grill of the window-mounted air conditioner just to stop the compiler from crashing. Can’t say my family was super thrilled but I accommodated them the best I could. When it was finally over, and I had built an entire operating system from the ground up, I felt I could accomplish anything.

I had learned an entirely new paradigm for computing. Unlike DOS, in UNIX there is a single root directory; no A: drive or C: drive. Instead the computer has a single root and the various drives are “mounted” as subdirectories. This crucial difference in the lowest-level of the user environment made clear to me the arbitrary, human nature of operating system design as a model or abstraction which is crafted, like art, as something first-and-foremost meaningful for the user to understand. It is not for technical reasons, but for human reasons that computers are designed with the abstractions that they are. Steve Jobs called the Apple II (emphatically not, as the schmaltzy Michael Fassbender biopic would mislead you to believe, the Macintosh) a bicycle for the mind. Bicycles don’t have seats and handlebars in order to make the gear ratios work better; they have them for the sake of the human interface. Likewise the entire concept of filesystems and directory paths doesn’t exist because it’s technically necessary, but because the artists who thought up ways of interfacing with machines had to subdivide the whole computer into fragmentary parts which the user could learn and work with within their own imagination. This way they could learn which keys to press or buttons to click intuitively as though navigating through some sort of virtual space. The computer isn’t “in” a folder; you are! By learning DOS, Windows, MacOS, and GNU/Linux I was beginning to separate the specifics of implementation for the sake of interface from the universals inherent across all systems. Hardware is hardware and operating systems must boot precisely in accordance but interface paradigms and software design are for the benefit of the human; not the computer.

I saved up some money to purchase a Palm m515 PDA. It was beautiful. 65,000+ colours on the bright 160×160 resolution display. I could HotSync the calendar, contacts, todo list and notes with my the software on my GNU/Linux desktop just by putting it in the charging cradle and pressing a button. I could install tons of applications, and even archive websites onto it for offline reading on-the-go. I read many books on my Palm m515. I shared them with people, including the Commanding Officer of my Army Cadet corp using the infrared file transfer (imagine two people, standing around, pointing TV remotes at each other for 5 minutes). I played infrared Battleship across class one time with someone in highschool; we passed turns to each other back and forth from our desks. I even bought a folding keyboard and an pocket sized, AA-battery powered, infrared thermal printer and handed in a few essays in class which I had noisily printed in size eight font on long strips of receipt paper. My daily schedule was mapped and I followed it well. I never was, and have never since been, as organized as I was with my trusty Palm PDA.

Around this time I also decided to resurrect my original Compaq Contura 4/25c. My dad had found fresh batteries for it on eBay, meaning that I could get it running on the school bus into Barrie each day. I was in retro-land, reliving my early years in DOS and Windows 3.1. My main uses for the laptop was to play Warcraft (the first one), Lemmings, and Star Wars: Tie Fighter using the clip-on trackball mouse.