Landschlacht, Switzerland, Thursday 12 September 2019
To somewhat contribute to the kitty that is our mutual income I wear two hats:
From Monday to Wednesday I offer my services as an ESL (English as a Second Language) teacher, from Thursday to Sunday I am available to work as a barista at Starbucks.
In my free time I write.
(Only the God of Christianity rests on Sunday.
This Jack is all work and no play.
A very dull boy indeed.)
(I hope one day to wear a third hat – or to exchange one of the other two hats – as a paid writer, but for now this blog is unpaid practice.)
I often wax poetically (i.e. write a lot) about Starbucks, but there are times, like now, where my teaching inspires a post or two.
As an ESL teacher I could be classified as an ESP teacher (English for Specific Purposes, not an educator with extrasensory perception) as I have been required to teach a plethora of topics where the students needed English for particular purposes.
I have taught English for:
- A prison guard who needed English to work in a United Nations prison in Kosovo
- A prospective sailboat buyer
- The insurance industry
- Pilots, cabin crew, aviation technicians and airline passengers
- Legal students
- Cambridge University examination candidates
- Accounting and banking students
- HR personnel and those seeking employment
- Executives and admin staff
- Customer service
- Pharmaceutical students
- Medical staff
- Theatrical staff (backstage and wardrobe)
- Conversation and presentations
- Business students and staff
- Engineering students
Of course, I am not an expert in all of these subjects, so I have had to teach myself beforehand what the students expect to learn, or at least I had to learn enough of their subjects to provide them with the English they needed to function and achieve their desired goals.
Of all these subjects the most challenging for me to teach has been anything remotely technical, for I have never been of this bent.
(Which makes me an embarrassment to the family name as my father is a retired mechanic and my oldest brother a retired x-ray technician.)
Above: The family tartan
For example, I know nothing about cars and have never driven one.
I own a mobile phone, a laptop, a tablet and a desktop computer, but I use them at a far lesser capacity than my colleagues do, and I am easily frustrated by them.
I am more likely to use them to gather dust than to gather information.
I am forever forgetting passwords, to check my messages, to respond to emails, to save what I have spent many hours labouring on, to update, reboot, or whatever the hell my computer tells me to do.
And that’s the thing.
More and more it seems to me that we don’t control technology.
It controls us.
We have become an impatient species, cursing at unemotional screens that won’t show us what we want to see as quickly as we want to see it.
And our impatience with our incomprehensible technology – (most of us use it, but few of us understand it) – extends to our relationships with other people.
Above: Alan Cumming as “Boris Grishenko“, James Bond film GoldenEye
“Why can’t people be as fast and efficient as our machines?“, we complain bitterly.
“I sent you a WhatsApp message or an email five minutes ago.
Why haven’t you responded?”
“Why can’t machines be as fast and efficient as our desires?“, people cry.
I marvel at contact payment and tremble at the dystopian possibilities of a day when cards won’t be needed and our bodies become barcodes.
I hate the increasing dependence on technology and find myself shocked by how pervasive it has become.
I am reminded of E.M. Forster’s The Machine Stops…..
Above: Edward Morgan Forster (1879 – 1970), best known for A Room with a View, Howard’s End and A Passage to India
“The Machine Stops” is a science fiction short story by E. M. Forster.
After initial publication in The Oxford and Cambridge Review (November 1909), the story was republished in Forster’s The Eternal Moment and Other Stories in 1928.
After being voted one of the best novellas up to 1965, it was included that same year in the populist anthology Modern Short Stories.
In 1973 it was also included in The Science Fiction Hall of Fame, Volume Two.
The story, set in a world where humanity lives underground and relies on a giant machine to provide its needs, predicted technologies such as instant messaging and the Internet.
The story describes a world in which most of the human population has lost the ability to live on the surface of the Earth.
Each individual now lives in isolation below ground in a standard room, with all bodily and spiritual needs met by the omnipotent, global Machine.
Travel is permitted, but is unpopular and rarely necessary.
Communication is made via a kind of instant messaging/video conferencing machine with which people conduct their only activity: the sharing of ideas and what passes for knowledge.
The two main characters, Vashti and her son Kuno, live on opposite sides of the world.
Vashti is content with her life, which, like most inhabitants of the world, she spends producing and endlessly discussing secondhand ‘ideas‘.
Kuno, however, is a sensualist and a rebel.
He persuades a reluctant Vashti to endure the journey (and the resultant unwelcome personal interaction) to his room.
There, he tells her of his disenchantment with the sanitised, mechanical world.
He confides to her that he has visited the surface of the Earth without permission, and that he saw other humans living outside the world of the Machine.
However, the Machine recaptures him and he is threatened with ‘Homelessness‘: expulsion from the underground environment and presumed death.
Vashti, however, dismisses her son’s concerns as dangerous madness and returns to her part of the world.
As time passes, and Vashti continues the routine of her daily life, there are two important developments.
First, the life support apparatus required to visit the outer world is abolished.
Most welcome this development, as they are skeptical and fearful of first-hand experience and of those who desire it.
Secondly, “Technopoly“, a kind of religion, is re-established, in which the Machine is the object of worship.
People forget that humans created the Machine and treat it as a mystical entity whose needs supersede their own.
Those who do not accept the deity of the Machine are viewed as ‘unmechanical‘ and threatened with Homelessness.
The Mending Apparatus—the system charged with repairing defects that appear in the Machine proper—has also failed by this time, but concerns about this are dismissed in the context of the supposed omnipotence of the Machine itself.
During this time, Kuno is transferred to a room near Vashti’s.
He comes to believe that the Machine is breaking down and tells her cryptically “The Machine stops.”
Vashti continues with her life, but eventually defects begin to appear in the Machine.
At first, humans accept the deteriorations as the whim of the Machine, to which they are now wholly subservient, but the situation continues to deteriorate, as the knowledge of how to repair the Machine has been lost.
Finally, the Machine collapses, bringing ‘civilization‘ down with it.
Kuno comes to Vashti’s ruined room.
Before they perish, they realise that humanity and its connection to the natural world are what truly matter, and that it will fall to the surface-dwellers who still exist to rebuild the human race and to prevent the mistake of the Machine from being repeated.
In the preface to his Collected Short Stories (1947), Forster wrote that “‘The Machine Stops‘ is a reaction to one of the earlier heavens of H. G. Wells.”
In The Time Machine, Wells had pictured the childlike Eloi living the life of leisure of Greek gods whilst the working Morlocks lived underground and kept their whole idyllic existence going.
In contrast to Wells’ political commentary, Forster points to the technology itself as the ultimate controlling force.
Isaac Asimov’s second novel in the I Robot series, The Naked Sun, takes place on a planet similar to the Earth seen in this story.
On the planet Solaria human colonists live isolated from one another, only viewing each other through holograms and only have interactions with their robot retinues.
After several centuries the humans have become so dependent on this practice it has become taboo to even be in the presence of another human being.
The song “The Machine Stops” by the band Level 42 not only shares the same title with the story but also has lyrics that echo Kuno’s thoughts.
Both George Lucas’s film THX 1138 (1971) and the original novel version of Logan’s Run (1967) by William F. Nolan and George Clayton Johnson bear multiple similarities to “The Machine Stops“.
Few people see the scenery with heads bent down over handheld screens.
And I am as guilty as they, for I like to compose my thoughts of the day on Facebook as I travel on the train.
I recall a visit to the Luzern Historical Museum where each visitor is handled a tablet, all the exhibited objects are barcoded, and information is gathered by scanning the barcodes.
Never have I ended a visit to a historical museum so quickly, for there was no sense of history or heritage felt by constantly staring downwards at a tablet screen.
My wife constantly curses me for buying books for an already overexpanded library in our cramped apartment.
“Get a Kindle!“, she shouts.
“Read books electronically, at a cheaper price and without all the space taken up.”
I will not.
I am also reminded of George Orwell’s Nineteen Eighty-Four and Ray Bradbury’s Fahrenheit 451, where information is altered to suit the government’s agenda in Orwell’s dystopia and books are burned in Bradbury’s hell.
A book is complete and unalterable once it is published and printed and purchased and collected and pondered and cherished.
An electronic book is never completely immune from alternation by those who desire we read only what they feel we ought to and not have thoughts that cause us to question the life that they control.
All of this comes to mind as I recall an old article I stumbled across in my post-bombing Dresden study about the World Wide Web and he who is called its father.
Now I am not a total Luddite when it comes to the Internet, for it truly has many positive aspects about it.
(The Luddites were a secret oath-based organization of English textile workers in the 19th century, a radical faction which destroyed textile machinery as a form of protest.
The group was protesting against the use of machinery in a “fraudulent and deceitful manner” to get around standard labour practices.
Luddites feared that the time spent learning the skills of their craft would go to waste, as machines would replace their role in the industry.
Over time, however, the term has come to mean one opposed to industrialisation, automation, computerisation, or new technologies in general.
The Luddite movement began in Nottingham in England and culminated in a region-wide rebellion that lasted from 1811 to 1816.
Mill and factory owners took to shooting protesters and eventually the movement was suppressed with legal and military force.)
On the Net you can find the answer to any question (except maybe the meaning of life), shop the globe (caveat emptor), send documents worldwide in a flash, hear new music, dabble in the stock market, visit art galleries, read books electronically, play games, video chat with faraway relatives, make friends with similar interests, catch up on the latest hometown news, make phone calls, grab free software, manage your bank accounts – or just fritter away your free time surfing the almost infinite universe of the Web.
And the Net is not merely a toy, but it is a tool firmly entrenched in the workplace, including at Starbucks.
(Except in my classrooms where I remain an old-fashioned blackboard / whiteboard teacher who has yet to use electronics to teach.)
Millions of companies use the Net to promote their products (annoyingly so), take orders and support their customers (like an assembly line).
More business communication is done by email and instant messaging than via phone calls, faxes (remember those?) and printed correspondence combined.
Personally I prefer face-to-face encounters where we interact with one another as human beings rather than with words or pictures on a screen.
Put simply, the Internet consists of millions of computers connected via cables and radio waves.
At the core of this giant network are a series of computers permanently joined together through high-speed connections.
To connect to the Net you simply connect your computer to any one of these networked computers via an Internet Service Provider (ISP).
The moment you do this, your computer itself becomes part of the Internet.
Let the madness of the Matrix begin.
Of course we are assured that the Net is not really about computers and cables and satellites that string everything together.
It is about people and communication and sharing knowledge.
(They chuckle over this in Silicon Valley.)
Above: San Jose, California
It is about people spending more time with technology than with each other, communication that is faster but less dignified or diplomatic, and an overabundance of information from which few can glean fact from fiction.
The theory with all new techno-toys, innovative inventions and dazzling discoveries is that our lives will be improved.
I don’t subscribe to this point of view.
Faster and more does not necessarily mean better.
It just means faster and more.
It was 1957 at the height of the Cold War.
The Soviets had just launched the first Sputnik, thus beating the US into space.
The race was on.
Above: Replica of Sputnik 1
In response the US Department of Defense formed the Advanced Research Projects Agency (ARPA) to bump up its technological prowess.
Twelve years later, ARPA spawned ARPAnet – a project to develop a military research network, or, specifically, the world’s first decentralized computer network.
In those days, no one had PCs.
The computer world was based on mainframe computers and dumb terminals.
These usually involved a gigantic, fragile box in a climate-controlled room, which acted as a hub, with a mass of cables spoking out to keyboard / monitor ensembles.
The concept of independent intelligent processors pooling resources through a network was brave new territory that would require the development of new hardware, software and connectivity methods.
The driving force behind decentralization, ironically, was the bomb-proofing factor.
Nuke a mainframe and the system goes down.
But bombing a network would, at worst, remove only a few nodes.
The remainder could route around it unharmed.
Or so the theory went….
Over the next decade, research agencies and universities flocked to join the network.
US institutions, such as UCLA, MIT, Stanford and Harvard led the way.
In 1973, the network crossed the Atlantic to include University College London and Norway’s Royal Radar Establishment.
Above: Flag of Norway
The 1970s also saw the introduction of electronic mail, FTP (file transfer protocol), Telnet and what would become the Usenet newsgroups.
The early 1980s brought TCP / IP (Internet protocol), the domain name system, Network News Transfer Protocol and the European networks EUnet (European UNIX Network), MiniTel (the widely-adopted French consumer network) and JANET (Joint Academic Network) as well as the Japanese UNIX network.
ARPA evolved to handle the research traffic, while a second network, MILnet, took over US military intelligence.
An important development took place in 1986, when the US National Science Foundation established NSFnet by linking five university super-computers at a backbone speed of 56Kbps.
This opened the gateway for external universities to tap into superior processing power and share resources.
In the three years between 1984 and 1988 the number of host companies on the Internet (as it was now called) grew from about 1,000 to over 60,000.
Over the next few years, more and more countries joined the network, spanning the globe from Australia and New Zealand to Iceland, Israel, Brazil, India and Argentina.
It was at this time, too, that Internet Relay Chat (IRC) burst onto the scene by providing an alternative to CNN’s incessant, but censored, Gulf War coverage.
By this stage, the Net had grown far beyond its original charter.
Although ARPA had succeeded in creating the basis for decentralized computing, whether it was an actual military success is debatable.
It might have been bombproof, but it also opened new doors to espionage.
It was never particularly secure and it is suspected that Soviet agents routinely hacked in to forage for research data.
In 1990, ARPAnet folded and NSFnet took over administering the Net.
Above: Flag of the Soviet Union (1922 – 1981)
Global electronic communication was far too useful and versatile to stay confined to academics.
Big business was starting to notice.
The Cold War looked as if it was over and world economies were regaining confidence after the 1987 stock market savaging.
In most places, market trading moved from the pits and blackboards onto computer screens.
The financial sector expected fingertip real-time data and that desire was spreading.
The world was ready for a people’s network.
And, since the Net was already in place, funded by taxpayers, there was really no excuse not to open it to the public.
In 1989, Tim Berners-Lee of CERN, the Swiss particle physics institute, proposed the basis of the World Wide Web, initially as a means of sharing physics research.
Above: Logo of the Conseil européen pour la Research nucleaire (European Organization for Nuclear Research)
By 1985, the global Internet began to proliferate in Europe and the Domain Name System (upon which the Uniform Resource Locator is built) came into being.
In 1988 the first direct IP connection between Europe and North America was made and Berners-Lee began to openly discuss the possibility of a web-like system at CERN.
Above: Sir John Timothy Berners-Lee
While working at CERN, Berners-Lee became frustrated with the inefficiencies and difficulties posed by finding information stored on different computers.
On 12 March 1989, he submitted a memorandum, titled “Information Management: A Proposal” to the management at CERN for a system called “Mesh” that referenced ENQUIRE, a database and software project he had built in 1980, which used the term “web” and described a more elaborate information management system based on links embedded as text:
“Imagine, then, the references in this document all being associated with the network address of the thing to which they referred, so that while reading this document, you could skip to them with a click of the mouse.“
Such a system, he explained, could be referred to using one of the existing meanings of the word hypertext, a term that he says was coined in the 1950s.
There is no reason, the proposal continues, why such hypertext links could not encompass multimedia documents including graphics, speech and video, so that Berners-Lee goes on to use the term hypermedia.
In a list of 80 cultural moments that shaped the world, chosen by a panel of 25 eminent scientists, academics, writers, and world leaders, the invention of the World Wide Web was ranked number one, with the entry stating:
“The fastest growing communications medium of all time, the Internet has changed the shape of modern life forever.
We can connect with each other instantly, all over the world“.
With help from his colleague and fellow hypertext enthusiast Robert Cailliau he published a more formal proposal on 12 November 1990 to build a “Hypertext project” called “WorldWideWeb” (one word) as a “web” of “hypertext documents” to be viewed by “browsers” using a client–server architecture.
Above: Robert Cailliau
At this point HTML and HTTP had already been in development for about two months and the first Web server was about a month from completing its first successful test.
This proposal estimated that a read-only web would be developed within three months and that it would take six months to achieve “the creation of new links and new material by readers, [so that] authorship becomes universal” as well as “the automatic notification of a reader when new material of interest to him/her has become available“.
While the read-only goal was met, accessible authorship of web content took longer to mature, with the wiki concept, WebDAV, blogs, Web 2.0 and RSS/Atom.
His goal was a seamless network in which data from any source could be accessed in a simple, consistent way with one program, on any type of computer.
Above: This NeXT computer used by Berners-Lee at CERN became the first Web server.
The Web did this, encompassing all existing infosystems, such as FTP, Gopher and Usenet, without alteration.
It remains an unqualified success.
As the number of Internet hosts exceeded one million, the Internet Society was formed to brainstorm protocols and attempt to coordinate and direct the Net’s escalating expansion.
(The Internet Society (ISOC) is an American nonprofit organization founded in 1992 to provide leadership in Internet-related standards, education, access, and policy.
Its mission is “to promote the open development, evolution and use of the Internet for the benefit of all people throughout the world“.
The Internet Society has its global headquarters in Reston, Virginia, United States (near Washington, D.C.), a major office in Geneva, Switzerland, and regional bureaus in Brussels, Singapore and Montevideo.
It has a global membership base of more than 100,000 organizational and individual members.)
Mosaic – the first graphical Web browser – was released and declared to be the “killer application of the 1990s”.
Mosaic made navigating the Internet as simple as pointing and clicking and took away the need to know UNIX.
The Web’s traffic increased by 25-fold in the year up to June 1994.
Above: Mosaic logo, National Center for Supercomputing Applications (NCSA)
Domain names for commercial organizations (.com) began to outnumber those of educational institutions (.edu).
As the Web grew, so too did the global village.
The media began to notice, slowly realizing that the Internet was something that went way beyond propellor heads and students.
Almost every country in the world had joined the Net.
Even the White House was online…..
It was an auspicious start.
When Tim Berners-Lee, then an awkward 34-year-old coder from South London, wrote an document outlining plans for the World Wide Web and handed it to his boss three decades ago, it was eventually returned with three words pencilled in the corner:
“Vague but exciting.”
Today, Sir Tim, who has a tendency to talk in rapid bursts and skip over words, recalls how “nobody really did anything” after he wrote his proposal in March 1989.
It wasn’t until 18 months later that his boss at CERN agreed that Tim could work on his idea as a hobby, or sort of “play project” as he puts it.
The rest, as they say, is history and Sir Tim has never looked back.
The softly spoken computer scientist could not have imagined that this side project would later pave the way for colossal companies like Facebook, Amazon, Netflix and Google.
It would connect billions globally, spawn the creation of entire new industries and serve as a growing bank for humanity’s knowledge.
But it would also give rise to fake news, political manipulation and Internet addiction on a terrifying scale….
As word of a captive market got around, entrepreneurial brains went into overdrive.
Canter & Siegel, an Arizona law firm, notoriously “spammed” Usenet with advertisements for the US Green Card Lottery.
Although the Net was tentatively open for business, crossposting advertisements to every newsgroup was considered bad form.
Such was the ensuing backlash that C & S had no chance of filtering out genuine respnses from the server-breaking level of hate mail they received.
A precedent had been sent for how not to do business on the Net.
Pizza Hut, by contrast, showed how to do it subtly by setting up a trial service on the Web.
Although it generated wads of positive publicity, it too was doomed by impracticalities.
Nevertheless, the ball began to roll….
“For me, I watched the Web grow from something where people had very Utopian dreams about it, to something where I had to point out to them that this is just a reflection of humanity.“, says Sir Tim, now 63, with a quick shrug of resignation.
The animated software engineer stepped off stage at the Open Data Institute, the organization he founded to promote the use of open data.
Sir Tim is eager to explain how the Web, just like humanity, encompasses the good, the bad and the ugly.
Sadly, in an intensifying battle between these different forces online, it is the latter two which have emerged as the clear winners in recent months…..
As individuals arrived to stake out Web territory, businesses followed.
Most had no idea what to do once they got their brand online.
Too many arrived with a bang, only to peter out in a perpetuity of “under construction” signs.
Soon business cards not only sported email addresses, but Web addresses as well.
And, rather than send a CV and stiff letter, job aspirants could now send a brief email accompanied with a “see my webpage” for further details.
The Internet moved out of the realm of luxury into an elite necessity, verging toward a commodity.
Some early business sites gathered such a following that by 1995 they were able to charge high rates for advertising banners.
Above: Spanish language Wikipedia web banner
In late 1995 the backlash against Internet freedom moved into full flight.
The expression “found on the Internet” became the news tag of the minute, depicting the Net as the source of everything evil – from bomb recipes to child pornography.
While editors and commentators, often with little direct experience of the Net, urged that children be protected, the Net’s own media and opinion shakers pushed the freedom of speech wheelbarrow, claiming that the very foundations of democracy were at stake.
Above: The Parthenon, Athens, Greece
At first politicians didn’t take much notice.
Few could even grasp the concept of what the Net was about, let alone figure out a way to regulate its activities.
The first, and easiest, target was porn, resulting in raids on hundreds of private bulletin boards (BBSs) worldwide and a few much-publicized convictions for child porn.
BBSs were sitting ducks, being mostly self-contained and run by someone who could take the rap.
Net activists, however, feared that the primary objective was to send a ripple of fear through a Net community that believed it was bigger than the law and to soften the public to the notion that the Internet, as it stood, posed a threat to national wellbeing.
In December 1993, at the request of German authorities, CompuServe cut its newsfeed to exclude the bulk of newsgroups carrying sexual material.
But the groups cut were not just pornographers.
Some were dedicated to gay and abortion issues.
This brought to light the difficulty in drawing the lines of obscenity and the problems of publishing across foreign boundaries.
Next came the US Communications Decency Act, proposed legislation to forbid the online publication of “obscene” material.
The Act was poorly conceived, however, and, following opposition from a very broad range of groups (including such mainstream bodies as the American Libraries Association), it was overturned, the decision later being upheld in the Supreme Court.
Outside the US, more authorities reacted.
In France, three ISP Chiefs were temporarily jailed for supplying obscene newsgroups.
Above: Flag of France
Meanwhile in Australia police prosecuted several users for downloading child porn.
New South Wales (NSW) courts introduced legislation banning obsence material with such loose wording that the Internet itself could be deemed illegal – if the law is ever tested.
Above: Flag of the Commonwealth of Australia
In Britain, the police tried a “voluntary” approach in mid-1996, identifying newsgroups that carried pornography beyond the pale and requesting that providers remove them from their feed.
Most complied, but there was unease within the Internet industry that this was the wrong approach.
The same groups would migrate elsewhere and the root of the problem would remain.
The debate was, and is, about far more than porn.
For Net fundamentalists, the issue is about holding ground against any compromises in liberty and retaining the global village as a political force.
One that is potentially capable of bringing down governments and large corporations.
Indeed, they argue that these battles over publishing freedom have shown governments to be out of touch with both technology and the social undercurrent, and that, in the long run, the balance of power will shift towards the people, towards a new democracy…..
When it comes to bad, cyberhacks have eroded people’s trust in online security and left millions exposed to fraud.
The growing use of smartphones caused a generation of children to become increasingly addicted, anxious and lonely.
Meanwhile, governments around the world stepped up efforts to monitor citizens through online tools that are often invasive and discriminatory…..
Another slow news day story of the 1990s depicted hackers ruling networks, stealing money and creating havoc.
Great reading, but the reality was less alarming.
Although the US Department of Defense reported hundreds of thousands of network break-ins, they claimed it was more annoying than damaging, while in the commercial world little went astray except the odd credit card file.
(Bear in mind that every time you hand your credit card to a shop assistant they get the same information.)
In fact, by and large, for an online population greater than the combined size of Calcutta, London, Moscow, New York and Tokyo, there were surprisingly few noteworthy crimes.
Yet the perception remained that the Net was too unsafe for the exchange of sensitive information, such as payment details.
Libertarians raged at the US government’s refusal to lift export bans on crack-proof encryption algorithms.
But cryptography, the science of message coding, has traditionally been classified as a weapon and thus export of encryption falls under the Arms Control acts.
Above: Germany’s Enigma machine (1922 – 1945)
Encryption requires a special key to open the contents of a message and often another public key to code the message.
These keys can be generated for regular use by individuals or, in the case of Web transactions, simply for one session upon agreement between the server and the client.
Several governments proposed to employ official authorities to keep a register of all secret keys and surrender them upon warrant – an unpopular proposal, to put it mildly, among a Net community who regard invasion of privacy as an issue equal in importance to censorship, and government monitors as instruments of repression.
However, authorities were so used to being able to tap phones, intercept mail and install listening devices to aid investigations, they did not relish giving up that freedom either.
Government officials made a lot of noise about needing to monitor data to protect national security, though their true motives probably involve monitoring internal insurgence and catching tax cheats.
Stuff they are not really supposed to do but we put up with it anyway because if we are law-abiding it is mostly in our best interests.
The implications of such obstinancy went far beyond personal privacy.
Businesses awaited browsers that could talk to commercial servers using totally snooper-proof encryption.
Strong encryption technology had already been built into browsers, but it was illegal to export them from the US.
At any rate, the law was finally relaxed in mid-2000…..
Then, in March 2018, things got ugly.
The Cambridge Analytica scandal showed how far political campaigners were prepared to go when it came to collecting user data to manipulate behaviour.
“I think what happened with Cambridge Analytica, it was the tipping point for people on the street.“, says Sir Tim.
“Before you would not have written about these things in the Telegraph, and now you do because people realize that there are big systems and processes at work that they are contributing to by giving their data, and those systems are being used to manipulate them.”
Born in Richmond young Tim showed a natural ability for computers.
By his own admission, he was hopeless at sports, instead preferring to spend his days trainspotting and tinkering with model railways.
After attending Emanuel, a private school in Battersea, the web pioneer studied physics at Oxford.
It was here where he met his first wife and built his own computer using an M6800 processor and scrap parts from an old television.
But it wasn’t until he joined the European Organization for Nuclear Research (CERN) after graduating that his life really began to change.
His work led him to create the blueprint for the World Wide Web – a system designed to share data between nuclear scientists.
On 30 April 1993, Tim’s browser was placed in the public domain available for anyone to use for free.
It quickly took on a life of its own.
The “Father of the Web” tries to pinpoint the moment he realized his creation, which he has never profited from, turned against him.
“I don’t think there was one particular place, but I remember there was the Twitter bombing of the Martha Coakley election, where people demonstrated how that election had been largely manipulated.
She was attacked using social media.”
Above: Logo for Twitter
Tim was referring to the 2010 Twitter bot attack on Attorney General Martha Coakley, who was running for the US Senate.
Within the space of 138 minutes, nine Twitter accounts spread misinformation designed to make users believe that Coakley hated Catholics.
Above: Martha Coatley
Today, the incident is regarded as the first known bot attack used to influence politics.
Sir Tim is clearly frustrated with the lack of progress from social media companies.
“I think when you look at all of the issues of fake news, I think we do need to move quickly, because you never know how the next election is going to be.
It is important to try to make sure that the social networks and all the systems we have out there are ones that have been engineered as much as possible to produce constructive debate and to produce systems where people can be held accountable for things that they say that aren’t true.”
But Sir Tim is, at heart, an optimist, and claims that Facebook has shown signs it is changing its ways.
That could be something to do with the fact that Facebook’s CEO Mark Zuckerberg signed up to Sir Tim’s Magna Carta for the Web – a contract that requires tech companies to respect data privacy and “support the best in humanity”.
It remains to be seen, however, how far the deal is upheld….
Above: Facebook CEO Mark Zuckerberg
Sir Tim is keen to emphasize that data is not a force for evil.
Used in the right way, it can have massive benefits.
He admits he is somewhat of a data addict himself:
“I do like tracking myself, I do like to track where I have been, what I have been up to, what I have been talking to.”
Like a parent who won’t give up on his child, Sir Tim appears hopeful the worst days of the Web are over.
“I think the winds have changed.
We will get past this time.”
Now that the people’s network has the globe in a stranglehold, you might assume that the wired revolution is as good as over.
Perhaps it is, but for those in the dark the reality is less comforting.
A passable knowledge of the Internet was enough to land you a job in 1997.
Today, in many fields, it is a basic prerequisite.
It is a case of get online or get left behind.
Still, like it or not, the Net is the closest thing yet to an all-encompassing snapshot of the human race.
Never before have our words and actions been so immediately accountable in front of such a far-reaching audience.
If we are scammed, we can instantly warn others.
If we believe there is a government cover-up, we can expose it through the Net.
If we want to be heard, no matter what it is we have to say, we can tell it to the Net.
And, in the same way, if we need to know more, or we need to find numbers, we can turn to it for help.
The problem with such rapid improvement is that our expectations grow to meet it.
But the Net is still only in its infancy.
So be patient.
Enjoy it for what it is today.
It is amazing it works at all.
Sources: Wikipedia / Google / Peter Buckley and Duncan Clark, The Rough Guide to the Internet / Ellie Zolfagharifard, “Tim Berners-Lee, the “Father of the Web” talks about his plan to fight fake news“, World and Press, 2 February 2019