I’m in favor of Article 13

There’s a very neat trick when it comes to obscuring discourse: conflating unrelated-yet-confusingly-similar issues to make them seem bigger than they really are, and to steal support from otherwise-legitimate causes.

This is an ongoing example (link):


Hillary Clinton is facing backlash for arguing that European leaders should try to assuage the concerns of a growing right-wing populism across the continent by refusing to offer “refuge and support” to migrants.


… politicians expressed shock and concern with Clinton’s comments, which some said appeared to contradict her 2016 campaign position on welcoming immigrants and refugees.

Eliza Relman, Business Insider US

Immigrants and refugees.

There’s a world of difference between the two. Personally, I’m in favor of more open borders and a greater flow of immigrants: people who explicitly and voluntarily decide how and where they want to live, and are prepared to put in the work to contribute positively and assimilate into the culture are all right in my book.

Refugees are very different. They’re not moving by choice, but by necessity. Being coerced out of the land they chose to live in means they’ll hang on to their culture and traditions (as they should, by their own volition).

When this issue is reported though, immigrants and refugees are routinely conflated, to the point where they’re treated as synonyms for eachother – when they clearly are not. So now, even though I completely support as much voluntary, legal immigration as countries can bear, I’m also expected to support the unmitigated flow of refugees to systems that cannot integrate them at all.

I feel the exact same thing happening with this Article 13 issue.

At its core, the EU leans in favor of human rights. The regulations handed down are more often for the protection of citizens than not – GDPR being a stellar recent example.

Article 13 protects rightsholders by preventing unauthorized use of their work. The initial draft of the bill proposed some truly terrible mitigations (requiring automated content filtering on all uploads to catch violations), but the final bill has watered that down quite a lot.

When Article 13 is reported on though, it’s usually with a message like this:


the EU’s new copyright directive have stoked fears that memes will effectively be banned


platforms will have to pay a fee to share a link to a news article and have to start filtering and removing memes.


they will arbitrarily remove content based on their terms and conditions. As a result, many creators will see their content get blocked


Only platforms with deep pockets will be able to comply with the Article 13 requirements

It’s all horseshit, reasoned from a faulty premise that legitimizes theft under the banner of “user-generated content”. The internet that anti-Article 13 activists are fighting to protect was largely built on wide-scale infringement, with the inability to enforce existing laws taken as tacit permission to break them all.

But every wild west is eventually tamed, and the internet is long overdue for this. The truth (especially in Facebook and YouTube’s case) is that copyright-infringed content has been the biggest driver of their success. While they’ll fight Article 13 and sell it as the platforms “standing up for the creators” (and we’ll come back to “creators”), in reality they desperately need the freely-generated content to keep flowing – that’s all that keeps eyeballs on the site, and ad dollars flowing.

As platforms, between DMCA, Fair Use and Safe Harbor, they effectively have a license to print money (or in this case, monetizable attention). They can provide platforms that permit millions of people to violate copyright, then simply take their time to remove infringing content, while never having to compensate the victim.

While they claim to be acting in the best interests of “creators”, they’ve managed to come up with a very narrow, self-serving definition of “creator”: anyone who uploads anything. Truly independent creators are suffering the most under the current regime, illustrated beautifully by Kurzgesagt:

It’s that “immigrants and refugees” trickery all over again: conflating the independent artists who put their backs into creating original content, with the vampires who cut and re-share it without attribution (or fair compensation) to build their own profiles. To the platforms, these are both considered “creators”, which is why this statement from YouTube’s CEO should come as no surprise:


Article 13 as written threatens to shut down the ability of millions of people — from creators like you to everyday users — to upload content to platforms like YouTube. And it threatens to block users in the EU from viewing content that is already live on the channels of creators everywhere. This includes YouTube’s incredible video library of educational content, such as language classes, physics tutorials and other how-to’s.

Susan Wojcicki

(I wonder if Susan’s “educational content” includes these horror shows aimed at young children.)

There’s a whole lot of very subtle trickery in that paragraph. For one, and this is probably the most important point in the whole debacle:

People are rightsholders too.

Most of the criticism about Article 13 sets up this dystopian scenario where a few large companies (Disney, FOX, etc) will end up being the only ones who can publish anything, since Article 13 protects copyright and copyright is evil.

Except, it’s not. In most common-law countries, copyright is actually very simple: You make it, you own it. And if you own it, you should have some say in how it gets used – including permitting people to use your stuff for free, which is what Creative Commons is all about.

Copyright is only evil in a world where you can’t create new things, and the reality here is that a lot of this outrage is coming from people who have built businesses, careers and social standing off the work of others. They’re the ones with the most to lose if laws like Article 13 pass, which is why “copyright” is routinely cast as this benefit that only applies to large companies with expensive lawyers. 

Copyright is a thing we’ve had since 1886, ratified at the Berne Convention and adopted by pretty much every country on Earth. 

You could go (right now) and outline a story about a high school for wizards. Apply some creativity, take on a new angle, mix in your own experiences, draw from a large array of influences and produce something unique – and by default, you’ll have the copyright on it.

That’s creation. That’s what authorship is supposed to look like. Taking a three-second clip from a movie and dubbing a different voice over it is, at best, imitation.

But it’s that imitation that’s now being heralded as “creation”, defended by companies that desperately need large volumes of content to monetize but cannot (or will not) invest in producing it themselves.

Of course, there are more arguments against Article 13, for instance:

Only large companies will be able to afford compliance! Only big platforms like Facebook and YouTube could possibly do this!

Garbage. Setting up your own website comes with a cost of $free, and you have full control over what goes up on there. The only reason these large companies are the “only ones who can afford compliance” is that their business model depends on large-scale, unmonitored, unchecked user-generated content that can be monetized – with bonus points for presenting all of that as a defense of free speech.

This will kill creativity! Nobody will be able to make anything new! Copyrights prevent people from experimenting!

More garbage. The thing about copyright (other than it being a basic human right, globally enforced and freely available) is that the rightsholder can do whatever they want with the rights, including making it available for adaptation.

It’s as if everyone’s taken crazy pills and forgotten that CC-BY-SA exists.

Even in a world where that experimentation/remixing/adaptation is universally good for business, rightsholders (everyone from Disney to neighbour Dorothy) should have some say over how their work is used. If they decide to prevent remixes, that’s their business. Everyone that takes a more relaxed approach will benefit, and the free market will sort itself out.

Ultimately, that’s why I’m in favor of legislation that tries to protect rightsholders from unauthorized use of their work, while still giving them the option of making their work available for adaptation and re-use: Because people are rightsholders too.

Definitely not missing the bird

Next week Tuesday (27 November) will mark the 1-month point since I decided to completely ditch Twitter from my daily routine. On Tuesday itself I’m likely to be very busy, but I wanted to take a moment to write some feedback on this experiment.

I can confidently say I don’t miss it one bit. The trade-off just wasn’t worth it: For every connection that was worthwhile, there were at least 20x more knee-jerk reactions and vapid hot takes, and 100x the volume of neurotic, click-bait psychic garbage swirling around a monetized trash compactor explicitly designed to make people feel angry. It felt like mental chemotherapy except it was also giving me cancer.

Twitter had basically become an abusive relationship. For the slim chance that something nice might happen, I was hanging on through torrents of garbage that made me jealous, depressed, despondent (specially the South African political feeds!), and ultimately self-destructive.

Granted, there are much better ways to use Twitter, and there are definite upsides to staying informed. Keeping abreast of major news stories is the easiest form of social currency there is, and it’s a great way to stay updated on interesting projects and useful tools. I was using it badly – as an actual social network.


Even the rage engagement is not real rage. It’s a faux rage. No one writes a snarky reply to @realdonaldtrump because they’re really engaging with The Donald. No, they’re posting to their mirror engagement crew, who they know is also in a rage engagement with @realdonaldtrump. It’s not even virtue signaling. It’s pure entertainment. It’s a simulation where they can “engage” with the President of the United States in the company of their supportive mirror engagement crew. Plus dopamine!

Ben Hunt, A Game Of You

Many months ago an old friend of mine asked me about my daily Twitter habit. He also has a Twitter account, but does maybe one post a month related to projects he works on. I explained that Twitter was becoming the national conversation platform – it was the place that journalists, newsmakers, political parties and influencers converged, and being part of that was probably a smarter move.

It turns out I was wrong, for the reasons so neatly encapsulated in A Game Of You. While there was real engagement (and real connections formed), so much more of the engagement on social media is effectively a simulation: Yes, you’re technically “engaging” with the accounts of notable people and organizations, but in reality your voice isn’t actually heard – and every post is just increasing the risk of damage.

It’s also increasingly performative. The US dealt with this years ago, and South Africa is just catching up to it now. Smart, unethical people have realized that the more outrageous shit you post, the further it circulates, the more it boosts your profile, and the greater opportunities it opens up for you. Yes, it comes at the cost of poisoning the well for everyone, but that’s a small price to pay for quasi-celebrity status.

Then there’s the short-term thinking effect, which has probably been the biggest learning for me over the last month. Twitter (or any fast-paced information environment) doesn’t so much inform you as constantly trick you into thinking you’re learning – but really its just reinforcing the stuff you already believe.

Since quitting I haven’t technically had more time to think, but I’ve been able to think through ideas that take a lot longer. Among them, I’ve been developing a sort of mini-philosophy for how I see the world (more a statement of principles and arguments at this point), and it was only after reasoning through lots of examples I came up with better explanations for things. Hot-take shitposting in humanity’s garbage compactor would have been of little help there.

(Core to that is measuring rules and behavior on a voluntary-coercive axis and balancing it for the maximum liberty of the individual, but that’s better suited for another post.)

P.S.

Then there’s The Noscript Show! Every week, I co-host an hour-long livestream on which we just decompress and unpack the stuff going on in the world. On every successive episode we’ve improved on our production quality, while sticking to the principle that we script absolutely nothing on the show – and it’s a lot of fun to do.

Getting started with Mastodon!

Mastodon Setup

Howdy, stranger! This document is the other half of this video, in which I set up a single-server instance of Mastodon. This was assembled on 9 April 2017, and there’s a good chance that some of the specifics here will change over time. I’ll keep an updated version up on wogan.blog.

(What is Mastodon? I’ll do another post on that sometime!)

If you’d like, you can download a plain HTML, styled HTML or PDF version of this post instead – it might make copying some of the code easier.

UPDATE 17 April 2017: Mastodon has reached v1.2, and now requires Ruby 2.4.1. The post has been updated with new commands required as of today, and an upgrade guide is below.

0. Pre-Prerequisites

At a bare minimum, you’re going to need:

  • A domain name, with the ability to add an A record yourself
  • A free mailgun.com account, with the account verified and your sandbox enabled to send to you
  • A 1GB RAM machine with decent network access. This document uses a DigitalOcean VM.

This setup procedure skips a few things that you may want to do on a “productionized” or “community” instance of Mastodon, such as configuring S3 file storage, or using a non-sandbox email send account. You may also want a beefier machine than just 1GB RAM.

For reference, the OS version in use is Ubuntu 16.04.2 LTS and all the commands are being run from the root user unless explicitly specified.

1. Getting started!

The first couple steps:

  • Create the VM
  • Point your domain to it immediately, by setting the A record to the public IP
  • Log into the VM
  • Set your root password
  • Create a new Mastodon user: adduser mastodon
  • Update the apt cache: apt-get update

2. Install Prerequisites

Now we’ll grab all the prerequisite software packages in one go:

# apt-get install imagemagick ffmpeg libpq-dev libxml2-dev libxslt1-dev nodejs file git curl redis-server redis-tools postgresql postgresql-contrib autoconf bison build-essential libssl-dev libyaml-dev libreadline6-dev zlib1g-dev libncurses5-dev libffi-dev libgdbm3 libgdbm-dev git-core letsencrypt nginx

That’ll take a little while to run. When it’s done, you’ll need Node (version 4) and yarn:

# curl -sL https://deb.nodesource.com/setup_4.x | bash -
# apt-get install nodejs
# npm install -g yarn

You’ll also want to be sure that redis is running, so do:

# service redis-server start

3. Configure Database

With Postgres installed, you need to create a new user. Drop into the postgres user and create a mastodon account:

# su - postgres
$ psql
> CREATE USER mastodon CREATEDB;
> \q
$ exit

Later on we’ll configure mastodon to use that.

4. Generate SSL certificate

Before configuring nginx, we can generate the files we’ll need to support SSL. First, kill nginx:

# service nginx stop

Now proceed through the LetsEncrypt process:

  • Run letsencrypt certonly
  • Enter your email address
  • Read and acknowledge the terms
  • Enter the domain name you chose

If the domain name has propagated (which is why it’s important to do this early), LetsEncrypt will find your server and issue the certificate in one go. If this step fails, you may need to wait a while longer for your domain to propagate so that LetsEncrypt can see it.

5. Configure nginx

With the SSL cert done, time to configure nginx!

# cd /etc/nginx/sites-available
# nano mastodon

Simply substitute your domain name where it says example.com in this snippet (lines 9, 15, 23, 24), then paste the entire thing into the file and save it.

map $http_upgrade $connection_upgrade {
  default upgrade;
  ''      close;
}

server {
  listen 80;
  listen [::]:80;
  server_name example.com;
  return 301 https://$host$request_uri;
}

server {
  listen 443 ssl;
  server_name example.com;

  ssl_protocols TLSv1.2;
  ssl_ciphers EECDH+AESGCM:EECDH+AES;
  ssl_ecdh_curve prime256v1;
  ssl_prefer_server_ciphers on;
  ssl_session_cache shared:SSL:10m;

  ssl_certificate     /etc/letsencrypt/live/example.com/fullchain.pem;
  ssl_certificate_key /etc/letsencrypt/live/example.com/privkey.pem;

  keepalive_timeout    70;
  sendfile             on;
  client_max_body_size 0;
  gzip off;

  root /home/mastodon/live/public;

  add_header Strict-Transport-Security "max-age=31536000; includeSubDomains";

  location / {
    try_files $uri @proxy;
  }

  location @proxy {
    proxy_set_header Host $host;
    proxy_set_header X-Real-IP $remote_addr;
    proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
    proxy_set_header X-Forwarded-Proto https;

    proxy_pass_header Server;

    proxy_pass http://localhost:3000;
    proxy_buffering off;
    proxy_redirect off;
    proxy_http_version 1.1;
    proxy_set_header Upgrade $http_upgrade;
    proxy_set_header Connection $connection_upgrade;

    tcp_nodelay on;
  }

  location /api/v1/streaming {
    proxy_set_header Host $host;
    proxy_set_header X-Real-IP $remote_addr;
    proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
    proxy_set_header X-Forwarded-Proto https;

    proxy_pass http://localhost:4000;
    proxy_buffering off;
    proxy_redirect off;
    proxy_http_version 1.1;
    proxy_set_header Upgrade $http_upgrade;
    proxy_set_header Connection $connection_upgrade;

    tcp_nodelay on;
  }

  error_page 500 501 502 503 504 /500.html;
}

Once you’ve saved and closed the file, enable it by creating a symlink:

# ln -s /etc/nginx/sites-available/mastodon /etc/nginx/sites-enabled/mastodon

Then test that the file is OK by running nginx -t. If it reports any errors, you’ll want to fix them before moving on. If the file comes back OK, fire it up!

# service nginx start

Open a browser tab and navigate to your domain. You should get a 502 Gateway Error, secured with your LetsEncrypt cert. If not, go back and make sure you’ve followed every preceding step correctly.

6. Configure Systemd

Mastodon consists of 3 services (web, sidekiq and streaming), and we need to create config files for each. You can use the code straight from this page, as-is.

# cd /etc/systemd/system/

The first file is called mastodon-web.service and consists of the following:

[Unit]
Description=mastodon-web
After=network.target

[Service]
Type=simple
User=mastodon
WorkingDirectory=/home/mastodon/live
Environment="RAILS_ENV=production"
Environment="PORT=3000"
ExecStart=/home/mastodon/.rbenv/shims/bundle exec puma -C config/puma.rb
TimeoutSec=15
Restart=always

[Install]
WantedBy=multi-user.target

The next file is called mastodon-sidekiq.service and consists of the following:

[Unit]
Description=mastodon-sidekiq
After=network.target

[Service]
Type=simple
User=mastodon
WorkingDirectory=/home/mastodon/live
Environment="RAILS_ENV=production"
Environment="DB_POOL=5"
ExecStart=/home/mastodon/.rbenv/shims/bundle exec sidekiq -c 5 -q default -q mailers -q pull -q push
TimeoutSec=15
Restart=always

[Install]
WantedBy=multi-user.target

The final file is called mastodon-streaming.service and consists of the following:

[Unit]
Description=mastodon-streaming
After=network.target

[Service]
Type=simple
User=mastodon
WorkingDirectory=/home/mastodon/live
Environment="NODE_ENV=production"
Environment="PORT=4000"
ExecStart=/usr/bin/npm run start
TimeoutSec=15
Restart=always

[Install]
WantedBy=multi-user.target

Once all those are saved, we’ve done all we can with the root user for now.

7. Switch to the Mastodon user

If you haven’t yet logged into the server as mastodon, do so now in a second SSH window. We’re going to set up ruby and pull down the actual Mastodon code here.

8. Install rbenv, rbenv-build and Ruby

As the mastodon user, clone the rbenv repo into your home folder:

$ git clone https://github.com/rbenv/rbenv.git ~/.rbenv

When that’s done, link the bin folder to your PATH:

$ echo 'export PATH="$HOME/.rbenv/bin:$PATH"' >> ~/.bash_profile

Then add the init script to your profile:

$ echo 'eval "$(rbenv init -)"' >> ~/.bash_profile

That line is valid for the OS we’re on (Ubuntu 16.04 LTS) but it may differ slightly for you. You can run ~/.rbenv/bin/rbenv init to check what line you need to use.

Once you’ve saved that, log out of the mastodon user, then log back in to complete the rest of this section.

Install the ruby-build plugin like so:

$ git clone https://github.com/rbenv/ruby-build.git ~/.rbenv/plugins/ruby-build

Then install Ruby v2.4.1 proper:

$ rbenv install 2.4.1

This could take up to 15 minutes to run!

When it’s done, change to your home folder and clone the Mastodon source:

$ cd ~
$ git clone https://github.com/tootsuite/mastodon.git live
$ cd live

Next up, dependencies! Always more dependencies – we’ll install bundler, then use that to install everything else:

$ gem install bundler
$ bundle install --deployment --without development test
$ yarn install

If all of those succeeded, we’re ready to configure!

9. Configure Mastodon

Before diving into the configuration file, generate 3 secret strings by running this command 3 times:

$ bundle exec rake secret

Copy those out to a text file – you’ll paste them back in later. Create the config file by copying the template, then editing it with nano:

$ cp .env.production.sample .env.production
$ nano .env.production

Inside this file we’re going to make several quick changes.

REDIS_HOST=localhost
DB_HOST=/var/run/postgresql
DB_USER=mastodon
DB_NAME=mastodon_production

To enable federation, you need to set your domain name here:

LOCAL_DOMAIN=example.com

Then, for these 3, paste in each key you generated earlier:

PAPERCLIP_SECRET=
SECRET_KEY_BASE=
OTP_SECRET=

Finally, configure your SMTP details:

SMTP_LOGIN= (whatever your mailgun is)
SMTP_PASSWORD= (whatever your mailgun is)

Save and close the file.

10. Run installer

If you’ve done everything correctly, this command will install the database:

$ RAILS_ENV=production bundle exec rails db:setup

If that passes successfully (it’ll echo out every command it runs), you can then precompile the site assets, which may take a few minutes:

$ RAILS_ENV=production bundle exec rails assets:precompile

At this point, we’re almost ready to go!

11. Configure cronjob

This is technically optional, but highly recommended to keep your instance in good order. As the mastodon user, start by determining where your bundle command lives:

$ which bundle

That path will be substituted for $bundle. Now, edit your own crontab:

$ crontab -e

Select nano (2) if you’re prompted. As of version 1.2 (17 April 2017) you only need one daily task in your crontab:

5 0 * * * RAILS_ENV=production $bundle exec rake mastodon:daily

Save and close the crontab.

12. Log out and return to root

We’re done with the mastodon account. Log out and return to your root shell.

13. Start Mastodon

The moment of truth! Enable the Mastodon services (so that they start on boot):

# systemctl enable /etc/systemd/system/mastodon-*.service

Then fire up Mastodon itself:

# systemctl start mastodon-web.service mastodon-sidekiq.service mastodon-streaming.service

Open up a browser tab on your domain. Mastodon can take up to 30 seconds to warm up, so if you see an error page, don’t fret. Only fret if it’s there for longer than a minute – that requires troubleshooting, which is outside the scope of this document.

You should eventually get a signup page. Congratulations! Register an account for yourself, receive the confirmation email, and activate it. This should enable you (the first user) as an administrator.

14. Securing Mastodon

This is by no means a comprehensive guide to server security, but there are two quick things you can change while the root shell is open. Start by editing the passwd file:

# nano /etc/passwd

Find the mastodon entry (it’ll be near the bottom) and replace /bin/bash with /usr/sbin/nologin. Save and quit. This will prevent anyone from logging in as the mastodon user.

Next, configure ufw. First check if it’s disabled:

# ufw status

It should be off, since this is a brand new VM. Configure it to allow SSH (port 22) and HTTPS (port 443), then turn it on:

# ufw allow 22
# ufw allow 443
# ufw enable
? y

That will prevent any connection attempts on other ports.

15. Enjoy!

If you enjoyed this guide, I’d appreciate a follow! You can find me by searching wogan@wogan.im in your Mastodon web UI. Give me a shout if you were able to get an instance set up with these instructions, or if you ran into any problems.

16. Upgrade to v1.2 (17 April 2017)

If you’ve installed Mastodon according to these instructions, you’ll need to do a few things to upgrade to the latest version.

Start by logging into your instance as the root user, then re-enabling your mastodon user shell (in step 14, change the mastodon user’s shell back to /bin/bash). We’ll use it in a bit to perform the upgrades themselves.

When that’s done, stop the Mastodon services like so:

# systemctl stop mastodon-*

That will shut down all the Mastodon services. In a new window, log into your mastodon user and install Ruby 2.4.1, the new preferred version:

$ cd live
$ rbenv install 2.4.1
$ gem install bundler --no-ri --no-rdoc

This will install the latest Ruby, and the version-appropriate bundler. Now pull down the latest source code:

$ git pull

There are a couple of one-time commands to run – in order, they are to install new dependencies, run database migrations, do a one-time avatar migration, and recompile the frontend.

$ bundle install
$ yarn install
$ RAILS_ENV=production bundle exec rails db:migrate
$ RAILS_ENV=production rake mastodon:maintenance:add_static_avatars
$ RAILS_ENV=production bundle exec rails assets:precompile

When this is all done, make sure your crontab has been updated to use the new mastodon:daily command. Refer to step 11 above for details.

Finally, the teardown – log out of the mastodon user, and switch back to your root connection. Set the mastodon user’s shell back to /usr/sbin/nologin (step 14) and restart the Mastodon services:

# systemctl start mastodon-web.service
# systemctl start mastodon-sidekiq.service
# systemctl start mastodon-streaming.service

Give it a few seconds to warm up, and check that they’re running with:

# systemctl status mastodon-web.service

If you get a green dot with “running”, you’re good to go!

Sources

A lot of this guide was sourced from the official Production guide on the Mastodon Github page. I reorded it into a logical sequence after running through it for a few tries.

This post was updated for v1.2 (and v1.1.2) upgrade notes on 17 April 2017.

Fare thee well, Twitter

I feel honored, in a very weird way, to have witnessed the birth, growth, stumbling, and inevitable death of a global phenomenon. I might have said that about Gangnam Style, but somehow that bloody video is still attracting views.

I’m referring to Twitter, and I have such fond memories of Twitter. They launched in 2006, but it was only in 2008 that I became aware of it. My first memories of Twitter was that it was a faster, lighter, and somewhat less-sophisticated version of Facebook, which I had first joined in 2007.

Within a South African context, I was on Twitter before it was cool. It’s hard to believe in 2016, where basically every talk show and marketing campaign is using hashtags, and questionable social movements are using the platform to rally and organize social change – but there was a time when Twitter in SA was small. Really, really small.

Small enough that in May 2008, a few of us started assembling a manually-curated directory of South African Twitter users. This was back when Twitter didn’t have a search feature (remember how that used to be a whole separate site?), and geolocation wasn’t even remotely an option.

The Internet Archive will forever immortalize this entry on a free PBWorks wiki (used to be Peanut Butter Wiki, by the way), before the spam came along.

2016-09-24 18_47_40-satwitter _ FrontPage.png

Three months later, the list had grown to a mere 150, and I think we got to about 300 before Twitter started gaining some real traction here in SA.

By September 2008, Twitter had introduced us all to the concept of the Fail Whale, and outages were about as common as uptime. I quite liked Twitter, though, and they had a REST API -so between that API, a profile search feature, and a few cron jobs, I built a South African shadow site, called TwitterSA.com.

All it did, really, was continuously search for users that had South Africa (or major cities) as their location in their profiles, automatically follow them, and cache all their tweets in its own database. I recall eventually adding a signup feature, and started scraping the tweets for popular links and hashtags that were being shared.

Yes, it was basically a nascent social media listening platform. If I knew how popular those would have become in the years to follow, I definitely would have held on to it a lot more strongly than I did.

It had a brief, but glorious reign. At one point it held the definitive directory of South African Twitter users, and even, some crazy how, got myself and the designer featured in an ITWeb column.

That lasted until about July 2009, when Twitter started sending out Cease & Desists to all domains that included “Twitter” in the name. We received it, discussed it for about two minutes, and decided to shut down the site. Twitter was only a few months away from launching their Geolocation API, which would have made the service redundant anyway.

Since then, I’ve been perpetually on-and-off Twitter, simultaneously inspired and frustrated by it. I never could figure out the best way to make use of it, and towards the end of 2015 I started making the mistake of trying to have rational debates through the platform.

Turns out, 140 characters at a time is not a good medium for debate, and the average participant who attempts it is not really great at communicating complex, cogent arguments – myself included. Retweets make it far too easy to take things out of context and magnify them to a hostile audience, and it’s way, way too easy to get mobbed off the service.

Which is why I quit in February 2016.

https://twitter.com/WoganMay/status/700466035749556224

I left it connected to my WordPress.com account, on the off chance that some of my audience would appreciate the stuff I write here – but on the whole, I disengaged completely. Frustrating limitations, hostile audience, apathetic moderation – was there any other outcome?

And it turned out, I wasn’t the only one starting to feel this way. Me dropping off Twitter came as the stock coasted to its lowest level since IPO:

2016-09-24 19_02_49-$TWTR - Google Search.png

As it turns out, it’s really hard to monetize such a shitty place. And I choose that word with the utmost care:

  • The highest engagement comes from social media addicts and internet trolls
  • The tools basically streamline the hate mob and harassment process
  • The ad formats infringe directly on the stream experience
  • Twitter’s API policies turned exclusive and hostile, alienating developers
  • Nearly every algorithm adjustment upset users, who wanted reverse-chronological feeds entirely of their own curation
  • Twitter was routinely in the news for all the wrong reasons – usually because of someone famous (or just an inconsequential profile) saying something stupid that blew up

I suppose the smart Twitter investors started cottoning on to the fact that something was wrong back in April 2015, when they started redefining (almost every quarter) the metrics they used for measuring their own success. The less-smart investors would have started getting alarmed in August 2015, when the SEC started asking pointed questions about how Twitter was running their business.

For me, though, the moment of clarity came just last month, in August 2016, with a long-form Buzzfeed article entitled Inside Twitter’s 10-year failure to stop harassment. It’s a fascinating read, and in a few months, will serve as the pre-post-mortem of why Twitter collapsed the way it did.

So as of today, Twitter is on the chopping block. And the one piece of data that is really telling? Twitter’s stock price, the moment news broke of a potential sale:

2016-09-24 19_10_56-#TWTR - Google Search.png

That’s the last heartbeat of a company destined for flat-lining: When their stock gains instant value only because there might be a quick return. It’s a bit of a “vultures are circling” situation, with some of the braver ones picking at the still-lumbering victim.

At this point investor sentiment is basically clear – Twitter has to go. I can’t imagine there’ll be a last-second magic trick that restores Twitter’s credibility and independence. If that were possible, it would have emerged at some point in the last five years.

So what’s next? One thing to bear in mind is that no social network, post-acquisition, has actually survived in a form that in any way resembled the original. Social networks are tricky things, and new owners typically want a good financial return on their investment.

Whoever buys Twitter is getting a mixed bag of:

  • Some very good distributed message processing tech – global scale, realtime delivery
  • An executive team void of any real direction
  • A disillusioned workforce, whose attempts to improve the platform have met with repeated failure
  • A social hot potato, in that Twitter is more or less the new 4chan, and were forced to create a Safety Council in reaction to “extreme” speech
  • A political hot potato, in that anyone with clout will want the service sanitized to remove harsh messaging about them.
  • A financial hot potato, with declining ad revenues across the board
  • A brand that created two new entries in the Oxford Dictionary: Twitter and Tweet
  • A grab-bag of acquisitions: Vine, Periscope, some adtech and design startups
  • And a domain name that, if you think about it, is kinda dumb – birds can’t use smartphones

About the only thing that makes sense, acquisition-wise, would be to turn Twitter into a one-way content provider: letting brands and verified celebrities use it as a platform to push out messaging, while severely limiting user interaction – like how Hollywood works right now, basically.

A dumb content pipe with no controversies and news blowups is preferable, commercially speaking, to a public square for free speech and open debate. It’s a lot easier to monetize a captive and engaged audience, and if there’s one thing that news outlets in particular have realized in the last year, you don’t need a public feedback facility to enable that.

I might even start using Twitter again, should it turn into something that has a net positive effect on my mood. A safe content delivery pipeline, pushed to the top of your phone, with granular interest tracking, personalized content and real-time feedback? Marketer’s wet dream.

But for now, enjoy the long shadows cast by the setting sun that is Twitter.

Swinging a double-edged sword

It’s been an interesting few weeks in terms of freedom of speech, and what that means on the internet. On 31 August, YouTube made a small but significant change to its advertising policy – it set new guidelines for monetizeable content, and included rules specifically against offensive content.

In a lot of contexts, that’s pretty justifiable: It shouldn’t ever be the case that a system is put in place that rewards hateful and destructive speech. Under the previous system, ad revenue was pretty much a direct correlation to viewership, and controversy is a constant driver of viewership.

For example, it would be possible to create a YouTube channel that featured nothing but trolling and baiting other people, and not only would you get a response to that, you’d actually be rewarded for your efforts with ad revenue. That cannot possibly be a reasonable thing to reward – it directly fans the flames that make YouTube an unpleasant place to be.

37b05b5e00000578-3763921-rude_-m-60_1472494720798

Trouble is, “offensive” is an extremely subjective and rapidly-moving target – nowhere more so in the US, with college campus politics and ludicrous entitlement driving a new generation of offenderati. A recent, glowing example of this was a Lyft passenger who berated the driver for being racist, simply because the driver had a Hawaiian bobblehead accessory in his car:

http://www.dailymail.co.uk/news/article-3763921/Woman-berates-Lyft-driver-racist-Hawaiian-bobblehead-doll-dash.html

That’s one element to consider. The second is how quickly instances like that are used to fuel internet mobs – groups of people with internet access and nothing better to do. They’ll gang up, post harassing messages on social media, try unearthing private information about their target, and generally try making their lives miserable.

If that Lyft driver in the article above was on social media, he would have been targeted with death threats, his information would have been made public, he’d probably be barraged with phone calls and texts for being “racist” – none of his fault, and all of it perpetrated by what I can only imagine is a mentally unstable woman.

That behavior – distasteful as it is – is a fantastic driver of “engagement” on these platforms. It drives eyeballs to videos, it drives comments (albeit bad), and it drives uploads – mainly rebuttals and rants. And YouTube is, in my opinion, justified in trying to protect themselves from that. Not only because advertisers will be insisting on it, but also just because it’s the decent, human thing to do.

photo-1455747634646-0ef67dfca23f.jpg

 

But because everyone’s so easily offended, and because people can be rallied up into attack mobs, there’s a very real downside: The systems by which YouTube enforces this policy are largely automated, and there’s basically no preemptive defense. If you say the wrong thing on YouTube, or you upset a group of people (maybe no more than 50 people, even), they can rally against you and flag all your videos for inappropriate content. And after a certain point, the system simply starts demonetizing your videos.

That’s a bit of a slap in the face to the people that work on producing great content for YouTube. It takes a lot of effort to build a channel and an audience, and the ad revenue from that was what made it viable for a lot of content creators to keep doing that. There’s a large overlap between the people that are passionate about creating YouTube content, and the people that believe strongly in the views they’re sharing.

And since we’re now living in a world where simply existing is offensive to some people, it’ll become harder and harder for those content creators to justify spending so much time and effort producing content for YouTube (which already takes 45% of all ad revenue), when it’s so easy for someone to get a hate mob together and torpedo your earnings.

YouTube is by far the largest platform for community-driven speech in the world, and they’ve swung a bit of a double-edged sword here. They’ll likely end up with a profitable network, but at the cost of burning down a vital square for public debate.

Meaning that the final holdout just bit the dust. Facebook has been tailoring their algorithms for years, designed to put profitable content in front of you. Twitter’s losing the battle for their independence, and will have to start making larger compromises pretty soon if they want to remain relevant. And now YouTube has thrown their independent content producers to the wolves.

Kinda makes you long for MySpace a bit.