Category Archives: Social Media

Getting started with Mastodon!

Mastodon Setup

Howdy, stranger! This document is the other half of this video, in which I set up a single-server instance of Mastodon. This was assembled on 9 April 2017, and there’s a good chance that some of the specifics here will change over time. I’ll keep an updated version up on wogan.blog.

(What is Mastodon? I’ll do another post on that sometime!)

If you’d like, you can download a plain HTML, styled HTML or PDF version of this post instead – it might make copying some of the code easier.

UPDATE 17 April 2017: Mastodon has reached v1.2, and now requires Ruby 2.4.1. The post has been updated with new commands required as of today, and an upgrade guide is below.

0. Pre-Prerequisites

At a bare minimum, you’re going to need:

  • A domain name, with the ability to add an A record yourself
  • A free mailgun.com account, with the account verified and your sandbox enabled to send to you
  • A 1GB RAM machine with decent network access. This document uses a DigitalOcean VM.

This setup procedure skips a few things that you may want to do on a “productionized” or “community” instance of Mastodon, such as configuring S3 file storage, or using a non-sandbox email send account. You may also want a beefier machine than just 1GB RAM.

For reference, the OS version in use is Ubuntu 16.04.2 LTS and all the commands are being run from the root user unless explicitly specified.

1. Getting started!

The first couple steps:

  • Create the VM
  • Point your domain to it immediately, by setting the A record to the public IP
  • Log into the VM
  • Set your root password
  • Create a new Mastodon user: adduser mastodon
  • Update the apt cache: apt-get update

2. Install Prerequisites

Now we’ll grab all the prerequisite software packages in one go:

# apt-get install imagemagick ffmpeg libpq-dev libxml2-dev libxslt1-dev nodejs file git curl redis-server redis-tools postgresql postgresql-contrib autoconf bison build-essential libssl-dev libyaml-dev libreadline6-dev zlib1g-dev libncurses5-dev libffi-dev libgdbm3 libgdbm-dev git-core letsencrypt nginx

That’ll take a little while to run. When it’s done, you’ll need Node (version 4) and yarn:

# curl -sL https://deb.nodesource.com/setup_4.x | bash -
# apt-get install nodejs
# npm install -g yarn

You’ll also want to be sure that redis is running, so do:

# service redis-server start

3. Configure Database

With Postgres installed, you need to create a new user. Drop into the postgres user and create a mastodon account:

# su - postgres
$ psql
> CREATE USER mastodon CREATEDB;
> \q
$ exit

Later on we’ll configure mastodon to use that.

4. Generate SSL certificate

Before configuring nginx, we can generate the files we’ll need to support SSL. First, kill nginx:

# service nginx stop

Now proceed through the LetsEncrypt process:

  • Run letsencrypt certonly
  • Enter your email address
  • Read and acknowledge the terms
  • Enter the domain name you chose

If the domain name has propagated (which is why it’s important to do this early), LetsEncrypt will find your server and issue the certificate in one go. If this step fails, you may need to wait a while longer for your domain to propagate so that LetsEncrypt can see it.

5. Configure nginx

With the SSL cert done, time to configure nginx!

# cd /etc/nginx/sites-available
# nano mastodon

Simply substitute your domain name where it says example.com in this snippet (lines 9, 15, 23, 24), then paste the entire thing into the file and save it.

map $http_upgrade $connection_upgrade {
  default upgrade;
  ''      close;
}

server {
  listen 80;
  listen [::]:80;
  server_name example.com;
  return 301 https://$host$request_uri;
}

server {
  listen 443 ssl;
  server_name example.com;

  ssl_protocols TLSv1.2;
  ssl_ciphers EECDH+AESGCM:EECDH+AES;
  ssl_ecdh_curve prime256v1;
  ssl_prefer_server_ciphers on;
  ssl_session_cache shared:SSL:10m;

  ssl_certificate     /etc/letsencrypt/live/example.com/fullchain.pem;
  ssl_certificate_key /etc/letsencrypt/live/example.com/privkey.pem;

  keepalive_timeout    70;
  sendfile             on;
  client_max_body_size 0;
  gzip off;

  root /home/mastodon/live/public;

  add_header Strict-Transport-Security "max-age=31536000; includeSubDomains";

  location / {
    try_files $uri @proxy;
  }

  location @proxy {
    proxy_set_header Host $host;
    proxy_set_header X-Real-IP $remote_addr;
    proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
    proxy_set_header X-Forwarded-Proto https;

    proxy_pass_header Server;

    proxy_pass http://localhost:3000;
    proxy_buffering off;
    proxy_redirect off;
    proxy_http_version 1.1;
    proxy_set_header Upgrade $http_upgrade;
    proxy_set_header Connection $connection_upgrade;

    tcp_nodelay on;
  }

  location /api/v1/streaming {
    proxy_set_header Host $host;
    proxy_set_header X-Real-IP $remote_addr;
    proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
    proxy_set_header X-Forwarded-Proto https;

    proxy_pass http://localhost:4000;
    proxy_buffering off;
    proxy_redirect off;
    proxy_http_version 1.1;
    proxy_set_header Upgrade $http_upgrade;
    proxy_set_header Connection $connection_upgrade;

    tcp_nodelay on;
  }

  error_page 500 501 502 503 504 /500.html;
}

Once you’ve saved and closed the file, enable it by creating a symlink:

# ln -s /etc/nginx/sites-available/mastodon /etc/nginx/sites-enabled/mastodon

Then test that the file is OK by running nginx -t. If it reports any errors, you’ll want to fix them before moving on. If the file comes back OK, fire it up!

# service nginx start

Open a browser tab and navigate to your domain. You should get a 502 Gateway Error, secured with your LetsEncrypt cert. If not, go back and make sure you’ve followed every preceding step correctly.

6. Configure Systemd

Mastodon consists of 3 services (web, sidekiq and streaming), and we need to create config files for each. You can use the code straight from this page, as-is.

# cd /etc/systemd/system/

The first file is called mastodon-web.service and consists of the following:

[Unit]
Description=mastodon-web
After=network.target

[Service]
Type=simple
User=mastodon
WorkingDirectory=/home/mastodon/live
Environment="RAILS_ENV=production"
Environment="PORT=3000"
ExecStart=/home/mastodon/.rbenv/shims/bundle exec puma -C config/puma.rb
TimeoutSec=15
Restart=always

[Install]
WantedBy=multi-user.target

The next file is called mastodon-sidekiq.service and consists of the following:

[Unit]
Description=mastodon-sidekiq
After=network.target

[Service]
Type=simple
User=mastodon
WorkingDirectory=/home/mastodon/live
Environment="RAILS_ENV=production"
Environment="DB_POOL=5"
ExecStart=/home/mastodon/.rbenv/shims/bundle exec sidekiq -c 5 -q default -q mailers -q pull -q push
TimeoutSec=15
Restart=always

[Install]
WantedBy=multi-user.target

The final file is called mastodon-streaming.service and consists of the following:

[Unit]
Description=mastodon-streaming
After=network.target

[Service]
Type=simple
User=mastodon
WorkingDirectory=/home/mastodon/live
Environment="NODE_ENV=production"
Environment="PORT=4000"
ExecStart=/usr/bin/npm run start
TimeoutSec=15
Restart=always

[Install]
WantedBy=multi-user.target

Once all those are saved, we’ve done all we can with the root user for now.

7. Switch to the Mastodon user

If you haven’t yet logged into the server as mastodon, do so now in a second SSH window. We’re going to set up ruby and pull down the actual Mastodon code here.

8. Install rbenv, rbenv-build and Ruby

As the mastodon user, clone the rbenv repo into your home folder:

$ git clone https://github.com/rbenv/rbenv.git ~/.rbenv

When that’s done, link the bin folder to your PATH:

$ echo 'export PATH="$HOME/.rbenv/bin:$PATH"' >> ~/.bash_profile

Then add the init script to your profile:

$ echo 'eval "$(rbenv init -)"' >> ~/.bash_profile

That line is valid for the OS we’re on (Ubuntu 16.04 LTS) but it may differ slightly for you. You can run ~/.rbenv/bin/rbenv init to check what line you need to use.

Once you’ve saved that, log out of the mastodon user, then log back in to complete the rest of this section.

Install the ruby-build plugin like so:

$ git clone https://github.com/rbenv/ruby-build.git ~/.rbenv/plugins/ruby-build

Then install Ruby v2.4.1 proper:

$ rbenv install 2.4.1

This could take up to 15 minutes to run!

When it’s done, change to your home folder and clone the Mastodon source:

$ cd ~
$ git clone https://github.com/tootsuite/mastodon.git live
$ cd live

Next up, dependencies! Always more dependencies – we’ll install bundler, then use that to install everything else:

$ gem install bundler
$ bundle install --deployment --without development test
$ yarn install

If all of those succeeded, we’re ready to configure!

9. Configure Mastodon

Before diving into the configuration file, generate 3 secret strings by running this command 3 times:

$ bundle exec rake secret

Copy those out to a text file – you’ll paste them back in later. Create the config file by copying the template, then editing it with nano:

$ cp .env.production.sample .env.production
$ nano .env.production

Inside this file we’re going to make several quick changes.

REDIS_HOST=localhost
DB_HOST=/var/run/postgresql
DB_USER=mastodon
DB_NAME=mastodon_production

To enable federation, you need to set your domain name here:

LOCAL_DOMAIN=example.com

Then, for these 3, paste in each key you generated earlier:

PAPERCLIP_SECRET=
SECRET_KEY_BASE=
OTP_SECRET=

Finally, configure your SMTP details:

SMTP_LOGIN= (whatever your mailgun is)
SMTP_PASSWORD= (whatever your mailgun is)

Save and close the file.

10. Run installer

If you’ve done everything correctly, this command will install the database:

$ RAILS_ENV=production bundle exec rails db:setup

If that passes successfully (it’ll echo out every command it runs), you can then precompile the site assets, which may take a few minutes:

$ RAILS_ENV=production bundle exec rails assets:precompile

At this point, we’re almost ready to go!

11. Configure cronjob

This is technically optional, but highly recommended to keep your instance in good order. As the mastodon user, start by determining where your bundle command lives:

$ which bundle

That path will be substituted for $bundle. Now, edit your own crontab:

$ crontab -e

Select nano (2) if you’re prompted. As of version 1.2 (17 April 2017) you only need one daily task in your crontab:

5 0 * * * RAILS_ENV=production $bundle exec rake mastodon:daily

Save and close the crontab.

12. Log out and return to root

We’re done with the mastodon account. Log out and return to your root shell.

13. Start Mastodon

The moment of truth! Enable the Mastodon services (so that they start on boot):

# systemctl enable /etc/systemd/system/mastodon-*.service

Then fire up Mastodon itself:

# systemctl start mastodon-web.service mastodon-sidekiq.service mastodon-streaming.service

Open up a browser tab on your domain. Mastodon can take up to 30 seconds to warm up, so if you see an error page, don’t fret. Only fret if it’s there for longer than a minute – that requires troubleshooting, which is outside the scope of this document.

You should eventually get a signup page. Congratulations! Register an account for yourself, receive the confirmation email, and activate it. This should enable you (the first user) as an administrator.

14. Securing Mastodon

This is by no means a comprehensive guide to server security, but there are two quick things you can change while the root shell is open. Start by editing the passwd file:

# nano /etc/passwd

Find the mastodon entry (it’ll be near the bottom) and replace /bin/bash with /usr/sbin/nologin. Save and quit. This will prevent anyone from logging in as the mastodon user.

Next, configure ufw. First check if it’s disabled:

# ufw status

It should be off, since this is a brand new VM. Configure it to allow SSH (port 22) and HTTPS (port 443), then turn it on:

# ufw allow 22
# ufw allow 443
# ufw enable
? y

That will prevent any connection attempts on other ports.

15. Enjoy!

If you enjoyed this guide, I’d appreciate a follow! You can find me by searching wogan@wogan.im in your Mastodon web UI. Give me a shout if you were able to get an instance set up with these instructions, or if you ran into any problems.

16. Upgrade to v1.2 (17 April 2017)

If you’ve installed Mastodon according to these instructions, you’ll need to do a few things to upgrade to the latest version.

Start by logging into your instance as the root user, then re-enabling your mastodon user shell (in step 14, change the mastodon user’s shell back to /bin/bash). We’ll use it in a bit to perform the upgrades themselves.

When that’s done, stop the Mastodon services like so:

# systemctl stop mastodon-*

That will shut down all the Mastodon services. In a new window, log into your mastodon user and install Ruby 2.4.1, the new preferred version:

$ cd live
$ rbenv install 2.4.1
$ gem install bundler --no-ri --no-rdoc

This will install the latest Ruby, and the version-appropriate bundler. Now pull down the latest source code:

$ git pull

There are a couple of one-time commands to run – in order, they are to install new dependencies, run database migrations, do a one-time avatar migration, and recompile the frontend.

$ bundle install
$ yarn install
$ RAILS_ENV=production bundle exec rails db:migrate
$ RAILS_ENV=production rake mastodon:maintenance:add_static_avatars
$ RAILS_ENV=production bundle exec rails assets:precompile

When this is all done, make sure your crontab has been updated to use the new mastodon:daily command. Refer to step 11 above for details.

Finally, the teardown – log out of the mastodon user, and switch back to your root connection. Set the mastodon user’s shell back to /usr/sbin/nologin (step 14) and restart the Mastodon services:

# systemctl start mastodon-web.service
# systemctl start mastodon-sidekiq.service
# systemctl start mastodon-streaming.service

Give it a few seconds to warm up, and check that they’re running with:

# systemctl status mastodon-web.service

If you get a green dot with “running”, you’re good to go!

Sources

A lot of this guide was sourced from the official Production guide on the Mastodon Github page. I reorded it into a logical sequence after running through it for a few tries.

This post was updated for v1.2 (and v1.1.2) upgrade notes on 17 April 2017.

Fare thee well, Twitter

I feel honored, in a very weird way, to have witnessed the birth, growth, stumbling, and inevitable death of a global phenomenon. I might have said that about Gangnam Style, but somehow that bloody video is still attracting views.

I’m referring to Twitter, and I have such fond memories of Twitter. They launched in 2006, but it was only in 2008 that I became aware of it. My first memories of Twitter was that it was a faster, lighter, and somewhat less-sophisticated version of Facebook, which I had first joined in 2007.

Within a South African context, I was on Twitter before it was cool. It’s hard to believe in 2016, where basically every talk show and marketing campaign is using hashtags, and questionable social movements are using the platform to rally and organize social change – but there was a time when Twitter in SA was small. Really, really small.

Small enough that in May 2008, a few of us started assembling a manually-curated directory of South African Twitter users. This was back when Twitter didn’t have a search feature (remember how that used to be a whole separate site?), and geolocation wasn’t even remotely an option.

The Internet Archive will forever immortalize this entry on a free PBWorks wiki (used to be Peanut Butter Wiki, by the way), before the spam came along.

2016-09-24 18_47_40-satwitter _ FrontPage.png

Three months later, the list had grown to a mere 150, and I think we got to about 300 before Twitter started gaining some real traction here in SA.

By September 2008, Twitter had introduced us all to the concept of the Fail Whale, and outages were about as common as uptime. I quite liked Twitter, though, and they had a REST API -so between that API, a profile search feature, and a few cron jobs, I built a South African shadow site, called TwitterSA.com.

All it did, really, was continuously search for users that had South Africa (or major cities) as their location in their profiles, automatically follow them, and cache all their tweets in its own database. I recall eventually adding a signup feature, and started scraping the tweets for popular links and hashtags that were being shared.

Yes, it was basically a nascent social media listening platform. If I knew how popular those would have become in the years to follow, I definitely would have held on to it a lot more strongly than I did.

It had a brief, but glorious reign. At one point it held the definitive directory of South African Twitter users, and even, some crazy how, got myself and the designer featured in an ITWeb column.

That lasted until about July 2009, when Twitter started sending out Cease & Desists to all domains that included “Twitter” in the name. We received it, discussed it for about two minutes, and decided to shut down the site. Twitter was only a few months away from launching their Geolocation API, which would have made the service redundant anyway.

Since then, I’ve been perpetually on-and-off Twitter, simultaneously inspired and frustrated by it. I never could figure out the best way to make use of it, and towards the end of 2015 I started making the mistake of trying to have rational debates through the platform.

Turns out, 140 characters at a time is not a good medium for debate, and the average participant who attempts it is not really great at communicating complex, cogent arguments – myself included. Retweets make it far too easy to take things out of context and magnify them to a hostile audience, and it’s way, way too easy to get mobbed off the service.

Which is why I quit in February 2016.

https://twitter.com/WoganMay/status/700466035749556224

I left it connected to my WordPress.com account, on the off chance that some of my audience would appreciate the stuff I write here – but on the whole, I disengaged completely. Frustrating limitations, hostile audience, apathetic moderation – was there any other outcome?

And it turned out, I wasn’t the only one starting to feel this way. Me dropping off Twitter came as the stock coasted to its lowest level since IPO:

2016-09-24 19_02_49-$TWTR - Google Search.png

As it turns out, it’s really hard to monetize such a shitty place. And I choose that word with the utmost care:

  • The highest engagement comes from social media addicts and internet trolls
  • The tools basically streamline the hate mob and harassment process
  • The ad formats infringe directly on the stream experience
  • Twitter’s API policies turned exclusive and hostile, alienating developers
  • Nearly every algorithm adjustment upset users, who wanted reverse-chronological feeds entirely of their own curation
  • Twitter was routinely in the news for all the wrong reasons – usually because of someone famous (or just an inconsequential profile) saying something stupid that blew up

I suppose the smart Twitter investors started cottoning on to the fact that something was wrong back in April 2015, when they started redefining (almost every quarter) the metrics they used for measuring their own success. The less-smart investors would have started getting alarmed in August 2015, when the SEC started asking pointed questions about how Twitter was running their business.

For me, though, the moment of clarity came just last month, in August 2016, with a long-form Buzzfeed article entitled Inside Twitter’s 10-year failure to stop harassment. It’s a fascinating read, and in a few months, will serve as the pre-post-mortem of why Twitter collapsed the way it did.

So as of today, Twitter is on the chopping block. And the one piece of data that is really telling? Twitter’s stock price, the moment news broke of a potential sale:

2016-09-24 19_10_56-#TWTR - Google Search.png

That’s the last heartbeat of a company destined for flat-lining: When their stock gains instant value only because there might be a quick return. It’s a bit of a “vultures are circling” situation, with some of the braver ones picking at the still-lumbering victim.

At this point investor sentiment is basically clear – Twitter has to go. I can’t imagine there’ll be a last-second magic trick that restores Twitter’s credibility and independence. If that were possible, it would have emerged at some point in the last five years.

So what’s next? One thing to bear in mind is that no social network, post-acquisition, has actually survived in a form that in any way resembled the original. Social networks are tricky things, and new owners typically want a good financial return on their investment.

Whoever buys Twitter is getting a mixed bag of:

  • Some very good distributed message processing tech – global scale, realtime delivery
  • An executive team void of any real direction
  • A disillusioned workforce, whose attempts to improve the platform have met with repeated failure
  • A social hot potato, in that Twitter is more or less the new 4chan, and were forced to create a Safety Council in reaction to “extreme” speech
  • A political hot potato, in that anyone with clout will want the service sanitized to remove harsh messaging about them.
  • A financial hot potato, with declining ad revenues across the board
  • A brand that created two new entries in the Oxford Dictionary: Twitter and Tweet
  • A grab-bag of acquisitions: Vine, Periscope, some adtech and design startups
  • And a domain name that, if you think about it, is kinda dumb – birds can’t use smartphones

About the only thing that makes sense, acquisition-wise, would be to turn Twitter into a one-way content provider: letting brands and verified celebrities use it as a platform to push out messaging, while severely limiting user interaction – like how Hollywood works right now, basically.

A dumb content pipe with no controversies and news blowups is preferable, commercially speaking, to a public square for free speech and open debate. It’s a lot easier to monetize a captive and engaged audience, and if there’s one thing that news outlets in particular have realized in the last year, you don’t need a public feedback facility to enable that.

I might even start using Twitter again, should it turn into something that has a net positive effect on my mood. A safe content delivery pipeline, pushed to the top of your phone, with granular interest tracking, personalized content and real-time feedback? Marketer’s wet dream.

But for now, enjoy the long shadows cast by the setting sun that is Twitter.

Swinging a double-edged sword

It’s been an interesting few weeks in terms of freedom of speech, and what that means on the internet. On 31 August, YouTube made a small but significant change to its advertising policy – it set new guidelines for monetizeable content, and included rules specifically against offensive content.

In a lot of contexts, that’s pretty justifiable: It shouldn’t ever be the case that a system is put in place that rewards hateful and destructive speech. Under the previous system, ad revenue was pretty much a direct correlation to viewership, and controversy is a constant driver of viewership.

For example, it would be possible to create a YouTube channel that featured nothing but trolling and baiting other people, and not only would you get a response to that, you’d actually be rewarded for your efforts with ad revenue. That cannot possibly be a reasonable thing to reward – it directly fans the flames that make YouTube an unpleasant place to be.

37b05b5e00000578-3763921-rude_-m-60_1472494720798

Trouble is, “offensive” is an extremely subjective and rapidly-moving target – nowhere more so in the US, with college campus politics and ludicrous entitlement driving a new generation of offenderati. A recent, glowing example of this was a Lyft passenger who berated the driver for being racist, simply because the driver had a Hawaiian bobblehead accessory in his car:

http://www.dailymail.co.uk/news/article-3763921/Woman-berates-Lyft-driver-racist-Hawaiian-bobblehead-doll-dash.html

That’s one element to consider. The second is how quickly instances like that are used to fuel internet mobs – groups of people with internet access and nothing better to do. They’ll gang up, post harassing messages on social media, try unearthing private information about their target, and generally try making their lives miserable.

If that Lyft driver in the article above was on social media, he would have been targeted with death threats, his information would have been made public, he’d probably be barraged with phone calls and texts for being “racist” – none of his fault, and all of it perpetrated by what I can only imagine is a mentally unstable woman.

That behavior – distasteful as it is – is a fantastic driver of “engagement” on these platforms. It drives eyeballs to videos, it drives comments (albeit bad), and it drives uploads – mainly rebuttals and rants. And YouTube is, in my opinion, justified in trying to protect themselves from that. Not only because advertisers will be insisting on it, but also just because it’s the decent, human thing to do.

photo-1455747634646-0ef67dfca23f.jpg

 

But because everyone’s so easily offended, and because people can be rallied up into attack mobs, there’s a very real downside: The systems by which YouTube enforces this policy are largely automated, and there’s basically no preemptive defense. If you say the wrong thing on YouTube, or you upset a group of people (maybe no more than 50 people, even), they can rally against you and flag all your videos for inappropriate content. And after a certain point, the system simply starts demonetizing your videos.

That’s a bit of a slap in the face to the people that work on producing great content for YouTube. It takes a lot of effort to build a channel and an audience, and the ad revenue from that was what made it viable for a lot of content creators to keep doing that. There’s a large overlap between the people that are passionate about creating YouTube content, and the people that believe strongly in the views they’re sharing.

And since we’re now living in a world where simply existing is offensive to some people, it’ll become harder and harder for those content creators to justify spending so much time and effort producing content for YouTube (which already takes 45% of all ad revenue), when it’s so easy for someone to get a hate mob together and torpedo your earnings.

YouTube is by far the largest platform for community-driven speech in the world, and they’ve swung a bit of a double-edged sword here. They’ll likely end up with a profitable network, but at the cost of burning down a vital square for public debate.

Meaning that the final holdout just bit the dust. Facebook has been tailoring their algorithms for years, designed to put profitable content in front of you. Twitter’s losing the battle for their independence, and will have to start making larger compromises pretty soon if they want to remain relevant. And now YouTube has thrown their independent content producers to the wolves.

Kinda makes you long for MySpace a bit.