Monthly Archives: January 2017

Write500 – One Month In

I promised myself I’d write a post at the end of January, reflecting on the progress and lessons learned from Write500 so far. It’s been a busier month than I was expecting!

My goals

At the start of the month I set a few goals for myself:

  • I’d write to the same prompts that users get every day
  • I’d broadcast 1 image per day on social media to promote Write500
  • I’d work towards a web app that could go into beta by the end of the month

As it turns out, doing all of that with consistency, on top of having a regular 9-5 is a bit more of a challenge than I initially thought!

Some numbers

Before I get into the full run-down, here’s the part most people are likely to care about: the stats!

I started Write500 as a free mailing list, launched on 1 January of this year, and promoted in a few places (notably, Instagram). All the campaigns have now expired, and as we approach the end of the month, here are the highlights for that list’s performance:

Screen Shot 2017-01-28 at 3.57.28 PM.png

Of the 709 users currently subscribed to the free daily prompt list:

  • 683 (96.2%) are still active, and receive daily emails
  • 26 (3.7%) have opted out
  • 144 (20.3%) open their prompt emails every day

Since migrating the user database to the new list system on 17 Jan (more details on that below), I’ve had an additional 23 users join the list – without spending any money, and barely any effort, on marketing.

Screen Shot 2017-01-28 at 4.19.54 PM.png

My traffic curve looks exactly like you’d expect – lots of interest during the time the Instagram campaign was running, and very little when it ended. Interestingly, the campaign only officially concluded in the early hours of 13 January, after interest to the site started slowing down.

Filtering for all traffic starting 14 January, this is what my Acquisition Overview looks like:

Screen Shot 2017-01-28 at 4.22.08 PM.png

It’s encouraging to see Referral and Organic Search showing up in Channels. I’ve still got a ton of work to do, in terms of content marketing – a full site+blog overhaul, to start.

Finally, and probably my favorite part:

Screen Shot 2017-01-28 at 4.24.42 PM.png

Write500 is now ranking #1 for its own keyword on Google, Bing and Yahoo (hence DuckDuckGo putting it at #1 too). Which, to be fair, wasn’t much of a challenge – there is only 1 other domain (write500.com) that could be said to be competing for that keyword.

I do feel that this bodes well for offline and word-of-mouth marketing, though!

Actual achievements

Goal #1: Writing

In terms of the writing, I started out pretty strong – I produced 12249 words between 1 Jan and 12 Jan. There were a few blog posts in there, but most of the words came from the daily prompts.

I started losing traction on this around 13 January, which is no coincidence – that’s when my mental bandwidth for this project started to narrow, with dayjob-related work ramping up. I had to start making compromises, and figured that it would be better to prioritize the app itself.

I’m still pretty happy with what I achieved here, though – 12k words in January represents more than I wrote across most of 2016. And I now know I’m capable of this, I just need to find the mental space again.

Goal #2: Image Publishing

When I started working on the prompt database for Write500, my first step was to collect 366 inspirational, motivational and literary quotes. They were originally intended to be used to decorate the daily emails, providing something else to read other than the prompt and the instructions itself.

When I spun up the Write500 Instagram account, I realized I could find a secondary use for that content – publishing each quote as an image. So I created a batch of 25 images – here’s a sample:

I used one of those as the creative on my Instagram ad, and it performed really well, so I decided to keep publishing those images every day. Even after the campaign ended, those images kept getting likes (as much as 80 per image).

However, this too suffered from a lack of time. I published every image on schedule, from 1 Jan to 18 Jan, until I missed my first day. I then managed to publish daily until the 21st, at which point I hit a bit of a wall.

I needed to make time to create more images in order to continue publishing daily, but at this point there wasn’t an immediate return on it. Even though the images are well-received on Instagram, I got very few signups that way. The only benefit to continuing might have been to keep building up my profile, which felt a bit like putting the cart before the horse: There’s still no app I can send people to.

I will eventually pick this up again, and produce the remaining 340-or-so images. That’ll be a marketing exercise though, and right now, the app is more important.

Goal #3: A Write500 Beta

I might actually make this one! Right now, it’s Saturday afternoon. I’m taking a break from a few hours of dev to write this post, and I’m intending to continue when it’s published. There’s only one core feature left to add, a few style cleanups, and then I’ll be able to start inviting beta users.

I had to jump through quite a few hoops for this one. When I started Write500, my plan was to build one app that allowed both free and paid signups, and I’d port over the users from the free mailing list into it.

When I actually tried building that out though, I started running into issues. Small ones, sure, but I could already see the technical debt forming – I’d have to make lots of exclusions and compromises to have both personas co-exist in one database.

So instead, I took a few days to break out the free mailing list feature into its own project. It now lives at https://list.write500.net as a standalone automated mailer, and exists independently from the main project.

I might even polish that up and release it as a standalone product someday. It just needs a nicer content editor and subscriber management features, but is otherwise a fairly capable bulk mailer (including a bespoke interaction tracker), using the very-reasonably-priced Amazon SES as a backend. Who can argue with $0.10 per thousand mails, and not having to send your subscriber or interaction data to a third party?

With the List system out of the way, I started building the app itself. I didn’t have much of an architecture or feature layout in mind when I started – I just knew the broad strokes of what I wanted to accomplish. Which, to date, has been the following:

  • Subscription integration with Paypal (the only global payment provider that I can use right now), using their PHP SDK to create and execute Billing Agreements.
  • An MVP for the main interaction loop (receive prompt, write words, get statistics), in a basic but functional Bootstrap-based UI
  • A Programs feature, that delivers the prompts on a schedule – this replaces the mailing list in many ways.

And I do have a few screenshots to share! Click to enlarge.

Up Next?

So right now, Write500 addresses the write-in-isolation use case: the writers who prefer to work alone, and only need to see their statistics and streaks building up over time for motivation.

Of course, there are other types of writers out there. I’m in a slightly different cohort myself: I prefer writing as part of a small group, sharing notes and progress as I go. I find that being surrounded by other writers helps the motivation somewhat.

Logically, Write500 needs a Groups feature of some kind, but I’ve given that far more thought, and I’m going to try something kinda new: Tribes.

About Tribes

One of the problems I have with normal groups (the sort you have on Facebook, for instance), is that as the size of the group goes up, the quality of the interactions goes down.

Not all interactions, of course – users still post useful things, and engage in useful ways. But when a group gets too big, it loses its initial sense of closeness and community. More members eventually means more rules, more moderation, and inevitably, users going quiet, leaving the group, or splintering off to form their own.

For Write500, I need a Groups feature that encourages everyone in the group to engage on a regular (daily, preferably) basis, while still feeling that they’re getting good personal engagement with other users. Doing this in a classic open-ended group would be difficult. That sort of interaction would be deafening, for one: signal-to-noise will be going way down.

And there’s also the fact that different personas want different things out of their groups. Some prefer lively debate, some prefer terse updates, some prefer checking in multiple times per day, some prefer checking in once a week.

So with Tribes, I’m going to create a groups feature that has the following characteristics:

  • Max of 10 users per Tribe (to start), and Tribes need to split in order to grow.
  • Users join on a time-limited tryout (or need to be invited), and every other member of the Tribe has to explicitly (and anonymously) vote to include them permanently
  • Notifications and events from Tribes will have a dedicated section in Write500 (Tribe Newsfeed), and be the most visible form of notification available
  • Users can only belong to one Tribe at a time

Completely antithetical to standard community growth tactics? You bet!

With Tribes, I specifically want groups of writers to form strong relationships with other writers in their genre/pace/orbit, and feel at ease about sharing more about their work than they’d regularly share on an open group.

I also want to make sure that being in a Tribe is a rewarding experience, and that other members of the Tribe pay attention to what you have to say – not always a given in a group that can grow uncontrollably.

When this comes to the actual writing, there’d be integration there too – the ability to broadcast a completed prompt to your Tribe, letting them review and comment on your work. Which feels like a better solution to me, than just broadcasting your work in a public square and hoping to catch people’s attention.

Of course, this entire experiment could fail and I just end up going back to standard groups! I’m optimistic, though, that a format like this would create an environment that some users would find useful.

Shoutout: StartupStudyGroup

If you read this far down, well done! You’re clearly someone who’s interested in the details, and how new products and services come about -so you really should check out startupstudygroup.com!

I joined the SSG Slack group earlier this year, and the community there has provided me with valuable insights and encouragement so far. If you join the SSG Slack, come say hello in #write500!

And now, back to the grindstone.

Laravel 5.3 + Cloud9

I use Cloud9 as my primary IDE – it handles all my PHP projects, my Domo app projects, and when I start seriously developing Ionic-based mobile apps, I’ll find a way to use Cloud9 for that too.

I’ve found it particularly handy for Laravel development: A lot of the prerequisite software provided by Homestead comes pre-installed on the Ubuntu VM that Cloud9 provides, and with a few small config tweaks, it’s possible to get up and running very quickly.

1. VM Setup

One of my cardinal rules: All your code should exist in a repository. Especially with web development, there’s basically zero excuse to keeping substantial amounts of code locally.

My first step is to always create a blank repository, and use that as the clone URL for a new Cloud9 VM.

2017-01-20 05_48_50-Create a New Workspace.png

The Blank Ubuntu VM is good enough for our purposes – there’s no benefit to selecting the PHP VM, since we’ll be installing the latest PHP in any case.

2. Services Setup

Chances are, whatever you’re building in Laravel will need the MySQL service to be running, and in most cases, you’ll probably want Redis too. Cloud9 has made it a little bit more difficult to automate service startup (presumably to minimize unnecessary resource utilization), but there’s a way around that.

If you’re using a premium workspace, Cloud9 will keep it “hot” between sessions – everything you start will continue to run even after closing the IDE window itself. But that will expire after a while, and it can be a hassle to keep restarting your services.

So, the first change we’ll make to our new VM is editing the /.profile file, and adding two new lines to the bottom:

$ cd ~/
$ echo 'sudo service mysql start > /dev/null 2>&1' >> .profile
$ echo 'sudo service redis-server start > /dev/null 2>&1' >> .profile

That will start the services (and skip startup if they’re already running), won’t show any console output, and will run whenever a new terminal window is created. So now, every time you open up a new terminal window, you’re guaranteed to have those services running:

2017-01-20 06_11_31-cloud9-demo - Cloud9.png

It’s not as ideal as having the server start it automatically, but in my opinion, this is a small trade-off to make.

3. PHP 7 Setup

If you’re deploying code to a new server on DO, or AWS, or Rackspace, or most other places, chances are it’s running PHP 7. By default, Cloud9 does not include PHP 7 (for reasons which escape me), so we need to take a quick detour into apt-get land.

There are two ways to set up PHP here – apt, or phpbrew. For a while I was using phpbrew, which builds PHP from source. It’s a decent option, but using ondrej/php is faster.

$ sudo add-apt-repository ppa:ondrej/php

Then you’ll need to sudo apt-get update. When that’s done, you should have access to the php7 packages:

2017-01-20 06_18_23-Groove Music.png

I’m using php 7.1 here, because that’s what my DigitalOcean VM is running. Laravel will need that, plus a few other modules:

$ sudo apt-get install php7.1 php7.1-mysql php7.1-curl 
  php7.1-xml php7.1-mbstring

That shouldn’t take more than a minute. And now we’re good to go:

2017-01-20 06_21_26-cloud9-demo - Cloud9.png

4. Laravel Project Setup

There are quite a few ways to get Laravel installed. It has a CLI-type thing that lets you do laravel new, you can clone it directly from github, or you can use Composer.

I tend to favor composer since it’s a once-off command for this workspace. There’s no benefit to installing any more tools here, since we’re only going to create 1 Laravel project in this VM. And since we’ve already got our workspace folder linked to our project repository, git can get messy.

So my favorite – use composer to set up Laravel in a temporary directory, then move it all across:

$ cd ~/
$ composer create-project laravel/laravel tmp --prefer-dist
$ mv tmp/* workspace/
$ mv tmp/.* workspace/
$ rmdir tmp

Why two move commands? The first catches all the files, but Laravel does include a few dotfiles that aren’t caught by the first move command. Sure, you could configure dotglob to modify mv’s behavior, but we’re only doing this move once for the workspace.

Just one more setup step – the database. Start by adjusting the .env file to set the username to root, and a blank password:

DB_CONNECTION=mysql
DB_HOST=127.0.0.1
DB_PORT=3306
DB_DATABASE=homestead
DB_USERNAME=root
DB_PASSWORD=

Then we’ll create that database in one go:

$ mysql -uroot -e "CREATE DATABASE homestead;"

You can now run the initial migrations if you want.

5. Node Setup

If you’re planning on using Elixir to compile frontend assets, you’ll probably want to upgrade to the latest version of Node. Cloud9 makes this easy:

$ nvm install 7.2
$ nvm alias default 7.2

All node commands will now use the 7.2 binary. You can now install the necessary components in your workspace folder:

$ npm install

6. Running Laravel

Cloud9 features a reverse proxy – anything that binds to port 8080 in the VM will be accessible via a URL unique to your workspace. It also has a dedicated “runner” feature, which we’ll configure like so:

2017-01-20 06_33_15-cloud9-demo - Cloud9.png

The name can be anything, but the run command will be:

php artisan serve --host=0.0.0.0 --port=8080

Cloud9 does make mention of $HOST and $PORT variables, but they’re always the same values anyway. Fill that in and click the Run button on the left:

2017-01-20 06_33_53-cloud9-demo - Cloud9.png

Cloud9 should open up an in-IDE browser window that points to your site. You can click the little pop-out button on the right of the “address bar” to bring it up in a new browser tab.

2017-01-20 06_36_01-cloud9-demo - Cloud9.png

That domain (c9users.io) will check for cloud9 authentication before proxying to your VM – in short, that URL will only work for you, and anyone else on cloud9 that you’ve shared the workspace with. Those settings are managed via the Share button on the top right:

2017-01-20 06_37_40-cloud9-demo - Cloud9.png

From that popup, you can invite new users to the workspace (cloud9 does issue free accounts), or you can just make the application URL public:

2017-01-20 06_38_34-cloud9-demo - Cloud9.png

You can now share that URL to anyone.

7. Get coding!

You’ve now got just about everything you need for proper Laravel development! It’s usually at this point I then take the extra step of configuring Amazon S3 as a cloud disk, push the blank Laravel project to bitbucket, and deploy my repo as a site on Forge.

Write500: My first 500 users.

After blogging about the start of Write500, I told myself I’d write a follow-up when I reached 500 subscribers. That happened a lot faster than I expected it would.

Yes, I’m at 500 opt-ins – barely 10 days into this.

In this post – expect stats, charts, breakdowns, and guesses as to what sort of marketing worked for me, what did not, and where I plan on spending my time next.

Continue reading

Building a Domo dashboard for Write500

This post is part of a series of posts I’m doing about Domo (the Business Cloud) and how I’m using the free tier to analyze metrics around write500.net. Click here for the full listing.

There’s a surprising amount of things to keep an eye on when it comes to a project like Write500.

Right now, only the free tier of Write500 exists – users get sent to a subscribe page, they opt-in, and receive daily writing prompts over email. It won’t always be so simple: I’ve got plans for a Plus (paid) version, and mobile apps to extend Write500’s reach.

So even this simple, free version, in terms of the architecture, already looks like this:

write500-arch

Every box in there has something that needs to be monitored in some way. Some of the monitoring is historical (looking back at performance over time), some of it is by exception (being alerted when something goes wrong).

That’s already a lot of potential metrics, from a lot of different sources. Individually, they aren’t particularly hard to get – it’s deciding which ones to get, how often, and what you’ll do with the information, that matters.

And that’s the product side of the equation. Not only am I building the product, I’m the one-person marketing team behind it.

write500-marketing

Each of those boxes, too, can have a whole universe of metrics in them. Social in particular, which has a penchant for inventing crazy new metrics every few months.

Where to start?

If it seems overwhelming – it is. There’s a lot of information out there, not all of it useful. It’s way too easy to get hung up on the wrong metrics, and miss out on the ones that matter.

So the best place to start, in my experience, is not with the data. I’ve seen too many BI projects start out with what the team could get from their databases, and in the end, they delivered dashboards that looked identical to what they were already getting from their data in the first place.

What we’re after here is actionable, meaningful insight. So without further ado:

1. The Question

A much better place to start, is the question.

If I could answer one question about your business, what would it be?

For Write500 I have a lot of questions, but in this post we’ll deal with the one that’s most important to me:

Are my subscribers getting value from Write500?

I built Write500 because I wanted to be nudged to write 500 words per day. So far, as the user of my own product, I’ve managed to achieve that – but my experience doesn’t matter nearly as much as the experience of my subscribers. If they’re not getting the value they need, then I’m either aiming at the wrong audience, or there’s a problem with my product.

Now, that question could be answered anecdotally. I could point to a few places I’ve seen bloggers start their challenge, and take that as a sign that users are getting value. I’ve seen a lot of businesses do this – make gut judgements based on limited information.

I have learned, however, not to trust my gut. Let’s quantify this instead.

Getting metrics about my subscribers is easy – I can pull the subscribers table and chart the growth. Getting metrics about the value is a little bit harder – right now, I’m not equipped for that.

Ideally, the value I want subscribers to get out of my product is producing 500 words per day – and the only way to track that metric is to have them plug their 500 words into a system where I can measure their wordcount. That’s a feature planned for the next version of Write500, but right now, I don’t have that data.

The best I can do, with the information I have, is determine whether or not users are opening the emails I’m sending on a regular basis. That’s not the best metric for whether or not they’re getting value from Write500, but I’ll operate on the assumption that consuming the prompts is at least partially valuable.

Here, already, one of the biggest problems in BI becomes visible: The gap between what you need to know about your business, and the data you have on hand to answer it. So, so many BI vendors and consultants sell dashboards that chart what you have, but don’t give a second thought to what you need.

2. The Data

So based on my question, what data can I acquire to answer it?

Write500 has a bespoke tracking pixel. Every prompt that gets sent is uniquely identifiable, and includes embedded tracking that fires (email client willing) when the email is opened. This data is sent straight back to my server, instead of going through a third-party analytics tool.

sendopened

The raw data, as you can see in the screenshot, includes the GUID for each send, and the server-side timestamp for when the email was opened. I also have the timestamp for when the email was sent. With this data, I can answer two sub-questions:

  1. Are subscribers opening the emails they’re receiving?
  2. Do subscribers open the email on the same day they receive it?

Together, the answers to those questions would at least indicate whether or not the emails themselves are being utilized (and whether my subscribers are getting value). So looking at the data I have available, this is what I can get to help answer those questions:

Per subscriber: (dimension)

  • The amount of prompts they have been sent (metric)
  • The amount of prompts they have opened (metric)
  • The open rate (calculated)

Per day: (dimension)

  • The total amount of prompts sent (metric)
  • The amount opened within 24 hours (metric)
  • The amount opened within 48 hours (metric)
  • The amount opened within 72 hours (metric)
  • The amount opened after 72 hours (metric)
  • The amount not opened (metric)

Dimensions and Metrics

I should explain what these are before continuing, and I’m doing this in the context of Domo’s capabilities. Within any tabular data set, you’ll have two types of fields:

  • Numbers
  • Everything that’s not numbers

Metrics are numbers you can measure, and Dimensions are the categories you group/slice/breakdown those metrics in. So:

  • New Subscribers per day
  • Recurring Visitors by mobile device
  • Purchases over $50 by persona

In the samples above, Day is a dimension (because it’s not a number). But Subscriber is also a dimension, even though it’s technically a numeric ID – except that it’s a number that makes no sense to measure. This is the exception to the rule.

It can’t be summed, averaged, min’d or max’d and present any meaningful result: What’s the point of knowing the average subscriber ID in your database? The count of subscribers is more useful, and in that case, the count becomes the metric.

3. The Acquisition

So now we know what question we’re answering, and the data that we’re going to use to answer it. This is where we start getting into the implementation.

The first step, obviously, is to acquire the data. In a previous post, I dealt with this issue for Write500, eventually settling on a SFTP upload for my subscribers dataset. Since then, I’ve added a better stats system to the app.

So for Write500 specifically, I now have this happening:

Write500 Data - New (1).png

I have a series of Artisan commands, each of which generates a single CSV file. Those commands are scheduled using the cron system on Forge, and when they run, the results are uploaded to a dedicated S3 bucket. Domo, in turn, pulls that information into its Vault:

domodata

Now that I have the data coming in, it’s time to build some charts!

4. Visualization

Visualization is a problem domain all its own, and has way too much science behind it to cram into a 3000-word blog post. However, we’re doing visualization for BI, which is very different to visualization for analysts, engineers and scientists.

BI visualization has to meet several fundamental criteria (at least, the type of BI I do):

  • Self-contained – the chart should explain what it is without requiring an appendix. Any chart that has a “How to read this chart” next to it, is a bad chart.
  • Key call-out – Whatever the purpose of the chart is, it should be obvious at a glance, preferably from across the room.
  • Honest – Too many “analysts” will fiddle with colors, min/max axes dimensions, sorting and scale to try to massage the truth behind the numbers. Don’t.
  • Actionable Next Step – In charts that are meant to drive decision-making, make the decision obvious.

In my experience, this tends to mean the following:

  • You’re going to re-use a lot of the same 3-4 chart types (line, bar, area, donut)
  • You’re going to create lots of smaller charts, instead of consolidated/complex ones
  • Colors are really important
  • Short titles are really important

I will eventually build the charts that answer my key question, but let’s start with some simple ones.

2017-01-07-14_50_30-write500-domo

First, we start with a blank canvas.

Domo includes the ability to group charts into Collections, which make it simpler to manage complicated dashboards. In this case, all the charts that pertain to my subscribers will go in one collection – other collections will include marketing and product data.

Note: All the charts here are built on real data, as of today.

0. Warming up

Easiest thing to chart – one metric vs a date.

2017-01-07-14_54_01-write500-domo

Here’s an example of what I think is a good chart:

  • The title and subtitle make it clear what I’m looking at,
  • The summary number makes the current status clear,
  • The bar chart shows the growth nicely, and
  • Using Domo’s linear regression Projection feature, I can see that I should end today with over 100 new subscribers

So overall – I’m getting new subscribers, and it’s visually obvious from the chart that the velocity is increasing.

But now let’s answer our first sub-question:

Are subscribers opening the emails they’re receiving?

Now we get to define a proprietary metric! I’m going to create a metric called “Engaged”:

Engaged: A user that has no more than 1 unopened prompt email

The logic behind this: Users receive one email per day, and there will be a gap between receiving an email and opening it. However, for as long as that gap exists, the email will show as “1 unopened” in my data, which is not fair – the user just hasn’t had a chance to open the most recent email yet.

“2 unopened” means they’ve neglected at least one email, so I’ll consider them less engaged. Yes, this is a high standard.

How do we calculate this metric? Here’s the Beast Mode formula I’m using, directly in the card.

calc1

This gets us the total users that have between 0 and 1 unopened emails, as a proportion of the total number of users in the data.

That can be represented in a single large number card, like so:

2017-01-07-15_19_06-write500-domo

That’s a pretty impressive result! Almost 70% of my subscribers are opening their emails on a regular basis, and it could be even higher: remember that I cannot track opens from email clients that block the call to my tracking pixel.

I also want to get a sense of the proportions, though – so we’ll take that summary metric and use it as a summary number on a pie chart. I’ll group my subscribers like so, using another Beast Mode.

bm2.png

Honestly, these sorts of CASE statements represent 90% of what I use Beast Mode for, in Domo – to create custom categories.

We’ll apply that formula to the data, and spin up a pie chart. Here’s what we get:

2017-01-07 15_24_03-Write500 - Domo.png

So of all my subscribers, 60 of them are neglecting their prompts completely. This is also useful information – I know who those subscribers are, so I can segment them out and try different email subject lines, to see if I can maybe get them more engaged and bolster the overall rate.

There’s one more chart we can create – we can compare our engagement rate to an industry open rate. Using this source data, I’ll take Hobbies as my benchmark.

2017-01-07 15_28_21-Write500 - Domo.png

Not bad at all! But then, that’s to be expected – my emails are not marketing, they’re more transactional in nature, and transactional emails have much higher open rates.

So let’s go back to our question:

Are subscribers opening the emails they’re receiving?

Resoundingly, yes: Close to 70% of my subscribers are opening their emails.

On to the next question!

Do subscribers open the email on the same day they receive it?

The answer to our previous question can cover this one too – if Unopened is between zero and one, then yes – most open the same day they receive it. However, that’s an aggregate view, and I want to see some more specific on how long it takes subscribers to engage with their emails.

So for this, we’re looking at the time difference between the email being sent, and opened. I’ve already done the heavy lifting of categorizing the data inside my data-providing Artisan commands, so the data I’m visualizing is already close to useful:

2017-01-07-15_46_51-write500-_-open-rate-domo

Our last charts looked at Subscriber Engagement, which is a higher-level metric than the actual individual emails being opened or not.  For these charts, we’re taking a different angle, looking at the actual emails. We’ll get similar rates, but not identical – that’s to be expected.

2017-01-07-16_59_10-write500-domo

 

Of all the emails that do get opened, practically all of them get opened on the same day (within 24 hours of being sent). And it looks like, overall, about 40% of the emails sent by the system have not been opened yet. Or, at least, they haven’t fired the tracking pixel yet.

The thing about tracking email open rates in particular, though – naturally, it’s not a trendable metric.

For example: You can send 100 emails on Monday, then calculate the open rate every day of the week, and get a different (higher) number every day.

Why? Because subscribers take time to open their emails (if at all). So looking at that chart above, right now, 40% of the emails I’ve sent are unopened. Those emails could have been sent today, or a week ago.

Right now, I care about whether or not subscribers open on the same day – and according to this chart, nearly all opened emails are opened on the same day. Emails opened at +1 Day and up are completely dwarfed. This is what happens when we remove Unopened emails from the chart:

2017-01-07-17_00_34-write500-domo

So that’s my question answered!

5. Taking Action

I’ve got my questions, they’re being answered by data, and now I need to think about the next step in this: What actions can I take, based on this data, to drive my business goals?

Right now, my goals are two-fold:

  1. Drive more subscribers to Write500
  2. Improve the email open rate

For (1), what I really need to look at is my subscriber acquisition machine – the marketing I’m doing, which is outside the scope of this post. I do plan on writing another one dealing specifically with marketing metrics though.

For (2), I now have a tight group of subscribers that are either not opening their emails, or not opening them on a daily basis – meaning they’re not writing every day. I can segment these users out and try a few things to remedy that.

I could also simply reach out to the subscribers that haven’t opened any emails yet. There might be a good reason why those emails aren’t being opened:

  • maybe they’re getting caught in spam,
  • maybe the user hasn’t been at their mailbox in the last week, or
  • maybe they no longer want to receive them, but haven’t bothered opting out.

My action would be to find out what it is, and remedy it where possible.

6. (Finally) Measuring Results

The final piece of the puzzle, and the one question that every BI dashboard should answer:

Are our actions having the intended effect?

Since the insights and actions are data-driven, the results should be too. For every action that you take to impact your metrics, you need to think about how you’re going to measure the results in a quantifiable way.

Let’s take the Open Rate chart above. When I start taking action to improve it, I expect to see that, over time, the overall percentage of Unopened emails goes down.

Since it’s an email open rate though, I can only get snapshots at regular intervals. The data generator I wrote does just that – it gathers point-in-time statistics, and appends them to the dataset. Right now, it’s running once per day, and it’s grouping the sends by the date they went out.

So putting that into a 100% Stacked Bar Chart, I get this:

2017-01-07 17_02_06-Write500 - Domo.png

Every day, I will get another bar on this chart, and the chart itself will be completely recalculated (if emails sent on previous days are opened, which we know from the data above it will be).

However, over time, that red portion should get smaller. If I wanted to create very precise tracking on this, I’d create a recursive dataflow to record the Unopened rate in a dedicated dataset – but for right now, I think I have enough BI going on!

 

 

Conclusion

I hope this post has been informative!

I’m planning on doing a deeper dive post into Write500’s metrics at the end of the month – that post will deal more with my expectations vs the reality, the charts I ended up using on a daily basis, what decisions I took, and how I measured those results.

2017-01-07 17_17_53-Write500 _ IG_ write500 - Domo.png

The dashboard I’m sticking with for now.

If you want to be notified when I write more of this stuff, check the Subscribe widget in my sidebar, top right. Or you can follow me on Twitter.

Use Amazon S3 with Laravel 5

Laravel’s Filesystem component makes it very easy to work with cloud storage drivers, and the documentation does an excellent job of covering how the Storage facade works – so I won’t repeat that here.

Instead, here’s the specifics on getting Laravel configured to use S3 as a cloud disk. These instructions are valid as of 4 January 2017.

The AWS Setup

On the AWS side, you need a few things:

  • An S3 bucket
  • An IAM user
  • An IAM policy attached to that user to let it use the bucket
  • The AWS Key and Secret belonging to the IAM user

Step 1: The S3 Bucket

Assuming you don’t already have one, of course.

This is the easiest part – log into AWS, navigate to S3, and create a bucket with any given name. For this example, I’m using write500-backups (mainly because I just migrated the automated backups for write500.net to S3):

2017-01-04 00_33_39-S3 Management Console.png

1. Easy button to find

Then:

2017-01-04 01_41_14-S3 Management Console.png

2. Select your region – with care

US Standard is otherwise known as North Virginia, and us-east-1. You can choose any region, but then you’ll need to use the corresponding region ID in the config file. Amazon keeps a list of region names here.

If you’re using this as a cloud disk for your app, it would make sense to locate the bucket as close as physically possible to your main servers – there are transfer time and latency benefits. In this case, I’m selecting Ireland because I like potatoes.

Step 2: The IAM User

Navigate to IAM and create a new user. AWS has made some updates to this process recently, so it has a lot more of a wizard look and feel.

2017-01-04 00_37_17-IAM Management Console.png

1. Add a new user from the Users tab

2017-01-04 00_37_33-IAM Management Console.png

2. Make sure the Programmatic Access is ticked, so the system generates a Key and Secret

Step 3: The IAM Policy

The wizard will now show the Permissions page. AWS offers a few template policies we’ll completely ignore, since they grant far too much access. We need our user to only be able to access the specific bucket we created.

Instead, we’ll opt to attach existing policies:

2017-01-04 01_45_53-IAM Management Console.png

3. This one

And then create a new policy:

2017-01-04 01_46_01-IAM Management Console.png

4. And then this one

This will pop out a new tab. On that screen, Select “Create Your Own Policy”.

  • Policy name: Something unique
  • Policy description: Something descriptive
  • Policy document: Click here for the sample

Paste that gist into the Policy Document section, taking care that there are no blank spaces preceding the {. Replace “bucket-name” with your actual bucket name, then save:

2017-01-04 01_50_40-IAM Management Console.png

If only insurance were this easy

Go back to the IAM wizard screen and click Refresh – you should see your brand new policy appear at the top of the list.

2017-01-04 00_40_36-IAM Management Console.png

Perfect!

Tick the box, then click Next: Review, and then Create user. It’ll give you the Access key ID and Secret like so:

2017-01-04 00_40_50-IAM Management Console.png

3. When you complete the wizard, you’ll get these.

The Access Key ID (key) and Secret access key (secret) will be plugged into the config file.

Step 4: Configure Laravel

You’ll want to edit the filesystem details at:config/filesystem.php

Near the bottom you should see the s3 block. It gets filled in like so:

fixed.png

Remember to set the correct region for your bucket

And done! The filesystem is now configured. If you’re making your app portable, it would be smart to use env() calls with defaults instead, but I’ll leave you figure that one out 🙂

Step 5: Test

The simplest way to test this is to drop into a tinker session and try working with the s3 disk.

2017-01-04 01_57_16-forge@write500_ ~_write500.net.png

And you should see the corresponding file in the S3 interface itself:

2017-01-04 01_57_43-S3 Management Console.png

Step 6: parrot.gif

Now that you have cloud storage configured, you should use it!.

First (and this will take minimal effort), you should set up the excellent spatie/laravel-backup package. It can do file and db backups, health monitoring, alerts, and can be easily scheduled. Total win.

You can also have Laravel do all its storage there. Just change the default disk:

2017-01-04 02_01_34-write500 - Cloud9.png

config/filesystems.php

This has the benefit of ensuring that even if your server crashes/dies horribly, nothing gets lost. You can also have multiple server instances all talking to the same s3 disk.

In my case, I’m using S3 as the storage for regular backups from write500. I’ll also use the same connection and attempt to publish my internal statistics dumps as CSV directly to S3 – meaning I can pull the data in via Domo’s Amazon S3 connector. I can then undo the SFTP setup I created previously, further securing my server.

Pulling arbitrary data into Domo

This post is part of a series of posts I’m doing about Domo (the Business Cloud) and how I’m using the free tier to analyze metrics around write500.net. Click here for the full listing.

There are almost as many ways to integrate data, as there are types of data, and source systems that hold data. There’s basically no way to cover them all in the space of one blog post.

In terms of Domo, though, all data is tabular – Domo doesn’t deal with binaries, log streams, delimited text files or complex JSON documents. Just plain old CSVs, which makes sense if you’re doing BI for business – most of the things that matter to decision-makers can be represented in table form.

So long as you can get your data in CSV format, there are a good couple of options to get it into Domo. In this case, I want to pull anonymous user data from write500 – just enough to chart user growth.

I’m also working within the limitations of what the free tier of Domo can do. The Enterprise tier is capable of more advanced acquisition, including basic JSON and CSV API endpoints.

In this example, I do have the benefit of being both the developer of the product, and the business user consuming the dashboards – so I can write any solution I need. Based on what the free tier is capable of, I have the following options:

write500-data-page-1

That’s a lot of options – and at some point, I’ve used every one.

From top to bottom:

1. Cloud Storage
I could write a script on the server to package the data in a format that can be uploaded to cloud storage – so a zipped file pushed to Box, OneDrive or Dropbox for Business. Domo then has cloud connectors that will let me pull that down as usable data.

2. JSON API
I could expose the data through a simple JSON API, and use Google Sheets as an intermediary. It’s possible to use the built-in scripting language and some scheduling to populate a Google Sheet, which is then trivial to import into Domo. It’s not very scalable though.

3. Export to CSV
I could write a script to dump out my required data as a CSV file, then pull that over SFTP. This is easier to set up, but still requires some scripting work. I can then pull the resulting data with the CSV/SFTP connector.

4. Direct Database Access
I could use the MySQL connector to hit the write500 database directly. I’d have to open firewall ports, add users, and do a bunch of other setup first. I’m not in favor of doing that much work right now, though.

So we’ll go with 3 – I’ll write a script to produce the exact CSV file I need, then set up the SFTP connector to fetch it.

Deciding what to fetch

There’s something of an art to deciding what metrics to actually pull – you first need to decide what’s actually useful to you, in terms of answering your business questions. In this example, I know I simply need a dataset that looks like this:

User ID Date Registered Is Subscribed
1 2017-01-01 05:30:00 1

How you generate that CSV is up to you – in my case, I wrote a basic Bash script to do that:

bashscript

Ok, so “basic” might be the wrong word to use – but all that’s really doing is generating a CSV file, with the right header line and correctly-formatted data.

That script is set up to run every 30 minutes, and it will keep a fresh copy of users.csv in a dedicated user’s home directory.

Now to import that! Domo has a set of connectors for working with files:

2017-01-02-22_43_45-domo

1. That (+) icon is everywhere.

connect

2. We’re looking for this one.

On the next screen, typical SFTP setup stuff – host, username and password. If you run a tightly-secured system, you might also need to whitelist an IP range so that Domo can reach your server.

2017-01-02-22_46_37-domo

3. Just needs a valid SSH (/SFTP) user.

2017-01-02-22_48_00-domo

4. Voila! Our file.

2017-01-02-22_49_31-domo

5. I want frequent updates!

2017-01-02-22_50_22-domo

6. Name, Describe, and Save

Whew!

So what we just did there – we set up the SFTP connector to fetch that CSV file once every hour. Every time it does that, it will overwrite the dataset we have with new data – that’s what the Replace method means, in the Scheduling tab.

Finally, named and described. It’s helpful to prefix all your datasets with a short project name – makes it easier to find later.

In no time at all, we’ve got our data!

2017-01-02-22_52_29-write500-_-users-domo

Sweet, sweet data.

From here, the next step is to visualize it. That’s a whole topic all on its own (so I won’t go into detail here), but here’s a dead-simple line chart built from that data, to show the trending over time:

2017-01-02-22_54_50-overview-domo

54 opted-in users from a total of 80 or so

So that’s a basic rundown of a CSV connector, end-to-end. This example does lean towards the “more complex” side of data acquisition when it comes to Domo. Luckily, most of the high-value stuff exists in systems that Domo already has great connectors for.

If you want to be notified when I write more of this stuff, check the Subscribe widget in my sidebar, top right. Or you can follow me on Twitter.

Using Domo – First Steps

This post is part of a series of posts I’m doing about Domo (the Business Cloud) and how I’m using the free tier to analyze metrics around write500.net. Click here for the full listing.

I’m a stats nerd. I love numbers, graphs, and digging into them to find meaning. So, when the opportunity came up at my employer (Acceleration) to partner with Domo and deliver cloud-based BI to customers, it was a natural fit. I’ve spent a bunch of time in both the product and the ecosystem, and have recently started using it to run analytics for my own projects.

A few months ago, Domo made free instances available – anyone can now sign up and get a feature-limited account. I did exactly that, and I’m hoping to write a few posts about how to use various parts of it to do really cool stuff.

What is Domo?

Domo is a cloud-based BI tool that’s aimed at enterprises. Their pricing reflects that, with the entry-level accounts sitting at $175/user/month. Which is not cheap. I haven’t checked comprehensively (there are lots of older-school solutions like SAS and Tableau out there) but it would not surprise me to learn that Domo is the highest-priced BI tool on the market.

The one thing that Domo does really well, as compared to other solutions, is data acquisition. The cloud is a very messy place – some vendors have APIs, some don’t and every API is different. It’s a far cry from the relatively straightforward world of ODBC, where vendors all adhered to the same standard for moving data around.

For some enterprises, the one-click nature of getting data in is the single biggest selling point. The transformation and charting and reporting, social features and alerts and the app are all nice – but the fact that a non-technical user can plug in a few usernames and passwords and get a dashboard, that’s the winning stuff.

Domo makes it very straightforward to access common sources (popular cloud tools, social networks, cloud productivity, etc), and equally straightforward to push data in via a REST-based API. I even built a PHP library for that (shameless plug alert).

So for today, I’m gonna get started by getting a Domo instance up and running, and pulling in Google Analytics data to power a simple dashboard.

This is by far the easiest thing I’ve ever done in this tool – the cool stuff comes later!

Getting Started

The signup form is here, and having just tried it, I got my instance invite about a minute later. Of course, the first thing you should do is company setup – just to put your name and face on it!

2017-01-02-20_17_41-domo

Handy little app launcher – everything’s here

2017-01-02-20_17_53-company-settings-domo

Company Settings > Company Overview – just the basics!

There’s also a profile setup, but I’m skipping that here – I’m the only person using this instance for now.

Next up, I’ll use the Domo Appstore to deploy a pre-built Google Analytics dashboard. The Free tier of Domo includes 4 apps, so this is made very straightforward:

2017-01-02-20_18_04-company-settings-domo

1. Launcher > Appstore

2017-01-02-20_18_20-appstore-domo

2. Pretty hard to miss!

2017-01-02-20_19_45-google-analytics-quickstart-appstore-domo

3. Yes please, we’ll try it!

2017-01-02-20_19_58-google-analytics-quickstart-appstore-domo

4. Naming it after the product I’m tracking

From there, Domo builds out a page with the name from step 4, and fills it with a bunch of charts all powered by sample data. This gives you a good idea of what sort of insights you’re going to get – but I’ll skip straight to the relevant part:

2017-01-02-20_20_52-write500-net-domo

5. On the top right, Connect Your Data

2017-01-02-20_20_58-write500-net-domo

6. Connect accounts! Future installs will skip this step.

2017-01-02-20_21_11-request-for-permission

7. Domo helpfully uses oAuth for this part

2017-01-02-20_21_26-write500-net-domo

8. My account is now linked, and can be selected for future app installs (or new datasets).

2017-01-02-20_21_49-write500-net-domo

9. Pick the view to get data for – just one for now.

2017-01-02-20_26_05-google-analytics-quickstart-domo

10. And away we go!

And there we go. I’ve clicked my way to a dashboard.

2017-01-02-20_23_36-write500-net-domo

11. Charts! Really quite simple.

It takes a few minutes to do the initial pull (longer if there’s a lot of data), but eventually it’ll all populate. The datasets that power that dashboard will automatically refresh once per day. The next best thing to do is get the mobile app set up, so you can reach those dashboards on the go:

For now, I’m going to be putting a dashboard together to track the growth of write500.net – it’s a good sample to use, seeing as it’s a source I have full control of, and I am actually going to try identifying trends I can capitalize on to drive user growth.

If you want to be notified when I write those upcoming posts, use that little Subscribe widget, top right of the sidebar.