NOTE: you can subscribe to just these posts via RSS.

WordPress + CloudFront

It took a bit of wrangling, but I was able to get up and running with Amazon’s CloudFront today. It’s pretty sweet.

Why Bother?

My first goal was to enable TLS for this site. Let’s Encrypt is a great way to go, but there’s some setup involved, and there are costs despite them offering it for free.

CloudFront also supports HTTP/2, and is super fast and very configurable.

Requests Map

I’ve long loved not having www tacked on to the front of my URL. I get it’s the standard, it just still feels a bit goofy. With this CDN world, though, there’s not much way around it.

As it stands now, http://rlaskey.org serves WordPress. Requests for rlaskey.org, though, get sent along to https://www.rlaskey.org: www is now a CNAME for CloudFront, which is set up to get everything from http://origin.rlaskey.org. Requests for origin.rlaskey.org do not redirect.


To test all of this out, curl -I <URL> is your best friend. It is worth experimenting a lot to make sure everything is going to the right place before flipping any switches. Look for a Location header, which will tell you if you have redirects firing. Otherwise, requests can get cached by the browser fairly easily, so it’s a mess to figure out what’s actually active or not.

No Plugins: Just Config

I don’t particularly want to be editing my site through a CDN, so I connect to my instance directly, via a self-signed SSL certificate. With that, though, there are some tricks. Here’s where I landed, by editing my wp-config.php:

if (isset($_SERVER['SERVER_PORT']) && $_SERVER['SERVER_PORT'] === '443') {
    define('WP_HOME', 'https://rlaskey.org/words');
    define('WP_SITEURL', WP_HOME);
} else {
    $_SERVER['HTTPS'] = 'on';
    define('WP_HOME', 'https://www.rlaskey.org/words');
    define('WP_SITEURL', WP_HOME);


The magic here is that regardless of your general home and site URL settings, when you’re in the admin interface over HTTPS, WordPress will always override those two settings.


The silly part in the above snippet is where I set $_SERVER['HTTPS'] to be on exactly when it’s off. That sadly was not a typo. Why do we need it? Well, try it out, before hooking up your CDN: despite having an https URL for the home and site URL, links will point to an http version instead when you’re in your origin, which is what the CDN will ingest. The net result is that your CSS and everything else will point to http versions of the world, which nobody wants. Tell WordPress that you’re on TLS, though, and the links will line up.

Tweak CloudFront Settings

With the above setup, and some patience, you can walk through the setup of CloudFront as their docs instruct. You can make your own SSL certificate in their Certificate Manager. You’ll need to also then specify your domain in the Alternate Domain Names section, lest you’ll get some weird failures.

Under Behaviors, you’ll need to allow POST, and make sure you enable forwarding of Query Strings. Infinite Scroll and Search will break otherwise, respectively.

At least until you are comfortable, bring the Max and Default TTLs way, way down. WordPress won’t send the cache control headers for PHP requests, meaning it will get the default, which starts off at 24 hours. You can invalidate everything, but that part is not free after a point.

Finally, your DNS will need to change, such that www or whatever you pick then heads to the CloudFront Domain Name.

Home Free

CloudFront can take a little while as you tweak everything. Saving the settings doesn’t mean it’ll instantly be there, so check General: Distribution Status. If you don’t see “Deployed”, keep waiting. DNS also is slower than dirt, generally, so any updates may take longer than you expect.

This path is also a complete one, in that PHP + everything else is going to be handled by the CDN. I personally like that, though it might not be for everyone. Unless you set headers for your PHP requests, your site updates will get delayed by up to whatever you have as the default TTL.

Feedback, PLS

Hopefully this all helps some people, though with that said I do understand that even all of this is far from complete. If anyone has questions, please do write in a comment; I’m happy to dig in a little deeper, or to write more about any particular aspect.

Accidentally typed “gut” instead of “git“. Felt right, though.

Dynamic, Responsive Embeds w/ CSS

This past winter, I was introduced to a concept which was actually developed back in 2009. Namely, if you have video content on your website, and you want a responsive design, there’s a very clever trick which requires a wrapper div and a bit of CSS.

Just now, I was able to set this up for my own site, with an extra twist: for each iframe embed I have on my page, I can loop through and replace it with a wrapped version. I can also then find the original aspect ratios, if available, and make sure that persists, regardless of the width of the page / elements. That’s enough words, so here’s the code:

The new thing here is that we can set the paddingBottom dynamically, via JS, with a fallback if needed, and the other CSS rules take care of the rest. If we wanted to depend on jQuery, this code could be even smaller, yet even as it is, we have a pretty compact solution.


Since I’m running WordPress here, I also was able to integrate this easily with the JetPack Infinite Scroll plugin, by adding a callback on post-load. This means that after any new posts have been added, I wrap any new content in there as well.

I should also point out that it would be more efficient to have these wrappers in the HTML itself, rather than asking JS to make replacements on its own. The main reason I’m not doing that here is because WordPress graciously supports oEmbed,which means the HTML embed code is generated dynamically.

This way there’s a lot less for me to maintain. I can plop in the URL in my post, and forget about the rest. I can also clear out the cached versions, and on the next page view it’ll get that embed code again.

Seeking Feedback

I’m pretty happy where this is at, and I’d like you to try it out. Fork what’s there and add or integrate whatever you’d like. If you have any questions, you can write me, post on the GitHub Gist, or comment here. Thanks!

grep for naming conventions

I’ve used grep for years, now, though in the past few days I found a nice combination of it and some other command line utilities to do a bit of an audit as to how I’m naming methods and variables in the code I’m writing.

-oh # that’s how it works

a shortcut for --only-matching, this option won’t show the whole line, just the part that matches your search criteria
a shortcut for --no-filename, this will hide the first part of the normal output, so you can look for trends across files more easily

Put these two together, and you can then pipe the output over many files into sort and uniq -c to get an alphabetized, counted list of every instance of the pattern you want to find. You can also then pipe that all into a final sort -n which will sort by the count, such that the biggest fish fall to the bottom.

Use case: camelCase vs. underscores

Putting this all together, you can get a general command something like this:

grep -oh $PATTERN $(find . -name \*${EXT}) \
	| sort | uniq -c | less

To find underscore_separated_words, use PATTERN='\w\+_\w\+'; for camelCase, let PATTERN='[[:lower:]]\+[[:upper:]]\w\+'.

Finally, set EXT='.php' to look over just your PHP files, or change it to any extension of the types of files you want to inspect. You can also use grep’s -r flag with a target directory if you want to recurse through all files, rather than limiting yourself to files with a particular extension.

less: folding/chopping

I’ve been using less for years now, though today I learned a helpful new trick: you can toggle between command line options by typing – then the option. So, for -S, you can toggle between “chopping” lines or not; also known as line folding; also known as line wrapping.

Further, any other command line option can be changed this way. There’s not an awful lot here that warrants frequent changing, though -N does toggle line numbers which can be handy. Trying -G also toggles search highlighting, and -i and -I can toggle case sensitivity in searches, the latter extending to searches with patterns using uppercase letters.

Child Context

so this fills in the gap of a missing child

Fragment of a commit message I just wrote. Sounds pretty dark out of context, which perhaps is a natural consequence of using the word “children” in entirely virtual spaces.

On a related note, “master” and “slave” are also pretty common computing paradigms. Might be about time to change all of that.

Death and Windows

Got into work today, and my computer was acting a bit funny. It took my password, then stared at me blankly. Flipping around to the console, I saw notes to the effect that while life is all well and good, my root partition was mounted read-only and I would not be able to do the great things I had wanted to do.

Digging a bit deeper, I then see the error messages which indicate the filesystem was not in a good place, so that’s why we were reading only, writing disabled.

Anyway, my primary hard drive was hosed. I was able to reboot, and an fsck got the system in a place where I could at least do enough things to start my plan into new areas of the world.

Hello, Windows

I’ve been running Ubuntu for a couple years now, and generally it’s been great. It’s free, it’s fast, and the support for both hardware and software is rather top notch. The failure of the hard drive was actually nothing to do with the operating system, though I did need to start from scratch.

Armed with a license for Windows 8.1, I took a dive back into the Microsoft world. I’ve been using it at home for more than a year now, though this step has further sealed it: Linux for servers, but not for my desktop.

Windows 8 is definitely a bit odd. I like it a lot, though. The paradigms are different, but they make sense to me. I understand they’re mashing together interfaces for a phone, a tablet, and a desktop. It takes a bit of training, but going back to Windows 7 really doesn’t feel quite as nice.

Smaller bites

Luckily, I’ve been rather good at keeping backups. I’ve had two spare drives, and had access to a third. Most of my actual work was contained in a Virtual Machine, too, which I’ve intentionally kept rather minimal.

After looking everything over, Bittorrent Sync got me the bulk of my bytes in order, allowing for more than two extra copies of everything I care about, one closer enough that a local wired connection quickly sorted out tens of gigabytes of restoring.

On a failing drive, it’s tempting to try and save every bit, though I’ve often found that getting what you need and getting out grants rather more success. Drives like to spiral when they’re already down, and what looks fine may not be hours later.

So, with the bulk of the transfer abstracted, I was able to concentrate on my 8GB virtual machine image. Trying to copy it out to one backup drive revealed that even this secondary was rather on its last legs, which at that point just fucking figured. It was a Monday, after all.

I got it, though, and thanked myself for paring down to 8GB rather than 16 or 32. Sure, space is cheap, but in these situations with such larger images, every byte can count towards a potential problem in re-assembling a consistent whole.

Git was definitely another big part of the success, knowing that everything that really mattered was not only in my own machine in two or three places, but on others throughout the network, all with the latest updates. If for no other reason than that, learning Git is worth its weight in gold.


There’s a part of me that regrets a move away from running Linux this time around. It’s not as open, or as free. It’s a funny landscape, though. Ubuntu seems to make the most suitable environment, yet it still gets heaps of criticism for not quite adhering to the Linux Way.

Microsoft, on the other end, has been trying, but is largely perceived as more the uncool, legacy monster. Hell, even desktop computers in general don’t have the hype or attraction that they did a few years ago.

For me, open source lives better in a slightly more adaptive landscape. It’s great for servers. It’s incredible for software development of most any kind.

In the end, I’m just glad it all mostly seems to work.