Ever since I saw Michael Teeuw’s Magic Mirror project, I’ve wanted to make my own.

The idea is simple: place a monitor behind a one-way mirror and write software to drive a personal Heads-Up Display. Maybe place it in your bathroom to view the weather and your schedule while getting ready in the morning. Maybe place it in your foyer to read and share notifications when getting home.

To get started, I wanted to keep the budget and hardware to a minimum. I might eventually go all-out and embed a full monitor and PC, but I preferred to start small.

First Attempt: Mirrored Window Film

Besides the monitor and PC, the biggest cost for this project comes from the one-way mirror itself. When first looking in 2014, I couldn’t find custom-sized one-way mirrors online. The best options I found were >$250 each.

Naively, I figured that I could simulate a mirror by using a mirrored window film. You know, the cling-wrap some people use for privacy on their first-floor windows. So I ordered the window film, application spray, found a squeegee, and went to work.

It went horribly.

First, it was difficult to cut straight lines to get it to the right size. Worse, window films are hard to apply perfectly on large surfaces. They work fine in small window panes, but the larger the application, the more bubbles and creases tend to appear. If you mis-apply and have to lift a wet section, it can fold or stick to itself. After that, it never looks good again.

Toni and I eventually got it applied, put it in the frame, and tried putting a screen behind. It looked terrible. Cheap, too opaque, and just overall distorted. Oh well, that idea’s down the drain.

Current Attempt: One-Way Acrylic

I was fortunate to see this post in my Reddit stream a few weeks ago. User “olipayne” pointed out the precise custom-order acrylic one-way mirrors from TAP Plastics. A little pricey at ~$150, but worth trying in my opinion.

I picked out my picture frame, measured it twice, and ordered.  Within a week, the most important part of my magic mirror arrived and fit the frame perfectly.

An important decision to get started with my mirror was to skip a full monitor. Very few monitors are flat enough to fit behind a frame and even fewer have the HDMI ports mounted in the right orientation. I wasn’t ready to try taking a monitor apart. And even if I could get it to fit, I didn’t want to have a PC dedicated to this display.

(Un)fortunately, I had an unused Surface RT that I could dedicate to the project. On a side note, it’s amazing how quickly the RT became obsolete. While a full-size display would be ideal, I figured I could include fit some pertinent KPIs in the tablet-sized corner of the mirror. I’d turn it on when needed and unplug it, going into hibernation when away.

I wanted to keep the frame’s backing, so I had to cut a Surface-sized opening for it. Those boards are surprisingly thick! Thankfully, my Dremel tool made short work of it.

Before I knew it, I had the mirror in the frame and the Surface fit perfectly. Turn on the screen, put it on the wall, and… Whoa, the mirror looks warped. Is this a circus mirror?

While I’m very happy with the one-way acrylic from TAP Plastics, it did arrive slightly curved. And it turns out that the mirror and frame backing isn’t as thick as most art that you frame, so there was plenty of wiggle room allowing the acrylic to flex. We placed cardboard between the backing and the mirror (cut from the shipping materials), which made the whole display sturdier, but it still flexed outward.

That’s led to an unappealing convex bubble in the middle, distorting subjects from the middle outward. That’s as far as I’ve gotten from a hardware standpoint.

How’s It Look?

Could be much better, but you can see the concept here.

Magic-Mirror-FullThe photo above was taken mid-day, where the glare makes white text hard to read. You can notice the distortion right away. We had friends over for a barbecue Saturday and everyone quipped about the “funhouse mirror”.

Magic-Mirror-CornerHere’s a zoomed-in glimpse of the Surface display at the bottom corner. As you can see, the display blends in fairly well. It’s pretty rare that the background glow is noticeable. While a bit small, the Surface RT works great as a backing display.

The Software

I haven’t spent much time on the dashboard UI for the display. I set up IE 11 on the surface to display in full-screen by default and set its homepage to my custom ASP.NET MVC site. For weather, I make a JavaScript call to OpenWeatherMap’s REST API. Music is pulled through a .NET SOAP wrapper for Sonos I wrote a few years ago. That portion is creaky and due for an update.

Next Steps

If this is going to stick around, we’ll need to eliminate the mirror bowing. I’m planning to take out the cardboard layer and try putting the glass pane back on the surface, constraining the mirror like most artwork. I’m concerned it will still be distorted, but can’t get much work.

Frankly, I’m not sure this looks good enough to be on display in the foyer. Maybe I continue after this is moved to the basement, although it’s far less useful there. It doesn’t look great to have a frame with a random wall wart and power cord dangling below.

To make this more practical, the display should only turn on when needed. To accomplish that, I could set up a geo fence or beacons so that my phone turn it on or off. The “RT” can’t run universal apps, but other tablets would allow native code to add more control. Maybe I’d set it up so the camera monitors and turns the screen on only during motion. Maybe a Leap Motion or Kinect could be configured for “Minority Report”-style swiping control. It’d be pretty fun to swipe left or right to change the song and swipe up or down to adjust volume.

From a software standpoint, I’d like to add my schedule using Exchange Web Services and pull my current Fitbit step count using their API.

Right now, everything’s too tightly coupled. I may start an open source project for personal dashboards and make it easy to enroll personalize data sources using OAuth. We’ll see.


While migrating my old websites to Azure, I decided to retire my old vBulletin forums. Instead of maintaining an unpatched application with PHP and MySQL dependencies, I preferred to migrate to static HTML files.

To export the site, I relied on the handy “wget” Gnu utility. wget is pre-installed on most Linux distros and is available for Windows.

Here’s the command:

wget -r http://path.to.vbulletin/ --convert-links=on --html-extension=on --page-requisites --accept '*.html,showthread.php*,forumdisplay.php*,member.php,images*,*.gif,*.png,*.jpg,*.jpeg,*.css,*.js,clientscript*' --reject "*?p=*,*goto=*,*sort=*,*order=*,*daysprune=*"

vBulletin normally has many redundant URLs that lead to the same contents. These wget parameters grab only the relevant content (forum indexes, threads, images, stylesheets, and scripts) while ignoring problematic query strings.

The only downside in archiving vBulletin to HTML is that it takes more storage. Instead of relying on efficient MySQL storage and dynamic page generation, we have duplicate markup in each file. For me, that tradeoff is well worth it.

When doing test migrations or otherwise coordinating multiple environments, it’s sometimes useful to create a list of files modified since a cutoff date.

PowerShell makes it easy:

Add-PSSnapin Microsoft.SharePoint.PowerShell

$dateCutoff = "2015-01-01"
$spQuery = New-Object Microsoft.SharePoint.SPQuery 
$spQuery.Query = "<Where> 
 <FieldRef Name='Modified' /> 
 <Value IncludeTimeValue='TRUE' Type='DateTime'>" + $dateCutoff + "</Value>
$spQuery.ViewFields = "<FieldRef Name='EncodedAbsUrl' /><FieldRef Name='Modified' />" 
$spQuery.ViewFieldsOnly = $true

Get-SPWebApplication | Get-SPSite | Get-SPWeb | ForEach-Object { ForEach ($list in $_.Lists) { $listItems = $list.GetItems($spQuery); $listItems | ForEach-Object { $_['EncodedAbsUrl'] + " (" + $_['Modified'] + ")" } } }

WordPress Multisite, Project Nami, Azure, and CloudFlare

Over the weekend, I migrated all of my WordPress sites to Azure websites.  I had three main goals:

  1. Simplify the underlying technology stack
  2. Centralize site management
  3. Optimize security and performance

Thanks to WordPress Multisite, Project Nami, Azure, and CloudFlare, mission accomplished!

Before Now
  • IIS with ARR as a reverse proxy
  • WordPress
  • Apache, configured with PHP and mod_rewrite
  • MySQL
  • Patching, backups, and dependency management for all of the above
  • Azure websites
  • WordPress Multisite with Project Nami
  • Azure SQL
  • CloudFlare

About WordPress Multisite

When I started with WordPress, I had to create Apache bindings and individual MySQL databases for each site. Every time I set up a new website, I had to reacquaint myself with the process of installing WordPress. Plugins, themes, etc. had to managed independently.

WordPress Multisite was introduced in version 3.0, allowing for multiple WordPress sites to be managed within one installation. That means one HTTP(S) binding, one database, and unified upgrades.

I have to say, this works even better than I hoped. It’s not only easier to manage code assets; it also allows you to share posts and media easily across sites in “network”. I haven’t found any downsides yet.

About Project Nami

Project Nami is the other key that’s made my website management easier. Although I’ve been working with Linux on and off since the ’90s, I spend most of my time in Windows, so LAMP management always took longer than I’d like. And I know that WordPress with PHP and MySQL on Windows is possible, but it’s always seemed like an uphill battle.

Thanks to the generosity of project creators Spencer and Patrick (no surnames — apparently they’re not in it for the fame), Project Nami adapts all of the database logic in WordPress to work with Azure SQL, which is no small feat. That removes all dependencies on MySQL. It’s available as a ZIP archive containing all WordPress files patched to work with Azure SQL. They’re very fast with updates. When WordPress rushed out version 4.2.1 with security updates, Project Nami was updated within a day.

It’s worth noting that there are MySQL hosting options in Azure. The most popular is a third-party offering from ClearDB, which is available as a PaaS linked resource. It’s a decent option, but not as affordable or as well integrated as Azure SQL. For simplicity, I preferred to stay with the Microsoft stack.

About Azure

Azure needs the least introduction. In short, we use its Azure websites, Azure SQL, and storage features (to manage backups).

About CloudFlare

CloudFlare is an amazing content delivery network. By routing all of your traffic through it, they “supercharge” your site with the following free benefits:

  1. DNS (including CNAME flattening)
  2. Content delivery and caching
  3. Flexible SSL with SNI
  4. Firewall and preprocessing rules
  5. Traffic analytics

Using CloudFlare, I’ve been able to add SSL to each site, use “page rules” to enforce HTTPS, and improve performance through aggressive caching.

Guide: WordPress Multisite on Azure

Here are the main steps I took:

  1. Deploy Azure SQL database
  2. Set up Azure website with PHP enabled
    1. Add Azure SQL database connection string to simplify backups
  3. Install Project Nami to Azure website
  4. Run the famous WordPress 5-minute install
    1. Enable multisite with the “WP_ALLOW_MULTISITE”
    2. Set up the seven web.config rewrite rules on this page
  5. Create a network
    1. Install WordPress MU Domain Mapping plugin
  6. Create each site and map domains for each
  7. Export each site’s contents under “Tools” > “Export”, then import using “Tools” > “Import”
    1. You may need up increase the “upload_max_filesize” and “post_max_size” settings using a “.user.ini” file
    2. Even with larger upload limits, you may need to run the import multiple times, especially if you have a lot of media
  8. Test everything on the new site
  9. Move DNS
    1. In CloudFlare, create awverify CNAME records matching the domain
    2. In Azure, add the domains
    3. In CloudFlare, create CNAME records pointing at your *.azurewebsites.net URL
  10. Once everything’s working, back up the website and database through the Azure portal
    1. Optionally, enable daily backups with 30 days retention

Those are the main steps I took. Again, I’m really happy with the results. Each site is now faster, encrypted, more reliable, and easier to manage.

On-prem to cloud

I’ve been long overdue in moving my personal websites to the cloud. Although I’ve been helping clients with public and hybrid cloud deployments for years, it’s been a “cobbler’s kids” situation, leaving me run more than a dozen websites and applications from my home server.

At this point, there’s no compelling reason to maintain everything on-prem. The cloud has made more sense for years and now it’s time to move. Why?

  1. Cost: I had been running a 2U server with decent specs, powering a dozen VMs. The up-front cost was $X,XXX and I’m sure some of the RAID drives will need to be replaced soon. For the price of one SSD, I’ll be able to power all of the same workloads for months. Factoring hardware, software licenses, and electricity, the ROI is immense.
  2. Management: My biggest headaches have been patching, backups, and dependencies.
    1. Patching: Running a WSUS server streamlined most of my upgrade needs on Windows, but I had to maintain a Linux/MySQL VM due to all of our WordPress sites. I’m still pretty clumsy on the Linux side and every “apt-get update” seemed to lead to an extra hour of troubleshooting. I’m thrilled with Azure’s PaaS offerings and the ability to leave patching behind.
    2. Backups: Backing up SOHO VM hosts is still too difficult and expensive in my opinion. Over the years, I tried Unitrends, Veeam, Acronis, and several other disk-based backup programs with minimal luck. They required too much attention to address ephemeral failures and always seemed to lag, even using iSCSI and USB3. Cloud backup solutions were out of grasp at the time, although I’d go that route now based on costs and faster home internet speed.
    3. Dependencies: Running certain workloads requires an entire farm. For years, we’ve run an internal SharePoint installation, necessitating AD and SQL virtual machines. These were more for learning than anything, but they created a rat’s nest of interdependent VMs. There’s no reason to go that route instead of Office 365 or Azure IaaS at this point.
  3. Availability: We’ve been fortunate to avoid any major outages. There have been a couple days of downtime, which always seemed to line up with vacations. Those were usually tied to VM restarts or IP conflicts. That said, I’ve toured the Microsoft data centers and am very confident they’ll have better uptime than I could manage!
  4. Performance and Scalability: In the off-chance that a website really takes off, scaling is painless and can be automated. I’m already tuning the sites and taking advantage of free services like CloudFlare, but the public cloud options for optimizing performance and scalability are amazing.
  5. “The Wife Factor”: Toni has been very cool about everything and understands my obsession with tech. But at this point, it’s tough to justify the giant server rack and crates of swappable parts. I’m looking forward to a slightly quieter house without the constant whirring of fans and hard disks. We’ll also get to enjoy faster internet speeds without QoS prioritizing our websites.

I’ve already moved the first batch of websites and expect to migrate the rest this next month.

Technically, I’ll still need a small machine to host my home automation VM. Downsizing from a 2U server to a silent Intel NUC? Not bad.

On-prem to cloud and NUC