May 2015

You are browsing the site archives for May 2015.

I recently launched a new intranet for a Fortune 500 client focused on internal communications and marketing awareness.

They use Tracx to mine and aggregate insights from social networks like Facebook and Twitter. Tracx enables multiple views for slicing and dicing tweets and analyzing sentiments of social posts. Imagine being able to query across all of your brands, segmented by country, age group, or gender.

Our goal was pretty simple in this case. We just wanted to display the ten most recent social posts (Facebook or Twitter) attached to a portfolio of nearly 100 brands.

Tracx provides a REST API, but no SDKs or guidance for integrating client-side. It seems like their API is targeted more towards .NET, Java, and open-source back-ends.

Here’s how I accomplished Tracx integration via JavaScript.

Getting Started: Your Key, Secret, and Query

In order to access Tracx’s OData endpoints, you’ll need an application key and secret.  Those will be 32-digit alphanumeric keys.

You can log into the Tracx API portal at https://api.tra.cx/. From there, find the appropriate query to run. That took me a bit of trial and error before I found the following parameters to pass to the “/activity/posts” endpoint:

 map["filters"] = '{"campaignID":XXXX, "topicIDs":[YYYYY], "dataType": 2, "spotlightsMode":[3], "startDate":"' + prevDate + '", "endDate":"' + nextDate + '"}';
 map["limit"] = 10;
 map["offset"] = 0;
 map["orderBy"] = 0;
 map["key"] = ZZZZZZ;

Calculating HMACs and OAuth

The Tracx API using hash-based message authentication codes (HMAC) for security. To calculate HMAC, SHA1, and help with OAuth requests, we can rely on the following two scripts (created by Netflix and hosted on Google Code):

When we’re assembling our parameters to pass in, we have to order and sign/hash them in a precise way.

Support IE8/9 and Providing a Shared Cache

In our case, we decided that making a web service call directly to Tracx was undesirable for two reasons:

  1. Tracx requires requests to be submitted using the “POST” verb, but older browsers including IE8 and IE9 only reliable support “GET” requests via XmlHttpRequest.
  2. Tracx service calls have a cost, for both performance and licensing. Instead of having many users make dynamic calls simultaneously, we’d be better off having service calls cached for everybody for a period of time.

In order to accomplish both, we implemented my .NET Proxy open source solution, hosted in Azure. .NET Proxy supports shared caching, configurable via the web.config. It also supports verb transformation; by passing in a “httpmethod” parameter on the “GET” query string with the value “POST”, it proxies its outbound calls as “POST”. That allows “GET”-only browsers to simulate “POST”s.

The source code below assumes .NET Proxy is being used as an intermediary.

Tracx in JavaScript: Source Code

Here’s the complete source code for embedding Tracx tweets and social posts on your website.

Make sure to include the sha1.js and oauth.js dependencies referenced above.

The “LoadSocial” function loads the data asynchronously. Once loaded, the second method outputs the posts to a DIV with ID “TracxContent”.

function LoadSocial(){
 if ($('#TracxContent').length > 0) {
 var tracxApiKey = "XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX";
 var tracxApiSecret = "YYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYY";

 var baseUrl = "https://api.tra.cx/1/activity/posts";

 var lastWeek = new Date();
 lastWeek = new Date(lastWeek.setDate(lastWeek.getDate() - 7));
 var year = lastWeek.getYear();
 if (year < 1000)
 year += 1900;
 var prevDate = year + '-' + Pad(lastWeek.getMonth() + 1, 2) + '-' + Pad(lastWeek.getDate(), 2)
 var tomorrow = new Date();
 tomorrow = new Date(tomorrow.setDate(tomorrow.getDate() + 1));
 var year = tomorrow.getYear();
 if (year < 1000)
 year += 1900;
 var nextDate = year + '-' + Pad(tomorrow.getMonth() + 1, 2) + '-' + Pad(tomorrow.getDate(), 2)

 var map = {};
 map["filters"] = '{"campaignID":XXXX, "topicIDs":[YYYY], "dataType": 2, "spotlightsMode":[3], "startDate":"' + prevDate + '", "endDate":"' + nextDate + '"}';
 map["limit"] = 10;
 map["offset"] = 0;
 map["orderBy"] = 0;
 map["key"] = tracxApiKey;

 var message = {};
 message["method"] = "POST";
 message["action"] = baseUrl;
 message["parameters"] = map;

 var baseURL = OAuth.SignatureMethod.getBaseString(message);
 var parameters = "filters=" + encodeURIComponent(map["filters"]) + "&key=" + map["key"] + "&limit=" + map["limit"] + "&offset=" + map["offset"] + "&orderBy=" + map["orderBy"];
 var proxyUrl = "https://netproxy.mydomain.com/netproxy.aspx?" + encodeURIComponent(baseUrl);
 var signature = b64_hmac_sha1(tracxApiSecret + "&", baseURL);

 // Try up to 3 times.
 var attempts = 0;
 // Send a GET, but tell the proxy to forward as a POST.
 while (attempts < 3) {
 attempts++;
 $.ajax({
 url: proxyUrl + "&httpmethod=POST&httppostdata=" + encodeURIComponent(parameters + "&signature=" + encodeURIComponent(signature) + "="),
 success: function (e) {
 if (WriteTracx(e))
 attempts = 3;
 else {
 if (attempts == 3)
 $('#TracxContent').html(Terms["unable to load social data"]);
 }
 },
 error: function (e) {
 if (attempts == 3)
 $('#TracxContent').html(Terms["unable to load social data"]);
 },
 timeout: 60000
 });
 }
 }
}

function WriteTracx(res) {
 if (res.indexOf("while(1);") > -1) {
 var result = jQuery.parseJSON(res.replace("while(1);", "")).data;

 if (result.length > 0) {
 var socialFeed = "<ol>";

 var dynamHtml = '<ol>';

 var monthNames = new Array("Jan", "Feb", "Mar", "Apr", "May", "Jun", "Jul", "Aug", "Sep", "Oct", "Nov", "Dec");

 for (var i = 0; i < result.length; i++) {
 var author = result[i].author.name;
 var authorID = result[i].author.id;
 var title = result[i].title;
 var postDate = result[i].publishedDate.replace(' |', '');
 var postDateParts = postDate.split(' ');
 var postDateParts2 = postDateParts[0].split('-');
 var postDate = postDateParts2[1] + '-' + postDateParts2[0] + '-' + postDateParts2[2] + ' ' + postDateParts[1] + ' ' + postDateParts[2];
 var url = result[i].externalUrl;
 var externalID = result[i].externalID;

 var testDate = new Date(Date.parse(postDate));
 var postDateObject;
 try {
 postDateObject = parseDate(testDate, "mm-dd-yyyy hh:MM TT");
 }
 catch (e) {
 postDateObject = 'NaN';
 }

 if (postDateObject == 'NaN') {
 try {
 postDateObject = parseDate(postDate, "mm-dd-yyyy hh:MM TT");
 } catch (e) {
 postDateObject = 'NaN';
 }
 }

 var curDate = postDateObject.getDate();
 var curMonth = postDateObject.getMonth();
 var curYear = postDateObject.getFullYear();
 var postDateFormatted = monthNames[curMonth] + " " + curDate;

 var network = result[i].network.toLowerCase();
 switch (network) {
 case "twitter":
 var titleParts = title.split(' ');
 for (var j = 0; j < titleParts.length; j++) {
 if (titleParts[j].toLowerCase().substr(0, 4) == "http")
 titleParts[j] = "<a href=\"" + titleParts[j] + "\" target='_blank'>" + titleParts[j] + "</a>";
 else if (titleParts[j].toLowerCase().substr(0, 1) == "#")
 titleParts[j] = "<a href=\"https://twitter.com/hashtag/" + titleParts[j].substr(1) + "\" target='_blank'>" + titleParts[j] + "</a>"
 }
 title = titleParts.join(' ');

 var socialPost = "\t\t\t<li class='h-entry tweet with-expansion customisable-border' data-tweet-id='" + externalID + "'>\r\n" +
 "\t\t\t\t<div class='header'>\r\n" +
 "\t\t\t\t\t\t<div class='h-card p-author'>\r\n" +
 "\t\t\t\t\t\t\t<a class='u-url profile' href='https://twitter.com/" + author + "' target='_blank'>\r\n" +
 "\t\t\t\t\t\t\t\t<img class='twitter-pic' src='/images/" + network + "-" + author + ".png'>\r\n" +
 "\t\t\t\t\t\t\t\t<span class='full-name'>\r\n" +
 "\t\t\t\t\t\t\t\t\t<span class='p-name customisable-highlight'>" + author + "</span>\r\n" +
 "\t\t\t\t\t\t\t\t</span>\r\n" +
 "\t\t\t\t\t\t\t</a>\r\n" +
 "\t\t\t\t\t\t\t<span class='p-nickname' dir='ltr'>@" + author + "</span>\r\n" +
 "\t\t\t\t\t\t\t<span class='u-floatLeft'>&nbsp;·&nbsp;</span>\r\n" +
 "\t\t\t\t\t\t\t<a class='twitter-timestamp' href='" + url + "' target='_blank'><time pubdate='" + postDate + "' class='dt-updated' datetime='" + postDateFormatted + "' title='Time posted: " + postDateFormatted + "'>" + postDateFormatted + "</time></a>\r\n" +
 "\t\t\t\t\t\t</div>\r\n" +
 "\t\t\t\t\t</div>\r\n" +
 "\t\t\t\t\t<div class='twitter-content'>\r\n" +
 "\t\t\t\t\t\t<p class='twitter-c'>" + title + "</p>\r\n" +
 "\t\t\t\t\t\t<div class='detail-expander'></div>\r\n" +
 "\t\t\t\t\t</div>\r\n" +
 "\t\t\t\t\t<div class='footer customisable-border'>\r\n" +
 "\t\t\t\t\t<img src='/images/twitter.png'>\r\n" +
 "\t\t\t\t\t<ul class='tweet-actions' role='menu' aria-label='Tweet actions'>\r\n" +
 "\t\t\t\t\t\t<li><a href='#' onclick=\"TwitterReply('" + externalID + "','" + authorID + "')\" class='reply-action web-intent' title='Reply'><i class='ic-reply ic-mask'></i><b>" + Terms['Reply'] + "</b></a></li>\r\n" +
 "\t\t\t\t\t\t<li><a href='#' onclick=\"TwitterRetweet('" + externalID + "')\" class='retweet-action web-intent' title='Retweet'><i class='ic-retweet ic-mask'></i><b>" + Terms['Retweet'] + "</b></a></li>\r\n" +
 "\t\t\t\t\t\t<li><a href='#' onclick=\"TwitterFavorite('" + externalID + "')\" class='favorite-action web-intent' title='Favorite'><i class='ic-fav ic-mask'></i><b>" + Terms['Favorite'] + "</b></a></li>\r\n" +
 "\t\t\t\t\t</ul>\r\n" +
 "\t\t\t\t</div>\r\n" +
 "\t\t\t</li>";

 socialFeed += socialPost;
 break;
 case "facebookpage":
 var titleParts = title.split(' ');
 for (var j = 0; j < titleParts.length; j++) {
 if (titleParts[j].toLowerCase().substr(0, 4) == "http")
 titleParts[j] = "<a href=\"" + titleParts[j] + "\" target='_blank'>" + titleParts[j] + "</a>";
 else if (titleParts[j].toLowerCase().substr(0, 1) == "#")
 titleParts[j] = "<a href=\"https://twitter.com/hashtag/" + titleParts[j].substr(1) + "\" target='_blank'>" + titleParts[j] + "</a>"
 }
 title = titleParts.join(' ');

 var socialPost = "\t\t\t<li class='h-entry tweet with-expansion customisable-border' data-tweet-id='" + externalID + "'>\r\n" +
 "\t\t\t\t<div class='header'>\r\n" +
 "\t\t\t\t\t\t<div class='h-card p-author'>\r\n" +
 "\t\t\t\t\t\t\t<a class='u-url profile' href='https://twitter.com/" + author + "' target='_blank'>\r\n" +
 "\t\t\t\t\t\t\t\t<img class='twitter-pic' src='/images/facebook-" + author + ".png'>\r\n" +
 "\t\t\t\t\t\t\t\t<span class='full-name'>\r\n" +
 "\t\t\t\t\t\t\t\t\t<span class='p-name customisable-highlight'>" + author + "</span>\r\n" +
 "\t\t\t\t\t\t\t\t</span>\r\n" +
 "\t\t\t\t\t\t\t</a>\r\n" +
 "\t\t\t\t\t\t\t<span class='p-nickname' dir='ltr'>@" + author + "</span>\r\n" +
 "\t\t\t\t\t\t\t<span class='u-floatLeft'>&nbsp;·&nbsp;</span>\r\n" +
 "\t\t\t\t\t\t\t<a class='twitter-timestamp' href='" + url + "' target='_blank'><time pubdate='" + postDate + "' class='dt-updated' datetime='" + postDateFormatted + "' title='Time posted: " + postDateFormatted + "'>" + postDateFormatted + "</time></a>\r\n" +
 "\t\t\t\t\t\t</div>\r\n" +
 "\t\t\t\t\t</div>\r\n" +
 "\t\t\t\t\t<div class='twitter-content'>\r\n" +
 "\t\t\t\t\t\t<p class='twitter-c'>" + title + "</p>\r\n" +
 "\t\t\t\t\t\t<div class='detail-expander'></div>\r\n" +
 "\t\t\t\t\t</div>\r\n" +
 "\t\t\t\t\t<div class='footer customisable-border'>\r\n" +
 "\t\t\t\t\t<img src='/images/facebook.png'>\r\n" +
 "\t\t\t\t\t<ul class='tweet-actions' role='menu' aria-label='Tweet actions'>\r\n" +
 "\t\t\t\t\t\t<li><a href='#' onclick=\"FacebookShare('" + rfc3986EncodeURIComponent(url) + "', '', '" + rfc3986EncodeURIComponent(url) + "', '" + rfc3986EncodeURIComponent(title) + "')\" class='reply-action web-intent' title='Reply'><i class='ic-reply ic-mask'></i><b>" + Terms["Share"] + "</b></a></li>\r\n" +
 "\t\t\t\t\t</ul>\r\n" +
 "\t\t\t\t</div>\r\n" +
 "\t\t\t</li>";

 socialFeed += socialPost;
 break;
 }
 }
 }
 else
 return false;

 $('#TracxContent').html(socialFeed);
 }
 else {
 return false;
 }

 return true;
}}

Ever since I saw Michael Teeuw’s Magic Mirror project, I’ve wanted to make my own.

The idea is simple: place a monitor behind a one-way mirror and write software to drive a personal Heads-Up Display. Maybe place it in your bathroom to view the weather and your schedule while getting ready in the morning. Maybe place it in your foyer to read and share notifications when getting home.

To get started, I wanted to keep the budget and hardware to a minimum. I might eventually go all-out and embed a full monitor and PC, but I preferred to start small.

First Attempt: Mirrored Window Film

Besides the monitor and PC, the biggest cost for this project comes from the one-way mirror itself. When first looking in 2014, I couldn’t find custom-sized one-way mirrors online. The best options I found were >$250 each.

Naively, I figured that I could simulate a mirror by using a mirrored window film. You know, the cling-wrap some people use for privacy on their first-floor windows. So I ordered the window film, application spray, found a squeegee, and went to work.

It went horribly.

First, it was difficult to cut straight lines to get it to the right size. Worse, window films are hard to apply perfectly on large surfaces. They work fine in small window panes, but the larger the application, the more bubbles and creases tend to appear. If you mis-apply and have to lift a wet section, it can fold or stick to itself. After that, it never looks good again.

Toni and I eventually got it applied, put it in the frame, and tried putting a screen behind. It looked terrible. Cheap, too opaque, and just overall distorted. Oh well, that idea’s down the drain.

Current Attempt: One-Way Acrylic

I was fortunate to see this post in my Reddit stream a few weeks ago. User “olipayne” pointed out the precise custom-order acrylic one-way mirrors from TAP Plastics. A little pricey at ~$150, but worth trying in my opinion.

I picked out my picture frame, measured it twice, and ordered.  Within a week, the most important part of my magic mirror arrived and fit the frame perfectly.

An important decision to get started with my mirror was to skip a full monitor. Very few monitors are flat enough to fit behind a frame and even fewer have the HDMI ports mounted in the right orientation. I wasn’t ready to try taking a monitor apart. And even if I could get it to fit, I didn’t want to have a PC dedicated to this display.

(Un)fortunately, I had an unused Surface RT that I could dedicate to the project. On a side note, it’s amazing how quickly the RT became obsolete. While a full-size display would be ideal, I figured I could include fit some pertinent KPIs in the tablet-sized corner of the mirror. I’d turn it on when needed and unplug it, going into hibernation when away.

I wanted to keep the frame’s backing, so I had to cut a Surface-sized opening for it. Those boards are surprisingly thick! Thankfully, my Dremel tool made short work of it.

Before I knew it, I had the mirror in the frame and the Surface fit perfectly. Turn on the screen, put it on the wall, and… Whoa, the mirror looks warped. Is this a circus mirror?

While I’m very happy with the one-way acrylic from TAP Plastics, it did arrive slightly curved. And it turns out that the mirror and frame backing isn’t as thick as most art that you frame, so there was plenty of wiggle room allowing the acrylic to flex. We placed cardboard between the backing and the mirror (cut from the shipping materials), which made the whole display sturdier, but it still flexed outward.

That’s led to an unappealing convex bubble in the middle, distorting subjects from the middle outward. That’s as far as I’ve gotten from a hardware standpoint.

How’s It Look?

Could be much better, but you can see the concept here.

Magic-Mirror-FullThe photo above was taken mid-day, where the glare makes white text hard to read. You can notice the distortion right away. We had friends over for a barbecue Saturday and everyone quipped about the “funhouse mirror”.

Magic-Mirror-CornerHere’s a zoomed-in glimpse of the Surface display at the bottom corner. As you can see, the display blends in fairly well. It’s pretty rare that the background glow is noticeable. While a bit small, the Surface RT works great as a backing display.

The Software

I haven’t spent much time on the dashboard UI for the display. I set up IE 11 on the surface to display in full-screen by default and set its homepage to my custom ASP.NET MVC site. For weather, I make a JavaScript call to OpenWeatherMap’s REST API. Music is pulled through a .NET SOAP wrapper for Sonos I wrote a few years ago. That portion is creaky and due for an update.

Next Steps

If this is going to stick around, we’ll need to eliminate the mirror bowing. I’m planning to take out the cardboard layer and try putting the glass pane back on the surface, constraining the mirror like most artwork. I’m concerned it will still be distorted, but can’t get much work.

Frankly, I’m not sure this looks good enough to be on display in the foyer. Maybe I continue after this is moved to the basement, although it’s far less useful there. It doesn’t look great to have a frame with a random wall wart and power cord dangling below.

To make this more practical, the display should only turn on when needed. To accomplish that, I could set up a geo fence or beacons so that my phone turn it on or off. The “RT” can’t run universal apps, but other tablets would allow native code to add more control. Maybe I’d set it up so the camera monitors and turns the screen on only during motion. Maybe a Leap Motion or Kinect could be configured for “Minority Report”-style swiping control. It’d be pretty fun to swipe left or right to change the song and swipe up or down to adjust volume.

From a software standpoint, I’d like to add my schedule using Exchange Web Services and pull my current Fitbit step count using their API.

Right now, everything’s too tightly coupled. I may start an open source project for personal dashboards and make it easy to enroll personalize data sources using OAuth. We’ll see.

 

While migrating my old websites to Azure, I decided to retire my old vBulletin forums. Instead of maintaining an unpatched application with PHP and MySQL dependencies, I preferred to migrate to static HTML files.

To export the site, I relied on the handy “wget” Gnu utility. wget is pre-installed on most Linux distros and is available for Windows.

Here’s the command:

wget -r http://path.to.vbulletin/ --convert-links=on --html-extension=on --page-requisites --accept '*.html,showthread.php*,forumdisplay.php*,member.php,images*,*.gif,*.png,*.jpg,*.jpeg,*.css,*.js,clientscript*' --reject "*?p=*,*goto=*,*sort=*,*order=*,*daysprune=*"

vBulletin normally has many redundant URLs that lead to the same contents. These wget parameters grab only the relevant content (forum indexes, threads, images, stylesheets, and scripts) while ignoring problematic query strings.

The only downside in archiving vBulletin to HTML is that it takes more storage. Instead of relying on efficient MySQL storage and dynamic page generation, we have duplicate markup in each file. For me, that tradeoff is well worth it.

When doing test migrations or otherwise coordinating multiple environments, it’s sometimes useful to create a list of files modified since a cutoff date.

PowerShell makes it easy:

Add-PSSnapin Microsoft.SharePoint.PowerShell

$dateCutoff = "2015-01-01"
$spQuery = New-Object Microsoft.SharePoint.SPQuery 
$spQuery.Query = "<Where> 
 <Gt> 
 <FieldRef Name='Modified' /> 
 <Value IncludeTimeValue='TRUE' Type='DateTime'>" + $dateCutoff + "</Value>
 </Gt> 
 </Where>"
$spQuery.ViewFields = "<FieldRef Name='EncodedAbsUrl' /><FieldRef Name='Modified' />" 
$spQuery.ViewFieldsOnly = $true

Get-SPWebApplication | Get-SPSite | Get-SPWeb | ForEach-Object { ForEach ($list in $_.Lists) { $listItems = $list.GetItems($spQuery); $listItems | ForEach-Object { $_['EncodedAbsUrl'] + " (" + $_['Modified'] + ")" } } }

WordPress Multisite, Project Nami, Azure, and CloudFlare

Over the weekend, I migrated all of my WordPress sites to Azure websites.  I had three main goals:

  1. Simplify the underlying technology stack
  2. Centralize site management
  3. Optimize security and performance

Thanks to WordPress Multisite, Project Nami, Azure, and CloudFlare, mission accomplished!

Before Now
  • IIS with ARR as a reverse proxy
  • WordPress
  • Apache, configured with PHP and mod_rewrite
  • MySQL
  • Patching, backups, and dependency management for all of the above
  • Azure websites
  • WordPress Multisite with Project Nami
  • Azure SQL
  • CloudFlare

About WordPress Multisite

When I started with WordPress, I had to create Apache bindings and individual MySQL databases for each site. Every time I set up a new website, I had to reacquaint myself with the process of installing WordPress. Plugins, themes, etc. had to managed independently.

WordPress Multisite was introduced in version 3.0, allowing for multiple WordPress sites to be managed within one installation. That means one HTTP(S) binding, one database, and unified upgrades.

I have to say, this works even better than I hoped. It’s not only easier to manage code assets; it also allows you to share posts and media easily across sites in “network”. I haven’t found any downsides yet.

About Project Nami

Project Nami is the other key that’s made my website management easier. Although I’ve been working with Linux on and off since the ’90s, I spend most of my time in Windows, so LAMP management always took longer than I’d like. And I know that WordPress with PHP and MySQL on Windows is possible, but it’s always seemed like an uphill battle.

Thanks to the generosity of project creators Spencer and Patrick (no surnames — apparently they’re not in it for the fame), Project Nami adapts all of the database logic in WordPress to work with Azure SQL, which is no small feat. That removes all dependencies on MySQL. It’s available as a ZIP archive containing all WordPress files patched to work with Azure SQL. They’re very fast with updates. When WordPress rushed out version 4.2.1 with security updates, Project Nami was updated within a day.

It’s worth noting that there are MySQL hosting options in Azure. The most popular is a third-party offering from ClearDB, which is available as a PaaS linked resource. It’s a decent option, but not as affordable or as well integrated as Azure SQL. For simplicity, I preferred to stay with the Microsoft stack.

About Azure

Azure needs the least introduction. In short, we use its Azure websites, Azure SQL, and storage features (to manage backups).

About CloudFlare

CloudFlare is an amazing content delivery network. By routing all of your traffic through it, they “supercharge” your site with the following free benefits:

  1. DNS (including CNAME flattening)
  2. Content delivery and caching
  3. Flexible SSL with SNI
  4. Firewall and preprocessing rules
  5. Traffic analytics

Using CloudFlare, I’ve been able to add SSL to each site, use “page rules” to enforce HTTPS, and improve performance through aggressive caching.

Guide: WordPress Multisite on Azure

Here are the main steps I took:

  1. Deploy Azure SQL database
  2. Set up Azure website with PHP enabled
    1. Add Azure SQL database connection string to simplify backups
  3. Install Project Nami to Azure website
  4. Run the famous WordPress 5-minute install
    1. Enable multisite with the “WP_ALLOW_MULTISITE”
    2. Set up the seven web.config rewrite rules on this page
  5. Create a network
    1. Install WordPress MU Domain Mapping plugin
  6. Create each site and map domains for each
  7. Export each site’s contents under “Tools” > “Export”, then import using “Tools” > “Import”
    1. You may need up increase the “upload_max_filesize” and “post_max_size” settings using a “.user.ini” file
    2. Even with larger upload limits, you may need to run the import multiple times, especially if you have a lot of media
  8. Test everything on the new site
  9. Move DNS
    1. In CloudFlare, create awverify CNAME records matching the domain
    2. In Azure, add the domains
    3. In CloudFlare, create CNAME records pointing at your *.azurewebsites.net URL
  10. Once everything’s working, back up the website and database through the Azure portal
    1. Optionally, enable daily backups with 30 days retention

Those are the main steps I took. Again, I’m really happy with the results. Each site is now faster, encrypted, more reliable, and easier to manage.