2014 OWS Race Season Recap

The 2014 OWS race season was a big one for me; I had set the goal of swimming the Semana Nautica 10K in July and knew that I needed a lot of miles in the bank! I swam most of the races that I could conveniently get to; here’s a roundup:

Nadadores 2.4 Mile Rough Water Swim – May 17th

Boy, do the fasties show up for races in Orange County, my 1:01 got me last place in AG!

Salt Creek Beach in Dana Point is a stunning setting for a race. The water was 62 degrees, and the visibility was great. Very well-organized race, with a lot of fast swimmers (winning time was 44 minutes). I highly recommend this early-season race.

Event website: http://www.nadadoresroughwaterswim.org/w/

Garmin: http://connect.garmin.com/modern/activity/501380107

Salt Creek Beach

Santa Monica Pier Paddle – June 7th

A great local race, almost every swimmer friend I know does this. We got lucky with conditions this year. Last year, 12 foot surf wreaked havoc on the womens’ wave. No such drama this year.

It’s only a mile. I find that if I try to “sprint” a race like this, I end up swimming slower. Came in at 31:03, but I attribute that time to that silly 200 yard run up the soft sand to finish.

Event website: http://www.pierpaddle.com/

Garmin: http://connect.garmin.com/modern/activity/515642861



Photo Credit: Jessica Spuehler

Seal Beach Rough Water Swim – June 14th

The first 5K of the season! A single loop around a martini glass shaped course in Seal Beach Harbor.

The organizers highly recommend a kayaker escort, and my friend Alicia graciously offered to kayak. Sighting is awful in this race … I can see why they recommend a kayaker. Alicia and I didn’t find each other until about halfway through the race, I guess we should have discussed our plan before the race started!

Did a 1:27, which I was really happy with. Felt great afterwards, so I just jumped in and did the 1 mile race that followed. Definitely doing this next year, but I think I’ll wing it without a kayaker.

Event website: http://www.teamunify.com/TabGeneric.jsp?_tabid_=5400&team=scssbsc

Garmin: http://connect.garmin.com/modern/activity/520639873

Seal Beach

Semana Nautica 5K – July 6th

The final tuneup before the 10K the following week. I love this race, and I love the cold water up in Santa Barbara. The sun was shining very bright that morning, so the race organizers were nice enough to switch the direction of the course so that we wouldn’t get blinded.

Told myself to take it easy because I was swimming the 10K a week later, but I felt great and ended up swimming my fastest 5K in 1:22!

Event website: http://semananautica.com/

Garmin: http://connect.garmin.com/modern/activity/536114965

5K-1 5K-2

Semana Nautica 10K – July 13th

The big one! Everything had been leading up to this. We were piling on some nutty mileage at our Tower 26 Wednesday beach workouts, so I knew that I had the miles in the bank. Yes, coach Gerry, I love triple circuits! June was a > 30 mile swimming month, and July was on track to hit 35 miles. I was ready.

My good friend Chris Georges offered to kayak for me. Tom joined us for the road trip and helped get the car from the start to the finish – logistics are important.

I opened with what I later saw was my fastest open water mile, even though I felt like I was taking it easy. Chris and I had agreed on doing the first feeding an hour in. I had my Garmin set to auto-lap every mile, so I knew when I was approaching an hour.

The conditions were amazing, we got lucky. 68’ish degree water, and a gentle swell. Amazing visibility. Got to see a lot of fish and beautiful kelp forests. Chris kayaked slightly ahead of me to carve out a path through the kelp.

Felt great throughout, skipped the last feeding and finished up in 2:37.

The course was .4 miles short of 6 miles, so I did what any self-respecting swimmer would do and went out and swam another half mile. There was no way in hell I was gonna let that day pass and not actually swim 10K.

Little known fact: I missed the World Cup Final for this race.

Event website: http://semananautica.com/

Garmin: http://connect.garmin.com/modern/activity/541042059



Dwight Crum Pier to Pier – August 3rd

Pier to Pier is a fixture on the Los Angeles swimming calendar and one of the top 10 open water races in the US. This is another race that everyone does, it was great to hang out with friends and later talk about all the great white sharks hanging out along the swim course.

The race is 2 miles from the Hermosa Pier to the Manhattan Beach Pier. The winner this year swam it in 37 minutes – I just can’t fathom how that’s possible.

What I love most about this race is the chaos of the mass start. We practice mass starts every Wednesday, I love them. Nothing like trying to round that first buoy with 500 of your best friends.

Sighting this race is trickier than it should be, so I swim the course a few times beforehand to practice it. Finished in 57 minutes, and didn’t get tumbled by the surf at the end … it’s the little things.

Event website: http://www.surffestival.org/Swim/SwimIndex.html

Garmin: http://connect.garmin.com/modern/activity/556394123


Ocean Park Bandit 5K’ish Swim

Without a doubt, the crown jewel of the Los Angeles open water race calendar!

We’d always joked about organizing a race, and this was it! Joined about 30 people to swim a course from Tower 26, south to Venice and around Tom “The Human Buoy” Hiel, North to Casa del Mar, and back.

OP-1 OP-2 OP-3 OP-4

So much fun, thinking seriously about turning this into a summer race series next year!

Next Year

I already signed up to do the Bridge to Bridge 10K in San Francisco, and will definitely do the La Jolla Rough Water Swim now that it’s back on. A Catalina relay is also in the works! And, of course, the Ocean Park Bandit Swim!

Using Breeze with IHttpActionResult

I came up with a solution for this as I was writing out my question on stackoverflow.com – I love when that happens!

I’m implementing measures on my controllers to prevent users from being able to access information that they shouldn’t have permission to.

I’m looking into having my controller methods return an IHttpActionResult. Consider this simplified example:

public IHttpActionResult FindById(int id)
    // implementation of DoesUserHavePermission not relevant
    var canAccess = DoesUserHavePermission(id); 
    if (canAccess)
        return Ok(_uow.Repo.All().Where(r => r.Id == id).FirstOrDefault());
        return NotFound();

On the client-side, it would look something like this:

return uow.Repo
    .then(function (results) {
	if (results[0] == undefined) {
	else {
    .fail(function (e) {
        // log an error

This works great when you have a result, but if the controller returns a NotFound the call to findById actually fails because the call to the controller came back with a 404 status code.

Turns out a super easy way to handle this is to check the value of e.Status in the fail handler:

return uow.Repo
    .then(function (results) {
	if (results[0] == undefined) {
	else {
    .fail(function (e) {
        if (e.status = 404)
            // log an error

FYI that this is a SPA using BreezeJS and DurandalJS.

Barrel-aged Cocktail Experiment!

Amy got me a couple of oak barrels for my birthday from Oak Barrels Ltd so that I could start experimenting with barrel-aged cocktails.

I got 5 and 2 liter barrels; I knew I wanted to do a Manhattan in the bigger barrel, and then maybe age a white whiskey or another cocktail in the small one.

I settled on a Newark in the 2 liter barrel, which calls for Laird’s Applejack, Maraschino liqueur, sweet vermouth, and Fernet Branca. This should age pretty quickly in the smaller barrel – I’ll check it in 6 weeks to see how it’s doing.

I wanted to do a Negroni but read that it’s best to do that after you’ve already aged something else in the barrel.



I’m looking forward to seeing how the Manhattan in the bigger barrel turns out. I used Buffalo Trace (it was on sale) as the base. This sucker should yield about six 750ml bottles of the good stuff, so I need to start collecting some empties.


I’ll report back as I taste these during the aging process. This should be interesting, especially because patience isn’t really one of my best qualities, but I have to wait at least 8 weeks on the big barrel.


Naked domain support using GoDaddy and Azure Websites

Getting a naked domain working properly seems to be one of those secrets of the internet that people in the know protect under threat of violence.

Different DNS providers and hosts handle this differently, but here’s how I got it working with Azure Websites and GoDaddy.

In my case, I wanted to get both http://www.rescuepal.com and http://rescuepal.com to point to my website. Preferably, I also wanted http://rescuepal.com to work without a redirect – I wanted the user to actually see http://rescuepal.com in the address bar.

The Azure management portal pretty much tells you all you need to know when configuring a custom domain for your site. After you enter the domain, it does some validation to make sure the necessary DNS entries are in place – and let’s you know if they’re not.



Pretend for a second that I entered www.rescuepal.com (not www.mycustomdomain.com as in the screenshot). The error message tells me that I need a CNAME record awverify.www that points to awverify.rescuepal.azurewebsites.net.

After configuring that in GoDaddy, http://www.rescuepal.com works as a custom domain for http://rescuepal.azurewebsites.net.

To get the naked domain http://rescuepal.com to work as a custom domain, create a CNAME record awverify that points to awverify.rescuepal.azurewebsites.net

Here you can see the two CNAME records in GoDaddy:


and the custom domains configured in the Azure Management Portal:



Hope this helps

Enabling Entity Framework SQL Server Compact Edition support on Azure Websites

I’m working on a simple splash screen which collects the user’s email address and saves it in a database. I decided to use a SQL Server Compact Edition database and include it in my MVC4 website’s App_Data folder.

This worked great in my development environment, however, after deploying the site to Azure I would get the following error:

Unable to find the requested .Net Framework Data Provider. It may not be installed.

I figured that Azure didn’t have the necessary dependency available – and I had it locally because I was running Visual Studio which had most likely already installed the necessary dependency.

Instead of messing manually with bin-deployable assemblies, I went straight to NuGet.

I installed the EntityFramework.SqlServerCompact package which also installed the Microsoft SQL Server Compact Edition package that it depended on.


I then deployed to my site and everything worked like a charm.

Hope this helps.


New Ink

I sat for couple of tattoo sessions back in December at The Dolorosa Tattoo Company in Studio City with Christina Hock.

I wanted to get a tattoo of Phoebe over some cross-bones, and her name on a collar tag. Christina did a great job coming up with this design. This went on my lower left arm – first time under the elbow.

Instagram Photo

The next day I went in to get the lily tattoo on my shoulder redone. I’ve been wanting to redo this with bolder colors and outlining. Really happy with how it turned out.

Instagram Photo

Implementing cross-browser CORS support for ASP.NET Web API

Browsers lock down cross-domain calls as a security measure against cross-site request forgery attacks, however, there are times that you legitimately need to make cross-domain requests to get data that your application needs.

CORS (Cross Origin Resource Sharing) defines a mechanism to allow such cross-domain requests to happen successfully between the client and the server.

Modern browsers have gotten significantly better at providing CORS support, but you still have to jump through hoops to make this work consistently across browsers. Approaches like using jQuery’s JSONP or explicitly setting $.support.cors = true have their shortcomings.

This post will cover the things you need to take care of in JavaScript client code and at the API level to properly handle cross-domain calls.

Cross-domain calls in an ASP.NET Web API scenario

Consider this simplified application architecture:

  • An ASP.NET MVC4 web application at http://app.mydomain.com
  • An ASP.NET MVC4 Web API application at http://api.mydomain.com
  • A SQL Server database containing your application’s data

In such an architecture, when a user interacts with your web application, the Web API acts as the middle man; sending data back and forth between the web application and the database.

You will also most likely host your API at a different URL or even a different server to maintain separation. In modern web applications, the application at http://app.mydomain.com (and other clients of your API) will use JavaScript to make calls to your API. These calls are cross-domain calls because the resource being called is outside the domain of the calling page.

Handling cross-domain calls on the client

Needing to support multiple browsers is a fact of life by now for developers, and once again, this is an IE vs. everyone else problem. To be fair, IE 10 finally gets it right. However, we have support IE 8 and IE 9 for some time to come.

Let’s assume that you’re using jQuery’s $.ajax() – which wraps XMLHttpRequest – to make calls to the API. You’ll quickly notice that this will cause IE to prevent the call from happening and/or display a security warning to the user.

To provide consistent cross-browser support for AJAX calls, you can write a generic JavaScript transport function that uses the XDomainRequest object for IE and encapsulates $.ajax() for all other callers.

Here’s a generic executeRequest function that accepts the following parameters:

  • The URL to call
  • The HTTP verb to use
  • Any data to include in the body of the request
  • A success callback
  • A failure callback
transport = function () {

    var verb = function () {
        var post = "POST";
        var get = "GET";
        var put = "PUT";
        var del = "DELETE";

        return {
            post: post,
            get: get,
            put: put,
            del: del

    var executeRequest = function (
		failCallback) {

		var isCrossDomainRequest = url.indexOf('http://') == 0 
			|| url.indexOf('https://') == 0;

        if (window.XDomainRequest 
			&& $.browser.msie 
			&& $.browser.version < 10 
			&& isCrossDomainRequest) {

            // IE 10 fully supports XMLHttpRequest 
			//  but still contains XDomainRequest. 
            // Include check for IE < 10 to force IE 10 calls 
			//	to use standard XMLHttpRequest instead.
            // Only use XDomainRequest for cross-domain calls.

            var xdr = new XDomainRequest();
            if (xdr) {
                xdr.open(method, url);
                xdr.onload = function() {
                    var result = $.parseJSON(xdr.responseText);
                    if (result === null 
						|| typeof(result) === 'undefined') {
							result = $.parseJSON(

                    if ($.isFunction(doneCallback)) {
                xdr.onerror = function() {
                    if ($.isFunction(failCallback)) {
                xdr.onprogress = function() {};
            return xdr;
        } else {
            var request = $.ajax({
                type: method,
                url: url,
                data: data,
                dataType: "JSON",
                contentType: 'application/json'
            }).done(function (result) {
                if($.isFunction(doneCallback)) {
            }).fail(function (jqXhr, textStatus) {
                if($.isFunction(failCallback)) {
                    failCallback(jqXhr, textStatus);

            return request;

    return {
        verb: verb,
        executeRequest: executeRequest

Note that I’m choosing to let IE 10 use jQuery’s $.ajax(). IE 10 still includes the XDomainRequest object but there’s no point in using it since the browser can now make cross-domain calls without it.

Here’s an example of calling the executeRequest function for a simple GET operation:

	function(result) {
		// do something with result data
	function(jqXhr, textStatus) {
		// do something with failure data

That’s all you have to do on the client side, let’s move on to implementing cross-domain support for the ASP.NET Web API.

Implementing a delegating handler for cross-domain requests

When a browser makes a cross-domain call, it will add an additional HTTP header to the request called Origin. The value in this header identifies the domain that the request is coming from.

Here’s a screenshot from Fiddler showing the Origin header in the request:


The API should respond to this request with an additional response header called Access-Control-Allow-Origin.

Here’s a screenshot from Fiddler showing the Access-Control-Allow-Origin header in the response.


It’s worth noting that the different browsers show different behavior as far as OPTIONS is concerned. I observed that Firefox and Chrome sent an OPTIONS request, but IE 9 did not. I haven’t tested with IE 10.

A browser will also send a preflight request to interrogate the API about its capabilities for cross-domain requests. This special type of request uses the OPTIONS HTTP verb.

In an OPTIONS request, the client sends a request with the following headers:

  • Access-Control-Request-Method
  • Access-Control-Request-Headers

and the server responds to the request with the following headers:

  • Access-Control-Allow-Methods
  • Access-Control-Allow-Headers

Here’s a screenshot from Fiddler showing the OPTIONS request.


Here’s a screenshot from Fiddler showing the response to the OPTIONS request.


To implement support for this on the API side, you have to write a Delegating Handler. The handler allows you to handle requests to the API and manipulate the response headers before the ASP.NET Web API engine processes the request.

Carlos Figueira does a great job in this MSDN article explaining how to write such a handler to implement CORS support for ASP.NET Web API. This code is based directly on Carlos’ article.

public class CorsHandler : DelegatingHandler
	private const string AccessControlAllowHeaders 
		= "Access-Control-Allow-Headers";
	private const string AccessControlAllowMethods 
		= "Access-Control-Allow-Methods";
	private const string AccessControlAllowOrigin 
		= "Access-Control-Allow-Origin";
	private const string AccessControlRequestHeaders 
		= "Access-Control-Request-Headers";
	private const string AccessControlRequestMethod 
		= "Access-Control-Request-Method";
	private const string Origin = "Origin";

	protected override Task<HttpResponseMessage> SendAsync(
		HttpRequestMessage request, 
		CancellationToken cancellationToken)
		var isCorsRequest = request.Headers.Contains(Origin);
		var isPreflightRequest = request.Method == HttpMethod.Options;

		if (isCorsRequest)
			if (isPreflightRequest)
				var response = new HttpResponseMessage(HttpStatusCode.OK);


				var accessControlRequestMethod = request.Headers.GetValues

				if (accessControlRequestMethod != null)

				var requestedHeaders = String.Join(", ", 

				if (!string.IsNullOrEmpty(requestedHeaders))

				var tcs = new TaskCompletionSource<HttpResponseMessage>();
				return tcs.Task;
				return base.SendAsync(
					.ContinueWith<HttpResponseMessage>(t =>
					var resp = t.Result;
					return resp;
			return base.SendAsync(request, cancellationToken);
Implementing a delegating handler for missing content type header when using XDomainRequest

A side-effect of using the XDomainRequest object for requests coming from IE is that it doesn’t allow you to manipulate the request headers.

This is critical because when creating an XMLHttpRequest, you may need to set the value of the content-type header to application/json if your request contains JSON.

The solution to this is to write another Delegating Handler. In this case, the handler inspects the request to see if its content-type header is present. If it is not, it adds it and sets its value to application/json before passing it along to the ASP.NET Web API engine.

Here’s the code for this handler:

public class ContentTypeHandler : DelegatingHandler
	protected override Task<HttpResponseMessage> SendAsync(
		HttpRequestMessage request, 
		CancellationToken cancellationToken)
		if (request.Method == HttpMethod.Post 
			&& request.Content.Headers.ContentType == null)
				= new MediaTypeHeaderValue("application/json");

		return base.SendAsync(request, cancellationToken);

It’s a pity that you have to jump through so many hoops to get true cross-browser CORS support working. I’m hoping this becomes more of a configuration/deployment issue than a development one.

There’s a small shortcoming in the JavaScript executeRequest function that I’d like to call out. The non-XDomainRequest section of the code supports JavaScript promises, allowing you to use jQuery functionality such as $.when() to wait until a set of asynchronous operations have completed. I’m still working on getting this working properly for IE by manually creating the jQuery Deferred object.

Facebook and HTML5

Lots of reaction this week to Zuckerberg’s statement during TechCrunch Disrupt that investing in HTML5 was “one of the biggest mistakes if not the biggest strategic mistake that we made.”

facebook used HTML5 technology in a really smart way in their iOS app by essentially iFrame’ing a version of the their mobile website inside an iOS app shell using something called a UIWebView.

This allowed them to fix bugs, push updates, or even make major UI changes without needing to push out a new version of the app through the App Store. Everything happened on the server side and was available in the app right away.

Dirk de Kok has a fantastic writeup about how this was actually implemented.  Problem was it was horribly slow… Dirk explains why that particular architecture choice caused caused the facebook iOS app to perform so badly.

Facebook recently released an all new native iOS application, and guess what? It’s faster. Why? Because it’s a fully native app written in Objective-C.

The promise of HTML5 is very much like what we heard years ago with Java: write once, run everywhere.  Unfortunately, it can also be: write once, suck everywhere.

I completely understand why you would want to abstract out core pieces of an application and reuse it across platforms.  It doesn’t seem like that’s gonna cut it if you want to build a best-in-class application for each platform.  If you want the best performance, you’re gonna have to go native on the platforms that matter most to you.

There’s nothing wrong with HTML5, facebook was just holding it wrong!


Summer 2012

It’s been a fantastic summer in LA! In Chicago today is the last day where the beaches and public pools are officially open, but there’s no sign of summer ending in LA anytime soon!

Got a great ocean swim in today at Tower 26 in Santa Monica. June gloom is long gone and the weather was amazing. Towards the mid-point of our swim, we spotted a small pod of dolphins. They got closer to us and were playing around not even 10 feet from us; you could even hear them chattering underwater!

Tower 26 - Santa Monica beach

Tower 26 in Santa Monica


Earlier this summer, Kevin Marshall (colleague at Clarity) and I also swam the Alcatraz Sharkfest swim.

What an epic swim!!

800 swimmers pile into two ferries and sail to Alcatraz. Everybody dives into the water and swims towards a start line of kayakers. The ferry horn sounds and you swim like mad.

Super challenging swim because of the currents; you can’t just swim a straight line to the finish – you’ll get swept out and get brought back in on the back of a jetski.

Swam it in a respectable 34 minutes, but wussed out and wore my wetsuit. San Francisco is cold in June, and it psyched me out!

Alcatraz Sharkfest swim

Alcatraz Sharkfest swim

Ocean Swimming

One of the perks of living here is being able to train in the Pacific Ocean. It’s a little harder to get to living in the Valley, but it’s a great treat for the weekends.

This picture was taken last week in full June Gloom. Hope to catch some more sunny days this summer.