Using Breeze with IHttpActionResult

I came up with a solution for this as I was writing out my question on stackoverflow.com – I love when that happens!

I’m implementing measures on my controllers to prevent users from being able to access information that they shouldn’t have permission to.

I’m looking into having my controller methods return an IHttpActionResult. Consider this simplified example:

[HttpGet]
[BreezeQueryable]
public IHttpActionResult FindById(int id)
{
    // implementation of DoesUserHavePermission not relevant
    var canAccess = DoesUserHavePermission(id); 
    if (canAccess)
        return Ok(_uow.Repo.All().Where(r => r.Id == id).FirstOrDefault());
    else
        return NotFound();
}

On the client-side, it would look something like this:

return uow.Repo
    .findById(id)
    .then(function (results) {
	if (results[0] == undefined) {
    	    router.navigate(notFoundRoute);
	}
	else {
	    myEntity(results[0]);
	}
    })
    .fail(function (e) {
        // log an error
    });

This works great when you have a result, but if the controller returns a NotFound the call to findById actually fails because the call to the controller came back with a 404 status code.

Turns out a super easy way to handle this is to check the value of e.Status in the fail handler:

return uow.Repo
    .findById(id)
    .then(function (results) {
	if (results[0] == undefined) {
    	    router.navigate(notFoundRoute);
	}
	else {
	    myEntity(results[0]);
	}
    })
    .fail(function (e) {
        if (e.status = 404)
            router.navigate(notFoundRoute);
        else
            // log an error
    });

FYI that this is a SPA using BreezeJS and DurandalJS.

Barrel-aged Cocktail Experiment!

Amy got me a couple of oak barrels for my birthday from Oak Barrels Ltd so that I could start experimenting with barrel-aged cocktails.

I got 5 and 2 liter barrels; I knew I wanted to do a Manhattan in the bigger barrel, and then maybe age a white whiskey or another cocktail in the small one.

I settled on a Newark in the 2 liter barrel, which calls for Laird’s Applejack, Maraschino liqueur, sweet vermouth, and Fernet Branca. This should age pretty quickly in the smaller barrel – I’ll check it in 6 weeks to see how it’s doing.

I wanted to do a Negroni but read that it’s best to do that after you’ve already aged something else in the barrel.

P1020813

 

I’m looking forward to seeing how the Manhattan in the bigger barrel turns out. I used Buffalo Trace (it was on sale) as the base. This sucker should yield about six 750ml bottles of the good stuff, so I need to start collecting some empties.

P1020818

I’ll report back as I taste these during the aging process. This should be interesting, especially because patience isn’t really one of my best qualities, but I have to wait at least 8 weeks on the big barrel.

 

Naked domain support using GoDaddy and Azure Websites

Getting a naked domain working properly seems to be one of those secrets of the internet that people in the know protect under threat of violence.

Different DNS providers and hosts handle this differently, but here’s how I got it working with Azure Websites and GoDaddy.

In my case, I wanted to get both http://www.rescuepal.com and http://rescuepal.com to point to my website. Preferably, I also wanted http://rescuepal.com to work without a redirect – I wanted the user to actually see http://rescuepal.com in the address bar.

The Azure management portal pretty much tells you all you need to know when configuring a custom domain for your site. After you enter the domain, it does some validation to make sure the necessary DNS entries are in place – and let’s you know if they’re not.

NakedDomain-Azure

 

Pretend for a second that I entered www.rescuepal.com (not www.mycustomdomain.com as in the screenshot). The error message tells me that I need a CNAME record awverify.www that points to awverify.rescuepal.azurewebsites.net.

After configuring that in GoDaddy, http://www.rescuepal.com works as a custom domain for http://rescuepal.azurewebsites.net.

To get the naked domain http://rescuepal.com to work as a custom domain, create a CNAME record awverify that points to awverify.rescuepal.azurewebsites.net

Here you can see the two CNAME records in GoDaddy:

NakedDomain-GoDaddyDNS

and the custom domains configured in the Azure Management Portal:

NakedDomain-AzureWebsitesCustomDomains

 

Hope this helps

Enabling Entity Framework SQL Server Compact Edition support on Azure Websites

I’m working on a simple splash screen which collects the user’s email address and saves it in a database. I decided to use a SQL Server Compact Edition database and include it in my MVC4 website’s App_Data folder.

This worked great in my development environment, however, after deploying the site to Azure I would get the following error:

Unable to find the requested .Net Framework Data Provider. It may not be installed.

I figured that Azure didn’t have the necessary dependency available – and I had it locally because I was running Visual Studio which had most likely already installed the necessary dependency.

Instead of messing manually with bin-deployable assemblies, I went straight to NuGet.

I installed the EntityFramework.SqlServerCompact package which also installed the Microsoft SQL Server Compact Edition package that it depended on.

NugetSQLCompact

I then deployed to my site and everything worked like a charm.

Hope this helps.

 

New Ink

I sat for couple of tattoo sessions back in December at The Dolorosa Tattoo Company in Studio City with Christina Hock.

I wanted to get a tattoo of Phoebe over some cross-bones, and her name on a collar tag. Christina did a great job coming up with this design. This went on my lower left arm – first time under the elbow.

Instagram Photo

The next day I went in to get the lily tattoo on my shoulder redone. I’ve been wanting to redo this with bolder colors and outlining. Really happy with how it turned out.

Instagram Photo

Implementing cross-browser CORS support for ASP.NET Web API

Browsers lock down cross-domain calls as a security measure against cross-site request forgery attacks, however, there are times that you legitimately need to make cross-domain requests to get data that your application needs.

CORS (Cross Origin Resource Sharing) defines a mechanism to allow such cross-domain requests to happen successfully between the client and the server.

Modern browsers have gotten significantly better at providing CORS support, but you still have to jump through hoops to make this work consistently across browsers. Approaches like using jQuery’s JSONP or explicitly setting $.support.cors = true have their shortcomings.

This post will cover the things you need to take care of in JavaScript client code and at the API level to properly handle cross-domain calls.

Cross-domain calls in an ASP.NET Web API scenario

Consider this simplified application architecture:

  • An ASP.NET MVC4 web application at http://app.mydomain.com
  • An ASP.NET MVC4 Web API application at http://api.mydomain.com
  • A SQL Server database containing your application’s data

In such an architecture, when a user interacts with your web application, the Web API acts as the middle man; sending data back and forth between the web application and the database.

You will also most likely host your API at a different URL or even a different server to maintain separation. In modern web applications, the application at http://app.mydomain.com (and other clients of your API) will use JavaScript to make calls to your API. These calls are cross-domain calls because the resource being called is outside the domain of the calling page.

Handling cross-domain calls on the client

Needing to support multiple browsers is a fact of life by now for developers, and once again, this is an IE vs. everyone else problem. To be fair, IE 10 finally gets it right. However, we have support IE 8 and IE 9 for some time to come.

Let’s assume that you’re using jQuery’s $.ajax() - which wraps XMLHttpRequest - to make calls to the API. You’ll quickly notice that this will cause IE to prevent the call from happening and/or display a security warning to the user.

To provide consistent cross-browser support for AJAX calls, you can write a generic JavaScript transport function that uses the XDomainRequest object for IE and encapsulates $.ajax() for all other callers.

Here’s a generic executeRequest function that accepts the following parameters:

  • The URL to call
  • The HTTP verb to use
  • Any data to include in the body of the request
  • A success callback
  • A failure callback
transport = function () {

    var verb = function () {
        var post = "POST";
        var get = "GET";
        var put = "PUT";
        var del = "DELETE";

        return {
            post: post,
            get: get,
            put: put,
            del: del
        };
    }();

    var executeRequest = function (
		method, 
		url, 
		data, 
		doneCallback, 
		failCallback) {

		var isCrossDomainRequest = url.indexOf('http://') == 0 
			|| url.indexOf('https://') == 0;

        if (window.XDomainRequest 
			&& $.browser.msie 
			&& $.browser.version < 10 
			&& isCrossDomainRequest) {

            // IE 10 fully supports XMLHttpRequest 
			//  but still contains XDomainRequest. 
            // Include check for IE < 10 to force IE 10 calls 
			//	to use standard XMLHttpRequest instead.
            // Only use XDomainRequest for cross-domain calls.

            var xdr = new XDomainRequest();
            if (xdr) {
                xdr.open(method, url);
                xdr.onload = function() {
                    var result = $.parseJSON(xdr.responseText);
                    if (result === null 
						|| typeof(result) === 'undefined') {
							result = $.parseJSON(
								data.firstChild.textContent);
                    }

                    if ($.isFunction(doneCallback)) {
                        doneCallback(result);
                    }
                };
                xdr.onerror = function() {
                    if ($.isFunction(failCallback)) {
                        failCallback();
                    }
                };
                xdr.onprogress = function() {};
                xdr.send(data);
            }
            return xdr;
        } else {
            var request = $.ajax({
                type: method,
                url: url,
                data: data,
                dataType: "JSON",
                contentType: 'application/json'
            }).done(function (result) {
                if($.isFunction(doneCallback)) {
                    doneCallback(result);
                }
            }).fail(function (jqXhr, textStatus) {
                if($.isFunction(failCallback)) {
                    failCallback(jqXhr, textStatus);
                }
            });

            return request;
        }
    };

    return {
        verb: verb,
        executeRequest: executeRequest
    };
}();

Note that I’m choosing to let IE 10 use jQuery’s $.ajax(). IE 10 still includes the XDomainRequest object but there’s no point in using it since the browser can now make cross-domain calls without it.

Here’s an example of calling the executeRequest function for a simple GET operation:

transport.executeRequest(
	transport.verb.get,
	url,
	null,
	function(result) {
		// do something with result data
	},
	function(jqXhr, textStatus) {
		// do something with failure data
	}

That’s all you have to do on the client side, let’s move on to implementing cross-domain support for the ASP.NET Web API.

Implementing a delegating handler for cross-domain requests

When a browser makes a cross-domain call, it will add an additional HTTP header to the request called Origin. The value in this header identifies the domain that the request is coming from.

Here’s a screenshot from Fiddler showing the Origin header in the request:

1

The API should respond to this request with an additional response header called Access-Control-Allow-Origin.

Here’s a screenshot from Fiddler showing the Access-Control-Allow-Origin header in the response.

CORS-2

It’s worth noting that the different browsers show different behavior as far as OPTIONS is concerned. I observed that Firefox and Chrome sent an OPTIONS request, but IE 9 did not. I haven’t tested with IE 10.

A browser will also send a preflight request to interrogate the API about its capabilities for cross-domain requests. This special type of request uses the OPTIONS HTTP verb.

In an OPTIONS request, the client sends a request with the following headers:

  • Access-Control-Request-Method
  • Access-Control-Request-Headers

and the server responds to the request with the following headers:

  • Access-Control-Allow-Methods
  • Access-Control-Allow-Headers

Here’s a screenshot from Fiddler showing the OPTIONS request.

CORS-3

Here’s a screenshot from Fiddler showing the response to the OPTIONS request.

CORS-4

To implement support for this on the API side, you have to write a Delegating Handler. The handler allows you to handle requests to the API and manipulate the response headers before the ASP.NET Web API engine processes the request.

Carlos Figueira does a great job in this MSDN article explaining how to write such a handler to implement CORS support for ASP.NET Web API. This code is based directly on Carlos’ article.

public class CorsHandler : DelegatingHandler
{
	private const string AccessControlAllowHeaders 
		= "Access-Control-Allow-Headers";
	private const string AccessControlAllowMethods 
		= "Access-Control-Allow-Methods";
	private const string AccessControlAllowOrigin 
		= "Access-Control-Allow-Origin";
	private const string AccessControlRequestHeaders 
		= "Access-Control-Request-Headers";
	private const string AccessControlRequestMethod 
		= "Access-Control-Request-Method";
	private const string Origin = "Origin";

	protected override Task<HttpResponseMessage> SendAsync(
		HttpRequestMessage request, 
		CancellationToken cancellationToken)
	{
		var isCorsRequest = request.Headers.Contains(Origin);
		var isPreflightRequest = request.Method == HttpMethod.Options;

		if (isCorsRequest)
		{
			if (isPreflightRequest)
			{
				var response = new HttpResponseMessage(HttpStatusCode.OK);

				response.Headers.Add(
					AccessControlAllowOrigin, 
					request.Headers.GetValues(Origin).First());

				var accessControlRequestMethod = request.Headers.GetValues
					AccessControlRequestMethod).FirstOrDefault();

				if (accessControlRequestMethod != null)
				{
					response.Headers.Add(
						AccessControlAllowMethods, 
						accessControlRequestMethod);
				}

				var requestedHeaders = String.Join(", ", 
					request.Headers.GetValues(AccessControlRequestHeaders));

				if (!string.IsNullOrEmpty(requestedHeaders))
				{
					response.Headers.Add(
						AccessControlAllowHeaders, 
						requestedHeaders);
				}

				var tcs = new TaskCompletionSource<HttpResponseMessage>();
				tcs.SetResult(response);
				return tcs.Task;
			}
			else
			{
				return base.SendAsync(
					request, 
					cancellationToken)
					.ContinueWith<HttpResponseMessage>(t =>
				{
					var resp = t.Result;
					resp.Headers.Add(
						AccessControlAllowOrigin, 
						request.Headers.GetValues(Origin).First());
					return resp;
				});
			}
		}
		else
		{
			return base.SendAsync(request, cancellationToken);
		}
	}
}
Implementing a delegating handler for missing content type header when using XDomainRequest

A side-effect of using the XDomainRequest object for requests coming from IE is that it doesn’t allow you to manipulate the request headers.

This is critical because when creating an XMLHttpRequest, you may need to set the value of the content-type header to application/json if your request contains JSON.

The solution to this is to write another Delegating Handler. In this case, the handler inspects the request to see if its content-type header is present. If it is not, it adds it and sets its value to application/json before passing it along to the ASP.NET Web API engine.

Here’s the code for this handler:

public class ContentTypeHandler : DelegatingHandler
{
	protected override Task<HttpResponseMessage> SendAsync(
		HttpRequestMessage request, 
		CancellationToken cancellationToken)
	{
		if (request.Method == HttpMethod.Post 
			&& request.Content.Headers.ContentType == null)
		{
			request.Content.Headers.ContentType 
				= new MediaTypeHeaderValue("application/json");
		}

		return base.SendAsync(request, cancellationToken);
	}
}
Conclusion

It’s a pity that you have to jump through so many hoops to get true cross-browser CORS support working. I’m hoping this becomes more of a configuration/deployment issue than a development one.

There’s a small shortcoming in the JavaScript executeRequest function that I’d like to call out. The non-XDomainRequest section of the code supports JavaScript promises, allowing you to use jQuery functionality such as $.when() to wait until a set of asynchronous operations have completed. I’m still working on getting this working properly for IE by manually creating the jQuery Deferred object.

Facebook and HTML5

Lots of reaction this week to Zuckerberg’s statement during TechCrunch Disrupt that investing in HTML5 was “one of the biggest mistakes if not the biggest strategic mistake that we made.”

facebook used HTML5 technology in a really smart way in their iOS app by essentially iFrame’ing a version of the their mobile website inside an iOS app shell using something called a UIWebView.

This allowed them to fix bugs, push updates, or even make major UI changes without needing to push out a new version of the app through the App Store. Everything happened on the server side and was available in the app right away.

Dirk de Kok has a fantastic writeup about how this was actually implemented.  Problem was it was horribly slow… Dirk explains why that particular architecture choice caused caused the facebook iOS app to perform so badly.

Facebook recently released an all new native iOS application, and guess what? It’s faster. Why? Because it’s a fully native app written in Objective-C.

The promise of HTML5 is very much like what we heard years ago with Java: write once, run everywhere.  Unfortunately, it can also be: write once, suck everywhere.

I completely understand why you would want to abstract out core pieces of an application and reuse it across platforms.  It doesn’t seem like that’s gonna cut it if you want to build a best-in-class application for each platform.  If you want the best performance, you’re gonna have to go native on the platforms that matter most to you.

There’s nothing wrong with HTML5, facebook was just holding it wrong!

 

Summer 2012

It’s been a fantastic summer in LA! In Chicago today is the last day where the beaches and public pools are officially open, but there’s no sign of summer ending in LA anytime soon!

Got a great ocean swim in today at Tower 26 in Santa Monica. June gloom is long gone and the weather was amazing. Towards the mid-point of our swim, we spotted a small pod of dolphins. They got closer to us and were playing around not even 10 feet from us; you could even hear them chattering underwater!

Tower 26 - Santa Monica beach

Tower 26 in Santa Monica

 

Earlier this summer, Kevin Marshall (colleague at Clarity) and I also swam the Alcatraz Sharkfest swim.

What an epic swim!!

800 swimmers pile into two ferries and sail to Alcatraz. Everybody dives into the water and swims towards a start line of kayakers. The ferry horn sounds and you swim like mad.

Super challenging swim because of the currents; you can’t just swim a straight line to the finish – you’ll get swept out and get brought back in on the back of a jetski.

Swam it in a respectable 34 minutes, but wussed out and wore my wetsuit. San Francisco is cold in June, and it psyched me out!

Alcatraz Sharkfest swim

Alcatraz Sharkfest swim

Ocean Swimming

One of the perks of living here is being able to train in the Pacific Ocean. It’s a little harder to get to living in the Valley, but it’s a great treat for the weekends.

This picture was taken last week in full June Gloom. Hope to catch some more sunny days this summer.

Now available! Programming Microsoft’s Clouds: Windows Azure and Office 365

I wrote a couple of chapters for Programming Microsoft’s Clouds: Windows Azure and Office 365   about Exchange Online and Lync Online in Office 365. The book is available for purchase now on Amazon in paperback and Kindle format.

Microsoft has obviously made a huge push into the cloud. Most of our projects today have a component running in the cloud, whether its a database running in SQL Azure, a service running in a Windows Azure Worker Role, or sometimes the whole shebang running in Windows Azure.

The introduction of Office 365 also opens up a lot of possibilities to companies wanting to use and pay for products such as SharePoint, Exchange, or Lync in a subscription model. As a small business or even an enterprise, those subscription rates are extremely compelling.

In the current incarnation of Office 365, the online versions of SharePoint and Lync offer up a portion of the development experience and capabilities available in their on-premises brethren.

Expect the developer story here to improve in upcoming versions of Office 365; until then, this book provides some great guidance for dealing with those challenges and building real-world applications on top of these technologies.