Archive for Computer Stuff

Using Breeze with IHttpActionResult

I came up with a solution for this as I was writing out my question on stackoverflow.com – I love when that happens!

I’m implementing measures on my controllers to prevent users from being able to access information that they shouldn’t have permission to.

I’m looking into having my controller methods return an IHttpActionResult. Consider this simplified example:

[HttpGet]
[BreezeQueryable]
public IHttpActionResult FindById(int id)
{
    // implementation of DoesUserHavePermission not relevant
    var canAccess = DoesUserHavePermission(id); 
    if (canAccess)
        return Ok(_uow.Repo.All().Where(r => r.Id == id).FirstOrDefault());
    else
        return NotFound();
}

On the client-side, it would look something like this:

return uow.Repo
    .findById(id)
    .then(function (results) {
	if (results[0] == undefined) {
    	    router.navigate(notFoundRoute);
	}
	else {
	    myEntity(results[0]);
	}
    })
    .fail(function (e) {
        // log an error
    });

This works great when you have a result, but if the controller returns a NotFound the call to findById actually fails because the call to the controller came back with a 404 status code.

Turns out a super easy way to handle this is to check the value of e.Status in the fail handler:

return uow.Repo
    .findById(id)
    .then(function (results) {
	if (results[0] == undefined) {
    	    router.navigate(notFoundRoute);
	}
	else {
	    myEntity(results[0]);
	}
    })
    .fail(function (e) {
        if (e.status = 404)
            router.navigate(notFoundRoute);
        else
            // log an error
    });

FYI that this is a SPA using BreezeJS and DurandalJS.

Naked domain support using GoDaddy and Azure Websites

Getting a naked domain working properly seems to be one of those secrets of the internet that people in the know protect under threat of violence.

Different DNS providers and hosts handle this differently, but here’s how I got it working with Azure Websites and GoDaddy.

In my case, I wanted to get both http://www.rescuepal.com and http://rescuepal.com to point to my website. Preferably, I also wanted http://rescuepal.com to work without a redirect – I wanted the user to actually see http://rescuepal.com in the address bar.

The Azure management portal pretty much tells you all you need to know when configuring a custom domain for your site. After you enter the domain, it does some validation to make sure the necessary DNS entries are in place – and let’s you know if they’re not.

NakedDomain-Azure

 

Pretend for a second that I entered www.rescuepal.com (not www.mycustomdomain.com as in the screenshot). The error message tells me that I need a CNAME record awverify.www that points to awverify.rescuepal.azurewebsites.net.

After configuring that in GoDaddy, http://www.rescuepal.com works as a custom domain for http://rescuepal.azurewebsites.net.

To get the naked domain http://rescuepal.com to work as a custom domain, create a CNAME record awverify that points to awverify.rescuepal.azurewebsites.net

Here you can see the two CNAME records in GoDaddy:

NakedDomain-GoDaddyDNS

and the custom domains configured in the Azure Management Portal:

NakedDomain-AzureWebsitesCustomDomains

 

Hope this helps

Enabling Entity Framework SQL Server Compact Edition support on Azure Websites

I’m working on a simple splash screen which collects the user’s email address and saves it in a database. I decided to use a SQL Server Compact Edition database and include it in my MVC4 website’s App_Data folder.

This worked great in my development environment, however, after deploying the site to Azure I would get the following error:

Unable to find the requested .Net Framework Data Provider. It may not be installed.

I figured that Azure didn’t have the necessary dependency available – and I had it locally because I was running Visual Studio which had most likely already installed the necessary dependency.

Instead of messing manually with bin-deployable assemblies, I went straight to NuGet.

I installed the EntityFramework.SqlServerCompact package which also installed the Microsoft SQL Server Compact Edition package that it depended on.

NugetSQLCompact

I then deployed to my site and everything worked like a charm.

Hope this helps.

 

Implementing cross-browser CORS support for ASP.NET Web API

Browsers lock down cross-domain calls as a security measure against cross-site request forgery attacks, however, there are times that you legitimately need to make cross-domain requests to get data that your application needs.

CORS (Cross Origin Resource Sharing) defines a mechanism to allow such cross-domain requests to happen successfully between the client and the server.

Modern browsers have gotten significantly better at providing CORS support, but you still have to jump through hoops to make this work consistently across browsers. Approaches like using jQuery’s JSONP or explicitly setting $.support.cors = true have their shortcomings.

This post will cover the things you need to take care of in JavaScript client code and at the API level to properly handle cross-domain calls.

Cross-domain calls in an ASP.NET Web API scenario

Consider this simplified application architecture:

  • An ASP.NET MVC4 web application at http://app.mydomain.com
  • An ASP.NET MVC4 Web API application at http://api.mydomain.com
  • A SQL Server database containing your application’s data

In such an architecture, when a user interacts with your web application, the Web API acts as the middle man; sending data back and forth between the web application and the database.

You will also most likely host your API at a different URL or even a different server to maintain separation. In modern web applications, the application at http://app.mydomain.com (and other clients of your API) will use JavaScript to make calls to your API. These calls are cross-domain calls because the resource being called is outside the domain of the calling page.

Handling cross-domain calls on the client

Needing to support multiple browsers is a fact of life by now for developers, and once again, this is an IE vs. everyone else problem. To be fair, IE 10 finally gets it right. However, we have support IE 8 and IE 9 for some time to come.

Let’s assume that you’re using jQuery’s $.ajax() – which wraps XMLHttpRequest – to make calls to the API. You’ll quickly notice that this will cause IE to prevent the call from happening and/or display a security warning to the user.

To provide consistent cross-browser support for AJAX calls, you can write a generic JavaScript transport function that uses the XDomainRequest object for IE and encapsulates $.ajax() for all other callers.

Here’s a generic executeRequest function that accepts the following parameters:

  • The URL to call
  • The HTTP verb to use
  • Any data to include in the body of the request
  • A success callback
  • A failure callback
transport = function () {

    var verb = function () {
        var post = "POST";
        var get = "GET";
        var put = "PUT";
        var del = "DELETE";

        return {
            post: post,
            get: get,
            put: put,
            del: del
        };
    }();

    var executeRequest = function (
		method, 
		url, 
		data, 
		doneCallback, 
		failCallback) {

		var isCrossDomainRequest = url.indexOf('http://') == 0 
			|| url.indexOf('https://') == 0;

        if (window.XDomainRequest 
			&& $.browser.msie 
			&& $.browser.version < 10 
			&& isCrossDomainRequest) {

            // IE 10 fully supports XMLHttpRequest 
			//  but still contains XDomainRequest. 
            // Include check for IE < 10 to force IE 10 calls 
			//	to use standard XMLHttpRequest instead.
            // Only use XDomainRequest for cross-domain calls.

            var xdr = new XDomainRequest();
            if (xdr) {
                xdr.open(method, url);
                xdr.onload = function() {
                    var result = $.parseJSON(xdr.responseText);
                    if (result === null 
						|| typeof(result) === 'undefined') {
							result = $.parseJSON(
								data.firstChild.textContent);
                    }

                    if ($.isFunction(doneCallback)) {
                        doneCallback(result);
                    }
                };
                xdr.onerror = function() {
                    if ($.isFunction(failCallback)) {
                        failCallback();
                    }
                };
                xdr.onprogress = function() {};
                xdr.send(data);
            }
            return xdr;
        } else {
            var request = $.ajax({
                type: method,
                url: url,
                data: data,
                dataType: "JSON",
                contentType: 'application/json'
            }).done(function (result) {
                if($.isFunction(doneCallback)) {
                    doneCallback(result);
                }
            }).fail(function (jqXhr, textStatus) {
                if($.isFunction(failCallback)) {
                    failCallback(jqXhr, textStatus);
                }
            });

            return request;
        }
    };

    return {
        verb: verb,
        executeRequest: executeRequest
    };
}();

Note that I’m choosing to let IE 10 use jQuery’s $.ajax(). IE 10 still includes the XDomainRequest object but there’s no point in using it since the browser can now make cross-domain calls without it.

Here’s an example of calling the executeRequest function for a simple GET operation:

transport.executeRequest(
	transport.verb.get,
	url,
	null,
	function(result) {
		// do something with result data
	},
	function(jqXhr, textStatus) {
		// do something with failure data
	}

That’s all you have to do on the client side, let’s move on to implementing cross-domain support for the ASP.NET Web API.

Implementing a delegating handler for cross-domain requests

When a browser makes a cross-domain call, it will add an additional HTTP header to the request called Origin. The value in this header identifies the domain that the request is coming from.

Here’s a screenshot from Fiddler showing the Origin header in the request:

1

The API should respond to this request with an additional response header called Access-Control-Allow-Origin.

Here’s a screenshot from Fiddler showing the Access-Control-Allow-Origin header in the response.

CORS-2

It’s worth noting that the different browsers show different behavior as far as OPTIONS is concerned. I observed that Firefox and Chrome sent an OPTIONS request, but IE 9 did not. I haven’t tested with IE 10.

A browser will also send a preflight request to interrogate the API about its capabilities for cross-domain requests. This special type of request uses the OPTIONS HTTP verb.

In an OPTIONS request, the client sends a request with the following headers:

  • Access-Control-Request-Method
  • Access-Control-Request-Headers

and the server responds to the request with the following headers:

  • Access-Control-Allow-Methods
  • Access-Control-Allow-Headers

Here’s a screenshot from Fiddler showing the OPTIONS request.

CORS-3

Here’s a screenshot from Fiddler showing the response to the OPTIONS request.

CORS-4

To implement support for this on the API side, you have to write a Delegating Handler. The handler allows you to handle requests to the API and manipulate the response headers before the ASP.NET Web API engine processes the request.

Carlos Figueira does a great job in this MSDN article explaining how to write such a handler to implement CORS support for ASP.NET Web API. This code is based directly on Carlos’ article.

public class CorsHandler : DelegatingHandler
{
	private const string AccessControlAllowHeaders 
		= "Access-Control-Allow-Headers";
	private const string AccessControlAllowMethods 
		= "Access-Control-Allow-Methods";
	private const string AccessControlAllowOrigin 
		= "Access-Control-Allow-Origin";
	private const string AccessControlRequestHeaders 
		= "Access-Control-Request-Headers";
	private const string AccessControlRequestMethod 
		= "Access-Control-Request-Method";
	private const string Origin = "Origin";

	protected override Task<HttpResponseMessage> SendAsync(
		HttpRequestMessage request, 
		CancellationToken cancellationToken)
	{
		var isCorsRequest = request.Headers.Contains(Origin);
		var isPreflightRequest = request.Method == HttpMethod.Options;

		if (isCorsRequest)
		{
			if (isPreflightRequest)
			{
				var response = new HttpResponseMessage(HttpStatusCode.OK);

				response.Headers.Add(
					AccessControlAllowOrigin, 
					request.Headers.GetValues(Origin).First());

				var accessControlRequestMethod = request.Headers.GetValues
					AccessControlRequestMethod).FirstOrDefault();

				if (accessControlRequestMethod != null)
				{
					response.Headers.Add(
						AccessControlAllowMethods, 
						accessControlRequestMethod);
				}

				var requestedHeaders = String.Join(", ", 
					request.Headers.GetValues(AccessControlRequestHeaders));

				if (!string.IsNullOrEmpty(requestedHeaders))
				{
					response.Headers.Add(
						AccessControlAllowHeaders, 
						requestedHeaders);
				}

				var tcs = new TaskCompletionSource<HttpResponseMessage>();
				tcs.SetResult(response);
				return tcs.Task;
			}
			else
			{
				return base.SendAsync(
					request, 
					cancellationToken)
					.ContinueWith<HttpResponseMessage>(t =>
				{
					var resp = t.Result;
					resp.Headers.Add(
						AccessControlAllowOrigin, 
						request.Headers.GetValues(Origin).First());
					return resp;
				});
			}
		}
		else
		{
			return base.SendAsync(request, cancellationToken);
		}
	}
}
Implementing a delegating handler for missing content type header when using XDomainRequest

A side-effect of using the XDomainRequest object for requests coming from IE is that it doesn’t allow you to manipulate the request headers.

This is critical because when creating an XMLHttpRequest, you may need to set the value of the content-type header to application/json if your request contains JSON.

The solution to this is to write another Delegating Handler. In this case, the handler inspects the request to see if its content-type header is present. If it is not, it adds it and sets its value to application/json before passing it along to the ASP.NET Web API engine.

Here’s the code for this handler:

public class ContentTypeHandler : DelegatingHandler
{
	protected override Task<HttpResponseMessage> SendAsync(
		HttpRequestMessage request, 
		CancellationToken cancellationToken)
	{
		if (request.Method == HttpMethod.Post 
			&& request.Content.Headers.ContentType == null)
		{
			request.Content.Headers.ContentType 
				= new MediaTypeHeaderValue("application/json");
		}

		return base.SendAsync(request, cancellationToken);
	}
}
Conclusion

It’s a pity that you have to jump through so many hoops to get true cross-browser CORS support working. I’m hoping this becomes more of a configuration/deployment issue than a development one.

There’s a small shortcoming in the JavaScript executeRequest function that I’d like to call out. The non-XDomainRequest section of the code supports JavaScript promises, allowing you to use jQuery functionality such as $.when() to wait until a set of asynchronous operations have completed. I’m still working on getting this working properly for IE by manually creating the jQuery Deferred object.

Facebook and HTML5

Lots of reaction this week to Zuckerberg’s statement during TechCrunch Disrupt that investing in HTML5 was “one of the biggest mistakes if not the biggest strategic mistake that we made.”

facebook used HTML5 technology in a really smart way in their iOS app by essentially iFrame’ing a version of the their mobile website inside an iOS app shell using something called a UIWebView.

This allowed them to fix bugs, push updates, or even make major UI changes without needing to push out a new version of the app through the App Store. Everything happened on the server side and was available in the app right away.

Dirk de Kok has a fantastic writeup about how this was actually implemented.  Problem was it was horribly slow… Dirk explains why that particular architecture choice caused caused the facebook iOS app to perform so badly.

Facebook recently released an all new native iOS application, and guess what? It’s faster. Why? Because it’s a fully native app written in Objective-C.

The promise of HTML5 is very much like what we heard years ago with Java: write once, run everywhere.  Unfortunately, it can also be: write once, suck everywhere.

I completely understand why you would want to abstract out core pieces of an application and reuse it across platforms.  It doesn’t seem like that’s gonna cut it if you want to build a best-in-class application for each platform.  If you want the best performance, you’re gonna have to go native on the platforms that matter most to you.

There’s nothing wrong with HTML5, facebook was just holding it wrong!

 

SharePoint – Configuring log4net in a SharePoint Application

Cross-posted from my Clarity blog

Configuring log4net in a SharePoint or ASP.NET application is pretty straightforward, but no matter how many times I do it, I always forget something small and waste time troubleshooting why logging isn’t working.

Here’s a guide to configuring logging using log4net in a SharePoint application. This post will cover:

  • Creating and deploying a log4net configuration file
  • Making the necessary web.config changes to support logging
log4net deployment is documented extensively so I’ll keep this at a high level.

Creating and Deploying a log4net Configuration File

It’s always best to put your log4net configuration in a dedicated config file, this way you get to keep your SharePoint web.config file clean and not have to deal with the SPWebConfigModification class.

The part I always forget is that when you have the log4net configuration in a separate file, you need to tell your assembly the name of the log4net configuration file. You do that by adding the following attribute to the AssemblyInfo class of the SharePoint project or class library where you have your logger.

According to the documentation from Apache, this attribute is read and processed after the first time you invoke log4net. Your logger will call LogManager.GetLogger and look for this attribute and then read the configuration information.

[assembly: log4net.Config.XmlConfigurator(
 ConfigFile = "log4Net.config", Watch = true)]

You can examine properties of the logger such as isDebugEnabled and isErrorEnabled to check if everything is working properly.

You can then use a SharePoint Timer job to deploy the log4net.config file to the root of your SharePoint web application.

In a Web Application scoped feature, you can run a SharePoint Timer job in the FeactureActivated handler:

var log4NetDeployJob = new log4NetConfigDeploymentJob(
    _jobName, webApplication);
var schedule = new SPOneTimeSchedule(DateTime.Now);
log4NetDeployJob.Schedule = schedule;
log4NetDeployJob.Update();
webApplication.JobDefinitions.Add(log4NetDeployJob);
webApplication.Update();
log4NetDeployJob.RunNow();

The Execute method of the Timer Job extracts the log4net.config file from where it was deployed with your feature and copies it to the virtual directory:

public override void Execute(Guid targetInstanceId)
{
    var webApp = this.Parent as SPWebApplication;
    foreach (KeyValuePair<SPUrlZone, SPIisSettings> setting in webApp.IisSettings)
    {
        var deployTo = setting.Value.Path.FullName + @"\log4Net.config";
        var filePath = SPUtility.GetGenericSetupPath(
            @"TEMPLATE\FEATURES\log4Net\Configuration\log4Net.config");
        File.Copy(filePath, deployTo, true);
    }

    base.Execute(targetInstanceId);
}

SPUtility.GetGenericSetupPath returns the full path to the 14 hive, and “TEMPLATE\FEATURES\log4Net\Configuration\log4Net.config” is the path to the log4Net.config file inside the feature.

Web.config Modifications

You now need to make some changes to the web application’s web.config file.

Add log4Net configuration section in configuration/configSections:

<section name="log4net"
  type="log4net.Config.Log4NetConfigurationSectionHandler, Log4net"/>

Add the following before the end of the configuration section:

<log4net configSource="log4Net.config" />

That’s it! Assuming you created a logger class in your application, you can start logging away. Check out the documentation from Apache for information about how to write a logger.

SharePoint – Timing of Master Page JavaScript

Our base SharePoint 2010 Publishing master page calls a JavaScript function that does things such as:

  • Initialize jQuery plugins
  • Configure some top-navigation tweaks
  • Miscellaneous items such as setting the date in the copyright footer
  • Initialize Google Analytics

We call this function right before the end of the head section of the master page. The contents of the JavaScript function are enclosed in a jQuery $(document).ready block, meaning that the function will run after the DOM is ready.

A drawback to this approach is that JavaScript that manipulate elements on the screen only executes after the DOM is ready. This isn’t a great user experience because the user can see the “jumpiness” on screen as the script executes.

I ran this by our resident web ninja Jacob Gable and he suggested breaking up the JavaScript function into smaller chunks and calling those chunks directly after the HTML content that they affect; for example:

<div id="footer">
    <span id="footer-year"></span>&nbsp;All rights reserved.
    <script type="text/javascript">
        SetCopyrightDate();
    </script>
</div>

where SetCopyrightDate uses jQuery to set the date to the current year:

function SetCopyrightDate() 
{
    var today = new Date();
    var yr = today.getFullYear();
    $("#footer-year").text("© " + yr);
}

This is pretty obvious in retrospect; your JavaScript can be sprinkled throughout the page as needed. People are scared away by this, wanting to have a super-clean master page.

However, in this case, pulling out some JavaScript functionality into separate functions and calling them from the master page does the trick.

The remaining JavaScript can stay in a global function and run in a $(document).ready block; for us, all this does now is initialize jQuery plugins we’re using throughout the site.