Lately I’ve been throwing around the idea of writing an app for my wedding, and started thinking of what I’d want to put in it. One feature that I thought could be fun would be to let guests upload photos directly from the app and display them both to other guests and on our site in real time. In this post I’ll go over how I was able to quickly put together a simple prototype for how this could work using SignalR, Azure Storage, and WebAPI.
First things first: all of the code for this sample is available on GitHub
Here’s a short video showing the gallery in action:
The first upload is being done in a script called from LINQPad, and is basically the same code you’ll see later that is used from the iOS app.
To The Code!
Enough of that, let’s look at some code. This sample includes several projects:
- Gallery: MVC/WebAPI/SignalR project for viewing and uploading photos
- GalleryApp.Core: Shared core logic for apps to let them upload photos and listen for new uploads
- GalleryApp.Core.iOS: A file-linked version of GalleryApp.Core compiled for iOS
- GalleryApp-iOS: An iOS app that lets you upload photos and see what others upload
Let’s start with the Gallery project. In the sample project I have it set up to use the local Azure Storage emulator that makes it easy to test, but you can swap this out with a real connection string in Web.config if you want.
First we need an API endpoint for uploading new photos to our storage container:
What happens there is that when a PUT request comes in that gets routed to this controller, it will pull out the files contained in the form data and pipe them through a custom provider that we’ll define next, which uploads the files to Azure Storage. Once that has completed, it broadcasts a message on a SignalR hub so that anybody listening will know about the new photo.
The custom storage provider used there looks like this:
Each file in the request is saved down to a local file, uploaded to Azure Storage, and then the local copy is deleted. This code is based on a helpful blog post I found by Yao Huang Lin. After each file is uploaded, its public URL is added to a collection so that it can be broadcast out.
The other piece we have referenced here that hasn’t been defiend yet is the SignalR hub:
Since PhotoController is manually broadcasting its message to clients, the hub itself only needs to exist, so we don’t have to define anything extra here.
Now that the API side is defined, it would be nice to have a web interface that displays the gallery as well, so let’s define a view for that:
As you can see, there’s not much going on here. In markup we just have a list for photos to get added to. When the page loads, it will connect to the SignalR hub and listen for messages about new photos added to the gallery. When new photos are received they are added to the list and displayed. Simple!
In a real application you’d definitely want to add in security to prevent anyone from being able to use your storage container, but I left that out of this sample to keep things simple. You might also want to do things like resizing, restricting file formats, etc.
Let’s take a look at the shared component for apps to hook into gallery. This component contains just two classes. The PhotoUploader class, as the name implies, takes a byte array representing an image along with its file extension, assigns it a unique filename, and sends that to the API we defined earlier.
Next up is the PhotoListener class, which connects to the SignalR hub and listens for new photos. When a new photo is received, it raises an event that the app can response do.
That’s it! The GalleryApp.Core.iOS project simply links these files into a Xamarin.iOS class library so they can be used from an app. These classes make use of some newer features like async/await and HttpClient, which are currently available in the alpha and beta channels of Xamarin.
Finally, let’s look at the iOS app. First we’ll look at the code, then discuss what’s going on in there:
To simplify the UI creation, I’m making use of MonoTouch.Dialog, which ships with Xamarin.iOS and makes it really easy to create UI elements in code. When the view loads, we set the right bar button to an Add symbol, and attach a click handler to it that starts the process of uploading a photo. It then starts listening for new photos, and will add new ImageStringElement objects to the UI when any are received.
To upload a photo we just use the built-in iOS image picker, so we don’t have to write any of that code ourselves. Once a photo is chosen, it is converted to a JPEG, then to a byte array, and uploaded via the shared uploader code from earlier.
In this post we went over how you can combine the powers of Xamarin, SignalR, Azure, and WebAPI to create an easy real-time photo gallery with support for multiple platforms. The best part is that it didn’t require very much code to do, which is a real testament to the power of these technologies. You could also swap out Azure and WebAPI for other technologies that you may prefer, such as Amazon, Nancy, ServiceStack, etc. I was personally curious to try out Azure and WebAPI, which is why I chose to go that route, and both worked quite well in this prototype.