Helping The Open Medicine Project through Microsoft 4Afrika

Yesterday I got just the confirmation email that I will be going to South Africa to help The Open Medicine Project (TOMPSA) with their truly innovative guide apps for medical professionals in developing countries. And I will be flying out on Friday! The gig is organized by the Microsoft 4Afrika program, which is a program run and funded by Microsoft to support the development of the African continent. I signed up through one of their projects, MySkills4Afrika, and I will be going down there to help out with TOMPSA’s move onto the Windows platform.

Microsoft 4Afrika

The Open Medicine Project have developed five different apps that solve problems that often occur in the world of medicine in developing countries. You can find more information about their apps on their website. One of the founders, Mohammed Dalwai, gave a TEDx talk in Cape Town explaining his story and how he co-founded The Open Medicine Project to provide to solutions to simple but important problems in medicine. Check out the YouTube video below about his story in Pakistan and the mobile triage app.

I’m extremely excited to be going. I can’t wait to meet the people working at TOMPSA and hopefully be able to help them move further forward on their journey.

Moving from MSDN to

After having had my blog on MSDN for a while I decided to move back to my own domain I will continue to run my blog here from now on and this will be the first post I write here.

Almost 4 months ago I changed to a new role internally in Microsoft, namely as a Tech Evangelist. Now that I have gotten a little more into the role I have decided to take up my blog actively again (after a period with just a few posts now and again). I plan to write a lot more about the things I encounter in my job but also outside of work. Since some of things I plan on writing about won’t strictly be related to my job at Microsoft I decided to move the blog over to my own domain. I feel that that gives me a little more freedom and flexibility.

Continue reading “Moving from MSDN to”

Parrot AR.Drone 2.0 controlled by Kinect for Windows

At a recent hackathon at an internal Microsoft event, I got to work with the Parrot AR.Drone 2.0 together with an amazing team of fellow tech evangelists in Microsoft. We had 8 hours to put together a fun project with some cool gadgets that were at our disposal for the day. The team and I ended up building a Kinect controller for the drone, essentially allowing you to control the drone using specific gestures in front of a Kinect for Windows.

Check it out below and please leave comments if you’ve got feedback!

Tutorial: How to Train a Neural Network with Azure Machine Learning

I recently stumbled upon James McCaffrey’s session on Neural Networks from Build 2014 and I thought it was cool how he had implemented the back-propagation algorithm in C# for training neural networks. If you haven’t seen James’ session yet I will strongly recommend you to do so. It’s a great session that presents the idea of neural networks really well and demonstrates how you can implement and train your own using .NET and Visual Studio.

Immediately after watching the session I opened up Visual Studio to try and do what James had just done. Rather than the Iris data set, I used the Wisconsin Breast Cancer data set and after just a bit of trial and error I had trained successfully my own neural network with an accuracy of about 97-98%. I was pretty thrilled! I refactored some of James’ original code so if you want to try and replicate the example from his Build session, you can go ahead and take a look at the refactored solution, which I uploaded to GitHub. You should be able to just clone the repository and build the code yourself. Try it out:

After completing the example with the Breast Cancer data set by coding it myself, I thought of using Azure Machine Learning to do the same job. Azure ML is a new service in Azure and it offers an incredibly powerful set of tools for machine learning. It’s still in preview but the product is already so mature that you can go ahead and start working with it today. Since I already had my Breast Cancer data set, I wanted to see how fast I could train the same network using Azure ML instead of custom code in C#. And, oh boy, it was easy – and fast! I have recorded a tutorial of how to set up the experiment in Azure ML Studio so if you want to do the same, you can see how it’s done. Check out the video here:


Here are a list of resources I refer to in the video:

As always: Have fun playing around with Azure ML! If you have any questions please don’t hesitate to write me a comment below or send me an email. I will be happy to help! 🙂

KiteBuddy: Integrating Cortana Voice Commands into your Windows Phone 8.1 app

When the Windows Phone 8.1 SDK was released, it came with a bunch of new interesting things you could do as an app developer. One of these things is the new API for integrating your app into the speech recognition experience in Windows Phone 8.1.

To get an introduction to how this works I recommend watching session 2-530 from this year’s BUILD conference. Avery Bishop, Rob Chambers and Monica Smith did a great job presenting the new capabilities, demonstrating the actual functionality as well showing how easy it is to set up from a developer point of view. Check out the session on Channel 9 here:

There are also a few articles on MSDN that serve as good resources for learning more about how it works. You should check out these three:

In this blog post, I will show how I integrated Voice Commands into my app, KiteBuddy. I will demonstrate how it works from a user perspective as well as the actual Voice Command Definition XML file behind it. Shoot me a comment below the article if you have any questions to how this works. I will be happy to help out! 🙂

Voice Command Definition XML

See the XML of the Voice Command Definition installed with the app here: