Microsoft had their Connect 2017 event last week and there were lots of great announcements that I get excited about. To see the entire keynote go here https://www.microsoft.com/en-us/connectevent/default.aspx but announcements range from From AI to DevOps to iOS and Android development, the key theme for Microsoft is Any Developer, Any App, Any Platform.
As a developer you are spending a lot of your time in an IDE (Integrated Development Environment), whether it’s XCode, Android Studio, Visual Studio, Visual Studio Code or anything else out there. For myself, my time is spent in Visual Studio, Visual Studio for Mac and Visual Studio Code which works on Mac or Windows. Goal for Visual Studio is to provide best in class tools for every developer.
The next few sections go over some of the announcement highlights and the technology we can use to build great solutions for our customers at RedBit. If you want to skip to sections in keynote, I have included the approximate start time for the feature.
One of the new features that I think is fantastic for Visual Studio is the new “Live Share” feature. The demo was showing a NodeJS app built and deployed using docker, developed on a mac, using a linux backend … like I said, any developer, any platform, any app!
Think of this as a screen sharing and ‘pair programming’ on steriods. Imagine a pair programming session with developers being at remote locations and from within an IDE of choice (VS Code and Visual Studio) both devs can see exactly the same break points, see live debug variables and not just doing ‘console.log’ to debug for a NodeJS web app.
I’ve been a mobile dev for a long time and also been using Xamarin for a while so when I see new features I always get excited. There a lot of features announced for mobile devs and you can see a full list by Joseph Hill on the Xamarin Blog. There are two main things described below, but to see the keynote section done by James Montemagno skip to 25:18 to watch Xamarin updates.
I’ve been using Live Player in the preview with some production Xamarin apps and some proof of concepts. What it does is really accelerate the time to build out user interfaces using XAML. The great thing is it runs on on actual device and is not emulated. So as you type, it will compile, push to the device and you have a ‘remote view’ of the device on your desktop. It’s pretty much the same as ‘live updating’ when building out web apps but this is native code running on a device not in a browser.
Xamarin Forms 2.5 is finally released, again used in preview for some POCs but happy it’s finally out. The feature I look forward to is Layout Compression. On mobile, ever second counts especially when delay on mobile apps cause more stress than horror movies! Layout compression helps optimize user interface rendering speed. Just imagine the stress you are reducing by using Layout Compression!
Visual Studio App Center is the combination of Hockey App and Xamarin Insights and it’s now generally available. To view the keynote video start at 39:25
With App center you can test, deploy, monitor your applications. You can setup a CI environment, automate UI Tests in the cloud on real devices, setup continuous delivery to testers, capture crash reports and real time analytics.
One interesting feature is being able to segment your users into particular groups and sending out push notifications to them. See an example of push notifications in the keynote go to 46:27. Here a push notification is sent to users based in the US for Thanksgiving Holiday. Great way to use data to keep users engaged in your apps!
Start watching from 1:03:35 to see Donovan Brown show the different features. Basically this is how you want to manage your software projects form development to production to gathering feedback from users and iterating on your software. If you are manually FTPing code to servers or using ZIP files as your source control strategy (I have seen this) then watch this section of the keynote. This is how RedBit runs all our customer projects and our own software products.
Lots of new things coming with AI and Machine Learning and you can view the second part of the keynote here. The message that I get is Microsoft is trying to bring AI and ML to all developers as usually you have to have a PHD or Masters to make and train models. I don’t have a PHD (or a CS degree) but to be able to train a machine to learn what an apple is (skip to 6:38 to view) by looking at a picture is amazing!
Then taking that model and exporting to CoreML( see how to export at around 14:03) and run it on an iPhone using Xamarin and identify an Apple without a connection to Azure is incredible (see around 17:16)! The amount of power we have in our pockets is amazing!
I would say do yourself a favor and just explore this, bring it into your software projects, start small, then explore how you can further customize and take advantage of AI and ML. I wrote a ‘skunk works’ project to do handwriting recognition form an image captured from a Xamarin forms app, sent to a .NET Core API which saved the image in Azure blob storage, sent the image to Azure cognitive services vision API, stored the results in SQL Azure and sent back the recognized text.
So in the end, lots of great announcements in the keynote. Now to bring some of these things into customer software projects and our software projects!