Improvements on build system

The build system for all (preview) target images have just been updated to reduce complexity and aid interoperability. These changes are particularly relevant for those interested in local builds and debugging of the interpreter.

Until this change, the following applied:

  1. The CD-CI pipeline (based on Azure Pipelines) was self-contained and completely autonomous.
  2. Local builds relied on developers following a recipe to download and install all the required tools. ESP32 developers where luckier than the rest as there were some PowerShell scripts that downloaded and installed some of the required tools.

The above lead to situations where developers struggled to follow the setup and configuration instructions, and became frustrated in their attempts to match between the scripts used by Azure Pipelines and the automated install scripts. Not to mention the build instructions/guides in the documentation continually falling behind.

Considering that we want to keep using the powerful Azure Pipelines (which is able to consume PowerShell scripts), the path seemed quite obvious. We should have a common base of PowerShell scripts, that are consumed by either Azure Pipelines or locally by the developer if needed.

Of course, there will still be variations on all this, mainly because the Azure Pipelines build agent already includes some pre-installed tools and the locations to download are different between the agent and a local machine where a developer has full control of where the tools should be installed. All of this can be factored into the PowerShell scripts and the various Pipelines yaml documents.

So what has changed?

Some wrapper scripts have been added to deal with differences in the required tools for the various target platforms.

New scripts were added to install ALL the required tools. This is now real (except for Visual Studio Code).

Also, a new script was added to wrap the calls to CMake providing a true “fire-and-forget” experience for those who prefer to build locally without having to worry about anything at all.

On top of this, bringing the build system a couple of notches up, there are now launch and build configurations per target living in each target folder. To make this really useful and developer friendly, a new script is now also available to fully setup VS Code. This script sets all of the required entries in settings.json, composes the launch configuration and CMake variants by grabbing the individual configurations from the various targets. And I really mean ALL because the tool paths are also updated with the setups/install from the previous step.

The build guides have been updated to reflect and describe all this, But I will summarize it here to give a concise overview.

For example, to install all the required tools to build an ESP32 image in a specific location:

.\install-nf-tools.ps1 -TargetSeries ESP32 -Path 'Z:\my-nftools'

To build an image for the STM32 F429 Discovery board:

.\build.ps1 -Target ST_STM32F429I_DISCOVERY

To setup Visual Studio Code:


Hopefully this is an important step forward in what concerns the setup stage. So, there are no more excuses to not setup a local build system and even debug it.

What are you waiting for? Go get it and have fun with .NET nanoFramework! 😉

New preview feeds for nanoFramework!

Until now nanoFramework has been using MyGet as its NuGet feed provider to host preview and test packages.

MyGet is a decent provider but only offers 100MB storage on their free trier. Actually they offer 2GB to OSS projects (which is great) and for which we’ve applied, but it turns out that setting up the account for the OSS offer is an absolute pain! After more than two weeks and dozens of emails exchanged with their support we’ve decided to just move.

We already run most of our infrastructure on Azure DevOps and support for public feeds in Azure Artifacts were recently announced at Microsoft Build 2019. This has support for NuGet packages and the offer includes 2GB of storage. Considering all this, we’ve decided to move everything there.

How does this impact you as a C# developer consuming NuGet packages? Not much! It’s only a matter of editing the NuGet package source in visual studio to point to the new feed and you’re done.

The URL for the new feed is:

This impacts the VS extension preview versions too. We are now publishing those on the Open Vsix Gallery. Likewise, if you are eager to get the latest and greatest, and you’re already subscribed to the preview feed, you have to update the URL to:

For a detailed description on how to setup all this in Visual Studio, please read the earlier blog post here.

Have fun with nanoFramework!

SonarCloud is on nanoFramework repos!

Code quality is something that is high in our priority list at nanoFramework.

Though, being on the list is not of much use, it must be measurable and comparable against an accepted standard.

For this reason, since last week we’ve been busy adding the awesome SonarCloud tool to all our class libraries repositories on GitHub.

SonarCloud is an analysis tool that checks code for issues and defects that aren’t caught by the compiler but are problems none the less. These can be for many reasons, like code that hides an unintended behaviour, a potential security breach, a violation of the .NET principles and good practices, unit test coverage and many others. These are grouped into bugs, vulnerabilities, code smells, coverage and duplications. A global classification of the project is then computed into a score.

The SonarCloud folks are supporting Open Source projects by offering their tool free of charge. Nice!

It’s setup is straightforward and takes only a few minutes per project. SonarCloud provides an Azure Pipelines task which is great for nanoFramework and it allows us to analyse PRs as they are submitted. The same goes for commits that show the various scores and metrics right after being built.

This has already pointed a few code smells and bugs that were luring in our code.

All in all, we are in very good shape! Just check it out! Most of our repos have their quality gate showing a green “pass” and score “A” in most of the categories. We do have some Cs and Ds, mostly because of code complexity, which is something that can be improved, but nothing serious.

One area that we are not performing that well in (and to be completely honest, are not performing at all) is code coverage. Our class libraries are still lacking unit tests. Mostly because we still haven’t worked out a sufficient way of running tests and collecting results when they require deployment to real hardware, But we’ll get there (feel free to contribute)!


WE ARE HIRING! 😉 make sure you join our Discord community

All systems green! (again)

After the release of v1.0 we turned a page and that is true on what concerns our GitHub repositories history. Release means tagging a point in the repository commit history. And suddenly after that all hell breaks loose on our versioning system!

Continuous deployment and continuous delivery are great, but we must make sure that all that actually works… Specially all by itself and not requiring constant baby-sitting and tweaking. This was not happening with this…

To give you an insight: we are using AppVeyor as the build platform (a big thanks to them for making it available for Open Source project such as nanoFramework) and (until last week) on GitVersion to figure out the next version that will be used when running a build. Out of nothing GitVersion went nuts and suddenly wasn’t able to compute the next version on some PRs and specially on git tags! After an insane number of hours spent trying to reach to the root cause of it (including AppVeyor support, asking for help on the project page – help that never came – and looking at other projects using it) it was decided that the best course of action was to switch to a better alternative. Switching tools carries a degree of uncertainty and the need to change stuff that was working, so it’s not something to be taken lightly… 😕

Surprisingly a quick search didn’t reveal much options. At least that looked credible and reliable. One option popped tough: Nerdbank.GitVersioning. Digging a bit deeper it seems that this is the tool that Microsoft is using lately on some of their major repositories. MSBuild included. So, if it’s good enough for taking care of MSBuild it should be suited to our modest requirements… 😎

Since you ask, between nf-interpreter, debugger and class libraries there are now 27 repositories. That’s a lot to maintain!! 😯

Moving forward and up to climb the learning curve that comes along with each new tool!

After making the required adjustments and tweaking on Yaml, PowerShell, config files and VS projects we (finally) have all system green again.

Now, back to coding which is a lot more fun!! 😉