CI/CD with TwinCAT - part four
This is a crosspost from my blog https://www.alltwincat.com/
We are at the final article of this series of continous integration and delivery with TwinCAT. We’ve accomplished setting up & configuring the automation server, created a TwinCAT test library project and wrote a small batch script that will be launched every time a new push has been done to the GIT repository for the library project. Now it’s time for us to write the program that will do the actual static code analysis.
With the TwinCAT3 integrated development environment being integrated in Visual Studio this will be our playground. In order for us to access the functionality of Visual Studio programmatically Microsoft has created something called the Development Tools Environment (DTE). Much of the things that we take for granted in TwinCAT such as build/rebuild/clean are natural parts of Visual Studio and can be accessed through the DTE. The DTE does however not give access to the things that are specific for TwinCAT, which is quite a great amount of functionality! Just think about it, everything related to creation of task creation, PLC projects, I/O configuration, TwinCAT library referencing and much more. To access TwinCAT specific functionality, we need to use the TwinCAT automation interface.
I’m not going to go into all details of the TwinCAT automation interface as it most likely would need (and deserves!) its own series of articles if we were to cover it all. What is important to know is that the TwinCAT automation interface is a necessary part of the toolchain in the workflow of CI/CD. Now, while the TwinCAT automation interface is extensive and gives a lot of possibilities, the issue is that it doesn’t give you access to all TwinCAT specific functionality but a subset of it. I’m sure Beckhoff had good reasons to do their selection of features included, and from my perspective it seems that the selection of features (as of 4022.22) are more oriented against configuration of a complete solution. The problem we’re facing is that there is nothing in the TwinCAT automation interface for static code analysis. I’ve spoken to Beckhoff and they will investigate the possibility to add static code analysis to the automation interface, though I have no idea whether when (or even if) it will be included. But should that stop us? Of course not. In the best of worlds, this is a list of items (requirements) that I would want to be able to do with the automation interface:
- Define the rules for the static code analysis, including which ones should generate warnings and which should generate errors
- Load a pre-defined set of rules to apply for a project
- Run the static code analysis check (without executing a build)
- All of the above on a project that has had nothing to do with the static code analysis defined before
The reason I put that last point is basically that we don’t want any of our users/developers to care about setting all of this up. This should only be set up at one location (the TwinCAT build server) and these rules and settings should be the same for all TwinCAT executables and libraries within a specific project or subproject. Define once, run everywhere.
If Beckhoff ever decides to include static code analysis into the automation interface, we can quite easily extend our program with this functionality. But what are we supposed to do now that we can’t do the above? Quite a lot actually, and for the things we cannot do through the automation interface we will have to do it manually. Once the rules for the static code analysis are defined and set they are still executed once we do a build. And executing a build is something we can do through the DTE as this is part of the standard functionality in Visual Studio. The result from the static code analysis is printed in the standard Visual Studio error and warning list. We get a great deal of other stuff as well (eveything related to doing a build obviously such as build info/warning/errors), but these can be filtered out. Let me explain with an example. All the static code analysis rules have a number allocated to them, which is a reference to the description in the manual which gives the rationale for the rule. For instance the rule for gaps in structures is described like this in the manual for TE1200:
This is a check to make sure that we don’t have any gaps in structures for the selected target CPU architecture (which can be avoided by filling the gaps – read more here and here on the subject). You see that the rule has number SA0016? If we set this rule to generate warnings in TE1200, and then create an example structure ST_SA0016 that violates this rule and do a build we’ll get:
Because all rules start with the string “SA”, we can filter them out from the rest of the output in the Visual Studio error list. But there must be any drawbacks with this solution compared to if Beckhoff had included the static code analysis directly in the TwinCAT automation interface? The big drawback is that we need to apply the static code analysis to all our projects. It is possible to load a file with pre-defined rules, but there are two big problems with that. First, we'' need to have a file that we'll have to distribute to each and every owner of a TwinCAT project. Now imagine if we want to do some change to the static code analysis rules. This could potentially mean that we would have to manually update all those projects which is a solution that doesn’t scale that well. Second, we need to save this settings in the project itself. It would make much more sense to (as an option) have the possibility to load the rules externally, so that they can be defined outside of the project and in one place instead of potentially hundreds of places.
As long as this is not included in the TwinCAT automation interface the static code analysis is far from optimal, especially in larger projects. Loading the rules manually is a good first step though, but I do really hope Beckhoff invests the time and effort to include it in the automation interface in such a way that it conforms to the above mentioned requirements. Also, enabling of the static analysis rules in your project requires the TE1200-license on an USB-stick (or license terminal), which means that any developer that wants to enable these rules will at some point need the USB-stick (unless it is managed and controlled at a central location). Enabling of the license on the USB-stick is far from optimal (at least in the current latest release of TwinCAT 3.1.4022.22). Beckhoff requires that you enable the TE1200 license by scanning the I/O of your developer machine, and activate the license for TwinCAT to recognize it after that. A more natural way would be for TwinCAT to recognize the USB-stick and activate the stored licenses automatically. For now, we need to make sure that all our TwinCAT projects have the static code analysis rules defined. Because it’s our virtual machine that will do the actual static code analysis, we need to forward the USB-stick to the VirtualBox machine.
With all that being said, now that we know some of the limitations it’s about time we start doing some coding. There are several language bindings to access the TwinCAT automation interface, such as:
- The .NET languages (C# or Visual Basic)
- Microsoft Powershell
- C++
Personally I prefer C++, but because most of the examples in the documentation are in C# I’ll stick to that. Start Visual Studio and create a Visual C# Console App, name it something of your choice (in my example TcStaticAnalysisLoader). As this program will be executed from Jenkins we don’t need any fancy graphical user interface, just a simple console application will do.
What we need to do next is to add the libraries that gives us access to the DTE. In the solution explorer, go to References, right-click and click Add references. Click Assemblies->Extensions and select EnvDTE and EnvDTE80.
Now we need to do the same thing for the TwinCAT automation interface. In the same window click on COM -> Type Libraries. Select the latest version of the Beckhoff TwinCAT XAE Base X.Y Type library (version 3.2 in the example below).
Now that we have all the references, we can write our program. First we need to parse the arguments which we provided from the windows batch file to this program in the previous article. Instead of writing our own parser we can use the NDesk.Options program options parser library for C#. Add a reference to it and add the options for the Visual Studio solution and TwinCAT project file paths.
OptionSet options = new OptionSet()
.Add("v=|VisualStudioSolutionFilePath=", v => VisualStudioSolutionFilePath = v)
.Add("t=|TwinCATProjectFilePath=", t => TwinCATProjectFilePath = t)
.Add("?|h|help", h => showHelp = h != null);
I also added a option to print a help text, not necessary as long as we are using it in the context of Jenkins but if used as a stand-alone tool. Add a small little helper-method to print the help text. Everything that we print to the console (using Console.WriteLine()) is visible in Jenkins after that the job has been run for analysis.
static void DisplayHelp(OptionSet p) {
Console.WriteLine("Usage: TcStaticAnalysisLoader [OPTIONS]");
Console.WriteLine("Loads the TwinCAT static code analysis loader program with the selected visual studio solution and TwinCAT project.");
Console.WriteLine("Example: TcStaticAnalysisLoader -v \"C:\\Jenkins\\workspace\\TcProject\\TcProject.sln\" -t \"C:\\Jenkins\\workspace\\TcProject\\PlcProject1\\PlcProj.tsproj\"");
Console.WriteLine();
Console.WriteLine("Options:");
p.WriteOptionDescriptions(Console.Out);
}
Running this will give us a nice output that clearly describes how to use this program:
Next we need to figure out which version of Visual Studio and TwinCAT XAE that was used for this TwinCAT project. As the .sln-file is a human readable file we can parse the Visual Studio version from this. The .tsproj-file is an XML-file, so the same goes there. If we open the .sln-file for a TwinCAT library project solution we'll get:
For every Visual Studio release Microsoft have a number representing the version, which for a few of the versions is according to this table:
So I just noticed that version 13 is left out, I wonder whether Microsoft had any good reason for that? For the example above, we can see that the TwinCAT project was made using Visual Studio 2015. Now if we open the .tsproj-file we get:
So here we can see which version of the XAE was used to create the project. To parse the version of Visual Studio we use a Streamreader.
/* Find visual studio version */
string vsVersion = "";
string line;
bool foundVsVersionLine = false;
System.IO.StreamReader file = new System.IO.StreamReader(@VisualStudioSolutionFilePath);
while ((line = file.ReadLine()) != null) {
if (line.StartsWith("VisualStudioVersion")) {
string version = line.Substring(line.LastIndexOf('=') + 2);
Console.WriteLine(version);
string[] numbers = version.Split('.');
string major = numbers[0];
string minor = numbers[1];
bool isNumericMajor = int.TryParse(major, out int n);
bool isNumericMinor = int.TryParse(minor, out int n2);
if (isNumericMajor && isNumericMinor) {
Console.WriteLine("Major: " + major);
Console.WriteLine("Minor: " + minor);
vsVersion = major + "." + minor;
foundVsVersionLine = true;
}
break;
}
}
file.Close();
if (!foundVsVersionLine) {
Console.WriteLine("Did not find Visual studio version in Visual studio solution file");
return Constants.RETURN_ERROR;
}
We’ll do it similarly for the TwinCAT-version in the .tsproj-file. Now that we have both the visual studio and TwinCAT version in two strings.
The next step is to check that the TwinCAT version that has been used is at minimum 3.1.4022.0, as it’s from this version that static code analysis using TE1200 is supported. We’ll create a constant MIN_TC_VERSION_FOR_SC_ANALYSIS, and do a comparison to this. If the loaded projects version is older than 4022.0, we’ll return with the value 1 indicating a failure.
var versionMin = new Version(Constants.MIN_TC_VERSION_FOR_SC_ANALYSIS);
var versionDetected = new Version(tcVersion);
var compareResult = versionDetected.CompareTo(versionMin);
if (compareResult < 0) {
Console.WriteLine("The detected TwinCAT version in the project does not support TE1200 static code analysis");
Console.WriteLine("The minimum version that supports TE1200 is " + Constants.MIN_TC_VERSION_FOR_SC_ANALYSIS);
return Constants.RETURN_ERROR;
}
Now some of you might ask the question “Why do we need this check of the VS/TwinCAT versions?”. This is actually a very good question to ask. Doing the static code analysis the question we should ask is whether we want to use the same version of TwinCAT (for instance, latest and greatest) or whether we should do the analysis using the TwinCAT version that was used when developing the TwinCAT project in question. The answer on this question is most likely project dependent. One thing that is good to consider in any case is the fact that Beckhoff are still making bug-fixes to TE1200. Making a static code analysis on the same project using TwinCAT 3.1.4022.4 and 3.1.4022.22 will give you very different results. For the sake of getting some experience with the TwinCAT automation interface, in this example we will load the TwinCAT version that was used when creating the project, which means we do the static code analysis in the same environment as the project was made in. Another thing to notice is that you necessary wouldn’t have to do this information gathering. You could for instance always have “Pin version” enabled for all the TwinCAT projects, though you then would need to make sure that all projects do so. Is this something we want to enforce on all the developers? As you see, once we start to dig into this, we are starting to ask some important questions. Always having more questions than answers is what I love with software engineering. There is an infinite amount of things to discover and learn.
To make sure that the DTE is having the right visual studio environment, we create one instance of the DTE using the correct version of Visual Studio.
string VisualStudioProgId = "VisualStudio.DTE." + vsVersion;
Type type = System.Type.GetTypeFromProgID(VisualStudioProgId);
EnvDTE80.DTE2 dte = (EnvDTE80.DTE2)System.Activator.CreateInstance(type);
While we’re at it, we also have the option to not show Visual Studio while the DTE is running. Because this is going to be running as an automated program launched from Jenkins it makes sense to suppress any visuals (i.e. the Visual Studio HMI), which also means that the static code analysis will run faster.
dte.SuppressUI = true;
dte.MainWindow.Visible = false;
Next we open the Visual Studio solution:
EnvDTE.Solution visualStudioSolution = dte.Solution; visualStudioSolution.Open(VisualStudioSolutionFilePath); EnvDTE.Project pro = visualStudioSolution.Projects.Item(1);
To load the correct version of the TwinCAT version we can use the (fairly new) TwinCAT remote manager. This is cleary visible and easy to use in Visual Studio:
Just select the correct version of TwinCAT prior to loading the project and we’re all set. But now we want our C#-program to do this, and this means we need to ask the question of whether this is included in the TwinCAT automation interface. Looking through Beckhoffs API for the automation interface I find nothing. Looking through some of the C#-examples however I find:
This seems to be exactly what we are looking for. Beckhoff have forgotten to add the above to the API-documentation, but looking into the code of the actual library that we have referenced in our C#-program it’s clearly there.
Aren’t we lucky? Lets use the remote manager defined in the TwinCAT automation interface:
ITcRemoteManager remoteManager = dte.GetObject("TcRemoteManager");
remoteManager.Version = tcVersion;
Now we’ve got everything setup and ready! Next we’ll initiate a Clean and Build:
visualStudioSolution.SolutionBuild.Clean(true); visualStudioSolution.SolutionBuild.Build(true);
This will clean and build the project, and from this we'll have the results in the visual studio error list. The error list is available in the DTE, so let’s collect the errors and print how many of them we have.
ErrorItems errors = dte.ToolWindows.ErrorList.ErrorItems;
Console.WriteLine("Errors count: " + errors.Count);
Next we will iterate through all the errors, and filter them out according to my earlier description. In this way we will only get the errors that are produced from the static code analysis tool after the build, and not any other errors if there would be some.
for (int i = 1; i <= errors.Count; i++) {
ErrorItem item = errors.Item(i);
if (item.Description.StartsWith("SA")) {
Console.WriteLine("Description: " + item.Description);
Console.WriteLine("ErrorLevel: " + item.ErrorLevel);
Console.WriteLine("Filename: " + item.FileName);
}
}
We’re basically just checking whether the error description starts with the string “SA”, and if so we print it. The item.ErrorLevel is an enumeration which can return either vsBuildErrorLevelLow, vsBuildErrorLevelMedium or vsBuildErrorLevelHigh. What you see in Visual Studio translates to the readable values “Message”, “Warning” and “Error” in that order.
Going back to something I discussed earlier, remember that you for every type of static analysis that the TE1200 tool can do can specify whether it:
- Should be ignored
- Should be marked as warning
- Should be marked as error
One of the questions we need to ask ourselves is whether we consider that the complete jenkins job failed only if the static code analysis produced results that gave errors, or whether we should consider that the job failed if the static code analysis also detects warnings. In Jenkins a finished job can be not only marked as “successful/passed” or “failed”, but also “unstable”. A nice solution for our particular problem would be to implement the following set of requirements:
- If the static code analysis reports no warnings and no errors, mark the build as successful
- If the static code analysis reports no errors, but at least one warning, mark the build as unstable
- If the static code analysis reports at least one error, mark the build as failed
Each of these outcomes are displayed in Jenkins with a different colour, so just by looking at the latest builds you can quickly see which result a static code analysis has had. While trying to solve this problem, I found this report in the Jenkins reporting tool. This report is a great example of what is so great when doing anything outside of the ordinary structured text/PLC-programming, which is the fact that for every problem you encounter, someone else has already encountered it and posted about it somewhere on the Internet. Reading through the report and the results, we can conclude that the team behind Jenkins has included functionality for not only setting a job as successful or failed, but also unstable. Lets go back to the configuration of our job in Jenkins, and click on Advanced… for the build step:
Clicking this we can define what error level should correspond to an unstable build. I set this to the number 2.
Let’s rewrite some of that code we just wrote:
int tcStaticAnalysisWarnings = 0;
int tcStaticAnalysisErrors = 0;
for (int i = 1; i <= errors.Count; i++) {
ErrorItem item = errors.Item(i);
if (item.Description.StartsWith("SA") && (item.ErrorLevel != vsBuildErrorLevel.vsBuildErrorLevelLow)) {
Console.WriteLine("Description: " + item.Description);
Console.WriteLine("ErrorLevel: " + item.ErrorLevel);
Console.WriteLine("Filename: " + item.FileName);
if (item.ErrorLevel == vsBuildErrorLevel.vsBuildErrorLevelMedium)
tcStaticAnalysisWarnings++;
else if (item.ErrorLevel == vsBuildErrorLevel.vsBuildErrorLevelHigh)
tcStaticAnalysisErrors++;
}
}
The final thing we need to do is to report back the result to Jenkins. If we have any errors, we report the job as a failure. If we only have warnings (but no errors), we’ll report the job as unstable. If we have neither warnings nor errors we report the job as successful.
if (tcStaticAnalysisErrors > 0)
return Constants.RETURN_ERROR;
else if (tcStaticAnalysisWarnings > 0)
return Constants.RETURN_UNSTABLE;
else
return Constants.RETURN_SUCCESSFULL;
And that’s basically it! Now we need to open our TwinCAT3 test library and define the static code analysis rules, and test them out on our toolchain. As I’ve explained in a previous post about TE1200, the rules can be defined to generate either nothing (rule ignored), a warning or an error. I’ll leave all the rules at the default. I think we are ready to do a test run! Lets just do a minor change in the TwinCAT library, and push the change over to the GIT server. Going to the job in Jenkins we can see that Jenkins has started. Clicking on “Console output” for the build shows us something strange.
C:\Jenkins\workspace\TwinCATStaticCodeAnalysis>LaunchPLCStaticAnalysis.bat
VISUAL_STUDIO_SOLUTION_PATH found!
The filepath to the visual studio solution file is: "C:\Jenkins\workspace\TwinCATStaticCodeAnalysis\CodeAnalysisDemoTcLibrary.sln"
TWINCAT_PROJECT_PATH found!
The filepath to the TwinCAT project file is: "C:\Jenkins\workspace\TwinCATStaticCodeAnalysis\CodeAnalysisDemoTcLibrary\CodeAnalysisDemoTcLibrary.tsproj"
TcStaticAnalysisLoader.exe : argument 1: C:\Jenkins\workspace\TwinCATStaticCodeAnalysis\CodeAnalysisDemoTcLibrary.sln
TcStaticAnalysisLoader.exe : argument 2: C:\Jenkins\workspace\TwinCATStaticCodeAnalysis\CodeAnalysisDemoTcLibrary\CodeAnalysisDemoTcLibrary.tsproj
In Visual Studio solution file, found visual studio version 14.0.25420.1
In TwinCAT project file, found version 3.1.4022.22
Unhandled Exception: System.IO.FileNotFoundException: The specified module could not be found. (Exception from HRESULT: 0x8007007E)
at EnvDTE80.DTE2.GetObject(String Name)
at AllTwinCAT.TcStaticAnalysisLoader.Program.Main(String[] args)
Exit code is -532462766
Build step 'Execute Windows batch command' marked build as failure
Finished: FAILURE
It looks like our windows batch-script has launched successfully, and that our C#-program works as it has found the versions of both visual studio and TwinCAT used in this project. But I don't like the line “Unhandled Exception: System.IO.FileNotFoundException”. There seems to be something wrong for sure. Let’s run the exact same batch-file in a command prompt (CMD) as a local user on the virtual machine.
It works… what? Why does it work? From this point I wasted two weekends (!) trying to understand why the script works in CMD but not in Jenkins, and when I finally did find the error I was so dissapointed at myself for not figuring it out earlier. I’ll show you what you need to do so that you don’t have to waste as much time as I did on this, as this will haunt you whatever you try to automate using Jenkins and Visual Studio/DTE/TwinCAT automation interface. Let’s fast forward many-many hours of frustration and swearing. When doing the build from Jenkins and monitoring windows task manager I noticed that there was a Visual Studio process spawned, which is expected. What was strange was that the owning user of the process was SYSTEM, whilst it for many other processes the owning user was the local user TwinCAT-SCA.
Now I felt I was on to something. How can we verify that this is indeed the problem? One way would be to run CMD as SYSTEM user instead of the local user. But how do we start CMD as the SYSTEM user? I found Microsoft PSTools which allows you to do exactly that. Starting CMD as SYSTEM user and starting the windows batch-file gave me the exact same results as if I would run it from Jenkins. EUREKA! The only thing left to figure out is how to start Jenkins as the local user (TwinCAT-SCA). Jenkins is started as a service, and to change the starting user we need to go to Services (In Windows 10, press start and enter Services). Select the Jenkins service, right-click on it and select Properties. Click on Log On and select This account. Select your local (administrator) account, enter the password and restart the service.
Once this is done, go to the Jenkins job and click Build now. Go to the console output.
It looks exactly like when we launched the script from the CMD as the TwinCAT-SCA user. I’ve recorded a video where I demonstrate the usage of this live, so that you can see it in action.
In Jenkins a erroneous, successful and unstable build is represented with a red, blue and yellow circle/ball respectively. If we commit & push the same TwinCAT-project to GIT but adjusting the code just a little to get different results from the static code analysis we can see this in Jenkins.
Discussions
I want to finish this series of articles by some discussions. As I wrote in the first article of these series the example of static code analysis is just one of the many artifacts that could be automated in the software development processes.
There are some other issues with not having the static code analysis included in the TwinCAT automation interface as well. For example, when the build is executed and the static code analysis is launched it’s only the “standard” one and not the [Check all objects]-version that is executed.
The difference is just the same between doing a standard build and doing a “Check all objects” in TwinCAT. Doing the static code analysis with the [Check all objects]-option, TwinCAT will also make sure to check all POUs that are not referenced/used in any program which is especially important when you are developing libraries as you might not even have a program/task that references to the POUs, but these are rather referenced in other projects where you have the executable binaries.
In this example we had a windows-batch file saved and version controlled together with the library. In a production system I would put this batch-file outside of the specific PLC-projects and use one file for all projects. Also, it could be discussed why every developer needs to push the code to the central repository for the static code analysis to happen. The drawbacks are quite obvious, it would simply be nice to get the results much earlier than that. In this case the ambition was to have the rules in one common central place, but because the static code analysis is not (yet) included in the automation interface this is not possible. Another advantage of having it in one central place is that you only need to buy one single USB-license, otherwise every developer would need to have a license of the TE1200 tool.
Some of you might have noticed that all of this is very Windows heavy. For someone that is used in working in the Linux-world it might feel like there are many unecessary limitations. Although this is TwinCAT (The Windows Control and Automation Technology), Beckhoff have put some limitations which could be relaxed. For instance, it would be benefical if most of this would be accesible through a CLI, instead of everything going through Visual Studio. When my Linux-hacking colleagues are showing of their impressive automation scripts and highly developed automated software toolchains my left eye always looks at this with a little jealousy at the same time as my right eye is in the PLC-world with a small tear running down my cheek. But then I need to remind myself that these two worlds have completely different backgrounds and history. They were two entirely different worlds, but thanks to things like the TwinCAT automation interface have started to converge in certain areas, and I want them to converge even more. With the problems I’ve had to get this to work I again get reminded of the fact that doing scripting/automation on Windows/visual studio is harder than doing it any environment on Linux. Also, doing CI/CD in “traditional” (non-PLC) software engineering is common, and the amount of documentation/stackoverflow threads/wikis is overwhelming. Modern software practices in automation is more the exception rather than the rule, why it’s even more important to push it forward. It’s important to never give up, as it’s very rewarding once it works.
Everything that is currently in the TwinCAT automation interface is just a subset of all the functionality in Visual Studio that is specific to TwinCAT (i.e. added by Beckhoff). Anything that isn’t included in the TwinCAT automation interface doesn’t scale well, which includes the static code analysis. For certain parts that are not included in the automation interface there are workarounds (i.e. hacks), for others you can cut some corners to get most of the functionality, but the preferred option is to always have it done properly via the automation interface. Beckhoff will of course not add anything that no-one asked for, but if the market/customers requests this functionality the chances of Beckhoff including it will most likely increase.
I hope you’ve enjoyed these series about continous integration and continous delivery with TwinCAT, and that I have inspired at least one person to use this as inspiration for a project. I’d love to get feedback from any readers. Maybe you have developed and deployed modern software practices at your current project/work? Which parts of the software development process could be automated?
All sourcecode for this series is available on GitHub.
Artwork attribution:
Cloud icon by Smashicons from www.flaticon.com
Control & Automation | De-risking your technology strategy
6 年A stunning piece of work, Jakob.