The key to create better camera analytics applications faster using Python and ACAP
Prototyping and debugging in an edge device can be challenging and drive the cost of application development.

The key to create better camera analytics applications faster using Python and ACAP

If you are reading this then you have probably already found the importance of rapid prototyping and fast development cycles. Working with one-off implementations and applications for analytics it is obvious that having a framework decreasing the time to a finished product is directly coupled to the bottom line profits. In the life cycle of a large product the initial development phase might be a reasonably small part of the total costs, that does not however mean that rapid development is not crucial. A product is rarely static after deployment, new features are requested, new integrations are proposed and higher security standars and better performance are assumed. All these aspects makes the development environment and processes a core asset in the way to success. Getting your product to the market fast can be the difference between a seized opportunity and an expensive artifact at the wrong time?or place. A shorter cycle from idea to proof of concept and MVP will reduce the financial risk by trying out feasibility and finding problems as?cheaply as possible. Iterating, creating and testing multiple versions will lead to a better product in the end.

With this insight, the million dollar question for all companies in the software industry is how to create a framework and processes that enable fast development, rapid prototyping and is easy to get started with. In this article I will summarize my experience of many years of developing ACAPs at Axis Communications. I will give hands-on tips of low hanging fruit on the motto that solving 20% of the issues will gain you 80% of the results.

I will discuss how the choice of programming language impacts the end-to-end development cycle for embedded analytics applications. I will show how micropython can be used to build flexible applications deployed on Axis' cameras. I will show how python can be used both as a prototyping language but also as a mean to quickly create extremely powerful analytics applications for proof of concepts and customer pilots. Finally, I will show what the future of ACAP development looks like and how you can start already today to harvest the benefits of it.

  1. How does the?choice of programming language impact the end-to-end development cycle?
  2. How can micropython be used to build flexible applications deployed on Axis cameras?
  3. How can python be used to quickly create powerful applications for PoCs and customer pilots?
  4. What does the future of ACAP development look like?

In the beginning of an analytics product's life there exists a problem and an idea.?Often the definition of the problem starts by data collection: collecting pairs of situation representations and expected actions or outputs. Following modern best practices, adding data version control is the next step. Just as version controlled code will track the transformation of the code base and expose the source and the cause of a change in behavior, version controlled data will aid in identifying the data points causing a change in application behavior. There exists a plethora of tools for data version control and experiment tracking, a common theme though is that many expose a python API for integration. Some commonly used examples are DVC which has a python API, Pachyderm which has the python Pachyderm Client and polyaxon which provides a python library. These integrations will be valuable to perform custom visualizations, data integration and experiment automation. The real value lies in knowing what data you have and how data is affecting your algorithms. This insight should be used to direct continued data collection efforts and to set the expectations.?As progress is relative to the bar, solutions need to be evaluated against the use cases continuously to know whether a product has a chance on the market or if it is a waste of time and resources. Having the programmatic access to the holistic state and being able to automate this insight generation will enable agility as well as long term strategies. At some point you might want to incorporate a complete MLOps workflow, automatizing data collection, training and validation, to deal with a disruptive or drifting environments. Having programmatic access to the data, the experiments and the pipelines are the foundation of MLOps processes.

With the first data in place the next step is to design and prototype the core algorithms. This starts with brainstorming and rapid experimentation in a high level language such as python. At this point, using python notebooks or python with an scientific editor such as Spyder is common practice. The choice of language is not only important due to the number of code lines needed to solve a problem but also to reuse as much pre-written code as possible such as open source libraries and frameworks. Python has a vast ecosystem of frameworks for deep learning such as Pytorch and TensorFlow, libraries for computer vision such as OpenCV and Scikit-image and libraries for image and video i/o, decoding and encoding, and visualization such as pillow, imageio and matplotlib. All this makes python the language of choice for many of the successful companies we see today.

When the first ideation is finished it is time to create a proof of concept to validate the assumptions in a real environment. Following the process to get here, there is now already a lot of code written in python both for data integration and transformation but also for analytics as a result of the rapid prototyping. Being able to reuse this code will save a lot of time as well as mitigate the risk of introducing new bugs while rewriting the code to a second language. Further more it will be easier to recruit the right people since AI researchers and engineers are in most cases much more comfortable working with python than C or C++. Enabling more cross functional team compositions allows the use of vertical responsibilities leading to goal oriented prioritization and less friction between teams.

Due to all the reasons above it should be no surprise that I am in favor of writing embedded applications in python. It does not however come without a cost or challenges that needs to be overcome. In the rest of the article I will introduce three different methods to build and run python and python applications in an Axis camera. I will also show a set of tools and tricks that enables more fluent development of python applications against a target device. After finishing the article you will know the pros and cons of:

  • Writing an embedded application in micropython
  • Running a standard python interpreter in any Axis camera
  • Using the ACAP 4 release to run docker containers with python in the Axis cameras

Micropython

Soon two years ago I showed you how to cross compile micropython and run it in an Axis camera. You will find all the code and the tooling in my github repository camera-analytics. Writing an application in micropython will result in a production grade deployment unit. The implementation is optimized for footprint in flash as well as memory and runtime efficiency on limited systems. There exists multiple levels to do optimization of bottle necks, from decorators that will compile standard python compliant code to native code, to more aggressive automatic compilation and the ability to write your own c-modules which can be imported just like any other python library. The micropython project linked above includes all the standard libraries, the ulab library (numpy lookalike) and a jpeg decoding library and consumes only a few hundred kilobytes deployed to an axis camera.

root@axis-accc********:/var/volatile/tmp# ls -lah micropython
-rwxr-xr-x? 1 root? root? 353.1K Sep 29 19:06 micropython

root@axis-accc********:/var/volatile/tmp# ldd micropython?
ldd:? libjpeg.so.8 => /tmp/libs/libjpeg.so.8 (0x76e4b000)
?libpthread.so.0 => /usr/lib/libpthread.so.0 (0x76e26000)
?libffi.so.6 => /usr/lib/libffi.so.6 (0x76e10000)
?libdl.so.2 => /usr/lib/libdl.so.2 (0x76dfd000)
?libm.so.6 => /usr/lib/libm.so.6 (0x76d96000)
?libc.so.6 => /usr/lib/libc.so.6 (0x76ca4000)
?/lib/ld-linux-armhf.so.3 => /usr/lib/ld-linux-armhf.so.3 (0x76eec000)
?libgcc_s.so.1 => /usr/lib/libgcc_s.so.1 (0x76c7b000)

root@axis-accc********:/var/volatile/tmp# ls -lah libs/libjpeg.so.8?
-rw-r--r--  1 root? root? 158.0K Sep 29 19:06 libs/libjpeg.so.8
        

While micropython is used in many products in production, the other reason for me to call it a production language is that it does not fulfill all the aspects needed for rapid prototyping. Using micropython on the target device will allow partial reusability of the code that you have already written in python for the host experiments. While the normal python syntax and expressions are the same in micropython the libraries are often not. Simple libraries that are implemented in pure python will probably work also in micropython, but most libraries for data science and algorithms are accelerated using e.g. external C or C++ extensions. Some libraries have micropython counterparts, like numpy which can be replaced with ulab.numpy. This is however not a complete alternative and lots of the functionality is missing or have other constraints.

The other aspect of using python for target prototyping is the familiarity from host prototyping and that the needed skill set does not pose a divider between embedded developers and data analytics developers. This does not hold true for micropython since the learning path is, albeit relatively shallow, very long. Any python developer can quite easily port a simple python application to micropython and make some optimizations to the runtime performance. However, when talking about large and complex analytics applications there is a wide skill set needed to make them run in micropython. You will need a good high-level understanding of the micropython implementation to use the code in an efficient manner. You will need to know the limitations when emitting native code or using viper code to further speed up critical sections. Sooner or later you will need to compile acceleration modules for the core algorithms using either native machine code in .mpy files or c-modules. This takes even more understanding of the underlying micropython implementation and skills in compilation and linking. You need to understand implications of facts such that your code is not linked against the global symbol table, preventing use of symbols outside your module, or that the use of the .data section is not supported, demanding the implementations to be adapted to use e.g. variables in the .bss section instead. Writing a c-module and linking it to the micropython binary during the build process will add less limitations to the C code but does instead come with complications in the build and deployment process.

If you are ready to deal with these kind of considerations and invest some time learning the language and its best practices, then micropython can be a viable option. My experience is today that micropython is mostly used for simpler IoT devices and controller boards. The selection of micropython libraries for image or video processing and deep learning is sparse and unmature. An example of how to get started writing your own c-modules for these kind of tasks can be seen in my ujpeg library on github.

Quick and easy python deployment to any ARMv7 Axis camera

Going back to the core incentives in this article, what we want to do is to run any standard python code in the camera, using the normal python libraries used for analytics, image processing and deep learning. We also want the possibility to use standard package managers such as apt-get and pip to install the libraries we need. Many developer teams have dismissed this as impossible or impractical. It is true that getting a production ready system running a python application in a camera with no extra hardware can be hard and involve a lot of time consuming optimizations and adoptions. Talking about rapid prototyping it is however a different story. By using some tricks and accepting some limitations it can be really easy to get started with rapid prototyping in python in a standard Axis camera.

In my github project python-app-on-axis-camera you can see a simple and easy to get started with example of a development environment to build python applications and cross compile them to run in an Axis camera. To make the process as simple as possible, we make use of three tricks:

  • QEMU emulation of target environments to reduce need of cross compilation
  • Docker containers to facilitate package managers
  • A network share to easily deploy large data

Arguably one of the largest obstacles getting started with python on a target platform is the cross compilation steps to get both the python language and the needed libraries cross compiled. To minimize the development time we can stand on the effort of others. Python has official distributions compiled for the ARMv7 architecture which can be installed with an apt-get command. Pythons package manager, pip, has support for binary distribution of prebuild modules for multiple architectures, thus we can find many of the libraries we need already cross compiled for the ARMv7 architecture. The least effort approach to this is to utilize QEMU, a virtualization environment, to run a docker container from an ARMv7 ubuntu image. A simple installation step of QEMU allows us to run target binaries on the host system, thus we can run package managers as if they were run in the target system. These are used to install the files and dependencies into the docker container where they later can be copied to the target platform. By utilizing the docker multi-stage builds we can use several different docker images as basis for our build environment. This allows us to use the emulated ARMv7 containers to install binary dependencies but to use the official Axis SDK images as a base for the cross-compilation environment. The cross-compilation environment is needed when binary distributions are not an option, such as when we want to access the specific Axis camera hardware using the Axis SDK C libraries or when we want to compile our own proprietary algorithms to a C extension module.

The second issue we are facing is the limited flash storage on an edge device like an Axis camera. Despite the python binary itself only being a few megabytes, a python distribution together with analytics libraries such as the one installed in the referenced project can have a size well above 100MB despite being based on the python3-minimal package.

root@axis-accc********:/# ls -lah <...>/python
  -rwxrwx---? ? 1 storage? storage? ? ?3.1M Jan 15? 2000 <...>/python

root@axis-accc********:/# du -h <...>/generated/usr/ | \
                          grep -E "^[0-9][0-9][0-9\.]+M"
  10.5M? ?<...>/generated/usr/lib/python3/dist-packages
  10.5M? ?<...>/generated/usr/lib/python3
  60.0M? ?<...>/generated/usr/lib/python3.6/config-3.6m-arm-linux-gnueabihf
  80.1M? ?<...>/generated/usr/lib/python3.6
  90.6M? ?<...>/generated/usr/lib
  14.6M? ?<...>/generated/usr/local/lib/python3.6/dist-packages/numpy/core
  38.0M? ?<...>/generated/usr/local/lib/python3.6/dist-packages/numpy
  44.7M? ?<...>/generated/usr/local/lib/python3.6/dist-packages
  44.7M? ?<...>/generated/usr/local/lib/python3.6
  44.7M? ?<...>/generated/usr/local/lib
  44.7M? ?<...>/generated/usr/local
  135.3M? <...>/generated/usr/
  
root@axis-accc********:/# ldd <...>/generated/python
  ldd:? libc.so.6 => /usr/lib/libc.so.6 (0x76ec2000)
  ?libpthread.so.0 => /usr/lib/libpthread.so.0 (0x76e9d000)
  ?libdl.so.2 => /usr/lib/libdl.so.2 (0x76e8a000)
  ?libutil.so.1 => /usr/lib/libutil.so.1 (0x76e77000)
  ?libexpat.so.1 => /usr/lib/libexpat.so.1 (0x76e4e000)
  ?libz.so.1 => /usr/lib/libz.so.1 (0x76e2e000)
  ?libm.so.6 => /usr/lib/libm.so.6 (0x76dc7000)
  ?/lib/ld-linux-armhf.so.3 => /usr/lib/ld-linux-armhf.so.3 (0x76fb4000)
  ?libgcc_s.so.1 => /usr/lib/libgcc_s.so.1 (0x76d9e000)          

The trick that we can make use of here is that the RAM memory is typically much larger than the flash storage. This means that storing the application persistently is the only issue, not having it loaded in the RAM memory as it is running. Investigating a companion line camera (low-end) we can see that the persistent flash storage where an installed ACAP would be kept is only 29 MB while the RAM memory is above 200 MB:

root@axis-accc********:~# df -h | grep flash
/dev/ubi0_20? ? ? ? ? ? ?33.6M? ? ? 2.8M? ? ?29.0M? ?9% /mnt/flash

root@axis-accc********:~# cat /proc/meminfo? | grep MemTotal
MemTotal:? ? ? ? ?222904 kB        

In a prototyping environment we can get around this problem by using an SD-card or a network storage location. The Axis cameras supports this out of the box and most of them have a slot for an micro SD-card.

Ingen alternativ text angiven f?r den h?r bilden

The drawback of using network storage or an SD-card is the low speed of reading. When starting the application for the first time it needs to be read into memory, this process can take several seconds when the python binary and its dependencies are located on an SD-card or network storage. In the referenced project the loading time to start the application was 5 seconds from an SD-card and 10 seconds from a network storage:

$ time make run-app DEPLOY_TO=sd_card
[...]
0.02s user 0.01s system 0% cpu 5.380 total

$ time make run-app DEPLOY_TO=nfs? ??
[...]
0.02s user 0.01s system 0% cpu 10.747 total        

In a prototype or a proof of concept it should be no significant issue to complement the camera with an SD-card or a network storage solution like a NAS (Network Attached Storage) but in the end product, considerations such as complexity and reliability should be factored in. All in all, the referenced project shows that using python in the camera is viable for rapid prototyping and can be set up without much effort. This should be one of the tools that are considered when designing the workflow for a new application or product. Next we will look at some tips and tricks to further improve development speed.

Tips and tricks for rapid prototyping with an Axis camera

A common inconvenience I see from developers working with ACAPs is the constant copying of the binary and the application data between the developer computer and the target camera. I have multiple tricks that you will find incorporated into my workflows to reduce these efforts. First of all, instead of building an installable application package (e.g. .eap-file) every time you want to test a code change you can just copy the affected file and overwrite the one on the camera. Instead of restarting the application in the camera GUI you can run the executable directly using ssh. To use ssh you need to enable it in the plain config of your camera:

Ingen alternativ text angiven f?r den h?r bilden

With ssh enabled you can use scp to copy files and ssh to execute them in the camera:

$ scp -r data/ cam:/usr/local/packages
file_b    100%? ? 0? ? ?0.0KB/s? ?00:00? ??
file_a    100%? ? 0? ? ?0.0KB/s? ?00:00? ??

$ ssh cam ls -lah /usr/local/packages/
drwxrwxr-x? ? 4 root? ? ?root? ? ? ? ?288 Oct? 4 16:57 .
drwxr-xr-x? ? 3 root? ? ?root? ? ? ? ?376 Oct? 4 16:32 ..
drwxr-xr-x? ? 2 root? ? ?root? ? ? ? ?288 Oct? 4 16:57 data
drwxrwxr-x? ? 5 sdk? ? ? admin? ? ? ? 840 Sep 11 06:12 vmd/        

An even more convenient setup is to have a shared folder between the developer computer and the camera. We can configure this using the network storage feature. By installing a samba server on the developer computer we can have an application build folder being mounted directly into the camera, removing any need to copy files at all. If using ubuntu or debian, follow the guide on the ubuntu website to install the samba server, then configure it to share your application folder:

# Add to /etc/samba/smb.conf
[sambashare]
? ? ? ? comment = Samba share in camera application dir
? ? ? ? path = /home/daniel/src/acap/build
? ? ? ? read only = no
? ? ? ? browsable = yes        
Ingen alternativ text angiven f?r den h?r bilden

Once mounted from the camera GUI you can access your application from the camera. You can either start the application using ssh or you can go one step further and build a tiny installable wrapper ACAP that detects when the binary on the share is overwritten and reloads it to restart the application.

If you rather keep the executables on the SD-card you will notice that the SD-card is mounted with the option noexec, preventing execution of any files stored on it. In a development environment we can simply change this by ssh:ing to the camera and typing:?

root@axis-accc********:~# mount -o remount,exec /var/spool/storage/SD_DISK        

Testing on host

Despite all these improvements prototyping and testing on the camera is still slower than testing on host. Therefore you see that both of the referenced projects in this article allows building and executing code on the developer computer. This is critical to allow rapid prototyping, interactive debugging and automated tests integrated in a CI/CD workflow. In the micropython project this was made possible by building micropython for the host architecture as well as for the target and adding a make-target to easily run the host version in a docker container. In the other referenced project it is easier since python can simply be installed on the developer computer.

Unit tests should be design to be fast and easy to run so that the preferred method is to create a new unit test before adding a new feature. This will improve quality and iteration speed since a lot of the issues are found before even deploying to the camera. In unit tests the hardware dependencies and architecture specific features are mocked out to allow them to execute on the developer computer or a CI/CD pipeline. It is however recommended to create python stubs and mock libraries for functionality that is often used and needed to run the application, e.g. the camera capture interface could be replaced by a host version where a video file is parsed and exposed in the same format as from the native interface. This allows for simple (and reproducible) prototyping on the developer machine and to run the full application on host as well.

Installing dependencies on the camera using package managers

In a prototyping workflow one is used to use a lot of tools that are not from the start available in the camera. A workflow that is very usable for prototyping is to make use of an emulated docker container to install the dependencies using a package manager such as apt-get or pip. The binaries and any needed dependencies can then simply be copied to either the tmpfs partition in the camera or to the flash. As an example, when benchmarking a prototype, making use of the time command in Linux to measure the execution time of a binary is useful. This is not installed in the camera firmware but getting it running for prototyping is easy. The following assumes that docker and QEMU is installed and configured.

# Install 'time' in a docker container and copy to mounted folder
docker run --rm -it -v `pwd`:/out arm32v7/ubuntu:bionic \
       bash -c 'apt update && apt install time && cp `which time` /out'

# Copy the 'time' binary to the camera
$ scp time cam:/tmp
   time    100%? ?14KB? ?3.9MB/s? ?00:00

# Time the execution of 'sleep 1' in the camera
root@axis-accc********:~# /tmp/time sleep 1
   0.00user 0.00system 0:01.00elapsed 0%CPU
   (0avgtext+0avgdata 896maxresident)k
   0inputs+0outputs (0major+55minor)pagefaults 0swaps
        

As you can see from these instructions, a lot of what I learned over the years working with ACAP applications relates to the process of building an environment and deploying it to the camera. By using the processes above, docker can greatly improve the build environment and allows us to reuse the effort of others. The deployment is however stil an ad-hoc process of copying and file transfer, albeit a little bit easier due to the automation using network shares and ssh. If you have worked in the cloud native industry, then you have probably gotten used to deploying applications as multiple microservices running in a docker container each. Is this the future for Axis cameras too?

The future of ACAP: containers, python and open APIs

A week ago, something great was announced by Axis, something that will transform the way we think of edge application development. Axis introduced the 4th generation of AXIS Camera Application Platform which allows you to run docker containers directly in the camera, allows you to write python applications out of the box and introduces familiar open source libraries such as OpenCV to capture video and images from the camera. This release allows the use of cloud native workflows where multiple docker containers can be build and easily deployed to the camera. It increases the isolation of application code which can be executed in the sand-boxed containers. It allows developers and research engineers to use the patterns, tools and workflows that they are familiar with. You will find docker images to use as a base at the Axis dockerhub, such as Axis Computer Vision SDK image which contains a distribution of python compiled for the camera, together with libraries such as numpy, scipy, an OpenCV with camera capture ability and a TensorFlow-serving API integrating to the deep learning accelerators that you can find in some of Axis cameras.

Prototyping a computer vision application running inside an Axis camera is now as easy as:

# cv_app.py

import cv2

cap = cv2.VideoCapture(1)  # Select the camera interface
cap.set(cv2.CAP_PROP_FPS, 25)  # Select the frame rate
cap.set(cv2.CAP_PROP_FRAME_WIDTH, 480)  # Select width and height
cap.set(cv2.CAP_PROP_FRAME_HEIGHT, 320)
cap.set(cv2.CAP_PROP_FOURCC, cv2.VideoWriter_fourcc(*"RGB3")) # Set format

while True:
    _, frame = cap.read()
    do_analytics(frame)          

While the build process consists of a simple Dockerfile and the deployment simply pushing the image to repo and pulling it from the camera.

FROM axisecp/acap-computer-vision-sdk:1.0-armv7h AS cv-sdk
FROM arm32v7/ubuntu:20.04
COPY --from=cv-sdk /axis/python /
COPY --from=cv-sdk /axis/python-numpy /
COPY --from=cv-sdk /axis/opencv /
COPY --from=cv-sdk /axis/openblas /

WORKDIR /app
COPY app/* /app/
CMD ["python3", "cv_app.py"]        

You will find a large set of examples in the Axis github page. Does this announcement render the earlier referenced projects useless? Not really. Using ACAP4 and docker containers running in the camera is definitely the future, as of now it is however not supported in all of Axis cameras. Using micropython results in small binaries that are easy to ship to a production environment with no need of extra hardware such as SD-cards. The process of prototyping as described in the second referenced project could be further streamlined using the ACAP4 docker images and precompiled binaries even though deployed using the ACAP3 SDK or copied file by file. This gives a prototyping environment that shifts towards ACAP4 the more mature and supported it gets and where all developers, regardless if their history is in the cloud native, as an ML research engineer or embedded developer, will feel comfortable. If this feels like the future to you too, then stick around for further updates and more examples.

Nomenclature dictionary

cross compilation: Compiling code for one architecture using a compiler running in a second architecture. The compiled executables can not natively run on the machine that produced them.

host: The developer machine used to develop and cross compile the application.

target: The hardware that the application is supposed to run on, e.g. the camera.

ACAP: AXIS Camera Application Platform is a framework and toolset to build, install and run applications in Axis network cameras.

C extension: In python (cpython) a library written in an external compiled language such as C is called a C extension. In difference from regular python code which is architecture independent libraries implementing C extensions needs to be cross compiled.

native code: In micropython, native code is the code emitted from a compilation which is executable directly by the underlying machine. This differs from compiled byte code which is executable by the virtual machine implementation in the micropython language.

c-module: In micropython a library written in C (or any other compiled language) which are linked to the core python binary at build time is called a c-module.

Daniel Falk

CTO/Founder ?? Custom EDGE apps for network cameras ?? AI/ML/Computer Vision ?? MLOps ?? Edge Analytics ?? Entrepreneur ?? Writing code, motivating developers and uncovering valuable insights

3 年

Beppe Andrianò I'm really happy too with the progress we can see today. We have an amazing ACAP team working with the entire ecosystem and I am sure this will unlock even more value in the short future. Thanks for using, and enriching, our products!

回复
Beppe Andrianò

Owner at Think and Make Retail Solutions, People Counting and Digital Signage

3 年

thank you very much for your support and effort to make developement work easier and profitable to maximize a so powerful hardware with injection of good software!

回复

要查看或添加评论,请登录

社区洞察

其他会员也浏览了