What's up, home? part 7
Janne Pikkarainen
SRE Operations Lead Site Reliability Engineer, GIAC GCIH, Zabbix 4.0 Certified Specialist at Forcepoint, blogger at Zabbix blog
By day, I am a monitoring technical lead in a global cyber security company. By night, I monitor my home with Zabbix and Grafana in very creative ways. But what has Zabbix to do with Blender 3D software or virtual reality? Read on.
Full stack monitoring is an old concept — in IT world, it means your service is monitored all the way from physical level (data center environmental status like temperature or smoke detection, power, network connectivity, hardware status…) to operating system status to your application status, enriched with all kinds of data such as application logs or end-to-end testing performance. Zabbix has very mature support for that, but how about… full house monitoring in 3D and, possibly, in virtual reality?
Slow down, what are you talking about?
The catacombs of my heart do have a place for 3D modelling. I am not a talented 3D artist, not by a long shot, but I have flirted with 3D apps since Amiga 500 and its Real 3D 1.4, then later with Amiga 1200 a legally purchased Tornado 3D and not so legally downloaded Lightwave. With Linux, so after 1999 for me, I have used POV-Ray about 20 years ago, and as Blender went open source long time ago, I have tried it out every now and then.
So, in theory, I can do 3D. In practise, it’s the “Hmm I wonder what happens if I press this button” approach I use.
Not so slow, get to the point please
Okay. There are several of reasons why I am doing this whole home monitoring thing.
2D or not 2D, that is the question
For traditional IT monitoring, 2D interface and 2D alerts are OK, maybe apart from physical rack location visualization, where it definitely helps if a sysadmin can locate a malfunctioning server easily from a picture.
For the Real World monitoring, it is a different story. I’m sure an electrician would appreciate if the alert would contain pictures or animations visualising the exact location of whatever broken. The same for plumbers, guards, whoever need to get to fix something in huge buildings, fast.
Let's get to it
Now that you know my motivation, let’s finally get started!?
In my case, leaping Zabbix from 2D to 3D meant just a bunch of easy steps:
Home sweet home
Sweet Home 3D is a relatively easy to use home modeling application. It’s free, and already contains a generous bunch of furniture, and with a small sum you’ll get access to many, many more items.?
After a few moments I had my home modelled in Sweet Home 3D.
?Next, I exported the file to .obj format, recognized by Blender.
Will it blend?
In Blender, I created a new scene, removed the meme-worthy default cube, and imported the Sweet Home 3D model to Blender.
Oh wow, it worked! Next, I needed to label the interesting items, such as our living room TV to match with the names in Zabbix.
You modelled your home. Great! But does this Zabbix —> Blender integration work?
Yes it does. Here is my first “let’s throw in some random objects into a Blender scene and try to manipulate it from Zabbix” attempt before any Sweet Home 3D business.
?Fancy? No. Meaningful? Yes. There’s a lot going on in here.
领英推荐
My Zabbix is now consulting Blender for every severity >=Average trigger, and I can also run the rendering manually any time I want.?
First, here’s the manual refresh.
Next, here is the trigger:
?Static image result
Here is a static PNG image rendering result by Blender Eevee rendering engine. Like gaming engines, Eevee cuts some corners when it comes to accuracy, but with a powerful GPU it can do wonders in real-time or at least in near-real-time.?
The “I am not a 3D artist” part will hit you now hard. Cover your eyes, this will hurt. Here’s the Eevee rendering result.
?That green color? No, our home is not like that. I just tried to make this thing to look more futuristic, perhaps Matrix-like… but now it looks like… well… like I would have used a 3D program. The red Rudolph the Rednose Reindeer nose like thing? I imagined it would be neatly glowing red sphere along with the TV glowing, indicating an alert with our TV. Fail for the visual part, but at least the alert logic works! And don’t ask why the TV looks so strange.
But you get the point. Imagine if a warehouse/factory/whatever monitoring center would see something like this in their alerts. No more cryptic “Power socket S1F1A255DU not working” alerts, instead the alert would pinpoint the alert in a visual way.
There was supposed to be an earth-shattering VR! Where’s the VR?
Mark Zuckerberg, be very afraid with your Metaverse, as Zabbixverse will rule the world. Among many other formats, Blender can export its scenes to X3D format. It’s one of the virtual world formats our web browsers do support, and dead simple to embed inside Zabbix/Grafana. Blender would support WebGL, too, but getting X3D to run only needed the use of <x3d> tag, so for my experiment, it was super easy.
The video looks crappy because I have not done any texture/light work yet, but the concept works! In the video, it is me controlling the movement.?
In my understanding, X3D/WebGL supports VR headsets, too, so in theory you could be observing the status of whatever physical facility you monitor through your VR headset.
Of course, this works in Grafana, too.
How much does this cost to implement??
It’s free! I mean, Zabbix is free, Python is free and Blender is free, and open source. If you have some 3D blueprints of your facility in a format Blender can support — it supports plenty — you’re all set! Have an engineer or two or ten for doing the 3D scene labeling work, and pretty soon you will see you are doing your monitoring in 3D world.
What are the limitations?
The new/resolved alerts are not updated to scene in real time. For PNG files that does not matter much, as those are static and Zabbix can update those as often as needed, but for the interactive X3D files it’s a shame that for now the scene will only be updated whenever you refresh the page, or Zabbix does it for you. I need to learn if I can update X3D properties in real-time instead of a forced page load.
Coming up next week: monitoring Philips OneBlade
Next week I will show you how I monitor a Philips OneBlade shaver for its estimated runtime left. The device does not have any IoT functionality, so how do I monitor it? Tune in to this blog next week at the same Zabbix time.
I have been working at Forcepoint since 2014 and never get bored of inventing new ways to visualise data.
Senior Software Engineer @ H-E-B | VMware
2 年Sweet Janne. I can truly see this as the next evolution of monitoring where you can visualize the server in question rack level and find the culprit. This is amazing!