Modular Levee Assessment System
move with the flow

Modular Levee Assessment System

Yeah, took some time to come up with that amazing name ;-) but I think it covers what I am about to tell you here.

As you might know I spent a lot of time assessing levee safety. I've been doing this for at least 20 years now (luckily it's not the only work I do or I might get bored!) so you might say I know what I am doing or at least I hope I know.. Things are now quite hectic because there are so many changes that it's almost impossible to anticipate on the future of levee assessment but I'll still have a go at it.

How not to do it

It's always easy to write semi intelligent things using hindsight so here goes; the software in The Netherlands for levee assessments as it is now is not future proof.. not at all. Before I step on someone's toes (which is also easily done) I have to admit that the choices had to be made in a time when changes where moving fast and probably with some people at key positions who lacked the required knowledge but anyway, choices were made and some of them were not the best choices.

My main concern is that we were / are still building API-less standalone software with a user interface that people need to install on their computer. That's fine for a detailed analysis but it's a no-go for automated processes. It wouldn't be too bad if it was one programme but there are loads of different tools that all import some kind of data and export another. Try to automate such a process! The other annoying thing is that we chose formats like XML, I mean.. the whole (web)world is using JSON, why use XML...

No alt text provided for this image


The good

Fortunately not all is bad in levee land. There are some pretty amazing things going on as well. First of all the software developers of geotechnical software are writing API's. Look at Bentley's Plaxis, it has one of the best Python interfaces. Deltares has recently released a DStability version (2019) with a nice file format (zipped JSON's hoorray!!) and as far as I know software like SVSlope (Bentley) seems to have some kind of API as well (looking forward to try my code on your software guys!).

If you're not into programming as much as I am, here's the deal with an API. Let's take an example of a levee calculation using DStability 2019 the default way. You open up your software, define a crosssection, soillayers, soil parameters, hydraulic assumptions etc. etc and in the end you press the calculate button and write down the safety factor.

If software has an API this whole process can be done by writing code. You can use the API to automatically define a crosssection, soillayers, soil parameters, hydraulic assumptions etc. etc and in the end your script calls the calculate option and extracts the safety factor from the result.

Writing this code is not triavial... well it is if you don't mind reusability but if you do (and in most cases you should!) it takes time. But the math is still easy. From my experience it takes about 8 hours to create a model (including time to look up the data which is always a pain). So let's look at a typical levee, let's say 5 kilometers of it. We want to check the inward stability every 100m, so that's 50 * 8 = 400 hours of modelling (ok, it will probably be a little less once you get the hang of it). Now if you could spend 400 hours (that's 10 weeks) to automate this process it should be feasible to write that code and it will be available for the next levee section as well! So it's easy, automation is definitely worthwhile.

No alt text provided for this image


Modular.. modular, modular

Ok, so we know that we should automate repetitive tasks like levee assessments. But how? This is where things also seems to go wrong in current LeveeLand. Do not, and I repeat do not try to build a one in all software package.. it won't work. It's not flexible enough and you will bang your head against the wall soon. If you would want to create a all-in-one software package for a levee assessment it will be a monster of code, probably managed by one company (vender lock-in) and every change will be very costly and difficult to implement.

Also please don't ever try to make software do too many things (like being a GIS system as well as a reporting tool as well as a calculation tool). Try to avoid any vendor lock-in at all. Some users might prefer Plaxis, others might prefer DStability or Slope/W or whatever. You will need a system that is as independent of your data and software choices as much as possible. Oh, and, in the ideal world, it should be open source.

That's where modularity comes in. Write software in small pieces each being solely responsible for one task and one task only. This is very important. Let's look at an example.

The story of the CPT conversion

No alt text provided for this image

One of the sources for levee assessments is CPT data. Looking at all the cone resistance and friction numbers is nice but doesn't make a soillayer configuration out of it. We need a way to convert CPT's to soil layers. GEF files are easy to read (last complaint in this article... why do we want to switch to XML in The Netherlands?? please!?) So where was I.. oh yes, conversion from CPT to soillayers can be done in multiple ways. Robertson is one, in The Netherlands we also have NEN5104 (if I am right) and currently there are also ways to use machine learning for soil recognition. Already 3 choices..

If you write modular code you could write 3 modules, RobertsonModule, NENModule and the MachineLearningModule. It's clear which module is responsible for what type of action and if you ever have a problem you adjust the code for that module or if you find a new way to interpret CPTs you write a new module.

Chaining

Interpreting CPTs is just one step to come to a complete calculation. There are many, many more steps needed. For the MLAS system that I have developed I have loads and loads of modules. They all have their own responsibility. Create a crosssection, add the ditch data, add the waterlevel, find the soillayers etc. etc. But how do I get from no data to all the data that's needed to generate a calculation file?

I use chaining for this. It's a kind of a data flow where you start with an empty object called a master object. The master object passes through many modules which contain algorithms to add specific data. The master object is in JSON so it's easy to pass it through webservices. In between modules data from previous steps is used to generated new data and in the end the master object contains all the data needed to convert it into input for the calculation software.

The beautiful thing about this (sorry for being so lyrical about my own solution) is that it's very, very easy to exchange modules (want to use Robertson instead of NEN5104? just make sure that the master object passes through the Robertson module instead of the NEN5104) and the master object is input not only for DStability but also for SVSlope or Plaxis! You only have to write code to convert the master object to the software you want. Ok, this is a little more complicated than just writing it down in a LinkedIn article but using the API's from software packages it's much more convenient to do and once you're done it's there forever).

Let's have a short look at the (stripped down) code for a levee assessment that's implemented right now.

ci = CalcInput()

# add a crosssection
crsg = CrosssectionGenerator(settings, ci, surfacedata, waterdata)
ci = crsg.execute()

# add the ditch
dtg = DitchGenerator(settings, ci, slotendata)
ci = dtg.execute()

# add the algorithm to find the crest / polder transition
xpg = XPolderGenerator(settings, ci)
ci = xpg.execute()

# add the trafficload
tlg = TrafficLoadGenerator(settings, ci)
ci = tlg.execute()

# add the phreatic line
phg = PhreaticLineGenerator(settings, ci)
ci = phg.execute()

# add probabilistic subsoil
rrd = SubsoilsRRDGenerator(settings, ci)
ci = rrd.execute()

# add deterministic subsoil
ssg = Subsoil2DGenerator(settings, ci)
ci = ssg.execute()

# add the bishop parameters 
pro_big = BishopAnalysisGenerator(settings, ci)
ci = pro_big.execute()

# add the spencer parameters
pro_spe = SpencerAnalysisGenerator(settings, ci)
ci = pro_spe.execute()

# create calculation input for DStability 2019
for i in range(len(ci.geometries)):
    ds = DStability(ci)
    ds.write(pro_stix_dir, fname, use_geometry=i)

I hope you see the modular way of thinking. The master object (CalcInput) is passed on through different modules, each adding a litte data, in the end leading to an object that can be used to, in this case, create a DStability calculation.

Note that there are very nice and effective ways to use chaining / data flows in a very efficient way, like Apache NiFi. It's also very cost effective if you create a serverless function out of the modules in GCP / AWS or Azure or any other cloud provider.

No alt text provided for this image

The bad

So, what's the catch? Well, actually I wouldn't really know. Ok, you will need some Python programmers (it's definitely not a beginners thing except if you only use (and not maintain) the modules), and you will need an environment that can handle Python code. You will also always need to adjust the data connections because there is no company with the exact same data management (like filepaths, database types, cloud type). So yeah, it's no free lunch and it will take effort to implement.

Utopia

Let's say I am in Utopia, then this is the best MLAS I can imagine (although in Utopia I probably would have better things to do ;-).

All calculation software has great API's, all modules are open source. Every company / engineer using the modules is behaving exactly like the open source licenses tell you to do. One big happy family of geotechnical engineers expanding the library up to a point where it is usable throughout the world.

The software is used in developed as well as less developed countries to avoid flooding. Each in his own way based on the availability of data and resources.

Due to all the automated calculations we are able to generate loads of training data and start to develop even better algorithms based on machine learning and artificial intelligence.

Now, that would be my geotechnical Utopia.

Closing remarks

I'm not sure what the point is of me writing this down, sometimes I just don't but I felt the urge to share this and who knows what it leads to! If you felt I stepped on your toe and you're a developer, please do know that I understand where it's coming from. If you are a one man team like I have been for this project it's easy to make your own choices and not care about the rest of the world.

Feel free to leave any remarks on my ideas and way of working. I am always open to learn and enhance my code!

Regards and thanks for reading,

Rob van Putten






You can call it MoLAsSys but maybe it's too fast for that :) The only disadvantage I see in general with this (or earlier) automation efforts in Levee assessment is that, when it becomes very easy to change 'modules' it becomes more likely that people will do so, even without understanding the consequences. At least before (before WTI and probabilistic flood risk assessments) the use of old fashioned methods was often corrected in the acceptable safety factor. Changing your cpt interpretation from NEN to Robertson might feel like a no brainer, for example, if you look at that in isolation. But it's a part of a larger method. Also before, you could do it if course, but because it was all manual you would carefully discuss with the client, redoing manually everything was way too expensive. This culture was engrained in engineering firms (at least at GeoDelft). One can support the other. But your software output needs a definite way of checking all model steps against some approved value, to ensure no little module was introduced or modified, or the results should not be accepted in official reports. This problem has already been dealt with elsewhere, but is not trivial. Am I too careful here?

Berk Demir

Senior Tunnel Engineer at COWI

5 年

It was inspiring to follow the development of this project throughout the time. (From the first soil classification trials to here.) So, I am glad you shared.

要查看或添加评论,请登录

Rob van Putten的更多文章

社区洞察

其他会员也浏览了