Digitisation of historical architectural elements
The Greypixel
‘Compelling visualizations born from unique ideas and collaboration’
Introduction
The large-scale reconstruction of the Hungarian National Opera House provided lighting designers with a once-in-a-lifetime opportunity to recreate the original, period flair with contemporary lighting solutions. A traditional crude architectural CAD model was insufficient to properly plan and execute this. They needed a 3D model that not only faithfully reproduced all of the physical properties, intricate shapes, and material properties of the space and spatial elements but was also designed for accessibility and ease of use.
How to acquire such data?
Our approach was simple: make a laser survey of the spaces in question (the Auditorium and the Main Staircase) so that the larger architectural structures, walls, beams, slabs, and so on can be modelled quickly and accurately, and simultaneously make high-resolution 3D scans of ornaments, statues, and all the small details separately, creating instantiable optimized digital copies that can be placed in place accurately based on the laser scanning. Just as previous centuries' masters worked from templates and moulds.
Technical challenges
It would appear obvious to use a precision handheld scanner to digitize the 250 or so individual decorative elements of the space, as this would complete the majority of the data collection tasks. However, as it turns out, it's not quite that simple. Structured light scanners, such as the Artec Eva, have not proven to be an effective solution in our tests with scanners using different technologies.
The situation was complicated further by the metallic reflection of the gold-plated surfaces, which frequently confused the scanners, a well-known foe of scanners of almost every type.
As a result, we chose a photogrammetric approach and built a cross-polarized scanning rig:
Without cross-polarization, the results would have been similar to the other scanners we tested, riddled with noise and larger reconstruction errors to the point of being useless due to the constantly changing specular reflections on the metallic surfaces. Cross-polarization eliminated the specular reflection from the surfaces, leaving only diffused light. This enables the photogrammetry software to produce a more usable and accurate result, requiring less post-processing on the processing side. Later, we simplified the rig and instead collected data with two single strobe rigs in parallel, significantly reducing the weight of the rig and increasing data collection efficiency.
领英推荐
Data collection and processing
We had to create an elaborate database to keep track of everything because the data collection resulted in hundreds, sometimes thousands, of raw images per architectural element. We had to process everything as quickly as possible. We also had to make the most of the time that the opera house was being renovated until the scaffolding was completed and we had free access to the decorations. We had to deal with data collection errors and circumstances that were difficult to assess on-site quickly. As a result, we created a pipeline of controlled steps. Raw images are used as input, and optimized clean models are used as output.
The raw images are used to generate a dense point cloud, which is then used to generate a textured 3D mesh. Most raw meshes required only minor noise filtering and error correction due to the cross-polarized capture technique. They were then retopologized using sophisticated algorithms, and the highly optimized, low-poly models were created with some additional manual modelling.
After manually creating a sufficiently utilized texture scape for each model, the geometric and material details of the cleaned high-poly models were baked back into the low-poly ones. The majority of the models had 4K textures, but some needed 8K to capture all of the detail. We also fine-tuned the textures by hand where necessary, for example, to apply different material properties to other parts.
Anyone creating assets for modern AAA game titles should be quite familiar with the entire process. However, due to the significant additional work involved, these methods are rarely used in architectural visualization.
All of this extra effort was critical here and paid off handsomely. It allowed lighting designers to move around the virtual space quickly, have the model respond quickly to their changes, and work with the resulting material in real time. Having such a responsive dataset enabled quick design iterations and allowed for more options to be considered before making a critical decision.
We were able to create two digitized models of architectural spaces that could be easily managed on a standard computer, making the engineers' work easier, and the resulting visual plans and virtual tours, which were used to present iterations of each lighting concept as a perfect replica of the planned reality, played a significant role in the decision-making process.
Follow us if you’d like to learn more about visualisation solutions!?