Adventures in Photogrammetry.

Adventures in Photogrammetry.

Well, I had wanted to experiment and try this out for a while and, at last, was finally able to make time for it. I had looked at several other people’s projects, and the results were a bit of a mixed bag with varying degrees of success, so I began viewing the whole project with a severe dose of scepticism. Indeed having the software reconstruct a model from a series of staged photographs?couldn’t produce any worthwhile or usable results. Could it? Well, there’s simply one way to find out for sure.

First off, I needed a subject. That was simple enough, as I eventually settled on using the uprooted tree trunk in my back garden that my cats happily enjoy using as a scratching post. My extensive research (a 5-minute search on google) revealed that best practices for photographing the subject suggested an ambient, preferably cloudy, day so that no shadows would be cast and interfere with the reconstruction process. And also, that images should be shot with manual exposure, as differing images with different exposure settings would also affect the reconstruction process. Pretty simple so far, right? It is also generally acknowledged that taking shots with a mobile device is a terrible idea and would yield poor results. Unfortunately, my mobile is the only device I have to shoot imagery, so awful or not, it was just going to have to do. And so, I began the process of shooting the trunk. In total, I took about 160 pictures.

No alt text provided for this image

Now that I had my images, all that remained was to decide which program I would use for the reconstruction. There were several options available to me here, both free and paid for, all with various pros and cons. Eventually, though, I chose meshroom as it was both freely accessible and most effortless looking to use. I fed it all my photos and then went to make a drink as the software performed its magic and “did its thing”. To say I was “blown away” with the results was an understatement. I had in no way anticipated the outcome to be this good.

The whole thing looked absolutely amazing! Unfortunately, though, the operation wasn’t without its caveats. At just over 25 million polygons the mesh was so dense that the model wasn’t exactly in a state that could be described as usable.

No alt text provided for this image

The model required some severe decimation and retopology. Thankfully meshroom came to the rescue as it allowed me to do just that. However, reducing the mesh to just 3000 polygons meant a severe loss of detail.

Bringing back that detail was an important step, which meant baking the details from the high poly mesh to the low poly mesh.

No alt text provided for this image

I wasn’t entirely convinced that it would be enough to bring back sufficient detail to make it worthwhile, but the only way to know was to create a quick scene set-up and do a test render. I opened up 3ds Max, imported the model and set up the textures. I added a dome light and a couple of fill lights and then set up Vray.

No alt text provided for this image

And the final result? Well, not too shabby for a first attempt. I'm quite pleased with the end result. Anyway, let me know what you all think.

No alt text provided for this image
Rylan Taylor

Support Analyst, 3D Artist & QA Tester

2 年

This is great work. Shows diverse ability in your pipeline! Would like to see more about how you rebuilt the textures though for PBR ??!

要查看或添加评论,请登录

Ismail Aljanabi的更多文章

社区洞察

其他会员也浏览了