During SIGGRAPH 2020, Adobe was presenting their advances in R&D, and what I find interesting is that almost all of them have an application in 3D graphics, so I decided to share a little bit about them.
Adobe 3D & Immersive 2020
As you know, Adobe now owns Substance (I don't know much about the status of Substance because I am still using an old version), and one of their research projects seems to fit this Substance workflow nicely.
Certain high-end 3D games have used "photogrammetry" to texture their environments (a method where artists take a lot high resolution textures at different from different angles to create a final texture that will look extremely realistic). One of the things Adobe is developing is a system where you can do something similar with a single texture. The software uses a lot of different algorithms and AI to create a PBR material based on a single photo, and the best part is you can even use a photo taken on your mobile phone (as long as your phone can take good pictures, of course).
While this could be used for anything 3D, I see this as an interesting application for video games and real-time architectural presentations, where you can increase the visual fidelity of a scene using real-life photos as a starting point for your materials.
Another interesting application is a deformations simulator. According to the presentation, you can have very realistic simulations for soft bodies, while preventing interpenetration. I see this has potential for creating more realistic cloth, or rubber objects, and pretty much any kind of soft body.
Adobe has also been working on a remeshing application. In case you don't know, remeshing is a process where a 3D model's polygon count is reduced while trying to keep the shape of the object. The amount of detail is retained depending on how much reduction is performed, which means sometimes you will still get some finer details while sometimes you will only get the base shape. How much you want to reduce the polygon count depends on the application (for example, pre-rendered images don't need as much reduction as real-time applications like games or visualizations).
Adobe's rendering engine
The last thing I want to cover is a 3D rendering engine Adobe is developing. When Adobe acquired Allegorithmic, I found that to be a surprising move since Adobe wasn't really into 3D graphics (and still isn't, even if Photoshop has some 3D functionality). However, since they own not only Substance, but also photography and video software, I think a 3D rendering engine can be interesting in the long run, if they make it compatible with their existing applications (meaning, if they make the renderer capable of exporting layers directly into Photoshop or After Effects for further editing and compositing with live footage or photos).
While SIGGRAPH is now over, we are lucky we can still watch most sessions on demand and that means I can cover more SIGGRAPH stuff in the following days and weeks.