full-color 3D printing via AI-accelerated appearance prediction ”
Charles University’s Computer Graphics Group (CGG) published a item  with a new method to improve color 3D printing technology. The new technique achieves 3D printer output that allows for a much more accurate match between user input and the final appearance of the object than current commercial software. The method relies on machine learning techniques to improve the runtime and practicality of a previously published algorithm [3,5] by the same authors.
This work  was a collaboration with researchers from MPI Saarbrücken, USI Lugano, Keldysh Institute Moscow, IST Austria and University College London (UCL) who followed an Innovative Training Network (ITN) project of the EU called DISTRIBUTION. The presentation is virtual Eurography 2021 conference and the article appears in the May issue of Infographic Forum open access journal.
In full-color 3D printing, the printhead does not melt the plastic but shoots tiny droplets of liquid resin and instantly hardens them using UV light. As with any additive manufacturing technique, this process is repeated layer by layer to form a 3D dimensional object (Image ).
These material jet 3D printers today are mainly used in industrial applications, such as prototyping, preservation of cultural heritage and for medical prostheses. Recently, Mixed dimensions started offering printing service for custom play figures. Animation workshop LAIKA uses this 3D printing technology to animate facial expressions of their characters in stop-motion films.
Similar to 2D printing, a wide range of color tones is produced by placing multiple base materials (CMYK + W) next to each other. The materials are made of a semi-transparent resin  which allows light to travel slightly below the surface. This translucency allows for precise subtractive mixing of color shades across different ratios of absorbent base materials.
Despite the flexibility of color reproduction, color bleeding also poses a physical limitation for fine detail reproduction: texture details are blurry, and hard edges lose contrast from light scattering laterally below the surface. Adding to this problem, the blur is three-dimensional, which means that it also affects the colors on the opposite sides of the parts of thin objects.
Previous work of the team [5,3] has proven that it is possible to recover sharpness and contrast by carefully optimizing material placement. By using a virtual simulation of the printed appearance, one can iteratively find an arrangement of materials that most faithfully reproduces the input object.
In this article  they offer a new technique for this virtual simulation that is up to 300 times faster than previous methods, while requiring only one GPU instead of a full compute cluster. By learning from millions of training examples, a neural network is able to effectively predict how light will scatter below the surface and how a given surface point is influenced by the materials around it.
Thanks to the new simulation, the preparation time of a colored 3D model is reduced from a few dozen hours to several minutes, which allows this technique to be used in practice. With such an iterative pipeline, one can achieve a more faithful surface appearance with today’s 3D printer hardware than what traditional print preparation software could offer.
The authors’ paper page provides links to the open source implementation, including the dataset for training and pre-trained networks, a 30s teaser video, and an 18 minute conference presentation.
 T. Rittig et al., “Neural acceleration of diffusion-sensitive color 3D printing”, Infographic Forum, 2021, I: 10.1111 / cgf.142626.
 O. Elek et al., “Robust and convenient measurement of volume transport parameters in solid photopolymer materials for 3D printing”, Opt. Express, OE, flight. 29, no. 5, p. 7568-7588, March 2021, doi: 10.1364 / OE.406095.
 D. Sumin et al., “Diffusion compensation taking geometry into account for 3D printing,” ACM Trans. Graphic., flight. 38, no. 4, p. 111: 1-111: 14, Jul 2019, doi: 10.1145 / 3306346.3322992.
 S. Ritter, formnext AM FIELD GUIDE: discover the world of additive manufacturing: a practical guide to the exciting world of generative manufacturing. 2019.
 O. Elek et al., “Diffusion-sensitive texture reproduction for 3D printing”, ACM Trans. Graphic., flight. 36, no. 6, p. 241: 1-241: 15, November 2017, doi: 10.1145 / 3130800.3130890.