Research ArticleAPPLIED SCIENCES AND ENGINEERING

Artificial intelligence for art investigation: Meeting the challenge of separating x-ray images of the Ghent Altarpiece

See allHide authors and affiliations

Science Advances  30 Aug 2019:
Vol. 5, no. 8, eaaw7416
DOI: 10.1126/sciadv.aaw7416

Figures

  • Fig. 1 The Ghent Altarpiece:

    closed (left, shown after conservation) and open (right, shown before conservation; for these panels, conservation is ongoing and images after conservation are not available yet). The bottom left panel of the open left wing has been missing since its theft almost a century ago. [Images in this figure, and for details in further figures, are used with permission of the copyright holder, Saint-Bavo’s Cathedral (Art in Flanders; www.lukasweb.be). Photo credits: D. Provost (closed Ghent Altarpiece) and H. Maertens (open Ghent Altarpiece).]

  • Fig. 2 The two double-sided panels from the Ghent Altarpiece.

    Interior view of shutters (left, before conservation), exterior view of shutters (center, after conservation), and corresponding x-ray images for each panel (right, acquired before conservation), which include the combined contributions from both sides of each panel. Note that because x-ray imaging captures both sides by penetrating through the panel, whereas the panel has to be turned over to photograph its backside in visible light, the x-ray images superpose the mirror image of one side and the nonmirrored image of the other side. [X-ray image in this figure, and for details in further figures, are used with permission of the copyright holder, Saint-Bavo’s Cathedral (Art in Flanders; www.lukasweb.be). Photo credit: KIK-IRPA.]

  • Fig. 3 Results of the proposed algorithm applied to a detail from the Adam panel.

    Column (A) input data; (B to D) results from first (B), second (C), and combined (D) approach—the latter takes the best of both first and second approaches. Top row: Interior (Adam) side RGB input (before conservation) and various reconstructed x-ray images. Second row: Exterior (drapery) side RGB input (image mirrored for easier comparison with x-ray images) and the reconstructed x-ray images. Third row: Original mixed x-ray input image (left) and mixtures of the reconstructed x-ray images in rows 1 and 2. Bottom row: Visualization of the error map for each approach.

  • Fig. 4 Results of the proposed algorithm applied to a detail from the Eve panel.

    Column (A) input data; (B to D) results from first (B), second (C), and combined (D) approach—the latter takes the best of both first and second approaches. Top row: Interior (Eve) (before conservation) and various reconstructed x-ray images. Second row down: exterior (drapery) side RGB input (image mirrored for easier comparison with the x-ray images) and the reconstructed x-ray images. Third row down: Original mixed x-ray input image (left) and mixtures of the reconstructed x-ray images in rows 1 and 2. Bottom row: Visualization of the error map for each approach.

  • Fig. 5 Comparison between the results generated with the proposed new algorithm and the preceding state-of-the-art results for a detail of the Adam panel.

    Top: (A) the mixed x-ray; (B) the RGB images from each side of the panel (before conservation) corresponding to the x-ray detail (i.e., the algorithm inputs); (C) reconstructed x-ray images produced by the proposed algorithm; (D) reconstructed x-ray images produced in (25); and (E) reconstructed x-ray images produced by coupled dictionary learning (30). All of the grayscale images presented here have gone through histogram stretching to have a common ground for the comparison. Eve panel (bottom): (F) the mixed x-ray; (G) the RGB images from each side of the panel (before conservation) corresponding to the x-ray detail (i.e., the algorithm inputs); (H) reconstructed x-ray images produced by the proposed algorithm; and (I) reconstructed x-ray images produced in (30). All of the grayscale images presented here have gone through histogram stretching to have a common ground for the comparison.

  • Fig. 6 A diagram of the neural network architecture.

Navigate This Article