In Quality of X-Ray film the film quality parameters are explained. The standards for digital technology with CR or DDA also contain requirements for the maximum unsharpness and contrast resolution, which should also apply to digitized films in this sense. In Fig. 1, ASTM E94 defines the image quality parameters. Image unsharpness (radiographic definition) is determined by the inherent unsharpness caused by film, including screens, and the geometrical unsharpness caused by the arrangement of tube, test object and film. On the other hand, there is contrast, which is determined by the test object and the film. The image quality is overlaid by the film granularity, which introduces noise into the image and small details can be lost in this noise. With optimal exposure, the best film quality can have a blur of a few µm and a light transmission of 1 : 1 000 000.
The implementation of these quality parameters for digital technology is shown in the lower part of the image above, which applies similarly to film digitizers. Since the exposure arrangement has already been completed with the film, only the boxes outlined in red are relevant. In order to keep all the information from the film in the digital image, these parameters must at least meet the film requirements. The resolution is determined by the pixel size of the optical image sensor and the lens. The contrast resolution is limited by the digitization depth and the noise of the scanner. Radiographically, the optical film density is proportional to the X-ray dose.
Correct interpretation of the digital film scan requires that the digital gray value Gv is proportional to the optical density D of the film, e.g., Gv = 10 000 *D. With 16-bit resolution, a density range of 0 < D < 6.5 can be covered, which is sufficient in practice. The calibrated data is displayed as a positive image. For film-like negative images the digital data must be inverted by the software. However, the calibrated gray values should not be changed.