Factors affecting image quality

    Factors affecting image quality


There are a number of factors, both inherent in equipment design and external, that affect image quality. The following are important examples:


Fill factor


For flat-panel detectors, a proportion of the detector contains the read-out circuitry and will be insensitive to the incoming light photons or electrons. This leads to the concept of the fill factor (see equation below), which is the ratio of the sensitive area of the pixel to the effective area of the detector element itself.

Any improvements in resolution will require a reduced pixel pitch. The fill factor will decrease with improved resolution, as the read-out electronics will take up a larger proportion of the detector element and decrease the detector sensitivity.

Tiling


A tiled array consists of a number of detectors abutted together to sample the whole image. However, there may be small areas on read-out devices that are not sensitive; these are caused by gaps between the detectors (typically about 100 m). There may be some image processing to compensate for this, although this may give some stitching artefacts.


Grids


Low grid strip densities can cause interference patterns in the image called Moiré patterns. This can be solved by using moving grids or high-density grids of over 60 lines/cm. When using CR, ideally the grid lines should also be perpendicular to the scan lines in the reader.

Radiation exposure (image optimization)


Image quality is related to the radiation exposure received by the
detector. Although a relatively low exposure will result in a noisy image, it may still contain sufficient information to be diagnostically acceptable. A high exposure will result in improved image quality, since quantum noise is reduced. However, image-quality improvement is not linear: it will eventually level off as the quantum noise becomes less dominant and decrease as the plate becomes overexposed. Ideally, a system should be set up to obtain adequate image quality for the lowest possible dose (optimization).



Automatic exposure control response


An automatic exposure control (AEC) for a film/screen system is set up by ensuring that the correct optical density is achieved across a range of kilovoltages. This method is not practical for digital imaging, as the image will be displayed according to preset parameters, irrespective of the exposure used. The AEC will need to be set up in collaboration with the radiology and medical physics dep-artments and the supplier. The level of exposure must be optimized for the selected examination and the receptor dose measured.

One other consideration is that some-times when film/screen systems are replaced by a CR system, then for simplicity the AEC is kept at the same settings. This may not be the optimal working level, because the sensitivity and energy response of the digital
system are different from those of the film/screen system it replaces. A DDR system can use the detector itself as an AEC, although currently most use a conventional AEC chamber system.