Radiographic density (optical density, photographic density, or film density) is a measure of the degree of film darkening. Basically, it is the degree of blackness present at any given point on the film. Technically it should be called "transmitted density" when associated with transparent-base film since it is a measure of the light transmitted through the film. Radiographic density is the logarithm of two measurements: the intensity of light incident on the film (I0) and the intensity of light transmitted through the film (It). This ratio is the inverse of transmittance.
- D = log [I0/It]
Film periphery (away from the patient) should be DARKER BLACK. If the technician can see their fingers through the film, it’s not black [dense] enough. Some common causes are:
- mAs too low (underexposure)
- Inadequate development (underdevelopment)
Film periphery (away from the patient) should be LIGHTER WHITE. If the technician cannot see their fingers through the film at all, it’s too black [extremely dense]. Some common causes are:
- mAs too high (overexposure)
- Inadequate development (over development)
Film density is measured with a densitometer. A densitometer simply has a photoelectric sensor that measures the amount of light transmitted through a piece of film. The film is placed between the light source and the sensor and a density reading is produced by the instrument.