American Registry of Radiologic Technologists (ARRT) Practice Exam

Disable ads (and more) with a membership for a one time $2.99 payment

Prepare for the American Registry of Radiologic Technologists Test with engaging quizzes and detailed explanations. Boost your test readiness with multiple-choice questions designed to enhance your knowledge and confidence.

Each practice test/flash card set has 50 randomly selected questions from a bank of over 500. You'll get a new set of questions each time!

Practice this question and more.


What happens to radiographic density when the distance from the x-ray source is reduced?

  1. Increases

  2. Decreases

  3. Remains constant

  4. Fluctuates

The correct answer is: Increases

When the distance from the x-ray source is reduced, the radiographic density increases. This is primarily due to the inverse square law, which states that the intensity of radiation (or x-rays) from a point source is inversely proportional to the square of the distance from that source. When the distance decreases, the number of x-ray photons that reach the image receptor increases, resulting in a higher exposure and therefore greater radiographic density. Essentially, as the source of the x-rays gets closer to the detector, the same amount of x-ray output covers a smaller area, leading to greater intensity and a darker image on the film or digital receptor. In contrast, if the distance were to increase, the intensity would decrease, leading to a lower radiographic density. Keeping distance constant would not lead to any change in density, as it implies a steady state of exposure. Fluctuations could happen under varying exposure conditions, but in a controlled scenario of consistent output and distance changes, density consistently increases with reduced distance.