The first X-ray tube was the Crookes tube, a partially evacuated glass bulb containing two electrodes, named after its designer, the British chemist and physicist Sir William Crookes. When an electric current passes through such a tube, the residual gas is ionized and positive ions, striking the cathode, eject electrons from it. These electrons, in the form of a beam of cathode rays, bombard the glass walls of the tube and produce X rays. Such tubes produce only soft X rays of low energy.
An early improvement in the X-ray tube was the introduction of a curved cathode to focus the beam of electrons on a heavy-metal target, called the anticathode, or anode. This type generates harder rays of shorter wavelengths and of greater energy than those produced by the original Crookes tube, but the operation of such tubes is erratic because the X-ray production depends on the gas pressure within the tube.
The next great improvement was made in 1913 by the American physicist William David Coolidge (1873-1975). The Coolidge tube is highly evacuated and contains a heated filament and a target. It is essentially a thermionic vacuum tube in which the cathode emits electrons because the cathode is heated by an auxiliary current and not because it is struck by ions as in the earlier types of tubes. The electrons emitted from the heated cathode are accelerated by the application of a high voltage across the tube. As the voltage is increased, the minimum wavelength of the radiation decreases.
Most of the X-ray tubes in present-day use are modified Coolidge tubes. The larger and more powerful tubes have water-cooled anticathodes to prevent melting under the impact of the electron bombardment. The widely used shockproof tube is a modification of the Coolidge tube with improved insulation of the envelope (by oil) and grounded power cables. Such devices as the betatronare used to produce extremely hard X rays, of shorter wavelength than the gamma rays emitted by naturally radioactive elements.
No comments:
Post a Comment