https://www.x-ray-forum.net/

Short description of Image Intensifier Technology
https://www.x-ray-forum.net/./22161106nx65772/radioscopy-with-ii-f13/short-description-of-image-intensifier-technology-t36.html
Page 1 of 1

Author:  Klaus [ 04.11.2019, 20:58 ]
Post subject:  Short description of Image Intensifier Technology

The image intensifier (II) accelerates the speed of electrons in a vacuum tube to increase its energy level. In the result each light quant from the input screen can have a thousand times higher brightness at the output screen.

There are several stages until the image is on the output screen. In the first stage the X-ray photon is converted by a scintilator screen to light photons. In a photokathode these light photons are converted to electrons. Due to the high voltage (the acceleration voltage is 25kV) the electrons are accelerated toward the output screen; two electron lenses are used to focus the beam. The output screen converts the electron back to visible light which could be captured with a camera. The high brightness allowed in the 60's of last century the use of TV cameras to capture and monitors to display X-ray images. Compared to film - where you have to wait some time until you have the film on the screen - you get a live image at once. This advantage opened the door for low cost NDT inspection with X-Ray technique; mainly the automotive industry benefits from the low cost and could use aluminum parts which always are known for flaws (shrinkage or porosity) to reduce the weight of the cars. Additionally the option to move the part during inspection allow the inspector to get a feeling about the position of the flaw in the part.
In the beginning the "beam" of the camera controlled the beam of the TV monitor with the video signal, which is standardized in US (60 fps) and Europe (50fps) based on the frequency of the power stations.

As the speed of the signal was not high enough the "trick" with half images were invented were only every second line was transfered and two half images are combined in the human eye to a full image (creating some flicker %) ). After the first generation of (expensive and large) electron tube cameras very soon CCD cameras entered the market making Image Intensifier based detectors ("Realtime-Detector") affordable for much more users.

The CCD cameras support the line wise readout like the TV monitors and this technique dominated for more than 15 years the X-Ray image systems. Now the pictures could also be stored on Video tape for documentation - still analogue. A lot of standards were created for radioscopy (ASTM: E1000 guide, E1255 practice, E1416 practice for welds, E1734 practice for castings, E1411 practice for qualification, E1453 storage of media that contains radioscopic data, E1475 data fields for computerized transfer; EN: 13068 with three parts), later based on digital technique which came up with fast frame grabber which could grab the single images and use digital technique for image improvement like frame accumulation (to reduce the noise) or filter.

Also the first automatic defect recognition systems (ADR), where no inspector looks on the images anymore, were possible with the realtime imaging system and computer hardware.

Page 1 of 1 All times are UTC + 1 hour
Powered by phpBB® Forum Software © phpBB Group
https://www.phpbb.com/