Color tv how does it work




















Peter Goldmark, the head of the CBS lab and one of the inventors of color television, noted that audiences at medical conventions responded strongly to the images produced by his system. Similar claims about the power and impact of the electronic color image carried over into its use in commercial broadcasting.

Commercial color television systems were not approved by the FCC until the start of the s, after consumers had already started purchasing black and white sets. Of the three television networks in the U. Full conversion of all three networks was not complete until the late s.

But during that extended period of conversion and dissemination, network executives, publicists, advertising companies, inventors, and television manufacturers worked assiduously to promote color technology by reinforcing some of the same notions of its perceptual, aesthetic, and emotional functions that medical TV pioneers had noted. They were trying to convince consumers that the liveness and immediacy of television, combined with the unique visual properties of electronic color, would provide them with an expansive and revelatory view on the world that they had never experienced before.

These beliefs then slipped into the descriptions of color television by commentators, critics, and journalists, further influencing the way that viewers made sense of their color viewing experience. The resulting page report, which was used by NBC to get sponsors on board with color, argued that color television gave viewers a reduced sense of psychological distance, while also increasing levels of emotional involvement, empathy, creativity, comprehension, sociality, and immediacy. Ultimately, the ability to evoke strong feeling and capture attention was seen as a boon to sponsors willing to invest in color programming and commercials.

To stop subsequent lines skewing from side to side through timing drift, at the start of each scan line is a synchronization pulse. This reminds the rendering receiver exactly when each line needs to start. This synchronization pulse is 4. Just before this pulse is an even shorter Front Porch gap of 1.

As we will see later, this back porch has an additional use for color TV signals. During the entire time between the front porch, through the synchronization pulse through the the back porch, the signal is below the blanking internal, and the electron gun is off.

It is during this blanking interval that the gun sweeps around to back ready to draw the next line. Here is an exaggeration of what might happen to the image if there were not horizontal synchronization pulses generated to define the start of each line. Without the sync pulses 'reminding' the receiver exactly when each line begins, the analog technology used might drift slightly.

Below is a animation showing a single scan line a small image containing just 64 vertical lines. You can use the UP and DOWN buttons to select which of the lines from the image is rendered when stopped , or see how the image is painted at various speeds.

The rendered line is shown below the graph of the luminosity values. To stop ragged edges, and issues with a potential sharp edge when the line started or ended with a bright signal, old TV's typically didn't display all the transmitted image.

A small part of the the left and right of each line was cropped off, as was a little at the top and bottom sort of like zooming in slightly. This was referred to as Overscan. This was a useful safety feature in the days of analog TV, whose components were, well, analog, and could drift.

In digital televisions, this is no longer an issue. You probably have controls on your digital TV or cable box to turn off overscan and get more of the picture you are paying for! The non-visible lines clipped out of the viewed area were later used to add additional metadata to the image, such as closed-captions, and in the UK, the awesome Teletext service.

It's not quite as simple as we've described it above. When television was first invented, bandwidth was limited, and there is still the desire to keep the refresh rate as high as possible to prevent flicker, and to preserve the persistence of vision in which a rapidly flickering image is interpreted by the brain as a continuous image. Depending on your region, a TV picture refreshes 50 or 60 times per second.

A solution to this dilemma is interlacing. Rather than repaint the entire image 60 times each second, the screen is still refreshed at that rate, but on each frame, only half the image is updated. Every other line is skipped between frames. Alternately, the even number, then the odd numbered, scan lines are transmitted. These mesh together like a fingers or a perfect riffle shuffle of a deck of cards.

It's not a perfect solution as, if objects are moving rapidly, they are at different positions on the odd and even lines. This can cause image 'tearing', and the edges of objects can look like they are being viewed through a comb. If the signal is transmitted as frames of every other line, this is called Interlaced signal. For the bandwidth preservation reasons described above, this is how over-the-air TV transmissions were broadcast.

When every line of a frame is rendered, in order, this is called progressive scan. The classification of a transmission is usually indicated by the number of lines, followed by a lower-case i or p to represent the type of scan. You may see references today to signals such as i, p, i …. Here are a couple of interactive animations showing the different rendering styles. The first is progressive scan. Here each line is drawn, sequentially, in order down the screen. Next is interlaced, first the odd, then the even lines are drawn.

In any frame, only have the lines are drawn. In odd frames of the picture, lines start at the top left, and only half a line is drawn as the last line of the frame. Even frames start their first line half-way across the screen, and finish in the lower right. Because of over-scan described early , these half-lines are never seen.

Numerically, lines are labeled based on the order they are rendered scan line ordering in the signal. This means, because of interlacing, adjacent lines on the display are not numerically adjacent. In interlaced video, there are two fields in every frame. Typically these are labeled field 1 and field 2 odd and even. Field 2 starts with the sync pulse midway through the line, so even numbered lines are on field 2.

But what about the half lines, how to you label these? Also, should you call the first half line zero or one? Unless you work in the industry, I'd not worry about this. An additional set of pulses below blanking level are inserted to describe the start of each frame. These provide vertical synchronization and also allow the beam to sweep back up to the top of the screen to render the next frame. This is the vertical blanking interval. These trains of pulses are more complex and comprise a series of pre-equalization pulses, followed by field sync Vertical pulses called broad pulses , followed by post-equalization pulses before the scan lines for the next frame are delivered.

The pre-equalization pulses are six pulses at half the pulse width of the horizontal sync pulses, but twice the frequency so that they occupy the same time period of three regular lines ; the midline sync pulse is ignored, keeping the timing correct, and by making the pulses half the width, the energy content is the same.

Next are six broad pulses, delivered over the time of three additional lines. Finally a set of post-equalization pulses, which are identical to the pre-equalization pulses. In total, these pulses take the space of nine scan lines. In additional to the vertical sync pulses, a total of the first 20 lines of every frame are reserved for the vertical blanking and additional control information V-chip, stereo audio, subtitles ….

These 20 lines are called the vertical blanking interval, and all signals here are not rendered. What this means is that, on a analog signal, at most, the highest resolution is lines.

For PAL, the vertical blanking interval is 25 lines, resulting in an active field resolution of lines. Just because there are half lines, do not think of these as discontinuities. The train of pulses is continuous and at constant rate, it's just that there is a non-integral number of lines in a frame. A good analogy for how this works is imagine the signal like a brick wall built using a regular running bond called a stretcher bond.

In any small section of wall, it's possible to identify even and odd frames, but if these are built together alternating between odd and even frames, the bricks are in courses that are continuous. In this analogy, the vertical mortar lines between bricks are the horizontal sync pulses, and where the transitions are between odd and even frames are the vertical sync pulses.

Obviously, there is a desire to make them as high as possible for the best picture quality , but also the compromise of the needed bandwidth wanting to make them as small as possible.

In France, they had an line standard for a while the first HD? Around the time of WWII - impressive, but a bandwidth hog , with before that. At first glance, these really do seem like arbitrary numbers.

The answer to the puzzle is because of ratios. Because of interlacing, the timing for the horizontal and vertical timebases needed to be precise. The time signals were generated through a series of electronic dividing circuits. Each division is a by an odd number.

Technology constraints of the 30's and 40's necessitated using only integers preferably no greater than 7 for stability. I should probably have answered this question first, rather than burying it right here, but to answer this you need an appreciation of how the analog system worked. The question is, why devise a system to with an odd number of lines in the first place, and cause all this headache? Wouldn't it be simpler to define an interlaced system with an even number of lines?

The answer is yes , and for digital protocols, this is what has been done. However, remember back to the time of when the television standards were being defined. The toy also moves forward very slightly. By putting together 15 or more subtly different frames per second, the brain integrates them into a moving scene. Fifteen per second is about the minimum possible -- any fewer than that and it looks jerky.

When you download and watch the MPEG file offered at the beginning of this section, you see both of these processes at work simultaneously.

Your brain is fusing the dots of each image together to form still images and then fusing the separate still images together into a moving scene. Without these two capabilities, TV as we know it would not be possible. A few TVs in use today rely on a device known as the cathode ray tube , or CRT , to display their images. LCDs and plasma displays are other common technologies.

It is even possible to make a television screen out of thousands of ordinary watt light bulbs! You may have seen something like this at an outdoor event like a football game. Let's start with the CRT, however. The terms anode and cathode are used in electronics as synonyms for positive and negative terminals. For example, you could refer to the positive terminal of a battery as the anode and the negative terminal as the cathode.

In a cathode ray tube, the "cathode" is a heated filament not unlike the filament in a normal light bulb. The heated filament is in a vacuum created inside a glass "tube. Electrons are negative. The anode is positive, so it attracts the electrons pouring off the cathode. In a TV's cathode ray tube, the stream of electrons is focused by a focusing anode into a tight beam and then accelerated by an accelerating anode.

This tight, high-speed beam of electrons flies through the vacuum in the tube and hits the flat screen at the other end of the tube. This screen is coated with phosphor, which glows when struck by the beam. There is a cathode and a pair or more of anodes. There is the phosphor-coated screen. There is a conductive coating inside the tube to soak up the electrons that pile up at the screen-end of the tube.

However, in this diagram you can see no way to "steer" the beam -- the beam will always land in a tiny dot right in the center of the screen. That's why, if you look inside any TV set, you will find that the tube is wrapped in coils of wires.

On the next page, you'll get a good view of steering coils. The steering coils are simply copper windings see How Electromagnets Work for details on coils. These coils are able to create magnetic fields inside the tube, and the electron beam responds to the fields. One set of coils creates a magnetic field that moves the electron beam vertically, while another set moves the beam horizontally.

By controlling the voltages in the coils, you can position the electron beam at any point on the screen. A phosphor is any material that, when exposed to radiation, emits visible light. The radiation might be ultraviolet light or a beam of electrons.

Any fluorescent color is really a phosphor -- fluorescent colors absorb invisible ultraviolet light and emit visible light at a characteristic color. In a CRT, phosphor coats the inside of the screen. When the electron beam strikes the phosphor, it makes the screen glow. In a black-and-white screen, there is one phosphor that glows white when struck. In a color screen, there are three phosphors arranged as dots or stripes that emit red, green and blue light.

There are also three electron beams to illuminate the three different colors together. There are thousands of different phosphors that have been formulated.

They are characterized by their emission color and the length of time emission lasts after they are excited. In a black-and-white TV, the screen is coated with white phosphor and the electron beam "paints" an image onto the screen by moving the electron beam across the phosphor a line at a time. To "paint" the entire screen, electronic circuits inside the TV use the magnetic coils to move the electron beam in a " raster scan " pattern across and down the screen.

The beam paints one line across the screen from left to right. It then quickly flies back to the left side, moves down slightly and paints another horizontal line, and so on down the screen.

In this figure, the blue lines represent lines that the electron beam is "painting" on the screen from left to right, while the red dashed lines represent the beam flying back to the left. When the beam reaches the right side of the bottom line, it has to move back to the upper left corner of the screen, as represented by the green line in the figure. When the beam is "painting," it is on, and when it is flying back, it is off so that it does not leave a trail on the screen.

The term horizontal retrace is used to refer to the beam moving back to the left at the end of each line, while the term vertical retrace refers to its movement from bottom to top. As the beam paints each line from left to right, the intensity of the beam is changed to create different shades of black, gray and white across the screen.

Because the lines are spaced very closely together, your brain integrates them into a single image. A TV screen normally has about lines visible from top to bottom. In the next section, you'll find out how the TV "paints" these lines on the screen.

Standard TVs use an interlacing technique when painting the screen. In this technique, the screen is painted 60 times per second but only half of the lines are painted per frame. The beam paints every other line as it moves down the screen -- for example, every odd-numbered line.

Then, the next time it moves down the screen it paints the even-numbered lines, alternating back and forth between even-numbered and odd-numbered lines on each pass. The entire screen, in two passes, is painted 30 times every second. The alternative to interlacing is called progressive scanning , which paints every line on the screen 60 times per second. Most computer monitors use progressive scanning because it significantly reduces flicker.

Because the electron beam is painting all lines 30 times per second, it paints a total of 15, lines per second. Some people can actually hear this frequency as a very high-pitched sound emitted when the television is on. When a television station wants to broadcast a signal to your TV, or when your VCR wants to display the movie on a video tape on your TV, the signal needs to mesh with the electronics controlling the beam so that the TV can accurately paint the picture that the TV station or VCR sends.

A signal that contains all three of these components -- intensity information, horizontal-retrace signals, and vertical-retrace signals -- is called a composite video signal.

One line of a typical composite video signal looks something like the image on this page. The horizontal-retrace signals are 5-microsecond abbreviated as "us" in the figure pulses at zero volts.



0コメント

  • 1000 / 1000