RT was a popular concept on TVs and computer monitors. With the advent of more modern technologies, the name is no longer widely used.
CRT stands for “cathode ray tube”, which is the glass imaging tube used in ancient TVs or computer monitors, commonly used before flat screens appeared. According to How-To Geek , the advantage of CRT is the flexible display, without moving parts.
Due to being widely used for many years, TVs or monitors using CRT technology are often called CRT TVs, or CRT monitors.
How CRT TV works
Before the discovery of electrons, scientists called electron flow as Bcathode rays because they moved from cathode to anode. Based on this theory, German engineer Karl Ferdinand Braun created a vacuum tube with cathode ray (CRT) in 1897, which controls magnetic deflection to display waves of alternating current.
CRT is a glass vacuum tube, consisting of 3 main components: electron source (also called electron gun), electron deflector system and luminescent screen (phosphorus screen).
On the color CRT screen, there are 3 electron guns for the colors red, green and blue. They will shoot a stream of electrons onto the phosphorus screen to stimulate light. The electron flow intensity can be adjusted, helping to change the brightness in certain regions.
Most TVs, CRT monitors display images in a horizontal line from top to bottom, scanning speed 30 or 60 times / second (raster graphics). Meanwhile, the CRT screen of an oscilloscope or game box draws live object-oriented images (vector screens).
To function as a complete monitor, a CRT needs many additional circuits, such as a power supply or circuit board, to receive and transmit signals. The above components have different structure and size depending on monitor type and manufacturer.
The boom in television has been associated with TVs using CRT technology. Then, the computer also uses the CRT monitor as an output device, eliminating the need to continuously print data onto paper.
Why don’t we use CRT anymore?
Some nostalgic people still prefer using CRT monitors for gaming. Apart from the above group of people, this technology is almost no longer available in the market.
CRTs were most commonly used from the 1950s to the mid-2000s, on TVs and then on computer monitors. In the United States, most of the production lines for commercial CRT monitors have been shut down since the mid-2000s, some remaining until the 2010s. Currently, several companies still manufacture or refurbish CRT monitors, but not for the mass market.
CRT TVs die down due to flat screen (mostly LCD) technologies that have both commercial advantages and convenience. The cost of producing flat screens is cheaper. Their design is thin and light, consumes less energy and radiates heat compared to CRT. The flat screen also offers high resolution rendering, making it easy to manufacture in large sizes.
The advantages of CRT
In the 2000s and 2010s, CRT still had some advantages over flat screens such as rich display colors and faster response speed. Over time, however, they have shrunk thanks to improvements to the flat screen.
For older gamers, CRT monitors are still favored by their ability to handle the resolution quite well. On modern TVs, the resolution of old game consoles can cause pictures to stretch or blur. However, when plugged into the CRT monitor, the image is very clear, with vibrant colors.
In addition, many classic gaming accessories only work with CRT monitors, such as the “duck gun” in Nintendo’s Duck Hunt game. Finally, games played on CRT monitors give better pictures and colors, because developers have used CRT’s features to blend colors and create gloss for the image. The effects above will not work well when playing on modern monitors.
After the successful launch of the M1 processor, Apple is ready to introduce the MacBook M2 in the second half of this year. Therefore, please wait for an informed decision.
According to Nikkei , the Apple M2 processor chip was put into mass production by TSMC in April, expected to ship as early as July to equip MacBook models in the second half of the year.
Similar to its predecessor M1, the M2 chip uses SoC (System on Chip) architecture, including CPU, GPU, RAM and neural processor integrated on a circuit board.
The M2 chip’s specifications and upgrades compared to its predecessor are unknown, except that it is manufactured by TSMC on a 5 nm + process. The Taiwanese company is also the maker of M1, the first processor chip developed by Apple on the ARM architecture with the goal of replacing Intel CPUs. According to Nikkei , the production of these advanced chips takes at least three months.
Earlier, analyst Ming-Chi Kuo from TF International Securities predicted that Apple will upgrade the MacBook Pro line in the second half of the year, adding more ports.
At the end of April, a leak from an Apple manufacturing partner mentioned two unreleased laptops with Apple Silicon chips, including the 14- and 16-inch MacBook Pro. The revealed drawing shows the product no longer has the Touch Bar, which is brought back to the MagSafe charging port, HDMI and SD card slot.
Besides the MacBook, Nikkei predicts that the M2 processor will be used for many other devices. Currently, the M1 chip is equipped by Apple for the iPad Pro 2021 line alongside the MacBook Pro, MacBook Air, Mac mini and iMac. When launching a new iMac, Apple said the M1 chip gave 85% faster performance than the old iMac, using Intel CPUs.
You should not buy a MacBook M1 at this time
After being sold to the market in November 2020, MacBook with M1 chip is appreciated by many technology sites with good performance and energy saving. However, this chip only supports up to 16 GB RAM, the integrated graphics card is weaker than Intel. Tim Cook, CEO of Apple, said it took about 2 years to complete the process of switching to new processors.
The switch to Apple’s self-developed processor chips was a huge blow to Intel, the largest US chip maker with x86 architecture that has dominated the market for years.
According to research firm IDC , the number of Mac computers sold in 2020 will increase 29% to 23.1 million units. Apple is the world’s fourth largest computer maker, after Lenovo, HP and Dell. Experts say that the demand for computers is quite high, despite the market’s difficulties due to the lack of chips. According to IDC , some MacBook and iPad models had to delay production due to lack of chips and components.
Using self-developed processors is a move that sets Apple apart from its competitors , said Joey Yen, an analyst at IDC . “So far, Apple has had a successful M1 launch, a good product experience based on general user reviews,” Yen said.