Most of the time, when lay men are asked what they think of the latest trends in technology and their opinions on the ones that are about to hit the recycling bin, a typical response would be that the usefulness of the new products far outweigh the former. This view is so common it has become an irreversible stereotype. The effects of course are very visible. We see a huge almost exponential demand on the "new" product and the exact inverse on the demand of the "obsolete" one. Electronics practitioners tend to find it highly unnecessary to study old technologies and instead focus on the new. Anything related to anything outside the new is considered ancient and negligible.

Let us take a closer look at exactly how technology evolved from then and now (I won't be tackling much of the earlier 1800s since I've pretty much covered enough on the topic on one of my previous posts). Perhaps one can remember of the old black and white television set and the vacuum tube diode? These technologies were popular during the early 1900s (the TV was invented by Vladimir Zworykin in 1923 and the vacuum tube diode by John Fleming in 1903). The vacuum tube diode served as the computing power of most digital electronics equipment at that time. Logic levels were manually identified, and because of this physical feature, the first computer occupied so much space that it filled an entire room. In spite of the usual bugs that would disrupt the digital equipment's operation (hence the popular term "computer bug"), it was considered state-of-the-art, one of the best until that faithful year in the mid-90s that the transistor was discovered. Logic operations were now done by solid-state technology which paved the way for smaller digital equipment. Computers finally became personal. But Moore predicted that transistor density on chips will double every 2 years, so the miracles didn't stop there. Everything digital became smaller and smaller until you could finally play your favorite music on a device the size of your watch, the most popular product being the i-pod. And with the development of new technology came lower market prices. All of a sudden, everyone forgot about the bulky vacuum tube diodes or the bipolar junction transistors. Now, what they're familiar with are CMOS, 3D transistors/tri-gate transistors, the Ivy Bridge, memristors, I7 processors, graphene-based transistors, smart materials, fiber optics, etc.

Then again, there seems to be one common factor about all these developments. They're all innovations on DIGITAL applications. Faster speeds, larger memmory, smaller size, everything-er has only been focused on the digital/computing market. But what about communications and other electronics applications? Could a CMOS do better than old technology to, for example, modulate broadcasting signals at transmitter stations?

Some may say I may be pointing out something too obvious, but this is a thing that I believe needs iteration because the chances of neglecting such a fact is tempting with all the ads going around on the latest trends in engineering science. Just because something is obsolete doesn't meant it's useless now. In fact, the term "obsolete" means that something is no longer in use, and to say that the old electronics is no longer in use is a big fallacy.

Old technology is still important, and it is important to remember them. Let us take the bulky vacuum tubes that were once used in the ENIAC and other old computing equipment. In some communications applications such as broadcasting, high power signals are used. High power signals can not be modulated by any solid-state component - due to their small size, the amount of power they can handle is very small so the solid-state components will just be destroyed. Vacuum tube diodes, with their unwanted bulk in digital applications, finds great use in communications requiring high power. (Another common device similar to the vacuum tube is being used in high power communications as well, called the klystron amplifier)

Aside from differences in application, there is another reason why obsolete technology is relevant. If inventors were to just invent anything they thought of out of the blue, and manufacturers were to just keep manufacturing what these inventors thought of, then all businesses related to these people will go bankrupt. Why? For example, let us say Charles invented a cool new phone, which was 99% more energy efficient, 99% faster, and 99% cheaper than other competing phones. However, when it was introduced in the market, no one would buy them. It seems that Charles' "super-efficient" phone used a completely different modulation scheme, a completely different machine architecture, and a completely different way of battery-charging that no other phone or mobile station would interact with it and a very unique charger is required to re-energize it. Plus, if the phone needed repairs, the only company that can repair it is the one that manufactured it. Does this scenario sound familiar? For Apple it is. Ask a non-Apple user why he/she isn't an apple fan and compare the reason given to you to the reason above. Of course, Apple thrived because their products can still communicate with other phones and can still be sensibly charged. Apple didn't phase out BACKWARD COMPATIBILITY in their products. What would've happened if the I-phone or I-pad couldn't communicate with other mobile devices? If it didn't feature wireless fidelity/wi-fi?

In order for a product to be backward compatible, the inventor of the product must know how the old technology worked and what principles governed its operation. This way, his/her invention can function together with older technology.

There are other numerous reasons why obsolete technology is crucial, but I believe the 2 most important reasons are its difference in application and its backward compatibility with older technology.