Skip to main content

A Look Back at the Trend-setters of 2017

This year's haul of outstanding emerging technologies are summarized below by yours truly.

Laser-Equipped Drones that Zap Small Unwanted Pests (in a fixed area)

I've seen this topic popping up in popular tech. news (mostly from innovation labs) these past few months, and they do offer quite an effective solution. I remember a Youtube video detailing how an innovation lab (Intellectual Ventures) plans to use it in the fight against malaria, i.e. stationing guard posts at all 4 corners of a farmer's residence and zapping the mosquitoes with photonic lasers. (more information in the link below)

Figure 1. A time-lapse photo of a mosquito getting zapped is shown (t=0 starting at the leftmost frame).

Technically, the idea has already been around for years, first introduced to the public around 6 years ago. But it has only gained traction quite recently. Take an application to ichthyology as an example. An increase in marine ectoparasites due to changes in climate and weather patterns have been causing Salmon fishermen sleepless nights. But such nights have been lessened with the help of lice-hunting underwater drones. In the fish pens of the far North Sea in Norway, they perambulate underwater, scouring the premises for sea lice. These bots will fry sea critters at a distance of as much as 2 meters. Their identification mechanism is similar to how smartphones pick out human faces, but at a much faster pace.  Thus far, estimates connote that only 2 such drones will be needed per fish pen.

Though promising as it is, the solution still needs to be backed by formal documentation to be worth considering as an alternative to more expensive and tedious methods.

It does make me wonder if we could use other naturally occurring factors in our environment as a substitute for lasers. Would it be possible or cheaper to focus sound waves on a certain locale causing permanent injury to an insect a millimeter in size? Or perhaps a giant mosquito net would prove more economical?

Deep, Deep Learning, Artificial Intelligence and the Race Towards Quantum Supremacy

Most prominent in machine vision applications (such as self-driving cars and cancer identification), machine learning has taken almost all sectors of the industry by storm. This year, things have gone up a notch with Google's push to create a 7x7 array of qubits (a portmanteau of the words "quantum" and "bit") on a single integrated circuit.

Figure 2. Google's 2x3 qubit array quantum computing chip is shown above.
Source: Lucero, E. (2017, June) "Google Aims for Quantum Computing Supremacy."  IEEE Spectrum, p. 8

Being a mere dilettante on quantum computing, the main issue that this technology seems to address is on error correction. Physicists say such a system is still far-fetched from what truly motivates the study (which I believe is the replication of the human brain?). But if Google succeeds in this endeavor, it will have a powerful decryption tool before the year ends. By the way, Google isn't the only player making significant strides in this field. IBM also pledged to jump-start a project for a 50-qubit system in the coming years. What's more is it plans to make such a system accessible to the cloud!

Figure 3. Shown is a car equipped with self-driving hardware.
Are you ready to give up the driver's seat for a set of preconditioned algorithms? If your answer is yes then sadly we're not on the same page. If you're answer is no, we're still not on the same page because I don't have a car (i.e. I have a penchant for taking walks and using public transportation). Weighing the bliss one feels of actually driving down an open highway to just sitting in the passenger's seat is quite difficult for one who lacks in experience. But one thing I am firm about - I wouldn't like a huge ugly chunk of whatchamacallit sitting at the top of my car! *Blech*

The self-driving car has caused a lot of ruckus in the media, yet they seem to have failed to gain the public's favor. (see the link below for more information)

Could it be because of all the accidents and mishaps during test runs? Or perhaps the security risks?

I personally took a M.O.O.C. on machine learning around 2 years ago from Coursera, so I got a good grasp on what is essentially happening inside a self-driving car. The biggest problem that came to my mind at that time was - if the algorithm/neural network adapts to a training set that is provided by "us" - then how can we ascertain that we have fed it with enough training sets. This can be done computationally, but in a practical sense provided a chaotic world brimming with stochastic processes - how sure is sure enough?

Again, I merely have a smattering background on machine learning, making my statements above quite contentious.


Popular posts from this blog

Calculator Techniques for the Casio FX-991ES and FX-991EX Unraveled

In solving engineering problems, one may not have the luxury of time. Most situations demand immediate results. The price of falling behind schedule is costly and demeaning to one's reputation. Therefore, every bit of precaution must be taken to expedite calculations. The following introduces methods to tackle these problems speedily using a Casio calculator FX-991ES and FX-991EX.

►For algebraic problems where you need to find the exact value of a dependent or independent variable, just use the CALC or [ES] Mode 5 functions or [EX] MENU A functions.

►For definite differentiation and integration problems, simply use the d/dx and integral operators in the COMP mode.

►For models that follow the differential equation: dP/dx=kt and models that follow a geometric function(i.e. A*B^x).

-Simply go to Mode 3 (STAT) (5)      e^x
-For geometric functions Mode 3 (STAT) 6 A*B^x
-(Why? Because the solution to the D.E. dP/dx=kt is an exponential function e^x.
When we know the boundary con…

Common Difficulties and Mishaps in 6.004 Computation Structures (by MITx)

May 6, 2018
VLSI Project: The Beta Layout [help needed]Current Tasks: ►Complete 32-bit ALU layout [unpipelined] in a 3-metal-layer C5 process. ►Extend Excel VBA macro to generate code for sequential instructions (machine language to actual electrical signals).
Current Obstacles/Unresolved Decisions:
►Use of complementary CMOS or pass transistor logic (do both? time expensive, will depend on sched.
►Adder selection: Brent-Kung; Kogge Stone; Ladner Fischer (brent takes up most space but seems to be fastest, consider fan-out) [do all? time expensive, will depend on sched.)
►layout requirements and DRC errors

Please leave a comment on the post below for advise. Any help is highly appreciated.

Yay or Nay? A Closer Look at AnDapt’s PMIC On-Demand Technology

Innovations on making product features customizable are recently gaining popularity. Take Andapt for example, a fabless start-up that unveiled its Multi-Rail Power Platform technology for On-Demand PMIC applications a few months back. (read all about it here: Will PMIC On-Demand Replace Catalog Power Devices?) Their online platform, WebAmp, enables the consumer to configure the PMIC based on desired specifications. Fortunately, I got a hands-on experience during the trial period (without the physical board (AmP8DB1) or adaptor (AmpLink)). In my opinion, their GUI is friendly but it lacks a verification method for tuning (i.e. the entered combination of specs). How would we know if it will perform as expected or if there are contradicting indications that yield queer behavior? Also, there is not just one IP available, but many that cater to a differing number of channels and voltage requirements (each with their own price tag).
Every new emerging technology has the potential to oversh…