As per AI, humanity has adopted technology in waves. Having known that, I wanted to test if these waves and new technology adoption can be auto-generated using AI & Code. Below (Figure 1) is the Technology Adoption Maps.

Figure 1
How was this obtained? An unsupervised learning algorithm (AI) was fed with a lot of technology topics. The topics were fed as numerical vectors which captured data around publications, code, developers, bugs etc. The AI system crunched the data and recursively pointed to the existence of clusters of technologies as shown below (Figure 2).

Figure 2
The clusters are interesting for example there is a “C” cluster which are the base languages that have existed for decades. You can see some trends like the advent of internet and evolution from SQL to NoSQL databases. The rise of the Internet also proliferated python, PHP, HTML and Java including through spread of platforms like Wordpress. Then, the internet becomes “efficient” in terms of a better front end and various “framework” based client-server architectures. As you delve deeper you see the rise of AI, cryptocurrencies, IoT, VR, and gaming. Here you can start feeling the presence of waves. The advent of deep learning later breaks into specific aspects like Tensorflow. The “battle” of Internet Of Things(IoT) between Arduino and Raspberry Pi is seen.
AI showed the waves. Crypto-currency moves with InterPlanetary File System(IPFS), Monero and now Ethereum picking up. The deeper recursions throw out the latter waves. The rise of Kotlin is also observed that Google in 2018 I/O conference has endorsed big time. When these recursions were reduced to two dimensions the standout features were “spread of code” and “rise of a code developer base” which is plotted in Figure 1. When plotted along these dimensions one can see it is a fair representation of the waves and the trends in adoption. The different aspects/waves are named differently as they are exactly not the same as the recursion due to dimensional reduction.
One may argue that this is the “supply” side of the equation which indicates the spread of development (Source code, Libraries etc). Does the “demand” side validate this?
“Demand” from industry can be broken by typical usage cycle. This starts with research and moves till actual production/delivery of goods & services using the technology. Companies driven by shareholder requirements are very diligent in their investments in research focus areas and what they publish. Research is a lead indicator of what is coming but when this research for a given technology rises above a critical mass, it is also an indication of adoption by companies. Production requires people to develop using technologies and thereby jobs are a great indicator of mass adoption of a technology.
While Research precedes production by years, patents can be another measure that in certain cases is an outcome of research. In this blog, patents have not been used but the possibility was demonstrated in the previous blog (Auto-generate reports via coding & AI). “Demand” represented through research surrogate is shown below (Figure 3). The top research areas in 5-year blocks are plotted.

Figure 3
Remember research precedes and hence Raspberry Pi comes early while it’s release happened at the beginning of 2012. The trends can be seen very similar to the “supply” side. From internet to front-end and cloud technologies to the rise of AI (Deep learning, machine learning, natural language processing), Virtual Reality (VR) and 3D+gaming technologies. At a directional level, this validates what was seen from the AI generated adoption maps.
Plotting research focus on market demand (actual jobs advertised) for these top research areas (most recent 2016-2018) paints an even more firm picture (Figure 4).

Figure 4
AI (Deep Learning & Machine learning) jobs are in great demand with an enormous focus on researching its usage in new fields. Areas like Pixel Art are upcoming and companies have started recruiting. The trends seen are supported by jobs in the market which implies budgets & resources being deployed.
AI has auto-generated a directional picture of technology adoption. These reports can be generated in real-time and on demand. Multiple industries like consulting, media, publishing etc where “language processing” was key is increasingly up for a disruption. It can replace the need for “white collar” consultants conducting surveys and collecting “CXO quotes” to decipher which technologies matter. The billions of sensors and trillions of documents out there on the internet already has the answers and this is only increasing.
Data holds the truth. Code+AI can unlock it.
In the world of AI… data matters, intelligence matters.