Oct 04 2018
Are biosciences and computation for bioscience still hot topics? In the early 2000s, the promise of bioscience was everywhere around us, and much of it was powered by advances in computation and big data. The sequencing of the human genome in 2002 is but one example of how computation, enabled by significant reductions in the cost of DNA sequencing, has accelerated bioscience. Yet, this trend seemingly gave way (or at least press time) to other computationally-accelerated areas like artificial intelligence, the Internet of Things (IoT), and self-driving vehicles. What has driven computational bioscience from the forefront of trends that are changing our world?
In a word, the problem is too much data. As an article in Wired Magazine pointed out four years ago, the 3 billion DNA base pairs and 20,000 genes in the human genome are but a small part of the human genetic ecosystem which, when including the millions of microbes in the human body, stands at roughly 100 billion base pairs and over a million genes. It is estimated that over 15 petabytes of genetic sequencing data is being produced per year across the globe. Worse yet, much of this data is not as well-structured like data from physics, business, or psychology. This results in a problem that has become intractable with today’s technology. While this has not stopped progress in the biosciences, it is almost certain that there are a huge number of new insights sitting in this data that we cannot yet take advantage of.
The real question for the biosciences is whether there are new computational technologies and models that can accelerate the analysis of this treasure trove of data. Concepts like artificial intelligence, computational storage, and the fusion of clinical and research data may hold the key to further accelerate the biosciences and help us move towards the productization of these innovations. This three-part blog series will investigate how new IT technologies are helping to reshape the biosciences.
More likely than not, you have interacted with several content delivery networks (CDNs) today. Whether browsing popular websites, downloading new applications, or watching web-based video streaming services, CDNs are the virtual backbone of today’s internet. CDNs allow companies large and small to get physical closer to their customers, and to scale their delivery needs dynamically to support seasonal peaks in business, as well as long-term business growth. CDNs provide proxy services for nearly all websites in the world today by serving the content of their customers (the people creating the websites and their content) to the customers’ users.
Initially, CDNs primarily served time-sensitive information like streaming audio and video content. As users began to expect highly responsive web experiences, CDNs utilized their widespread network of Points of Presence (PoPs) to serve website traffic as well. The importance of this increased as e-retail began to emerge as a significant revenue source for a variety of companies. This was also closely followed by software delivery services, especially in the gaming sector where users would “slam” the internet when new games were released. Similar models followed for popular software releases in non-gaming markets, especially as the release of frequent updates became the norm to patch security and other concerns in applications.
Today, CDNs form the virtual infrastructure of the World Wide Web. CDN worldwide revenue is expected to grow from $7.47B in 2017 to $30.89B (US) by 2022, according to Markets and Markets, with the largest segment being media delivery. With this kind of growth, one can expect new entrants to the market and the introduction of game-changing technology which will radically reshape the competitive landscape of this market. If you want to find out more, please follow our five-part blog series, which will be available on LinkedIn and on the NGD Systems website.