Andrew Shreve: Biomaterials Science: technology development and education

Profile: Prof. Andrew P. Shreve, Regents' Professor, Department of Chemical and Biological Engineering at the University of New Mexico, NM, USA.

Professor Andrew P. Shreve contributes to understanding the impacts of rise of Bio and Nano Tech - and what educational solutions this causes us to confront:

"We need to ensure that scientists and engineers are aware of the nature of the humanities and fine arts, and that artists and historians are aware of the nature of scientific endeavors and basic scientific principles. From the science and engineering side, the role that society plays in influencing technology development is almost entirely missing from present training, and from the arts and humanities side, basic understanding of scientific principles and methods of inquiry are all too often lacking."

Introduction

The SFI-sponsored workshop addressing the Growing Gap Between Physical and Societal Technologies considered the development of physical technologies in Biotechnology, Information sciences, Nanotechnology and Cognitive sciences. These brief notes present some relevant topics that arise from consideration of the Biotechnology and Nanotechnology areas in particular, with some extension to more traditional disciplines of Biology and Materials Science. Three topics are presented: (1) Selected examples of actual technologies are briefly discussed, with the aim of showing how Bio-Nano areas are increasingly intertwined and that new technology development often occurs in a multi-disciplinary context, (2) Broader themes related to technology development over the past decades in both Materials Science and Biosciences are summarized, and (3) A need for different educational approaches in a time of rapidly evolving and broadly interdisciplinary physical and societal technologies is presented.

Examples of Technologies at the Interface of Biology and Materials Science

As specific examples of technology evolution, I present three types of technologies at the interface of biology and materials science. The first, bioinstrumentation, is relatively mature and technology development is progressing on a path of optimization of system components, interspersed with occasional innovative transformations, occurring in the context of an overall integrated, multidisciplinary, technology platform. The second, 3D cell culturing and tissue engineering, is still in a growing development phase and is just beginning to transition into larger scale manufacturing arenas where process control and optimization approaches may lead to rapid transformation in the next few decades. The final, DNA-based nanotechnology, is at a very early stage where basic feasibility issues of material cost, yield and scalability are still critical limitations. One common feature is that all these examples illustrate how much of modern technology development is reliant on multidisciplinary and interdisciplinary approaches and raise important questions about the optimal training and education needed for scientists and engineers in today's world. The challenge is even greater in that technology development and maturation is also shaped by societal technologies. Investigators of the future are increasingly in need of a broad understanding of not only science and engineering, but also of the broad sweep of humanities and social sciences, themes explored in the final two sections.

Bioinstrumentation: Over the previous few decades, Biology has transformed from a largely descriptive science to a highly quantitative science, one that involves acquisition, processing and modeling of vast amounts of data. One set of enabling tools for this transformation is the general area of bioanalytical instrumentation, the set of new instruments and analysis methods that provide quantitative data about biological systems. There are many examples, but one that's illustrative is flow cytometry, also including fluorescence-activated cell sorting methods (FACS).[1] Modern instruments in this area reflect ≈50+ years of technology development and are mostly provided by large instrumentation companies. These instruments are widely deployed in research, academic, and clinical settings, and are a "gold standard" for rapid analysis of heterogeneous cell populations, providing multi-parameter interrogation of ≈50000 individual cells per second. A few examples of applications include T-cell counting for monitoring the status of immunocompromised individuals, rapid measurement of biomolecular interactions for drug development applications, or detection of heterogeneous blood cell populations as an indication of blood doping in competitive sports. For our purpose, the important aspect of flow cytometry is that a modern instrument involves the integration of chemistry, biology, fluidic engineering, materials science, lasers, high-performance optics and detectors, data acquisition hardware, signal processing methods, and analysis of large multivariate data sets. In other words, while advances or rapid innovations in a single discipline or type of technology might drive the improvement of a particular module within a flow cytometer, overall optimization of instrument performance requires the integration of multiple areas of science and technology and a solid understanding of the source of current performance limits. This situation is very common in bioinstrumentation and provides a framework that must be incorporated into understanding (or predicting) the future rate of technology development.[2]

Tissue engineering and 3D cell culturing: A different type of example of developing technology at the interface of Biology and (nano)Materials Science is implementation of new methods for three-dimensional (3D) cell culturing and tissue engineering. Many types of cells, including most of those in complex multi-cellular organisms such as humans, are naturally found in 3D structures (e.g., tissues). Traditionally, however, cells in laboratory settings have been grown on two-dimensional (2D) surfaces and in artificially controlled nutrient environments. There is increasing awareness that growing cells in non-natural physical and chemical settings may alter their biological properties, and that studies of cells grown under such conditions may not translate to understanding the behavior of cells in actual tissues.[3] In addition, there is the emerging and important technology of tissue engineering.[4] Here, investigators working at the interface of materials science and biology attempt to coax cells seeded into a 3D material architecture to proliferate and grow, and ultimately to reproduce a naturally occurring tissue. These types of technologies have developed over the last 25 years or so, and currently involve a broad range of academic and commercial institutions, with the latter ranging from small-scale entrepreneurial ventures to large bioinstrumentation and biomanufacturing entities. For our discussion, the primary point is that technology development in this exciting new area is also highly multidisciplinary, relying on materials science (e.g., polymer science, 3D printing and soft material fabrication methods, multiscale structural characterization of materials), biology (e.g., extracellular matrix properties, cell-cell communication, stem cell biology, tissue growth) and medicine (e.g., artificial organs and implants, surgical transplantation, regenerative methods).

DNA nanotechnology: A final example of the blurring of boundaries between Bioscience and Materials Science is the area of DNA Nanotechnology.[5] DNA is, of course, an information carrier in biological systems, and in its most prevalent form in cells is structured as a double-helix formed by the pairing of two complementary strands. Each strand is a polymer containing a long sequence of chemical compounds known as bases, denoted as A, C, G and T. In forming the double-helix, bases on one strand pair with bases on the other strand, following the Watson-Crick base pairing rules of A–T and G–C. From a materials science perspective, this type of high fidelity association of two polymers in solution allows for programmable assembly of materials. Thus, for example, one can add complementary DNA strands to metal nanoparticles, and the particles will assemble in solution. More complex structures than simple pairing of particles is possible using different types of DNA sequences and arrangements. These kinds of assemblies of nanoparticles have emergent optical or electronic properties that are sensitive to local chemical and physical environments, are useful in sensor and genetic analysis technologies, catalysis and material characterization arenas, to name just a few applications. An even more sophisticated application of DNA as a component of materials is provided by DNA origami, where up to hundreds of short DNA strands (oligomers) are designed to interact with a longer DNA polymer.[6] The interaction of each short strand occurs so that it will attach to spatially separated locations of the long DNA strand, bringing them together. By careful design, the hundreds of staple strands can be used to fold the overall DNA assembly into a pre-selected 2D or 3D shape that has a particular structure on the nanometer length scale, hence the name DNA origami. Traditionally, such high precision structures are produced using very expensive material fabrication tools (e.g., e-beam writers and the like) in so-called top-down processing. In contrast, DNA-based assembly can produce structures with exquisite control of geometry on nanometer length scales through a self-assembly process in solution, one example of several so-called bottom-up self-assembly methods. The contrast couldn't be more striking – in top-down processing you use incredibly expensive clean rooms and highly precise manufacturing tools, and in self-assembly you essentially use beakers and test tubes in a chemistry lab. These kinds of DNA-based material assembly methods have been developed and employed at the lab bench for about 20 years or so. In terms of technology impact, they are potentially disruptive, but still at a very early stage and must address key questions of scalability and reliability in moving forward.

Technology Development in Materials Science and Biosciences

It's hard to predict where how technologies will develop over time – what is the pace of performance increase, cost of deployment, adoption by society, impact on society and so forth. A few case studies are presented here to guide further discussion. We will see a tremendous rate of performance increase in some technologies such as DNA sequencing or semiconductor manufacturing, but other technologies show extremely slow rates of improvement, with drug discovery being a well-known example. Why is there such different behavior? An attempt to answer this question is a relevant and important research topic, and improved understanding could help guide further discussion of the interplay of societal and physical technologies.

Moore's Law: In the Materials Science area, the single most well-known example of technology development and technology forecasting is Moore's Law.[7] In 1965, Gordon E. Moore, then at Fairchild Semiconductor, wrote a brief article for the trade journal Electronics.[8] In that article he commented on how he expected the field of integrated circuitry to develop over the next ten years. Moore performed an analysis based on two competing forces. One was that most of the cost of an integrated circuit at that time was independent of the number of elements in the circuit (e.g., reflecting packaging and related fixed costs). On that basis, the cost per element should decrease linearly with the number of elements. The other was that as the number of elements increased, beyond some point the overall yield of success in the process decreased. Beyond that number of elements, the cost per element would thus rapidly increase. Putting these two factors together, he suggested that there is a minimum in a plot of cost per element versus number of elements. Further, based on experience from 1959 (the date of the first planar transistor, in essence an integrated circuit with one element) to 1965 (the time of his writing the article) and based on his knowledge of future generations of technology that he could foresee over the next few years, he proposed that the number of elements per integrated circuit would increase by doubling every year for the next ten years. Thus, with about 60 elements per circuit in 1965, he foresaw an increase of ≈210 ≈ 1000 to about 60,000 elements per circuit in 1975. That happened, and in revisiting this projection in 1975, Moore looked forward another decade, and proposed that the doubling time would shift to about 2 years, but the basic trend of rapid growth would continue. (The details are a bit messy, as Moore actually expected the one-year doubling time to continue for a few more years before transitioning to a two-year doubling time. This didn't happen, as Moore explains in his 1995 article.[9])

Doubling of technology performance every two years corresponds to exponential performance increase with a rate of about 35% per year (i.e.,  with y a measure of performance and t in years). The remarkable fact about the overall semiconductor industry is that this rate of growth didn't just continue through one decade, or two decades, but has persisted for 50+ years from 1965 to 2015 and beyond. The corresponding increase in integrated circuit complexity and number of transistors per unit area and a corresponding decrease in cost per transistor form the basis of a set of exponentially improving performance curves that are now collectively known as Moore's Law. The resulting orders of magnitude increase in performance of all semiconductor-based technologies, including computational power, has empowered many of the transformations of society seen over the last several decades.[10]

There are several aspects of Moore's Law that warrant additional discussion.[11] First, the definition has evolved over time, starting with a specific prediction of the number of elements per integrated circuit, and becoming more of an observation of any rapidly exponentially improving technology within the semiconductor industry. Second, the rate of performance increase described by Moore's Law has, in a sense, became self-fulfilling. The rapid doubling time of all things semiconductor is now deeply embedded in the culture of the entire semiconductor and electronics industry, and major corporate entities invest enormous sums of money in capitalization costs to maintain this rate of growth. Third, Moore's Law is, of course, not a "law" in the physical sense (i.e., it's not comparable to a construct like the 2nd Law of Thermodynamics). However, because of its cultural acceptance and external enforcement mechanisms (e.g., through market penalties), it may be similar to a "law" in the societal sense.[12] The entire semiconductor and computer hardware industry is, in some strong sense, governed by Moore's Law. As an aside, one of the consequences of Moore's Law is that the number of transistors currently manufactured has reached almost unimaginably large numbers, more than 1020 per year.[13] That's a number several orders of magnitude larger than the estimated 1014–1015 synapses in a human brain.[14]

In its present form, Moore's Law can't continue forever. Physical limitations are being reached as, for example, the size of semiconductor features in fabricated devices is approaching the scale of a few atoms. There is some indication over the last few years that rate of performance is beginning to slow, though this is subject to some disagreement.[15] On the other hand, there are a number of nascent, potentially transformative, computational strategies in development. Although there are certainly continued performance gains expected over the next several years in traditional, Moore's Law kinds of efforts, the next decade or two may well see a transformation of computational performance as entirely new kinds of architectures come into play.

Drug discovery: As we discuss the interconnection of physical and societal technologies, an important observation is that not every physical technology has Moore's Law type performance (e.g., rate of increase of ≈35% per year). A well-known case is that of drug discovery. The rate of approval of so-called new molecular entities by the US Food and Drug Administration has been relatively unchanged over the last 70 years. For example, in the 1950's, the rate was about 18 per year, and in the 2010's, it's about 30 per year.[16] Even if one were to suppose that there was an exponential growth in new drug approval (and it's not clear that the data support an exponential model), the effective exponential rate is less than 1% per year. This is despite a substantial increase in both public and private investment in drug development, with investment increasing by nearly 6-fold in constant dollars just from 1970 to 2010.[17]

Clearly, something is very different for drug discovery than semiconductor manufacturing! That difference has profound societal impacts as well. No one would argue that there hasn't been substantial improvement in treatment of many debilitating diseases over the prior four or five decades. For example, five-year survival rates of most types of cancer have improved by 20 or 30%, and now exceed 90% for some types of commonly occurring cancer.[18] However, in only the last 25 years, supercomputing power has increased by about 1 million times. If drug discovery and related technologies for medical intervention had improved at anything close to that rate, most diseases would have been completely eradicated and medical treatment would have decreased in cost to the point of becoming readily available world-wide.

DNA sequencing: There are biological technologies with rapidly increasing performance. One example is DNA sequencing technology. The cost per base sequenced, or perhaps more relevantly the cost per full sequence of a human genome, has decreased super-exponentially over the last 16 to 17 years. Per estimates of the US National Institutes of Health, the cost of sequencing the first human genome in 2001 was ≈$100 million, while the current cost in 2017/2018 is about $1000.[19] This decline in cost corresponds to a five order of magnitude decrease over about 16 years, a performance increase that would correspond to a doubling time of about one year. In fact, however, the performance over the last 10 years, from 2008 onward, has improved at an accelerating rate with cost decreasing by ≈10000 in just 10 years. The effects of this technological transformation are yet to be fully experienced, but the cost of full sequencing of a human genome is now within the range of standard medical tests in the developed world. It is likely that the ready availability of sequence information will enable the development of truly personalized medicine, though also raise significant ethical and privacy considerations.

Educational approaches adapted for rapidly evolving technologies

Probably for as long as there has been technological change, and certainly for the last several decades, scholars and thinkers have pondered the interplay of physical and societal technologies. For example, James Burke, in a transcript of 1983 NASA-sponsored lecture (transcript published in 1985) notes:[20]

"The main thing, it seems to me, is to remember that technology manufactures not gadgets, but social change."

Burke went on to argue for a rethinking of educational approaches as a component of a possible positive adaptation to technological change:[21]  

"The other future I mentioned is a good deal more difficult to forecast. .... I suppose what I’m suggesting is a crash restructuring of the educational system. I’ve been a teacher myself, so I know how easy this is to say and how difficult to do. However, if we were to manage some kind of interdisciplinary curriculum that taught people not the facts, which would be obsolete before they used them, but how to use the data systems to juxtapose, to look for relationships in knowledge, to see patterns in the way things happen and affect their lives, then perhaps we would be moving toward a very different type of society, one free of a central paradigm at all."

There is increasing awareness that educational approaches, at all levels, are not optimized for a world that is increasingly shaped by rapidly accelerating physical (and societal) technologies. Educational institutions and systems are perceived as being very resistant to change. In fact, if one were to design a system to be resistant to change, then the end point might look much like the current dominant structure of higher educational institutions, organized by disciplines with very slowly changing boundaries and limited intercommunication. One should not be surprised. In part, educational institutions have likely evolved to fill a societal need of maintaining accumulated knowledge and facilitating its transfer to subsequent generations. Further, as the quantity of factual knowledge and information has exploded in recent decades, if anything, traditional disciplinary structure has become even more entrenched and more finely divided. For example, Chemistry has largely separated into sub-disciplines of analytical chemistry, organic chemistry, physical chemistry, inorganic chemistry, materials chemistry, etc., that have only limited interaction with one another. Similar trends are apparent in Physics, Biology, or Engineering, and we are probably all familiar with increasing specialization in Medicine.

Trends toward increased specialization, particularly in rapidly evolving fields, is a response to an increasing rate of knowledge production. However, even with intense specialization, knowledge production in many areas now appears to exceed the rate at which knowledge can be taught. As a result, students are trained in sets of facts that are partly obsolete even before graduation, and most practitioners of higher education in the sciences and technology face the uncomfortable situation of teaching topics, and sometimes entire fields, that didn't even exist at the time of their own training. In the face of such rapid change, institutions struggle to find a balance between imparting traditional, deep, factual knowledge embedded in disciplinary structures and imparting a way of learning and assimilating new information across a breadth of topics that will allow students to formulate questions to guide their own subsequent intellectual development. Further, there is the need to provide students with both the skills and the opportunities to engage in learning throughout their lives.

In the context of teaching students how to pose creative questions and critically assess the information they find to address those questions, one can argue for the need of broad interdisciplinary exposure. We need to ensure that scientists and engineers are aware of the nature of the humanities and fine arts, and that artists and historians are aware of the nature of scientific endeavors and basic scientific principles. From the science and engineering side, the role that society plays in influencing technology development is almost entirely missing from present training, and from the arts and humanities side, basic understanding of scientific principles and methods of inquiry are all too often lacking.

One major barrier to change in educational endeavors is the lack of resource material. Traditionally, particularly at the higher education level but also at other levels, instructional resource material is developed within tight disciplinary structures. Developing new material to support broadly interdisciplinary educational approaches is hard. But, this is the kind of effort that small teams drawn from different disciplines can effectively execute, and with broad dissemination of material now easy, it's the kind of effort that can have widespread impact.

All of these trends and opportunities in education, particularly higher education, require transforming many of the traditional approaches that colleges and universities have relied upon for the last century or more. Much has been written on the topic of the university of the future, but the changes that universities will actually undertake in response to accelerating changes in both physical and societal technologies still remain uncertain.[22]

Conclusion

To close, we can again look back to the 1980's when thinkers were considering these very same issues. Isaac Asimov stated:[23]

“We live in a world now in which education isn’t geared for creativity and the kinds of jobs you have destroy any feeling of creativity you might have had to begin with. ... I have great hopes that if you could get to youngsters, have them grow up with the aid of a computer, so they would have a one-to-one relationship with the wisdom of the world (the computerized wisdom of the world), they could follow up what interested them at their own speed and time. They would not be put to work doing things that are stupid enough for a computer to do but would be encouraged to do things that are more human. It may turn out that what we call creativity is a much more common feature of the human mind than we think."

  Thirty years later, I'm not sure there's much to add to this assessment of both the challenge and the opportunity we face as our physical and societal technologies continue to evolve.

References

[1] See, for example, A. Adan, G. Alizada, Y. Kiraz, Y. Baran, A. Nalbant, Flow cytometry: Basic principles and applications, Crit. Rev. Biotechnol. 37 (2017) 163-176. DOI: 10.3109/07388551.2015.1128876.

[2] J. McNerney, J.D. Farmer, S. Redner, J.E. Trancik, Role of design complexity in technology improvement, Proc. Nat. Acad. Sci. USA 108 (2011) 9008-9013. DOI: 10.1073/pnas.1017298108.

[3] S.A. Langhans, Three-dimensional in vitro cell culture models in drug discovery and drug repositioning, Front. Pharmacol. 9:6 (2018). DOI: 10.3389/fphar.2018.00006; A. Muir, M.G. Vander Heiden, The nutrient environment affects therapy, Science 360 (2018) 962-963. DOI: 10.1126/science.aar5986.

[4] F.J. O'Brien, Biomaterials and scaffolds for tissue engineering, materialstoday 14 (2011) 88-95. DOI: 10.1016/S1369-7021(11)70058-X.

[5] N.C. Seeman, H.F. Sleiman, DNA nanotechnology, Nature Reviews Materials 3 (2017) 17068. DOI: 10.1038/natrevmats.2017.68.

[6] P.W.K. Rothemund, Folding DNA to create nanoscale shapes and patterns, Nature 440 (2006) 297. DOI: 10.1038/nature04586.

[7] G.E. Moore, “Moore’s Law at forty” in Understanding Moore’s Law: Four Decades of Innovation, D.C. Brock, Ed. (Chemical Heritage Foundation, 2006), Chapter 7, 67-84.

[8] G.E. Moore, Cramming more components onto integrated circuits, Electronics April 19, 1965, 114-117, Reprinted in G.E. Moore, Cramming more components onto integrated circuits, Proc. IEEE 86 (1998) 82-85. DOI: 10.1109/JPROC.1998.658762. 

[9] G.E. Moore, Lithography and the future of Moore's Law, Proc. SPIE 2437 (1995). DOI: 10.1117/12.209151.

[10] See https://ourworldindata.org/technological-progress (accessed 21 August, 2018).

[11] G.E. Moore, 2006, op. cit.

[12] Thanks to Jenna Bednar for informative discussions on this topic.

[13] https://spectrum.ieee.org/computing/hardware/transistor-production-has-reached-astronomical-scales (accessed 21 August 2018).

[14] S. Herculano-Houzel, The remarkable, yet not extraordinary, human brain as a scaled-up primate brain and its associated cost, Proc. Nat. Acad. Sci. USA 109 (2012) 10661-10668. DOI: 10.1073/pnas.1201895109 (note that estimates of synapses per neuron are in the range of several thousand). Thanks to J.D. Farmer for suggesting this comparison.

[15] L. Eeckhout, Is Moore's Law slowing down? What's next?, IEEE Micro 37 (2017) 4-5. DOI: 10.1109/MM.2017.3211123 (note that this entire issue of IEEE Micro is targeted at "Architectures for the post-Moore era").

[16] M.S. Kinch, A. Haynesworth, S.L. Kinch, D. Hoyer, An overview of FDA-approved new molecular entities: 1827-2013, Drug Discov. Today 19 (2014) 1033-1039. DOI: 10.1016/j.drudis.2014.03.018.

[17] L.D. Fricker, Drug discovery over the past thirty years: Why aren't there more new drugs, Einstein J. Biol. Med. 29 (2013) 61-65. DOI: 10.23861/EJBM201329461.

[18] https://seer.cancer.gov/csr/1975_2015/browse_csr.php?sectionSEL=2&pageSEL=sect_02_table.09 (accessed 21 August 2018).

[19] https://www.genome.gov/sequencingcostsdata/ (accessed 21 August 2018).

[20] Transcript available as “The Impact of Science on Society”, NASA SP-482 (1985); Available at https://history.nasa.gov/sp482.pdf (accessed 21 August 2018), page 21.

[21] Ibid, page 23.

[22] Selected books on this overall topic, each with a distinct viewpoint, are: (a) C.M. Christensen and H.J. Eyring, The Innovative University: Changing the DNA of Higher Education from the Inside Out, (Jossey Bass, 2011) ISBN 978-1118063484; (b) J.J. Selingo, College Unbound: The Future of Higher Education and What It Means for Students, (New Harvest, 2013) ISBN 978-0544027077; (c) C.N. Davidson, The New Education: How to Revolutionize the University to Prepare Students for a World in Flux, (Basic Books, 2017) ISBN 978-0465079728.

[23] NASA SP-482 (1985), op. cit., page 73-74.

Previous
Previous

Carl Frey: Worker-replacing and labour-augmenting technological change

Next
Next

Molly Crockett: How Social Media Amplifies Moral Outrage