majestic moutnains banner

Biotechnology: Combining Engineering with the Biological Sciences

What is Biotechnology?

The word “biotechnology” is a portmanteau word of “biology” and “technology”. Also know in a shortened form as “biotech”, it uses biological and natural processes to tackle and solve some of the world's biggest technological and industrial problems. Anything that uses biological cells and living material - including genes and gene sequences in technological applications - qualifies as biotechnology. Although its largely considered for its medical applications in the public mind, it has its considered origins is in the vital technology of agriculture from pre-civilization. Applying biology and technology together have been around for the best part of 6,000 years (1) since the dawn of the Agriculture Revolution in the Neolithic.

The term “biotechnology” was not coined until 1919 (2) way beyond the beginning of the Enlightenment and into the modern era of science. A UN convention on biodiversity defined it as "any technological application that uses biological systems, living organisms, or derivatives thereof, to make or modify products or processes for specific use" (3). It is so carefully interwoven with most biological sciences, particularly for industrial scale processes such as agriculture, medicine and many other areas, that it is and will be fundamental to any future developments.

Read more about biotechnology degrees.

A History of Biotechnology

The Neolithic Agricultural Revolution

To examine the origins of biotechnology and humans using biological material as a commodity to improve lifestyle, increase longevity and create convenience, we need to go back to the dawn of human civilization itself (1). The Neolithic Revolution or The First Agricultural Revolution as it is sometimes known is arguably the point at which human culture began. Humans largely stopped being nomadic hunter-gatherers and began planting crops and breeding livestock for food and clothing. Settlements became increasingly permanent rather than seasonal and the first permanent or semi-permanent camps and villages began.

Nobody is quite sure how and when it started, but it's clear the ideas were traded between groups of people and may even have arrived separately in different parts of the world. From East Asia across the Middle East, the Russian Steppe and into Europe, farming spread quickly. It's a common belief that the Native Americans were all nomadic, but this is not strictly true. Many tribes were before the arrival of Christopher Columbus, but many more were not, particularly the Pueblo and Hohokam (4) cultures of the southwestern US. Many of these cultures practised what archaeologists call “subsistence farming” and “surplus agriculture” - producing more food than is needed for a season in contrast to the process of simply gathering enough food in one area before moving to the next.

Several major biotechnological “discoveries” were made during the Neolithic Agricultural Revolution. The first is the development of bread. Evidence for the making of flatbreads or something similar goes back 30,000 years but archaeologists have traced the advent of leavened bread and the use of yeast to around 6,000 years ago (1). It's possible that, as many of these discoveries are, the arrival of leavened bread was accidental. Mixing wheat with yeast to create bread is one of the oldest and simplest applications for biotechnology. Yeast occurs naturally, and its use creates irreversible chemical and nutritional changes when heat is applied.

The second major change is in agricultural crossbreeding. This technology was applied to both livestock and crops using the same principles. Again, it is not clear precisely when this began although archeozoological evidence suggests animal husbandry may predate crop farming, but not by much. Evidence suggests animal husbandry and the first fields for crops began some 10,000 years ago in the ancient Near East (5). Techniques of selective cross-breeding were applied to semi-domesticated livestock to create preferential attributes for more dairy and meat production and to increase yield and resilience in crops. Most of the livestock and staple crops we plant and consume today never evolved to be that way but exist due to thousands of years of selective crossbreeding. This was the original genetic modification.

Beer and Wine as Important Early Biotechnologies

Practices for surplus agriculture led to incredible population boom across the world where these practices were adopted. With the use of irrigation and careful cross-breeding, some of the world's earliest civilizations were able to settle, propagate, expand and create some of humanity's greatest achievements - art, culture and literature, and monument and city building. None of these things would have been possible without surplus agriculture allowing for the concentrated urban population numbers that would enable it. But these growing populations would require more and more food which meant more crossbreeding for even greater yields and hardiness in some of the Old World's most difficult places to settle. The great cities of ancient Mesopotamia may have formed in the Fertile Crescent between modern Iran and Iraq, but some of these settlements required environmental engineering projects such as irrigation to bring masses of water to the cities to feed and water its population and its agricultural practices.

But urban centers bring public health problems, many of which are born and transmitted by the waterways. Beer may have originally developed as a by-product of the bread fermentation process, but it soon became one of the cruxes of civilization in the ancient world. Water was often dirty and contaminated (6). One of the best ways to remove pollutants and harmful microbes and get people to drink the water was fermentation. Beer and wine may be seen as luxuries in the modern world, but to ancient people, they were the safest ways to consume fluids. It's the reason ancient Mesopotamia developed beer and later Mediterranean civilizations developed wine. In the Far East, civilized societies produced sake (rice wine) or versions of it.

Fermentation is now an important cultural indicator. Every society that produces alcohol is proud of that beer, spirit or wine heritage. But fermentation is another biotechnological process that, at the time of its development as an industrial process, was beneficial to humanity. A naturally-occurring process (the degradation of organic material) breaks down and destroys harmful bacteria that occurs naturally in water (7). A similar process allows cheese to be made from raw dairy, with harmful bacteria eliminated by beneficial bacteria in the process of making dairy produce for human consumption.

The Enlightenment

These technologies served human civilization rather well for thousands of years even though the capacity for understanding how and why fermentation and why certain enzymes can increase the longevity of milk by turning it into cheese was never understood until the scientific revolution. The Enlightenment advanced the cause of many sciences, not least of all biology and its associated subdisciplines. One of the earliest contributions to medical science from biotech, and one that took it firmly outside its limitations within food technology, was the vaccine. Although the reasons for why and how they worked were little understood at the time, the application of specific viruses and bacteria as a treatment for triggering the immune system to create immunity to other conditions is now one of the founding scientific theories behind immunology.

The arrival of biotechnology as an area of study as we would understand it arrived late as far as the Enlightenment is concerned. Its major contributor in the early 19th century and before he published his seminal works on evolutionary theory, was Charles Darwin. By 1859 when he published On the Origin of Species, he had already written extensively on the ability of humanity to influence species changes through unnatural selection. Observing pigeon variety based on the attributes of its parents, he observed the process of what today we call “selective crossbreeding” (9).

The scientists in this period did not understand genetics, but they did have some of the tools that would allow us to examine the tiniest life forms. It was Louis Pasteur who finally came to examine the bizarre and (then) mysterious phenomena of yeast fermentation. In another paper, he theorized a lactic process in dairy products similar to that of alcohol, indicating that in both cases, a type of yeast (but different yeast) was responsible (10). He further demonstrated how microorganisms such as bacteria could spoil food, a theory that would later lead to “pasteurization” - a method used to destroy harmful bacteria and prolong the life of dairy produce.

FIND SCHOOLS
Sponsored Content

Into the Modern Age: The 20th Century

Industrialization of farming practices began in the 17th century during what is called the “Second Agricultural Revolution” or “The British Agricultural Revolution” after its invention in Britain (11, p1). It wasn't just towns and cities that industrialized and adopted mass production methods for goods; agriculture did too. Right across the developed world, thanks to selective crossbreeding, yields were far higher than at any time before that and it would continue exponentially until the end of the 20th century when the continual growing population would demand even higher yields.

Many of our best-known crop cultivars were developed in the late 19th and early 20th centuries. New techniques and a more tailored approach to soil types, water and cultivation (11, p15). The early 20th century saw incredible advances in agricultural technology. It was the beginning of intensive farming following the two world wars that created the massive surplus that we have today and it's largely due to advances in such biotechnologies as understanding such concepts as microbiology. New herbicides and pesticides were developed to deter pests are reduce instances of infection or infestation of the crops. The Potato Blight that had led to so march starvation in the previous century would never (yet) be repeated in the developed world.

RELATED: Microbiology Careers

But biotech was not just about increasing yields through disease prevention; Israel's first Prime Minister started life as a biochemist and developed a method of using biotechnology to produce high yields of cornstarch. Although initially used in explosives during World War One (12), today it is used as a thickening agent in many foodstuffs, especially sauces. The growth of biotechnology as a medical science also took an incredible leap forward in the early 20th century. Edward Jenner's discovery of the Penicillium mold which would go on to create the first antibiotics and put biotech's foot firmly in the realm of medical treatment (13). This technology would initially help humans, but antibiotics would increasingly be used in agriculture too. By preventing certain diseases in livestock, we could further ensure greater survival rates of the cattle and sheep and ensure higher yields.

The late 19th into the 21st century is and continues to be a period of exponential growth in human longevity and health, but new problems arose in this industrial world - climate change. Adaptation to the problems of a warming world, more erratic weather, the appearance of new diseases, the spread of existing diseases into areas that were previously not conducive and mutation of existing diseases would form the crux of many problems in the 21st century. Also with increased antibiotic resistance, researchers would need to look elsewhere and at new technologies to solve the problems of the new millennium.

21st Century Developments

Now, biotechnology is finding new ways to branch out. The “Green Revolution” is underway and it's thanks, in part, to our understanding of new biotechnologies such as genetics and genomics. The best-known of the modern technologies utilizing principles of biotechnology is genetic modification. Humans have modified crops and animal species for millennia through trial and error crossbreeding related species and selective crossbreeding of individuals to create favored attributes. These attributes are designed to create greater yields (grain and seed in plants, milk and meat in livestock), for disease or pest resistance and hardiness (less subject to erratic weather). Modern genetic modification works with the same ideas in mind but by looking to select single genes to implant in an existing species. Many see this technology as vital to alleviating or even solving hunger in the developing world, especially marginal landscapes - areas already subject to slight changes that may suffer further when the full effects of climate are realized in these areas (14).

Genetic modification in agriculture is still an under-used technology thanks to public mistrust. But GM has many potential applications. It's taken until nearly the end of the second decade of the 21st century for the first trials to go ahead. GM wheat and potatoes were authorized in both the developed (15) and the developing world in 2017 (16). Access could prove problematic in the short-term due to cost in the developing world (14), but with any technology, the cost will eventually reduce with widespread use and faster, more efficient production methods. Further developments in the late 20th and 21st centuries include hydroponics and their potential for colonizing other planets.

But agricultural yields are not the only potential application here. The Zika Virus that hit Brazil and other parts of South America and Mesoamerica in 2016-17 was an unusually high level of the disease. It's carried by mosquitoes and in a typical year, cases number in the hundreds right across the tropics. But n a single season, with Brazil the worst affected, thousands of children were born with the deformities. One novel solution was to release genetically modified mosquitoes (17). Data demonstrated that the tactic was a success. We can expect further outbreaks of diseases spread by insects to be fought with genetic weapons in the future.

Major Applications of Biotechnology

Some of the issues have already been discussed in the “History” section thanks to developments, but here we're going to provide some real-world examples of how biotechnology has helped, is helping or may help in future.

In Agriculture

Genetic Modification: It's the oldest form of the use of the science of biotechnology and it's still important today. It's unlikely humans will ever stop developing the biotechnology of agriculture as, while the human population continues to increase, we're going to need to produce higher yields. Yield may be the answer to an overpopulation problem to avoid encroaching into more virgin land such as rainforests and talking up valuable space for potential housing in other areas. We have always sought to increase yield, pest resistance, hardiness and safety through selective cross-breeding. But genetic modification has proven itself a vital technology for coping with our food demands for tomorrow. Repeated tests and studies have shown it is safe despite some minor concerns over ethics and potential allergens.

Hydroponics: As we look to colonize other worlds (with a planned manned mission to Mars likely around the 2030s) we're going to need to feed a lot more people and feed them on worlds presently unable to sustain life as it stands because of a lack of soil or little to no nutrients in their soils. This will mean not just genetic modification for plants to produce high yields for little energy, water or UV light input, but also to grow in some of the harshest conditions of the solar system. This could also mean something called hydroponics - developing ways of growing crops without the use of soils and monitoring soil fertility (18). A number of trials have been run over the decades. Hydroponics is not just a potential technology, but in some areas, it is already a vital method of production.

Biofuel production: One of the biggest challenges of the modern world is to continue to supply fuel. Fossil fuels are a finite resource. It's a matter of when, not if, renewable energy production methods replace oil, gas and coal. Hydroelectricity, solar power and wind turbines are the most common but at present, crops are grown and converted to diesel. The organic material used in biofuels such as Miscanthus (Elephant Grass) and Prairie Switchgrass (19) are high yield and low maintenance. The ability to convert organic material to fuel is a recent development.

Animal sciences: People with qualifications in biotechnology may work in animal sciences working on some of the biggest issues today. While livestock production and veterinary science are the most obvious, animal scientists also examine organic materials from animals to create goods for human consumption - for example, tempering leather, examining the use of bone mulch or animal waste for natural pesticides and fertilizers. There is also some overlap with medical science and genetics as researchers seek to understand why some animal species are genetically immune to certain diseases and researching those genes for human and animal testing (34).

FIND SCHOOLS
Sponsored Content

In Bioengineering

Bioengineering, or Biological Engineering, is an applied science using biological theory in solving real-world problems. This both complements and contrasts traditional engineering practice that uses math, science and computing in designing structures and tools. Bioengineering allows adaptation of the sciences by looking at ideas, theories and practices that already exist in nature. Biological engineers aim to mimic existing biological systems or modify them to replace, enhance or otherwise improve upon current engineering problems. It brings together two seemingly disparate sciences to create a necessary overlap.

Food technology is an area of overlap with agriculture, but bioengineering has designed and created several alternative food sources. Synthetic proteins used in meat substitute foods is one example, but in recent years researchers have focused on producing synthetic meat (21). Rearing livestock for meat is one of the most land-intensive food production methods and is arguably not good for the ecology. It also creates ethical dilemmas for people who eventually turn vegetarian or vegan for animal welfare purposes. If bioengineering can create synthetic meat with all the nutrients of meat and to do so safely, it could eliminate both the ethical and ecological problems while providing for our meat requirements of the future.

In Biomimetics / Biomimicry

A form of bioengineering, biomimetics (also known as biomimicry) is the application of natural processes and systems in modelling to solve complex engineering problems. One of the earliest and simplest is how humans attempted to fly by copying the flapping wings and motions of birds. This method failed due to a poor understanding of physics and the nature of flight - speed and lift - while focusing solely on the flapping motions and the shape of bird wings. However, this major failure aside, biomimetics does have useful application today with many success stories. One of the most recent is mimicking termite mound design to create energy efficient buildings of the future (23). The Eastgate Building - a commercial office block and shopping mall combination in Zimbabwe uses efficient internal climate control. The entire structure is built on the same principle as a termite mound to regulate temperature, using natural processes to keep it cool in summer and warm in winter. Remarkably, this complex has no heating or air condition system. Other examples include redesigning medical needles based on the structure of a mosquito's sharp mouth for more efficient injections and blood sample taking, leading to less damage to the veins and arteries. Another is the development of sonar through the marine biology study of whales and dolphins.

In Manufacturing

Biotechnology is also proving vital in catering to our consumer society. One of the biggest problems in this decade is the build-up of plastics in our oceans. Microbeads and other disposed plastics are harming wildlife on a massive scale. 2018 is proving to be the year of awareness raising of the danger of plastics. This type of material currently relies on oil as a raw material, but it can be made through biodegradable, safer materials. It already is and we're using it already in some of the plastics we use every day. Bioengineering could be one of the methods to use for environmental production methods for a range of sustainable materials by mimicking biological systems or applying biological technology to existing technology. The science has developed eco-plastics made from cellulose for our grocery bags, agricultural waste to make packaging for water bottles, hard and flexible plastics (20). In some cases, these are indistinguishable from plastics manufactured from petrochemical production. Cellulose (paper derivative) has proven particularly useful in that it will degrade in just a couple of years and leave no residue when incinerated, unlike conventional plastics.

We also expect biotech in manufacturing to go down to the microbiological level - enzymes and genes - in the future to solve some of our production problems. In theory or in practice, these include genetic manipulation of biomass (22, p5), “green” chemical production (22, p8), agricultural feed and fertilizers (22, p31). This is potentially vital for industries as moving away from petrochemicals and use of oil as a major resource could reduce carbon footprint almost everywhere in manufacturing.

In Environmental Remediation

This is the application of biotech in cleaning up after an oil spill or gas leak (24). Biotechnologists who work in this area will use either natural or genetically modified biological material to counteract the damage that such a disaster is capable of causing. Typical material includes bacteria (microbial remediation), algae and fungi (mycoremediation) to consume polluting material, or new plant species (phytoremediation) to soak it up through natural soil leaching. Bacteria is often the most preferred as it will usually convert dead organisms (biological life killed in the disaster) into organic materials such as nutrients. In each case, the job of the introduced material is to break down organic or inorganic materials into harmless or even beneficial materials through acids and enzymes or the byproducts of their “eating” the material (25).

Biomedical Technology

Since the Enlightenment, biotechnology has seen its greatest number of applications develop and grow through medical sciences. Today, there are no areas of medical application that has not been touched by what is called “biomedical technology”. The first is in disease detection and diagnosis (1). Understanding the nature of disease has not always been an exact science. Many physicians rely on experience and explanations of symptoms to diagnose a condition, even today with all the technology at their disposal. But the application of biotechnology such as genetic testing looking for specific proteins and markers to understand which condition a patient has is one of the most recent developments (26). Now we are unlocking the genetic codes and biological markers of most diseases, this process could be easier in future.

Genetics is also informing a number of other areas such as virotherapy (27). This is the use of viruses in the treatment of other diseases. There are three branches: anti-cancer oncolytic viruses, viral vectors for gene therapy (the use of non-replicating viruses as a carrier for a genetic change) and viral immunotherapy. The latter is similar to traditional vaccines (infecting a patient with another virus, a far less harmful one, that will result in immunity to harmful related viruses) but differs in that it applies antigens into the immune system. The production of traditional medicines, vaccines and antibiotics is also technically biotechnology as researchers often examine the genetic or chemical structure of a naturally-occurring substance and manipulate it or synthesize it for conventional medicines (35).

Biotechnology may have future applications for a concept called “personalized medicine”. The explanation is precisely what it sounds like - the move away from a “one size fits all” mentality to healthcare to one tailored to the unique needs of the patient considering lifestyle and genetics, while using targeted therapies, treatments and care. Naturally, this will include some interesting technologies such as gene sequencing to devise more effective cancer treatments for the individual, diagnostic to determine the type of cancer a patient has (not just where it is - there are different types of breast cancer for example), and the use of biomarkers (28).

The Sub-Divisions of Biotechnology

Biotechnology is a division of biology where the science overlaps and fuses with engineering and the use of applied technology but there are also multiple subdivisions of biotechnology based on intersections with other areas of study. Some of these are relatively new developments and based on technologies that are only just now making it into the mainstream.

Biotechnology falls into a “color code”. The subdisciplines below will often fall into more than one:

  • Blue biotechnology: using marine biological systems for developing applications, products and services, examining sonar (already given as an example) and the streamline shapes of fish and sea mammals for developing submersibles
  • Green biotechnology: This is the application of biotech to the agricultural sciences. Genetic modification and conventional crossbreeding are the most common, but it also researches and develops fungi, algae and bacteria in such applications as environmental remediation
  • Yellow biotechnology: This is one of the oldest methods of biotech and applies to food - not to crossbreeding, but to applied and deliberate fermentation and other processes in the production of dairy produce and alcohol
  • White biotechnology: This is concerned with the industrial process of creating packaging and other mass-produced materials. Alternative plastics (also known as eco-plastics) will come under this as we search for viable and biodegradable alternatives
  • Red biotechnology: The application of biotechnology for medical purposes - some of which have already been discussed here
  • Gray biotechnology: typically applied to anything involving environmental remediation and toxic clean-up
  • Brown biotechnology: This concerns a number of areas in the application of technology for arid areas such as desert and other arid marginal landscapes. Creating GM seeds to survive in the harshest of conditions come under this area
  • Gold biotechnology: Computing and robotics - see below for further details
  • Violet biotechnology: Ethics, law and policy in the application of biotechnology in any industry
  • Dark biotechnology: Fusing biotechnological military use with healthcare, it is the examination and study of chemical and biological pathogens used as weapons. And, of course, the application of disaster relief, treatment and cure
FIND SCHOOLS
Sponsored Content

Bioinformatics

Bioinformatics is when biotechnology works with information technology as a form of theoretical biology (29). It's an interdisciplinary and cross-disciplinary field that attempts to develop software-based tools for the application of analyzing biological data. With the advent of Big Data, it's likely to experience a heyday over the next few years and into the 2020s. It brings together experts and data from such broad fields as information technology, biological sciences, engineering and math, in the analysis and interpretation of biological information. In essence, it is a form of applied statistics and computational processes and modelling for the biological sciences. Typical applications for bioinformatics include investigating large-scale population trends and understanding individual genes as candidates for common diseases. Bioinformatics allows researchers to examine and process more information faster and more often with enormous processing power and statistical analysis including digital mapping where appropriate. The further implications for this are clear - not just for medical use (human and zoological/veterinarian), but in agritech and for gene simulation modeling too (30). We can identify specific genes that make crops hardier and more likely to produce higher yields, and selectively breed them and project how they might work with other genes.

Biorobotics

Biorobotics is biomechanics with the use of computer-controlled robotics. Engineers work to develop robots whose applied methods, mechanics, and theory are inspired by living systems. In theory, they should present advantages over and above traditional robotics systems. By examining evolution of biological systems, we can design better robots, prosthetics and automation in industry (31). The most obvious applications of robotics are, of course, in prosthetics and automation of dangerous jobs to reduce human workplace injuries and death, but with many jobs being expected to move in favor of automation in the next decade, the potential is now limitless. Miniaturization may help us examine disease genetics further, for example - this we call nanobiotechnology (32). Some have even been used in tests to examine childhood development to identify autism and in surgeries. As far as environmental applications are concerned, the ability of robots to mimic and adapt to animals in a natural environment may go some way to understanding ecosystems and putting in place structures for environmental conservation (33).

Sources

Latest posts by Matthew Mason (see all)