Perhaps it seems counter-intuitive when talking about the future of science to look back at its beginnings; perhaps it even seems strange to discuss the concept of a beginning of science. Nevertheless it is important to have a look back before moving forward.
Science is an incredibly ambiguous word. In some ways the concept of science predates humanity in the use of tools such as sticks to extract ants from anthills. This was, at the time, a technological advancement. Another was fire, a key step in the development of humanity.
About 9,000 years ago farmers used their understanding of the world around them to grow better crops. Humanity has always used its understanding of the stars to navigate.
But are these examples of science?
Not if you apply our current understanding of science as the study of the physical and natural world or the systematically organised body of knowledge on a particular subject; according to these definitions, science is surprisingly new to humanity.
The ultimate origin of science can be debated back and forth but the important issue here is that science is something created by humans, for humans.
Samir Okasha, author of Philosophy of Science: A Very Short Introduction, put it this way: “Scientific theories are invented rather than discovered so they will always bear the imprints of human creativity and they will always be products of their time.”
It’s a similar story with maths and numbers. Numbers were created by human beings. They are part of our attempt to organise and understand the world. In fact, legend goes that when Hippasus, a man who lived about 2,500 years ago, found out that there was no square root of two he was murdered for attempting to show his findings to the world.
Look up the story, it’s incredibly interesting. Of course, we now do know that the square root of two exists, just not in a tangible way. You could never write out the square root of two, for example, because it’s irrational. Similarly, imaginary numbers exist, just not as complete, whole, tangible numbers like one, two, or three.
Like with maths, because science is created by humans it will ultimately bare the human imprint and our understanding of it will change as we move forward. The natural world remains the same but our study of it and our understanding of it changes. And it is that study and understanding, known as science, which we’re focusing on now.
The evolution of science
[pullquote align=”right”]“Scientific theories are invented rather than discovered so they will always bare the imprints of human creativity.”
One of the primary misconceptions about science is the idea that science equals truth. Science, instead, simply equals our current understanding of the world around us.
For example, humanity used to know that the sun revolved around the Earth. Then, in 1543, the Copernican Revolution took place. It was a slow revolution no doubt but a revolution nonetheless, and once it occurred humanity knew that the Earth revolved around the sun.
Similarly, about 300 years ago we began to accept Newtonian physics, a set of rules which explain the world around us and which are still taught in schools today.
However, about 100 years ago Einstein came along and brought humanity special relativity. We now know that Newton was wrong and the world is significantly more complicated than we originally thought.
Similar examples can be found in the splitting of the atom, when humanity knew that the atom was the smallest element and now knows that, in fact, there are many subatomic particles.
The theory of evolution and, later, the discovery of DNA taught humanity that all living beings come from the same place, billions of years ago.
In 1962 Thomas Kuhn shook the scientific community with the publication of The Structure of Scientific Revolutions. In this book the philosopher of science broke down what he believed went into making a revolution like the Copernican Revolution take place.
The varying criteria for such a dramatic shift in understanding included not only scientific elements like logic and a greater understanding of the material, but also several sociological elements like pressure within the scientific community and enthusiasm for the new ideas.
To many scientists, this felt like an attack, accusing science of being based on elements outside of logic and reasoning.
The debate, which continues to this day, raises several issues, including the seemingly obvious nature of science as moving toward truth.
In this debate there are, generally, two sides, the side of realism and that of instrumentalism. The first, realism, is the belief that science is trying to tell us the way the world is and should be regarded as true, at least until we know better.
The second, instrumentalism, is the belief that certain elements of science shouldn’t be regarded as concrete truth, but just as a way of explaining the data of observation.
Though the belief in instrumentalism has declined over the years, Okasha believes that it’s still an issue.
“The general dichotomy between thinking of science as trying to tell you what the world is like in and of itself as opposed to trying to construct theories that enable us to predict the data we see, irrespective of whether they’re really true in any deep sense, that dichotomy still exists.”
Understanding the universe
Keeping this all in mind, perhaps one of the most interesting areas of science that sits on a potential revolution is that of physics, specifically sub-atomic physics.
Several years ago, scientists discovered the Higgs Boson particle, a key part of the current understanding of physics, known as the Standard Model. This model, also known as the “theory of almost everything”, is where physics currently sits in the understanding of the universe.
But if history has taught us anything it is that our understanding of things can drastically change. They might not, and many believe they won’t, but as science begins to look further into the mysteries of black holes, dark matter, and dark energy, there is a potential for a shift to take place.
On this potential change in understanding, Okasha thinks it’s most probable that science will eventually settle down.
“It could be that the basic scientific world view evolves to a point and then doesn’t change much after that, and the day to day activity of what science is just becomes filling in the details and extending the scientific picture to more and more phenomena but without fundamentally shaking [the core of what science is].”
So eventually our understanding, according to Okasha, will most likely settle down, with no more big reveals and changes in understanding. The only issue here is that this universe is pretty big and there is still so much we have yet to know. Our basic scientific world view may never again change drastically, or it might keep changing for the next several hundred years; there is really no way of telling.
Okasha made it pretty clear when asked about the future of physics that it’s ultimately futile to try and predict anything. Nobody knows what can happen and if the last fifty years are anything to go by, the study of physics is so unpredictable.
Cloning the world
[pullquote align=”right”]“We’re effectively as far up the ladder of using animal models as one can go. That leaves only humans.”
The second area that seems to sit on a precipice within the scientific community is that of biology.
Though it doesn’t seem to be set for the type of revolution that Kuhn talked about, with the advancement in areas such as cloning the question of what happens next gets stronger and stronger.
Though we probably won’t see a Jurassic Park style recreation any time soon, researchers have already started studying the Woolly Mammoth and are on their way to being able to clone it, or at least create a hybrid with an Asian Elephant.
This, in turn, raises a good number of ethical questions about the power that humans hold, or should hold, over the cloning of animals. It also raises the question of whether or not we’ll ever get to human cloning, and what that means for humanity.
In 2007, the Oregon Health and Science University successfully cloned primate embryotic stem cells. In an interview with the John Hopkins University Gazette, Jeffrey Kahn, the Levi Professor of Bioethics and Public Policy at the Johns Hopkins Berman Institute of Bioethics, spoke about the implications and possibilities of human cloning.
“We’re effectively as far up the ladder of using animal models as one can go. Oregon Health & Science University has cloned non-human primates. That leaves only humans.”
This doesn’t necessarily mean creating a living human being, as most of the cloning will be for medical purposes to help combat various diseases, but it does means that, potentially, we aren’t all that far off.
If it’s any consolation, back in 2005, the United Nations adopted a Declaration on Human Cloning, making it illegal to partake in any form of human cloning. So at least for now there’s no worry.
Another interesting area in biology is that of medical biology, specifically relating to antibiotics. Due to overuse of antibiotics the fear of superviruses – a virus able to counteract the effects of antibiotics – has grown.
Whether the fear is justified or not, the British government has estimated that by 2050 “drug resistant bacteria will kill more than 10 million people worldwide”.
Now, this is assuming we don’t find a better alternative to antibiotics, a task already being undertaken by numerous scientists around the world, so we don’t know for sure that this will happen. It’s just an estimate.
Either way, there’s a strong chance that soon, antibiotics will be a thing of the past.
Mind vs. Brain
[pullquote align=”right”]“Studying the subconscious is like hitting a moving target.”
Retired neurologist and journalist Robert Burton has spent a great deal of time correcting any misconceptions the public may have about what we know about the brain and about the concept of our minds.
In his book A Skeptic’s Guide to the Mind he breaks down much of what we do in fact know and areas where there is debate as to what the data shows us.
As medical science moves forward into understanding the brain more and more and mapping out just how it works, Burton is cautious over what we might expect.
“Over time, science is likely to unravel basic elements of the brain such as neurons, synapses, their interconnections, and even how fetal neurons migrate to their final place in the mature brain. My doubts are with the assumption that such knowledge will give us an understanding of the so-called ‘hard problem of consciousness’.”
The brain is a phenomenal part of humanity. Arguably the one defining aspect that makes each of us individual, the beauty in the brain is its ability to produce consciousness, an awareness of self, and a bank of memories.
That notion of the brain as a memory bank is an incredibly hard thing to wrap our heads around, let alone study.
“The basic building blocks of subconscious thought – our old experiences and memories – are constantly being reshaped by subsequent experience. We underscore some memories, modify, exaggerate, minimise, or forget others. Studying the subconscious is like hitting a moving target.”
Perhaps most startling is Burton’s reminder as to the difference between our brains and what we often refer to as our minds.
A large part of his book is dedicated to this concept of the mind and how the mind is not the brain, but is instead something created by the brain. When asked, this is what Burton feels is the biggest misconception about the brain.
What’s more, he raises the point that, as other animal species act differently in groups than as individuals, human beings are most likely no different and, he says, the mind most likely plays a big role in this.
“Most commonly dismissed is the very high likelihood that many aspects of our mind are actually collective. Other animal species clearly demonstrate collective behavior that is distinct from their behavior as an individual. Unless we have evolved differently than the rest of the animal kingdom (highly improbable), we also behave differently as groups than as individuals.
“If so, the study of individual minds gives a skewed view of the contributions to our overall behaviour. It is unfortunate that we have such a strong but illusory sense of personal self, equipped with personal agency, that we generally think of ourselves as having unique minds subject to unique control and fail to consider how collective thought will determine the future of our civilisation.”
A good way to imagine this is by thinking of a computer built to simulate a mind. When alone, the computer would run by itself and wouldn’t interact with anything else.
But when in community with other computers it would recognise those other computers, perhaps through wireless connections, and function differently, interacting with and changing its behaviour based on the other computers in the room; and all the other computers would be doing the same. It would literally operate differently.
It’s not a perfect analogy as a computer does not function like a brain, and it’s a difficult concept to try to understand, let alone confirm; but if true, it means that our current studies are giving us a misleading understanding of our own brains.
Rise of the artificial intelligence?
[pullquote align=”right”]“We generally think of ourselves as having unique minds subject to unique control.”
Like Burton, Okasha questions the ability, at least in the near future, of science to be able to study consciousness. Both believe that, though the workings of the brain can be learned, there are far more questions that need to be answered in order to learn how we remember, and continue to remember, experiences and how we develop our own awareness.
One aspect that both Burton and Okasha do raise points about is the advancement of the science of technology, with Okasha discussing the advancement in general and Burton talking specifically about computational neuroscience.
As more and more money flows into the advancement of technology, Okasha points out, it’s natural to predict greater movement in that field and he is fairly confident in believing that we’ll see progress there. How much progress exactly remains to be seen.
An interesting aspect of this progress, and something that has come up in numerous sci-fi films, is the concept of artificial intelligence (AI).
There are robots that can beat humans at chess and robots are currently being built to play football, but the development of AI is a whole other story.
For Burton, the answer to whether or not we’ll being seeing AI anytime soon is simple.
“For those aspects of life that can be reduced to data points, I do believe that AI will be a fitting model. However, I remain dubious that those elements of life that cannot be completely captured by numbers (most of experience) are amenable to AI.”
What this means is that although we will probably be able to create a computer that functions like a brain, to be able to store that brain with memories and experiences, and have the robot actually be able to relive those memories and experiences and extract data from them and learn, etc, is a whole other issue.
To build a robot that knows all the answers is one thing (think google.com) but to build one that can learn the answers and develop understanding is another.
The nature of science
[pullquote align=”right”]“Scientists need to choose what to study and to focus on one thing at a time.”
As Okasha is a philosopher of science, our conversation revolved heavily around what that means when talking about scientific pursuits and the future of science. Although weary to discuss predictions for the future, one of the fascinating topics touched upon was the notion of scientific truth.
When looking at science in the future and the notion of scientific truth, one thing becomes clear from talking to Okasha. Science is not cut and dry. It is a human creation and the study of things we have yet to understand.
“In a sense, all scientific theories can never really be proved to be true in the strict literal sense in which you can prove that Pythagorean’s theory is true. The reason for that is in scientific theory they talk about what’s going to happen in all times and places.”
Science is based on observation. Making claims from observation is, inherently, irrational according to many philosophers of science because it does not use deductive reasoning.
Now, this does not mean that scientific theories are wrong or should be ignored. Okasha actually says the opposite, pointing out that “you can achieve a high degree of certainty [using observation]” and scientists are constantly testing their observations to check for errors.
What it does mean is that, because of this, the very nature of science can, and should, change as more and more evidence is available. It has been the case throughout history that when new evidence contradicting a certain believe comes to light the scientists shift their thinking.
Though sometimes rather slowly, the Copernican Revolution took many, many years, this notion of science is an important one, and it is one that has allowed advancements to occur.
There are numerous aspects of science that sit on the edge of an incredibly exciting, and somewhat bewildering, future and it is because of this notion of science as changeable and adaptable that they are possible.
As is evident talking to Okasha, science will still move forward, despite what certain people say when discoveries such as the Higgs Boson are made. He doubts there is an ‘end to physics’ or any other scientific subject; at least not for a long time.
As Okasha puts it, “scientists need to choose what to study and to focus on one thing at a time. There are a potentially unlimited number of things you can, in principle, study.”
The next fifty years holds a number of potential advancements and have the possibility to reveal more than we can imagine. If there is one thing for certain though, it’s that there is no telling what any of these secrets are. After all, science is about fumbling around in a dark room looking for answers, sometimes to questions we haven’t even asked yet.
Featured image, the 1940s comic book character Captain Future, by Colleen A. Bryant.