Biology Education Is Dead. Long Live LLMs
Imagine you are a biology student on your last year of your degree. You wake up in your dorm room, grab your phone, and check the news. "CRISPR-GPT is a new LLM capable of designing and running complex CRISPR experiments. Even untrained individuals had a 100% success rate in their experiments." You sit there, bleary eyed, anxiety rising in your chest. A year ago you chose to pursue a genetics specialization, hoping to do research on the world of genes, DNA modifications, and all these things that seemed like science fiction at the time. Of course, you were unaware of the state of the job market at the time. And of the AI hype slowly consuming the field. As you near graduation, you start to look for jobs, wade into the market, and realize something: there is very little for you to do. Not only is the market flooded with applicants (the "collateral damage" of daily restructurings), but AI is simply faster than you could ever be, holds way more information, and you barely have to pay anything to use it, as it seems to feed only on the dreams of the new generations rather than food. You plop down on the bed, close your eyes, and try not to think about it.
On the other side of the wall, a fellow student wakes up half an hour before class. They have an assignment to submit... and a blank page where words should be. Instead of showering and downing a quick coffee as most mornings, they power up their laptop, and feed a few instructions into their ChatGPT conversation. They attach file with the western blot images from the last lab, and neat rows of text start to appear in their screen, explaining the results, why the different bands are where they are, and what that means. After a couple more prompts to "humanize" the text and a quick skim for any leftover em-dashes, they paste it into a word document, slap their name somewhere, convert it into a PDF, and upload it. 15 minutes have passed. No shower today, but they grab a quick coffee and make it to class just in time.
The Unstoppable Rise Of Technology
Somehow, people are surprised at what is happening in education with Artificial Intelligence (AI) and Large Language Models (LLMs). Go to a family gathering or a group of friends and try to discuss it. All, or most, act surprised that students will readily use AI to "cheat", write essays, and avoid doing work for their classes. Perhaps you also are! But should we really be?
From Encyclopedia To Google And Wikipedia
If you are reading this, chances are your parents had a dictionary or encyclopedia at home they used for coursework. Books and books on different subjects adjacent to their major. But you likely had other tools. Google. Wikipedia. Online books.
We live in the Information Age. It's right there in the name. Access to information has been democratized more in the past 50 years than in the millennium before. And teachers have been saying the same thing ever since this revolution started. Which, if we were to boil down the nuance away, it would come down to "it's bad".
When calculators were invented, math teachers actually took to the streets to protest. I have endlessly heard that we should not use Wikipedia as a source. Book passages and references are still used in academic settings, as if it was not all a quick Google search away. But what should we trust? There is so much unchecked information that some even prefer to call this age the Disinformation Age.
The Challenge In Education: Transitioning From Building Knowledge To Information Processing
Let's stop for a moment and put ourselves into the second student's shoes. Why not write the essay with ChatGPT? Is that really unacceptable, ethically wrong, as many teachers claim? I say it isn't. We, as humans, can't ever hold more knowledge than AI, LLMs, Google, etc. It is physically impossible. Our brains cannot remember so much information. The internal sense of dread at having to compete with these new technologies is rough. If you could access all that information in a few clicks, would you really ignore that and do it "by hand"? If you were asked to dig a hole in the ground with your bare hands, and someone left a shovel next to you, would you not use it? Can you honestly say that? After all, a future without shovels and LLMs seems unthinkable now.
And therein lies our problem as a society and educators. Technology is here to stay. We cannot keep asking students to forgo the tools at their disposal, especially when it is clear that the skills they build without them will be outdated in a few years, as AI advances.
Biology Education Is Dead As We Know It
This week, the paper describing CRISPR-GPT came out. It is, simply put, amazing. For many researchers, it will make many things easier. And yet, for many, it will mean less jobs for them. What once was a coveted and hard to acquire skill is now easily replaced with an AI/LLM.
This will keep happening. Literature search, western blot analysis, lab management. Drug discovery, protein folding, DNA structures. All of this is or will be done by AI soon, or at least you require very little education to run an AI that will solve these issues for you. It happens in other fields as well. So what can we do then?
If You Can't Beat Them, Join Them
Entry level jobs are already disappearing en-masse. We have a responsibility to new generations to teach them something that will serve them both progress science and their own lives and careers. So, as they say, if you can't beat them, join them.
Using ChatGPT and other AI models must be integrated in education. We cannot shame students who choose to use tools to make their lives easier. Yes, there is value in knowing how to do things by yourself, and that should be taught. But if students leave university unprepared to use all the tools available (and use them well!), the system has not done a good job. We are bound to teach pupils how to speed up research, searches, and science, using these tools.
That may take many forms. In my mind, teachers can encourage students to use ChatGPT, then let other students grade the results and criticize the work of LLMs. Reason beyond what they can. Explain the flaws in their processing. Double-checking and finding good references is paramount. Students must be AI-literate and able to create prompts that provide accurate and good responses, and they should be able to build beyond what AI can put out. Teachers can challenge students to explain several concepts in a limited amount of space, using only prompts instead of writing anything themselves. And I am sure there are many other good or better options to integrate AI in the classroom than what I may have come up with.
The era where knowledge alone was the main driver of employment in science is over. How we handle large streams of information, how do we make sure that our scientific arguments are airtight, and how we build on top of machine-stored knowledge will be the key to future success.
After all, an AI may provide you with the step-by-step protocol for an experiment, but if the concept of why you wanted to perform said experiment is not correct, you will have wasted your time anyway. If the results do not valide your hypothesis, you may have walked in circles.
Standing On The Shoulders Of Giants
Science has a beautiful saying, which I always found depressing. "Standing on the shoulders of giants". Although the saying refers to all that came before us and contributed to science, to me, these "giants" really are the pioneers of techniques, the innovators, the disruptors, the ones who made the experiments we run possible, those who created the knowledge we have and require to progress.
The truth is that barely anything we invent is new. Concepts are reapplied to different areas, and new things may be discovered. But few things are actually built from scratch. And yet, the democratization of information allows more of us to stand on these proverbial shoulders. Because we have the most access to protocols, references, results... We can search for them, create them with AI, etc. All knowledge is more readily available than ever.
The people who truly are able to build something more, something new, are not many. Repeating what your great peers have already done is not something to be dismissed. The world needs more people to perform established tasks than to invent new ones.
But with more people than ever to access to these cutting-edge tools that provide us with knowledge, experience, and do tasks for us, those who are actually able to understand it, integrate it, and evolve it, will be the true winners. The next "giants".
It is our generation's duty to teach students to deal with the large quantity of information that can come from AI. To have critical thinking so that they do not only perform an experiment, but perform the right experiment. To connect the dots when the machine cannot. And to take that leap between knowledge and innovation, from idea to invention. Because AI and LLMs only have the knowledge we provide, the results we validate, and the protocols and logical leaps we make. Because, although they are disruptive and world-changing tools, they are at heart just that. Tools. And we still need human minds to use them.