A private school in London is opening the UK's first classroom taught by artificial intelligence instead of human teachers. They say the technology allows for precise, bespoke learning while critics argue AI teaching will lead to a "soulless, bleak future".
The learning facilitators they mention are the key to understanding all of this. They need them to actually maintain discipline and ensure the kids engage with the AI, so they need humans in the room still. But now roles that were once teachers have been redefined as “Learning facilitators”. Apparently former teachers have rejoined the school in these new roles.
Like a lot of automation, the main selling point is deskilling roles, reducing pay, making people more easily replaceable (don’t need a teaching qualification to be a "learning facilitator to the AI) and producing a worse service which is just good enough if it is wrapped in difficult to verify claims and assumptions about what education actually is. Of course it also means that you get a new middleman parasite siphoning off funds that used to flow to staff.
They could just have the kids read actual books designed by actual pedagogic experts which actually help to learn through studying it.
Now nobody knows if the “AI” is even teaching real things or if it is only using properly vetted material, if the structure it proposes makes sense.
Yes teachers are fallible, but they are also human and can emotionally understand what is going on during learning that a trained algorithm just cannot get. In so far also it means there needs to be a clearly defined “goal” of knowledge and competencies and the algorithm can only fill the holes, rather than encourage students to maybe seek knowledge beyond the established set.
Also i am skeptical how much of it is even “AI” in the sense of needing a machine learning approach, or it is just regular computer tests of which “level” is reached in each category and where to improve still. Chance is, this could be done with an excel sheet.
Unfortunately this trend is happening in the States even without the AI buzzwords (though it is there). You give every kid a tablet with educational apps that feed into a curriculum algo. Teachers are told by the algo which student needs help on what, basically they become facilitators to the app. Then you also have “student summarizers” which will “analyze” a student written or audio submission and flatten it down to some unform stats.
so the thing is this is a private school at the sort of fees that attract really good teachers and use them as a selling point, so I don’t actually think being cheap is the goal here. I think some idiot thinks this is actually a good idea.
In some areas of the USA, teaching degrees aren’t required to actually teach. I hope I don’t see this world-wide.
Aren’t there laws about who gets to teach kids? I know there are strictures on teacher to student ratios, but how can that exist without a written definition of what a teacher is?
There is a lot of benefit to be had though. It will likely suck at first and I think the tendency for outsourcing this kind of thing is idiotic. The gov needs to be the AI administrator AND the company because AI is extremely privacy invasive and should never be commercialized in any capacity with kids. I don’t support even the school having full access to a child’s prompting. I say this because I have intimate knowledge of what kind of information can be accessed using this and how invasive it is. I only run my own open source models on my own offline hardware. The only persons within a school with full access to a child’s prompting should be someone bound to confidentiality and a Hippocratic oath like a licensed psychiatrist with no obligations or bias towards the school’s petty interests.
The education system is largely antiquated presently. I’m all for supporting my community with living wage jobs. Our reductionist culture is a big part of why we are falling apart. When we are presented with efficiency improvements, we are too stupid to adapt, and too stupid to use them as a resource. We flush out that newly created value instead of investing it immediately within ourselves.
The world has changed from an era when a traditional teacher is relevant. Audio visual information is our primary form of communication. With readily available video, it is criminal to continue live lecturing and presentation of static information. There is no chance that the live presentation of information is anywhere near the quality of a polished and edited video. There is very little chance that any given lecturer is truly the best at presenting such information. Such a statement glosses over the fact that there are an enormous range of personalities and functional thought processes. It is extremely unlikely that any given teacher connects well with each individual student. We have had readily available video communication for over a decade. Some university professors readily use the medium and offer class time as more of a workshop or lab environment. Most primary schools lack this kind of adoption of technology, complexity, and efficiency to keep up with the changing world. In truth, we lack the requirement for a teacher to be a life long learner too.
I expect much the same Luddism with AI. With teaching kids, this is pushing AI to the point where it needs serious supervision to be effective. Maintaining a child’s autonomy and right to privacy is absolutely critical for the future of society as a whole. However, the ability for AI to adapt to any functional thought and help with individualized problem solving is something that no teacher is capable of with more than one student at a time.
Most of us had to persist through our frustration in order to learn. AI can directly and individually address that frustration and find a solution. It is not always correct, but it is in the same realm of accuracy as an above average teacher. Maybe you too were aware of just how many teachers did not even know the subjects they were tasked with teaching in primary school, I certainly was.
christ
it doesn’t do this
I’m sorry your teachers sucked bad enough you could replace them with a prerecorded video and a statistical language model that’s notorious for generating confident, dangerous lies. I don’t think most kids should have that kind of experience in school though, and if they are currently maybe we should do what it takes (funding, regulation, strikes) to not go in that direction.
The thing is, technology could absolutely play a huge role in advancing education, allowing students to approach material at their own pace and (algorithmically, not black box bullshit) adjusting problem sets to optimize their benefit from the learning.
But this is to free the actual teacher to spend their time one on one assisting students with areas where they need the extra attention. It’s not to replace it with some unreliable bullshit machine.
(It should also probably be only part of the schedule. Various group settings have a bunch of value in a bunch of contexts both for the material and social stuff.) But you could absolutely enhance learning.
No, it can’t.
Quod grātīs asseritur, grātīs negātur.
We’ve been using “video communication” to teach for half a century at least; Open University enrolled students in 1970. All the advantages of editing together the best performances from a top-notch professor, moving beyond the blackboard to animation, etc., etc., were obvious in the 1980s when Caltech did exactly that and made a whole TV series to teach physics students and, even more importantly, their teachers. Adding a new technology that spouts bullshit without regard to factual accuracy is necessarily, inevitably, a backward step.
You’re being downvoted just because you’re going against the hive mind that exists on Lemmy for some reason, saying that AI might not actually be the devil incarnate.
With a bunch of useless “nuh uh” replies in response.
AI has been incredibly useful for me and its fucking amazing. I cant wait for the huge amount of equality that can be afforded to people who want to learn (and not shitpost).
I completely agree, AI as if exists now is a really useful tool, and as long as AGI doesn’t develop into a paperclip maximiser (which we should take steps to prevent, focusing on AI alignment), it will also be extremely useful, perhaps being the thing that finally means humans don’t have to work anymore. At least we won’t have to work as much.