Jean was simply 16 when he was left outdoors the entrance door of the headquarters of UK visas and immigration in Croydon, London – alone, frightened and with none paperwork to show who he was. He had arrived in Britain simply hours earlier than – his first ever journey outdoors his residence nation in Central Africa and the primary time he had even travelled out of his residence city.
The previous few days had been a nightmare that had seen him witness a horrific assault on his household. He himself had been subjected to torture and, left with out anybody else to show to, he had managed to search out assist from a trusted buddy of the household.
She had introduced him to the UK by aircraft, and Jean briefly thought he may need discovered security together with her, till he was taken to Croydon’s Lunar Home and advised he was now on his personal.
“She mentioned ‘I can’t help you anymore’. All I bear in mind is she mentioned ‘go into the constructing and inform them who you might be’. It was a little bit of a battle to let her go. I used to be scared, I didn’t wish to go into that constructing. However she satisfied me to go inside. I came upon later that it was an immigration centre,” Jean, which is a pseudonym used for security causes, advised The Impartial.
“I used to be feeling confusion and concern. The climate, the language… the whole lot was new to me. I used to be simply misplaced. Initially I used to be scared seeing folks in uniform as a result of that introduced me again to what I had witnessed at residence. I used to be traumatised from what I had skilled.”
Hundreds of unaccompanied asylum-seeking youngsters search assist from the UK authorities every year, with the vast majority of them aged 16 or 17. Within the yr ending March 2025, there have been 3,707 asylum claims from lone youngsters.
For these aged 17 and underneath, social providers should present someplace secure to dwell, in addition to present garments, meals, training and assist with an asylum declare.
Nevertheless a whole bunch of youngsters are wrongly assessed by Residence Workplace officers as adults, which means they don’t get the assistance they’re entitled to and are sometimes put into harmful conditions.
Information obtained by the Helen Bamber Basis revealed that at the very least 678 youngsters in 2024 had been wrongly categorized as adults after a human “visible evaluation” on the border.
David Bolt, the impartial chief inspector of borders and immigration, discovered elements like “lack of eye contract” had been used to make choices, and mentioned that youngsters had been being “pressured” into declaring they had been over 18. From a pattern of 55 circumstances that the inspector checked out the place the Residence Workplace had mentioned the asylum seeker was “considerably over 18”, 76 per cent had been the truth is discovered to be youngsters.
Ministers now plan to exchange human judgement with AI facial-recognition expertise, in a transfer that charities and rights group have mentioned quantities to an “experiment on migrants” that may result in “severe, life-changing, penalties”.
The Residence Workplace are available in the market for “an algorithm that may precisely predict the age of a topic”. A authorities contract discover, seen by The Impartial, says that the expertise “may have a number of use circumstances for Residence Workplace, an instance may/ can be to help in figuring out the age of those that are encountered with out verifiable id documentation”.
The three-year contract, which can begin in February subsequent yr, is valued at £1.3 million. Saying the plans in July, then-Residence Workplace minister Dame Angela Eagle mentioned that the AI facial age estimation expertise can be the “most cost-effective choice”.
The purpose is for facial age estimation to be “totally built-in into the present age evaluation system over the course of 2026”, she mentioned.
It’s not but clear at which stage of the asylum course of the AI age-estimation expertise can be used; whether or not it might be deployed on youngsters as they arrive to the UK on small boats or whether or not it might be used to tell last asylum declare choices. The Residence Workplace have mentioned that the expertise shall be used to help officers, and that no last choices had been made about what stage of the method it will likely be built-in.
If it used on arrival, the algorithm must account for the ageing have an effect on of traumatic journeys, previous torture and abuse – experiences that may usually make younger asylum seekers seem older.
Jean initially discovered assist from social providers when he arrived within the UK in 2012, and was housed in care with different youngsters. Nevertheless Residence Workplace officers later determined he wasn’t a toddler in spite of everything and his help was taken away.
The choice was devastating. “I used to be referred to as to an interview at 4pm. They gave me a time when places of work are about to shut, that’s my understanding now, however I didn’t launch it on the time,” he mentioned.
“I needed to look ahead to them to obtain me at 5pm. They mentioned ‘you aren’t a toddler, saying you’re a liar’. I advised them ‘I’m not a liar, I do know who I’m, I do know my age’. When somebody is at a desk questioning your age, you are feeling like you might be invisible. It’s a must to battle in your id, and it’s not simple to battle in your self.
“You are feeling like you need to isolate your self to deal with what you have got been by, continually questioning: Why, why, why? You are feeling such as you wish to finish the whole lot as a result of they don’t imagine you, and I do know for certain that many younger individuals are in the identical scenario.”
He defined that the immigration officers advised him he needed to go to the places of work of a charity, Refugee Council, and that he ought to discover his personal manner there: “They gave me a map, and it was a protracted journey to get there, particularly as I used to be scuffling with the language. I managed to get there but it surely was about to shut, and I bought despatched to a hostel to sleep”.
A now 17-year-old boy with little English, he was housed with grownup asylum seekers in a hostel. He felt extremely unsafe and felt it might be a superb determination to depart, one thing he later considered as a mistake.
“I used to be traumatised, anxious, and I simply wished to be by myself. That was the concept,” he defined. This then led to round 4 years sleeping tough in London – till a stranger who noticed him begging for cash at a prepare station directed him to Notre Dame charity in Leicester Sq..
He bought a referral to migrant charity Freedom from Torture, who had been in a position to help Jean submit a contemporary asylum declare. A decide’s determination to grant him sanctuary in 2018, and a recognition that he ought to have been helped as a toddler refugee all these years in the past, has meant Jean now has a roof over his head in council-provided lodging.
On the day of our interview, he has heard that he’s now a British citizen. Nevertheless he fears for others like him who arrive within the UK as youngsters however who’re advised they’re liars.
Talking concerning the authorities’s plans to make use of AI to assist with decision-making, he mentioned: “It’s a manner of not treating folks as human beings. They’re treating us as a software to coach their AI.
“They’re testing one thing and it’s like we aren’t human. They’re considering ‘Okay let’s use them’.
“Making choices based mostly on a pc, everyone knows it’s not all the time correct. They should perceive that loads of younger individuals are going by trauma, and so they could look totally different at that second after they actually need assistance.”
Kamena Dorling, director of coverage on the Helen Bamber Basis, mentioned the federal government’s plans had been “regarding except important safeguards are put in place”.
She added: “Current proof has discovered that AI may be even much less correct and extra biased than human decision-making when judging an individual’s age, with related patterns of errors.
“Crucially, AI can’t account for elements that may considerably alter a youngster’s look after fleeing battle and persecution and making harmful journeys, akin to trauma, malnutrition and exhaustion.”
Anna Bacciarelli, senior AI researcher at Human Rights Watch, mentioned: “The UK authorities’s plans to make use of facial age estimation are misguided at finest, and must be scrapped instantly.
“Along with subjecting susceptible youngsters and younger folks to a dehumanising course of that would undermine their privateness, non-discrimination and different human rights, we don’t truly know if the expertise works. There’s no standarised trade benchmarks, and easily no moral option to prepare and audit this expertise on like-for-like populations.
“Within the UK, it’s been used up to now in retailers and bars, not refugee processing centres”.
A Residence Workplace spokesperson mentioned: “Sturdy age assessments are a significant software in sustaining border safety.
“We are going to begin to modernise that course of within the coming months by the testing of quick and efficient AI age estimation expertise. We then intend to combine facial age estimation into the present system topic to the outcomes of testing and assurance.”









