The alleged sexual assault was against the in-game avatar of a girl aged under 16, and perpetrated by a group of strangers. The case of a sexual offence in virtual reality is the first of its kind being investigated by a UK police force.
Despite no physical injuries, a senior police officer familiar with the case told the New York Post that the victim may have experienced emotional and psychological trauma similar to someone sexually assaulted in the real world.
Prosecuting the case may turn out to be legally complex, since the existing laws around sexual assault define it as physical touching in a sexual manner without consent.
“We have a duty to take issues like this seriously. It might seem strange to some, but I do think this is something well worth looking into,” said home secretary James Cleverly to the LBC.
“I know it is easy to dismiss this as being not real, but the whole point of these virtual environments is they are incredibly immersive,” he continued. “And we’re talking about a child here, and a child has gone through sexual trauma.
“It’s also worth realising that somebody who is willing to put a child through a trauma like that digitally may well be someone that could go on to do terrible things in the physical realm.”
The metaverse is an online space where users interact with each other through digital recreations of themselves. According to Statista, the number of UK metaverse market users is expected to reach 24.9m by 2030, over a third of the population.
This is not the first case of alleged sexual assault in the metaverse. In 2022, a 21-year-old woman claimed her avatar was raped while other metaverse users watched. According to her, a user led her into a private room and made advances on her which caused her hand controllers to vibrate, “creating a very disorienting and even disturbing physical experience during a virtual assault,” said a report in Wion.
According to a police officer speaking with Daily Mail, the metaverse has become rife with such crimes.
This has prompted the NSPCC (the National Society for the Prevention of Cruelty to Children) to urge immediate action, looking to tech firms to address the issue of abuse in the virtual reality space.
“Companies must act now and step up their efforts to protect children from abuse in virtual reality spaces. It is crucial that tech firms can see and understand the harm taking place on their services, and law enforcement has access to all the evidence and resources required to safeguard children,” said Richard Collard, associate head of child safety online policy at the NSPCC.
Recommended reading
While the Online Safety Act, passed last year, grants regulators the authority to sanction social media companies for content on their platforms, it has yet to be enforced. Ofcom, the communications regulator, is currently developing guidelines for the practical implementation of the rules.
In response to the allegations, Meta, the parent company of Facebook and Instagram, and an operator of the Metaverse where the alleged incident took place, stated that such behaviour has no place on their platform. They highlighted an automatic protection feature called “personal boundary” designed to keep unfamiliar users at a distance. Meta expressed its commitment to investigate the incident once details become available.
Related
Credit: Source link