Skip to main content

Splitting a large AI across several devices lets you run it in private (New Scientist)

Running your own AI locally by splitting it across multiple devices offers more privacy than using online services

Powerful artificial intelligence based on large language models can be computationally split to run on several smartphones. That could enable people to use AIs locally without relying on a cloud service’s data centres – and without needing to share sensitive queries or personal information with a tech company. “Our key motivation was privacy,” says Sangeetha Abdu Jyothi at the University of California, Irvine. “If users want to run large-language-model queries without revealing their questions to providers, how can we do that?”

Read the full article in New Scientist.