30s Summary
Near Protocol aims to construct the largest open-source AI model in the world, around 3.5 times bigger than Meta’s existing one. The company will use thousands of contributors via its Near AI Research hub, with participation beginning for a smaller model. The project is planned to spread over seven models and will ensure privacy with encrypted Trusted Execution Environments. Covered by selling tokens, the costs involved are approximately $160 million. This ambitious move presupposes new technology to create a decentralized network of computers, promoting autonomous AI technology over AI controlled by single entities.
Full Article
Near Protocol is planning to create the world’s biggest open-source artificial intelligence (AI) model. Would you believe it’s going to be 3.5 times bigger than Meta’s current model? They announced this plan at a recent conference in Bangkok.
The plan is to get thousands of people to contribute to the model through the new Near AI Research hub. In fact, participants can start training a smaller model from today!
This AI project will progressively get bigger and more complex across seven different models. Only the top contributors will move on to work on the larger, harder models. These models will be monetized and to keep people’s privacy safe, they’ll be using encrypted Trusted Execution Environments.
Funding is, of course, quite a chunk—the training and computing cost about $160 million. But Near Protocol’s co-founder, Illia Polosukhin, says the funds can be raised by selling tokens. He says, “The tokenholders get repaid from all the events that happen when this model is used. So we have a business model, a way to monetize it, raise money, and have a loop that lets people reinvest back into the next model.”
Near Protocol, by the way, is one of the few cryptocurrency projects potentially capable of this. Polosukhin co-wrote a revolutionary research paper that led to ChatGPT. Alex Skidanov, the other co-founder who also worked at OpenAI, even admitted that this is a huge project with absolutely enormous hurdles.
For example, to train a super big model like this, you’d need tens of thousands of GPUs all in one place. To avoid that, they’re thinking about using a decentralized network of computers, but they need to invent brand new technology for that since all our current methods need very fast connection. But based on Deep Mind’s recent research, it just might be possible.
Whatever happens, Polosukhin emphasizes that decentralized AI technology must win. As Edward Snowden explained at the conference, if one company controls all the AI, we’ll effectively have to do whatever that company says, and there will be no decentralization. So, the direction to follow is decentralized AI.