News
Microsoft’s model BitNet b1.58 2B4T is available on Hugging Face but doesn’t run on GPU and requires a proprietary framework.
Memory requirements are the most obvious advantage of reducing the complexity of a model's internal weights. The BitNet b1.58 ...
Hosted on MSN17d
Microsoft's New Compact 1-Bit LLM Needs Just 400MB of MemoryMicrosoft’s new large language model (LLM) puts significantly less strain on hardware than other LLMs—and it’s free to experiment with. The 1-bit LLM (1.58-bit, to be more precise ...
The 1-bit LLM (1.58-bit, to be more precise) uses -1, 0, and 1 to indicate weights, which could be useful for running LLMs on small devices, such as smartphones. Microsoft put BitNet b1.58 2B4T on ...
Some results have been hidden because they may be inaccessible to you
Show inaccessible results