News

Microsoft’s model BitNet b1.58 2B4T is available on Hugging Face but doesn’t run on GPU and requires a proprietary framework.
Memory requirements are the most obvious advantage of reducing the complexity of a model's internal weights. The BitNet b1.58 ...
Microsoft’s new large language model (LLM) puts significantly less strain on hardware than other LLMs—and it’s free to experiment with. The 1-bit LLM (1.58-bit, to be more precise ...
The 1-bit LLM (1.58-bit, to be more precise) uses -1, 0, and 1 to indicate weights, which could be useful for running LLMs on small devices, such as smartphones. Microsoft put BitNet b1.58 2B4T on ...