By comparison, GPT-4 has a parameter count estimated at one trillion, six times higher than its predecessor GPT-3. The parameter count may not be an even measure of LLM efficacy, however, as Stability AI noted in its blog post announcing the launch of StableLM:Announcing StableLM❗
We’re releasing the first of our large language models, starting with 3B and 7B param models, with 15-65B to follow. Our LLMs are released under CC BY-SA license. We’re also releasing RLHF-tuned models for research use. Read more→ https://t.co/R66Wa4gbnW pic.twitter.com/gvDDJMFBYJ — Stability AI (@StabilityAI) April 19, 2023
“StableLM is trained on a new experimental dataset built on The Pile, but three times larger with 1.5 trillion tokens of content […] The richness of this dataset gives StableLM surprisingly high performance in conversational and coding tasks, despite its small size of 3 to 7 billion parameters.”It’s unclear at this time exactly how robust the StableLM models are. The StabilityAI team noted on the organization's Github page that more information about the LMs capabilities would be forthcoming, including model specifications and training settings. Related: Microsoft is developing its own AI chip to power ChatGPT Provided the models perform well enough in testing, the arrival of a powerful open-source alternative to OpenAI’s ChatGPT could prove interesting for the cryptocurrency trading world. As Cointelegraph reported, people are building advanced trading bots on top of the GPT API and new variants that incorporate third-party tool access, such as BabyAGI and AutoGPT. The addition of open-source models into the mix could be a boon for tech-savvy traders who don’t want to pay OpenAI’s access premiums. Those interested can test out a live interface for the 7B-parameter StableLM model hosted on HuggingFace. However, as of the time of this article’s publishing, our attempts to do so found the website overwhelmed or at capacity.