[3] It was announced on September 7, 2023,[4][5] and an initial paper was published 4 days later.
[6] Initially intended for use in the IBM's cloud-based data and generative AI platform Watsonx along with other models,[7] IBM opened the source code of some code models.
[10][11][1] A foundation model is an AI model trained on broad data at scale such that it can be adapted to a wide range of downstream tasks.
[4][13] On May 6, 2024, IBM released the source code of four variations of Granite Code Models under Apache 2, an open source permissive license that allows completely free use, modification and sharing of the software, and put them on Hugging Face for public use.
[14][15] According to IBM's own report, Granite 8b outperforms Llama 3 on several coding related tasks within similar range of parameters.