They shipped standard with a version of BASIC that was installed in the computer.
The computers can access the BASIC language without the user inserting cartridges or loading software from external media.
Full tokenization means that all keywords are converted to tokens and all extra space characters are removed.
Partial tokenization leaves extra space characters in the source.
How to test for full tokenization: If it is fully tokenized it should return 10 PRINT "HELLO" without all the extra spaces that were entered.