Unlike Moore's law, a new computer class is usually based on lower cost components that have fewer transistors or less bits on a magnetic surface, etc.
It also takes up to a decade to understand how the class formed, evolved, and is likely to continue.
Established market class computers aka platforms are introduced and continue to evolve at roughly a constant price (subject to learning curve cost reduction) with increasing functionality (or performance) based on Moore's law that gives more transistors per chip, more bits per unit area, or increased functionality per system.
Roughly every decade, technology advances in semiconductors, storage, networks, and interfaces enable the emergence of a new, lower-cost computer class (aka "platform") to serve a new need that is enabled by smaller devices (e.g. more transistors per chip, less expensive storage, displays, i/o, network, and unique interface to people or some other information processing sink or source).
Beginning in the 1990s, a single class of scalable computers or mega-servers, (built from clusters of a few to tens of thousands of commodity microcomputer-storage-networked bricks), began to cover and replace mainframes, minis, and workstations to become the largest computers of the day, and when applied for scientific calculation they are commonly called a supercomputer.