The emergence of Web3D dates back to 1994, with the advent of VRML, a file format designed to store and display 3D graphical data on the World Wide Web.
[citation needed] The main drawback of the technology was the requirement to use third-party browser plug-ins to perform 3D rendering, which slowed the adoption of the standard.
[citation needed] Between 2000 and 2010, one of these plug-ins, Adobe Flash Player, was widely installed on desktop computers and was used to display interactive web pages and online games and to play video and audio content.
[citation needed] Eventually, Adobe developed Stage3D, an API for rendering interactive 3D graphics with GPU-acceleration for its Flash player and AIR products,[4] which was adopted by software vendors.
[13] Among notable WebGL frameworks are A-Frame, which uses HTML-based markup for building virtual reality experiences;[14] PlayCanvas, an open-source engine alongside a proprietary cloud-hosted creation platform for building browser games;[15] Three.js, an MIT-licensed framework used to create demoscene from the early 2000s;[16] Unity, which obtained a WebGL back-end in version 5;[17] and Verge3D, which integrates with Blender, 3ds Max, and Maya to create 3D web content.
[21] Other upgrades include sparse accessors and morph targets for techniques such as facial animation, and schema tweaks and breaking changes for corner cases or performance, such as replacing top-level glTF object properties with arrays for faster index-based access.