Computational lexicology

First, collaborative activities between computational linguists and lexicographers led to an understanding of the role that corpora played in creating dictionaries.

Most computational lexicologists moved on to build large corpora to gather the basic data that lexicographers had used to create dictionaries.

The advent of markup languages led to the creation of tagged corpora that could be more easily analyzed to create computational linguistic systems.

WordNet can be considered to be such a development, as can the newer efforts at describing syntactic and semantic information such as the FrameNet work of Fillmore.

Outside of computational linguistics, the Ontology work of artificial intelligence can be seen as an evolutionary effort to build a lexical knowledge base for AI applications.

To this respect, the various data models of Computational lexicons are studied by ISO/TC37 since 2003 within the project lexical markup framework leading to an ISO standard in 2008.