The term is used for two different things: Extremely large datasets may be divided between co-operating systems as in-memory data grids.
With disk-based technology, data is loaded on to the computer's hard disk in the form of multiple tables and multi-dimensional structures against which queries are run.
[9] Information technology (IT) staff may spend substantial development time on optimizing databases, constructing indexes and aggregates, designing cubes and star schemas, data modeling, and query analysis.
Though SQL is a very powerful tool, arbitrary complex queries with a disk-based implementation take a relatively long time to execute and often result in bringing down the performance of transactional processing.
In order to obtain results within an acceptable response time, many data warehouses have been designed to pre-calculate summaries and answer specific queries only.
Users query the data loaded into the system's memory, thereby avoiding slower database access and performance bottlenecks.
[11] A range of in-memory products provide ability to connect to existing data sources and access to visually rich interactive dashboards.
The investment is more likely to be suitable in situations where speed of query response is a high priority, and where there is significant growth in data volume and increase in demand for reporting facilities; it may still not be cost-effective where information is not subject to rapid change.