The IT industry is constantly redefining the term “efficient” by creating new software and hardware solutions designed to increase the performance of applications. The task is further hindered by a parallel trend, which is an increase in the amount of data. This is particularly important with regard to analytical systems where most of the processing relates to data sets of millions of records. Therefore, data warehouses were among the first with “in-memory computing” solutions, that is those in which reading data from “slow” hard drives is eliminated through the increased use of memory.
Basic expectations of “in-memory” technologies are twofold. Firstly, a rapid access to current information and analyses necessary to respond adequately to changing market conditions. Secondly, lowering the cost of purchase and operation of the IT infrastructure required to store huge data sets.
However, from the point of view of an end user it is the effect, namely the speed of operation, that counts most of all. These expectations are rising all the time, of course. Some time ago, analysts were satisfied when it took a dozen or so seconds to generate a report in the data warehouse, while now they want this time to be reduced to less ...
Full content available upon login