I need some suggestions about an optimal way for updating the data in the cube.Whenever new data arrives and made an entry in that table we need to update the current values in the cube like a trigger.In earlier work, we defined a taxonomy for these dimension updates and a minimal set of operators to perform them.

updating olap dimensions-63updating olap dimensions-43

The table must be defined in the metadata repository and it must contain all the columns for the measures.

Depending whether you are incrementally updating the cube or rebuilding the cube, you must manage the input data accordingly.

This should be a balance between the load that the processing puts on your relational database and the performance of the SSAS queries and the latency in your cubes.

If you use proactive caching the cube listens to changes in the underlying data source and processes your MOLAP partitions on the schedule you choose.

These events are known as facts and are stored in a fact table.

Dimensions categorize and describe data warehouse facts and measures in ways that support meaningful answers to business questions. A data warehouse organizes descriptive attributes as columns in dimension tables.If performance is too slow, there is a third option called HOLAP (Hybrid OLAP), which stores the data in the relational database but the aggregations in the OLAP database.When the source data is updated the aggregations will be processed again.[1] With MOLAP, see below for the proactive caching options There is also a ROLAP (Relational OLAP) storage mode, which stores the aggregations in indexed views in a relational database.There is no copy of the data in the cube but the data is available immediately in the cube.It's hard to give you a specific guidance on your best setup, but in any case there is no such thing as a trigger writing to an OLAP cube. You need to look into different storage options for your cubes.