Dynamic Cache Management Technique
computer science crazy|
Joined: Dec 2008
22-09-2008, 10:13 AM
The memory hierarchy of high performance and embedded processors has been shown to be one of the major energy consumers. Extrapolating the current trends, this portion is likely to be increased in the near future. In this paper, a technique is proposed which uses an additional mini cache, called the L0-cache, located between the I-cache and the CPU core. This mechanism can provide the instruction stream to the data path, and when managed properly, it can efficiently eliminate the need for high utilization of the more expensive I-cache.
Cache memories are accounting for an increasing fraction of a chip's transistors and overall energy dissipation. Current proposals for resizable caches fundamentally vary in two design aspects: (1) cache organization, where one organization, referred to as selective-ways, varies the cache's set-associativity, while the other, referred to as selective-sets, varies the number of cache sets, and (2) resizing strategy, where one proposal statically sets the cache size prior to an application's execution, while the other allows for dynamic resizing both across and within applications.
Five techniques are proposed and evaluated which are used to the dynamic analysis of the program instruction access behavior and to proactively guide the L0-cache. The basic idea is that only the most frequently executed portion of the code should be stored in the L0-cache, since this is where the program spends most of its time.
Results of the experiments indicate that more than 60% of the dissipated energy in the I-cache subsystem can be saved.
Use Search at http://topicideas.net/search.php wisely To Get Information About Project Topic and Seminar ideas with report/source code along pdf and ppt presenaion
seminar project explorer|
Active In SP
Joined: Feb 2011
03-03-2011, 11:27 PM
The memory hierarchy of high performance and embedded processors are one of the major energy consumers. The article describes a new technique for introducing a new cache called the mini cache- the L0-cache which is located in between the level I cache and the processor. instruction stream can be provided to the data path if this is done properly. The utilization of the more expensive level-I cache can be reduced by this.
The propositions for the resizable cache designs are broadly classified into two types:
-Based on the cache organization:
Here the the cache’s set-associativity is varied by the selective-ways organization and the the number of cache sets is varied by the selective-sets
-based on the resizing strategy
Here , on application statically sets the cache size prior to an application’s execution and also the dynamic resizing both across and within application is also done by the application.
The five techniques which are used for the dynamic management of the cache are:
-Simple Method. .
-Dynamic Confidence Estimation Method.
-Restrictive Dynamic Confidence Estimation Method.
-Dynamic Distance Estimation Method.
Get the full report here:
|Popular Searches: cashing in on the cache, cache design, cache affinity, working of cache technology, cache furniture, dynamic cache management technique ppt, dynamic cache management technique doc,|