Unity interceptor to seamlessly introduce caching for parallel method invocations.
The solution involves using the Unity interception mechanism to introduce behaviours, and building out the parallel cache behaviour using Reactive Extensions.
The interception behaviour uses the parameters of the method call (along with method meta data) to generate a cache key. This implementation has its limitations, as it works well with value type parameters, but reference type parameters need to implement IEquitable for the system to be able to generate a cache key.
The key component of this design is to enable it in a parallel invocation scenario. The invocations could be take a significant amount of time to process, and we do not want to issue a new request while a similar request in flight. Enter Reactive extensions. Using Reactive extension, the parallel cache behaviour, subscribes to any matching request and waits for the return. When the first invocation compeletes, any waiters are notified and the same result is dispatched to all subscribers.http://codereview.stackexchange.com/questions/16075/improve-parallel-cache-with-reactive-extensions-unity-interception
I also noticed that there is a similar project on GitHub here, but it was too complex for my needs. it does look like it has more extensive locking mechanism. I've relied upon the Parallel extensions in .Net 4 to avoid rolling any of my own locking mechanisms.https://github.com/reactiveui/ReactiveUI/blob/master/ReactiveUI/ObservableAsyncMRUCache.cs