C# Singleton Collection Objects and ASP.NET

Bohica69

Gawd
Joined
Jul 12, 2005
Messages
676
I'm working on a project that has a need for a buch of dropdowns that we are going to pull from the db.

So my thought is this: Make a singleton collection for each table that corresponds to a drop down, then I only have to hit the db one time - The options of the drop downs are, for all intents fairly static - that is to say, the can change occasionally, and restarting IIS when they do is acceptable.

Basically what I'm thinking is that when the the collection is first called, it's constructor goes out calls the db, and builds the collection. Then each time it's called, it will already be in memory. The methods for adding and deleting from the collection are private to the collection itself - I don't want anybody to be able to add or delete values from the collection once it's built.

My question is this: How concerned do I have to be about memory in this case? The lists I'm talking about arent' that big, and the objects in the lists don't have that many properties. Doing it this way seems MUCH more efficent that having each page build the objects each time - when you have a lot of visitors, even though the memory useage of the application when nobody is using it will be higher because those objects are still in memory.

Your thoughts and opinions please.

Thanks.
 
If you were concerned about memory, I don't think you'd be using managed code. (That's probably too harsh, sure. But that's what I think of it.)

Using a little extra memory all of the time is preferable to hitting the database every single time the page is rendered. The database will use memory to do your query, it will do I/O and so on. Your query and the pages it touches in the database will be cached (if you have a DBMS that's worth a spit), but all that means is the DBMS is taking the extra memory for you instead of you doing it yourself. And I think the DBMS would use more than you would.

Also, if each connection is getting the data, at least for an instant that connection context has the data independently of others. Which means that you've got multilpe copies of the data, once for each connection. And that means your peak usage not caching it with a singleton is far more than caching it -- 1*sizeof(menu_data) compared to nConnections*sizeof(menu_data).

If nobody is connected to the server, what would that memory be used for otherwise? How much memory is it in question?
 
If the lists aren't that big I don't think using the singleton pattern is going to buy you much in terms of performance. Hitting the db for small lists won't be an issue.

Oh, and having to restart the IIS server when you change the items in the list would be a giant HACK.
 
Stupendous said:
Oh, and having to restart the IIS server when you change the items in the list would be a giant HACK.
Nothing stops the singleton objects from sensing or being informed of an update and refreshing.
 
mikeblas said:
Nothing stops the singleton objects from sensing or being informed of an update and refreshing.

Of course nothing stops that but his solution mentioned restarting the IIS server which is a poor solution.
 
How much memory is it in question?

To be honest, I don't know exactly - what made me ask the question was that this morning, as I was working on the objects and unit testing it, I was watching the ASPnet worker process in Task manager as I was refreshing the page, and as I refreshed the page, the worker process consumed more and more memory. It didn't always go up, but it was tending in that general direction. Now figureing that I implimented the classes correctly (and I think I did, but I could be wrong - profiler against SQL server only see's one hit to the db) I would not have guessed that the worker process memory useage would climb with each refresh of page. I figured the memory useage would move around, sure, but not as much as I was seeing, and like I said, it was definetely going up. My concern was that this was going to get deployed to production and cause problems.

Good point about informing the objects of an update - I hadn't considered that. Piggy backing on regularly scheduled maintainance isn't a hack - it's using limited resources wisely.
 
Your working set will grow and grow until GC happens. That's the way managed code works (as a generalization).
 
Back
Top