Visual Studio 2010 soooo slow!

Joined
Jun 19, 2005
Messages
2,198
So at work I maintain and fix bugs on a huge library of code for various backend services. The amount of files is ridiculous and the header files are spread out over 300+ directories.

Now the code is grouped under a logical directory structure but the header files are located across 300+ directories.

Typically we work on a small amount of files usually located in a single folder. Each "group" of code has its own make file and pulls from the appropriate sources of course.

I thought that instead of using vi and grepping for references across countless files, I could maybe use something like VC++ 2010 so that I would have access to a call hierarchy and other quick-referencing features.

So I set up the project, copied all of the code over, included it in the project... Then I had to set up the include path. Something like 300-400 folders of header files. I included them all and got rid of *most* of the invalid references.

The problem is that every time I edit/save a file, it seems to re-parse everything and go into a "scanning #includes" thing across 40,000 files. CPU spikes to 80-100% and this goes on for 15+ minutes. During this time none of the features work. No syntax highlighting, no intelli-sense, etc... I can still edit files but I have none of the features that I wanted to begin with.

Am I stuck with using vi because this project is too complex for an IDE to handle? :(
 
While I expect there's a fix for this particular problem, if you're already working in Vi then you might want to investigate Ctags which will do source code referencing.
 
What are your system specs?

When was the last time the hard drive was defragged? And NOT with the built in Windows one unless you are running Windows 7. The defrag in everything past ME was total crap.

What about a chkdsk /f ? If the file system has issues, it can cause all sorts of funky problems.

How much RAM us VS2010 using when you have that project open?

What OS are you running?

VS2010 really shouldn't reparse everything when editing only a single file... at least it has never done that to me.

It could be some IDE setting that is causing it to be that slow.
 
Do you have any 3rd party refactoring tools or plugins running in VS? Try disabling those if you can.

Some of those can be pretty demanding on large codebases.
 
hmm... I'll have to investigate some of this.

I think the "scanning for #includes" thing is periodic because I came into work this morning and I saw it doing it. It goes through 40,000+ files each time.

Maybe the include path is just too complicated? I pretty much did a recursive search on all folders containing *.h files, then chained those folders together into a huge include path. Individual paths separated by a semicolon of course. But there is something like 340 of these paths... I'd imagine that a neater and more efficient project would have their includes under a much smaller directory structure? I'm not sure... I never really worked on a large codebase like this before.

System specs aren't good...

Pentium 4 2.80 ghz
2 GB cheap ram
Probably a slow 5400 RPM HD

Can't do much about that, it's what I was given to work with. :(

I suppose a defrag might be in order.

Thanks for the suggestions guys!
 
I remember using VS 2003 and it was slow, 2010 is probably worse.

I prefer using notepad++ and a command line compiler, makes for a much simpler environment to work in too, don't need a huge hog of a program opened just to code.
 
^ With this code, having the ability to right click on variables and open their definitions is a huge plus when it comes to tracing through it and understanding how it works. Even the tooltips are a big help -> I know the type of something just by hovering over it. Things are so deeply nested here that it sometimes takes forever just to find out what a single line of code is doing because you have to check 20 referenced files. If I can get to those files via call hierarchies and right clicks, then that saves A TON of time grepping around.

If I were writing my own code from scratch, then yeah Notepad++ would do. But I'm working on code dating back to 98 which has been worked on by many different people so tracing through it and understanding it is a huge part of the effort!

So anyway... I wonder if there is a way to turn off the periodic "Scanning #includes for additional files". Because that it what is really slowing me down.
 
Ah wow yeah I see. Actually I'd love a program that has those tool tips and drop down menus as I tend to forget what I named my own functions so to be able to type object. and get a drop down would be nice. But I have yet to find an app that is not so slow doing it, I gave up.
 
Actually I think I found the option I was looking for. "Rescan solution interval". I set it to 0. Let's see if that helps.
 
2010 C++ coding does seem to drag a bit compared to 2008, even with simple programs. I haven't had much of a chance to play with it since I'm working on a few projects under 2008. This is on my 2.8GHz C2D 4 gig laptop running Win7.
 
Why can't companies realize that they would get much more production out of a faster system? It's not like a decent computer costs thousands of dollars anymore.

http://www.officedepot.com/a/products/508866/HP-Pavilion-p6510f-Desktop-Computer-With/
$480

And at Officemax, the same computer is even cheaper at a measly $450.
http://www.officemax.com/technology/computers/desktop-computers/product-prod3030482?history=hoo0g4v0|categoryId~10004^categoryName~technology^parentCategoryID~category_root^prodPage~25^region~1@3jdkliky|categoryId~283^categoryName~computers^parentCategoryID~cat_10004^prodPage~25^region~1^refine~1@x02tkgyi|prodPage~15^refine~1^region~1^categoryName~Desktop+Computers^categoryId~324^parentCategoryID~cat_283@ztd1vw4l|prodPage~15^sort~Price+%28Low-High%29^refine~1^position~1^region~1

The last big company I worked with wouldn't upgrade their systems either.... it was horrible... They even had people that had to use AutoCAD using old Dell P4 systems with onboard video with 512MB of RAM. SOOOOOOOOOO Stupid.
 
Why can't companies realize that they would get much more production out of a faster system? It's not like a decent computer costs thousands of dollars anymore.

http://www.officedepot.com/a/products/508866/HP-Pavilion-p6510f-Desktop-Computer-With/
$480

And at Officemax, the same computer is even cheaper at a measly $450.
http://www.officemax.com/technology/computers/desktop-computers/product-prod3030482?history=hoo0g4v0|categoryId~10004^categoryName~technology^parentCategoryID~category_root^prodPage~25^region~1@3jdkliky|categoryId~283^categoryName~computers^parentCategoryID~cat_10004^prodPage~25^region~1^refine~1@x02tkgyi|prodPage~15^refine~1^region~1^categoryName~Desktop+Computers^categoryId~324^parentCategoryID~cat_283@ztd1vw4l|prodPage~15^sort~Price+%28Low-High%29^refine~1^position~1^region~1

The last big company I worked with wouldn't upgrade their systems either.... it was horrible... They even had people that had to use AutoCAD using old Dell P4 systems with onboard video with 512MB of RAM. SOOOOOOOOOO Stupid.

I agree with this. Sadly companies just like to blame the economy and refuse to spend money on things they should be spending it on, yet turn around and spend it on really stupid stuff.

For example, the client I work most of my time for is always finding excuses not to buy IT things they really need, yet they turned around and spent 30K to outsource the development of their local intranet. They are locked in this contract where they are not allowed to modify it themselves. They have to call the company each time and it usually costs about 1k.

Meanwhile, there are still some old MDG P4's in production. Every now and then we even find a P3 or lower.

That's another problem, they have WAY too many different models of computers. Instead of buying 500 computers at a time, they only buy 1-2 as needed, so we end up with like 50 different models. I feel sorry for the L2 guy sometimes, he has so many different images to juggle, it's not even funny. There's like 3 types of ram out in use, like 10 different power supplies etc... parts are hardly interchangeable. They don't even stick to one company. Sometimes they go MDG, sometimes they go IBM, other times they go HP... it's retarded.
 
2010 C++ coding does seem to drag a bit compared to 2008, even with simple programs. I haven't had much of a chance to play with it since I'm working on a few projects under 2008. This is on my 2.8GHz C2D 4 gig laptop running Win7.

Visual Studio 2010 is DEFINATELY a good bit heavier than 2008. Haven't done any C++ with it yet but all the C# projects I've tested don't load or compile as fast vs 2008.

That said Visual Studio 2010 is fucking amazing. There's so much in this tool and it's so well thought out. Its an an eaxmple of how Microsoft when focused is the best software maker on the planet.
 
You should take a look at your code to see whether or not you're including everywhere. Always try to use forward references when possible. C++, frankly, because of its massively flawed include propagation, is shit-tier in this respect, which is why you have to be very careful when adding a new include that your code has been structured in such a way that you don't need to touch the base-level include very often in the entire include tree. If you don't, and your code is more than 1000 lines, you'll get bitten in the ass really quickly.

tl;dr - improperly structured code will increase your compile and IDE scan times by up to orders of magnitude.
 
Back
Top