- Sep 6, 2011
Yea right you right a better book then. It help me get my A+ certification as well. Your just mad because it doesn't tell you how break your computer by Overclocking it. By the way I have computer old enough to prove that overclocking is bad because they haven't been overclocked and have been running for up to 12 years without any problems. She tells you everything you need to know in that book and doesn't BS about Microsoft's history like Mike Myers. She even has a PHD, which makes here more credible than most if not all other authors. You are in fact wasting your time overclocking because you don't overclock a server or a computer used as a client for a business because it just causes problems no matter how stable you can get the thing to run because it will never be 100% stable. I could go on and on about how it shortens the life of the computer as well and how you waste money buying new hardware because your computer died prematurely due to overclocking. Fact is that the components of the computer were not meant to handle the clock speeds. Do what you want with your hardware, but I know from experience that overclocking can actually decrease performance in some ways and you should at least benchmark before and after overclocking to get a good idea of how much of an increase in performance you gain before overclocking everything right out of the box. Eventually, though you'll get the idea that it not worth the trouble to overclock and that its better just to have a completely stable computer that may only crash due to file corruption in the software do viruses and malware or similar files that corrupt the system, which is also part of the problem.
I am not sure if you are a really smart troll, or just really dumb.
I do not care to comment on anything you have said except to say you are 100% wrong with your opinions.
EDIT: Congrats on your A+, it is the start to an IT career. Possibly in that career you will actually learn something.