Microsoft Announces DX12 Support for Windows 7

The problem is that I don't trust their updates and they break stuff a lot. I get why they do this (I use linux but manually update quite often) but windows not so much. They are doing the best they can though.


I don't either. I think they messed this shit up. But it's better take the risk of getting a bad update than to leave known exploits open.
 
I don't either. I think they messed this shit up. But it's better take the risk of getting a bad update than to leave known exploits open.
I agree, which is why I switched to linux, a much more secure OS, whose updates I trust and update often. Its too bad, I only recently switched for that reason.
 
For people that are using their machines just for gaming and no browsing or other internet activities?
Depending how you rate things I find it pretty awesome that when you try and block all windows telemetry ip or domains in the hostfile it simply gets ignored under Windows 10.

So much for supported security in Windows 10 makes me feel much safer.

It would be pretty much impossible to be gaming without internet these days. Suggesting going with almost dead OS is hardly a good idea security wise.
 
As Skydriver pointed out, Windows 7 goes EoL January 2020... so whats the point?
Who cares, updates aren't the be all and end all, besides it's half the market. W10 isn't a valid option for many and isn't much longer lived either.
 
Who cares, updates aren't the be all and end all, besides it's half the market. W10 isn't a valid option for many and isn't much longer lived either.

Read the first page of this thread to find the people who care. They certainly exist and have already made their opinions know in this thread.
 
Read the first page of this thread to find the people who care. They certainly exist and have already made their opinions know in this thread.
I didn't mean I didn't care about the dx12 change of policy, but was throwing my hat into the w7 corner. MS can take a hike with rubbish fire that has been w10.
 
I didn't mean I didn't care about the dx12 change of policy, but was throwing my hat into the w7 corner. MS can take a hike with rubbish fire that has been w10.

Oh I know. I just wanted to answer your question since it was such an easy one to answer.
 
Instead of seeing it for what it is, there are a lot if misplaced feelings in this thread whether it be about OS preference, "oh damn I already switched" or flat out off topic debate. I'm just going to chime in here; MS is doing this for one reason only. Developers have been asking for it, customers have been asking for it but the sole reason MS is doing this is for publicity. Make no mistake, they are giving the keys only to developers who they trust will make sure Windows 10 performs better on any possible benchmark that may result from this limited access. It's all a controlled environment so they can point out "hey look we were right, Windows 10 is faster" in a last ditch effort to bring people onboard. It's a pathetic move, but I'm happy to see something come out of it.
 
This doesn't make senses. Isn't DX12 1 of the tools to get you to migrate to 10? I'm for it but I don't get the benefit its only good till 2020. What am i missing?
It makes perfect sense. Microsoft already got enough customers to switch to Windows 10. The portion of the population that would consider going from 10 back to 7 because of this is insignificant at this point. All the laptops are shipping with 10.. so Microsoft doesn't have any skin left in the game of keeping DX12 as Win10 exclusive.
 
It would be pretty much impossible to be gaming without internet these days. Suggesting going with almost dead OS is hardly a good idea security wise.

When the opposite is a live OS which disregards simple things which should matter for security reasons what are you exactly doing under Windows 10 you push through the firewall as long as you can dns spoof because you know which domains are unfiltered. That means that you can get a billion and one security patches but you can not patch what the OS ignores by default.

So limiting your exposed connection to browsing or other activities beside the gaming client is not that bad.
 
But speculative execution can be patched via firmware and OS updates... so its not really the same thing. If Intel was to say... we are no longer providing updates for Core Duo processors, then yes people would be stupid to continue using them.

The point is valid anyone willing to run an OS without security updates is no better then your average anti vaxxer. Your not just putting your own data at risk your putting everyone else at risk as well. Even if all you do is game and you don't trust your banking info ect.... becoming part of a bot net, or helping to spread other crap isn't ok.

For now windows 7 has updates... and people willing to pay for them in a couple years, go ahead. Today if your a moron running windows XP connected to the net you deserve to catch some hate.

Except that it can't be patched, only mitigated so you aren't actually protected you just think you are.
 
That's kind of disingenuous.

Meltdown has at this point been more or less completely mitigated in software (albeit at a performance hit, but less of one now than the first mitigations). Spectre is harder to mitigate, but also much more difficult to take advantage of.

There are some areas of real concern (for instance untrusted guest operating systems running inside of VM's on affected hosts, but this is not a realistic attack vector for most usees, and thus much less of a concern.

Furthermore, the fact that one thing has security problems is not a reason to ignore easily preventable security problems in something else. This line of reasoning is just fundamentally illogical.

Meltdown has but not Spectre. You are still vulnerable even with the mitigations (noted pg 7 in the whitepaper) and my point still stands.
 
It makes perfect sense. Microsoft already got enough customers to switch to Windows 10. The portion of the population that would consider going from 10 back to 7 because of this is insignificant at this point. All the laptops are shipping with 10.. so Microsoft doesn't have any skin left in the game of keeping DX12 as Win10 exclusive.
"Perfect sense"? Fine, you say MS doesn't have skin in the game keeping DX12 off 7. Let's assume that's true. Here are the options:

Do nothing = some users may still trickle into 10 later + no additional effort for MS
Release DX12 for Win7 4 years after saying that's impossible = confuse everyone, add additional support costs for at least a year, more work

I'm not seeing the perfect sense.
 
Meltdown has but not Spectre. You are still vulnerable even with the mitigations (noted pg 7 in the whitepaper) and my point still stands.

I feel like I discussed this very point in my comment. Spectre is not yet mitigated, and there is some question as to whether or not it can be at all without hardware design.

The flipside of that coin is that it is much harder to take advantage of, and doesn't really apply to the use cases of typical consumers.
 
what makes it dead? XP is dead, W7 isn't.

I said almost dead because it will be EOL soon. EOL means officially dead for something like OS or that has to do with security, doesn't matter if people keep using it.
 
I said almost dead because it will be EOL soon. EOL means officially dead for something like OS or that has to do with security, doesn't matter if people keep using it.

Won't be for another year and they just released dx12. Doubt it
 
Won't be for another year and they just released dx12. Doubt it

Unless it goes the way of XP, I highly doubt it had much life left once security patches stop. It's also far less than a year left, W7 is scheduled to go EOL this December. I honestly haven't looked back once I finally pulled the trigger on W10. I did go with a Pro and you can turn off most of the annoying things. Updates happen at night for me and haven't bothered. With several dozen machines I have in the house, I've never had any issues. Not sure if it's really just paranoia or hearsay like how Vista was so bad and slow. I never had any Home editions so I suppose these can be crappy, I can't speak for that.
 
Unless it goes the way of XP, I highly doubt it had much life left once security patches stop. It's also far less than a year left, W7 is scheduled to go EOL this December. I honestly haven't looked back once I finally pulled the trigger on W10. I did go with a Pro and you can turn off most of the annoying things. Updates happen at night for me and haven't bothered. With several dozen machines I have in the house, I've never had any issues. Not sure if it's really just paranoia or hearsay like how Vista was so bad and slow. I never had any Home editions so I suppose these can be crappy, I can't speak for that.

Windows Vista was slow compared with what came before it. It wasn't paranoia or heresay as you claim. I've dual booted Vita and XP both on the exact same hardware, and in every numerous test I've thrown at it, it was factually slower. And in some cases by a very large degree too. I'm talking over twice as long slow.
 
I'm talking over twice as long slow.

This is true, at least initially. Patched Vista seemed competitive and even indistinguishable from Windows 7. This of course happened as Windows 7 was developed, because the two are essentially the same operating system with slight UI differences.

Windows 7 exists because they needed something 'new' to market after the initial blunders with Vista.
 
Windows Vista was slow compared with what came before it. It wasn't paranoia or heresay as you claim. I've dual booted Vita and XP both on the exact same hardware, and in every numerous test I've thrown at it, it was factually slower. And in some cases by a very large degree too. I'm talking over twice as long slow.

This is bullshit. Vista was plagued mainly by hearsay and shitty misadvertising as well as lack of knowledge. How many people have you hear bitch about Vista and when asked they never actually used it? Of course it ran worse on machines designed for XP (like those with 512MB RAM). It did however run well on mid range hardware or devices actually designed for it. Microsoft shot themselves in the foot with Vista capable as sellers were selling wrong hardware which barely met minimal spec. Same issue was when Netbooks came out, wrong hardware, wrong expectations and predatory sales people coupled with usual consumers who know nothing about what they are purchasing (of course nobody could be bothered to read up online). I ran Vista on a Core2Duo and later Quad and it worked great (4GB and 8GB RAM respectively). Vista ran fine on 2GB or more RAM which was not an issue for desktops, however laptops were on a much slower improvement curve so they did suffer from shitty hardware for longer. I only used Vista Ultimate 64-bit, so cannot speak for its 32-bit counterpart but 64-bit one was a very decent and stable OS after SP1. For comparison XP only became decent and stable after SP2 release. When XP came out it was as much of a mess and it took years for Microsoft and software/hardware vendors to get their shit together and people cried how they were better on Win98 or Win2K. You have to realize that most vendors didn't bother getting 64-bit drivers our despite having access to Vista beta. Vista had a newer security and driver models which did in fact help fix a lot of malware infestations which were so common on XP. I ran a tech shop at CompUSA back at that time so I know all about this stuff as well as have my own experience with servicing and using Vista. Same story goes pretty much for all operating systems where newer OS is not going to run better on same hardware you've been using. Same deal with Windows, same deal with Mac.
 
Unless it goes the way of XP, I highly doubt it had much life left once security patches stop. It's also far less than a year left, W7 is scheduled to go EOL this December. I honestly haven't looked back once I finally pulled the trigger on W10. I did go with a Pro and you can turn off most of the annoying things. Updates happen at night for me and haven't bothered. With several dozen machines I have in the house, I've never had any issues. Not sure if it's really just paranoia or hearsay like how Vista was so bad and slow. I never had any Home editions so I suppose these can be crappy, I can't speak for that.
I bet they'll extend it. I went with LTSB and I hated it still. Performance was fine, but it broke tweaks that were required (fuck the start menu and ads on it) often and so I decided to switch to linux. I haven't looked back, I hate W10's spying and it not respecting the hosts file blocking MS.
 
We received four builds of Windows Vista within a space of a week before golden master. There were two 'official' pre-golden releases and two 'unofficial' pre-golden releases. The two official ones ended up being as bad as the actual golden master we received. The two unofficial ones worked great.

Microsoft has only themselves to blame for how bad Vista was at launch.

When they released the public preview for Windows 7 (we were no longer official beta testers for Windows products at this point) I installed it on my personal machine and it worked almost flawlessly until Microsoft forced it to expire. While I am using Windows 10 (Insider Edition) and, for the most part, works as well as beta software can, I still prefer Windows 7.
 
Hi All

I remember the huge debate as to Microsoft’s ability to implement DX 12 in Windows 7
 
This is bullshit. Vista was plagued mainly by hearsay and shitty misadvertising as well as lack of knowledge. How many people have you hear bitch about Vista and when asked they never actually used it? Of course it ran worse on machines designed for XP (like those with 512MB RAM). It did however run well on mid range hardware or devices actually designed for it. Microsoft shot themselves in the foot with Vista capable as sellers were selling wrong hardware which barely met minimal spec. Same issue was when Netbooks came out, wrong hardware, wrong expectations and predatory sales people coupled with usual consumers who know nothing about what they are purchasing (of course nobody could be bothered to read up online). I ran Vista on a Core2Duo and later Quad and it worked great (4GB and 8GB RAM respectively). Vista ran fine on 2GB or more RAM which was not an issue for desktops, however laptops were on a much slower improvement curve so they did suffer from shitty hardware for longer. I only used Vista Ultimate 64-bit, so cannot speak for its 32-bit counterpart but 64-bit one was a very decent and stable OS after SP1. For comparison XP only became decent and stable after SP2 release. When XP came out it was as much of a mess and it took years for Microsoft and software/hardware vendors to get their shit together and people cried how they were better on Win98 or Win2K. You have to realize that most vendors didn't bother getting 64-bit drivers our despite having access to Vista beta. Vista had a newer security and driver models which did in fact help fix a lot of malware infestations which were so common on XP. I ran a tech shop at CompUSA back at that time so I know all about this stuff as well as have my own experience with servicing and using Vista. Same story goes pretty much for all operating systems where newer OS is not going to run better on same hardware you've been using. Same deal with Windows, same deal with Mac.

Nope, you're wrong I used a stopwatch. Vista took over twice as long as XP took to boot to the desktop with fresh installs on the same exact machine. You weren't even there when these tests are performed, so you are not in a position to say how it actually went down.
 
This is true, at least initially. Patched Vista seemed competitive and even indistinguishable from Windows 7. This of course happened as Windows 7 was developed, because the two are essentially the same operating system with slight UI differences.

Windows 7 exists because they needed something 'new' to market after the initial blunders with Vista.

This is completely and totally untrue. Vista was a resource hog with no reason to be hogging those resources and was quite different from Win7. You needed at least 2 gig of RAM to make Vista usable at even a basic level. You did not need that with Win7. When Win7 was released you could put it on XP era hardware and it had no problems at all running and in most cases that I saw, it ran faster and more efficient than XP ever did. When Win7 came out a buddy of mine wanted me to put it on his laptop at the time which was a single core AMD system with 512 meg of RAM which was running XP. Win7 ran better on that laptop than XP ever did and it was designed with XP in mind.

No matter how "patched" Vista was it was never even close to the same thing as Win7.
 
Nope, you're wrong I used a stopwatch. Vista took over twice as long as XP took to boot to the desktop with fresh installs on the same exact machine. You weren't even there when these tests are performed, so you are not in a position to say how it actually went down.

One of the biggest reasons for this was the ultra-aggressive Superfetch Vista used. Even on a Q6600 with 4 gig of RAM it took forever for Vista to start up and become usable at the desktop and much of that was because of Superfetch. It would thrash the hard drive so much that it took close to 5 minutes to load start up programs along with Superfetch doing what it wanted before the desktop was even remotely usable.
 
One of the biggest reasons for this was the ultra-aggressive Superfetch Vista used. Even on a Q6600 with 4 gig of RAM it took forever for Vista to start up and become usable at the desktop and much of that was because of Superfetch. It would thrash the hard drive so much that it took close to 5 minutes to load start up programs along with Superfetch doing what it wanted before the desktop was even remotely usable.

Exactly. I've never seen behavior like that on XP and I used it from day one and didn't stop using it completely until Win7 dropped.
 
They were equivalent in my testing. UI differences, and that's it.

I wouldn't admit that since it says something is seriously flawed with your testing methodology and/or your ability to observe results or comparisons.
 
Nope, you're wrong I used a stopwatch. Vista took over twice as long as XP took to boot to the desktop with fresh installs on the same exact machine. You weren't even there when these tests are performed, so you are not in a position to say how it actually went down.

Who really cares how long it takes to boot, you said it is a much slower OS no matter what. How about providing specs of your hardware so we know what you used to make this comparison? Do you also realistically expect it to be faster or the same speed on same hardware as OS that predates it by 5-6 years?
 
Who really cares how long it takes to boot, you said it is a much slower OS no matter what. How about providing specs of your hardware so we know what you used to make this comparison? Do you also realistically expect it to be faster or the same speed on same hardware as OS that predates it by 5-6 years?

Well, in all testing and observation I did regarding XP vs Win7, Win7 was always as fast or faster than XP and there was an even larger time distance between those OSes than XP and Vista.
 
One of the biggest reasons for this was the ultra-aggressive Superfetch Vista used. Even on a Q6600 with 4 gig of RAM it took forever for Vista to start up and become usable at the desktop and much of that was because of Superfetch. It would thrash the hard drive so much that it took close to 5 minutes to load start up programs along with Superfetch doing what it wanted before the desktop was even remotely usable.

It did not take this long to boot, maybe a driver was a problem with your machine. I had my main box with 4 Seagate 1TB drives in RAID10 and the thing booted in a jiffy. Probably about a minute to usable. On other machine like my media center with a single drive it didn't take 5 minutes either. Both were 4GB systems, I think media pc was 2GB at first and then I upgraded it to 4GB. I've never seen such bad results on a healthy system with Vista or other OS.
 
Well, in all testing and observation I did regarding XP vs Win7, Win7 was always as fast or faster than XP and there was an even larger time distance between those OSes than XP and Vista.
ms
I really didn't see much difference between Vista Ultimate x64 and & Ultimate x64 on same machines. I originally had it on evga nf780 q9450 with 8GB ram and mechanical raid 10. It was same on my nf680 and e7200 (I cannot remember, it was some kind of dual core chip, way back in the day it was e6300). nf680 booted off a single wd black drive. The only time I saw significant improvement was when i made jump to ssd in 2011. Also to be fair, win7 is a polished version of Vista and MS did tweak it as well as made it scale better. Drivers matured a lot too. I made my switch to 7 only because of catastrophic drive failure which I couldn't recover from so I decided to just upgrade the OS as well. I really don't recall seeing any significant performance increase from just OS update.
 
I bet they'll extend it. I went with LTSB and I hated it still. Performance was fine, but it broke tweaks that were required (fuck the start menu and ads on it) often and so I decided to switch to linux. I haven't looked back, I hate W10's spying and it not respecting the hosts file blocking MS.

I don't like telemetry but so much software does it nowdays, I'm not sure it really is worth the effort to fight. For somethings if you run pfsense or the like, it's not too hard to block on your network. If you have W10 Pro you can block stuff via gpo. As far as I know it does respect that. Hosts file is really a legacy thing anyway.
 
Who really cares how long it takes to boot, you said it is a much slower OS no matter what. How about providing specs of your hardware so we know what you used to make this comparison? Do you also realistically expect it to be faster or the same speed on same hardware as OS that predates it by 5-6 years?

That's a lie. I never claimed that it's a much slower O.S. no matter what. You need to work on your reading comprehension because it needs improvement.
 
I don't like telemetry but so much software does it nowdays, I'm not sure it really is worth the effort to fight. For somethings if you run pfsense or the like, it's not too hard to block on your network. If you have W10 Pro you can block stuff via gpo. As far as I know it does respect that. Hosts file is really a legacy thing anyway.
I agree, its why I have as much of it blocked as much as possible. After knowing what it can do like in 2016, its really scary what else we can be manipulated into.
 
Back
Top