cageymaru

Fully [H]
Joined
Apr 10, 2003
Messages
22,074
Intel has disclosed a new set of security flaws collectively called the L1 Terminal Fault (L1TF). These flaws were discovered in conjunction with researchers at KU Leuven University and other universities. The researchers call their discoveries Foreshadow and Foreshadow - Next Generation (NG). Intel even suggests that disabling SMT might be applicable in some use cases when running virtualized operating systems.

Foreshadow

At a high level, SGX is a new feature in modern Intel CPUs which allows computers to protect users' data even if the entire system falls under the attacker's control. While it was previously believed that SGX is resilient to speculative execution attacks (such as Meltdown and Spectre), Foreshadow demonstrates how speculative execution can be exploited for reading the contents of SGX-protected memory as well as extracting the machine's private attestation key. Making things worse, due to SGX's privacy features, an attestation report cannot be linked to the identity of its signer. Thus, it only takes a single compromised SGX machine to erode trust in the entire SGX ecosystem.

Foreshadow - Next Generation (NG)

While investigating the vulnerability that causes Foreshadow, which Intel refers to as "L1 Terminal Fault", Intel identified two related attacks, which we call Foreshadow-NG. These attacks can potentially be used to read any information residing in the L1 cache, including information belonging to the System Management Mode (SMM), the Operating System's Kernel, or Hypervisor. Perhaps most devastating, Foreshadow-NG might also be used to read information stored in other virtual machines running on the same third-party cloud, presenting a risk to cloud infrastructure. Finally, in some cases, Foreshadow-NG might bypass previous mitigations against speculative execution attacks, including countermeasures to Meltdown and Spectre.
 
Do I read this correctly in that there are 2 vectors, and only one of them is SGX? I wonder if AMD is impacted by the L1 related vulnerability.
 
Do I read this correctly in that there are 2 vectors, and only one of them is SGX? I wonder if AMD is impacted by the L1 related vulnerability.
I think in the article they said that they haven't gotten it to work on AMD systems yet. Doesn't mean that they are immune.
 
Am I right in thinking that this situation is like the malware/virus situation on the windows platform vs say Mac/Linux? It isn't that exploits don't exist on AMD processors, its just that one platform (Intel) has a larger install base so it is the first target?
 
I don't like it when these types of exploits are called "flaws" because it's literally impossible to design anything that cannot be exploited to some degree(s) given time and the desire to do it. When Intel was making these processors nobody at the time was thinking "Hey, let's do this, which exploits that, and then we're in like Flynn" so, while some responsibility on how the situation is handled after such exploits are created/discovered/made us of does fall back on the makers of the processors, I can't honestly point a finger at Intel and say "Hey, look, even a 1st grader could have seen what was going to happen here..." because at the time that wasn't a possibility for anyone.

People find exploits all the time, hell I think sendmail still has shit exploited to this day and it's what, 50+ years old now? :D
 
Good God, if this keeps up Intel will have to include disposable gloves to install the infested thing.

dirtyintel.png
 
Last edited:
Im not to worried about this crap anymore. I do the best I can to keep a up to date system with patches, run internet security and practice safe computing. If someone wants to hack my system to see what porn sites I've been on or You Tube video I have watched I say have at it. Im sure it matters to a much greater degree in a business environment then my home system.
 
It says a lot that OS's have gotten so secure that it's become easier to hack the underlying HW.

That being said: It's clear that Core is showing it's age. It's about time Intel start developing a new architecture.
 
Pity I bought an Intel CPU right before specter meldown happened. Looks like it's going to be and for the foreseeable future.
 
I think I'd like to go back to 80's or 90's computing style where it was near strictly functional work and spend the rest of time living my life in peace.

Keep the performance improvements of course.
 
I don't like it when these types of exploits are called "flaws" because it's literally impossible to design anything that cannot be exploited to some degree(s) given time and the desire to do it. When Intel was making these processors nobody at the time was thinking "Hey, let's do this, which exploits that, and then we're in like Flynn" so, while some responsibility on how the situation is handled after such exploits are created/discovered/made us of does fall back on the makers of the processors, I can't honestly point a finger at Intel and say "Hey, look, even a 1st grader could have seen what was going to happen here..." because at the time that wasn't a possibility for anyone.

People find exploits all the time, hell I think sendmail still has shit exploited to this day and it's what, 50+ years old now? :D
A flaw is a flaw. It may not be an obvious flaw, and likely (I should say, by definition) wasn't purposefully designed with that flaw in mind, but it is still a flaw (virtually all designs have flaws, though they aren't all obvious or serious). Nobody is saying Intel is dumb for making such a flawed CPU, they're (Intel, and media) just reporting the exploits so people know what their exposure is and what solutions are currently available.

As far as these go, the important thing is how easy it is to implement and execute an exploit, how easy it is to mitigate the exploit or eliminate the flaw altogether, and what level of access is needed to exploit a system. What's unimportant is what you call it...
 
Last edited:
Im not to worried about this crap anymore. I do the best I can to keep a up to date system with patches, run internet security and practice safe computing. If someone wants to hack my system to see what porn sites I've been on or You Tube video I have watched I say have at it. Im sure it matters to a much greater degree in a business environment then my home system.

Banks use Intel CPUs. When your bank account is emptied or your credit card is maxed you will care about this crap.
 
Don’t worry, Intel has a lot of lakes left to drain to fool you in to thinking that your buying a new CPU...

Intel, the intelligent customers are not buying your rehashed ‘new’ CPUs. We are waiting for a new architecture first.
 
What the fuck has Intel even been doing since they released Sandy Bridge in 2011? Their chips haven't improved in any meaningful way, and they keep finding more security issues with them.

I think one of the reasons we're seeing all these flaws and exploits coming out is because Intel hasn't done much regarding the architecture. When using the same architecture for so long it's more likely for flaws to be discovered and exploited. Larger architectural changes can by design or by accident remove flaws. When these changes happen more often the exploits which can be found either aren't found in time to be truly useful or the flawed architecture is naturally phased out with replacements down the line.
 
Am I right in thinking that this situation is like the malware/virus situation on the windows platform vs say Mac/Linux? It isn't that exploits don't exist on AMD processors, its just that one platform (Intel) has a larger install base so it is the first target?

A few things. First Linux and Mac are not only more secure because they have a smaller install foot print. Linux is just as large if not a larger target for criminals as windows is. Sure there are more windows machines in the world and if you want grandmas credit card or a bot net of desktops... windows is your target. If your after Everyones Credit card at the same time your targeting Linux running servers. (or the morons given access to the right Linux servers)

Too your main point however. No this is not at all an install issue. This is an Intel doesn't know wtf they are doing issue. 20 years of short cuts and bad decisions (some possibly at the behest of Gov agencies) has left a lot of issues with their underlying arch. Intels branch prediction was designed by a complete moron. (no I'm not joking) their routine allows the CPUs prediction engine to directly access privileged memory outside of user space which is just plain stupid design work.

[ the easiest way to explain Intels issue is like this. If you create a text file in any operating system, even windows. If you are the owner of that file you can set it so only you may open it. Memory the CPU is using works the same way... the memory the CPU is using can be assigned specific users.Intel made a choice years ago, that made their Speculation a lot faster then everyone else... they decided the engine didn't need to check the privileges at the start of calculation. It just used the memory... at the end of the calculation it performs the check. This makes their routine a lot faster as it isn't respecting protected memory locations at all. Also if it tosses that branch of calculation it never had to perform the check at all. The catch of course is, its a really really really stupid ass way to operate, as you can get the chip to perform calculations that copy the memory contents before it ever checks to see if it should be allowed to do that. Which is basically what specter and meltdown are.... on AMD chips their routine won't ever be operating in protected memory spaces as their routine checks memory privs first, there by making it mostly immune to those types of attacks. ]

Bottom line is... Intel appears to have taken a lot of shortcuts which = higher performance at a sever potential cost to security. It would seem now its all coming to light.

I am curious now btw if Intels recent down grading by goldmen sacks is perhaps related to these new issues.
 
another one?...why is AMD never affected (to a large degree) by these things?...is their architecture so world class that they are immune?...I thought Intel had all the best engineers on their payroll
 
another one?...why is AMD never affected (to a large degree) by these things?...is their architecture so world class that they are immune?...I thought Intel had all the best engineers on their payroll

i wouldn't say world class, it's probably just not as corner cutting as intel's architecture.. after all there's a reason they have the IPC gains and clock advantages that they do and that was probably because they had to cut corners in the architecture to get those things. i'm sure a lot of this stems back to the creation of the core 2 architecture where they were desperate as hell to get ahead of AMD and to get people to forget about netburst so they put reward ahead of the risks in what they were designing.
 
It isn't that AMDs engineers are better then Intels. Intels engineers know how stupid their no look speculative engine is.

The difference is AMD is headed by one of the best and few true genius engineering minds of our generation in Lisa Su. She would never sign off on something so bone headed.... Intel is headed by money suits, who say ok so this check here is slowing things down 10%, what if we just didn't do it, cause we have sales numbers to hit.
 
Maybe this is Intel's sneaky way to keep those cheapskate internet hosts on the cash cow range. Cheap VMs bad, dedicated servers better. One process, one chip. ;-)
 
Maybe this is Intel's sneaky way to keep those cheapskate internet hosts on the cash cow range. Cheap VMs bad, dedicated servers better. One process, one chip. ;-)

You can design around this flaw FYI.
But it might pose a problem for pure cloud-providers ;)
 
Banks use Intel CPUs. When your bank account is emptied or your credit card is maxed you will care about this crap.
I will care when someone demonstrates a viable real-world attack that was made using any of these exploits. We haven't seen any targeted attacks that used speculative execution methods. I'm not saying I want to see those, but until I do see a successful attack that only relied on speculative execution and no other pre-existing security hole. I'll regard this as a remote possibility that hasn't been demonstrated as a real threat. Especially for a home user. I couldn't care less about banks loosing a bit of performance over this, they can afford it.
 
I've never thought that I would have a nice argument for running a old-ass server (Xeon X3353): it's *safer*! LOL
 
Im not to worried about this crap anymore. I do the best I can to keep a up to date system with patches, run internet security and practice safe computing. If someone wants to hack my system to see what porn sites I've been on or You Tube video I have watched I say have at it. Im sure it matters to a much greater degree in a business environment then my home system.

Basically this. At work I just categorize it as "This is cyber security 2018" and nothing we can do about it. Keep up to date on the latest threats, attack vectors, etc and patch the hell out of your systems. Put in the proper systems to detect when someone is trying to break in or are in, etc.

There are teams forming up in colleges and professional businesses where there goal is to find these flaws and exploit them. The flaws existed for years...lets get past that...at least now they are being found. 5 years...10 years ago...cyber security was a threat yes...not but no where as significant as its been in the past year or 2 and its only to keep getting worse. If anyone is reading this and are in college or about to go into college and want a career in I.T...seriously consider studies in Cyber Security...it be worth you while.
 
Banks use Intel CPUs. When your bank account is emptied or your credit card is maxed you will care about this crap.
Care about it sure, my options of what I can do about it, limited to basically myself.
 
I am curious now btw if Intels recent down grading by goldmen sacks is perhaps related to these new issues.

Their downgrade is related to potential for growth. Intel is having 10nm and 7nm problems AND their architecture has a major flaw the competition's does not. SO Intel got downgraded and AMD got upgraded.

That may not be their direct reading of their motivations, it may have worked out that their guys who do their due diligence called up the CIOs or CTOs of the companies they hold a lot of stock in and said where do you see your IT dollars for hardware going, and the answer was not intel, but we all know why that would be.
 
Back
Top