At least the US Government isn't just outright assassinating political dissidents. The 60's were rough, it wasn't your cellphone getting you killed it was being a capable enough person to organise people without the internet.
Snowdens basic point was the government *is* psychotic enough to burn...
I can seat the connecters fine, but if you put too much strain on them they just start to pop out. Why would people but strain on them? Cuz the cards are way out of spec physically and then the dumb ass connecters stick straight out even further. Its just one connector it could be pointing in so...
Crysis was fucked because it was designed for 5-6ghz single threaded cpu's that never eventuated and was hella CPU bound for a long time.
If I want to break my system I use Fortnite, it gets a new unobtanium setting every three months with Epic showing off the latest engine improvements.
16GB what? fuck off no way. You can put like so many of them in 4RU holy shit.
They'll be like hens teeth to buy.
And when you are handing out PCIex5 lanes who cares that its x8? That means it only takes four of my lanes.
Think new new version of this...
I think you all missed the part where DLSS allows nVidia to turn AI into Raster. Raster is stuck where it is, but DLSS will get really really really good. Or at least it will become the only game in town.
GPT 3 was trained on fuckin Voltas. The new new LLM's are yet to be built and the current ones are making massive undeniable accelerations to workloads that render you uncompetitive without them in some industries.
But the thing is while training is a bitch, if you build something with...
Apple didn't stop making Laptops and PC's. Witness the M3 Ultra!
But they're a phone company now. With a lil bitty PC department.
nVidia will keep making the most kick as gfx cards, but it will be a tiny department reusing stuff developed for phones. Datacenters.
The goal is models that give a majority of people a 30% boost in productivity without dimming the lights of the whole planet.
However, how we finally get there might leave nVidia looking like SGI, which would be ironic. Or Google looking like Yahoo.
And the VC's sure as shit can't tell one AI...
AMD could massively slash their GPU prices and go for market share.... but where would they FAB them?
The save the company, win the future moves are where you get 100x + mark ups, and those customers are buying every kind of accelerator as fast as they can be shipped.
I feel like the next...
AMD has figured out tiling GPUs on their M300. They can make any capacity APU they want now.
They can also focus on upgrading just the GPU tiles while a GPU i/o ship stays the same and the card stays validated so they can do much faster cycels.
But they got to get their architecture...
I dunno. GPT 4 isn't nearly as dumb as some of my friends, and isn't going to choose windows for a desktop for example. Or get knocked up by a dipshit. Or buy a house with a mortgage right before interest rates are obviously set to climb for decades.
The number of humans who actually change their minds based on explicit data is pretty small. Most go by their gut which is in fact making statistical guesses based on its training data.
But programming based on a gut instinct about which Function you should build next is a risk process, unless...
I used to haul around a 7 kglaptop with a 2kg power supply. It had the first dual core AMD chip, might have been a desktop part. raided hard drives. 17" screen. maybe it had sli gpus? I had the first model Dell 24" screen wide HD LCD with a handle I bolted to the top. I traveled 'door to door'...
Plenty of people use Asahi as a daily driver.
I love my air cuz not only does it have super fast ram, but when I use it heavily for work each day all day, I still only have to charge it every other night. I never take a charger with me anywhere.
“Cisco buried the lede.” >10,000 network devices backdoored through unpatched 0-day
This is a full root of Cisco's big switches. The amount of access that could give you is unthinkable.
Also is there is nobody on AMD's dev team making drivers specifically for CounterStrike that is good enough at CounterStrike to care about getting banned? It seems like like playtesting should have triggered VAC, so wtf...
American food tastes so weird. Easiest single explanation is HFCS, also the simplest example of hwy American democracy is so broken.
AI aint gonna fix shit while yall voting for your own destruction.
Basically these vulnerabilities have existed for a long long time and are now white hat knowledge.
Nobody really cares except anyone doing workloads or storing data on cloud services. On cloud services you are *actively* sharing your hardware with malicious local access. On fucking purpose...
I asked Bard about a particular rocket launch start up. There are hundreds of them, but I asked it about mine. It knew the name of the company and the location. Knew my name. It even told me it was a fan. Its a very unique launch company though. When I asked it about the launch system it...
Theres a fork of Stable Diffusion built for AMD. Im still just tooling around with a 6900 til I have a better idea on whats happening with the tech. At the moment CUDA is very core, and the 4090 ti if it appears will be the new hotness.
i'm shoppingThe main driver for me is playing with large personal local AI assistants. Not for dev work, but for actually using one as a daily driver securely. I don't know where the field is going, but its likely I'm gonna want as much VRAM as I can, and might even go two 7900xtx's at this...
I mean they could be using AI tools to analyze games from different aspects, remove a bunch of tedium, and help actual writers to write better articles... but clearly those sites were already clickbait, now its just one step more automated.
PCIe 4?
If they had a chip twice the size, hell even at that size where else can you get 100+ GB of Vram outside of renting nV cloud?
It can ingest 24 4k streams? This is why I got out of TV, it all just got way too easy.
It feels like a stop gap for an M3 super chip next year. Its enough to...
I own an index.... from like five years ago now? Its was $1,000. What does $3,500 get you in 2023 if it can't even play the VR hand fighting SkyRim with Chat GPT NPC's? (Over 700 mods to get it going, but holy shit)
Meh, if the game you are playing is how much VRAM can I feed my AI model, and the 48GB card is over $8k.... a Mac with 64, 96, or 128GB of unified memory is suddenly pretty appealing.
I just scared myself. My home workstation is as below. Its across three monitors (one portrait) and a full wall projector. I wonder if I get a lil dock and plug this nifty lil air m2 laptop in, if I would notice the difference. Outside of Fortnite, VR, AI, editing and maybe compiling?
What if I...
WHO ARE THESE PEOPLE THAT HAVE TO EDIT RED RAW ON AN ENTRY LEVEL MAC?!?!?
Its like, if I got my junk stuck in a bear trap would a spoon be sufficient to get it out or do I really need a fork?