Star Wars Battlefront Official System Requirements

I always go with a minimum of 16GB of memory in any system I build. My main rig was just upgraded to 32 a few months ago, so bring it :cool:.
For gaming, it certainly is. 64-bit processors and operating systems have been mainstream for 10 years now. The programs are finally catching up and now people are bitching about it :rolleyes:.
No it really only been must in the last 5 years now that 64 OS has take off
Keep mind that XP 64 was very buggggg little to no support
Vista 64 was flop even the OEM didn't want support it
Windows 7 64 start to take off being it petty rock soild OS
 
I'm actually pretty surprised by some of the comments on this thread.

64 bit? How many years has 32bit been dead?

16GB RAM. Really... ok, I know there's a lot of 8GB machines out there, but 16GB of DDR3 for the majority of existing systems is dirt cheap right now. I mean, CHEAP.

An Nvidia 660 video card required... That's 3 gens back now. (3-4 years?)

I'm seriously not not trying to be snobby, I'm sorry for anyone that needs to buy hardware to play, but you've got to admit it's pretty good humor coming from a bunch of [H]'ers. Don't you think?

Computers age in dog years, don't you know. ;)
 
so you need 64bit OS now? is 32bit OS dead?
32-bit is still around for compatibility and older systems, but for anything serious, the 4GB RAM limit is a giant brick wall for many individuals.
I found this out the hard way with Far Cry 4; at first I wasn't happy that it required 64-bit Windows to just run, but once I realized how much RAM it could burn through, then I understood why it was required.

I always go with a minimum of 16GB of memory in any system I build. My main rig was just upgraded to 32 a few months ago, so bring it :cool:.
For gaming, it certainly is. 64-bit processors and operating systems have been mainstream for 10 years now. The programs are finally catching up and now people are bitching about it :rolleyes:.

No, 64-bit operating systems only started to become mainstream around 2010 or so.
I remember circa 2005 when Windows XP x64 was released, and in 2007 when Vista 64-bit was released.

Not every system back then had a 64-bit CPU and could even run them, let alone have the ability to run more than 4GB of RAM back then (2GB was standard in 2008), making the jump a bit pointless or ahead of its time.
Driver and software support were the biggest issues, and even though 64-bit x86 CPUs have been around since circa 2003, mainstream OS support didn't really come about until 2010 for both home and enterprise.

So really, it's only been mainstream for about five years, not ten, and it's been an uphill battle, though the climb is becoming much easier now.
 
32-bit is still around for compatibility and older systems, but for anything serious, the 4GB RAM limit is a giant brick wall for many individuals.
I found this out the hard way with Far Cry 4; at first I wasn't happy that it required 64-bit Windows to just run, but once I realized how much RAM it could burn through, then I understood why it was required.



No, 64-bit operating systems only started to become mainstream around 2010 or so.
I remember circa 2005 when Windows XP x64 was released, and in 2007 when Vista 64-bit was released.

Not every system back then had a 64-bit CPU and could even run them, let alone have the ability to run more than 4GB of RAM back then (2GB was standard in 2008), making the jump a bit pointless or ahead of its time.
Driver and software support were the biggest issues, and even though 64-bit x86 CPUs have been around since circa 2003, mainstream OS support didn't really come about until 2010 for both home and enterprise.

So really, it's only been mainstream for about five years, not ten, and it's been an uphill battle, though the climb is becoming much easier now.

Wrong.

Windows Vista x64 was released in 2006, not 2005......
 
16GB 'recommended' memory requirement sounds like BS. Companies seem to be using these requirements as sales pitches to get people to upgrade
Could you describe, in depth, the backroom deals between game developers/publishers and RAM manufacturers? How do game makers benefit financially from inflated RAM requirements? Like, does the CEO of DICE secretly hold stock in Gskill or something? Is there some sort of kickback going on?

Explain the conspiracy if you would, please.
 
32-bit is still around for compatibility and older systems, but for anything serious, the 4GB RAM limit is a giant brick wall for many individuals.
I found this out the hard way with Far Cry 4; at first I wasn't happy that it required 64-bit Windows to just run, but once I realized how much RAM it could burn through, then I understood why it was required.



No, 64-bit operating systems only started to become mainstream around 2010 or so.
I remember circa 2005 when Windows XP x64 was released, and in 2007 when Vista 64-bit was released.

Not every system back then had a 64-bit CPU and could even run them, let alone have the ability to run more than 4GB of RAM back then (2GB was standard in 2008), making the jump a bit pointless or ahead of its time.
Driver and software support were the biggest issues, and even though 64-bit x86 CPUs have been around since circa 2003, mainstream OS support didn't really come about until 2010 for both home and enterprise.

So really, it's only been mainstream for about five years, not ten, and it's been an uphill battle, though the climb is becoming much easier now.

Vista 64-bit and 32-bit were the same. XP 64-bit was not Windows XP, it was Server 2003 with a few desktop features added back in. It worked, didn't have the best support in all areas, but it worked. That said come Vista the 64-bit OS started to become more main stream as Vista itself was released as both 32-bit and 64-bit with no difference between the two version of the OS at the base level. Both were Vista, both had the same features and ran the same programs (outside of 64-bit requirements). Most new computers that you purchased that had a 64-bit CPU had the 64-bit OS installed on them. Although you could opt, if you wanted, to downgrade to a 32-bit OS. However most people would have just taken whatever it was sold with as an option when going to dell.com or whatever. Only people who knew what they were selecting would have made that change. The OS itself ran just fine, I personally had no more of an issue finding 64-bit drivers than I did 32-bit drivers, although driver support was horrible at first for everything due to hardware makers. Software wise, all my 32-bit programs ran just fine. So I personally think it is safe to say that most buying a new computer during the Vista time frame or above would have received a 64-bit OS. Anyone that upgraded from XP to Vista would have still had the 32-bit OS unless they elected to do a clean install of Vista.

The fact that it was being put on all new computers makes it main stream. It was out there, it came in the box for you to install along side the 32-bit version. I can only think of a single OEM computer that I have come across running a 32-bit version of Vista or above and that was due to it only having a 32-bit CPU. One of those cheap $175 black Friday specials.


As for the requirements. Seeing the reaction of people here makes me laugh.
 
Could you describe, in depth, the backroom deals between game developers/publishers and RAM manufacturers? How do game makers benefit financially from inflated RAM requirements? Like, does the CEO of DICE secretly hold stock in Gskill or something? Is there some sort of kickback going on?

Explain the conspiracy if you would, please.

game developers like Ubisoft or DICE partner up with Nvidia or AMD...develop their games sometimes with major input from the GPU manufacturers (like Alien Isolation and AMD)...title is tested and if title performs well across a variety of hardware then possible crippling GPU-exclusive effects are added such as GameWorks or system requirements are inflated to get people to upgrade from their GTX 580 to a GTX 980...

hope that helps crusty...if there's any other help you need, I'll be here :cool:
 
game developers like Ubisoft or DICE partner up with Nvidia or AMD...develop their games sometimes with major input from the GPU manufacturers (like Alien Isolation and AMD)...title is tested and if title performs well across a variety of hardware then possible crippling GPU-exclusive effects are added such as GameWorks or system requirements are inflated to get people to upgrade from their GTX 580 to a GTX 980...

hope that helps crusty...if there's any other help you need, I'll be here :cool:

thats fine, and I dont think anyone says that game manufacturers dont do that for GPU's but he asked about RAM, I'm pretty sure theres no hidden RAM agenda here to convince people to run out and buy 16GB of G.Skill ram which is what he was saying,
 
.title is tested and if title performs well across a variety of hardware then possible crippling GPU-exclusive effects are added such as GameWorks or system requirements are inflated to get people to upgrade from their GTX 580 to a GTX 980...
I asked about RAM tin foil lunacy, not the Gameworks Illuminati controversy.
 
I dont think anyone says that game manufacturers dont do that for GPU's
They don't do that for GPUs. It's not a conspiracy.

Game devs do not receive kickbacks from Nvida for every 980ti sold. This is some serious Underpants Gnome shit you guys are on.

1) Devs inflate sys reqs. by integrating Gameworks
2) User buys a GPU
3) ?????
4) Profit

Tell me how DICE/Rocksteady make money by forcing their customers to purchase Gskill/Nvidia hardware.
 
Tell me how DICE/Rocksteady make money by forcing their customers to purchase Gskill/Nvidia hardware.

of course not for every individual GPU sold...but if you think there's no secret backdoor handshake agreements between companies then you need to get out of your bubble...partnerships like this are about long term investment and not short term gain...there's a reason certain games always perform better with 1 GPU or CPU etc manufacturer...you're focusing on G.Skill because you know your position makes no sense so you need to focus on the area where the 'Illuminati' influence cannot easily be seen :D
 
Wrong.

Windows Vista x64 was released in 2006, not 2005......
Might want to read what I wrote again...

I remember circa 2005 when Windows XP x64 was released, and in 2007 when Vista 64-bit was released.

Windows Vista 32-bit and 64-bit general availability (mainstream release) was on January 30, 2007.
So, you're kinda wrong on both parts there, genius. :rolleyes:
 
Vista 64-bit and 32-bit were the same. XP 64-bit was not Windows XP, it was Server 2003 with a few desktop features added back in. It worked, didn't have the best support in all areas, but it worked. That said come Vista the 64-bit OS started to become more main stream as Vista itself was released as both 32-bit and 64-bit with no difference between the two version of the OS at the base level. Both were Vista, both had the same features and ran the same programs (outside of 64-bit requirements). Most new computers that you purchased that had a 64-bit CPU had the 64-bit OS installed on them. Although you could opt, if you wanted, to downgrade to a 32-bit OS. However most people would have just taken whatever it was sold with as an option when going to dell.com or whatever. Only people who knew what they were selecting would have made that change. The OS itself ran just fine, I personally had no more of an issue finding 64-bit drivers than I did 32-bit drivers, although driver support was horrible at first for everything due to hardware makers. Software wise, all my 32-bit programs ran just fine. So I personally think it is safe to say that most buying a new computer during the Vista time frame or above would have received a 64-bit OS. Anyone that upgraded from XP to Vista would have still had the 32-bit OS unless they elected to do a clean install of Vista.

The fact that it was being put on all new computers makes it main stream. It was out there, it came in the box for you to install along side the 32-bit version. I can only think of a single OEM computer that I have come across running a 32-bit version of Vista or above and that was due to it only having a 32-bit CPU. One of those cheap $175 black Friday specials.

Yeah... I remember that era well, and driver support was garbage for a few years when Vista was out (not Microsoft's fault).
Netbooks were becoming mainstream, and many laptops at the time were not 64-bit compatible (Core Duos, Atoms, AMD Turions, etc. - all 32-bit CPUs).

Core 2 Duo/Quad laptops at that time were very expensive (relative to others), and would probably only equate to about 1/3 to 1/2 of most systems sold at the time (counting the other CPUs I just mentioned).
Software was mostly compatible, you are right, but I wouldn't say that 64-bit was mainstream.

It was on the uprise in the late 2000s, but didn't really start to become mainstream until 2010.
 
Then how?

Honestly, everything you're posting sounds like typical PC gamer tin foil hat bullshit, right down to the ....trailing off in every sentence.:rolleyes:

it's not 'trailing off every sentence'...it's used to separate sentences so everything doesn't look like a giant wall of text...easier to read with spaces in between...speaking of posting etiquette, I see you like to cut off posts and only quote specific sentences or words taken out of context...if you don't have any intelligent rebuttals to the post taken as a whole then then why bother responding to it in fragments?
 
They don't do that for GPUs. It's not a conspiracy.

Game devs do not receive kickbacks from Nvida for every 980ti sold. This is some serious Underpants Gnome shit you guys are on.

1) Devs inflate sys reqs. by integrating Gameworks
2) User buys a GPU
3) ?????
4) Profit

Tell me how DICE/Rocksteady make money by forcing their customers to purchase Gskill/Nvidia hardware.

as far as im concerned when ever I see a logo/sequence when you first launch a game that says "made for nvidia" or how ever its put there is a very good chance Nvidia got that in their some how. It may not be direct cash but even offering to help code parts of the game or in some way lend assitance so its slightly more compatible with nvidia cards could be a kick back. Developer saves some money on making the game, Nvidia stands to gain potential customers due to benchmarks favoring their products. Or yes it could be a simple csh advertising, Nvidia pays the developer to add "made for nvidia cards" or what ever the spin phrase is and nvidia gets advertising to increase sales of their cards.
 
as far as im concerned when ever I see a logo/sequence when you first launch a game that says "made for nvidia" or how ever its put there is a very good chance Nvidia got that in their some how. It may not be direct cash but even offering to help code parts of the game or in some way lend assitance so its slightly more compatible with nvidia cards could be a kick back. Developer saves some money on making the game, Nvidia stands to gain potential customers due to benchmarks favoring their products. Or yes it could be a simple csh advertising, Nvidia pays the developer to add "made for nvidia cards" or what ever the spin phrase is and nvidia gets advertising to increase sales of their cards.

Because AMD never does this... ;)
 
Yeah... I remember that era well, and driver support was garbage for a few years when Vista was out (not Microsoft's fault).
Netbooks were becoming mainstream, and many laptops at the time were not 64-bit compatible (Core Duos, Atoms, AMD Turions, etc. - all 32-bit CPUs).

Core 2 Duo/Quad laptops at that time were very expensive (relative to others), and would probably only equate to about 1/3 to 1/2 of most systems sold at the time (counting the other CPUs I just mentioned).
Software was mostly compatible, you are right, but I wouldn't say that 64-bit was mainstream.

It was on the uprise in the late 2000s, but didn't really start to become mainstream until 2010.

I guess that definition of mainstream here is the issue. I myself consider it starting to go mainstream as soon as anyone could go buy a pc and it would be on it, software was being developed to take advantage of it to any degree and it wasn't just something that a small number of people would / could make use of. which might not have been as early as 2005, but during the lifespan of Vista before 7 was out that was more and more common to see 64-bit cpus in a lot of average priced systems. My sister is still running a 8 or 9 year old dell that came with 64-bit vista on it, just updated it to windows 10 for her. She didn't pay anything crazy for it at the time.

AMD Turion was a 64-bit CPU. That is what I had in my work laptop at the time. Worked well for what I used it for.

Now we are there talking about the average person also. For people that are building mid - high end gaming computer. I am pretty sure that they had 64-bit CPUs 2005 - 2006. Doubt many people doing high end gaming waited till 2010 or later to move beyond a 32-bit single core cpu. So for those people, ie anyone posting here in this thread, there is no reason any of them are still running a 32-bit OS on their gaming rig.
 
Because AMD never does this... ;)

Sorry wasn't picking on nvidia intentially. Just easier to stick to one brand then two or three and confuse it somewhere, but yes both nvidia and amd are guilty.

Personally I use what ever is the better bang for my buck at the time of purchase. My current and last one were Radeon cards, last 4 before that were nvidia.
 
AMD Turion was a 64-bit CPU. That is what I had in my work laptop at the time. Worked well for what I used it for.

Ah, you are right about that.

Now we are there talking about the average person also. For people that are building mid - high end gaming computer. I am pretty sure that they had 64-bit CPUs 2005 - 2006. Doubt many people doing high end gaming waited till 2010 or later to move beyond a 32-bit single core cpu. So for those people, ie anyone posting here in this thread, there is no reason any of them are still running a 32-bit OS on their gaming rig.

Well, some people do like to dual boot, and 32-bit OSes are necessary to run older 16-bit applications as well, including games (emulators, mostly).
But yes, for a mainstream gaming system, I agree.
 
Might want to read what I wrote again...



Windows Vista 32-bit and 64-bit general availability (mainstream release) was on January 30, 2007.
So, you're kinda wrong on both parts there, genius. :rolleyes:

Sorry!!
 
Back
Top