Is this a good processor, would it be good enough to play newer games?
Printable View
Is this a good processor, would it be good enough to play newer games?
Intel Core2Duo is a brand of processor with several hundred different models, differing in power by over probably 2Ghz.
You have to be more specific. Generally, yes, multicore processors are good.
I'd reccomend ATI , but it's your pick.
IntelŽ CoreTM 2 Duo SU7300 (1.3GHz/800Mhz FSB/3MB cache) Exact processor. Also this graphics card 1024 (MB) NVIDIAŽ GT335M GeForceŽ. With 4 gb of ram.
That's an incredibly weak CPU. My 4 year old laptop has a Core2Duo of 2GHz. I strongly suggest you find something better.
Aight.
Don't bother with the 3092379523 core ones. Most shit doesn't use them anyway.
But also, Graphics cards, for new games need to be very good. Almost need to update every year if you want to play on high settings.
Wrong. Most programs these days are being developed to use multicore processors. You get so much more for your money if you buy a multicore processor, compared to an old singlecore one, especially because the technology isn't really being developed on anymore. For gaming, a quadcore processor is ideal. Something around 3GHz is generally a good speed. Getting this for a laptop is going to be difficult though. A 2.8GHz dual core is probably the best you will get, which is pretty good too.
This isn't true at the moment. Graphics development is much slower the last few years, because of consoles.Quote:
But also, Graphics cards, for new games need to be very good. Almost need to update every year if you want to play on high settings.
An 8800 GTS can still play most games at high settings, and that's a 4 and half year old card.
GPU is whats improtant for a game, not the CPU
I don't agree. Obviously a single core is shit. But you'll be hard pressed to find a program that uses more than four cores.
Of course I'm not sure about PC's but Mac's use some cores for the game, and then some for running other programs. But there's really no need because you don't need to do anything else besides play the game while you're playing it. I'm not sure if PC's do this automatically or not.
Again I do not agree at all. Games are developed for the PC (unless they're exclusive to a certain console) and then they optimise them for the consoles afterwards. (shitter graphics and physics etc.)
And yeah Ninja, that's wrong. CPU's are still very important.
This topic is epic lulz. Raging E-Peen up in dis biatch.:cackle:
That build should run any game on medium settings just fine. Which is about the best you'll get out of a laptop for a reasonable price.Quote:
IntelŽ CoreTM 2 Duo SU7300 (1.3GHz/800Mhz FSB/3MB cache) Exact processor. Also this graphics card 1024 (MB) NVIDIAŽ GT335M GeForceŽ. With 4 gb of ram.
Most games these days are being developed to work with multicore processors, like the battle-field series.
There is a large amount of games that just get ported to PC from the consoles. It's actually very rare that a game is first made on PC and then ported to consoles, because dumbing a game down and making it work on worse hardware can be very difficult. Through the last 5 years, graphical development has been incredibly slow. Graphics are only slightly better each year. PC graphics basically hit the peak in 2007 with the release of Crysis. No other game has beat it since then, except for maybe Metro 2033, and even that isn't that good looking. The reason is that developers don't want to develop a game that they can't easily release on all platforms.Quote:
Again I do not agree at all. Games are developed for the PC (unless they're exclusive to a certain console) and then they optimise them for the consoles afterwards. (shitter graphics and physics etc.)
I'm quite certain I know a lot more about it than you do. 90% of the cycles for games are graphics related. The graphics in games have to calculate millions of floating point precise vectors, then translate that into between 1 and 8 million pixels on the screen. Then extra shader passes are done, these are entirely GPU based operations. When a programmer writes a graphics heavy program, they fork it into usually one or two physics threads, an AI thread, and about a dozen graphics threads. The graphics threads run entirely on the GPU, they don't even have the same instruction set as the rest of the program. It's the reason that GPUs have a dozen cores, but CPUs only have between 2 and 8.
Not true, simple programs run with one or two cores, but they aren't intensive anyway. Take a look at your Activity Monitor sometime. Programs are highly parallelized. Most programs run about 8 threads at any given time, more cores means more of those can run at the same time. Finder alone runs 6 to 8 threads, Safari runs between 15 and 25, iTunes runs between 10 and 25... Games will significantly more because they use a technique called thread pooling. Most dispatch to the GPU, but you'll still have 8 to 10 of them running on the CPU.
Macs can use CPUs for games because of Central Dispatch, a feature that Windows doesn't have. On a Windows PC, graphics threads run on GPUs and processor threads run on CPUs, there is no overlap. And Macs will only push graphics threads onto the main CPU if the GPU is struggling to keep up, so if you have a high end GPU, it won't need to do that. Are you gaming on a Mac?
I'm sure you know a lot about the technical aspects. What I do know, is that there are some more recent games out there, that are very physics intensive. A game like BF:BC2 for example would run much much better for me if I had a 3GHz quadcore, than just my silly 2.6GHz dual core. A game like Minecraft is also very CPU and RAM intensive. Of course, Minecraft is a special case, but it is worth mentioning.
Minecraft uses an outdated engine. Thanks to something called OpenCL, large amounts of physics calculations are also being pushed onto the graphics card. The latest DirectX is also pushing more and more physics calculations to the much faster GPU. RAM has nothing to do with the CPU or GPU anymore, programmers can connect the RAM directly to the GPU with the newer APIs, so there is no need for the CPU to act as the middleman anymore. The limiting factor is the bus speed of the RAM and how close it is to the processors.
As time goes on, the amount of physics done by the CPU will only go down.
I agree with all of this and I know it's true, but the situation with current games, in my experience and many of my friends', is that the CPU does matter. I wasn't aware that people were starting to figure out how OpenGL could take care of physics calculations too. That is very interesting. Does it have something to do with OpenCL? Bear with me, my knowledge about graphics APIs is limited.
It is OpenCL, OpenCL allows you to push regular floating point queries onto the graphics card. Of course CPU matters, but the performance difference you'll see between 2 and 4 cores is much less than you'll see between 8 and 16 graphic cores. The bottle neck is still the graphics, physics still won't choke the system, they are done via steps that are automatically reconfigured in the event of any lag, so any game should be able to run on any modern CPU as long as the GPU is capable. Intel chips aren't made for doing calculation intensive stuff anyway, why do you think consoles don't use them? The XBox has PowerPC chips in them, and the PS3 has custom cell processors.
Ah, I misread, I thought you wrote OpenGL up there.
I would assume consoles use hardware that specializes in game processing, because that's what consoles are for. PCs have a more general purpose though.
What are you guys, dummies? You're so dummies if you don't know GPU is more important than CPU for gaming. Crunching graphics numbers is better done by graphics processors, and GAMES ARE LIKE 90% GRAPHICS, OK?!
I'm not sure what threads are. I'm going on my limited knowledge here.
But I said I don't know of any programs that use more than four cores.
And then you said, you don't agree with that because simple programs run on one or two?
That doesn't make sense lol
What I was saying with the Macs using CPU cores was regarding the parts of the games that actually use the CPU, not what you said about Central dispatch,
but the part which allows people to easily write programs to use multiple cores. (I didn't even know it did what you said).
Sorry if that made no sense I should be sleeping.
Basically I was saying to the OP that he's not going to need a CPU with a billion cores or whatever they're coming up with these days.
Because the games aren't going to use them. Unless the games are set up for this, and I'm not sure if many actually are.
For example I run a program and I check Activity Monitor and it's using all of one core (or close to) and it just won't use the other ones.
So I have to start another user account and run that program in there to get it to use the other core. (I did this with iMovie converting videos a while ago).
Reading the rest of your posts it basically seems like you agree with what I was saying though, that he doesn't need a hugely expensive CPU.
But I think we are saying it for different reasons.
You're saying games don't use CPU's much anymore.
I'm saying that if they do, it doesn't matter because it won't use more than 2 anyway that I've noticed.
I game on my Mac only rarely (played TF2 and Gary's mod twice and Minecraft a couple of times in the 2 and a half years I've had it).
I'm pretty much console exclusive coz I switched to a Mac for artwork purposes coz they're relatively easier to use for that than PC's
and they seem to not need to be updated as much. Although maybe that's because of the slowing of graphics progress as you mentioned.