search for it

Showing posts with label new processor. Show all posts
Showing posts with label new processor. Show all posts

Friday, June 26, 2015

Best Gaming CPUs For The Money: June 2015 (CPUs Hierarchy Chart / Processors Hierarchy Chart)

What about this other CPU that’s not on the list? How do I know if it’s a good deal or not?
This will happen. In fact, it’s guaranteed to happen because availability and prices change quickly. So how do you know if that CPU you have your eye on is a good buy in its price range?
Here is a resource to help you judge if a CPU is a reasonable value or not: the gaming CPU hierarchy chart, which groups CPUs with similar overall gaming performance levels into tiers. The top tier contains the highest-performing gaming CPUs available and gaming performance decreases as you go down the tiers from there.
This hierarchy was originally based on the average performance each CPU achieved in our test suite. We have since incorporated new game data into our criteria, but it should be known that any specific game title will likely perform differently depending on its unique programming. Some games, for example, will be severely graphics subsystem-limited, while others may react positively to more CPU cores, larger amounts of CPU cache, or even a specific architecture. We also did not have access to every CPU on the market, so some of the CPU performance estimates are based on the numbers similar architectures deliver. Indeed, this hierarchy chart is useful as a general guideline, but certainly not as a one-size-fits-all CPU comparison resource. For that, we recommend you check out our CPU Performance Charts.
- See more at: http://www.tomshardware.com/reviews/gaming-cpu-review-overclock,3106-5.html#sthash.69bkXp0H.dpuf

You can use this hierarchy to compare the pricing between two processors, to see which one is a better deal, and also to determine if an upgrade is worthwhile. I don’t recommend upgrading your CPU unless the potential replacement is at least three tiers higher. Otherwise, the upgrade is somewhat parallel and you may not notice a worthwhile difference in game performance. - See more at: http://www.tomshardware.com/reviews/gaming-cpu-review-overclock,3106-5.html#sthash.69bkXp0H.dpuf




Summary
There you have it folks: the best gaming CPUs for the money this month. Now all that’s left to do is compare their performance to your budget before you decide which one is right for you. We even put in the work to help find the best prices.
Also remember that the stores don’t follow this list. Things will change over the course of the month and you’ll probably have to adapt your buying strategy to deal with fluctuating prices. Good luck!
- See more at: http://www.tomshardware.com/reviews/gaming-cpu-review-overclock,3106-5.html#sthash.69bkXp0H.dpuf

Source: Tom's Hardware

List CPUs: Core i7-2600, -2600K, -2700K, -3770, -3770K, -3820, -3930K, -3960X, -3970X, -4770, -4770K, -4790K, -5775C, -5820K, 5930K, -5960X, Core i7-965, -975 Extreme, -980X Extreme, -990X Extreme, Core i5-5675C, -4690K, 4670K, -4590, -4670, -4570, -4430, -3570K, -3570, -3550, -3470, -3450P, -3450, -3350P, -3330, 2550K, -2500K, -2500, -2450P, -2400, -2380P, -2320, -2310, -2300, Core i7-980, -970, -960, Core i7-870, -875K, Core i3-4370, -4170, -4160, -3250, -3245, -3240, -3225, -3220, -3210, -2100, -2105, -2120, -2125, -2130 FX-9590, 9370, 8370, 8350, 8320, 8150, 6350, 4350, Phenom II X6 1100T BE, 1090T BE, Phenom II X4 Black Edition 980, 975, Core i7-860, -920, -930, -940, -950, Core i5-3220T, -750, -760, -2405S, -2400S, Core 2 Extreme QX9775, QX9770, QX9650, Core 2 Quad Q9650, FX-8120, 8320e, 8370e, 6200, 6300, 4170, 4300, Phenom II X6 1075T, Phenom II X4 Black Edition 970, 965, 955 , A10-6800K, 6790K, 6700, 5800K, -5700, -7800, -7850K, A8-3850, -3870K, -5600K, 6600K, -7600, -7650K, Athlon X4 651K, 645, 641, 640, 740, 750K, 860K, Core 2 Extreme QX6850, QX6800, Core 2 Quad Q9550, Q9450, Q9400, Core i5-650, -655K, -660, -661, -670, -680, Core i3-2100T, -2120T, FX-6100, -4100, -4130, Phenom II X6 1055T, 1045T, Phenom II X4 945, 940, 920, Phenom II X3 Black Edition 720, 740, A8-5500, 6500, A6-3650, -3670K, -7400K, Athlon II X4 635, 630, Core 2 Extreme QX6700, Core 2 Quad Q6700, Q9300, Q8400, Q6600, Q8300, Core 2 Duo E8600, E8500, E8400, E7600, Core i3 -530, -540, -550, Pentium G3460, G3260, G3258, G3250, G3220, G3420, G3430, G2130, G2120, G2020, G2010, G870, G860, G850, G840, G645, G640, G630,Phenom II X4 910, 910e, 810, Athlon II X4 620, 631, Athlon II X3 460, Core 2 Extreme X6800, Core 2 Quad Q8200, Core 2 Duo E8300, E8200, E8190, E7500, E7400, E6850, E6750, Pentium G620, Celeron G1630, G1620, G1610, G555, G550, G540, G530, Phenom II X4 905e, 805, Phenom II X3 710, 705e, Phenom II X2 565 BE, 560 BE, 555 BE, 550 BE, 545, Phenom X4 9950, Athlon II X3 455, 450, 445, 440, 435, 425, Core 2 Duo E7200, E6550, E7300, E6540, E6700, Pentium Dual-Core E5700, E5800, E6300, E6500, E6600, E6700, Pentium G9650, Phenom X4 9850, 9750, 9650, 9600, Phenom X3 8850, 8750, Athlon II X2 265, 260, 255, 370K, A6-5500K, A4-6400K, 6300, 5400K, 5300, 4400, 4000, 3400, 3300, Athlon 64 X2 6400+, Core 2 Duo E4700, E4600, E6600, E4500, E6420, Pentium Dual-Core E5400, E5300, E5200, G620T, Phenom X4 9500, 9550, 9450e, 9350e, Phenom X3 8650, 8600, 8550, 8450e, 8450, 8400, 8250e, Athlon II X2 240, 245, 250, Athlon X2 7850, 7750, Athlon 64 X2 6000+, 5600+, Core 2 Duo E4400, E4300, E6400, E6320, Celeron E3300, Phenom X4 9150e, 9100e, Athlon X2 7550, 7450, 5050e, 4850e/b, Athlon 64 X2 5400+, 5200+, 5000+, 4800+, Core 2 Duo E5500, E6300, Pentium Dual-Core E2220, E2200, E2210, Celeron E3200, Athlon X2 6550, 6500, 4450e/b,  Athlon X2 4600+, 4400+, 4200+, BE-2400, Pentium Dual-Core E2180, Celeron E1600, G440, Athlon 64 X2 4000+, 3800+, Athlon X2 4050e, BE-2300, Pentium Dual-Core E2160, E2140, Celeron E1500, E1400, E1200

Thursday, November 17, 2011

Qualcomm Launches Eight Snapdragon S4 Chips

New additions to the S4 family, six of which include cellular baseband features.
Qualcomm has added eight SKUs, with models including the MSM8660A, MSM8260A, MSM8630, MSM8230, MSM8627, MSM8227 (all with modems) as well as the APQ8060A and APQ8030.
All new chips are based on the 28 nm, dual-core Krait processor with a clock speed ranging from 1.0 to 1.7 GHz, as well as Adreno 225 or 305 graphics units.
Qualcomm did not say which commercial products will integrate these processors, but noted that devices with these chips could be surfacing in early 2012, which could mean that Snapdragon S4 smartphones/tablets should be at CES in January. The company also announced four additional new processors (MSM7225A, MSM7625A, MSM7227A, MSM7627A) in its S1 family, which, however, are based on an older ARMv11 core and ARMv6 architecture. The current Krait core is based on heavily modified ARMv7 architecture. Qualcomm claims that its chips are more efficient than ARM's Cortex-A9 design.

Source: Tom's Hardware

Sandy Bridge-E: Core i7-3960X Is Fast, But Is It Any More Efficient?

Ironically, when it comes to performance, Intel’s Core i7-3960X is the real Bulldozer. Since its power consumption levels are lower than the Gulftown-based Core i7, it should also deliver amazing performance per watt as well. Is that really the case?
Intel's Sandy Bridge-E design takes the company's 32 nm Sandy Bridge architecture to the next level. As you likely saw in Chris Angelini’s full review on Sandy Bridge-E And X79 Express, the new high-end processor family offers more of almost everything: more cores, more cache, more memory channels, and more PCI Express connectivity, resulting in better benchmark scores in almost every discipline.
While the new processor design, which is now available as the Core i7-3960X and Core i7-3930K (and Core i7-3820 some time next year) delivers more performance, we've already seen the first review machines based on X79 Express lowering power consumption versus the Gulftown/X58 combination thanks to the dual-chip platform layout. AMD might not want to learn in detail what this could mean in terms of performance per watt, since the six-core Core i7-990X was already faster than its flagship FX-8150.

Monday, November 14, 2011

Intel Core i7 3960X (Sandy Bridge E) Review: Keeping the High End Alive

If you look carefully enough, you may notice that things are changing. It first became apparent shortly after the release of Nehalem. Intel bifurcated the performance desktop space by embracing a two-socket strategy, something we'd never seen from Intel and only once from AMD in the early Athlon 64 days (Socket-940 and Socket-754).
LGA-1366 came first, but by the time LGA-1156 arrived a year later it no longer made sense to recommend Intel's high-end Nehalem platform. Lynnfield was nearly as fast and the entire platform was more affordable.
When Sandy Bridge launched earlier this year, all we got was the mainstream desktop version. No one complained because it was fast enough, but we all knew an ultra high-end desktop part was in the works. A true successor to Nehalem's LGA-1366 platform for those who waited all this time.

Left to right: Sandy Bridge E, Gulftown, Sandy Bridge
After some delays, Sandy Bridge E is finally here. The platform is actually pretty simple to talk about. There's a new socket: LGA-2011, a new chipset Intel's X79 and of course the Sandy Bridge E CPU itself. We'll start at the CPU.

Friday, November 11, 2011

CPU Chart, Processors Hierarchy 2011

What about this other CPU that’s not on the list? How do I know if it’s a good deal or not?
This will happen. In fact, it’s guaranteed to happen because availability and prices change quickly. So how do you know if that CPU you have your eye on is a good buy in its price range?
Here is a resource to help you judge if a CPU is a reasonable value or not: the gaming CPU hierarchy chart, which groups CPUs with similar overall gaming performance levels into tiers. The top tier contains the highest-performing gaming CPUs available and gaming performance decreases as you go down the tiers from there.
This hierarchy was originally based on the average performance each CPU achieved in our charts test suite using only four game titles: Crysis, Unreal Tournament 3, World in Conflict, and Supreme Commander. We have since incorporated new game data into our criteria, but it should be known that any specific game title will likely perform differently depending on its unique programming. Some games, for example, will be severely graphics subsystem-limited, while others may react positively to more CPU cores, larger amounts of CPU cache, or even a specific architecture. We also did not have access to every CPU on the market, so some of the CPU performance estimates are based on the numbers similar architectures deliver. Indeed, this hierarchy chart is useful as a general guideline, but certainly not as a one-size-fits-all CPU comparison resource. For that, we recommend you check out our CPU Performance Charts.
You can use this hierarchy to compare the pricing between two processors, to see which one is a better deal, and also to determine if an upgrade is worthwhile.

Thursday, November 10, 2011

Holiday Budget System Buyers' Guide: Celeron, Athlon II, Llano


Sandy Bridge Celerons
Intel released Sandy Bridge-based Celeron CPUs in early September, and these started appearing in retail channels by the middle of that month; we provided a brief overview of these parts. The Celeron that stands out is the G530, a dual-core CPU clocked at 2.4GHz with 2MB L3 cache and on-die Intel HD Graphics. This processor lacks Hyper-Threading and Quick Sync support, and it has a TDP of 65W (though it will generally use far less power than that). While Intel's suggested pricing is a meager $42, retail prices have stayed steady since its release at $55-60. It is the least powerful Intel dual-core CPU, with only the single core G440 available for less money.
If you've been building and using computers for years, you know there is a stigma attached to the Celeron name. For a long time, Celerons were crippled to the point of near-unusability for even the most basic tasks. That has changed as our basic benchmarks indicate that the G530 is not at all an almost-garbage CPU. The Celeron stigma is dead.
Athlon II X2s
AMD's Athlon II X2 Regor-based 45nm dual-cores have been a mainstay of budget computing since their introduction in 2009. The Athlon II X2 250, clocked at 3.0GHz with 2MB L2 cache, is essentially as capable today as it was two years ago for basic usage. For example, 1080p videos on YouTube are no more difficult to decode and Microsoft Office 2010 isn't much more CPU-hungry than Office 2007 was. Given that most computers I assemble are budget systems, I've now used the Athlon II X2 250 for more builds than any other CPU. Is that about to change?
Llano APUs
AMD's most recent APUs (accelerated processing units) have also expanded into the budget processor range. These Fusion APUs combine both the CPU and Radeon "cores" on a single die. Anand reviewed the most capable APU back in June, and compared the A6 model to Intel's Sandy Bridge Pentium in late August. The more recently released 32nm A4-3300 chip (overviewed by Anand in September) is a dual-core part clocked at 2.5GHz with 1MB total L2 cache and featuring AMD's Radeon HD 6410 graphics—160 GPU cores clocked at 443MHz. Its nominal TDP is 65W. Priced around $70, the A4-3300 is only about $10 more than the Celeron G530 and Athlon II X2 250. It promises better graphics performance—but how does the least expensive A-series APU compare to inexpensive discrete video cards, and do you sacrifice processor performance for better graphics?
Battle of the Budget Processors: Benchmarks
While we didn't put the Celeron G530 and A4-3300 through our extensive Bench suite, here are a few benchmarks that show how they stack up against the venerable Athlon II X2 250. All benchmarks were performed using an Antec Neo Eco 400W power supply, a Western Digital Blue 500GB WD5000AAKX hard drive, and a 2x2GB kit of DDR3-1333 with a clean installation of Windows 7 Enterprise 64-bit, with only the manufacturer-supplied drivers installed.
Conversion of a PowerPoint Presentation to a PDF
For this benchmark, I converted a 100 slide, 25MB PowerPoint file to a PDF using Microsoft Office 2010's integrated "Save as PDF" option. As you can see, the Athlon II CPU performs this task slightly faster than the Celeron, though in reality you're only going to actually notice a difference if you're converting extremely large PowerPoint files. The Fusion APU is substantially slower—this is a difference you will notice in real-world usage scenarios.
7-Zip performance
These values were obtained using 7-Zip's built-in benchmark function with a 32MB dictionary. AMD's Athlon II CPU has a more noticeable advantage here over the Celeron—you will notice a difference if compressing/decompressing either many files or large files. The A4-3300 again performs palpably worse--no surprise given its lower 2.5GHz clock compared to the Athlon's 3.0GHz.
FastStone image resizing
For this test, I resized 50 4200p pictures down to 1080p resolution using FastStone's batch image conversion function. Again, the two CPUs perform similarly, though this time Intel takes the lead. The AMD APU once again lags significantly behind the two CPUs.
x264 HD encode test
Graysky's x264 HD test (v. 3.03) uses x264 to encode a 4Mbps 720p MPEG-2 source. The focus here is on quality rather than speed, thus the benchmark uses a 2-pass encode and reports the average frame rate in each pass. The difference between the Athlon II and Celeron CPUs is essentially nil; both offer better performance than the AMD APU.
Power consumption
Like the above benchmarks, all components were held equal for power consumption testing sans the CPU and motherboard. For the Athlon II platform, I used the ASRock 880GM-LE motherboard, for the Intel platform I used the ASRock H61M-VS motherboard, and the APU was tested on an ASRock A55M-HVS. This is where the efficiency of the newer architectures truly outshines the older Athlon II design. Measurements were taken using a P3 International P4400 Kill A Watt monitor and reflect the entire system, not just the CPU.
Intel's Celeron still leads for low power use, but Llano is at least within striking distance. The older Athlon II X2 uses around 50% more power than Llano for these two tests--or around 17 to 30W more power. Taking the lower number and going with a system that's only powered on eight hours per day, we end up with a difference of around 50kWh per year--or $4 to $15 depending on how much you pay for electricity. If you're in a market where power costs more, obviously there's a lot to be said for going with the more efficient architectures.
Gaming benchmarks
Next we test how the AMD A4-3300 APU's graphics prowess stacks up against a budget GPU. The AMD Athlon II and Intel Celeron CPUs were paired with an AMD Radeon HD 5670 512MB DDR5 discrete GPU as neither of their integrated graphics are capable of producing a tolerable gaming experience. The A4-3300 was not paired with a discrete GPU.
Left 4 Dead 2
For the Left 4 Dead 2 benchmark, we used a 1024x768 resolution with all settings at maximum (but without antialiasing). The AMD APU delivers almost 40 frames per second by itself, so no discrete graphics card is required. Subjectively, gameplay was smooth and fluid on the APU. However, bumping up the resolution to even 720p could be an issue, even with less demanding games.
DiRT 3
For the DiRT 3 benchmark, we used DirectX 11 at 1024x768 resolution, but this time graphics options were set to the low preset. Even then, the AMD APU struggled to breach the 30 frames per second threshold, and DiRT 3 clearly did not run as smoothly as Left 4 Dead 2. That said, it remained playable, and if you're tolerant of lower resolutions, it performs fine in windowed mode.
Keep in mind that we're using the bottom-rung Llano APU for these tests, and it's a pretty major cut from the A6 models--half the shader cores, but with a slightly higher clock, and only a dual-core CPU. Where the A6 and A8 can legitimately replace budget discrete GPUs, the same cannot be said for the A4 APUs. The lowest priced A6-3500 will set you back around $100, but it drops the CPU clock to 2.1GHz and only adds a third core. Meanwhile the quad-core A6-3650 will run $120 ($110 with the current promo code), but it sports a 2.6GHz clock with the HD 6530D graphics (and a higher 100W TDP). At that point, you might also be tempted to go for the A8-3850, with the full HD 6550D graphics and a 2.9GHz clock, which brings the total for the APU to $135. All of these APUs will work in the same base setup as our Llano build, but obviously the price goes up quite a bit. If you'd like added processing and graphics power, though, the quad-core parts make sense.
Summary
As you can see, the Athlon II and Celeron CPUs are very evenly matched across a range of basic productivity tests, while the Fusion APU typically lags behind, at least for office productivity and encoding tasks. That said, the A4-3300 is capable of delivering an acceptable gameplay experience for casual gamers without necessitating a discrete GPU. Additionally, Intel's newer Sandy Bridge architecture and AMD's newer Llano architecture result in dramatically lower total system power consumption at both idle and load compared to the aging AMD Regor architecture.
So which CPU should you buy for your budget build? In terms of upgradeability, socket AM3 is still viable. In the short term, Phenom II quad-cores are already inexpensive, starting at just over $100—so they will be even cheaper in another year or two. Of course, Bulldozer CPUs are compatible with many AM3+ motherboards and could be a wise upgrade in a few years as well. Intel's LGA 1155 socket is also very upgrade-friendly—the Celeron G530 is, after all, the least powerful Sandy Bridge CPU (aside from the sole single-core SKU). The Core i3-2100 will likely sell for less than $100 in another year or so (at least on the secondhand market), and more powerful Core i5 and i7s could keep today's Intel budget build alive and well for maybe as much as five more years. Like the Celeron G530, AMD's socket FM1 has nowhere to go but up from the A4-3300 APU. That said, LGA 1155 currently offers far more powerful CPUs than the high-end A8-3850.
I think in this case, given how evenly the CPUs perform (aside from power consumption), and that both offer lots of upgrade potential, the decision will come down to overall platform cost and features. The A4-3300 APU does offer an acceptable general, basic computing experience—its real strength is its ability to play less resource-intensive games without the extra cost of a discrete GPU. We cover a few budget AMD and Intel platform motherboards on the next page.

Wednesday, November 9, 2011

Nvidia Tegra 3: A Whole New Level of Mobile Gaming


The final details on Tegra 3 have finally been unveiled. It's not exactly a revolutionary design, but Nvidia's new SoC promises to deliver notebook-level performance.
Today, Nvidia formally unveils Tegra 3 (codenamed "Kal-El"). Given the number of leaks within the past few months, though, most of the technical specifications regard this latest SoC doesn't really surprise us.
Two months back, Nvidia published two whitepapers that basically spelled out what Tegra 3 would all be about. At that time, the company was adamant to only refer to its quad-core SoC as Kal-El. But who were they kidding? We all knew it was Tegra 3.
Tegra 3 Highlights:
  • 5x performance of Tegra 2
  • Better battery life
  • 3x faster GPU
What's interesting is that Nvidia is promising better performance and improved battery life (compared to the Tegra 2). That's a tall order in world of notebooks, but it's an even more difficult task more when you're dealing with embedded architecture. The voltage requirements are much tighter and we're dealing with power consumption a magnitude lower than the familiar x86 processors.
However, Kal-El is technically a quint-core SoC (Cortex-A9 architecture), because it features a fifth "companion" CPU core that handles low overhead tasks such as syncing email, playing a ringtone, and keeping applications alive in standby mode. We've already covered the technical CPU side of the discussion on how Nvidia accomplishes in an earlier post, so it's not entirely new to us. (For those curious, SoC cache size is 32/32 KB L1 per core and 1 MB L2 shared among the four cores. The companion core has access to the full 1 MB of L2 cache when the main cores are idle.)
The end result is that we should finally be able to watch Flash video at sites like Hulu without major stutter, which has always been one of our major complaints concerning Android devices. After all, what good is touting Flash compatibility over Apple's iOS devices when video playback is choppy?
NVIDIA Tegra 3: Side by Side Comparisons
On the GPU end, we should make it clear that Nvidia has basically recycled and supercharged its Ultra Low Power GeForce GPU from its Tegra 2 SoC. The graphics core is still restricted to OpenGL ES 2.0, so it's not a move up in the same way we'd think of DX10 to DX11. Furthermore, unlike Nvidia's desktop GPUs, both SoCs are based on an architecture that pre-dates the company's unified design.
With Tegra 2, you’re looking at four pixel shader cores and four vertex shader cores. This means the SoC operates most efficiently when the ULP GeForce GPU is presented with an even mix of vertex and shader code. Tegra 3 basically doubles the number of pixel shaders, which means it's going to operate most efficiently when faced with an uneven mix of code.
While the graphics core hasn't undergone a revolutionary overhaul, you are going to see games optimized for Tegra 3 that enable dynamic lighting, motion blur, and more realistic water and ballistic effects. This is partly due to the fact that many games continue to be CPU bound; so, the quad-core architecture is really going to help.
NVIDIA Tegra 3: Developers Bring Next-gen Games to Mobile
However, the increased memory bandwidth also increases the capabilities of the GPU. How much is a matter of debate, but we expect that this will open up a whole new world of mobile gaming.
Nvidia is already advertising the fact that you can hook up a Tegra 3 tablet to a 3D monitor or HDTV and use a PS3, Xbox 360, or Wii controller to play games. That's currently possible with a current Honeycomb-based tablet (minus the 3D part), but it's a less than ideal situation because Tegra 2 lacks the horsepower to make gameplay completely smooth. 
Tegra 3 promises to close that gap, which highlights the fact that we're coming to a point where there's greater convergence between devices. Nvidia's may have finally taken the first step in demonstrating that this isn't just a pipe dream, but it's a real possibility as the technology matures. A tablet that replaces your gaming console? That's something that looks mighty tempting.

Gaming CPU Hierarchy Chart


What about this other CPU that’s not on the list? How do I know if it’s a good deal or not?
This will happen. In fact, it’s guaranteed to happen because availability and prices change quickly. So how do you know if that CPU you have your eye on is a good buy in its price range?
Here is a resource to help you judge if a CPU is a reasonable value or not: the gaming CPU hierarchy chart, which groups CPUs with similar overall gaming performance levels into tiers. The top tier contains the highest-performing gaming CPUs available and gaming performance decreases as you go down the tiers from there.
However, a word of caution: this hierarchy is based on the average performance each CPU achieved in our charts test suite using only four game titles: Crysis, Unreal Tournament 3, World in Conflict, and Supreme Commander. While we feel this represents an acceptable cross-section of typical gaming scenarios, a specific game title will likely perform differently. Some games, for example, will be severely graphics subsystem-limited, while others may react positively to more CPU cores, larger amounts of CPU cache, or even a specific architecture. We also did not have access to every CPU on the market, so some of the CPU performance estimates are based on the numbers similar architectures deliver. Indeed, this hierarchy chart is useful as a general guideline, but certainly not as a gospel one-size-fits-all perfect CPU comparison resource.
You can use this hierarchy to compare the pricing between two processors, to see which one is a better deal, and also to determine if an upgrade is worthwhile. I don’t recommend upgrading your CPU unless the potential replacement is at least three tiers higher. Otherwise, the upgrade is somewhat parallel and you may not notice a worthwhile difference in game performance.
Gaming CPU Hierarchy Chart
IntelAMD
Core i7-2600, -2600K
Core i7-965, -975 Extreme, -980X Extreme
Core i7-970, -960
Core i5-2500, -2500K

Core i7-860, -870, -875K, -920, -930, -940, -950,
Core i5-750, -760
Core 2 Extreme QX9775, QX9770, QX9650
Core 2 Quad Q9650
Core i3-2100
Phenom II X4 Black Edition 975
Core 2 Extreme QX6850, QX6800
Core 2 Quad Q9550, Q9450, Q9400
Core i5-650, -655K, -660, -661, -670, -680
Phenom II X6 1100T BE, 1090T BE, 1075T
Phenom II X4 Black Edition 970, 965, 955 
Core 2 Extreme QX6700
Core 2 Quad Q6700, Q9300, Q8400, Q6600, Q8300 
Core 2 Duo E8600, E8500, E8400, E7600
Core i3 -530, -540, -550
Phenom II X6 1055T
Phenom II X4 945, 940, 920, 910, 910e, 810
Phenom II X3 Black Edition 720, 740
Athlon II X4 645, 640, 635, 630
Athlon II X3 455, 450, 445, 440, 435
Core 2 Extreme X6800
Core 2 Quad Q8200
Core 2 Duo E8300, E8200, E8190, E7500, E7400, E6850, E6750
Phenom II X4 905e, 805
Phenom II X3 710, 705e
Phenom II X2 565 BE, 560 BE, 555 BE, 550 BE, 545
Phenom X4 9950
Athlon II X4 620
Athlon II X3 425
Core 2 Duo E7200, E6550, E7300, E6540, E6700
Pentium Dual-Core E5700, E6300, E6500, E6600, E6700
Pentium G9650
Phenom X4 9850, 9750, 9650, 9600
Phenom X3 8850, 8750
Athlon II X2 265, 260, 255
Athlon 64 X2 6400+
Core 2 Duo E4700, E4600, E6600, E4500, E6420
Pentium Dual-Core E5400, E5300, E5200 
Phenom X4 9500, 9550, 9450e, 9350e
Phenom X3 8650, 8600, 8550, 8450e, 8450, 8400, 8250e
Athlon II X2 240, 245, 250
Athlon X2 7850, 7750
Athlon 64 X2 6000+, 5600+
Core 2 Duo E4400, E4300, E6400, E6320
Celeron E3300
Phenom X4 9150e, 9100e
Athlon X2 7550, 7450, 5050e, 4850e/b
Athlon 64 X2 5400+, 5200+, 5000+, 4800+
Core 2 Duo E5500, E6300
Pentium Dual-Core E2220, E2200, E2210
Celeron E3200
Athlon X2 6550, 6500, 4450e/b, 
Athlon X2 4600+, 4400+, 4200+, BE-2400
Pentium Dual-Core E2180
Celeron E1600
Athlon 64 X2 4000+, 3800+
Athlon X2 4050e, BE-2300
Pentium Dual-Core E2160, E2140
Celeron E1500, E1400, E1200

Summary

There you have it folks: the best gaming CPUs for the money this month. Now all that’s left to do is to find and purchase them.
Also remember that the stores don’t follow this list. Things will change over the course of the month and you’ll probably have to adapt your buying strategy to deal with fluctuating prices. Good luck!

Saturday, November 5, 2011

AMD Bulldozer Speed Record Broken at 8.58GHz

Maybe with liquid helium, he'll achieve ludicrous speed.

So, about that Guinness World Record-beating clock speed of 8.46 GHz we talked about last week – that's been beaten again.
The very same Andre Yang that achieved that remarkable speed has upped his efforts – and his AMD FX-8150's limits – to an astounding 8.58 GHz.
Yang kept the same Asus Crosshair V Formula motherboard, but this time he cranked the voltage up from 1.992V to 2.076V. This was also done with liquid nitrogen, so it's possible that there's still room for more with the even-more-effective liquid helium.

source: tomshardware.com

Tuesday, February 16, 2010

Phenom II X2 555 Vs. Pentium G6950: New Budget Dual-Core Titans : AMD's New CPU Portfolio And The New Phenom II X2 555


Intel finally has quad-threaded processors to compete with in the sub-$200 space that AMD has dominated for so long: the Clarkdale-based Core i3 and Core i5 CPUs. Notice we said quad-threaded. These are still dual-core parts with Hyper-Threading, yielding four logical cores. With its launch earlier in January, the company now offers a handful of viable value options for the LGA 1156 platform, with attractive scalability to higher-end Core i5 and Core i7 models.

AMD isn't taking this frontal assault on its turf sitting down, of course, and its retaliation strategy employs a sizable mix of clock speed bumps and reduced prices. The already-attractive price/performance ratio of the sub-$200 CPU market will most definitely take a turn for the better, and you, the enthusiast, win again.

In the midst of all of this new model chaos, we couldn't help but notice the Phenom II X2 555 Black Edition. At an aggressive 3.2 GHz, this is the fastest dual-core CPU that AMD has ever made, and the best part is that it boasts the same $100 price tag as its predecessor, the 550. Intel's counterpoint, the new Clarkdale-based Pentium G6950, is about $5 cheaper and has a slower clock rate of 2.8 GHz, but it does have the advantage of an efficient 32nm process and reportedly-unholy overclocking headroom.

So we couldn't help but wonder: which of these two entry-level offerings is the better bet? How does stock performance compare to a more expensive option, like the quad-core Core i5-750? And could either of these processors offer budget-busting performance if we overclock them, despite their dual-core "limits?"

We certainly slammed headfirst into a few surprises along the way (not all of them pleasant), and we didn't walk away innocent of a mistake or two. But before we dig into the dual-core battle, let's spend a little time looking at AMD's new processor portfolio.


Phenom II X2 555Pentium G6950
Codename: CallistoClarkdale
Process: 45nm 32nm
CPU Cores: 2 2
Clock Speed: 3.2 GHz
2.8 GHz
Socket: AM2+/AM3LGA 1156
L1 Cache: 2 x 128KB
4 x 32KB
L2 Cache: 2 x 512KB
2 x 256KB
L3 Cache: 6MB
3MB
Thermal Envelope:
80W
73W