Sony PS Move motion controller review

JAPANESE ELECTRONICS GIANT Sony announced the Playstation (PS) Move back in 2009 with much fanfare. The company didn't have any titles to show off - it still doesn't, really - but it was wowing the crowds at games expos with the lag-free and responsive motion controller.
Sony had previously made promises it couldn't keep at E3 with games demos that proved to be tweaked engines rather than games running on the fly. So the tech demos it ran for the PS Move looked very impressive but were taken with a healthy lump of salt. Then Sony started demoing titles for its upcoming PS Move and the same lag free responsive motion controller.
Those games demos weren't a trick of the light. Sony got the press interested with talk of magnetometer gyroscopes, accelerometers and pin-point accuracy. As usual, Sony got the edge with advanced technology after spending five years watching its market share drift to Wii punters.
The company also recently launched ahead of Microsoft's rival Kinect technology, hitting UK shelves a month before the Vole. But, as usual, Sony hasn't landed a killer blow with a decent array of titles to support the PS Move and entice the great unwashed. There are a couple of patches for good titles like Heavenly Rain and a rerelease of Eye-Pet. There's even a rejigged version of another old PS2 title that used the Playstation camera as a cheap augmented reality trick, now called Start the Party. But there's no killer app to sell PS Moves by the truckload, which is a shame. We think those big glowing orbs have much more potential than the Wii, and the jury is still out on Microsoft's Kinect.
Sony sent The INQUIRER two PS Move motion controllers and an array of launch titles, including Start the Party, Kung Fu Riders, Eye-Pet and Sports Champions. We had to purchase the Playstation Eye camera separately but it can be bought as part of the Starter Pack. The pack will only set you back about £40, which is about £80 less than Microsoft's Kinect, so it is a much cheaper option.
ps3-move-3
Installation is simple with the PS3 automatically picking up the camera. The controllers need to be plugged in via USB before they are recognised by the system but the USB cable only needs to be plugged in again for re-charging. The controllers have a claimed ten hour battery life, not bad given they also have to power the glowing orb light and vibration control. We did run out of battery life which meant trying to use the PS Move with the controller disconnected - not recommended. Because the camera picks up so much depth of field in some games like table tennis on Sports Champions, you need a lot of room to move. As most games support up to four players, it left us wishing the PS3 had more than two USB ports to recharge controllers. The camera connects to one port, which leaves just one USB port available for recharging.
In fact, we also stuck to Sony's rigid set up instructions to make sure the game play experience was optimised for the PS Move. Like Kinect, Sony's control system requires a large, clean and tidy room that's well-lit. For one player at a time it's not a problem but Sony obviously thinks we're all millionaires living in huge houses. We needed at least six to eight feet of space from the TV with another six feet either side. This proved most difficult on Eye-Pet where I tried to fit myself, two children, one cat and one augmented reality virtual pet into a tight space on the floor. We also had a real problem calibrating the controllers for this game, which seemed to be more chance than anything else.
The PS Move controller is, on paper and in hand, a great piece of technology. The big ball at the end lights up to provide a point of reference that the camera can track. That's not just very tight tracking left, right, up and down, but tracking backwards and forwards as well. The controllers automatically assign a different light when another is connected for each player but there are many colour changes that are used for different functions in different games to add to the gameplay experience.
The ball can also be converted using Sony's tech voodoo to anything you can hold in your hand. In this review and with the help of my boys, we held a sword, bat, torch, axe, Frisbee, swatter, hammer, fan, hair shaver, bow and pencil to name but a few. All were rendered with Sony's superior graphics engine, increasing in size the nearer we moved to the camera and decreasing as we moved away. But it was the lag free playing that held our attention. Try as we might, spinning the controllers in our hands, we couldn't catch the PS Move out so there's not even a millisecond of lag to report.
ps3-move-4
Aside from the glowing orb, the controller has a huge PS Move button in the middle, a trigger button for your right finger and Sony's traditional square, triangle, circle and cross buttons around the large Move button. There's also a small select and start button on the left and right hand side. The main buttons sat naturally underneath our fingers and most of the simpler games like Start the Party didn't use any other buttons.
But the square, triangle, circle and cross buttons were a little harder to track, especially on manic games like the awful Kung Fu Riders. With so much time devoted to holding down the PS Move button in some games, a move to another button at the side takes some getting used to. This might be more of a problem when Sony finally releases some headline titles like Socom 4 or other FPS games. Where quick fragging is of the essence, that vital move to the side buttons could cost valuable milliseconds.
The layout feels very comfortable to use and Sony has done itself a favour by making the controller slender and robust enough to take a few whacks by frustrated players throwing it across the room. I was also pleased that my kids only suffered shattered egos rather than skulls when beating each other up in the gladiator duels in Sports Champions. Without enough space to move, both boys spent more time actually hitting each other with the spongy ball rather than landing virtual blows.
But that all important game play mechanic with the controllers is, unequivocally, fantastic. Sports Champions lacks the cutesy charm of Nintendo's Wii Sports but we'd rather play its table tennis game than anything the Wii has to offer. The PS Move adds so much depth and control that the gameplay of table tennis opens up. You can orchestrate smashes on Wii tennis with a dainty flick but the PS Move asks its players to actually perform a smash. Ditto for lobs backhands, slices and topspins. The 360 degree representation of the bat meant we got full control over gameplay, giving some games richer depth than we expected.
Start the Party is a cheap and dirty post pub five minute knee trembler. We enjoyed it thoroughly at the time but you might regret it in the morning. You are not asked to think beyond following the mighty Bruce Campbell's instructions to whack, stab or swat something on screen. It is an old PS2 idea with the camera displaying our mug on screen in a virtual world and picking up our controller movement. The idea is enhanced by the PS Move and it proved to be my kids' favourite.
ps3-move-5
Eye-Pet is also a revamped older title but has been given the Move makeover. Calibration was awful. Sports Champions has a much more complicated calibration process because the games asked a lot more of the PS Move but it worked every time. Ditto for Start the Party, which only required us to point the controller at the camera once and press the PS Move button. But Eye-Pet's calibration hardly worked even though it should've been relatively plain sailing.
Kung Fu Riders is a terrible launch title with a one trick pony idea extended to an entire game. Performing Kung Fu moves on a chair while getting points for style moves is a neat idea badly executed and the game should never have seen the light of day. Even my normally forgiving kids didn't rate this one well.
In Short
It is so hard not to judge Sony's technology in a land where content is king. Without a killer app to use this great technology the PS Move could be over before it begins. The company's hyperbole is bang on the money for us, however. It is truly a marvel to play with and we didn't think we'd enjoy it as much as we did. µ
The Good
Affordable price for motion control gaming, lag-free precision controllers.
The Bad
Launch titles aren't going to entice the masses, side buttons are hard to find.
The Ugly
PS Move has to be more than a technical demonstration that Sony won't fully support if punters don't buy in to the idea.
Bartender's Score
8/10

Crossfiring the AMD HD6870 review

we looked at the plain vanilla AMD Radeon HD6870 graphics card holding its own against the super overclocked Nvidia Geforce GTX460 from Gainward. This time, we'll see how a pair of those HD6870 cards scale up in Crossfire and hold their own against the Green Goblin's GTX580.
Nvidia's GTX580 is the new king of the single-GPU block until the HD6970 appears in another month or so, even accounting for its freshly rumoured delays. The comparison is also interesting because a pair of HD6870s sell for quite close in actual retail price to a single high-end GTX580, in particular the $550-class overclocked varieties by Asus, Gainward and others.
Since I tested the GTX580 on higher end systems, I set up another fast configuration to review the HD6870 Crossfire pair for a more of apples to apples comparison. I used the Xeon version of Intel Core i7 980X six-core extreme CPU - also known as the Xeon X3680 - together with an Asus Sabertooth X58 high-end mainboard based on MIL-spec grade components. Twelve gigabytes of memory made up of three 4GB A-Data DDR3-1600 high density DIMMs, an 160GB Intel SSD, as well as a Xigmatek 700W PSU completed the setup. Ah yes, there was also the new Xigmatek Aegir 6-pipe heat sink fan assembly, keeping the CPU around 37C at idle, or just 4C above Singapore room temperature. More about these in a separate Sabertooth review.
6870xfire
As you can see here, the two graphics cards fit just nicely into two x16 PCIe graphics slots on the mainboard. Unlike some other high end X58 mainboards, the Sabertooth has no extra PCIe bridges or even x8 - x16 switches, so the latency to PCIe is not affected by any such extra circuitry. Therefore, this should lead to theoretical best scaling measurements, even though the differences aren't large.
We ran the latest Catalyst 10.10 drivers on the usual Windows 7 64-bit operating system, with both 3Dmark Vantage for DX10 and Unigine Heaven 2.1 for DX11.
Here are the results for the single HD6870:
3dmarkhd6870asus980xpuni
... and for the dual HD6870 Crossfire setup:
3dmarkhd6870asus980xdual
As you can see, in both performance and extreme modes, there's quite decent scaling when adding the second card, in fact even better in the Full HD-class Extreme setting with nearly double the score. With that performance, the card pair does perform better than a single GTX580 in 3Dmark Vantage by some margin. On the comparative GTX580 setups, the 3Dmark Vantage results hover around the 24,000 mark in Performance mode and 13,000 in Extreme mode, but taking into account the PhysX offload on CPU benchmarks as well.
Heaven single (top) and dual-card (bottom) results:
heavenhd6870cpu3680v21
However, Unigine Heaven, even in this most recent update, still shows basically no speedup from a dual GPU card setup of any kind.
In summary, while awaiting the HD6970 and HD6990 - which should hand the single GPU performance crown back to AMD - Radeon afficionados can get some pretty decent performance for a few bucks by pairing up two HD6870's in Crossfire. The scaling is good, the cards are reasonably low power - two of them together cost just a bit more than a single GTX580 - and you can have a plenty of display output options with Eyefinity.
In the meantime, AMD's rescheduling of its HD6900 series announcement will hopefully result in a bit higher clock speed for the new cards, just in case they need to keep some distance from the newly resurgent Nvidia. µ



Avermedia Windows 7 TV Starter Kit review

YOU WOULD THINK by now that Freeview certified TV tuners for PCs would be ten a penny, but Freeview’s website relegates the entire topic of DVB-T tuners to a single answer in its FAQ. Of course, there are dozens of digital tuners out there of all shapes and sizes, and from experience we know that most branded models work pretty well with Windows 7 Media Center.
But Avermedia has seen fit to go one step further and get one of its USB tuner dongles officially blessed by both Microsoft and Freeview, turning a fairly standard product into the Windows 7 Starter Kit. Obviously, this should mean that it’s an easy sell for those worrying types who like the comfort of a logo, but it does seem a little late in the day.
averm-starter-kitThis minimalist kit consists of an Avertv Hybrid Volar HX dongle and a driver CD that includes some cheesy Vole-produced how-to videos. The dongle itself has been available for about three years in non-certified form, and is a hybrid analogue/digital model. This means it can be used for either kind of TV input or an FM antenna, but not both. It's quite chunky, so for some installations you'll definitely need an extension cable (stingily not supplied).
According to Avermedia, as this bundle is aimed solely at Windows 7 users, there's no remote control supplied. We're still chewing over the logic of that one, so if you want a remote you'll have to fork out for a third party Media Center model, about £20. The dongle has a proprietary AV input socket on the side, but the cable isn't supplied, unlike the standard Volar HX unit advertised on the Avermedia website.
The dongle drivers installed easily and it was immediately recognised by Windows 7 Media Center. We successfully tried both analogue and digital inputs, and found picture quality to be indistinguishable from our other Hauppauge tuners. Although it's not supplied, you can download the free Avertv viewing app from the website and use it with this tuner if you wish, and it's also compatible with Avermedia's Snug TV location-shifting service, for which a 30-day free trial is available to registered owners.
In Short
The Avermedia Windows 7 TV Starter Kit is an easy plug-and-go solution for Windows 7 users who just want something guaranteed to work. However, it really brings nothing new technically to the world of PC TV. µ
The Good
Full Windows and Freeview certification, analogue or digital reception, no software needed for Windows 7.
The BadBulky, no remote, no AV input cable, no USB cable, expensive.
The UglyAn old product in a new box, but it still works well enough.
Bartender's score
6/10

AMD Radeon HD6970 and HD6950 review

WITH ITS LATEST PARTS, the last major launch of the 40nm GPU generation, AMD has now presented its answer to the recently announced and seemingly quite well performing Nvidia chippery, the GTX580. The Radeon HD6970 and its lower-cost sibling, the HD6950, fare as well as expected in the benchmarks, and provide a nice feature boost too.
The twin graphics engines for much faster geometry and rasterisation plus massively sped up tessellation, Powertune real-time performance and power adjustment, as well as higher double precision floating point performance, are just some of the benefits. Image quality wise, there's Enhanced Quality Anti-Aliasing (EQAA) and morphological anti-aliasing (MLAA), plus support for deep colour and other new pixel formats.
To display all that, the twin Displayport 1.2 and HDMI 1.4a ports together with dual DVI outputs support up to six monitors from a single card. Finally, the approximately 3TFlops single precision and nearly 1Tflops double precision floating-point performance will help in compute jobs too, aided by the doubled memory capacity for locally executed tasks without hopping non-stop over the slow PCIe connection.
hd6970inx58

How about the performance boost versus the best of the past generation, the HD5870? In this case, besides a couple of benchmarks comparing the old reference HD5870, I took along what is probably the best ever HD5870 card: The Asus Matrix 5870 Platinum, a custom design in everything from the PCB with the best componentry and massive voltage tweaks, to the much more powerful yet less noisy cooling, and, yes, a full 2GB of GDDR5 memory just like the new cards - it does help in multi-monitor Eyefinity setups, too. You can see it here inside the same test setup:
asusmatrixinx58
Why is it interesting? Well, such a card can now be had at quite a discount due to AMD having launched the new cards, and, on top of 6 per cent higher processing and memory speed than the reference, its doubled memory can offer a more equal apples-to-apples comparison versus the new cards in detail-intensive and other memory capacity-dependent benchmarks even beyond, say, 3Dmark11.
Coming back to the naming, I still think that the cards should have rightfully been called the HD6870 and HD6850, as the up to 30 per cent speedup from the HD5870 and HD5850 kind of justifies that. What is now called HD6870 would then have been HD6770, as presumably the original plans were. So, what would have been HD6970 dual GPU card to succeed the HD5970, will now be called HD6990 early next year.
The HD6970 and HD6950 are just as long if not longer than the HD5870, so beware the case space requirements when purchasing. Otherwise, I think the card design, even though boxy, is just as modern as the HD5870 generation. I squeezed them in a very compact Gigabyte chassis in which our test platform for the HD6870 review last month already was - the uber-reliable Asus Sabertooth X58 mainboard with military grade componentry, hosting the Xeon 980X 3.33GHz six core Intel processor with twelve gigabytes of A-Dat DDR3-1600 memory and Windows 7 64-bit Ultimate running off an Intel X25-M SSD 160GB drive.
The new Xigmatek Aegir direct heat-pipe touch HSF cooled the CPU, quite well at that with 37C idle temps with Turbo enabled. The Aegir is a bit more compact than the previous models, and direct touch heat pipe models like this are now prevailing in the high end heatsink market. A Xigmatek 700W PSU feeds the whole setup. This PSU actually handled two HD5870s in Crossfire very well too, together with the 6-core setup, not bad for a 700W unit.
The display used was the brand new Gigabyte Envision P2271wL LED monitor, a 1920x1080 unit with a low power high contract display, 10 million to 1 contrast ratio at that, and full HD resolution with 5ms response. The very compact, light monitor has none of the usual hardware buttons, but just on-screen display touch controls for on-off and settings. At just 3.5kg net weight and less than 30W power consumption, the monitor is perfect for mobile benchmark testbeds, too. It's just a pity that there's no 1920x1200 resolution monitor unit available with 16:10 aspect ratio proportions.
envision

So, the cards we will look at here will be, in order of reference: the original reference ATI Radeon HD5870 1GB; the Asus Matrix HD5870 Platinum 2GB at 900MHz GPU clock factory pre-set; and the AMD HD6950 and HD6970, both reference 2GB designs.
Here are the results. Let's look at the now already old 3Dmark Vantage DX10 test in both Performance and Extreme modes:
First the HD5870 reference...
3dmhd5870asusx58x3670
Then the Asus Matrix HD5870 Platinum...
3dmasushd5870ocasusx58x3670
Followed by HD6950...
3dvanthd6950
And finally, the HD6970 top of the crop...
3dvanthd6970
Well, you can see that there are differences, but not that much. The HD6950 is about the same as the old HD5870, and a little slower than the Asus HD5870 pre-overclocked Platinum version.
Now we move to the Unigine Heaven 2.1 benchmark, focused on DX11 of course:
heavenh5870-2gb6950-6970

Here, the difference between the old and new becomes far more obvious, due to the HD6900 series optimisations for DX11 and tesselation as well.
Finally, here's our first run on the new 3Dmark 11 benchmark, just in the basic mode this time. We only ran it on the three cards that had the full two gigabytes of memory here, the Asus Matrix HD5870 as well as the HD6950 and HD6970:
3dm11-5870-6950-6970-2gb
We can see that, despite the new DX11 enhancements in the HD6900 series, the best of the HD5870 does come really close here in 3Dmark11. The differences between the generations are not so pronounced as in the Heaven test.
In Short
The AMD Radeon HD6970 and HD6950 bring along a bunch of new features and power management capabilities, plus the required high-end speedup to fend off the Nvidia GTX580 assault as well as its GTX570 lower-end spin off. With 8-pin plus 6-pin power connectors on the HD6970, there's enough extra juice flowing for some decent overclocking headroom even on this reference design. And, if the GPU vendors put in some efficient but a bit slimmer - like, say, 1-3/4 slot thick coolers instead of full two slot depth - we could have Crossfire pairs or even quads without totally blocking the airflow between the cards. After all, Crossfire now scales quite a bit better than in the previous generation, with near double performance using two cards in quite a number of games, and triple the single card performance in Quadfire setups. So, let's make sure the system airflow is not an issue when putting these cards together.
And, if you have problems finding them at the beginning, keep in mind that the good HD5870 upper-end cards like the Asus Matrix HD5870 are still near the top of the performance bracket. I wonder, how would the Matrix HD6970 look then - over 950MHz GPU clock for a start?
In the future, we'll look at the GPU compute performance as well as a Crossfire test, likely together with a Nvidia Geforce GTX580 setup. µ

AMD Phenom II X6 1090T CPU Review

Introduction
 
AMD has had mastery of the budget end of the processor market for some time now for reasons including price/performance, low motherboard prices and platform longevity (they don’t change sockets at the drop of a hat). The only downside has been the sacrificing of the high performance market to Intel (albeit at a much higher price). Recently Intel launched its 6-core processor the i7-980X at the usual "Extreme Edition" price of around a $1000 (or a £1000 if you happen to live in the UK due to sales tax and other historical factors) putting it out of reach of all but a few enthusiasts and professionals in specialized fields such as video editing.
Today AMD is launching its own 6-core processor code named Thuban. Two models are launched today, the Phenom II X6 1090T (3.2GHz stock and up to 3.6GHz with Turbo Core) and the Phenom II X6 1055T (2.8GHz stock and up to 3.3GHz with Turbo Core). Not only are these launching at an aggressive clock speed and with a boosting technology to rival Intel's Turbo Mode, the estimated street price for the flagship model is under the $300 dollar mark. We have tested the 1090T and it promises to really shake up the current status quo with performance that in some cases beats the best Intel CPUs available.
By spending a long time duplicating our tests six times we are able to see how various applications perform with differing numbers of cores allowing us to establish the multi-core efficiency of games such as Far Cry 2 and benchmarking tools like 3D Mark Vantage. The testing is by no means comprehensive and if we had 2-3 weeks to spare we could have tested every recent game and application for completeness so our apologies in advance if your favourite application is not included in our representative sampling.
Of more universal interest is comparing the efficiencies of the latest Intel and AMD architecture to compare current and future efficiencies and predict how future trends and architectures will affect performance.

Processor Architecture
To go with the Thuban launch we are getting a new chipset, the 890FX, which promises better performance and greater headroom for overclocking. The board we tested with was the ASUS Crosshair IV Formula.   
 
SATA-3 is now standard although USB 3.0 still has to be provided by 3rd party hardware (NEC in our case). Now on to the die that has been the subject of intense speculation these last few months:

Each core has 64KB of L1 data and instruction cache and 512KB of L2 cache. 6MB of L3 cache is shared between the cores. A 45nm process and manufacturing optimizations keep the processor within a thermal envelope of 125W despite the addition of two extra cores. This TDP will be of key importance when we discuss the Turbo Core feature.
 
The CPU 
The X6 range will fit into a standard AM3 socket (a BIOS update may be required for current/recent motherboards) showing AMDs commitment to platform longevity and ease of upgrading.
We received final shipping product for our testing and not an engineering sample so we are confident that our tests will reflect the actual performance that consumers will experience.


Turbo Core
Intel has been using it's Turbo Mode for some time now with i5/i7 processors to boost the speed of one or two cores by a few steps when thermal envelopes allow. The greatest benefit is gained in applications that are not highly threaded and so cannot otherwise fully utilize all available cores. AMD now also have this feature built into their latest range in the form of Turbo Core, which allows for 3 cores to be boosted by up to 500MHz when the other 3 are at low utilization. This is more than was originally expected and is done by cleverly reducing the speed of unused cores to 800MHz and lowering voltage correspondingly while increasing voltage to the boosted cores. This is all done automatically by the processor although some motherboards (such as the ASUS one we tested with) allow the Turbo Core feature to be tweaked independently of the usual CPU adjustments. The net effect of this is to maximise processor performance with any type of application while staying within the 125W thermal envelope.

Overclocking
Traditionally, AMD processors have been more difficult to overclock than their Intel rivals with most users able to boast modest overclocks without exotic cooling.
We used a Corsair H50 which gives the benefits of water cooling with the ease of installation of an air-cooled HSF. In terms of cost and performance it is similar to a high end air cooling solution but without the bulky heatsink or noisy CPU fan. Please not that due to the small reservoir on these sealed budget water block and radiator combo systems they should not be used for extreme overclocking and if the processor temperature gets above 70 degrees Celsius it should be brought back down immediately to prevent water turning to steam and permanently “unsealing” the system.
 
The CPU-Z screens show all the relevant information. Here the processor is running under load at stock speeds but will throttle back to 800MHz when idle or at low utilization.


The screen shots above are quite real - the Phenom II X6 1090T booted straight into Windows at over 4GHz with our motherboard taking care of all adjustments. That was our first attempt at overclocking and had we the time before the launch deadline we would have seen just how far we could go. Given the time constraints we were only able to run some benchmarks in Everest Ultimate edition and these are shown later on. The system was stable at 4GHz under stress testing for the 1 hour we could spare for that purpose.
AMD's new manufacturing process should have overclockers rubbing their hands with glee especially given the price. We estimate that an entire system based around the Phenom II X6 1090T including monitor and budget SSD can be purchased for the price of an Intel i7-980X processor alone.

The Problem with Multi-Tasking
Since this review is primarily about multi-core efficiency it is worth explaining the inherent problems with multi-tasking. This may surprise some readers as we already have supercomputers made up of thousands of Intel or AMD processors and if they did not scale well then research institutions would not buy them to predict climate change, where minerals are buried and so on. The reason they work so well is that it is easy to split millions of operations among thousands of cores. Splitting one thread across multiple cores is actually quite difficult.
The problem involves concurrency, monitors and semaphores and is too involved to go into here although interested readers are encouraged to read the Wikipedia article on “Dining Philosophers” which explains the whole problem in easy to visualize terms. It can be found here.
Until Quantum Computing is viable we will have to rely on programmers making allowances for multiple cores and programming accordingly. Some games and applications are already optimized to a limited degree for multiple cores and theoretically every application will get a boost with a second core, even if just by offloading the usual Windows background processes to the other unused core.
It has been clear for some years that frequencies cannot continue to increase due to manufacturing limits and have remained roughly constant around the 3GHz mark for about 6 years. Instead it seems that the future gains will be attained by increasing the number of cores in a CPU, whether physical or also virtual (as with HyperThreading). Our test will aim to show which architectures are most suited to getting the best out of extra cores, where the bottlenecks are and, hopefully, give an indication of how the architectures will scale in the future as number of cores increase.

Test Setup



Test Configuration

System Hardware

CPU

Intel Core i7-870 (2.93 GHz, 8MB Cache
AMD Phenom 2 X6 1090T (3.2 GHz, 6MB Cache)
Motherboard
ASUS Maximus III Gene
ASUS Crosshair IV Formula
CPU Cooler
Corsair H50
Corsair H50
RAM
Kingston
CL8 (Kit of 2) Intel XMP Tall HS CAS 8-8-8-2
KHX2133C8D3T1K2/4GX 4GB 2133MHz DDR3 Non-ECC4
Kingston KHX1600C8D3T1K2/4GX 4GB 1600MHz DDR3 T1 Series Non-ECC
CL8 DIMM (Kit of 2) XMP CAS 8-8-8-24
Graphics
ATI Radeon 5850 HD
ATI Radeon 5850 HD
Hard Drive
Maxtor 300GB SATA-2
Maxtor 300GB SATA-2
Sound
SupremeFX X-Fi built-in
Realtek® 1200 8 -Channel High Definition Audio CODEC
Network
Gigabit LAN controller
Realtek® 8112 Gigabit LAN controller
Chassis
Antec 902 Midi Tower Case
Antec P183 Ultra Quiet Case
Power
Antec TruPower 750W
Antec EarthPower 1000W

Software

Operating System
Windows 7 Professional
Windows 7 Professional
Graphics
ATI Catalyst 10.3
ATI Catalyst 10.3
Chipset
Intel P55
AMD 890FX
Applications
  • SiSoft Sandra 2009
  • 3DMark Vantage Pro
  • PCMark Vantage Pro
  • Everest Ultimate
  • CPU-Z
  • Far Cry 2
  • HAWX
  • Resident Evil 5
  • SiSoft Sandra 2009
  • 3DMark Vantage Pro
  • PCMark Vantage Pro
  • Everest Ultimate
  • CPU-Z
  • Far Cry 2
  • HAWX
  • Resident Evil 5
All games are tested at the maximum available settings and initially at 1280x1024 so we can be sure of hitting CPU limitations before bandwidth or fill rate ones related to the GPU. We selected Far Cry 2 (first person shooter), HAWX (air combat) and Resident Evil 5 (horror) for our tests as they are newish titles that are suited to benchmarking and make most systems struggle.


Test Results - SiSoft Sandra

 
The results show fairly linear scaling as we go up in cores. It should be noted that synthetic tests such as SiSoft Sandra will scale quite well and are mainly useful as an indication of bottlenecks and to see what programmers can achieve if they overcome the hurdles they face. The Thuban processor is able to match its costlier Intel rival.
 
The processor multimedia results also scale well although real-life differences will not be as pronounced as this chart indicates. Here the newest AMD processor takes a clear lead by virtue of extra cores.
 
Interestingly, the memory bandwidth results show that a single core cannot make full use of available capacity and is particularly the case for the AMD Phenom II architecture. Dual core or higher is required to overcome this limitation. Ultimately, the 2000MHz DDR3 of the Intel platform makes all the difference over the 1600MHz DDR3 the AMD system has.

Test Results - Everest Ultimate Edition 
Everest is a very comprehensive benchmark suite that is set to take the synthetic crown from SiSoft Sandra. We limited our testing to the CPU and FPU benchmarks provided.
 
CPU Queen is a simple integer benchmark which focuses on the branch prediction capabilities and the misprediction penalties of the CPU. It finds the solutions for the classic "Queens problem" on a 10 by 10 sized chessboard. CPU Photoworx is an integer benchmark that performs different common tasks used during digital photo processing. CPU Zlib is an integer benchmark that measures combined CPU and memory subsystem performance through the public ZLib compression library. CPU ZLib test uses only the basic x86 instructions, and it is HyperThreading, multi-processor (SMP) and multi-core (CMP) aware. CPU AES is an integer benchmark that measures CPU performance using AES (a.k.a. Rijndael) data encryption. It utilizes Vincent Rijmen, Antoon Bosselaers and Paulo Barreto's public domain C code in ECB mode.
Since all of these tests are fully threaded we see a linear increase in performance as number of cores increases. It is worth noting the effect of overclocking to 4GHz - a massive 25% overclock with a commensurate increase in performance.

The FPU Julia benchmark measures the single precision (also known as 32-bit) floating-point performance through the computation of several frames of the popular "Julia" fractal. The code behind this benchmark method is written in Assembly, and it is extremely optimized for every popular AMD and Intel processor core variants by utilizing the appropriate x87, 3DNow!, 3DNow!+ or SSE instruction set extension.
The FPU Mandel benchmark measures the double precision (also known as 64-bit) floating-point performance through the computation of several frames of the popular "Mandelbrot" fractal. The code behind this benchmark method is written in Assembly, and it is extremely optimized for every popular AMD and Intel processor core variants by utilizing the appropriate x87 or SSE2 instruction set extension.
The FPU SinJulia benchmark measures the extended precision (also known as 80-bit) floating-point performance through the computation of a single frame of a modified "Julia" fractal. The code behind this benchmark method is written in Assembly, and it is extremely optimized for every popular AMD and Intel processor core variants by utilizing trigonometric and exponential x87 instructions.
As with the CPU tests, the FPU benchmarks are highly threaded and we can see a linear performance increase with number of cores.

Test Results - PC Mark Vantage Pro
PC Mark Vantage tests a whole range of activities from web browsing to photo manipulation and music conversion.

Performance is fairly consistent
Across all resolutions so it makes little difference other than common sense suggesting the highest resolution be used
For this type of activity for ease of use.

Test Results - 3D Mark Vantage Pro
Of much more interest to gamers is 3D Mark Vantage and this is the de facto standard for synthetic 3D graphics benchmarks for a wide variety of gaming types.

Performance scales well except for single cores which just don't have the raw power to get the job done. There is just a hint of a leveling out at high numbers of cores but we will need to wait for 8+ core processors to confirm this.

The CPU score is of most interest to us and here we can see something quite interesting. While the i7-870 seems to be hitting some kind of bottleneck at around 3 cores, the AMD processors are scaling linearly all the way up to 6 cores. This bodes extremely well for AMDs architecture and for future systems with even more cores.

The only purpose of the GPU benchmark component is to show that even a single core can make full use of our Radeon 5850 card. We would really like to return to this benchmark when we can get our hands on dual Radeon 5970 cards to see just how many cores are needed to feed high end Crossfire / SLI setups.

Test Results - MultiCore Analysis
So far we have looked at synthetic benchmarks and these tend to be well threaded to make full use of all available cores. This is not always the case in the real world and now we look at some recent 3D games with the emphasis on core scaling. Tests are run at 1280x1024 to avoid any GPU limitations at high resolutions.

In Far Cry it seems that a dual core processor is just as good as a quad or hex core one. Although fairly recent, this game was designed some time ago and we have learned that future games from the developers will be fully threaded to take account of many (they declined to say how many) cores.

Similar situation for Tom Clancy's HAWX with even a budget AMD processor beating the i7-870 and the new Thuban ahead by quite a margin. All seem to hit a bottleneck at 4 cores with 2 cores being the "sweet spot".


Resident Evil 5 appears to show the Phenom II X4 635 scaling well but in fact the Phenom II X6 1090T shows that 2 cores are adequate and is consistent with the Intel CPU performance. The real reason for this discrepancy appears to be the lack of L3 cache on the budget processor and is one of the few real world cases where we see this having such a profound impact in gaming.

Test Results - Overall Gaming Performance
Now we have compared differing numbers of cores, it’s worth showing the performance of the above games with all cores active but at varying resolutions to show the maximum performance that can be expected. After all, no consumer is going to purchase a CPU and then disable one or more of its cores to see how much it slows down.
 
All processors can run at good speeds at all resolutions. If we had not tested with different numbers of cores we would not be able to tell from the above results that a 2-core Lynnfield runs this game just as well as a 4-core one and that the AMD Phenom II X4 635 processor needs at least 3 cores to keep up. The Phenom II X6 1090T takes the performance crown again from its more expensive Intel quad core rival.
 
Performance is virtually identical across differing resolutions hiding the issue with a single AMD core. This is a game that will not tax even basic systems and anything more than 2 cores is wasted here. 
 
Here the i7-870 manages to pull ahead due to some kind of bandwidth limitation. We would like to do a quad crossfire test to really strain each cpu but that will have to wait until we have more graphics cards in our test lab. This game is playable at all resolutions with any of the three processors.
 
Conclusion
We’ve done something not seen in other reviews and looked at the multi-core efficiency of the latest architectures from Intel and AMD and looked beyond the simple results of just running benchmarks at default (and sometimes overclocked) speeds.
By using the motherboard BIOS to selectively disable cores we can look at the per-core performance which gives us a much greater insight into the architecture’s potential than just interpreting the results from the more traditional benchmarks.
The release of AMD's 6-core Thuban processors marks an exciting time for PC enthusiasts. In the past the fastest AMD processor has been significantly slower than the fastest Intel processor and the only foil to that has been the price of the fastest AMD processor being a lot less than the fastest Intel one. The equation has changed with the performance of the top end processors from AMD and Intel being effectively tied with AMDs lower pricing likely to play a big part in consumer purchasing decisions. This may change if AMD think the Thuban processors are too cheaply priced or if shortages are encountered we may find retailers increasing prices as was the case when the Radeon 5800 series was first released. More likely, Intel will cut into their ample margins and lower prices now that they have a fight on their hands in their flagship categories.
AMD have made a strong play for the high end of the processor market with the release of the Phenom II X6 1090T and 1055T processors. Importantly, they have done this without charging a premium as Intel have been content to do with their "Extreme Edition" price point. The strategy of dominating the low end / mainstream market and using that as a springboard for the high end as they have done in the GPU arena with ATI may be putting them back on an equal competitive footing with Intel - something which can only be good for the consumer.
As for the Phenom II X6 1090T? It's a great product at a great price and we know from speaking with game developers that several titles due out this year will make full use of all 6 cores. If you're in the market for a new CPU then 6 cores is the way to go. For those unable or unwilling to spend a $1000 on Intel's i7-980X but would like similar performance at a fraction of the price then AMD's Thuban is the only option and, as can be seen from the benchmark results in this review, represents tremendous value for money.

Twitter Delicious Facebook Digg Stumbleupon Favorites More