Recent years have seen plenty of wailing and gnashing of teeth surrounding the supposed "death" of the PC gaming industry, and to a wider extent the future of desktop PCs (and gaming systems particularly) in general. To this end, the last couple of days have seen a pair of editorial pieces emerge discussing just these concerns and what they mean for both the PC and gaming industries - what are your thoughts on the points they put forward?
For many of our readers, that last statement might be passed off as opinion. Sadly, there will always be people who refuse to believe in something until after it has becomes undeniable to argue. I'm a little more calculated since I operate a corporation centered on sales and service of this platform, in addition to my experiences with this website. Nevertheless, I've already published statistical facts that backed my initial pessimistic feelings, but that wasn't enough for some of you. I knew that to really convince my readers, my message would needed to be solid. For those who still remain tethered to this industry, consider this editorial a friendly forewarning.
How video games killed desktop PC computing at Benchmark Reviews
Just as you get to a battle or see some bad guys, the suit tells you to activate stealth or armor, letting you know there's a battle that is about to begin. The suit talks to you now, telling you what to do, where to go, what power to use. I don't want that; why am I having my hand held throughout the game? I don't want this. I want the open-ended gaming that the original games from Crytek offered in Far Cry and Crysis.
Games these days don't want you to struggle. They want you to enjoy your time and finish the game, usually cut down which allows DLC (downloadable content) to be released (for a charge). If the game comes out and 2 weeks later there's DLC, why not ship the game as a whole, instead of cutting it off and releasing more content which you get charged for.
Are consoles holding back gaming tech? at Tweak Town
Will we be seeing a new series of AMD GPUs to refresh their line-up this summer? Possibly, if this latest speculation is to be believed.
The SI tapeout happened in in February, more likely earlier than later, so lets call it 6 weeks ago. For those doing the math, both of the last two generations, Evergreen and Northern Islands, taped out later in the calendar year, and the resultant chips were launched in late Q3 or early Q4.
The SI/7xxx cards are a tweaked, more effiCIent, and polished Cayman/69xx design, so they should be pretty low risk, and the spins will be minimal. If all goes really well, think 6 weeks from mid-February, or about now, for silicon to come back from the fab, then 2 weeks for testing, and 3 months for production wafers to start popping up. That would put delivery of cards to users in late July with time to spare, add 6 weeks for a possible, even likely respin, and late August is the outer bound. Since this is a pretty low risk part, we don’t expect there to be many huge difficulties with the design.
SemiAccurate has the full story.
It's new AMD graphics driver time once again, squeezing in just before the month of March comes to an end. Here's what's new this time around:
Seamless GPU Compute support
- The AMD Accelerated Parallel Processing (APP) OpenCL runtime is now enabled by default within AMD Catalyst. Applications that leverage OpenCL for GPU based compute tasks will automatically benefit from the significant performance boost that this provides.
Highlights of the Linux AMD Catalyst™ 11.3 release include:
Support for new Linux operating systems
- This release of AMD Catalyst™ Linux introduces support for the following new operating systems:
The AMD Game web site has the latest WHQL driver set. If you'd rather try something even more exciting, then you can also now nab the Catalyst 11.4 preview driver, which has a lot more changes and improvements to offer users:
AMD Catalyst™ Driver 11.4 early Preview Features:
- Support for the AMD Radeon™ HD 6790 series of products
- A number of performance optimizations, as highlighted here:
- Performance enhancement for Shogun 2 (DX11 version) when running Anti-Aliasing
- Enhancements to the AMD Catalyst Control Center™
- New task based Display Management controls
- Simplifies the configuration of displays and display settings
- New Eyefinity setup group
- Setting up an Eyefinity group has never been easier
- New branding (based on system configuration)
- AMD based platform – AMD VISION Engine Control Center
- Discrete AMD GPU with Intel CPU – AMD Catalyst Control Center™
- AMD Catalyst update notification (found within the Information Center)
- Please note this functionality is not yet enabled, but will be in a future AMD Catalyst release
- This feature will be used to notify users that new AMD Catalyst software packages are available
- New task based Display Management controls
- Includes fixes where Dragon Age 2 would hang in DirectX11 on ATI Radeon™ HD 5800 Series products.
- Includes fixes where MediaEspresso 6.5 and 6.0 may crash while initializing
- Includes a fix for game corruption issues in BulletStorm
- Includes a fix for the AMD Catalyst Control Center install; resolves the issue of not being able to launch CCC after Catalyst has finished installing
- Includes a fix for the Tessellation slider sometimes setting the incorrect value
You can download the preview driver from this page.
Yesterday saw Intel release a new range of mainstream SSDs from Intel, making use of 25 nanometre memory. Are they worthy of consideration in the ever-growing Solid State storage market?
The 320 Series boasts 25nm flash memory. PC Perspective got a first hand look at 25nm production early last year. We had been waiting for this memory to make an appearance in an Intel part, and our wait is finally over.
A single die of 25nm flash holds a whopping 8GB. While multiple dies can be stacked inside each chip package, the more you stack, the greater chance a failed part will cause a TSOP to be considered bad during the production process. For this reason, larger die capacities and fewer dies per chip make things cheaper to produce all around. This should make for some competitive pricing as well.
As the 320 model number suggests, the third-generation X25-M is meant to slot in below the high-end 510 Series. Don't get your hopes up for a new standard in storage performance. Intel appears to have focused most of its efforts on improving this drive's reliability and lowering costs. Those strike me as sensible points of focus for a mid-range SSD making the transition to 25-nano flash. As Intel astutely points out, the performance delta between decent SSDs and mechanical hard drives is huge next to the comparatively minor differences in performance between various SSD models. The Intel 320 isn't meant to challenge for the SSD performance crown. Instead, it's looking to lure more folks into the SSD fold.
When briefing the press last week, Intel spent much more time talking about reliability than it did discussing performance. We haven't heard much from SSD makers on this front, but Intel revealed some interesting figures about the X25-M, which has been deployed internally throughout the company. Among the 50,000 drives pressed into service by Intel's IT department, the annual failure rate is claimed to be 0.61%. Intel also quotes a 0.26% failure rate for the over 100,000 X25-Ms in use by ZT Systems, an enterprise customer running the drives in a datacenter environment. For the over 800,000 SSDs that Intel has shipped into the distribution channel, the failure rate is said to be only 0.4%. Since these figures come from Intel, we'll need to add salt—but perhaps only a sprinkle. Earlier this year, a French retailer released reliability stats on hard drive failures. Intel SSDs had a failure rate of 0.59%, while the solid-state competition from Corsair, Crucial, Kingston, and OCZ ranged from 2.17-2.93%.
- Benchmark Reviews
- Bjorn 3D
- Guru 3D
- Guru 3D (SLI)
- Hardware Canucks
- Hardware Heaven
- Hardware Heaven (SLI)
- Hot Hardware
- Inside HW
- PC Perspective
- Pure Overclock
- The Tech Report
- Tom's Hardware
- Tweak Town
- X-Bit Labs
NVIDIA have just launched a teaser video for what they're calling a next-generation graphics product? What will it be? Who knows, but the fact that GeForce GTX 590 specifications and images are floating around online could be a clue...
If you assumed, like I did, that the number of people buying PCs with discrete GPUs versus integrated parts would be about the same the world over, you'd be wrong - at least, according to these figures from AMD.
Chip Chick has the full story.
The iPad 2 is here, but is it worthy of consideration? Anandtech takes a look at the pros and cons.
There's no support for Flash. Like it or not Flash support is still an important part of the overall PC experience. Eventually Apple will either cave, become irrelevant or HTML5 will replace Flash entirely on the web. One way or another, this problem gets solved.
Multitasking is a pain. When the iPad first debuted there was no hope for multitasking, but now with the feature it's still far from magical. I need to tap the home button twice to bring up a task switcher, then tap or swipe/type before getting to the application I'm trying to switch to. There's no alt+tab (or cmd+tab) and no immediately visible task/dock bar of my currently running apps. Copying data between apps is a pain as I can't physically look at two things at once, there's just constant switching required to get things done. When I get a new email on the iPad I have to stop what I'm doing, go read the email and then switch back to what I was doing. The same goes for if I need to respond to an IM quickly while writing in Pages. With apps only running full screen and no support for windows, using a tablet can often times seriously reduce productivity. These are solvable problems. Apple and Microsoft figured out how to do it on the desktop after all, but we're just not there yet with tablets.
Alongside multitasking is the performance problem. With the original iPad even deleting several emails at a time was a bit choppy, and web page rendering performance needed tons of work. As always Apple does its best to hide the limitations of the platform but I must point out that even the iPad 2 with a pair of ARM Cortex A9s has lower CPU performance than a netbook with a single core Atom. The fact that you can't really tell most of the time is a testament to Apple's software engineering, but it doesn't change reality.
Check out the full article over here.
Is it time to say goodbye to the DirectX API? AMD's Richard Huddy believes that the time might be ripe to do just that, and explains his reasoning.
'I certainly hear this in my conversations with games developers,' he says, 'and I guess it was actually the primary appeal of Larrabee to developers – not the hardware, which was hot and slow and unimpressive, but the software – being able to have total control over the machine, which is what the very best games developers want. By giving you access to the hardware at the very low level, you give games developers a chance to innovate, and that's going to put pressure on Microsoft – no doubt at all.'
Of course, there are many definite pros to using a standard 3D API. It's likely that your game will run on a wide range of hardware, and you'll get easy access to the latest shader technologies without having to muck around with scary low-level code. However, the performance overhead of DirectX, particularly on the PC architecture, is apparently becoming a frustrating concern for games developers speaking to AMD.
Read his thoughts in full at bit-tech.