••• کتول میدرخشد •••

از شما دوست عزیز میخوام که در صورت استفاده از مطالب وبلاگ حتما نام وبلاگ را به عنوان منبع ذکر نمایید.ممنونم.

••• کتول میدرخشد •••

از شما دوست عزیز میخوام که در صورت استفاده از مطالب وبلاگ حتما نام وبلاگ را به عنوان منبع ذکر نمایید.ممنونم.

AMD A10-7850K (Kaveri) Review: One Step Forward, Two Steps Back

AMD Kaveri has united four x86 Steamroller cores and GCN-based Radeon R7 graphics core. It has manufactured by 28-nm process technology and supports HSA specification. AMD believes that due to this cocktail it has an outstanding product, able to compete with Core i5. But according to the results of the tests we have a different opinion.


The last few years we’ve been watching AMD’s processor department lose its ground in terms of its presence in traditional PCs. The company is preaching about the importance of mobile and embedded solutions while keeping silent on desktop CPU issues. What we see is that AMD first lost the top-end CPU market to its rival and now they are already talking about only offering low-end desktop processors with integrated graphics. At least, this is implied by the proposed roadmap where we have no updates in the flagship FX series but do see more focus on accelerated processing units (APUs) which combine x86 and graphics cores within a single semiconductor die. Against this background, AMD’s new APU design, codenamed Kaveri, comes out as the company’s key processor product in 2014. Building on the ideas implemented in the Trinity and Richland APUs, the Kaveri is going to be viewed critically in this article since we remain loyal fans of desktop PCs.

 There’s nothing particularly bad about AMD’s shifting its focus to processors with integrated graphics. After all, the majority of Intel’s desktop CPUs have the same internal design. The problem is that AMD, unlike Intel, does not strive to push the performance bar higher, preferring to set other priorities. The FX series was about multicore processors handling a lot of computing threads simultaneously. Now, with the APU having fewer x86 cores, the integrated graphics core comes to the fore. The Kaveri is optimized for affordable mobile gadgets, so it is supposed to sport a high performance-per-watt ratio. And this ratio is being improved not by increasing performance but by lowering power consumption and heat dissipation, which is now limited to 35 or even 15 watts.

Desktop users are offered special versions of the Kaveri design with a TDP up to 95 watts, yet AMD doesn’t even claim that they deliver high performance, talking instead about certain new capabilities. Judging by everything we know about it, the Kaveri can’t bring about any changes to the desktop processor market. The new series, just like AMD’s previous APUs, consists of affordable products for home, office and entry-level gaming computers.

It would be wrong to say that the Kaveri is absolutely unexciting for desktop users, though. It features a new version of the Bulldozer microarchitecture, codenamed Steamroller. Its graphics core is now based on the GCN design. And the APU at large complies with the heterogeneous system architecture (HSA) specs. Even though these innovations can’t make the new processors interesting for gamers or enthusiasts, they are quite exciting in their own way. At least, they show us what direction AMD is going in and may even suggest whether AMD will ever return to developing high-performance desktop processors as a top priority.

There are two desktop Kaveri-based products available since the beginning of 2014: A10-7850K and A10-7700K. They are not shipped in mass quantities, yet their availability is not a problem. We will be discussing the senior model which works at higher clock rates and features the most shader processors in its integrated graphics core. In other words, the A10-7850K is the fastest modern APU from AMD. Both models have a TDP of 95 watts.

There exists a third Kaveri-based desktop APU which features a TDP of 65 watts. This energy-efficient A8-7600 model is yet unavailable in retail, so we will review it at some other time.

Netflix embraces profiling

It was a dark and stormy night when I first joined Netflix in January 2007. Or light and breezy. Who knows? I can't remember three days ago. The membership was a (requested) Christmas gift, and there was much rejoicing throughout the Fox household, which, at the time, was two-fifths its current size. Back then, you could set up separate DVD/Blu-ray queues for different family members and alternate which queue your next disc was sent from. My wife, the hottest of all Megan Foxes, and I took advantage of this feature. Meaning there was almost always at least one movie in the house one of us had no interest in viewing.

And then Reed Hastings drove a stake through the heart of marital movie bliss and banished this feature to the junk pile of Betamax tapes and HD-DVDs. Boo. Hiss. So my wife's profile sat there for years. Still clickable, but with nothing in the queue and no way to add anything to the queue (although now such additions would be of the streaming variety, as we had given up the disc rentals after the Great Qwikster Flambé of 2011). So lonely. So forlorn.

Eventually, my wife pulled a double and spawned two offspring at once just to prove she was still a tough, Iowa farm girl at heart even though she doesn't know how to take a pork tenderloin from piglet to deep fryer. These spawn were then followed by a third and surgically guaranteed final Fox, and our Netflix Instant Queue ranneth over with the evil that is the whiny, Canadian scourge known as "Caillou." Sigh, eh.

Then, earlier this very summer, Netflix announced that they were bring profiles back—up to five per account. They finally rolled out this much-requested feature beginning August 1, with the caveat that it could take up to two weeks for profiles to propagate amongst the user hordes. Naturally, my account was on the tail end of receiving the upgrade, but it did show up Monday evening. Which is when the fun really began. Which is sarcasm.  ادامه مطلب ...

Samsung Galaxy S4


Three weeks with the Samsung Galaxy S4


An iPhone user's perspective
— 3:37 PM on July 1, 2013

Last year, I bought an iPhone 5. I'd been set on ditching iOS for Android at the time, but weeks of careful research had left me no closer to finding an Android handset I really liked. Then, one day, in a moment of weakness, I stepped into an Apple Store. I walked up to one of the display stands and started playing with the iPhone 5, and I realized how fast and light it was. And all the king's horses and all the king's men couldn't make me pocket my credit card again.

Fast forward eight months, and I'm now toting a Samsung Galaxy S4. Check it out:

Okay, so I didn't really switch phones. This thing is a loaner from Samsung. I've been using it in parallel with my iPhone 5 for the past three weeks, and the experience has been interesting, to say the least. I've always had an abstract awareness of the Android platform's advantages and pitfalls, but I'd never before had the opportunity to spend so long with it—especially not on a top-of-the-line handset.

And top-of-the-line the Galaxy S4 certainly is. Barely three months old, this phone packs a quad-core Qualcomm Snapdragon 600 SoC, two gigs of RAM, and a 5" PenTile Super AMOLED display with a 1920x1080 resolution (PPI count: 441). There's 100Mbps LTE and NFC and all kinds of other bells and whistles, too. The thing is almost impossibly thin and light, at just 0.31 inches and 4.6 ounces.

Coming from years of daily iPhone use, the Galaxy S4 looks massive despite its thinness. It's imposing, and the screen crowds the front surface with its size, leaving barely any room for buttons or ornaments. Yet the resolution is so high that the PenTile pixel layout's trademark screen-door effect is invisible. Text looks as sharp as a printed page, or close to it, and flat colors are flat, with no pixel grid anywhere in sight. That blend of screen real estate and resolution is terrific for everything from web browsing to video playback to e-book reading.

Pick up an iPhone 5 after an hour spent with the Galaxy S4, and the Apple device looks like a toy. The difference is that stark.

On the software front, the Galaxy S4 runs a version of Android 4.2.2 Jelly Bean customized with Samsung's TouchWiz interface. Google apps abound, and the basic behavior of the Android OS is very much intact. However, TouchWiz adds its own flavor to the stock experience, and there are plenty of Samsung-specific apps and widgets along for the ride.

I've never been a big fan of TouchWiz, and my weeks with the Galaxy S4 didn't change that. The interface elements are too drab, too angular, and the sound effects are too cheesy. By default, the phone makes a watery "bloop" each time you tap on a menu item. A grating two-tone whistle lets you know about new e-mails and texts, and a new-age jingle plays whenever you unlock the device. (The jingle is accompanied by a sparkly pixie dust effect.) It's sad, but at times, the uninspired UI and crummy sound effects conspire to make the phone seem cheap, very much unlike the high-tech device it really is. Not even Apple's worst skeumorphic over-indulgences are quite so bad.

Samsung really crowds those home screens, too. Three of the five default ones are taken up by Samsung widgets like S Travel, Story Album, Walking mate, and Samsung Hub. Yet another home screen is filled with carrier-specific fare. The remaining screen (the middle one) is occupied by a ginormous weather app, the Google search field, and shortcuts to default apps. There's not a single free spot for your own app shortcuts. Adding home screens or clearing up existing ones isn't difficult, but first impressions matter—and out of the box, the Galaxy S4 doesn't feel like a blank slate; it feels like a device borrowed from a Samsung executive.

Getting acquainted with the Galaxy S4 is, in many ways, a lot like setting up a notebook PC. Android blends flexibility and redundant clutter very much like Windows. For instance, there are at least three different ways to get into the Settings app from the home screen, and the phone ships with two different e-mail apps and two different web browsers out of the box. The notification system sometimes fills up with multiple identical Google Play icons, each one heralding a different app update. It can get a little crowded. Meanwhile, all those carrier and manufacturer widgets feel like the smartphone equivalents of Dell and HP bloatware: things bound to satisfy marketers more than users.

That's all very different from what you get with iOS, which reminds me a lot of circa-1995 Mac OS: clean and easy to navigate, but also pared down and rigid. There's a lot to be said for the extra flexibility Android provides, like the option to set a default web browser, change the default keyboard, automatically update apps, and manage wireless connections from the notifications pane. A handful of those features is coming to iOS in version 7 this fall, but the rest will remain exclusive to Android for the foreseeable future.

Anyway, enough generalizations about software. What's the Galaxy S4 like to use on a day-to-day basis?

 


 

Life with the Galaxy S4
The Galaxy S4 feels a little slow during day-to-day use, I'm sad to say. It's slower than my iPhone 5 at unlocking, at opening apps, and at switching between them. It takes longer to get to the camera from an unlocked state, and even Google Now takes more time to respond to voice searches than the same app on iOS. I expected this killer, state-of-the-art handset to run circles around the older Apple one, or at the very least to be comparably quick, but that's just not the case. The loss in performance got frustrating at times, like when I needed to take a picture or look up something online quickly. Going from a fast phone to a slower one is never fun.

The frustration doesn't stop there. Samsung has made the front bezel extremely thin, which means the gap between the bottom of the phone and the display is very small. That gap accommodates the home button as well as back and menu buttons that are hidden until pressed. If you use the phone one-handed for a little while, I guarantee you'll hit one of those buttons by accident. (It's not just me. TR's biz guy, a long-time Android devotee, has the same issue with his Galaxy S4.) This annoyance is compounded by the fact that each button has a secondary action tied to it. Pressing and holding the back button brings up a "multi-window" tray, which collapses into a little pull-out tab. The first time I brought up the tray by accident, I had no idea how it happened, and I had to Google for a way to turn it off. Ugh.

Not even that gorgeous five-inch screen is a home run for Samsung. It's big, yes, but it's also noticeably dim, even at the highest "auto" setting. I could get the luminosity to match my iPhone's only if I disabled automatic brightness altogether, which is probably terrible for battery life—and even then, the iPhone's maximum brightness was still brighter. On top of that, the S4's screen takes on a blue cast when viewed off-center, and I noticed some ghosting when scrolling down lists. Those may be small kinks in what's an otherwise amazing screen, but my iPhone 5 has no such problems.

Oh, and the default keyboard is terrible. Samsung substitutes the stock Jelly Bean keyboard with one of its own design, which has inexplicably small keys and a baffling lack of autocorrect functionality. I actually made more typos on it than on my iPhone, despite the huge difference in screen real estate. The solution? Head to Google Play and download Jelly Bean Keyboard, which restores the Google default. But that really isn't something one should have to do on a brand-new, $800 smartphone.

Put together, those deficiencies make the Galaxy S4 feel a little hamstrung by the stock software. Perhaps a third-party ROM closer to the Jelly Bean default could make things better. Such a ROM might do away with the drabness of TouchWiz and the clutter of the default widgets. It might take care of the auto brightness problem, too, and it might even resolve the accidental button-press issue, since Jelly Bean is supposed to have software buttons on the screen.

Apparently, a version of the Galaxy S4 with stock Google firmware can be ordered right from the Google Play store in the United States. That model wasn't available to me, though, and I couldn't root the Galaxy S4, since I had to send it back to Samsung at the end of my three-week test drive. Even if that hadn't been the case, rooting has its dangers—like the fact that it voids your warranty. Some folks may have no qualms about cheating Samsung by restoring the stock firmware before getting the phone serviced, but a hardware failure could make that impossible. That means users who can't easily get the stock Google version of the S4 may be better off sticking with TouchWiz and putting up with its flaws. And that's really too bad.

There is a lot to like about the Galaxy S4. It's thin and comfortable to hold, the display is excellent with the brightness cranked up, and the large footprint means the device doesn't slide around in my pocket like the iPhone 5. The performance may be a little lackluster, but it's definitely not terrible. Also, most of the software eccentricities I bemoaned can be resolved by a little tweaking and tinkering.

That said, after spending three weeks with this device, I have little desire for a TouchWiz-infused Galaxy S4 of my own. Rather, I'm eager to try the stock Google version... and to see Apple release a bigger iPhone.

Those next-gen games?

Those next-gen games? Yeah, one just arrived


— 2:49 PM on March 1, 2013

We've been swept away by a wave of talk about next-gen consoles since Sony unveiled the specs for the PlayStation 4, and we're due for another round when Microsoft reveals the next Xbox. The reception for the PS4 specs has largely been positive, even among PC gamers, because of what it means for future games. The PS4 looks to match the graphics horsepower of today's mid-range GPUs, something like a Radeon HD 7850. Making that sort of hardware the baseline for the next generation of consoles is probably a good thing for gaming, the argument goes.

Much of this talk is about potential, about the future possibilities for games as a medium, about fluidity and visual fidelity that your rods and cones will soak up like a sponge, crying out for more.

And I'm all for it.

But what if somebody had released a game that already realized that potential, that used the very best of today's graphics and CPU power to advance the state of the art in plainly revolutionary fashion, and nobody noticed?

Seems to me, that's pretty much what has happened with Crysis 3. I installed the game earlier this week, aware of the hype around it and expecting, heck, I dunno what—a bit of an improvement over Crysis 2, I suppose, that would probably run sluggishly even on high-end hardware. (And yes, I'm using high-end hardware, of course: dual Radeon HD 7970s on one rig and a GeForce GTX Titan on the other, both attached to a four-megapixel 30" monitor. Job perk, you know.)

Nothing had prepared me for what I encountered when the game got underway.

 

I've seen Far Cry 3 and Assasin's Creed 3 and other big-name games with "three" in their titles that pump out the eye candy, some of them very decent and impressive and such, but what Crytek has accomplished with Crysis 3 moves well beyond anything else out there. The experience they're creating in real time simply hasn't been seen before, not all in one place. You can break it down to a host of component parts—an advanced lighting model, high-res textures, complex environments and models, a convincing physics simulation, expressive facial animation, great artwork, and what have you. Each one of those components in Crysis 3 is probably the best I've ever seen in an interactive medium.

And yes, the jaw-dropping cinematics are all created in real time in the game engine, not pre-rendered to video.

But that's boring. What's exciting is how all of those things come together to make the world you're seeing projected in front of your face seem real, alive, and dangerous. To me, this game is a milestone; it advances the frontiers of the medium and illustrates how much better games can be. This is one of those "a-ha!" moments in tech, where expectations are reset with a tingly, positive feeling. Progress has happened, and it's not hard to see.

Once I realized that fact, I popped open a browser tab and started looking at reviews of Crysis 3, to find out what others had to say about the game. I suppose that was the wrong place to go, since game reviewing has long since moved into fancy-pants criticism that worries about whether the title in question successfully spans genres or does other things that sound vaguely French in origin. Yeah, I like games that push those sorts of boundaries, too, but sometimes you have to stop and see the forest full of impeccably lit, perfectly rendered trees.

 

Seems to me like, particularly in North America, gamers have somehow lost sight of the value of high-quality visuals and how they contribute to the sense of immersion and, yes, fun in gaming. Perhaps we've scanned through too many low-IQ forum arguments about visual quality versus gameplay, as if the two things were somehow part of an engineering tradeoff, where more of one equals less of the other. Perhaps the makers of big-budget games have provided too many examples of games that seem to bear out that logic. I think we could include Crytek in that mix, with the way Crysis 2 wrapped an infuriatingly mediocre game in ridiculously high-zoot clothing.

Whatever our malfunction is, we ought to get past it. Visuals aren't everything, but these visuals sure are something. A game this gorgeous is inherently more compelling than a sad, Xboxy-looking console port where all surfaces appear to be the same brand of shiny, blurry plastic, where the people talking look like eerily animated mannequins. Even if Crytek has too closely answered, you know, the call of duty when it comes to gameplay and storylines, they have undoubtedly achieved something momentous in Crysis 3. They've made grass look real, bridged the uncanny valley with incredible-looking human characters, and packed more detail into each square inch of this game than you'll find anywhere else. Crysis 3's core conceit, that you're stealthily hunting down bad guys while navigating through this incredibly rich environment, works well because of the stellar visuals, sound, and physics.

My sense is that Crysis 3 should run pretty well at "high" settings on most decent gaming PCs, too. If not on yours, well, it may be time to upgrade. Doing so will buy you a ticket to a whole new generation of visual fidelity in real-time graphics. I'd say that's worth it. To give you a sense of what you'd be getting, have a look at the images in the gallery below. Click "view full size" to see them in their full four-megapixel glory. About half the shots were take in Crysis 3's "high" image quality mode, since "very high" was a little sluggish, so yes, it can get even better as PC graphics continues marching forward.

Radeon HD 7990

Frame-pacing driver aims to revive the Radeon HD 7990


Can a driver tweak prevent microstuttering?
— 6:21 PM on August 1, 2013

When the Radeon HD 7990 first hit the market back in April, it didn't get the sort of reception one might expect for a graphics card that could easily claim to be the world's fastest. The trouble it encountered had been brewing for quite a while, as PC gamers became increasingly aware of a problem known as microstuttering that plagues multi-GPU configurations like AMD's CrossFire and Nvidia's SLI.

The problem has to do with the fact that frames are rendered in interleaved fashion between the two GPUs. That division of labor ought to work well, at least in theory, but the GPUs can go out of sync, dispatching and delivering frames at uneven intervals. In the worst cases, the two GPUs might dispatch or deliver frames at practically the same time. The second frame doesn't really offer any visual benefit when that happens, although its presence will inflate FPS averages.

Since the 7990 is a CrossFire-on-a-stick product with two GPUs lurking under its expansive cooler, it's prone to microstuttering problems just like a pair of video cards would be.

Although we've been stalking this problem for a couple of years, we were ready to pinpoint it when the 7990 arrived. We conducted a deep investigation, capturing each frame of animation produced by the card and converting them into detailed, frame-by-frame benchmark results and slow-motion videos. We concluded that two Radeon GPUs were often no better than one, visually speaking—and that AMD's thousand-dollar Radeon HD 7990 wasn't really any better for gaming than a much cheaper video card with a single GPU.  


 
ادامه مطلب ...