Technalysis Research
Previous Blogs

May 6, 2014
Device Usage a Question of Degree

April 29, 2014
The Next Smartphone Battleground: Durability

April 22, 2014
BYOD: A Work in Progress

April 18, 2014
Insider Extra: AMD Back in the Groove

April 15, 2014
The Mobility Myth

April 9, 2014
BYOD Dilemma: Devices vs. Data

April 8, 2014
Insider Extra: Qualcomm's Evolving Story

April 1, 2014
A Wearables Forecast

March 25, 2014
Measuring Success in Wearables? It's Thousands of Thousands

March 24, 2014
Insider Extra: Intel Strategy Moves Forward

March 18, 2014
IOT: Islands of Isolated Things?

March 11, 2014
Wearables Cautionary Tale

March 4, 2014
The New Platform Battle

February 25, 2014
Watch What Happens

February 18, 2014
Talkin' 'bout Touchpads

February 11, 2014
The MultiOS Conundrum

February 4, 2014
Computing Redefined

January 28, 2014
The Apple Problem

January 21, 2014
The 2-in-1s People Might Want

January 14, 2014
The Post Tablet Era

January 7, 2014
The Innovation Asymptote

December 31, 2013
Top 5 2014 Predictions

December 17, 2013
Holiday Shoppers Gifting Themselves

December 10, 2013
Companion Apps

December 3, 2013
Aisle Check

TECHnalysis Research Blog

May 13, 2014
Computing in the Cloud

The move to cloud-based computing models and cloud-based services continues forward at a breakneck pace. Seemingly everything is migrating to the cloud and along with that transfer, there seems to be an assumption that the local capabilities of attached client devices, particularly the CPU, don’t really matter. The reality of the situation, however, is actually much different.

The quality of virtually any cloud-based service is directly impacted by the horsepower of the device you’re using to access it. Don’t believe me? Think about it this way. If the local CPU (and graphics) didn’t matter, you should be able to have the same experience when you visit a web site or other cloud-based service from several different devices across the same network connection. As we all know, however, that just isn’t the case. In fact, the experience can vary widely between different devices, and one of the core reasons—pardon the pun—has to do with the local compute power of the connected device.

Other factors can certainly make an impact too—everything from whether the service is accessed via a dedicated app or through a browser, the choice of browser, the efficiency of the underlying operating system, speed of the memory and storage subsystems, and more. The bottom line is, though it may not appear that way at first, cloud-based computing solutions are dependent on the speed of the attached client. In other words, cloud computing isn’t all about the cloud.

Part of the reason for this has to do with how web-based applications and services are created and delivered. Generally speaking, there is a split of the computing load between the server hosting the application or service and the local device accessing it. The amount of the split can vary tremendously, from, say, 90% on the client and 10% on the server, to the exact opposite (or more likely, somewhere in between). But even in cases where significantly more of the workload occurs on the server, the performance of the client matters, because the local device needs to render the results of any server-processed workload to the local screen (at the very least). Plus, in true “thin client”-type environments, where 100% of the work is done on the server and a series of pixels are sent down a network connection—sometimes called “screen scraping”—the local device also has to deal with the protocols used to take those packets and convert them into pixels on the screen.

The issues can also go beyond performance. Even today, there are still some web-based application services that don’t run on certain operating systems or certain chip architectures. Believe it or not, the good ol’ x86-based PC is still the most compatible cloud computing solution.

There’s no question that we’re seeing an evolution of computing models and that more and more of the applications we use every day are moving to the web. When it comes to computing in the cloud, however, even if you’re connecting globally, you’re still computing locally, and that’s going to continue to drive the evolution of devices for many years to come.

.Here's a link to the original column:

Leveraging more than 10 years of award-winning, professional radio experience, TECHnalysis Research participates in regular audio podcasts in conjunction with the team at
  Research Schedule
A list of the documents that TECHnalysis Research plans to publish in 2015 can be found here.