It's a rant, so I'm not going to critique this post too much, but I'd like to call this out:
[...] Intel is desperately trying to figure out what to
do to combat the phones and tablets that are eating them
alive from the ankles up. It is pretty obvious that the
company both doesn’t understand what the problem is and
is actively shutting out all voices that explain it
to them.
I don't think this is true. Intel certainly understands the market and where it's headed. However they are committed to x86/64. What Intel is doing in my view is taking a series of huge but calculated risks. They seem to be betting that:
- Laptops will stick around and have Intel Inside for quite a while. The market may be boring, but it will be there for years. Corporate America helps.
- Servers won't be switching to ARM any time soon (I'd argue this is the riskiest bet).
- The desktop and enthusiast/gamer PC market will be around for a while, and also won't be switching to ARM any time soon.
So all of these "shoe-ins" buy them time, and I believe they think that in time they can pull off the biggest risk of all:
- Intel is betting that the biggest differentiating factor is and will be performance per watt. They are willing to gamble that they will eventually eclipse ARM cores in this area. In their view, if they have an x86/64 core that trounces competing ARM architectures in ppw then phone, tablet, and set top manufacturers won't have a problem putting those chips in their devices.
Granted, I'm not saying I think Intel is 100% correct or that they'll succeed with their long term bets; I just don't think they are as clueless as this rant makes them out to be.
No doubt about it, though, UltraBooks DO suck.
EDIT: I'm going to revise my statement on UltraBooks. Not all of them suck. In particular, the Lenovo Yoga is fantastic.
What they are missing is that it's not performance per watt, but performance per watt per dollar.
Intel may be able to build lower power/faster chips but their foundries aren't cheap, neither are the thousands of engineers Intel puts on each SoC, and technology scaling is only getting more and more expensive.
It's a classic case of the innovators dilemma. Intel are optimized to sell very fast chips that cost hundreds of dollars but the performance of fast-ish chips that cost a few dollars has almost caught up to them and Intel simply can't compete without a complete restructuring. A further complicating factor is the fact that ARM is a weird many-headed chimera [1] that Intel simply can't kill they way the did with x86 competitors like AMD.
Not too sure what you are referring to by "Ultrabooks," (something ARM specific?) but I have an "Ultrabook" and I absolutely love it. Its my work machine, my gaming machine, and my web surfing machine. It may even become my Steam Machine.
Its light, powerful, and compact. I can throw it in my backpack and forget its even there. What's not to like about Ultrabooks?
(For anyone wondering, its a Dell XPS 13 running Ubuntu)
Not saying Intel is ignorant of these facts, but the Microsoft/Intel hegemony is shrinking. Not dying, shrinking.
1. Apple and Google built an OS. Intel isn't willing to go after software. Are they smart to limit themselves? Microsoft is at least pretending to make hardware – alienating all the OEMs in the process – and Intel doesn't have what it takes to make software?
2. Cheap laptops have traditionally been Intel's enterprise play. Dell/HP/Lenovo are still selling 1366x768. Is Intel serious about their integrated graphics, while their customers are still being outfitted with 1366x768?
3. The server market is not a safe place to hide. Talk of how Intel is safe on servers ignores the consumer market and Intel is obviously not going to ignore the consumer.
So are we down to a single supplier for all our computers (Apple)? What will you do when Apple screws up? But more to the point, Intel had better watch out before Apple just ditches them entirely.
No doubt about it, though, UltraBooks DO suck.
EDIT: I'm going to revise my statement on UltraBooks. Not all of them suck. In particular, the Lenovo Yoga is fantastic
I suspect neither you or the OP have been paying attention lately? Neither had I until I started looking. I got a Samsung series 9 earlier this year and I'm very happy with it. I think many people would love it, but they don't know it exists. Same goes for a few of the others.
Any bet that the PC (i.e. the laptop) is going away, always needs to ask on which platform the comments on the website it's posted to are being written.
I'm willing to bet some hear are typing out on a tablet, but I'm sitting here in the kitchen with my laptop.
Ugh. Change tack[1], which is a sailing reference[2]. As for the actual content, I feel this analysis lacks nuance. Mobile is booming, of course, but the PC is not dead, nor will it be dead five years from now. There are a hundred use cases for which a desktop or laptop is the only practical solution. Fantasize all you want about businesses abandoning real machines for iPads; reality begs to differ.
> During this time Windows 8 came out and PC sales dropped 15% in the first full quarter after launch.
I don't think it's all Windows 8's fault. The average desktop PC is just too powerful.
I've been using Visual Studio 2010/2012/2013 with an i3 and an SSD for years now and I rarely run against any sort of performance bottleneck.
To compare what sort of performance requirements I have: in the project that I work on I have a solution with 28 projects that takes about 50 seconds to build from a clean build. Visual Studio takes care of incrementally building the projects during normal development, so usually I'm looking at ~5 seconds to build then launch the debugger.
I have absolutely no need to upgrade. No need = no sale.
I'm using Windows 8 as my operating system. It takes one step forward and one step backwards. I'm looking forward to Windows 8.1 but there's nothing so seriously wrong with Windows 8 that I need 8.1.
When I'm sitting in front of my PC and using Visual Studio, I'm not thinking "I wish this was actually a docked tablet". I have an iPad for mobility.
PC sales are probably undergoing a bit of a course correction as people who are satisfied with tablets buy tablets instead of PCs. But I suspect PCs will be around for a long time to come and, until that day, there's nothing for them to "[come] back" from.
Same here. 4 years ago I bought 2 desktops, one for work and one for home. I quit contracting and got employed, so both sit in my office at home now. I still use them, develop on them, do everything else on them and they are fine. I wont' be buying a new desktop for at least another year if then. And I CERTAINLY won't be 'upgrading' to win8.
Oh well... This rant has little sense and a lot of angriness. The phrase "The PC is over and PC sucks" appears several times with little explanation other than citing the grow of other markets. The true is there is no replacement for the PC and it doesn't seem to have a serious replacement any time soon.
People can't make movies, edit images properly, use a compiler, debug, use a nontrivial spreadsheet,etc in phones or tablets. Until that doesn't change the desktop PC won't die. They might not been as popular as before nor have the same upgrade cycle as before, they might had lost relevance as a growing market, but they are far from dead.
Well, that's a very popular position taken recently in the media. "the PC is dying... Tablets will replace everything we know... look, the sales are going up, it means it's a zero sum game and PC share will go down to ZERO!".
As you mentioned the future is fragmentation, certainly not a monolithic tablet-only future. There are still too many incentives to keep using PCs for many, many usages.
I don't agree with the "people can't create" thing with tablets.
I do a podcast entirely on my iPad, I also record music on it - and in both cases significantly prefer it to doing the same on my computer.
I also sketch stuff (although a decent digitiser would help) and my daughter records and edits films on it as well (she says it's too fiddly editing video on the Mac). I write numerous blog posts and I've written one essay on it (using an external keyboard). I even used it for coding (well not really, just as an SSH client to a linux box, but the inbuilt 3G over MOSH made it extremely convenient for doing non-UI work).
I've not had to do any spreadsheet stuff on it, and I can see why that would be a weakness. Nor have I had to do any Photoshop-level image editing (I have done simple image editing).
But for most "creation" tasks I find the iPad to be competent, and in some cases (especially audio editing) to be superior to a "computer".
Of course, none of the stuff is "professional" level creation - but that still fits perfectly with Steve Jobs' cars vs trucks analogy - most people don't need that level of control.
Well, true all that. Except, my non-technical parents won't be buying any more PCs. They can do everything they want on their ipad. And I'm pretty sure they don't want media editors, compilers or complex spreadsheet editors either.
>The phrase "The PC is over and PC sucks" appears several times with little explanation other than citing the grow of other markets.
There are some non-market explanations in the article and I could personally think of many more.
The PC could be way better than it is in technical terms and mostly (but not only) on the software side. The main problem is that once Windows reached the de facto monopoly, it had little incentive to innovate and instead had reasons to stay backwards compatible.
My own thoughts on the negligence and indolence of the PC industry are full of rage. But this guy makes my rants seem a little tame. I love it!
I believe I have an unpopular opinion about desktop PCs. The conventional thinking is that desktop computing is boring because a modern PC does everything it is intended to do just fine. That may be true, but the problem is that the industry is not interested in establishing new usage patterns—new things the PC should do.
At the end of last year, I started a series of rants about how modern technology sucks [1] with particular emphasis on the frustrating stagnation of desktop computing and the bothersome way every new portable computing device wants to be a center of attention.
I was pleasantly surprised that the author of the linked article hits the target squarely when he lists off what PCs need. The first item: better displays. He may be speaking more about laptops (and they are deserving of the shame), but allow me to rant a bit about my preferred computing medium—desktops.
The stagnation of desktop displays is, and has been for a decade, the crucial failure of desktop computing. Display stagnation is the limitation that allows all other limitations to be tolerated. It is the barrier that leads the overwhelming majority of users (and even pundits!) who tolerate mediocrity to declare everything else—from processors, to memory and GPUs—as "good enough." I absolutely seethe when I hear any technology declared good enough (at least without a very compelling argument).
Desktop displays, and by extension, desktop computing is so far from good enough that it should be self-evident to anyone who observes users interacting with tablets or mobile phones(!) while seated at a desktop PC. Everything that is wrong with modern computing can be summarized in that single all too common scene:
1. Desktop displays are not pleasant to look at. They are too small. They are too dark. They are too low-fidelity. And they often have annoying bezels down the middle of your view because we routinely compensate for their mediocrity by using more of them, side-by-side.
2. The performance of desktop computers is neglected because "how hard is it to run a browser and Microsoft Office?" This leads to lethargy in updating desktop PCs, both by IT and by users ("I don't want the hassle"). In 2013, I suspect many corporate PCs in fact feel slower than a modern tablet or even mobile phone.
3. Desktop operating systems are actively attempting to move away from (or at least marginalize) their strong suits of personal applications and input devices tailored for precision and all-day usage.
4. Desktop computers--and more accurately personal home networks--have lost their role as the central computing hub for individuals by a misguided means of gaining application omnipresence: what I call "the plain cloud." This is because none in the desktop industry (Microsoft most notably) are working to make personal networks appreciably manageable by laypeople.
5. Mobile phones and tablets are often free of IT shackles and therefore enjoy more R&D (more money to be made).
Desktop displays stopped moving forward in capability in 2001, and in large part regressed (as the article points out) since then. Had they continued to move forward--had the living room's poisonous moniker of "HD" spared computer monitors its wrath--I believe we would have breathtaking desktop displays by now. In that alternate universe, my desktop is equipped with a 50+" display with at least 12,000 horizontal pixels.
Desktop computing needs to leverage immersion (without nausea; VR goggles need not apply, yet). Large form-factor super-high-definition displays would bring all manner of new technology needs with them:
1. Gesture controls.
2. Ultra high-bandwidth wired networking (win for wired network folks) to move super high definition files.
3. Ultra high-capacity storage.
4. Extremely fast processors and GPUs to deal with a much greater visual pipeline.
Such a computing environment is a trojan horse for today's tablets: it turns tablets into subservient devices as seen in science fiction films such as Avatar. The tablet is just a view on your application, allowing you to take your work away from the main work space briefly until you return. I say trojan horse, but that's not quite right because I actually want this subservient kind of tablet very much. I do not want a tablet that is a first-class computing device in its own right (even less do I want a phone to be a first-class computing device). I only want one first-class computing device in my life, running singular instances of applications for me and me only, and I want all my devices to be subservient to that singular application host.
For the time being, that should be the desktop PC. In the long haul, it could be any application host (a local compute server, a compute server I lease from someone else, or maybe even a portable device as envisioned by Ubuntu's phone). But for now, the desktop should re-assert its rightful role as a chief computing environment, making all other devices mobile views.
One thing desktop displays desperately need is a way for each pixel to become brighter than its surrounding pixels at will. A lot brighter. Like, 50x brighter.
Dynamic brightness range is a necessary step for writing a 3D renderer that makes you feel like you're looking out a window. 256 levels of brightness aren't nearly enough.
We don't need the ability to specify that a pixel should be brightness level 64823 vs 64824. It doesn't need to be that fine-grained. What we need is the ability to overdrive the brightness of specific pixels. That way sunlight filtering through tree leaves will actually give the impression of sunlight filtering through tree leaves.
It's funny how your opinion contrasts against mine.
For what I've seen, displays have become a commodity and that's a good thing. I can go buy any kind of display, choose a size and I'm probably fitted with more pixels than I ever need, the panels are neat, flicker-free and flat, and the best part of it is that they cost next to nothing. You buy a laptop and you choose the size based on how much hardware you want to carry around — not because you need this huge screen because all laptop screens are "good enough" as you mentioned: I haven't had a laptop with less than 1440x900 for... a little less than a decade. And the resolution has never been inadequate for browsing, coding, drawing, writing and watching movies which is what I mostly do.
This is completely the opposite of what we had in the 90's when 15" CRT was the baseline, you were always a bit short of resolution and you never had the money to buy that huge one-cubic-meter-in-size display that could do your 1280x960 at 60Hz or something, and for which you probably had to upgrade your graphics card and probably your PC too. That totally, totally sucked. One could live with the basic resolution and screen size but I remember the agony of something better always being almost at reach. These days screens are something you don't think about twice. Everything is, again, dreadedly good enough, and if you need something professional you can get that too it probably won't cost you the price of a small car.
In the last 10 years or so I haven't considered once whether I should soon upgrade to a "better" display or to a laptop with a "better" screen. That's bliss, IMHO.
You raise an interesting issue. But is there enough money in large displays?
Let's look at use cases, and focus on the average person(assuming that the gamer market is too small to justify opening a top manufacturing line for large displays).
Usecase 1: movies. Even assuming there's value in 4K resolution(which is not certain) you still need to align and improve so many industries to make it work. Really hard.
Usecase 2: games. Most games plain folk play are casual games, and maybe playing angry birds at high resolution don't justify spending that much money . And even if there are ideas for such games, it's still chicken and egg problem.
Maybe the right strategy is to sell 10" retina displays, used at a short distance, to let people experience quality cheaply, build the relevant industries, and than offer large retina displays .
Man do I agree with you. I think the disconnect is that consumer tablet/mobile devices lack the ability to be views for our desktops. PCs are powerful enough to be the locus of consumer computing; we just need a good architecture to make them so.
Not to mention: when it comes to real productivity, you can't beat the desktop. It isn't the profitable sector for manufacturers, but it's still the productivity toolset.
We just need to make them exciting to the overall ecosystem.
This is one way WebRTC will be really useful (since, IMO, consumer IT is mostly Web or mobile apps). With signalling services in place, we'll be able to run hosts from desktops without registering domain names or establishing fixed IPs. The user-centric, desktop-hosted systems can grow out of that.
> Desktop displays stopped moving forward in capability in 2001
Are you kidding? In 2001, it was pricy to get a 15" 1280x1024 LCD monitor. Are we in the same state of affairs today? Would you be willing to wager that the quality, viewing angle, etc of that LCD monitor compares favourably to today's models?
I dont get this article. Seems to be nothing but hearsay and opinion with no references to facts.
Also point of author just seems nonsensical to me. I have a big-ass powerful pc and I love it! Use it for gaming, game development, 3d modelling, music making. I built it myself from components, set up OS and software exactly the way I want with win7 and Crunchbang (linux). No other device would be able to do exactly what I want like this.
Maybe most people dont want that but most people have never been technically minded, if theyre happier with tablets then thats fine.
Just want to validate this... I feel the same way. PCs are faster than they've ever been and cheaper than they've ever been. I just bought a custom smoking fast computer for my parents in an awesome tiny mini-ITX case with a fast SSD drive and an awesome i5 Ivy processor (I think 3570k?). Including monitor it was < $1000. I priced out even cheaper PCs with cheaper, larger cases for $300. I also have a big-ass powerful PC and I love it.
I'm on the complete opposite boat - I think the laptop experience generally sucks still. Battery life only recently got improved to a great point in the past few years, but performance is still generally lacking.
Meanwhile a great desktop lasts longer than ever, is cheaper than ever, and does everything extremely fast. I have a 5 year old desktop that outperforms a lot of laptops out there, including my new Macbook Air & my work laptop (not even a month old), and that desktop pales compared to my half year old desktop (which costs maybe $200 more than the 13" cheapest Macbook Air).
I think part of the shift in the market is due to the great state desktops have become as long lasting devices (& thus declining sales), and some of the improvements on more mobile devices - I'm highly skeptical of any call that the desktop is going away anytime soon though, because the mobile experience is still seriously lacking in the sweet spot of performance, battery life, weight, and price.
What PC innovation? PC's are the same trash can size devices they were two decades ago. They just get a little faster every couple of years.
Innovation will be driven by mobile (both phones and tablets). It requires a lot more innovation to build these smaller devices that operate all day on a battery.
I think people are really underestimating where things like perception computing are going. Something Intel is also invested in.
But maybe people start looking at which jobs require using a computer to get essential work done vs. not needing one and therefore not using it. If people really think that entire generations of people are not going to need computers to do work are seriously mistaken, especially in BRIC/developing countries. I don't think the question really is are PC dying, the question should what the hell can I do with the ~$1000 machine other than look at cat pictures. We can thank Microsoft mostly for that. Seriously I think people really underestimate how turned off the entire industry is from Windows 8, especially when they need to upgrade the only reasonable choice is Apple.
Remember Apple is the only company that is actually increasing sales to laptops(MBA models). Clearly there is a market it's just not being served by current parties.
I just recently purchased an Ultrabook; it was on sale and it was great deal. It had what I wanted: super thin, long battery life, and most importantly a high-res display.
But just the other day I was annoyed with the fan running while I wasn't doing anything intensive. So I checked my power settings and was surprised that the Min processor speed on battery was set to 100% and the cooling policy set to active. So I changed it. Then a few minutes later I checked it again and it was back to 100%/active. I finally started doing some Googling and found out the f'ing track pad driver was the cause of those power settings getting reset! Several driver updates and a BIOS update and the machine is where it should have been to start with.
I can only imagine how many people out there have this exact machine, which is chock full of Intel's best frequency stepping and power management technology, and it's all completely disabled.
Total Apple fanboy rant. The latest Ultrabooks are superior to the Macbook Air IMO. They are faster with better battery life and cost less. I prefer Windows 8 and 8.1 over Mac OS X ML and over IOS 7. I will never buy another iPad or iPhone ( I have an I iPad 3 and iPhone 5 atm) as I prefer the flexibility of my Ultrabook and Android phones have leapfrogged iPhones in almost all aspects.
The article is a giant pile of stupid. Umm...I think my Air is a nice laptop. It's packed full of Intel stuff. Why the anti-Intel rant? He's dumb or a liar. Intel doesn't care whether Apple/Intel wins, or Microsoft/Intel wins, or ??/Intel wins.
What do users want and ask for vocally? Screens that aren’t garbage quality, resolutions that are not worse than mainstream laptops from 2007, SSD instead of error prone and driver dependent ‘hybrid’ garbage, an OS that isn’t grating to the user, decent Wi-Fi, good build quality, and a decent price.
Do they really ? Imo most consumer couldnt care less about any of that, its a tech savy minority that wants higher quality screens and SSDs. Thats exactly the reason why we are seeing zero innovation in the PC monitor space, because the market doesnt really care. It cares for price most importantly which leads to popularity of low res screens and slow HDDs in the first place.
This 'the PC is dead' nonsense will come full circle eventually. Phones and tablets are PCs, we just haven't yet got to point where we can satisfactorily dock them with a full desktop accessory set.
I personally see a scenario where everyone has a nice big LCD screen, full sized QWERTY, and probably still a mouse, in their study at home but carry their 'beige box' in their pocket. Just 5-10 years out imho. Unfortunately I think Windows is still positioned best to make this happen.
Unfortunately I think Windows is still positioned best to make this happen.
Have you looked at the Ubuntu phone OS? It's designed to be the same software, hardware, and UX from phone to desktop. I don't necessarily like it, but it shows promise of doing exactly what you say Windows is best positioned to do.
> Unfortunately I think Windows is still positioned best to make this happen.
As a user who has been very impressed by the combination of http://elementaryos.org/ and the fact that Valve/Steam has recommitted to Linux, I think Windows is on the downturn and Linux is on the upswing.
This is also the direction I think computing is taking. In the case of Apple, you might still have the Mac Pro, and perhaps MacBooks, but the iMac and Mac Mini will be replaced by iPad, which can connect to a monitor, keyboard, and mouse. On the iPad itself you will have the iOS experience, and in desktop mode you will have the current OS X experience. This is the inevitable outcome of the convergence between the two.
This article sucks? why? because it's not semi-accurate, it's totally inaccurate, everywhere.
And writing "sucks" every paragraph doesn't make it right.
10% speed improvement a year isn't nothing. The small battery improvement this year? Oh.. We went from from 6H battery life to 13H.. it's only double, it sucks! Heck, it's better than my smartphone with screen on.
The rest is on windows, which is an OS, not the OS. (Which isn't even a _bad_ OS, despite the hate for Microsoft)
What's actually happening is that the PC market is basically saturated with machines that pretty much do whatever anybody asks of them.
The market has pretty much plateaued. Pretty much everybody has a PC at home and work. Most households already have multiple computers. Heck, I know entirely non-technical powerwasher/gutter cleaner guys who have 2 or 3 computers. In fact, I don't know a single person older than 10 years old who doesn't have at least one Personal Computer of some kind.
Any commodity off-the-shelf PC will pretty much do whatever you ask of it (at least for most consumers). I used to replace my computer every year or two just so I could run modern software. I haven't felt compelled to do so for the last 6 years and even then I'm 50/50 on doing it. The rMBP my work issued to me is fantastic for virtualization, but unbelievable overkill for everything else I do (mostly email, word and web).
There's just not much of a reason to buy more machines outside of regular replacement rates due to failure and total obsolescence and new humans buying them as they get old enough.
It's not that PCs aren't coming back, it's that the constant growth in the market has plateaued.
Everybody was hoping China, India and Africa would explode 3/5s of the world's humanity moved into the middle-class and needed computers, but the growth has been far slower than was hoped and these first time computer buyers won't really be constantly upgrading like previous markets did -- the market characteristics are such that it won't be a simple repeat of the 80s, 90s and early 2000s.
Smartphones and Tablets are an entirely new segment and still growing (though showing some signs of flattening out as well). That's why they're exciting, because those markets are still building out and upgrading. But there are signs that those segments are flattening as well.
Tablets and phones are awesome, but they're definitely not a replacement for a general purpose PC. Even my mother and father, who're quite the luddites, regularly needs capabilities that don't work well on a tablet -- like doing taxes. Even if those things were magically fixed and working awesomely tomorrow, they'd still want a bigger screen than a tablet afford.
PCs aren't going anywhere, it's just that the market has to shift to sustaining the market not growing it (which is infinitely more expensive, meaning loads more money sloshing around in the secondary markets). This is fundamentally the problem that both Intel and Microsoft are dealing with. Apple escaped it largely because they created new segments to grow into.
Heck, the one new market segment that PC makers did manage to get into, netbooks, they managed to screw up so bad that the entire segment was dead within just a few years. (If you think of where netbooks needed to go as a segment, the Surface Pro would probably be a reasonable outcome, except that market is totally hosed now and Microsoft has to rebuild it).
Also it becomes even better by imagining it in Bane's voice.
Netbooks didn't get screwed up, it's just that what most people want from a cheap device is primarily content consumption which is done better by tablets.
The upgrade cycle with phones and tablets might be starting to slow as well.
For those who didn't make it to IDF, it felt dead. There was very little attendance in most sessions and the expo floor was also pretty much empty. They actually moved food into demo areas so it looked like there was buzz. I'm pretty sure the "outside of Intel" attendee count was remarkably low.
PCs are dead, but Intel will be fine. Bay Trail will be the beginning of the end for ARM as Intel brings its massive lead in fab technology to bear on the mobile market.
baytrail has the vast expense of cutting edge foundries and the entire expensive might of intel r&d behind it, and it just got beaten by a cpu produced with far less engineering and r&d expense on a last generation foundry.
The fact that tablet devices and phones have entered the market, reducing the need to do everything on a PC, doesn't spell the end of PCs. It just means they aren't the only go-to computer anymore, which is a good thing for everyone.
A rock solid PC in the home connected to a nice big monitor and other useful peripheral devices, is a good thing to have. Be it a compact PC, laptop or desktop, Windows or something else.
"Post PC" is a stupid agenda-driven term. We live in a "post horse and cart" world, but the PC has no inherent limitations preventing it from evolving. If you bother to look, there's currently more enclosures, cases, and interesting "desktop" configuration variety for PCs than ever before, cheaper than ever before.
Well, I am currently shopping for some used ThinkPad T60's and T61's because I cannot stand the shite keyboards (and also, not infrequently, displays) that have taken over current designs.
This doesn't really speak to market trends, I guess, but making your products physically unpleasant to use probably isn't helping your cause.
I should delete this comment as a pointless rant... but, I'm shopping for 7 year old laptops, dammit. I want to type quickly and pain-free, and also have some vertical context without eyestrain.
I don't think PCs are dying. I think computing consumption is increasing so desktop productivity looks like it is declining.
I think when a dock for tablets or phone finally happens for consumers, they'll just get it. Desktop mode is not intended for using your fat fingers on a touch screen. "Metro" mode is for that. Windows 8 is all about for when you get off the bus in consumption tablet mode and dock into your desk and your dual monitor with keyboard and mouse lights up and you go into productivity mode.
[+] [-] breckinloggins|12 years ago|reply
- Laptops will stick around and have Intel Inside for quite a while. The market may be boring, but it will be there for years. Corporate America helps.
- Servers won't be switching to ARM any time soon (I'd argue this is the riskiest bet).
- The desktop and enthusiast/gamer PC market will be around for a while, and also won't be switching to ARM any time soon.
So all of these "shoe-ins" buy them time, and I believe they think that in time they can pull off the biggest risk of all:
- Intel is betting that the biggest differentiating factor is and will be performance per watt. They are willing to gamble that they will eventually eclipse ARM cores in this area. In their view, if they have an x86/64 core that trounces competing ARM architectures in ppw then phone, tablet, and set top manufacturers won't have a problem putting those chips in their devices.
Granted, I'm not saying I think Intel is 100% correct or that they'll succeed with their long term bets; I just don't think they are as clueless as this rant makes them out to be.
No doubt about it, though, UltraBooks DO suck.
EDIT: I'm going to revise my statement on UltraBooks. Not all of them suck. In particular, the Lenovo Yoga is fantastic.
[+] [-] microarchitect|12 years ago|reply
Intel may be able to build lower power/faster chips but their foundries aren't cheap, neither are the thousands of engineers Intel puts on each SoC, and technology scaling is only getting more and more expensive.
It's a classic case of the innovators dilemma. Intel are optimized to sell very fast chips that cost hundreds of dollars but the performance of fast-ish chips that cost a few dollars has almost caught up to them and Intel simply can't compete without a complete restructuring. A further complicating factor is the fact that ARM is a weird many-headed chimera [1] that Intel simply can't kill they way the did with x86 competitors like AMD.
[1] http://www.anandtech.com/show/7112/the-arm-diaries-part-1-ho...
[+] [-] tyleregeto|12 years ago|reply
Its light, powerful, and compact. I can throw it in my backpack and forget its even there. What's not to like about Ultrabooks?
(For anyone wondering, its a Dell XPS 13 running Ubuntu)
[+] [-] sounds|12 years ago|reply
1. Apple and Google built an OS. Intel isn't willing to go after software. Are they smart to limit themselves? Microsoft is at least pretending to make hardware – alienating all the OEMs in the process – and Intel doesn't have what it takes to make software?
2. Cheap laptops have traditionally been Intel's enterprise play. Dell/HP/Lenovo are still selling 1366x768. Is Intel serious about their integrated graphics, while their customers are still being outfitted with 1366x768?
3. The server market is not a safe place to hide. Talk of how Intel is safe on servers ignores the consumer market and Intel is obviously not going to ignore the consumer.
So are we down to a single supplier for all our computers (Apple)? What will you do when Apple screws up? But more to the point, Intel had better watch out before Apple just ditches them entirely.
[+] [-] nightski|12 years ago|reply
[+] [-] ekianjo|12 years ago|reply
You mean, in comparison to ARM based tablets?
[+] [-] gizzlon|12 years ago|reply
I suspect neither you or the OP have been paying attention lately? Neither had I until I started looking. I got a Samsung series 9 earlier this year and I'm very happy with it. I think many people would love it, but they don't know it exists. Same goes for a few of the others.
[+] [-] XorNot|12 years ago|reply
I'm willing to bet some hear are typing out on a tablet, but I'm sitting here in the kitchen with my laptop.
[+] [-] zedpm|12 years ago|reply
Ugh. Change tack[1], which is a sailing reference[2]. As for the actual content, I feel this analysis lacks nuance. Mobile is booming, of course, but the PC is not dead, nor will it be dead five years from now. There are a hundred use cases for which a desktop or laptop is the only practical solution. Fantasize all you want about businesses abandoning real machines for iPads; reality begs to differ.
[1] http://idioms.thefreedictionary.com/change+tack [2] http://en.wikipedia.org/wiki/Tack_(sailing)
[+] [-] taspeotis|12 years ago|reply
I don't think it's all Windows 8's fault. The average desktop PC is just too powerful.
I've been using Visual Studio 2010/2012/2013 with an i3 and an SSD for years now and I rarely run against any sort of performance bottleneck.
To compare what sort of performance requirements I have: in the project that I work on I have a solution with 28 projects that takes about 50 seconds to build from a clean build. Visual Studio takes care of incrementally building the projects during normal development, so usually I'm looking at ~5 seconds to build then launch the debugger.
I have absolutely no need to upgrade. No need = no sale.
I'm using Windows 8 as my operating system. It takes one step forward and one step backwards. I'm looking forward to Windows 8.1 but there's nothing so seriously wrong with Windows 8 that I need 8.1.
When I'm sitting in front of my PC and using Visual Studio, I'm not thinking "I wish this was actually a docked tablet". I have an iPad for mobility.
PC sales are probably undergoing a bit of a course correction as people who are satisfied with tablets buy tablets instead of PCs. But I suspect PCs will be around for a long time to come and, until that day, there's nothing for them to "[come] back" from.
[+] [-] JoeAltmaier|12 years ago|reply
[+] [-] yuhong|12 years ago|reply
http://support.microsoft.com/gp/lifecycle-Windows81-faq
What is fun is that I doubt this applies to Server 2012, even though it is based on the same codebase.
[+] [-] rocky1138|12 years ago|reply
[+] [-] pmelendez|12 years ago|reply
People can't make movies, edit images properly, use a compiler, debug, use a nontrivial spreadsheet,etc in phones or tablets. Until that doesn't change the desktop PC won't die. They might not been as popular as before nor have the same upgrade cycle as before, they might had lost relevance as a growing market, but they are far from dead.
[+] [-] ekianjo|12 years ago|reply
As you mentioned the future is fragmentation, certainly not a monolithic tablet-only future. There are still too many incentives to keep using PCs for many, many usages.
[+] [-] rahoulb|12 years ago|reply
I do a podcast entirely on my iPad, I also record music on it - and in both cases significantly prefer it to doing the same on my computer.
I also sketch stuff (although a decent digitiser would help) and my daughter records and edits films on it as well (she says it's too fiddly editing video on the Mac). I write numerous blog posts and I've written one essay on it (using an external keyboard). I even used it for coding (well not really, just as an SSH client to a linux box, but the inbuilt 3G over MOSH made it extremely convenient for doing non-UI work).
I've not had to do any spreadsheet stuff on it, and I can see why that would be a weakness. Nor have I had to do any Photoshop-level image editing (I have done simple image editing).
But for most "creation" tasks I find the iPad to be competent, and in some cases (especially audio editing) to be superior to a "computer".
Of course, none of the stuff is "professional" level creation - but that still fits perfectly with Steve Jobs' cars vs trucks analogy - most people don't need that level of control.
[+] [-] NTDF9|12 years ago|reply
[+] [-] whyoh|12 years ago|reply
There are some non-market explanations in the article and I could personally think of many more. The PC could be way better than it is in technical terms and mostly (but not only) on the software side. The main problem is that once Windows reached the de facto monopoly, it had little incentive to innovate and instead had reasons to stay backwards compatible.
[+] [-] bhauer|12 years ago|reply
I believe I have an unpopular opinion about desktop PCs. The conventional thinking is that desktop computing is boring because a modern PC does everything it is intended to do just fine. That may be true, but the problem is that the industry is not interested in establishing new usage patterns—new things the PC should do.
At the end of last year, I started a series of rants about how modern technology sucks [1] with particular emphasis on the frustrating stagnation of desktop computing and the bothersome way every new portable computing device wants to be a center of attention.
I was pleasantly surprised that the author of the linked article hits the target squarely when he lists off what PCs need. The first item: better displays. He may be speaking more about laptops (and they are deserving of the shame), but allow me to rant a bit about my preferred computing medium—desktops.
The stagnation of desktop displays is, and has been for a decade, the crucial failure of desktop computing. Display stagnation is the limitation that allows all other limitations to be tolerated. It is the barrier that leads the overwhelming majority of users (and even pundits!) who tolerate mediocrity to declare everything else—from processors, to memory and GPUs—as "good enough." I absolutely seethe when I hear any technology declared good enough (at least without a very compelling argument).
Desktop displays, and by extension, desktop computing is so far from good enough that it should be self-evident to anyone who observes users interacting with tablets or mobile phones(!) while seated at a desktop PC. Everything that is wrong with modern computing can be summarized in that single all too common scene:
1. Desktop displays are not pleasant to look at. They are too small. They are too dark. They are too low-fidelity. And they often have annoying bezels down the middle of your view because we routinely compensate for their mediocrity by using more of them, side-by-side.
2. The performance of desktop computers is neglected because "how hard is it to run a browser and Microsoft Office?" This leads to lethargy in updating desktop PCs, both by IT and by users ("I don't want the hassle"). In 2013, I suspect many corporate PCs in fact feel slower than a modern tablet or even mobile phone.
3. Desktop operating systems are actively attempting to move away from (or at least marginalize) their strong suits of personal applications and input devices tailored for precision and all-day usage.
4. Desktop computers--and more accurately personal home networks--have lost their role as the central computing hub for individuals by a misguided means of gaining application omnipresence: what I call "the plain cloud." This is because none in the desktop industry (Microsoft most notably) are working to make personal networks appreciably manageable by laypeople.
5. Mobile phones and tablets are often free of IT shackles and therefore enjoy more R&D (more money to be made).
Desktop displays stopped moving forward in capability in 2001, and in large part regressed (as the article points out) since then. Had they continued to move forward--had the living room's poisonous moniker of "HD" spared computer monitors its wrath--I believe we would have breathtaking desktop displays by now. In that alternate universe, my desktop is equipped with a 50+" display with at least 12,000 horizontal pixels.
Desktop computing needs to leverage immersion (without nausea; VR goggles need not apply, yet). Large form-factor super-high-definition displays would bring all manner of new technology needs with them:
1. Gesture controls.
2. Ultra high-bandwidth wired networking (win for wired network folks) to move super high definition files.
3. Ultra high-capacity storage.
4. Extremely fast processors and GPUs to deal with a much greater visual pipeline.
Such a computing environment is a trojan horse for today's tablets: it turns tablets into subservient devices as seen in science fiction films such as Avatar. The tablet is just a view on your application, allowing you to take your work away from the main work space briefly until you return. I say trojan horse, but that's not quite right because I actually want this subservient kind of tablet very much. I do not want a tablet that is a first-class computing device in its own right (even less do I want a phone to be a first-class computing device). I only want one first-class computing device in my life, running singular instances of applications for me and me only, and I want all my devices to be subservient to that singular application host.
For the time being, that should be the desktop PC. In the long haul, it could be any application host (a local compute server, a compute server I lease from someone else, or maybe even a portable device as envisioned by Ubuntu's phone). But for now, the desktop should re-assert its rightful role as a chief computing environment, making all other devices mobile views.
[1] http://tiamat.tsotech.com/technology-sucks
[+] [-] sillysaurus2|12 years ago|reply
Dynamic brightness range is a necessary step for writing a 3D renderer that makes you feel like you're looking out a window. 256 levels of brightness aren't nearly enough.
We don't need the ability to specify that a pixel should be brightness level 64823 vs 64824. It doesn't need to be that fine-grained. What we need is the ability to overdrive the brightness of specific pixels. That way sunlight filtering through tree leaves will actually give the impression of sunlight filtering through tree leaves.
[+] [-] yason|12 years ago|reply
For what I've seen, displays have become a commodity and that's a good thing. I can go buy any kind of display, choose a size and I'm probably fitted with more pixels than I ever need, the panels are neat, flicker-free and flat, and the best part of it is that they cost next to nothing. You buy a laptop and you choose the size based on how much hardware you want to carry around — not because you need this huge screen because all laptop screens are "good enough" as you mentioned: I haven't had a laptop with less than 1440x900 for... a little less than a decade. And the resolution has never been inadequate for browsing, coding, drawing, writing and watching movies which is what I mostly do.
This is completely the opposite of what we had in the 90's when 15" CRT was the baseline, you were always a bit short of resolution and you never had the money to buy that huge one-cubic-meter-in-size display that could do your 1280x960 at 60Hz or something, and for which you probably had to upgrade your graphics card and probably your PC too. That totally, totally sucked. One could live with the basic resolution and screen size but I remember the agony of something better always being almost at reach. These days screens are something you don't think about twice. Everything is, again, dreadedly good enough, and if you need something professional you can get that too it probably won't cost you the price of a small car.
In the last 10 years or so I haven't considered once whether I should soon upgrade to a "better" display or to a laptop with a "better" screen. That's bliss, IMHO.
[+] [-] hershel|12 years ago|reply
Let's look at use cases, and focus on the average person(assuming that the gamer market is too small to justify opening a top manufacturing line for large displays).
Usecase 1: movies. Even assuming there's value in 4K resolution(which is not certain) you still need to align and improve so many industries to make it work. Really hard.
Usecase 2: games. Most games plain folk play are casual games, and maybe playing angry birds at high resolution don't justify spending that much money . And even if there are ideas for such games, it's still chicken and egg problem.
Maybe the right strategy is to sell 10" retina displays, used at a short distance, to let people experience quality cheaply, build the relevant industries, and than offer large retina displays .
[+] [-] pfraze|12 years ago|reply
Not to mention: when it comes to real productivity, you can't beat the desktop. It isn't the profitable sector for manufacturers, but it's still the productivity toolset. We just need to make them exciting to the overall ecosystem.
This is one way WebRTC will be really useful (since, IMO, consumer IT is mostly Web or mobile apps). With signalling services in place, we'll be able to run hosts from desktops without registering domain names or establishing fixed IPs. The user-centric, desktop-hosted systems can grow out of that.
[+] [-] pyre|12 years ago|reply
Are you kidding? In 2001, it was pricy to get a 15" 1280x1024 LCD monitor. Are we in the same state of affairs today? Would you be willing to wager that the quality, viewing angle, etc of that LCD monitor compares favourably to today's models?
[+] [-] everyone|12 years ago|reply
Also point of author just seems nonsensical to me. I have a big-ass powerful pc and I love it! Use it for gaming, game development, 3d modelling, music making. I built it myself from components, set up OS and software exactly the way I want with win7 and Crunchbang (linux). No other device would be able to do exactly what I want like this. Maybe most people dont want that but most people have never been technically minded, if theyre happier with tablets then thats fine.
[+] [-] dgrant|12 years ago|reply
[+] [-] Bahamut|12 years ago|reply
Meanwhile a great desktop lasts longer than ever, is cheaper than ever, and does everything extremely fast. I have a 5 year old desktop that outperforms a lot of laptops out there, including my new Macbook Air & my work laptop (not even a month old), and that desktop pales compared to my half year old desktop (which costs maybe $200 more than the 13" cheapest Macbook Air).
I think part of the shift in the market is due to the great state desktops have become as long lasting devices (& thus declining sales), and some of the improvements on more mobile devices - I'm highly skeptical of any call that the desktop is going away anytime soon though, because the mobile experience is still seriously lacking in the sweet spot of performance, battery life, weight, and price.
[+] [-] mmohsenazimi|12 years ago|reply
I hope this trend do not lead to slowness in PC innovation.
[+] [-] melling|12 years ago|reply
Innovation will be driven by mobile (both phones and tablets). It requires a lot more innovation to build these smaller devices that operate all day on a battery.
[+] [-] TheLegace|12 years ago|reply
But maybe people start looking at which jobs require using a computer to get essential work done vs. not needing one and therefore not using it. If people really think that entire generations of people are not going to need computers to do work are seriously mistaken, especially in BRIC/developing countries. I don't think the question really is are PC dying, the question should what the hell can I do with the ~$1000 machine other than look at cat pictures. We can thank Microsoft mostly for that. Seriously I think people really underestimate how turned off the entire industry is from Windows 8, especially when they need to upgrade the only reasonable choice is Apple.
Remember Apple is the only company that is actually increasing sales to laptops(MBA models). Clearly there is a market it's just not being served by current parties.
[+] [-] justincormack|12 years ago|reply
[+] [-] gph|12 years ago|reply
As opposed to what... tablets? Or are you suggesting we leap forward to Computer Terminals right out of the Hitchhiker's Guide.
I couldn't read much further than that. Really bad article.
[+] [-] wvenable|12 years ago|reply
But just the other day I was annoyed with the fan running while I wasn't doing anything intensive. So I checked my power settings and was surprised that the Min processor speed on battery was set to 100% and the cooling policy set to active. So I changed it. Then a few minutes later I checked it again and it was back to 100%/active. I finally started doing some Googling and found out the f'ing track pad driver was the cause of those power settings getting reset! Several driver updates and a BIOS update and the machine is where it should have been to start with.
I can only imagine how many people out there have this exact machine, which is chock full of Intel's best frequency stepping and power management technology, and it's all completely disabled.
[+] [-] epochwolf|12 years ago|reply
The computers people currently own?
[+] [-] uslic001|12 years ago|reply
[+] [-] rossjudson|12 years ago|reply
[+] [-] kayoone|12 years ago|reply
Do they really ? Imo most consumer couldnt care less about any of that, its a tech savy minority that wants higher quality screens and SSDs. Thats exactly the reason why we are seeing zero innovation in the PC monitor space, because the market doesnt really care. It cares for price most importantly which leads to popularity of low res screens and slow HDDs in the first place.
[+] [-] nly|12 years ago|reply
I personally see a scenario where everyone has a nice big LCD screen, full sized QWERTY, and probably still a mouse, in their study at home but carry their 'beige box' in their pocket. Just 5-10 years out imho. Unfortunately I think Windows is still positioned best to make this happen.
[+] [-] nitrogen|12 years ago|reply
Have you looked at the Ubuntu phone OS? It's designed to be the same software, hardware, and UX from phone to desktop. I don't necessarily like it, but it shows promise of doing exactly what you say Windows is best positioned to do.
[+] [-] lectrick|12 years ago|reply
As a user who has been very impressed by the combination of http://elementaryos.org/ and the fact that Valve/Steam has recommitted to Linux, I think Windows is on the downturn and Linux is on the upswing.
[+] [-] mratzloff|12 years ago|reply
[+] [-] zobzu|12 years ago|reply
10% speed improvement a year isn't nothing. The small battery improvement this year? Oh.. We went from from 6H battery life to 13H.. it's only double, it sucks! Heck, it's better than my smartphone with screen on.
The rest is on windows, which is an OS, not the OS. (Which isn't even a _bad_ OS, despite the hate for Microsoft)
[+] [-] bane|12 years ago|reply
The market has pretty much plateaued. Pretty much everybody has a PC at home and work. Most households already have multiple computers. Heck, I know entirely non-technical powerwasher/gutter cleaner guys who have 2 or 3 computers. In fact, I don't know a single person older than 10 years old who doesn't have at least one Personal Computer of some kind.
Any commodity off-the-shelf PC will pretty much do whatever you ask of it (at least for most consumers). I used to replace my computer every year or two just so I could run modern software. I haven't felt compelled to do so for the last 6 years and even then I'm 50/50 on doing it. The rMBP my work issued to me is fantastic for virtualization, but unbelievable overkill for everything else I do (mostly email, word and web).
There's just not much of a reason to buy more machines outside of regular replacement rates due to failure and total obsolescence and new humans buying them as they get old enough.
It's not that PCs aren't coming back, it's that the constant growth in the market has plateaued.
Everybody was hoping China, India and Africa would explode 3/5s of the world's humanity moved into the middle-class and needed computers, but the growth has been far slower than was hoped and these first time computer buyers won't really be constantly upgrading like previous markets did -- the market characteristics are such that it won't be a simple repeat of the 80s, 90s and early 2000s.
Smartphones and Tablets are an entirely new segment and still growing (though showing some signs of flattening out as well). That's why they're exciting, because those markets are still building out and upgrading. But there are signs that those segments are flattening as well.
Tablets and phones are awesome, but they're definitely not a replacement for a general purpose PC. Even my mother and father, who're quite the luddites, regularly needs capabilities that don't work well on a tablet -- like doing taxes. Even if those things were magically fixed and working awesomely tomorrow, they'd still want a bigger screen than a tablet afford.
PCs aren't going anywhere, it's just that the market has to shift to sustaining the market not growing it (which is infinitely more expensive, meaning loads more money sloshing around in the secondary markets). This is fundamentally the problem that both Intel and Microsoft are dealing with. Apple escaped it largely because they created new segments to grow into.
Heck, the one new market segment that PC makers did manage to get into, netbooks, they managed to screw up so bad that the entire segment was dead within just a few years. (If you think of where netbooks needed to go as a segment, the Surface Pro would probably be a reasonable outcome, except that market is totally hosed now and Microsoft has to rebuild it).
[+] [-] sien|12 years ago|reply
Also it becomes even better by imagining it in Bane's voice.
Netbooks didn't get screwed up, it's just that what most people want from a cheap device is primarily content consumption which is done better by tablets.
The upgrade cycle with phones and tablets might be starting to slow as well.
[+] [-] mentat|12 years ago|reply
[+] [-] rayiner|12 years ago|reply
You heard it here first.
[+] [-] JetSetWilly|12 years ago|reply
baytrail has the vast expense of cutting edge foundries and the entire expensive might of intel r&d behind it, and it just got beaten by a cpu produced with far less engineering and r&d expense on a last generation foundry.
Doesn't look good for baytrail then.
[+] [-] exodust|12 years ago|reply
A rock solid PC in the home connected to a nice big monitor and other useful peripheral devices, is a good thing to have. Be it a compact PC, laptop or desktop, Windows or something else.
"Post PC" is a stupid agenda-driven term. We live in a "post horse and cart" world, but the PC has no inherent limitations preventing it from evolving. If you bother to look, there's currently more enclosures, cases, and interesting "desktop" configuration variety for PCs than ever before, cheaper than ever before.
In short, the article sucks.
[+] [-] pasbesoin|12 years ago|reply
This doesn't really speak to market trends, I guess, but making your products physically unpleasant to use probably isn't helping your cause.
I should delete this comment as a pointless rant... but, I'm shopping for 7 year old laptops, dammit. I want to type quickly and pain-free, and also have some vertical context without eyestrain.
[+] [-] shearnie|12 years ago|reply
I think when a dock for tablets or phone finally happens for consumers, they'll just get it. Desktop mode is not intended for using your fat fingers on a touch screen. "Metro" mode is for that. Windows 8 is all about for when you get off the bus in consumption tablet mode and dock into your desk and your dual monitor with keyboard and mouse lights up and you go into productivity mode.