top | item 5535977

Sorry, PC companies: you've apparently managed to perfect the PC

37 points| technologizer | 13 years ago |techland.time.com | reply

61 comments

order
[+] simonsarris|13 years ago|reply
I've felt this way since I built my last desktop in 2008. I was sort-of waiting for the "gee its time to upgrade" mark to roll around in 3 or 4 years, but it hasn't happened yet. Any games I want to play it still runs very well, and it still feels very fast to me even compared to modern systems.

When my friends ask for advice I tell them if they like the keyboard and screen, then its just plain hard to be disappointed with anything new.

I think I can pinpoint when this happened. It was the SSD. Getting an SSD was the last upgrade I ever needed.

~~~

Something does worry me slightly about the large shift to tablets, which are great devices in their own right. It's hard(er) to create new content on a tablet, and I don't really want that becoming the default computer for any generation.

I think its extremely healthy to have the lowest bar possible to go from "Hey I like that" to "Can I do that? Can I make it myself?"

I think its something hackers, especially those with children should ask themselves: Would I still be me, if I had grown up around primarily content consumption computing devices instead of more general purpose laptops and desktops?

[+] unclebucknasty|13 years ago|reply
Nailed it from top to bottom.

That's about the time I noticed that I no longer felt the need to upgrade.

And, yeah, tablets feel very constrained for creation. It is a consumption device. Perfect term. I am typing on a Galaxy Tab now, and after over a year, it still drives me slightly nuts to type more than a couple sentences; not to mention the thought of actually designing or developing on this thing.

Maybe that will change as new input devices and peripherals are developed. But, at some point if it evolved enough in that direction, then it would no longer be a tablet per se, but more a PC in a different form factor.

[+] rogerbinns|13 years ago|reply
Just to note that some of us are the exception to these adequacy feelings, but that is possibly because I am a software developer. My current workstation has 32GB of RAM and an 8 thread i7. That is the maximum amount of memory the motherboard can take, and Intel had a few faster i7 models but not sufficiently faster single threaded to justify the massive price difference. Even my laptop is an i7 with 16GB of RAM (again both maxxed out).

I am having to go in and make all the work I do be increasingly parallel and pipelined. This is a lot more effort than straight forward code.

I used to have a first gen i7 and am now on 3rd gen i7. I recently transcoded a DVD which I had done before and the performance difference was incredible. Before it would take several hours, and now it takes 10 minutes! (There may also be a lot of credit to improved software too.)

I will admit that sometimes I do feel the adequacy. During the tail end of the Soduko boom I decided to write a solver as an intellectual exercise. I started out with a brute force solver that tried every possibility. It was written in Python and I gave no thought to optimising data structures or the code - I just needed a baseline. It solved most puzzles before the return key had sprung all the way back. It took less than 5 seconds on "hard" puzzles. The most difficult puzzle I could find took under 45 seconds. I gave up at that point.

[+] coldtea|13 years ago|reply
>Something does worry me slightly about the large shift to tablets, which are great devices in their own right. It's hard(er) to create new content on a tablet, and I don't really want that becoming the default computer for any generation.

Depends on the type of content.

For writing text, painting, or making electronic music for example, there are tons of extremely easy to use apps for tablets.

For example with something like Procreate people can draw stuff nearly as intuitive as using paper and brushes, something that is difficult in the PC, except if you have one of those Wacom monitors/tablets.

[+] hobb0001|13 years ago|reply
> It's hard(er) to create new content on a tablet, and I don't really want that becoming the default computer for any generation.

At the risk of using yet another car analogy, I see PC's becoming the pickup trucks of the computer industry. There will always be plenty of them around and will be primarily designed and built for utility. Many people who don't really need the utility will still buy them because they like the style. Over the years, the pendulum will swing back and forth between mobility and power -- and there will inevitably be the SUV cross-overs as well.

[+] PakG1|13 years ago|reply
What mitigation steps are you taking for when your SSD dies a horrible death? Do you keep all your important files saved on another SSD or in the cloud? And you'll just buy a new SSD and re-image? Or are you doing something else?
[+] DigitalSea|13 years ago|reply
There was an article posted yesterday in which IDC blames Microsoft and Windows 8 in their report for the decline in PC sales but it's refreshing to see proper journalism acknowledging that isn't solely the case and in-fact Apple are selling fewer Mac's as well. The world is moving toward mobile devices. PC's will always serve a purpose, but for some people a PC isn't needed at all. As a developer and a bit of a designer, I couldn't picture myself coding on a tablet nor designing on one (prototyping a design maybe). It's a changing landscape, the likes of Google Glass give us a glimpse what a future without desktop computer domination looks like.

The real issue here as touched upon in the article is the fact that new computers don't really offer an advantage over older computers. Upgrading from a 386 to a 486 back in the day was a reason to upgrade but my current machine which is a spec'd out Core i7 will last me until it stops working in 4 to 5 years time. The only sector of computing probably still thriving is storage, people probably upgrade their hard drives more than they do their computers. Computing has reached a point where a CPU will last 4 years but a hard drive only lasts as long as it has space left.

[+] unclebucknasty|13 years ago|reply
>As a developer and a bit of a designer, I couldn't picture myself coding on a tablet nor designing on one

That pretty much describes me and the thing is that even though I still need a PC for the same reasons as you, I haven't felt the desire to upgrade as before. The reason? Pretty much exactly what the article states.

I used to upgrade every couple of years, and maybe slightly more frequently. And, I would see significant performance/productivity gains per upgrade.

But, unlike before, I don't feel any performance "pain" with my current desktop or mobile workstation. They are both 3-4 years old, Windows 7 64-bit machines, and pack plenty of power/memory for my (heavy) use. So, I no longer feel a need for better performance.

Seems there was a time when software (including the OS) pushed the hardware, such that users were ever hungrier for more power. But, now it seems that hardware has gotten out in front permanently for all intents and purposes. A slightly above average consumer rig with sufficient memory can now handle just about any task most folks will throw at it and with relatively no latency.

[+] mahyarm|13 years ago|reply
There is a certain class of developer where something like the ASUS transformer is perfect.

They can:

  - Play all sorts of media
  - Download things with utorrent
  - Browse the web with webkit
  - Access terminals, 
  - Use SSH, 
  - Write their PHP websites with a highlighting text editor 
  - Edit images
Getting all of that in a package that is:

  - Under 2 pounds
  - Has 10 hour battery life
  - Is $300 if you shop around, almost disposable.
  - Backed up on the cloud
Add a mSATA SSD and more RAM and it would work with even more developers, photographers and other professionals. Student who just needs to write papers and use facebook? Especially them too.
[+] ams6110|13 years ago|reply
for some people a PC isn't needed at all

I'd go even farther than that: for a lot of people, a PC is something they never really wanted. For a long time it was the only practical way to get the things they did want: Email, YouTube, online shopping, social media, music, the web in general. Now that those things can be done, and done well, from a phone or a tablet, and now that you can watch Netflix on your TV with a game console or even directly, a PC is just clutter.

[+] DannoHung|13 years ago|reply
> isn't solely the case and in-fact Apple are selling fewer Mac's as well

Gartner's numbers are a bit different actually, though they both have Lenovo pegged as growing slightly. http://www.gartner.com/newsroom/id/2420816

What if it's just that the market for crappy laptops (Ultrabooks or not) is disappearing?

[+] AngryParsley|13 years ago|reply
The sad thing about this is that tablets and phones aren't nearly as good at content creation. A physical keyboard is still the fastest brain --> computer interface in town. Also, tablets and phones aren't self-hosting. You can't develop iOS apps on an iOS device. This makes it much harder for inexperienced people to get into programming. Taking the plunge into programming will be like deciding to buy an instrument and learning to play it.

While most people are never going to write software, those who do will be hurt by the drop in PC sales. In the past, PC R&D costs were borne by the general public. Now the public is moving to mobile devices, but developers still need to buy full-fledged computers. Lower PC sales means costs will go up (since R&D can't be spread across as many units) or manufacturers won't develop new features as quickly.

There's some silver lining: the technologies used in tablets overlap quite a bit with those used in laptops. Developers won't be stuck completely in the past, but future PCs might be a little too tablet-y for their tastes. (This is already happening with Windows 8).

[+] unholyalliance|13 years ago|reply
I don't think the price for PCs will matter. You can always grab a bluetooth keyboard if you want to type something out, and there are even web based IDEs available, which means that you're not limited to platform. Right now, you can add a keyboard to your droid or iphone and start hacking away.

Additionally, the programming experience is in many ways focused too much on the text based code itself, and less on the act of creation. It may be that changing the PC/developer interface causes a revolution in the way that people program.

[+] rdouble|13 years ago|reply
The sad thing about this is that tablets and phones aren't nearly as good at content creation. A physical keyboard is still the fastest brain --> computer interface in town.

Tablets are a faster brain --> interface than a keyboard for visual arts like drawing, painting, video and photography. Arguably music too, because you can simulate many different types of input from drum machines, to strings, to even wind instruments (ala SMULE's Ocarina).

[+] davidroberts|13 years ago|reply
Last year I bought a new motherboard with an eight-core processor when my old one died. Just a couple of day ago I realized that my experience with that set up is exactly the same as the previous one I bought in 2008. The 1TB hard drive I bought in 2009 is only half full. 16GB of RAM runs no better than 4. Two cores humming along at 3000 mHz can handle everything I throw at them. The others sit idle.

It's a huge change from ten years ago when I would anxiously await the day when I could afford a new rig because I was already pushing my three-year old one to its limit. Desktop PC technology has clearly reached the point where its capabilities far exceed the needs of ordinary users.

[+] PostOnce|13 years ago|reply
The ordinary user argument. I argue it too, because it's true for now. Facebook and email don't require 8 cores and a Kepler card and 32gb of RAM. They require an $80 Pentium 4 machine. Youtube HD is about the only thing that a normal person uses that'd push that, other than games.

The best counter-argument I've seen so far is that these modern machines are capable of great but uninvented or unpopularized things. If developers give users a reason to upgrade, they will. Nvidia wouldn't exist if game developers hadn't made 3D games to take advantage of their hardware. Same with PCs, developers have to give them a purpose.

I agree with that argument, and I hope someone capable steps up and makes it happen.

[+] bhauer|13 years ago|reply
I still think the problem is the complete lack of innovation with desktop displays for the past twelve years [1]. I very badly want a home computing environment that features a ~50 inch high-DPI screen that I view at a distance of approximately 2 to 3 feet.

I feel high resolution, high density displays would reinvigorate what we currently call "desktop" computing.

[1] http://tiamat.tsotech.com/displays-are-the-key

[+] feral|13 years ago|reply
I have a dual monitor setup, with 27 inch (2560x1440) and 24 inch flat IPS displays.

They are fantastic to develop software on, and they cost less than my PC.

I couldn't have afforded anything like this 12 years ago. I guess it depends what you call 'innovation', but from a making-my-work-experience-nice perspective, I'm delighted with the progress of desktop displays.

I'm not sure how much benefit I'd get from higher res, with the distance the displays are from my eyes?

[+] unholyalliance|13 years ago|reply
50 inch high? Do you mean wide? Due to our physiology of two horizontal eyes, we're more suited for a wide screen than one that high. I'm not convinced your field of view could even s make use of an entire 50 inch high screen at that distance.
[+] okr|13 years ago|reply
On my dev computer at work i used to be able to integrate the whole application environment. No more. To get a bigger and bigger machine,for me that is not affordable. Not to forget the taken space, the produced heat and the noise from this clumsy pc box.

So we started to build our own infrastructure, enabled virtualization and giving everyone, what he needs. Growing as needed. It feels like a natural development to me.

The modularization of racks becomes better: separate hot-swappable and inter-connected cpu, fast ram, slower storage units. Feels like a pc itself again. Maybe that shrinks and we get it at home again.

I like the idea of owning my own pc. But i think, it gets more and more difficult to have everything on it. I will end up with a lot of servers anyways.

On the other hand, a lot of people develop web applications. For that i dont need much power.

-> i need more power -> some people dont need much power => maybe thats one reason why the pc market shrinks

:)

[+] DanBC|13 years ago|reply
Are they taking into account global economic crisis?

Because I'm pretty sure that businesses were the main buyers of new computers, and that they're not going to buy new computers unless they really need to in this climate.

I agree that Vista, when launched, put a lot of people off.

[+] thirsteh|13 years ago|reply
I think you are overestimating how bad the "global economic crisis" is. Besides, computers are vital to many companies--they're not an expendable "luxury."
[+] georgemcbay|13 years ago|reply
The PC (or the Mac) is, of course, a long way from perfect, but you can count me as another data point for the theory that hardware pretty much reached "good enough for just about anything" a few years back, and I'm saying that as someone who is a programmer and a gamer, so for Joe Q. Public running Office and Chrome this point was reached even sooner.

Core 2 Duo w/ 4 gigs of RAM was, I'd guess, basically the tipping point for normal users, Nehalem w/ 8 gigs of RAM, GeForce 4xx and an SSD for the system disk was the tipping point for people like me.

I used to upgrade my system yearly (buying parts off Newegg, reusing existing bits where they made sense to do so) but now it is more like every three years and growing each time.

There are, of course, lots of ways PC manufacturers can turn this around with increased novel input methods, more hybrid devices and especially an increased focus on higher resolution screens (which has a multiplier effect because if you truly boost your on-screen resolution, you'll soon start feeling cramped by your CPU, GPU and memory again), but the days of tossing out more powerful CPUs, GPUs and a bit more RAM (and then calling it a day as far as new features are concerned) are over.

[+] tracker1|13 years ago|reply
When I upgraded from my C2D to a first gen i7 was the first time I didn't feel pain and the need to upgrade... I did go with an SSD at that time, and spent about $1500 on that desktop iirc (main case, not monitors, keyboard etc)... I recently replaced it with AMD's top 8-core option, which works better for me than an i3/i5 at that price.

I only upgraded because my system was unstable, and it was likely the motherboard (which I would have to replace the mb+cpu) or the power supply, either way pretty much the same effort/cost as upgrading both. New system runs great.. the irony is most of my non-work stuff gets done on my htpc in the living room, or my nexus tablet. My c2d macbook pro, and my desktop aren't used that much.

[+] Everlag|13 years ago|reply
I must say, despite the gloomy outlook, isn't the fact that most of the market is running on old machines not a bad one?

Why do you buy a new pc? Why, of course, to do something your old one couldn't. Now, you have six year old consoles so your standard port won't be incredibly pretty or taxing.

Give it a months after the launch of the next-gen consoles and PC sales should see an uptake as people start buying the awesome looking ports that are being crapped out in the dozens by the big AAA devshops.

While it won't fix the market, it should have a serious effect upon the profitability of a pc business. Combined with the fact that your tablet, which was nearing the current console power, is now looking pretty bad that is even more reason for the market to keep on chugging with consumers realizing their all in one isn't the beauty they thought it was.

You guys say you haven't upgraded in years because you can run the new games on a very decent level? Just you wait.

[+] hayksaakian|13 years ago|reply
I used to decry notebooks/laptops as a way to get "real work" done, but nowadays they are commonplace.

Beyond work, everything else is easier/better on a tablet/phone.

At this point the only reason I have a windows PC any more is for "HD" gaming and the occasional windows only software.

[+] drewcoo|13 years ago|reply
I would guess that Windows Blue is another data point to support the thesis. Windows sales have traditionally been tied to hardware sales. With declining hardware sales it seems like a way to try to keep milking the cash cow.

And . . . when I Googled for a link to info about Blue for context, lo and behold, I found this: http://online.wsj.com/article/SB1000142412788732374100457841...

[+] jpxxx|13 years ago|reply
Basically, PCs are dumb, boring work terminals for old people.

Children coming of computing age when the iPhone was released are now 9 years old, perhaps on their second or third portable gaming device, and lobbying their parents for an iPhone.

What on Earth would they possibly want a slow, dirty, heavy keyboard computer for? You can't even take a picture with it unless it's an Apple.

[+] pyre|13 years ago|reply

  | Basically, PCs are dumb, boring work
  | terminals for old people.
Or that kid down the street that used his laptop to get something into the AppStore, and is actually making money...