Could somebody point me to a technical explanation of why it's sometimes non trivial to just compile your app against x86-64 and call it a day?
For example, something I encounter every day is Visual Studio and it's helper processes being 32 bit. Because Visual Studio regularly, even on the latest 15.7 preview shits the bed with OutOfMemoryExceptions on our large solution, I'm inclined to rage "why don't they just make it 64 bit? If it could just load more into memory it could get past this indexing hurdle and give me back the UI". But I also understand that if it was that simple they would have done it by now.
Something else, that I understand more, is the LabVIEW RT and FPGA modules only working on 32 bit LabVIEW. I would assume it's related to the compiling and deploying to the 32 bit ARM/x86 RT target.
The main issue would be that 64-bit may cause an app to use more memory. Every pointer doubles in size, the alignment of a structure containing a pointer may grow, etc. Basically, if you needed a lot of structures allocated and they all grew, your memory use becomes noticeably bigger.
Sometimes legacy code can make assumptions about pointer size. These hacks were more common in the days of porting older systems to 32-bit but it could still happen moving to 64-bit.
If there’s code that tries to manually populate the bytes of data structures, sometimes bugs appear when the target field size changes (e.g. somebody ends up not initializing 4 of the 8 bytes in a now-wider field).
In the case of Apple, the huge pain will be their decision to not port all of Carbon to 32-bit (despite Forstall getting on stage years ago and stating that Carbon was going to be 64-bit “top to bottom”, it never happened). This can mean addressing a whole pile of nothing-to-do-with-64-bit-whatsoever problems before even starting to solve 64-bit problems.
Usually it's old code that assumes sizeof(int) == sizeof(void *) - you could stuff a pointer into an int on 32 bit platforms, you can't when an integer is 32 bits long, and a pointer is 64 bits. In C (or Fortran, ...) it's pretty easy to make this mistake, especially in pre-K&R C - so there's some work to be done porting over.
First, it could be as simple as you use a legacy proprietary software that you don't have access to the sources to recompile it 64 bit, on macOS is less of a problem because 32bit was used for a short transition period, but if you think maybe you want to run 32bit Windows software with wine.
Another reason is that the program was not written with portability in mind, and so it works well on 32bit and on 64bit it has strange behaviors, this could be due to a infinite number of possibilities, and so if you want to use it on a 64bit you must not only recompile the program but debug and fix it.
And then it could have be done on purpose, yes Microsoft compiles Visual Studio 32bit on purpose, the reason is that the main advantage of 64bit is a bigger address space, and a couple more registers, otherwise on the Intel architecture the performance is the same, but with 64bit you consume significantly more memory, because every pointer inside you program is now twice as big: so if you program doesn't need an address space bigger than 32bit and you want to save some RAM, it's not a stupid idea as it would seem to still compile it 32bit. As it's not a stupid idea to use a 32bit OS on a PC with 2Gb of RAM or even less.
They've actually talked about why they haven't done this here[1]:
"So why not just move Visual Studio to be a 64-bit application? While we’ve seriously considered this porting effort, at this time we don’t believe the returns merit the investment and resultant complexity. We’d still need to ship a 32-bit version of the product for various use cases, so adding a 64-bit version of the product would double the size of our test matrix. In addition, there is an ecosystem of thousands of extensions for Visual Studio (https://visualstudiogallery.msdn.microsoft.com) which would need to also port to 64-bit. Lastly, moving to 64-bit isn’t a panacea – as others have noted (https://blogs.msdn.microsoft.com/ricom/2016/01/11/a-little-6...), unless the work doesn’t fit into a 32-bit address space, moving to 64-bit can actually degrade performance."
Also, a lot of people don't realize that there is a 64-bit version of the toolsets[2] (for C++ at least). I don't tend to have high memory use by Visual Studio itself but often run out of heap space using the compiler and linker so having access to those can be very helpful.
> Could somebody point me to a technical explanation of why it's sometimes non trivial to just compile your app against x86-64 and call it a day?
Because the program uses a deprecated 32-bit API.
Once the deprecated 32-bit API is dropped, the program simply won't compile.
Also, the modern 64-bit API is different. And by different I mean it has a completely different design and set of interfaces. So to get the program to work again you have to port the program from using the old 32-bit API to using the non-deprecated 64-bit API. That's non-trivial work.
> Because Visual Studio regularly, even on the latest 15.7 preview shits the bed with OutOfMemoryExceptions on our large solution, I'm inclined to rage "why don't they just make it 64 bit?
There's technical reasons alone and there's decisions being made based on other factors.
Iirc the product manager for Visual Studio was against blindly moving VS to 64-bit just because it uses a lot of memory. He considered that fixing the symptome and not the cause. He wanted the team instead to spend their time trying to identify inefficiencies and memory leaks.
Now... If you can say that decision has paid off or not is not for me to call, but I do appreciate the reasoning behind the decision.
If they can make it work well with a 32-bit constraint, that's clearly better than yet another app which uses 8GB memory for no reason.
BTW: The 32bit limit is per-process, not for the OS. You can easily have multiple 32 bit processes consuming more than 4GB memory in total. You don't need to port to 64bit to get that extra memory.
macOS makes the 64-bit transition a bit harder than that because a lot of the older user space APIs were purposely not ported to 64-bit. If your app was written against the Carbon APIs, you have to do a fair amount of work to change all the GUI code.
Imagine if you have a dependency on a third party library, and they only offer 32-bit DLL, or the company is out of business and you are stuck with the 32-bit dynamic library. You can't just "compile against x86_64" in this case.
I think the long term plan is to keep adding stuff to VS code, keeping it modular so that it can be anything from a simple editor to the full IDE VS is, and then phase out Visual Studio. Achieves multiple goals in one go, and gives a smooth transition path without the second system syndrome pressure. Everybody wins.
Completely agree, and usually the answer I get is - well plugin developers should just move their stuff OutOfProcess, and deal with it. I went to GDC and talked to JetBrains about ReSharper C/C++ - which is a phenomenal product, yet limited by what else lives with it in the process space. So since last week I had to kill (uninstall/disable) anything else but it, because I depend on it. I briefly had to re-enable th Qt plugin, the P4 one, but had to disable a lot of Microsoft's (whatever I can), Intel's, and had to leave Sony's SN-DBS otherwise compile times are slow as hell... So it's manageable, and there was way to have different environments setup (haven't looked deeply into it)... but really moving to 64-bit would solve a lot of it... even if it would take more memory... (bigger pointers, that's usually the stupid excuse)
> Could somebody point me to a technical explanation of why it's sometimes non trivial to just compile your app against x86-64 and call it a day?
"In 32-bit programs, pointers and data types such as integers generally have the same length. This is not necessarily true on 64-bit machines. Mixing data types in programming languages such as C and its descendants such as C++ and Objective-C may thus work on 32-bit implementations but not on 64-bit implementations." [1]
It's non-trivial to ensure every developer writes portable code multiplied by the varying degree of skills for developers.
In a language like C++, assuming the size of an integer is 4-bytes, or that the size of all pointer types are the same, can introduce hard to track down bugs, which may result in 64-bit versions of apps to be unstable.
If people have done architecture-specific things they may not compile (or worse, crash inexplicably at runtime).
I tried doing more or less that (compiling a C++ Windows app for 64-bit) a few jobs ago and eventually discovered that way down in a library somewhere a dev (who had long since departed) did some trickery in string processing routines that relied on arguments being lined up on the stack, which is no longer the case with the x86_64 calling convention. That remains the worst piece of code I've ever worked with - it's innocuous to look at, and any developer who understands why that works should also have known better than to do it.
More generally, integer size issues can arise - if `int` remains 32 bits it's no longer enough to capture the difference between two pointers (and obviously should never have been used for that, but often these things happen).
Two basic APIs are not available in 64 bits: Carbon and QuickTime. Steve Jobs promised to port Carbon to 64 bits when macOS X was introduced. So many of us developed for Carbon, and a few years later Carbon was deprecacted, and Apple announced they would never port it to 64 bits. Even software from Apple such as iTunes was built and Carbon, and it took years for Apple to remove the Carbon code out of iTunes...
This is particularly heinous for those of us that do music production. A lot of plugins are orphaned at some point (meaning the developers no longer update them), and if you created a piece of music with those plugins (instruments, effects), if you can't load those plugins, you can't open old files.
I often open up sketches from several years prior and consider working more on them. I keep a full 32-bit stack of music stuff left installed exactly for that reason. Usually, if I open them up, then I'll go ahead and move them over to 64-bit semi-equivalents, but that's difficult if you can't even hear what you were doing with the old one.
As an ambient composer, myself I understand your pain but that is the unavoidable fate of all abandoware technologies.
Abandonware technologies however can be used through the use of emulators. If I can use my first computer that my father bought in 1988 (Amstrad CPC 6128) with its magnificent 4mhz CPU and mindblowing 128kb memory from any OS (including iOS and Android) via emulators. I am sure using 32 bit VSTs and AUs should be a walk in the park with Virtualbox.
All you will have to do is bounce those plugins to audio and move the audio to your 64bit OS or even (probably its possible) make the two OS talk to each other so you can use both 32 and 64 bit versions at the same time.
A friend of mine who does music production too, keeps a mac pro (2012) on snow leopard for backward compatibility purposes (compatibility with old specialized hardware, pci express cards and software).
For plugins it might well be possible to create a 64 bit shim plugin that fires up the old 32 bit plugin in a virtualised 32 bit environment, with thunks to translate backwards and forwards.
While you will possibly have to upgrade your OS for some software sooner rather than later, MacOS High Sierra should continue to get security updates for a while, and you will be able to run it in a virtual machine like Virtualbox or Veertu, or something else built on Apple's native hypervisor for a very long time.
A lot of apps never made the switch on ios. I have some useful ones that were orphaned. I feel like only the mac original intel chips were core2 32bit one, and those long lost os support.
Its weird that dos (box) and old windows programs are often still runable, but somehow mac applications just don't age nearly as well.
I really think if some linux distros can get it together and get some good application support, now is the time for them on the desktop/laptop.
going to Apple-Menu->about this mac->system report ->software applications
will show a list of applications with the right most column indicating 64 bit support. 95ish % of mine are currently 64 bit
Microsoft and Apple have different approaches to backwards compatibility based on the markets they're in. Microsoft makes most of their money on enterprise, whereas Apple makes most on consumers. Enterprise values running old apps, consumers value running today's apps. Each makes sense for their market... and desktop Linux doesn't make sense for either.
Someone else will know the details a lot better, but Apple has changed the licensing details on many of its older OS X versions to legally permit virtualization if the host is a Mac. Which OS version images I can download for free in the store right now appear to be linked to which ones I had ever used under my current Apple Store account.
Given that a lot of current Macs come with only 8GB of ram it is not ideal to run a lot of applications in a virtual machines, but applications that need much RAM have almost certainly moved to 64bit anyway.
Again, someone else will understand it better, but I think products like Veertu were built on Apple's built in hypervisor framework, and it would not be difficult for people to build additional free virtualization options in MacOS. My point is that the reasons for choosing Linux over MacOS are already sufficient without targeting users of older Mac software, and that people who are using older Mac software probably also use new Mac software, and won't have too difficult a time using the older stuff too.
A while ago I would have said that something like Wine would be great for using the best software I can ever remember which only ran on early Macs. Except that these days you can do that in Javascript in any browser, and while the software was good, I've already spent a few weekend afternoons playing with them, and I don't need to anymore.
I miss my 32bit iOS games more than I'd miss any 32bit MacOS applications, which I could run anyway through virtualization. I also wish they'd stop disappearing features from Apple software in general, but I think sunsetting 32bit apps or Carbon are generally better for the Macintosh ecosystem.
Re: iOS, yeah, I was looking at my purchased app history to get inspiration for a new project and a decent chunk of apps I bought around the time of the first iPad (as well as newer ones) are no longer available, which is a shame. I can’t really blame the authors for not going back and updating a £1.99 app 6 years after the fact, but it’s a shame and I hope there won’t be another “extinction event” like this. Am I right in understanding that Bitcode should make architecture changes much less of an issue for iOS in future?
The only ones I have that aren't 64 bit are games interestingly enough. And Calculator and DVD Player, which is odd, I'd expect that to have been ported by apple by now. That said the DVD Player app is not a big deal as I don't have any dvd drives any longer I realize, oh wait i bought a usb one, would have to find it.
Looking a the list I see: Steam stuff, Blizzard stuff, WebEx and something for the logitech mouse. Sadly unsurprising, the biggest players are the slowest to move.
Can't imagine why this push is necessary. One of the primary advantages of x86_64 as opposed to other 64-bit architectures is that normal 32-bit applications run natively with no performance hits. As others notice, this is likely going to serve no other purpose than to make abandoned 32-bit applications completely unusable.
Of course, it could also be related to Apple's intention to switch to ARM chips in the near future, and getting everything on consistent 64-bit to aid the porting effort. I can't imagine the developers are going to enjoy low-level mapping of 64-bit x86 instructions to 64-bit ARM though...
This will be what finally moves me away from Adobe software.
I use one Adobe app: Illustrator CS5. My needs are fairly specialist and I haven't needed any new features introduced since CS4 (2008; multiple artboards, finally).
Adobe's only upgrade option is £240pa for a single-app subscription. Fortunately, alternative drawing programs have come on a long way since CS5, so I'll almost certainly jump ship to one of those.
Gee, I'm building, configuring a server based on an Asus motherboard and the AMD FX-8350 processor, 8 cores, 64 bit addressing.
Surprise! I discovered that Windows XP 32 bit Professional SP2 (service pack 2) will install and run! It sees all 8 cores, and the version of Microsoft's TASKMGR plots the activity separately on each of all 8 cores. It also sees the full 16 GB of main memory and is willing to use 2 GB of it with 5 GB of paging space.
And I discovered that the Western Digital (WD) Data Lifeguard Tools CD, IIRC version 11.1, boots and runs! This is amazing since what boots is old DOS! The DOS part will boot from a CD/DVD USB (universal serial bus) drive, but then the WD software doesn't run. But if boot from a SATA (serial advanced technology attachment) CD/DVD drive, then the WD software does run.
If have the Windows version running and put the WD CD in the SATA drive, then the WD software appears to run as a Windows application!
My most important application is 32 bit editor KEdit, and I've discovered that it runs fine on Windows 10 64 bit Home Edition on an HP laptop with a 64 bit Intel processor with two cores and 4 threads.
So, lesson: With Windows, AMD, Intel, and ASUS, a lot of 32 bit computing still works! Sorry Apple!
My first intention installing Windows XP was just to run some experiments on using the WD Tools to backup and restore a bootable partition, but I've since discovered that apparently my trusty old copy of Nero for CD/DVD reading/writing that I long used on XP appears to install on Windows 10 on the HP laptop but as far as I can tell won't read or write CDs or DVDs. So, for routine reading/writing CDs and DVDs, apparently I should keep a bootable partition with XP.
Sorry, Apple, 32 bit computing won't go away soon: The reason is standard and old in computing -- there is a lot of old software people very much still want to run.
What is the point of an OS if not to run the software the user has purchased? I don't care, and more importantly don't want to care at all about the OS beyond it being a launcher for software that allows me to make money. Such a bizzare approach from Apple to reducing the user to an open wallet willing to repurchase software that already works. Also with Apple forcing the user on the rolling release treadmill, its rather annoying that one can't simply stay on a stable version.
I've got an old core duo (32 bit) iMac which is still working fine despite the old age (I think it's a late 2009).
My small brother use it to navigate the web.
The sad thing is that it became useless with OSX. Safari couldn't be updated nor any browser, thus leading to the inability to browse the web because of https certificates compatibility.
I had to install (with a lot of tricks) linux and it works flawlessy.
It's sad to see that a working computer has become obsolete in only 10 years, and while I will probably continue use Apple products I feel like there something _wrong_ with this. That's one of the reason I am worried about buying an apple watch. Obsolescence.
All in all I get it and I know that it'll probably pay off for them, just like it did with the dvd player removed, but it'll take time (for me) to get used to the fact that, at least on the apple ecosystems, things last more than usual, but they also become useless more than usual.
On the other hand, it is a bit crazy and impressive that OS X/macOS has had to support 32-bit x86 binaries for >12 years because they sold 32-bit Core Duo machines from January 2006 (introduction of Core Duo iMac) until August 2007 (replacement of Core Duo Mac Mini with Core 2 Duo version). 12 years of binary support because they sold 32-bit x86 CPUs for 20 months.
[+] [-] bonoetmalo|8 years ago|reply
For example, something I encounter every day is Visual Studio and it's helper processes being 32 bit. Because Visual Studio regularly, even on the latest 15.7 preview shits the bed with OutOfMemoryExceptions on our large solution, I'm inclined to rage "why don't they just make it 64 bit? If it could just load more into memory it could get past this indexing hurdle and give me back the UI". But I also understand that if it was that simple they would have done it by now.
Something else, that I understand more, is the LabVIEW RT and FPGA modules only working on 32 bit LabVIEW. I would assume it's related to the compiling and deploying to the 32 bit ARM/x86 RT target.
[+] [-] makecheck|8 years ago|reply
Sometimes legacy code can make assumptions about pointer size. These hacks were more common in the days of porting older systems to 32-bit but it could still happen moving to 64-bit.
If there’s code that tries to manually populate the bytes of data structures, sometimes bugs appear when the target field size changes (e.g. somebody ends up not initializing 4 of the 8 bytes in a now-wider field).
In the case of Apple, the huge pain will be their decision to not port all of Carbon to 32-bit (despite Forstall getting on stage years ago and stating that Carbon was going to be 64-bit “top to bottom”, it never happened). This can mean addressing a whole pile of nothing-to-do-with-64-bit-whatsoever problems before even starting to solve 64-bit problems.
[+] [-] angrygoat|8 years ago|reply
[+] [-] alerighi|8 years ago|reply
Another reason is that the program was not written with portability in mind, and so it works well on 32bit and on 64bit it has strange behaviors, this could be due to a infinite number of possibilities, and so if you want to use it on a 64bit you must not only recompile the program but debug and fix it.
And then it could have be done on purpose, yes Microsoft compiles Visual Studio 32bit on purpose, the reason is that the main advantage of 64bit is a bigger address space, and a couple more registers, otherwise on the Intel architecture the performance is the same, but with 64bit you consume significantly more memory, because every pointer inside you program is now twice as big: so if you program doesn't need an address space bigger than 32bit and you want to save some RAM, it's not a stupid idea as it would seem to still compile it 32bit. As it's not a stupid idea to use a 32bit OS on a PC with 2Gb of RAM or even less.
[+] [-] eco|8 years ago|reply
"So why not just move Visual Studio to be a 64-bit application? While we’ve seriously considered this porting effort, at this time we don’t believe the returns merit the investment and resultant complexity. We’d still need to ship a 32-bit version of the product for various use cases, so adding a 64-bit version of the product would double the size of our test matrix. In addition, there is an ecosystem of thousands of extensions for Visual Studio (https://visualstudiogallery.msdn.microsoft.com) which would need to also port to 64-bit. Lastly, moving to 64-bit isn’t a panacea – as others have noted (https://blogs.msdn.microsoft.com/ricom/2016/01/11/a-little-6...), unless the work doesn’t fit into a 32-bit address space, moving to 64-bit can actually degrade performance."
Also, a lot of people don't realize that there is a 64-bit version of the toolsets[2] (for C++ at least). I don't tend to have high memory use by Visual Studio itself but often run out of heap space using the compiler and linker so having access to those can be very helpful.
1. https://visualstudio.uservoice.com/forums/121579-visual-stud... 2. https://docs.microsoft.com/en-us/cpp/build/how-to-enable-a-6...
[+] [-] jancsika|8 years ago|reply
Because the program uses a deprecated 32-bit API.
Once the deprecated 32-bit API is dropped, the program simply won't compile.
Also, the modern 64-bit API is different. And by different I mean it has a completely different design and set of interfaces. So to get the program to work again you have to port the program from using the old 32-bit API to using the non-deprecated 64-bit API. That's non-trivial work.
Edit: clarification
[+] [-] pjmlp|8 years ago|reply
It starts by C not having fixed sizes for its datatypes.
Sure there were always macros/typedefs with such fixed sizes and C99 introduced stdint header.
However not everyone actually uses them.
Then there are the bit fiddling algorithms, unions and casts that assume a specific memory layout.
Followed by code that might actually become UB when switching to another memory model.
All of that scattered across hundreds of files, not written by a single person, with an history of decades of code changes.
[+] [-] josteink|8 years ago|reply
There's technical reasons alone and there's decisions being made based on other factors.
Iirc the product manager for Visual Studio was against blindly moving VS to 64-bit just because it uses a lot of memory. He considered that fixing the symptome and not the cause. He wanted the team instead to spend their time trying to identify inefficiencies and memory leaks.
Now... If you can say that decision has paid off or not is not for me to call, but I do appreciate the reasoning behind the decision.
If they can make it work well with a 32-bit constraint, that's clearly better than yet another app which uses 8GB memory for no reason.
[+] [-] ksk|8 years ago|reply
https://docs.microsoft.com/en-us/cpp/build/common-visual-cpp...
http://www.informit.com/articles/printerfriendly/2339636
BTW: The 32bit limit is per-process, not for the OS. You can easily have multiple 32 bit processes consuming more than 4GB memory in total. You don't need to port to 64bit to get that extra memory.
[+] [-] gok|8 years ago|reply
[+] [-] skocznymroczny|8 years ago|reply
[+] [-] roel_v|8 years ago|reply
[+] [-] malkia|8 years ago|reply
[+] [-] balls187|8 years ago|reply
"In 32-bit programs, pointers and data types such as integers generally have the same length. This is not necessarily true on 64-bit machines. Mixing data types in programming languages such as C and its descendants such as C++ and Objective-C may thus work on 32-bit implementations but not on 64-bit implementations." [1]
It's non-trivial to ensure every developer writes portable code multiplied by the varying degree of skills for developers.
In a language like C++, assuming the size of an integer is 4-bytes, or that the size of all pointer types are the same, can introduce hard to track down bugs, which may result in 64-bit versions of apps to be unstable.
[1]: https://en.wikipedia.org/wiki/64-bit_computing#64-bit_data_m...
[+] [-] pebers|8 years ago|reply
I tried doing more or less that (compiling a C++ Windows app for 64-bit) a few jobs ago and eventually discovered that way down in a library somewhere a dev (who had long since departed) did some trickery in string processing routines that relied on arguments being lined up on the stack, which is no longer the case with the x86_64 calling convention. That remains the worst piece of code I've ever worked with - it's innocuous to look at, and any developer who understands why that works should also have known better than to do it.
More generally, integer size issues can arise - if `int` remains 32 bits it's no longer enough to capture the difference between two pointers (and obviously should never have been used for that, but often these things happen).
[+] [-] FraKtus|8 years ago|reply
[+] [-] fiatpandas|8 years ago|reply
[+] [-] amelius|8 years ago|reply
The reason for that is quite similar to the Y2K problem.
https://en.wikipedia.org/wiki/Year_2000_problem
[+] [-] rusk|8 years ago|reply
[+] [-] nerdymanchild|8 years ago|reply
[+] [-] wheels|8 years ago|reply
I often open up sketches from several years prior and consider working more on them. I keep a full 32-bit stack of music stuff left installed exactly for that reason. Usually, if I open them up, then I'll go ahead and move them over to 64-bit semi-equivalents, but that's difficult if you can't even hear what you were doing with the old one.
[+] [-] kilon|8 years ago|reply
Abandonware technologies however can be used through the use of emulators. If I can use my first computer that my father bought in 1988 (Amstrad CPC 6128) with its magnificent 4mhz CPU and mindblowing 128kb memory from any OS (including iOS and Android) via emulators. I am sure using 32 bit VSTs and AUs should be a walk in the park with Virtualbox.
All you will have to do is bounce those plugins to audio and move the audio to your 64bit OS or even (probably its possible) make the two OS talk to each other so you can use both 32 and 64 bit versions at the same time.
[+] [-] xaldir|8 years ago|reply
[+] [-] caf|8 years ago|reply
[+] [-] rz2k|8 years ago|reply
[+] [-] acomjean|8 years ago|reply
Its weird that dos (box) and old windows programs are often still runable, but somehow mac applications just don't age nearly as well.
I really think if some linux distros can get it together and get some good application support, now is the time for them on the desktop/laptop.
going to Apple-Menu->about this mac->system report ->software applications
will show a list of applications with the right most column indicating 64 bit support. 95ish % of mine are currently 64 bit
[+] [-] npunt|8 years ago|reply
[+] [-] rz2k|8 years ago|reply
Given that a lot of current Macs come with only 8GB of ram it is not ideal to run a lot of applications in a virtual machines, but applications that need much RAM have almost certainly moved to 64bit anyway.
Again, someone else will understand it better, but I think products like Veertu were built on Apple's built in hypervisor framework, and it would not be difficult for people to build additional free virtualization options in MacOS. My point is that the reasons for choosing Linux over MacOS are already sufficient without targeting users of older Mac software, and that people who are using older Mac software probably also use new Mac software, and won't have too difficult a time using the older stuff too.
A while ago I would have said that something like Wine would be great for using the best software I can ever remember which only ran on early Macs. Except that these days you can do that in Javascript in any browser, and while the software was good, I've already spent a few weekend afternoons playing with them, and I don't need to anymore.
I miss my 32bit iOS games more than I'd miss any 32bit MacOS applications, which I could run anyway through virtualization. I also wish they'd stop disappearing features from Apple software in general, but I think sunsetting 32bit apps or Carbon are generally better for the Macintosh ecosystem.
[+] [-] tomduncalf|8 years ago|reply
[+] [-] mitchty|8 years ago|reply
[+] [-] jhack|8 years ago|reply
[+] [-] yoz-y|8 years ago|reply
[+] [-] kalleboo|8 years ago|reply
[+] [-] ibiza|8 years ago|reply
[+] [-] gargravarr|8 years ago|reply
Of course, it could also be related to Apple's intention to switch to ARM chips in the near future, and getting everything on consistent 64-bit to aid the porting effort. I can't imagine the developers are going to enjoy low-level mapping of 64-bit x86 instructions to 64-bit ARM though...
[+] [-] saagarjha|8 years ago|reply
Why would the average developer even have to care?
[+] [-] jnwatson|8 years ago|reply
I don't have the ecosystem tie-in that I used to have. What good is owning a bunch of apps if you can't use them?
[+] [-] shmerl|8 years ago|reply
[+] [-] EamonnMR|8 years ago|reply
[+] [-] ersh|8 years ago|reply
[+] [-] Doctor_Fegg|8 years ago|reply
I use one Adobe app: Illustrator CS5. My needs are fairly specialist and I haven't needed any new features introduced since CS4 (2008; multiple artboards, finally).
Adobe's only upgrade option is £240pa for a single-app subscription. Fortunately, alternative drawing programs have come on a long way since CS5, so I'll almost certainly jump ship to one of those.
[+] [-] gaius|8 years ago|reply
[+] [-] beedogs|8 years ago|reply
[+] [-] graycat|8 years ago|reply
Gee, I'm building, configuring a server based on an Asus motherboard and the AMD FX-8350 processor, 8 cores, 64 bit addressing.
Surprise! I discovered that Windows XP 32 bit Professional SP2 (service pack 2) will install and run! It sees all 8 cores, and the version of Microsoft's TASKMGR plots the activity separately on each of all 8 cores. It also sees the full 16 GB of main memory and is willing to use 2 GB of it with 5 GB of paging space.
And I discovered that the Western Digital (WD) Data Lifeguard Tools CD, IIRC version 11.1, boots and runs! This is amazing since what boots is old DOS! The DOS part will boot from a CD/DVD USB (universal serial bus) drive, but then the WD software doesn't run. But if boot from a SATA (serial advanced technology attachment) CD/DVD drive, then the WD software does run.
If have the Windows version running and put the WD CD in the SATA drive, then the WD software appears to run as a Windows application!
My most important application is 32 bit editor KEdit, and I've discovered that it runs fine on Windows 10 64 bit Home Edition on an HP laptop with a 64 bit Intel processor with two cores and 4 threads.
So, lesson: With Windows, AMD, Intel, and ASUS, a lot of 32 bit computing still works! Sorry Apple!
My first intention installing Windows XP was just to run some experiments on using the WD Tools to backup and restore a bootable partition, but I've since discovered that apparently my trusty old copy of Nero for CD/DVD reading/writing that I long used on XP appears to install on Windows 10 on the HP laptop but as far as I can tell won't read or write CDs or DVDs. So, for routine reading/writing CDs and DVDs, apparently I should keep a bootable partition with XP.
Sorry, Apple, 32 bit computing won't go away soon: The reason is standard and old in computing -- there is a lot of old software people very much still want to run.
[+] [-] bangonkeyboard|8 years ago|reply
[+] [-] malkia|8 years ago|reply
[+] [-] ksk|8 years ago|reply
[+] [-] andrea_sdl|8 years ago|reply
The sad thing is that it became useless with OSX. Safari couldn't be updated nor any browser, thus leading to the inability to browse the web because of https certificates compatibility.
I had to install (with a lot of tricks) linux and it works flawlessy.
It's sad to see that a working computer has become obsolete in only 10 years, and while I will probably continue use Apple products I feel like there something _wrong_ with this. That's one of the reason I am worried about buying an apple watch. Obsolescence.
All in all I get it and I know that it'll probably pay off for them, just like it did with the dvd player removed, but it'll take time (for me) to get used to the fact that, at least on the apple ecosystems, things last more than usual, but they also become useless more than usual.
[+] [-] polpo|8 years ago|reply
[+] [-] PascLeRasc|8 years ago|reply