Apple's 64-bit Chip Could Change Things, but Not Overnight
Test and Monitor | Posted November 26, 2013

[caption id="attachment_14828" align="aligncenter" width="480"] Credit: Cult of Mac[/caption]

The 64-bit nature of the A7 chip is nice. However, any potential tech exploitation of that 64-bit nature is a long ways off, say developers.

Apple's latest generation of non-Mac products sport a new processor that's touted as 64-bits, which is causing a bit of an uproar in the tech sector. And we all know the impact Apple can have far beyond its own borders.

For the record, the performance gains in the new A7 over earlier versions don't come from being a 64-bit chip. They are the result of an aggressive new architecture and some performance tweaking by Apple engineers. The result is a very fast processor that flattens past generations, according to recent benchmarks.

The move to 64-bits on PCs and especially servers has been a significant one. A decade ago, people deployed row upon row of 32-bit Xeon servers that, once the sysadmins loaded Windows Server, IIS, and other applications, often had less than a gigabyte of memory left over for actual applications. The result was server sprawl, with a whole bunch of servers running at 2-3% utilization.

The advent of 64-bit architectures (capable of addressing as much as 16 exabytes of RAM) and the operating systems to run on those servers led to a massive revolution in server architecture. Virtualization and the cloud would never have happened in the 32-bit world. But on the client side? PCs routinely come with 8GB of memory, mostly because RAM is now cheap. Unless you are doing graphic or video editing, you would have to really push your computer system to maximize that memory.

In the smartphone and tablet world that ARM dominates, it's a completely different situation. The ARM cores used all across the industry are 32-bit, but that's not an impediment for smartphone and tablet developers. Nothing they were doing was coming even remotely close to the 4GB limit of a 32-bit processor.

If anything, AnandTech points out, there are good reasons to stay 32-bit. Putting 8GB of memory in a phone would mean eight times as much memory to power and initialize as a 1GB phone like the iPhone 5S; and with all the drains on the battery, Apple is not about to do that. Plus, 32-bit apps can't see the extras of a 64-bit app, and 64-bit apps tend to use more memory than a 32-bit version of the same app.

Given the iPhone 5S has only 1GB of DRAM, being able to address more than 4GB does seem rather pointless. Developers feel the memory issue isn't significant for tablets and phones.

“The most common thing associated [with 64-bit] is memory addressing, and the ability to address more than 4 GB of memory. But I don't see Apple making the choice to go 64-bit in iOS because they want to stuff more than 4 GB of memory into iPhones and iPads,” says Ronan O'Ciosoig, co-founder and head of mobile development for Mobile Genius, LLC. 64-bit allows for many other things, such as video editing and filters, he says.

But app development? Maybe not. "64-bit architecture also means a larger address space for RAM, but mobile devices haven't even reached the limit on this anyway,” says O'Ciosoig. “Mobile applications in general don't use a lot of RAM compared to their desktop counterparts. Most apps run just fine on 32- bits.”

O'Ciosoig praises the iPhone 5S camera, and its video recording and editing; the chip’s 64-bit nature, with its extra registers, could come in handy. "This is where the extra code can be compiled to run faster. Run real-time algorithms and filters for improved image quality, focus and stability, faster frame rates and so on," he says.

However, Mark Reid, a freelance Mac and iOS software developer in the U.K., has a different take on 64-bit: that it's not just more accessible memory. "In the case of this 64-bit transition, it's not just about being able to address more memory,” Reid says. “The chip itself has larger registers. That means that more data can be stored on the processor so there will be times that it doesn't even need to go to the memory. Apple has been able to make huge improvements to the set of instructions the chip uses to get its work done giving huge speed improvements over the 32-bit counterpart.”

Apple's Xcode developer tools compiles an app in both 64-bit and 32-bit versions at the same time so it's really not as much of a hurdle as the 64-bit transition was and still is for Windows developers, Reid notes.

“I would agree with many industry bigwigs that 64-bits just isn't all that significant for consumers,” O'Ciosoig adds. “It didn't make much difference on the desktop when this change happened – so it isn't likely to be even perceivable for most people.”

Will the Mac OS Migrate Again?

In 2006, Apple announced it would defect from the PowerPC architecture to Intel’s x86 chip, marking the second CPU migration for the Macintosh in its history. The first, from Motorola’s 68000 series, wasn’t too painful, since Motorola also made the PowerPC chip; some shared technology and architecture made it easy to emulate a 68k environment on PowerPC.

This was important, because Mac System 7 was written in a lot of 68k low level Assembly code for performance reasons; the move wasn't so painful. Likewise, the transition to Intel architecture was helped along by the fact that Mac OS X was built on the NextStep OS, which in turn used the Mach kernel and BSD Unix, both of which were on x86 already.

With the release of the 64-bit A7, there is speculation (and no real hints) that Apple might unite its products under ARM and possibly one OS, either iOS or MacOS. That would mean the same pain for developers as the last time. At best, it would mean one code base and two compiles, one for x86 and one for ARM. At worst, it would mean two very different code bases.

Both iOS and OSX use Mach-O, the Mach object file format, a file format for executables, object code, shared libraries, dynamically-loaded code, and core dumps. One of its advantages (or shortcomings, depending on how you look at it) is that they use only the registers that are present on all CPUs. Given the difference in architecture between ARM and x86, that's really hitting the lowest common denominator.

So what do developers think of this notion? They don’t like it, nor do they think it will happen.

"A7 is just the beginning and has a long way to go before it would be capable of competing with Intel's CPUs,” says Adam Lindley, a developer with ACME AtronOmatic, maker of the MyRadar weather app for smartphones. “Apple doesn't really like to rush things. If they were to make that kind of switch we probably wouldn't see it happen for at least four to six years from now, but then again Mac has gone through three major CPU architectures and the Mach kernel does allow for multiple binaries in the app package.”

“The biggest boon Apple will receive from switching to ARM 64 for iOS devices would be that OSX and iOS can share even more of the base/foundation source code, with them both being 64 bit," Lindley adds.

There is no chance at all that this is going to happen in the immediate future, says O'Ciosoig. “But maybe in a few years we will have some kind of convergence. For now, the processing power of an Intel chip is a high multiple of what the A7 is capable of.” Plus, Intel and ARM are on opposite sides of the spectrum when it comes to microchip architecture. “ARM is low power RISC design principles whereas Intel have stuck to their CISC model since the 80s,” he says.

CPUs are more than just instruction processing pipelines, points out Maurice Sharp, a developer and author of the book Learning iOS 7 Development (Addison-Wesley); they contain much more functionality. For example, Intel's new Haswell line of chips for laptops has an integrated GPU and security instructions for AES encryption. The ARM chips Apple uses in the iPhone are more about low energy, high speed instruction processing but they also have specialty instructions, such as an image signal processor for stabilizing camera shots.

"So the basic answer is, sure, they could switch to ARM on Mac, the question is ‘Why would they?’” says Sharp. “Doing so would be a fair bit of engineering work, even though both Mac OS and iOS have the same parent. If you consider the ROI (return on investment) of doing that work, it is not clear it makes sense at the moment.”

Switching to ARM would mean switching the kernel and possibly other low-level code, which would essentially change what Mac OS is about, since it's built on the Mach kernel.

Sharp thinks a more likely scenario is a longer term migration to iOS for everything. “You can already see it happening. Many of the changes in Mavericks made desktop applications more like their iOS cousins,” he says. “The look and feel also is starting to unify, in as much as it makes sense for desktop/laptop versus handheld/tablet.”

The bottom line, Sharp says, it that is technically feasible to make a switch, but the transition cost does not seem worthwhile at the moment.  So enjoy your x86 coding.

See also:

[dfads params='groups=932&limit=1&orderby=random']

[dfads params='groups=937&limit=1&orderby=random']

Close

By submitting this form, you agree to our
Terms of Use and Privacy Policy

Thanks for Subscribing

Keep an eye on your inbox for more great content.

Continue Reading

Add a little SmartBear to your life

Stay on top of your Software game with the latest developer tips, best practices and news, delivered straight to your inbox