If you go to any tech conference or developer event, you will inevitably find yourself surrounded by glowy Apple logos. Although this might have baffled developers from years ago, now it’s to be expected. The only people that might be baffled are aspiring coders that start to wonder, "Do all developers use Macs? If so, why? And do I need to switch to Mac if I want to start working in this field?"
These are exactly the question we will try to answer in this article.
First of all, as we mentioned earlier, if you found a way to travel back in time, let’s say 15 years, and told developers that Apple devices would become so popular among their peers, they would have laughed at your ridiculous ideas and doubted that you did any sort of time traveling whatsoever. That’s because back then, most developers used a pure Unix/Linux environment or Windows.
So what changed?
Well, Apple computers were always great devices to learn to code on, but what really made a difference was the release of OS X and Apple’s switch to Intel.
The Unix Command Line
Of course, this shift didn’t happen overnight. It was a gradual process. For the past 15 years, Apple has been running an operating system that’s built on top of Unix. Moreover, OS X is certified as Unix by the Open Standards Group. In contrast, not even modern versions of Linux like Ubuntu and Mint can boast the same certification since they’re based on GNU and not Unix.
The Unix shell is extremely important to developers because it allows them to run programs in almost any language directly from the command line environment without having to use a specialized IDE.
A command-line environment lets users interact with a computer by typing commands. This is how most people communicated with their computers before GUIs or graphical user interfaces made it possible to simply click on graphical elements.
The Windows command-line environment doesn’t offer the same advantages as the one on macOS X.
Furthermore, most tech companies want their employees to know how to work on Unix-based systems, so those that can’t afford a Mac will dual-boot Linux and Windows for a while so they can gain the experience they need, but eventually, they’ll buy a Mac either second-hand or through buy now pay later electronics.
Applications and UX
Let’s not forget that developers are also end-users. Much like any other end-user, they prefer well-polished and efficient applications. Part of the shift to macs is due to the UI and the wide availability of quality applications.
As time passed and more developers switched to Macs, they created more tools to make their work easier, which, in turn, made the platform even more attractive, bringing in more developers.
What about the Hardware?
The build quality of Macs also plays a role in their popularity. They’re durable and require less maintenance. The displays are better at handling glare, which makes it easier when you need to work in your car or outdoors. Since macOS is more efficient than Windows, the battery lasts longer, and combined with the aluminum chassis, Macs don’t heat as easily.
Moreover, if you’re trying to make software that runs well on Apple devices, you need to use Apple devices.
While it’s true that Windows computers offer more flexibility in terms of upgrades, and many have similar specs at lower prices, another advantage of getting a Mac is the resale value. Since Apple is a well-respected brand and Macs are known to run well even after several years, developers can upgrade every three years or so by reselling their used devices for around half of what they paid for them. This helps offset the initial high purchasing price.
We’ve already established that if you need to develop OS X or iOS software, then you automatically need a Mac. But you can also use a Mac to run all the main operating systems. You can use a virtual environment to install Windows or Linux, and you won’t have any problems. On the other hand, it’s difficult to run OS X on a Linux or Windows PC. This means that if you have to develop and test software for Windows and Linux, you can still use your trusty Mac.
Furthermore, almost all commercial software companies provide reliable Mac versions, which is not something you can say for Linux. So, as a developer, a Mac is a much more versatile tool than a Linux or Windows computer.
Does this mean I need to get a Mac if I want to learn how to code?
It depends. A Mac is only a tool. It will not teach you how to code or make you a better programmer. But it is more versatile. You have to consider the type of development you want to do.
For example, if you want to develop software for Windows, then a Windows computer is the obvious choice. It also applies to a lot of enterprise software. For web development, Macs work really well, but Linux is also a great alternative. For iOS development, you don’t have much of a choice.
But if you want to learn how to code, you can do it from any platform. The computer you’re already using can get you started on your path. Having said that, getting a Mac can be a huge benefit further down the line.
You also have to take into account your personal preferences. If you can’t afford a Mac right now, you don’t need to start skipping lunch or anything like that. You can still learn without one. But if you can, and you also like the aesthetic, the applications, and the Apple ecosystem, then you should definitely consider making the switch.
Many developers have tried to resist switching either for emotional or philosophical reasons, or they simply thought that Windows computers were more customizable and offered more bang for their buck. Eventually, most of them did concede, which is why you see all those glowy Apple logos at tech conferences.