M1 chip is fantastic. Imagine a blazingly fast processor, all-day battery life, and no thermal issues. Sounds great, at least on paper. Still, it had to go.
In case you want a single sentence summary – some data science libraries are either impossible or near to impossible to run natively, connecting two external displays is a nightmare, and finally, eGPUs aren’t supported.
That’s precisely how the article is structure, so feel free to navigate to a section that interests you the most:
Application and Library Compatibility
I’m not a big fan of Anaconda Python distribution. Overall, it’s a great idea, but I prefer a clean installation of Python 3 and dependency management on the fly. Still, Anaconda seemed like a go-to way on the M1 chip.
The default Python 3 on M1 was 3.9.x, which you’ll first have to downgrade to 3.8.x to make some libraries work. Not a big deal, but an extra step for sure. Even after the downgrade, the only consistent thing I saw when installing libraries natively was a bunch of red lines in the Terminal.
Want to install TensorFlow? Great, but please install a specific version of Numpy and five other packages beforehand. Needless to say, but these versions get overridden when installing other packages if you don’t specify some extra parameters (or if you don’t install them in a virtual environment). Kind of a hassle if you want TensorFlow always available.
Anaconda worked fine, but there wasn’t an official release for the M1 chip at the time of testing. This means the entire distribution runs through an emulator called Rosetta 2, which does a terrific job. Still, it’s not native support.
For my daily job, I need to communicate with cloud databases a lot, mainly Oracle. The only sane way to do so with Python is through Oracle Instant Client, which isn’t ported to the new chip.
This was just a short list of things that didn’t work or didn’t work as expected. I’m sure any other tech professional can add issues to the list.
External Display Support
Feel free to skip this section if you’re using a single external display.
13″ isn’t enough for comfortable 8+ hours work sessions. Sure, work from home means work from bed on some days, but you’ll need that extra screen real estate more often than not.
I’m using two Dell U2419H monitors. One of them is in a normal horizontal position, while the other is pivoted vertically, as you can see from the following image:
Say what you want, but writing code on a vertical monitor is not something you can easily let go of. Take a look at the following image and you’ll immediately get the gist:
In a nutshell – vertical space is a massive productivity booster for anything involving code. Having a single monitor and pivoting it isn’t the best option. Having two relatively cheap and color-accurate monitors is a way to go for me.
Unfortunately, the M1 chip in Macbook Pro and Macbook Air supports only a single monitor. There are some ways around it, like purchasing a docking station with DisplayLink, but the recommended ones weren’t available on Amazon the last time I checked.
I’m completely fine with sacrificing a dedicated Nvidia GPU to get an ultraportable and sleek-looking laptop. Still, having an option to connect a GPU via thunderbolt was always an option with Intel-based Macs. Not a cheap one, but it was there.
The M1 chip changed that in a bad way. It doesn’t support eGPU at all, and there’s nothing you can do about it. This means you can forget occasional gaming sessions. I’m aware no one buys Macs for gaming, but having the option can’t hurt.
The M1 chip does come with a neural engine, and it should help a bit for basic deep learning tasks if you’re into that. But it’s still a laptop, so don’t expect some crazy performance. You’ll have to switch to Collab or cloud GPUs for that.
This last point shouldn’t be a deal-breaker if you’re into deep learning since mobile GPUs can only get you so far.
To conclude – M1 Macs will get you 90% there, but they are not the best option if you need something ultra-specific. It’s easy to go along with the hype, but once it passes the frustration kicks in. At least that was the case for me.
As many would say – never buy the first generation of Apple products. I agree.
What are your experiences with the M1 chip? I’m eager to hear both pros and cons for any IT profession.
- Top 5 Books to Learn Data Science in 2021
- Ridgeline Plots: The Perfect Way to Visualize Data Distributions with Python
- Python Dictionaries: Everything You Need to Know
- How to Send Beautiful Emails with Python – The Essential Guide
- Are the New M1 Macbooks Any Good for Data Science? Let’s Find Out