Adding more RAM: Not always the solution. https://t.co/O220nQ2SJT
— SwiftOnSecurity (@SwiftOnSecurity) May 8, 2018
Consider me triggered.
There’s a lot in data science that comes down to just plain good software engineering or programming skill. In fact, there’s quite a bit more than many people think. Even as I started out — taking a bootcamp to bring myself up to speed — I noticed that I was writing much tighter code than anybody else, including the instructor. This despite working in a new language and with new concepts. I kind of dismissed that thought for a while but it keeps on coming up and I’ve long passed the point at which I’m willing to continue patting myself on the back for just being a naturally gifted programmer.
Katherine Scott sort of crystalized this for me with a string of tweets in February. Key among those tweets was the one in which she notes that
We’re computer scientists FIRST. Compute time and storage add up to real money; like millions of dollars. That’s solving a real world problem that people will gladly pay for.
As one who is working mostly on AWS servers these days, the whole thing about compute time and storage adding up to real dollars resonates. But you don’t learn that in school where everything is free. And if you’ve got a decent rig at home you may not learn it there either.
So yesterday was time for my own rant, responding to SwiftOnSecurity‘s comment above. It wasn’t even directly related to the specific technology that was being discussed, but the “just add more RAM” idea tends to trigger me.
Why understanding your technology stack at a lower level is important: Doesn't matter if you're managing application memory, SQL databases, or the WSUS database. You make decisions (even if just going with "default") and decisions have consequences. #SoftwareEngineering https://t.co/ClvUdcXGvG
— Michael Gat (@michaelgat) May 8, 2018
Here’s a slightly better edited version:
Understanding your technology stack at a lower level is important: It doesn’t matter if you’re managing application memory, SQL databases, or the WSUS database. You always make decisions (even if just going with “default”) and decisions have consequences. Sure you can always add memory, or storage, or an extra/faster CPU on the server, or move it to the cloud and pay per cycle. But that’s sloppy and has real costs. Wasting the hardware advances we’ve made on sloppy practices is unprofessional.
I’ve been off in the wilds of project management for years and am far from being a great developer today. But I know one thing: Tight code, tight environments and tight operations matter. The difference is only a few microseconds on your PC, but in a deployed app, that’s multiplied by millions every day, maybe every hour, sometimes even more than that. Those millions or billions of extra cycles, extra bytes of storage, extra heat that needs to be extracted from the data-center by costly cooling systems, all have real-world costs that add up. If you aren’t looking hard at how you’re doing things and putting effort into minimizing those, you’re missing the point.
Engineering, someone once said, is what happens when you combine science and economics. You can’t forget the economics if you want to be good at it. It doesn’t matter if you’re a developer, a devops engineer, a sysadmin, an architect, managing the WSUS database, or whatever. If you’re not using detailed knowledge of how things work to optimize the operation, you’re not doing your job. Throwing RAM or any other resource at a problem prior to making absolute full use of the resources you have, is just sloppy.
So as SwiftOnSecurity commented. Sometimes more RAM is not the answer. Usually it’s not. That’s where you go after everything is already as optimized as it can get. Make it run on a Raspberry Pi first, then tell me why you need more hardware. I just had a chat with a junior dev type working at a very rich company, who happened to be at my local coffee place. He said he wouldn’t work for anybody who didn’t give him a new computer at least every 18 months. Anything older “wasn’t good enough.” I told him he’d be lucky to get a Pi 3B+ from me. I’m building nets on my 4th gen i7 (admittedly with upgraded GPU) and he’s not doing anything that demands near as much. His work is just not that special nor is the app he’s building. I need to see that you’re squeezing all the performance out of what you’ve got, before I’ll think about upgrading. I don’t care about “cool factor.”
Seriously, if you haven’t figured out how to get the most out of what you already have, don’t ask me for more RAM/disk/CPU or anything else. Buy a Pi Zero and use it’s limitations to force yourself to write tight code. Then we’ll talk.
And by the way, did I mention that I’ll be speaking about “Can you do it on a Raspberry Pi?” at ITX-NZ in Wellington on July 12? I will address some of the issues from this thread and explain why everybody should learn on minimalist hardware.