Maybe about a decade ago, a friend of mine had a little kitten named Olive. I was over at their house one summer day, drinking a glass of ice water filled with that crushed ice like you get in a Slurpee.
The kitten got interested in my water, so I tipped the glass and let her have a few licks. She went nuts. She stuck her little head in my glass and greedily lapped at the ice water. I took a swig, and it jumped up on me and tried to get more. That is the power of ice water. The sweet cold taste of clear mountain water off a glacier. Unbelievable. No wonder it was a luxury in Roman times. They'd cart the ice down from the mountains, packed in straw, and store it in rock tunnels. And the Romans would enjoy ice water, same as us. Except, of course, we didn't have to worry about diphtheria and typhoid fever.
So this morning, I go to one of those old fashioned drinking fountains, the Oasis brand. The ones that have the Greek-letter-looking EBCD on the handle. Since I was dehydrated from the night, and had biked into work, the water - ice cold - was beyond refreshing. It was indescribably good, and I understood why that little kitten had so greedily lapped it up.
What a luxury that is! Clean cold water. Jesus we don't have any conception of just how amazing it is until, well, after the Apocalypse, I suppose.
I'll miss coffee as well. And ice cream.
But that's not what I to talk about. So, I'm thinking lately about neural nets, and how basically everyone is going neural net/machine learning crazy lately. As if it were finally an appreciated luxury like ice water.
Well, it's nothing of the kind. It's true that the current versions are impressive, but that's really more due to advances in hardware and sheer processing power than to any game-changing software advances. Like for example, any of coding expertise, of which there is hardly any. I mean, perceptrons were doing symbolic logic in the 1940s. ANDs, ORs and NOTs had all been successfully implemented in Hebbian networks. The XOR (of which computer scientists said neural nets couldn't do, and therefore neural nets couldn't be good computers) was figured out in 1946. Here's how you do it:
But we don't care because we don't really want to build computers. We know how to build computers. We have had computers since at least 500BCE and probably long before.
We want to build electronic brains. And we do it by (primitive aping, still) reverse engineering real brains. The deep learning networks we have, impressive as they be in performance, are woefully sparse and scarecrow stiff in comparison to real brains. Neurons do amazing calculations inside themselves, chemicals and hormones flood the network with broadcast signals, not to mention the narrow cast signals we ape with these weighted links in nets are just so fucking paltry and primitive compared to the symphonic splendor of even worm brains.
So, when all those experts tell the kids that it is important to learn to code, all I can think of is that it is rather like telling kids in the 60s that it is important to learn Morse code. Because, you know, telecommunications! Satellites! All those satellites are gonna be beep beep booping at each other, so you better know all about it, kids!
But fact of the matter is the new model to go by is the neural net. Everything can be a neural net. Shipping networks, cars, anthills, beehives, classrooms, you name it. And telling people they can by with a little IF THEN ELSE is quite frankly a little dishonest. It's getting their hopes up, when they lose that make-work factory job, they'll earn big bucks as coders! Well, sorry, but the code doesn't need your stupid help. All it needs from you is what you want it to do.
Is there one place where learning to code might come in handy? Yeah, biology, but you gotta be smart to that. So, I guess you all better hope that the basic living income comes to pass...
Oh, yeah, and you should probably kill your boss while you are at it.