Monday, February 15, 2021

Linked lists, dictionaries, and AI

When I was learning the craft of programming, I spent a lot of time learning about data structures (linked lists, trees, and other things). How to create them. How to add a node. How to remove a node. How to find a node. There was a whole class in college about data structures.

At the time, everyone learning computer science learned those data structures. Those data structures were the tools to use when designing and building programs.

Yet now in the 21st century, we don't use them. (At least not directly.)

We use lists and dictionaries. Different languages use different names. C++ calls them 'vectors' and 'maps'. Perl calls them 'lists' and 'hashes'. Ruby calls them ... you get the idea. The names are not important.

What is important is that these data structures are the ones we use. Every modern language implements them. And I must admit that lists and dictionaries are much easier to use than linked lists and balanced trees.

Lists and dictionaries did not come for free, though. They cost more in terms of both execution time and memory. Yet we, as an industry, decided that the cost of lists and dictionaries was worth the benefit (which was less time and effort to write programs).

What does this have to do with AI?

It strikes me that AI is in a phase equivalent to the 'linked list' phase of programming.

Just as we were convinced, some years ago, that linked lists and trees were the key to programming, we are (today) convinced that our current techniques are the key to AI.

It would not surprise me to find that, in five or ten years, we are using completely different tools for AI.

I don't know what those new tools will be. (If I did, I would be making a small fortune implementing and selling them.)

But just as linked lists and trees morphed into lists and dictionaries with the aid of faster processors and more memory, I think AI tools of today will morph into the tools of tomorrow with better hardware. That better hardware might be faster processors and more memory, or it might be advanced network connections and coordination between processes on different computers, or it might even be better data structures. (The last, technically, is of course not hardware.)

Which doesn't mean we should stop work on AI. It doesn't mean that we should all just sit around and wait for better tools for AI to appear. (If no one is working on AI, then no one will have ideas for better tools.)

We should continue to work on AI. But just as we replaced to code that used older data structures with code that used newer data structures, we should expect to replace early AI techniques with later AI techniques. In other words, the things that we build in AI will be temporary. We can expect to replace them with better tools, better models -- and perhaps not that far off in the future!


No comments: