The commercialization of computers

Computer commercialization set research back decades and we still haven't recovered. I explore why that is and end on some hopeful notes.

Transcript

Eric Normand: Why are there no new ideas in computing? By the end of this episode, I hope to explore this idea. Something I've been thinking about a lot that there is this golden age in the '60s and '70s when a whole bunch of stuff got invented. Now, we took some steps back and we're still uncovering all that stuff. What happened?

My name is Eric Normand and I help people thrive with functional programming.

There's a stack overflow question about something that Alan Kay said that there are no new inventions in computing since 1980.

To a lot of us, that's pretty surprising because we work in computing and we feel like new stuff is coming out all the time. The person who was asking was like, "What does he mean by this?" He answered. Alan Kay answered the question himself.

He set it out as a challenge like what are the things that you think are new? [laughs] A lot of question or answers came in to his challenge. He said, "Well, but in 1967 there was a paper where they talked about that, you know. Maybe this is a new implementation, you know, it has a lot of practical improvements, but it's the same idea but newer."

He just goes down the list. People coming up with ideas and he just says, "Well, you know, here's the paper." What's going on? Why is it that things stopped and we're still trying to catch up to where people were back then?

Alan Kay has mentioned it before. I didn't really understand it. I've been thinking about it, researching it. The best answer I can come up with...I mean, something I just think about all the time is the microcomputer revolution.

The commercialization of microcomputers has really set us back. It set us forward because we have cheap computers now, because they became consumer grade. I don't know how many computers I owned, I stopped counting.

There's computers in so many devices now. That wasn't the case when microcomputers first came out. We have all this new computing power. I can go onto Amazon and start up computers in the cloud, as many as I can afford, and they're not expensive.

It's amazing. We have all this great power, but the systems we use, the programming languages, the operating systems, the software, is all from this time. It all stems from this time when we had to rebuild everything.

Imagine you were a researcher at a university and you had, maybe not a mainframe, but you had a mini computer. Not a microcomputer, but a step up, or a magnitude bigger than what was available, like in an Apple II.

It's a powerful computer for the time. It cost thousands of dollars, like $40,000-$50,000 dollars, in the dollars of the time, so pretty expensive. You could do research on it. You could work in an advanced language with a compiler, with a decent editor.

You then look at this microcomputer that has no compiler, the language on it is basic. If you need to do anything sophisticated, you have to go down to assembly. You would look at it and say, "I can't do the algorithm research that I'm working on at work or in my job on this small computer."

Here's another way to look at it. Steve Jobs went to Xerox PARC and saw the Smalltalk system that Alan Kay's team had developed, and was blown away by it. Smalltalk was running on a $30,000 machine that took up the whole desk and had these big tapes for storing your project on, you have to load in the tape.

Once it was all loaded, it was very interactive. You had a mouse and you could change the code as it's running and move documents around and stuff like that. It was a big inspiration for the Macintosh. From there, Windows and everything we use today...

The Smalltalk system had this really advanced language and object-oriented language in the '70s that you could click on anything, see the code for how it worked, modify it, hit save, and then the behavior of the system right there in front of your eyes. No recompilation step or anything, it would just be different.

In fact, they did this in the demo, but Steve Jobs missed it. The other programmers that were with him from Apple saw it. Anyway, what's important is that when they went back and said, "We got to put this on the Macintosh," what did they develop it in? Assembly.

They made a pretty nice GUI, for what could run on the hardware that they had, the commodity, consumer hardware that they were making. They wrote it in assembly. You couldn't change any code on it. It was a very dumbed-down system.

They had to, to make it fit on the hardware that they expected people to be able to afford. It's that kind of down step that we took from this expressive, super forward looking system, the Smalltalk system down to a system that yes, made it easier to use, yes, it had a lot of nice things about it.

In terms of a programming machine, and advanced concepts like being able to modify code and understand how every piece of the system worked, it was a big down step. Backward step, if you will.

Over time, the computers, that microcomputers that were commodity hardware that were produced, eventually, they caught up to where the Alto was. That's what the Smalltalk system was running on. They were much cheaper, but the programming systems did not.

The programming systems, maybe they're not an assembly anymore, they're in C or objective C or what have you. They never took that leap and said, "Wait, we can run Smalltalk now. Let's start over and reinvent what we have from that beginning."

They needed to sell every year. They were a business. They needed to commercialize this, and they didn't want to start over. People were learning the GUI. They didn't want to have to say, "Well, here's a new GUI that we invented for you," because people had invested in training their staff.

They didn't want to have to retrain them on the new GUI. Right? There's all these investment and sunk costs that are going into it. They couldn't change it.

They can make income incremental improvements, but they couldn't say, "Whoa, wait, let's stop because we could do what they were doing in the '70s now on our hardware, and will we would be like, basically where they were with this advanced expensive computer, we could do that on our cheap commodity computer," but no one's ever done that.

It's odd that no one's ever done it, but you can see why. It's because they're always trying to get the next release out, just adding features, adding features. They're never doing the big rewrite. Big rewrites are expensive. They're risky, just keep coming out with the next thing.

We find ourselves in 2019. That's when I'm recording this. I have devices here that have operating systems that don't have some of the features that Smalltalk had back in the day. I can't click on anything and change how it works.

The software rewrite for them in general is still from the batch-computing era, where we submit a program to a compiler which compiles it, generates an executable. We have to basically start the whole thing over.

It does not provide objects that I can interact with in the same way that a Smalltalk system had. Any Smalltalk object would have methods that you would call, messages you could pass it.

As an example, I'm recording this on my computer here. I've got a window open. You can't see this, but I'm going to talk about it. I've got a window open. It's the QuickTime player. It's what it's called. It's a piece of software that records video.

How do I get the frames out of this? Shouldn't there be an interface to this object on the screen that lets me query it for a frame in a certain time? How do I get the raw stream of...? It can talk to the camera. This QuickTime player can talk to the camera. How do I get the raw stream of frames out? It somehow knows how to do it.

I want to use the encoding that the QuickTime player knows how to use. I basically don't own this software. There's only a hundred things I can do through a mouse-driven interface. Yes, there's keyboard shortcuts. That's not what I'm talking about. I cannot write software that interacts with this software. I can't change how this software works.

They could do that in the '70s on the Smalltalk systems. To this day, when you get Smalltalk software, they give you the source code, or they give you an image that contains the whole thing. It's everything. "Here's our software." You can start to mess with it. You change it. You make it do what you want it to do.

I think it's a total shame because we get this software...I'm thinking about video editing software. Final Cut Pro, that's what I use sometimes. I want to change it. I want to make it different, and I can't. I have to learn how it wants me to work with video. I want to work with it in a different way, and that doesn't count.

It's my computer. I paid for it. It's my software. I paid for it, and I can't modify it. We took this huge step back. A lot of ideas are now reaching the light of day. People are reading the old papers because they're on the Internet now. That's a good thing. People are talking about them. There's conferences about old papers.

People are implementing the ideas in their favorite programming languages. That's awesome, but I still feel like that's just piecemeal. We need a revolution. We need to start over. We need something to reset where we are.

These computers, this thing on my desk is way more powerful than an Alto from 1970. It's ridiculous. It costs way less, even in accounting for inflation, especially accounting for inflation. It's not as powerful in the sense of giving me power as a user.

Of course, it's not going to happen because Final Cut Pro costs something like $300. If they let you develop your own editing software in an easy way, meaning what if Final Cut Pro was a bunch of objects?

What if it was not full suite that has a fixed interface? What if it was a bunch of objects that you could program together? I've seen the Smalltalk demos. They were editing video then in the '70s. It was a simple interface. They're like, "Oh, let me throw something together to help me make this video I just recorded."

They threw it together. That's what I want to be able to do. Well, this is turning into a really downer episode. I don't want to end on a negative note. The industry is growing, but it started growing so fast when those microcomputers came out. It's been doubling very quickly.

The number of researchers...There were COBOL programmers back then working on mainframes and stuff. Those are run-of-the-mill Code Monkeys, I have to say it. They're just people getting a job done, making some banking software, getting the business automated, whatever they're doing. They're not computer researchers. They're engineers.

The number of computer researchers was really small. They did amazing stuff like the Xerox PARC stuff. That's worth trillions of dollars now. It's added that much to the economy. Now, as this wave of programmers just expanding...It's just growing, doubling every five years. That's what I've heard. I don't know if that's true. It's growing in a fast rate. Everyone knows that.

So fast, universities can't keep up. Boot camps are popping up because there's so many people who want to learn and are willing to pay a lot of money. Anyway, there are people who grew up in industry as part of this wave who are becoming aware that there was a time before microcomputing.

Those people are starting new projects. They're doing the required background research. They're looking back into how we got where we are and what was going on before. I hope one day to count myself among them, but I can't really right now. I don't have the time to read all the stuff I need to read. I am somewhat aware that this is a problem.

I'm hopeful that the number of people who can work on this, who dedicate time to it, do computer research like it was done before the microcomputer revolution. I'm hopeful that something new is going to come out of it. Might take 20 years. Might take 50 years, but the commercialization has just been so rapid and huge, right?

It's just this huge wave. It's overtaken any of that prior research. Maybe it's even happening in the same scale. It just looks tiny and you can't see it compared to all of the new versions of this software that comes out and, "Oh look. A new OS version came out," and like, "Oh, look at this cool new hardware."

There's things like the end of Moore's law. The end of Moore's law means that maybe Intel doesn't have the advantage that it once had. It used to be that Intel would just keep working on making the same kind of processors but with more transistors. Same architecture. They advanced it but the same architecture.

They knew it was flawed. They weren't happy with the architecture but just because of the commercialization — this is was what was selling, this is what was making money — so we're going to keep making them. They tried other things. They tried other chips, other architectures, but they weren't as successful, so they dropped them.

Now they would just wait six months and this processor would be twice as fast. Any kind of custom architecture that was custom designed for a certain problem or a certain kind of language, these custom architectures that could use more clever things to make it go faster in certain cases, the Intel stuff would just double just by making the transistor smaller.

Any advantage you had with cool architectures, it didn't matter. Now at the end of Moore's law, maybe that's mattering more and more. We see it with Apple. They're coming out with these custom chips like for those AirPods. Maybe they have different architectures in there that allow them to be smaller, lower energy, et cetera. That could be a thing.

There's also so much computing resources available. I feel like something could happen there. Maybe we will realize that we can start, I guess I want to say doing stuff that looks wasteful from the old paradigm but in the new paradigm it makes sense because there's like a vast cloud of computers waiting to run your code. I don't know about that, but I feel like there could be something there.

There's the no-code movement that looks very hopeful to me, and I'm optimistic. The timeframe, I'm not optimistic about. [laughs] I want it now. I don't want to wait 25 years, but things have to take the time that they take.

Awesome. I hope I haven't bummed you out. I hope I ended on a positive enough note because I am hopeful. I just think that we took a huge down-step to build up this industry and we're only now realizing collectively how bad that back-step was and how we still haven't caught up to where they were.

We are realizing it, and more and more people are realizing it. There are conferences talking about it. There are conferences talking about the papers. A lot of the papers that people bring up are old papers from a long time ago when this stuff was invented.

Yes, thank you. If you like this episode, you can go to lispcast.com/podcast, and there you'll find all the old episode with audio, video and text transcripts. You'll also find links to subscribe in whatever format you want and also links to find me on social media where I'd love to get into a discussion with you about this topic.

This has been my thought on functional programming. My name is Eric Normand. Thank you for listening, and rock on.

Transcript

Eric Normand: Why are there no new ideas in computing? By the end of this episode, I hope to explore this idea. Something I've been thinking about a lot that there is this golden age in the '60s and '70s when a whole bunch of stuff got invented. Now, we took some steps back and we're still uncovering all that stuff. What happened?

My name is Eric Normand and I help people thrive with functional programming.

There's a stack overflow question about something that Alan Kay said that there are no new inventions in computing since 1980.

To a lot of us, that's pretty surprising because we work in computing and we feel like new stuff is coming out all the time. The person who was asking was like, "What does he mean by this?" He answered. Alan Kay answered the question himself.

He set it out as a challenge like what are the things that you think are new? [laughs] A lot of question or answers came in to his challenge. He said, "Well, but in 1967 there was a paper where they talked about that, you know. Maybe this is a new implementation, you know, it has a lot of practical improvements, but it's the same idea but newer."

He just goes down the list. People coming up with ideas and he just says, "Well, you know, here's the paper." What's going on? Why is it that things stopped and we're still trying to catch up to where people were back then?

Alan Kay has mentioned it before. I didn't really understand it. I've been thinking about it, researching it. The best answer I can come up with...I mean, something I just think about all the time is the microcomputer revolution.

The commercialization of microcomputers has really set us back. It set us forward because we have cheap computers now, because they became consumer grade. I don't know how many computers I owned, I stopped counting.

There's computers in so many devices now. That wasn't the case when microcomputers first came out. We have all this new computing power. I can go onto Amazon and start up computers in the cloud, as many as I can afford, and they're not expensive.

It's amazing. We have all this great power, but the systems we use, the programming languages, the operating systems, the software, is all from this time. It all stems from this time when we had to rebuild everything.

Imagine you were a researcher at a university and you had, maybe not a mainframe, but you had a mini computer. Not a microcomputer, but a step up, or a magnitude bigger than what was available, like in an Apple II.

It's a powerful computer for the time. It cost thousands of dollars, like $40,000-$50,000 dollars, in the dollars of the time, so pretty expensive. You could do research on it. You could work in an advanced language with a compiler, with a decent editor.

You then look at this microcomputer that has no compiler, the language on it is basic. If you need to do anything sophisticated, you have to go down to assembly. You would look at it and say, "I can't do the algorithm research that I'm working on at work or in my job on this small computer."

Here's another way to look at it. Steve Jobs went to Xerox PARC and saw the Smalltalk system that Alan Kay's team had developed, and was blown away by it. Smalltalk was running on a $30,000 machine that took up the whole desk and had these big tapes for storing your project on, you have to load in the tape.

Once it was all loaded, it was very interactive. You had a mouse and you could change the code as it's running and move documents around and stuff like that. It was a big inspiration for the Macintosh. From there, Windows and everything we use today...

The Smalltalk system had this really advanced language and object-oriented language in the '70s that you could click on anything, see the code for how it worked, modify it, hit save, and then the behavior of the system right there in front of your eyes. No recompilation step or anything, it would just be different.

In fact, they did this in the demo, but Steve Jobs missed it. The other programmers that were with him from Apple saw it. Anyway, what's important is that when they went back and said, "We got to put this on the Macintosh," what did they develop it in? Assembly.

They made a pretty nice GUI, for what could run on the hardware that they had, the commodity, consumer hardware that they were making. They wrote it in assembly. You couldn't change any code on it. It was a very dumbed-down system.

They had to, to make it fit on the hardware that they expected people to be able to afford. It's that kind of down step that we took from this expressive, super forward looking system, the Smalltalk system down to a system that yes, made it easier to use, yes, it had a lot of nice things about it.

In terms of a programming machine, and advanced concepts like being able to modify code and understand how every piece of the system worked, it was a big down step. Backward step, if you will.

Over time, the computers, that microcomputers that were commodity hardware that were produced, eventually, they caught up to where the Alto was. That's what the Smalltalk system was running on. They were much cheaper, but the programming systems did not.

The programming systems, maybe they're not an assembly anymore, they're in C or objective C or what have you. They never took that leap and said, "Wait, we can run Smalltalk now. Let's start over and reinvent what we have from that beginning."

They needed to sell every year. They were a business. They needed to commercialize this, and they didn't want to start over. People were learning the GUI. They didn't want to have to say, "Well, here's a new GUI that we invented for you," because people had invested in training their staff.

They didn't want to have to retrain them on the new GUI. Right? There's all these investment and sunk costs that are going into it. They couldn't change it.

They can make income incremental improvements, but they couldn't say, "Whoa, wait, let's stop because we could do what they were doing in the '70s now on our hardware, and will we would be like, basically where they were with this advanced expensive computer, we could do that on our cheap commodity computer," but no one's ever done that.

It's odd that no one's ever done it, but you can see why. It's because they're always trying to get the next release out, just adding features, adding features. They're never doing the big rewrite. Big rewrites are expensive. They're risky, just keep coming out with the next thing.

We find ourselves in 2019. That's when I'm recording this. I have devices here that have operating systems that don't have some of the features that Smalltalk had back in the day. I can't click on anything and change how it works.

The software rewrite for them in general is still from the batch-computing era, where we submit a program to a compiler which compiles it, generates an executable. We have to basically start the whole thing over.

It does not provide objects that I can interact with in the same way that a Smalltalk system had. Any Smalltalk object would have methods that you would call, messages you could pass it.

As an example, I'm recording this on my computer here. I've got a window open. You can't see this, but I'm going to talk about it. I've got a window open. It's the QuickTime player. It's what it's called. It's a piece of software that records video.

How do I get the frames out of this? Shouldn't there be an interface to this object on the screen that lets me query it for a frame in a certain time? How do I get the raw stream of...? It can talk to the camera. This QuickTime player can talk to the camera. How do I get the raw stream of frames out? It somehow knows how to do it.

I want to use the encoding that the QuickTime player knows how to use. I basically don't own this software. There's only a hundred things I can do through a mouse-driven interface. Yes, there's keyboard shortcuts. That's not what I'm talking about. I cannot write software that interacts with this software. I can't change how this software works.

They could do that in the '70s on the Smalltalk systems. To this day, when you get Smalltalk software, they give you the source code, or they give you an image that contains the whole thing. It's everything. "Here's our software." You can start to mess with it. You change it. You make it do what you want it to do.

I think it's a total shame because we get this software...I'm thinking about video editing software. Final Cut Pro, that's what I use sometimes. I want to change it. I want to make it different, and I can't. I have to learn how it wants me to work with video. I want to work with it in a different way, and that doesn't count.

It's my computer. I paid for it. It's my software. I paid for it, and I can't modify it. We took this huge step back. A lot of ideas are now reaching the light of day. People are reading the old papers because they're on the Internet now. That's a good thing. People are talking about them. There's conferences about old papers.

People are implementing the ideas in their favorite programming languages. That's awesome, but I still feel like that's just piecemeal. We need a revolution. We need to start over. We need something to reset where we are.

These computers, this thing on my desk is way more powerful than an Alto from 1970. It's ridiculous. It costs way less, even in accounting for inflation, especially accounting for inflation. It's not as powerful in the sense of giving me power as a user.

Of course, it's not going to happen because Final Cut Pro costs something like $300. If they let you develop your own editing software in an easy way, meaning what if Final Cut Pro was a bunch of objects?

What if it was not full suite that has a fixed interface? What if it was a bunch of objects that you could program together? I've seen the Smalltalk demos. They were editing video then in the '70s. It was a simple interface. They're like, "Oh, let me throw something together to help me make this video I just recorded."

They threw it together. That's what I want to be able to do. Well, this is turning into a really downer episode. I don't want to end on a negative note. The industry is growing, but it started growing so fast when those microcomputers came out. It's been doubling very quickly.

The number of researchers...There were COBOL programmers back then working on mainframes and stuff. Those are run-of-the-mill Code Monkeys, I have to say it. They're just people getting a job done, making some banking software, getting the business automated, whatever they're doing. They're not computer researchers. They're engineers.

The number of computer researchers was really small. They did amazing stuff like the Xerox PARC stuff. That's worth trillions of dollars now. It's added that much to the economy. Now, as this wave of programmers just expanding...It's just growing, doubling every five years. That's what I've heard. I don't know if that's true. It's growing in a fast rate. Everyone knows that.

So fast, universities can't keep up. Boot camps are popping up because there's so many people who want to learn and are willing to pay a lot of money. Anyway, there are people who grew up in industry as part of this wave who are becoming aware that there was a time before microcomputing.

Those people are starting new projects. They're doing the required background research. They're looking back into how we got where we are and what was going on before. I hope one day to count myself among them, but I can't really right now. I don't have the time to read all the stuff I need to read. I am somewhat aware that this is a problem.

I'm hopeful that the number of people who can work on this, who dedicate time to it, do computer research like it was done before the microcomputer revolution. I'm hopeful that something new is going to come out of it. Might take 20 years. Might take 50 years, but the commercialization has just been so rapid and huge, right?

It's just this huge wave. It's overtaken any of that prior research. Maybe it's even happening in the same scale. It just looks tiny and you can't see it compared to all of the new versions of this software that comes out and, "Oh look. A new OS version came out," and like, "Oh, look at this cool new hardware."

There's things like the end of Moore's law. The end of Moore's law means that maybe Intel doesn't have the advantage that it once had. It used to be that Intel would just keep working on making the same kind of processors but with more transistors. Same architecture. They advanced it but the same architecture.

They knew it was flawed. They weren't happy with the architecture but just because of the commercialization — this is was what was selling, this is what was making money — so we're going to keep making them. They tried other things. They tried other chips, other architectures, but they weren't as successful, so they dropped them.

Now they would just wait six months and this processor would be twice as fast. Any kind of custom architecture that was custom designed for a certain problem or a certain kind of language, these custom architectures that could use more clever things to make it go faster in certain cases, the Intel stuff would just double just by making the transistor smaller.

Any advantage you had with cool architectures, it didn't matter. Now at the end of Moore's law, maybe that's mattering more and more. We see it with Apple. They're coming out with these custom chips like for those AirPods. Maybe they have different architectures in there that allow them to be smaller, lower energy, et cetera. That could be a thing.

There's also so much computing resources available. I feel like something could happen there. Maybe we will realize that we can start, I guess I want to say doing stuff that looks wasteful from the old paradigm but in the new paradigm it makes sense because there's like a vast cloud of computers waiting to run your code. I don't know about that, but I feel like there could be something there.

There's the no-code movement that looks very hopeful to me, and I'm optimistic. The timeframe, I'm not optimistic about. [laughs] I want it now. I don't want to wait 25 years, but things have to take the time that they take.

Awesome. I hope I haven't bummed you out. I hope I ended on a positive enough note because I am hopeful. I just think that we took a huge down-step to build up this industry and we're only now realizing collectively how bad that back-step was and how we still haven't caught up to where they were.

We are realizing it, and more and more people are realizing it. There are conferences talking about it. There are conferences talking about the papers. A lot of the papers that people bring up are old papers from a long time ago when this stuff was invented.

Yes, thank you. If you like this episode, you can go to lispcast.com/podcast, and there you'll find all the old episode with audio, video and text transcripts. You'll also find links to subscribe in whatever format you want and also links to find me on social media where I'd love to get into a discussion with you about this topic.

This has been my thought on functional programming. My name is Eric Normand. Thank you for listening, and rock on.